Halo

February 2026 - Present

A compute node so time-consumed that I needed another project page for it.


Specs:

Its low power footprint keeps power draw low and can still bring multicore performance.


Applications

OpenWebUI - Web AI Chat UI

Using Docker, I deployed OpenWebUI, a web UI for sending and receiving messages with LLMs.

OpenClaw - AI Agent

Halo is the name of the agent that connects with my Discord and terminals. After hearing about the ease of having a coding agent in an app I already have and use like WhatsApp or Discord, I decided to set it up. For much of my tuning, this was a great "average use workload" for my LLMs, with context in high demand in a multi-modal interface.

Llama.cpp - LLM host

This project demanded the most memory and compute, so it was also the most tuned