self-hosted AI

8 posts with this tag

Pan UI Review: Self-Hosted AI Agent Workspace for Hermes

Pan UI Review: Self-Hosted AI Agent Workspace for Hermes

Tired of cloud-hosted AI agents leaking sensitive data? Pan UI is a self-hosted, TypeScript-based workspace built for the Hermes Agent—featuring chat, skills, memory, profile switching, and runtime controls. Lightweight, private, and production-ready in 11 days of testing.

Administrator 4/16/2026
MF0-1984: The Privacy-First Local AI Agent You’ve Been Waiting For

MF0-1984: The Privacy-First Local AI Agent You’ve Been Waiting For

Tired of AI tools that phone home? MF0-1984 is a lean, CLI-first JavaScript AI agent that runs entirely on your hardware—zero cloud dependencies, no telemetry, no subscriptions. Built for sysadmins and privacy advocates who demand real control.

Administrator 4/16/2026
AionsHome Review: A Real AI Companion for Your Smart Home

AionsHome Review: A Real AI Companion for Your Smart Home

Tired of forgetful AI 'companions'? AionsHome is a real self-hosted Python stack with long-term memory, local voice I/O, camera vision, and Home Assistant integration — no cloud, no hallucinations, just continuity.

Administrator 4/14/2026
Tigrimos Review: The Self-Hosted AI Workspace That Just Works

Tigrimos Review: The Self-Hosted AI Workspace That Just Works

Tigrimos is a rare self-hosted AI workspace that runs natively on macOS and Windows—no Docker, no WSL2, no Node.js bloat. Built in TypeScript, it delivers multi-agent orchestration out of the box with strong privacy guarantees.

Administrator 4/13/2026
m_flow: Lightweight Memory-Augmented Knowledge Graph for AI Agents

m_flow: Lightweight Memory-Augmented Knowledge Graph for AI Agents

m_flow is the first lightweight, memory-augmented knowledge graph engine built for self-hosted AI agents — no fine-tuning, no vector DB tuning. It auto-builds contextual graphs from docs like DevOps runbooks and Zettelkasten on minimal hardware.

Administrator 4/4/2026
Relay: The Lightweight LLM Proxy for Self-Hosted AI Stacks

Relay: The Lightweight LLM Proxy for Self-Hosted AI Stacks

Tired of 'proxy fatigue' juggling Ollama, Llama.cpp, and Groq? Relay is a lightweight, config-driven TypeScript proxy that orchestrates heterogeneous LLM backends—like nginx for reasoning. Runs stable for 11+ days.

Administrator 3/31/2026
Awesome-OpenClaw: The Cohesive Self-Hosted AI Toolkit You Need

Awesome-OpenClaw: The Cohesive Self-Hosted AI Toolkit You Need

Tired of broken LangChain integrations and unreliable RAG pipelines? Awesome-OpenClaw is a curated, maintained GitHub list (253+ stars) offering a cohesive, modular blueprint for self-hosted AI—no 300-line glue code required.

Administrator 3/30/2026