Relay

1 posts with this tag

Relay: The Lightweight LLM Proxy for Self-Hosted AI Stacks

Relay: The Lightweight LLM Proxy for Self-Hosted AI Stacks

Tired of 'proxy fatigue' juggling Ollama, Llama.cpp, and Groq? Relay is a lightweight, config-driven TypeScript proxy that orchestrates heterogeneous LLM backends—like nginx for reasoning. Runs stable for 11+ days.

Administrator 3/31/2026