Simpler self hosted alt to Open WebUI
Mirrored from r/LocalLLaMA for archival readability. Support the source by reading on the original site.
| Got Qwen3.6 27B running on my newly assembled 4x 3090 rig (s/o 3090-club) and I'm trying to get the people in my house to adopt the local workflow. Open WebUI has improved a lot in the recent updates, but I still found it pretty rough for non-technical people. It often feels more like a dev tool than a self-hosted ChatGPT-style app that "just works". I built overtchat to focus mainly on getting the core chat experience right: a polished ui, simple setup and fewer moving parts. The goal is not to compete on agentic workflow with LibreChat/LobeChat/OWUI but to provide a cleaner self-hosted interface for local models. Ships with its own tried & tested searxng config for web search, kokoro tts (no api keys needed). Single docker compose file. MIT licensed of course, no telemetry. Optimized for mobile as PWA. Github. Also being upfront - I write code for a living and have been actively reviewing/debugging/changing things, but I did use quite a lot of AI lol. I promise it's not slop tho 😿 . Feedback is welcome! [link] [comments] |
Discussion (0)
Sign in to join the discussion. Free account, 30 seconds — email code or GitHub.
Sign in →No comments yet. Sign in and be the first to say something.