GitHub - pwilkin/openmoss: OpenMOSS pure C++ pipeline based on GGML
Mirrored from r/LocalLLaMA for archival readability. Support the source by reading on the original site.
| I'm uploading a full GGML-based pipeline for OpenMOSS (https://huggingface.co/OpenMOSS-Team/MOSS-TTS) that I've vibe-coded for myself in case someone else finds it useful. TTS models are notoriously annoying to set up due to the entire Python ecosystem, so I decided I'd make it a bit simpler. Both server mode and single-shot cli mode are supported here. Why OpenMOSS? For me, the reason was that it's one of the few TTS models that can deal well with languages outside the typical "English/Chinese" duet - namely Polish. Maybe someone else will find it useful as well. [link] [comments] |
More from r/LocalLLaMA
-
I built a self-hosted open-source MCP server that gives any local LLM real financial data — SEC filings, 13F, insider & congressional trades, short data, FRED
May 15
-
Built a fully offline suitcase robot around a Jetson Orin NX SUPER 16GB. Gemma 4 E4B, ~200ms cached TTFT, 30+ sensors, no WiFi/BT/cellular. He has opinions.
May 15
-
Are the rich RAM /poor GPU people wrong here?
May 15
-
MTP Incoming today!
May 15
Discussion (0)
Sign in to join the discussion. Free account, 30 seconds — email code or GitHub.
Sign in →No comments yet. Sign in and be the first to say something.