Audio input not accepted with llamacpp for Nemotron 3 nano Omni ?
Mirrored from r/LocalLLaMA for archival readability. Support the source by reading on the original site.
Llama-server does not accept audio input (or video for that matter) with Nemotron 3 nano omni (unsloth). I’m on a recent build of llamacpp and I redownloaded Nemotron, and I have the mmproj loaded too. It still accepts images, but not audio, in fact the audio input option on the llama-server webUI is greyed out. Gemma4-e4b audio input works, so I know it’s not something to do with llamacpp, it seems like something is going on with llamacpp’s compatibility with nemotron 3 Omni specifically
Is this a known issue? Whats going on that’s getting in the way
[link] [comments]
More from r/LocalLLaMA
-
macOS support in Lemonade has graduated out of beta!
May 16
-
A very important milestone for me in the AI field.
May 16
-
Built a 6x cheaper CodeRabbit alternative using open source models
May 16
-
What’s are the best abliterated or uncensored local models that allow financial advice-related questions?
May 16
Discussion (0)
Sign in to join the discussion. Free account, 30 seconds — email code or GitHub.
Sign in →No comments yet. Sign in and be the first to say something.