r/LocalLLaMA · · 1 min read

What is the most unexpected thing you have gotten a local model to do?

Mirrored from r/LocalLLaMA for archival readability. Support the source by reading on the original site.

Most local LLM use cases I see are chat, coding, and RAG. But with vision models getting better and faster on consumer hardware, I feel like there is a lot of untapped territory.

I got a local VLM to play a board game by just looking at the screen and it worked way better than I expected.

What is the weirdest or most unexpected thing you have used a local model for?

submitted by /u/Enough-Astronaut9278
[link] [comments]

Discussion (0)

Sign in to join the discussion. Free account, 30 seconds — email code or GitHub.

Sign in →

No comments yet. Sign in and be the first to say something.

More from r/LocalLLaMA