r/LocalLLaMA · · 1 min read

Anyone actually using a local LLM as their daily knowledge base? Not for coding, for life stuff. What's your setup?

Mirrored from r/LocalLLaMA for archival readability. Support the source by reading on the original site.

So I've been going down a rabbit hole lately and I can't find many people actually talking about this specific use case.

everyone here runs local LLMs for coding, chat, maybe some creative writing. cool. But what about using it as a proper personal knowledge base? like, dump your own notes, PDFs, random docs into it and actually query your own life privately, every day.

I tried looking into this seriously and hit a wall. Most resources either assume you're a developer building something, or they're 2 years old and recommend tools that have completely changed since.

So genuinely asking, is anyone here actually doing this day to day? Not as an experiment, but as a real workflow?

Things I keep running into that I can't figure out:

  • What model are you running for this? RAG on consumer hardware seems finicky depending on quant
  • Do you actually trust the retrieval or do you double check everything because hallucinations?
  • LlamaIndex vs Ollama vs whatever else has anything actually made this less painful recently?
  • Context length, how do you handle it when your personal docs start piling up?

Not looking for a tutorial or a GitHub repo. Just want to hear from someone who's made this work without it becoming a part time job to maintain.

submitted by /u/InformationSweet808
[link] [comments]

Discussion (0)

Sign in to join the discussion. Free account, 30 seconds — email code or GitHub.

Sign in →

No comments yet. Sign in and be the first to say something.

More from r/LocalLLaMA