r/LocalLLaMA · · 1 min read

A VERY lightweight open web-search tool for smaller local LLMs

Mirrored from r/LocalLLaMA for archival readability. Support the source by reading on the original site.

Hey everyone,

Been playing around with local agent setups lately, mostly Cline/Roo with smaller models, and web search kept annoying me.

Not because it doesn’t work, but because it usually throws way too much random page text into the context. small models really don’t handle that gracefully lol. they start with a simple search and suddenly half the prompt is scraped garbage.

So I built bad boy, TinySearch.

It’s a small open-source MCP tool that does web search, crawls a few pages, chunks/retrieves/reranks the useful bits, and gives the agent a much smaller context blob instead of dumping full pages.

Repo:
https://github.com/MarcellM01/TinySearch

Uses DuckDuckGo, Crawl4AI, dense + BM25-style retrieval, reranking, MCP, and it can also run as a FastAPI server.

On my setup (M4 Mac and old ahh lenovo thinkpad) it usually takes around 5–12 seconds end to end, depending on the query/machine

Not trying to replace real search infra or anything. it’s more just a little local research layer for people building agents who don’t want to spin up a whole backend just to let the model look stuff up.

Still rough in places, but it’s been useful enough for my own workflows that I figured I’d share it.

Feedback/roasting welcome, especially from people using Cline, Roo, MCP, or smaller local models.

submitted by /u/Scared-Tip7914
[link] [comments]

Discussion (0)

Sign in to join the discussion. Free account, 30 seconds — email code or GitHub.

Sign in →

No comments yet. Sign in and be the first to say something.

More from r/LocalLLaMA