r/LocalLLaMA · · 1 min read

AllenAI has been iterating on their MolmoAct2 models for robotics

Mirrored from r/LocalLLaMA for archival readability. Support the source by reading on the original site.

r/AllenAI is cooking with MolmoAct2, a 5B vision-language-action model for robot control. They keep releasing new fine-tunes on different kinds of robotics datasets, including (but not limited to, and they keep releasing new ones):

AllenAI has released these as fully open source models, publishing not only their weights but also their complete training datasets (including pretraining), their training software source code, and technical papers describing the theory, training, and assessments of these models.

If anyone is fiddling with robots controlled via LLM inference, you should give MolmoAct2 models a look.

submitted by /u/ttkciar
[link] [comments]

Discussion (0)

Sign in to join the discussion. Free account, 30 seconds — email code or GitHub.

Sign in →

No comments yet. Sign in and be the first to say something.

More from r/LocalLLaMA