r/LocalLLaMA · · 1 min read

RDNA3 Flash Attention fix just dropped by llama.cpp b9158

Mirrored from r/LocalLLaMA for archival readability. Support the source by reading on the original site.

Discussion (0)

Sign in to join the discussion. Free account, 30 seconds — email code or GitHub.

Sign in →

No comments yet. Sign in and be the first to say something.

More from r/LocalLLaMA