<a href=\"https://github.com/declare-lab/delta-Mem\" rel=\"nofollow\">https://github.com/declare-lab/delta-Mem</a></p>\n","updatedAt":"2026-05-13T02:48:05.260Z","author":{"_id":"63024676056ec3a2a8714b24","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1661093436322-noauth.jpeg","fullname":"Xiang Liu","name":"Dominic789654","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":5,"isUserFollowing":false}},"numEdits":0,"identifiedLanguage":{"language":"en","probability":0.5958847403526306},"editors":["Dominic789654"],"editorAvatarUrls":["https://cdn-avatars.huggingface.co/v1/production/uploads/1661093436322-noauth.jpeg"],"reactions":[],"isReport":false}},{"id":"6a047aa30795198c2036b64e","author":{"_id":"661ab1f1fa3b144a381fa454","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/661ab1f1fa3b144a381fa454/IlpZBb9NCjo7ntFwMIH53.png","fullname":"Urro","name":"urroxyz","type":"user","isPro":true,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":9,"isUserFollowing":false},"createdAt":"2026-05-13T13:20:35.000Z","type":"comment","data":{"edited":true,"hidden":false,"latest":{"raw":"This is really, really cool.\n\nI think it's important we stop treating weights as the ultimate beholder, and start adding ornaments with their own purposes. This paper introduces one of the most lightweight and successful implementations of tack-on memory that I've seen yet.\n\nKeep it up!","html":"<p>This is really, really cool.</p>\n<p>I think it's important we stop treating weights as the ultimate beholder, and start adding ornaments with their own purposes. This paper introduces one of the most lightweight and successful implementations of tack-on memory that I've seen yet.</p>\n<p>Keep it up!</p>\n","updatedAt":"2026-05-13T13:21:06.237Z","author":{"_id":"661ab1f1fa3b144a381fa454","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/661ab1f1fa3b144a381fa454/IlpZBb9NCjo7ntFwMIH53.png","fullname":"Urro","name":"urroxyz","type":"user","isPro":true,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":9,"isUserFollowing":false}},"numEdits":1,"identifiedLanguage":{"language":"en","probability":0.9820061922073364},"editors":["urroxyz"],"editorAvatarUrls":["https://cdn-avatars.huggingface.co/v1/production/uploads/661ab1f1fa3b144a381fa454/IlpZBb9NCjo7ntFwMIH53.png"],"reactions":[{"reaction":"❤️","users":["huaXiaKyrie","di-zhang-fdu"],"count":2}],"isReport":false},"replies":[{"id":"6a048133d49282e27499adf6","author":{"_id":"65756c5e1488186315c6696d","avatarUrl":"/avatars/2ec67e0d1921b6635c69669b591a3110.svg","fullname":"Jingdi Lei","name":"huaXiaKyrie","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":3,"isUserFollowing":false},"createdAt":"2026-05-13T13:48:35.000Z","type":"comment","data":{"edited":false,"hidden":false,"latest":{"raw":"I am really appreciate you like it","html":"<p>I am really appreciate you like it</p>\n","updatedAt":"2026-05-13T13:48:35.665Z","author":{"_id":"65756c5e1488186315c6696d","avatarUrl":"/avatars/2ec67e0d1921b6635c69669b591a3110.svg","fullname":"Jingdi Lei","name":"huaXiaKyrie","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":3,"isUserFollowing":false}},"numEdits":0,"identifiedLanguage":{"language":"en","probability":0.9877209663391113},"editors":["huaXiaKyrie"],"editorAvatarUrls":["/avatars/2ec67e0d1921b6635c69669b591a3110.svg"],"reactions":[{"reaction":"❤️","users":["urroxyz"],"count":1}],"isReport":false,"parentCommentId":"6a047aa30795198c2036b64e"}}]}],"primaryEmailConfirmed":false,"paper":{"id":"2605.12357","authors":[{"_id":"6a03e01d86b054ce2fa40d7b","user":{"_id":"65756c5e1488186315c6696d","avatarUrl":"/avatars/2ec67e0d1921b6635c69669b591a3110.svg","isPro":false,"fullname":"Jingdi Lei","user":"huaXiaKyrie","type":"user","name":"huaXiaKyrie"},"name":"Jingdi Lei","status":"claimed_verified","statusLastChangedAt":"2026-05-13T07:44:41.109Z","hidden":false},{"_id":"6a03e01d86b054ce2fa40d7c","user":{"_id":"64bce15bafd1e46c5504ad38","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/64bce15bafd1e46c5504ad38/vkEjiu-mIagKlrXzDH75o.png","isPro":false,"fullname":"Di Zhang","user":"di-zhang-fdu","type":"user","name":"di-zhang-fdu"},"name":"Di Zhang","status":"claimed_verified","statusLastChangedAt":"2026-05-13T07:44:39.150Z","hidden":false},{"_id":"6a03e01d86b054ce2fa40d7d","user":{"_id":"656ae4088fb1ddf0d5ec9ac5","avatarUrl":"/avatars/e38468d2c0274f3c0f5732f30a2e3436.svg","isPro":false,"fullname":"Junxian Li","user":"Duke-de-Artois","type":"user","name":"Duke-de-Artois"},"name":"Junxian Li","status":"claimed_verified","statusLastChangedAt":"2026-05-13T07:44:34.799Z","hidden":false},{"_id":"6a03e01d86b054ce2fa40d7e","user":{"_id":"661b9d96c153e4a0a25adc3e","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/661b9d96c153e4a0a25adc3e/VRt7kCQ0KdJp-lhPLOajO.jpeg","isPro":false,"fullname":"Weida Wang","user":"weidawang","type":"user","name":"weidawang"},"name":"Weida Wang","status":"claimed_verified","statusLastChangedAt":"2026-05-13T07:44:32.844Z","hidden":false},{"_id":"6a03e01d86b054ce2fa40d7f","name":"Kaixuan Fan","hidden":false},{"_id":"6a03e01d86b054ce2fa40d80","name":"Xiang Liu","hidden":false},{"_id":"6a03e01d86b054ce2fa40d81","name":"Qihan Liu","hidden":false},{"_id":"6a03e01d86b054ce2fa40d82","name":"Xiaoteng Ma","hidden":false},{"_id":"6a03e01d86b054ce2fa40d83","user":{"_id":"64303a4484f3ed1ce62a2c5a","avatarUrl":"/avatars/99667daaab49a4e1f2c80bf472cc742b.svg","isPro":false,"fullname":"Andrew Chen","user":"anchen1011","type":"user","name":"anchen1011"},"name":"Baian Chen","status":"claimed_verified","statusLastChangedAt":"2026-05-13T07:44:37.170Z","hidden":false},{"_id":"6a03e01d86b054ce2fa40d84","name":"Soujanya Poria","hidden":false}],"publishedAt":"2026-05-12T00:00:00.000Z","submittedOnDailyAt":"2026-05-13T00:00:00.000Z","title":"δ-mem: Efficient Online Memory for Large Language Models","submittedOnDailyBy":{"_id":"6039478ab3ecf716b1a5fd4d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6039478ab3ecf716b1a5fd4d/_Thy4E7taiSYBLKxEKJbT.jpeg","isPro":true,"fullname":"taesiri","user":"taesiri","type":"user","name":"taesiri"},"summary":"Large language models increasingly need to accumulate and reuse historical information in long-term assistants and agent systems. Simply expanding the context window is costly and often fails to ensure effective context utilization. We propose δ-mem, a lightweight memory mechanism that augments a frozen full-attention backbone with a compact online state of associative memory. δ-mem compresses past information into a fixed-size state matrix updated by delta-rule learning, and uses its readout to generate low-rank corrections to the backbone's attention computation during generation. With only an 8times8 online memory state, δ-mem improves the average score to 1.10times that of the frozen backbone and 1.15times that of the strongest non-δ-mem memory baseline. It achieves larger gains on memory-heavy benchmarks, reaching 1.31times on MemoryAgentBench and 1.20times on LoCoMo, while largely preserving general capabilities. These results show that effective memory can be realized through a compact online state directly coupled with attention computation, without full fine-tuning, backbone replacement, or explicit context extension.","upvotes":87,"discussionId":"6a03e01d86b054ce2fa40d85","githubRepo":"https://github.com/declare-lab/delta-Mem","githubRepoAddedBy":"user","ai_summary":"A lightweight memory mechanism called δ-mem enhances large language models by augmenting a frozen attention backbone with a compact associative memory state that provides low-rank corrections to attention computations.","ai_keywords":["large language models","memory mechanism","frozen full-attention backbone","associative memory","delta-rule learning","attention computation","memory-heavy benchmarks","MemoryAgentBench","LoCoMo"],"githubStars":46,"organization":{"_id":"69d05e49e99e437a3b18bebc","name":"mindlab-research","fullname":"Mind Lab","avatar":"https://cdn-avatars.huggingface.co/v1/production/uploads/64303a4484f3ed1ce62a2c5a/0L3sMuvL2JGls3zUKOKYU.jpeg"}},"canReadDatabase":false,"canManagePapers":false,"canSubmit":false,"hasHfLevelAccess":false,"upvoted":false,"upvoters":[{"_id":"64bce15bafd1e46c5504ad38","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/64bce15bafd1e46c5504ad38/vkEjiu-mIagKlrXzDH75o.png","isPro":false,"fullname":"Di Zhang","user":"di-zhang-fdu","type":"user"},{"_id":"668775b5c82095d2543b47a3","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/noauth/gPgYQAj_WOqTtehiWIa-6.jpeg","isPro":false,"fullname":"NolanHo","user":"NolanHo","type":"user"},{"_id":"6969a15c87ebb192324c37e8","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/noauth/qWv0dF3x7ZtKdn0MqrjXW.png","isPro":false,"fullname":"Qihan Liu","user":"liuqh16","type":"user"},{"_id":"62e217e52e4cab6e39dd54c2","avatarUrl":"/avatars/18eb5199171711585e928906d7ee75bc.svg","isPro":false,"fullname":"JuneGao","user":"JuneGao","type":"user"},{"_id":"677aa4d51d4f6bcb2e217c93","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/no-auth/BPjftZ9rVhaGJfEBRDmLb.png","isPro":false,"fullname":"Xiaoteng Ma","user":"xtma","type":"user"},{"_id":"6621e02cf34ab6caed18e9c6","avatarUrl":"/avatars/15888b2060d1cc56be9fa55fd4b34005.svg","isPro":false,"fullname":"Fanqi Kong","user":"Fancylalala","type":"user"},{"_id":"64a0154fbfd4d3ffab930448","avatarUrl":"/avatars/f3b08011f04cd7be736ec48d4feb0fc7.svg","isPro":false,"fullname":"Hongmutian","user":"OriReplication","type":"user"},{"_id":"66f8689725464a7989b75845","avatarUrl":"/avatars/43a61a528c5779103eaf5687ba44ee14.svg","isPro":false,"fullname":"Jiarui Yao","user":"FlippyDora","type":"user"},{"_id":"6a03e53624974fbafc902240","avatarUrl":"/avatars/1902dbabc9c8ad05e532050cae69c545.svg","isPro":false,"fullname":"Hua","user":"peixuan77","type":"user"},{"_id":"6a03e580d1687e220c5e9d88","avatarUrl":"/avatars/8925c1cd8c506131c6c1268380cdbf8f.svg","isPro":false,"fullname":"Luke Chen","user":"lukechen3","type":"user"},{"_id":"6577e03b3ceeb2f078f66512","avatarUrl":"/avatars/27fe1cff6bc8e6e98966359d7ca4adc4.svg","isPro":false,"fullname":"Xiang Lei","user":"Lei00764","type":"user"},{"_id":"66ae29f2a3f613f565683397","avatarUrl":"/avatars/3c56232f35efd0ee8c9725116a0b304f.svg","isPro":false,"fullname":"tc","user":"tc95","type":"user"}],"acceptLanguages":["en"],"dailyPaperRank":3,"organization":{"_id":"69d05e49e99e437a3b18bebc","name":"mindlab-research","fullname":"Mind Lab","avatar":"https://cdn-avatars.huggingface.co/v1/production/uploads/64303a4484f3ed1ce62a2c5a/0L3sMuvL2JGls3zUKOKYU.jpeg"}}">
δ-mem: Efficient Online Memory for Large Language Models
Abstract
A lightweight memory mechanism called δ-mem enhances large language models by augmenting a frozen attention backbone with a compact associative memory state that provides low-rank corrections to attention computations.
AI-generated summary
Large language models increasingly need to accumulate and reuse historical information in long-term assistants and agent systems. Simply expanding the context window is costly and often fails to ensure effective context utilization. We propose δ-mem, a lightweight memory mechanism that augments a frozen full-attention backbone with a compact online state of associative memory. δ-mem compresses past information into a fixed-size state matrix updated by delta-rule learning, and uses its readout to generate low-rank corrections to the backbone's attention computation during generation. With only an 8times8 online memory state, δ-mem improves the average score to 1.10times that of the frozen backbone and 1.15times that of the strongest non-δ-mem memory baseline. It achieves larger gains on memory-heavy benchmarks, reaching 1.31times on MemoryAgentBench and 1.20times on LoCoMo, while largely preserving general capabilities. These results show that effective memory can be realized through a compact online state directly coupled with attention computation, without full fine-tuning, backbone replacement, or explicit context extension.
Community
This is really, really cool.
I think it's important we stop treating weights as the ultimate beholder, and start adding ornaments with their own purposes. This paper introduces one of the most lightweight and successful implementations of tack-on memory that I've seen yet.
Keep it up!
I am really appreciate you like it
Upload images, audio, and videos by dragging in the text input, pasting, or clicking here.
Tap or paste here to upload images
Cite arxiv.org/abs/2605.12357 in a dataset README.md to link it from this page.
Cite arxiv.org/abs/2605.12357 in a Space README.md to link it from this page.
Discussion (0)
Sign in to join the discussion. Free account, 30 seconds — email code or GitHub.
Sign in →No comments yet. Sign in and be the first to say something.