Papers
arxiv:2403.11901

Larimar: Large Language Models with Episodic Memory Control

Published on Mar 18
· Submitted by akhaliq on Mar 19
#3 Paper of the day
Authors:
,
,
,
,
,
,

Abstract

Efficient and accurate updating of knowledge stored in Large Language Models (LLMs) is one of the most pressing research challenges today. This paper presents Larimar - a novel, brain-inspired architecture for enhancing LLMs with a distributed episodic memory. Larimar's memory allows for dynamic, one-shot updates of knowledge without the need for computationally expensive re-training or fine-tuning. Experimental results on multiple fact editing benchmarks demonstrate that Larimar attains accuracy comparable to most competitive baselines, even in the challenging sequential editing setup, but also excels in speed - yielding speed-ups of 4-10x depending on the base LLM - as well as flexibility due to the proposed architecture being simple, LLM-agnostic, and hence general. We further provide mechanisms for selective fact forgetting and input context length generalization with Larimar and show their effectiveness.

Community

Will you be publishing the code for this paper?

Actually, I did not fully understand the methods due to my none of backgrounds to episodic and other theories, but can understand why this structure is designed, where hippocampus-neocortex interaction inspires your model. I have some questions in the motivation. Is concept of episodic memory neccesary to build the hippocampus-neocortex interacntion If then, why is episodic memory important? Is normal memory structure such as RAG never appropriate to implement it.?

This is an automated message from the Librarian Bot. I found the following papers similar to this paper.

The following papers were recommended by the Semantic Scholar API

Please give a thumbs up to this comment if you found it helpful!

If you want recommendations for any Paper on Hugging Face checkout this Space

You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: @librarian-bot recommend

anyone found a pytorch implementation out there?

Larimar: Revolutionizing Large Language Models with Brain-Inspired Memory Control

Links 🔗:

👉 Subscribe: https://www.youtube.com/@Arxflix
👉 Twitter: https://x.com/arxflix
👉 LMNT (Partner): https://lmnt.com/

By Arxflix
9t4iCUHx_400x400-1.jpg

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2403.11901 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2403.11901 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2403.11901 in a Space README.md to link it from this page.

Collections including this paper 12