AIModels.fyi

AIModels.fyi

Share this post

AIModels.fyi
AIModels.fyi
UC Berkeley unveils MemGPT: Applying OS architecture to LLMs for unlimited context

UC Berkeley unveils MemGPT: Applying OS architecture to LLMs for unlimited context

Combining an OS-inspired architecture with an LLM for unbounded context via memory paging

aimodels-fyi's avatar
aimodels-fyi
Oct 14, 2023
∙ Paid

Share this post

AIModels.fyi
AIModels.fyi
UC Berkeley unveils MemGPT: Applying OS architecture to LLMs for unlimited context
Share
UC Berkeley unveils MemGPT: Applying OS architecture to LLMs for unlimited context
"In MemGPT, a fixed-context LLM processor is augmented with a tiered memory system and a set of functions that allow it to manage its own memory." - From the project site.

Large language models like GPT-3 have revolutionized AI by achieving impressive performance on natural language tasks. However, they are fundamentally limited by their fixed context wi…

Keep reading with a 7-day free trial

Subscribe to AIModels.fyi to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 AIModels.fyi
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share