TheSequence

TheSequence

Share this post

TheSequence
TheSequence
Edge 321: Memory and Foundation Models

Edge 321: Memory and Foundation Models

Reviewing one of the most important concept in language model programming

Aug 29, 2023
∙ Paid
52

Share this post

TheSequence
TheSequence
Edge 321: Memory and Foundation Models
1
Share
Created Using Midjourney

In this Issue:

  1. An introduction to the concept of memory in foundation models.

  2. Microsoft Research paper about augmenting LLMs with long-term memory.

  3. A review of the super -popular Pinecone vector database.

💡 ML Concept of the Day: Memory and Foundation Models

Throughout this series, we have explored different emerging techniques in large language models(LLMs) such as chain-of-thought reasoning or in-context-learning(ICL) which can power a next LLM foundations. Many of these techniques rely on developing a richer context in LLM interactions which if often related with remembering specific concepts in a conversation. In cognitive theory, we refer to this capability as memory and, not surprisingly the foundation model space is adopting memory as one of the key components of its architecture.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Jesus Rodriguez
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share