TheSequence

TheSequence

Share this post

TheSequence
TheSequence
Edge 344: LLMs and Memory is All You Need. Inside One of the Most Shocking Papers of the Year

Edge 344: LLMs and Memory is All You Need. Inside One of the Most Shocking Papers of the Year

Can memory-augmented LLMs simulate any algorithm?

Nov 16, 2023
∙ Paid
27

Share this post

TheSequence
TheSequence
Edge 344: LLMs and Memory is All You Need. Inside One of the Most Shocking Papers of the Year
2
Share
Create an image of a programmer, of Hispanic descent and male, sitting in a high-tech office, using a large language AI model with a vast memory capacity. The AI model is represented as a futuristic, colossal screen filled with complex data and algorithms, simulating thousands of processes simultaneously. The programmer is intently focused, typing on a futuristic keyboard, surrounded by smaller screens displaying various data streams and code. The background is cinematic, featuring dramatic lighting and a grand, futuristic cityscape visible through large windows, emphasizing the advanced technology and the scale of the AI's capabilities.
Created Using DALL-E

Large language models(LLMs) continue to push the limits of computations models one breakthrough at a time. How far could this go? Well, a recent research paper published from AI researchers from Google Brain and University of Alberta shows that it can go VERY FAR. Could we possibly simulate any algorithm using large language models(LLMs) and memory? Can the combination of LLM and memory be Turing complete?

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Jesus Rodriguez
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share