TheSequence

TheSequence

Share this post

TheSequence
TheSequence
Edge 378: Meet TimesFM: Google's New Foundation Model for Time-Series Forecasting

Edge 378: Meet TimesFM: Google's New Foundation Model for Time-Series Forecasting

The model is about 200M parameters and has been trained in over 100 billion data points.

Mar 14, 2024
∙ Paid
10

Share this post

TheSequence
TheSequence
Edge 378: Meet TimesFM: Google's New Foundation Model for Time-Series Forecasting
1
Share
Created Using Ideogram

Time series forecasting is one of the classic scenarios in machine learning(ML) since its early days. The ability of outputting predictions on time series data is relevant on many domains including retail, finance, manufacturing, healthcare, and natural sciences and yes, stock market predictions. Despite its relevance, the progress in those scenarios pales relative to the rapid developments we are seeing in LLMs, computer visions and other areas of generative AI. Is the paradigm of pretrain model applicable to time series forecasting scenarios. Google seems to believe so with a recent research paper that outlines a decoder-only pretrain model for time series forecasting. The paper introduces TimeFM, a 200M parameter foundation model trained in over 100 billion time series data points. Google also announced that the new model will be available in Vertex AI in the near future.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Jesus Rodriguez
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share