Edge 378: Meet TimesFM: Google's New Foundation Model for Time-Series Forecasting
The model is about 200M parameters and has been trained in over 100 billion data points.
Time series forecasting is one of the classic scenarios in machine learning(ML) since its early days. The ability of outputting predictions on time series data is relevant on many domains including retail, finance, manufacturing, healthcare, and natural sciences and yes, stock market predictions. Despite its relevance, the progress in those scenarios pales relative to the rapid developments we are seeing in LLMs, computer visions and other areas of generative AI. Is the paradigm of pretrain model applicable to time series forecasting scenarios. Google seems to believe so with a recent research paper that outlines a decoder-only pretrain model for time series forecasting. The paper introduces TimeFM, a 200M parameter foundation model trained in over 100 billion time series data points. Google also announced that the new model will be available in Vertex AI in the near future.