TheSequence

TheSequence

Share this post

TheSequence
TheSequence
πŸ€– πŸ•• Edge#57: Transformer Architectures for Time Series

πŸ€– πŸ•• Edge#57: Transformer Architectures for Time Series

TheSequence is the best way to build and reinforce your knowledge about machine learning and AI

Jan 26, 2021
βˆ™ Paid
6

Share this post

TheSequence
TheSequence
πŸ€– πŸ•• Edge#57: Transformer Architectures for Time Series
Share

In this issue:

  • we discuss transformers for time-series and Google Research’s Temporal Fusion Transformers (TFT) in particular;Β 

  • we learn how Uber manages uncertainty in time-series prediction models;Β 

  • we explore tsfresh – a magical library for feature extraction in time-series datasets.

Enjoy the learning!

Share TheSequence

This post is for paid subscribers

Already a paid subscriber? Sign in
Β© 2025 Jesus Rodriguez
Privacy βˆ™ Terms βˆ™ Collection notice
Start writingGet the app
Substack is the home for great culture

Share