🤖 Edge#128: Wu Dao – the Biggest Transformer Model in History

What’s New in AI is a deep dive into one of the freshest research papers or technology frameworks that is worth your attention. Explained in less than 5 min read. 

💥 What’s New in AI: Wu Dao is the Biggest Transformer Model in History

It seems that every other month we have a new milestone in the race of building massively large transformer models. The trajectory is astonishing. GPT-2 set up new records by building a 1.5 billion parameters model just to be surpassed by Microsoft’s Turing NLG with 17 billion parameters. GPT-3 set up the mark at 175 billion parameters, and Google’s Switch Transformer took it to 1.6 trillion parameters. Recently, the Beijing Academy of Artificial Intelligence (BAAI) announced the release of Wu Dao 2.0, a transformer model that contains a mind-blowing 1.75 trillion parameters. Those numbers are just hard to imagine. 

>become Premium and learn the history of Wu Dao, its architecture, its multimodality and training capabilities, and how impressive Wu Dao is in action

Find other topics that match your interests:


🙌 Let’s connect

Follow us on Twitter. We share lots of useful information, events, educational materials, and informative threads that strengthen your knowledge about ML.

Follow us on Twitter