๐จ๐ผโ๐๐ฉ๐ฝโ๐ The Standard for Scalable Deep Learning Models
Weekly news digest curated by the industry insiders
๐ย Editorialย
Large deep learning models seem to be the norm these days. While deep neural networks with trillions of parameters are very attractive, they are nothing short of a nightmare to train. In most training techniques, the computational cost scales linearly with the number of parameters, resulting in impractical costs for most scenarios. In recent years, a mixture of experts (MoE) has emerged as a powerful alternative. Conceptually, MoE operates by partitioning a task into subtasks and aggregating the output. When applied to deep learning models, MoE has proven to scale sublinear with respect to the number of parameters, making the only viable option to scaling deep learning models to trillions of parameters.ย ย
The value proposition of MoE has sparked the creation of new frameworks for supporting this technique. Facebook AI Research (FAIR) recently launched fairseq for using MoE in language models.ย Similarly, researchers from the famous Beijing Academy of Artificial Intelligence (BAAI) open-sourced FastMoE, an implementation of MoE in PyTorch. A few days ago, Microsoft Research jumped into the MoE contributions space with the release of Tutel, an open-source library to use MoE to enable the implementation of super large deep neural networks. One of the best things about Tutel is that Microsoft didnโt only focus on the open-source release but also deeply optimized the framework for GPUs supported in the Azure platform streamlining the adoption of this MoE implementation. Little by little, MoE is becoming the gold standard of large deep learning models.ย ย ย
๐๐ TheSequence Scope is our Sunday free digest. To receive high-quality educational content about the most relevant concepts, research papers and developments in the ML world every Tuesday and Thursday, pleaseย subscribeย toย TheSequence Edgeย ๐๐
๐ย Next week in TheSequence Edge:
Edge#145:ย we discuss model observability and its difference from model monitoring;ย we exploreย MLTrace, a reference architecture for observability in ML pipelines;ย ย we overview Arize AI that enables the foundation for ML observability.ย ย ย
Edge#146:ย we deep dive into Arize AIย ML observability platform.ย ย
Now, letโs review the most important developments in the AI industry this week
๐ ML Research
Deep Learning Demystifiedย
The team from Walmart Labs published a remarkable blog post explaining the mathematical and computer science foundations of deep learning โread more on Walmart Global Tech blog
Predictive Text Selection and Federated Learningย ย
Google Research published a blog post detailing how they used federated learning to improve the Smart Text Selection feature in Android โread more on Google Research blog
Safety Envelopes in Robotic Interactionsย
Carnegie Mellon University published a paper detailing a probabilistic technique for inferring surfaces that guarantee the safety of robots while interacting with objects in an environment โread more on Carnegie Melon University blog
๐ค Cool AI Tech Releases
Tutelย
Microsoft Research open-sourced Tutel, a high-performance mixture of experts (MoE) library to train massively large deep learning models โread more on Microsoft Research blog
GauGAN2ย
NVIDIA released a demo showcasing its GauGAN2 model that can generate images from textual input โread more on NVIDIA blog
๐ธ Money in AI
For ML&AI:
Open-source neural search company Jina.ai raised a $30 million Series A funding round led by Canaan Partners. Hiring in Berlin/Germany, Beijing and Shenzhen/China.
AI-powered
AI-powered transcription company Verbit raised $250 million in a Series E round led by Third Point Ventures. Hiring in Tel Aviv/Israel, New York/US, Kyiv, Ukraine.
Intelligent computing platform for digital R&D Rescale raised $105 million in an expanded series C funding round. Hiring globally.
AI-powered e-commerce fulfillment startup Deliverr raised $250 million in a Series E funding round led by Tiger Global. Hiring mostly remote.
Medical platform for AI and Visualization LifeVoxel raised $5 million in a seed round. Hiring in the US and Canada.
IPO