🚜 🛵 Using Transformers in Mainstream Deep Learning Applications
📝 Editorial
Not a week goes by in which we don’t learn about new, marvelous applications of transformers for deep learning domains. Considered one of the most important breakthroughs in the last few years of the deep learning space, transformers have gone to establish unfathomable milestones in domains such as natural language understanding (NLU) and computer vision. However, the implementation of transformer applications remains a privilege of the big technology firms and AI research labs that have access to vast data and compute resources. Making transformers more accessible to mainstream deep learning applications is one of the most interesting challenges of the next few years of practical deep learning.
The news is not all bad. 😉 There are many ongoing efforts to simplify the process of incorporating transformers into mainstream deep learning applications. Just this week, startup Hugging Face raised an impressive $40M financing round to continue its efforts to advance NLU research with a specific focus on transformer architectures. In recent years, Hugging Face has become one of the most popular platforms for using transformer models in NLU programs( we discussed Hugging Face in a previous issue of The Sequence). The new funding should help expand the use of transformers into other domains such as computer vision or time-series analysis. Certainly, we should expect transformers to become one of the most popular architectures in modern, mainstream deep learning applications.
🔺🔻TheSequence Scope – our Sunday edition with the industry’s development overview – is free. To receive high-quality content about the most relevant developments in the ML world every Tuesday and Thursday, please subscribe to TheSequence Edge 🔺🔻
🗓 Next week in TheSequence Edge:
Edge#71: the concept of Differentiable Neural Architecture Search; how Facebook FBNet uses NAS to produce efficient CNNs; Google’s AdaNet – a lightweight AutoML framework for TensorFlow.
Edge#72: deep dive intoTecton, which was inspired by Uber’s Michelangelo, to build an enterprise-grade feature store platform.
Now, let’s review the most important developments in the AI industry this week
🔎 ML Research
Understanding Generalization in Deep Learning
Google Research published a paper about a new framework that uses online optimization techniques to better understand generation in deep neural networks ->read more on Google Research blog
Adversarial Environment Generation
Google Research published a paper proposing an algorithm that uses adversarial dynamics between multiple agents to generate robust training environments ->read more on Google Research blog
A New Algorithm for Robust Reinforcement Learning Problems
Researchers from the Berkeley AI Research (BAIR) lab published a paper unveiling a new algorithm for robust reinforcement learning problems, which are designed to adapt to drastic changes in environments ->read more on FAIR blog
🤖 Cool AI Tech Releases
PyTorch 1.8
The new version of the PyTorch framework was released this week. The new release includes enhancements in areas such as distributed training and mobile deep learning as well as a good number of new libraries ->read more in this blog post from the PyTorch team
New Alexa Prize Challenge
Amazon announced a new Alexa Prize challenge designed to build chatbots that can operate well in multitasking environments ->read more in the Amazon press release
MolGX
IBM Research released IBM Molecule Generation Experience (MolGX), a cloud platform that uses machine learning to help with the design of new molecular structures that can help discover new material ->read more on IBM Research blog
💬 Useful Tweet
💸 Money in AI
ML industry
Hugging Face closed a $40 million Series B funding round. The startup is an open-source provider of natural language processing (NLP) technologies. Their open-source framework Transformers has been downloaded over a million times, amassing over 42,000 stars and 10,000 forks on GitHub. It’s one of the engines that moves the AI industry forward.
AI implementation
Intelligent process automation provider WorkFusion raised $220 million. The company built its proprietary cloud-federated learning technology: AI bots learn in real-time from data and end users with “no-code” simplicity, then further aggregate and share those learnings across the bot ecosystem.
The cloud data governance and security startup Privacera raised $50 million. The company leverages an open-source AI and ML library for natural language processing in order to automate the discovery of personally identifiable data and to tackle the data privacy and security challenges faced by large enterprises.
AI-powered cancer diagnostics startup Ibex Medical Analytics raised $38 million in funding. It creates AI solutions to detect misdiagnosed and mis-graded cancers in digitized slides, guiding pathologists to areas of cancer in support of a prompt review. It also develops AI-markers for prognostic and predictive applications used in cancer management and drug development.
CapitalOne Ventures invested $24 million in Securonix, a security startup that reduces noise and prioritizes high-fidelity alerts with behavioral analytics technology that pioneered the UEBA category. It heavily invests in AI and machine learning for greater automation to meet the growing pace of cyberattacks.
AI-powered bookkeeping startup Zeni raised $13.5 million in a Series A round. Zeni leverages a blend of artificial intelligence and human finance experts to perform daily bookkeeping and manage the different financial needs of a startup.
AI-powered HRtech startup retrain.ai raised a $9 million Series A round. The company leverages AI and ML to help organizations unlock effective talent intelligence and upskill their employees to stay ahead of the curve.