đđ Edge#Recap1: key topics
TheSequence is a convenient way to build and reinforce your knowledge about machine learning and AI
As we are approaching the end of 2020, we decided to provide a summary of some key topics covered in TheSequence. In the first 50 issues, we have discussed very current topics in the machine learning and deep learning universe. Before we start with a new set of topics next year, letâs recap some of the most important concepts complemented by the relevant research paper and tech solution. Catch up with what you missed and prepare for the next year! Â
Automated Machine Learning
AutoML and ideas for automating the creation of machine learning models are becoming a super hot topic. We covered that topic in the following editions of The Sequence Edge:Â
Edge#4: Neural Architecture Search
we look atâŻNeural Architecture Search (NAS)âŻthat is equal to or outperform hand-designed architectures;Â
we explain the research paper âA Survey on Neural Architecture Searchâ and how it helps to understand NAS;Â
we speak aboutâŻUberâs LudwigâŻtoolbox that lowers the entry point for developers by enabling the training and testing of ML models without writing any code.Â
we explain the concept of AutoML;Â
we discuss AutoML-Zero, which proposes a method to expand the frontiers of AutoML models;Â
we speak about TransmogriAI, an open-source framework that Salesforce.com used to build Einstein.  Â
we discuss the concept of meta-learning;Â
we explore Berkeley AI Research Labâs famous paper about an algorithm for meta-learning that is model-agnostic; Â
we deep dive into Comet.ml, which many people call the GitHub of machine learning.Â
Generative Models
How to generate synthetic data that looks just like the real thing? How about networks competing with each other to improve learning. Check out the following issues of The Sequence Edge which cover topics related to generative models.Â
we evaluate the concept of generative models that gave us GANs; Â
we discuss Microsoftâs Optimus, one of the first generative models that can be used in super large-scale language tasks;Â
we dive deep into ART, an open-source framework that uses generative models for protecting neural networks.Â
Edge#8: Generative Adversarial Neural Networks
we explain the concept of GANs; Â
we walk you through the original GANs paper by Ian Goodfellow;Â
we discuss TF-GANs, one of the most popular libraries for implementing GANs.Â
Feature Engineering
Considered by many one of the most important topics in real-world machine learning solutions, feature selection and engineering is a never-ending area of research and innovation. You can learn more about it in the following issues.
Edge#10: Feature Extraction vs. Feature Selection
we explain the difference between feature extraction and feature selection;Â
we explore a feature visualization method known as Activation Atlases; Â
we review the HopsWorks feature store platform.Â
Edge#25: Representation Learning
we present the concept of representation learning;  Â
we overview the paper from Microsoft Research about representation and multi-task learning in language;  Â
we explore Facebookâs fastText framework.Â
Practical Machine LearningÂ
TheSequence Edge regularly discusses topics related to best practices and techniques to run machine learning models in production at scale. Here are some of our favorites.Â
we discuss the concept of parallel training; Â
we review a famous OpenAI research paper that proposes GNS, a metric to measure training scalability; Â
we explore Horovod, the parallel training framework created by Uber.Â
we explain the concept of model serving;Â
we review a paper in which Google Research outlined the architecture of a serving pipeline for TensorFlow models; Â
we discuss MLflow, one of the most complete machine learning lifecycle management frameworks on the market.Â
we discuss the concept of machine learning operations (MLOps); Â
we review Googleâs TFX paper; Â
we provide an overview of some of the top technologies in the MLOps space.Â
Emerging Learning Methods
Modern machine learning goes beyond supervised and unsupervised learning. Check out some of the new learning paradigms that are likely to become more relevant in the near future.Â
Edge#14: Semi-Supervised Learning
we discuss the concept of semi-supervised learning;Â
we deep dive into a paper that proposes a data augmentation method to advance semi-supervised learning; Â
we explore Labelbox, a fast-growing platform for data labeling.  Â
Edge#26: Self-Supervised Learning
we explain the concept of self-supervised learning;Â
we overview the self-supervised method for image classification proposed by Facebook;Â
we explore Googleâs SimCLR framework for advancing self-supervised learning.Â
we present the concept of contrastive learning;  Â
we explore Googleâs research on view selection for contrastive learning;  Â
we review Uberâs impressive open-sourced machine learning contributions.   Â
In the next issue, weâll summarize the rest of the important topics that were covered in TheSequence Edge. Thatâs a great way to catch up and get ready for the further exploration of the fascinating AI and ML world.
Thank you for shares and likes. It helps spread the knowledge about practical implementations of AI, accelerating its adoption throughout industries.
You can also forward this email to those who want to stay up-to-date with the developments of the AI&ML industry. Edge#2, #4, #11 from Automated Machine Learning section do not require any subscription. Thank you.
TheSequence is a summary of groundbreaking ML research papers, engaging explanations of ML concepts, and exploration of new ML frameworks and platforms. TheSequence keeps you up to date with the news, trends, and technology developments in the AI field.
5 minutes of your time, 3 times a week â you will steadily become knowledgeable about everything happening in the AI space.Â