✨ Edge#150: Microsoft’s SynapseML is a New Framework for Large Scale Machine Learning
What’s New in AI, a deep dive into one of the freshest research papers or technology frameworks that is worth your attention. Our goal is to keep you up to date with new developments in AI to complement the concepts we debate in other editions of our newsletter.
💥 What’s New in AI: Microsoft’s SynapseML is a New Framework for Large Scale Machine Learning
Building large-scale ML solutions is nothing short of a nightmare. Even if you have the perfect architecture, highly scalable ML pipelines typically require combining many infrastructure platforms and frameworks that are not precisely designed for seamless integration. The process of orchestrating different ML tools is challenging even for the most experienced ML developers.
To illustrate the challenge of large-scale, distributed ML, consider the scenario of distributed evaluation of a deep neural network. The process requires many complicated steps, from distributing the model across nodes in a network, evaluating the GPU utilization, and gathering the results. Some frameworks like Horovod or SparkML can do that task effectively. But there is no consistent developer experience across different stacks, which makes the process of comparing results a very tedious task. Microsoft Research open-sourced a new framework designed to address this challenge.
SynapseML
SynapseML is the new version of MMLSpark, an open-source library designed from the ground up to implement massively scalable ML pipelines. Functionally,