👑 The GPT-3 Influence Factor

The Scope covers the most relevant ML papers, real-world ML use cases, cool tech releases, and $ in AI. Weekly

📝 Editorial 

It is not very common that an individual artificial intelligence (AI) system can power a whole new generation of models. This is partly due to the fact that it has been almost impossible for decades to reuse capabilities between AI models. The practice in traditional supervised learning was to train new models from scratch every time we needed to master a new task. This started to change with the advent of pretrained models and what can be considered its maximum exponent: GPT-3.

OpenAI’s GPT-3 is one of the most famous artificial intelligence (AI) models ever created. By establishing new milestones in different natural language processing (NLP) areas such as text completion, question answering, summarization, and many others, GPT-3 has become one of the best examples of the power of massively large deep learning systems. Beyond its impressive capabilities, one of the most fascinating things about GPT-3 is its influence on a new generation of equally impressive models.  

Since the release of GPT-3, OpenAI has been very active in leveraging the pretrained model to achieve new milestones in challenging domains. The influence of GPT-3 in recent OpenAI research releases is remarkable. Codex leverages the foundation of GPT-3 for code generation. DALL-E uses it for generating images from natural language sentences. Just this week, OpenAI unveiled a GPT-3 based model able to generate high-level summaries of books.

As language is a foundational aspect of human cognition, models like GPT-3 that have mastered language are becoming the key building blocks for a new generation of AI capabilities.  

Share


🔺🔻TheSequence Scope – our Sunday edition with the industry’s development overview – is free. To receive high-quality content about the most relevant developments in the ML world every Tuesday and Thursday, please subscribe to TheSequence Edge 🔺🔻

🗓 Next week in TheSequence Edge:

Edge#127: we discuss Self-Supervised Learning as Contrastive Learning; we cover SimCLR, an open-source framework for contrastive learning.

Edge#128: deep dive into Wu Dao, the Biggest Transformer Model in History. 


Now, let’s review the most important developments in the AI industry this week

🔎 ML Research

Book Summarization 

OpenAI published a paper detailing a GPT-3 based model for book summarization →read more on OpenAI blog

Generating Images with Never Seen Concepts 

Facebook AI Research (FAIR) published a paper proposing a GAN model that can generate high-quality images from things never seen before →read more on FAIR blog

Direct Speech-to-Speech Translation 

Google Research published a paper introducing the second version of Translatotron, a model that can directly translate speech between two different languages without the need for many intermediary subsystems →read more on Google Research blog


🛠 Real World ML

Real-Time Ad Data Processing at Uber Eats 

The Uber engineering team published a blog post detailing the data and analytics architecture powering ads in the UberEats app →read more on Uber blog

Data Consumption at Scale 

The Airbnb engineering team published a blog post detailing how their internal metrics architecture, called Minerva, can enable data consumption for different user profiles →read more on Airbnb blog


👩‍💻 Jobs!

Do you want to build a robust ML infrastructure for feature discovery and data quality monitoring and testing? Architect systems that will work with massive unstructured and structured data sets? Design novel approaches for handling high-volume real-time data streams for ML applications? Tecton is hiring a Software Engineer, ML!

Apply here


🤖 Cool AI Tech Releases

Wikipedia Image-Text Dataset 

Google Research open-sourced a Wikipedia-based image-test dataset to train multimodal vision-language systems →read more on Google Research blog


🗯 Useful Tweet

We find the best for you

Follow us on Twitter


💸 Money in AI

  • Document automation platform Ocrolus raised $80 million in Series C funding led by Fin VC. Hiring.

  • Data observability platform Bigeye raised $45 million in Series B funding led by Coatue. Hiring remote.

  • Supply chain intelligence platform Altana AI raised $15 million in a Series A funding round led by GV. Hiring in the US: Brooklyn/NY/Washington, DC.

  • Workflow automation startup Daylight (formerly FormHero) raised $12.3 million in a Series A round led by RTP Global, Bessemer Venture Partners, and Golden Ventures.

  • Inventory optimization startup Flieber raised $12 million in a Series A round co-led by GGV Capital and Monashees. Hiring in New York, US.

  • Data privacy platform Osano raised $11 million in a funding round led by Jump Capital. Hiring remote.

  • Disinformation Intelligence platform Blackbird.AI raised $10 million in Series A fundraising led by Dorilton Ventures. Hiring in New York, US.

Do you like TheSequence? What is the most helpful? What do we miss? Send us some feedback by replying to this email. We always appreciate your thoughts.