TheSequence

TheSequence

Share this post

TheSequence
TheSequence
✍🏽 Edge#131: Self-Supervised Learning for Language

✍🏽 Edge#131: Self-Supervised Learning for Language

Plus XLM-R and Facebook's fastText

Oct 12, 2021
∙ Paid
5

Share this post

TheSequence
TheSequence
✍🏽 Edge#131: Self-Supervised Learning for Language
Share

In this issue:

  • we discuss Self-Supervised Learning for Language;

  • we explore XLM-R, one of the most powerful SSL cross-lingual models ever built;

  • we cover Facebook’s fastText, a library for representation learning in language tasks.

Give a gift subscription


💡 ML Concept of the Day: Self-Supervised Learning for Language 

Continuing our series about self-supervised learning (SSL), we would like to cover its applications in language. Without a doubt, natural language processing (NLP) has been the domain in which SSL models have excelled the most. If you think about the entire wave of transformer models (see The Sequence’s series about Transformers), they can all be considered SSL methods in nature. As SSL evolves, many of the lessons from NLP methods have been extrapolated to other domains such as computer vision. However, SSL remains the most advanced area of SSL techniques.  

What makes NLP such an ideal domain for SSL techniques?  

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Jesus Rodriguez
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share