🌐 🕸 Graph Neural Networks Recap
Last week we finished our mini-series about Graph Neural Networks, an important one. Here is a full recap for you to catch up with the topics we covered. As the proverb (and many ML people) says: Repetition is the mother of learning ;)
💡Graph Neural Networks – the different types and concepts behind GNNs
Graphs are one of the most common data structures to describe complex relationships between entities. Social networks and search engines are some of the key movements that have fast-tracked the adoption of graph data structures. Not surprisingly, there has been a huge demand for deep learning models that can natively learn from graph structures.
Most modern deep learning models, such as recurrent neural networks (RNNs) or convolutional neural networks (CNNs), are fundamentally designed to work on tabular, vector-based structures and struggle when presented with graph datasets. GNNs are a new area of deep learning focused on tackling this problem. Initially created…→subscribe and keep reading Edge#195. In his issue also: we observe how DeepMind showcases the potential of GNN; we discuss Deep Graph Library, a framework for implementing GNNs.
Forward this email to those who might benefit from reading it or give a gift subscription.
In Edge#197 (read it without a subscription), we overview the types of graph learning tasks; dive into the original GNN paper; explore Deep Graph Library, a framework for implementing GNNs.
In Edge#199, we discuss building blocks and types of GNN architectures; explain GraphWorld, which provides insights about how to test GNNs; explore Spektral, a library for building GNNs in Keras and TensorFlow.
In Edge#201, we explain Graph Convolutional Neural Networks; overview the original GCN Paper; explore PyTorch Geometric, one of the most complete GNN frameworks available today.
In Edge#203, we explain what Graph Recurrent Neural Networks are, discuss GNNs on Dynamic Graphs, explore DeepMind’s Jraph, a GNN Library for JAX.
In Edge#205, we explain graph attention networks; discuss the original GAT paper; explore TF-GNN, a library for implementing GNNs in TensorFlow.
Next week we are starting the series about ML testing. Super interesting! Remember: by reading TheSequence Edges regularly, you become smarter about ML and AI five minutes at a time 🤓