Recurrent GNNs
We'll start this section by looking at graph neural networks (GraphNNs; see The Graph Neural Network Model, https://ieeexplore.ieee.org/document/4700287). Although the authors of the paper abbreviated the model to GNN, we'll refer to it with the GraphNN acronym to avoid conflict with the GNN abbreviation, which is reserved for the general class of graph networks. This is one of the first GNN models to be proposed. It extends existing NNs to process graph-structured data. In the same way that we used the context of a word (that is, its surrounding words) to create embedding vectors (Chapter 6, Language Modeling), we can use the neighboring graph nodes of a node to do the same. GraphNNs aim to create an s-dimensional vector state ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Read now
Unlock full access