Deep Learning Models for Graphs
In recent years, the field of machine learning has witnessed a paradigm shift with the emergence of graph neural networks (GNNs) as powerful tools for addressing prediction tasks on graph-structured data. Here, we'll delve into the transformative potential of GNNs, highlighting their role as optimizable transformations capable of handling diverse graph attributes, such as nodes, edges, and global context while preserving crucial graph symmetries, particularly permutation invariances.
The foundation of GNNs lies in the message-passing neural network (MPNN) framework. Through this framework, GNNs leverage a sophisticated mechanism for information exchange and aggregation across graph structures, enabling the model to capture intricate relationships and dependencies within the data.
One distinctive feature of GNNs is their adherence to a graph-in, graph-out architecture. This means that the model accepts a graph as input, equipped with information embedded in its nodes, edges, and global context. This inherent structure aligns with many real-world problems where data exhibits complex relationships and dependencies best represented as graphs.
GNNs excel in their ability to perform a progressive embedding transformation on the input graph without altering its connectivity. This progressive transformation ensures that the model refines its understanding of the underlying patterns and structures within the data, contributing to enhanced predictive capabilities.
We'll cover the following topics in this chapter:
- Message passing in graphs
- Decoding GNNs
- Graph convolutional networks (GCNs)
- Graph Sample and Aggregation (GraphSAGE)
- Graph attention networks (GATs)