Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Hands-On Graph Neural Networks Using Python

You're reading from   Hands-On Graph Neural Networks Using Python Practical techniques and architectures for building powerful graph and deep learning apps with PyTorch

Arrow left icon
Product type Paperback
Published in Apr 2023
Publisher Packt
ISBN-13 9781804617526
Length 354 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Maxime Labonne Maxime Labonne
Author Profile Icon Maxime Labonne
Maxime Labonne
Arrow right icon
View More author details
Toc

Table of Contents (25) Chapters Close

Preface 1. Part 1: Introduction to Graph Learning
2. Chapter 1: Getting Started with Graph Learning FREE CHAPTER 3. Chapter 2: Graph Theory for Graph Neural Networks 4. Chapter 3: Creating Node Representations with DeepWalk 5. Part 2: Fundamentals
6. Chapter 4: Improving Embeddings with Biased Random Walks in Node2Vec 7. Chapter 5: Including Node Features with Vanilla Neural Networks 8. Chapter 6: Introducing Graph Convolutional Networks 9. Chapter 7: Graph Attention Networks 10. Part 3: Advanced Techniques
11. Chapter 8: Scaling Up Graph Neural Networks with GraphSAGE 12. Chapter 9: Defining Expressiveness for Graph Classification 13. Chapter 10: Predicting Links with Graph Neural Networks 14. Chapter 11: Generating Graphs Using Graph Neural Networks 15. Chapter 12: Learning from Heterogeneous Graphs 16. Chapter 13: Temporal Graph Neural Networks 17. Chapter 14: Explaining Graph Neural Networks 18. Part 4: Applications
19. Chapter 15: Forecasting Traffic Using A3T-GCN 20. Chapter 16: Detecting Anomalies Using Heterogeneous GNNs 21. Chapter 17: Building a Recommender System Using LightGCN 22. Chapter 18: Unlocking the Potential of Graph Neural Networks for Real-World Applications
23. Index 24. Other Books You May Enjoy

What this book covers

Chapter 1, Getting Started with Graph Learning, provides a comprehensive introduction to GNNs, including their importance in modern data analysis and machine learning. The chapter starts by exploring the relevance of graphs as a representation of data and their widespread use in various domains. It then delves into the importance of graph learning, including different applications and techniques. Finally, the chapter focuses on the GNN architecture and highlights its unique features and performance compared to other methods.

Chapter 2, Graph Theory for Graph Neural Networks, covers the basics of graph theory and introduces various types of graphs, including their properties and applications. This chapter also covers fundamental graph concepts, such as the adjacency matrix, graph measures, such as centrality, and graph algorithms, Breadth-First Search (BFS) and Depth-First Search (DFS).

Chapter 3, Creating Node Representations with DeepWalk, focuses on DeepWalk, a pioneer in applying machine learning to graph data. The main objective of the DeepWalk architecture is to generate node representations that other models can utilize for downstream tasks such as node classification. The chapter covers two key components of DeepWalk – Word2Vec and random walks – with a particular emphasis on the Word2Vec skip-gram model.

Chapter 4, Improving Embeddings with Biased Random Walks in Node2Vec, focuses on the Node2Vec architecture, which is based on the DeepWalk architecture covered in the previous chapter. The chapter covers the modifications made to the random walk generation in Node2Vec and how to select the best parameters for a specific graph. The implementation of Node2Vec is compared to DeepWalk on Zachary’s Karate Club to highlight the differences between the two architectures. The chapter concludes with a practical application of Node2Vec, building a movie recommendation system.

Chapter 5, Including Node Features with Vanilla Neural Networks, explores the integration of additional information, such as node and edge features, into the graph embeddings to produce more accurate results. The chapter starts with a comparison of vanilla neural networks’ performance on node features only, treated as tabular datasets. Then, we will experiment with adding topological information to the neural networks, leading to the creation of a simple vanilla GNN architecture.

Chapter 6, Introducing Graph Convolutional Networks, focuses on the Graph Convolutional Network (GCN) architecture and its importance as a blueprint for GNNs. It covers the limitations of previous vanilla GNN layers and explains the motivation behind GCNs. The chapter details how the GCN layer works, its performance improvements over the vanilla GNN layer, and its implementation on the Cora and Facebook Page-Page datasets using PyTorch Geometric. The chapter also touches upon the task of node regression and the benefits of transforming tabular data into a graph.

Chapter 7, Graph Attention Networks, focuses on Graph Attention Networks (GATs), which are an improvement over GCNs. The chapter explains how GATs work by using the concept of self-attention and provides a step-by-step understanding of the graph attention layer. The chapter also implements a graph attention layer from scratch using NumPy. The final section of the chapter discusses the use of a GAT on two node classification datasets, Cora and CiteSeer, and compares the accuracy with that of a GCN.

Chapter 8, Scaling up Graph Neural Networks with GraphSAGE, focuses on the GraphSAGE architecture and its ability to handle large graphs effectively. The chapter covers the two main ideas behind GraphSAGE, including its neighbor sampling technique and aggregation operators. You will learn about the variants proposed by tech companies such as Uber Eats and Pinterest, as well as the benefits of GraphSAGE’s inductive approach. The chapter concludes by implementing GraphSAGE for node classification and multi-label classification tasks.

Chapter 9, Defining Expressiveness for Graph Classification, explores the concept of expressiveness in GNNs and how it can be used to design better models. It introduces the Weisfeiler-Leman (WL) test, which provides the framework for understanding expressiveness in GNNs. The chapter uses the WL test to compare different GNN layers and determine the most expressive one. Based on this result, a more powerful GNN is designed and implemented using PyTorch Geometric. The chapter concludes with a comparison of different methods for graph classification on the PROTEINS dataset.

Chapter 10, Predicting Links with Graph Neural Networks, focuses on link prediction in graphs. It covers traditional techniques, such as matrix factorization and GNN-based methods. The chapter explains the concept of link prediction and its importance in social networks and recommender systems. You will learn about the limitations of traditional techniques and the benefits of using GNN-based methods. We will explore three GNN-based techniques from two different families, including node embeddings and subgraph representation. Finally, you will implement various link prediction techniques in PyTorch Geometric and choose the best method for a given problem.

Chapter 11, Generating Graphs Using Graph Neural Networks, explores the field of graph generation, which involves finding methods to create new graphs. The chapter first introduces you to traditional techniques such as Erdős–Rényi and small-world models. Then you will focus on three families of solutions for GNN-based graph generation: VAE-based, autoregressive, and GAN-based models. The chapter concludes with an implementation of a GAN-based framework with Reinforcement Learning (RL) to generate new chemical compounds using the DeepChem library with TensorFlow.

Chapter 12, Learning from Heterogeneous Graphs, focuses on heterogeneous GNNs. Heterogeneous graphs contain different types of nodes and edges, in contrast to homogeneous graphs, which only involve one type of node and one type of edge. The chapter begins by reviewing the Message Passing Neural Network (MPNN) framework for homogeneous GNNs, then expands the framework to heterogeneous networks. Finally, we introduce a technique for creating a heterogeneous dataset, transforming homogeneous architectures into heterogeneous ones, and discussing an architecture specifically designed for processing heterogeneous networks.

Chapter 13, Temporal Graph Neural Networks, focuses on Temporal GNNs, or Spatio-Temporal GNNs, which are a type of GNN that can handle graphs with changing edges and features over time. The chapter first explains the concept of dynamic graphs and the applications of temporal GNNs, focusing on time series forecasting. The chapter then moves on to the application of temporal GNNs to web traffic forecasting to improve results using temporal information. Finally, the chapter describes another temporal GNN architecture specifically designed for dynamic graphs and applies it to the task of epidemic forecasting.

Chapter 14, Explaining Graph Neural Networks, covers various techniques to better understand the predictions and behavior of a GNN model. The chapter highlights two popular explanation methods: GNNExplainer and integrated gradients. Then, you will see the application of these techniques on a graph classification task using the MUTAG dataset and a node classification task using the Twitch social network.

Chapter 15, Forecasting Traffic Using A3T-GCN, focuses on the application of Temporal Graph Neural Networks in the field of traffic forecasting. It highlights the importance of accurate traffic forecasts in smart cities and the challenges of traffic forecasting due to complex spatial and temporal dependencies. The chapter covers the steps involved in processing a new dataset to create a temporal graph and the implementation of a new type of temporal GNN to predict future traffic speed. Finally, the results are compared to a baseline solution to verify the relevance of the architecture.

Chapter 16, Detecting Anomalies Using Heterogeneous GNNs, focuses on the application of GNNs in anomaly detection. GNNs, with their ability to capture complex relationships, make them well-suited for detecting anomalies and can handle large amounts of data efficiently. In this chapter, you will learn how to implement a GNN for intrusion detection in computer networks using the CIDDS-001 dataset. The chapter covers processing the dataset, building relevant features, implementing a heterogenous GNN, and evaluating the results to determine its effectiveness in detecting anomalies in network traffic.

Chapter 17, Recommending Books Using LightGCN, focuses on the application of GNNs in recommender systems. The goal of recommender systems is to provide personalized recommendations to users based on their interests and past interactions. GNNs are well-suited for this task as they can effectively incorporate complex relationships between users and items. In this chapter, the LightGCN architecture is introduced as a GNN specifically designed for recommender systems. Using the Book-Crossing dataset, the chapter demonstrates how to build a book recommender system with collaborative filtering using the LightGCN architecture.

Chapter 18, Unlocking the Potential of Graph Neural Networks for Real-Word Applications, summarizes what we have learned throughout the book, and looks ahead to the future of GNNs.

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image