Reader small image

You're reading from  Hands-On Graph Neural Networks Using Python

Product typeBook
Published inApr 2023
PublisherPackt
ISBN-139781804617526
Edition1st Edition
Right arrow
Author (1)
Maxime Labonne
Maxime Labonne
author image
Maxime Labonne

Maxime Labonne is currently a senior applied researcher at Airbus. He received a M.Sc. degree in computer science from INSA CVL, and a Ph.D. in machine learning and cyber security from the Polytechnic Institute of Paris. During his career, he worked on computer networks and the problem of representation learning, which led him to explore graph neural networks. He applied this knowledge to various industrial projects, including intrusion detection, satellite communications, quantum networks, and AI-powered aircrafts. He is now an active graph neural network evangelist through Twitter and his personal blog.
Read more about Maxime Labonne

Right arrow

Forecasting Traffic Using A3T-GCN

We introduced T-GNNs in Chapter 13, but we did not elaborate on their main application: traffic forecasting. In recent years, the concept of smart cities has become increasingly popular. This idea refers to cities where data is used to manage and improve operations and services. In this context, one of the main sources of appeal is the creation of intelligent transportation systems. Accurate traffic forecasts can help traffic managers to optimize traffic signals, plan infrastructure, and reduce congestion. However, traffic forecasting is a challenging problem due to complex spatial and temporal dependencies.

In this chapter, we will apply T-GNNs to a particular case of traffic forecasting. First, we will explore and process a new dataset to create a temporal graph from raw CSV files. We will then apply a new type of T-GNN to predict future traffic speed. Finally, we will visualize and compare the results to a baseline solution to verify that our...

Technical requirements

All the code examples from this chapter can be found on GitHub at https://github.com/PacktPublishing/Hands-On-Graph-Neural-Networks-Using-Python/tree/main/Chapter15.

Installation steps required to run the code on your local machine can be found in the Preface of this book. This chapter requires a large amount of GPU. You can lower it by decreasing the size of the training set in the code.

Exploring the PeMS-M dataset

In this section, we will explore our dataset to find patterns and get insights that will be useful to the task of interest.

The dataset we will use for this application is the medium variant of the PeMSD7 dataset [1]. The original dataset was obtained by collecting traffic speed from 39,000 sensor stations on the weekdays of May and June 2012 using the Caltrans Performance Measurement System (PeMS). We will only consider 228 stations across District 7 of California in the medium variant. These stations output 30-second speed measurements that are aggregated into 5-minute intervals in this dataset. For example, the following figure shows the Caltrans PeMS (pems.dot.ca.gov) with various traffic speeds:

Figure 15.1 – Traffic data from Caltrans PeMS with high speed (>60 mph) in green and low speed (<35 mph) in red

Figure 15.1 – Traffic data from Caltrans PeMS with high speed (>60 mph) in green and low speed (<35 mph) in red

We can directly load the dataset from GitHub and unzip it:

from io import BytesIO
from urllib.request...

Processing the dataset

Now that we have more information about this dataset, it is time to process it before we can feed it to a T-GNN.

The first step consists of transforming the tabular dataset into a temporal graph. So, first, we need to create a graph from the raw data. In other words, we must connect the different sensor stations in a meaningful way. Fortunately, we have access to the distance matrix, which should be a good way to connect the stations.

There are several options to compute the adjacency matrix from the distance matrix. For example, we could assign a link when the distance between two stations is inferior to the mean distance. Instead, we will perform a more advanced processing introduced in [2] to calculate a weighted adjacency matrix. Instead of binary values, we calculate weights between 0 (no connection) and 1 (strong connection) using the following formula:

Here, represents the weight of the edge from node to node , is the...

Implementing the A3T-GCN architecture

In this section, we will train an Attention Temporal Graph Convolutional Network (A3T-GCN), designed for traffic forecasting. This architecture allows us to consider complex spatial and temporal dependencies:

  • Spatial dependencies refer to the fact that the traffic condition of a location can be influenced by the traffic condition of nearby locations. For example, traffic jams often spread to neighboring roads.
  • Temporal dependencies refer to the fact that the traffic condition of a location at a time can be influenced by the traffic condition of the same location at previous times. For example, if a road is congested during the morning peak, it is likely to remain congested until the evening peak.

A3T-GCN is an improvement over the temporal GCN (TGCN) architecture. The TGCN is a combination of a GCN and GRU that produces hidden vectors from each input time series. The combination of these two layers captures spatial and temporal...

Summary

This chapter focused on a traffic forecasting task using T-GNNs. First, we explored the PeMS-M dataset and converted it from tabular data into a static graph dataset with a temporal signal. In practice, we created a weighted adjacency matrix based on the input distance matrix and converted the traffic speeds into time series. Finally, we implemented an A3T-GCN model, a T-GNN designed for traffic forecasting. We compared the results to two baselines and validated the predictions made by our model.

In Chapter 16, Building a Recommender System Using LightGCN, we will see the most popular application of GNNs. We will implement a lightweight GNN on a massive dataset and evaluate it using techniques from recommender systems.

Further reading

  • [1] B. Yu, H. Yin, and Z. Zhu. Spatio-Temporal Graph Convolutional Networks: A Deep Learning Framework for Traffic Forecasting. Jul. 2018. doi: 10.24963/ijcai.2018/505. Available at https://arxiv.org/abs/1709.04875.
  • [2] Y. Li, R. Yu, C. Shahabi, and Y. Liu. Diffusion Convolutional Recurrent Neural Network: Data-Driven Traffic Forecasting. arXiv, 2017. doi: 10.48550/ARXIV.1707.01926. Available at https://arxiv.org/abs/1707.01926.
lock icon
The rest of the chapter is locked
You have been reading a chapter from
Hands-On Graph Neural Networks Using Python
Published in: Apr 2023Publisher: PacktISBN-13: 9781804617526
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime

Author (1)

author image
Maxime Labonne

Maxime Labonne is currently a senior applied researcher at Airbus. He received a M.Sc. degree in computer science from INSA CVL, and a Ph.D. in machine learning and cyber security from the Polytechnic Institute of Paris. During his career, he worked on computer networks and the problem of representation learning, which led him to explore graph neural networks. He applied this knowledge to various industrial projects, including intrusion detection, satellite communications, quantum networks, and AI-powered aircrafts. He is now an active graph neural network evangelist through Twitter and his personal blog.
Read more about Maxime Labonne