v2.11.0 (6146)

Enseignement de Master - APM_5DS30_TP : Machine Learning with Graphs

Descriptif

Graph data is ubiquitous. Any system with entities and relationships between them can be represented as a graph. Over the past decade, machine learning algorithms have made remarkable progress in fields such as natural language processing, computer vision, and speech recognition. This success is primarily due to deep neural network architectures' ability to extract high-level features from Euclidean-structured data like images, text, and audio. However, graph data has not received the same level of attention.

In this course, we will explore how to create machine learning models to extract high-level features from graph data, a process known as graph representation learning. The topics covered in this course include graph neural networks (GNNs), such as graph convolutions and graph attention mechanisms, scalable GNNs for big data applications, spatiotemporal data analysis with GNNs, recommender systems, and graph generation. This course also includes laboratory sessions to provide hands-on experience with these concepts.

Format des notes

Numérique sur 20

Littérale/grade européen

Pour les étudiants du diplôme M1 DATAAI - Data and Artificial Intelligence

L'UE est acquise si Note finale >= 10
  • Crédits ECTS acquis : 3 ECTS

Pour les étudiants du diplôme M2 DATAAI - Data and Artificial Intelligence

L'UE est acquise si Note finale >= 10
  • Crédits ECTS acquis : 3 ECTS

Programme détaillé

Schedule: 

  •     Lecture 1: Introduction to Machine Learning on Graphs.
    •     Motivation: Why graphs matter in ML (examples from social networks, biology, internet, chemistry). 
    •     Graph basics: nodes, edges, directed/undirected graphs, etc. 
    •     Applications of graph learning (social networks, molecules, recommender systems).
    •     Classical ML on graphs: centrality measures, node embeddings (DeepWalk, Node2vec).
    •     Introduction to graph representation learning. 
  •     Lecture 2: Introduction to Graph Neural Networks. 
    •     Neural networks on structured data.  
    •     Convolutions in time (1D), space (2D), and graphs. 
    •     Graph signals and convolutional filters.  
    •     Graph Convolutional Networks (GCNs). 
    •     Message Passing Neural Networks (MPNN) framework. 
    •     Graph Attention Networks (GAT). 
    •     Limitations of GNNs (over-smoothing, heterophily). 
    •     Introduction to PyTorch Geometric (PyG). 
  •     Lab Session 1: Graph Neural Networks. 
  •     Lecture 3: Scaling up Graph Neural Networks. 
    •     Challenges with large-scale graphs: memory, mini-batch training, neighborhood explosion. 
    •     Applications with massive graphs: recommender systems, social networks, citation networks. 
    •     Why SGD is not straightforward for GNNs. 
    •     Scalable GNN approaches: 
    •     GraphSAGE (neighbor sampling). 
    •     Cluster-GCN (graph partitioning). 
    •     Simplifying GNN architectures (SGC). 
    •     Trade-offs: scalability vs. accuracy. 
  •     Lecture 4: Spatiotemporal Analysis with Graph Neural Networks. 
    •     Introduction to spatiotemporal data and time-varying graph signals. 
    •     Temporal operators and difference signals. 
    •     Spectral GNNs. 
    •     Spatiotemporal graph neural networks. 
    •     Applications: traffic prediction, weather forecasting.  
  •     Lab Session 2: Spatiotemporal Analysis with Graph Neural Networks. 
  •     Lecture 5: Recommender systems. 
    •     Bipartite graphs: users and items.  
    •     Task: predicting new edges (future interactions).  
    •     Embedding-based models for recommendation.   
    •     Evaluation metrics: Recall@K, NDCG.  
    •     GNNs for recommendation: user–item embedding with message passing.   
  •     Lecture 5: Graph Generation. 
    •     Applications: drug discovery, material design, social network modeling, anomaly detection. 
    •     Properties of real-world graphs (degree distribution, clustering coefficient, etc.). 
    •     Traditional graph generative models: 
    •     Erdős–Rényi random graphs. 
    •     Deep generative models for graphs: 
    •     GraphRNN. 
    •     Efficient graph generation methods. 
  •     Lab Session 6: Graph Generation. 
  •     Exam. 

Veuillez patienter