🎞 Machine Learning with Graphs: Heterogeneous & Knowledge Graph Embedding, Knowledge Graph Completion
💥Free recorded course by Jure Leskovec, Computer Science, PhD
💥 In this lecture, we first introduce the heterogeneous graph with the definition and several examples. In the next, we talk about a model called RGCN which extends the GCN to heterogeneous graph. To make the model more scalable, several approximated approaches are introduced, including block diagonal matrices and basis learning. At last, we show how RGCN predicts the labels of nodes and links.
Then we introduce the knowledge graphs by giving several examples and applications.
📽 Watch: part1 part2
📝 slide
💻 Code
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #Knowledge_Graph #GNN
💥Free recorded course by Jure Leskovec, Computer Science, PhD
💥 In this lecture, we first introduce the heterogeneous graph with the definition and several examples. In the next, we talk about a model called RGCN which extends the GCN to heterogeneous graph. To make the model more scalable, several approximated approaches are introduced, including block diagonal matrices and basis learning. At last, we show how RGCN predicts the labels of nodes and links.
Then we introduce the knowledge graphs by giving several examples and applications.
📽 Watch: part1 part2
📝 slide
💻 Code
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #Knowledge_Graph #GNN
YouTube
Stanford CS224W: ML with Graphs | 2021 | Lecture 10.1-Heterogeneous & Knowledge Graph Embedding
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3pNkBLE
Lecture 10.1 - Heterogeneous Graphs and Knowledge Graph Embeddings
Jure Leskovec
Computer Science, PhD
In this lecture,…
Lecture 10.1 - Heterogeneous Graphs and Knowledge Graph Embeddings
Jure Leskovec
Computer Science, PhD
In this lecture,…
👍4👏1
🎞 Machine Learning with Graphs: Reasoning in Knowledge Graphs, Answering Predictive Queries, Query2box: Reasoning over KGs
💥Free recorded course by Jure Leskovec, Computer Science, PhD
💥 IIn this lecture, we introduce how to perform reasoning over knowledge graphs and provide answers to complex queries. We talk about different possible queries that one can get over a knowledge graph, and how to answer them by traversing over the graph. We also show how incompleteness of knowledge graphs can limit our ability to provide complete answers. We finally talk about how we can solve this problem by generalizing the link prediction task.
📽 Watch: part1 part2 part3
📝 slide
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #Knowledge_Graph
💥Free recorded course by Jure Leskovec, Computer Science, PhD
💥 IIn this lecture, we introduce how to perform reasoning over knowledge graphs and provide answers to complex queries. We talk about different possible queries that one can get over a knowledge graph, and how to answer them by traversing over the graph. We also show how incompleteness of knowledge graphs can limit our ability to provide complete answers. We finally talk about how we can solve this problem by generalizing the link prediction task.
📽 Watch: part1 part2 part3
📝 slide
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #Knowledge_Graph
YouTube
Stanford CS224W: Machine Learning with Graphs | 2021 | Lecture 11.1 - Reasoning in Knowledge Graphs
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3BweHQZ
Lecture 11.1 - Reasoning in Knowledge Graphs using Embeddings
Jure Leskovec
Computer Science, PhD
In this lecture, we introduce…
Lecture 11.1 - Reasoning in Knowledge Graphs using Embeddings
Jure Leskovec
Computer Science, PhD
In this lecture, we introduce…
👍1
🎞 Machine Learning with Graphs: Neural Subgraph Matching & Counting, Neural Subgraph Matching, Finding Frequent Subgraphs
💥Free recorded course by Jure Leskovec, Computer Science, PhD
💥In this lecture, we will be talking about the problem on subgraph matching and counting. Subgraphs work as building blocks for larger networks, and have the power to characterize and discriminate networks. We first give an introduction on two types of subgraphs - node-induced subgraphs and edge-induced subgraphs. Then we give you an idea how to determine subgraph relation through the concept of graph isomorphism. Finally, we discuss why subgraphs are important, and how we can identify the most informative subgraphs with network significance profile.
📽 Watch: part1 part2 part3
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #Subgraph
💥Free recorded course by Jure Leskovec, Computer Science, PhD
💥In this lecture, we will be talking about the problem on subgraph matching and counting. Subgraphs work as building blocks for larger networks, and have the power to characterize and discriminate networks. We first give an introduction on two types of subgraphs - node-induced subgraphs and edge-induced subgraphs. Then we give you an idea how to determine subgraph relation through the concept of graph isomorphism. Finally, we discuss why subgraphs are important, and how we can identify the most informative subgraphs with network significance profile.
📽 Watch: part1 part2 part3
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #Subgraph
YouTube
CS224W: Machine Learning with Graphs | 2021 | Lecture 12.1-Fast Neural Subgraph Matching & Counting
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3jR7jK2
Jure Leskovec
Computer Science, PhD
In this lecture, we will be talking about the problem on subgraph matching and counting.…
Jure Leskovec
Computer Science, PhD
In this lecture, we will be talking about the problem on subgraph matching and counting.…
🎞 Graph Analytics and Graph-based Machine Learning
💥Free recorded course by Clair Sullivan
💥Machine learning has traditionally revolved around creating models around data that is characterized by embeddings attributed to individual observations. However, this ignores a signal that could potentially be very strong: the relationships between data points. Network graphs provide great opportunities for identifying relationships that we may not even realize exist within our data. Further, a variety of methods exist to create embeddings of graphs that can enrich models and provide new insights.
In this talk we will look at some examples of common ML problems and demonstrate how they can take advantage of graph analytics and graph-based machine learning. We will also demonstrate how graph embeddings can be used to enhance existing ML pipelines.
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning
💥Free recorded course by Clair Sullivan
💥Machine learning has traditionally revolved around creating models around data that is characterized by embeddings attributed to individual observations. However, this ignores a signal that could potentially be very strong: the relationships between data points. Network graphs provide great opportunities for identifying relationships that we may not even realize exist within our data. Further, a variety of methods exist to create embeddings of graphs that can enrich models and provide new insights.
In this talk we will look at some examples of common ML problems and demonstrate how they can take advantage of graph analytics and graph-based machine learning. We will also demonstrate how graph embeddings can be used to enhance existing ML pipelines.
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning
YouTube
Graph Analytics and Graph-based Machine Learning
Machine learning has traditionally revolved around creating models around data that is characterized by embeddings attributed to individual observations. However, this ignores a signal that could potentially be very strong: the relationships between data…
🎞 Machine Learning with Graphs: Community Detection in Network, Network Communities, Louvain Algorithm, Detecting Overlapping Communities
💥Free recorded course by Jure Leskovec, Computer Science, PhD
💥In this lecture, introduce methods that build on the intuitions presented in the previous part to identify clusters within networks. We define modularity score Q that measures how well a network is partitioned into communities. We also introduce null models to measure expected number of edges between nodes to compute the score. Using this idea, we then give a mathematical expression to calculate the modularity score. Finally, we can develop an algorithm to find communities by maximizing the modularity..
📽 Watch: part1 part2 part3 part4
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #Community_Detection
💥Free recorded course by Jure Leskovec, Computer Science, PhD
💥In this lecture, introduce methods that build on the intuitions presented in the previous part to identify clusters within networks. We define modularity score Q that measures how well a network is partitioned into communities. We also introduce null models to measure expected number of edges between nodes to compute the score. Using this idea, we then give a mathematical expression to calculate the modularity score. Finally, we can develop an algorithm to find communities by maximizing the modularity..
📽 Watch: part1 part2 part3 part4
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #Community_Detection
YouTube
Stanford CS224W: ML with Graphs | 2021 | Lecture 13.1 - Community Detection in Networks
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3Eu4Xss
Jure Leskovec
Computer Science, PhD
In this lecture, we first introduce the community structure of graphs and information…
Jure Leskovec
Computer Science, PhD
In this lecture, we first introduce the community structure of graphs and information…
👍5
🎞 Machine Learning with Graphs: Generative Models for Graphs, Erdos Renyi Random Graphs, The Small World Model, Kronecker Graph Model
💥Free recorded course by Jure Leskovec, Computer Science, PhD
💥This lecture, will cover generative models for graphs. The goal of generative models for graphs is to generate synthetic graphs which are similar to given example graphs. The simplest model for graph generation, Erdös-Renyi graph (E-R graphs, Gnp graphs). The small-world graphs (Watts–Strogatz graphs, W-S graphs). Even though the E-R graphs can fit the average path length of real-world graphs, its clustering coefficient is much smaller than real-world graphs. The small-world model is proposed to generative realistic graphs with both low diameter and high clustering coefficient. Specifically, W-S graphs are generative by randomly rewring edges from regular lattic graphs. The Kronecker Graph model, where graphs are generated in a recursive manner. The key motivation is that real-world graphs often exhibit self-similarity, where the whole structure of the graph has the same shape as its parts. Kronecker graphs are generated by recursively doing Kronecker product over the initiator matrix, which is trained to fit the statistics of the input dataset. We further discuss fast Kronecker generator algorithms. Finally, we show that Kronecker graphs and real graphs are very close in many important graph statistics.
📽 Watch: part1 part2 part3 part4
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #Erdos_Renyi #Small_World #Kronecker_Graph
💥Free recorded course by Jure Leskovec, Computer Science, PhD
💥This lecture, will cover generative models for graphs. The goal of generative models for graphs is to generate synthetic graphs which are similar to given example graphs. The simplest model for graph generation, Erdös-Renyi graph (E-R graphs, Gnp graphs). The small-world graphs (Watts–Strogatz graphs, W-S graphs). Even though the E-R graphs can fit the average path length of real-world graphs, its clustering coefficient is much smaller than real-world graphs. The small-world model is proposed to generative realistic graphs with both low diameter and high clustering coefficient. Specifically, W-S graphs are generative by randomly rewring edges from regular lattic graphs. The Kronecker Graph model, where graphs are generated in a recursive manner. The key motivation is that real-world graphs often exhibit self-similarity, where the whole structure of the graph has the same shape as its parts. Kronecker graphs are generated by recursively doing Kronecker product over the initiator matrix, which is trained to fit the statistics of the input dataset. We further discuss fast Kronecker generator algorithms. Finally, we show that Kronecker graphs and real graphs are very close in many important graph statistics.
📽 Watch: part1 part2 part3 part4
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #Erdos_Renyi #Small_World #Kronecker_Graph
YouTube
Stanford CS224W: Machine Learning with Graphs | 2021 | Lecture 14.1 - Generative Models for Graphs
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3jO8OsE
Jure Leskovec
Computer Science, PhD
In this lecture, we will cover generative models for graphs. The goal of generative…
Jure Leskovec
Computer Science, PhD
In this lecture, we will cover generative models for graphs. The goal of generative…
👍5
🎞 Machine Learning with Graphs: Deep Generative Models for Graphs, Graph RNN: Generating Realistic Graphs, Scaling Up & Evaluating Graph Gen, Applications of Deep Graph Generation.
💥Free recorded course by Jure Leskovec, Computer Science, PhD
💥this lecture, focus on deep generative models for graphs. We outline 2 types of tasks within the problem of graph generation: (1) realistic graph generation, where the goal is to generate graphs that are similar to a given set of graphs; (2) goal-directed graph generation, where we want to generate graphs that optimize given objectives/constraints. First, we recap the basics for generative models and deep generative models; then, in next parts introduce and focus on GraphRNN, one of the first deep generative models for graph; and finally, discuss GCPN, a deep graph generative model designed specifically for application to molecule generation.
📽 Watch: part1 part2 part3 part4
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #GCPN #GraphRNN #DGNN #GNN
💥Free recorded course by Jure Leskovec, Computer Science, PhD
💥this lecture, focus on deep generative models for graphs. We outline 2 types of tasks within the problem of graph generation: (1) realistic graph generation, where the goal is to generate graphs that are similar to a given set of graphs; (2) goal-directed graph generation, where we want to generate graphs that optimize given objectives/constraints. First, we recap the basics for generative models and deep generative models; then, in next parts introduce and focus on GraphRNN, one of the first deep generative models for graph; and finally, discuss GCPN, a deep graph generative model designed specifically for application to molecule generation.
📽 Watch: part1 part2 part3 part4
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #GCPN #GraphRNN #DGNN #GNN
YouTube
Stanford CS224W: ML with Graphs | 2021 | Lecture 15.1 - Deep Generative Models for Graphs
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3Ex8TsH
Jure Leskovec
Computer Science, PhD
In this lecture, we focus on deep generative models for graphs. We outline 2 types of…
Jure Leskovec
Computer Science, PhD
In this lecture, we focus on deep generative models for graphs. We outline 2 types of…
👍3🎉1
🎞 application of machine learning in traffic optimization
💥Free recorded course by Powel gora
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning
💥Free recorded course by Powel gora
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning
YouTube
Paweł Gora: Applications of machine learning in traffic optimization
I will be talking about possible applications of machine learning in traffic optimization (and in optimizing some other complex processes). I will describe the process of building traffic metamodels by approximating outcomes of traffic simulations using machine…
👍4
🎞 pytorch geometric tutorial: graph attention networks implementation
💥Free recorded course
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #GAT #code #python
💥Free recorded course
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #GAT #code #python
YouTube
Pytorch Geometric tutorial: Graph attention networks (GAT) implementation
In this video we will see the math behind GAT and a simple implementation in Pytorch geometric.
Outcome:
- Recap
- Introduction
- GAT
- Message Passing pytroch layer
- Simple GCNlayer implementation
- GAT implementation
- GAT Usage
Download the material…
Outcome:
- Recap
- Introduction
- GAT
- Message Passing pytroch layer
- Simple GCNlayer implementation
- GAT implementation
- GAT Usage
Download the material…
👍2
🎞 Analysis on Collaboration and Co-Authorship Network using Centrality Measures
💥Free recorded course
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Co_Authorship #Centrality
💥Free recorded course
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Co_Authorship #Centrality
YouTube
Analysis on Collaboration and Co-Authorship Network using Centrality Measures
This is a presentation of a mini-paper I wrote on analysis on collaboration and co-authorship network of international Network Science researches by using the classical centrality measures and structural holes. The data set I used here is from M.E.J. Newman…
👍3
🎞 Machine Learning with Graphs: Applications of Deep Graph Generation.
💥Free recorded course by Jure Leskovec, Computer Science, PhD
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #DGNN #GNN
💥Free recorded course by Jure Leskovec, Computer Science, PhD
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #DGNN #GNN
YouTube
Stanford CS224W: ML with Graphs | 2021 | Lecture 15.4 - Applications of Deep Graph Generation
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3EwmakW
Lecture 15.4: Application of Deep Graph Generative Models to Molecule Generation
Jure Leskovec
Computer Science, PhD
Finally…
Lecture 15.4: Application of Deep Graph Generative Models to Molecule Generation
Jure Leskovec
Computer Science, PhD
Finally…
👍5❤1
🎞 Tutorial: Graph Neural Networks in TensorFlow: A Practical Guide
💥Free recorded course by Sami Abu-el-Haija, Neslihan Bulut, Bryan Perozzi, and Anton Tsitsulin
💥Graphs are general data structures that can represent information from a variety of domains (social, biomedical, online transactions, and many more). Graph Neural Networks (GNNs) are quickly becoming the de-facto Machine Learning models for learning from Graph data and hereby infer missing information, such as, predicting labels of nodes or imputing missing edges. The main goal of this tutorial is to help practitioners and researchers to implement GNNs in a TensorFlow setting. Specifically, the tutorial will be mostly hands-on, and will walk the audience through a process of running existing GNNs on heterogeneous graph data, and a tour of how to implement new GNN models. The hands-on portion of the tutorial will be based on TF-GNN, a new framework that we open-sourced.
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #GNN #code #python #tensorflow
💥Free recorded course by Sami Abu-el-Haija, Neslihan Bulut, Bryan Perozzi, and Anton Tsitsulin
💥Graphs are general data structures that can represent information from a variety of domains (social, biomedical, online transactions, and many more). Graph Neural Networks (GNNs) are quickly becoming the de-facto Machine Learning models for learning from Graph data and hereby infer missing information, such as, predicting labels of nodes or imputing missing edges. The main goal of this tutorial is to help practitioners and researchers to implement GNNs in a TensorFlow setting. Specifically, the tutorial will be mostly hands-on, and will walk the audience through a process of running existing GNNs on heterogeneous graph data, and a tour of how to implement new GNN models. The hands-on portion of the tutorial will be based on TF-GNN, a new framework that we open-sourced.
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #GNN #code #python #tensorflow
YouTube
Tutorial: Graph Neural Networks in TensorFlow: A Practical Guide
Organizers: Sami Abu-el-Haija, Neslihan Bulut, Bryan Perozzi, and Anton Tsitsulin
Abstract: Graphs are general data structures that can represent information from a variety of domains (social, biomedical, online transactions, and many more). Graph Neural…
Abstract: Graphs are general data structures that can represent information from a variety of domains (social, biomedical, online transactions, and many more). Graph Neural…
👍4
🎞 Machine Learning with Graphs: Generative Models for Graphs
💥Free recorded course by Jure Leskovec, Computer Science, PhD
💥In this lecture, we will cover generative models for graphs. The goal of generative models for graphs is to generate synthetic graphs which are similar to given example graphs. Graph generation is important as it can offer insight on the formulation process of graphs, which is crucial for predictions, simulations and anomaly detections on graphs. In the first part, we will introduce the properties of real-world graphs, where a successful graph generative model should fit these properties. These graph statistics include degree distribution, clustering coefficient, connected components and path length.
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #Generative_Models
💥Free recorded course by Jure Leskovec, Computer Science, PhD
💥In this lecture, we will cover generative models for graphs. The goal of generative models for graphs is to generate synthetic graphs which are similar to given example graphs. Graph generation is important as it can offer insight on the formulation process of graphs, which is crucial for predictions, simulations and anomaly detections on graphs. In the first part, we will introduce the properties of real-world graphs, where a successful graph generative model should fit these properties. These graph statistics include degree distribution, clustering coefficient, connected components and path length.
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #Generative_Models
YouTube
Stanford CS224W: Machine Learning with Graphs | 2021 | Lecture 14.1 - Generative Models for Graphs
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3jO8OsE
Jure Leskovec
Computer Science, PhD
In this lecture, we will cover generative models for graphs. The goal of generative…
Jure Leskovec
Computer Science, PhD
In this lecture, we will cover generative models for graphs. The goal of generative…
👍6
🎞 Graph Analytics and Graph-based Machine Learning
💥Free recorded course by Clair Sullivan(Neo4j)
💥Machine learning has traditionally revolved around creating models around data that is characterized by embeddings attributed to individual observations. However, this ignores a signal that could potentially be very strong: the relationships between data points. Network graphs provide great opportunities for identifying relationships that we may not even realize exist within our data. Further, a variety of methods exist to create embeddings of graphs that can enrich models and provide new insights.
In this talk we will look at some examples of common ML problems and demonstrate how they can take advantage of graph analytics and graph-based machine learning. We will also demonstrate how graph embeddings can be used to enhance existing ML pipelines.
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning
💥Free recorded course by Clair Sullivan(Neo4j)
💥Machine learning has traditionally revolved around creating models around data that is characterized by embeddings attributed to individual observations. However, this ignores a signal that could potentially be very strong: the relationships between data points. Network graphs provide great opportunities for identifying relationships that we may not even realize exist within our data. Further, a variety of methods exist to create embeddings of graphs that can enrich models and provide new insights.
In this talk we will look at some examples of common ML problems and demonstrate how they can take advantage of graph analytics and graph-based machine learning. We will also demonstrate how graph embeddings can be used to enhance existing ML pipelines.
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning
YouTube
Graph Analytics and Graph-based Machine Learning
Machine learning has traditionally revolved around creating models around data that is characterized by embeddings attributed to individual observations. However, this ignores a signal that could potentially be very strong: the relationships between data…
👍4👏1
🎞 Machine Learning with Graphs: Applications of Deep Graph Generation.
💥Free recorded course by Jure Leskovec, Computer Science, PhD
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #DGNN #GNN
💥Free recorded course by Jure Leskovec, Computer Science, PhD
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #Machine_Learning #DGNN #GNN
YouTube
Stanford CS224W: ML with Graphs | 2021 | Lecture 15.4 - Applications of Deep Graph Generation
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3EwmakW
Lecture 15.4: Application of Deep Graph Generative Models to Molecule Generation
Jure Leskovec
Computer Science, PhD
Finally…
Lecture 15.4: Application of Deep Graph Generative Models to Molecule Generation
Jure Leskovec
Computer Science, PhD
Finally…
👍1
🎞 Machine Learning with Graphs: Graph Neural Networks in Computational Biology
💥Free recorded course by Prof. Marinka Zitnik
💥In this lecture, Prof. Marinka gives an overview of why graph learning techniques can greatly help with computational biology research. Concretely, this talk covers 3 exemplar use cases: (1) Discovering safe drug-drug combinations via multi-relational link prediction on heterogenous knowledge graphs; (2) Classify patient outcomes and diseases via learning subgraph embeddings; and (3) Learning effective disease treatments through few-shot learning for graphs.
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #GNN #Machine_Learning #computational_biology
💥Free recorded course by Prof. Marinka Zitnik
💥In this lecture, Prof. Marinka gives an overview of why graph learning techniques can greatly help with computational biology research. Concretely, this talk covers 3 exemplar use cases: (1) Discovering safe drug-drug combinations via multi-relational link prediction on heterogenous knowledge graphs; (2) Classify patient outcomes and diseases via learning subgraph embeddings; and (3) Learning effective disease treatments through few-shot learning for graphs.
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #GNN #Machine_Learning #computational_biology
YouTube
Stanford CS224W: Machine Learning with Graphs | 2021 | Lecture 18 - GNNs in Computational Biology
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/2XVImFC
Lecture 18 - Graph Neural Networks in Computational Biology
Jure Leskovec
Computer Science, PhD
We are glad to invite Prof.…
Lecture 18 - Graph Neural Networks in Computational Biology
Jure Leskovec
Computer Science, PhD
We are glad to invite Prof.…
👍4🎉1
🎞 Machine Learning with Graphs: Pre-Training Graph Neural Networks
💥Free recorded course by Prof. Jure Leskovec
💥There are two challenges in applying GNNs to scientific domains: scarcity of labeled data and out-of-distribution prediction. In this video we discuss methods for pre-training GNNs to resolve these challenges. The key idea is to pre-train both node and graph embeddings, which leads to significant performance gains on downstream tasks.
📽 Watch
📑More details can be found in the paper: Strategies for Pre-training Graph Neural Networks
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #GNN #Machine_Learning
💥Free recorded course by Prof. Jure Leskovec
💥There are two challenges in applying GNNs to scientific domains: scarcity of labeled data and out-of-distribution prediction. In this video we discuss methods for pre-training GNNs to resolve these challenges. The key idea is to pre-train both node and graph embeddings, which leads to significant performance gains on downstream tasks.
📽 Watch
📑More details can be found in the paper: Strategies for Pre-training Graph Neural Networks
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #GNN #Machine_Learning
arXiv.org
Strategies for Pre-training Graph Neural Networks
Many applications of machine learning require a model to make accurate pre-dictions on test examples that are distributionally different from training ones, while task-specific labels are scarce...
👍4❤1
🎞 Machine Learning with Graphs: hyperbolic graph embeddings
💥Free recorded course by Prof. Jure Leskovec
💥 This part focused on graph representation learning in Euclidean embedding spaces. In this lecture, we introduce hyperbolic embedding spaces, which are great for modeling hierarchical, tree-like graphs. Moreover, we introduce basics for hyperbolic geometry models, which leads to the idea of hyperbolic GNNs. More details can be found in the paper: Hyperbolic Graph Convolutional Neural Networks
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #GNN #Machine_Learning
💥Free recorded course by Prof. Jure Leskovec
💥 This part focused on graph representation learning in Euclidean embedding spaces. In this lecture, we introduce hyperbolic embedding spaces, which are great for modeling hierarchical, tree-like graphs. Moreover, we introduce basics for hyperbolic geometry models, which leads to the idea of hyperbolic GNNs. More details can be found in the paper: Hyperbolic Graph Convolutional Neural Networks
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #GNN #Machine_Learning
YouTube
Stanford CS224W: Machine Learning with Graphs | 2021 | Lecture 19.2 - Hyperbolic Graph Embeddings
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3Brc7vN
Jure Leskovec
Computer Science, PhD
In previous lectures, we focused on graph representation learning in Euclidean embedding…
Jure Leskovec
Computer Science, PhD
In previous lectures, we focused on graph representation learning in Euclidean embedding…
👍2
🎞 Machine Learning with Graphs: design space of graph neural networks
💥Free recorded course by Prof. Jure Leskovec
💥 This part discussed the important topic of GNN architecture design. Here, we introduce 3 key aspects in GNN design: (1) a general GNN design space, which includes intra-layer design, inter-layer design and learning configurations; (2) a GNN task space with similarity metrics so that we can characterize different GNN tasks and, therefore, transfer the best GNN models across tasks; (3) an effective GNN evaluation technique so that we can convincingly evaluate any GNN design question, such as “Is BatchNorm generally useful for GNNs?”. Overall, we provide the first systematic investigation of general guidelines for GNN design, understandings of GNN tasks, and how to transfer the best GNN designs across tasks. We release GraphGym as an easy-to-use code platform for GNN architectural design. More information can be found in the paper: Design Space for Graph Neural Networks
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #GNN #Machine_Learning
💥Free recorded course by Prof. Jure Leskovec
💥 This part discussed the important topic of GNN architecture design. Here, we introduce 3 key aspects in GNN design: (1) a general GNN design space, which includes intra-layer design, inter-layer design and learning configurations; (2) a GNN task space with similarity metrics so that we can characterize different GNN tasks and, therefore, transfer the best GNN models across tasks; (3) an effective GNN evaluation technique so that we can convincingly evaluate any GNN design question, such as “Is BatchNorm generally useful for GNNs?”. Overall, we provide the first systematic investigation of general guidelines for GNN design, understandings of GNN tasks, and how to transfer the best GNN designs across tasks. We release GraphGym as an easy-to-use code platform for GNN architectural design. More information can be found in the paper: Design Space for Graph Neural Networks
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #GNN #Machine_Learning
arXiv.org
Design Space for Graph Neural Networks
The rapid evolution of Graph Neural Networks (GNNs) has led to a growing number of new architectures as well as novel applications. However, current research focuses on proposing and evaluating...
🎞 Machine Learning with Graphs: GraphSAGE Neighbor Sampling
💥Free recorded course by Prof. Jure Leskovec
💥 This part discussed Neighbor Sampling, That is a representative method used to scale up GNNs to large graphs. The key insight is that a K-layer GNN generates a node embedding by using only the nodes from the K-hop neighborhood around that node. Therefore, to generate embeddings of nodes in the mini-batch, only the K-hop neighborhood nodes and their features are needed to load onto a GPU, a tractable operation even if the original graph is large. To further reduce the computational cost, only a subset of neighboring nodes is sampled for GNNs to aggregate.
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #GNN #Machine_Learning #GraphSAGE
💥Free recorded course by Prof. Jure Leskovec
💥 This part discussed Neighbor Sampling, That is a representative method used to scale up GNNs to large graphs. The key insight is that a K-layer GNN generates a node embedding by using only the nodes from the K-hop neighborhood around that node. Therefore, to generate embeddings of nodes in the mini-batch, only the K-hop neighborhood nodes and their features are needed to load onto a GPU, a tractable operation even if the original graph is large. To further reduce the computational cost, only a subset of neighboring nodes is sampled for GNNs to aggregate.
📽 Watch
📲Channel: @ComplexNetworkAnalysis
#video #course #Graph #GNN #Machine_Learning #GraphSAGE
YouTube
Stanford CS224W: Machine Learning with Graphs | 2021 | Lecture 17.2 - GraphSAGE Neighbor Sampling
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3Brn5kW
Lecture 17.2 - GraphSAGE Neighbor Sampling Scaling up GNNs
Jure Leskovec
Computer Science, PhD
Neighbor Sampling is a representative…
Lecture 17.2 - GraphSAGE Neighbor Sampling Scaling up GNNs
Jure Leskovec
Computer Science, PhD
Neighbor Sampling is a representative…