Top Trends of Graph Machine Learning in 2020
In this blogpost the author shares an overview of ICLR 2020 papers on Graph Machine Learning and highlights several trends:
1. More solid theoretical understanding of GNN:
* the dimension of the node embeddings should be proportional to the size of the graph if we want GNN being able to compute a solution to popular graph problems
* under certain conditions on the weights, GCNs cannot learn anything except node degrees and connected components when the number of layers grows
* a certain readout operation after neighborhood aggregation could help capture different types of node classification
2. New cool applications of GNN:
* a way to detect and fix bugs simultaneously in Javascript code
* inferring the types of variables for languages like Python or TypeScript
* reasoning in IQ-like tests (Raven Progressive Matrices (RPM) and Diagram Syllogism (DS)) with GNNs
* an RL algorithm to optimize the cost of TensorFlow computation graphs
3. Knowledge graphs become more popular:
* an idea to embed a query into a latent space not as a single point, but as a rectangular box
* a way to work with numerical entities and rules
* the re-evaluation of the existing models and how do they perform in a fair environment
4. New frameworks for graph embeddings:
* a way to improve running time and accuracy in node classification problem for any unsupervised embedding method
* a simple baseline that does not utilize a topology of the graph (i.e. it works on the aggregated node features) performs on par with the SOTA GNNs
blog post:
https://towardsdatascience.com/top-trends-of-graph-machine-learning-in-2020-1194175351a3
#ICLR #gnn #graphs
In this blogpost the author shares an overview of ICLR 2020 papers on Graph Machine Learning and highlights several trends:
1. More solid theoretical understanding of GNN:
* the dimension of the node embeddings should be proportional to the size of the graph if we want GNN being able to compute a solution to popular graph problems
* under certain conditions on the weights, GCNs cannot learn anything except node degrees and connected components when the number of layers grows
* a certain readout operation after neighborhood aggregation could help capture different types of node classification
2. New cool applications of GNN:
* a way to detect and fix bugs simultaneously in Javascript code
* inferring the types of variables for languages like Python or TypeScript
* reasoning in IQ-like tests (Raven Progressive Matrices (RPM) and Diagram Syllogism (DS)) with GNNs
* an RL algorithm to optimize the cost of TensorFlow computation graphs
3. Knowledge graphs become more popular:
* an idea to embed a query into a latent space not as a single point, but as a rectangular box
* a way to work with numerical entities and rules
* the re-evaluation of the existing models and how do they perform in a fair environment
4. New frameworks for graph embeddings:
* a way to improve running time and accuracy in node classification problem for any unsupervised embedding method
* a simple baseline that does not utilize a topology of the graph (i.e. it works on the aggregated node features) performs on par with the SOTA GNNs
blog post:
https://towardsdatascience.com/top-trends-of-graph-machine-learning-in-2020-1194175351a3
#ICLR #gnn #graphs