[2021-01-18] [CS224W] Graph Neural Network

less than 1 minute read

Published:

Basics of Graph Neural Network

Idea: Generate node embeddings based on local network neighborhoods Neighborhood aggregation: Average information from neighbors and apply a neural network

\(h_v^0 = x_v \\ h_v^k = \sigma(W_k \sum_{u \in N(v)} \frac{h_u^{k-1}}{|N(v)|} + B_k h_v^{k-1}), \forall k \in {1, \cdots, K}\\ z_v = h_v^K\) $W_k$ and $B_k$ are trainable parameters

Graph Convolutional Networks and GraphSAGE

\[h_v^k = \sigma([W_k \cdot AGG({h_u^{k-1}, \forall u \in N(v)}) , B_k h_v^{k-1}]), k \in {1, \cdots, K}\\\]

AGG variants: mean, pool, LSTM

Efficient Implementation:

  • sparse matrix operations

Graph Attention Networks

Specify arbitrary importances to different neighbors of each node in the graph Let $\alpha_{vu}$ be computed as a byproduct of an attention mechanism $a$ \(e_{vu} = a(W_kh_u^{k-1}, W_kh_v^{k-1}) \\ \alpha_{vu} = \frac{exp(e_{vu})}{\sum_{k \in N(v)} exp(e_{vk})} \\ h_v^k = \sigma(\sum_{u \in N(v)} \alpha_{vu}W_kh_u^{k-1})\) where $e_{vu}$ indicates the importance of node u’s message to node v

Attention mechanism $a$

  • e.g. use a simple single-layer neural network
  • parameters of $a$ are trained jointly