Batching for GNN
Batching for GNN
- GIven I training graphs Xi,Ai and labels yi, params Φ={βk,Ωk}k=0K , SGDand Binary Cross Entropy
- We cannot concat batches into a big tensor since each graph has a different number of nodes
- Instead, we treat the graphs in each batch as disjoint components of a single large graph.
- we can then just run the network over this big graph as a single instance of the network equations.
- Then we can do mean pooling on it
Neighborhood Sampling
- randomly sample fixed number of neighbors, recursively
- somewhat like Dropout
Graph Partitioning
- cluster the original graph into disjoint (not connected to each other) subsets of nodes
- these are batches
- this converts a transductive problem to an inductive one
- for inference, k-hop neighbors