Graph structures, spectral methods, and message passing for graph neural networks.
6 concepts
The graph Laplacian translates a graphโs connectivity into a matrix that measures how much a function varies across edges.
Spectral graph theory studies graphs by looking at eigenvalues and eigenvectors of matrices like the adjacency matrix A and Laplacians L and L_norm.
Message Passing Neural Networks (MPNNs) learn on graphs by letting nodes repeatedly exchange and aggregate messages from their neighbors.
Graph isomorphism asks whether two graphs are the same up to renaming vertices; the WeisfeilerโLeman (WL) test is a powerful heuristic that often distinguishes non-isomorphic graphs quickly.
A random walk on a graph moves from a node to one of its neighbors chosen uniformly at random at each step.