Groups
Category
Self-attention can be viewed as message passing on a fully connected graph where each token (node) sends a weighted message to every other token.
Multi-Head Attention runs several attention mechanisms in parallel so each head can focus on different relationships in the data.