Groups
Layer Normalization rescales and recenters each sample across its feature dimensions, making it independent of batch size.
Batch Normalization rescales and recenters activations using mini-batch statistics to stabilize and speed up neural network training.