Groups
Batch Normalization rescales and recenters activations using mini-batch statistics to stabilize and speed up neural network training.
Dropout randomly turns off (zeros) some neurons during training to prevent the network from memorizing the training data.