Groups
Category
Multi-task loss balancing aims to automatically set each task’s weight so that no single loss dominates training.
Dropout randomly turns off (zeros) some neurons during training to prevent the network from memorizing the training data.