Groups
Category
Multi-task loss balancing aims to automatically set each task’s weight so that no single loss dominates training.
Dropout can be interpreted as variational inference in a Bayesian neural network, where applying random masks approximates sampling from a posterior over weights.