Concepts7

📚TheoryAdvanced

Information-Theoretic Lower Bounds

Information-theoretic lower bounds tell you the best possible performance any learning algorithm can achieve, regardless of cleverness or compute.

#information-theoretic lower bounds#fano inequality#le cam method+12
📚TheoryAdvanced

Representation Learning Theory

Representation learning aims to automatically discover features that make downstream tasks easy, often without human-provided labels.

#representation learning#contrastive learning#infonce+12
📚TheoryAdvanced

Information Bottleneck Theory

Information Bottleneck (IB) studies how to compress an input X into a representation Z that still preserves what is needed to predict Y.

#information bottleneck#mutual information#variational information bottleneck+12
📚TheoryIntermediate

Contrastive Learning Theory

Contrastive learning learns representations by pulling together positive pairs and pushing apart negatives using a softmax-based objective.

#contrastive learning#infonce#nt-xent+12
📚TheoryIntermediate

Mutual Information

Mutual Information (MI) measures how much knowing one random variable reduces uncertainty about another.

#mutual information#entropy#kl divergence+12
📚TheoryIntermediate

Shannon Entropy

Shannon entropy quantifies the average uncertainty or information content of a random variable in bits when using base-2 logarithms.

#shannon entropy#information gain#mutual information+12
📚TheoryIntermediate

Information Theory

Information theory quantifies uncertainty and information using measures like entropy, cross-entropy, KL divergence, and mutual information.

#entropy#cross-entropy#kl divergence+12