Concepts105

📚TheoryAdvanced

P vs NP Problem

P vs NP asks whether every problem whose solutions can be verified quickly can also be solved quickly.

#p vs np#np-complete#np-hard+12
📚TheoryAdvanced

GAN Theory

Generative Adversarial Networks (GANs) set up a two-player game where a generator tries to make fake samples that look real while a discriminator tries to tell real from fake.

#gan minimax#wasserstein gan#js divergence+11
📚TheoryAdvanced

Representation Learning Theory

Representation learning aims to automatically discover features that make downstream tasks easy, often without human-provided labels.

#representation learning#contrastive learning#infonce+12
📚TheoryAdvanced

Information Bottleneck Theory

Information Bottleneck (IB) studies how to compress an input X into a representation Z that still preserves what is needed to predict Y.

#information bottleneck#mutual information#variational information bottleneck+12
📚TheoryAdvanced

Policy Gradient Theorem

The policy gradient theorem tells us how to push a stochastic policy’s parameters to increase expected return by following the gradient of expected rewards.

#policy gradient#reinforce#actor-critic+11
📚TheoryAdvanced

Transformer Theory

Transformers map sequences to sequences using layers of self-attention and feed-forward networks wrapped with residual connections and LayerNorm.

#transformer#self-attention#positional encoding+12
📚TheoryAdvanced

Reinforcement Learning Theory

Reinforcement Learning (RL) studies how an agent learns to act in an environment to maximize long-term cumulative reward.

#reinforcement learning#mdp#bellman equation+12
📚TheoryAdvanced

Neural Tangent Kernel (NTK) Theory

The Neural Tangent Kernel (NTK) connects very wide neural networks to classical kernel methods, letting us study training as if it were kernel regression.

#neural tangent kernel#ntk#infinite width+12
📚TheoryAdvanced

Calculus of Variations

Calculus of variations optimizes functionals—numbers produced by whole functions—rather than ordinary functions of numbers.

#calculus of variations#euler–lagrange#functional derivative+12
📚TheoryAdvanced

Deep Learning Generalization Theory

Deep learning generalization theory tries to explain why overparameterized networks can fit (interpolate) training data yet still perform well on new data.

#generalization#implicit regularization#minimum norm+12
📚TheoryAdvanced

Neural Network Expressivity

Neural network expressivity studies what kinds of functions different network architectures can represent and how efficiently they can do so.

#neural network expressivity#depth separation#relu linear regions+12
📚TheoryAdvanced

Statistical Learning Theory

Statistical learning theory explains why a model that fits training data can still predict well on unseen data by relating true risk to empirical risk plus a complexity term.

#statistical learning theory#empirical risk minimization#structural risk minimization+11