🎓How I Study AIHISA
📖Read
📄Papers📰Blogs🎬Courses
💡Learn
🛤️Paths📚Topics💡Concepts🎴Shorts
🎯Practice
📝Daily Log🎯Prompts🧠Review
SearchSettings
How I Study AI - Learn AI Papers & Lectures the Easy Way

Concepts172

Groups

📐Linear Algebra15📈Calculus & Differentiation10🎯Optimization14🎲Probability Theory12📊Statistics for ML9📡Information Theory10🔺Convex Optimization7🔢Numerical Methods6🕸Graph Theory for Deep Learning6🔵Topology for ML5🌐Differential Geometry6∞Measure Theory & Functional Analysis6🎰Random Matrix Theory5🌊Fourier Analysis & Signal Processing9🎰Sampling & Monte Carlo Methods10🧠Deep Learning Theory12🛡️Regularization Theory11👁️Attention & Transformer Theory10🎨Generative Model Theory11🔮Representation Learning10🎮Reinforcement Learning Mathematics9🔄Variational Methods8📉Loss Functions & Objectives10⏱️Sequence & Temporal Models8💎Geometric Deep Learning8

Category

🔷All∑Math⚙️Algo🗂️DS📚Theory

Level

AllBeginnerIntermediate
∑MathAdvanced

Evidence Lower Bound (ELBO)

The Evidence Lower Bound (ELBO) is a tractable lower bound on the log evidence log p(x) used to perform approximate Bayesian inference.

#elbo#variational inference#vae+12
📚TheoryAdvanced

Manifold Learning

Manifold learning assumes high-dimensional data actually lies near a much lower-dimensional, smoothly curved surface embedded in a higher-dimensional space.

#manifold learning
12345
Advanced
#isomap
#locally linear embedding
+12
📚TheoryAdvanced

Neural Collapse

Neural Collapse describes what happens at the end of training: the penultimate-layer features of each class concentrate tightly around a class mean.

#neural collapse#simplex etf#equiangular tight frame+12
📚TheoryAdvanced

Transfer Learning Theory

Transfer learning theory studies when and why a model trained on a source distribution will work on a different target distribution.

#transfer learning#domain adaptation#hΔh-divergence+12
📚TheoryAdvanced

Disentangled Representations

Disentangled representations aim to encode independent factors of variation (like shape, size, or color) into separate coordinates of a latent vector.

#disentangled representations#independent factors#total correlation+12
📚TheoryAdvanced

Energy-Based Models (EBM)

Energy-Based Models (EBMs) define probabilities through an energy landscape: low energy means high probability, with p(x) = exp(-E(x)) / Z.

#energy-based models#partition function#langevin dynamics+12
∑MathAdvanced

Stochastic Differential Equations for Generation

A forward stochastic differential equation (SDE) models a state that drifts deterministically and is shaken by random Brownian noise over time.

#stochastic differential equation#diffusion model#euler maruyama+12
📚TheoryAdvanced

Diffusion Models (Score-Based)

Score-based diffusion models corrupt data by gradually adding Gaussian noise and then learn to reverse this process by estimating the score, the gradient of the log-density.

#diffusion models#score-based modeling#ddpm+7
📚TheoryAdvanced

Normalizing Flows

Normalizing flows transform a simple base distribution (like a standard Gaussian) into a complex target distribution using a chain of invertible functions.

#normalizing flows#change of variables#jacobian determinant+12
📚TheoryAdvanced

GAN Theory & Training Dynamics

GANs frame learning as a two-player game where a generator tries to fool a discriminator, and the discriminator tries to detect fakes.

#gan#generator#discriminator+12
📚TheoryAdvanced

Variational Autoencoders (VAE) Theory

A Variational Autoencoder (VAE) is a probabilistic autoencoder that learns to generate data by inferring hidden causes (latent variables) and decoding them back to observations.

#variational autoencoder#elbo#kl divergence+12
📚TheoryAdvanced

In-Context Learning Theory

In-context learning (ICL) means a model learns from examples provided in the input itself, without updating its parameters.

#in-context learning#transformer#attention+12