🎓How I Study AIHISA
📖Read
📄Papers📰Blogs🎬Courses
💡Learn
🛤️Paths📚Topics💡Concepts🎴Shorts
🎯Practice
📝Daily Log🎯Prompts🧠Review
SearchSettings
How I Study AI - Learn AI Papers & Lectures the Easy Way

Concepts140

Groups

📐Linear Algebra15📈Calculus & Differentiation10🎯Optimization14🎲Probability Theory12📊Statistics for ML9📡Information Theory10🔺Convex Optimization7🔢Numerical Methods6🕸Graph Theory for Deep Learning6🔵Topology for ML5🌐Differential Geometry6∞Measure Theory & Functional Analysis6🎰Random Matrix Theory5🌊Fourier Analysis & Signal Processing9🎰Sampling & Monte Carlo Methods10🧠Deep Learning Theory12🛡️Regularization Theory11👁️Attention & Transformer Theory10🎨Generative Model Theory11🔮Representation Learning10🎮Reinforcement Learning Mathematics9🔄Variational Methods8📉Loss Functions & Objectives10⏱️Sequence & Temporal Models8💎Geometric Deep Learning8

Category

🔷All∑Math⚙️Algo🗂️DS📚Theory

Level

AllBeginnerIntermediateAdvanced
∑MathIntermediate

Partial Derivatives

Partial derivatives measure how a multivariable function changes when you wiggle just one input while keeping the others fixed.

#partial derivatives#gradient#jacobian+12
∑MathIntermediate

Derivatives & Differentiation Rules

Derivatives measure how fast a function changes, and rules like the product, quotient, and chain rule let us differentiate complex expressions efficiently.

#derivative
56789
#product rule
#quotient rule
+11
∑MathIntermediate

Limits & Continuity

A limit describes what value a function approaches as the input gets close to some point, even if the function is not defined there.

#limit#continuity#epsilon-delta+12
∑MathIntermediate

Matrix Calculus Fundamentals

Matrix calculus extends single-variable derivatives to matrices so we can differentiate functions built from matrix multiplications, traces, and norms.

#matrix calculus#frobenius norm#trace trick+12
∑MathIntermediate

Low-Rank Approximation

Low-rank approximation replaces a big matrix with one that has far fewer degrees of freedom while preserving most of its action.

#low-rank approximation#eckart-young theorem#svd+12
∑MathIntermediate

Tensor Operations

A tensor is a multi-dimensional array that generalizes scalars (0-D), vectors (1-D), and matrices (2-D) to higher dimensions.

#tensor#multi-dimensional array#broadcasting+12
∑MathIntermediate

Matrix Norms & Condition Numbers

Matrix norms measure the size of a matrix in different but related ways, with Frobenius treating entries like a big vector, spectral measuring the strongest stretch, and nuclear summing all singular values.

#matrix norm#spectral norm#frobenius norm+12
∑MathIntermediate

Positive Definite Matrices

A real symmetric matrix A is positive definite if and only if x^T A x > 0 for every nonzero vector x, and positive semidefinite if x^T A x ≥ 0.

#positive definite#positive semidefinite#cholesky decomposition+11
∑MathIntermediate

Eigendecomposition

Eigendecomposition expresses a matrix as a change of basis times a diagonal scaling, revealing its natural stretching directions.

#eigendecomposition#eigenvalue#eigenvector+11
∑MathIntermediate

Inner Products & Norms

An inner product measures how much two vectors point in the same direction; in R^n it is the dot product.

#inner product#dot product#norm+12
∑MathIntermediate

Systems of Linear Equations

A system of linear equations asks for numbers that make several linear relationships true at the same time, which we compactly write as Ax = b.

#systems of linear equations#gaussian elimination#row echelon form+12
∑MathIntermediate

Matrix Operations & Properties

Matrix operations like multiplication and transpose combine or reorient data tables and linear transformations in predictable ways.

#matrix multiplication#transpose#trace+12