๐ŸŽ“How I Study AIHISA
๐Ÿ“–Read
๐Ÿ“„Papers๐Ÿ“ฐBlogs๐ŸŽฌCourses
๐Ÿ’กLearn
๐Ÿ›ค๏ธPaths๐Ÿ“šTopics๐Ÿ’กConcepts๐ŸŽดShorts
๐ŸŽฏPractice
๐Ÿ“Daily Log๐ŸŽฏPrompts๐Ÿง Review
SearchSettings
How I Study AI - Learn AI Papers & Lectures the Easy Way

Concepts7

Groups

๐Ÿ“Linear Algebra15๐Ÿ“ˆCalculus & Differentiation10๐ŸŽฏOptimization14๐ŸŽฒProbability Theory12๐Ÿ“ŠStatistics for ML9๐Ÿ“กInformation Theory10๐Ÿ”บConvex Optimization7๐Ÿ”ขNumerical Methods6๐Ÿ•ธGraph Theory for Deep Learning6๐Ÿ”ตTopology for ML5๐ŸŒDifferential Geometry6โˆžMeasure Theory & Functional Analysis6๐ŸŽฐRandom Matrix Theory5๐ŸŒŠFourier Analysis & Signal Processing9๐ŸŽฐSampling & Monte Carlo Methods10๐Ÿง Deep Learning Theory12๐Ÿ›ก๏ธRegularization Theory11๐Ÿ‘๏ธAttention & Transformer Theory10๐ŸŽจGenerative Model Theory11๐Ÿ”ฎRepresentation Learning10๐ŸŽฎReinforcement Learning Mathematics9๐Ÿ”„Variational Methods8๐Ÿ“‰Loss Functions & Objectives10โฑ๏ธSequence & Temporal Models8๐Ÿ’ŽGeometric Deep Learning8

Category

๐Ÿ”ทAllโˆ‘Mathโš™๏ธAlgo๐Ÿ—‚๏ธDS๐Ÿ“šTheory

Level

AllBeginnerIntermediate
โˆ‘MathIntermediate

Pseudoinverse (Moore-Penrose)

The Mooreโ€“Penrose pseudoinverse generalizes matrix inversion to rectangular or singular matrices and is denoted Aโบ.

#pseudoinverse#moore-penrose#least squares+12
โš™๏ธAlgorithmIntermediate

Principal Component Analysis (PCA)

Principal Component Analysis (PCA) finds new orthogonal axes (principal components) that capture the maximum variance in your data.

#principal component analysis
Advanced
Filtering by:
#svd
#pca c++
#eigendecomposition
+11
๐Ÿ“šTheoryIntermediate

Spectral Normalization

Spectral normalization rescales a weight matrix so its largest singular value (spectral norm) is at most a target value, typically 1.

#spectral normalization#spectral norm#singular value+12
โˆ‘MathIntermediate

Low-Rank Approximation

Low-rank approximation replaces a big matrix with one that has far fewer degrees of freedom while preserving most of its action.

#low-rank approximation#eckart-young theorem#svd+12
โˆ‘MathIntermediate

Matrix Norms & Condition Numbers

Matrix norms measure the size of a matrix in different but related ways, with Frobenius treating entries like a big vector, spectral measuring the strongest stretch, and nuclear summing all singular values.

#matrix norm#spectral norm#frobenius norm+12
๐Ÿ“šTheoryIntermediate

Singular Value Decomposition (SVD)

Singular Value Decomposition (SVD) factors any mร—n matrix A into A = UฮฃV^{T}, where U and V are orthogonal and ฮฃ is diagonal with nonnegative entries.

#singular value decomposition#svd#truncated svd+12
๐Ÿ“šTheoryIntermediate

Linear Algebra Theory

Linear algebra studies vectors, linear combinations, and transformations that preserve addition and scalar multiplication.

#linear algebra#vector space#basis+12