๐ŸŽ“How I Study AIHISA
๐Ÿ“–Read
๐Ÿ“„Papers๐Ÿ“ฐBlogs๐ŸŽฌCourses
๐Ÿ’กLearn
๐Ÿ›ค๏ธPaths๐Ÿ“šTopics๐Ÿ’กConcepts๐ŸŽดShorts
๐ŸŽฏPractice
๐Ÿ“Daily Log๐ŸŽฏPrompts๐Ÿง Review
SearchSettings
How I Study AI - Learn AI Papers & Lectures the Easy Way

Concepts12

Groups

๐Ÿ“Linear Algebra15๐Ÿ“ˆCalculus & Differentiation10๐ŸŽฏOptimization14๐ŸŽฒProbability Theory12๐Ÿ“ŠStatistics for ML9๐Ÿ“กInformation Theory10๐Ÿ”บConvex Optimization7๐Ÿ”ขNumerical Methods6๐Ÿ•ธGraph Theory for Deep Learning6๐Ÿ”ตTopology for ML5๐ŸŒDifferential Geometry6โˆžMeasure Theory & Functional Analysis6๐ŸŽฐRandom Matrix Theory5๐ŸŒŠFourier Analysis & Signal Processing9๐ŸŽฐSampling & Monte Carlo Methods10๐Ÿง Deep Learning Theory12๐Ÿ›ก๏ธRegularization Theory11๐Ÿ‘๏ธAttention & Transformer Theory10๐ŸŽจGenerative Model Theory11๐Ÿ”ฎRepresentation Learning10๐ŸŽฎReinforcement Learning Mathematics9๐Ÿ”„Variational Methods8๐Ÿ“‰Loss Functions & Objectives10โฑ๏ธSequence & Temporal Models8๐Ÿ’ŽGeometric Deep Learning8

Category

๐Ÿ”ทAllโˆ‘Mathโš™๏ธAlgo๐Ÿ—‚๏ธDS๐Ÿ“šTheory

Level

AllBeginnerIntermediate
โˆ‘MathIntermediate

Numerical Stability

Numerical stability measures how much rounding and tiny input changes can distort an algorithmโ€™s output on real computers using floating-point arithmetic.

#numerical stability#forward error#backward error+12
โˆ‘MathIntermediate

Lagrange Multipliers & Constrained Optimization

Lagrange multipliers let you optimize a function while strictly satisfying equality constraints by introducing auxiliary variables (the multipliers).

#lagrange multipliers
Advanced
Filtering by:
#gaussian elimination
#constrained optimization
#kkt conditions
+11
โˆ‘MathIntermediate

Implicit Differentiation & Implicit Function Theorem

Implicit differentiation lets you find slopes and higher derivatives even when y is given indirectly by an equation F(x,y)=0.

#implicit differentiation#implicit function theorem#jacobian+12
โˆ‘MathIntermediate

Systems of Linear Equations

A system of linear equations asks for numbers that make several linear relationships true at the same time, which we compactly write as Ax = b.

#systems of linear equations#gaussian elimination#row echelon form+12
โˆ‘MathIntermediate

Matrix Operations & Properties

Matrix operations like multiplication and transpose combine or reorient data tables and linear transformations in predictable ways.

#matrix multiplication#transpose#trace+12
โˆ‘MathIntermediate

Vectors & Vector Spaces

A vector is an element you can add and scale, and a vector space is any collection of such elements closed under these operations.

#vector space#basis#span+12
โš™๏ธAlgorithmAdvanced

DP with Probability

DP with probability models how chance flows between states over time by repeatedly redistributing mass according to transition probabilities.

#markov chain#probability dp#absorbing state+12
โš™๏ธAlgorithmAdvanced

DP with Expected Value

Dynamic programming with expected value solves problems where each state transitions randomly and we seek the expected cost, time, or steps to reach a goal.

#expected value dp#linearity of expectation#indicator variables+11
โˆ‘MathIntermediate

Matrix Rank and Linear Independence

Matrix rank is the number of pivots after Gaussian elimination and equals the dimension of both the column space and the row space.

#matrix rank#linear independence#gaussian elimination+12
โˆ‘MathAdvanced

Gaussian Elimination over GF(2)

Gaussian elimination over GF(2) is ordinary Gaussian elimination where addition and subtraction are XOR and multiplication is AND.

#gaussian elimination#gf(2)#xor basis+12
โˆ‘MathIntermediate

Gaussian Elimination

Gaussian elimination is a systematic way to solve linear equations by cleaning a matrix into an upper-triangular form using row swaps, scaling, and adding multiples of rows.

#gaussian elimination#partial pivoting#row echelon form+12
โˆ‘MathIntermediate

Determinant

The determinant of a square matrix measures how a linear transformation scales volume and whether it flips orientation.

#determinant#gaussian elimination#lu decomposition+12