๐ŸŽ“How I Study AIHISA
๐Ÿ“–Read
๐Ÿ“„Papers๐Ÿ“ฐBlogs๐ŸŽฌCourses
๐Ÿ’กLearn
๐Ÿ›ค๏ธPaths๐Ÿ“šTopics๐Ÿ’กConcepts๐ŸŽดShorts
๐ŸŽฏPractice
๐Ÿ“Daily Log๐ŸŽฏPrompts๐Ÿง Review
SearchSettings
How I Study AI - Learn AI Papers & Lectures the Easy Way

Concepts25

Groups

๐Ÿ“Linear Algebra15๐Ÿ“ˆCalculus & Differentiation10๐ŸŽฏOptimization14๐ŸŽฒProbability Theory12๐Ÿ“ŠStatistics for ML9๐Ÿ“กInformation Theory10๐Ÿ”บConvex Optimization7๐Ÿ”ขNumerical Methods6๐Ÿ•ธGraph Theory for Deep Learning6๐Ÿ”ตTopology for ML5๐ŸŒDifferential Geometry6โˆžMeasure Theory & Functional Analysis6๐ŸŽฐRandom Matrix Theory5๐ŸŒŠFourier Analysis & Signal Processing9๐ŸŽฐSampling & Monte Carlo Methods10๐Ÿง Deep Learning Theory12๐Ÿ›ก๏ธRegularization Theory11๐Ÿ‘๏ธAttention & Transformer Theory10๐ŸŽจGenerative Model Theory11๐Ÿ”ฎRepresentation Learning10๐ŸŽฎReinforcement Learning Mathematics9๐Ÿ”„Variational Methods8๐Ÿ“‰Loss Functions & Objectives10โฑ๏ธSequence & Temporal Models8๐Ÿ’ŽGeometric Deep Learning8

Category

๐Ÿ”ทAllโˆ‘Mathโš™๏ธAlgo๐Ÿ—‚๏ธDS๐Ÿ“šTheory

Level

AllBeginnerIntermediate
โš™๏ธAlgorithmIntermediate

Bootstrap & Resampling Methods

Bootstrap is a resampling method that estimates uncertainty by repeatedly sampling with replacement from the observed data.

#bootstrap#resampling#confidence intervals+12
๐Ÿ“šTheoryIntermediate

Bayesian Inference

Bayesian inference updates prior beliefs with observed data to produce a posterior distribution P(\theta\mid D).

#bayesian inference
123
Advanced
Filtering by:
#monte carlo
#posterior
#prior
+12
โˆ‘MathIntermediate

Expectation, Variance & Moments

Expectation is the long-run average value of a random variable and acts like the balance point of its distribution.

#expectation#variance#moments+12
โˆ‘MathBeginner

Conditional Probability

Conditional probability measures the chance of event A happening when we already know event B happened.

#conditional probability#bayes theorem#law of total probability+12
โˆ‘MathIntermediate

Random Variables & Distributions

A random variable maps uncertain outcomes to numbers and is described by a distribution that assigns likelihoods to values or ranges.

#random variable#pmf#pdf+12
โˆ‘MathIntermediate

Probability Axioms & Rules

Kolmogorovโ€™s axioms define probability as a measure on events: non-negativity, normalization, and countable additivity.

#kolmogorov axioms#probability measure#sample space+12
๐Ÿ“šTheoryIntermediate

Randomized Algorithm Theory

Randomized algorithms use random bits to make choices that simplify design, avoid worst cases, and often speed up computation.

#randomized algorithms#las vegas#monte carlo+12
๐Ÿ“šTheoryIntermediate

Probability Theory

Probability theory formalizes uncertainty using a sample space, events, and a probability measure that obeys clear axioms.

#probability measure#random variable#expectation+12
๐Ÿ“šTheoryIntermediate

KL Divergence (Kullback-Leibler Divergence)

Kullbackโ€“Leibler (KL) divergence measures how one probability distribution P devotes probability mass differently from a reference distribution Q.

#kl divergence#kullback-leibler#cross-entropy+12
โˆ‘MathIntermediate

Variance and Covariance

Variance measures how spread out a random variable is around its mean, while covariance measures how two variables move together.

#variance#covariance#standard deviation+12
โˆ‘MathIntermediate

Expected Value

Expected value is the long-run average outcome of a random variable if you could repeat the experiment many times.

#expected value#linearity of expectation#indicator variables+12
โˆ‘MathIntermediate

Probability Fundamentals

Probability quantifies uncertainty by assigning numbers between 0 and 1 to events in a sample space.

#probability#sample space#conditional probability+12