🎓How I Study AIHISA
📖Read
📄Papers📰Blogs🎬Courses
💡Learn
🛤️Paths📚Topics💡Concepts🎴Shorts
🎯Practice
⏱️Coach🧩Problems🧠Thinking🎯Prompts🧠Review
SearchSettings
How I Study AI - Learn AI Papers & Lectures the Easy Way

Concepts7

Groups

📐Linear Algebra15📈Calculus & Differentiation10🎯Optimization14🎲Probability Theory12📊Statistics for ML9📡Information Theory10🔺Convex Optimization7🔢Numerical Methods6🕸Graph Theory for Deep Learning6🔵Topology for ML5🌐Differential Geometry6∞Measure Theory & Functional Analysis6🎰Random Matrix Theory5🌊Fourier Analysis & Signal Processing9🎰Sampling & Monte Carlo Methods10🧠Deep Learning Theory12🛡️Regularization Theory11👁️Attention & Transformer Theory10🎨Generative Model Theory11🔮Representation Learning10🎮Reinforcement Learning Mathematics9🔄Variational Methods8📉Loss Functions & Objectives10⏱️Sequence & Temporal Models8💎Geometric Deep Learning8

Category

🔷All∑Math⚙️Algo🗂️DS📚Theory

Level

AllBeginner
∑MathIntermediate

Hidden Markov Models

A Hidden Markov Model (HMM) describes sequences where you cannot see the true state directly, but you can observe outputs generated by those hidden states.

#hidden markov model#forward algorithm#viterbi+12
⚙️AlgorithmIntermediate

Dynamic Time Warping

Dynamic Time Warping (DTW) aligns two time series that may vary in speed to find the minimum-cost correspondence between their elements.

#dynamic time warping
Intermediate
Advanced
Group:
Sequence & Temporal Models
#dtw c++
#time series alignment
+11
📚TheoryIntermediate

Sequence-to-Sequence with Attention

Sequence-to-sequence with attention lets a decoder focus on the most relevant parts of the input at each output step, rather than compressing everything into a single vector.

#sequence-to-sequence#attention#encoder-decoder+12
📚TheoryIntermediate

Temporal Convolutions

Temporal (causal) convolution computes each output at time t using only the current and past inputs, ensuring no future information leakage.

#temporal convolution#causal convolution#fir filter+12
∑MathIntermediate

State Space Models (SSM)

A State Space Model (SSM) describes a dynamical system using a state vector x(t) that evolves via a first-order matrix differential equation and produces outputs y(t).

#state space#matrix exponential#controllability+12
📚TheoryIntermediate

LSTM & Gating Mechanisms

Long Short-Term Memory (LSTM) networks use gates (forget, input, and output) to control what information to erase, write, and reveal at each time step.

#lstm#forget gate#input gate+11
📚TheoryIntermediate

Recurrent Neural Network Theory

A Recurrent Neural Network (RNN) processes sequences by carrying a hidden state that is updated at every time step using h_t = f(W_h h_{t-1} + W_x x_t + b).

#recurrent neural network#rnn#backpropagation through time+12