πŸŽ“How I Study AIHISA
πŸ“–Read
πŸ“„PapersπŸ“°Blogs🎬Courses
πŸ’‘Learn
πŸ›€οΈPathsπŸ“šTopicsπŸ’‘Concepts🎴Shorts
🎯Practice
⏱️Coach🧩Problems🧠Thinking🎯Prompts🧠Review
SearchSettings
How I Study AI - Learn AI Papers & Lectures the Easy Way
Back to Concepts
🎰

Random Matrix Theory

Spectral distributions of random matrices and their surprising connections to neural network behavior.

5 concepts

Advanced5

βˆ‘MathAdvanced

Wigner Semicircle Law

The Wigner Semicircle Law says that the histogram of eigenvalues of large random symmetric matrices converges to a semicircle-shaped curve.

#wigner semicircle law#random matrix#empirical spectral distribution+12
βˆ‘MathAdvanced

Marchenko-Pastur Distribution

The Marchenko–Pastur (MP) distribution describes the limiting eigenvalue distribution of sample covariance matrices S = (1/n) XX^{\top} when both the dimension p and the sample size n grow with p/n \to \gamma.

#marchenko-pastur#random matrix theory#sample covariance+10
βˆ‘MathAdvanced

Free Probability Theory

Free probability studies "random variables" that do not commute, where independence is replaced by freeness and noncrossing combinatorics replaces classical partitions.

#free probability#freeness#r-transform+11
πŸ“šTheoryAdvanced

Spectral Analysis of Neural Networks

Spectral analysis studies the distribution of eigenvalues and singular values of neural network weight matrices during training.

#spectral analysis#eigenvalues#singular values+12
πŸ“šTheoryAdvanced

Random Matrix Theory in High-Dimensional Statistics

Random Matrix Theory (RMT) explains how eigenvalues of large random matrices behave when the dimension p is comparable to the sample size n.

#random matrix theory#marchenko-pastur#wigner semicircle+12