Groups
Category
Level
The MooreโPenrose pseudoinverse generalizes matrix inversion to rectangular or singular matrices and is denoted Aโบ.
A sparse matrix stores only its nonzero entries, saving huge amounts of memory when most entries are zero.
The Kronecker product A โ B expands a small matrix into a larger block matrix by multiplying every entry of A with the whole matrix B.
Orthogonal (real) and unitary (complex) matrices are length- and angle-preserving transformations, like perfect rotations and reflections.
Matrix calculus extends single-variable derivatives to matrices so we can differentiate functions built from matrix multiplications, traces, and norms.
Low-rank approximation replaces a big matrix with one that has far fewer degrees of freedom while preserving most of its action.
A tensor is a multi-dimensional array that generalizes scalars (0-D), vectors (1-D), and matrices (2-D) to higher dimensions.
Matrix norms measure the size of a matrix in different but related ways, with Frobenius treating entries like a big vector, spectral measuring the strongest stretch, and nuclear summing all singular values.
A real symmetric matrix A is positive definite if and only if x^T A x > 0 for every nonzero vector x, and positive semidefinite if x^T A x โฅ 0.
Eigendecomposition expresses a matrix as a change of basis times a diagonal scaling, revealing its natural stretching directions.
An inner product measures how much two vectors point in the same direction; in R^n it is the dot product.
A system of linear equations asks for numbers that make several linear relationships true at the same time, which we compactly write as Ax = b.