Glossary

Matrix

A Matrix is a rectangular array of numbers, arranged in rows and columns. An $m \times n$ matrix has $m$ rows and $n$ columns, and its entry in row $i$, column $j$ is denoted $a_{ij}$. Matrices are the natural way to represent linear transformations: functions that map vectors to vectors while preserving addition and scalar multiplication. Every linear transformation from $n$-dimensional to $m$-dimensional space corresponds uniquely to an $m \times n$ matrix, and vice versa.

In AI, matrices appear everywhere. The weights of a neural network layer form a matrix; a dataset of $m$ examples with $n$ features is an $m \times n$ data matrix; a greyscale image is an $h \times w$ matrix of pixel intensities; the attention mechanism in a transformer computes matrices of query–key similarities. Special types include the identity matrix $I$ (ones on the diagonal), diagonal matrices, symmetric matrices ($A = A^T$), and orthogonal matrices ($Q^T Q = I$). The transpose $A^T$ swaps rows and columns; the inverse $A^{-1}$ undoes the transformation when it exists.

Key scalar invariants of a square matrix are its determinant (the signed volume-scaling factor of the transformation, zero if the matrix is singular), its trace (the sum of diagonal entries, equal to the sum of eigenvalues), and its rank (the dimension of its column space). Rank in particular is central to AI: low-rank matrix factorisation underpins collaborative filtering, latent semantic analysis, and parameter-efficient fine-tuning methods such as LoRA.

Related terms: Vector, Matrix Multiplication, Eigenvalue and Eigenvector, Singular Value Decomposition, Tensor

Discussed in:

Also defined in: Textbook of AI, Textbook of Medical AI