Glossary

Integral

The Integral of a function $f$ over an interval $[a, b]$, written $\int_a^b f(x),dx$, is the signed area under the curve of $f$ between $a$ and $b$. The Fundamental Theorem of Calculus connects integration and differentiation: if $F' = f$, then $\int_a^b f(x),dx = F(b) - F(a)$. Integrals and derivatives are, in a precise sense, inverse operations.

In AI, integrals arise primarily in probability theory. A continuous probability distribution is defined by a probability density function $p(x)$ satisfying $\int_{-\infty}^\infty p(x),dx = 1$. The expected value is $\mathbb{E}[X] = \int x, p(x),dx$, and the variance is a similar integral. Marginalising out a latent variable requires integration: $p(x) = \int p(x, z),dz$. Information-theoretic quantities such as entropy $H(X) = -\int p(x)\ln p(x),dx$ and KL divergence are also integrals.

Many important integrals in machine learning cannot be computed in closed form—the posterior of a Bayesian neural network, the evidence lower bound of a variational autoencoder, the partition function of an energy-based model. Two families of techniques approximate intractable integrals: Monte Carlo methods draw random samples and average (whose error scales as $1/\sqrt{N}$ regardless of dimension, beating the curse of dimensionality that cripples deterministic quadrature), and variational methods replace the intractable distribution with a tractable approximation and optimise its parameters.

Related terms: Expectation

Discussed in:

Also defined in: Textbook of AI