In measure theory, Lebesgue's dominated convergence theorem provides sufficient conditions under which almost everywhere convergence of a sequence of functions implies convergence in the L1 norm. Its power and utility are two of the primary theoretical advantages of Lebesgue integration over Riemann integration.
It is widely used in probability theory, since it gives a sufficient condition for the convergence of expected values of random variables.
Lebesgue's Dominated Convergence Theorem. Let {fn} be a sequence of real-valued measurable functions on a measure space (S, Σ, μ). Suppose that the sequence converges pointwise to a function f and is dominated by some integrable function g in the sense that
for all numbers n in the index set of the sequence and all points x ∈ S. Then f is integrable and
which also implies
Remark 1. The statement "g is integrable" is meant in the sense of Lebesgue; that is
Remark 2. The convergence of the sequence and domination by g can be relaxed to hold only μ-almost everywhere provided the measure space (S, Σ, μ) is complete or f is chosen as a measurable function which agrees μ-almost everywhere with the μ-almost everywhere existing pointwise limit. (These precautions are necessary, because otherwise there might exist a non-measurable subset of a μ-null set N ∈ Σ, hence f might not be measurable.)