In probability theory, the total variation distance is a distance measure for probability distributions. It is an example of a statistical distance metric, and is sometimes just called "the" statistical distance.
The total variation distance between two probability measures P and Q on a sigma-algebra of subsets of the sample space is defined via
Informally, this is the largest possible difference between the probabilities that the two probability distributions can assign to the same event.
The total variation distance is related to the Kullback–Leibler divergence by Pinsker's inequality.
On a finite probability space, the total variation distance is related to the L1 norm by the identity:
The total variation distance arises as twice the optimal transportation cost, when the cost function is , that is,