*** Welcome to piglix ***

Chebyshev's inequality


In probability theory, Chebyshev's inequality (also spelled as Tchebysheff's inequality, Russian: Нера́венство Чебышёва) guarantees that, for a wide class of probability distributions, "nearly all" values are close to the mean—the precise statement being that no more than 1/k2 of the distribution's values can be more than k standard deviations away from the mean (or equivalently, at least 1−1/k2 of the distribution's values are within k standard deviations of the mean). The rule is often called Chebyshev's theorem, about the range of standard deviations around the mean, in statistics. The inequality has great utility because it can be applied to any probability distribution in which the mean and variance are defined. For example, it can be used to prove the weak law of large numbers.

In practical usage, in contrast to the 68–95–99.7 rule, which applies to normal distributions, under Chebyshev's inequality a minimum of just 75% of values must lie within two standard deviations of the mean and 89% within three standard deviations.

The term Chebyshev's inequality may also refer to Markov's inequality, especially in the context of analysis.

The theorem is named after Russian mathematician Pafnuty Chebyshev, although it was first formulated by his friend and colleague Irénée-Jules Bienaymé. The theorem was first stated without proof by Bienaymé in 1853 and later proved by Chebyshev in 1867. His student Andrey Markov provided another proof in his 1884 Ph.D. thesis.

Chebyshev's inequality is usually stated for random variables, but can be generalized to a statement about measure spaces.


...
Wikipedia

...