*** Welcome to piglix ***

Chernoff's inequality


In probability theory, the Chernoff bound, named after Herman Chernoff but due to Herman Rubin, gives exponentially decreasing bounds on tail distributions of sums of independent random variables. It is a sharper bound than the known first or second moment based tail bounds such as Markov's inequality or Chebyshev inequality, which only yield power-law bounds on tail decay. However, the Chernoff bound requires that the variates be independent – a condition that neither the Markov nor the Chebyshev inequalities require.

It is related to the (historically prior) Bernstein inequalities, and to Hoeffding's inequality.

The generic Chernoff bound for a random variable X is attained by applying Markov's inequality to etX. For every :

When X is the sum of n random variables X1, ..., Xn, we get for any t > 0,

In particular, optimizing over t and using the assumption that Xi are independent, we obtain,


...
Wikipedia

...