In probability theory and statistics, the cumulants κn of a probability distribution are a set of quantities that provide an alternative to the moments of the distribution. The moments determine the cumulants in the sense that any two probability distributions whose moments are identical will have identical cumulants as well, and similarly the cumulants determine the moments.
The first cumulant is the mean, the second cumulant is the variance, and the third cumulant is the same as the third central moment. But fourth and higher-order cumulants are not equal to central moments. In some cases theoretical treatments of problems in terms of cumulants are simpler than those using moments. In particular, when two or more random variables are statistically independent, the nth-order cumulant of their sum is equal to the sum of their nth-order cumulants. As well, the third and higher-order cumulants of a normal distribution are zero, and it is the only distribution with this property.
Just as for moments, where joint moments are used for collections of random variables, it is possible to define joint cumulants.
The cumulants of a random variable X are defined using the cumulant-generating function K(t), which is the natural logarithm of the moment-generating function:
The cumulants κn are obtained from a power series expansion of the cumulant generating function:
This expansion is a Maclaurin series, so the n-th cumulant can be obtained by differentiating the above expansion n times and evaluating the result at zero: