*** Welcome to piglix ***

Stein's lemma


Stein's lemma, named in honor of Charles Stein, is a theorem of probability theory that is of interest primarily because of its applications to statistical inference — in particular, to James–Stein estimation and empirical Bayes methods — and its applications to portfolio choice theory. The theorem gives a formula for the covariance of one random variable with the value of a function of another, when the two random variables are jointly normally distributed.

Suppose X is a normally distributed random variable with expectation μ and variance σ2. Further suppose g is a function for which the two expectations E(g(X) (X − μ) ) and E( g ′(X) ) both exist (the existence of the expectation of any random variable is equivalent to the finiteness of the expectation of its absolute value). Then

In general, suppose X and Y are jointly normally distributed. Then

In order to prove the univariate version of this lemma, recall that the probability density function for the normal distribution with expectation 0 and variance 1 is

and that for a normal distribution with expectation μ and variance σ2 is

Then use integration by parts.

Suppose X is in an exponential family, that is, X has the density

Suppose this density has support where could be and as , where is any differentiable function such that or if finite. Then


...
Wikipedia

...