*** Welcome to piglix ***

Multivariate mutual information


In information theory there have been various attempts over the years to extend the definition of mutual information to more than two random variables. These attempts have met with a great deal of confusion and a realization that interactions among many random variables are poorly understood.

The conditional mutual information can be used to inductively define a multivariate mutual information (MMI) in a set- or measure-theoretic sense in the context of information diagrams. In this sense we define the multivariate mutual information as follows:

where

This definition is identical to that of interaction information except for a change in sign in the case of an odd number of random variables.

Alternatively, the multivariate mutual information may be defined in measure-theoretic terms as the intersection of the individual entropies :

Defining , the set-theoretic identity which corresponds to the measure-theoretic statement , allows the above to be rewritten as:


...
Wikipedia

...