*** Welcome to piglix ***

Explaining away


The interaction information (McGill 1954), or amounts of information (Hu Kuo Ting, 1962) or co-information (Bell 2003) or information interaction (Licina 2017) is one of several generalizations of the mutual information. In fact, the definition of interaction information is identical to that of multivariate mutual information except for a change in sign in the case of an odd number of random variables except in the work of Licina where information is more like a physical entitiy.

Interaction information expresses the amount information (redundancy or synergy) bound up in a set of variables, beyond that which is present in any subset of those variables. Unlike the mutual information, the interaction information can be either positive or negative. This confusing property has likely retarded its wider adoption as an information measure in machine learning and cognitive science. These functions, their negativity and minima have a direct interpretation in algebraic topology (Baudot & Bennequin, 2015).

For three variables , the interaction information is given by

where, for example, is the mutual information between variables and , and is the conditional mutual information between variables and given . Formally,


...
Wikipedia

...