*** Welcome to piglix ***

Conditional mutual information


In probability theory, particularly information theory, the conditional mutual information is, in its most basic form, the expected value of the mutual information of two random variables given the value of a third.

For discrete random variables and we define

where the marginal, joint, and/or conditional probability mass functions are denoted by with the appropriate subscript. This can be simplified as


...
Wikipedia

...