Inequalities are very important in the study of information theory. There are a number of different contexts in which these inequalities appear.
Consider a finite collection of finitely (or at most countably) supported random variables on the same probability space. For a collection of n random variables, there are 2n − 1 such non-empty subsets for which entropies can be defined. For example, when n = 2, we may consider the entropies and and express the following inequalities (which together characterize the range of the marginal and joint entropies of two random variables):