Beliefs depend on the available information. This idea is formalized in probability theory by conditioning. Conditional probabilities, conditional expectations and conditional distributions are treated on three levels: discrete probabilities, probability density functions, and measure theory. Conditioning leads to a non-random result if the condition is completely specified; otherwise, if the condition is left random, the result of conditioning is also random.
This article concentrates on interrelations between various kinds of conditioning, shown mostly by examples. For systematic treatment (and corresponding literature) see more specialized articles mentioned below.
Example. A fair coin is tossed 10 times; the random variable X is the number of heads in these 10 tosses, and Y — the number of heads in the first 3 tosses. In spite of the fact that Y emerges before X it may happen that someone knows X but not Y.
Given that X = 1, the conditional probability of the event Y = 0 is
More generally,
One may also treat the conditional probability as a random variable, — a function of the random variable X, namely,
The expectation of this random variable is equal to the (unconditional) probability,
namely,
which is an instance of the law of total probability