In information theory, Shannon–Fano–Elias coding is a precursor to arithmetic coding, in which probabilities are used to determine codewords.
Given a discrete random variable X of ordered values to be encoded, let be the probability for any x in X. Define a function
Algorithm:
Let X = {A, B, C, D}, with probabilities p = {1/3, 1/4, 1/6, 1/4}.
Shannon–Fano–Elias coding produces a binary prefix code, allowing for direct decoding.
Let bcode(x) be the rational number formed by adding a decimal point before a binary code. For example, if code(C)=1010 then bcode(C) = 0.1010. For all x, if no y exists such that
then all the codes form a prefix code.
By comparing F to the CDF of X, this property may be demonstrated graphically for Shannon–Fano–Elias coding.
By definition of L it follows that
And because the bits after L(y) are truncated from F(y) to form code(y), it follows that
thus bcode(y) must be no less than CDF(x).
So the above graph demonstrates that the , therefore the prefix property holds.