*** Welcome to piglix ***

Slepian–Wolf coding


In information theory and communication, the Slepian–Wolf coding, also known as the Slepian–Wolf bound, is a result in distributed source coding discovered by David Slepian and Jack Wolf in 1973. It is a method of theoretically coding two lossless compressed correlated sources.

Distributed coding is the coding of two, in this case, or more dependent sources with separate encoders and a joint decoder. Given two statistically dependent i.i.d. finite-alphabet random sequences X and Y, the Slepian–Wolf theorem gives a theoretical bound for the lossless coding rate for distributed coding of the two sources as shown below:

If both the encoder and the decoder of the two sources are independent, the lowest rate it can achieve for lossless compression is and for and respectively, where and are the entropies of and . However, with joint decoding, if vanishing error probability for long sequences is accepted, the Slepian–Wolf theorem shows that much better compression rate can be achieved. As long as the total rate of and is larger than their joint entropy and none of the sources is encoded with a rate smaller than its entropy, distributed coding can achieve arbitrarily small error probability for long sequences.


...
Wikipedia

...