*** Welcome to piglix ***

Surprisal


In information theory, self-information or surprisal is a synonym for the surprise when a random variable is sampled. It is expressed in a unit of information, for example shannons (often called bits), nats, or hartleys, depending on the base of the logarithm used in its calculation. The expected self-information is information entropy and reflects the average surprise or uncertainty associated with sampling a random variable .

By definition, information is transferred from an originating entity possessing the information to a receiving entity only when the receiver had not known the information a priori. If the receiving entity had previously known the content of a message with certainty before receiving the message, the amount of information of the message received is zero.

For example, quoting a character (the Hippy Dippy Weatherman) of comedian George Carlin, “Weather forecast for tonight: dark. Continued dark overnight, with widely scattered light by morning.” Assuming one does not reside near the Earth's poles or polar circles, the amount of information conveyed in that forecast is zero because it is known, in advance of receiving the forecast, that darkness always comes with the night.

When the content of a message is known a priori with certainty, with probability of 1, there is no actual information conveyed in the message. Only when the advance knowledge of the content of the message by the receiver is less than 100% certain does the message actually convey information.

Accordingly, the amount of self-information contained in a message conveying content informing an occurrence of event, , depends only on the probability of that event.


...
Wikipedia

...