*** Welcome to piglix ***

Cross entropy


In information theory, the cross entropy between two probability distributions and over the same underlying set of events measures the average number of bits needed to identify an event drawn from the set, if a coding scheme is used that is optimized for an "unnatural" probability distribution , rather than the "true" distribution .


...
Wikipedia

...