Kullback–Leibler Divergence
Frederic P. Miller, Agnes F. Vandome, John McBrewster
High Quality Content by WIKIPEDIA articles! In probability theory and information theory, the Kullback–Leibler divergenc (also information divergence, information gain, relative entropy, or KLIC) is a non-symmetric measure of the difference between two probability distributions P and Q. KL measures the expected number of extra bits required to code samples from P when using a code based on Q, rather than using a code based on P. Typically P represents the "true" distribution of data, observations, or a precise calculated theoretical distribution. The measure Q typically represents a theory, model, description, or approximation of P. Although it is often intuited as a distance metric, the KL divergence is not a true metric – for...
ISBN: 978-6-1338-9768-7
Издательство:
Книга по требованию
Дата выхода: июль 2011