Entropy, Relative Entropy, Cross Entropy
https://www.iitg.ac.in/cseweb/osint/slides/Anasua_Entropy.pdf
It is basically coming from Real Analysis of Measure Theory goes beyond Information Theory: Shannon Entropy
https://en.wikipedia.org/wiki/Entropy_(information_theory)#Definition
Lesser the probability for an event, larger the entropy. >>> Entropy of a six-headed fair dice is log2(6).
No comments:
Post a Comment