Publication Date
10-1-2022
Abstract
Entropy is a natural measure of randomness. It progresses from its smallest possible value 0 -- when we have a deterministic case in which one alternative i occurs with probability 1 (pi = 1), to the largest possible value which is attained at a uniform distribution p1 = ... = pn = 1/n. Intuitively, both in the deterministic case and in the uniform distribution case, there is not much variety in the distribution, while in the intermediate cases, when we have several different values pi, there is a strong variety. Entropy does not seem to capture this notion of variety. In this paper, we discuss how we can describe this intuitive notion.
Original file
Comments
Technical Report: UTEP-CS-22-100a
To appear in Proceedings of the 11th IEEE International Conference on Intelligent Systems IS'22, Warsaw, Poland, October 12-14, 2022.