Entropy is a natural measure of randomness. It progresses from its smallest possible value 0 -- when we have a deterministic case in which one alternative i occurs with probability 1 (pi = 1), to the largest possible value which is attained at a uniform distribution p1 = ... = pn = 1/n. Intuitively, both in the deterministic case and in the uniform distribution case, there is not much variety in the distribution, while in the intermediate cases, when we have several different values pi, there is a strong variety. Entropy does not seem to capture this notion of variety. In this paper, we discuss how we can describe this intuitive notion.