In physics, the word entropy has important physical implications as the amount of "disorder" of a system. In mathematics, a more abstract definition is used. The (Shannon) entropy of a variable $X$ is defined as
$$H(X)=-\Sigma_x P(x)log_2 [P(x)]$$
bits, where $P(x)$ is the probability that $X$ is in the state $x$ , and is defined as $0$ if $P=0$.
Question: Can anyone explain to me why "disorderness" of a system defined as this? especially, where has "$log_2$" come from in this formula?
No comments:
Post a Comment