Wednesday, November 7, 2018

thermodynamics - Definition of the entropy


In physics, the word entropy has important physical implications as the amount of "disorder" of a system. In mathematics, a more abstract definition is used. The (Shannon) entropy of a variable $X$ is defined as


$$H(X)=-\Sigma_x P(x)log_2 [P(x)]$$


bits, where $P(x)$ is the probability that $X$ is in the state $x$ , and is defined as $0$ if $P=0$.


Question: Can anyone explain to me why "disorderness" of a system defined as this? especially, where has "$log_2$" come from in this formula?




No comments:

Post a Comment

classical mechanics - Moment of a force about a given axis (Torque) - Scalar or vectorial?

I am studying Statics and saw that: The moment of a force about a given axis (or Torque) is defined by the equation: $M_X = (\vec r \times \...