Monday, May 30, 2016

How is thermodynamic entropy defined? What is its relationship to information entropy?


I read that thermodynamic entropy is a measure of the number of microenergy states. What is the derivation for $S=k\log N$, where $k$ is Boltzmann constant, $N$ number of microenergy states.


How is the logarithmic measure justified?


Does thermodynamic entropy have anything to do with information entropy (defined by Shannon) used in information theory?



Answer



I think that the best way to justify the logarithm is that you want entropy to be an extensive quantity -- that is, if you have two non-interacting systems A and B, you want the entropy of the combined system to be $$ S_{AB}=S_A+S_B. $$ If the two systems have $N_A,N_B$ states each, then the combined system has $N_AN_B$ states. So to get additivity in the entropy, you need to take the log.


You might wonder why it's so important that the entropy be extensive (i.e., additive). That's partly just history. Before people had worked out the microscopic basis for entropy, they'd worked out a lot of the theory on macroscopic thermodynamic grounds alone, and the quantity that they'd defined as entropy was additive.


Also, the number of states available to a macroscopic system tends to be absurdly, exponentially large, so if you don't take logarithms it's very inconvenient: who wants to be constantly dealing with numbers like $10^{10^{20}}$?


No comments:

Post a Comment

classical mechanics - Moment of a force about a given axis (Torque) - Scalar or vectorial?

I am studying Statics and saw that: The moment of a force about a given axis (or Torque) is defined by the equation: $M_X = (\vec r \times \...