Tuesday, September 23, 2014

thermodynamics - Clear up confusion about the meaning of entropy


So I though, and was told, that entropy is the amount of disorder in a system. Specifically the example of heat flow and it flows to maximize entropy. To me this seemed odd. This seemed more ordered to me, from two things (hot and cold) to one thing (lukewarm).


So as I have read today on wikipedia


https://en.wikipedia.org/wiki/Entropy_%28order_and_disorder%29



In recent years the long-standing use of term "disorder" to discuss entropy has met with some criticism.[20][21][22]


When considered at a microscopic level, the term disorder may quite correctly suggest an increased range of accessible possibilities; but this may result in confusion because, at the macroscopic level of everyday perception, a higher entropy state may appear more homogenous (more even, or more smoothly mixed) apparently in diametric opposition to its description as being "more disordered". Thus for example there may be dissonance at equilibrium being equated with "perfect internal disorder"; or the mixing of milk in coffee being described as a transition from an ordered state to a disordered state.


It has to be stressed, therefore, that "disorder", as used in a thermodynamic sense, relates to a full microscopic description of the system, rather than its apparent macroscopic properties. Many popular chemistry textbooks in recent editions increasingly have tended to instead present entropy through the idea of energy dispersal, which is a dominant contribution to entropy in most everyday situations.



So what this is saying basically many people mistake the macroscopic outcome of entropy in the same manner as I have above. Seemingly two dissimilar things combining and becoming one.



They say to understand it microscopically. But this still is odd to me, take for example the particles in a box, when they come to thermal equilibrium they are said to have the most entropy they can within their system (from wiki again)



Entropy – in thermodynamics, a parameter representing the state of disorder of a system at the atomic, ionic, or molecular level; the greater the disorder the higher the entropy.[7]



What is disorder in the molecular sense I guess is my question. When the particles all come to thermal equilibrium they all seem to be uniform, which I equate with ordering.


I feel disorder in the molecular sense would be fast moving particles, making a mind boggling amount of interactions with each other. Such as in a hot gas, hot gas seems super disordered, but put cold gas in thermal contact with it, let them come to thermal equilibrium, and now they are more ordered!?



Answer



Sir James Jeans has an excellent answer, without using the word "order" or "disorder". Consider a card game...¿is there anyone else on this forum besides me who still plays whist? His example is whist. I had better use poker.
You have the same probability of being dealt four aces and the king of spades as of being dealt any other precisely specified hand, e.g., the two of clubs, the three of diamonds, the five of hearts, the eight of spades, and the seven of spades. Almost worthless.
This is the microstate. A microstate is the precisely specified hand.

A macrostate is the useful description, as in the rules: two pairs, four of a kind, royal flush, flush, straight, straight flush, worthless.
You have a much lower probability of being dealt four aces than of being dealt a worthless hand.


A macrostate is the only thing about which it makes sense to say "ordered" or "disordered". The concept of "order" is undefined for a microstate. A macrostate is highly ordered or possesses a lot of information if your knowledge that you are in that macrostate implies a very high degree of specificity about what possible microstate you might be in. "Four aces", as a description, tells you a lot. "One pair", as a description, tells you much less. So the former is a state of low entropy and the latter is a state of higher entropy.


A macrostate is a probability distribution on the set of microstates. "Four aces" says that the microstate "ace, deuce, three, four, five, all spades" has zero probability. Most microstates have zero probability, they are excluded from this description. But "four aces and king of spades" has probability 1/48, so does "four aces and king of hearts", etc. etc. down to "four aces and deuce of clubs". The entropy formula then is $-k \log \frac1{48}$ where $k$ is not Boltzmann's constant. But the entropy of "one pair" is much higher: put "W" to be the number of different precisely specified hands which fall under the description "one pair". Then its entropy is $k \log W$.


Jeans makes the analogy with putting a kettle of water on the fire. The fire is hotter than the water. Energy (heat) is transferred from the fire to the water, and also from the water to the fire, by, let us assume, molecular collisions only. When we say "cold kettle on a hot fire" we are describing a macrostate. When we say "water boils" that is another macrostate. When we say "fire gets hotter and water freezes" we are also describing a possible macrostate that might result. ¿What are the entropies? They are proportional to the logarithm of the number of microstates that fall under theses three descriptions. Now, by the Maxwell distribution of energies of the molecules, there are very many high-energy molecules in the fire that come into contact with lower-energy molecules in the water and transfer energy to the water. There are very many precisely specified patterns of interaction at the individual molecular level of energy transfer from the fire to the water, so "boils" has a large entropy.


But there are some ways of freezing the water: the Maxwell distribution says that a few of the molecules in the fire are indeed less energetic than the average molecule in the water. It is possible that only (or mostly) these molecules are the ones that collide with the water and receive energy from the water. This is in strict analogy to the card game: there are very few aces, but it is possible you will get dealt all of them. But there are far fewer ways for this to happen, for the water to freeze, than for the previous process of boiling. Therefore this macrostate has less entropy than the "boils" macrostate.


This example shows that to use the definition of entropy you have to have defined the complete set of possible microstates you will consider as possible, and you have to study macrostates which are sets of these microstates. These different macrostates can be compared as to their entropies.


You cannot suddenly switch to a hockey game and compare the entropy of a full house to the entropy of "Leafs win". If you wish to make comparisons such as that, you would have to initially define an overarching system which contained both sets of microstates, define the macrostates, and even then comparisons of entropy would be merely formalistic. The laws of thermodynamics only apply when there is an interaction between all parts of the system such that any one component of the system has a chance of interchanging energy with any other part of the system. Within the time allotted. We also had to assume that each hand was equally likely, i.e., that Persi Diaconis was not dealing...some dealers know how to make some precisely specified hands less likely than others.... without these two assumptions, there is no connection between thermodynamic entropy and informational entropy, and so the Second Law of thermodynamics will not apply to informational entropy.



Better than "more ordered" would be to think "more specificity". There are fewer ways to arrange that all the slow molecules are in one body and all the fast ones in the other than there are to arrange a seemingly random mix, just as there are fewer ways to arrange that I get all the aces, than there are to arrange that each player gets one ace.





See also Does entropy really always increase (or stay the same)? where I put Sir James's exact words at length.


No comments:

Post a Comment

classical mechanics - Moment of a force about a given axis (Torque) - Scalar or vectorial?

I am studying Statics and saw that: The moment of a force about a given axis (or Torque) is defined by the equation: $M_X = (\vec r \times \...