After taking a statistical mechanics course, I'm somewhat surprised that my intuitive highschool understanding of entropy doesn't match my current understanding.
When I was introduced to entropy, I was told that it (and the second law of thermodynamics) is just a statement that things move probabilistically toward their most likely state. I remember an example given with atoms, how entropy being defined (for atoms) as dq/T makes sense from this perspective.
But now in graduate school I'm told that entropy is the expected value of "information," that it's the log(Z), that it's always increasing, AND that it is dq/T.
I've had a really hard time trying to really connect all these facts mathematically. I've read a little bit on wikipedia about H-theorem, and tried to piece together an understanding of entropy but I'm still left with quite a few questions:
If entropy can be explained probabilistically, does that mean we can describe a system's entropy as a random variable? Can we find the probability a given system changes entropy as a function of time (microcanonically, canonically, or whatever)? And if we did this, could we show more explicitly how entropy is always increasing at large timescales?
Can we derive the first law of thermodynamics using microscopic arguments? In stat mech, we always calculate the partition function, and then just assume that the first law (or maxwell relations) holds - and then just plug-and-chug. But I always imagined a more statistical argument taking place.
I've seen entropy talked about so casually; whether it's the heat death of the universe, or violation of laws of thermodynamics - that I really wish I was more comfortable with the logic behind going from microstates to macrostates.
No comments:
Post a Comment