Suppose we don't know anything about statistical mechanics, not even the existence of atoms.
Why is entropy defined as δS=δQT
Answer
In thermodynamics, the definition of entropy is unique up to redefinitions of temperature.
The Zeroth Law of thermodynamics tells us that a temperature scale exists, but it doesn't specify anything more than that. Therefore, we are free to replace temperature T with any monotonic function f(T), in which case the definition of entropy becomes δS=δQf(T).
The specific choice f(T)=T is sort of 'snuck in'. In the usual procedure, we show that a Carnot engine running between reservoirs of temperature T1 and T2 has an efficiency η of the form 1−η(T1,T2)=g(T2)g(T1).
However, we're also free to take any function g, and choosing a nontrivial g is equivalent to choosing a nontrivial f=g−1. This then affects the definition of entropy as above.
In statistical mechanics, we have a more fundamental definition of entropy, S=kBlogΩ.
No comments:
Post a Comment