Suppose we don't know anything about statistical mechanics, not even the existence of atoms.
Why is entropy defined as $$\delta S=\frac{\delta Q}{T}$$ instead of, say, $$\delta S=\frac{\delta Q}{T^2}$$ or any other function that will conserve the 2nd Law of thermodynamics? In a nutshell, is our entropy definition unique?
Answer
In thermodynamics, the definition of entropy is unique up to redefinitions of temperature.
The Zeroth Law of thermodynamics tells us that a temperature scale exists, but it doesn't specify anything more than that. Therefore, we are free to replace temperature $T$ with any monotonic function $f(T)$, in which case the definition of entropy becomes $$\delta S = \frac{\delta Q}{f(T)}.$$ As you've seen, this doesn't change anything about the Second Law. It does change what 'temperature' means, though. The Carnot efficiency, the form of the ideal gas law, etc. all have to be changed.
The specific choice $f(T) = T$ is sort of 'snuck in'. In the usual procedure, we show that a Carnot engine running between reservoirs of temperature $T_1$ and $T_2$ has an efficiency $\eta$ of the form $$1 - \eta(T_1, T_2) = \frac{g(T_2)}{g(T_1)}.$$ This can be shown by considering the composition of two Carnot engines in series. By convention, we take $g(T) = T$, and the resulting temperature scale defined is called the thermodynamic temperature. This is a good choice, because it corresponds closely with empirical temperature measures, like the height of mercury in a thermometer.
However, we're also free to take any function $g$, and choosing a nontrivial $g$ is equivalent to choosing a nontrivial $f = g^{-1}$. This then affects the definition of entropy as above.
In statistical mechanics, we have a more fundamental definition of entropy, $$S = k_B \log \Omega.$$ This definition is unique, and it forces the choice $f(T) = T$.
No comments:
Post a Comment