I've been trying to understand how we can equate the Boltzmann entropy $k_B \ln \Omega$ and the entropy from thermodynamics. I'm following the approach found in the first chapter in Pathria's statistical mechanics, and in many other texts. Many other questions on stackexchange come close to addressing this problem, but I don't think any of the answers get at my specific question.
So, we're considering two isolated systems 1 and 2, which are brought into thermal contact and allowed to exchange energy (let's assume for simplicity that they can only exchange energy). On the thermodynamic side of the problem, we have the necessary and sufficient condition for thermal equilibrium
$$T_1=\frac{\partial E_1}{\partial S_1}=T_2=\frac{\partial E_2}{\partial S_2},$$
where the temperatures $T_1$ and $T_2$, the internal energies $E_1$ and $E_2$, and the entropies $S_1$ and $S_2$ are all defined appropriately in operational, thermodynamic terms. On the other hand, we can show that the necessary and sufficient condition for equilibrium from the standpoint of statistical mechanics is given by
$$\beta_1 \equiv \frac{\partial \ln \Omega_1}{\partial E_1}= \beta_2 \equiv \frac{\partial \ln \Omega_2}{\partial E_2}.$$
Here, $\Omega_1$ and $\Omega_2$ are the number of microstates associated with the macrostate of each system. Now, since both of these relations are necessary and sufficient for equilibrium, one equality holds if and only if the other also holds. My question is: How can we proceed from here to show that $S=k_B \ln \Omega$, without limiting our scope to specific examples (like an ideal gas)? In Pathria's text and in other treatments, I don't see much explanation for how this step is justified.
My possibly wrong thoughts are: It seems like we first need to show that $\beta$ is a function of $T$ alone (and indeed the same function of $T$ for both systems), and then show that the form of this function is in fact $\beta \propto T^{-1}$. But I'm not sure how to prove either of those claims.
No comments:
Post a Comment