I've been thinking about entropy for a while and why it is a confusing concept and many references are filled with varying descriptions of something that is a statistical probability (arrows of time, disorder, etc.). Could this confusion be in the nature of it's units and how it is scattered around in different equations?
When Maxwell published his paper on the molecular distribution of velocities in 1859. This has led to the identification of temperature with the mean kinetic energy of atoms or molecules in the gas. At that point we could have redefined temperature to units with energy which make sense instead of K. This would make the new $T_a=kT$ where $T_a$ is the new absolute temperature in the units with energy.
Once temperature is in the units of energy this makes entropy unitless (makes more sense to me):
$S = ln(W)$
In addition this makes other equations more clear:
Maxwell's gas law identity has the form (for atomic particles of mass m) $\frac{3kT}{2} = \frac{m<ν^2>}{2}$ where T is the absolute temperature and $<ν^2>$ , the average of the squared velocity of the atoms, and k Boltzmann’s constant.
But with $T_a$, the relation $\frac{3kT}{2} = \frac{m<ν^2>}{2}$ will become simpler $\frac{3T_a}{2} = \frac{m<ν^2>}{2}$
And the gas constant R in the equation of state for ideal gases would be changed into Avogadro number $N_{AV} = 6.022 × 10^{23}$ and the equation state of one mole of an ideal gas will read: $PV = N_{AV}T_a$, instead of $PV = RT$
Why bother?
This would make entropy identical conceptually and formally to information by redefining temperature in terms of units of energy. This creates a strong association between entropy and probability and makes the second law (which isn't absolute anyway) less mysterious.
Would it make sense to have temperature in units of energy at other levels of physics and entropy unitless say in the case of black holes entropy?
Answer
I agree mostly with Jerry and Danu in their comments, in that your proposed definition would make much sense, and indeed the Boltzmann constant is unity in natural (Planck) units.
A unitless entropy would have a great deal of appeal, especially given the tight links between the thermodynamic entropy and the information-theoretic Shannon entropy - they would be equal in your units for a thermalized system of perfectly uncorrelated (statistically independent) constituents - the special case envisaged by Boltzmann's Stosszahlansatz (molecular chaos, although Boltzmann's own word means "collision number hypothesis"). Your entropy would then be measured in nats: you would have to use units wherein the Boltzmann constant were $\log 2$ to get entropy in bits. Note, however, that the thermodynamic entropy calculated from marginal distributions $N \sum p_j \log p_j$ is not in general equal to the Shannon entropy in these units: one in general has to take into account correlations between particles, which lessens the entropy (because particle states partially foretell other particle states). See, for a good explanation of these ideas
E. T. Jaynes, "Gibbs vs Boltzmann Entropies", Am. J. Phys. 33, number 5, pp 391-398, 1965 as well as many other of his works in this field
There is one last point to note, however, and that is that the idea of temperature being proportional to average constituent energy is in general only approximately true. It is true for an ideal gas, as you know. However, the most general definition of thermodynamic temperature is that the efficiency of an ideal reversible heat engine working between two infinite heat reservoirs at different temperatures defines the ratio of these temperatures: it is:
$$\frac{T_1}{T_2} = 1 - \eta$$
where $\eta$ is the efficiency of the heat engine, $T_2$ the higher temperature and the temperature of the reservoir which the engine draws heat from and $T_1$ the lower temperature of the other reservoir which the engine dumps waste heat into. Once a "standard" unit temperature is defined (e.g. as something like that of the triple point of water), then the full temperature definition follows. This definition can be shown to be equivalent to the definition (in your units, with $k=1$:
$$T^{-1} = \partial_U S$$
i.e. the inverse temperature (sometimes quaintly called the "perk") is how much a given system "thermalizes" in response to the adding of heat to its internal energy $U$ (how much the system rouses or "perks up"). See for a good summary the section "Second Law of Thermodynamics" on the Wikipedia page for Temperature.
So let's apply this definition to a thermalized system of quantum harmonic oscillators: suppose they are at distinguishable positions. At thermodynamic equilibrium, the Boltzmann distribution for the ladder number (number of photons / phonons in a given oscillator) is:
$$p(n) = \left(e^{\beta\, \hbar\,\omega }-1\right) e^{-\beta\,\hbar\, \omega \, (n+1)}$$
The mean oscillator energy is then:
$$\left
The Shannon entropy (per oscillator) is then:
$$S = -\sum\limits_{n = 0}^\infty p(n) \log p(n) = \frac{\beta\,\hbar\,\omega\,e^{\beta \,\hbar\,\omega}}{e^{\beta\,\hbar\,\omega}-1} - \log \left(e^{\beta\,\hbar\,\omega}-1\right)$$
so the thermodynamic temperature is then given by (noting that the only way we change this system's energy is by varying $\beta$):
$$T^{-1} = \partial_{\left
but this temperature is not equal to the mean particle energy at very low temperatures; the mean particle energy is:
$$\begin{array}{lcl}\left
so that you can see that the your original definition as the mean particle energy is recovered for $T>>\hbar\omega$, the photon energy.
No comments:
Post a Comment