Monday, January 23, 2017

thermodynamics - How is temperature defined, and measured?


In questions like this one, temperatures of millions of degrees (Celsius, Kelvin, it doesn't really matter at that point) are mentioned.


But, what does it mean exactly? What is measured, and how? As I more or less expected, the Wikipedia article mentions that the official definition (the one I was told in elementary school, either between freezing and boiling points of water for Celsius, or between absolute zero and triple point of water for Kelvin) doesn't really work above 1300K.



So, questions:





  • For "earthly temperatures", how is temperature measured consistently? Why would linearity of the measuring device be assumed, in particular to extrapolate beyond the initial (0-100, or 0-273.16) range? What guarantees that two thermometers, working under different principles (say, mercury and electricity), that agree at 0C and 100C, will agree at 50C?




  • What do statements like "the temperature in the Sun's core is $1.5\times10^7K$ mean? Or, even, what do the "7000K" in the Earth's core mean?







Answer



Definitions


First and foremost, temperature is a parameter defining a statistical distribution, much as the statistical parameters of mean and standard deviation define the normal probability distribution. Temperature defines the equilibrium (maximum likelihood) distribution of energies of particles in a collection of statistically independent particles through the Boltzmann distribution. If the possible energies of the particles are $E_i$, then the maximum likelihood particle energy distribution is proportional to $\exp\left(-\frac{E_i}{k\,T}\right)$, where $T$ is simply a parameter of the distribution. Most often, the higher the system's total energy, the higher its temperature (but this is not always so, see my answer here) and indeed for ideal gases, the temperature is proportional to the mean energy of the constituent molecules (sometimes one hears people incorrectly saying that temperature measures the mean particle energy - this is so for ideal gasses but not in general). This latter, incorrect definition will nonetheless give much correct intuition for common systems - an eight year old girl at my daughter's school in our parents' science sessions once told me that she thought temperature measured the amount of heat energy in a body, and I was pretty impressed by that answer from an eight year old.


An equivalent definition that allows us to calculate the temperature statistical parameter is that an equilibrium thermodynamic system's reciprocal temperature, $\beta = \frac{1}{k\,T}$ is defined by:


$$\frac{1}{k\,T} = \partial_U S\tag{1}$$


where $U$ is the total internal energy of a system and $S$ the system's entropy i.e. $\beta$ (sometimes quaintly called the "perk") is how much a given system "thermalizes" (increases its entropy) in response to the adding of heat to its internal energy $U$ (how much the system rouses or "perks up"). The Boltzmann constant depends on how one defines one's unit temperature - in natural (Plank) units unity temperature is defined so that $k = 1$.


This definition hearkens back to Carnot's ingenious definition of temperature, whereby one chooses a "standard" heat reservoir and then measures the efficiency of an ideal heat engine working between a reservoir whose temperature is to be measured and the standard one. If the efficiency is $\eta$, then the temperature of the hot reservoir is $\frac{1}{1-\eta}$. The choice of the standard reservoir is equivalent to fixing the Boltzmann constant. Of course, ideal heat engines do not exist, but this is a "thought experiment" definition. Nonetheless, this definition leads to the the realization that there must be a function of state - the entropy - and that we can define the temperature through (1). See my answer here for more details.


Measurements


Extreme temperatures, such as the cores of stars, are calculated theoretically. Given a stellar thermodynamic model and calculations of pressure from gravitational theory, one can calculate the statistical distribution of energies that prevails. Stellar models predict surface temperatures and these latter, not so extreme temperatures can be measured by spectroscopy, i.e. by measuring the spectrum of emitted light and then fitting it to the Planck Radiation Law. Given reasonable agreement between predicted and observed quantities, one can have reasonable confidence in the temperatures calculated for the star core.



Pyrometry, grounded on the Stefan-Boltzmann law, is another, simpler (but less accurate) way to measure highish temperatures.


Earth core temperatures are deduced partly through theoretical models in the same way, but also inferred from what we know about the behavior of matter at these temperatures. Such temperatures and pressures can be created in the laboratory and monitored through pyrometry. We are reasonably confident of the phase diagram for iron, for example, and we know under what temperatures and pressures it will be liquid and when it will be solid. Then, seismic wave measurements give us a picture of the core of the Earth; thus we know the radius of the inner, solid core. Given that we know the phase diagram for the assumed core iron-nickel alloy, the solid core boundary gives us an indirect measurement of the temperature at the boundary.


No comments:

Post a Comment

classical mechanics - Moment of a force about a given axis (Torque) - Scalar or vectorial?

I am studying Statics and saw that: The moment of a force about a given axis (or Torque) is defined by the equation: $M_X = (\vec r \times \...