I'm reading a book by George Gamow, "Thirty years that shook Physics" and have trouble understanding his way of describing the UV catastrophe. In a first part he points out that applying the Equipartition Theorem (or principle) to radiation would lead to a situation in which every wave of each frequency should have $0$ energy (or, better, $E/\infty$, if $E$ is the original amount of energy of the system). Next he states that if we introduce for instance red light in a Jeans cube and we apply the principle to this situation, we could have the absurd consequence that the cube could be a source of high fequency radiation ($\gamma$, $X$, etc).
Now, how can the second situation even exist if (equvalently absurd) each frequency had zero energy to begin with? I cannot see a "UV" catastrophe, rather a "disappearing energy" catastrophe. What am I missing?
Answer
The problem I think you are having is that once you assume a false statement, you can prove anything. So everything you said in the second paragraph is true if you treat the problem classically. You are right that each electromagnetic standing-wave mode in the cavity would have no energy, and so there would be no electromagnetic energy at all even at finite temperature.
However, this is not the exact line of reasoning the author intended. The author reasoned as follows:
We know from experience that it takes only a finite amount of energy $E$ to raise the temperature of a hollow metal box (radiation cavity) by some temperature $\Delta T$.
We know from equipartition that this energy $E$ must be split evenly between each mode of the cavity
Since there are an infinite number of modes, each mode's energy must increase by $E/\infty$, but this is zero and so no mode will have any more energy after the temperature is raised.
These first three points pretty much agree with what he says. Then I think his next point goes something like this
We know that if we pump energy into a low frequency mode and wait, the system will thermalize, so that energy will be transfered to higher freqency modes.
We know from experience that our cavity will still emit radiation after thermalization.
Because of equipartition, we expect much of the radiation to occur it higher frequencies. This contradicts experience, because we never see a room temperature blackbody emitting x-rays.
Now the way I was taught the ultraviolet catastrophe was the following. We try to figure out the total energy $E$ of the system at temperature $T$. This will be the sum of all modes $\nu$ of the energy in that mode $E_\nu$. Since there are modes with arbitrarily high $\nu$, this sum is actually infinite, so it can be written as a limit: $$E = \lim_{\nu^* \to \infty} \sum_{\nu=0}^{\nu^*} E_\nu.$$ Now classicaly each $E_\nu$ should just be $kT$, so that our equation becomes $$ E = \lim_{\nu^* \to \infty} \sum_{\nu=0}^{\nu^*} kT = \lim_{\nu^* \to \infty} kT N(\nu < \nu^*),$$ where $N(\nu < \nu^*)$ is the number of modes with frequency $\nu$ less than $\nu^*$. Now when $\nu^*$ is pushed higher and higher (into the ultraviolet), $N(\nu < \nu^*)$ keeps increasing without bound, so that the estimate of the total energy $E$ keeps getting bigger and bigger. The fact that $E$ appears to become infinite when you put $\nu^*$ deeper and deeper into the ultraviolet is why it is called the ultraviolet catastrophe.
So now you have seen the ultraviolet catastrophe explained to ways. The way the author explained it, he assumed a finite total energy, and divided this energy up between an infinite number of modes to get zero energy per mode. I would say "disappearing energy" catastrophe is a good name for this. The way I explained it was to assume a constant finite energy per mode, and have the energy diverge as higher and higher frequencies are considered. It makes more sense to call this one the ultraviolet catastrophe. Either way, it is clear something is wrong.
No comments:
Post a Comment