Monday, November 4, 2019

quantum mechanics - Why would a black hole explode?


It is common in popular science culture to assume that Hawking radiation causes black holes to vaporize. And, in the end, the black hole would explode. I also remember it being mentioned in A Brief History of Time.


Why would a black hole explode? Why can't it gradually vanish to zero? What is the exact mechanism or theory which causes a black hole to explode?



Answer



The expression for the power emitted as Hawking radiation is $$ P = \frac{\hbar c^6}{15360 \pi G^2 M^2} = 3.6\times10^{32} M^{-2}\ \text{W} = -c^2 \frac{dM}{dt},$$ where the term on the far right hand side expresses the rate at which the black hole mass decreases due to the emission of Hawking radiation.


You can see that what happens is that the power emitted actually increases as $M$ decreases. At the same time, the rate at which the mass decreases also increases.


So as the black hole gets less massive, the rate at which it gets less massive increases rapidly and hence the power it emits increases very, very rapidly.


By solving this differential equation it can be shown that the time to evaporate to nothing is given by $$ t = 8.4\times10^{-17} M^3\ \text{s},$$ so for example a 100 tonne black hole would evaporate in $8.4 \times10^{-2}\ \text{s}$, emitting approximately $E = Mc^2 = 9\times 10^{21}$ joules of energy as it does so – equivalent to more than a million megatons of TNT. I guess you could call this an explosion!



This will be the fate of all evaporating black holes, but most will take a very long time to get to this stage (even supposing they do not accrete any matter). The evaporation time is only less than the age of the universe for $M < $ a few $10^{11}\ \text{kg}$. A 1 solar mass black hole takes $2\times10^{67}$ years to evaporate.


EDIT: The Hawking radiation temperature is given by $$ kT = \frac{\hbar c^3}{8 \pi GM}.$$ Unless the temperature is well above the ambient temperature (at a minimum the cosmic microwave background temperature), the black hole will always absorb more energy than it radiates, and get bigger. i.e. to evaporate $$ \frac{\hbar c^3}{8 \pi GM} > kT_{\rm ambient}$$ $$ M < \frac{1.2\times10^{23}}{T_{\rm ambient}}\ {\rm kg}$$


Therefore unless I've made a mistake, this proviso is of no practical importance other than for evaporating black holes (i.e. those with $M<10^{11}$ kg) in the early universe.


The temperature of a black hole goes as its evaporation timescale as $t_{\rm evap}^{-1/3}$. The temperature of the early, radiation-dominated, universe scales as $t^{-1/2}$. Thus it appears to be the case that at some point in the past, a black hole that might have had an evaporation timescale shorter than the age of the universe is incapable of doing so.


No comments:

Post a Comment

classical mechanics - Moment of a force about a given axis (Torque) - Scalar or vectorial?

I am studying Statics and saw that: The moment of a force about a given axis (or Torque) is defined by the equation: $M_X = (\vec r \times \...