Tuesday, January 12, 2016

thermodynamics - Why is absolute zero considered to be asymptotical? Wouldn't regions such as massive gaps between galaxy clusters have temperatures of absolute zero?


Why is absolute zero considered to be asymptotical? Wouldn't regions such as massive gaps between galaxy clusters have temperatures of absolute zero?


I just do not see why our model must work the way that it does. I mean there have to be regions with no thermal energy out there, the universe is massive.



Answer




We can only approach absolute zero asymptotically because we can't suck heat out of a system. The only way we can get heat out is to place our system in contact with something cooler and let the heat flow from hot to cold as it usually does. Since there is nothing colder than absolute zero, we can never get all the heat to flow out of a system.


We can reduce the temperature by increasing the size of the system and diluting the heat. In fact this is why the CMB (cosmic microwave background) temperature is only 2.7K rather than gazillions of K as it was shorlty after the Big Bang. The expansion of the universe has diluted the heat left over from the Big Bang and reduced the temperature. However achieving absolute zero this way would require infinite dilution and therefore infinite time, which is why the universe approaches absolute zero asymptotically.


Actually, assuming the dark energy doesn't go away the universe will never cool to absolute zero even given infinite time. This is because the accelerated expansion caused by dark energy creates a cosmological horizon, and this produce Hawking radiation. The Hawking radiation will keep the temperature above absolute zero.


No comments:

Post a Comment

classical mechanics - Moment of a force about a given axis (Torque) - Scalar or vectorial?

I am studying Statics and saw that: The moment of a force about a given axis (or Torque) is defined by the equation: $M_X = (\vec r \times \...