As I understand it (and admittedly it's a weak grasp), a computer processes information irreversibly (AND gates, for example), and therefore has some minimum entropy increase associated with its computations. The true entropy increase is much greater, and comes from the conversion of electrical energy to heat.
How efficient is a typical desktop computer when viewed in this light? Make any assumptions you find useful about the energy use, computations per second, temperature of the room, etc.
Answer
Assuming a typical computer with CPU processing power ~1 GHz. It means that it can generate output byte sequence at ~$10^9$ byte/s, which is about ~$10^{-13}$ J/K in terms of von Neumann entropy. Also, the power consumption of a typical CPU is ~100 W, which gives entropy ~0.3 J/K at room temperature.
So the (minimum ΔS) / (actual ΔS) ~ $10^{-14}$
This calculation is not quite right because it is hard to determine what is the actual output of a computer. In most case, the previous output will be used as input later. The above calculation has also made the assumption that all output is continuously written in some external device.
A better point of view is that each gates taking two inputs and one output, such as AND, OR, NAND, ..., must drop one bit to the surrounding as heat. This is the minimum energy $W$ required to process information in a classical computer. In this sense, we may define the efficiency as $e = W/Q$, where $Q$ is the actual heat generation per second.
The efficiency depends on how many such logical gates that will be used, but I guess it is less than thousand in a typical clock rate, so $e \approx 10^{-11}$.
It means that our computer is very low efficiency in terms of information processing, but probably good as a heater. This theoretical minimum energy requirement is also hard to verified by experiment because of the high accuracy required.
No comments:
Post a Comment