Tuesday, September 17, 2019

quantum mechanics - Maximum theoretical data density


Our ability to store data on or in physical media continues to grow, with the maximum amount a data you can store in a given volume increasing exponentially from year to year. Storage devices continue to get smaller and their capacity gets bigger.



This can't continue forever, though, I would imagine. "Things" can only get so small; but what about information? How small can a single bit of information be?


Put another way: given a limited physical space -- say 1 cubic centimeter -- and without assuming more dimensions than we currently have access to, what is the maximum amount of information that can be stored in that space? At what point does the exponential growth of storage density come to such a conclusive and final halt that we have no reason to even attempt to increase it further?



Answer



The answer is given by the covariant entropy bound (CEB) also referred to as the Bousso bound after Raphael Bousso who first suggested it. The CEB sounds very similar to the Holographic principle (HP) in that both relate the dynamics of a system to what happens on its boundary, but the similarity ends there.


The HP suggests that the physics (specifically Supergravity or SUGRA) in a d-dimensional spacetime can be mapped to the physics of a conformal field theory living on it d-1 dimensional boundary.


The CEB is more along the lines of the Bekenstein bound which says that the entropy of a black hole is proportional to the area of its horizon:


$$ S = \frac{k A}{4} $$


To cut a long story short the maximum information that you can store in $1\ \mathrm{cm^3} = 10^{-6}\ \mathrm{m^3}$ of space is proportional to the area of its boundary. For a uniform spherical volume, that area is:


$$ A = V^{2/3} = 10^{-4}\ \mathrm{m^2}$$


Therefore the maximum information (number of bits) you can store is approximately given by:



$$ S \sim \frac{A}{A_\mathrm{pl}} $$


where $A_\mathrm{pl}$ is the planck area $ \sim 10^{-70}\ \mathrm{m^2}$. For our $1\ \mathrm{cm^3}$ volume this gives $ S_\mathrm{max} \sim 10^{66} $ bits.


Of course, this is a rough order-of-magnitude estimate, but it lies in the general ballpark and gives you an idea of the limit that you are talking about. As you can see, we still have decades if not centuries before our technology can saturate this bound!




Edit: Thanks to @mark for pointing out that $1\ \mathrm{cm^3} = 10^{-6}\ \mathrm{m^3}$ and not $10^{-9}\ \mathrm{m^3}$. Changes final result by three orders of magnitude.


On Entropy and Planck Area


In response to @david's observations in the comments let me elaborate on two issues.




  1. Planck Area: From lqg (and also string theory) we know that geometric observables such as the area and volume are quantized in any theory of gravity. This result is at the kinematical level and is independent of what the actual dynamics are. The quantum of area, as one would expect, is of the order of $\sim l_\mathrm{pl}^2$ where $l_\mathrm{pl}$ is the Planck length. In quantum gravity the dynamical entities are precisely these area elements to which one associates a spin-variable $j$, where generally $j = \pm 1/2$ (the lowest rep of SU(2)). Each spin can carry a single qubit of information. Thus it is natural to associate the Planck areas with a single unit of information.





  2. Entropy as a measure of Information: There is a great misunderstanding in the physics community regarding the relationship between entropy $S$ – usually described as a measure of disorder – and useful information $I$ such as that stored on a chip, an abacus or any other device. However they are one and the same. I remember being laughed out of a physics chat room once for saying this so I don't expect anyone to take this at face value.




But think about this for a second (or two). What is entropy?


$$ S = k_\mathrm B \ln(N) $$


where $k_\mathrm B$ is Boltzmann's constant and $N$ the number of microscopic degrees of freedom of a system. For a gas in a box, for eg, $N$ corresponds to the number of different ways to distribute the molecules in a given volume. If we were able to actually use a gas chamber as an information storage device, then each one of these configurations would correspond to a unit of memory. Or consider a spin-chain with $m$ spins. Each spin can take two (classical) values $\pm 1/2$. Using a spin to represent a bit, we see that a spin-chain of length $m$ can encode $2^m$ different numbers. What is the corresponding entropy:


$ S \sim \ln(2^m) = m \ln(2) \sim \textrm{number of bits} $


since we have identified each spin with a bit (more precisely qubit). Therefore we can safely say that the entropy of a system is proportional to the number of bits required to describe the system and hence to its storage capacity.



No comments:

Post a Comment

classical mechanics - Moment of a force about a given axis (Torque) - Scalar or vectorial?

I am studying Statics and saw that: The moment of a force about a given axis (or Torque) is defined by the equation: $M_X = (\vec r \times \...