I'm not familiar (yet) on how Information theory can be emerged/used in QM/QFT but I was thinking about this question:
While we have Heisenberg uncertainty principle on measuring coupled observables, can we express it using somehow a more fundamental/abstract concept like Information uncertainty (especially that we have Information conservation principle like for energy and momentum, and associated symmetry, i.e CPT)? in sense that because there is always some details that we can't measure/know, can we express that as some information uncertainty?
And despite of the answer please explain why there is or there isn't connection/relation between the two concepts.
Answer
Lubos' answer is correct: information is not an observable so does not have fluctuations in the sense that could enter an uncertainty relation. However, there does exist a relationship between 'information' and the uncertainty principle, although not of the type that it seems the OP expects.
First of all, note that 'information conservation' could never be an explanation for the uncertainty principle. Information is not a conserved quantity in quantum mechanics, since measurements are part of the formalism. Measurements, by definition, produce a discontinuous change in the information content of a system with respect to an observer. It is important to remember that, despite its current fashionable status as a paradigm for understanding physics, information is still not a physical property of a system. Rather, it is a property of the relationship between an observer and a system. The only subtlety is that quantum mechanics places a fundamental restriction on the amount of information that can be gained by any observer.
To understand how to make this restriction quantitative, you will need to learn a bit of quantum estimation theory. I am not gonna derive it all here; you can find details in reviews such as, for example, this paper. The basic idea is that if you want to estimate some parameter $\lambda$ on which a state depends, which may or may not be an "observable" in the traditional sense, your precision will be limited by the Cramer-Rao bound: $$\mathrm{Var}(\lambda) \geq \frac{1}{M F(\lambda)},$$ where $\mathrm{Var}(\lambda)$ is the variance of the distribution of measurement outcomes, $M$ is the number of measurements and $F(\lambda)$ is the so-called Fisher Information. This is a result from classical information theory.
Given a system and a parameter to be estimated, the Fisher Information generally depends on the choice of measurements. In the quantum case, one can do even better and show that the Fisher Information is bounded from above by the Quantum Fisher Information $H(\lambda)$,so the quantum Cramer-Rao bound reads $$\mathrm{Var}(\lambda) \geq \frac{1}{M H(\lambda)}.$$ The quantum Fisher information gives the absolute upper bound on the amount of information that an observer can gain about the parameter $\lambda$ by measuring the system. It is the Fisher information corresponding to the optimal measurement basis.
How does this relate to the uncertainty principle? Specialise to the particular case of a system in a pure state, where the parameter dependence is produced by the unitary transformation $$ |\psi(\lambda)\rangle = U_{\lambda}|\psi(0)\rangle,$$ where $$U_{\lambda} = e^{-\mathrm{i} \lambda G}, $$ and $G$ is the hermitian generator of the unitary transformation. This includes scenarios such as energy-time uncertainty, where $G = \hat{H}$ is the Hamiltonian (generator of time translations) and $\lambda = t$ is the waiting time after initial preparation of the state $|\psi(0)\rangle$ (I set $\hbar = 1$). Then you can derive the following inequality from the quantum Cramer-Rao bound: $$ \mathrm{Var}(\lambda) \langle \psi(0)| G^2 |\psi(0)\rangle \geq \frac{1}{4 M}, $$ which is exactly an uncertainty relation. Note that this example is slightly artificial: the uncertainty relations are more general than this scenario. However, hopefully this example gives you a flavour of how uncertainty relations can be linked to concepts from information theory. (It also shows that energy-time uncertainty relations don't require a lot of hand-waving to derive, as some people seem to believe.)
Another subtle connection that is worth mentioning is a very deep fact about quantum mechanics: "information gain implies disturbance". This means that it is impossible to gain some information about a system without disturbing it. The more information gained, the greater the disturbance. See this paper for more info. If you take the information-disturbance trade-off as a fundamental principle for quantum mechanics, such as in this recent paper, then you have a heuristic way of understanding the physical origin of the uncertainty principle.
No comments:
Post a Comment