Sunday, October 30, 2016

Is there conservation of information during quantum measurement?


Consider the following experiment. I take a spin-$\frac{1}{2}$ particle and make a $\sigma_x$ measurement (measure the spin in the $x$ direction), then make a $\sigma_y$ measurement, then another $\sigma_x$ one, then $\sigma_y$, and so on for $n$ measurements. The formalism of quantum mechanics tells us that the outcomes of these measurements will be random and independent. I now have a string of completely random bits, of length $n$. Briefly, my question is where does the information in this string come from?


The obvious answer is "quantum measurements are fundamentally indeterministic and it's simply a law of physics that you get a random string when you do that". The problem I have with this is that it can be shown that unitary evolution of quantum systems conserves von Neumann entropy, just as Hamiltonian evolution of a classical system conserves Shannon entropy. In the classical case this can be interpreted as "no process can create or destroy information on the microscopic level." It seems like the same should be true for the quantum case as well, but this seems hard to reconcile with the existence of "true" randomness in quantum measurement, which does seem to create information.


It's clear that there are some interpretations for which this isn't a problem. In particular, for a no-collapse interpretation the Universe just ends up in a superposition of $2^n$ states, each containing an observer looking at a different output string.


But I'm not a big fan of no-collapse interpretations, so I'm wondering how other quantum interpretations cope with this. In particular, in the "standard" interpretation (by which I mean the one that people adhere to when they say quantum mechanics doesn't need an interpretation), how is the indeterminacy of measurement reconciled with the conservation of von Neumann entropy? Is there an interpretation other than no-collapse that can solve it particularly well?


addendum


It seems worth summarising my current thinking on this, and having another go at making clear what I'm really asking.


I want to start by talking about the classical case, because only then can I make it clear where the analogy seems to break down. Let's consider a classical system that can take on one of $n$ discrete states (microstates). Since I don't initially know which state the system is in, I model the system with a probability distribution.


The system evolves over time. We model this by taking the vector $p$ of probabilities and multiplying it by a matrix T at each time step, i.e. $p_{t+1} = Tp_t$. The discrete analogue of Hamiltonian dynamics turns out to be the assumption that $T$ is a permutation matrix, i.e. it has exacly one 1 on each rown and column, and all its other entries are 0. (Note that permutation matrices are a subset of unitary matrices.) It turns out that, under this assumption, the Gibbs entropy (aka Shannon entropy) $H(p)$ does not change over time.


(It's also worth mentioning, as an aside, that instead of representing $p$ as a vector, I could choose to represent it as a diagonal matrix $P$, with $P_{ii}=p_i$. It then looks a lot like the density matrix formalism, with $P$ playing the role of $\rho$ and $T$ being equivalent to unitary evolution.)



Now let's say I make a measurement of the system. We'll assume that I don't disturb the system when I do this. For example, let's say the system has two states, and that initially I have no idea which of them the system is in, so $p=(\frac{1}{2},\frac{1}{2})$. After my measurement I know what state the system is in, so $p$ will become either $(1,0)$ or $(0,1)$ with equal probability. I have gained one bit of information about the system, and $H(p)$ has reduced by one bit. In the classical case these will always be equal, unless the system interacts with some other system whose state I don't precisely know (such as, for example, a heat bath).


Seen from this point of view, the change in von Neumann entropy when a quantum measurement is performed is not surprising. If entropy just represents a lack of information about a system then of course it should decrease when we get some information. In the comments below, where I refer to "subjective collapse" interpretations, I mean interpretations that try to interpret the "collapse" of a wavefunction as analogous to the "collapse" of the classical probability distribution as described above, and the von Neumann entropy as analogous to the Gibbs entropy. These are also called "$\psi$-epistemic" interpretations.


But there's a problem, which is this: in the experiment described at the beginning of this question, I'm getting one bit of information with every measurement, but the von Neumann entropy is remaining constant (at zero) instead of decreasing by one bit each time. In the classical case, "the total information I have gained about the system" + "uncertainty I have about the system" is constant, whereas in the quantum case it can increase. This is disturbing, and I suppose what I really want to know is whether there's any known interpretation in which this "extra" information is accounted for somehow (e.g. perhaps it could come from thermal degrees of freedom in the measuring apparatus), or in which it can be shown that something other than the von Neumann entropy plays a role analogous to the Gibbs entropy.




No comments:

Post a Comment

classical mechanics - Moment of a force about a given axis (Torque) - Scalar or vectorial?

I am studying Statics and saw that: The moment of a force about a given axis (or Torque) is defined by the equation: $M_X = (\vec r \times \...