In non-relativistic QM, the $\Delta E$ in the time-energy uncertainty principle is the limiting standard deviation of the set of energy measurements of $n$ identically prepared systems as $n$ goes to infinity. What does the $\Delta t$ mean, since $t$ is not even an observable?
Answer
Let a quantum system with Hamiltonian $H$ be given. Suppose the system occupies a pure state $|\psi(t)\rangle$ determined by the Hamiltonian evolution. For any observable $\Omega$ we use the shorthand $$ \langle \Omega \rangle = \langle \psi(t)|\Omega|\psi(t)\rangle. $$ One can show that (see eq. 3.72 in Griffiths QM) $$ \sigma_H\sigma_\Omega\geq\frac{\hbar}{2}\left|\frac{d\langle \Omega\rangle}{dt}\right| $$ where $\sigma_H$ and $\sigma_\Omega$ are standard deviations $$ \sigma_H^2 = \langle H^2\rangle-\langle H\rangle^2, \qquad \sigma_\Omega^2 = \langle \Omega^2\rangle-\langle \Omega\rangle^2 $$ and angled brackets mean expectation in $|\psi(t)\rangle$. It follows that if we define $$ \Delta E = \sigma_H, \qquad \Delta t = \frac{\sigma_\Omega}{|d\langle\Omega\rangle/dt|} $$ then we obtain the desired uncertainty relation $$ \Delta E \Delta t \geq \frac{\hbar}{2} $$ It remains to interpret the quantity $\Delta t$. It tells you the approximate amount of time it takes for the expectation value of an observable to change by a standard deviation provided the system is in a pure state. To see this, note that if $\Delta t$ is small, then in a time $\Delta t$ we have $$ |\Delta\langle\Omega\rangle| =\left|\int_t^{t+\Delta t} \frac{d\langle \Omega\rangle}{dt}\,dt\right| \approx \left|\frac{d\langle \Omega\rangle}{dt}\Delta t\right| = \left|\frac{d\langle \Omega\rangle}{dt}\right|\Delta t = \sigma_\Omega $$
No comments:
Post a Comment