Can we find the exponential radioactive decay formula from first principles? It's always presented as an empirical result, rather than one you can get from first principles. I've looked around on the internet, but can't really find any information about how to calculate it from first principles. I've seen decay rate calculations in Tong's qft notes for toy models, but never an actual physical calculation, so I was wondering if it's possible, and if so if someone could link me to the result.
Answer
If you want to be very nitpicky about it, the decay will not be exponential. The exponential approximation breaks down both at small times and at long times:
At small times, perturbation theory dictates that the amplitude of the decay channel will increase linearly with time, which means that the probability of decay is at small times only quadratic, and the survival probability is slightly rounded near $t=0$ before of going down as $e^{-t/\tau}$. This should not be surprising, because the survival probability is time-reversal invariant and should therefore be an even function.
At very long times, there are bounds on how fast the bound state amplitude can decay which are essentially due to the fact that the hamiltonian is bounded from below, and which I demonstrate in detail below.
Both of these regimes are very hard to observe experimentally. At short times, you usually need very good time resolution and the ability to instantaneously prepare your system. At long times, you probably wouldn't need to go out that far out, but it is typically very hard to get a good signal-to-noise ratio because the exponential decay has pretty much killed all your systems, so you need very large populations to really see this.
However, both sorts of deviations can indeed be observed experimentally. At long times, the first observation is
Violation of the Exponential-Decay Law at Long Times. C Rothe, SI Hintschich and AP Monkman. Phys. Rev. Lett. 96 163601 (2006); Durham University eprint.
(To emphasize on the difficulty of these observations, they had to observe an unstable system over 20 lifetimes to observe the deviations from the exponential, by which time $\sim10^{-9}$ of the population remains.) For short times, the first observations are
Experimental evidence for non-exponential decay in quantum tunnelling. SR Wilkinson et al. Nature 387 no. 6633 p.575 (1997). UT Austin eprint,
which measured tunnelling of sodium atoms inside an optical lattice, and
Observation of the Quantum Zeno and Anti-Zeno Effects in an Unstable System. MC Fischer, B GutiƩrrez-Medina and MG Raizen. Phys. Rev. Lett. 87, 040402 (2001), UT Austin eprint (ps).
To be clear, the survival probability of a metastable state is for all practical intents and purposes exponential. It's only with a careful experiment - with large populations over very long times, or with very fine temporal control - that you can observe these deviations.
Consider a system initialized at $t=0$ in the state $|\psi(0)⟩=|\varphi⟩$ and left to evolve under a time-independent hamiltonian $H$. At time $t$, the survival amplitude is, by definition, $$ A(t)=⟨\varphi|\psi(t)⟩=⟨\varphi|e^{-iHt}|\varphi⟩ $$ and the survival probability is $P(t)=|A(t)|^2$. (Note, however, that this is a reasonable but loaded definition; for more details see this other answer of mine.) Suppose that $H$ has a complete eigenbasis $|E,a⟩$, which can be supplemented by an extra index $a$ denoting the eigenvalues of a set $\alpha$ of operators to form a CSCO, so you can write the identity operator as $$1=\int\mathrm dE \mathrm da|E,a⟩⟨E,a|.$$ If you plug this into the expression for $A(t)$ you can easily bring it into the form $$ A(t)=\int \mathrm dE\, B(E)e^{-iEt},\quad\text{where}\quad B(E)=\int \mathrm da |⟨E,a|\varphi⟩|^2. $$ Here it's easy to see that $B(E)\geq0$ and $\int B(E)\mathrm dE=1$, so $B(E)$ needs to be pretty nicely behaved, and in particular it is in $L^1$ over the energy spectrum.
This is where the energy spectrum comes in. In any actual physical theory, the spectrum of the hamiltonian needs to be bounded from below, so there is a minimal energy $E_\text{min}$, set to 0 for convenience, below which the spectrum has no support. This looks quite innocent, and it allows us to refine our expression for $A(t)$ into the harmless-looking $$ A(t)=\int_{0}^\infty \mathrm dE\, B(E)e^{-iEt}.\tag1 $$ As it turns out, this has now prevented the asymptotic decay $e^{-t/\tau}$ from happening.
The reason for this is that in this form $A(t)$ is analytic in the lower half-plane. To see this, consider a complex time $t\in\mathbb C^-$, for which \begin{align} |A(t)| & =\left|\int_0^\infty B(E)e^{-iEt}\mathrm dE\right| \leq\int_0^\infty \left| B(E)e^{-iEt}\right|\mathrm dE =\int_{0}^\infty \left| B(E)\right|e^{+E \mathrm{Im}(t)}\mathrm dE \\ & \leq\int_{0}^\infty \left| B(E)\right|\mathrm dE=1. \end{align} as $\mathrm{Im}(t)<0$. This means that the integral $(1)$ exists for all $t$ for which $\mathrm{Im}(t)\leq 0$, and because of its form it means that it is analytic in $t$ in the interior of that region.
This is nice, but it is also damning, because analytic functions can be very restricted in terms of how they can behave. In particular, $A(t)$ grows exponentially in the direction of increasing $\mathrm{Im}(t)$ and decays exponentially in the direction of decreasing $\mathrm{Im}(t)$. This means that its behaviour along $\mathrm{Re}(t)$ should in principle be something like oscillatory, but you can get away with something like a decay. What you cannot get away with, however, is exponential decay along both directions of $\mathrm{Re}(t)$ - it is simply no longer compatible with the demands of analyticity.
The way to make this precise is to use something called the Paley-Wiener theorem which in this specific setting demands that $$ \int_{-\infty}^\infty \frac{\left|\ln|A(t)|\right|}{1+t^2}dt<\infty. $$ That is, of course, a wonky integral if anyone ever saw one, but you can see that if $A(t)\sim e^{-|t|/\tau}$ for large times $|t|$ ($A(t)$ must be time-reversal symmetric), then the integral on the left (only just) diverges. There's more one can say about why this happens, but for me the bottom line is: analyticity demands some restrictions on how fast $A(t)$ can decay along the real axis, and when you do the calculation this turns out to be it.
(For those wondering: yes, this bound is saturated. The place to start digging is the Beurling-Malliavin theorem, but I can't promise it won't be painful.)
For more details on the proofs and the intuition behind this stuff, see my MathOverflow question The Paley-Wiener theorem and exponential decay and Alexandre Eremenko's answer there, as well as the paper
L. Fonda, G. C. Ghirardi and A. Rimini. Decay theory of unstable quantum systems. Rep. Prog. Phys. 41, pp. 587-631 (1978). §3.1 and 3.2.
from which most of this stuff was taken.
No comments:
Post a Comment