It is generally accepted that the radius of convergence of perturbation series in quantum field theory is zero.
E.g. 't Hooft in "Quantum Field Theory for Elementary Particles":
"The only difficulty is that these expansions will at best be asymptotic expansions only; there is no reason to expect a finite radius of convergence.”
Or Jackiw in "The Unreasonable Effectiveness of Quantum Field Theory":
"Quantum field theoretic divergences arise in several ways. First of all, there is the lack of convergence of the perturbation series, which at best is an asymptotic series."
The main argument for this comes from Dyson. (See: Dyson, Divergence of perturbation theory in quantum electrodynamics, Phys.Rev. 85 (1952) 631–632).
Now, in "Can We Make Sense Out of Quantum Chromodynamics?" t' Hooft introduced a transformation (now called t' Hooft transformation, or t' Hooft renormalization scheme) that replaces the coupling constant $g$ with $g_R$ such that $\beta(g_R) =a_1 g_R+ a_2 g_R^2$. In other words, the beta function for the new parameter $g_R$ has only two terms and not infinitely many as the series for $g$.
Isn't this in contradiction with the claim that the radius of convergence is zero? Or is the series for $g_R$ still divergent although it only contains two terms? If yes, where can we observe the divergence after the t' Hooft transformation?
No comments:
Post a Comment