Saturday, August 12, 2017

More questions on string theory and the standard model


This is a followup question to How does string theory reduce to the standard model?


Ron Maimon's answer there clarified to some extent what can be expected from string theory, but left details open that in my opinion are needed to justify his narrative. Let me state what his answer suggested to me:


Based on some indirect evidence, it is conjectured that
(i) one day, string theory produces in one of the many possible low energy approximations a theory equivalent to the standard model or a slight variation of it.
(ii) string theory cannot predict that the universe must be in this low energy approximation, as the actual low energy approximation is a random result of a stochastic dynamics, and hence as unpredictable as the masses and distances of sun and planets in the solar system.
(iii) In particular, the phenomenological input needed (roughly corresponding to boundary conditions in a classical theory) includes ''the qualitative structure of the standard model, plus the SUSY, plus say 2-decimal place data on 20 parameters''. From this, it is conjectured that infinitely accurate values of all parameters in the standard model are determined by string theory.



Support for these conjectures is a matter of belief. To me, none of the three statements is so far plausible:
(i) can be supported only by finding such an approximation.
What other finding could support this conclusion?


(ii) is quite different from what we see at lower energies. For example, while we cannot predict the shape of a particular bunch of water, we can derive completely all macroscopic (i.e., low energy) properties of water from microscopic principle and a few constants defining the microscopic laws (masses of H, O, and the electron, and the fine structure constant). And the parameters of the standard model define in principle the properties of all nuclei (though we haven't yet good enough numerical methods).
What makes string theory unique in that not only the particular low energy states but also the low energy dynamical laws are random results?


(iii) could only be substantiated if it could be shown at least for some of the many low energy approximations that knowing some numbers to low pecision fixes all coupling constants in theory to arbitrary precision.
Is there such a case, so that one could study the science behind this claim?


Also (iii) is somewhat disappointing, as it means that most of the potential insight an underlying theory could provide must already be assumed: Quite unlike Newton's theory, which is indispensible to derive from a few dozen constants the complete motion of the planets, or quantum chemistry and statistical mechanics, which are indispensible to derive from a few dozen constants all properties of macroscopic materials, string theory does not provide a similar service for elementary particle theory, as quantum field theory already contains all machinery needed to draw the empirical consequences from 'the qualitative structure of the standard model, plus the SUSY, plus'' say 4-decimal place data on 33 parameters.



Answer



When Newton's mechanics was new, people expected a theory of the solar system to produce better descriptions for the stuff left unexplained by Ptolmey: the equant distances, the main-cycle periods, and epicycle locations. Newton's theory didn't do much there--- it just traded in the Ptolmey parameters for the orbital parameters of the planets. But the result predicted the distances to the planets (in terms of the astronomical unit), and to the sun, and these distances could be determined by triangulation. Further, the theory explained the much later observation of stellar abberation, and gave a value for the speed of light. If your idea of what a theory should predict was blinkered by having been brought up in a Ptolmeian universe, you might have considered the Kepler/Newton's theory as observationally inconsequential, since it did not modify the Ptolmeic values for the observed locations of the planets in any deep way.



The points you bring up in string theory are similar. String theory tells you that you must relate the standard model to a microscopic configuration of extra dimensions and geometry, including perhaps some matter branes and almost certainly some orbifolds. These predictions are predicated on knowing the structure of the standard model, much like the Newtonian model is predicated on the structure of the Ptolmeic one. But the result is that you get a complete self-consistent gravitational model with nothing left unknown, so the number of predictions is vastly greater than the number of inputs, even in the worst case scenario you can imagine.


The idea that we will have $10^{40}$ standard model like vacua with different values of the electron mass, muon mass, and so on, is extremely pessimistic. I was taking this position to show that even in this worst of all possible worlds, string theory is predictive. It is difficult to see how you could make so many standard-model like vacua, when we have such a hard time constructing just one.


There are some predictions that we can make from string theory without knowing hardly anything at all about the vacuum, just from the observation of a high Planck scale and the general principles of degree-of-freedom counting. These predictions are generally weak. For example, if we find that the Higgs is in a technicolor sector, with a high Planck scale, and the technicolor gauge group is SO(1000) or SP(10000), it is flat out impossible to account for this using string theory. The size of the gauge group in a heterotic compactification is two E8s, and no bigger. In order to violate this bound on the number of gauge generators, you need a lot of branes in some kind of type II theory, and then you won't be able to stabilize the compactification at a small enough scale, because of all the gauge flux from the branes will push the volume to be too big, so that the GUT scale will fall too low, and the problems of large-extra dimensions will reappear.


In a similar vein, if you discover a new ultra-weak gauge field in nature, like a new charge that protons carry that electrons don't, you will falsify string theory. You can't make small charges without small masses, and this constraint is the most stringent of several model constraints on low-energy strings in the swampland program.


These types of things are too general--- they rule out stuff that nobody seriously proposes (although at least one group in the 90s did propose ultra-weak gauge charges as a way to stabilize the proton). But it is notable that the type of things we observe at low energies are relatively small gauge groups compared to what could be theoretically, and in generational echoes of a relatively limited number of representations, all of the most basic sort. This is the kind of stuff that naturally appears in compactified string theory, without any fine adjustments.


Still, in order to make real predictions, you need to know the details of the standard model vacuum we live in, the shape of the compactification and all the crud in it. The reason we don't know yet is mostly because we are limited in our ability to explore non-supersymmetric compactifications.


When the low-energy theory is supersymmetric, it often has parameters, moduli, which you can vary while keeping the theory supersymmetric. These are usually geometrical things, the most famous example is the size of the compactification circles in a toroidal compactification of type II strings. In such a vacuum, there would be parameters that we would need to determine experimentally. But these universes are generally empty. If you put nonsupersymmetric random stuff into a toroidal compactification, it is no longer stable. Our universe has many dimensions already collapsed, and only a few dimensions growing quickly.


The cosmological constant in our universe is an important clue, because it is, as far as we can see, just a miraculous cancellation between QCD pion-condensate energy and QCD gluon condensate energy, Higgs condensate energy, zero-point energy density in the low-energy fields, dark matter field zero-point energy (and any dark-matter condensates), and some energy which comes from the Planckian configuration of the extra dimensions. If the low cosmological constant is accidental to 110 decimal places, which I find extremely unlikely, the observation of its close-to-zero value gives 110 decimal places of model-restricting data. It is much more likely that the cosmological constant is cancelled out at the Higgs scale, or perhaps it is an SO(16)xSO(16) type thing where the cosmological constant is so close zero because the theory is an orbifold-like projection of a close-to-SUSY theory. In this case, you might have only 20 decimal places of data from the cosmological constant, or even fewer.


To answer your specific concerns:


for i: we can make string compactifications that contain the MSSM emerging from an SO(10) or SU(5) GUT at low energies and no exotic matter. although these types of compactifications are generally still too supersymmetric, they have the right number of generations, and the right gauge groups and particle content. These ideas are found here: http://arxiv.org/abs/hep-th/0512177 , and the general scheme is based on work of Candelas, Horowitz, Strominger, Witten from 1985, in supersymmetric geometric compactifications of the heterotic string theories of Gross,Harvey,Martinec,Rohm.



Kachru and collaborators in the last decade explained how you can break SUSY at low enegies using only gauge fluxes in the extra dimensions. Orbifold-type can even break SUSY at high energy, leaving only non-SUSY stuff, and the classic example is the SO(16)xSO(16) strings of Alvarez-Gaume, Ginsparg, Moore, Vafa (see here: http://arxiv.org/abs/hep-th/9707160 ). This suggests very strongly that we can find the standard model in a natural compactification, and more so, we can find several different embeddings.


for ii--- the answer is not so different from other theories. The low energy "laws" are not immutable laws, they are modified by knocking the extra dimension stuff around, they can be changed. There are no immutable field theory laws in string theory--- the field theory is an effective field theory describing the fluctuations of a moderately complex system, about as complicated as a typical non-biological high-Tc superconducting ceramic. So the laws are random only inasmuch as the crud at high energy can be rearranged consistently, which is not that much.


for iii--- you must remember that I am talking about the worst case scenario. We have a lot of clues in the standard model, like the small electron mass, and the SUSY (or lack thereof) that we will (or will not) find at LHC. These clues are qualitative things that cut down the number of possibilities drastically. It is very unlikely to me that we will have to do a computer aided search through $10^{40}$ vacua, but if push comes to shove, we should be able to do that too.


Historical note about Ptolmey, Aristarchus, Archimedes and Appolonius


To be strictly honest about the history I incidentally mentioned, I should say that I believe the preponderance of the evidence suggests that Aristarchus, Archimedes, and Appolonius developed a heliocentric model with elliptical orbits, or perhaps only off-center circular orbits with nonuniform motion, already in the 3rd century BC, but they couldn't convince anybody else, precisely because the theory couldn't really make new predictions with measurements that were available at the time, and it made counterintuitive and, to some denominations heretical, predictions that the Earth was moving frictionlessly through a Democritus style void. The reason one should believe this about those ancient folks is we know for sure, from the Sand Reckoner, that Archimedes was a fan of Aristarchus heliocentric model, that Appolonius and Archimedes felt a strong motivation to make detailed study of conic sections--- they knew what we would today call the defining algebraic equation of the parabola, the ellipse, and hyperbola. It was Appolonius who introduced the idea of epicycle and deferent, I believe as an Earth-centered approximation to a nonuniform conic orbit in a heliocentric model. It is certain to my mind that Appolonius, a contemporary of Archimedes, was a heliocentrist.


Further, Ptolmey's deferent/epicycle/equant system is explained in the Almaghest with an introduction which is replete with an anachronistic Heisenberg-like positivism: Ptolmey notes that his system is observationally equivalent to some other models, which is an obvious reference to the heliocentric model, but we can't determine distances to the planets, so we will never know which model is right in an abstract sense, so we might as well use the more convenient model which treats the Earth as stationary, and makes orbits relative to the Earth. The Ptolmey deferent/equant/epicycle model is elegant. If you only heard about it by hearsay, you would never know this--- there were no "epicycles upon epicycles" in Ptolmey, only Copernicus did stuff like that, and then only so as to match Ptolmey in accuracy, only using uniform circular orbits centered on the sun, not off-center circles with equants, or area-law ellipses. Ptolmey's system can be obtained by taking a heliocentric model with off-center circular orbits and an equal-areas law and putting your finger on the Earth, and demanding that everything revolve around the Earth. This back-derivation from a preexisting heliocentric off-center circle model is more plausible to me than saying that Ptolmey came up with this from scratch, especially since Appolonius is responsible for the basic idea.


Barring an unlikely discovery of new surviving works by Archimedes, Appolonius, or Aristarchus, one way to check if this idea is true is to look for clues in ancient texts, to see if there was a mention of off-center circles in the heliocentric model, or of nonuniform circular motion. Aristotle is too early, he dies before Aristarchus is active, but his school continues, and might have opposed the scientific ideas floating around with later texts.


No comments:

Post a Comment

classical mechanics - Moment of a force about a given axis (Torque) - Scalar or vectorial?

I am studying Statics and saw that: The moment of a force about a given axis (or Torque) is defined by the equation: $M_X = (\vec r \times \...