Forgive me if this topic is too much in the realm of philosophy. John Baez has an interesting perspective on the relative importance of dimensionless constants, which he calls fundamental like alpha, versus dimensioned constants like $G$ or $c$ [ http://math.ucr.edu/home/baez/constants.html ]. What is the relative importance or significance of one class versus the other and is this an area that physicists have real concerns or expend significant research?
Answer
first of all, the question you are asking is very important and you may master it completely.
Dimensionful constants are those that have units - like $c, \hbar, G$, or even $k_{\rm Boltzmann}$ or $\epsilon_0$ in SI. The units - such as meter; kilogram; second; Ampere; kelvin - have been chosen partially arbitrarily. They're results of random cultural accidents in the history of mankind. A second was original chosen as 1/86,400 of a solar day, one meter as 1/40,000,000 of the average meridian, one kilogram as the mass of 1/1,000 cubic meters (liter) of water or later the mass of a randomly chosen prototype, one Ampere so that $4\pi \epsilon_0 c^2$ is a simple power of 10 in SI units, one Kelvin as 1/100 of the difference between the melting and boiling points of water.
Clearly, the circumference of the Earth, the solar day, a platinum prototype brick in a French castle, or phase transitions of water are not among the most "fundamental" features of the Universe. There are lots of other ways how the units could be chosen. Someone could choose 1.75 meters - an average man's height - to be his unit of length (some weird people in the history have even used their feet to measure distances) and he could still call it "one meter". It would be his meter. In those units, the numerical values of the speed of light would be different.
Exactly the products or ratios of powers of fundamental constants that are dimensionless are those that don't have any units, by definition, which means that they are independent of all the random cultural choices of the units. So all civilizations in the Universe - despite the absence of any interactions between them in the past - will agree about the numerical value of the proton-electron mass ratio - which is about $6\pi^5=1836.15$ (the formula is just a teaser I noticed when I was 10!) - and about the fine-structure constant, $\alpha\sim 1/137.036$, and so on.
In the Standard Model of particle physics, there are about 19 such dimensionless parameters that "really" determine the character of physics; all other constants such as $\hbar,c,G,k_{\rm Boltzmann}, \epsilon_0$ depend on the choice of units, and the number of independent units (meter, kilogram, second, Ampere, Kelvin) is actually exactly large enough that all those constants, $\hbar,c,G,k_{\rm Boltzmann},\epsilon_0$, may be set equal to one which simplifies all fundamental equations in physics where these fundamental constants appear frequently. By changing the value of $c$, one only changes social conventions (what the units mean), not the laws of physics.
The units where all these constants are numerically equal to 1 are called the Planck units or natural units, and Max Planck understood that this was the most natural choice already 100 years ago. $c=1$ is being set in any "mature" analysis that involves special relativity; $\hbar=1$ is used everywhere in "adult" quantum mechanics; $G=1$ or $8\pi G=1$ is sometimes used in the research of gravity; $k_{\rm Boltzmann}=1$ is used whenever thermal phenomena are studied microscopically, at a professional level; $4\pi\epsilon_0$ is just an annoying factor that may be set to one (and in Gaussian 19th century units, such things are actually set to one, with a different treatment of the $4\pi$ factor); instead of one mole in chemistry, physicists (researchers in a more fundamental discipline) simply count the molecules or atoms and they know that a mole is just a package of $6.022\times 10^{23}$ atoms or molecules.
The 19 (or 20?) actual dimensionless parameters of the Standard Model may be classified as the three fine-structure constants $g_1,g_2,g_3$ of the $U(1)\times SU(2)\times SU(3)$ gauge group; Higgs vacuum expectation value divided by the Planck mass (the only thing that brings a mass scale, and this mass scale only distinguishes different theories once we also take gravity into account); the Yukawa couplings with the Higgs that determine the quarks and fermion masses and their mixing. One should also consider the strong CP-angle of QCD and a few others.
Once you choose a modified Standard Model that appreciates that the neutrinos are massive and oscillate, 19 is lifted to about 30. New physics of course inflates the number. SUSY described by soft SUSY breaking has about 105 parameters in the minimal model.
The original 19 parameters of the Standard Model may be expressed in terms of more "fundamental" parameters. For example, $\alpha$ of electromagnetism is not terribly fundamental in high-energy physics because electromagnetism and weak interactions get unified at higher energies, so it's more natural to calculate $\alpha$ from $g_1,g_2$ of the $U(1)\times SU(2)$ gauge group. Also, these couplings $g_1,g_2$ and $g_3$ run - depend on the energy scale approximately logarithmically. The values such as $1/137$ for the fine-structure constant are the low-energy values, but the high-energy values are actually more fundamental because the fundamental laws of physics are those that describe very short-distance physics while long-distance (low-energy) physics is derived from that.
I mentioned that the number of dimensionless parameters increases if you add new physics such as SUSY with soft breaking. However, more complete, unifying theories - such as grand unified theories and especially string theory - also imply various relations between the previously independent constants, so they reduce the number of independent dimensionless parameters of the Universe. Grand unified theories basically set $g_1=g_2=g_3$ (with the right factor of $\sqrt{3/5}$ added to $g_1$) at their characteristic "GUT" energy scale; they may also relate certain Yukawa couplings.
String theory is perfectionist in this job. In principle, all dimensionless continuous constants may be calculated from any stabilized string vacuum - so all continuous uncertainty may be removed by string theory; one may actually prove that it is the case. There is nothing to continuously adjust in string theory. However, string theory comes with a large discrete class of stabilized vacua - which is at most countable and possibly finite but large. Still, if there are $10^{500}$ stabilized semi-realistic stringy vacua, there are only 500 digits to adjust (and then you may predict everything with any accuracy, in principle) - while the Standard Model with its 19 continuous parameters has 19 times infinity of digits to adjust according to experiments.
No comments:
Post a Comment