To Avoid the Implications of Cosmic Fine-Tuning, a Continuing Quest
Remember, the multiverse is the currently favored prophylactic against the theistic implications of cosmic fine-tuning. So if the following sounds a bit abstruse, remember what's at stake.
Under the headline "At Multiverse Impasse, a New Theory of Scale," Natalie Wolchover at Quanta Magazine reports that as the solution to a "nasty impasse" some researchers now propose to revise ideas in physics that seemingly could not be more basic. Perhaps, we learn, "the fundamental description of the universe does not include the concepts of 'mass' and 'length,' implying that at its core, nature lacks a sense of scale." The notion is called "scale symmetry."
This passage is telling:
As the logical conclusion of prevailing assumptions, the multiverse hypothesis has surged in begrudging popularity in recent years. But the argument feels like a cop-out to many, or at least a huge letdown. A universe shaped by chance cancellations eludes understanding, and the existence of unreachable, alien universes may be impossible to prove. "And it's pretty unsatisfactory to use the multiverse hypothesis to explain only things we don't understand," said Graham Ross, an emeritus professor of theoretical physics at the University of Oxford.
The multiverse ennui can't last forever.
Perhaps a new "theory of agravity'" is in order, then, short for "adimensional gravity" and described in a paper by Alberto Salvio and Alessandro Strumia. We queried physicist and ENV contributor Dr. Rob Sheldon, currently consulting with NASA's Marshall Space Flight Center, for his take:
Okay, you asked for it. I'll try to summarize the agravity paper, beginning with the problems we are trying to solve.
The "Standard Model" right now has three outstanding mysteries: 1) "dark matter" that creates 70 percent of the gravity in the universe; 2) "dark energy" that provides a fraction of a percent of "antigravity" in the current epoch; and 3) the "inflaton" that stretched the universe in the first nanosecond so as to smooth everything out.
Now when adding in the cosmological constant (the antigravity dark energy term in Einstein's equations), the na�ve calculation using dimensional analysis is wrong by 10120, which is greater than William Dembski's "Maximum probability bound." For the sake of comparison, the balance between the Big Bang energy and the mass of the universe is 1060. This "fine tuning" feature is what necessitated the "inflaton."
Many theorists have proposed ways to resolve the dark energy problem. Most solutions involve "accidental cancellations" of fundamental forces/particles/terms that reduce the vacuum energy by 10120. The favorite cancellation model involved "supersymmetric particles" that are just above the Higgs boson mass and never live long enough to be seen, but whose existence provides that cancellation term.
The Higgs boson discovery last year at 125 GeV was on the low range of predictions, and SUSY (supersymmetry) particles were not seen all the way up to 7 TeV or whatever was the upper range of the Large Hadron Collider then. (It's down for upgrades right now.) This lack of confirmation has trashed 95 percent of all SUSY theories, and theorists are in a panic. A less-favored solution was "axions" which involve a spin-2 graviton if I remember correctly, but searches for it have also turned up negative. $100M proposals want to push the axion search to fill in any gaps, but the theory is not very elegant, nor is it going to solve the "dark matter" problem. And if you're going to spend $100M, you want to solve more problems than you create.
This paper -- agravity -- wants to go back to a rejected theory about cancellations and pretty it up as a third candidate for the dark energy. It doesn't even attempt to solve the dark matter problem. Instead, it proposes to solve the "inflaton" problem, so it addresses problems #2 and #3 only.
Remember, the "problem" everyone is addressing in #2 and #3 is "fine tuning," whereas #1 is an observational problem.
Here's some relevant text from the arXiv paper:
Section 6: ...This alternative understanding of the Higgs mass hierarchy problem relies on the smallness of some parameters. All parameters assumed to be small are naturally small, just like the Yukawa coupling of the electron, ye ? 106, is naturally small. These small parameters do not receive unnaturally large quantum corrections. No fine-tuned cancellations are necessary. ...
As a final comment, we notice that accidental global symmetries (a key ingredient of axion models) are a natural consequence of the dimensionless principle. In the usual scenario, ad hoc model building is needed in order to suppress explicit breaking due to mass terms or non-renormalizable operators .
So what are they claiming? That if you make their set of "natural" assumptions, the other guy's ad hoc fine-tuning assumption goes away. In other words, their assumptions are better than his assumptions based on some aesthetic principle.
I didn't count carefully, but "their assumptions" involved at least three separate coupling constants, quadratic approximations, smallness criteria, and an unobserved "ghost" graviton -- all to remove the fine-tuning problem from the Standard Model. They are following the footsteps of the "inflaton" theories -- expanding sets of hypotheses to avoid some embarrassing metaphysics.
I do wish all these theorists would apply Bayesian methods to their theories. It might help them think more clearly.
Fortunately we have Dr. Sheldon to do some clear thinking on their behalf.