banner image
Sedang Dalam Perbaikan

Naturalness and the LHC nightmare

Phil Gibbs wrote a nice essay,
Naturally Unnatural,
in which he discusses the annual EPS-HEP conference that just began in Stockholm and the subdued/nightmare feelings that phenomenologists may have due to the perfect agreement between the LHC and the Standard Model. He adds comments about naturalness and the multiverse.



Unnaturalness is sometimes in the eyes of the beholder.

Nima is quoted as a defender of a \(100\TeV\) collider. A possible result could be that nothing new is found which would be considered fascinating by Nima because that would be a proof of some unnaturalness in the Universe. Well, I have been defending "ever higher energies" as the most well-motivated post-LHC direction of particle physics for quite some time but I wouldn't be too thrilled by a negative result. It seems totally plausible to me – the probability is comparable to 30-50 percent, whatever – that the Standard Model would work even at such a higher-energy collider.




The reason is that I have never had any trouble with a modest amount of "unnaturalness". Many dimensionless quantities in physics are of order one but there's nothing impossible about quantities' being of order 0.001 or somewhat smaller.




Although this point has been analyzed many times on this blog, let me add a few words.

Unless we have a more accurate way to estimate or calculate a quantity \(g\) which is a priori between \(0\) and \(1\) – let's normalize it in this way – it's rather natural to choose the uniform prior probability distribution in which the probability that \(g\) is in the interval \((g,g+dg)\) is simply \(dp=dg\). In particular, the probability that \(g\) is smaller than some very small \(g_0\) is of order \(g_0\) itself.
Neutrinos: T2K shows electron-muon neutrino oscillations at 7.5 sigma...
That's the real reason why we don't expect the dimensionless parameters – that probably result from some more fundamental physics – to be much smaller than their "natural" value. It's all about the Bayesian inference and some prior probabilities.

However, this uniform prior probability distribution is rather unrefined and naive. It reflects our ignorance. It's very clear that if we knew more about the physical phenomena that determine the value of the parameter, we could also know that these phenomena favor low values so that the probability distribution could be, for example\[

dp = dg\cdot \frac{g^{-k}}{1-k},\quad 0\lt k \lt 1

\] which makes the values of \(g\) near zero much more likely. Alternatively, a better theory may also predict that the right probability distribution is\[

dp = dg\cdot \delta\zav{ g - \sqrt{ \frac{4\pi}{137.036} }}

\] i.e. a distribution that simply tells you the right value of the constant. The original uniform distribution wasn't fundamentally true in any sense. It only reflected our immediate knowledge about the system and the distribution was uniform because we had minimal knowledge about the question. It's clear that a theory that gives non-uniform distributions, especially one that produces the delta-function-like distribution, is more predictive, more refined, and unless falsified, may be much more correct and accurate than the "theory" producing the uniform distribution.



Is this animal unnatural? The main reason why most people will say Yes is just their lack of experience with it.

I used the number from the fine-structure constant because it's going to be my example. It is equal to\[

\alpha = \frac{e^2}{4\pi\epsilon_0\cdot \hbar c} \sim\frac{1}{137.036} \sim 0.007297

\] and this dimensionless number quantifies the characteristic strength of one of the most familiar fundamental interactions, the electromagnetic force. Note that the value is also "much smaller than one" and we could say that it is rather unlikely (less than one percent probability). However, with some extra knowledge, we may argue that the value isn't too unnatural. Why?

First of all, we might argue that \(\epsilon_0\) is more natural than \(4\pi\epsilon_0\) in the rationalized system of units and \[

\frac{e^2}{\epsilon_0\hbar c}\sim 0.092

\] which is already rather close to one. Moreover, you could write the QED Lagrangian in such a way that \(e\) appears in the first power. So a natural quantification of the strength of the force could be the square root of the number above \[

\frac{e}{\sqrt{\epsilon_0\hbar c}} \sim 0.303

\] which is "really" of order one. But these were high-school-level numerological games that don't reflect the actual expertise of a particle physicist well. The real reasons why it's natural for the fine-structure constant to be this small are different. In fact, it's more natural to express the strength using \(e^2\) and not \(e\); and it's more natural to divide this \(e^2\) by \(4\pi\) than not to do it. In fact, quantum loops naturally add an extra \(1/16\pi^2\) which makes the unnaturalness of \(\alpha\) worse, not better.

However, in the electroweak theory, \(\alpha\) and electromagnetism are no longer fundamental. They are produced as a mixture of two interactions, those mediated by the \(SU(2)_W\) and \(U(1)_Y\) gauge fields, respectively, and the extra angle determining the mixing – the Weinberg weak angle – is another source of the potential smallness of the low-energy fine-structure constants such as the electromagnetic one.



These muscles look a bit unnatural and I admit, they are unnatural – it is a Halloween outfit – but some people have very similar natural muscles, anyway.

Moreover, we know that \(\alpha\) should better not be too close to one – too high – because in that case, the coupling could diverge at slightly shorter distances (Landau pole, inconsistencies of the QFT) or slightly longer distances (confinement of a source). None of these things qualitatively agrees with the long-range electromagnetic force as we know it – we know it as an interaction that equally acts on a wide variety of objects and scales – which are reasons to expect that the electromagnetic fine-structure constant should be sort of significantly smaller than the maximum or natural value.

The comments above are vague reflections of rudimentary particle physics. A deeper knowledge of the correct short-distance theory could allow you to say much more. So I wouldn't be surprised at all if some dimensionless parameters encoding the soft supersymmetry breaking were found to be \(0.0001\) or smaller. On the contrary, I do expect such values to appear rather often. They only look unnatural if you believe the roughest, most naive probability distribution for these parameters. Any kind of better knowledge is bound to make some "natural candidates for preferred values" and their vicinity more likely. Zero is an extremely special point in the interval and there may exist many mechanisms and/or arguments that favor it or that favor its vicinity. If you say that you want to be indefinitely impressed by the relative smallness of a parameter, you're de facto saying that you don't want to learn about a more accurate physical explanation of these parameters and the dynamics that underlies them – more accurate relatively to the stupidest, roughest possible theory in which all parameters are equally likely.

When phenomenologists were solving the naturalness problem – effectively why the Higgs boson is so much lighter than the Planck scale – they focused on the theories that add new physics such as SUSY or technicolor very close to the scale of the Higgs mass. So the general belief is that any theory that makes the Higgs mass look natural must predict new particles whose mass is very close to the Higgs mass.

Even though I do realize that scalar masses may be produced or modified by pretty much any process, i.e. they are unprotected, I don't believe that this is inevitable. There are lots of examples in particle physics where one produces a small mass parameter without adding new particles that are equally light. An obvious example is the seesaw mechanism for the tiny neutrino masses. The neutrino masses that are generated in certain models are of order\[

m_\nu\sim \frac{m_{EW}^2}{m_{GUT}}

\] so they're smaller than the electroweak scale by the same factor by which the electroweak scale is amaller than the grand unified scale. Even though I appreciate that the lightness of the neutrino has been helped by the chiral symmetry, I am not aware of proofs of no-go theorems that would say that nothing vaguely similar is possible for the Higgs mass. In general, supersymmetry links the light Higgs to a comparably light higgsino (whose mass may be naturally light due to the same chiral symmetry as the neutrino) but other principles (symmetries and/or their stringy alternatives) or more specific types of SUSY may say more than that and they may favor a very light Higgs even in the absence of a "truly comparable in mass" higgsino.

So if you imagine that the success of the Standard Model will continue during or after the \(13\TeV\) LHC run that will begin in April 2015, it would mean that we will have a proof of some 1% fine-tuning in Nature. That's great but it will only be evidence that our naive, egalitarian, uniform probability distribution is no good – and rather weak evidence, for that matter. If there's still a 1% or 0.1% probability that such a smallness occurs by chance, its appearance is a 2.5-3.5-sigma signal supporting "something new". Moreover, we're not really sure what this "something new" is, not even approximately. Let me say in advance that if the LHC agrees with the Standard Model in 2015, I won't think that the multiverse will have been established; such a conclusion would seem totally premature to me.

People have spent $10 billion and many of them expected huge new results – because a smaller percentage of them have promised some huge new results – but there has never been a reason to be really certain that new physics (aside from the Higgs) would have to appear at the LHC. The LHC may be expensive but it brings us nowhere near the fundamental scale. With the 2012 LHC data, we're just 7% closer to the Planck scale than we were before the LHC! (It's on the log scale; on the linear scale, we have made almost no progress whatsoever.) Nothing fundamental has really changed about our inability to experimentally probe Nature at the fundamental scale. And the idea that some new physical phenomena would "have to" occur at the LHC has always been more wishful thinking (plus promises used to convince everyone that a new collider has to be built) and less solid, justified reasoning.

So for various reasons, dark matter and some naturalness bias, I still think that it's slightly more likely than not that the LHC will start to see new physics in 2015. So far, it has agreed with the Standard Model almost perfectly. Since yesterday, after nearly one month, the LHC experimenters began to release a huge package (dozens) of new papers (see the CMS, ATLAS hyperlinks in the right sidebar of this blog). I've gone through these papers and everything agrees with the Standard Model. One of the bins somewhere showed a 2.4-sigma excess and I won't even tell you where it was – somewhere in a \((180\GeV,200\GeV)\) bin – because given the large amount of channels and bins, an excess of this magnitude is very likely to appear.

Some of the tests of the Standard Model are somewhat impressive. For example, while I was writing this text, CMS tweeted that a new rare decay has been spotted.

If there's no experimental breakthrough and no theoretical revolution that will immediately and convincingly change our opinion about what is right around the corner behind the Standard Model, the status quo will simply continue whether you like it or not. In particular, some kind of a supersymmetric scenario remains the most likely candidate for new physics that will be observed on a sunny day in the future. In fact, we have repeatedly argued that supersymmetry's relative odds have increased due to the "negative" LHC data so far. But without a theoretical revolution, we can't really know when the Standard Model will finally break down. It's like throwing dice: every year in which the LHC is running, when you get a 6 (or perhaps 5 or 6), the Standard Model collapses. It may take quite some time. It is completely natural for such things to last some time.

The lightness of the Higgs combined with the evidence of no new particles with similar masses would strengthen - and has already strengthened – the proposal that we should think about particle physics in the multiverse framework. But I don't think that this strengthening has a chance to become a "completely dominant" paradigm in a few years or a decade. Whether we like it or not, the LHC has no magic bullet to be a game-changer and the status quo is bound to continue with minor gradual modifications if no game-changing discoveries are made.

And that's the memo.
Naturalness and the LHC nightmare Naturalness and the LHC nightmare Reviewed by DAL on July 19, 2013 Rating: 5

No comments:

Powered by Blogger.