banner image
Sedang Dalam Perbaikan

LHC confirms a modest stringy prediction on black holes

Comments about the mass of smallest black holes

As David Gross said, string theory is really the Wild West of Physics. At the same moment, it is the most conservative - and the only conservative enough - extension of the well-established principles of quantum field theory.

A physicist who tries to find new conceptual frameworks for physics is walking on a rope that is really thin: while he wants to escape from the boring old waters that are filled with the familiar phenomena and equations, he must also avoid the waters that are too new so that everything would break down. It is not too surprising that there is only one viable framework that goes beyond the achievements of local quantum field theory.

For years, science haters and various assorted ideologues and liars would claim that string theory was bad because except for supersymmetry, string theory didn't predict any spectacular new effects for the LHC and similar easily doable experiments.

The reason is simple: string theory is actually the correct description of the real world and there are no qualitatively new, non-QFT effects below one TeV. So string theory must obviously say the same thing, otherwise it would be wrong. And be sure it is not wrong.




Now, when the LHC has made it clear that there are no preons, first-generation leptoquarks, second-generation leptoquarks, W' bosons, or light black holes near the electroweak scale, see
Search for Microscopic Black Hole Signatures at the Large Hadron Collider (CMS)
for the latter point, some people have made a complete U-turn and they began to - nonsensically - say that string theory was actually predicting these spectacular things to occur during the early stages of the LHC experiment. Well, it was not.

I am totally flabbergasted by the complete ignorance of the people. Even though some of them sell themselves as science journalists, many of them are still unfamiliar with totally basic facts about quantum gravity and string theory - for example with the fact that black holes can only be produced at inaccessibly high energies.

This very insight is no rocket science. For example, in his The Elegant Universe read by millions of lay readers, Brian Greene dedicates the whole Chapter 13 to black holes in string/M-theory. At the very first page, we learn that the smallest black holes deserving the name are so heavy that they can't be reached by particle accelerators:
As we try to make ever less massive black holes, however, there comes a point when they are so light and small that quantum mechanics does comes into play. This happens if the total mass of the black hole is about the Planck mass or less. (From the point of view of elementary particle physics, the Planck mass is huge — some ten billion billion times the mass of a proton. From the point of view of black holes, though, the Planck mass, being equal to that of an average grain of dust, is quite tiny.) And so, physicists who speculated that tiny black holes and elementary particles might be closely related immediately ran up against the incompatibility between general relativity — the theoretical heart of black holes — and quantum mechanics. In the past, the incompatibility stymied all progress in this intriguing direction.
I will show the elementary calculation of the smallest black hole mass below. But already now, I have to scream: how is it possible that so many journalists get this elementary point upside down? For example, John Timmer at the otherwise very good Ars Technica has claimed:
Missing Black Holes Cause Trouble for String Theory
Oh, really? And there were several others. They had to confuse string theory, a theory studied by the planet's smartest people, by nightmares of high school teacher Walter Wagner who claimed that the LHC would eat the Earth, right?

At the beginning of this text, I will describe the scaling laws for black holes in the conventional theories. At the very end, we will look at the models with large or warped extra dimensions.

Estimates of black hole parameters

As we have explained many times, black holes are the most important and most universal localized objects predicted by the general theory of relativity. They massively distort the space around them. The geometry is so curved that you obviously need the mathematics of the Riemannian geometry to describe them.

However, to roughly estimate the relationship between the mass and the radius, it is actually enough to use the laws of Newtonian mechanics that works in the flat (and non-relativistic) space and time. Let me emphasize at the very beginning that this picture only leads to a correct answer to this particular question by chance. There is absolutely no reason why an obsolete and, when considered literally, wrong theory of space and time should be making the right predictions about the extreme conditions in the Universe.

With this disclaimer, let us proceed.

A black hole is an object that is so heavy that not even light has a sufficient speed to escape from the gravitational field. Other objects may fall into this object, the object is a "hole". Because no light is coming from the hole, the hole looks black. It's as simple as that. Now, imagine that the gravitational field is produced by an object of mass "M" and let us ask what is the radius "R" into which we must compress the material so that the light cannot escape.

Well, the escape speed "V" is calculated simply by equating the kinetic energy with the absolute value of the potential energy inside the gravitational field (the latter is negative inside the attractive field): that's exactly the condition that is needed for the velocity to marginally drop to zero as the object gets very far from the source of the gravitational field. If the initial velocity is just a little bit bigger, the object may escape and converge to a nonzero speed even when it's at infinity, i.e. very far from the gravitational field. So we have
1/2 mV2 = GMm / R
The right hand side is the potential energy (without the minus sign): note that there is only "1/R" in the denominator; the derivative of this potential energy gives us the force, "GMm/R^2". Here, "M" is the mass of the object that creates the gravitational field and "m" is the mass of the object that probes it and tries to escape.

Clearly, we can cancel "m" - the calculation is about the accelerations only - and get the radius "R" quickly:
1/2 V2 = GM / R
R = 2GM / V2
That was easy, right? Because we want the escape velocity to be the speed of light, "C", we have:
R = 2GM / C2
This happens to be the correct formula for the radius of a neutral, Schwarzschild black hole (radius as measured by the proper distances on the spherical event horizon). As I have emphasized, it is a complete coincidence that the simple Newtonian argument leads to the exactly right result. In other cases, you could only get the right result up to a wrong numerical coefficient; in more subtle problems, even the scaling could fail if you used a wrong theory. In reality, if you learn general relativity, the exact result is not too hard to find even in the relativistic framework. Karl Schwarzschild managed to do so while fighting against the Russians in the First World War.

For you to have some ideas, stellar-mass black holes have radii of a few miles. The simplest way to calculate the answer is to translate "R" and "M" to the Planck units, "10^{-35} meters" and "10^{-8} kg". In those units, the two quantities are approximately equal. The solar mass is "2x 10^{30} kg" which is about "10^{38}" Planck masses. And indeed, "10^{38}" Planck lengths is a kilometer or so. Similarly, a nucleus-sized black hole weighs billions of tons. The black holes you may speculatively create from some energy in particle collisions have negligibly tiny radii - not far from the Planck length itself.

You may check that the recipe to convert "M" to "R" and vice versa has to work - even if you're not familiar with the natural Planck units. But you should get familiar with them. Without a loss of generality, one may work in units where "c=hbar=1" (units for adult particle physicists) and maybe even "G=1" (Planck units). It leads to no contradiction if you set a few dimensionless universal constants equal to one - it just means that previously unrelated quantities (distance, time, inverse energy) and their units become related and "convertible".

Now, classical (i.e. non-quantum) general relativity is only an OK approximation of the reality if the quantum corrections are small.

In the black hole context, a very canonical quantum correction to the laws of general relativity is the Hawking radiation. Classically, the black holes are totally stable. In the real world, the situation is different. A black hole is not quite the stable object that is implied by general relativity. Instead, it evaporates by emitting some thermal radiation. The smaller the black hole is, the higher the temperature is. In fact, the Hawking temperature is, in certain natural units (and up to a simple numerical coefficient, "1/2.pi"), equal to the "gravitational acceleration" at the black hole surface, the event horizon.
Temperature = kappa / (2.pi) =
= hbar.C / (4.pi.R.kB)
This is a summary of Hawking's calculation of the temperature. Up to the numerical constant, the temperature is such that the energy per degree of freedom, "k_B.T" ("k" is Boltzmann's contant), is equal to the energy of a photon whose wavelength is comparable to the black hole radius, "hbar.C/R" ("hbar" comes from "E=hf" and the frequency of radiation is "C/lambda"). It's not a shocking result that the black hole wants to emit photons whose wavelength is comparable to the black hole radius "R" itself, is it?

Note that the temperature may drop, relatively to this estimate, if it is extremal or near-extremal.

Also note that as the black hole is getting lighter, the light quanta it emits are actually getting more energetic because shorter-wavelength photons carry a higher energy. How many photons or other particles a black hole approximately emits before it disappears? Well, it's not hard to calculate this number "N". Just divide the total black hole latent energy i.e. black hole mass multiplied by "C^2", as "E = MC^2" tells us, by the characteristic energy of the photon we discussed in the previous paragraph, i.e. by "hbar.C/R":
N = MC2 / (hbar.C/R) =
= MC2 / (hbar.C3/2GM)
= G.M2 / (2.hbar.C)
Up to the factor of "2" that we haven't really computed accurately, it is nothing else than the squared black hole mass in the Planck units: you divide the mass "M" by the Planck mass, i.e. by "sqrt(hbar.C/G) = 2.176 x 10^{-8} kg" we mentioned previously, and square the dimensionless ratio.

Clearly, the black hole mass must be heavier than the Planck mass itself - otherwise quantum mechanics predicts that it disappears after it emits less than one particle. A real black hole that can be well approximated by general relativity must be sufficiently long-lived so that the Hawking quanta are relatively small perturbations of its mass. Consequently, black holes that behave like black holes are heavier than the Planck mass.

In a similar way, one may calculate the the lifetime of a black hole of mass "M" is "M^3" in the Planck units; note that the Planck time is "sqrt(hbar.G/c^5) = 5.39 x 10^{-43} seconds". Again, if the mass of your black hole were smaller than the Planck mass, its lifetime would be too short - in fact, shorter than the time that light needs to fly from one side to the black hole to the other. This clearly can't occur so the classical theory breaks down at that point. Again, the conclusion is that the black holes must be heavier than the Planck mass i.e. larger than the Planck radius "10^{-35} meters".

If you want a derivation of the "M^3" scaling for the lifetime, I will offer you one in the Planck units only. The temperature goes like "1/R", so the Stefan-Boltzmann law tells you that the energy outflow goes like "1/R^4" per unit area. The area goes like "R^2", so the total energy outflow goes like "1/R^2" in Planck units. Divide the total mass "M" by the rate of energy decrease, "1/M^2", to get "M^3" for the lifetime.

Higher-dimensional generalizations

The calculations above were done for a 3+1-dimensional spacetime but they may be generalized to higher-dimensional spacetimes, too. The total spacetime dimension will be called "d". First of all, we have to change the gravitational potential "GM/R" to "GM/R^{d-3}"; note that 4-3=1. This changes many exponents. In particular, the mass and the radius are no longer proportional to one another.

In the Planck units, derived from a higher-dimensional "G" by the same formulae as in 3+1 dimensions, the mass goes like "R^{d-3}". The entropy - or area of the event horizon - scales like "R^{d-2}" i.e. as "M^{(d-2)/(d-3)}". And the number of Hawking quanta that the black hole emits before it evaporates scales like the very same thing. It is no coincidence: every Hawking particles carries O(1) of entropy or so.

I can also redo the calculation of the lifetime that was sketched in "d=4" three paragraphs above. In "d" dimensions, the temperature still goes like "1/R". The Stefan-Boltzmann law dictates the loss of energy at "1/R^d" per unit time and unit (d-2)-dimensional area. Multiply it by the area "R^{d-2}" to get the rate of loss of energy at "1/R^2" - in this form, the exponent is d-independent. However, the mass goes like "M=R^{d-3}", so the ratio is "R^{d-3+2}=R^{d-1}" which is the lifetime in "d" spacetime dimensions.

Models of the real world with extra dimensions

Note that if you want to describe the black holes by a 3+1-dimensional classical general relativity, there is absolutely no way to change the scaling laws above - in either direction: indeed, if the black hole is (much) larger than the Planck scale, the equations of the 3+1-dimensional relativity are guaranteed to be approximately valid because the corrections are much smaller than 100%. On the other hand, if the black hole is smaller, it's guaranteed that already the first quantum corrections we understand - such as the Hawking radiation - make the classical description inapplicable. Obviously, the transition from "GR works" to "GR fails" is gradual.

The only way how you can get a different estimate for the minimal black hole mass is to change the classical theory by which the black hole should be described. And the only way how you can replace the 3+1-dimensional general relativity by another theory that passes the tests of relativity and that behaves as 3+1-dimensional relativity at long distances is to consider a higher-dimensional relativity!

However, the extra dimensions have to be "hidden" by a legitimate physical mechanism, not to contradict the fact that we haven't observed them (and most likely, we will not observe them in any foreseeable future).

In ordinary stringy compactifications, some dimensions may be a bit larger than others. In particular, the heterotic E8 x E8 string theory's strong coupling limit is described by the so-called Hořava-Witten's heterotic M-theory. M-theory has 11 spacetime dimensions. If one of them is a line interval with two end points, a gauge group E8 will live on each boundary of the spacetime. If the two boundaries are very close to one another, the M-theoretical dynamics will reduce to the E8 x E8 heterotic string theory: the heterotic strings themselves are cylindrical M2-branes (membranes) stretched between the end-of-the-world domain walls.

This extra, eleventh dimension in M-theory may be larger than the other six (usually compactified on a Calabi-Yau manifold). If the line interval's length is a million of Planck lengths or so, one gets slightly different scalings at very short distances - and, as Witten showed, one may actually improve the gauge coupling unification that includes gravity (one may erase the gap between the GUT scale and the Planck scale). However, this length scale - a million of Planck distances - is still way too short to be observable. Ever.

However, phenomenologists have always been partly driven by their wishful thinking - that the upcoming experiments will show something cool. Even before the experiments show it, the phenomenologists have studied what cool things could be observed soon, i.e. what cool things could be hiding right behind the corner. This phenomenologists' motivation is a less harmful version of what the climatologists began to produce in recent years: the amount of induced fear became much more important than the accuracy of their papers.

My own relationship to this whole mode of reasoning is, well, unexcited. I think it is nothing else than a scientific bias to selectively focus on theories because they are catchy or because they try to invent things that could be "interesting" for many people "soon" - and this is an example. Scientists should focus on models that are likely to be right, not on those that would be more observationally flashy if they happened to be right, by a Nature's mistake. ;-)

So of course, no string theorist has ever believed that the new scenarios in which the extra dimensions could be "right behind the corner" were more likely than 50%. In fact, as I will argue, not even the very proponents of the scenarios ever thought that their models were likely. But we will look at them, anyway.

Old large dimensions

All the extra-dimensional models with visible signs of extra dimensions require branes; they are "braneworld models". It means that all (or at least most of the) particles of the Standard Model we know have to be confined to a brane, a higher-dimensional brane that contains at least the 3+1 dimensions we know. (If the photons lived in the whole space, their properties would dramatically change at distances comparable to the extra dimensions' radii.) But there can be additional dimensions that are perpendicular to the branes. Photons etc. don't see those new dimensions - because the photons are stuck. And the extra dimensions are therefore less constrained by experiments.

In the 1998 ADD "large dimensions" models by Arkani-Hamed, Dimopoulos, and Dvali, the extra dimensions are transverse to our brane and they are considered independent of the 3+1 dimensions we know and love. One usually imagines simple new toroidal dimensions - although this shape is extremely unlikely according to string theory. Because electromagnetism, the strong force, and the weak force are stuck at the brane but gravity - defined as dynamics of all space - may propagate to the extra dimensions, it follows that the strength of the 3 forces will remain constant while gravity will get "diluted" by propagating into the extra dimensions.

The smallness of the usual 3+1-dimensional Newton's constant "G" is partly explained by the large volume of the extra dimensions into which the higher-dimensional gravity gets "diluted".

If a black hole is smaller than the radius of some large extra dimensions, it will inevitably be a higher-dimensional black hole, well-described by a higher-dimensional version of the classical general theory of relativity. In terms of "c,hbar,G", the scalings will change and will depend on the volume of the extra dimensions.

The gravitational "1/R^2" law will change to something else if two objects are closer than the radius of the extra dimensions. As of today, the experiments have measured gravity up to 10 microns of separation. Those people are very skillful, indeed. Try to appreciate that even Mt Everest's own gravity changes the horizontal surface around it just a little bit. But those experimenters are able to measure the gravity of foils whose thickness is just 10 microns or so!

From these gravitational experiments, we know that the extra dimensions can't be bigger than 10 microns. But precision particle-physics experiments "mostly" exclude the ADD models that could leave visible traces at the LHC. Under some awkward assumptions, they're still possible but a reasonable expert's subjective probability is just a fraction of a percent that the LHC could see a glimpse of the ADD models.

The 1999 Randall-Sundrum (RS) models are more likely and I like to estimate their probability of showing up at the LHC to be 1-2 percent (this figure includes the ADD models as well). In this scenario, the extra "radial" dimension "u" may be pretty large but what is important is that it is not quite independent from the 3+1 dimensions we know: the spacetime is not a simple Cartesian product in this case.

Instead, the proper distances as measured along the 3+1 dimensions we know - distances between points with coordinates "(u,t,x,y,z)" and "(u,t',x',y',z')" - depend on the extra coordinate "u". If the coordinate "u" is chosen so that the proper distances in the "u" directions are measured by the coordinate "u" itself, then the proper distances in the usual 3+1 dimensions depend on "u" exponentially. This is a way to write the anti de Sitter space and the AdS/CFT correspondence teaches you to work with the same geometry (although one usually imagines very different radii and very different applications).

In those models, the very small black holes are inherently higher-dimensional black holes as well. Their spherical event horizon is not 2-dimensional but 3-dimensional. Consequently, the "fundamental higher-dimensional Planck length" is much longer - less insanely short - than in the 3+1-dimensional models. All the new scalings depend on the curvature of the 5-dimensional spacetime. And there is a (remote) possibility that the smallest black holes, as well as signs of the extra dimensions or excited strings and other objects that could appear at longer distances than the Planck scale, could even leave some visible traces at the LHC.

These models are sometimes called "string-inspired" because extra dimensions are often associated with string theory. However, we should appreciate that every good phenomenologist - whether or not he or she considers string theory to be a settled framework for quantum gravity - does consider theories with extra dimensions as a possibility simply because they obviously are a possibility. Their history goes back to Kaluza's work in 1919.

Some features of the Randall-Sundrum model appear in various compactifications of string theory, especially in the F-theory vacua (which also underlie the largest part of the "landscape" discussed in the context of the anthropic principle). Nevertheless, it's still true - in string theory or otherwise - that there exists no good reason why the effects of the new dimensions should be "right behind the corner". A solution of the hierarchy problem may be quoted as a reason in field theory; however, a wishful thinking seems to be the best argument why string theory should produce the right parameters of the extra dimension, especially because the string models typically solve the hierarchy problem by supersymmetry, even in the presence of the warped dimensions.

In fact, as Lisa Randall and Patrick Meade showed in 2007, the LHC won't observe multi-particle final states corresponding to the Hawking radiation of black holes. (If the number of Hawking particles "N" emitted by a black hole before it decays is large, they would be highly isotropic in the black hole rest frame - a very different decay than what we're used to.) Note that it's the same Lisa Randall. Even if the "tiny black holes" were as close to the observable scales as the previous experiments allow, the first observation of new physics wouldn't be in terms of multiparticle final states - corresponding to black holes decaying to many products.

Instead, they quantitatively argued that two-particle final states would show some patterns of the type that are being looked for whenever hypothetical compositeness of the elementary particles is being experimentally studied. After all, the black holes that decay to "very many" Hawking particles must be really heavy.

This conclusion seems to hold for all Randall-Sundrum-like models. For this reason, the conclusions of the CMS black hole preprint aren't really new for the theorists or phenomenologists. While it can't be self-evident to an experimenter that previous experiments imply that they can't see multi-particle signs of black holes at this stage, it is actually possible to derive that they can't see them. The more "top-down" or the more "theoretical" approach you use, the more tightly the newest experiments may be linked to the previous ones and the more you can say about them.

While the possibility that some signs of extra dimensions could be found by the LHC later was investigated by many researchers, they still realize that this possibility is very unlikely.

The most motivated models in string/M-theory - and even the slightly more general models than those in string/M-theory - imply that the fundamental higher-dimensional Planck scale is still very far and probably very close to the usual 3+1-dimensional Planck scale, "10^{19} GeV". That's about fifteen orders of magnitude higher energy per particle than what the LHC can achieve.

Below the Planck scale, we find the GUT scale and a few other things. However, it seems very sensible to assume that there is almost nothing happening between the GUT scale and the electroweak scale. The gauge coupling unification is the most quantitative argument supporting this "big desert scenario" - the absence of any new features even if you magnify the electroweak particles and phenomena a million of billion of times. ;-) If you inserted any new patterns or "new physics" in between, the numerical agreement behind the "gauge coupling unification" would cease to work. Also, the seesaw explanation of the small neutrino masses - that seem to be as many times lighter than the Higgs boson as the Higgs boson is lighter than the GUT scale - could cease to work if you tried to bring the Planck scale much closer.

Quite obviously, the supersymmetry is the only truly motivated consequence of string/M-theory among those that could appear at the LHC although, obviously, phenomenologists can't forget about many other, much less likely possibilities. The year 2011 - and maybe a few years that will follow - will (almost) settle the question whether supersymmetry exists at the TeV scale. (It's possible that the LHC may miss it even if it is there.)

However, as the TRF readers have been led to realize many times, no other "truly spectacular" new effects are really compatible with the well-established principles of physics, and they can't be seen and they won't be seen. This includes various compositeness, technicolor, preon models as well as all models involving a violation of the Lorentz symmetry.

The people who have been fantasizing about the totally new effects are likely to be proved wrong - and indeed, it's a huge fraction of the phenomenology literature. Some of them models were more clever, more conceptually new, and more realistic, many of them have been really dumb (especially those that ambitiously tried to define quantum gravity in a non-stringy way), but the LHC will surely clean a huge majority of them.

I hope that after the LHC is stopped in a decade or two and physics will possibly enter another era dominated by the theorists and phenomenologists, the phenomenologists will already be more sober and impartial and they will avoid the overproduction of unlikely models whose only advantage is that they would be flashy if they were right even though it can be seen that they're almost certainly not right. ;-)

Some fundamental features of Nature - such as the very postulates of quantum mechanics - are more shocking, creative, and unfamiliar to the laymen than most people are willing to imagine.

However, when it comes to "more mundane" features that the laymen could imagine but that are not included in the state-of-the-art physics, Nature is pretty conservative and string theory describes Nature as it is - as pretty conservative. No violation of fundamental symmetries, no compositeness inside quarks, no discreteness, no anything of this sort. You don't like it? Well, then move to another Universe where the laws of physics are more progressive or more philosophically pleasing to you. Meanwhile, I would appreciate if the science journalists avoided downright lies about the predictions of string theory.

String theory predicts that the light black holes have masses that are almost certainly relatively close to the Planck scale, to say the least, and will never be produced by particle accelerators. It's the other "alternative" theories that are driven by hype rather than good science and that have predicted lots of flashy phenomena - that are not being seen and won't be seen as the dust is settling and string theory is proving to be the only beyond-QFT framework with a lasting value.

And that's the memo.
LHC confirms a modest stringy prediction on black holes LHC confirms a modest stringy prediction on black holes Reviewed by MCH on December 20, 2010 Rating: 5

No comments:

Powered by Blogger.