Several years ago, the writers of The Big Bang Theory, the sitcom, may have been inspired by Sabine Hossenfelder when they ordered Leslie Winkle to present a ludicrous anti-string rant defending loop quantum gravity in front of Sheldon Cooper.
The episode S02E02 was actually the first one I ever watched because of that exchange.
Winkle would say the very same things as her real-world subpar physics counterparts. Things like "loop quantum gravity" are predictive and it's so great that they predict that the speed of light depends on the frequency due to the discrete character of the spacetime near the Planck scale.
In the last 10 years, you could have read dozens of blog posts on this weblog about the incompatibility of all theories of discrete physics with the established rules of the special relativity – laws first appreciated by Albert Einstein that have been increasingly demonstrated by the experiments (e.g. recently by the Fermi telescope) to be almost certainly exact.
The room for a fundamental violation of the Lorentz symmetry has been shrunk to such a small volume that all the fundamentally Lorentz-breaking theories that continue to be compatible with the experimental constraints are unavoidably unnatural, unmotivated, unjustifiable, and heavily fine-tuned cousins of a Lorentz-invariant theory. As long as a Lorentz-violating theory may be made compatible with the experiments at all, which is rare and usually requires lots of extra assumptions, you may always pick the Lorentz-invariant relative as a superior cousin and reduce discussions about the Lorentz-violating deformations to the status of completely uninteresting speculations about some numbers that seem to be strictly zero but with some tolerance, could be supertiny and nonzero, discussions that have nothing to do with the explanations of any phenomena in Nature.
If a theory fails to naturally explain the Lorentz symmetry, it is a huge problem for the theory – the theory becomes pretty much a falsified one – rather than something to boast about!
For example, you may look at this 2009 blog post about the misconceptions related to the "minimal length". When correctly interpreted, I was arguing, new physics phenomena that may be associated with the concept of the "minimal length" aren't incompatible and shouldn't be incompatible with the Lorentz invariance.
However, if you discretize the spacetime in a brutal way, e.g. if you attempt to embed a graph to the Minkowski space, you inevitably break the Lorentz invariance. I have shown this picture
and argued that these randomly sprinkled points are Lorentz-invariant when it comes to their distribution. When you boost a picture like that, it looks the same. On the other hand, if you boost a picture of the Minkowski spaces that has some edges or faces such as the U.S. map,
you get a very different picture that betrays that you have boosted a nicer picture to a less natural inertial frame:
You see that the areas are stretched in one of the diagonal directions but shrunk in the complementary direction. This is unavoidable for a simple reason I have explained as well. The edges have directions and in the Minkowski space, the space of possible directions is the hyperboloid composed of (e.g. timelike) vectors obeying\[
v^\mu v_\mu = +1.
\] But this hyperboloid, the coset \(SO(3,1)/SO(3)\), is non-compact. Lorentz invariance requires all elements of this hyperboloid to be equally frequently represented as "edges of your spin network". But no probability distribution may exist on the hyperboloid because it's noncompact – its volume is infinite – which means that a uniform distribution would inevitably have an infinite total probability. It couldn't be normalized.
The very same argument eliminates not just timelike edges of a graph but also spacelike edges of a graph – the space of their direction is also a non-compact hyperboloid. You may also eliminate null directions because their proper length is zero and all lengths must be equally allowed – but again, the space of possible lengths is non-compact. For the same reason, faces (or higher-dimensional shapes but not the maximally dimensional shapes) embedded into a Minkowski space inevitably obey a statistical distribution that violates the Lorentz symmetry, too.
I don't want to describe all these trivial things again because I have done so many times but it's nice to see that now, in 2014, Sabine Hossenfelder wrote her blog post
So 5 years were enough to forget that I should be credited with these arguments which made it possible and politically correct to mention them! ;-)
More generally, she talks about a paper by Dejan Stojkovič about some scale-dependent spacetime dimension. I am not 100% sure that Dejan fails to understand the simple argument above but I would bet that unlike Sabine, he still does misunderstand those things. The idea of a scale-dependent spacetime dimension is credible and intriguing and in some sense, it is correct. For example, at distances shorter than the Kaluza-Klein compactification scale, the spacetime dimensionality includes the small dimensions while the effective dimensionality drops if you go to longer distances.
This "loss of dimensions" associated with the flow towards longer distances is generalized in perturbative string theory by the Zamolodčikov \(c\)-theorem which proves that \(c\), effectively the number of dimensions, is decreasing. There are other ways to argue that the spacetime dimensionality is effectively depending on the scale. In holography, the spacetime dimension is pretty much lowered if the physics is well described by the physics of the event horizon (or the boundary of the AdS space). In perturbative string theory, the fundamental, generating "spacetime" dimension is really \(1+1\) because the spacetime is created out of the world sheet. And there exist more sophisticated approaches to see that the claim that "the effective spacetime dimension is scale-dependent" is at least morally correct.
According to a/the Lorentz-invariant measure, hyperboloids are non-compact and have infinite volumes which prohibits a Lorentz-invariant probability distribution on them. If something describing the structure of the vacuum takes values in a hyperboloid and the number of these patterns per unit spacetime hypervolume is finite, the violation of the Lorentz symmetry is inevitable.
What is completely wrong, however, is the idea that the spacetime of a relativistic theory may be described by any particular discrete structure connecting "points" or other localized objects – at any scale. Whenever you draw something like that into your spacetime, you are inevitably breaking the Lorentz symmetry because the probabilistic distribution for the directions of these edges or other discrete shapes must be "centered" around a direction that defines the preferred reference frame. It must be concentrated because a non-concentrated, uniform probability distribution wouldn't be a normalizable one.
After all, if a lattice-like structure were embedded in the vacuum, the configuration of the lattice/vacuum wouldn't be unique and would carry a huge entropy (like a liquid). This leads to many problems unrelated to relativity but it also violates relativity because the entropy density is the temporal component of a 4-vector, and if this 4-vector is nonzero, it picks a preferred frame, too. Too bad. The vacuum of a relativistic theory must have a vanishing entropy density! So every attempt to imagine that much like a crystal, the vacuum is made out of some connected visualizable pieces, is wrong.
This argument is extremely simple and obvious and eliminates pretty much all proposed "discrete theories" of physics that have ever been promoted by the naive people. What is amazing from a sociological viewpoint is how many people have invested their would-be creative thinking into this manifestly incorrect research program. It's not clear whether these people could have contributed something to science if they hadn't been caught by this trap. But even if the answer is "No", it's always good to see someone who finally sees the light, and Sabine apparently does.
The episode S02E02 was actually the first one I ever watched because of that exchange.
Winkle would say the very same things as her real-world subpar physics counterparts. Things like "loop quantum gravity" are predictive and it's so great that they predict that the speed of light depends on the frequency due to the discrete character of the spacetime near the Planck scale.
In the last 10 years, you could have read dozens of blog posts on this weblog about the incompatibility of all theories of discrete physics with the established rules of the special relativity – laws first appreciated by Albert Einstein that have been increasingly demonstrated by the experiments (e.g. recently by the Fermi telescope) to be almost certainly exact.
The room for a fundamental violation of the Lorentz symmetry has been shrunk to such a small volume that all the fundamentally Lorentz-breaking theories that continue to be compatible with the experimental constraints are unavoidably unnatural, unmotivated, unjustifiable, and heavily fine-tuned cousins of a Lorentz-invariant theory. As long as a Lorentz-violating theory may be made compatible with the experiments at all, which is rare and usually requires lots of extra assumptions, you may always pick the Lorentz-invariant relative as a superior cousin and reduce discussions about the Lorentz-violating deformations to the status of completely uninteresting speculations about some numbers that seem to be strictly zero but with some tolerance, could be supertiny and nonzero, discussions that have nothing to do with the explanations of any phenomena in Nature.
If a theory fails to naturally explain the Lorentz symmetry, it is a huge problem for the theory – the theory becomes pretty much a falsified one – rather than something to boast about!
For example, you may look at this 2009 blog post about the misconceptions related to the "minimal length". When correctly interpreted, I was arguing, new physics phenomena that may be associated with the concept of the "minimal length" aren't incompatible and shouldn't be incompatible with the Lorentz invariance.
However, if you discretize the spacetime in a brutal way, e.g. if you attempt to embed a graph to the Minkowski space, you inevitably break the Lorentz invariance. I have shown this picture
and argued that these randomly sprinkled points are Lorentz-invariant when it comes to their distribution. When you boost a picture like that, it looks the same. On the other hand, if you boost a picture of the Minkowski spaces that has some edges or faces such as the U.S. map,
you get a very different picture that betrays that you have boosted a nicer picture to a less natural inertial frame:
You see that the areas are stretched in one of the diagonal directions but shrunk in the complementary direction. This is unavoidable for a simple reason I have explained as well. The edges have directions and in the Minkowski space, the space of possible directions is the hyperboloid composed of (e.g. timelike) vectors obeying\[
v^\mu v_\mu = +1.
\] But this hyperboloid, the coset \(SO(3,1)/SO(3)\), is non-compact. Lorentz invariance requires all elements of this hyperboloid to be equally frequently represented as "edges of your spin network". But no probability distribution may exist on the hyperboloid because it's noncompact – its volume is infinite – which means that a uniform distribution would inevitably have an infinite total probability. It couldn't be normalized.
The very same argument eliminates not just timelike edges of a graph but also spacelike edges of a graph – the space of their direction is also a non-compact hyperboloid. You may also eliminate null directions because their proper length is zero and all lengths must be equally allowed – but again, the space of possible lengths is non-compact. For the same reason, faces (or higher-dimensional shapes but not the maximally dimensional shapes) embedded into a Minkowski space inevitably obey a statistical distribution that violates the Lorentz symmetry, too.
I don't want to describe all these trivial things again because I have done so many times but it's nice to see that now, in 2014, Sabine Hossenfelder wrote her blog post
Evolving dimensions, now vanishingthat not only contains the same arguments why the discrete structures embedded into the Minkowski space inevitably pick a preferred reference frame and therefore violate relativity. She has included pretty much the same pictures, too! That's great because my previous blog posts about the very same argument were partially inspired by her misunderstanding in the past.
So 5 years were enough to forget that I should be credited with these arguments which made it possible and politically correct to mention them! ;-)
More generally, she talks about a paper by Dejan Stojkovič about some scale-dependent spacetime dimension. I am not 100% sure that Dejan fails to understand the simple argument above but I would bet that unlike Sabine, he still does misunderstand those things. The idea of a scale-dependent spacetime dimension is credible and intriguing and in some sense, it is correct. For example, at distances shorter than the Kaluza-Klein compactification scale, the spacetime dimensionality includes the small dimensions while the effective dimensionality drops if you go to longer distances.
This "loss of dimensions" associated with the flow towards longer distances is generalized in perturbative string theory by the Zamolodčikov \(c\)-theorem which proves that \(c\), effectively the number of dimensions, is decreasing. There are other ways to argue that the spacetime dimensionality is effectively depending on the scale. In holography, the spacetime dimension is pretty much lowered if the physics is well described by the physics of the event horizon (or the boundary of the AdS space). In perturbative string theory, the fundamental, generating "spacetime" dimension is really \(1+1\) because the spacetime is created out of the world sheet. And there exist more sophisticated approaches to see that the claim that "the effective spacetime dimension is scale-dependent" is at least morally correct.
According to a/the Lorentz-invariant measure, hyperboloids are non-compact and have infinite volumes which prohibits a Lorentz-invariant probability distribution on them. If something describing the structure of the vacuum takes values in a hyperboloid and the number of these patterns per unit spacetime hypervolume is finite, the violation of the Lorentz symmetry is inevitable.
What is completely wrong, however, is the idea that the spacetime of a relativistic theory may be described by any particular discrete structure connecting "points" or other localized objects – at any scale. Whenever you draw something like that into your spacetime, you are inevitably breaking the Lorentz symmetry because the probabilistic distribution for the directions of these edges or other discrete shapes must be "centered" around a direction that defines the preferred reference frame. It must be concentrated because a non-concentrated, uniform probability distribution wouldn't be a normalizable one.
After all, if a lattice-like structure were embedded in the vacuum, the configuration of the lattice/vacuum wouldn't be unique and would carry a huge entropy (like a liquid). This leads to many problems unrelated to relativity but it also violates relativity because the entropy density is the temporal component of a 4-vector, and if this 4-vector is nonzero, it picks a preferred frame, too. Too bad. The vacuum of a relativistic theory must have a vanishing entropy density! So every attempt to imagine that much like a crystal, the vacuum is made out of some connected visualizable pieces, is wrong.
This argument is extremely simple and obvious and eliminates pretty much all proposed "discrete theories" of physics that have ever been promoted by the naive people. What is amazing from a sociological viewpoint is how many people have invested their would-be creative thinking into this manifestly incorrect research program. It's not clear whether these people could have contributed something to science if they hadn't been caught by this trap. But even if the answer is "No", it's always good to see someone who finally sees the light, and Sabine apparently does.
Sabine began to understand the incompatibility of discrete physics with relativity
Reviewed by MCH
on
June 16, 2014
Rating:
No comments: