banner image
Sedang Dalam Perbaikan

Dixon law firm: CyberSUSY

John Dixon from the Dixon Law Firm in Calgary submitted the first, 40-page-long preprint about CyberSUSY. He plans to write and submit four papers and they should have hundreds of pages in total.

His law firm sounds even better than a patent office and because he claims to calculate the fermion masses and solve the cosmological constant problem, among other things, I couldn't resist to look at the paper. It would be pretty easy for me to overcome all kinds of idiosyncrasies if he had something to say.

Unfortunately, five minutes is enough to transform the eager expectations into a laughter. What does he claim to do?

He claims to have a framework that breaks SUSY at the same moment when the electroweak symmetry is broken. That surely sounds impossible but you don't want to give up too early. So you read how he intends to realize such a heroic tour-de-force.

Dixon modifies the supersymmetry transformations by adding new terms that have a simple impact on some particular composite operators in the theory. After going through 10 pages, you may finally see that this is his strategy.

Cracks appear soon

However, when you want to know what exactly happens, several explosive surprises are waiting for you. First of all, the modified transformations are BRST transformations. That's not what you would expect because the BRST transformation is a technical tool to deal with local symmetries, not with global symmetries such as supersymmetry in the supersymmetric standard model.




Such an observation will shock you a bit so in order to check whether you're on the same frequency, you return to his definition of the "normal" BRST transformation at the beginning of the paper. And your surprises continue. His "BRST" transformation is schematically equal to "C" times "Q". Now, I suppose that "Q" is the supercharge because of its spinor index and declared dimension. Clearly, "C" must be the parameter of the supersymmetry transformation for the variation to be a variation: it has the dimension "-1/2" and a spinor index, too.

Now, it's clear that Dixon has confused the supersymmetry parameters with the BRST ghosts "C". As a result, he has also confused supersymmetry transformations and BRST transformations. He still requires this (inherently) supersymmetric transformation to be nilpotent. ;-)

That's not an excessively promising mistake at the beginning of a paper that tries to derive a new kind of supersymmetry breaking from deformations of SUSY or BRST transformations. But such bizarre things continue. For example, even though Dixon claims to link SUSY breaking to gauge symmetry breaking, you will see no gauge fields or gauge transformations anywhere in the paper. ;-)

Also, the additional terms for the transformation rules are only written down for some composite fields, not for all (elementary) fields. The actions don't depend on the ghosts "C" at all which makes it less surprising that he manages to make his "BRST charge" nilpotent.

The cosmological constant problem is "solved" by looking at leptons only: gravity, gauge fields, and other fields seem to be "irrelevant" in Dixon's opinion. After 10 minutes, you see that the paper is complete balderdash and you stop reading. But there are several universal incredible patterns of papers written by these amateur scientists (a category that also includes lots of professional amateur physicists) that I want to describe explicitly.

Physics vs formalism

In real physics, one has to have an idea that can be described in physical words. The idea has to refer to actual physical objects - particles, black holes, fields, other measurable quantities, and physically verifiable symmetries and phase transitions, among other things. Once the idea or the problem is formulated, it must be refined and investigated by mathematical tools (such as the BRST formalism or others) that take over.

These mathematical tools are not "canonically" connected with the physical statements and questions but they nevertheless decide whether various hypotheses are correct and what the results are. The mathematical tools to find the answers are usually not unique (there typically exist many equivalent ways to obtain a result) but they are always essential. Sometimes you are led to an interesting equation that determines an observable quantity.

In amateur physics, at least the subtype of amateur physics that doesn't avoid equations, it is the other way around. They start with a program that is defined in terms of a mathematical object that would normally be just a tool to study physical assertions: a slave becomes the master. I am talking about the BRST transformation, a superconnection, etc. The physical propositions, physical starting points, and links of their new work to the existing laws of physics are completely missing. You can also see that simple mathematical objects such as algebraic equations or numerological identities are at the very center of their work while the other "calculations" are mostly added as a sort of decoration.

They do something with these mathematical objects that makes no sense and finally they "arrive" to grandiose statements such as "we have found a theory of everything" or "we have calculated the fermion masses" and "solved the cosmological constant" etc. You can see that most of these assertions were formulated long before they wrote the equations - the decoration. ;-) But it is not possible, not even in principle, to solve any of these problems by following a similar path.

To do anything sensible in theoretical physics, one must actually begin with a well-defined physical framework, modify some of its components or assume something about its parameters and initial conditions according to rules that can be motivated in physical terms, use mathematical tools to calculate the results as carefully as we can, and end up with some conclusions that couldn't be known at the beginning.

Consistency checks inside the text

Another thing that the amateur scientists seem to misunderstand is that it typically takes a few minutes to see that their paper is balderdash. They seem to believe that another physicist has to read every word of the paper, evaluate it for the same long period of time that they spent by writing the paper, and only afterwords, the physicist has a chance to have an opinion about the paper.

That's not how it works in reality. The meaningful papers in theoretical physics are actually not composed of 100% of completely new information. Quite on the contrary. Something like 1/2 of sentences in a typical paper are actually consistency checks in which the reader can verify that the author of the paper is not on a completely wrong track: the reader may check that what the author is saying is consistent with some of the previous, limited knowledge of the reader (either standard knowledge or other results in the same new paper), but the author can also add something new.

For example, when a new, more universal expression is given for a physical quantity (a function), the paper typically checks that the quantity is well-behaved in some special regime that should have been known before the paper was written. The author is supposed to be "a few steps ahead" of the reader and to help him to get through (imagine two mountaineers). When they don't share anything from the previous trip, that's already too bad.

Even more importantly, whenever there is something that seems to be a contradiction, a meaningful and well-written paper typically explains the subtlety that actually resolves (or might resolve) the contradiction. The author and the reader don't have an "identical" understanding what is surprising and what is expected about science but because they live in the same world with the same existing knowledge about the discipline, their expectations shouldn't be terribly different. If they're too different, it becomes impossible for the reader to read the paper.

But of course, the author is not obliged to pedagogically explain the fate of apparent contradictions etc. I wouldn't call the the resulting paper "well-written" if no attention is paid to important cases and possible contradictions but it could still be a "correct" paper, in some sense.

However, the amateur physicists are using the physical concepts in such a wrong way that it becomes immediately clear that they don't know what they're talking about. They don't actually omit the checksums. They include them and virtually all of them are wrong. For example, Dixon
  • shows that he thinks that the BRST operator is a physical object in global SUSY model; in reality, it shouldn't be there at all, and even when one uses it, it is just a technical tool to deal with other symmetries and physical objects, not a "primary" object to build a paper upon
  • confuses the BRST operator with the supersymmetry transformations
  • in doing so, he checks the "nilpotency" of the supersymmetry transformations, instead of the correct (nonzero) anticommutators
  • links supersymmetry breaking to gauge symmetry breaking but gauge fields don't seem to appear in the paper at all
  • claims to solve the cosmological constant problem without talking about lots of objects (such as gravity and loops of particles) that are clearly necessary to say something about the value of the cosmological constant (electrons only are not enough)
and so on and on and on. So a meaningful scientific paper has 1 byte of new information followed by 1 byte of checksums. And the checksums must work correctly in 90+ percent of the cases, otherwise the paper is thrown away as gibberish. However, in Dixon's case, about 90 percent of these checksums are wrong.

Moreover, I believe that e.g. Dixon must completely realize that he has no clue what e.g. the BRST operator is. When most people are shown the BRST formalism for the first time, they don't understand why the details were chosen in this way. At the beginning, the BRST framework looks like a mysterious gift from the aliens. (See some motivation starting with the Abelian case.)

So I would personally never use the BRST formalism if I didn't know why the terms are what they are, why the operator should be nilpotent, and why the physical states should be the cohomologies. Dixon clearly doesn't understand most of these things. So why does he use this technical tool in his preprint? Does he really believe that it is possible to end up with anything meaningful if one uses tools that he completely misunderstands? It's like a native of a cargo cult tribe who pilots an airplane. The results simply can't be good.

Building a TOE from the scratch

In principle, you could imagine that an ingenious author writes an important paper that is completely disconnected from (almost) all the previous knowledge about science. You could imagine that the new genius would need no shoulders of the giants to stand upon. It has never happened in the history of physics but if such a thing occurred, the genius would still have to offer his own replacement for all the physical laws that have been verified and that he ignores or rejects.

Such a paper would have to be much longer - thousands of pages - and it would have to refer to a lot of experiments. Why? When you claim that a theory properly reproduces certain phenomena in physics, you must link the theory to experiments. It either means that you link it directly or, more typically, that you link it to some other papers and approximate theories and principles that have already been verified to agree with the relevant experiments: you normally link your new theories to the observations indirectly.

But if you want to avoid the existing state-of-the-art theories, the latter option evaporates. Your paper would have to talk about the experiments themselves and it would look very different from Dixon's paper.
Dixon law firm: CyberSUSY Dixon law firm: CyberSUSY Reviewed by DAL on August 06, 2008 Rating: 5

No comments:

Powered by Blogger.