Posts Tagged ‘Faster Than Light’

“Proof” That Faster Than Light Communications Are Impossible Is False

December 16, 2017

There are theories everywhere, and the more ingrained they are, the more suspiciously they should be looked at. From the basic equations of relativity it is clear that if one adds speeds less than the speed of light, one will get a speed less than the speed of light. It is also clear that adding impulse to a mass will make it more massive, while its speed will asymptotically approach that of light (and, as I explained, the reason is intuitive, from Time Dilation).

The subject is not all sci-fi: modern cosmology brazenly assumes that space itself, after the alleged Big Bang, expanded at a speed at least 10^23 c (something like one hundred thousand billion billions time the speed of light c). The grossest, yet simplest, proof of that is simple: the observable universe is roughly 100 billion light years across, and it is ten billion years old. Thus it expanded at the minimum average clip of ten billion light years, every billion years. 100c/10 = 10c, according to standard cosmology. One could furiously imagine a spaceship somehow surfing on a wave of warped space, expanding for the same obscure reason same obscure reason as the Big Bang itself, that is…) 

The question naturally arises whether velocities which are greater than that of light could ever possibly be obtained in other ways. For example, are there communication speeds faster than light? (Throwing some material across will not work: its mass will increase, while its speed stays less than c.)

Textbooks say it’s not possible. There is actually a “proof” of that alleged impossibility, dating all the way back to Einstein (1907) and Tolman (1917). The mathematics are trivial (they are reproduced in my picture below). But the interpretation is apparently less so. Wikipedia weirdly claims that faster than light communications would allow to travel back in time. No. One could synchronize all clocks on all planets in the galaxies, and having faster than light communications would not change anything. Why? Time is local, faster than light data travel is nonlocal.

The problem of faster than light communications can be attacked in the following manner.

Consider two points A and B on the X axis of the system S, and suppose that some impulse originates at A, travels to B with the velocity u and at B produces some observable phenomenon, the starting of the impulse at A and the resulting phenomenon at B thus being connected by the relation of cause and effect. The time elapsing between the cause and its effect as measured in the units of system S will evidently be as follows in the calligraphy below. Then I use the usual Relativity formula (due to Lorentz) of time as it elapses in S’:

Equations help, but they are neither the beginning, nor the end of a story. Just an abstraction of it. The cult of equations is naive, interpretation is everything. The same thing, more generally, holds for models.
As Tolman put it in 1917: “Let us suppose now that there are no limits to the possible magnitude of the velocities u and V, and in particular that the causal impulse can travel from A to B with a velocity u greater than that of light. It is evident that we could then take a velocity u great enough uV/C^2 will be greater than one.
so that Delta(t) would become negative. In other words, for an observer in system S’ the effect which occurs at B would precede in time its cause which originates at A.”

I quote Tolman, because he is generally viewed as the one having definitively established the impossibility of faster than light communications. Tolman, though is not so sure; in his next sentence he turns out wishy washy: “Such a condition of affairs might not be a logical impossibility; nevertheless its extraordinary nature might incline us to believe that no causal impulse can travel with a velocity greater than that of light.”

Actually it is an effect those who have seen movies running in reverse are familiar with. Causality apparently running in reverse is no more surprising than the fact that two events at x1 and x2 which are simultaneous in S are separated by:  (x1-x2) (V/square root (1-VV/CC)). That introduces a sort of fake, or apparent causality, sometimes this before that, sometimes that before this.

(The computation is straightforward and found in Tolman’s own textbook; it originated with Henri Poincaré.[9][10] In 1898 Poincaré argued that the postulate of light speed constancy in all directions is useful to formulate physical laws in a simple way. He also showed that the definition of simultaneity of events at different places is only a convention.[11]) . Notice that, in the case of simultaneity, the signs of V and (x1-x2) matter. Basically, depending upon how V moves, light in S going to S’ takes more time to catch up with the moving frame, and the more so, the further it is, the same exact effect which explains the nil result in the Michelson-Morley interferometer; there is an underlying logic below all of this, and it’s always the same).

Tolman’s argumentation about the impossibility of faster than light communications is, in the end, purely philosophical and fully inconsistent with the closely related, and fully mainstream, relativity of simultaneousness.

Poincaré in 1900 proposed the following convention for defining clock synchronisation: 2 observers A and B, which are moving in space (which Poincaré called the aether), synchronise their clocks by means of optical signals. They believe to be at rest in space (“the aether”) from not moving relative to distant galaxies or the Cosmic Radiation Background and assume that the speed of light is constant in all directions. Therefore, they have to consider only the transmission time of the signals and then crossing their observations to examine whether their clocks are synchronous.

“Let us suppose that there are some observers placed at various points, and they synchronize their clocks using light signals. They attempt to adjust the measured transmission time of the signals, but they are not aware of their common motion, and consequently believe that the signals travel equally fast in both directions. They perform observations of crossing signals, one traveling from A to B, followed by another traveling from B to A.” 

In 1904 Poincaré illustrated the same procedure in the following way:

“Imagine two observers who wish to adjust their timepieces by optical signals; they exchange signals, but as they know that the transmission of light is not instantaneous, they are careful to cross them. When station B perceives the signal from station A, its clock should not mark the same hour as that of station A at the moment of sending the signal, but this hour augmented by a constant representing the duration of the transmission. Suppose, for example, that station A sends its signal when its clock marks the hour 0, and that station B perceives it when its clock marks the hour t. The clocks are adjusted if the slowness equal to t represents the duration of the transmission, and to verify it, station B sends in its turn a signal when its clock marks 0; then station A should perceive it when its clock marks t. The timepieces are then adjusted. And in fact they mark the same hour at the same physical instant, but on the one condition, that the two stations are fixed. Otherwise the duration of the transmission will not be the same in the two senses, since the station A, for example, moves forward to meet the optical perturbation emanating from B, whereas the station B flees before the perturbation emanating from A. The watches adjusted in that way will not mark, therefore, the true time; they will mark what may be called the local time, so that one of them will be slow of the other.[13]

This Poincaré (“–Einstein”) synchronisation was used by telegraphers as soon as the mid-nineteenth century. It would allow to cover the galaxy with synchronized clocks (although local times will differ a bit depending upon the motion of stars, and in particular where in the galactic rotation curve a star sits). Transmitting instantaneous signals in that networks would not affect causality. Ludicrously, Wikipedia asserts that faster than light signals would make “Bertha” rich (!!!). That comes simply from Wikipedia getting thoroughly confused, allowing faster than light signals for some data, and not for other data, thus giving an advantage to some, and not others.

***

Quantum Entanglement (QE) enables at-a-distance changes of Quantum states:

(It comes in at least three types of increasing strength.) Quantum Entanglement, as known today, is within Quantum state to within Quantum state, but we cannot control in which Quantum state the particle will be, to start with, so we cannot use QE for communicating faster than light (because we don’t control what we write, so to speak, as we write with states, so we send gibberish).

This argument is formalized in a “No Faster Than Light Communication theorem”. However, IMHO, the proof contains massive loopholes (the proof assumes that there is no Sub Quantum Reality, whatsoever, nor could there ever be some, ever, and thus that the unlikely QM axioms are forever absolutely true beyond all possible redshifts you could possibly imagine, inter alia). So this is not the final story here. QE enables, surprisingly, the Quantum Radar (something I didn’t see coming). And it is not clear to me that we have absolutely no control on states statistically, thus that we can’t use what Schrödinger, building on the EPR thought experiment, called “Quantum Steering” to communicate at a distance. Quantum Radar and Quantum Steering are now enacted through real devices. They use faster-than-light in their inner machinery.

As the preceding showed, the supposed contradiction of faster-than-light communications with Relativity is just an urban legend. It makes the tribe of physicists more priestly, as they evoke a taboo nobody can understand, for the good reason that it makes no sense, and it is intellectually comfortable, as it simplifies brainwork, taboos always do, but it is a lie. And it is high time this civilization switches to the no more lies theorem, lest it wants to finish roasted, poisoned, flooded, weaponized and demonized.

Patrice Ayme’

Technical addendum:

https://en.wikipedia.org/wiki/Relativity_of_simultaneity

As Wikipedia itself puts it, weasel-style, to try to insinuate that Einstein brought something very significant to the debate, the eradication of the aether (but the aether came back soon after, and there are now several “reasons” for it; the point being that, as Poincaré suspected, there is a notion of absolute rest, and now we know this for several reasons: CRB, Unruh effect, etc.):

In 1892 and 1895, Hendrik Lorentz used a mathematical method called “local time” t’ = t – v x/c2 for explaining the negative aether drift experiments.[5] However, Lorentz gave no physical explanation of this effect. This was done by Henri Poincaré who already emphasized in 1898 the conventional nature of simultaneity and who argued that it is convenient to postulate the constancy of the speed of light in all directions. However, this paper does not contain any discussion of Lorentz’s theory or the possible difference in defining simultaneity for observers in different states of motion.[6][7] This was done in 1900, when Poincaré derived local time by assuming that the speed of light is invariant within the aether. Due to the “principle of relative motion”, moving observers within the aether also assume that they are at rest and that the speed of light is constant in all directions (only to first order in v/c). Therefore, if they synchronize their clocks by using light signals, they will only consider the transit time for the signals, but not their motion in respect to the aether. So the moving clocks are not synchronous and do not indicate the “true” time. Poincaré calculated that this synchronization error corresponds to Lorentz’s local time.[8][9] In 1904, Poincaré emphasized the connection between the principle of relativity, “local time”, and light speed invariance; however, the reasoning in that paper was presented in a qualitative and conjectural manner.[10][11]

Albert Einstein used a similar method in 1905 to derive the time transformation for all orders in v/c, i.e., the complete Lorentz transformation. Poincaré obtained the full transformation earlier in 1905 but in the papers of that year he did not mention his synchronization procedure. This derivation was completely based on light speed invariance and the relativity principle, so Einstein noted that for the electrodynamics of moving bodies the aether is superfluous. Thus, the separation into “true” and “local” times of Lorentz and Poincaré vanishes – all times are equally valid and therefore the relativity of length and time is a natural consequence.[12][13][14]

… Except of course, absolute relativity of length and time is not really true: everywhere in the universe, locally at rest frames can be defined, in several manner (optical, mechanical, gravitational, and even using a variant of the Quantum Field Theory Casimir Effect). All other frames are in trouble, so absolute motion can be detected. The hope of Einstein, in devising General Relativity was to explain inertia, but he ended down with just a modification of the 1800 CE Bullialdus-Newton-Laplace theory… (Newton knew his instantaneous gravitation made no sense, and condemned it severely, so Laplace introduced a gravitation speed, thus the gravitational waves, and Poincaré made them relativistic in 1905… Einstein got the applause…)

REALITY: At Your COMMAND, FASTER Than LIGHT

September 11, 2015

Feynman:”It is safe to say that no one understands Quantum Mechanics.” 

Einstein: “Insanity is doing the same thing over and over and expecting different results.”

Nature: “That’s how the world works.”

Wilzcek (Physics Nobel Prize): “Naïveté is doing the same thing over and over, and always expecting the same result.”

Parmenides, the ancient Greek philosopher, theorized that reality is unchanging and indivisible and that movement is an illusion. Zeno, a student of Parmenides, devised four famous paradoxes to illustrate the logical difficulties in the very concept of motion. Zeno’s arrow paradox starts and ends this way:

  • If you know where an arrow is, you know everything about its physical state….
  • The arrow does not move…

Classical Mechanics found the first point to be erroneous. To know the state of a particle, one must know not only its position X, but also its velocity and mass (what’s called its momentum P). Something similar happens with Quantum Physics. To know the state of a particle, we need to know whether the state of what it has interacted with before…  exists, or not. According to old fashion metaphysics, that’s beyond weird. It’s simply incomprehensible.

The EPR Interaction: Zein Und Zeit. For Real.

The EPR Interaction: Zein Und Zeit. For Real.

[The Nazi philosopher Heidegger, an ex would-be priest, wrote a famous book “Being And Time“. However, rather than a fascist fantasy, the EPR is exactly about that level of depth: how existence and time come to be! And how those interact with our will…]

With that information, X and P, position and momentum, for each particle, classical mechanics predicts a set of particles’ future evolution completely. (Formally dynamic evolution satisfies a second order linear differential equation. That was thoroughly checked by thousands of officers of gunnery, worldwide, over the last five centuries.)

Highly predicting classical mechanics is the model of Einstein Sanity.

Aristotle had ignored the notion of momentum, P. For Aristotle, one needed a force to maintain motion (an objective proof of Aristotle’s stupidity; no wonder Aristotle supported, and instigated, fascist dictatorship as the best system of governance). Around 1320 CE, the Parisian genius Buridan declared that Aristotle was completely wrong and introduced momentum P, calling it “IMPETUS”.

May we be in a similar situation? Just like the Ancient Greeks had ignored P, is Quantum Wave Mechanics incomplete from an inadequate concept of what a complete description of the world is?

Einstein thought so, and demonstrated it to his satisfaction in his EPR Thought Experiment. The EPR paper basically observed that, according to the Quantum Axiomatics, two particles, after they interacted still formed JUST ONE WAVE. Einstein claimed that there had to exist hidden “elements of reality”, not yet identified in the (Copenhagen Interpretation of) quantum theory. Those heretofore hidden “elements of reality” would re-establish Einstein Sanity, Einstein feverishly hoped.

According to Einstein, following his friend Prince Louis De Broglie (to whom he had conferred the Doctorate) and maybe the philosopher Karl Popper (with whom he corresponded prior on non-locality), Quantum Mechanics appears random. But that randomness is only because of our ignorance of those “hidden variables.” Einstein’s demonstration rested on the impossibility of what he labelled “spooky action at a distance”.

That was an idea too far. The “spooky action at a distance” has been (amply) demonstrated in the meantime. Decades of experimental tests, including a “loophole-free” test published on the scientific preprint site arxiv.org last month, show that the world is like that: completely non-local everywhere.

In 1964, the physicist John Bell, CERN’s theory chief, working with David Bohm’s version of Einstein’s EPR thought experiment, identified an inequality obeyed by any physical theory that is both local — meaning that interactions don’t travel faster than light — and where the physical properties usually attributed to “particles” exist prior to “measurement.”

(As an interesting aside, Richard Feynman tried to steal Bell’s result, at a time when Bell was not famous, at least in the USA: a nice example of “French Theory” at work! And I love Feynman…)

Einstein’s hidden “elements of reality” probably exist, but they are NON-LOCAL. (Einstein was obsessed by locality; but that’s an error. All what can be said in favor of locality is that mathematics, and Field Theory, so far, are local: that’s the famous story of the drunk who looks for his keys under the lamp post, because that’s the only thing he sees.)

Either some physical influences travel faster than light, or some properties don’t exist before measurement. Or both

I believe both happen. Yes, both: reality is both faster than light, and it is pointwise fabricated by interactions (“measurement”). Because:

  1. The EPR Thought Experiment established the faster than light influence (and that was checked experimentally).
  2. But then some properties cannot exist prior to “EPR style influence”. Because, if they did, why do they have no influence whatsoever, once the EPR effect is launched?

Now visualize the “isolated” “particle”. It’s neither truly “isolated” nor truly a “particle”, as some of its properties have not come in existence yet. How to achieve this lack of existence elegantly? Through non-localization, as observed in the one-slit and two-slit experiments.

Why did I say that the “isolated” “particle” was not isolated? Because it interfered with some other “particle” before. Of course. Thus it’s EPR entangled with that prior “particle”. And when that “particle” is “measured” (namely INTERACTS with another “particle”), the so-called “isolated” “particle” gets changed, by the “spooky action at a distance”, at a speed much faster than light.

(This is no flight of fancy of mine, consecutive to some naïve misinterpretation; Zeilinger and Al. in Austria, back-checked the effect experimentally; Aspect in Paris and Zeilinger got the Wolf prize for their work on non-locality, so the appreciation for their art is not restricted to me!)

All these questions are extremely practical: they are at the heart of the difficulties in engineering a Quantum Computer.

Old physics is out of the window. The Quantum Computer is not here yet, because the new physics is not understood enough, yet.

Patrice Ayme’

QUANTUM ENTANGLEMENT: Nature’s Faster Than Light Architecture

November 22, 2014

A drastically back-to-basic reasoning shows that the universe is held together and ordered by a Faster Than Light Interaction, QUANTUM ENTANGLEMENT. Nature is beautifully simple and clever.

(For those who spurn Physics, let me point out that Quantum Entanglement, being the Fundamental Process, occurs massively in the brain. Thus explaining the non-local nature of consciousness.)

***

The Universe is held together by an entangled, faster than light interaction. It is time to talk about it, instead of the (related) idiocy of the “multiverse”. OK, it is easier to talk idiotically than to talk smart.

Entanglement Propagates, Says the National Science Foundation (NSF)

Entanglement Propagates, Says the National Science Foundation (NSF)

I will present Entanglement in such a simple way, that nobody spoke of it that way before.

Suppose that out of an interaction, or system S, come two particles, and only two particles, X and Y. Suppose the energy of S is known, that position is the origin of the coordinates one is using, and that its momentum is zero.

By conservation of momentum, momentum of X is equal to minus momentum of Y.

In Classical Mechanics, knowing where X is tells us immediately where Y is.

One can say that the system made of X and Y is entangled. Call that CLASSICAL ENTANGLEMENT.

This is fully understood, and not surprising: even Newton would have understood it perfectly.

The same situation holds in Quantum Physics.

This is not surprising: Quantum Physics ought not to contradict Classical Mechanics, because the latter is fully demonstrated, at least for macroscopic objects X and Y. So why not for smaller ones?

So far, so good.

In Quantum Physics, Classical Entanglement gets a new name. It is called QUANTUM ENTANGLEMENT. It shows up as a “paradox”, the EPR.

That paradox makes the greatest physicists freak out, starting with Einstein, who called QUANTUM ENTANGLEMENT “spooky action at a distance”.

Why are physicists so shocked that what happens in Classical Mechanics would also be true in Quantum Physics?

Some say John Bell, chief theorist at CERN, “solved” the EPR Paradox, in 1964. Not so. Bell, who unfortunately died of a heart attack at 64, showed that the problem was real.

So what’s the problem? We have to go back to what is the fundamental axiom of Quantum Physics (Note 1). Here it is:

De Broglie decreed in 1924 that all and any particle X of energy-momentum (E,p) is associated to a wave W. That wave W s uniquely defined by E and p. So one can symbolize this by: W(E,p).

W(E,p) determines in turn the behavior of X. In particular all its interactions.

De Broglie’s obscure reasoning seems to have been understood by (nearly) no one to this day. However it was checked right away for electrons, and De Broglie got the Nobel all for himself within three years of his thesis.

Most of basics Quantum Mechanics is in De Broglie’s insight. Not just the “Schrodinger” equation, but the Uncertainty Principle.

Why?

Take a “particle X”. Let’s try to find out where it is. Well, that means we will have to interact with it. Wait, if we interact, it is a wave W. How does one find the position of a wave? Well the answer is that one cannot: when one tries to corner a wave, it becomes vicious, as everybody familiar with the sea will testify. Thus to try to find the position of a particle X makes its wave develop great momentum.

A few years after De Broglie’s seminal work, Heisenberg explained that in detail in the particular case of trying to find where an electron is, by throwing a photon on it.

This consequence of De Broglie’s Wave Principle was well understood in several ways, and got to be known as the Heisenberg Uncertainty Principle:

(Uncertainty of Position)(Uncertainty of Momentum) > (Planck Constant)

[Roughly.]

The Quantum Wave, and thus the Uncertainty, applies to any “particle” (it could be a truck).

It is crucial to understand what the Uncertainty Principle says. In light of all particles being waves (so to speak), the Uncertainty Principle says that, AT NO MOMENT DOES A PARTICLE HAVE, EVER, A PERFECTLY DEFINED MOMENTUM and POSITION.

It would contradict the “particle’s” wavy nature. It’s always this question of putting a wave into a box: you cannot reduce the box to a point. There are NO POINTS in physics.

Now we are set to understand why Quantum Entanglement created great anxiety. Let’s go back to our two entangled particles, X and Y, sole, albeit not lonely, daughters of system S. Suppose X and Y are a light year apart.

Measure the momentum of X, at universal time t (Relativity allows to do this, thanks to a process of slow synchronization of clocks described by Poincare’ and certified later by Einstein). The momentum of Y is equal and opposite.

But, wait, at same time t, the position of Y could be determined.

Thus the Uncertainty Principle would be violated at time t at Y: one could retrospectively fully determine Y’s momentum and position, and Y would have revealed itself to be, at that particular time t, a vulgar point-particle… As in Classical Mechanics. But there are no point-particles in Quantum Physics:  that is, no point in Nature, that’s the whole point!).

Contradiction.

(This contradiction is conventionally called the “EPR Paradox”; it probably ought to be called the De Broglie-Einstein-Popper Paradox, or, simply, the Non-Locality Paradox.)

This is the essence of why Quantum Entanglement makes physicists with brains freak out. I myself have thought of this problem, very hard, for decades. However, very early on, I found none of the solutions by the great names presented to be satisfactory. And so I developed my own. The more time passes, the more I believe in it.

A difficulty I had is my theory created lots of cosmic garbage, if true (;-)).

At this point, Albert Einstein and his sidekicks (one of them was just used to translate from Einstein’s German) wrote:

“We are thus forced to conclude that the quantum-mechanical description of physical reality given by wave functions is not complete.” [Einstein, A; B Podolsky; N Rosen (1935-05-15). “Can Quantum-Mechanical Description of Physical Reality be Considered Complete?”. Physical Review 47 (10): 777–780.]

The EPR paper ends by saying:

“While we have thus shown that the wave function does not provide a complete description of the physical reality, we left open the question of whether or not such a description exists. We believe, however, that such a theory is possible.”

This is high lawyerese: even as vicious a critic as your humble servant cannot find anything wrong with this craftily composed conceptology.

Einstein had corresponded on the subject with the excellent philosopher Karl Popper earlier (and Popper found his own version of the EPR). This is no doubt while he was more circumspect that he had been before.

Let’s recapitulate the problem, my way.

After interacting, according to the WAVE PRINCIPLE, both widely separating particles X and Y share the SAME WAVE.

I talk, I talk, but this is what the equations that all physicists write say: SAME WAVE. They can write all the equations they want, I think about them.

That wave is non-local, and yes, it could be a light year across. Einstein had a problem with that? I don’t.

Those who cling to the past, tried everything to explain away the Non-Locality Paradox.

Einstein was a particular man, and the beginning of the EPR paper clearly shows he wants to cling back to particles, what I view as his error of 1905. Namely that particles are particles during fundamental processes (he got the Physics Nobel for it in 1922; however, as I will not get the Nobel, I am not afraid to declare the Nobel Committee in error; Einstein deserved several Nobels, yet he made a grievous error in 1905, which has led most physicists astray, to this day… hence the striking madness of the so-called “multiverse”).

The Bell Inequality (which Richard Feynman stole for himself!) conclusively demonstrated that experiments could be made to check whether the Quantum Non-Local effects would show up.

The experiments were conducted, and the Non-Local effects were found.

That they would not have been found would have shattered Quantum Physics completely. Indeed, all the modern formalism of Quantum Physics is about Non-Locality, right from the start.

So what is my vision of what is going on? Simple: when one determines, through an interaction I, the momentum of particle X, the wave made of X and Y, W(X,Y), so to speak, “collapses”, and transmits the fact of I to particle Y at faster than light speed TAU. (I have computed that TAU is more than 10^10 the speed of light, c; Chinese scientists have given a minimum value for TAU, 10^4 c)

Then Y reacts as if it had been touched. Because, well, it has been touched: amoebae-like, it may have extended a light year, or more.

Quantum Entanglement will turn into Einstein’s worst nightmare. Informed, and all around, quasi-instantaneously. Tell me, Albert, how does it feel to have thought for a while one had figured out the universe, and then, now, clearly, not at all?

(Why not? I did not stay stuck, as Einstein did, making metaphors from moving trains, clocks, etc; a first problem with clocks is that Quantum Physics does not treat time and space equivalently. Actually the whole Quantum conceptology is an offense to hard core Relativity.)

Faster than light entanglement is a new way to look at Nature. It will have consequences all over. Indeed particles bump into each other all the time, so they get entangled. This immediately implies that topology is important to classify, and uncover hundreds of states of matter that we did not suspect existed. None of this is idle: Entanglement  is central to Quantum Computing.

Entanglement’s consequences, from philosophy to technology, are going to dwarf all prior science.

Can we make predictions, from this spectacular, faster than light, new way to look at Nature?

Yes.

Dark Matter. [2]

Patrice Ayme’

***

[1]: That the De Broglie Principle, the Wave Principle implies Planck’s work is my idea, it’s not conventional Quantum as found in textbooks.

[2]: Interaction density depends upon matter density. I propose that Dark Matter is the remnants of waves that were too spread-out to be fully brought back by Quantum Wave Collapse. In low matter density, thus, will Dark Matter be generated. As observed.

FASTER THAN LIGHT

October 29, 2011

WHY RELATIVITY OUGHT TO FAIL AT HIGH ENERGIES.

Why Relativity Is Not That Relative: A Theory Of Velocity?

***

Main Idea: Theoretical triage on Special Relativity leaves the theory in shambles at high energies. A precise mechanism to blow up the theory is produced which may cause Faster Than Light (FTL).

***

Abstract: I don’t know whether neutrinos go faster than light or not. The OPERA neutrino anomaly rests on some guess work on the shape of long neutrino pulses, so its superluminal results may well be a mirage. (The experience will soon be run with very short pulses.)

However neutrinos may go Faster Than Light. Why not? Because the idea has caused obvious distress among many of the great priests of physics?

In science one should never suppose more than necessary, nor should one suppose less than necessary. Faith is necessary in physics, but it should reflect the facts, and nothing but the facts. Such is the difference between physics and superstitious religion.

There are reasons to doubt that the leanness of Relativity reflects appropriately the subtlety of the known universe. This is what this essay is about.

OK, I agree that the equations of what Henri Poincare’ named “Relativity”,  seem, at first sight, to lead one to doubt Faster Than Light speeds. Looking at those equations naively, and without questioning their realm of validity, it looks as if, as one approaches the speed of light, one gets infinitely heavy, short, and slow.

However, adding as an ingredient much of the rest of fundamental physics, I present below an argument that those equations ought to break down at very high energies.

As my reasoning uses basic “Relativity” (Special and General), plus basic Quantum physics, one may wonder why something so basic was not pointed out before.

The answer is that, just as in economics most people are passengers on the Titanic, in cattle class, so it is in physics.

Most people have a quasi religious approach to “Relativity”, and their critical senses have been so stunted, that they even fail to appreciate that Einstein did not invent “Relativity” (as I will perhaps show in another, gossipy essay). This is of some consequence, because Einstein had a rather shallow understanding of some of the concepts involved (as he proved aplenty in his attempts at a Unified Field Theory… Pauli called them “embarrassing“… well, may be, my dear Wolfgang, all of Relativity was embarrassing, another collective German hallucination…)

The logic of the essay below is multi-dimensional, but tight. As it throws most of relativity out, I am going to be rather stern describing it:

1) At high energies, absolute motion can be detected. This renders Galileo’s Relativity, and its refurbishment by Poincare’, suspicious.

2) Time dilation is real. There is no “Twin Paradox’ whatsoever; fast clocks are really slow. This is well known, but I review the physical reason why.

3) Length contraction is also real. Moving rods really shrink. It is not a question of fancy circular definitions (as some have had it). I show this below by an electromagnetic argument which makes the connection between relativistic transformation and Maxwell equation obvious.

4) A contradiction is derived. A particle would suffer gravitational collapse if its speed came close enough to the speed of light, as it would get confined within its Schwarzschild radius, should the equation of “Relativity” remain the same at all energies. A particle gun, at high enough energy, would become a black hole gun. That would violate all sorts of laws.

Thus “Relativity” breaks down. The simplest alternative is that some of the energy spills into acceleration (beyond c!) instead of mass acquisition.

5) Einstein’s conviction that Faster Than Light is equivalent to time travel is shown to be the result of superficial analysis. If the equations of relativity break down at high energies, they cannot be used to present us with negative time. Although I will not insist on this too heavily in this particular essay, Einstein got confused between a local notion (time) and a non local one (speed, and its parallel transport around loops).

As an humoristic aside, should high energy neutrinos go faster than light, it should be possible to measure time beyond the speed of light, using a high energy neutrino clock.  

Thus a reassessment of “Relativity” is needed, starting with the name: if all the “Relative” laws blow up at high energies, that is, at high velocity, uniform motion is not relative, but absolute. The theory of Relativity ought to become the Theory of Velocity (because, at intermediate energies, all of the laws of the present “Theory of Relativity”, including fancy rotational additions of velocities, do still apply!) Mach’s principle, already absolute for rotational motion, and already favoring a class of uniform motion, would become absolute for any motion.

***

ABSOLUTE, UNIFORM MOTION IS DETECTABLE… IF ENERGETIC ENOUGH:

Poincare’ named the theory that he, Lorentz and a dozen other physicists invented, “Relativity“. Yes, Poincare’, not Einstein: this essay is about truth, not convention to please a few thousand physicists and a few billions imprinted on the Einstein cult. (And I like Einstein… When he is not insufferable.)

That name, “Relativity”, may have been a mistake. A better name, I would suggest, would be “THEORY OF VELOCITY“, for the following reasons:

In 1904, summarizing the experimental situation then, Henri Poincare’ generalized the Galilean relativity principle to all natural phenomena into as he wrote:

The principle of relativity, according to which the laws of physical phenomena should be the same, whether to an observer fixed, or for an observer carried along in a uniform motion of translation, so that we have not and could not have any means of discovering whether or not we are carried along in such a motion”.

A first problem is that the American Edwin Hubble, and others, ten years after the precocious death of Poincare’, discovered cosmic expansion, which defines absolute rest. OK, Galileo would have said:

“Patrice, just don’t look outside, I told you to stay in your cabin, in the bowels of the ship.” Fair enough, but rather curious that inquiring minds ought to be blind.

However, fifty years after the death of Poincare’, the Cosmic Microwave Background (CMB) was discovered.  That changed the game completely. You can stay in the bowels of the ship all you want, Galileo, when you move fast enough relative to the CMB, the wall of your cabin towards your uniform motion, v, will turn incandescent, and then into a plasma, as the CMB will turn into gamma rays, thanks to the Doppler shift. Even burying Galileo’s head in the sand will not work. Even a Galileo ostrich can’t escape a solid wall of gamma rays.

Relativity fanatics may insist that they are correct in first order, when they don’t go fast enough. OK, whatever: the equations of “Relativity” do not change shape with v, but, as I just said, the bowels of Galileo’s ship will always disintegrate, if the speed is high enough.

When equations don’t fit reality, you must them quit. That’s why it’s called science.

But let’s forget this glaring problem, as I said, I have found a much worse one. To understand it requires some background in so called “Relativity“, the theory of Gravitation (so called “General Relativity“, an even more insipid name), and Quantum physics.

We have to go through some preliminaries which show that time dilation and length contraction are real physical effects. There is nothing relative about them. So it’s not just the relativity of uniform motion which is not relative. It is the other two basic effects of relativity which are not relative either.  

People can write all the fancy “relativistic” equations they want, a la Dirac, and evoke some spacetime mumbo jumbo . Those equations rest on, and depict, the three preceding relative effects, and if these are not relative at high energies anymore, one cannot use them at very high energies either. Paul Dirac can sing song that they are pretty all he wants, like a canary in a coal mine. Physics is not a fashion show. It’s about what’s really happening. If the canary is dead, it’s time to get out of there.

***

POINCARE’-LORENTZ LOCAL TIME (prior to 1902):

One central notion of standard relativity is “Local Time“, which Poincare’ named and extracted from the work of Lorentz (sometimes calling it “diminished time“). We have:

t’ = t multiplied by square root of (1- vv/cc)

[Because of problems with the Internet carrying squares and square roots properly, I write cc for the square of c, instead of c^2, as some do, etc… After all, it’s exactly what it is. In the end, mathematics is eased, and rendered powerful by abstraction, but it is all about words.]

So when v= 0, t’ = t and when v = c, t’ = 0, or, in other words, t’ stops. Here t’ is the time in the coordinate system F’ travelling at speed v relative to the coordinate system F, with its time t. More exactly t is what one could call “local electromagnetic time“. Some physicists would get irritated at that point, and snarl that there is nothing like “local electromagnetic time”. In science precision is important: we are more clever than chimps because we make more distinctions than chimps do.

How do we find t’ knowing t? We look at a light clock in F’ from F. If we look at a light clock perpendicular to v, in F’, from F, we see that light in F’ will have to cover more distance to hit the far mirror of the light clock. That clock will run slow. If we suppose there is only one measure of time in F’, that means time in F’ will run slow. (This unicity of time is a philosophical hypothesis, but it has been partly confirmed experimentally since.) This is what Poincare’ (also) called “diminished time”, in his 1902 recommendation of Hendrick Lorentz to the Nobel Prize in physics (for what Poincare’ called the Lorentz transformations).

The math to compute t’ from t use nothing harder than Pythagoras’ theorem. The idea is to compare two IDENTICAL light clocks, both perpendicular to v, one in F, standing still, the other in F’, moving along at v, and separated by some distance that light covers in time t. We look at the situation from F. When the F’ light has gone from one mirror to the other in the moving clock, it has covered ct’. Why that? Well we don’t know (yet!) what t’ is, but we know light is supposed to always (appear to) go at the same speed, the astronomers’ practice, as Poincare’ reiterated in 1898.

Meanwhile the origin where the light emanated from in F’ has moved by vt’. By Pythagoras:

(ct’)(ct’) = (vt’)(vt’) + (ct)(ct). This is the relation between t’ and t above.

***

FITZGERALD CONTRACTION IS A REAL PHYSICAL EFFECT:

The Irish physicist Fitzgerald suggested that, to explain the null result of the Michelson-Morley experiment, the arm of the instrument was shortened, as needed. What is the Michelson-Morley device? Basically two light clocks at right angles, one perpendicular to v, the other along v. These two arms allow to make light interfere, after it has gone back and forth either along the direction of v, or perpendicular to it.

No difference was found. The way I look at it, it shows that electromagnetic time is indifferent to direction. The way it was looked at then was that there was no “ether drag“.

What is going on physically is very simple: F’ moves relative to F. Light released at x = x’ = t = t’ = 0 is going to have to catch up with the other mirror. (notice that this is a slight abuse of notation, as the xs and ts are in different dimensions, and F and F’ in different coordinate systems…) Suppose v is very high. Seen from F, it is obvious that photons will take a very long time to catch up with the mirror at the end of the arm. Actually, if v was equal to the speed of light, it would never catch up (if someone looked in a mirror, when going at the speed of light, she would stop seeing herself).

Well, however, the M-M device showed no such effect. Thus the only alternative was that the length of the M-M interferometer shrank, as Fitzgerald proposed in 1892 (13 years before Einstein’s duplication).

Poincare’ introduced “Poincare’ stresses” to explain the effect as a real physical effect. That was explored further in 1911 by Lorentz, and then worked out in even more detail in the 1940s, using Dirac’s quantum electrodynamics.

The reason I am giving all these details is that Einstein could not understand Poincare’s insistence on “mechanical” models. That’s OK; not everybody can be super bright.

Einstein preferred the more formal insistence that the arm along v had to shrink, because c was constant, and that was it. This was exactly Fitzgerald’s initial reasoning, and it does not explain anything: true it seems necessary that the arm will shrink, but is it really happening, and if so, how? Einstein’s platitude about the mind of god, who he was most apt to seize, are just plain embarrassing… Especially as it turns out that, in this case, the one who got the idea was Poincare’. Maybe a god to Einstein, but to me, just a man. (All the more confusing as Einstein tried to refer, and defer, to Poincare’ less than justice required.) By clinging to Fitzgerald’s original vision like a rat to a reed in the middle of the ocean, Einstein aborted the debate with a hefty dose of superstition.

As the detailed development of Quantum Field Theory showed, the mechanical models were the way to go. They reveal a lot of otherwise unpredictable, unanalyzable complexities. Just as we are going to do below.

So what is happening with the relativistic contraction?

Maxwell equations tell us how the electromagnetic field behave. In ultra modern notations, they come down to dF =0, d*F = q (d being covariant differentiation). Very pretty.

Maxwell is not all, though. The Lorentz  force equation tells us how particles move, when submitted to the electromagnetic field. It is:

Force = q(E + vB)

[E, B are vector fields, vB is the (vector) cross product of the particle velocity, the vector v, with B.]

Let’s suppose the particle moves at v. As it does, it will be reached simultaneously by the electromagnetic field from two different places. Say one of these field elements will be a retarded component, Fretarded, and the other is obtained from Pythagoras theorem, using the same sort of diagram used for a light clock. One can call that component Frelativistic. Frelativistic is proportional to the usual gamma factor of relativity, namely 1/sqrt(1-vv/cc). The total field incorporates Fretarded plus Frelativistic. The relativistic component basically crushes any particle sensitive to the Lorentz force in the direction of motion.

In particular, it will crush electronic orbitals. So atoms will get squeezed. That reasoning, by the way, explain directly, physically why the Lorentz transformations are the only ones to respect the Maxwell equations. It is better to achieve physical understanding rather than just formal understanding (by the way, professor Voigt found the formal argument for the Lorentz transformations in 1887, 18 years before Einstein).

How do formal relativists a la Einstein look at this? They look at it eerily, not to say… ethereally. Max Born, a Nobel Prize winner (for the statistical interpretation of the Quantum waves), a personal friend of Einstein expounds the formalism with coherently infuriating declarations around p 253 of his famous book “Einstein’s Theory of Relativity“.

“For if one and the same measuring rod…has a different length according to it being at rest in F, or moving relative to F, then, so these people say, there must be a cause for this change. But Einstein’s theory gives no cause; rather it states that the contraction occurs by itself, that is an accompanying circumstance to the fact of motion. In fact this objection is not justified. It is due to too limited view of the concept “change”. In itself such a concept has no meaning. It denotes nothing absolute, just as data denoting distances or times have no absolute significance. For we do not mean to say that a system which is moving uniformly in a straight line with respect to an inertial system F ‘undergoes a change’ although it actually changes its situation with respect to system F.

The standpoint of Einstein’s theory about contraction is as follows: a material rod is physically not a spatial thing, but a spacetime configuration.”

This sort of theology will remind some of Heidegger’s writings, when the pseudo philosopher looks for the “ground” and never finds it. Too much relativity will do that for you. Stay with confusion, end up with Auschwitz.

Could it be that, according to Born, a bird which flies by does not change, as it ‘actually changes its situation as a spacetime configuration’. Thus the bird at rest on its branch has not changed into a bird flying. Maybe Born should have received another Nobel for this other wonderful theory?

What Born forgets is this: 1) let’s suppose there is something as absolute rest (given, once again, by the reality of the CMB). 2) then a fast frame passing by has been colossally accelerated first before reaching that high relative speed when vv/cc approaches 1. So the fast frame has undergone an absolute change… And I explained what it is.

3) trying to play relative games between relatively relativistically moving frames does not wash, as there is privileged state of rest (or quasi rest: the Earth moves at 370 km/s relative to the CMB).

The derivation I sketched above provides with a cause for the change. It shows that, by reason of the uniform displacement, a real boost in the e-m field occurs which causes the contraction. So, basically it’s the electromagnetic geometry of uniform motion which causes the Lorentz transformations. It’s deep down not mysterious at all.

The reasons for real time dilation and real length contraction are plain, and absolute.

This is all about the geometry of electromagnetism. When Einstein wrote down in marble his formal considerations about relative this, relative that, a full generation would elapse before the discovery of the neutrino, which responds to weak interactions (OK, there is an electroweak theory, but we are not going to remake all of physics in this essay!)

By the way, I can address in passing another confusion: some will say that by standing on the surface of the Earth one undergoes an acceleration of one g (correct!), and thus why would an acceleration of one g in a straight line for a year (which is enough for high relativistic speeds) cause all these absolute changes I crow about?

The reason is simple: in one case no energy is stored, the acceleration is purely virtual, as the ground is in the way. In the other case, a tremendous amount of energy is concentrated piles up inside the moving body.

***

CLINCHER: YOU CAN’T ALWAYS SQUEEZE WHAT YOU WANT, BUT IF YOU TRY SOMETIMES, YOU GO FTL:

The Schwarzschild radius is given by R = 2GM/cc, where M is the mass of the body, G is the universal constant of gravitation, and c is the speed of light.

The energy of a particle is E = hf, where h is Planck’s constant, and f the frequency of its matter wave. Plug in Poincare’ mass-energy relation: E = M cc. Now E/cc is inertial mass M. Thus it causes, by the equivalence principle, the gravitational mass: hf/cc.

Thus the Schwarzschild radius of a particle of matter wave of frequency f is: R = 2Ghf/cccc.

But now comes the clincher: As the particle accelerates, at ever increasing speed v, its matter wave will shrink ever more, from (Fitzgerald) length contraction. At some point the wave will shrink within the Schwarzschild radius of the particle.

If one takes the gravitational collapse of the particle within itself at face value, the particle would exit physics, never to be seen again, but for its gravitational effect. That is obviously impossible.

So something has to give in the equation f = ma; here f is the force through which one pumps energy in the system (an electromagnetic field, or a muon jet, whatever), m is the relativistic mass, and a is the acceleration. What we saw is that m is bounded.  Thus a has to augment, and the particle will accelerate through the speed of light. QED.

Some could object that I used the photon energy relation: E = Hf, instead of M = (Rest Mass)/ sq. root (1 – vv/cc). But that would not change the gist of the argument any.

[Neutrinos are suspected of having a non zero mass, because they oscillate between various states; but the mass is so small, it’s not known yet. Similarly the question of the photon rest mass is an experimental problem!]

Another objection could be that I used the Schwarzschild radius, but it does not apply to single particles. Indeed, the initial argument (Tolman Oppenheimer Volkoff), was that the gravitational force would overwhelm the nuclear force inside an extremely dense star. The nuclear force is repulsive at short distance.

[I tried to show a picture of the nuclear force, but I had to give up as the WordPress primitive system does not allow me to.]

A rough sketch of the force between two nucleons shows a strong repulsive peak, which is, however finite. The force depends upon gluon exchanges, is very spin dependent, etc… The idea of TOV was that gravitation would overwhelm it, and neuron degeneracy, under some circumstances. The basic ideea in this essay is the same, although it is the all too real Lorentz-Fitzgerald contraction, not gravity, which does the crunching.]

Previously Einstein had tried to demonstrate the Schwarzschild singularity had no physical meaning. His argument was contrived, erroneous. TOV succeeded to prove Einstein wrong (Hey, it can be done!)

Besides, all these subtleties can blown away by looking at particles the strong force is not applied to, such as photons or neutrinos. They both have mass, as far as creating geometry is concerned.

In the Schwarzschild computation, a term shows up, causing a singularity at a finite distance. The term is caused purely by the spherical coordinates, and the imposition of the vacuum Einstein gravitational equation. It is indifferent to kinetic effects (an important detail, as I put the Fitzgerald squeeze on). That term is identified to a mass for purely geometrical reasons. That mass will appear through Poincare’s E = mcc, or m = hf/cc, after plugging in the de Broglie relation.  

This allows to circumvent Hawking style, trans-Planckian arguments (which, anyhow, Hawking superbly ignored in deriving Hawking radiation).

In any case f = square root [ccccc/2Gh]. Plugging in the numbers, one gets trouble when the frequency gets to ten to the power 43 or so. That’s about ten to the power 16 TeV, or 2 tons of TNT.  About one million billion times more energetic than the CERN neutrinos. But then, of course, the effect would be progressive, somehow proportional to energy. (Also see Large Dimensions below.)

***

EINSTEIN GRANDFATHER OBJECTION SELF CONTRADICTING:

Conventional physicists will make the following meta objection to the preceding. According to lore of the standard theory of Special Relativity, the preceding scheme is completely impossible. Countless physicists would say that if one had a particle going faster than light, one could go back in time. Einstein said this, and everybody has been repeating it ever since, because Einstein looked so brainy.

Or maybe because he was also German. Thus he had got to have invented the Poincare’-Lorentz relativity! (We meet here again the Keynes-Hitler problem, that, in much of the Anglo-Saxon world, blind admiration for anything German, reigns sometimes to the point of spiting reason. I am myself fanatically pro-German, but there are lines of prejudice I will not cross).

OK, granted, there is some mathematical superficiality to support Einstein’s confusion between speed of light, time, and causality. Those mathematics are reproduced in the next chapter, where the mistake is exposed.

There are three problems with using the Grandfather Paradox to shoot down my reason for FTL.

The first objection is that Einstein’s General relativity itself makes possible The “Grandfather Paradox“. So it is rather hypocritical to use it to fend off contradictions of (Special) Relativity.

That grandfather paradox increasingly haunts standard fundamental physics. The paradox arises from “Closed Timelike Curves“, which appear in (standard) Relativity (not in my version of Relativity). Hawking was reduced around 1992 to the rather ridiculous subterfuge of a “Time Protection Conjecture“.

The second objection to the objection is that, in practice, experimentally, the objection has proven irrelevant, by years of increasingly precise experiments. Basically what happens is that Quantum theory allows Faster Than Light teleportation of some sort (“states“, as the saying goes). And experiments are confirming this, leading to claims that time travel has been achieved, experimentally. There are many references, here I give one from July 2011, one of the authors, professor Lloyd from MIT is a Principal Investigator, on just this sort of problems.

Personally I have no problem with the results: they fit my vision of things, which is very friendly to Faster Than Light. But I have a problem with the standard semantics of “time travel“, which comes straight from the way Einstein looked at time.

Einstein was, in my opinion, very confused about time. He notoriously said: “time is nothing but a stubbornly persistent illusion“. In my vision of physics, time is fundamental, and it has nothing to do with space. I believe the notion of spacetime applies well to gravitation, at great distances, such as Earth orbit. I used above one of the staples of “General Relativity”, the Schwarzschild radius, but very carefully, from first principles. When people argue “time travel“, they argue from later principles, later day saints, so to speak. They confuse cornucopia and utopia.

I believe this: Time is the measure of the change of the universe. It’s not subject to travelling. But it is subject to confusion, and Einstein is the latter’s prophet.

(I also believe that the second Law of Thermodynamics applies even at the subquantal level (the subquantal level is what those who study entanglement, non locality, in Quantum physics, such as the authors I just quoted, tangle with). OK, the latter point is beyond this essay, the main theme of which does not use it at all.)

***

THE EINSTEIN GRANDFATHER ERROR:

The third objection to using the grandfather paradox to contradict me is much more drastic, and revealed by my own pencil and paper (the following is just an abstract of my reasoning). The conventional reasoning due to Einstein is faulty (and faulty in several ways). 

Most specialists of relativity subscribe to statements such as “it should be possible to transmit Faster Than Light signals into the past by placing the apparatus in a fast-moving frame of reference“.

Einstein’s argument rests on equations for time such as  t’ = (t – xw/cc)/square root of (1- vv/cc).

In this v is supposed to be the speed of a moving frame, and w the speed of an alleged Faster Than Light signal. One sees that, given a w > c, there is a v close enough to, but inferior to, c, such that t’ becomes negative.

t’ is time. So what we have, given a Faster Than Light signal w, frames moving with enough of a speed v in which events seem to be running in reverse. Thus Very Serious Professors have argued that the putative existence of that FTL w reverses causality. They abbreviate this by saying that time travel a consequence of FTL. And what do I say to this howling of the Beotians? Not so fast.

Indeed, this ought to be physics, not just vulgar mathematics. One argument was overlooked by Einstein, who apparently believed blindly in Poincare’s Relativity Postulate even more than his author himself did.

We have discovered above that there is a limit energy, beyond which Relativity breaks down. Thus the Lorentz transformations break down.

Let’s look at the situation in a more refined way. What happens to the traditional transverse light clock encountered above, and in all traditional Relativity treatises? Well, in this thought experiment, admitting my reasoning above as valid, the light clock, if made energetic enough, will start to accelerate faster than light. So it would stop functioning, because light will not be able to catch up with it. This may sound strange, but it’s not. In the mood of the preceding, it just says that the light used is limited in energy, and the clock is not. To still keep time for a while, we may have to use a high energy neutrino clock (supposing that neutrinos, indeed, travel Faster Than Light; a neutrino clock is build exactly like a photon clock, with the photons replaced by neutrinos).

This is why the Lorentz transformation fail; because the clock fail, as the light cannot catch up with the mirror. Pernicious ones could claim I self contradict, as my argument used the Fitzgerald contraction. Yes, I used it, because it failed. In my logic, the contraction fails first.

Another argument is that the alledged violations of causality hard core relativists find in the apparition of negative times are much less frequent than they fear, because time dilation dilues the statistics. A relativistic point they absolutely ignore. Also, as I have argued in the past, in https://patriceayme.wordpress.com/2011/09/01/quantum-non-locality/, Quantum physics non locality, in conjonction with Faster Than Light space expansion implies violations of local causality. Basically Quantum entanglements can happen beyond the space horizon given by the cosmic FTL expansion (which is an experimental fact).

Thus causality has to be taken with a grain of salt, and plenty of statistics (a characteristic of modern physics, at least since Boltzman).

***

AND LET’S NOT FORGET FASTER THAN LIGHT FROM THE QUANTUM…

In physics, these days, a thousand theories blossom, and nearly a thousand fail miserably. So, on the odds, it ought to be the case with the preceding theory. However the preceding rests on fundamentals, whereas much of the fashionable stuff rests on notions nobody understands, and few bothered to study (such as the interdiction of FTL, which rests just on the shallow logic of Einstein exposed above).

An interesting sort are the Large Extra Dimension theories (pertly instigated and made popular by the famous Lisa Randall, a glamorous professor from Princeton, Harvard and Solvay, author of the just published “Knocking On Heavens’ Doors”). They would help the preceding arguments, by lowering considerably the threshold of relativity’s high energy failure. Thus, if superluminal neutrinos are indeed observed, in light of the preceding, they would suggest the existence of Large Extra Dimensions.

Many of the theories which have blossomed are “not even wrong“. The obsession with superstrings was typical. Why to obsess with those, when basic Quantum theory had not been figured out? Is it because superstrings cannot be tested?

Basic Quantum theory is subject to experiments, and some have given spectacular, albeit extremely controversial results. There was curiously little interest to confront those tests. If nothing else, the OPERA experiment illustrates that the failure of Relativity at high energy can be tested (OPERA was really out to test neutrino oscillations, which are, already, a violation of Relativity in some sense, as they confer a (rest?) mass on a particle going at the speed of light!)

The preceding theory, and its absolute frame, fits like a glove with the idea with an absolute (but stochastic and statistical) space, constructed from Quantum Interaction (or “potential” as Bohm has it). That would partly cause the CMB. That theory rests on the predicted failure of standard Quantum theory at large distances, and the existence of an extremely Faster Than Light interaction. It’s getting to be a small world, theoretically speaking…

***

Patrice Ayme

***

P/S: The reasoning above is so simple that normal physicists will not rest before they can brandish a mistake. So what could that mistake be? Oh, simply that the perpetrator, yours truly, did not use full Relativistic Quantum Mechanics, obviously because s/he is ignorant. But referring to Relativistic Quantum Mechanics, by itself, would use in the proof what one wants to demonstrate with the proof. Relativistic Quantum Mechanics assumes that Relativity is correct at any speed. I don’t. I don’t, especially after looking outside.

OK, that was a meta argument. Can I build a more pointed objection? After all, Dirac got QED started on esthetic grounds, while elevating relativity to a metaprinciple. But beauty is no proof. Nor is the fact that Dirac’s point of view led to several predictions (Dirac equation for electrons, Spin, Positrons).

One could say that the Heisenberg Uncertainty Principle (HUP) would prevent the confinement of a particle in such an increasingly small box. However, the HUP is infeodated to general De Broglie Mechanics (GDM), and there is no argument I see why GDM could not get confined..