Posts Tagged ‘Poincare’’

“Proof” That Faster Than Light Communications Are Impossible Is False

December 16, 2017

There are theories everywhere, and the more ingrained they are, the more suspiciously they should be looked at. From the basic equations of relativity it is clear that if one adds speeds less than the speed of light, one will get a speed less than the speed of light. It is also clear that adding impulse to a mass will make it more massive, while its speed will asymptotically approach that of light (and, as I explained, the reason is intuitive, from Time Dilation).

The subject is not all sci-fi: modern cosmology brazenly assumes that space itself, after the alleged Big Bang, expanded at a speed at least 10^23 c (something like one hundred thousand billion billions time the speed of light c). The grossest, yet simplest, proof of that is simple: the observable universe is roughly 100 billion light years across, and it is ten billion years old. Thus it expanded at the minimum average clip of ten billion light years, every billion years. 100c/10 = 10c, according to standard cosmology. One could furiously imagine a spaceship somehow surfing on a wave of warped space, expanding for the same obscure reason same obscure reason as the Big Bang itself, that is…) 

The question naturally arises whether velocities which are greater than that of light could ever possibly be obtained in other ways. For example, are there communication speeds faster than light? (Throwing some material across will not work: its mass will increase, while its speed stays less than c.)

Textbooks say it’s not possible. There is actually a “proof” of that alleged impossibility, dating all the way back to Einstein (1907) and Tolman (1917). The mathematics are trivial (they are reproduced in my picture below). But the interpretation is apparently less so. Wikipedia weirdly claims that faster than light communications would allow to travel back in time. No. One could synchronize all clocks on all planets in the galaxies, and having faster than light communications would not change anything. Why? Time is local, faster than light data travel is nonlocal.

The problem of faster than light communications can be attacked in the following manner.

Consider two points A and B on the X axis of the system S, and suppose that some impulse originates at A, travels to B with the velocity u and at B produces some observable phenomenon, the starting of the impulse at A and the resulting phenomenon at B thus being connected by the relation of cause and effect. The time elapsing between the cause and its effect as measured in the units of system S will evidently be as follows in the calligraphy below. Then I use the usual Relativity formula (due to Lorentz) of time as it elapses in S’:

Equations help, but they are neither the beginning, nor the end of a story. Just an abstraction of it. The cult of equations is naive, interpretation is everything. The same thing, more generally, holds for models.
As Tolman put it in 1917: “Let us suppose now that there are no limits to the possible magnitude of the velocities u and V, and in particular that the causal impulse can travel from A to B with a velocity u greater than that of light. It is evident that we could then take a velocity u great enough uV/C^2 will be greater than one.
so that Delta(t) would become negative. In other words, for an observer in system S’ the effect which occurs at B would precede in time its cause which originates at A.”

I quote Tolman, because he is generally viewed as the one having definitively established the impossibility of faster than light communications. Tolman, though is not so sure; in his next sentence he turns out wishy washy: “Such a condition of affairs might not be a logical impossibility; nevertheless its extraordinary nature might incline us to believe that no causal impulse can travel with a velocity greater than that of light.”

Actually it is an effect those who have seen movies running in reverse are familiar with. Causality apparently running in reverse is no more surprising than the fact that two events at x1 and x2 which are simultaneous in S are separated by:  (x1-x2) (V/square root (1-VV/CC)). That introduces a sort of fake, or apparent causality, sometimes this before that, sometimes that before this.

(The computation is straightforward and found in Tolman’s own textbook; it originated with Henri Poincaré.[9][10] In 1898 Poincaré argued that the postulate of light speed constancy in all directions is useful to formulate physical laws in a simple way. He also showed that the definition of simultaneity of events at different places is only a convention.[11]) . Notice that, in the case of simultaneity, the signs of V and (x1-x2) matter. Basically, depending upon how V moves, light in S going to S’ takes more time to catch up with the moving frame, and the more so, the further it is, the same exact effect which explains the nil result in the Michelson-Morley interferometer; there is an underlying logic below all of this, and it’s always the same).

Tolman’s argumentation about the impossibility of faster than light communications is, in the end, purely philosophical and fully inconsistent with the closely related, and fully mainstream, relativity of simultaneousness.

Poincaré in 1900 proposed the following convention for defining clock synchronisation: 2 observers A and B, which are moving in space (which Poincaré called the aether), synchronise their clocks by means of optical signals. They believe to be at rest in space (“the aether”) from not moving relative to distant galaxies or the Cosmic Radiation Background and assume that the speed of light is constant in all directions. Therefore, they have to consider only the transmission time of the signals and then crossing their observations to examine whether their clocks are synchronous.

“Let us suppose that there are some observers placed at various points, and they synchronize their clocks using light signals. They attempt to adjust the measured transmission time of the signals, but they are not aware of their common motion, and consequently believe that the signals travel equally fast in both directions. They perform observations of crossing signals, one traveling from A to B, followed by another traveling from B to A.” 

In 1904 Poincaré illustrated the same procedure in the following way:

“Imagine two observers who wish to adjust their timepieces by optical signals; they exchange signals, but as they know that the transmission of light is not instantaneous, they are careful to cross them. When station B perceives the signal from station A, its clock should not mark the same hour as that of station A at the moment of sending the signal, but this hour augmented by a constant representing the duration of the transmission. Suppose, for example, that station A sends its signal when its clock marks the hour 0, and that station B perceives it when its clock marks the hour t. The clocks are adjusted if the slowness equal to t represents the duration of the transmission, and to verify it, station B sends in its turn a signal when its clock marks 0; then station A should perceive it when its clock marks t. The timepieces are then adjusted. And in fact they mark the same hour at the same physical instant, but on the one condition, that the two stations are fixed. Otherwise the duration of the transmission will not be the same in the two senses, since the station A, for example, moves forward to meet the optical perturbation emanating from B, whereas the station B flees before the perturbation emanating from A. The watches adjusted in that way will not mark, therefore, the true time; they will mark what may be called the local time, so that one of them will be slow of the other.[13]

This Poincaré (“–Einstein”) synchronisation was used by telegraphers as soon as the mid-nineteenth century. It would allow to cover the galaxy with synchronized clocks (although local times will differ a bit depending upon the motion of stars, and in particular where in the galactic rotation curve a star sits). Transmitting instantaneous signals in that networks would not affect causality. Ludicrously, Wikipedia asserts that faster than light signals would make “Bertha” rich (!!!). That comes simply from Wikipedia getting thoroughly confused, allowing faster than light signals for some data, and not for other data, thus giving an advantage to some, and not others.

***

Quantum Entanglement (QE) enables at-a-distance changes of Quantum states:

(It comes in at least three types of increasing strength.) Quantum Entanglement, as known today, is within Quantum state to within Quantum state, but we cannot control in which Quantum state the particle will be, to start with, so we cannot use QE for communicating faster than light (because we don’t control what we write, so to speak, as we write with states, so we send gibberish).

This argument is formalized in a “No Faster Than Light Communication theorem”. However, IMHO, the proof contains massive loopholes (the proof assumes that there is no Sub Quantum Reality, whatsoever, nor could there ever be some, ever, and thus that the unlikely QM axioms are forever absolutely true beyond all possible redshifts you could possibly imagine, inter alia). So this is not the final story here. QE enables, surprisingly, the Quantum Radar (something I didn’t see coming). And it is not clear to me that we have absolutely no control on states statistically, thus that we can’t use what Schrödinger, building on the EPR thought experiment, called “Quantum Steering” to communicate at a distance. Quantum Radar and Quantum Steering are now enacted through real devices. They use faster-than-light in their inner machinery.

As the preceding showed, the supposed contradiction of faster-than-light communications with Relativity is just an urban legend. It makes the tribe of physicists more priestly, as they evoke a taboo nobody can understand, for the good reason that it makes no sense, and it is intellectually comfortable, as it simplifies brainwork, taboos always do, but it is a lie. And it is high time this civilization switches to the no more lies theorem, lest it wants to finish roasted, poisoned, flooded, weaponized and demonized.

Patrice Ayme’

Technical addendum:

https://en.wikipedia.org/wiki/Relativity_of_simultaneity

As Wikipedia itself puts it, weasel-style, to try to insinuate that Einstein brought something very significant to the debate, the eradication of the aether (but the aether came back soon after, and there are now several “reasons” for it; the point being that, as Poincaré suspected, there is a notion of absolute rest, and now we know this for several reasons: CRB, Unruh effect, etc.):

In 1892 and 1895, Hendrik Lorentz used a mathematical method called “local time” t’ = t – v x/c2 for explaining the negative aether drift experiments.[5] However, Lorentz gave no physical explanation of this effect. This was done by Henri Poincaré who already emphasized in 1898 the conventional nature of simultaneity and who argued that it is convenient to postulate the constancy of the speed of light in all directions. However, this paper does not contain any discussion of Lorentz’s theory or the possible difference in defining simultaneity for observers in different states of motion.[6][7] This was done in 1900, when Poincaré derived local time by assuming that the speed of light is invariant within the aether. Due to the “principle of relative motion”, moving observers within the aether also assume that they are at rest and that the speed of light is constant in all directions (only to first order in v/c). Therefore, if they synchronize their clocks by using light signals, they will only consider the transit time for the signals, but not their motion in respect to the aether. So the moving clocks are not synchronous and do not indicate the “true” time. Poincaré calculated that this synchronization error corresponds to Lorentz’s local time.[8][9] In 1904, Poincaré emphasized the connection between the principle of relativity, “local time”, and light speed invariance; however, the reasoning in that paper was presented in a qualitative and conjectural manner.[10][11]

Albert Einstein used a similar method in 1905 to derive the time transformation for all orders in v/c, i.e., the complete Lorentz transformation. Poincaré obtained the full transformation earlier in 1905 but in the papers of that year he did not mention his synchronization procedure. This derivation was completely based on light speed invariance and the relativity principle, so Einstein noted that for the electrodynamics of moving bodies the aether is superfluous. Thus, the separation into “true” and “local” times of Lorentz and Poincaré vanishes – all times are equally valid and therefore the relativity of length and time is a natural consequence.[12][13][14]

… Except of course, absolute relativity of length and time is not really true: everywhere in the universe, locally at rest frames can be defined, in several manner (optical, mechanical, gravitational, and even using a variant of the Quantum Field Theory Casimir Effect). All other frames are in trouble, so absolute motion can be detected. The hope of Einstein, in devising General Relativity was to explain inertia, but he ended down with just a modification of the 1800 CE Bullialdus-Newton-Laplace theory… (Newton knew his instantaneous gravitation made no sense, and condemned it severely, so Laplace introduced a gravitation speed, thus the gravitational waves, and Poincaré made them relativistic in 1905… Einstein got the applause…)

Relativity, Absolute Frame, Simultaneity, Action At A Distance

September 15, 2016

Quantum Physics comes with an instantaneous action at a distance. A simultaneity. I call it the QI, the Quantum Interaction.

This simultaneity, this action at a distance, has baffled Relativity enthusiasts. See “Taming The Quantum Spooks”. 

https://aeon.co/essays/can-retrocausality-solve-the-puzzle-of-action-at-a-distance

According to Einsteinian lore, one cannot have such an “instantaneous” interaction, it would contradict “Relativity”. (From my point the interaction is not instantaneous, just more than 10^10 c, that is 10^10 the speed of light, at least.)

Jules Henri Poincaré asserted the Principle of Relativity (1904) and demonstrated that, supposing that the speed of light was always constant, one could get all the equations of Special Relativity. Then Einstein, opportunistically jumping on the immensely famous Poincaré’s work, asserted that the Frenchman’s work showed that the speed of light was constant (whereas a more cautious  Poincaré asserted earlier that, considering that the speed of light was always found experimentally to be constant, one should view that as a law of physics). Of course, Einstein did not quote the French, as he was a good Swabian (and not a good European), keen to ride, as his mentor Planck was, Prussian fascism.

This Field Of Galaxies Defines An Absolute Frame. It Is Plain To See, Only Years Of Learning Academic Physics Can Brainwash Someone, Not To See It.

This Field Of Galaxies Defines An Absolute Frame. It Is Plain To See, Only Years Of Learning Academic Physics Can Brainwash Someone, Not To See It.

Poincaré knew very well Lorentz’s Local Time theory, which he had helped established, in the preceding quarter of a century. However, Jules Henri still believed in Absolute Time (Einstein did not).

Why to believe in Absolute Time? Poincaré did not wax lyrical on the subject. He actually said nothing (contrarily to Nobel laureate Bergson twenty years later, who violently contradicted Einstein). Nor did any physicist, in the meantime (110 years), dare defend Absolute Time (we have lived in an Einstein terror regime!) But this what Quantum Physics quietly does and what I will now dare to do (if I can contradict professional Salafists, I surely can dare to contradict professional physicists).

Suppose we have an absolute reference frame. Bring a light clock there, at rest, call that time: Absolute time. One can slow transport clocks (say using chemical rockets, and taking 100,000 years to get to Proxima Centauri) all over the universe, establishing UNIVERSAL TIME. Relativistic effects depend upon vv/cc. The square of speed, divided by the square of the speed of light c. If v/c is small, vv/cc is even much smaller, and negligible. (Poincaré showed this first.)

So is there an absolute reference frame? Sure. That frame is the one steady relative to distance pulsars, quasars, distant galaxies, etc. (no rotation) and steady relative to the Cosmological Background Radiation. Then one can talk about simultaneity, absolute time, and thus instantaneous interaction at a distance.

(This is one approach; there is another approach of mine, more mathematical, using the fact a manifold of dimension n can be embedded in one of dimension 2n +1 (Whitney). Or then one can use the celebrated Nash’ embedding theorem.)

There is no contradiction of Absolute Time theory, or should we say, possibility, with Local Time Theory (LTT). LTT is about light clocks. Relativity is about light clocks. Yet we know of other interactions… plus the QUANTUM INTERACTION.

BTW, in “General Relativity”, “Einstein’s theory of gravitation”, the speed of light is not constant. Even Einstein recognized this.

Conclusion? One can profitably consider Ian Miller’s “Dark Energy and Modern Science“. Even physicists can believe what they believe in, on the most important fundamentals, because it is fashionable, a rite one has to believe in, so that one can become an initiated member of the tribe. And the more absurd the belief, the better.

Patrice Ayme’

 

Entangled Universe: Bell Inequality

May 9, 2016

Abstract: The Bell Inequality shatters the picture of reality civilization previously established. A simple proof is produced.

What is the greatest scientific discovery of the Twentieth Century? Not Jules Henri Poincaré’s Theory of Relativity and his famous equation: E = mcc. Although a spectacular theory, since  Poincaré’s made time local, in order to keep the speed of light constant, it stemmed from Galileo’s Principle of Relativity, extended to Electromagnetism. To save electromagnetism globally, Jules Henri Poincaré made time and length local.

So was the discovery of the Quantum by Planck the greatest discovery? To explain two mysteries of academic physics, Planck posited that energy was emitted in lumps. Philosophically, though, the idea was just to extent to energy the basic philosophical principle of atomism, which was two thousand years old. Energy itself was discovered by Émilie Du Châtelet in the 1730s.

Quantum Entanglement Is NOT AT ALL Classically Predictable

Quantum Entanglement Is NOT AT ALL Classically Predictable

Just as matter went in lumps (strict atomism), so did energy. In light of  Poincaré’s E = mc2, matter and energy are the same, so this is not surprising (by a strange coincidence (?)  Poincaré demonstrated, and published E = mc2, a few month of the same year, 1900, as Max Planck did E = hf; Einstein used both formulas in 1905).

The greatest scientific discovery of Twentieth Century was Entanglement… which is roughly the same as Non-Locality. Non-Locality would have astounded Newton: he was explicitly very much against it, and viewed it, correctly, as the greatest flaw of his theory. My essay “Non-Locality” entangles Newton, Émilie Du Châtelet, and the Quantum, because therefrom the ideas first sprung.

***

Bell Inequality Is Obvious:

The head of the Theoretical division of CERN, John Bell, discovered an inequality which is trivial and apparently so basic, so incredibly obvious, that it reflects the most basic common sense that it should always be true. Ian Miller (PhD, Physical Chemistry) provided a very nice perspective on all this. Here it is, cut and pasted (with his agreement):

Ian Miller: A Challenge! How can Entangled Particles violate Bell’s Inequalities?

Posted on May 8, 2016 by ianmillerblog           

  The role of mathematics in physics is interesting. Originally, mathematical relationships were used to summarise a myriad of observations, thus from Newtonian gravity and mechanics, it is possible to know where the moon will be in the sky at any time. But somewhere around the beginning of the twentieth century, an odd thing happened: the mathematics of General Relativity became so complicated that many, if not most physicists could not use it. Then came the state vector formalism for quantum mechanics, a procedure that strictly speaking allowed people to come up with an answer without really understanding why. Then, as the twentieth century proceeded, something further developed: a belief that mathematics was the basis of nature. Theory started with equations, not observations. An equation, of course, is a statement, thus A equals B can be written with an equal sign instead of words. Now we have string theory, where a number of physicists have been working for decades without coming up with anything that can be tested. Nevertheless, most physicists would agree that if observation falsifies a mathematical relationship, then something has gone wrong with the mathematics, and the problem is usually a false premise. With Bell’s Inequalities, however, it seems logic goes out the window.

Bell’s inequalities are applicable only when the following premises are satisfied:

Premise 1: One can devise a test that will give one of two discrete results. For simplicity we label these (+) and (-).

Premise 2: We can carry out such a test under three different sets of conditions, which we label A, B and C. When we do this, the results between tests have to be comparable, and the simplest way of doing this is to represent the probability of a positive result at A as A(+). The reason for this is that if we did 10 tests at A, 10 at B, and 500 at C, we cannot properly compare the results simply by totalling results.

Premise 1 is reasonably easily met. John Bell used as an example, washing socks. The socks would either pass a test (e.g. they are clean) or fail, (i.e. they need rewashing). In quantum mechanics there are good examples of suitable candidates, e.g. a spin can be either clockwise or counterclockwise, but not both. Further, all particles must have the same spin, and as long as they are the same particle, this is imposed by quantum mechanics. Thus an electron has a spin of either +1/2 or -1/2.

Premises 1 and 2 can be combined. By working with probabilities, we can say that each particle must register once, one way or the other (or each sock is tested once), which gives us

A(+) + A(-) = 1; B(+) + B(-) = 1;   C(+) + C(-) = 1

i.e. the probability of one particle tested once and giving one of the two results is 1. At this point we neglect experimental error, such as a particle failing to register.

Now, let us do a little algebra/set theory by combining probabilities from more than one determination. By combining, we might take two pieces of apparatus, and with one determine the (+) result at condition A, and the negative one at (B) If so, we take the product of these, because probabilities are multiplicative. If so, we can write

A(+) B(-) = A(+) B(-) [C(+) + C(-)]

because the bracketed term [C(+) + C(-)] equals 1, the sum of the probabilities of results that occurred under conditions C.

Similarly

B(+)C(-)   = [A(+) + A(-)] B(+)C(-)

By adding and expanding

A(+) B(-) + B(+)C(-) = A(+) B(-) C(+) + A(+) B(-) C(-) + A(+) B(+)C(-) + A(-)B(+)C(-)

=   A(+)C(-) [(B(+) + B(-)] + A+B C+ + AB(+)C(-)

Since the bracketed term [(B(+) + B(-)] equals 1 and the last two terms are positive numbers, or at least zero, we have

A(+) B(-) + B(+)C(-) ≧ A(+)C(-)

This is the simplest form of a Bell inequality. In Bell’s sock-washing example, he showed how socks washed at three different temperatures had to comply.

An important point is that provided the samples in the tests must give only one result from only two possible results, and provided the tests are applied under three sets of conditions, the mathematics say the results must comply with the inequality. Further, only premise 1 relates to the physics of the samples tested; the second is merely a requirement that the tests are done competently. The problem is, modern physicists say entangled particles violate the inequality. How can this be?

Non-compliance by entangled particles is usually considered a consequence of the entanglement being non-local, but that makes no sense because in the above derivation, locality is not mentioned. All that is required is that premise 1 holds, i.e. measuring the spin of one particle, say, means the other is known without measurement. So, the entangled particles have properties that fulfil premise 1. Thus violation of the inequality means either one of the premises is false, or the associative law of sets, used in the derivation, is false, which would mean all mathematics are invalid.

So my challenge is to produce a mathematical relationship that shows how these violations could conceivably occur? You must come up with a mathematical relationship or a logic statement that falsifies the above inequality, and it must include a term that specifies when the inequality is violated. So, any takers? My answer in my next Monday post.

[Ian Miller.]

***

The treatment above shows how ludicrous it should be that reality violate that inequality… BUT IT DOES! This is something which nobody had seen coming. No philosopher ever imagined something as weird. I gave an immediate answer to Ian:

‘Locality is going to come in the following way: A is going to be in the Milky Way, B and C, on Andromeda. A(+) B(-) is going to be 1/2 square [cos(b-a)]. Therefrom the contradiction. There is more to be said. But first of all, I will re-blog your essay, as it makes the situation very clear.’

Patrice Ayme’

Poincaré: LOCAL TIME Implies MASS = ENERGY

March 29, 2016

Historically three functions were attributed to time: simultaneity, synchronization and duration. Time became important in physics even before Galileo analyzed how gravity could be diluted by using a slope. Middle Age mathematicians made the first differential calculus computations using time, two centuries before Fermat established calculus.

Newton used calculus for his detailed theory of gravitation. However Isaac thought his own theory made no sense. The problem was that gravity was supposed to act instantaneously at a distance. Isaac thought that it is inconceivable that inanimate Matter should, without the Mediation of something else, which is not material, operate upon, and affect other matter without mutual Contact…That Gravity should be innate, inherent and essential to Matter, so that one body may act upon another at a distance thro’ a Vacuum, without the Mediation of any thing else, by and through which their Action and Force may be conveyed from one to another, is to me so great an Absurdity that I believe no Man who has in philosophical Matters a competent Faculty of thinking can ever fall into it.”

— Isaac Newton, Letters to Bentley, 1692/3

Poincaré: Time Is Local, MASS = ENERGY, Yet Relativity Is Not Fully Relative

Poincaré: Time Is Local, MASS = ENERGY, Yet Relativity Is Not Fully Relative

[The picture actually alludes to a completely different work of Poincaré, his discovery that qualitative methods in non solvable differential equations produced results where exact differential equations a la Newton did not: in particular, Poincare’s recurrence theorem… Useful in astronomy.]

Newton’s theory depended crucially on an absolute, universal time: thus the gravity force vector could always point to the center of (the) mass (exerting the gravitational force).

However the wrapping up of the electromagnetic equations by Maxwell showed that light was electromagnetic field travelling at speed c. C was universal. And independent of any “rest frame”. After thinking about the problem for twenty years, Lorentz discovered that, for electromagnetic phenomena to stay the same in a moving frame, one had to introduce what Poincaré called a “Local Time”. Poincaré then pointed out that there was no absolute rest relative to an “ether”, all one could do was to analyze the motion of matter relative to matter.

Then Poincaré thought some more for five years, and published in 1900, in the major Dutch physics Journal, that electromagnetic field retardation and its violation of Newton’s Third Law (Action equals reaction) could be resolved by attributing the inertial mass E/cc to the electromagnetic field.

(Mass = energy was attributed to a number of second order German physicists for Francophobic and nationalistic reasons, and the notion is repeated to this day by ignorant parrots; that would be sort of funny, if it did not distort not just the history of physics, but even the understanding of physics, as the parrots tend to not have as deep an understanding the underlying concepts).

“The principle of relativity, according to which the laws of physical phenomena must be the same for a stationary observer as for one carried along in a uniform motion of translation, so that we have no means, and can have none, of determining whether or not we are being carried along in such a motion… From all these results, if they were to be confirmed, would issue a wholly new mechanics which would be characterized above all by this fact, that there could be no velocity greater than that of light, any more than a temperature below that of absolute zero. For an observer, participating himself in a motion of translation of which he has no suspicion, no apparent velocity could surpass that of light, and this would be a contradiction, unless one recalls the fact that this observer does not use the same sort of timepiece as that used by a stationary observer, but rather a watch giving the “local time.[..] Perhaps, too, we shall have to construct an entirely new mechanics that we only succeed in catching a glimpse of, where, inertia increasing with the velocity, the velocity of light would become an impassable limit. The ordinary mechanics, more simple, would remain a first approximation, since it would be true for velocities not too great, so that the old dynamics would still be found under the new” [Poincaré, 1904.]

So after Poincaré’s work, what was the situation? Time is local (yet clocks could be synchronized at a distance), Galilean relativity could be extended to electromagnetism as long as mass = energy.

Are we further along today?

Poincaré kept a distinction between “apparent time” and “ether” given time. Einstein’s variation of the theory does not preserve this distinction (and that makes it false, ha ha ha). I will not go into the details here, as it would be pure research of the sort that 99% of theoretical physicists are unwilling to consider (some other day, in simple words). I am not trying to spite Einstein, long my preferred physicist (no more, though, he has exhausted my patience with vindictive plagiarism, in particular against Poincaré and Karl Popper, let alone abandoning his little daughter). Actually Einstein admitted there was some sort of ether: …”we may say that according to the general theory of relativity space is endowed with physical qualities; in this sense, therefore, there exists an ether. According to the general theory of relativity space without ether is unthinkable.” [Einstein, 1920.]

But there is much worse: we now know that Quantum Physics ignores Local Time. Quantum Physics brings back the instantaneous interaction at a distance which repulsed Newton. (At least, it appears instantaneous experimentally, so far, and it is certainly instantaneous in the existing Quantum formalism, which, amusingly, is in the same exact situation as Newtonian Physics: the Quantum as we know it today, cannot function without that instantaneous Quantum Interaction.

Whatever happens next, only one thing is clear; those who claim physics has been figured out, know very little, and should be advised to shut up, lest their  egregious statements confuse the public about the scientific method.

Patrice Ayme’

***

***

E = mcc? Here is my take on it:

The simplest idea to get to Energy = Mass, is that light has momentum (experiments and Poynting’s work on electromagnetism). Integrated (that is summed up) momentum transferred is… energy.

But also, upon emission of light, a recoil appears (Newton’s Third Law, and that is what it means that light has momentum). To keep the center of mass where it was prior (Buridan’s law, aka “Newton’s” First/Second Law), light needs to carry inertial mass (also gravitational, according to the equivalence principle)… Poincaré, no fool, has got to have been teaching that at the Sorbonne in 1899 (when he first publicized E = mcc)…

QUANTUM TRUMPS SPACETIME

August 8, 2013

Abstract: simple considerations of a philosophical, non computational, nature, on Space, Time and the Quantum show that the former two are not basic (and that some apparently most baffling traits of the Quantum are intuitive!). Progress in knowledge of the interdependence of things should not be hampered by traditional prejudices. (Not an easy essay: readers are encouraged to jump around it like kangaroos!)

***

What is time? Today’s physics does not answer that question, it just computes with the notion as if it were obvious. To find out what time could be, a little bout of metaphysics different from the tentative one in today’s understanding of nature, is needed.

Einstein amplified the notion that the universe is about spacetime (x,t) in a reference frame F. He, and his friends Hilbert and Besso used the mathematical, and physical ideas, created by Riemann (and his Italian successors: Ricci, Levi-Civita, etc.)

"Solitary and Uncomprehended Genius"

Riemann: “Solitary and Uncomprehended Genius” (Poincaré said)

Lorentz discovered one had to assume that (x’,t’) in a moving frame F’ cruising by at a steady speed v is related to (x,t) in frame F according to the Lorentz transformations.

Lorentz got the Nobel Prize, for finding these (thanks to the recommendation of the towering Henri Poincaré); I am not pointing this out to compare the relative merits of celebrities, but to establish the hierarchy of the discoveries they made, and thus the logic therein. (Poincaré’s 1904“Principe de Relativite’” was firmly established before Einstein showed up on the scene, and the latter’s contributions, although enlightening, have been vastly overestimated.)

Not that the initial logic of a discovery always perdures, but sometimes it’s important. The Einstein cult has been obscuring reality; Einstein would have been the first one to decry it (Einstein basically ran away with the idea of Poincaré that the constancy of the speed of light, c, being always observed, was thus a fundamental law of physics, and made it the foundation of what Poincare’ called “Relativite'”).

Only by using the Lorentz transformations are the equations of electrodynamics preserved. In other words: only thus is the speed of light measured to be c in both F, using (x,t) and F’, using (x’,t’).

So what is time t?

According to the scheme in Relativity, it’s simple: given the sanctity of the speed of light, c, and space x, time can be measured by having a photon of light going between two perfect mirrors, and counting the impacts (that’s what is called a light clock; it’s very useful to derive most equations of Relativity).

Indeed space is measured by the time it takes light to go back and forth. This sounds like a circular logic: time is needed to measure space and space is needed, to measure time.

Does that mean one of the two, say, time, is derivative?

I used to think so (propped by the lack of time in Quantum Theory, see below). But, actually, no.

Indeed, time can be localized down to the proton scale.

One can measure time at that scale with how long it takes some elementary particle to decay. Or because to any particle is associated its De Broglie wave, hence a frequency (and that particle can be confined in as small a space as a proton).

Basically time can be measured at a point.

However, space, by definition is… non local (space is always an extent, all the more if time is used to measure it, thanks to c; technically my idea is that space depends upon the holonomy group, time does not; thus Minkowsky’s “spacetime” belongs to the dustbin!).

Thus the conceptual universe in which bask electromagnetism makes it look as if, somehow, time was more fundamental.

The situation is the exact opposite in Quantum Theory. Quantum Theory is full of entangled situations. Measure such a situation somewhere, and it changes all over. “Measure such a situation somewhere, and it changes all over” means that a Quantum Process is all over it. Whatever “it” is. Einstein called that “spooky interaction at a distance”. I call it the QUANTUM INTERACTION.

Einstein tried to escape the spookiness. Instead, I claim it should be embraced. After all, Quantum spookiness makes life possible.

We indeed know now that this spooky Quantum interaction is fundamental to life. It allows life to be more efficient than any understanding from classical mechanics could have it. Vision and the chlorophyll molecule use Quantum spookiness at a distance. This recent discovery did not surprise me at all. I fully expected it, just as I fully expect that consciousness will be revealed to be a Quantum effect (an easy prediction, at this point, in this Quantum universe!)

A computer using the Quantum Theory would be more efficient, for the same reason: the Quantum computer computes all over, in a non local way. (The computers we have now are just sleek electron-using versions of the classical computers the ancient Greeks had, with their little teethed wheels; the Quantum computer is founded on a completely different process.)

This “spooky” non locality has alarmed many a thinker. But notice this simple fact: space itself, even the classical space used in electromagnetism, is non local (as one uses light travel, plus time, to determine space).

So it’s only natural that space in Quantum Theory be non local too.

The “spookiness” is easily understood thus: spacetime physics a la Einstein and company singles out a particular interaction, electromagnetism, and the sanctity of c, to measure the universe with. Why this one, and not another of the fundamental interactions we know?

Quantum Theory (QT) gets out of this would-be choice by choosing none of the traditional forces to measure space with!

As QT has it, as it stands, QT does not need to measure the universe. (I believe it does, using the Quantum Interaction, and I can support that with impossible simultaneous measurements at great distances, but that’s another, more advanced set of considerations.)

Those who think thinking is reduced to computing will object that it is not the same type of non locality (the one I claim to see in classical space and the “spooky” one of Quantum space). Whatever: the non locality in quantum Theory does not depend upon light speed. That’s the important point.

There, the lesson cannot be emphasized enough: on the face of it, the basic set-up of Quantum Theory tells us that light, and, in particular light speed, is NOT fundamental.

This few observations above should they prove to be as deep and correct as I believe they are, show the power of the philosophical method, even in today’s physics. Some will scoff, but not consider carefully all the philosophy behind spacetime a la Einstein.

A warning for those who scoff about the importance of meta-physics: the founding paper of differential geometry in mathematics, and physics, was a lecture by Bernhard Riemann. It’s full of metaphysics and metamathematics, for the best.

The paper had just one equation (and it is a definition!)

That lecture was entitled “Über die Hypothesen welche der Geometrie zu Grunde liegen (“On The Hypotheses Which Underlie Geometry“). (Call these “hypotheses” meta-geometrical, metamathematical, or metaphysical.)

The lecture was published in 1868, two years after his author’s death (and 14 years after he gave it). Riemann’s main idea was to define manifolds and curvature. (Riemannian) manifolds were defined by a metric. Curvature ought to be a tensor, Riemann said, not just a simple number (scalar; as Gaussian curvature).

From top to bottom: positive, negative and no curvature.

From top to bottom: positive, negative and no curvature.

Riemann generalized the notion of curvature to any dimension, thanks to the Riemann Curvature Tensor (the simplified Ricci form of which appears in Einstein’s gravitational field equation).

Here is for some meta-physics; Riemann: “It is quite conceivable that the geometry of space in the very small does not satisfy the axioms of [Euclidean] geometry… The properties which distinguish space from other conceivable triply-extended magnitudes are only to be deduced from experience.

Gauss, Riemann’s teacher, knew this so well that he had tried to measure the curvature of space, if any, using a triangle of tall peaks. Gauss found no curvature, but now we know that gravitation is best described as curved spacetime.

(This lack of Gaussian curvature shows that it’s not because situation is not found under some conditions that it is not there under other conditions; in biology the proof by Medawar that Lamarckism was false, using mice, for which he got the Nobel (being British, ;-)) comes to mind: no Lamarckism in Medawar experiments did not prove that there would be no Lamarckism in other experiments; now four Lamarckist mechanisms are known!)

Twentieth Century physics, in particular the theory of gravitation, exploits the following fact, understood by Riemann as he laid, dying from tuberculosis in Italy. Force is a tautology for geodesics coming closer (or not). Thus curvature is force.

Einstein remarkably said: “Only the genius of Riemann, solitary and uncomprehended, had already won its way by the middle of the last century to a new conception of space, in which space was deprived of its rigidity, and in which its power to take part in physical events was recognized as possible.”

(I find this statement all the more remarkable and prophetic in that it is not in Einstein’s physics, and could not be, but rather in the one I would like to have, where fundamental dynamic processes literally create space…)

The fact that a tautology is at the heart of Einstein’s Theory of Relativity means that it explains nothing much! (Relativity fanatics are going to hate that statement!…although it describes very well what happens to objects evolving in spacetime, especially GPS, let it be said in passing.)

“Only to be deduced from experience”, said mathematician Riemann. What’s the ultimate experience we have? Quantum Theory. And what did we find QT said? You can’t measure with space, you can’t measure with time (although clearly the Quantum depends upon the differential topology of the situation, see the Bohm-Aharanov effect! where, by the way, the space metric is made fun of once again!)

Last splendid idea from Riemann (1854-1866):

“Researches starting from general notions, like the investigation we have just made, can only be useful in preventing this work from being hampered by too narrow views, and progress in knowledge of the interdependence of things from being checked by traditional prejudices.”

Amen.

***

Patrice Ayme

FASTER THAN LIGHT

October 29, 2011

WHY RELATIVITY OUGHT TO FAIL AT HIGH ENERGIES.

Why Relativity Is Not That Relative: A Theory Of Velocity?

***

Main Idea: Theoretical triage on Special Relativity leaves the theory in shambles at high energies. A precise mechanism to blow up the theory is produced which may cause Faster Than Light (FTL).

***

Abstract: I don’t know whether neutrinos go faster than light or not. The OPERA neutrino anomaly rests on some guess work on the shape of long neutrino pulses, so its superluminal results may well be a mirage. (The experience will soon be run with very short pulses.)

However neutrinos may go Faster Than Light. Why not? Because the idea has caused obvious distress among many of the great priests of physics?

In science one should never suppose more than necessary, nor should one suppose less than necessary. Faith is necessary in physics, but it should reflect the facts, and nothing but the facts. Such is the difference between physics and superstitious religion.

There are reasons to doubt that the leanness of Relativity reflects appropriately the subtlety of the known universe. This is what this essay is about.

OK, I agree that the equations of what Henri Poincare’ named “Relativity”,  seem, at first sight, to lead one to doubt Faster Than Light speeds. Looking at those equations naively, and without questioning their realm of validity, it looks as if, as one approaches the speed of light, one gets infinitely heavy, short, and slow.

However, adding as an ingredient much of the rest of fundamental physics, I present below an argument that those equations ought to break down at very high energies.

As my reasoning uses basic “Relativity” (Special and General), plus basic Quantum physics, one may wonder why something so basic was not pointed out before.

The answer is that, just as in economics most people are passengers on the Titanic, in cattle class, so it is in physics.

Most people have a quasi religious approach to “Relativity”, and their critical senses have been so stunted, that they even fail to appreciate that Einstein did not invent “Relativity” (as I will perhaps show in another, gossipy essay). This is of some consequence, because Einstein had a rather shallow understanding of some of the concepts involved (as he proved aplenty in his attempts at a Unified Field Theory… Pauli called them “embarrassing“… well, may be, my dear Wolfgang, all of Relativity was embarrassing, another collective German hallucination…)

The logic of the essay below is multi-dimensional, but tight. As it throws most of relativity out, I am going to be rather stern describing it:

1) At high energies, absolute motion can be detected. This renders Galileo’s Relativity, and its refurbishment by Poincare’, suspicious.

2) Time dilation is real. There is no “Twin Paradox’ whatsoever; fast clocks are really slow. This is well known, but I review the physical reason why.

3) Length contraction is also real. Moving rods really shrink. It is not a question of fancy circular definitions (as some have had it). I show this below by an electromagnetic argument which makes the connection between relativistic transformation and Maxwell equation obvious.

4) A contradiction is derived. A particle would suffer gravitational collapse if its speed came close enough to the speed of light, as it would get confined within its Schwarzschild radius, should the equation of “Relativity” remain the same at all energies. A particle gun, at high enough energy, would become a black hole gun. That would violate all sorts of laws.

Thus “Relativity” breaks down. The simplest alternative is that some of the energy spills into acceleration (beyond c!) instead of mass acquisition.

5) Einstein’s conviction that Faster Than Light is equivalent to time travel is shown to be the result of superficial analysis. If the equations of relativity break down at high energies, they cannot be used to present us with negative time. Although I will not insist on this too heavily in this particular essay, Einstein got confused between a local notion (time) and a non local one (speed, and its parallel transport around loops).

As an humoristic aside, should high energy neutrinos go faster than light, it should be possible to measure time beyond the speed of light, using a high energy neutrino clock.  

Thus a reassessment of “Relativity” is needed, starting with the name: if all the “Relative” laws blow up at high energies, that is, at high velocity, uniform motion is not relative, but absolute. The theory of Relativity ought to become the Theory of Velocity (because, at intermediate energies, all of the laws of the present “Theory of Relativity”, including fancy rotational additions of velocities, do still apply!) Mach’s principle, already absolute for rotational motion, and already favoring a class of uniform motion, would become absolute for any motion.

***

ABSOLUTE, UNIFORM MOTION IS DETECTABLE… IF ENERGETIC ENOUGH:

Poincare’ named the theory that he, Lorentz and a dozen other physicists invented, “Relativity“. Yes, Poincare’, not Einstein: this essay is about truth, not convention to please a few thousand physicists and a few billions imprinted on the Einstein cult. (And I like Einstein… When he is not insufferable.)

That name, “Relativity”, may have been a mistake. A better name, I would suggest, would be “THEORY OF VELOCITY“, for the following reasons:

In 1904, summarizing the experimental situation then, Henri Poincare’ generalized the Galilean relativity principle to all natural phenomena into as he wrote:

The principle of relativity, according to which the laws of physical phenomena should be the same, whether to an observer fixed, or for an observer carried along in a uniform motion of translation, so that we have not and could not have any means of discovering whether or not we are carried along in such a motion”.

A first problem is that the American Edwin Hubble, and others, ten years after the precocious death of Poincare’, discovered cosmic expansion, which defines absolute rest. OK, Galileo would have said:

“Patrice, just don’t look outside, I told you to stay in your cabin, in the bowels of the ship.” Fair enough, but rather curious that inquiring minds ought to be blind.

However, fifty years after the death of Poincare’, the Cosmic Microwave Background (CMB) was discovered.  That changed the game completely. You can stay in the bowels of the ship all you want, Galileo, when you move fast enough relative to the CMB, the wall of your cabin towards your uniform motion, v, will turn incandescent, and then into a plasma, as the CMB will turn into gamma rays, thanks to the Doppler shift. Even burying Galileo’s head in the sand will not work. Even a Galileo ostrich can’t escape a solid wall of gamma rays.

Relativity fanatics may insist that they are correct in first order, when they don’t go fast enough. OK, whatever: the equations of “Relativity” do not change shape with v, but, as I just said, the bowels of Galileo’s ship will always disintegrate, if the speed is high enough.

When equations don’t fit reality, you must them quit. That’s why it’s called science.

But let’s forget this glaring problem, as I said, I have found a much worse one. To understand it requires some background in so called “Relativity“, the theory of Gravitation (so called “General Relativity“, an even more insipid name), and Quantum physics.

We have to go through some preliminaries which show that time dilation and length contraction are real physical effects. There is nothing relative about them. So it’s not just the relativity of uniform motion which is not relative. It is the other two basic effects of relativity which are not relative either.  

People can write all the fancy “relativistic” equations they want, a la Dirac, and evoke some spacetime mumbo jumbo . Those equations rest on, and depict, the three preceding relative effects, and if these are not relative at high energies anymore, one cannot use them at very high energies either. Paul Dirac can sing song that they are pretty all he wants, like a canary in a coal mine. Physics is not a fashion show. It’s about what’s really happening. If the canary is dead, it’s time to get out of there.

***

POINCARE’-LORENTZ LOCAL TIME (prior to 1902):

One central notion of standard relativity is “Local Time“, which Poincare’ named and extracted from the work of Lorentz (sometimes calling it “diminished time“). We have:

t’ = t multiplied by square root of (1- vv/cc)

[Because of problems with the Internet carrying squares and square roots properly, I write cc for the square of c, instead of c^2, as some do, etc… After all, it’s exactly what it is. In the end, mathematics is eased, and rendered powerful by abstraction, but it is all about words.]

So when v= 0, t’ = t and when v = c, t’ = 0, or, in other words, t’ stops. Here t’ is the time in the coordinate system F’ travelling at speed v relative to the coordinate system F, with its time t. More exactly t is what one could call “local electromagnetic time“. Some physicists would get irritated at that point, and snarl that there is nothing like “local electromagnetic time”. In science precision is important: we are more clever than chimps because we make more distinctions than chimps do.

How do we find t’ knowing t? We look at a light clock in F’ from F. If we look at a light clock perpendicular to v, in F’, from F, we see that light in F’ will have to cover more distance to hit the far mirror of the light clock. That clock will run slow. If we suppose there is only one measure of time in F’, that means time in F’ will run slow. (This unicity of time is a philosophical hypothesis, but it has been partly confirmed experimentally since.) This is what Poincare’ (also) called “diminished time”, in his 1902 recommendation of Hendrick Lorentz to the Nobel Prize in physics (for what Poincare’ called the Lorentz transformations).

The math to compute t’ from t use nothing harder than Pythagoras’ theorem. The idea is to compare two IDENTICAL light clocks, both perpendicular to v, one in F, standing still, the other in F’, moving along at v, and separated by some distance that light covers in time t. We look at the situation from F. When the F’ light has gone from one mirror to the other in the moving clock, it has covered ct’. Why that? Well we don’t know (yet!) what t’ is, but we know light is supposed to always (appear to) go at the same speed, the astronomers’ practice, as Poincare’ reiterated in 1898.

Meanwhile the origin where the light emanated from in F’ has moved by vt’. By Pythagoras:

(ct’)(ct’) = (vt’)(vt’) + (ct)(ct). This is the relation between t’ and t above.

***

FITZGERALD CONTRACTION IS A REAL PHYSICAL EFFECT:

The Irish physicist Fitzgerald suggested that, to explain the null result of the Michelson-Morley experiment, the arm of the instrument was shortened, as needed. What is the Michelson-Morley device? Basically two light clocks at right angles, one perpendicular to v, the other along v. These two arms allow to make light interfere, after it has gone back and forth either along the direction of v, or perpendicular to it.

No difference was found. The way I look at it, it shows that electromagnetic time is indifferent to direction. The way it was looked at then was that there was no “ether drag“.

What is going on physically is very simple: F’ moves relative to F. Light released at x = x’ = t = t’ = 0 is going to have to catch up with the other mirror. (notice that this is a slight abuse of notation, as the xs and ts are in different dimensions, and F and F’ in different coordinate systems…) Suppose v is very high. Seen from F, it is obvious that photons will take a very long time to catch up with the mirror at the end of the arm. Actually, if v was equal to the speed of light, it would never catch up (if someone looked in a mirror, when going at the speed of light, she would stop seeing herself).

Well, however, the M-M device showed no such effect. Thus the only alternative was that the length of the M-M interferometer shrank, as Fitzgerald proposed in 1892 (13 years before Einstein’s duplication).

Poincare’ introduced “Poincare’ stresses” to explain the effect as a real physical effect. That was explored further in 1911 by Lorentz, and then worked out in even more detail in the 1940s, using Dirac’s quantum electrodynamics.

The reason I am giving all these details is that Einstein could not understand Poincare’s insistence on “mechanical” models. That’s OK; not everybody can be super bright.

Einstein preferred the more formal insistence that the arm along v had to shrink, because c was constant, and that was it. This was exactly Fitzgerald’s initial reasoning, and it does not explain anything: true it seems necessary that the arm will shrink, but is it really happening, and if so, how? Einstein’s platitude about the mind of god, who he was most apt to seize, are just plain embarrassing… Especially as it turns out that, in this case, the one who got the idea was Poincare’. Maybe a god to Einstein, but to me, just a man. (All the more confusing as Einstein tried to refer, and defer, to Poincare’ less than justice required.) By clinging to Fitzgerald’s original vision like a rat to a reed in the middle of the ocean, Einstein aborted the debate with a hefty dose of superstition.

As the detailed development of Quantum Field Theory showed, the mechanical models were the way to go. They reveal a lot of otherwise unpredictable, unanalyzable complexities. Just as we are going to do below.

So what is happening with the relativistic contraction?

Maxwell equations tell us how the electromagnetic field behave. In ultra modern notations, they come down to dF =0, d*F = q (d being covariant differentiation). Very pretty.

Maxwell is not all, though. The Lorentz  force equation tells us how particles move, when submitted to the electromagnetic field. It is:

Force = q(E + vB)

[E, B are vector fields, vB is the (vector) cross product of the particle velocity, the vector v, with B.]

Let’s suppose the particle moves at v. As it does, it will be reached simultaneously by the electromagnetic field from two different places. Say one of these field elements will be a retarded component, Fretarded, and the other is obtained from Pythagoras theorem, using the same sort of diagram used for a light clock. One can call that component Frelativistic. Frelativistic is proportional to the usual gamma factor of relativity, namely 1/sqrt(1-vv/cc). The total field incorporates Fretarded plus Frelativistic. The relativistic component basically crushes any particle sensitive to the Lorentz force in the direction of motion.

In particular, it will crush electronic orbitals. So atoms will get squeezed. That reasoning, by the way, explain directly, physically why the Lorentz transformations are the only ones to respect the Maxwell equations. It is better to achieve physical understanding rather than just formal understanding (by the way, professor Voigt found the formal argument for the Lorentz transformations in 1887, 18 years before Einstein).

How do formal relativists a la Einstein look at this? They look at it eerily, not to say… ethereally. Max Born, a Nobel Prize winner (for the statistical interpretation of the Quantum waves), a personal friend of Einstein expounds the formalism with coherently infuriating declarations around p 253 of his famous book “Einstein’s Theory of Relativity“.

“For if one and the same measuring rod…has a different length according to it being at rest in F, or moving relative to F, then, so these people say, there must be a cause for this change. But Einstein’s theory gives no cause; rather it states that the contraction occurs by itself, that is an accompanying circumstance to the fact of motion. In fact this objection is not justified. It is due to too limited view of the concept “change”. In itself such a concept has no meaning. It denotes nothing absolute, just as data denoting distances or times have no absolute significance. For we do not mean to say that a system which is moving uniformly in a straight line with respect to an inertial system F ‘undergoes a change’ although it actually changes its situation with respect to system F.

The standpoint of Einstein’s theory about contraction is as follows: a material rod is physically not a spatial thing, but a spacetime configuration.”

This sort of theology will remind some of Heidegger’s writings, when the pseudo philosopher looks for the “ground” and never finds it. Too much relativity will do that for you. Stay with confusion, end up with Auschwitz.

Could it be that, according to Born, a bird which flies by does not change, as it ‘actually changes its situation as a spacetime configuration’. Thus the bird at rest on its branch has not changed into a bird flying. Maybe Born should have received another Nobel for this other wonderful theory?

What Born forgets is this: 1) let’s suppose there is something as absolute rest (given, once again, by the reality of the CMB). 2) then a fast frame passing by has been colossally accelerated first before reaching that high relative speed when vv/cc approaches 1. So the fast frame has undergone an absolute change… And I explained what it is.

3) trying to play relative games between relatively relativistically moving frames does not wash, as there is privileged state of rest (or quasi rest: the Earth moves at 370 km/s relative to the CMB).

The derivation I sketched above provides with a cause for the change. It shows that, by reason of the uniform displacement, a real boost in the e-m field occurs which causes the contraction. So, basically it’s the electromagnetic geometry of uniform motion which causes the Lorentz transformations. It’s deep down not mysterious at all.

The reasons for real time dilation and real length contraction are plain, and absolute.

This is all about the geometry of electromagnetism. When Einstein wrote down in marble his formal considerations about relative this, relative that, a full generation would elapse before the discovery of the neutrino, which responds to weak interactions (OK, there is an electroweak theory, but we are not going to remake all of physics in this essay!)

By the way, I can address in passing another confusion: some will say that by standing on the surface of the Earth one undergoes an acceleration of one g (correct!), and thus why would an acceleration of one g in a straight line for a year (which is enough for high relativistic speeds) cause all these absolute changes I crow about?

The reason is simple: in one case no energy is stored, the acceleration is purely virtual, as the ground is in the way. In the other case, a tremendous amount of energy is concentrated piles up inside the moving body.

***

CLINCHER: YOU CAN’T ALWAYS SQUEEZE WHAT YOU WANT, BUT IF YOU TRY SOMETIMES, YOU GO FTL:

The Schwarzschild radius is given by R = 2GM/cc, where M is the mass of the body, G is the universal constant of gravitation, and c is the speed of light.

The energy of a particle is E = hf, where h is Planck’s constant, and f the frequency of its matter wave. Plug in Poincare’ mass-energy relation: E = M cc. Now E/cc is inertial mass M. Thus it causes, by the equivalence principle, the gravitational mass: hf/cc.

Thus the Schwarzschild radius of a particle of matter wave of frequency f is: R = 2Ghf/cccc.

But now comes the clincher: As the particle accelerates, at ever increasing speed v, its matter wave will shrink ever more, from (Fitzgerald) length contraction. At some point the wave will shrink within the Schwarzschild radius of the particle.

If one takes the gravitational collapse of the particle within itself at face value, the particle would exit physics, never to be seen again, but for its gravitational effect. That is obviously impossible.

So something has to give in the equation f = ma; here f is the force through which one pumps energy in the system (an electromagnetic field, or a muon jet, whatever), m is the relativistic mass, and a is the acceleration. What we saw is that m is bounded.  Thus a has to augment, and the particle will accelerate through the speed of light. QED.

Some could object that I used the photon energy relation: E = Hf, instead of M = (Rest Mass)/ sq. root (1 – vv/cc). But that would not change the gist of the argument any.

[Neutrinos are suspected of having a non zero mass, because they oscillate between various states; but the mass is so small, it’s not known yet. Similarly the question of the photon rest mass is an experimental problem!]

Another objection could be that I used the Schwarzschild radius, but it does not apply to single particles. Indeed, the initial argument (Tolman Oppenheimer Volkoff), was that the gravitational force would overwhelm the nuclear force inside an extremely dense star. The nuclear force is repulsive at short distance.

[I tried to show a picture of the nuclear force, but I had to give up as the WordPress primitive system does not allow me to.]

A rough sketch of the force between two nucleons shows a strong repulsive peak, which is, however finite. The force depends upon gluon exchanges, is very spin dependent, etc… The idea of TOV was that gravitation would overwhelm it, and neuron degeneracy, under some circumstances. The basic ideea in this essay is the same, although it is the all too real Lorentz-Fitzgerald contraction, not gravity, which does the crunching.]

Previously Einstein had tried to demonstrate the Schwarzschild singularity had no physical meaning. His argument was contrived, erroneous. TOV succeeded to prove Einstein wrong (Hey, it can be done!)

Besides, all these subtleties can blown away by looking at particles the strong force is not applied to, such as photons or neutrinos. They both have mass, as far as creating geometry is concerned.

In the Schwarzschild computation, a term shows up, causing a singularity at a finite distance. The term is caused purely by the spherical coordinates, and the imposition of the vacuum Einstein gravitational equation. It is indifferent to kinetic effects (an important detail, as I put the Fitzgerald squeeze on). That term is identified to a mass for purely geometrical reasons. That mass will appear through Poincare’s E = mcc, or m = hf/cc, after plugging in the de Broglie relation.  

This allows to circumvent Hawking style, trans-Planckian arguments (which, anyhow, Hawking superbly ignored in deriving Hawking radiation).

In any case f = square root [ccccc/2Gh]. Plugging in the numbers, one gets trouble when the frequency gets to ten to the power 43 or so. That’s about ten to the power 16 TeV, or 2 tons of TNT.  About one million billion times more energetic than the CERN neutrinos. But then, of course, the effect would be progressive, somehow proportional to energy. (Also see Large Dimensions below.)

***

EINSTEIN GRANDFATHER OBJECTION SELF CONTRADICTING:

Conventional physicists will make the following meta objection to the preceding. According to lore of the standard theory of Special Relativity, the preceding scheme is completely impossible. Countless physicists would say that if one had a particle going faster than light, one could go back in time. Einstein said this, and everybody has been repeating it ever since, because Einstein looked so brainy.

Or maybe because he was also German. Thus he had got to have invented the Poincare’-Lorentz relativity! (We meet here again the Keynes-Hitler problem, that, in much of the Anglo-Saxon world, blind admiration for anything German, reigns sometimes to the point of spiting reason. I am myself fanatically pro-German, but there are lines of prejudice I will not cross).

OK, granted, there is some mathematical superficiality to support Einstein’s confusion between speed of light, time, and causality. Those mathematics are reproduced in the next chapter, where the mistake is exposed.

There are three problems with using the Grandfather Paradox to shoot down my reason for FTL.

The first objection is that Einstein’s General relativity itself makes possible The “Grandfather Paradox“. So it is rather hypocritical to use it to fend off contradictions of (Special) Relativity.

That grandfather paradox increasingly haunts standard fundamental physics. The paradox arises from “Closed Timelike Curves“, which appear in (standard) Relativity (not in my version of Relativity). Hawking was reduced around 1992 to the rather ridiculous subterfuge of a “Time Protection Conjecture“.

The second objection to the objection is that, in practice, experimentally, the objection has proven irrelevant, by years of increasingly precise experiments. Basically what happens is that Quantum theory allows Faster Than Light teleportation of some sort (“states“, as the saying goes). And experiments are confirming this, leading to claims that time travel has been achieved, experimentally. There are many references, here I give one from July 2011, one of the authors, professor Lloyd from MIT is a Principal Investigator, on just this sort of problems.

Personally I have no problem with the results: they fit my vision of things, which is very friendly to Faster Than Light. But I have a problem with the standard semantics of “time travel“, which comes straight from the way Einstein looked at time.

Einstein was, in my opinion, very confused about time. He notoriously said: “time is nothing but a stubbornly persistent illusion“. In my vision of physics, time is fundamental, and it has nothing to do with space. I believe the notion of spacetime applies well to gravitation, at great distances, such as Earth orbit. I used above one of the staples of “General Relativity”, the Schwarzschild radius, but very carefully, from first principles. When people argue “time travel“, they argue from later principles, later day saints, so to speak. They confuse cornucopia and utopia.

I believe this: Time is the measure of the change of the universe. It’s not subject to travelling. But it is subject to confusion, and Einstein is the latter’s prophet.

(I also believe that the second Law of Thermodynamics applies even at the subquantal level (the subquantal level is what those who study entanglement, non locality, in Quantum physics, such as the authors I just quoted, tangle with). OK, the latter point is beyond this essay, the main theme of which does not use it at all.)

***

THE EINSTEIN GRANDFATHER ERROR:

The third objection to using the grandfather paradox to contradict me is much more drastic, and revealed by my own pencil and paper (the following is just an abstract of my reasoning). The conventional reasoning due to Einstein is faulty (and faulty in several ways). 

Most specialists of relativity subscribe to statements such as “it should be possible to transmit Faster Than Light signals into the past by placing the apparatus in a fast-moving frame of reference“.

Einstein’s argument rests on equations for time such as  t’ = (t – xw/cc)/square root of (1- vv/cc).

In this v is supposed to be the speed of a moving frame, and w the speed of an alleged Faster Than Light signal. One sees that, given a w > c, there is a v close enough to, but inferior to, c, such that t’ becomes negative.

t’ is time. So what we have, given a Faster Than Light signal w, frames moving with enough of a speed v in which events seem to be running in reverse. Thus Very Serious Professors have argued that the putative existence of that FTL w reverses causality. They abbreviate this by saying that time travel a consequence of FTL. And what do I say to this howling of the Beotians? Not so fast.

Indeed, this ought to be physics, not just vulgar mathematics. One argument was overlooked by Einstein, who apparently believed blindly in Poincare’s Relativity Postulate even more than his author himself did.

We have discovered above that there is a limit energy, beyond which Relativity breaks down. Thus the Lorentz transformations break down.

Let’s look at the situation in a more refined way. What happens to the traditional transverse light clock encountered above, and in all traditional Relativity treatises? Well, in this thought experiment, admitting my reasoning above as valid, the light clock, if made energetic enough, will start to accelerate faster than light. So it would stop functioning, because light will not be able to catch up with it. This may sound strange, but it’s not. In the mood of the preceding, it just says that the light used is limited in energy, and the clock is not. To still keep time for a while, we may have to use a high energy neutrino clock (supposing that neutrinos, indeed, travel Faster Than Light; a neutrino clock is build exactly like a photon clock, with the photons replaced by neutrinos).

This is why the Lorentz transformation fail; because the clock fail, as the light cannot catch up with the mirror. Pernicious ones could claim I self contradict, as my argument used the Fitzgerald contraction. Yes, I used it, because it failed. In my logic, the contraction fails first.

Another argument is that the alledged violations of causality hard core relativists find in the apparition of negative times are much less frequent than they fear, because time dilation dilues the statistics. A relativistic point they absolutely ignore. Also, as I have argued in the past, in https://patriceayme.wordpress.com/2011/09/01/quantum-non-locality/, Quantum physics non locality, in conjonction with Faster Than Light space expansion implies violations of local causality. Basically Quantum entanglements can happen beyond the space horizon given by the cosmic FTL expansion (which is an experimental fact).

Thus causality has to be taken with a grain of salt, and plenty of statistics (a characteristic of modern physics, at least since Boltzman).

***

AND LET’S NOT FORGET FASTER THAN LIGHT FROM THE QUANTUM…

In physics, these days, a thousand theories blossom, and nearly a thousand fail miserably. So, on the odds, it ought to be the case with the preceding theory. However the preceding rests on fundamentals, whereas much of the fashionable stuff rests on notions nobody understands, and few bothered to study (such as the interdiction of FTL, which rests just on the shallow logic of Einstein exposed above).

An interesting sort are the Large Extra Dimension theories (pertly instigated and made popular by the famous Lisa Randall, a glamorous professor from Princeton, Harvard and Solvay, author of the just published “Knocking On Heavens’ Doors”). They would help the preceding arguments, by lowering considerably the threshold of relativity’s high energy failure. Thus, if superluminal neutrinos are indeed observed, in light of the preceding, they would suggest the existence of Large Extra Dimensions.

Many of the theories which have blossomed are “not even wrong“. The obsession with superstrings was typical. Why to obsess with those, when basic Quantum theory had not been figured out? Is it because superstrings cannot be tested?

Basic Quantum theory is subject to experiments, and some have given spectacular, albeit extremely controversial results. There was curiously little interest to confront those tests. If nothing else, the OPERA experiment illustrates that the failure of Relativity at high energy can be tested (OPERA was really out to test neutrino oscillations, which are, already, a violation of Relativity in some sense, as they confer a (rest?) mass on a particle going at the speed of light!)

The preceding theory, and its absolute frame, fits like a glove with the idea with an absolute (but stochastic and statistical) space, constructed from Quantum Interaction (or “potential” as Bohm has it). That would partly cause the CMB. That theory rests on the predicted failure of standard Quantum theory at large distances, and the existence of an extremely Faster Than Light interaction. It’s getting to be a small world, theoretically speaking…

***

Patrice Ayme

***

P/S: The reasoning above is so simple that normal physicists will not rest before they can brandish a mistake. So what could that mistake be? Oh, simply that the perpetrator, yours truly, did not use full Relativistic Quantum Mechanics, obviously because s/he is ignorant. But referring to Relativistic Quantum Mechanics, by itself, would use in the proof what one wants to demonstrate with the proof. Relativistic Quantum Mechanics assumes that Relativity is correct at any speed. I don’t. I don’t, especially after looking outside.

OK, that was a meta argument. Can I build a more pointed objection? After all, Dirac got QED started on esthetic grounds, while elevating relativity to a metaprinciple. But beauty is no proof. Nor is the fact that Dirac’s point of view led to several predictions (Dirac equation for electrons, Spin, Positrons).

One could say that the Heisenberg Uncertainty Principle (HUP) would prevent the confinement of a particle in such an increasingly small box. However, the HUP is infeodated to general De Broglie Mechanics (GDM), and there is no argument I see why GDM could not get confined..