Archive for the ‘Non Locality’ Category

“Proof” That Faster Than Light Communications Are Impossible Is False

December 16, 2017

There are theories everywhere, and the more ingrained they are, the more suspiciously they should be looked at. From the basic equations of relativity it is clear that if one adds speeds less than the speed of light, one will get a speed less than the speed of light. It is also clear that adding impulse to a mass will make it more massive, while its speed will asymptotically approach that of light (and, as I explained, the reason is intuitive, from Time Dilation).

The subject is not all sci-fi: modern cosmology brazenly assumes that space itself, after the alleged Big Bang, expanded at a speed at least 10^23 c (something like one hundred thousand billion billions time the speed of light c). The grossest, yet simplest, proof of that is simple: the observable universe is roughly 100 billion light years across, and it is ten billion years old. Thus it expanded at the minimum average clip of ten billion light years, every billion years. 100c/10 = 10c, according to standard cosmology. One could furiously imagine a spaceship somehow surfing on a wave of warped space, expanding for the same obscure reason.

The question naturally arises whether velocities which are greater than that of light could ever possibly be obtained in other ways. For example, are there communication speeds faster than light? (Throwing some material across will not work: its mass will increase, while its speed stays less than c.)

Textbooks say it’s not possible. There is actually a “proof” of that alleged impossibility, dating all the way back to Einstein (1907) and Tolman (1917). The mathematics are trivial (they are reproduced in my picture below). But the interpretation is apparently less so. Wikipedia weirdly claims that faster than light communications would allow to travel back in time. No. One could synchronize all clocks on all planets in the galaxies, and having faster than light communications would not change anything. Why? Time is local, faster than light data travel is nonlocal.

The problem of faster than light communications can be attacked in the following manner.

Consider two points A and B on the X axis of the system S, and suppose that some impulse originates at A, travels to B with the velocity u and at B produces some observable phenomenon, the starting of the impulse at A and the resulting phenomenon at B thus being connected by the relation of cause and effect. The time elapsing between the cause and its effect as measured in the units of system S will evidently be as follows in the calligraphy below. Then I use the usual Relativity formula (due to Lorentz) of time as it elapses in S’:

Equations help, but they are neither the beginning, nor the end of a story. Just an abstraction of it. The cult of equations is naive, interpretation is everything. The same thing, more generally, holds for models.
As Tolman put it in 1917: “Let us suppose now that there are no limits to the possible magnitude of the velocities u and V, and in particular that the causal impulse can travel from A to B with a velocity u greater than that of light. It is evident that we could then take a velocity u great enough uV/C^2 will be greater than one.
so that Delta(t) would become negative. In other words, for an observer in system S’ the effect which occurs at B would precede in time its cause which originates at A.”

I quote Tolman, because he is generally viewed as the one having definitively established the impossibility of faster than light communications. Tolman, though is not so sure; in his next sentence he turns out wishy washy: “Such a condition of affairs might not be a logical impossibility; nevertheless its extraordinary nature might incline us to believe that no causal impulse can travel with a velocity greater than that of light.”

Actually it is an effect those who have seen movies running in reverse are familiar with. Causality apparently running in reverse is no more surprising than the fact that two events at x1 and x2 which are simultaneous in S are separated by:  (x1-x2) (V/square root (1-VV/CC)). That introduces a sort of fake, or apparent causality, sometimes this before that, sometimes that before this.

(The computation is straightforward and found in Tolman’s own textbook; it originated with Henri Poincaré.[9][10] In 1898 Poincaré argued that the postulate of light speed constancy in all directions is useful to formulate physical laws in a simple way. He also showed that the definition of simultaneity of events at different places is only a convention.[11]) . Notice that, in the case of simultaneity, the signs of V and (x1-x2) matter. Basically, depending upon how V moves, light in S going to S’ takes more time to catch up with the moving frame, and the more so, the further it is, the same exact effect which explains the nil result in the Michelson-Morley interferometer; there is an underlying logic below all of this, and it’s always the same).

Tolman’s argumentation about the impossibility of faster than light communications is, in the end, purely philosophical and fully inconsistent with the closely related, and fully mainstream, relativity of simultaneousness.

Poincaré in 1900 proposed the following convention for defining clock synchronisation: 2 observers A and B, which are moving in space (which Poincaré called the aether), synchronise their clocks by means of optical signals. They believe to be at rest in space (“the aether”) from not moving relative to distant galaxies or the Cosmic Radiation Background and assume that the speed of light is constant in all directions. Therefore, they have to consider only the transmission time of the signals and then crossing their observations to examine whether their clocks are synchronous.

“Let us suppose that there are some observers placed at various points, and they synchronize their clocks using light signals. They attempt to adjust the measured transmission time of the signals, but they are not aware of their common motion, and consequently believe that the signals travel equally fast in both directions. They perform observations of crossing signals, one traveling from A to B, followed by another traveling from B to A.” 

In 1904 Poincaré illustrated the same procedure in the following way:

“Imagine two observers who wish to adjust their timepieces by optical signals; they exchange signals, but as they know that the transmission of light is not instantaneous, they are careful to cross them. When station B perceives the signal from station A, its clock should not mark the same hour as that of station A at the moment of sending the signal, but this hour augmented by a constant representing the duration of the transmission. Suppose, for example, that station A sends its signal when its clock marks the hour 0, and that station B perceives it when its clock marks the hour t. The clocks are adjusted if the slowness equal to t represents the duration of the transmission, and to verify it, station B sends in its turn a signal when its clock marks 0; then station A should perceive it when its clock marks t. The timepieces are then adjusted. And in fact they mark the same hour at the same physical instant, but on the one condition, that the two stations are fixed. Otherwise the duration of the transmission will not be the same in the two senses, since the station A, for example, moves forward to meet the optical perturbation emanating from B, whereas the station B flees before the perturbation emanating from A. The watches adjusted in that way will not mark, therefore, the true time; they will mark what may be called the local time, so that one of them will be slow of the other.[13]

This Poincaré (“–Einstein”) synchronisation was used by telegraphers as soon as the mid-nineteenth century. It would allow to cover the galaxy with synchronized clocks (although local times will differ a bit depending upon the motion of stars, and in particular where in the galactic rotation curve a star sits). Transmitting instantaneous signals in that networks would not affect causality. Ludicrously, Wikipedia asserts that faster than light signals would make “Bertha” rich (!!!). That comes simply from Wikipedia getting thoroughly confused, allowing faster than light signals for some data, and not for other data, thus giving an advantage to some, and not others.

***

Quantum Entanglement (QE) enables at-a-distance changes of Quantum states:

(It comes in at least three types of increasing strength.) Quantum Entanglement, as known today, is within Quantum state to within Quantum state, but we cannot control in which Quantum state the particle will be, to start with, so we cannot use QE for communicating faster than light (because we don’t control what we write, so to speak, as we write with states, so we send gibberish).

This argument is formalized in a “No Faster Than Light Communication theorem”. However, IMHO, the proof contains massive loopholes (the proof assumes that there is no Sub Quantum Reality, whatsoever, nor could there ever be some, ever, and thus that the unlikely QM axioms are forever absolutely true beyond all possible redshifts you could possibly imagine, inter alia). So this is not the final story here. QE enables, surprisingly, the Quantum Radar (something I didn’t see coming). And it is not clear to me that we have absolutely no control on states statistically, thus that we can’t use what Schrödinger, building on the EPR thought experiment, called “Quantum Steering” to communicate at a distance. Quantum Radar and Quantum Steering are now enacted through real devices. They use faster-than-light in their inner machinery.

As the preceding showed, the supposed contradiction of faster-than-light communications with Relativity is just an urban legend. It makes the tribe of physicists more priestly, as they evoke a taboo nobody can understand, for the good reason that it makes no sense, and it is intellectually comfortable, as it simplifies brainwork, taboos always do, but it is a lie. And it is high time this civilization switches to the no more lies theorem, lest it wants to finish roasted, poisoned, flooded, weaponized and demonized.

Patrice Ayme’

Technical addendum:

https://en.wikipedia.org/wiki/Relativity_of_simultaneity

As Wikipedia itself puts it, weasel-style, to try to insinuate that Einstein brought something very significant to the debate, the eradication of the aether (but the aether came back soon after, and there are now several “reasons” for it; the point being that, as Poincaré suspected, there is a notion of absolute rest, and now we know this for several reasons: CRB, Unruh effect, etc.):

In 1892 and 1895, Hendrik Lorentz used a mathematical method called “local time” t’ = t – v x/c2 for explaining the negative aether drift experiments.[5] However, Lorentz gave no physical explanation of this effect. This was done by Henri Poincaré who already emphasized in 1898 the conventional nature of simultaneity and who argued that it is convenient to postulate the constancy of the speed of light in all directions. However, this paper does not contain any discussion of Lorentz’s theory or the possible difference in defining simultaneity for observers in different states of motion.[6][7] This was done in 1900, when Poincaré derived local time by assuming that the speed of light is invariant within the aether. Due to the “principle of relative motion”, moving observers within the aether also assume that they are at rest and that the speed of light is constant in all directions (only to first order in v/c). Therefore, if they synchronize their clocks by using light signals, they will only consider the transit time for the signals, but not their motion in respect to the aether. So the moving clocks are not synchronous and do not indicate the “true” time. Poincaré calculated that this synchronization error corresponds to Lorentz’s local time.[8][9] In 1904, Poincaré emphasized the connection between the principle of relativity, “local time”, and light speed invariance; however, the reasoning in that paper was presented in a qualitative and conjectural manner.[10][11]

Albert Einstein used a similar method in 1905 to derive the time transformation for all orders in v/c, i.e., the complete Lorentz transformation. Poincaré obtained the full transformation earlier in 1905 but in the papers of that year he did not mention his synchronization procedure. This derivation was completely based on light speed invariance and the relativity principle, so Einstein noted that for the electrodynamics of moving bodies the aether is superfluous. Thus, the separation into “true” and “local” times of Lorentz and Poincaré vanishes – all times are equally valid and therefore the relativity of length and time is a natural consequence.[12][13][14]

… Except of course, absolute relativity of length and time is not really true: everywhere in the universe, locally at rest frames can be defined, in several manner (optical, mechanical, gravitational, and even using a variant of the Quantum Field Theory Casimir Effect). All other frames are in trouble, so absolute motion can be detected. The hope of Einstein, in devising General Relativity was to explain inertia, but he ended down with just a modification of the 1800 CE Bullialdus-Newton-Laplace theory… (Newton knew his instantaneous gravitation made no sense, and condemned it severely, so Laplace introduced a gravitation speed, thus the gravitational waves, and Poincaré made them relativistic in 1905… Einstein got the applause…)

Advertisements

CONTINUUM FROM DISCONTINUUM

December 1, 2017

Discontinuing The Continuum, Replacing It By Quantum Entanglement Of Granular Substrate:

Is the universe granular? Discontinuous? Is spacetime somehow emergent? I do have an integrated solution to these quandaries, using basic mass-energy physics, and quantum entanglement. (The two master ideas I use here are mine alone, and if I am right, will change physics radically in the fullness of time.)  

First let me point out that worrying about this is not just a pet lunacy of mine. Edward Witten is the only physicist to have got a top mathematics prize, and is viewed by many as the world’s top physicist (I have met with him). He gave a very interesting interview to Quanta Magazine: A Physicist’s Physicist Ponders the Nature of Reality.

Edward Witten reflects on the meaning of dualities in physics and math, emergent space-time, and the pursuit of a complete description of nature.”

Witten ponders, I answer.

Quantum Entanglement enables to build existence over extended space with a wealth exponentially growing beyond granular space

Witten: “I tend to assume that space-time and everything in it are in some sense emergent. By the way, you’ll certainly find that that’s what Wheeler expected in his essay [Information, Physics, Quantum, Wheeler’s 1989 essay propounding the idea that the physical universe arises from information, which he dubbed “it from bit.” He should have called it: “It from Qubit”. But the word “Qubit” didn’t exist yet; nor really the concept, as physicists had not realized yet the importance of entanglement and nonlocality in building the universe: they viewed them more as “spooky” oddities on the verge of self-contradiction. ..]

Edward Witten: As you’ll read, he [Wheeler] thought the continuum was wrong in both physics and math. He did not think one’s microscopic description of space-time should use a continuum of any kind — neither a continuum of space nor a continuum of time, nor even a continuum of real numbers. On the space and time, I’m sympathetic to that. On the real numbers, I’ve got to plead ignorance or agnosticism. It is something I wonder about, but I’ve tried to imagine what it could mean to not use the continuum of real numbers, and the one logician I tried discussing it with didn’t help me.”

***

Well, I spent much more time studying logic than Witten, a forlorn, despised and alienating task. (Yet, when one is driven by knowledge, nothing beats an Internet connected cave in the desert, far from the distracting trivialities!) Studying fundamental logic, an exercise mathematicians, let alone physicists, tend to detest, brought me enlightenment. mostly because it shows how relative it is, and how it can take thousands of years to make simple, obvious steps. How to solve this lack of logical imagination affecting the tremendous mathematician cum physicist Witten? Simple. From energy considerations, there is an event horizon to how large an expression can be written. Thus, in particular there is a limit to the size of a number. Basically, a number can’t be larger than the universe.

https://patriceayme.wordpress.com/2011/10/10/largest-number/

This also holds for the continuum: just as numbers can’t be arbitrarily large, neither can the digital expression of a given number be arbitrarily long. In other words, irrational numbers don’t exist (I will detail in the future what is wrong with the 24 century old proof, step by step).

As the world consists in sets of entangled quantum states (also known as “qubits”), the number of states can get much larger than the world of numbers. For example a set of 300 entangled up or down spins presents with 2^300 states (much larger than the number of atoms in the observable, 100 billion light years across universe). Such sets (“quantum simulators”) have been basically implemented in the lab.

Digital computers only work with finite expressions. Thus practical, effective logic uses already only finite mathematics, and finite logic. Thus there is no difficulty to use only finite mathematics. Physically, it presents the interest of removing many infinities (although not renormalization!)

Quantum entanglement creates a much richer spacetime than the granular subjacent space. Thus an apparently continuous spacetime is emergent from granular space. Let’s go back to the example above: 300 spins, in a small space, once quantum entangled, give a much richer spacetime quantum space of 2^300 states.

Consider again a set S of 300 particles (a practical case would be 300 atoms with spins up or down). If a set of “particles” are all entangled together I will call that a EQN (Entangled Quantum Network). Now consider an incoming wave W (typically a photonic or gravitational wave; but it could be a phonon, etc.). Classically, if the 300 particles were… classical, W has little probability to interact with S, because it has ONLY 300 “things”, 300 entities, to interact with. Quantum Mechanically, though, it has 2^300 “things”, all the states of the EQN, to interact with. Thus, a much higher probability of interacting. Certainly the wave W is more likely to interact wit2^300 entities than with 300, in the same space! (The classical computations can’t be made from scratch by me, or anybody else; but the classical computation, depending on “transparency” of a film of 300 particles would actually depend upon the Quantum computation nature makes discreetly, yet pervasely!

EQNs make (mathematically at least) an all pervasive “volume” occupying wave. I wrote “volume” with quote-unquote, because some smart asses, very long ago (nearly a century) pointed out that the Quantum Waves are in “PHASE” space, thus are NOT “real” waves. Whatever that means: Quantum volumes/spaces in which Quantum Waves compute can be very complicated, beyond electoral gerrymandering of congressional districts in the USA! In particular, they don’t have to be 3D “volumes”. That doesn’t make them less “real”. To allude to well-established mathematics: a segment is a one dimensional volume. A space filling curve is also a sort of volume, as is a fractal (and has a fractal dimension).

Now quantum entanglement has been demonstrated over thousands of kilometers, and mass (so to speak) quantum entanglement has been demonstrated over 500 nanometers (5,000 times the size of an atom). One has to understand that solids are held by quantum entanglement. So there is plenty enough entanglement to generate spaces of apparently continuous possibilities and even consciousness… from a fundamentally granular space.

Entanglement, or how to get continuum from discontinuum. (To sound like Wheeler.)

The preceding seems pretty obvious to me. Once those truths get around, everybody will say:’But of course, that’s so obvious! Didn’t Witten say that first?’

No, he didn’t.

You read it here first.

Granular space giving rise to practically continuous spacetime is an idea where deep philosophy proved vastly superior to the shortsightedness of vulgar mathematics.

Patrice Ayme’

SUB-QUANTUM GRAVITATIONAL COLLAPSE 2 SLIT Thought Experiment

September 23, 2017

A Proposed Lab SUB QUANTUM TEST: SQPR, Patrice Aymé Contra Albert Einstein: GRAVITATIONALLY DETECTING QUANTUM COLLAPSE! 

Einstein claimed that a “particle” was a lump of energy, even while in translation. He had no proof of this assertion, and it underlays all modern fundamental physics, and I believe it’s false. As I see it, this error, duplicated by 99.99% of 20 C theoretical physicists, led the search for the foundations of physics astray in the Twentieth Century. How could one prove my idea, and disprove Einstein?

What Einstein wrote is this, in what is perhaps his most famous work (1905 CE): “Energy, during the propagation of a ray of light, is not continuously distributed over steadily increasing spaces, but it consists of a finite number of energy quanta LOCALIZED AT POINTS IN SPACE, MOVING WITHOUT DIVIDING…” [What’s in capital letters, I view as extremely probably false. Einstein then added nine words, four of which explaining the photoelectric effect, and for which he got the Nobel Prize. Those nine words were entirely correct, but physically independent of the preceding quote!]

If those “energy quanta” are “localized at points in space“, they concentrate onto themselves all the mass-energy.

It’s simple. According to me, the particle disperses while it is in translation (roughly following, and becoming a nonlinear variant of its De Broglie/Matter Wave dispersion, the bedrock of Quantum Physics as everybody knows it). That means its mass-energy disperses. According to Einstein, it doesn’t.

However, a gravitational field can be measured. In my theory, SQPR, the matter waves are real. What can “real” mean, in its simplest imaginable form? Something is real if that something has mass-energy-momentum. So one can then do a thought experiment. Take the traditional Double Slit experiment, and install a gravitational needle (two masses linked by a rigid rod, like a hydrogen molecule at absolute zero) in the middle of the usual interference screen.

Sub Quantum Patrice Reality Is Experimentally Discernible From Einstein’s Version of Quantum Physics! Notice in passing that none of the physics super minds of the Twentieth Century seem to have noticed Einstein’s Axiom, which is ubiquitously used all over Quantum Physics and QFT!

According to Einstein, the gravitational needle will move before the process of interference is finished, and the self-interfering particle hit the screen (some may object that, because photons travel at c, and so do gravitons, one can’t really gravitationally point at the photon; however, that’s not correct, there should be a delayed field moving the needle).

According to me, the particle is dispersed during the self-interfering process: it’s nowhere in particular. Thus the mass-energy is dispersed before the collapse/singularization. Thus a gravitational field from the self-interfering particle can’t be measured from inside the self-interfering geometry.

Could the experiment be done?

Yes. But it won’t be easy.

Molecules constituted  of 5000 protons, 5000 neutrons and 5000 electrons have exhibited double slit behavior.  That’s plenty enough mass to turn a gravitational needle made of two hydrogen atoms. However, with such a large object, my theory may well fail to be experimentally checked (the molecule probably re-localizes continually, thus the needle will move before impact). Ideally, one should best check this Sub Quantum Reality with a simple unique particle, such as a photon, or an electron.

Why did I long believe Einstein was wrong on this point, what I called “Einstein’s Axiom” above?

First, he had no proof of what he said. Allure can’t replace reason

Second, localization into a point is contrary to the philosophical spirit, so to speak, of Quantum Physics. The basic idea of Quantum Physics is that one can’t localize physics into points in space… or into points in energy (this was Planck’s gist). Both space and energy come in LUMPS. For example, an electron delocalizes around a proton, creating an atom of hydrogen.

The lump thing for emissions of energy is Planck’s great discovery (a blackbody sends energy packets hf, where f is the frequency and h, Planck’s constant). The non-relevance of points is De Broglie’s great intuition: De Broglie’s introduced the axiom that one can compute everything about the translation behavior of an object from the waves associated to the energy-momentum of said object.

So Einstein was wrong on the philosophy, as he himself concluded thirty years of thinking hard about Quantum Physics, as one of its two founders, with his discovery of what he called “Spooky Interaction At A Distance” (the “EPR”, which has turned from thought experiment to real experiment, checked now in hundreds of different experiments). If “elements of reality” (to use the Einstein EPR language), are spooky action at a distance” why not so when the particle is in flight, which is precisely the gist of the EPR… (After I thought of this, I found a paper by Zurek and Al. who seem to draw a similar conclusion.)

The philosophy of Quantum Physics in one sentence: small is big, or even, everywhere.

Third, Einstein’s hypothesis of points particles being always localized has led to lots of problems, including the so-called “Multiverse” or the “Many Worlds Interpretation of Quantum Mechanics” (at least, according to yours truly…).

Fourth, the development of Twentieth Century physics according to Einstein’s roadmap, has led to theories on 5% or so of known mass-energy, at most: an epic failure. Whereas my own Sub Quantum Reality readily predicts the apparition of Dark Matter and the joint apparition of Dark Energy, as observed.

Fifth: If Einstein were right, the which-path information in the 2-slit experiment would be readily available, at least as a thought experiment, and that can’t work. The entire subject is still highly controversial: contemplate the massive paper in the Proceedings of the National Academy of Sciences, “Finally making sense of the double-slit experiment”, March 20, 2017, whose lead author is Yakir Aharonov, from the extremely famous and important Aharonov-Bohm effect. The Aharonov-Bohm effect pointed out that the potentials, not the fields themselves, were the crucial inputs of Quantum Physics. That should have been obvious to all and any who studied Quantum Physics. Yet it was overlooked by all the super minds for nearly 40 years!

Sixth: This is technical, so I won’t give the details (which are not deep). One can modify Einstein’s original EPR experiment (Which had to do with pairs of particles in general, not just photon polarization a la Bohm-Bell). One can introduce in the EPR 1935 set-up, an ideal gravity detector. If Einstein was right about the particle being always localized, determinism would be always true on particle A of an {A,B} interaction pair. Thus particle A could be tracked, gravitationally, always. But that would grossly violated the free arbiter of a lab experimenter deciding to tinker with B’s path, through an experiment of her choosing. (How do large particles do it, then? Well they tend to partly localize continually thanks to their own size, and random singularizations.)

The naked truth can be in full view, yet, precisely because it’s naked, nobody dares to see it!

Richard Feynman famously said that the double slit experiment was central to physics, and that no one understood it. He considered it carefully. Gravitation should stand under it, though! The preceding proposed experiment is one which it was obvious to propose. Yet, no one proposed it, because they just couldn’t seriously envision Quantum Collapse, and thus its impact on gravitation. Yet, I do! And therein the connection between Quantum Physics and Gravitation, the quest for the Graal of modern physicists… 

So let’s have an experiment, Mr. Einstein!

Patrice Ayme’

DARK MATTER PROPULSION Proposed

December 10, 2016

In  Sub-Quantum Patrice’s Reality (SQPR), Matter Waves are real (in Quantum Theory Copenhagen Interpretation (QTCI) the Matter Waves are just probability waves of… knowledge… hence the insistence that “it came from bit“). There has been no direct evidence that Matter Waves were real. So far. But times they are changing as the other one, Bob Dylan, a gifted yet not too deep singer who got his Nobel today, said.

Both Dark Matter and Dark Energy are consequences of SQPR. So: Observing both Dark Matter and Dark Energy constitute proofs of SQPR.

The prediction of the deviation of light by the Sun was twice with “General Relativity” than the one predicted in Newtonian Mechanics. The effect was minute, and detected only in grazing starlight, during Solar eclipse of 29 May 1919 (by the ultra famous British astronomer and physicist Eddington). Thus, as 95% of the universe matter-energy is from Dark Matter or Dark Energy, my prediction carries more weight.

SPQR also predict “fuel-less” production, in a variant of the effect which produces Dark Matter in SQPR (also called PSQR below): 

Dark Matter Pushes, Patrice Ayme Says. Explaining NASA's Findings?

Dark Matter Pushes, Patrice Ayme Says. Explaining NASA’s Findings?

How does Dark Matter create propulsion? Well, that it does is evident: just look at galactic clusters (more details another day). A Matter Wave will expand, until it singularizes. If it expands enough, it will become so big that it will lose a (smaller) piece of itself during re-singularization. That  piece is the Dark Matter.

Thus visualize this: take a cavity C, and bounce a Matter Wave around it (there is plenty of direct theoretical and experimental evidence that this can be arranged).

Make a hole H in the boundary of C (this is not different from the Black Body oven the consideration of which led Planck to discover the Quantum).

Some Dark Matter then escapes. By the hole. 

However, Dark Matter carries energy momentum (evidence from Galaxies, Galactic Clusters, etc.).

Hence a push. A Dark Matter push. (Notice: the Dark Matter is created inside the device, it doesn’t have to be “gathered”. DM propellant speed could be many times the speed of light, hence great efficiency…)

The (spectacular) effect has been apparently observed by NASA.

Does this violate Newton’s Third Law? (As it has been alleged.)

No. I actually just used Newton’s Third Law, the Action = Reaction law. So SQPR explains the observed effect in combination with the Action= Reaction Law, “proving” both.

How could we prove SQPR? There should be a decrease of energy-momentum after a while, and the decrease should equal the observed push exactly.

Patrice Ayme’

***

Warning: The preceding considerations are at the edge of plausible physics. (Groups of dissenting physicists are always busy making theories where Dark Matter does not exist (and they should!) Should they be right, the preceding is nonsense. The consensus, though, is that Dark Matter exists, but is explained by a variant of the so-called “Standard Model”, using “Supersymmetry”, or “WIMPs”, or “Axions”. My own theory, SQPR is, by far, the most exotic, as it uses an hypothesized Sub Quantic Reality, obtained by throwing Quantum Theory Copenhagen Interpretation, QTCI, through the window, as a first order theory.)

DARK GALAXY (Explained?)

October 1, 2016

A giant galaxy made nearly entirely of Dark Matter has been discovered. Theories of Dark Matter proposed by people salaried for professing physics cannot explain (easily, if at all!) why there would be so much Dark Matter in one galaxy. I can. In my own theory, Dark Matter is not really matter, although matter gives birth to it, under some particular geometrical conditions. In my theory, in some geometrodynamic situations, a galaxy will churn inordinate amounts of Dark Matter quickly. So I was not surprised by the find.

There are many potential theories of Dark Matter. Most are fairly conventional. They typically hypothesize new particles (some of these new particles could come from new symmetries, such as supersymmetry). I do not see how they can predict why these particular particles appear in some places, and not others. However, the importance of location, of geometry, is a crucial feature of my own theory.

I predicate that the Quantum Interaction (copyright myself) does not have infinite range. Thus, quantum interactions, in some conditions of low mass-energy density, leave behind part of the Quantum Wave. Such debris have mass-energy, so they exert gravitational pull, but they have little else besides (most of the characteristics of the particles they were part of concentrate somewhere else).

I Can Explain This Dark Galaxy, By Changing The Foundations Of Physics. No Less.

I Can Explain This Dark Galaxy, By Changing The Foundations Of Physics. No Less.

[From the Hawaiian Gemini telescope.]

In my own theory, one can imagine that the geometry of a galaxy is, at some point extremely favorable to the creation of Dark Matter: it is just a question of dispersing the matter just so. The Dark Galaxy has 1% of the stars of our Milky Way, or less. In my theory, once Dark Matter has formed, it does not seem possible to make visible matter again with it (broken Quantum Wave debris float around like a cosmic fog).

All past science started as a mix of philosophy and science-fiction (Aristarchus, Lucretius, Giordano Bruno, Immanuel Kant, Lamarck are examples). One can only surmise it will be the same in the future, and this is supported by very good logic: guessing comes always before knowing. Those who claim that science will never be born again from philosophy and fantasy are saying that really new science will never happen again. They say that all the foundations of science are known already. So they are into predication, just like religious fanatics.

It was fashionable to say so, among physicists in the 1990s, the times of the fable known as TOE, the so-called Theory Of Everything. Shortly after this orgasm of self-satisfaction by self-appointed pontiffs, the evidence became clear that the universe’s mass-energy was mostly Dark Energy, and Dark Matter.

This is an interesting case of meta-mood shared: also in the 1990s, clever idiots (Fukuyama, etc.) claimed history had ended: a similar claim from the same period, permeating the same mood of stunted imagination. The advantage, while those who pontificated that way? They could claim they knew everything: they had become gods, living gods.

I had known about Dark Matter all along (the problem surfaced nearly a century ago). I considered it a huge problem: It held galaxies and galactic clusters, together. But maybe something had been overlooked. Meanwhile Main Stream Physics (MSP) dutifully, studiously, ignored it. For decades. Speaking of Dark matter made one despicable, a conspiracy theorist.

Another thing MSP ignored was the foundations of physics. Only the most prestigious physicists, such as Richard Feynman, could afford to repeat Einstein’s famous opinion that “nobody understands Quantum Mechanics”. I gave my intellectual life’s main axis of reflection in trying to understand what nobody wanted to understand, that nobody thought they could afford to understand, the real foundations of physics. (So doing I was forced to reflect on why it is that people do not want to understand the most fundamental things, even while professing they do. It is particularly blatant in, say, economics.)

I have long discovered that the real foundations of physics are entangled with those of mathematics (it is not just that physics, nature, is written with mathematics, as Galileo wrote; there is a dialogue between the mathematics that we invent, and the universe that we discover, they lead to each other). For example whether the infinity axiom is allowed in mathematics change the physics radically (the normalization problem of physics is solved if one removes the infinity axiom).

Right now, research at the foundations of (proper) physics is hindered by our lack of nonlinear mathematics: Quantum mechanics, as it is, is linear (waves add up in the simplest way). However the “collapse of the wave packet” is obviously nonlinear (this is why it’s outside of existing physics, from lack of math). From that Quantum collapse, when incomplete from great distances involved, comes Dark Matter. At least, so I propose. 

Patrice Ayme’

DARK MATTER, Or How Inquiry Proceeds

September 7, 2016

How to find really new knowledge? How do you find really new science? Not by knowing the result: this is what we don’t have yet. Any really new science will not be deduced from pre-existing science. Any really new knowledge will come out of the blue. Poetical logic will help before linear logic does.

The case of Dark Matter is telling: this increasingly irritating elephant in the bathroom has been in evidence for 80 years, lumbering about. As the encumbering beast did not fit existing science, it was long religiously ignored by the faithful, as a subject not worthy of serious inquiry by very serious physicists. Now Dark Matter, five times more massive than Standard Model matter, is clearly sitting heavily outside of the Standard Model, threatening to crush it into irrelevance. Dark matter obscures the lofty pretense of known physics to explain everything (remember the grandly named TOE, the so-called “Theory Of Everything“? That was a fraud, snake oil, because main stream physics celebrities crowed about TOE, while knowing perfectly well that Dark Matter dwarfed standard matter, and was completely outside of the Standard Model).

Physicists are presently looking for Dark Matter, knowing what they know, namely that nature has offered them a vast zoo of particles, many of them without rhyme or reason (some have rhyme, a symmetry, a mathematical group such as SU3 acting upon them; symmetries revealed new particles, sometimes). 

Bullet Cluster, 100 Million Years Old. Two Galaxies Colliding. The Dark Matter, In Blue, Is Physically Separated From the Hot, Standard Matter Gas, in Red.

Bullet Cluster, 100 Million Years Old. Two Galaxies Colliding. The Dark Matter, In Blue, Is Physically Separated From the Hot, Standard Matter Gas, in Red.

[This sort of pictures is most of what we presently have to guess what Dark Matter could be; the physical separation of DM and SM is most telling to me: it seems to indicate that SM and DM do not respond to the same forces, something that my Quantum theory predicts; it’s known that Dark Matter causes gravitational lensing, as one would expect, as it was first found by its gravitational effects, in the 1930s…]

However, remember: a truly completely new piece of science cannot be deduced from pre-existing paradigm. Thus, if Dark Matter was really about finding a new particle type, it would be interesting, but not as interesting as it would be, if it were not, after all, a new particle type, but from a completely new law in physics.

This is the quandary about finding truly completely new science. It can never be deduced from ruling paradigms, and may actually overthrow them. What should then be the method to use? Can Descartes and Sherlock Holmes help? The paradigm presented by Quantum Physics helps. The Quantum looks everywhere in space to find solutions: this is where its (“weird”) nonlocality comes in. Nonlocality is crucial for interference patterns and for finding lowest energy solutions, as in the chlorophyll molecule. This suggests that our minds should go nonlocal too, and we should look outside of a more extensive particle zoo to find what Dark Matter is.

In general, searching for new science should be by looking everywhere, not hesitating to possibly contradict what is more traditional than well established.

An obvious possibility is, precisely, that Quantum Physics is itself incomplete, and generating Dark Matter in places where said incompleteness would be most blatant. More precisely, Quantum processes, stretched over cosmic distances, instead of being perfectly efficient and nonlocal over gigantically cosmic locales, could leave a Quantum mass-energy residue, precisely in the places where extravagant cosmic stretching of Quanta occurs (before “collapse”, aka “decoherence”).

The more one does find a conventional explanation (namely a new type of particle) for Dark Matter, the more likely my style of explanation is likely. How could one demonstrate it? Not by looking for new particles, but by conducting new and more refined experiments in the foundations of Quantum Physics.

If this guess is correct, whatever is found askew in the axioms of present Quantum Physics could actually help future Quantum Computer technology (because the latter works with Quantum foundations directly, whereas conventional high energy physics tend to eschew the wave aspects, due to the high frequencies involved).

Going on a tangent is what happens when the central, attractive force, is let go. A direct effect of freedom. Free thinking is tangential. We have to learn to produce tangential thinking.

René Descartes tried to doubt the truth of all his beliefs to determine which beliefs he could be certain were true. However, at the end of “The Meditations” he hastily conclude that we can distinguish between dream and reality. It is not that simple. The logic found in dreams is all too similar to the logic used by full-grown individuals in society.

Proof? Back to Quantum Physics. On the face of it, the axioms of Quantum Physics have a dream like quality (there is no “here”, nor “there”, “now” is everywhere, and, mysteriously, the experiment is Quantum, whereas the “apparatus” is “classical”). Still, most physicists, after insinuating they have figured out the universe, eschew the subject carefully.  The specialists of Foundations are thoroughly confused: see Sean Carroll, http://www.preposterousuniverse.com/blog/2013/01/17/the-most-embarrassing-graph-in-modern-physics/

However unbelievable Quantum Physics, however dream-like it is, physicists believe in it, and don’t question it anymore than cardinals would Jesus. Actually, it’s this dream-like nature which, shared by all, defines the community of physicists. Cartesian doubt, pushed further than Descartes did, will question not just the facts, the allegations, but the logic itself. And even the mood behind it.

Certainly, in the case of Dark Matter, some of the questions civilization has to ask should be:

  1. How sure are we of the Foundations of Quantum Physics? (Answer: very sure, all too sure!)
  2. Could not it be that Dark Matter is a cosmic size experiment in the Foundations of Quantum Physics?

Physics, properly done, does not just question the nature of nature. Physics, properly done, questions the nature of how we find out the nature of anything. Physics, properly done, even questions the nature of why we feel the way we do. And the way we did. About anything, even poetry. In the end, indeed, even the toughest logic is a form of poetry, hanging out there, justified by its own beauty, and nothing else. Don’t underestimate moods: they call what beauty is.

Patrice Ayme’

Entangled Universe: Bell Inequality

May 9, 2016

Abstract: The Bell Inequality shatters the picture of reality civilization previously established. A simple proof is produced.

What is the greatest scientific discovery of the Twentieth Century? Not Jules Henri Poincaré’s Theory of Relativity and his famous equation: E = mcc. Although a spectacular theory, since  Poincaré’s made time local, in order to keep the speed of light constant, it stemmed from Galileo’s Principle of Relativity, extended to Electromagnetism. To save electromagnetism globally, Jules Henri Poincaré made time and length local.

So was the discovery of the Quantum by Planck the greatest discovery? To explain two mysteries of academic physics, Planck posited that energy was emitted in lumps. Philosophically, though, the idea was just to extent to energy the basic philosophical principle of atomism, which was two thousand years old. Energy itself was discovered by Émilie Du Châtelet in the 1730s.

Quantum Entanglement Is NOT AT ALL Classically Predictable

Quantum Entanglement Is NOT AT ALL Classically Predictable

Just as matter went in lumps (strict atomism), so did energy. In light of  Poincaré’s E = mc2, matter and energy are the same, so this is not surprising (by a strange coincidence (?)  Poincaré demonstrated, and published E = mc2, a few month of the same year, 1900, as Max Planck did E = hf; Einstein used both formulas in 1905).

The greatest scientific discovery of Twentieth Century was Entanglement… which is roughly the same as Non-Locality. Non-Locality would have astounded Newton: he was explicitly very much against it, and viewed it, correctly, as the greatest flaw of his theory. My essay “Non-Locality” entangles Newton, Émilie Du Châtelet, and the Quantum, because therefrom the ideas first sprung.

***

Bell Inequality Is Obvious:

The head of the Theoretical division of CERN, John Bell, discovered an inequality which is trivial and apparently so basic, so incredibly obvious, that it reflects the most basic common sense that it should always be true. Ian Miller (PhD, Physical Chemistry) provided a very nice perspective on all this. Here it is, cut and pasted (with his agreement):

Ian Miller: A Challenge! How can Entangled Particles violate Bell’s Inequalities?

Posted on May 8, 2016 by ianmillerblog           

  The role of mathematics in physics is interesting. Originally, mathematical relationships were used to summarise a myriad of observations, thus from Newtonian gravity and mechanics, it is possible to know where the moon will be in the sky at any time. But somewhere around the beginning of the twentieth century, an odd thing happened: the mathematics of General Relativity became so complicated that many, if not most physicists could not use it. Then came the state vector formalism for quantum mechanics, a procedure that strictly speaking allowed people to come up with an answer without really understanding why. Then, as the twentieth century proceeded, something further developed: a belief that mathematics was the basis of nature. Theory started with equations, not observations. An equation, of course, is a statement, thus A equals B can be written with an equal sign instead of words. Now we have string theory, where a number of physicists have been working for decades without coming up with anything that can be tested. Nevertheless, most physicists would agree that if observation falsifies a mathematical relationship, then something has gone wrong with the mathematics, and the problem is usually a false premise. With Bell’s Inequalities, however, it seems logic goes out the window.

Bell’s inequalities are applicable only when the following premises are satisfied:

Premise 1: One can devise a test that will give one of two discrete results. For simplicity we label these (+) and (-).

Premise 2: We can carry out such a test under three different sets of conditions, which we label A, B and C. When we do this, the results between tests have to be comparable, and the simplest way of doing this is to represent the probability of a positive result at A as A(+). The reason for this is that if we did 10 tests at A, 10 at B, and 500 at C, we cannot properly compare the results simply by totalling results.

Premise 1 is reasonably easily met. John Bell used as an example, washing socks. The socks would either pass a test (e.g. they are clean) or fail, (i.e. they need rewashing). In quantum mechanics there are good examples of suitable candidates, e.g. a spin can be either clockwise or counterclockwise, but not both. Further, all particles must have the same spin, and as long as they are the same particle, this is imposed by quantum mechanics. Thus an electron has a spin of either +1/2 or -1/2.

Premises 1 and 2 can be combined. By working with probabilities, we can say that each particle must register once, one way or the other (or each sock is tested once), which gives us

A(+) + A(-) = 1; B(+) + B(-) = 1;   C(+) + C(-) = 1

i.e. the probability of one particle tested once and giving one of the two results is 1. At this point we neglect experimental error, such as a particle failing to register.

Now, let us do a little algebra/set theory by combining probabilities from more than one determination. By combining, we might take two pieces of apparatus, and with one determine the (+) result at condition A, and the negative one at (B) If so, we take the product of these, because probabilities are multiplicative. If so, we can write

A(+) B(-) = A(+) B(-) [C(+) + C(-)]

because the bracketed term [C(+) + C(-)] equals 1, the sum of the probabilities of results that occurred under conditions C.

Similarly

B(+)C(-)   = [A(+) + A(-)] B(+)C(-)

By adding and expanding

A(+) B(-) + B(+)C(-) = A(+) B(-) C(+) + A(+) B(-) C(-) + A(+) B(+)C(-) + A(-)B(+)C(-)

=   A(+)C(-) [(B(+) + B(-)] + A+B C+ + AB(+)C(-)

Since the bracketed term [(B(+) + B(-)] equals 1 and the last two terms are positive numbers, or at least zero, we have

A(+) B(-) + B(+)C(-) ≧ A(+)C(-)

This is the simplest form of a Bell inequality. In Bell’s sock-washing example, he showed how socks washed at three different temperatures had to comply.

An important point is that provided the samples in the tests must give only one result from only two possible results, and provided the tests are applied under three sets of conditions, the mathematics say the results must comply with the inequality. Further, only premise 1 relates to the physics of the samples tested; the second is merely a requirement that the tests are done competently. The problem is, modern physicists say entangled particles violate the inequality. How can this be?

Non-compliance by entangled particles is usually considered a consequence of the entanglement being non-local, but that makes no sense because in the above derivation, locality is not mentioned. All that is required is that premise 1 holds, i.e. measuring the spin of one particle, say, means the other is known without measurement. So, the entangled particles have properties that fulfil premise 1. Thus violation of the inequality means either one of the premises is false, or the associative law of sets, used in the derivation, is false, which would mean all mathematics are invalid.

So my challenge is to produce a mathematical relationship that shows how these violations could conceivably occur? You must come up with a mathematical relationship or a logic statement that falsifies the above inequality, and it must include a term that specifies when the inequality is violated. So, any takers? My answer in my next Monday post.

[Ian Miller.]

***

The treatment above shows how ludicrous it should be that reality violate that inequality… BUT IT DOES! This is something which nobody had seen coming. No philosopher ever imagined something as weird. I gave an immediate answer to Ian:

‘Locality is going to come in the following way: A is going to be in the Milky Way, B and C, on Andromeda. A(+) B(-) is going to be 1/2 square [cos(b-a)]. Therefrom the contradiction. There is more to be said. But first of all, I will re-blog your essay, as it makes the situation very clear.’

Patrice Ayme’

TO BE AND NOT TO BE? Is Entangled Physics Thinking, Or Sinking?

April 29, 2016

Frank Wilczek, a physics Nobel laureate, wrote a first soporific, and then baffling article in Quanta magazine: “Entanglement Made Simple”. Yes, all too simple: it sweeps the difficulties under the rug. After a thorough description of classical entanglement, we are swiftly told at the end, that classical entanglement supports the many World Interpretation of Quantum Mechanics. However, classical entanglement (from various conservation laws) has been known since the seventeenth century.

Skeptical founders of Quantum physics (such as Einstein, De Broglie, Schrodinger, Bohm, Bell) knew classical entanglement very well. David Bohm found the Bohm-Aharanov effect, which demonstrated the importance of (nonlocal) potential, John Bell found his inequality which demonstrated, with the help of experiments (Alain Aspect, etc.) that Quantum physics is nonlocal.

Differently From Classical Entanglement, Which Acts As One, Quantum Entanglement Acts At A Distance: It Interferes With Measurement, At A Distance

Differently From Classical Entanglement, Which Acts As One, Quantum Entanglement Acts At A Distance: It Interferes With Measurement, At A Distance

The point about the cats is that everybody, even maniacs, ought to know that cats are either dead, or alive. Quantum mechanics make the point they can compute things about cats, from their point of view. OK.

Quantum mechanics, in their busy shops, compute with dead and live cats as possible outcomes. No problem. But then does that mean there is a universe, a “world“, with a dead cat, happening, and then one with a live cat, also happening simultaneously?

Any serious philosopher, somebody endowed with common sense, the nemesis of a Quantum mechanic, will say no: in a philosopher’s opinion, a cat is either dead, or alive. To be, or not to be. Not to be, and not to be.

A Quantum mechanic can compute with dead and live cats, but that does not mean she creates worlds, by simply rearranging her computation, this way, or that. Her various dead and live cats arrangements just mean she has partial knowledge of what she computes with, and that Quantum measurements, even from an excellent mechanic, are just partial, mechanic-dependent measurements.

For example, if one measures spin, one needs to orient a machine (a Stern Gerlach device). That’s just a magnetic field going one way, like a big arrow, a big direction. Thus one measures spin in one direction, not another.

What’s more surprising is that, later on, thanks to a nonlocal entanglement, one may be able to determine that, at this point in time, the particle had a spin that could be measured, from far away, in another direction. So far, so good: this is like classical mechanics.

However, whether or not that measurement at a distance has occurred, roughly simultaneously, and way out of the causality light cone, EFFECTS the first measurement.

This is what the famous Bell Inequality means.

And this is what the problem with Quantum Entanglement is. Quantum Entanglement implies that wilful action somewhere disturbs a measurement beyond the reach of the five known forces. It brings all sorts of questions of a philosophical nature, and make them into burning physical subjects. For example, does the experimenter at a distance have real free will?

Calling the world otherworldly, or many worldly, does not really help to understand what is going on. Einstein’s “Spooky Interaction At A Distance” seems a more faithful, honest rendition of reality than supposing that each and any Quantum mechanic in her shop, creates worlds, willy-nilly, each time it strikes her fancy to press a button.

What Mr. Wilczek did is what manyworldists and multiversists always do: they jump into their derangement (cats alive AND dead) after saying there is no problem. Details are never revealed.

Here is, in extenso, the fully confusing and unsupported conclusion of Mr. Wilczek:

“Everyday language is ill suited to describe quantum complementarity, in part because everyday experience does not encounter it. Practical cats interact with surrounding air molecules, among other things, in very different ways depending on whether they are alive or dead, so in practice the measurement gets made automatically, and the cat gets on with its life (or death). But entangled histories describe q-ons that are, in a real sense, Schrödinger kittens. Their full description requires, at intermediate times, that we take both of two contradictory property-trajectories into account.

The controlled experimental realization of entangled histories is delicate because it requires we gather partial information about our q-on. Conventional quantum measurements generally gather complete information at one time — for example, they determine a definite shape, or a definite color — rather than partial information spanning several times. But it can be done — indeed, without great technical difficulty. In this way we can give definite mathematical and experimental meaning to the proliferation of “many worlds” in quantum theory, and demonstrate its substantiality.”

Sounds impressive, but the reasons are either well-known or then those reasons use a sleight of hand.

Explicitly: “take both of two contradictory property-trajectories into account”: just read Feynman QED, first chapter. Feynman invented the ‘sum over histories’, and Wilczek is his parrot; but Feynman did not become crazy from his ‘sum over history’: Richard smirked when his picturesque evocation was taken literally, decades later…

And now the sleight of hand: …”rather than  [gather] partial information spanning several times. But it can be done — indeed, without great technical difficulty.” This nothing new: it is the essence of the double slit discovered by that Medical Doctor and polymath, Young, around 1800 CE: when one runs lots of ‘particles’ through it, one sees the (wave) patterns. This is what Wilczek means by “partial information“. Guess what? We knew that already.

Believing that one can be, while not to be, putting that at the foundation of physics, is a new low in thinking. And it impacts the general mood, making it more favorable towards unreason.

If anything can be, without being, if anything not happening here, is happening somewhere else, then is not anything permitted? Dostoyevsky had a Russian aristocrat suggests that, if god did not exist anything was permitted. And, come to think of it, the argument was at the core of Christianism. Or more, exactly, of the Christian reign of terror which started in the period 363 CE-381 CE, from the reigns of emperor Jovian to the reign of emperor Theodosius. To prevent anything to be permitted, a god had to enforce the law.

What we have now is way worse: the new nihilists (Wilczek and his fellow manyworldists) do not just say that everything is permitted. They say: it does not matter if everything is permitted, or not. It is happening, anyway. Somewhere.

Thus Many-Worlds physics endangers, not just the foundations of reason, but the very justification for morality. That is that what is undesirable should be avoided. Even the Nazis agreed with that principle. Many-Worlds physics says it does not matter, because it is happening, anyway. Somewhere, out there.

So what is going on, here, at the level of moods? Well, professor Wilczek teaches at Harvard. Harvard professors advised president Yeltsin of Russia, to set up a plutocracy. It ruined Russia. Same professors made a fortune from it, while others were advising president Clinton to do the same, and meanwhile Prime Minister Balladur in France was mightily impressed, and followed this new enlightenment by the Dark Side, as did British leaders, and many others. All these societies were ruined in turn. Harvard was the principal spirit behind the rise of plutocracy, and the engine propelling that rise, was the principle that morality did not matter. because, because, well, Many-Worlds!

How does one go from the foundations of physics, to the foundations of plutocracy? Faculty members in the richest, most powerful universities meet in mutual admiration societies known as “faculty clubs” and lots of other I scratch-your-back, you scratch-my-back social occasion they spend much of their time indulging in. So they influence each other, at the very least in the atmospheres of moods they create, and then breathe together.

Remember? It is not that everything is permitted: it’s happening anyway, so we may as well profit from it first. Many-Worlds physics feeds a mood favorable to many plutocrats, and that’s all there is to it. (But that, of course, is a lot, all too much.)

Patrice Ayme’

Crazy Physics Helps With Overall Madness?

April 27, 2016

Quantum Physics has long been a circus. When De Broglie proposed his thesis, his  thesis jury (which comprised top physicists, including a Nobel Laureate) did not know what to make of it, and consulted Einstein. Einstein was enthusiastic, saying de Broglie “lifted a piece of the veil”. Three years later, de Broglie got the Nobel and proposed his pilot wave theory. Pauli made an objection, de Broglie replied to it with the consummate politeness of the Prince he was, and thus the reply was not noticed. Five years after, the great mathematician Von Neumann asserted a “proof” that there was no Quantum Mechanics but for the one elaborated in Copenhagen. De Broglie’s objections were not listened to. Another two decades later, David Bohm presented de Broglie theory at the Institute for Advanced Physics in Princeton. But Bohm was drowned by question about why he had refused to testify at the Committee on Anti-American Activities in Congress (the American born Bohm promptly lost his job at Princeton University and his US passport, and would leave the US forever).

The usual interpretation of Quantum Physics consider that the De Broglie Matter Waves therein are only probability waves. This idea of Nobel Laureate Born has eschewed controversy. However Einstein sourly remarked: “God does not play with dice.” To which Nobel Laureate Bohr smartly replied:”Stop telling God what to do!

Qubits Are Real. But The Multiverse Is Madness

Qubits Are Real. But The Multiverse Is Madness. And Madness Is Contagious.

De Broglie suggested a “Double Solution” theory, which was promptly forgotten as Dirac launched Quantum ElectroDynamics by starting from the simplest relativistic wave, and building the (spinor) space he needed to have said wave wave in it.  Bohm revived (some of) De Broglie’s ideas by proposing to guide an always well defined particle with a (nonlocal) “quantum potential”.

***

And The Madness Set In:

Nowadays, descriptions of Quantum Physics are keen to assert that something can be in two places at the same time, that there are many worlds, or universes, created each time something happen, that cats are dead and alive, that the observer creates reality, etc…

All this derangement affecting physicists has something to do with a collective madness similar to the pseudo-scientific theories behind the Slave Trade, Stalinism, or Nazism.

No, I am not exaggerating. The theory behind enslaving Black Africans (going all the way back to the Middle Ages) was that Black Africans were, somehow, the missing link between man and ape. That’s why the Pope allowed the slave trade.

Neither am I exaggerating about fascism: the Nazis were actually obsessed by the new physics, a world where everything seemed possible. They called it “Jewish Physics”, and several Nobel laureates (Lenard, etc.), top mathematicians (say Teichmuller, who died on the Eastern Front in combat) were its opponents.

It contributed to suggest an overall mood:’if anything is possible, why not surrealism, fascism, Stalinism, Nazism?’

Germany has long led, intellectually (not to say France did not lead too, but it was the great opponent). Thus when top physicists became Nazis even before Hitler did, they no doubt impressed the latter by their attacks on “Jewish Science”.

The madness was not confined to the Nazis, stricto sensu. An excellent example is Max Planck, discoverer of the Quantum.

Planck accepted Einstein’s paper on “The Electrodynamics of Moving Bodies” without references… When it was sure that Planck knew about the work of Poincare’, Lorentz, Fitzgerald, Michelson-Morley, etc. on Relativity. Poincaré  was a star, and had toured the USA, delivering lectures on “Relativity” the year prior.

So what was Planck up to? Promoting the German arriviste to the cost of the most accomplished mathematician and physicist, because the latter was a Frenchman. (Poincaré , who was as elevated a character as can be found, nevertheless complained about Einstein plagiarism later.) Not only was  Poincaré French, but his family was refugee from the occupation of Lorraine by the Prussians. Raymond Poincaré, who was prime minister of France several times and president of the French Republic during World War I, was Henri’s cousin.

This is of some import, in the understanding of ideas, to this day: Poincaré  discovered the idea of gravitational waves, and explained why all interactions should go at the speed of light. Scientists who published (stole) the same ideas later could not copy all of  Poincaré ’s arguments, it would have been too obvious (that they stole the ideas), so those important details of  Poincaré  have been forgotten… And this haunts physics to this day

I believe that this is how the extremely all too relative, theory of Relativity a la Einstein appeared: Einstein could not duplicate all of  Poincaré’s details, so he omitted (some of) them… Resulting in a (slick) theory with a glaring defect: all classes of frames in uniform motion are supposed to be equivalent, a blatant absurdity (as even the Big Bang theory imposes a unique class of comoving frames). This brought a lot of (on-going) confusion (say about “rest” mass).

Planck did not stop with stealing Relativity from  Poincaré, and offering it to the Great German empire.

Planck endorsed the general excitement of the German public, when Germany attacked the world on August 1, 1914. He wrote that, “Besides much that is horrible, there is also much that is unexpectedly great and beautiful: the smooth solution of the most difficult domestic political problems by the unification of all parties (and) … the extolling of everything good and noble.”

Planck also signed the infamous Manifesto of the 93 intellectuals“, a pamphlet of war propaganda (while Einstein at the academy in Berlin, retained a pacifistic attitude which almost led to his imprisonment, although he was saved by his Swiss citizenship). The Manifesto, ironically enough, enumerated German war crimes, while denying (‘not true’) that they had happened. It did not occur to the idiots who had signed it, that just denying this long litany of crimes was itself a proof that they had occurred… And it’s telling they had to deny them: the German population obviously was debating whether those crimes had happened, now that the war was not doing well.

Planck got punished for his nationalism: his second son Erwin was taken prisoner by the French in 1914. His eldest son Karl died at Verdun (along with another 305,000 soldiers). When he saw Hitler was destroying Germany, Planck went to see the dictator, to try to change his mind, bringing to his attention that he was demolishing German universities. But to no avail. In January 1945, Erwin, to whom he had been particularly close, was sentenced to death by the obscene and delirious Nazi “people” court, the Volksgerichtshof. Because Erwin participated in the failed attempt to make a coup against the criminal Hitler in July 1944. Erwin was executed on 23 January 1945 (along with around 5,000 German army officers, all the way to Feldmarshal).

So what to think of the “Multiverse”, “Dead and Alive Cats”, Things which are in different places at the same time, etc.? Do they have to do with suggesting, even promoting, a global reign of unreason?

I think they do. I think the top mood contaminate lesser  intellectuals, political advisers, even politicians themselves. Thus political and social leaders feel anything goes, so, next thing you know, they suggest crazy things, like self-regulating finance, trade treaties where plutocrats can sue states (apparently one of the features of TPP and TTIP), or a world which keeps on piling CO2, because everything is relative, dead, thus alive, and everywhere is the same, here, there and everywhere, since at the same place, in space, time, or whatever.

Physics, historically, was not just a model of knowledge, but of rational rectitude. This has been lost. And it was lost from technical reasons, discarding other approaches, in part because of sheer nationalism.

In the 1960s John Bell, the Irishman who was director of theory at CERN, published a book with his famous theorem on nonlocality inside:”Speakables and Unspeakables in Quantum Mechanics”. A title full of hidden sense.

Patrice Ayme

The Quantum Puzzle

April 26, 2016

CAN PHYSICS COMPUTE?

Is Quantum Computing Beyond Physics?

More exactly, do we know, can we know, enough physics for (full) quantum computing?

I have long suggested that the answer to this question was negative, and smirked at physicists sitting billions of universes on a pinhead, as if they had nothing better to do, the children they are. (Just as their Christian predecessors in the Middle Ages, their motives are not pure.)

Now an article in the American Mathematical Society Journal of May 2016 repeats (some) of the arguments I had in mind: The Quantum Computer Puzzle. Here are some of the arguments. One often hears that Quantum Computers are a done deal. Here is the explanation from Justin Trudeau, Canada’s Prime Minister, which reflects perfectly the official scientific conventional wisdom on the subject:  https://youtu.be/rRmv4uD2RQ4

(One wishes all our great leaders would be as knowledgeable… And I am not joking as I write this! Trudeau did engineering and ecological studies.)

... Supposing, Of Course, That One Can Isolate And Manipulate Qubits As One Does Normal Bits...

… Supposing, Of Course, That One Can Isolate And Manipulate Qubits As One Does Normal Bits…

Before some object that physicists are better qualified than mathematicians to talk about the Quantum, let me point towards someone who is perhaps the most qualified experimentalist in the world on the foundations of Quantum Physics. Serge Haroche is a French physicist who got the Nobel Prize for figuring out how to count photons without seeing them. It’s the most delicate Quantum Non-Demolition (QND) method I have heard of. It involved making the world’s most perfect mirrors. The punch line? Serge Haroche does not believe Quantum Computers are feasible. However Haroche does not suggest how he got there. The article in the AMS does make plenty of suggestions to that effect.

Let me hasten to add some form of Quantum Computing (or Quantum Simulation) called “annealing” is obviously feasible. D Wave, a Canadian company is selling such devices. In my view, Quantum Annealing is just the two slit experiment written large. Thus the counter-argument can be made that conventional computers can simulate annealing (and that has been the argument against D Wave’s machines).

Full Quantum Computing (also called  “Quantum Supremacy”) would be something completely different. Gil Kalai, a famous mathematician, and a specialist of Quantum Computing, is skeptical:

“Quantum computers are hypothetical devices, based on quantum physics, which would enable us to perform certain computations hundreds of orders of magnitude faster than digital computers. This feature is coined “quantum supremacy”, and one aspect or another of such quantum computational supremacy might be seen by experiments in the near future: by implementing quantum error-correction or by systems of noninteracting bosons or by exotic new phases of matter called anyons or by quantum annealing, or in various other ways…

A main reason for concern regarding the feasibility of quantum computers is that quantum systems are inherently noisy. We will describe an optimistic hypothesis regarding quantum noise that will allow quantum computing and a pessimistic hypothesis that won’t.”

Gil Katai rolls out a couple of theorems which suggest that Quantum Computing is very sensitive to noise (those are similar to finding out which slit a photon went through). Moreover, he uses a philosophical argument against Quantum Computing:

It is often claimed that quantum computers can perform certain computations that even a classical computer of the size of the entire universe cannot perform! Indeed it is useful to examine not only things that were previously impossible and that are now made possible by a new technology but also the improvement in terms of orders of magnitude for tasks that could have been achieved by the old technology.

Quantum computers represent enormous, unprecedented order-of-magnitude improvement of controlled physical phenomena as well as of algorithms. Nuclear weapons represent an improvement of 6–7 orders of magnitude over conventional ordnance: the first atomic bomb was a million times stronger than the most powerful (single) conventional bomb at the time. The telegraph could deliver a transatlantic message in a few seconds compared to the previous three-month period. This represents an (immense) improvement of 4–5 orders of magnitude. Memory and speed of computers were improved by 10–12 orders of magnitude over several decades. Breakthrough algorithms at the time of their discovery also represented practical improvements of no more than a few orders of magnitude. Yet implementing Boson Sampling with a hundred bosons represents more than a hundred orders of magnitude improvement compared to digital computers.

In other words, it unrealistic to expect such a, well, quantum jump…

“Boson Sampling” is a hypothetical, and simplest way, proposed to implement a Quantum Computer. (It is neither known if it could be made nor if it would be good enough for Quantum Computing[ yet it’s intensely studied nevertheless.)

***

Quantum Physics Is The Non-Local Engine Of Space, and Time Itself:

Here is Gil Kalai again:

“Locality, Space and Time

The decision between the optimistic and pessimistic hypotheses is, to a large extent, a question about modeling locality in quantum physics. Modeling natural quantum evolutions by quantum computers represents the important physical principle of “locality”: quantum interactions are limited to a few particles. The quantum circuit model enforces local rules on quantum evolutions and still allows the creation of very nonlocal quantum states.

This remains true for noisy quantum circuits under the optimistic hypothesis. The pessimistic hypothesis suggests that quantum supremacy is an artifact of incorrect modeling of locality. We expect modeling based on the pessimistic hypothesis, which relates the laws of the “noise” to the laws of the “signal”, to imply a strong form of locality for both. We can even propose that spacetime itself emerges from the absence of quantum fault tolerance. It is a familiar idea that since (noiseless) quantum systems are time reversible, time emerges from quantum noise (decoherence). However, also in the presence of noise, with quantum fault tolerance, every quantum evolution that can experimentally be created can be time-reversed, and, in fact, we can time-permute the sequence of unitary operators describing the evolution in an arbitrary way. It is therefore both quantum noise and the absence of quantum fault tolerance that enable an arrow of time.”

Just for future reference, let’s “note that with quantum computers one can emulate a quantum evolution on an arbitrary geometry. For example, a complicated quantum evolution representing the dynamics of a four-dimensional lattice model could be emulated on a one-dimensional chain of qubits.

This would be vastly different from today’s experimental quantum physics, and it is also in tension with insights from physics, where witnessing different geometries supporting the same physics is rare and important. Since a universal quantum computer allows the breaking of the connection between physics and geometry, it is noise and the absence of quantum fault tolerance that distinguish physical processes based on different geometries and enable geometry to emerge from the physics.”

***

I have proposed a theory which explains the preceding features, including the emergence of space. Let’s call it Sub Quantum Physics (SQP). The theory breaks a lot of sacred cows. Besides, it brings an obvious explanation for Dark Matter. If I am correct the Dark matter Puzzle is directly tied in with the Quantum Puzzle.

In any case, it is a delight to see in print part of what I have been severely criticized for saying for all too many decades… The gist of it all is that present day physics would be completely incomplete.

Patrice Ayme’