Archive for the ‘Entanglement’ Category


May 19, 2017

Through Wave Collapse and the ensuing Entanglements it sometimes brings, QUANTUM PHYSICS CREATES A CAUSAL STRUCTURE, THROUGHOUT THE UNIVERSE, THUS, AN ARROW OF TIME.

Actually it’s more than a simple causal structure: it is an existential structure, as localization creates materialization, in the (Sub-)Quantum Theory I advocate. (It’s a theory where there are no dead-and-alive cats, but particles in flight are not particles… Contrarily to what Einstein thought, but more along the lines of Niels Bohr, horror of horrors…) It also means that time, at the smallest scale, is a nonlocal entanglement. This is not a weird new age poetry, but pretty much what the raw formalism of Quantum Physics say. I throw the challenge to any physicist to contradict this in any way. It’s completely obvious on the face of it.

You read it here first, as they say (although I may have said it before). Is time absolute? How could time be absolute? Where does the Arrow Of Time (Eddington) come from? Is there something else which grows with time?

The old answer is entropy, traditionally denoted by S.

Boltzmann’s equation S = k log P says that entropy augments during the evolution of a system. P indicates the number of states accessible by the system. Entropy was a construction from later Nineteenth Century physics, a successful attempt to understand the basic laws of thermodynamics (mostly due to Carnot).

A big problem for classical thermodynamics: what’s a state? That’s not clear.

However Quantum Physics define states, very precisely. However, very specifically: a situation, defined in space-time, what Bohr and Al. called an “experiment” (rightly so!) defines a number of possible outcomes: the latter become the “states”, a basis for the Hilbert Space the “experiment” defines.

Classical statistical mechanics does not enjoy such precisely defined states. So why not to use the states of Quantum Physics? Some could object that Quantum “experiments” are set-up by people. However Quantum Interactions happen all the time, independently of people. As in the Quantum experiments set-up by people, those Quantum Interactions grow something: Quantum Entanglement. ( Self-described “Quantum Mechanic” Seth Lloyd from MIT has also mentioned that entanglement and the arrow of time could be related.)

Quantum Entanglement has a direction: from where singularization (= localization = the collapse of the Quantum wave packet) happened first, to the distant place it creates the geometry of (yes, entanglement creates geometry, that’s why it’s so baffling to specialists!) 

Quantum Physics, Or, More Precisely, What I call QUANTUM INTERACTIONS are irreversible processes. Hence the Arrow Of Time

So we have two things which grow, and can’t be reversed: Time and Wave Collapse/Quantum Entanglement. I propose to identify them. (After all, Maxwell proposed to identify electromagnetic waves and light, just because they are both waves and went at the same speed; it turned out to be a magnificent insight.)

Quantum Wave function collapse is time irreversible (actually, the entire Quantum Wave deployment is time irreversible, because it depends only upon the geometry it’s deployed in). The mechanism of wave function collapse is philosophically a matter of often obscure interpretations, and arguably the greatest problem in physics and philosophy.

My position here is perfectly coherent: I believe the Quantum Waves are real. (So I do not believe the waves are waves of ignorance, and an artefact, as some partisans of Quantum decoherence have it). Those objective waves are real, although not always in one piece (that’s how I generate Cold Dark Matter).

By the way, it is the collapse of the Quantum Wave which “creates” the Quantum Entanglement At least that’s how the mathematics, the description of the theory has it! The picture it creates in one’s mind (first the wave, then the collapse, then the entanglement) makes sense. Actually I am arguing that this is how sense makes sense!

Quantum Entanglement is a proven experimental fact. All physicists have to agree with that. Thus the Quantum Wave has to be real, as it is the cause of the Quantum Entanglement! (I am pointing out here that those, and that’s now nearly all of them, who believe in Entanglement are incoherent if they don’t believe in the wave too!).

Jules Henri Poincaré had seen that time and space were not equivalent. That was meritorious, as Poincaré had proposed the original ideas of “local time” and “local space” theories, which are the fundamental backbones of Special Relativity (they are deduced from the constancy of the speed of light).

Even Einstein publicly frowned on the concept of “spacetime”, which identifies space and time; “spacetime” was proposed by Minkowski, Einstein’s own professor at the EHT… They may not have been friends, as Minkowski compared Einstein to a “lazy dog”; Einstein, of course, respected Poincaré so much, that he grabbed the entire theory of Relativity from him, including its name…

Quantum Physics does not outright treat time as equivalent to space, quite the opposite (although Quantum Field theorists have tried to, and do treat space and “imaginary time” as the same!). In fundamental Quantum Physics, time is a one parameter group of transformation, not really a dimension.

When a glass falls and shatters, Classical Mechanics is at a loss:’Why can’t it reassemble itself, with as little work?” Classical Thermodynamics mumbles:’Because Entropy augments’. (That may be a tenable position, but one will have to count the states of the glass in a Quantum way. Even then, the full energy computation will reveal a lack of symmetry.)

I say, simply:’A glass which has shattered can’t be reassembled, because Quantum Interactions, and ensuing entanglements happen.’ The resulting topology of cause and effect is more complicated than what one started with, and can’t be reversed. Quantum Interactions and ensuing effects at a distance they provide with, create a partial, nonlocal, ordering of the universe. Time. (Once a set has been physically defined, it has been thoroughly interacted with, Quantum Mechanically, and then it becomes a “well ordering”!)

So what’s time? The causal structure of the universe as determined by irreversible, causal Quantum Wave collapse and Quantum Entanglement.

Patrice Ayme’

DARK MATTER EMERGENCE! (If so, is a New Quantum revolution at hand?)

March 31, 2017

Long story short: My own theory of Dark Matter predicts that Dark Matter is EMERGENT. That could be viewed as a huge flaw, easy to disprove, sending me back to a burrow somewhere to pursue my humble subterranean existence of sorts. HOWEVER, big surprise: DARK MATTER EMERGENCE seems to be exactly what was just observed in 2017, at the European Southern Observatory (ESO)!


Anomalies in the behavior of gravitation at a galactic scale, has become the greatest crisis in physics. Ever:

What is the problem? Four centuries of physics possibly standing on its head! (Using the virial theorem,) Swiss astronomer Fritz Zwicky discovered and named Dark Matter, or, as Zwicky said in German,  “dunkle Materie“, in 1933. Zwicky observed an enormously mysterious gravitational pull.

Zwicky computed that the observed gravitational pull did not correspond to the visible matter, by an ORDER OF MAGNITUDE, and thus Zwicky assumed that there was plenty of matter that could not be seen. (At the time, physicists scoffed, and went to stuff more interesting to the military, thus, better esteemed and more propitious to glorious splurging and handshakes from political leaders!)

If spiral galaxies were only made up of the matter that we can see, stars at the outer edge should orbit the centre slower than those closer to the center.. But Zwicky  noticed that this was not the case: all the stars in the Andromeda galaxy move at similar speeds, regardless of their distance from the galactic center. (For nationalistic reasons Americans love to attribute DM’s discovery to American astronomers Vera Rubin and Kent Ford .in the 1970s. However great Vera Rubin is, that’s despicable: they worked 40 years after Zwicky.)

Many studies since the 1930s provided evidence for Dark Matter. Such matter doesn’t interact with light, that’s why it is dark. Thus, one can only observe the effects of Dark Matter via its gravitational effects.

Nobel Prizes Were Only Given To the 5% So Far. The 5% Are All What Today’s Official Physics Is About. This Is One Of The Reasons Why I Am Thinking Outside Of Their 5% Box…


How does one compute the mass of a galaxy?

One just look at how many stars it has. (In the Solar System, the sun is a thousand times more massive than all the planets combined; studies on how much stars are moved by the planets around them confirm that most of the mass is in the stars.) And that shows up as the overall light emitted by a galaxy. Summing up the observed light sums up the mass. Or, at least that was the long-standing idea. (More recently, the pull gravitation exerts on light has been used to detect Dark Matter, and it has been used on a… massive scale!) 

At the scale of galaxies, or galactic clusters, the motions of objects is indicating at least ten times the gravitational force that should be there, according to gravitation theory, considering the mass we see (that is the mass of all the stars we see).

Problem: that would mean that he so-called “Standard Model” of physics has no explanation for most of the mass in the galactic clusters.

Reality check: the celebrities of physics are very arrogant, and think they know exactly what the universe had for breakfast, 13.8 billion years ago, and how big it was (never mind that their logic is ridiculously flawed). Up to a few years ago, many were in denial that they were missing most of the mass-energy in the universe with their Standard Model theory. 

However, here they are now, having to admit they missed 95.1&% of the mass-energy in the universe (according to their own latest estimates)!

A low logical cost solution to the riddle of the apparently missing mass, was to decree that all physicists who have studied gravitation since Bullialdus, nearly four centuries ago, got it wrong, and that gravitation is not, after all, an inverse square of the distance law. A problem is that French astronomer Bullaldius’ very elementary reasoning seems still to have kept some validity today. Remember that, in the Quantum Field Theory setting, forces are supposedly due to (virtual) particle exchanges? Well, that was the basic picture Bullialdus had in mind! (Thus those who want to modify so-called “Newtonian Dynamics” wreck the basic particle exchange model!)


Bullialdus’ Inverse Distance Squared Law, Basic to Newton-Eintein:

Ismael Boulliau (aka Bullialdus) a famous astronomer, member of the English Royal Society, proposed the inverse square law for gravity, a generation before Newton. (Bullialdus crater on the Moon, named for Boulliau, would have water, by the way.) Boulliau reasoned that the force would come from particles emitted by the sun, just like light. Here is Bullialdus voice:

“As for the power by which the Sun seizes or holds the planets, and which, being corporeal, functions in the manner of hands, it is emitted in straight lines throughout the whole extent of the world… seeing that it is corporeal, it becomes weaker and attenuated at a greater distance or interval, and the ratio of its decrease in strength is the same as in the case of light, namely, the duplicate proportion, but inversely, of the distances that is, 1/d².”

Why still true today? The carrier of force are particles.If they go to infinite distance (as electromagnetism and gravitation do), then the density of filed carriers (photons, gravitons) will go down, as Bullialdus said, for the reason he gave.

Bullaldius’ observation is the basis of Newton’s gravitation theory, which is itself the first order approximation of Einstein’s theory of gravitation. (Einstein’s gravitaion is a tweak on Newton’s theory; what Einstein did is actually to re-activate Buridan’s inertial theory with advanced mathematics invented by others (Riemann, Ricci, Hilbert, Levi-Civitta)

There is a basic problem here: although Einstein’s theory is a small tweak on Newton’s, MONDs are not. Correcting a theory by a factor of ten, a hundred, or a thousand is no tweak. Moreover: 

The ESO (European Southern Observatory) observation, illustrated above by ESO itself, seems to condemn BOTH of the two known, “official”classes of solutions for the gravitation problem: LCDM Dark Matter and Mond. The only theory left standing is my own Sub Quantic Dark Matter theory, which is fully emergent.


2017 ESO Discovery: Slowly Spinning Old Galaxies:Natascha Förster Schreiber at the Max Planck Institute for Extraterrestrial Physics in Germany and her colleagues have used the European Very Large Telescope in Chile to make the most detailed observations so far of the movement of six giant galactic discs, 10 billion years ago.

They found that, unlike in (quasi-)contemporary galaxies, the stars at the edges of these galaxies long ago, far away, move more slowly than those closer in.

“This tells us that at early stages of galaxy formation, the relative distribution of the normal matter and the dark matter was significantly different from what it is today,” says Förster Schreiber. (Well, maybe. MY interpretation would be very different! No DM!)

In order to check their unexpected results, the researchers used a “stack” of 101 images of other early galaxies to find an average picture of their rotations. The stacked galaxies matched the rotations of the more rigorously studied ones. “We’re not just looking at six weirdo galaxies – this could be more common,” says Förster Schreiber. “For me, that was the wow moment.”


MOdified Newtonian Dynamics (MONDs) Don’t Work:

About 10 billion years ago, there was a peak formation period of galaxies. By looking 10 billion light years away, one can see what was going on then, and have plenty of galaxies to look at. Where was the Dark Matter there? Was there Dark Matter then? One can answer these questions by just looking, because Dark Matter shows up in the way galaxies rotate, or orbit (in galactic cluster).

The result is both completely unexpected and spectacular! I am thrilled by it, because what is observed to happen is exactly the main prediction of MY theory of Dark Matter!

What is found is that, ten billion years ago, the largest star-forming galaxies were dominated by normal matter, not by the dark matter that’s so influential in galaxies today. (I reckon that this result was already indicated by the existence of galaxies which are mostly Dark Matter… at least in my sort of cosmology which differs massively from the standard Lambda Cold Dark Matter, LCDM model.)

MOND theories, relativistic or not, say that gravity is ten times stronger at, say, 30,000 light years away from a mass. If that’s the true law of gravitation in the last few hundreds of millions of years (as observed in presently surrounding galaxies), it should have been the case ten billion years ago. But that’s not what’s observed. So MOND theories can’t be true


LCDM cop-out: Dark Matter makes halos, like around the Virgin Mary’s Head!

On the face of it, the discovery about those ten billion year old galaxies say that the galactic disks then did not contain Dark Matter. That seems to me that it shoots down both MOND theories and the LCDM model (that’s the fancy name for the conventional Big Bang, latest version).

However, conventional scientists, and, in particular, cosmologists, are good at pirouettes, that’s why they are professionals.  There is still a (twisted) logical escape for LCDM model. The differences in early galaxies’ rotations demonstrates that there is very little Dark Matter in towards the middle of their disks, to start with, reason the Cold Dark Matter specialists. Instead, those ancient galaxies’ disks are almost entirely made up of the matter we see as stars and gas. The further away (and thus earlier in cosmic history) the galaxies were, the less dark matter their disks contained.

The specialists suggest that the turbulent gas in early galaxies condensed into the flat, rotating disk shapes we see today more quickly than Dark Matter, which remained in a diffuse  “halo”, which would progressively fall in… but had not started to falling enough, ten billion years ago. (That’s weird, because I thought LCDM mixed normal matter and dark matter, right from the start. In any case, I am not going to make their increasingly fishy case for them!).

Dark Matter gathers – but it takes time. This is exactly what my theory of Dark Matter predicts. In my own theory, Dark Matter is the result, the debris, of Quantum Interactions (entanglement resolutions, singularization) at very large distances. This debris gathering takes time.

My Dark Matter theory predicts that Dark Matter is an Emergent phenomenon. No other theory does that. Studies of more than 100 old giant galaxies support my theory, why making the situation (very) difficult for the conventional Dark Matter theory (“LCDM”) and impossible for the MOND theories.

This progressive build-up  of Dark Matter is NOT predicted by the other two Dark Matter theories. The standard (LCDM) cosmological Dark Matter model does NOT predict a slow gathering of Dark Matter. Nor does the  MOdified Newtonian Dynamics theories (MOND, relativistic or not) predict a slow apparition of Dark Matter.m the center and most of the visible matter.

It has been taken for granted by the Dark Matter advocates that Dark Matter, a sort of non-standard standard matter, was in the universe from its legendary start, the Big Boom, aka, Big Bang,

This is an important step in trying to figure out how galaxies like the Milky Way and larger galaxies must have assembled,” says Mark Swinbank at Durham University. “Having a constraint on how early the gas and stars must have formed the discs and how well-mixed they were with dark matter is important to informing their evolution.”

Journal reference: Nature, DOI: 10.1038/nature21685

Right. Or maybe, as I speculate, for plenty of excellent reasons coming from logically far away, this is an indication that not Gravitation Theory, but Quantum Theory, is not correct. Oh, the Standard Model, too, is not correct. But we all already knew this…

Conclusion: If the ESO observation that Dark Matter was not present in large galactic disks, ten billion years ago, is correct, I cannot imagine how MOdified Newtonian Dynamics theories could survive. And I find highly implausible that LCDM would. All what is left standing, is my own theory, the apparent main flaw of which, is now turned into a spectacular prediction! DARK MATTER Appears SLOWLY as predicted by Patrice Ayme’s SUB-QUANTIC Model. (Wow!)

Patrice Ayme’

DARK MATTER, Or How Inquiry Proceeds

September 7, 2016

How to find really new knowledge? How do you find really new science? Not by knowing the result: this is what we don’t have yet. Any really new science will not be deduced from pre-existing science. Any really new knowledge will come out of the blue. Poetical logic will help before linear logic does.

The case of Dark Matter is telling: this increasingly irritating elephant in the bathroom has been in evidence for 80 years, lumbering about. As the encumbering beast did not fit existing science, it was long religiously ignored by the faithful, as a subject not worthy of serious inquiry by very serious physicists. Now Dark Matter, five times more massive than Standard Model matter, is clearly sitting heavily outside of the Standard Model, threatening to crush it into irrelevance. Dark matter obscures the lofty pretense of known physics to explain everything (remember the grandly named TOE, the so-called “Theory Of Everything“? That was a fraud, snake oil, because main stream physics celebrities crowed about TOE, while knowing perfectly well that Dark Matter dwarfed standard matter, and was completely outside of the Standard Model).

Physicists are presently looking for Dark Matter, knowing what they know, namely that nature has offered them a vast zoo of particles, many of them without rhyme or reason (some have rhyme, a symmetry, a mathematical group such as SU3 acting upon them; symmetries revealed new particles, sometimes). 

Bullet Cluster, 100 Million Years Old. Two Galaxies Colliding. The Dark Matter, In Blue, Is Physically Separated From the Hot, Standard Matter Gas, in Red.

Bullet Cluster, 100 Million Years Old. Two Galaxies Colliding. The Dark Matter, In Blue, Is Physically Separated From the Hot, Standard Matter Gas, in Red.

[This sort of pictures is most of what we presently have to guess what Dark Matter could be; the physical separation of DM and SM is most telling to me: it seems to indicate that SM and DM do not respond to the same forces, something that my Quantum theory predicts; it’s known that Dark Matter causes gravitational lensing, as one would expect, as it was first found by its gravitational effects, in the 1930s…]

However, remember: a truly completely new piece of science cannot be deduced from pre-existing paradigm. Thus, if Dark Matter was really about finding a new particle type, it would be interesting, but not as interesting as it would be, if it were not, after all, a new particle type, but from a completely new law in physics.

This is the quandary about finding truly completely new science. It can never be deduced from ruling paradigms, and may actually overthrow them. What should then be the method to use? Can Descartes and Sherlock Holmes help? The paradigm presented by Quantum Physics helps. The Quantum looks everywhere in space to find solutions: this is where its (“weird”) nonlocality comes in. Nonlocality is crucial for interference patterns and for finding lowest energy solutions, as in the chlorophyll molecule. This suggests that our minds should go nonlocal too, and we should look outside of a more extensive particle zoo to find what Dark Matter is.

In general, searching for new science should be by looking everywhere, not hesitating to possibly contradict what is more traditional than well established.

An obvious possibility is, precisely, that Quantum Physics is itself incomplete, and generating Dark Matter in places where said incompleteness would be most blatant. More precisely, Quantum processes, stretched over cosmic distances, instead of being perfectly efficient and nonlocal over gigantically cosmic locales, could leave a Quantum mass-energy residue, precisely in the places where extravagant cosmic stretching of Quanta occurs (before “collapse”, aka “decoherence”).

The more one does find a conventional explanation (namely a new type of particle) for Dark Matter, the more likely my style of explanation is likely. How could one demonstrate it? Not by looking for new particles, but by conducting new and more refined experiments in the foundations of Quantum Physics.

If this guess is correct, whatever is found askew in the axioms of present Quantum Physics could actually help future Quantum Computer technology (because the latter works with Quantum foundations directly, whereas conventional high energy physics tend to eschew the wave aspects, due to the high frequencies involved).

Going on a tangent is what happens when the central, attractive force, is let go. A direct effect of freedom. Free thinking is tangential. We have to learn to produce tangential thinking.

René Descartes tried to doubt the truth of all his beliefs to determine which beliefs he could be certain were true. However, at the end of “The Meditations” he hastily conclude that we can distinguish between dream and reality. It is not that simple. The logic found in dreams is all too similar to the logic used by full-grown individuals in society.

Proof? Back to Quantum Physics. On the face of it, the axioms of Quantum Physics have a dream like quality (there is no “here”, nor “there”, “now” is everywhere, and, mysteriously, the experiment is Quantum, whereas the “apparatus” is “classical”). Still, most physicists, after insinuating they have figured out the universe, eschew the subject carefully.  The specialists of Foundations are thoroughly confused: see Sean Carroll,

However unbelievable Quantum Physics, however dream-like it is, physicists believe in it, and don’t question it anymore than cardinals would Jesus. Actually, it’s this dream-like nature which, shared by all, defines the community of physicists. Cartesian doubt, pushed further than Descartes did, will question not just the facts, the allegations, but the logic itself. And even the mood behind it.

Certainly, in the case of Dark Matter, some of the questions civilization has to ask should be:

  1. How sure are we of the Foundations of Quantum Physics? (Answer: very sure, all too sure!)
  2. Could not it be that Dark Matter is a cosmic size experiment in the Foundations of Quantum Physics?

Physics, properly done, does not just question the nature of nature. Physics, properly done, questions the nature of how we find out the nature of anything. Physics, properly done, even questions the nature of why we feel the way we do. And the way we did. About anything, even poetry. In the end, indeed, even the toughest logic is a form of poetry, hanging out there, justified by its own beauty, and nothing else. Don’t underestimate moods: they call what beauty is.

Patrice Ayme’

Entangled Universe: Bell Inequality

May 9, 2016

Abstract: The Bell Inequality shatters the picture of reality civilization previously established. A simple proof is produced.

What is the greatest scientific discovery of the Twentieth Century? Not Jules Henri Poincaré’s Theory of Relativity and his famous equation: E = mcc. Although a spectacular theory, since  Poincaré’s made time local, in order to keep the speed of light constant, it stemmed from Galileo’s Principle of Relativity, extended to Electromagnetism. To save electromagnetism globally, Jules Henri Poincaré made time and length local.

So was the discovery of the Quantum by Planck the greatest discovery? To explain two mysteries of academic physics, Planck posited that energy was emitted in lumps. Philosophically, though, the idea was just to extent to energy the basic philosophical principle of atomism, which was two thousand years old. Energy itself was discovered by Émilie Du Châtelet in the 1730s.

Quantum Entanglement Is NOT AT ALL Classically Predictable

Quantum Entanglement Is NOT AT ALL Classically Predictable

Just as matter went in lumps (strict atomism), so did energy. In light of  Poincaré’s E = mc2, matter and energy are the same, so this is not surprising (by a strange coincidence (?)  Poincaré demonstrated, and published E = mc2, a few month of the same year, 1900, as Max Planck did E = hf; Einstein used both formulas in 1905).

The greatest scientific discovery of Twentieth Century was Entanglement… which is roughly the same as Non-Locality. Non-Locality would have astounded Newton: he was explicitly very much against it, and viewed it, correctly, as the greatest flaw of his theory. My essay “Non-Locality” entangles Newton, Émilie Du Châtelet, and the Quantum, because therefrom the ideas first sprung.


Bell Inequality Is Obvious:

The head of the Theoretical division of CERN, John Bell, discovered an inequality which is trivial and apparently so basic, so incredibly obvious, that it reflects the most basic common sense that it should always be true. Ian Miller (PhD, Physical Chemistry) provided a very nice perspective on all this. Here it is, cut and pasted (with his agreement):

Ian Miller: A Challenge! How can Entangled Particles violate Bell’s Inequalities?

Posted on May 8, 2016 by ianmillerblog           

  The role of mathematics in physics is interesting. Originally, mathematical relationships were used to summarise a myriad of observations, thus from Newtonian gravity and mechanics, it is possible to know where the moon will be in the sky at any time. But somewhere around the beginning of the twentieth century, an odd thing happened: the mathematics of General Relativity became so complicated that many, if not most physicists could not use it. Then came the state vector formalism for quantum mechanics, a procedure that strictly speaking allowed people to come up with an answer without really understanding why. Then, as the twentieth century proceeded, something further developed: a belief that mathematics was the basis of nature. Theory started with equations, not observations. An equation, of course, is a statement, thus A equals B can be written with an equal sign instead of words. Now we have string theory, where a number of physicists have been working for decades without coming up with anything that can be tested. Nevertheless, most physicists would agree that if observation falsifies a mathematical relationship, then something has gone wrong with the mathematics, and the problem is usually a false premise. With Bell’s Inequalities, however, it seems logic goes out the window.

Bell’s inequalities are applicable only when the following premises are satisfied:

Premise 1: One can devise a test that will give one of two discrete results. For simplicity we label these (+) and (-).

Premise 2: We can carry out such a test under three different sets of conditions, which we label A, B and C. When we do this, the results between tests have to be comparable, and the simplest way of doing this is to represent the probability of a positive result at A as A(+). The reason for this is that if we did 10 tests at A, 10 at B, and 500 at C, we cannot properly compare the results simply by totalling results.

Premise 1 is reasonably easily met. John Bell used as an example, washing socks. The socks would either pass a test (e.g. they are clean) or fail, (i.e. they need rewashing). In quantum mechanics there are good examples of suitable candidates, e.g. a spin can be either clockwise or counterclockwise, but not both. Further, all particles must have the same spin, and as long as they are the same particle, this is imposed by quantum mechanics. Thus an electron has a spin of either +1/2 or -1/2.

Premises 1 and 2 can be combined. By working with probabilities, we can say that each particle must register once, one way or the other (or each sock is tested once), which gives us

A(+) + A(-) = 1; B(+) + B(-) = 1;   C(+) + C(-) = 1

i.e. the probability of one particle tested once and giving one of the two results is 1. At this point we neglect experimental error, such as a particle failing to register.

Now, let us do a little algebra/set theory by combining probabilities from more than one determination. By combining, we might take two pieces of apparatus, and with one determine the (+) result at condition A, and the negative one at (B) If so, we take the product of these, because probabilities are multiplicative. If so, we can write

A(+) B(-) = A(+) B(-) [C(+) + C(-)]

because the bracketed term [C(+) + C(-)] equals 1, the sum of the probabilities of results that occurred under conditions C.


B(+)C(-)   = [A(+) + A(-)] B(+)C(-)

By adding and expanding

A(+) B(-) + B(+)C(-) = A(+) B(-) C(+) + A(+) B(-) C(-) + A(+) B(+)C(-) + A(-)B(+)C(-)

=   A(+)C(-) [(B(+) + B(-)] + A+B C+ + AB(+)C(-)

Since the bracketed term [(B(+) + B(-)] equals 1 and the last two terms are positive numbers, or at least zero, we have

A(+) B(-) + B(+)C(-) ≧ A(+)C(-)

This is the simplest form of a Bell inequality. In Bell’s sock-washing example, he showed how socks washed at three different temperatures had to comply.

An important point is that provided the samples in the tests must give only one result from only two possible results, and provided the tests are applied under three sets of conditions, the mathematics say the results must comply with the inequality. Further, only premise 1 relates to the physics of the samples tested; the second is merely a requirement that the tests are done competently. The problem is, modern physicists say entangled particles violate the inequality. How can this be?

Non-compliance by entangled particles is usually considered a consequence of the entanglement being non-local, but that makes no sense because in the above derivation, locality is not mentioned. All that is required is that premise 1 holds, i.e. measuring the spin of one particle, say, means the other is known without measurement. So, the entangled particles have properties that fulfil premise 1. Thus violation of the inequality means either one of the premises is false, or the associative law of sets, used in the derivation, is false, which would mean all mathematics are invalid.

So my challenge is to produce a mathematical relationship that shows how these violations could conceivably occur? You must come up with a mathematical relationship or a logic statement that falsifies the above inequality, and it must include a term that specifies when the inequality is violated. So, any takers? My answer in my next Monday post.

[Ian Miller.]


The treatment above shows how ludicrous it should be that reality violate that inequality… BUT IT DOES! This is something which nobody had seen coming. No philosopher ever imagined something as weird. I gave an immediate answer to Ian:

‘Locality is going to come in the following way: A is going to be in the Milky Way, B and C, on Andromeda. A(+) B(-) is going to be 1/2 square [cos(b-a)]. Therefrom the contradiction. There is more to be said. But first of all, I will re-blog your essay, as it makes the situation very clear.’

Patrice Ayme’

TO BE AND NOT TO BE? Is Entangled Physics Thinking, Or Sinking?

April 29, 2016

Frank Wilczek, a physics Nobel laureate, wrote a first soporific, and then baffling article in Quanta magazine: “Entanglement Made Simple”. Yes, all too simple: it sweeps the difficulties under the rug. After a thorough description of classical entanglement, we are swiftly told at the end, that classical entanglement supports the many World Interpretation of Quantum Mechanics. However, classical entanglement (from various conservation laws) has been known since the seventeenth century.

Skeptical founders of Quantum physics (such as Einstein, De Broglie, Schrodinger, Bohm, Bell) knew classical entanglement very well. David Bohm found the Bohm-Aharanov effect, which demonstrated the importance of (nonlocal) potential, John Bell found his inequality which demonstrated, with the help of experiments (Alain Aspect, etc.) that Quantum physics is nonlocal.

Differently From Classical Entanglement, Which Acts As One, Quantum Entanglement Acts At A Distance: It Interferes With Measurement, At A Distance

Differently From Classical Entanglement, Which Acts As One, Quantum Entanglement Acts At A Distance: It Interferes With Measurement, At A Distance

The point about the cats is that everybody, even maniacs, ought to know that cats are either dead, or alive. Quantum mechanics make the point they can compute things about cats, from their point of view. OK.

Quantum mechanics, in their busy shops, compute with dead and live cats as possible outcomes. No problem. But then does that mean there is a universe, a “world“, with a dead cat, happening, and then one with a live cat, also happening simultaneously?

Any serious philosopher, somebody endowed with common sense, the nemesis of a Quantum mechanic, will say no: in a philosopher’s opinion, a cat is either dead, or alive. To be, or not to be. Not to be, and not to be.

A Quantum mechanic can compute with dead and live cats, but that does not mean she creates worlds, by simply rearranging her computation, this way, or that. Her various dead and live cats arrangements just mean she has partial knowledge of what she computes with, and that Quantum measurements, even from an excellent mechanic, are just partial, mechanic-dependent measurements.

For example, if one measures spin, one needs to orient a machine (a Stern Gerlach device). That’s just a magnetic field going one way, like a big arrow, a big direction. Thus one measures spin in one direction, not another.

What’s more surprising is that, later on, thanks to a nonlocal entanglement, one may be able to determine that, at this point in time, the particle had a spin that could be measured, from far away, in another direction. So far, so good: this is like classical mechanics.

However, whether or not that measurement at a distance has occurred, roughly simultaneously, and way out of the causality light cone, EFFECTS the first measurement.

This is what the famous Bell Inequality means.

And this is what the problem with Quantum Entanglement is. Quantum Entanglement implies that wilful action somewhere disturbs a measurement beyond the reach of the five known forces. It brings all sorts of questions of a philosophical nature, and make them into burning physical subjects. For example, does the experimenter at a distance have real free will?

Calling the world otherworldly, or many worldly, does not really help to understand what is going on. Einstein’s “Spooky Interaction At A Distance” seems a more faithful, honest rendition of reality than supposing that each and any Quantum mechanic in her shop, creates worlds, willy-nilly, each time it strikes her fancy to press a button.

What Mr. Wilczek did is what manyworldists and multiversists always do: they jump into their derangement (cats alive AND dead) after saying there is no problem. Details are never revealed.

Here is, in extenso, the fully confusing and unsupported conclusion of Mr. Wilczek:

“Everyday language is ill suited to describe quantum complementarity, in part because everyday experience does not encounter it. Practical cats interact with surrounding air molecules, among other things, in very different ways depending on whether they are alive or dead, so in practice the measurement gets made automatically, and the cat gets on with its life (or death). But entangled histories describe q-ons that are, in a real sense, Schrödinger kittens. Their full description requires, at intermediate times, that we take both of two contradictory property-trajectories into account.

The controlled experimental realization of entangled histories is delicate because it requires we gather partial information about our q-on. Conventional quantum measurements generally gather complete information at one time — for example, they determine a definite shape, or a definite color — rather than partial information spanning several times. But it can be done — indeed, without great technical difficulty. In this way we can give definite mathematical and experimental meaning to the proliferation of “many worlds” in quantum theory, and demonstrate its substantiality.”

Sounds impressive, but the reasons are either well-known or then those reasons use a sleight of hand.

Explicitly: “take both of two contradictory property-trajectories into account”: just read Feynman QED, first chapter. Feynman invented the ‘sum over histories’, and Wilczek is his parrot; but Feynman did not become crazy from his ‘sum over history’: Richard smirked when his picturesque evocation was taken literally, decades later…

And now the sleight of hand: …”rather than  [gather] partial information spanning several times. But it can be done — indeed, without great technical difficulty.” This nothing new: it is the essence of the double slit discovered by that Medical Doctor and polymath, Young, around 1800 CE: when one runs lots of ‘particles’ through it, one sees the (wave) patterns. This is what Wilczek means by “partial information“. Guess what? We knew that already.

Believing that one can be, while not to be, putting that at the foundation of physics, is a new low in thinking. And it impacts the general mood, making it more favorable towards unreason.

If anything can be, without being, if anything not happening here, is happening somewhere else, then is not anything permitted? Dostoyevsky had a Russian aristocrat suggests that, if god did not exist anything was permitted. And, come to think of it, the argument was at the core of Christianism. Or more, exactly, of the Christian reign of terror which started in the period 363 CE-381 CE, from the reigns of emperor Jovian to the reign of emperor Theodosius. To prevent anything to be permitted, a god had to enforce the law.

What we have now is way worse: the new nihilists (Wilczek and his fellow manyworldists) do not just say that everything is permitted. They say: it does not matter if everything is permitted, or not. It is happening, anyway. Somewhere.

Thus Many-Worlds physics endangers, not just the foundations of reason, but the very justification for morality. That is that what is undesirable should be avoided. Even the Nazis agreed with that principle. Many-Worlds physics says it does not matter, because it is happening, anyway. Somewhere, out there.

So what is going on, here, at the level of moods? Well, professor Wilczek teaches at Harvard. Harvard professors advised president Yeltsin of Russia, to set up a plutocracy. It ruined Russia. Same professors made a fortune from it, while others were advising president Clinton to do the same, and meanwhile Prime Minister Balladur in France was mightily impressed, and followed this new enlightenment by the Dark Side, as did British leaders, and many others. All these societies were ruined in turn. Harvard was the principal spirit behind the rise of plutocracy, and the engine propelling that rise, was the principle that morality did not matter. because, because, well, Many-Worlds!

How does one go from the foundations of physics, to the foundations of plutocracy? Faculty members in the richest, most powerful universities meet in mutual admiration societies known as “faculty clubs” and lots of other I scratch-your-back, you scratch-my-back social occasion they spend much of their time indulging in. So they influence each other, at the very least in the atmospheres of moods they create, and then breathe together.

Remember? It is not that everything is permitted: it’s happening anyway, so we may as well profit from it first. Many-Worlds physics feeds a mood favorable to many plutocrats, and that’s all there is to it. (But that, of course, is a lot, all too much.)

Patrice Ayme’

The Quantum Puzzle

April 26, 2016


Is Quantum Computing Beyond Physics?

More exactly, do we know, can we know, enough physics for (full) quantum computing?

I have long suggested that the answer to this question was negative, and smirked at physicists sitting billions of universes on a pinhead, as if they had nothing better to do, the children they are. (Just as their Christian predecessors in the Middle Ages, their motives are not pure.)

Now an article in the American Mathematical Society Journal of May 2016 repeats (some) of the arguments I had in mind: The Quantum Computer Puzzle. Here are some of the arguments. One often hears that Quantum Computers are a done deal. Here is the explanation from Justin Trudeau, Canada’s Prime Minister, which reflects perfectly the official scientific conventional wisdom on the subject:

(One wishes all our great leaders would be as knowledgeable… And I am not joking as I write this! Trudeau did engineering and ecological studies.)

... Supposing, Of Course, That One Can Isolate And Manipulate Qubits As One Does Normal Bits...

… Supposing, Of Course, That One Can Isolate And Manipulate Qubits As One Does Normal Bits…

Before some object that physicists are better qualified than mathematicians to talk about the Quantum, let me point towards someone who is perhaps the most qualified experimentalist in the world on the foundations of Quantum Physics. Serge Haroche is a French physicist who got the Nobel Prize for figuring out how to count photons without seeing them. It’s the most delicate Quantum Non-Demolition (QND) method I have heard of. It involved making the world’s most perfect mirrors. The punch line? Serge Haroche does not believe Quantum Computers are feasible. However Haroche does not suggest how he got there. The article in the AMS does make plenty of suggestions to that effect.

Let me hasten to add some form of Quantum Computing (or Quantum Simulation) called “annealing” is obviously feasible. D Wave, a Canadian company is selling such devices. In my view, Quantum Annealing is just the two slit experiment written large. Thus the counter-argument can be made that conventional computers can simulate annealing (and that has been the argument against D Wave’s machines).

Full Quantum Computing (also called  “Quantum Supremacy”) would be something completely different. Gil Kalai, a famous mathematician, and a specialist of Quantum Computing, is skeptical:

“Quantum computers are hypothetical devices, based on quantum physics, which would enable us to perform certain computations hundreds of orders of magnitude faster than digital computers. This feature is coined “quantum supremacy”, and one aspect or another of such quantum computational supremacy might be seen by experiments in the near future: by implementing quantum error-correction or by systems of noninteracting bosons or by exotic new phases of matter called anyons or by quantum annealing, or in various other ways…

A main reason for concern regarding the feasibility of quantum computers is that quantum systems are inherently noisy. We will describe an optimistic hypothesis regarding quantum noise that will allow quantum computing and a pessimistic hypothesis that won’t.”

Gil Katai rolls out a couple of theorems which suggest that Quantum Computing is very sensitive to noise (those are similar to finding out which slit a photon went through). Moreover, he uses a philosophical argument against Quantum Computing:

It is often claimed that quantum computers can perform certain computations that even a classical computer of the size of the entire universe cannot perform! Indeed it is useful to examine not only things that were previously impossible and that are now made possible by a new technology but also the improvement in terms of orders of magnitude for tasks that could have been achieved by the old technology.

Quantum computers represent enormous, unprecedented order-of-magnitude improvement of controlled physical phenomena as well as of algorithms. Nuclear weapons represent an improvement of 6–7 orders of magnitude over conventional ordnance: the first atomic bomb was a million times stronger than the most powerful (single) conventional bomb at the time. The telegraph could deliver a transatlantic message in a few seconds compared to the previous three-month period. This represents an (immense) improvement of 4–5 orders of magnitude. Memory and speed of computers were improved by 10–12 orders of magnitude over several decades. Breakthrough algorithms at the time of their discovery also represented practical improvements of no more than a few orders of magnitude. Yet implementing Boson Sampling with a hundred bosons represents more than a hundred orders of magnitude improvement compared to digital computers.

In other words, it unrealistic to expect such a, well, quantum jump…

“Boson Sampling” is a hypothetical, and simplest way, proposed to implement a Quantum Computer. (It is neither known if it could be made nor if it would be good enough for Quantum Computing[ yet it’s intensely studied nevertheless.)


Quantum Physics Is The Non-Local Engine Of Space, and Time Itself:

Here is Gil Kalai again:

“Locality, Space and Time

The decision between the optimistic and pessimistic hypotheses is, to a large extent, a question about modeling locality in quantum physics. Modeling natural quantum evolutions by quantum computers represents the important physical principle of “locality”: quantum interactions are limited to a few particles. The quantum circuit model enforces local rules on quantum evolutions and still allows the creation of very nonlocal quantum states.

This remains true for noisy quantum circuits under the optimistic hypothesis. The pessimistic hypothesis suggests that quantum supremacy is an artifact of incorrect modeling of locality. We expect modeling based on the pessimistic hypothesis, which relates the laws of the “noise” to the laws of the “signal”, to imply a strong form of locality for both. We can even propose that spacetime itself emerges from the absence of quantum fault tolerance. It is a familiar idea that since (noiseless) quantum systems are time reversible, time emerges from quantum noise (decoherence). However, also in the presence of noise, with quantum fault tolerance, every quantum evolution that can experimentally be created can be time-reversed, and, in fact, we can time-permute the sequence of unitary operators describing the evolution in an arbitrary way. It is therefore both quantum noise and the absence of quantum fault tolerance that enable an arrow of time.”

Just for future reference, let’s “note that with quantum computers one can emulate a quantum evolution on an arbitrary geometry. For example, a complicated quantum evolution representing the dynamics of a four-dimensional lattice model could be emulated on a one-dimensional chain of qubits.

This would be vastly different from today’s experimental quantum physics, and it is also in tension with insights from physics, where witnessing different geometries supporting the same physics is rare and important. Since a universal quantum computer allows the breaking of the connection between physics and geometry, it is noise and the absence of quantum fault tolerance that distinguish physical processes based on different geometries and enable geometry to emerge from the physics.”


I have proposed a theory which explains the preceding features, including the emergence of space. Let’s call it Sub Quantum Physics (SQP). The theory breaks a lot of sacred cows. Besides, it brings an obvious explanation for Dark Matter. If I am correct the Dark matter Puzzle is directly tied in with the Quantum Puzzle.

In any case, it is a delight to see in print part of what I have been severely criticized for saying for all too many decades… The gist of it all is that present day physics would be completely incomplete.

Patrice Ayme’

Is “Spacetime” Important?

November 3, 2015

Revolutions spawn from, and contributes to, the revolutionary mood. It is no coincidence that many revolutionary ideas in science: Chemistry (Lavoisier), Biological Evolution (Lamarck), Lagrangians, Black Holes,, Fourier Analysis, Thermodynamics (Carnot), Wave Optics, (Young, Poisson), Ampere’s Electrodynamics spawned roughly at the same time and place, around the French Revolution.

In the Encyclopedie, under the term dimension Jean le Rond d’Alembert speculated that time might be considered a fourth dimension… if the idea was not too novel. Joseph Louis Lagrange in his ), wrote that: “One may view mechanics as a geometry of four dimensions…” (Theory of Analytic Functions, 1797.) The idea of spacetime is to view reality as a four dimensional manifold, something measured by the “Real Line” going in four directions.

There is, it turns out a huge problem with this: R, the real line, has what is called a separated topology: points have distinct neighborhoods. However, the QUANTUM world is not like that, not at all. Countless experiments, and the most basic logic, show this:

Reality Does Not Care About Speed, & The Relativity It Brings

Reality Does Not Care About Speed, & The Relativity It Brings

Manifolds were defined by Bernhard Riemann in 1866 (shortly before he died, still young, of tuberculosis). A manifold is made of chunks (technically: neighborhoods), each of them diffeomorphic to a neighborhood in R^n (thus a deformed piece of R^n, see tech annex).

Einstein admitted that there was a huge problem with the “now” in physics (even if one confines oneself to his own set-ups in Relativity theories). Worse: the Quantum changes completely the problem of the “now”… Let alone the “here”.

In 1905, Henri Poincaré showed that by taking time to be an imaginary fourth spacetime coordinate (√−1 c t), a Lorentz transformation can be regarded as a rotation of coordinates in a four-dimensional Euclidean space with three real coordinates representing space, and one imaginary coordinate, representing time, as the fourth dimension.

— Hermann Minkowski, 1907, Einstein’s professor in Zurich concluded: “The views of space and time which I wish to lay before you have sprung from the soil of experimental physics, and therein lies their strength. They are radical. Henceforth space by itself, and time by itself, are doomed to fade away into mere shadows, and only a kind of union of the two will preserve an independent reality.”

This remark rests on Lorentz’s work, how to go from coordinates (x, t) to (x’, t’). In the simplest case:

C is the speed of light. Lorentz found one needed such transformations to respect electrodynamics. If v/c is zero (as it is if one suppose the speed v to be negligible  relative to c, the speed of light infinite), one gets:

t = t’

x’ = x – vt

The first equation exhibits universal time: time does not depend upon the frame of reference. But notice that the second equation mixes space and time already. Thus, philosophically speaking, proclaiming “spacetime” could have been done before. Now, in so-called “General Relativity”, there are problems with “time-like” geodesics (but they would surface long after Minkowski’s death).

Another problem with conceptually equating time and space is that time is not space: space dimensions have a plus sign, time a minus sign (something Quantum Field Theory often ignores by putting pluses everywhere in computations)

In any case, I hope this makes clear that, philosophically, just looking at the equations, “spacetime” does not have to be an important concept.

And Quantum Physics seems to say that it is not: the QUANTUM INTERACTION (QI; my neologism) is (apparently, so far) INSTANTANEOUS (like old fashion time).

As we saw precedingly (“Can Space Be Faster Than Light“), the top cosmologists are arguing whether the speed of space can be viewed as faster than light. Call that the Cosmic Inflation Interaction (CII; it has its own hypothesized exchange particle, the “Inflaton”). We see that c, the speed of light is less than CII, and may, or may not be related to QI (standard Quantum Physics implicitly assumes that the speed of the Quantum Interaction QI is infinite).

One thing is sure: we are very far from TOE, the “Theory Of Everything”, which physicists anxious to appear as the world’s smartest organisms, with all the power and wealth to go with it, taunted for decades.

Patrice Ayme’

Tech Annex: R is the real line, RxR = R^2, the plane, RxRxR = R^3 the usual three dimensional space, etc. Spacetime was initially viewed as just RxRxRxR = R^4.]What does diffeomorphic mean? It means a copy which can be shrunk or dilated somewhat in all imaginable ways, perhaps (but without breaks, and so that all points can be tracked; a diffeomorphism does this, and so do all its derivatives).


September 11, 2015

Feynman:”It is safe to say that no one understands Quantum Mechanics.” 

Einstein: “Insanity is doing the same thing over and over and expecting different results.”

Nature: “That’s how the world works.”

Wilzcek (Physics Nobel Prize): “Naïveté is doing the same thing over and over, and always expecting the same result.”

Parmenides, the ancient Greek philosopher, theorized that reality is unchanging and indivisible and that movement is an illusion. Zeno, a student of Parmenides, devised four famous paradoxes to illustrate the logical difficulties in the very concept of motion. Zeno’s arrow paradox starts and ends this way:

  • If you know where an arrow is, you know everything about its physical state….
  • The arrow does not move…

Classical Mechanics found the first point to be erroneous. To know the state of a particle, one must know not only its position X, but also its velocity and mass (what’s called its momentum P). Something similar happens with Quantum Physics. To know the state of a particle, we need to know whether the state of what it has interacted with before…  exists, or not. According to old fashion metaphysics, that’s beyond weird. It’s simply incomprehensible.

The EPR Interaction: Zein Und Zeit. For Real.

The EPR Interaction: Zein Und Zeit. For Real.

[The Nazi philosopher Heidegger, an ex would-be priest, wrote a famous book “Being And Time“. However, rather than a fascist fantasy, the EPR is exactly about that level of depth: how existence and time come to be! And how those interact with our will…]

With that information, X and P, position and momentum, for each particle, classical mechanics predicts a set of particles’ future evolution completely. (Formally dynamic evolution satisfies a second order linear differential equation. That was thoroughly checked by thousands of officers of gunnery, worldwide, over the last five centuries.)

Highly predicting classical mechanics is the model of Einstein Sanity.

Aristotle had ignored the notion of momentum, P. For Aristotle, one needed a force to maintain motion (an objective proof of Aristotle’s stupidity; no wonder Aristotle supported, and instigated, fascist dictatorship as the best system of governance). Around 1320 CE, the Parisian genius Buridan declared that Aristotle was completely wrong and introduced momentum P, calling it “IMPETUS”.

May we be in a similar situation? Just like the Ancient Greeks had ignored P, is Quantum Wave Mechanics incomplete from an inadequate concept of what a complete description of the world is?

Einstein thought so, and demonstrated it to his satisfaction in his EPR Thought Experiment. The EPR paper basically observed that, according to the Quantum Axiomatics, two particles, after they interacted still formed JUST ONE WAVE. Einstein claimed that there had to exist hidden “elements of reality”, not yet identified in the (Copenhagen Interpretation of) quantum theory. Those heretofore hidden “elements of reality” would re-establish Einstein Sanity, Einstein feverishly hoped.

According to Einstein, following his friend Prince Louis De Broglie (to whom he had conferred the Doctorate) and maybe the philosopher Karl Popper (with whom he corresponded prior on non-locality), Quantum Mechanics appears random. But that randomness is only because of our ignorance of those “hidden variables.” Einstein’s demonstration rested on the impossibility of what he labelled “spooky action at a distance”.

That was an idea too far. The “spooky action at a distance” has been (amply) demonstrated in the meantime. Decades of experimental tests, including a “loophole-free” test published on the scientific preprint site last month, show that the world is like that: completely non-local everywhere.

In 1964, the physicist John Bell, CERN’s theory chief, working with David Bohm’s version of Einstein’s EPR thought experiment, identified an inequality obeyed by any physical theory that is both local — meaning that interactions don’t travel faster than light — and where the physical properties usually attributed to “particles” exist prior to “measurement.”

(As an interesting aside, Richard Feynman tried to steal Bell’s result, at a time when Bell was not famous, at least in the USA: a nice example of “French Theory” at work! And I love Feynman…)

Einstein’s hidden “elements of reality” probably exist, but they are NON-LOCAL. (Einstein was obsessed by locality; but that’s an error. All what can be said in favor of locality is that mathematics, and Field Theory, so far, are local: that’s the famous story of the drunk who looks for his keys under the lamp post, because that’s the only thing he sees.)

Either some physical influences travel faster than light, or some properties don’t exist before measurement. Or both

I believe both happen. Yes, both: reality is both faster than light, and it is pointwise fabricated by interactions (“measurement”). Because:

  1. The EPR Thought Experiment established the faster than light influence (and that was checked experimentally).
  2. But then some properties cannot exist prior to “EPR style influence”. Because, if they did, why do they have no influence whatsoever, once the EPR effect is launched?

Now visualize the “isolated” “particle”. It’s neither truly “isolated” nor truly a “particle”, as some of its properties have not come in existence yet. How to achieve this lack of existence elegantly? Through non-localization, as observed in the one-slit and two-slit experiments.

Why did I say that the “isolated” “particle” was not isolated? Because it interfered with some other “particle” before. Of course. Thus it’s EPR entangled with that prior “particle”. And when that “particle” is “measured” (namely INTERACTS with another “particle”), the so-called “isolated” “particle” gets changed, by the “spooky action at a distance”, at a speed much faster than light.

(This is no flight of fancy of mine, consecutive to some naïve misinterpretation; Zeilinger and Al. in Austria, back-checked the effect experimentally; Aspect in Paris and Zeilinger got the Wolf prize for their work on non-locality, so the appreciation for their art is not restricted to me!)

All these questions are extremely practical: they are at the heart of the difficulties in engineering a Quantum Computer.

Old physics is out of the window. The Quantum Computer is not here yet, because the new physics is not understood enough, yet.

Patrice Ayme’

No Multiverse, No Teleportation. Yet Quantum Consciousness?

June 27, 2015

There is a flaw, at the very root of the definition of the Multiverse:

Multiverse partisans believe anything, any physics, is possible. However if such is the case, among those possibilities, the universe is one of them. But then, if the Universe exists, there is just one universe, and the Multiverse can’t be!

Logic is a terrifying thing for those who have too little…

[The preceding is actually the latest variant, thanks to yours truly, of the 25 centuries old Cretan Paradox.]

We are led by some physicist who, not only have little knowledge, and little imagination, but they don’t have much logic, either! We look up to physics, because we look up to intellectual, or, more precisely, logical, scientific leadership. Prominent statements about the “Multiverse” or “Teleportation”, though, go the other way.

"Teleportation" Is About States, Not Particles. Nothing Simplistic!

“Teleportation” Is About States, Not Particles. Nothing Simplistic!

In one of the world’s major science museum, instruction is conducted for children between the age of 4 and 94 years old. Somewhere above the mastodonts and triceratops’ fossils is an special exhibition of the science of science-fiction.

An exhibit was about “teleportation”. There I was informed that particles had been successfully “teletransported” by “scientists” already.

I was so pleased to be informed of this that I teletransported all those who believe such inanities to a mental asylum.

They make a drastic mistake: confusing “particle” and “state”.

Particles cannot be “teletransported”. To pretend otherwise is a complete affabulation. What can be “teletransported” are Quantum States.

The staff of Sciencealerts, 22 September, 2014, used the following banner: “Physicists have quantum teleported a particle of light across 25 kilometres.”

No, they did not. They teleported the state of a third photon.

This sort of confusion goes to the core of the mental retardation in which physics has spent most of the Twentieth Century. I pointed out that it originated with Einstein. Einstein made the following statement, which I view as an extreme error:

“Energy, during the propagation of a ray of light, is not continuously distributed over steadily increasing spaces, but it consists of a finite number of energy quanta LOCALIZED AT POINTS IN SPACE, MOVING WITHOUT DIVIDING and capable of being absorbed or generated only as entities.”

That opinion of Einstein  above,  “the propagation of a ray of light… consists of a finite number of energy quanta LOCALIZED AT POINTS IN SPACE, MOVING WITHOUT DIVIDING” is complete affabulation, a fantasy. Yes, I know, Einstein got the Nobel Prize in Physics for it, and, thus, by saying this, I do not just grab Einstein by the horns, but the entire physics establishment. As Martin Luther would say, though, I see no other way…

I affabulate, and fantasized too, most often. However, when I do, while searching for truth, I try to respect known, well-established facts. In 1905, Einstein could imagine things about photons the way he did. Why not? It was natural: from Lucretius to Newton, most thinkers believed in particles. Particles were supposed to be the ultimate atoms of matter (atom means, in Greek, what cannot be divided).

However, since then, facts have intervened. The “particle” hypothesis became untenable. Indeed, the particular effect, how,  the Quantum shows up, is only how the energy of fundamental processes is released. In complete conflict, how the fundamental process proceeds is all about waves.

Einstein himself, after talking extensively about this with the (physicist and) philosopher Karl Popper, came to write the “EPR” paper… what is now called TELEPORTATION.

Einstein called this teleportation of states a “spooky interaction at a distance“. In truth, it’s an obvious consequence that fundamental processes are computed with waves, and waves are, by definition, NON-LOCAL.


Quantum Computing: What’s the Difference, And How Conscious Is It?

Present computing is similar to computing with water canals, one primitive manipulation at a time. Quantum Computing will be about computing with the interferences waves bring.

For more on Quantum Waves:

And there a quandary is presented: Quantum behavior has much in common with the attributes of consciousness. Thus a full computer may well behave unpredictably, and as if it had consciousness, but also, truly, be conscious. We wouild be not just facing Artificial Intelligence, but Artificial Consciousness.

Skynet may not just acquire control, but be sentient…

This, I do believe, is a real “danger”. Working on the Quantum Computer, is working on Artificial Consciousness. However, the proximal danger is that the aura of contagious stupidity has infected what passes for political leadership. To with European “leaders”, leading into the abyss, because the Greek leader has decided to submit the latest austerity measures to a referendum by the Greek People.

Does not the Greek Prime Minister know that the People does not rule? Demo-cracy = Demos Kratos, People Power. Not what we have. How come the Prime Minister of Greece does not know the basics of the corrupto-world we live in? Democracy is just a word polite people of wealth and taste use to mask plutocracy.

The Greeks want a referendum on whether they want to suffer some more? Unforgivable. So negotiations of the worthies with uppity Greece are interrupted. The CE chief, J-C Junkers is little more than a polyglot Mafioso, having managed the tax evasion of hundreds of billions of Euros of hundreds of companies, when he “led” Luxembourg. Now he can talk tough.

Insanity in physics has shown the way to insanity in politics and ethics. Inspired by the Schrodinger cat who is supposed to be both dead and alive, our great leaders thought they could get away with being all about money, and all about the people. If you don’t like this universe, go live in another.

(OK, maybe our great political leaders do not know enough physics to think this consciously; however the little critters who advise them, and write their discourses for them have themselves friends who feel they are very smart, and that physics says one can be all things to all people, at the same time. So the pernicious influence of mad physics go far, that way. And it has penetrated ethics, indeed.)

Even the Pope has noticed that supposedly refined economics, such as “cap and trade” (a European invention now used in California) were obviously inspired by the Devil. He condemned them. But, nowadays, like Schrodinger’s Cat, our great leaders imagine they can be the Devil and the Good Lord at the same time, in different places, and we will still embrace their feet religiously, our hearts frantic with unbounded admiration.

Time to cut the Gordian knot, with a very sharp sword. A sword cannot cut the universe in two (as the naïve Multiversists believe), but it can certainly cut the crap. And teletransport minds to a state closer to reality.

Patrice Ayme’

Entropy & Quantum: The Relativity of States

May 17, 2015

Entropy (usual symbol S) measures the number of specific ways in which a thermodynamic system may be arranged. It measures the number of states. It is understood as a measure of disorder.

Another part of physics which worries about states is Quantum Physics. A Quantum Process is associated to a Quantum Space which turns out to be a Hilbert Space (a complete complex vector space with a metric; basically the nices, simplest high dimensional complex vector space one can conceive of). The measurement is identified with an operator (say A) in said space, which has eigenspaces and eigenvalues (Av = av; where v is a vector called an eigenvector, and a, a complex number, the eigenvalue).

Forget Cats. In Which States Is The World Really In?

Forget Cats. In Which States Is The World Really In?

[Haroche, from the ENS lab in Paris which invented optical pumping, thus the laser, 62 years ago, and Wineland, from Boulder, got the Nobel in 2012.]

Both Entropy and the Quantum suffer of the same problem, namely: what is a state? Can state be absolutely defined?

As it is, things have been all too relative.

This is exemplified in Quantum Physics with the Schrodinger Cat Paradox. A cat is put in a box, with an infernal Quantum mechanism that is supposed to gas it (shortly after, the Nazis did for real… Interesting Freudian slip that German and Austrian physicists were involved with the idea of mixtures of dead and live cats).

The question is whether mixing live and dead cat waves is a full description of the system. It obviously stretches credulity. This was the argument of Schrodinger (initiated in exchanges with Einstein).

From the point of view of the cat, inside the box, the waves, states, and chosen Quantum spaces would be quite different

My wished-for-solution?

Apply an order on Hilbert spaces, according to fullness of description, and consider only ultrafilters (in the topological sense) as genuinely representative of the best approximation of reality. Hey, nobody said we should not think big… Anyway, that’s my answer to the Multiverse and its multiversists.

Now back to entropy.

As it exists, thermodynamics is about particles. Thus, it infeodated to the problem of states in Quantum Physics. Hence solving the Quantum Cat problem solves the problem of Entropy.

Or does it?

The deepest problem subjacent to Quantum Physics is whether some sort of thermodynamics could be, and thus should be, applied to the isolated particle (I believe it could, and should).

The Haroche and Wineland methods, above, are a step in the right direction, namely measuring what the real states, the ultimate element of reality of the world, are.

So is Entropy useless? Is it physics? Yes, it is physics, just like computer science is science. Both are emergent aspects of the world. Not as fundamental as a future sub-Quantum Physics, but all the fundamentalism, and no more, that we need, much of the time.

Patrice Ayme’