Archive for the ‘Foundations Of Physics’ Category

Free Will Destroys The Holographic Principle

February 12, 2017

Abstract: Many famous physicists promote (themselves and) the “Holographic Universe” (aka the “Holographic Principle”). I show that the Holographic Universe is incompatible with the notion of Free Will.

***

When studying Advanced Calculus, one discovers situations where the information on the boundary of a locale enables to reconstitute the information inside. From my mathematical philosophy point of view, this phenomenon is a generalization of the Fundamental Theorem of Calculus. That says that the sum of infinitesimals df is equal to the value of the function f on its boundary.

The Fundamental Theorem of Calculus was discovered by the French lawyer and MP, Fermat, usually rather known for proposing a theorem in Number Theory, which took nearly 400 years to be proven! Fermat actually invented calculus, a bigger fish he landed while Leibniz and Newton’s parents were in diapers.

As Wikipedia puts it, inserting a bit of francophobic fake news for good measure:  Fermat was the first person known to have evaluated the integral of general power functions. With his method, he was able to reduce this evaluation to the sum of geometric series.[10] The resulting formula was helpful to Newton, and then Leibniz, when they independently developed the fundamental theorem of calculus.” (Independently of each other, but not of Fermat; Fermat published his discovery in 1629. Newton and Leibniz were born in 1642 and 1646…)  

Holography is a fascinating technology.  

Basic Setup To Make A Hologram. Once the Object, The Green Star, Has Fallen Inside A Black Hole, It’s Clearly Impossible To Make A Hologram of the Situation, If Free Will Reigns Inside the Green Star.

Basic Setup To Make A Hologram. Once the Object, The Green Star, Has Fallen Inside A Black Hole, It’s Clearly Impossible To Make A Hologram of the Situation, If Free Will Reigns Inside the Green Star.

The objection is similar to that made in Relativity with light: if one goes at the speed of light (supposing one could), and look at a mirror, the light to be reflected could never catch-up with the mirror. Hence, once reaching the speed of light, one could not look oneself into a mirror. Einstein claimed he got this idea when he was 16-year-old (cute, but by then others had long figured out the part off Relativity pertaining to that situation…

My further objection below is going to be a bit more subtle.

***

Here Is The Holographic Principle As Described In Wikipedia:

The holographic principle is a principle of string theories and a supposed property of quantum gravity that states that the description of a volume of space can be thought of as encoded on a lower-dimensional boundary to the region—preferably a light-like boundary like a gravitational horizon. First proposed by Gerard ‘t Hooft, it was given a precise string-theory interpretation by Leonard Susskind[1] who combined his ideas with previous ones of ‘t Hooft and Charles Thorn.[1][2] As pointed out by Raphael Bousso,[3] Thorn observed in 1978 that string theory admits a lower-dimensional description in which gravity emerges from it in what would now be called a holographic way.

In a larger sense, the theory suggests that the entire universe can be seen as two-dimensional information on the cosmological horizon, the event horizon from which information may still be gathered and not lost due to the natural limitations of spacetime supporting a black hole, an observer and a given setting of these specific elements,[clarification needed] such that the three dimensions we observe are an effective description only at macroscopic scales and at low energies. Cosmological holography has not been made mathematically precise, partly because the particle horizon has a non-zero area and grows with time.[4][5]

The holographic principle was inspired by black hole thermodynamics, which conjectures that the maximal entropy in any region scales with the radius squared, and not cubed as might be expected. In the case of a black hole, the insight was that the informational content of all the objects that have fallen into the hole might be entirely contained in surface fluctuations of the event horizon.

***

The Superficiality Principle Rules:

I long suspected that physicists and mathematicians are taken by the beauty of the simplification of knowing the inside from the outside. It’s a sort of beauty, fashion model way of looking at the world. It miserably fails with Black Holes.

To figure this out, one needs to know one thing about Black Holes, and another in philosophy of mind.

***

FREE WILL DEMOLITION OF THE HOLOGRAPHIC PRINCIPLE:

My reasoning is simple:

  1. Consider a Black Hole so large that a human being can fall into it without been shredded by tidal effects. A few lines of high school computation show that a Milky Way sized volume with the density of air on Earth is a Black Hole: light falling into it, cannot come back. (Newton could have made the computation and Laplace did it.)
  2. So here we have this Human (call her H), falling in the Milky Way Air Black Hole (MWAB).
  3. Once past the boundary of the Black Hole, Human H cannot be communicated with from the outside of the boundary (at least from known physics).
  4. What the Holographic proponent claim is that they can know what is inside the MWAB.
  5. Suppose that Human H decides to have scrambled eggs for breakfast instead of pancakes. The partisans of the Holographic Universe claim that they had the information already. However they stand outside of the MWAB, the giant Black Hole, and cannot communicate with its interior. Nevertheless, Susskind and company claim they knew it all along.

That is obviously grotesque. (Except if you believe Stanford physicists are omniscient, omnipotent gods, violating known laws of physics: that is basically what they claim.)

This is not as ridiculous as the multiverse (the most ridiculous theory ever). But it’s pretty ridiculous too. (Not to say that the questions Free Will lead to in physics are all ridiculous: they are not, especially regarding Quantum Theory!)

By the way, there are other objections against the Holographic Universe having to do with the COSMOLOGICAL Event Horizon (in contradistinction of those generated by Black Holes). Another time…

***

We Are Hypocrites, So We Live From Fake News:

Tellingly, the men promoting the Holographic Universe are Nobel Laureates, or the like. Such men tend to be very ambitious, full of Free Will, ready to say, or do anything, to dominate (I have met dozens in person). It is revealing that so great their Free Will is, that they are ready to contradict what they are all about, to make everybody talk about themselves, and promote their already colossal glories.

Patrice Ayme’

DARK MATTER PROPULSION Proposed

December 10, 2016

In Patrice’s Sub-Quantum Reality (PSQR), Matter Waves are real (in Quantum Theory Copenhagen Interpretation (QTCI) the Matter Waves are just probability waves). There has been no direct evidence that Matter Waves were real. So far. But times they are changing as the other one, who got his Nobel today, said.

Both Dark Matter and Dark Energy are consequences of PSQR. So: Observing both Dark Matter and Dark Energy constitute proofs of PSQR.

The prediction of the deviation of light by the Sun was twice with “General Relativity” than the one predicted in Newtonian Mechanics. The effect was minute, and detected only in grazing starlight, during Solar eclipse of 29 May 1919 (by the ultra famous British astronomer and physicist Eddington). Thus, as 95% of the universe matter-energy is from Dark Matter or Dark Energy, my prediction carries more weight.

PSQR also predict “fuel-less” production, in a variant of the effect which produces Dark Matter in PSQR: 

Dark Matter Pushes, Patrice Ayme Says. Explaining NASA's Findings?

Dark Matter Pushes, Patrice Ayme Says. Explaining NASA’s Findings?

How does Dark Matter create propulsion? Well, that it does is evident: just look at galactic clusters (more details another day). A Matter Wave will expand, until it singularizes. If it expands enough, it will become so big that it will lose a (smaller) piece of itself during re-singularization. That  piece is the Dark Matter.

Thus visualize this: take a cavity C, and bounce a Matter Wave around it (there is plenty of direct theoretical and experimental evidence that this can be arranged).

Make a hole H in the boundary of C (this is not different from the Black Body oven the consideration of which led Planck to discover the Quantum).

Some Dark Matter then escapes. By the hole. 

However, Dark Matter carries energy momentum (evidence from Galaxies, Galactic Clusters, etc.).

Hence a push. A Dark Matter push.

The (spectacular) effect has been apparently observed by NASA.

Does this violate Newton’s Third Law? (As it has been alleged.)

No. I actually just used Newton’s Third Law, the Action = Reaction law. So PSQR explains the observed effect in combination with the Action= Reaction Law, “proving” both.

How could we prove PSQR? There should be a decrease of energy-momentum after a while, and the decrease should equal the observed push exactly.

Patrice Ayme’

***

Warning: The preceding considerations are at the edge of plausible physics. (A tiny group of dissenting physicists are even busy making theories where Dark Matter does not exist. The consensus is that Dark Matter exists, but is explained by a variant of the so-called “Standard Model”, using “Supersymmetry”, or “WIMPs”, or “Axions”. My own theory, PSQR is, by far, the most exotic, as it threws Quantum Theory Copenhagen Interpretation, QTCI, through the window.)

Nature Of The Physical Law & Reaction Law

December 5, 2016

Human laws are modelled, in spirit, after physical laws. So it is socially important to realize how physical laws are established, and that they are not immutable. Physical laws are established by observation (some direct, some axiomatic; yes, a paradox). However, if you read the magazine “Wired”, you may feel that physical laws are established, like the Bible or the Qur’an, by the sheer power of a personality cult:

“LAST MONTH, NASA researchers dropped news with potentially huge consequences for space travel and science as a whole: They ran an experiment whose results seem to defy the very laws of physics, and could change how we travel through outer space. Problem is, experts say that it’s incredibly unlikely that Isaac Newton is wrong. Instead, the most likely explanation is the team simply made a mistake somewhere along the way

The team was testing a theory that there’s a new way to propel satellites, instead of using rockets powered by a limited supply of fuel. So they put a radio antenna in a specially designed, sealed container. Turned on, the antenna bounced 935MHz radio waves (similar to those used by some cell phones) around, and the container apparently moved a tiny, tiny bit. This violates Newton’s third law of motion, one of the basic tenets of physics.

Loosely put, Newton taught us that no action can occur without an equal and opposite reaction.”

[WIRED from August 2014: https://www.wired.com/2014/08/why-nasas-physics-defying-space-engine-is-probably-bogus/]

Reaction = Action Is An Experimental Fact. Or Was, Until Recently. Does not have to stay that way

Reaction = Action Is An Experimental Fact. Or Was, Until Recently. Does not have to stay that way

Right, the article is from 2014. However, the riddle got more interesting in 2016, when the same tests were conducted in hard vacuum… with the same results (it was initially thought that radiation heated air, which expanded, creating a push; without air, that counter-idea failed).

Who are these “experts”? People who gave the Nobel Prize to each other? Newton did not “teach” us that action = reaction inasmuch as he demonstrated it (thanks to arcane mathematics). Before I explain what I mean, let me mention that Richard Feynman wrote a famous book “The Character of the Physical Law” (which I read). Feynman observes that there is a hierarchy of laws. Here I will observe something even more subtle: there is a hierarchy of how fundamental laws are viewed as fundamental.

***

Newton ASSUMED this “Third Law”, he made an hypothesis of it (and the law was probably known to cannoneers for centuries). Using in part this action = reaction hypothesis, Newton was able to deduct, from a large axiomatic system, with lots of arcane mathematics, theorems. And some of these theorems had practical consequences which were found, or known, to be true (Kepler laws). So it was reasonably assumed that Newton’s Third Law was correct: it is an axiom the use of which bring the correct theorems. The same sort of reasonings established the First and Second Laws of motion, which were discovered by the stupendous genius Buridan, three centuries BEFORE Newton.  

To my knowledge, the Third Law was first stated by Newton. However, that law was certainly well-known by Roman artillery engineers, who were used to catapult large masses at enormous distances: they knew of the recoil all too well. Roman and European Middle Age artillery enabled to seize cities (armies which were less competent in artillery found seizing cities difficult to do; the Turks used Hungarians engineers to breach the walls of Constantinople with giant guns).

Thus we see there are two sorts of physical laws: those we assume as axioms, and then we certify them, because the mathematical logic they give rise to bring apparently correct results. Other natural laws are observed directly.

For example, the so-called “Standard Model” can be viewed as a sort of giant law. It uses, in its axioms, the so-called Higgs boson, and that was indeed found (sort of).

Thus direct observations can suggest a law (say action = reaction; or gravitation) which then is established through the axiomatic method (heavily used in modern physics). Actually the case of gravitation is even more interesting: observations suggested an attractive force. Then Ismaël Bullialdus, a French priest-astronomer-mathematician found a reasoning why it should be an inverse square law (Bullialdus has a crated named after him on the Moon). Armed with Bullialdus inverse-square law, Isaac Newton used the inverse square law as an axiom to “deduce” Kepler’s laws  (I wrote “deduce”, because, centuries later, it was called into question whether Newton had properly demonstrated Gauss’ law, which reduce, gravitationally speaking, planets to massive points)

Examples of laws observed directly are numerous: they include the classical laws of optics, of forces (depicted by vectors; but one cannot use vector theory to prove how force behave… because vectors are abstracted forces), much of electrical behavior, etc.

Some laws were deduced from axiomatics before being demonstrated experimentally. Newton’s crowning achievement was more or less) demonstrating the equivalence of Kepler Laws with the 1/dd inverse square universal attraction law… given the laws of “Newtonian” Mechanics.

As I said, the laws of mechanics were greatly deduced by Buridan and various engineers, generations before Newton.

Could the same be going on now? Who knows?

It is a question of observation. Ultimately physics, nature, is what is observed, nothing less. It gets to be more than what is observed, because of our imagination, and the fact it needs to use the logics and maths it knows.

Meta-lesson? Politics degenerated in the West, in the last 50 years, because what was really going on was observed only in a fragmentary way. This is in particular the drama of so-called “left”, or progress. We have to stick to what is observed.

In the case of democrats, what was observed is that “Democrats” selected a candidate who was the object of 4 Congressional inquiries (Sanders had none, never had any).

Now they insult us.

Patrice Ayme’

DARK GALAXY (Explained?)

October 1, 2016

A giant galaxy made nearly entirely of Dark Matter has been discovered. Theories of Dark Matter proposed by people salaried for professing physics cannot explain (easily, if at all!) why there would be so much Dark Matter in one galaxy. I can. In my own theory, Dark Matter is not really matter, although matter gives birth to it, under some particular geometrical conditions. In my theory, in some geometrodynamic situations, a galaxy will churn inordinate amounts of Dark Matter quickly. So I was not surprised by the find.

There are many potential theories of Dark Matter. Most are fairly conventional. They typically hypothesize new particles (some of these new particles could come from new symmetries, such as supersymmetry). I do not see how they can predict why these particular particles appear in some places, and not others. However, the importance of location, of geometry, is a crucial feature of my own theory.

I predicate that the Quantum Interaction (copyright myself) does not have infinite range. Thus, quantum interactions, in some conditions of low mass-energy density, leave behind part of the Quantum Wave. Such debris have mass-energy, so they exert gravitational pull, but they have little else besides (most of the characteristics of the particles they were part of concentrate somewhere else).

I Can Explain This Dark Galaxy, By Changing The Foundations Of Physics. No Less.

I Can Explain This Dark Galaxy, By Changing The Foundations Of Physics. No Less.

[From the Hawaiian Gemini telescope.]

In my own theory, one can imagine that the geometry of a galaxy is, at some point extremely favorable to the creation of Dark Matter: it is just a question of dispersing the matter just so. The Dark Galaxy has 1% of the stars of our Milky Way, or less. In my theory, once Dark Matter has formed, it does not seem possible to make visible matter again with it (broken Quantum Wave debris float around like a cosmic fog).

All past science started as a mix of philosophy and science-fiction (Aristarchus, Lucretius, Giordano Bruno, Immanuel Kant, Lamarck are examples). One can only surmise it will be the same in the future, and this is supported by very good logic: guessing comes always before knowing. Those who claim that science will never be born again from philosophy and fantasy are saying that really new science will never happen again. They say that all the foundations of science are known already. So they are into predication, just like religious fanatics.

It was fashionable to say so, among physicists in the 1990s, the times of the fable known as TOE, the so-called Theory Of Everything. Shortly after this orgasm of self-satisfaction by self-appointed pontiffs, the evidence became clear that the universe’s mass-energy was mostly Dark Energy, and Dark Matter.

This is an interesting case of meta-mood shared: also in the 1990s, clever idiots (Fukuyama, etc.) claimed history had ended: a similar claim from the same period, permeating the same mood of stunted imagination. The advantage, while those who pontificated that way? They could claim they knew everything: they had become gods, living gods.

I had known about Dark Matter all along (the problem surfaced nearly a century ago). I considered it a huge problem: It held galaxies and galactic clusters, together. But maybe something had been overlooked. Meanwhile Main Stream Physics (MSP) dutifully, studiously, ignored it. For decades. Speaking of Dark matter made one despicable, a conspiracy theorist.

Another thing MSP ignored was the foundations of physics. Only the most prestigious physicists, such as Richard Feynman, could afford to repeat Einstein’s famous opinion that “nobody understands Quantum Mechanics”. I gave my intellectual life’s main axis of reflection in trying to understand what nobody wanted to understand, that nobody thought they could afford to understand, the real foundations of physics. (So doing I was forced to reflect on why it is that people do not want to understand the most fundamental things, even while professing they do. It is particularly blatant in, say, economics.)

I have long discovered that the real foundations of physics are entangled with those of mathematics (it is not just that physics, nature, is written with mathematics, as Galileo wrote; there is a dialogue between the mathematics that we invent, and the universe that we discover, they lead to each other). For example whether the infinity axiom is allowed in mathematics change the physics radically (the normalization problem of physics is solved if one removes the infinity axiom).

Right now, research at the foundations of (proper) physics is hindered by our lack of nonlinear mathematics: Quantum mechanics, as it is, is linear (waves add up in the simplest way). However the “collapse of the wave packet” is obviously nonlinear (this is why it’s outside of existing physics, from lack of math). From that Quantum collapse, when incomplete from great distances involved, comes Dark Matter. At least, so I propose. 

Patrice Ayme’

DARK MATTER, Or How Inquiry Proceeds

September 7, 2016

How to find really new knowledge? How do you find really new science? Not by knowing the result: this is what we don’t have yet. Any really new science will not be deduced from pre-existing science. Any really new knowledge will come out of the blue. Poetical logic will help before linear logic does.

The case of Dark Matter is telling: this increasingly irritating elephant in the bathroom has been in evidence for 80 years, lumbering about. As the encumbering beast did not fit existing science, it was long religiously ignored by the faithful, as a subject not worthy of serious inquiry by very serious physicists. Now Dark Matter, five times more massive than Standard Model matter, is clearly sitting heavily outside of the Standard Model, threatening to crush it into irrelevance. Dark matter obscures the lofty pretense of known physics to explain everything (remember the grandly named TOE, the so-called “Theory Of Everything“? That was a fraud, snake oil, because main stream physics celebrities crowed about TOE, while knowing perfectly well that Dark Matter dwarfed standard matter, and was completely outside of the Standard Model).

Physicists are presently looking for Dark Matter, knowing what they know, namely that nature has offered them a vast zoo of particles, many of them without rhyme or reason (some have rhyme, a symmetry, a mathematical group such as SU3 acting upon them; symmetries revealed new particles, sometimes). 

Bullet Cluster, 100 Million Years Old. Two Galaxies Colliding. The Dark Matter, In Blue, Is Physically Separated From the Hot, Standard Matter Gas, in Red.

Bullet Cluster, 100 Million Years Old. Two Galaxies Colliding. The Dark Matter, In Blue, Is Physically Separated From the Hot, Standard Matter Gas, in Red.

[This sort of pictures is most of what we presently have to guess what Dark Matter could be; the physical separation of DM and SM is most telling to me: it seems to indicate that SM and DM do not respond to the same forces, something that my Quantum theory predicts; it’s known that Dark Matter causes gravitational lensing, as one would expect, as it was first found by its gravitational effects, in the 1930s…]

However, remember: a truly completely new piece of science cannot be deduced from pre-existing paradigm. Thus, if Dark Matter was really about finding a new particle type, it would be interesting, but not as interesting as it would be, if it were not, after all, a new particle type, but from a completely new law in physics.

This is the quandary about finding truly completely new science. It can never be deduced from ruling paradigms, and may actually overthrow them. What should then be the method to use? Can Descartes and Sherlock Holmes help? The paradigm presented by Quantum Physics helps. The Quantum looks everywhere in space to find solutions: this is where its (“weird”) nonlocality comes in. Nonlocality is crucial for interference patterns and for finding lowest energy solutions, as in the chlorophyll molecule. This suggests that our minds should go nonlocal too, and we should look outside of a more extensive particle zoo to find what Dark Matter is.

In general, searching for new science should be by looking everywhere, not hesitating to possibly contradict what is more traditional than well established.

An obvious possibility is, precisely, that Quantum Physics is itself incomplete, and generating Dark Matter in places where said incompleteness would be most blatant. More precisely, Quantum processes, stretched over cosmic distances, instead of being perfectly efficient and nonlocal over gigantically cosmic locales, could leave a Quantum mass-energy residue, precisely in the places where extravagant cosmic stretching of Quanta occurs (before “collapse”, aka “decoherence”).

The more one does find a conventional explanation (namely a new type of particle) for Dark Matter, the more likely my style of explanation is likely. How could one demonstrate it? Not by looking for new particles, but by conducting new and more refined experiments in the foundations of Quantum Physics.

If this guess is correct, whatever is found askew in the axioms of present Quantum Physics could actually help future Quantum Computer technology (because the latter works with Quantum foundations directly, whereas conventional high energy physics tend to eschew the wave aspects, due to the high frequencies involved).

Going on a tangent is what happens when the central, attractive force, is let go. A direct effect of freedom. Free thinking is tangential. We have to learn to produce tangential thinking.

René Descartes tried to doubt the truth of all his beliefs to determine which beliefs he could be certain were true. However, at the end of “The Meditations” he hastily conclude that we can distinguish between dream and reality. It is not that simple. The logic found in dreams is all too similar to the logic used by full-grown individuals in society.

Proof? Back to Quantum Physics. On the face of it, the axioms of Quantum Physics have a dream like quality (there is no “here”, nor “there”, “now” is everywhere, and, mysteriously, the experiment is Quantum, whereas the “apparatus” is “classical”). Still, most physicists, after insinuating they have figured out the universe, eschew the subject carefully.  The specialists of Foundations are thoroughly confused: see Sean Carroll, http://www.preposterousuniverse.com/blog/2013/01/17/the-most-embarrassing-graph-in-modern-physics/

However unbelievable Quantum Physics, however dream-like it is, physicists believe in it, and don’t question it anymore than cardinals would Jesus. Actually, it’s this dream-like nature which, shared by all, defines the community of physicists. Cartesian doubt, pushed further than Descartes did, will question not just the facts, the allegations, but the logic itself. And even the mood behind it.

Certainly, in the case of Dark Matter, some of the questions civilization has to ask should be:

  1. How sure are we of the Foundations of Quantum Physics? (Answer: very sure, all too sure!)
  2. Could not it be that Dark Matter is a cosmic size experiment in the Foundations of Quantum Physics?

Physics, properly done, does not just question the nature of nature. Physics, properly done, questions the nature of how we find out the nature of anything. Physics, properly done, even questions the nature of why we feel the way we do. And the way we did. About anything, even poetry. In the end, indeed, even the toughest logic is a form of poetry, hanging out there, justified by its own beauty, and nothing else. Don’t underestimate moods: they call what beauty is.

Patrice Ayme’

Happy In the Sky With New Logics: Einstein’s Error II

August 6, 2016

Einstein assumed reality was localized and definite in one of his famous 1905 papers, and physics never recovered from that ridiculous, out-of-the-blue, wanton, gratuitous error. (The present essay complements the preceding one found in the link). 

At the origin of Quantum Mechanics is Max Planck’s train of thought. Max demonstrated that supposing that electromagnetic energy was EMITTED as packets of energy hf explained the two obvious problems of physics; h is a constant (since then named after Planck), f is the frequency of the light.

Then came, five years later, Einstein. He explained the photoelectric effect’s mysterious features by reciprocating Planck’s picture: light’s energy was RECEIVED as packets of energy hf. Fine.   

However, so doing Einstein claimed that light, LIGHT IN TRANSIT, was made of “LICHT QUANTEN” (quanta of light), which he described as localized. He had absolutely no proof of that. Centuries of observation stood against it. And the photoelectric effect did not necessitate this grainy feature in flight, so did not justify it.  

Thus Einstein introduced the assumption that the ultimate description of nature was that of grains of mass-energy. That was, in a way, nothing new, but the old hypothesis of the Ancient Greeks, the atomic theory. So one could call this the Greco-Einstein hypothesis. The following experiment, conducted in 1921, demonstrated Einstein was wrong. Thus the perpetrator Walther Gerlach, did not get the Nobel, and the Nobel Committee never mentioned the importance of the experiment. Arguably, Gerlach’s experiment was more important than any work of Einstein, thus deserved punishment The Jewish Stern, an assistant of Einstein, got the Nobel alone in 1944, when Sweden was anxious to make friends with the winning “United Nations”: 

Two Points. The Classical Prediction Is A Vertical Smear. It Is Also Einstein’s Prediction. And Incomprehensible In Einstein’s View Of The World.

Two Points. The Classical Prediction Is A Vertical Smear. It Is Also Einstein’s Prediction. And That Smear Is Incomprehensible In Einstein’s View Of The World.

Yet, Einstein’s advocacy of nature as made of grains was obviously wrong: since the seventeenth century, it was known that there were wave effects ruling matter (diffraction, refraction, Newton’s rings). That was so true, Huyghens proposed light was made of waves. Around 1800 CE Young and Ampere proposed proofs of wave nature (2 slit experiment and Poisson’s dot). The final proof of the wave theory was Maxwell’s completion and synthesis of electromagnetism which showed light was an electromagnetic wave (travelling at always the same speed, c).

Einstein’s hypothesis of light as made of grain is fundamentally incompatible with the wave theory. The wave theory was invented precisely to explain DELOCALIZATION. A grain’s definition is the exact opposite.

There is worse.

Spin was discovered as an experimental fact in the 1920s. Interestingly it had been discovered mathematically by the French Alpine mathematician Elie Cartan before World War One, and stumbled upon by Dirac’s invention of the eponymous equation.  

The simplest case is the spin of an electron. What is it? When an electron is put in a magnetic field M, it deviates either along the direction of M (call it M!) or the opposite direction (-M). This sounds innocuous enough, until one realizes that it is the OBSERVER who selects the direction “M” of M. Also there are two angles of deviation only. (The Gerlach experiment was realized with silver (Ag) atoms, but the deviation was caused by a single electron therein.)

Einstein would have us believe that the electron is a grain. Call it G. Then G would have itself its own spin. A rotating charged particle G generates a magnetic field. Call it m. If Einstein were correct, as the direction of M varies, its interaction between the grain G magnetic field m will vary. But it’s not the case: it is as if m did not count. At all. Does not count, at all, whatsoever. It’s all about M, the direction of M.

So Einstein was wrong: there is no grain G with an independent existence, an independent magnetic filed m.

Bohr was right: Einstein was, obviously, wrong. That does not mean that Bohr and his followers, who proclaimed the “Copenhagen Interpretation” were right on other issues. Just like Einstein hypothesized something he did not need, so did the Copenhagists.

Backtrack above: M is determined by the observer, I said (so bleated the Copenhagen herd). However, although M can changed by an observer, clearly an observer is NOT necessary to create a magnetic field M and its direction.

Overlooking that blatant fact, that not all magnetic fields are created by observers, is the source of Copenhagen confusion.

We saw above that correct philosophical analysis is crucial to physics. Computations are also crucial, but less so: a correct computation giving correct results can be made from false hypotheses (the paradigm here is epicycle theory: false axiomatics, the Sun did not turn around the Earth, yet, roughly correct computations produced what was observed).

Out of Quantum Theory came Quantum ElectroDynamics (QED), and, from there, Quantum Field Theory (QFT).  

QED is one of the most precise scientific theory ever. However, there is much more precise: the mass of the photon is determined to be no more than 10^(-60) kilogram (by looking at whether the electromagnetic field of Jupiter decreases in 1/d^2…).

Nevertheless, QED is also clearly the most erroneous physical theory ever (by an order of 10^60). Indeed, it predicts, or rather uses, the obviously false hypothesis that there is some finite energy at each point of space. Ironically enough, it is Einstein and Stern (see above) who introduced the notion of “zero point energy” (so, when Einstein later could not understand, or refused to understand, Quantum Electrodynamics, it was not because all the weirdest concepts therein were not of his own making…)

The debate on the Foundations of Quantum Physics is strong among experts, all over the map, and permeated with philosophy. Thus don’t listen to those who scoff about whether philosophy is not the master of science: it always has been, it is frantically so, and always will be. It is a question of method: the philosophical method uses anything to construct a logic. The scientific method can be used only when one knows roughly what one is talking about. Otherwise, as in Zeroth Century, or Twentieth Century physics, one can go on imaginary wild goose chases.

From my point of view, Dark Matter itself is a consequence of the True Quantum Physics. This means that experiments could be devised to test it. The belief that some scientific theory is likely incites beholders to make experiments to test it. Absent the belief, there would be no will, hence no financing. Testing for gravitational waves was long viewed as a wild goose chase. However, the Federal government of the USA invested more than one billion dollars in the experimental field of gravitational wave detection, half a century after an early pioneer (who was made fun of). It worked, in the end, splendidly: several Black Hole (-like) events were detected, and their nature was unexpected, bringing new fundamental questions.

Some will say that all this thinking, at the edges of physics and philosophy is irrelevant to their lives, now. Maybe they cannot understand the following. Society can ether put its resources in making the rich richer, more powerful and domineering. Or society can pursue higher pursuits, such as understanding more complex issues. If nothing else, the higher technology involved will bring new technology which nothing else will bring (the Internet was developed by CERN physicists).

Moreover, such results change the nature not just of what we believe reality to be, but also of the logic we have developed to analyze it. Even if interest in all the rest faded away, the newly found diamonds of more sophisticated, revolutionary logics would not fade away.

Patrice Ayme’

 

Not An Infinity Of Angels On Pinheads

July 1, 2016

Thomas Aquinas and other ludicrous pseudo-philosophers (in contradistinction with real philosophers such as Abelard) used to ponder questions about angels, such as whether they can interpenetrate (as bosons do).

Are today’s mathematicians just as ridiculous? The assumption of infinity has been “proven” by the simplest reasoning ever: if n is the largest number, clearly, (n+1) is larger. I have long disagreed with that hare-brained sort of certainty, and it’s not a matter of shooting the breeze. (My point of view has been spreading in recent years!) Just saying something exists, does not make it so (or then one would believe Hitler and Brexiters). If I say:”I am emperor of the galaxy known as the Milky Way!” that has a nice ring to it, but it does not make it so (too bad, that would be fun).

Given n symbols, each labelled by something, can one always find a new something to label (n+1) with? I say: no. Why? Because reality prevents it. Somebody (see below) objected that I confused “map” and “territory”. But I am a differential geometer, and the essential idea there, from the genius B. Riemann, is that maps allow to define “territory”:

Fundamental Idea Of Riemann: the Maps At the Bottom Are Differentiable

Fundamental Idea Of Riemann: the Maps At the Bottom Are Differentiable

The reason has to do with discoveries made between 1600 and 1923. Around 1600 Kepler tried to concretize that attraction of planets to the sun (with a 1/d law). Ishmael Boulliau (or Bullialdius) loved the eclipses (a top astronomer, a crater on the Moon is named after him). But Boulliau strongly disagreed with 1/d and gave a simple, but strong reasoning to explain it should be 1/dd, the famous inverse square law.

Newton later (supposedly) established the equivalence between the 1/dd law and Kepler’s three laws of orbital motion, thus demonstrating the former (there is some controversy as whether Newton fully demonstrated that he could assume planets were point-masses, what’s now known as Gauss’ law).

I insist upon the 1/dd law, because we have no better (roll over Einstein…), on a small-scale.

Laplace (and some British thinker) pointed out in the late 18C that this 1/dd law implied Black Holes.

In 1900, Jules Henri Poincaré demonstrated that energy had inertial mass. That’s the famous E = mcc.

So famous, it could only be attributed to a member of the superior Prussian race.

The third ingredient in the annihilation of infinity was De Broglie’s assertion that to every particle a wave should be associated. The simple fact that, in some sense a particle was a wave (or “wave-packet”), made the particle delocalized, thus attached to a neighborhood, not a point. At this point, points exited reality.

Moreover, the frequency of the wave is given by its momentum-energy, said De Broglie (and that was promptly demonstrated in various ways). That latter fact prevents to make a particle too much into a point. Because, to have short wave, it needs a high frequency, thus a high energy, and if that’s high enough, it becomes a Black Hole, and, even worse a Whole Hole (gravity falls out of sight, physics implodes).

To a variant of the preceding, in: Solution: ‘Is Infinity Real?’  Pradeep Mutalik says:

July 1, 2016 at 12:31 pm

@Patrice Ayme: It seems that you are making the exact same conflation of “the map” and “the territory” that I’ve recommended should be avoided. There is no such thing as the largest number in our conceptual model of numbers, but there is at any given point, a limit on the number of particles in the physical universe. If tomorrow we find that each fermion consists of a million vibrating strings, we can easily accommodate the new limit because of the flexible conceptual structure provided by the infinite assumption in our mathematics.

***

I know very well the difference between “maps” and territory: all of post-Riemann mathematics rests on it: abstract manifolds (the “territories”) are defined by “maps Fi” (such that, Fi composed with Fj is itself a differential map from an open set in Rx…xR to another, the number of Real lines R being the dimension… Instead of arrogantly pointing out that I have all the angles covered, I replied:

Dear Pradeep Mutalik:

Thanks for the answer. What limits the number of particles in a (small enough) neighborhood is density: if mass-energy density gets too high, according to (generally admitted) gravity theory, not even a graviton could come out (that’s even worse than having a Black Hole!)

According to Quantum Theory, to each particle is associated a wave, itself computed from, and expressing, the momentum-energy of said particle.

Each neighborhood could be of (barely more than) Planck radius. Tessellate the entire visible universe this way. If too each distinct wave one attaches an integer, it is clear that one will run out of waves, at some point, to label integers with. My view does not depend upon strings, super or not: I just incorporated the simplest model of strings.

Another mathematician just told me: ‘Ah, but the idea of infinity is like that of God’. Well, right. Precisely the point. Mathematics, ultimately, is abstract physics. We don’t need god in physics, as Laplace pointed out to Napoleon (“Sire, je n’ai pas besoin de cette hypothese”). (I know well that Plato and his elite, tyrant friendly friends and students replied to all of this, that they were not of this world, a view known as “Platonism”, generally embraced by mathematicians, especially if they are from plutocratic Harvard University… And I also know why this sort of self-serving, ludicrous opinion, similar to those of so-called “Saint” Thomas, a friend of the Inquisition, and various variants of Satanism, have been widely advocated for those who call for self-respect for their class of haughty persons…) 

The presence of God, aka infinity, in mathematics, is not innocuous. Many mathematical brain teasers become easier, or solvable if one assumes only a largest number (this is also how computers compute, nota bene). Assuming infinity, aka God, has diverted mathematical innovation away from the real world (say fluid flow, plasma physics, nonlinear PDEs, nonlinear waves, etc.) and into questions akin to assuming that an infinity of angels can hold on a pinhead. Well, sorry, but modern physics has an answer: only a finite number.

Patrice Ayme’

 

Earth’s Core Is Younger Than Its Crust

May 30, 2016

Inner parts of the Earth are younger than the surface by an appreciable amount. Richard Feynman made this point first. But he underestimated the effect by a factor of 100 times! As the Danes who just discovered that put it: “The pedagogical value of this discussion is to show students that any number or observation, no matter who brought it forward, must be critically examined”.

Local Time is a theory invented by Poincaré, to make sense of Lorentz’s work. Local Time became famous when Einstein, a German, advertised it, and was himself advertised by Kaiser nationalists such as Max Planck. A gravitational field slows down (Local) Time. (The proof is easy.)

Notice that the core itself has no gravitation. So actually the slowing down of light clocks is a function of depth. Local time really slows down.

Local time, as given by light clocks, has to be the same as local time given by the weak force (radioactive decay). If not, one could tell absolute motion easily from the inside the bowels of the ship lab. That would contradict the Principle of Relativity.

I have argued for decades that the Cosmic Background Radiation gave an absolute frame. However, the situation is a bit more subtle than that. Galileo argued that a laboratory in the bowels of a ship cannot provide an indication of motion (as long as one does look outside!)

I recently dug around and found the argument came initially from bishop Oresme, a student and collaborator of Buridan. Both were major philosophers, mathematicians and physicists of the Fourteenth Century in Paris. Oresme considered the principle of relativity self-obvious (to “intelligent” persons). However that was as long as one was in the bowels of a ship, and not looking at heavenly bodies. Oresme explicitly said. Because Oresme argued the diurnal motion of Earth around itself could not be detected inside a lab (many centuries later, five centuries later, more exactly, that turned out to be false: consider Foucault’s pendulum, 1851 CE).

So can we find a sort of Foucault pendulum for absolute linear motion? General Relativity insists on what Newton already knew: the Earth falls around the Sun. Can we detect this rotation inside a mine, 2 kilometers down? In theory, yes: the CBR will slow down the Earth sometimes, and push it, at other times. A supersensitive accelerometer could detect that.

Nor can we do away with the likes of a CBR like reference frame. The simple fact that there are galactic clusters all around and they generate the gravitational field defines a state of rest relative to it.

The formalism of Quantum Physics already has an absolute time for all to see. That absolute time is what enables the non-local effects.

So is physics finished? No. Will the philosophical approach help? Of course (roll over, Feynman, go back to your faulty computations!). It took 32 years for physicists to realize that the potential was on the right side of the De Broglie-Schrodinger equation of 1924… That provided immediately with (the idea for) an experimental confirmation, the Bohm-Aharanov effect…
Patrice Ayme’

ianmillerblog

There was a rather interesting announcement recently: three Danes calculated that the centre of the earth is 2.5 years younger than the crust ( U I Uggerhøj et al. The young centre of the Earth, European Journal of Physics (2016). DOI: 10.1088/0143-0807/37/3/035602 ). The concept is that from general relativity, the gravitational field of earth warps the fabric of space-time, thus slowing down time. This asserts that space-time is something more than a calculating aid and it brings up a certain logic problem. First, what is time and how do we measure it? The usual answer to the question or measurement is that we use a clock, and a clock is anything that has a change over a predictable period of time, as determined by some reference clock. One entity that can be used as a clock is radioactive decay and according to general relativity, that clock at the core…

View original post 829 more words

Entangled Universe: Bell Inequality

May 9, 2016

Abstract: The Bell Inequality shatters the picture of reality civilization previously established. A simple proof is produced.

What is the greatest scientific discovery of the Twentieth Century? Not Jules Henri Poincaré’s Theory of Relativity and his famous equation: E = mcc. Although a spectacular theory, since  Poincaré’s made time local, in order to keep the speed of light constant, it stemmed from Galileo’s Principle of Relativity, extended to Electromagnetism. To save electromagnetism globally, Jules Henri Poincaré made time and length local.

So was the discovery of the Quantum by Planck the greatest discovery? To explain two mysteries of academic physics, Planck posited that energy was emitted in lumps. Philosophically, though, the idea was just to extent to energy the basic philosophical principle of atomism, which was two thousand years old. Energy itself was discovered by Émilie Du Châtelet in the 1730s.

Quantum Entanglement Is NOT AT ALL Classically Predictable

Quantum Entanglement Is NOT AT ALL Classically Predictable

Just as matter went in lumps (strict atomism), so did energy. In light of  Poincaré’s E = mc2, matter and energy are the same, so this is not surprising (by a strange coincidence (?)  Poincaré demonstrated, and published E = mc2, a few month of the same year, 1900, as Max Planck did E = hf; Einstein used both formulas in 1905).

The greatest scientific discovery of Twentieth Century was Entanglement… which is roughly the same as Non-Locality. Non-Locality would have astounded Newton: he was explicitly very much against it, and viewed it, correctly, as the greatest flaw of his theory. My essay “Non-Locality” entangles Newton, Émilie Du Châtelet, and the Quantum, because therefrom the ideas first sprung.

***

Bell Inequality Is Obvious:

The head of the Theoretical division of CERN, John Bell, discovered an inequality which is trivial and apparently so basic, so incredibly obvious, that it reflects the most basic common sense that it should always be true. Ian Miller (PhD, Physical Chemistry) provided a very nice perspective on all this. Here it is, cut and pasted (with his agreement):

Ian Miller: A Challenge! How can Entangled Particles violate Bell’s Inequalities?

Posted on May 8, 2016 by ianmillerblog           

  The role of mathematics in physics is interesting. Originally, mathematical relationships were used to summarise a myriad of observations, thus from Newtonian gravity and mechanics, it is possible to know where the moon will be in the sky at any time. But somewhere around the beginning of the twentieth century, an odd thing happened: the mathematics of General Relativity became so complicated that many, if not most physicists could not use it. Then came the state vector formalism for quantum mechanics, a procedure that strictly speaking allowed people to come up with an answer without really understanding why. Then, as the twentieth century proceeded, something further developed: a belief that mathematics was the basis of nature. Theory started with equations, not observations. An equation, of course, is a statement, thus A equals B can be written with an equal sign instead of words. Now we have string theory, where a number of physicists have been working for decades without coming up with anything that can be tested. Nevertheless, most physicists would agree that if observation falsifies a mathematical relationship, then something has gone wrong with the mathematics, and the problem is usually a false premise. With Bell’s Inequalities, however, it seems logic goes out the window.

Bell’s inequalities are applicable only when the following premises are satisfied:

Premise 1: One can devise a test that will give one of two discrete results. For simplicity we label these (+) and (-).

Premise 2: We can carry out such a test under three different sets of conditions, which we label A, B and C. When we do this, the results between tests have to be comparable, and the simplest way of doing this is to represent the probability of a positive result at A as A(+). The reason for this is that if we did 10 tests at A, 10 at B, and 500 at C, we cannot properly compare the results simply by totalling results.

Premise 1 is reasonably easily met. John Bell used as an example, washing socks. The socks would either pass a test (e.g. they are clean) or fail, (i.e. they need rewashing). In quantum mechanics there are good examples of suitable candidates, e.g. a spin can be either clockwise or counterclockwise, but not both. Further, all particles must have the same spin, and as long as they are the same particle, this is imposed by quantum mechanics. Thus an electron has a spin of either +1/2 or -1/2.

Premises 1 and 2 can be combined. By working with probabilities, we can say that each particle must register once, one way or the other (or each sock is tested once), which gives us

A(+) + A(-) = 1; B(+) + B(-) = 1;   C(+) + C(-) = 1

i.e. the probability of one particle tested once and giving one of the two results is 1. At this point we neglect experimental error, such as a particle failing to register.

Now, let us do a little algebra/set theory by combining probabilities from more than one determination. By combining, we might take two pieces of apparatus, and with one determine the (+) result at condition A, and the negative one at (B) If so, we take the product of these, because probabilities are multiplicative. If so, we can write

A(+) B(-) = A(+) B(-) [C(+) + C(-)]

because the bracketed term [C(+) + C(-)] equals 1, the sum of the probabilities of results that occurred under conditions C.

Similarly

B(+)C(-)   = [A(+) + A(-)] B(+)C(-)

By adding and expanding

A(+) B(-) + B(+)C(-) = A(+) B(-) C(+) + A(+) B(-) C(-) + A(+) B(+)C(-) + A(-)B(+)C(-)

=   A(+)C(-) [(B(+) + B(-)] + A+B C+ + AB(+)C(-)

Since the bracketed term [(B(+) + B(-)] equals 1 and the last two terms are positive numbers, or at least zero, we have

A(+) B(-) + B(+)C(-) ≧ A(+)C(-)

This is the simplest form of a Bell inequality. In Bell’s sock-washing example, he showed how socks washed at three different temperatures had to comply.

An important point is that provided the samples in the tests must give only one result from only two possible results, and provided the tests are applied under three sets of conditions, the mathematics say the results must comply with the inequality. Further, only premise 1 relates to the physics of the samples tested; the second is merely a requirement that the tests are done competently. The problem is, modern physicists say entangled particles violate the inequality. How can this be?

Non-compliance by entangled particles is usually considered a consequence of the entanglement being non-local, but that makes no sense because in the above derivation, locality is not mentioned. All that is required is that premise 1 holds, i.e. measuring the spin of one particle, say, means the other is known without measurement. So, the entangled particles have properties that fulfil premise 1. Thus violation of the inequality means either one of the premises is false, or the associative law of sets, used in the derivation, is false, which would mean all mathematics are invalid.

So my challenge is to produce a mathematical relationship that shows how these violations could conceivably occur? You must come up with a mathematical relationship or a logic statement that falsifies the above inequality, and it must include a term that specifies when the inequality is violated. So, any takers? My answer in my next Monday post.

[Ian Miller.]

***

The treatment above shows how ludicrous it should be that reality violate that inequality… BUT IT DOES! This is something which nobody had seen coming. No philosopher ever imagined something as weird. I gave an immediate answer to Ian:

‘Locality is going to come in the following way: A is going to be in the Milky Way, B and C, on Andromeda. A(+) B(-) is going to be 1/2 square [cos(b-a)]. Therefrom the contradiction. There is more to be said. But first of all, I will re-blog your essay, as it makes the situation very clear.’

Patrice Ayme’

TO BE AND NOT TO BE? Is Entangled Physics Thinking, Or Sinking?

April 29, 2016

Frank Wilczek, a physics Nobel laureate, wrote a first soporific, and then baffling article in Quanta magazine: “Entanglement Made Simple”. Yes, all too simple: it sweeps the difficulties under the rug. After a thorough description of classical entanglement, we are swiftly told at the end, that classical entanglement supports the many World Interpretation of Quantum Mechanics. However, classical entanglement (from various conservation laws) has been known since the seventeenth century.

Skeptical founders of Quantum physics (such as Einstein, De Broglie, Schrodinger, Bohm, Bell) knew classical entanglement very well. David Bohm found the Bohm-Aharanov effect, which demonstrated the importance of (nonlocal) potential, John Bell found his inequality which demonstrated, with the help of experiments (Alain Aspect, etc.) that Quantum physics is nonlocal.

Differently From Classical Entanglement, Which Acts As One, Quantum Entanglement Acts At A Distance: It Interferes With Measurement, At A Distance

Differently From Classical Entanglement, Which Acts As One, Quantum Entanglement Acts At A Distance: It Interferes With Measurement, At A Distance

The point about the cats is that everybody, even maniacs, ought to know that cats are either dead, or alive. Quantum mechanics make the point they can compute things about cats, from their point of view. OK.

Quantum mechanics, in their busy shops, compute with dead and live cats as possible outcomes. No problem. But then does that mean there is a universe, a “world“, with a dead cat, happening, and then one with a live cat, also happening simultaneously?

Any serious philosopher, somebody endowed with common sense, the nemesis of a Quantum mechanic, will say no: in a philosopher’s opinion, a cat is either dead, or alive. To be, or not to be. Not to be, and not to be.

A Quantum mechanic can compute with dead and live cats, but that does not mean she creates worlds, by simply rearranging her computation, this way, or that. Her various dead and live cats arrangements just mean she has partial knowledge of what she computes with, and that Quantum measurements, even from an excellent mechanic, are just partial, mechanic-dependent measurements.

For example, if one measures spin, one needs to orient a machine (a Stern Gerlach device). That’s just a magnetic field going one way, like a big arrow, a big direction. Thus one measures spin in one direction, not another.

What’s more surprising is that, later on, thanks to a nonlocal entanglement, one may be able to determine that, at this point in time, the particle had a spin that could be measured, from far away, in another direction. So far, so good: this is like classical mechanics.

However, whether or not that measurement at a distance has occurred, roughly simultaneously, and way out of the causality light cone, EFFECTS the first measurement.

This is what the famous Bell Inequality means.

And this is what the problem with Quantum Entanglement is. Quantum Entanglement implies that wilful action somewhere disturbs a measurement beyond the reach of the five known forces. It brings all sorts of questions of a philosophical nature, and make them into burning physical subjects. For example, does the experimenter at a distance have real free will?

Calling the world otherworldly, or many worldly, does not really help to understand what is going on. Einstein’s “Spooky Interaction At A Distance” seems a more faithful, honest rendition of reality than supposing that each and any Quantum mechanic in her shop, creates worlds, willy-nilly, each time it strikes her fancy to press a button.

What Mr. Wilczek did is what manyworldists and multiversists always do: they jump into their derangement (cats alive AND dead) after saying there is no problem. Details are never revealed.

Here is, in extenso, the fully confusing and unsupported conclusion of Mr. Wilczek:

“Everyday language is ill suited to describe quantum complementarity, in part because everyday experience does not encounter it. Practical cats interact with surrounding air molecules, among other things, in very different ways depending on whether they are alive or dead, so in practice the measurement gets made automatically, and the cat gets on with its life (or death). But entangled histories describe q-ons that are, in a real sense, Schrödinger kittens. Their full description requires, at intermediate times, that we take both of two contradictory property-trajectories into account.

The controlled experimental realization of entangled histories is delicate because it requires we gather partial information about our q-on. Conventional quantum measurements generally gather complete information at one time — for example, they determine a definite shape, or a definite color — rather than partial information spanning several times. But it can be done — indeed, without great technical difficulty. In this way we can give definite mathematical and experimental meaning to the proliferation of “many worlds” in quantum theory, and demonstrate its substantiality.”

Sounds impressive, but the reasons are either well-known or then those reasons use a sleight of hand.

Explicitly: “take both of two contradictory property-trajectories into account”: just read Feynman QED, first chapter. Feynman invented the ‘sum over histories’, and Wilczek is his parrot; but Feynman did not become crazy from his ‘sum over history’: Richard smirked when his picturesque evocation was taken literally, decades later…

And now the sleight of hand: …”rather than  [gather] partial information spanning several times. But it can be done — indeed, without great technical difficulty.” This nothing new: it is the essence of the double slit discovered by that Medical Doctor and polymath, Young, around 1800 CE: when one runs lots of ‘particles’ through it, one sees the (wave) patterns. This is what Wilczek means by “partial information“. Guess what? We knew that already.

Believing that one can be, while not to be, putting that at the foundation of physics, is a new low in thinking. And it impacts the general mood, making it more favorable towards unreason.

If anything can be, without being, if anything not happening here, is happening somewhere else, then is not anything permitted? Dostoyevsky had a Russian aristocrat suggests that, if god did not exist anything was permitted. And, come to think of it, the argument was at the core of Christianism. Or more, exactly, of the Christian reign of terror which started in the period 363 CE-381 CE, from the reigns of emperor Jovian to the reign of emperor Theodosius. To prevent anything to be permitted, a god had to enforce the law.

What we have now is way worse: the new nihilists (Wilczek and his fellow manyworldists) do not just say that everything is permitted. They say: it does not matter if everything is permitted, or not. It is happening, anyway. Somewhere.

Thus Many-Worlds physics endangers, not just the foundations of reason, but the very justification for morality. That is that what is undesirable should be avoided. Even the Nazis agreed with that principle. Many-Worlds physics says it does not matter, because it is happening, anyway. Somewhere, out there.

So what is going on, here, at the level of moods? Well, professor Wilczek teaches at Harvard. Harvard professors advised president Yeltsin of Russia, to set up a plutocracy. It ruined Russia. Same professors made a fortune from it, while others were advising president Clinton to do the same, and meanwhile Prime Minister Balladur in France was mightily impressed, and followed this new enlightenment by the Dark Side, as did British leaders, and many others. All these societies were ruined in turn. Harvard was the principal spirit behind the rise of plutocracy, and the engine propelling that rise, was the principle that morality did not matter. because, because, well, Many-Worlds!

How does one go from the foundations of physics, to the foundations of plutocracy? Faculty members in the richest, most powerful universities meet in mutual admiration societies known as “faculty clubs” and lots of other I scratch-your-back, you scratch-my-back social occasion they spend much of their time indulging in. So they influence each other, at the very least in the atmospheres of moods they create, and then breathe together.

Remember? It is not that everything is permitted: it’s happening anyway, so we may as well profit from it first. Many-Worlds physics feeds a mood favorable to many plutocrats, and that’s all there is to it. (But that, of course, is a lot, all too much.)

Patrice Ayme’