Archive for the ‘Entanglement’ Category

ENTANGLEMENT, FASTER THAN LIGHT COMMUNICATIONS, CAUSALITY, etc.

January 25, 2023

Abstract: Faster Than Light Particle Transfer? Not Possible According To Special Relativity. But Faster Than Light Communications? Some Day, Probably. Using Quantum Entanglement… Not Particle Transport.

TYRANOSOPHER: Folklore based on a vague reasoning of Einstein says Faster Than Light Communications are impossible (a variant supposedly breaks the universe… see below). Having read Einstein carefully, yours truly determined that Einstein’s reasoning was flimsy (Albert himself hints at that in his original paper).

Most of Special Relativity stays intact if Faster Than Light Communication, FTLC are possible. ALL the equations, and thus the verifying experiments of Special Relativity stay intact. (See towards the end answers to objections).

Simplicia: Many people will write you off because you wrote off Einstein. They won’t read any further.

Tyranosopher: OK, I will detail in another essay my objections to the packaging of Special Relativity which forbids FTLC with great details. Below is just a sketch.

Now about Einstein: he is not God. Actually, there is no God. When I was young and naive, I approved (all) of Einstein’s critiques of Quantum theory, a theory to which he crucially contributed as number two, Planck being number one. Planck said emission of radiation was in grains, quanta, and explained two facts this way. Einstein explained that supposing absorption of radiation also came in quanta explained the photoelectric effect. Planck condemned the latter, but Einstein was right. Then other physicists contributed. The next huge conceptual breakthrough was De Broglie’s matter waves. Then CIQ (Copenhagen Interpretation Quantum) arose with correct physics, admirable math, but a sick un-realistic metaphysics. De Broglie objected and rolled out a realistic model of fundamental physics. Einstein seconded De Broglie, but they were overwhelmed by the launch of QED by Dirac. Then all sorts of strange and marvellous high energy zoo, then QFT, etc.

Nevertheless, after exchanges with Karl Popper, Einstein wrote the EPR paper on nonlocality, in 1935… EPR criticized Quantum Physics and its nonlocality from the “realistic” point of view. I am also all for Sub Quantum Physical Reality (SQPR), but I have an axiom neither De Broglie nor Einstein had. Science progresses one axiom at a time…

However, as the decades passed, and I deepened my understanding, I realized that Einstein’s admirable work was not as revolutionary and crazy as needed. 

Simplicia: The funny thing is that Einstein discovered nonlocality in the 1935 EPR paper. Which is one of the top ten papers in theoretical physics, and very hot today, as Quantum Computers use nonlocality.

Tyranosopher: Einstein was honest enough to not throw nonlocality out of the window. Maybe his conversation with the philosopher Karl Popper helped: Popper did contribute to the discovery of nonlocality. Einstein called nonlocality “spooky action at a distance”.

Simplicia: Now nonlocality is a proven experimental fact.

Tyranosopher: Yes the “SPOOKY ACTION AT A DISTANCE” which initially was a purely theoretical fact coming out of the axiomatics of Quantum Theory has been PROVEN over distances of many kilometers. One has to know the crucial difference of QUANTUM SPIN versus classical spin to see nonlocality clearly.

Chinese scientists have measured a minimum speed for this “spooky action at a distance”. I call it the QUANTUM INTERACTION, and assign to it a finite speed, TAU. This supplementary axiom contradicts Quantum Theory

Instead, classical Twentieth Century Quantum Physics says that Quantum Entanglement proceeds at infinite speed.

So this supplementary axiom of propagating finite speed nonlocality should be experimentally testable. I claim the proof of a finite speed for the Quantum Interaction is all around us: Dark Matter and Dark Energy are the results of this experiment, conducted for free by the universe itself. Amen.

Simplicia: What do you mean that nonlocality has been proven? Your friend Ian Miller, who is a physical chemist, denies a proof was achieved.

***

Tyranosopher: I admire Ian, he is a renaissance man, but don’t understand his arguments in this particular case. There are countless variants and proofs under the label “Bell’s theorem” in a jungle of tweaked axiomatics. Ian uses the classical Noether’s theorem… which doesn’t apply to Quantum situations. For once I will use an authority argument. The Nobel was given to nonlocality in 2022, and should have been given at least two decades ago to Alain Aspect. That could have helped physics.

To understand the simplest quantifiable proof of nonlocality one has to know about Quantum Spin and what has been experimentally discovered. Quantum Spin does NOT behave like Classical Spin. Classical Spin can be measured in all directions simultaneously, but Quantum Spin can be measured in only one direction at a time, and that erases preceding measurement.

https://patriceayme.wordpress.com/2022/11/10/proof-of-no-local-hidden-variables-from-magnets/

Building up on Einstein’s 1935 EPR, the simplest Quantum Entanglement which can be studied over a distance was elaborated by David Bohm in the 1950s and then studied in detail by a very small group of physicists, including CERN theoretical high energy physics head, John Bell, in the 1960s, to produce an experimentally testable inequality… which was given the Physics Nobel for 2022.

Simplicia: OK, many people have thought this instantaneous nonlocality could be used for Faster Than Light, FTL.

Tyranosopher:  Maybe. But one has to distinguish FTL and FTL Communication. FTL for massive objects is impossible, except by transporting a space bubble, which is pure science fiction of the extravagant type.

However if SQPR is correct and TAU is finite, one should be able, theoretically speaking, to create energy imbalances at a distance, after an elaborate technological setup, and thus create FTLC channels. 

***

QUANTUM ENTANGLEMENT SEEMS TO PRODUCE FTLC: 

Suppose we produce a state of total spin zero shared by two particles. (Particle streams, in practice.)

We keep one going in circles around Earth, and send the other to Proxima Centauri, 4 lightyears away.

Now say that, after 4 years, we measure the spin in the z direction in the Earth neighborhood, and we find |+>. Then we know that the other particle has spin |-> at Proxima.

So our measurement at Earth created a spin down at Proxima… Instantaneously

Now, with particle streams and synchronized clocks one could easily transform this into an FTL Morse code….

Except for one Quantum difficulty: we do not know how to get a |+> state to start with. We have the same probability to create a |-> state…We can’t make a stream of I+> states to start with, so we can’t type our FTL Morse code to start with! It’s as if we told a cosmic monkey in another room to type, but he can’t select letters. 

***

Hence the impossibility of Faster Than Light Communications rests only upon claiming to know something we know nothing about: can one NEVER EVER prepare, and, or NEVER EVER select Quantum states before measuring them? In other words, do Quantum States have tails? 

There is a so-called “Non Cloning” [of states] theorem…But the “proof” has a loophole (it depends upon assuming a unitary operator, thus denying there are quantum tails, exactly what it wants to prove) In truth, it’s an experimental problem: if what the prestigious French physicist Devoret at Yale and his collaborators is true, it has been possible to prepare some (contrived) Quantum states… but, SO FAR, it has not been possible to prepare Quantum states which happen to be ENTANGLED.

***

When some physicists pretend Faster Than Light Communications are impossible, they pontificate, because, in truth, we don’t know. And science doesn’t progress one pontifex at a time, but one correct intuition at a time. The intuitive case for FTLC is growing as the Quantum amazes us ever more.

***

What we know is that something we thought to be completely impossible, SWAPPING QUANTUM ENTANGLEMENT, is not only possible, but now so amply demonstrated that it is central to various developing Quantum technologies

***

SQPR assumes particles have complex structures, a linear part (the guiding wave) and a nonlinear part (the “particle”), the entire structure being unstable and prone to contracting at TAU, the collapse and entanglement speed. 

***

However, Quantum Swapping shows that, somehow, one can have Quantum Interactions without collapse, namely the propagation of QE.

***

Thus it is starting to smell as if one could interact with a particle’s extended presence without inducing collapse, and then select the type we like…

***

Simplicia: Hence FTLC should be possible?

Tyranosopher: FTLC through Quantum Entanglement would not contradict Relativity, because it would not change anything to light clocks, or for the equation Force = d(mv/(1-vv/cc))/dt. There would be no mass transport.

***

It all smells as if FTLC will become possible. That does not mean that Faster Than Light matter transport should be possible. The latter is impossible without warp drives.

Simplicia: Wait, don’t go. It is well known that FTL Communication leads to the breakdown of causality, and thus, sheer madness. Consider the excellent video:

https://www.youtube.com/watch?v=an0M-wcHw5A

Tyranosopher: Yes, beautiful video. Minkowski spacetime diagrams. Einstein didn’t like them, he didn’t like either Minkowsky or “spacetime”. It was reciprocal: Minkowsky, who was Einstein’s physics professor at Zurich Polytechnic, ETA, called Albert a “lazy dog” and made sure he couldn’t get an academic appointment. Instead a friend got Einstein a job at the Patent Office in Bern.

Simplicia: Can we get to the point? You don’t like spacetime as a concept, so what?

Tyranosopher: Notice that they draw these spacetime diagrams all over the galaxy’s real space, in various places, and then they draw a contradiction. 

Simplicia: Yes, so what?

Tyranosopher: Relativity was invented by Henri Poincaré to describe local effects. Basically local speed makes local time of the speeding object run slow. A fast traveling light clock goes slow when going along the direction of the speed, at the speed. From there after quite a bit of half hidden logic, plus Michelson Morley type experiments which showed the undetectability of speed within a ship cabin not looking outside (the original Galileo imagery), one deduced length also contracted, and so did the local time of the moving device.  

Simplicia: Thanks for the two sentences recap of Relativity.

Tyranosopher: The slowing down of the local time was amply confirmed with fast particle like muons, and in a slightly different context, GPS computations crucially depend upon time contraction of the orbiting satellites.

Simplicia: And then? Why are spacetime diagrams bad?

Tyranosopher: Spacetime diagrams are tangent space objects. They are, at best, local approximations. Extending a spacetime diagram to Vega has degraded meaning. Einstein knew this, he mentioned somewhere that General Relativity violates the constancy of the speed of light. And that’s fairly obvious as light could be put in orbit around a black hole. Now the silly ones cry that time would be in orbit around said black hole, and bite its own tail, etc.  Grandchildren would kill all their grandparents, etc. Silly stuff: they confuse local and global, although that’s the bedrock of differential geometry. Differential geometry is locally flat (aka “Euclidean”) and globally curved (or even twisted). But this is not even the worst…

Simplicia: How come this is all not well-known.

T: Long ago I gave a seminar along these lines at Stanford. Many of the best and brightest were in attendance, Hawking, Penrose, Yau, Susskind, etc. and not too happy from what I said. But my point about General Relativity making no sense without Quantum is viewed as trivially obvious nowadays.

Simplicia: So you are saying one can’t just rotate the spacetime axes of a moving spaceship and make deductions?

T: One can make deductions, but one can’t make deductions where local time of a moving ship becomes global time, as in the video I linked above. Earth can synchronize time with Vega, Henri Poincaré described how that can be done. But one can’t synchronize time with a moving spaceship (as those who claim to have demonstrated that FTLC breaks causality to). 

If one sends an FTL message to a moving spaceship, it does not get it in our past. It gets it in our future. Our past and our future are local… to us, and… Vega, if we synchronized time with Vega. A really silly mistake. 

Simplicia: Please stop insulting fellow intellectuals, or they are not going to be fellows anymore. And why did you link to a false video?

Tyranosopher: Right, let me rephrase this: it has been known since the onset of Relativity that at speed simultaneity is violated. So cause and effect can look inverted in a moving ship relative to what they are in a co-moving frame. That’s basic.The video misses the point, although it looks so reasonable, with great graphics.

Therefore, in the Special Theory of Relativity, causality can only be established and defined in the co-moving frame. (Same for mass, let be said in passing. Even the otherwise excellent Richard Feynman makes that mistake in his lectures. The video I linked above makes that mistake).

So claiming Faster Than Light Communications violates causality is erroneous! 

***

Simplicia: If and when do you think we can realize FTLC?

Tyranosopher: We are tantalizingly close. Some physicists (Devoret) adorned with prizes, glory and long careers claim that they can detect the preparation of a Quantum jump, and even that they can revert it. If that’s true, and we can apply that kind of selection to Quantum Spin, FTLC could be installed with Mars before humanity lands on the planet.

Simplicia: Are you serious?

Tyranosopher: Absolutely. 

Patrice Ayme  

Proof Of NO LOCAL Hidden Variables From Magnets

November 10, 2022

Abstract: Looked at it the right way, the Stern Gerlach experiment with three consecutive magnets oriented various ways, show that there can’t be LOCAL hidden variables. No need, to exhibit nonlocality, for the precise, but obscure logic of the Bell Inequality. The argument here is less mathematically precise, but more intuitive.

***

Stern-Gerlach Magnets (SGM) reveal an aspect of the Quantum, namely the quantization of (some sort of) angular momentum. SGM launched in 1922 the saga of Quantum Spin (it turns out to be connected to deep pure geometry which had been developed independently by Élie Cartan, a decade earlier). Drive an electron, or (appropriate) atomic beam through a SGM, and one will get two dots, one up, one down. Whatever the axis of the SGM [1]. (The SGM is just a magnetic field with a specific linear direction.) 

That means, at the emotional level that, at the smallest scale, spin, the electronic (sort of) angular momentum, or the orbital (sort of) angular momentum, reduce to UP and DOWN. First surprise. (This is actually the case of Spin 1/2, the simplest, such as for an electron; we are trying to learn the most from the simplest case.)

Say the first SGM is vertical (magnetic field along “z axis”) and a second SGM is horizontal (mag field along “x axis”). Call them respectively SGMV and SGMH. So SGMH produces LEFT-RIGHT beams. Put SGMH across the UP beam coming out of SGMV. One could call that beam SGMV (UP). Once goes through SGMH, one will get 50-50 LEFT-RIGHT. No surprise there.

Now say one selects the RIGHT beam coming out SGMH. Call that beam SGMH (UP; RIGHT)… because first the beam went up, then the beam went right.  

Naively one would expect, from classical mechanics, that SGMH (UP; RIGHT) to have kept a memory of its initial source as SGMV(UP). 

That would be to assume that beam SGMV (UP) and its descendant SGMH (UP;  RIGHT) to have kept some memory, in other words that some that the beams through the first SGMV and then the second SGM to harbor some LOCAL HIDDEN VARIABLES.

But that’s not the case. 

Indeed, please run SGMH (UP; RIGHT) into a second SGMV (that is a Stern Gerlach Magnet parallel to the first magnet SGMV… Call that second vertical Stern Gerlach Magnet SGMV2 One gets fifty-fifty UP and DOWN coming out of SGMV2. It is as if the initial Stern Gerlach, SGMV, never happened. (This set-up is presented in Feynman Lectures on Physics III, Chapter 6, Spin 1/2)

So if there were local hidden variables carried after SGMV that is in the beam SGMV (UP), they got somehow completely eradicated by getting into the beam SGMH (RIGHT).

So these proposed local hidden variables do not stay hidden inside the “particle”: an outside agent can erase them…. Thus those putative local hidden variables aren’t really “local” anymore: the environment impacts them, outside of the particle, and drastically so, just as the potential impacts the phase of an electron in the Bohm-Aharonov experiment… non locally.

***

One can rerun the experiment, by using both beams SGMH (RIGHT) and SGMH (RIGHT), mixing them up. Then it turns out that SGMV2 deflects ONLY UP. So simply going through magnet SGMH, WITHOUT selecting a beam (either SGMH(LEFT) or SMGH (RIGHT)) doesn’t do anything: a collapsing of the Quantum space available to the Quantum wave, selecting either left or right space, is what does something.

Conventional Quantum Physics, newish, path integral version, phrases this by saying one can’t say which path has been followed [2] to keep the information SGV UP or SGV DOWN.  Copenhagen Interpretation of Quantum (CIQ) simply says that selecting beam SGMH (RIGHT) is a measurement thus collapses the wave function… SQPR says roughly the same thing.

In any case, this eradication of the influence of SGMH on the “particle” by just keeping open the OTHER beam, which the putative local hidden variable “particle” is by definition NOT taking, is itself a NONLOCAL effect, thus once again demolishing the “LOCAL Hidden Variable” concept. (One could say that one beam is entangled with the other…)

The advantage of this conceptual approach is that it exhibits directly the nonlocality… without hermetic complications [3]. It also shows the interest of a more philosophical rather than purely formalistic approach to physics.

Patrice Ayme
***

[1] Wolfgang Pauli in 1924 was the first to propose a doubling of the number of available electron states due to a two-valued non-classical “hidden rotation“. In 1925, George Uhlenbeck and Samuel Goudsmit suggested the simple physical interpretation for spin of a particle spinning around its own axis… But clearly that doesn’t fit what is observed. Pauli built a mathematical machinery which reflected the observed GSM behavior. It turned out to be a particular case of deep mathematical work from the French mathematician Élie Cartan who was born and initially educated in the small Alpine coal mining village of La Mure, and rose through merit in the republican educational system. It’s a bit like taking the square root of space. I don’t understand it, neither did the extremely famous mathematician Atiyah…

It is easy to be blinded by the math. But actually the math describes an observed physical behavior. Now this behavior may arise from deeper geometrical reason 

***

[2] In SQPR, the “particles” are preceded by the linear guiding waves. Blocking some of them triggers “collapse”. By selecting SGMH (RIGHT) one clearly collapses the linear guidance.

***

[3] Stern Gerlach Magnets also directly illustrates Spin, as did in the first few lines above (magnetic field —> two dots!) The Pauli machinery is often how Spin is introduced in Quantum Physics courses, but that, philososophically is confusing the formalism derived from what is observed with the observation itself.

REALIZATION NEEDS LOCALIZATION. NONLOCALITY Gets Nobel Prize

October 4, 2022

Finally! The most surprising discovery of the last two centuries in science was not the Quantum (that had been sort of anticipated by the Greeks who thought they had demonstrated the existence of atoms, literally non-divisibles; the photon is the atom of light…), nor was it DNA (the discovery that there are laws of inheritance is hundreds of thousands of years old, and make ever more specific with time, as humans learn to breed characteristics, etc.).

NONLOCALITY was an enormous surprise because it contradicted everything… even mathematics, come to think of it deeply enough.

The old approach, which had become crystallized by the ancient Greeks was that the world was made of indivisibles, atoms in nature, points in mathematics. So granular nature and extreme precision.

As my own dad once told me on his own, while I described the Quantum to him, in English translation:”the idea that one can get ever smaller and nothing changes can’t possibly be true”.

What NONLOCALITY says is that if one gets small enough, one ends up somewhere else!

This shatters the expectation that smaller is no different. Instead:

1) smaller is somewhere else.

2) Properties and matter as we expect it, do not exist at a small enough scale. REALIZATION NEEDS LOCALIZATION.  

(the mathematics of QM requires this. Einstein and company objected to the notion, as they put it, that the “Moon does not exist if no one looks at it”… That was an exaggerated objection;  there is a qualitative difference between a very small object and a very large one… At least in SQPR; the drawing from the Swedish Academia is also exaggerated… However, it makes the general idea, clear, in first approximation:

This gives the rough idea: the smallest properties, when at the Quantum scale, do not exist until an interaction has occurred. A lot of open question are connected to this: what of energy-momentum? Does it spread all over, or stays concentrated as in what I called Einstein’s Greatest Error? SQPR has an in-between position: most, but not all, of the enrgy-momentum stays concentrated. It is this lack which created Dark Matter and Dark Energy.

4 October 2022

The Royal Swedish Academy of Sciences has decided to award the Nobel Prize in Physics 2022 to

Alain Aspect

Université Paris-Saclay and

École Polytechnique, Palaiseau, France

John F. Clauser

J.F. Clauser & Assoc., Walnut Creek, CA, USA

Anton Zeilinger

University of Vienna, Austria

“for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science”

Entangled states – from theory to technology

Alain Aspect, John Clauser and Anton Zeilinger have each conducted groundbreaking experiments using entangled quantum states, where two particles behave like a single unit even when they are separated. Their results have cleared the way for new technology based upon quantum information.

The ineffable effects of quantum mechanics are starting to find applications. There is now a large field of research that includes quantum computers, quantum networks and secure quantum encrypted communication.

One key factor in this development is how quantum mechanics allows two or more particles to exist in what is called an entangled state. What happens to one of the particles in an entangled pair determines what happens to the other particle, even if they are far apart.

For a long time, the question was whether the correlation was because the particles in an entangled pair contained hidden variables, instructions that tell them which result they should give in an experiment. In the 1960s, John Stewart Bell developed the mathematical inequality that is named after him. This states that if there are hidden variables, the correlation between the results of a large number of measurements will never exceed a certain value. However, quantum mechanics predicts that a certain type of experiment will violate Bell’s inequality, thus resulting in a stronger correlation than would otherwise be possible [0].

John Clauser developed John Bell’s ideas, leading to a practical experiment. When he took the measurements, they supported quantum mechanics by clearly violating a Bell inequality. This means that quantum mechanics cannot be replaced by a theory that uses hidden variables [1].

Some loopholes remained after John Clauser’s experiment. Alain Aspect developed the setup, using it in a way that closed an important loophole. He was able to switch the measurement settings after an entangled pair had left its source, so the setting that existed when they were emitted could not affect the result [2].

Using refined tools and long series of experiments, Anton Zeilinger started to use entangled quantum states. Among other things, his research group has demonstrated a phenomenon called quantum teleportation, which makes it possible to move a quantum state from one particle to one at a distance.

“It has become increasingly clear that a new kind of quantum technology is emerging. We can see that the laureates’ work with entangled states is of great importance, even beyond the fundamental questions about the interpretation of quantum mechanics,” says Anders Irbäck, Chair of the Nobel Committee for Physics.

***

Clauser (and others with him, not mentioned by Nobel Com. took entanglement seriously. At the tme it was viewed as a curiosity, not really physics. As a top physicist in my department said:”Serious people don’t do these things, it just gives headaches!”

Personal notes: [0] This depends upon the fact that measuring the spin of a particle in one direction affects the measurement of the spin of the same particle in another direction… which is certainly not the case in Classical Mechanics: if one finds the rotation of a particle in direction x it does not affect a subsequent measurement in direction y…

[1] Nobel Com should have said: quantum mechanics cannot be replaced by a theory that uses LOCAL hidden variables.

And what does “LOCAL” mean? Topology as defined using the speed of light for metric. In other words, nonlocal hidden variables are permitted, and this is what SQPR, a SUB QUANTUM Physical Reality uses.

[2] Aspect thus showed that there was such a thing as a QUANTUM INTERACTION, and it propagates faster than light. The big question is how fast. In excess of 10^23c 

The notion of Quantum Interaction is important. Once one agrees (with Newton) that no interaction can be instantaneous, and once one has identified Quantum Collapse with it, Dark Matter and Dark Energy are near instantaneous deductions.

***

Anyway, here it is: Nonlocality has become official!

Patrice Ayme 

How To Generate Matter Waves In Large Objects

June 14, 2020

L = h/P is the De Broglie hypothesis, where L is the wavelength of the matter wave, P is the momentum of the object, and h is Planck constant. 

The relationship was known in the case of the photon [1].

What I viewed as a mystery: the validity of the formula for any object. It seemed to me to make all particles, and all masses sort of fundamental. How could that be? Meanwhile, the mystery of mass got thicker. I heard of the Higgs, it’s mostly a gimmick to create mass out of friction from an assumed universal field, now supposedly observed… for particles which in the Standard Model which wouldn’t otherwise have any. How much of the mass of a nucleon is produced by simply harnessing E = mcc is unknown to me, but could be most of it.

De Broglie matter waves, simplest version. In general p, the momentum is relativistic, and “mv” is just the slow poke version of it… So all I am saying is that the zillions of linear parts of all these waves add up… OK, I should make my own drawing, severely more sophisticated: this one gives the impression that the linear tails (the outside, guiding parts) of the wave is nonlinear… In SQPR, the center is highly nonlinear, and the outer, guiding part ready to transform itself in Dark Matter, given the right geometry

Then SQPR appeared. In SQPR, waves are everything, but they are additive, nonlinear… And they are all what space is. 

So visualize an object O. It’s made of a zillion elementary particles, call that number Z, each actually an elementary wave, expanding, entangled, then nonlinearly collapsing, then expanding again after that interaction, entangled with all over, etc. In the average, though, the expanding quantum waves of all these objects will have momentum p/Z and mass, m/Z. This clarity of mind escapes Quantum Field Theorists (QFTists), because their version of space is haunted by so-called “vacuum energy”… for which there is only anti-evidence (namely the universe exists; if vacuum energy existed, the universe won’t, because it would be collapsed all the time! [2])

Now SQPR says those linear parts all add up, as an average p/Z and constitute then a mass of m, momentum p. It’s all very simple, but now it sounds intuitive… A return of intuition in physics, would be welcome, instead of complete insanity…

Patrice Ayme

***

***

[1]. Planck had discovered E = hF, where F is the frequency of light. Einstein proposed to generalize it to quanta of light (“Lichtquanten”) in flight, and explained immediately the photoelectric effect that way (he got the Nobel for that; by the way, SQPR immediately explains Dark Matter)

***

[2] One evidence for the vacuum energy is the “CASIMIR EFFECT”… which is thoroughly demonstrated in very practical nanophysics. I explained once how to make it produce much energy. Nobel S. Weinberg  in his book on Gravitation rolls out Casimir as a proof of vacuum energy (instead, I roll out the universe to disprove vac energy!) However, it turns out one doesn’t need the full vac energy to explain Casimir…

See:

The Casimir Effect and the Quantum Vacuum
R. L. Jaffe
Center for Theoretical Physics,
Laboratory for Nuclear Science and Department of Physics
Massachusetts Institute of Technology,
Cambridge, Massachusetts 02139
Abstract.
In discussions of the cosmological constant, the Casimir effect is often invoked as decisive
evidence that the zero point energies of quantum fields are “real”. On the contrary, Casimir effects
can be formulated and Casimir forces can be computed without reference to zero point energies.
They are relativistic, quantum forces between charges and currents. The Casimir force (per unit
area) between parallel plates vanishes as α, the fine structure constant, goes to zero, and the standard
result, which appears to be independent of α, corresponds to the α → ∞ limit.

SQPR has its own version of the vacuum: it’s truly empty, devoid of mass-energy. All the space that is physical is made of matter waves… Matter waves and pieces thereof all have mass-energy, making them real.

FREE WILL SHOWS QUANTUM PHYSICS IS INCOMPLETE

February 15, 2020

Present Day Quantum Physics Is Entangled With Photon Awareness, While Contradicting Free Will, In A Most Peculiar Way…

Abstract: The Axiomatics of the Copenhagen Interpretation of the Quantum is written as if photons were aware of slits-at-a-distance… And as if photons acted accordingly (as if photons cared about slits!)… But the Copenhagen Interpretation of the Quantum provides NO mechanism for photons to take care of slits. This is absurd in two ways. It’s as if an anthropomorphic Mr. Photon was supposed to be telepathetic. Another problem with the Copenhagen Interpretation of the Quantum (“CIQ”) is that CIQ Quantum Physics, being deterministic, denies Free Will.

Conclusion: Quantum Theory is not the final story. Guiding Wave theories with delayed causality, such as SQPR, are necessary to reduce the nonsense.

***

We act, we decide, we initiate actions. Can we insert this faculty for action of our own Free Will, this human agency, into the general picture of nature (“physics”) that we presently have? No! Because physics as we know it is deterministic… And we are not! (Quantum Physics, contrarily to its repute, is deterministic… as long as its nonlocal effects are not considered…) 

Thus the humanity-as-an-independent-agent question leads to the depths of the human mind and its relationship with physical reality, throwing up profound connections to the mysteries of entropy (disorder augments… something biology violates) and the arrow of time (time flowing one way… although fundamental physics flow both ways, contradicting even entropy as fundamental). Even reality gets questioned (what is it?) and consciousness (what could it be?) Surprising answers are readily discernible. 

Quantum fields don’t have any agency. Atoms don’t, do bacteria?” asks physicist Sean Carroll from the California Institute of Technology. “I don’t know, but human beings do. Somewhere along that continuum it sneaked in.

Quantum Determinism a la Copenhagen Means We Have No Freedom Of Choice

Well, it is not even as simple as believing “agency” sprouts between things more complicated than atoms and human beings. Let me make a ridiculously simple observation. 

Take the 2 slit exp. This phenomenon is the conceptual heart of Quantum Physics. If we take the Copenhagen Interpretation of the Quantum (CIQ, pronounced “SICK”), at face value, something astounding occurs: it looks as if an electron, or a photon, has AWARENESS. 

Indeed, according to Einstein, a photon in flight is a localized concentrated “quantum” (Einstein wrote about “Lichtquanten”, light quanta; they got named “photons” 20 years later). 

The following is exactly what Albert wrote, in his otherwise beautiful Nobel Prize winning paper on the photoelectric effect, and has been viewed as definitive truth ever since: “Energy, during the propagation of a ray of light, is not continuously distributed over steadily increasing spaces, but it consists of a finite number of energy quanta LOCALIZED AT POINTS IN SPACE, MOVING WITHOUT DIVIDING…” (I view this Einstein unsupported opinion as a grave error which the herd has made ever since… But I am going to proceed, for the sake of argument, as if this ridiculous idea were true, in the next few lines!)

When one cuts two slits in a screen, a photon (going through just one slit, according to Einstein) somehow knows about the other slit. How? Certainly not by having Mr. Photon look over at the other slit. So, then, what is the root, the nature of this photon “awareness”, Einstein and his followers want us to believe in? 

A photon is aware of the other slit: could such an elementary particle’s awareness at a distance be the fundamental “element of consciousness“? (I am sarcastically parroting terminology of Einstein in 1935, introducing the notion of “elements of reality”) 

A shallow philosopher could chuckle that all consciousness comes from sensation, which comes from senses… And obviously the photon has no senses… Except, somehow, according to Einstein and CIQ, the photon (or any fundamental particle) senses the other slit at a distance (always under Einstein’s locality-of-the-quantum hypothesis, which permeates modern physics, a pervading poison gas)… So, according to them, the photon has a sense, somehow. 

Experiments With Bouncing Droplets such as these three above, were started in Paris in the Twenty-First Century. They provide with the first analogy to guide De Broglie’s Pilot Wave Theory of 1927 and the much more sophisticated SQPR… A problem for the Pilot Wave Theories being that we have NO mathematical models… As mathematicians prefer often to focus on silly problems posed by infinities, the modern analogue of the worry an infinite number of sitting angels on pinheads posed to Middle Age Catholic bishops….

The surface waves generated by the silicon oil droplets above are analogous to quantum mechanical waves that guide the dynamics of quantum particles. While the droplets move like quantum particles, they behave like quantum waves.”… says award winning photographer and physicist Dr Aleks Labuda, who took the picture above.

Guiding Wave (GW) partisans, such as yours truly, don’t have the problem of the telepathic, all aware photon endowed with Free Will: the Guiding Waves go through both slits of the 2 slit experiment, and thereafter “guides” the photon accordingly to the presence of these two slits. (The experimental models of the 2 slits, with bouncing liquid balls, exists… and thus have attracted great hatred from partisans of the Copenhagen Interpretation, such as from the grandson of Niels Bohr, himself a physicist. I will not put links, so as not to confuse readers…)

So, basically, if one rejects a strange photon “awareness”, implicitly assumed by CIQ, one is immediately led to Guiding Waves theories. To this people familiar with the Foundations of Quantum Physics may retort that a GW theory such as De Broglie-Bohm is indistinguishable from CIQ. Right. But I don’t think De Broglie-Bohm Guiding Wave can withstand the EPR 1935. Moreover, my own theory, SQPR, is distinguishable from CIQ: SQPR produces Dark Matter… CIQ doesn’t.

In any case, a GW theory is a mechanical, non local, field of awareness (Bohm makes it into a Quantum Potential). [1]

***

With Quantum Physics, we find ourselves back into the ultra-deterministic setting of the Eighteenth Century… But now with a theory which claims to understand everything (whereas in the 18C-19C, some pieces were missing, and not just the two clouds on the horizon  Lord Kelvin saw in the distance…) So the Quantum should explain Consciousness, Free Will… As it explains the universe. But, clearly, it contradicts Free Will… EXCEPT, if one considers nonlocal effects. Nonlocal effects violate local determinism.

The preceding essay stands as a testimony of the usefulness of the philosophical approach to dig deeper into what physics should become in the (hopefully) near future [2]. Not that it was ever different each time physics jumped ahead. All revolutions in physics have been revolutions in philosophy… and the most general revolutions in philosophy often preceded revolutions in physics and science in general [3]. For example, the Renaissance of the Eleventh Century preceded the Buridan (and his schools!) astronomical, physical and mathematical revolution after 1350 CE. In turn it may have accelerated the Fifteenth-Sixteenth Century philosophical Renaissance which clearly led to the Seventeenth Century scientific and technological revolutions, an ambitious protest against more modest understanding.

Patrice Ayme

***

***

[1] One thing that makes SQPR different from De Broglie-Bohm (DBB), is that SQPR supposes the Guiding Field proceeds, expanding or collapsing, at an extremely fast… but NOT infinite… speed. Another is that the Guiding Field carries minute, but non zero mass-energy. Both effects together predict the Dark Matter effect… Also SQPR makes Quantum Entanglement a mass-energy conveyor, hence non-magical, another deviation from both CIQ and DBB…

***

[2] Quantum Computers exploit the Foundations of the Quantum… but not through the brute force of Quantum Field Theory and its (glaringly very incomplete and haphazard) “Standard Model” ….the one with no model for Dark Matter or Dark Energy. So Quantum Computers bring the foundations, such as Quantum Entanglement, crucial for their operations, into focus… Hence expect foundations to become ever more crucial in the common Zeitgeist…

***

[3] This is particularly blatant reading Descartes, who justified his enormous advances in mathematics with a cocktail of philosophical and psychological observations of the most judicious types. Just as I question infinity, Descartes questioned the sort of proofs mathematicians had been satisfied with for two millennia… and did something about it (by inventing Algebraic Geometry)….

(Aaronson) Misleading “Quantum Talk”

December 8, 2019

Of Those For Whom Despising Others Fosters One’s Sense Of Existence:

Aaronson, a renowned Quantum Computation expert thought funny to pretend that what he studies (programming Quantum amplitudes) is the end-all, be-all of the universe. So this is out in a cartoon, complete with arrogant child and even more stupid mom. Although the fact the mom is stupid is not revealed by the cartoon, quite the opposite. When does she go wrong?

Besides having no eyes to speak to, or from? So here is the cartoon. It starts pretty well. It ends up as badly as possible, with plenty of lies about Quantum Mechanics and unsupported assertions that consciousness has nothing to do with Quantum Mechanics (when there is plenty of evidence to the contrary, including that QM is at the core of biology). Anyway here is the silly cartoon, from a respected Quantum Computational authority:

I condescend and lie, therefore I think, hence I exist? Too many physicists have succumbed to that temptation, due to their otherwise modest position in society.

There is a partially correct message above not well known by the unwashed multitudes: yes, Quantum Mechanics compute with complex amplitudes. Why is that? Let me reveal it: consider light. It’s the simplest Quantum phenomenon,and it requires Complex Numbers to be described as compactly as possible. Light is made of entangled electric and magnetic fields at an angle to the propagation direction. More precisely: E is the electric field vector, and B is the magnetic field vector of the EM wave. For electromagnetic waves E and B are always perpendicular to each other and perpendicular to the direction of propagation. The direction of propagation is the direction of E x B.

Electromagnetic waves are the solutions of Maxwell’s equations in a vacuum, which abstract the experiments pioneered by at least half a dozen physicists (Monge, Ampere, Faraday, Fresnel, etc.).

E (and thus B!) can rotate, or be in any directions, as long as they are perpendicular to each other and light wave propagation. However one can also fix E, by letting the light through a grid. This is a standard experiment, it’s use in polarized sunglasses, etc.  However it’s very deep, deeper than Aaronson can understand, or that present physics understand: this means the light is somehow extended… If if it’s a single, proverbial photon.

Any light wave coming from the left. The first vertical grid forces the E electric field of the light vertically (any other direction gets captured by the vertical grid). The second horizontal grid kills the vertically polarized light.

In general, the electric field E can adopt any direction or rotate. To depict this mathematically, there is a handy instrument: the Complex Number field, which is the one and only generalization of the Real Number field [1]. Light is also the simplest Quantum phenomenon known. Thus, to fully depict the simplest Quantum, one needs Complex Numbers. More Quantum will naturally mean more Complex Numbers.

Aaronson claims that “physicists had a customs when describing these matters with outsiders, they want to avoid being… too graphic too explicit … too *gulp* mathematically correct“.

The Schrodinger Cat, and nonlocality are caused by “talking to outsiders”? This is what Aaronson says explicitly lower down in the cartoon. This is hogwash bullshit. Physicist didn’t understand much of the very basics of the Quantum Mechanics they were in the process of inventing. Even the most trivially blatant features of Quantum mechanics were not understood for more than a generation, although they were the first thing one wrote on the subject. Let me explain.

The basic equation (De Broglie-Schrodinger) reads basically as:

i(change W relative to time) = (change acceleration of W) + (W)(A)

More “mathematically”: i dW/dt = ddW/dxdx + AW

Where W is a complex wave, and A a potential W interacts with (say the electromagnetic field for an electron) [2].

It’s obvious from this most basic equation, that W reacts to the potential A. However, it took 33 years or so to notice this: Bohm and Aharonov did it… and all the big geniuses of Quantum Mechanics, even Einstein and De Broglie, let alone Bohr, Heisenberg, Schrodinger, Pauli, Born, the hyper arrogant Von Neumann, Dirac, etc… Didn’t notice it.

Notice Aaronson describes physicists as “insiders“. In other words, physicists make a tribe… just like many philosophers around “French Theory” have claimed…

However, the real truth is that physicists were too ignorant to describe WHY the COMPLEX Quantum Wave well to “outsiders“, because there was not enough inside the heads of these insiders. But here I come.

So I take my right arm, and project it forward, like your standard Kung Fu master. I open three fingers. My index points in the direction of motion of my arm, ready to punch inside the eye of the tribal physicist “insider”. My major points perpendicular to the index (because physicists insiders get my symbolic finger; it also represent the E field above). My thumb, representing the B field, is perpendicular to the other two extended fingers. At this point the whole assembly of these three fingers progressing forward can be described by three real numbers. But then I impose a rotation of the entire hand: there is your Complex Numbers field. The rotation requires the complex numbers

Some could sneer that one complex number is a pair of real numbers with weird multiplication rules, so we don’t really need Complex Numbers, bla bla bla… But Mathematics is not just a language: mathematics is the most compact, most efficient language: it does that by compactifying the logic maximally.

What does Patrice mean by “compactifying the logic”? How to measure that? Simple, count the symbols: the fewer symbols, the most compact… [3]

***

Aaronson claims that “classical events have probabilities and quantum events have amplitude“. That’s false. They have probabilities too (from the square root of the norm squared of the amplitude). Before Aaronson showed he had no modesty, didn’t know the history of physics and was a tribalist. Now he throwing through the window not just the philosophy of Quantum Mechanics, but also the entire Theory of Measurement, and what the refined analysis of what an “event” is.

A basic axiom of Quantum Mechanics is that an unobserved event is not an event. As long as one has “amplitudes”, the quantum system computes quantum mechanically, but one had no “event”. “Event” happens after the collapse, when there are no more amplitudes. I call that singularization. Other more or less equivalent concepts are “collapse” and “decoherence” (collapse frightened the children so they opted for decoherence as they became snowflakes).

So what Aaronson presents as the one and only axiom of Quantum Physics is actually a self-contradicting ERROR.

Actually Wikipedia offers nine different complicated and independent axioms for Quantum Mechanics.

So why does Aaronson speak only of amplitudes? Because it’s all he needs for his job, computer programming. Forget physics: Aaronson doesn’t seem to know the difference between “amplitudes” and “waves”: waves have “amplitudes”, but do not reduce to “amplitudes”. Moreover, Quantum waves don’t behave like classical waves with their (only) local behavior: Quantum waves are global. In the author’s own SQPR,, the waves propagate at an enormous speed, tens of trillions of trillions times faster than the speed of light (so they appear “instantaneous” as the present Quantum mechanical axiomatics has it…)

https://patriceayme.wordpress.com/2016/05/18/quantum-waves-are-real/

The  (so-called “Heisenberg”) Uncertainty Principle is a direct consequence of De Broglie hypothesis. Yet, the basic idea is pre-Quantum… Or rather Quantum in the pure electromagnetic sense… as I said, the very basics of Quantum mechanics is electromagnetics, thus optics. Let me explain a bit more.

To find out where something is, one shines a light on it, one hits it, say, with light. However the light’s precision is greater, the shorter its wavelength (otherwise light turns around objects for the exact same reason as radio waves do). But the energy of the light is proportional to its frequency which is inversely proportional to its wavelength. So the more precise one tries to ascertain where an object is, by looking at it, the greater the kick one imparts to it. It’s obvious one will get an uncertainty, and, at this point, one doesn’t even need De Broglie’s equation, the relation between wavelength and momentum, but plain 19th century physics, mostly that light has momentum… the very same property tremendous genius Jules Henri Poincaré used in 1899, to demonstrate E = mcc, the famous mass-energy relationship usually attributed to fluffy parrot Einstein, then a young brat who tried to make us believe he invented all of physics besides telling God how to organize the universe (as Quantum Founder and nobel Niels Bohr told him).

To get the best numbers of the exact inequality, one needs De Broglie…

Notice that what is important here is WAVES, not just amplitudes. And the relationship between momentum, energy, and wave frequency, direction. Right the waves are complex and they have amplitudes.

By the way, Waves are not all one needs for Quantum Mechanics: SPIN, for example comes from a different logical source, purely geometrical. Spin was discovered by another tremendous French genius, Elie Cartan, before World War One. A generation later, Spin surfaced in Dirac’ s fertile mind: Dirac wanted the most simple equation possible to describe the electron… That required a new space, to enable spinors to live on it… (The mathematics of all this are not well understood at the deepest level; it’s a bit as if one took the square root of space, I have not much clue beyond that, nor does anyone else…)

Aaronson also claims that Quantum nonlocality is just a matter of amplitudes. Well, it’s not. Between the EPR paper of 1935 and Bell’s inequality of 1966, there were 33 years during which physicists were perplexed more than perplexed: they ignored nonlocality all together (until Bohm-Aharonov). Experimental tests started much later, and some physicists have received the greatest prizes for it … Albeit not the Nobel… which was attributed for realizing the Bose-Einstein condensate, some 70 years after it could become theoretically imaginable…

With the end of his silly cartoon, Aaronson demonstrates he doesn’t know physics (the beginning is pretty good though…), and it is teaching lies.

Aaronson demonstrated that he may have had a condescending mechanical mom without eyes, and the arrogance simpletons comes with, by definition.

***

Authorities (such as politicians, economists, media owners and physicists in position of authority) have several reasons to lie. Yes it manipulates people but more precisely:

  1. It prevents people to access to truth, thus power.
  2. It confuses people, leveraging on 1).
  3. It fills the public with awe, because, confused and powerless, they view the authoritative figure as quite a bit of a magician, because lying enables the authority to apparently master the (otherwise) incomprehensible.
  4. In manipulating We The People into submission, it is crucial not to reveal that the incomprehensible has been made incomprehensible by the authorities, by deliberate design of an immense distraction (following the misleading Aaronson, good luck trying to deduce nonlocality from “amplitudes”: there are “amplitudes” in my bathtub, but that doesn’t make nonlocal… So this is an impossible task, which Aaronson pretends to have mastered; economist and politicians, bankers and other high finance types, even ecologists on government payroll, do the same. day in, day out, 24/7: outrageous lying, to confuse the multitude)

All too many “intellectuals” in recent decades have been for sale…

Teaching Quantum Physics to all is teaching the universe as it is to all. Knowing Quantum Physics enriches one’s arsenal of understanding schemes, all over. So teaching it correctly is a mission civilisatrice. When humanity is more intelligent in the future, it will be in part because of this. For example the meta observation that everything is made, in the small, of waves is most enlightening, and impact sensibilities…

Thus la trahison des clercs, here lying about the nature of the Quantum, while posing as an “insider” is the sort of pseudo-intellectual posturing humanity really doesn’t need. This sort of deliberately dishonest and malicious posturing has brought deplorable racism such as imposing the notion that fearing Wahhabism is racist… and has distracted from the most major problems at hand such as global plutocratization and the man-made mass extinction.

Patrice Ayme

***

***

P/S: A friend of mine, a professional university researcher in Quantum Computing, long working for a GAFAM, sent me the cartoon above, in an apparent slight to my essay on brain modularity making consciousness necessary. I am grateful as the paper was intelligently stupid.

***

[1] Quaternion make a non-commutative division algebra.

***

[2] i is the square root of (-1), a rotation by 90 degrees in the complex number field. I put Planck constant = 1… As I am limited by the WordPress software, I denote the second (partial) derivative relative to space as dd/(dx)(dx) which is what it is… but different from the usual symbolic… which also use the psi Greek letter instead of W…

***

[3] Maxwell equations initially covered an entire page. Now they can be reduced to nine symbols: dF = 0 and d*F = A.

To understand that, you have to learn more advanced differential geometry: exterior differentiation, the * operator, etc. But will admit, that’s compact… And thus in the only sense that makes sense, MOST intuitive.

LOGIC IS MATERIAL

April 11, 2018

Logic doesn’t just matter, it is matter.

FUNDAMENTAL PROCESSES, INCLUDING COMPUTATIONS, LOGIC, ARE MATERIAL OBJECTS:

Is there something besides matter? No. What is matter? Ah, two types of things, corresponding to wave-particle duality… Or, as I put it often, process-quanta duality.

***

We should have come a long way in 24 centuries, yet some keep repeating ideas of Plato, an Athenian plutocrat. Plato (and his teacher Socrates and student Aristotle) had an extreme right wing agenda, much of it pursued later as the “Hellenistic” regimes (dictatorships), imperial fascist Roman Principate, and the rage against innovation. Plato’s metaphysics has much in common, if not everything, with Christianism (this explains its survival…)

And now for a word from this essay’s sponsor, the gentleman contradicting me. Robin Herbert replied to me: …”many don’t seem to grasp that the classical logics are not tied to any physical assumptions… the classical logics are not tied to any physical assumptions. I think the problem is that we have this term “classical physics” and another term “classical logic” and people think they are related. They aren’t.”

Are we that stupid? I guess, our enemies wish we were…

***

Only those who have never heard of Platonism would not be familiar with the notion that logic is not “material”: it is at the core of Plato’s view of the universe. And also at the core of Christianism, so help me not god!

I beg to oppose the dematerialization of logic. Differently from Plato, I have a careful observation of nature, Quantum theory, the mechanics of atomic theory, to back me up. Frankly, relative to what we know now, Plato is an ignorant twerp. So why the reverence for his antique antics? My unforgiving mood is driven in part by the observation that the Ancient Greeks had plenty of holes in their axiomatics… Especially in mathematics (where they made several ludicrous mistakes, such as forgetting non-Euclidean geometry, generations after discovering it).

If logic is not tied to “physics”, or what’s material, we want to know what that is. But, as I am going to show, all we do is go back to the Gospel of John as the ultimate authority (itself straight out of Plato!)

Twentieth Century physics has revealed that physics is made of “Fundamental Processes” (see the very nice, pre-QCD book by that title from Feynman)… And Quanta. The former, the processes, are described by waves, the second, those lumps of energy, by particles.

Thus, saying that “logic is not physics” is tantamount to saying that logic is neither a fundamental process (or set thereof), nor quanta (or set thereof).

Orbitals to an electron around a proton (the Hydrogen atom), visualized in 2013 (Phys. Review). What you are looking at is one electron, when it is delocalized. The electron is the cloud. The cloud is a process. The process is what an atom of hydrogen is, 99.9999999% of the time… At least…

There are several problems with such a claim: far from being immaterial, any logic shows up as quanta (aka “symbols”), and is itself a process (classical logic rests on implication, the simplest process:”if A then B”, and chains therefrom). Logic shows up as nothing else, so that’s what it is: a bunch of fundamental processes and quanta. This is the modern philosophy of physics, in action! (It originated with Newton and Laplace, and was then amplified by Jules Henri Poincaré)

There was a famous exchange between Heisenberg and Einstein; the latter, at the peak of his glory, accused the young Quantum physicist to have only put observables in his matrix quantum theory. Heisenberg coolly smirked back that it was Einstein who taught him to do so! (Constructively infuriated, ten years later Einstein rolled out the EPR thought experiment, alleging a contradiction between Quantum Mechanics and LOCAL “elements of reality“. The effect was relabeled “entanglement” by Schrödinger, now the central notion in Quantum theory… Einstein should have realized that it was this very delocalization which made atoms wholes…)    

So what’s “material”? What’s observable! And what is observable? (Delocalized) fundamental processes and (localized, yet ephemeral) quanta. Claiming that the logos is neither is (implicitly) done in the first sentence of the Gospel of John, and John adds that its name is god. We of the natural school shall excommunicate those evoking god. Those who claim “logic”, the logos, escapes nature (= physis) are just followers of whom John followed, namely Plato. They are Platocrats, a particular prototype of plutocrats…

Fundamental processes are described by equations, but that doesn’t mean the equations are “real”, beyond symbols (“quanta”) of a medium. First of all, equations are approximations: a classical computer can only make a finite number of operations (differently from a full Quantum computer, which works with a continuum, the circle S1). Instead what is really real is the fundamental process(es) the equations approximate.

Indeed, consider atoms: they are real, “indivisible” (sort of)… and yet mostly made of delocalized processes known as electronic orbitals.  It is the delocalization which creates the substance: see the picture above… 

So is a classical computation a real object, in the aforementioned sense? Yes, because it is a FINITE set of fundamental processes (moving electrons and photons around). However, if the proposed computation, or logical deduction, takes an infinite amount of time, it becomes something that never comes to exist. (That’s an allusion to a classical computer trying to duplicate Quantum computers; in the case of the chlorophyll molecule, no classical computer could do what the molecule, viewed as a Quantum computer, does!)

In this view, call it material logic, time, whether we want it or not, whether logicians realize it, or not, is an essential part of logic: the time-energy principle de facto granulates time (we need infinite energy for infinitely small time intervals, hence for would be infinite logical computations). To say time is not part of logic is another of these oversights (as Archimedes did, implicitly using what non-standard analysts, Robinson and Al. called “Archimedes Axiom”, which excludes infinitely small (or large) integral numbers). Any piece of logic comes with its own duration, namely how many steps it needs in its simplest form.   

Quantum computing uses one (hypothesized) infinity: the assumed instantaneity of what I call the Quantum Interaction (aka Quantum Collapse). That enables to delocalize Quantum logic (no distributive law of propositional logic!), as delocalized Quantum processes, and this is why it can’t be classically duplicated (aka “Quantum supremacy”).

Happy processes!

Patrice Aymé

Discrepancy In Universe’s Expansion & Quantum Interaction

January 17, 2018

In “New Dark Matter Physics Could Solve The Expanding Universe Controversy“, Ethan Siegel points out that:

“Multiple teams of scientists can’t agree on how fast the Universe expands. Dark matter may unlock why.
There’s an enormous controversy in astrophysics today over how quickly the Universe is expanding. One camp of scientists, the same camp that won the Nobel Prize for discovering dark energy, measured the expansion rate to be 73 km/s/Mpc, with an uncertainty of only 2.4%. But a second method, based on the leftover relics from the Big Bang, reveals an answer that’s incompatibly lower at 67 km/s/Mpc, with an uncertainty of only 1%. It’s possible that one of the teams has an unidentified error that’s causing this discrepancy, but independent checks have failed to show any cracks in either analysis. Instead, new physics might be the culprit. If so, we just might have our first real clue to how dark matter might be detected.

20 years ago it was peer-reviewed published, by a number of teams that we were in an ever faster expanding universe (right). The Physics Nobel was given for that to a Berkeley team and to an Australian team. There are now several methods to prove this accelerating expansion, and they (roughly) agree.

Notice the striking differences between different models in the past; only a Universe with dark energy matches our observations. Possible fates of the expanding Universe which used to be considered were, ironically enough, only the three on the left, which are now excluded.  Image credit: The Cosmic Perspective / Jeffrey O. Bennett, Megan O. Donahue, Nicholas Schneider and Mark Voit.

Three main classes of possibilities for why the Universe appears to accelerate have been considered:

  1. Vacuum energy, like a cosmological constant, is energy inherent to space itself, and drives the Universe’s expansion. (This idea comes back to Einstein who introduced a “Cosmological Constant” in the basic gravitational equation… To make the universe static, a weird idea akin to crystal sphere of Ptolemaic astronomy; later Einstein realized that, had he not done that, he could have posed as real smart by predicting the expansion of the universe… So he called it, in a self-congratulating way, his “greatest mistake”… However, in the last 20 years, the “greatest mistake” has turned to be viewed as a master stroke…).
  2. Dynamical dark energy, driven by some kind of field that changes over time, could lead to differences in the Universe’s expansion rate depending on when/how you measure it. (Also called “quintessence”; not really different from 1), from my point of view!)
  3. General Relativity could be wrong, and a modification to gravity might explain what appears to us as an apparent acceleration. (However, the basic idea of the theory of gravitation is so simplest, it’s hard to see how it could be wrong, as long as one doesn’t introduce Quantum effects… Which is exactly what I do! In my own theory, said effect occur only at large cosmic distances, on the scale of large galaxies)

Ethan: “At the dawn of 2018, however, the controversy over the expanding Universe might threaten that picture. Our Universe, made up of 68% dark energy, 27% dark matter, and just 5% of all the “normal” stuff (including stars, planets, gas, dust, plasma, black holes, etc.), should be expanding at the same rate regardless of the method you use to measure it. At least, that would be the case if dark energy were truly a cosmological constant, and if dark matter were truly cold and collisionless, interacting only gravitationally. If everyone measured the same rate for the expanding Universe, there would be nothing to challenge this picture, known as standard (or “vanilla”) ΛCDM.

But everyone doesn’t measure the same rate.”

The standard, oldest, method of measuring the Hubble cosmic expansion rate is through a method known as the cosmic distance ladder. The simplest version only has three rungs. First, you measure the distances to nearby stars directly, through parallax, the variation of the angle of elevation during the year, as the Earth goes around its orbit. Most specifically you measure the distance to the long-period Cepheid stars like this. Cepheids are “standard candles”; they are stars whose luminosities vary, but their maximum power doesn’t, so we can know how far they are by looking how much they shine. Second, you then measure other properties of those same types of Cepheid stars in nearby galaxies, learning how far away those galaxies are. And lastly, in some of those galaxies, you’ll have a specific class of supernovae known as Type Ia supernovae. Those supernovae explode exactly when they accrete 1.4 solar mass, from another orbiting star (a theory of Indian Nobel Chandrasekhar, who taught at the University of Chicago). One can see these 1a supernovae all over the universe. Inside the Milky Way, as well as many of billions of light years away. With just these three steps, you can measure the expanding Universe, arriving at a result of 73.24 ± 1.74 km/s/Mpc.

The other methods makes all sorts of suppositions about the early universe. I view it as a miracle that it is as close as it is: 66.9 km/s/Megaparsec…

Ethan concludes that: “Currently, the fact that distance ladder measurements say the Universe expands 9% faster than the leftover relic method is one of the greatest puzzles in modern cosmology. Whether that’s because there’s a systematic error in one of the two methods used to measure the expansion rate or because there’s new physics afoot is still undetermined, but it’s vital to remain open-minded to both possibilities. As improvements are made to parallax data, as more Cepheids are found, and as we come to better understand the rungs of the distance ladder, it becomes harder and harder to justify blaming systematics. The resolution to this paradox may be new physics, after all. And if it is, it just might teach us something about the dark side of the Universe.”

My comment: The QUANTUM INTERACTION CHANGES EVERYTHING:

My own starting point is a revision of Quantum Mechanics: I simply assume that Newton was right (that’s supposed to be a joke, but with wisdom attached). Newton described his own theory of gravitation to be absurd (the basic equation, F = M1 M2/dd. where d was the distance was from a French astronomer, Ishmael Boulliau, as Newton himself said. Actually this “Bullaldius” then spoiled his basic correct reasoning with a number of absurdities which Newton corrected).

Newton was actually insulting against his own theory. He said no one with the slightest understanding of philosophy would assume that gravitation was instantaneous.

Newton’s condemnation was resolved by Laplace, a century later. Laplace just introduced a finite speed for the propagation of the gravitational field. That implied gravitational waves, for the same reason as a whip makes waves.

We are in a similar situation now. Present Quantum Physics assumes that the Quantum Interaction (the one which carries Quantum Entanglement) is instantaneous. This is absurd for exactly the same reason Newton presented, and Laplace took seriously, for gravitation.

Supposing that the Quantum Interaction has a finite speed (it could be bigger than 10^23c, where c is the speed of light.

Supposing this implies (after a number of logical and plausible steps) both Dark Matter and Dark Energy. It is worth looking at. But let’s remember the telescope (which could have been invented in antiquity) was invented not to prove that the Moon was not a crystal ball, but simply to make money (by distinguishing first which sort of cargo was coming back from the Indies).

We see what we want to see, because that’s we have been taught to see, we search what we want to search, because that’s what we have been taught to search. Keeping an open mind is great, but a fully open mind is a most disturbing thing… 

Patrice Aymé

“Proof” That Faster Than Light Communications Are Impossible Is False

December 16, 2017

There are theories everywhere, and the more ingrained they are, the more suspiciously they should be looked at. From the basic equations of relativity it is clear that if one adds speeds less than the speed of light, one will get a speed less than the speed of light. It is also clear that adding impulse to a mass will make it more massive, while its speed will asymptotically approach that of light (and, as I explained, the reason is intuitive, from Time Dilation).

The subject is not all sci-fi: modern cosmology brazenly assumes that space itself, after the alleged Big Bang, expanded at a speed at least 10^23 c (something like one hundred thousand billion billions time the speed of light c). The grossest, yet simplest, proof of that is simple: the observable universe is roughly 100 billion light years across, and it is ten billion years old. Thus it expanded at the minimum average clip of ten billion light years, every billion years. 100c/10 = 10c, according to standard cosmology. One could furiously imagine a spaceship somehow surfing on a wave of warped space, expanding for the same obscure reason same obscure reason as the Big Bang itself, that is…) 

The question naturally arises whether velocities which are greater than that of light could ever possibly be obtained in other ways. For example, are there communication speeds faster than light? (Throwing some material across will not work: its mass will increase, while its speed stays less than c.)

Textbooks say it’s not possible. There is actually a “proof” of that alleged impossibility, dating all the way back to Einstein (1907) and Tolman (1917). The mathematics are trivial (they are reproduced in my picture below). But the interpretation is apparently less so. Wikipedia weirdly claims that faster than light communications would allow to travel back in time. No. One could synchronize all clocks on all planets in the galaxies, and having faster than light communications would not change anything. Why? Time is local, faster than light data travel is nonlocal.

The problem of faster than light communications can be attacked in the following manner.

Consider two points A and B on the X axis of the system S, and suppose that some impulse originates at A, travels to B with the velocity u and at B produces some observable phenomenon, the starting of the impulse at A and the resulting phenomenon at B thus being connected by the relation of cause and effect. The time elapsing between the cause and its effect as measured in the units of system S will evidently be as follows in the calligraphy below. Then I use the usual Relativity formula (due to Lorentz) of time as it elapses in S’:

Equations help, but they are neither the beginning, nor the end of a story. Just an abstraction of it. The cult of equations is naive, interpretation is everything. The same thing, more generally, holds for models.
As Tolman put it in 1917: “Let us suppose now that there are no limits to the possible magnitude of the velocities u and V, and in particular that the causal impulse can travel from A to B with a velocity u greater than that of light. It is evident that we could then take a velocity u great enough uV/C^2 will be greater than one.
so that Delta(t) would become negative. In other words, for an observer in system S’ the effect which occurs at B would precede in time its cause which originates at A.”

I quote Tolman, because he is generally viewed as the one having definitively established the impossibility of faster than light communications. Tolman, though is not so sure; in his next sentence he turns out wishy washy: “Such a condition of affairs might not be a logical impossibility; nevertheless its extraordinary nature might incline us to believe that no causal impulse can travel with a velocity greater than that of light.”

Actually it is an effect those who have seen movies running in reverse are familiar with. Causality apparently running in reverse is no more surprising than the fact that two events at x1 and x2 which are simultaneous in S are separated by:  (x1-x2) (V/square root (1-VV/CC)). That introduces a sort of fake, or apparent causality, sometimes this before that, sometimes that before this.

(The computation is straightforward and found in Tolman’s own textbook; it originated with Henri Poincaré.[9][10] In 1898 Poincaré argued that the postulate of light speed constancy in all directions is useful to formulate physical laws in a simple way. He also showed that the definition of simultaneity of events at different places is only a convention.[11]) . Notice that, in the case of simultaneity, the signs of V and (x1-x2) matter. Basically, depending upon how V moves, light in S going to S’ takes more time to catch up with the moving frame, and the more so, the further it is, the same exact effect which explains the nil result in the Michelson-Morley interferometer; there is an underlying logic below all of this, and it’s always the same).

Tolman’s argumentation about the impossibility of faster than light communications is, in the end, purely philosophical and fully inconsistent with the closely related, and fully mainstream, relativity of simultaneousness.

Poincaré in 1900 proposed the following convention for defining clock synchronisation: 2 observers A and B, which are moving in space (which Poincaré called the aether), synchronise their clocks by means of optical signals. They believe to be at rest in space (“the aether”) from not moving relative to distant galaxies or the Cosmic Radiation Background and assume that the speed of light is constant in all directions. Therefore, they have to consider only the transmission time of the signals and then crossing their observations to examine whether their clocks are synchronous.

“Let us suppose that there are some observers placed at various points, and they synchronize their clocks using light signals. They attempt to adjust the measured transmission time of the signals, but they are not aware of their common motion, and consequently believe that the signals travel equally fast in both directions. They perform observations of crossing signals, one traveling from A to B, followed by another traveling from B to A.” 

In 1904 Poincaré illustrated the same procedure in the following way:

“Imagine two observers who wish to adjust their timepieces by optical signals; they exchange signals, but as they know that the transmission of light is not instantaneous, they are careful to cross them. When station B perceives the signal from station A, its clock should not mark the same hour as that of station A at the moment of sending the signal, but this hour augmented by a constant representing the duration of the transmission. Suppose, for example, that station A sends its signal when its clock marks the hour 0, and that station B perceives it when its clock marks the hour t. The clocks are adjusted if the slowness equal to t represents the duration of the transmission, and to verify it, station B sends in its turn a signal when its clock marks 0; then station A should perceive it when its clock marks t. The timepieces are then adjusted. And in fact they mark the same hour at the same physical instant, but on the one condition, that the two stations are fixed. Otherwise the duration of the transmission will not be the same in the two senses, since the station A, for example, moves forward to meet the optical perturbation emanating from B, whereas the station B flees before the perturbation emanating from A. The watches adjusted in that way will not mark, therefore, the true time; they will mark what may be called the local time, so that one of them will be slow of the other.[13]

This Poincaré (“–Einstein”) synchronisation was used by telegraphers as soon as the mid-nineteenth century. It would allow to cover the galaxy with synchronized clocks (although local times will differ a bit depending upon the motion of stars, and in particular where in the galactic rotation curve a star sits). Transmitting instantaneous signals in that networks would not affect causality. Ludicrously, Wikipedia asserts that faster than light signals would make “Bertha” rich (!!!). That comes simply from Wikipedia getting thoroughly confused, allowing faster than light signals for some data, and not for other data, thus giving an advantage to some, and not others.

***

Quantum Entanglement (QE) enables at-a-distance changes of Quantum states:

(It comes in at least three types of increasing strength.) Quantum Entanglement, as known today, is within Quantum state to within Quantum state, but we cannot control in which Quantum state the particle will be, to start with, so we cannot use QE for communicating faster than light (because we don’t control what we write, so to speak, as we write with states, so we send gibberish).

This argument is formalized in a “No Faster Than Light Communication theorem”. However, IMHO, the proof contains massive loopholes (the proof assumes that there is no Sub Quantum Reality, whatsoever, nor could there ever be some, ever, and thus that the unlikely QM axioms are forever absolutely true beyond all possible redshifts you could possibly imagine, inter alia). So this is not the final story here. QE enables, surprisingly, the Quantum Radar (something I didn’t see coming). And it is not clear to me that we have absolutely no control on states statistically, thus that we can’t use what Schrödinger, building on the EPR thought experiment, called “Quantum Steering” to communicate at a distance. Quantum Radar and Quantum Steering are now enacted through real devices. They use faster-than-light in their inner machinery.

As the preceding showed, the supposed contradiction of faster-than-light communications with Relativity is just an urban legend. It makes the tribe of physicists more priestly, as they evoke a taboo nobody can understand, for the good reason that it makes no sense, and it is intellectually comfortable, as it simplifies brainwork, taboos always do, but it is a lie. And it is high time this civilization switches to the no more lies theorem, lest it wants to finish roasted, poisoned, flooded, weaponized and demonized.

Patrice Ayme’

Technical addendum:

https://en.wikipedia.org/wiki/Relativity_of_simultaneity

As Wikipedia itself puts it, weasel-style, to try to insinuate that Einstein brought something very significant to the debate, the eradication of the aether (but the aether came back soon after, and there are now several “reasons” for it; the point being that, as Poincaré suspected, there is a notion of absolute rest, and now we know this for several reasons: CRB, Unruh effect, etc.):

In 1892 and 1895, Hendrik Lorentz used a mathematical method called “local time” t’ = t – v x/c2 for explaining the negative aether drift experiments.[5] However, Lorentz gave no physical explanation of this effect. This was done by Henri Poincaré who already emphasized in 1898 the conventional nature of simultaneity and who argued that it is convenient to postulate the constancy of the speed of light in all directions. However, this paper does not contain any discussion of Lorentz’s theory or the possible difference in defining simultaneity for observers in different states of motion.[6][7] This was done in 1900, when Poincaré derived local time by assuming that the speed of light is invariant within the aether. Due to the “principle of relative motion”, moving observers within the aether also assume that they are at rest and that the speed of light is constant in all directions (only to first order in v/c). Therefore, if they synchronize their clocks by using light signals, they will only consider the transit time for the signals, but not their motion in respect to the aether. So the moving clocks are not synchronous and do not indicate the “true” time. Poincaré calculated that this synchronization error corresponds to Lorentz’s local time.[8][9] In 1904, Poincaré emphasized the connection between the principle of relativity, “local time”, and light speed invariance; however, the reasoning in that paper was presented in a qualitative and conjectural manner.[10][11]

Albert Einstein used a similar method in 1905 to derive the time transformation for all orders in v/c, i.e., the complete Lorentz transformation. Poincaré obtained the full transformation earlier in 1905 but in the papers of that year he did not mention his synchronization procedure. This derivation was completely based on light speed invariance and the relativity principle, so Einstein noted that for the electrodynamics of moving bodies the aether is superfluous. Thus, the separation into “true” and “local” times of Lorentz and Poincaré vanishes – all times are equally valid and therefore the relativity of length and time is a natural consequence.[12][13][14]

… Except of course, absolute relativity of length and time is not really true: everywhere in the universe, locally at rest frames can be defined, in several manner (optical, mechanical, gravitational, and even using a variant of the Quantum Field Theory Casimir Effect). All other frames are in trouble, so absolute motion can be detected. The hope of Einstein, in devising General Relativity was to explain inertia, but he ended down with just a modification of the 1800 CE Bullialdus-Newton-Laplace theory… (Newton knew his instantaneous gravitation made no sense, and condemned it severely, so Laplace introduced a gravitation speed, thus the gravitational waves, and Poincaré made them relativistic in 1905… Einstein got the applause…)

CONTINUUM FROM DISCONTINUUM

December 1, 2017

Discontinuing The Continuum, Replacing It By Quantum Entanglement Of Granular Substrate:

Is the universe granular? Discontinuous? Is spacetime somehow emergent? I do have an integrated solution to these quandaries, using basic mass-energy physics, and quantum entanglement. (The two master ideas I use here are mine alone, and if I am right, will change physics radically in the fullness of time.)  

First let me point out that worrying about this is not just a pet lunacy of mine. Edward Witten is the only physicist to have got a top mathematics prize, and is viewed by many as the world’s top physicist (I have met with him). He gave a very interesting interview to Quanta Magazine: A Physicist’s Physicist Ponders the Nature of Reality.

Edward Witten reflects on the meaning of dualities in physics and math, emergent space-time, and the pursuit of a complete description of nature.”

Witten ponders, I answer.

Quantum Entanglement enables to build existence over extended space with a wealth exponentially growing beyond granular space

Witten: “I tend to assume that space-time and everything in it are in some sense emergent. By the way, you’ll certainly find that that’s what Wheeler expected in his essay [Information, Physics, Quantum, Wheeler’s 1989 essay propounding the idea that the physical universe arises from information, which he dubbed “it from bit.” He should have called it: “It from Qubit”. But the word “Qubit” didn’t exist yet; nor really the concept, as physicists had not realized yet the importance of entanglement and nonlocality in building the universe: they viewed them more as “spooky” oddities on the verge of self-contradiction. ..]

Edward Witten: As you’ll read, he [Wheeler] thought the continuum was wrong in both physics and math. He did not think one’s microscopic description of space-time should use a continuum of any kind — neither a continuum of space nor a continuum of time, nor even a continuum of real numbers. On the space and time, I’m sympathetic to that. On the real numbers, I’ve got to plead ignorance or agnosticism. It is something I wonder about, but I’ve tried to imagine what it could mean to not use the continuum of real numbers, and the one logician I tried discussing it with didn’t help me.”

***

Well, I spent much more time studying logic than Witten, a forlorn, despised and alienating task. (Yet, when one is driven by knowledge, nothing beats an Internet connected cave in the desert, far from the distracting trivialities!) Studying fundamental logic, an exercise mathematicians, let alone physicists, tend to detest, brought me enlightenment. mostly because it shows how relative it is, and how it can take thousands of years to make simple, obvious steps. How to solve this lack of logical imagination affecting the tremendous mathematician cum physicist Witten? Simple. From energy considerations, there is an event horizon to how large an expression can be written. Thus, in particular there is a limit to the size of a number. Basically, a number can’t be larger than the universe.

https://patriceayme.wordpress.com/2011/10/10/largest-number/

This also holds for the continuum: just as numbers can’t be arbitrarily large, neither can the digital expression of a given number be arbitrarily long. In other words, irrational numbers don’t exist (I will detail in the future what is wrong with the 24 century old proof, step by step).

As the world consists in sets of entangled quantum states (also known as “qubits”), the number of states can get much larger than the world of numbers. For example a set of 300 entangled up or down spins presents with 2^300 states (much larger than the number of atoms in the observable, 100 billion light years across universe). Such sets (“quantum simulators”) have been basically implemented in the lab.

Digital computers only work with finite expressions. Thus practical, effective logic uses already only finite mathematics, and finite logic. Thus there is no difficulty to use only finite mathematics. Physically, it presents the interest of removing many infinities (although not renormalization!)

Quantum entanglement creates a much richer spacetime than the granular subjacent space. Thus an apparently continuous spacetime is emergent from granular space. Let’s go back to the example above: 300 spins, in a small space, once quantum entangled, give a much richer spacetime quantum space of 2^300 states.

Consider again a set S of 300 particles (a practical case would be 300 atoms with spins up or down). If a set of “particles” are all entangled together I will call that a EQN (Entangled Quantum Network). Now consider an incoming wave W (typically a photonic or gravitational wave; but it could be a phonon, etc.). Classically, if the 300 particles were… classical, W has little probability to interact with S, because it has ONLY 300 “things”, 300 entities, to interact with. Quantum Mechanically, though, it has 2^300 “things”, all the states of the EQN, to interact with. Thus, a much higher probability of interacting. Certainly the wave W is more likely to interact wit2^300 entities than with 300, in the same space! (The classical computations can’t be made from scratch by me, or anybody else; but the classical computation, depending on “transparency” of a film of 300 particles would actually depend upon the Quantum computation nature makes discreetly, yet pervasely!

EQNs make (mathematically at least) an all pervasive “volume” occupying wave. I wrote “volume” with quote-unquote, because some smart asses, very long ago (nearly a century) pointed out that the Quantum Waves are in “PHASE” space, thus are NOT “real” waves. Whatever that means: Quantum volumes/spaces in which Quantum Waves compute can be very complicated, beyond electoral gerrymandering of congressional districts in the USA! In particular, they don’t have to be 3D “volumes”. That doesn’t make them less “real”. To allude to well-established mathematics: a segment is a one dimensional volume. A space filling curve is also a sort of volume, as is a fractal (and has a fractal dimension).

Now quantum entanglement has been demonstrated over thousands of kilometers, and mass (so to speak) quantum entanglement has been demonstrated over 500 nanometers (5,000 times the size of an atom). One has to understand that solids are held by quantum entanglement. So there is plenty enough entanglement to generate spaces of apparently continuous possibilities and even consciousness… from a fundamentally granular space.

Entanglement, or how to get continuum from discontinuum. (To sound like Wheeler.)

The preceding seems pretty obvious to me. Once those truths get around, everybody will say:’But of course, that’s so obvious! Didn’t Witten say that first?’

No, he didn’t.

You read it here first.

Granular space giving rise to practically continuous spacetime is an idea where deep philosophy proved vastly superior to the shortsightedness of vulgar mathematics.

Patrice Ayme’


NotPoliticallyCorrect

Human Biodiversity, IQ, Evolutionary Psychology, Epigenetics and Evolution

Political Reactionary

Dark Enlightenment and Neoreaction

Of Particular Significance

Conversations About Science with Theoretical Physicist Matt Strassler

Rise, Republic, Plutocracy, Degeneracy, Fall And Transmutation Of Rome

Power Exponentiation By A Few Destroyed Greco-Roman Civilization. Are We Next?

SoundEagle 🦅ೋღஜஇ

Where The Eagles Fly . . . . Art Science Poetry Music & Ideas

Artificial Turf At French Bilingual School Berkeley

Artificial Turf At French Bilingual School Berkeley

Patterns of Meaning

Exploring the patterns of meaning that shape our world

Sean Carroll

in truth, only atoms and the void

West Hunter

Omnes vulnerant, ultima necat

GrrrGraphics on WordPress

www.grrrgraphics.com

Skulls in the Stars

The intersection of physics, optics, history and pulp fiction

Footnotes to Plato

because all (Western) philosophy consists of a series of footnotes to Plato

Patrice Ayme's Thoughts

Striving For Ever Better Thinking. Humanism Is Intelligence Unleashed. From Intelligence All Ways, Instincts & Values Flow, Even Happiness. History and Science Teach Us Not Just Humility, But Power, Smarts, And The Ways We Should Embrace. Naturam Primum Cognoscere Rerum

Learning from Dogs

Dogs are animals of integrity. We have much to learn from them.

ianmillerblog

Smile! You’re at the best WordPress.com site ever

NotPoliticallyCorrect

Human Biodiversity, IQ, Evolutionary Psychology, Epigenetics and Evolution

Political Reactionary

Dark Enlightenment and Neoreaction

Of Particular Significance

Conversations About Science with Theoretical Physicist Matt Strassler

Rise, Republic, Plutocracy, Degeneracy, Fall And Transmutation Of Rome

Power Exponentiation By A Few Destroyed Greco-Roman Civilization. Are We Next?

SoundEagle 🦅ೋღஜஇ

Where The Eagles Fly . . . . Art Science Poetry Music & Ideas

Artificial Turf At French Bilingual School Berkeley

Artificial Turf At French Bilingual School Berkeley

Patterns of Meaning

Exploring the patterns of meaning that shape our world

Sean Carroll

in truth, only atoms and the void

West Hunter

Omnes vulnerant, ultima necat

GrrrGraphics on WordPress

www.grrrgraphics.com

Skulls in the Stars

The intersection of physics, optics, history and pulp fiction

Footnotes to Plato

because all (Western) philosophy consists of a series of footnotes to Plato

Patrice Ayme's Thoughts

Striving For Ever Better Thinking. Humanism Is Intelligence Unleashed. From Intelligence All Ways, Instincts & Values Flow, Even Happiness. History and Science Teach Us Not Just Humility, But Power, Smarts, And The Ways We Should Embrace. Naturam Primum Cognoscere Rerum

Learning from Dogs

Dogs are animals of integrity. We have much to learn from them.

ianmillerblog

Smile! You’re at the best WordPress.com site ever

NotPoliticallyCorrect

Human Biodiversity, IQ, Evolutionary Psychology, Epigenetics and Evolution

Political Reactionary

Dark Enlightenment and Neoreaction

Of Particular Significance

Conversations About Science with Theoretical Physicist Matt Strassler

Rise, Republic, Plutocracy, Degeneracy, Fall And Transmutation Of Rome

Power Exponentiation By A Few Destroyed Greco-Roman Civilization. Are We Next?

SoundEagle 🦅ೋღஜஇ

Where The Eagles Fly . . . . Art Science Poetry Music & Ideas

Artificial Turf At French Bilingual School Berkeley

Artificial Turf At French Bilingual School Berkeley

Patterns of Meaning

Exploring the patterns of meaning that shape our world

Sean Carroll

in truth, only atoms and the void

West Hunter

Omnes vulnerant, ultima necat

GrrrGraphics on WordPress

www.grrrgraphics.com

Skulls in the Stars

The intersection of physics, optics, history and pulp fiction

Footnotes to Plato

because all (Western) philosophy consists of a series of footnotes to Plato

Patrice Ayme's Thoughts

Striving For Ever Better Thinking. Humanism Is Intelligence Unleashed. From Intelligence All Ways, Instincts & Values Flow, Even Happiness. History and Science Teach Us Not Just Humility, But Power, Smarts, And The Ways We Should Embrace. Naturam Primum Cognoscere Rerum

Learning from Dogs

Dogs are animals of integrity. We have much to learn from them.

ianmillerblog

Smile! You’re at the best WordPress.com site ever

%d bloggers like this: