Archive for the ‘Physics’ Category

Discrepancy In Universe’s Expansion & Quantum Interaction

January 17, 2018

In “New Dark Matter Physics Could Solve The Expanding Universe Controversy“, Ethan Siegel points out that:

“Multiple teams of scientists can’t agree on how fast the Universe expands. Dark matter may unlock why.
There’s an enormous controversy in astrophysics today over how quickly the Universe is expanding. One camp of scientists, the same camp that won the Nobel Prize for discovering dark energy, measured the expansion rate to be 73 km/s/Mpc, with an uncertainty of only 2.4%. But a second method, based on the leftover relics from the Big Bang, reveals an answer that’s incompatibly lower at 67 km/s/Mpc, with an uncertainty of only 1%. It’s possible that one of the teams has an unidentified error that’s causing this discrepancy, but independent checks have failed to show any cracks in either analysis. Instead, new physics might be the culprit. If so, we just might have our first real clue to how dark matter might be detected.

20 years ago it was peer-reviewed published, by a number of teams that we were in an ever faster expanding universe (right). The Physics Nobel was given for that to a Berkeley team and to an Australian team. There are now several methods to prove this accelerating expansion, and they (roughly) agree.

Notice the striking differences between different models in the past; only a Universe with dark energy matches our observations. Possible fates of the expanding Universe which used to be considered were, ironically enough, only the three on the left, which are now excluded.  Image credit: The Cosmic Perspective / Jeffrey O. Bennett, Megan O. Donahue, Nicholas Schneider and Mark Voit.

Three main classes of possibilities for why the Universe appears to accelerate have been considered:

  1. Vacuum energy, like a cosmological constant, is energy inherent to space itself, and drives the Universe’s expansion. (This idea comes back to Einstein who introduced a “Cosmological Constant” in the basic gravitational equation… To make the universe static, a weird idea akin to crystal sphere of Ptolemaic astronomy; later Einstein realized that, had he not done that, he could have posed as real smart by predicting the expansion of the universe… So he called it, in a self-congratulating way, his “greatest mistake”… However, in the last 20 years, the “greatest mistake” has turned to be viewed as a master stroke…).
  2. Dynamical dark energy, driven by some kind of field that changes over time, could lead to differences in the Universe’s expansion rate depending on when/how you measure it. (Also called “quintessence”; not really different from 1), from my point of view!)
  3. General Relativity could be wrong, and a modification to gravity might explain what appears to us as an apparent acceleration. (However, the basic idea of the theory of gravitation is so simplest, it’s hard to see how it could be wrong, as long as one doesn’t introduce Quantum effects… Which is exactly what I do! In my own theory, said effect occur only at large cosmic distances, on the scale of large galaxies)

Ethan: “At the dawn of 2018, however, the controversy over the expanding Universe might threaten that picture. Our Universe, made up of 68% dark energy, 27% dark matter, and just 5% of all the “normal” stuff (including stars, planets, gas, dust, plasma, black holes, etc.), should be expanding at the same rate regardless of the method you use to measure it. At least, that would be the case if dark energy were truly a cosmological constant, and if dark matter were truly cold and collisionless, interacting only gravitationally. If everyone measured the same rate for the expanding Universe, there would be nothing to challenge this picture, known as standard (or “vanilla”) ΛCDM.

But everyone doesn’t measure the same rate.”

The standard, oldest, method of measuring the Hubble cosmic expansion rate is through a method known as the cosmic distance ladder. The simplest version only has three rungs. First, you measure the distances to nearby stars directly, through parallax, the variation of the angle of elevation during the year, as the Earth goes around its orbit. Most specifically you measure the distance to the long-period Cepheid stars like this. Cepheids are “standard candles”; they are stars whose luminosities vary, but their maximum power doesn’t, so we can know how far they are by looking how much they shine. Second, you then measure other properties of those same types of Cepheid stars in nearby galaxies, learning how far away those galaxies are. And lastly, in some of those galaxies, you’ll have a specific class of supernovae known as Type Ia supernovae. Those supernovae explode exactly when they accrete 1.4 solar mass, from another orbiting star (a theory of Indian Nobel Chandrasekhar, who taught at the University of Chicago). One can see these 1a supernovae all over the universe. Inside the Milky Way, as well as many of billions of light years away. With just these three steps, you can measure the expanding Universe, arriving at a result of 73.24 ± 1.74 km/s/Mpc.

The other methods makes all sorts of suppositions about the early universe. I view it as a miracle that it is as close as it is: 66.9 km/s/Megaparsec…

Ethan concludes that: “Currently, the fact that distance ladder measurements say the Universe expands 9% faster than the leftover relic method is one of the greatest puzzles in modern cosmology. Whether that’s because there’s a systematic error in one of the two methods used to measure the expansion rate or because there’s new physics afoot is still undetermined, but it’s vital to remain open-minded to both possibilities. As improvements are made to parallax data, as more Cepheids are found, and as we come to better understand the rungs of the distance ladder, it becomes harder and harder to justify blaming systematics. The resolution to this paradox may be new physics, after all. And if it is, it just might teach us something about the dark side of the Universe.”

My comment: The QUANTUM INTERACTION CHANGES EVERYTHING:

My own starting point is a revision of Quantum Mechanics: I simply assume that Newton was right (that’s supposed to be a joke, but with wisdom attached). Newton described his own theory of gravitation to be absurd (the basic equation, F = M1 M2/dd. where d was the distance was from a French astronomer, Ishmael Boulliau, as Newton himself said. Actually this “Bullaldius” then spoiled his basic correct reasoning with a number of absurdities which Newton corrected).

Newton was actually insulting against his own theory. He said no one with the slightest understanding of philosophy would assume that gravitation was instantaneous.

Newton’s condemnation was resolved by Laplace, a century later. Laplace just introduced a finite speed for the propagation of the gravitational field. That implied gravitational waves, for the same reason as a whip makes waves.

We are in a similar situation now. Present Quantum Physics assumes that the Quantum Interaction (the one which carries Quantum Entanglement) is instantaneous. This is absurd for exactly the same reason Newton presented, and Laplace took seriously, for gravitation.

Supposing that the Quantum Interaction has a finite speed (it could be bigger than 10^23c, where c is the speed of light.

Supposing this implies (after a number of logical and plausible steps) both Dark Matter and Dark Energy. It is worth looking at. But let’s remember the telescope (which could have been invented in antiquity) was invented not to prove that the Moon was not a crystal ball, but simply to make money (by distinguishing first which sort of cargo was coming back from the Indies).

We see what we want to see, because that’s we have been taught to see, we search what we want to search, because that’s what we have been taught to search. Keeping an open mind is great, but a fully open mind is a most disturbing thing… 

Patrice Aymé

Advertisements

“Proof” That Faster Than Light Communications Are Impossible Is False

December 16, 2017

There are theories everywhere, and the more ingrained they are, the more suspiciously they should be looked at. From the basic equations of relativity it is clear that if one adds speeds less than the speed of light, one will get a speed less than the speed of light. It is also clear that adding impulse to a mass will make it more massive, while its speed will asymptotically approach that of light (and, as I explained, the reason is intuitive, from Time Dilation).

The subject is not all sci-fi: modern cosmology brazenly assumes that space itself, after the alleged Big Bang, expanded at a speed at least 10^23 c (something like one hundred thousand billion billions time the speed of light c). The grossest, yet simplest, proof of that is simple: the observable universe is roughly 100 billion light years across, and it is ten billion years old. Thus it expanded at the minimum average clip of ten billion light years, every billion years. 100c/10 = 10c, according to standard cosmology. One could furiously imagine a spaceship somehow surfing on a wave of warped space, expanding for the same obscure reason.

The question naturally arises whether velocities which are greater than that of light could ever possibly be obtained in other ways. For example, are there communication speeds faster than light? (Throwing some material across will not work: its mass will increase, while its speed stays less than c.)

Textbooks say it’s not possible. There is actually a “proof” of that alleged impossibility, dating all the way back to Einstein (1907) and Tolman (1917). The mathematics are trivial (they are reproduced in my picture below). But the interpretation is apparently less so. Wikipedia weirdly claims that faster than light communications would allow to travel back in time. No. One could synchronize all clocks on all planets in the galaxies, and having faster than light communications would not change anything. Why? Time is local, faster than light data travel is nonlocal.

The problem of faster than light communications can be attacked in the following manner.

Consider two points A and B on the X axis of the system S, and suppose that some impulse originates at A, travels to B with the velocity u and at B produces some observable phenomenon, the starting of the impulse at A and the resulting phenomenon at B thus being connected by the relation of cause and effect. The time elapsing between the cause and its effect as measured in the units of system S will evidently be as follows in the calligraphy below. Then I use the usual Relativity formula (due to Lorentz) of time as it elapses in S’:

Equations help, but they are neither the beginning, nor the end of a story. Just an abstraction of it. The cult of equations is naive, interpretation is everything. The same thing, more generally, holds for models.
As Tolman put it in 1917: “Let us suppose now that there are no limits to the possible magnitude of the velocities u and V, and in particular that the causal impulse can travel from A to B with a velocity u greater than that of light. It is evident that we could then take a velocity u great enough uV/C^2 will be greater than one.
so that Delta(t) would become negative. In other words, for an observer in system S’ the effect which occurs at B would precede in time its cause which originates at A.”

I quote Tolman, because he is generally viewed as the one having definitively established the impossibility of faster than light communications. Tolman, though is not so sure; in his next sentence he turns out wishy washy: “Such a condition of affairs might not be a logical impossibility; nevertheless its extraordinary nature might incline us to believe that no causal impulse can travel with a velocity greater than that of light.”

Actually it is an effect those who have seen movies running in reverse are familiar with. Causality apparently running in reverse is no more surprising than the fact that two events at x1 and x2 which are simultaneous in S are separated by:  (x1-x2) (V/square root (1-VV/CC)). That introduces a sort of fake, or apparent causality, sometimes this before that, sometimes that before this.

(The computation is straightforward and found in Tolman’s own textbook; it originated with Henri Poincaré.[9][10] In 1898 Poincaré argued that the postulate of light speed constancy in all directions is useful to formulate physical laws in a simple way. He also showed that the definition of simultaneity of events at different places is only a convention.[11]) . Notice that, in the case of simultaneity, the signs of V and (x1-x2) matter. Basically, depending upon how V moves, light in S going to S’ takes more time to catch up with the moving frame, and the more so, the further it is, the same exact effect which explains the nil result in the Michelson-Morley interferometer; there is an underlying logic below all of this, and it’s always the same).

Tolman’s argumentation about the impossibility of faster than light communications is, in the end, purely philosophical and fully inconsistent with the closely related, and fully mainstream, relativity of simultaneousness.

Poincaré in 1900 proposed the following convention for defining clock synchronisation: 2 observers A and B, which are moving in space (which Poincaré called the aether), synchronise their clocks by means of optical signals. They believe to be at rest in space (“the aether”) from not moving relative to distant galaxies or the Cosmic Radiation Background and assume that the speed of light is constant in all directions. Therefore, they have to consider only the transmission time of the signals and then crossing their observations to examine whether their clocks are synchronous.

“Let us suppose that there are some observers placed at various points, and they synchronize their clocks using light signals. They attempt to adjust the measured transmission time of the signals, but they are not aware of their common motion, and consequently believe that the signals travel equally fast in both directions. They perform observations of crossing signals, one traveling from A to B, followed by another traveling from B to A.” 

In 1904 Poincaré illustrated the same procedure in the following way:

“Imagine two observers who wish to adjust their timepieces by optical signals; they exchange signals, but as they know that the transmission of light is not instantaneous, they are careful to cross them. When station B perceives the signal from station A, its clock should not mark the same hour as that of station A at the moment of sending the signal, but this hour augmented by a constant representing the duration of the transmission. Suppose, for example, that station A sends its signal when its clock marks the hour 0, and that station B perceives it when its clock marks the hour t. The clocks are adjusted if the slowness equal to t represents the duration of the transmission, and to verify it, station B sends in its turn a signal when its clock marks 0; then station A should perceive it when its clock marks t. The timepieces are then adjusted. And in fact they mark the same hour at the same physical instant, but on the one condition, that the two stations are fixed. Otherwise the duration of the transmission will not be the same in the two senses, since the station A, for example, moves forward to meet the optical perturbation emanating from B, whereas the station B flees before the perturbation emanating from A. The watches adjusted in that way will not mark, therefore, the true time; they will mark what may be called the local time, so that one of them will be slow of the other.[13]

This Poincaré (“–Einstein”) synchronisation was used by telegraphers as soon as the mid-nineteenth century. It would allow to cover the galaxy with synchronized clocks (although local times will differ a bit depending upon the motion of stars, and in particular where in the galactic rotation curve a star sits). Transmitting instantaneous signals in that networks would not affect causality. Ludicrously, Wikipedia asserts that faster than light signals would make “Bertha” rich (!!!). That comes simply from Wikipedia getting thoroughly confused, allowing faster than light signals for some data, and not for other data, thus giving an advantage to some, and not others.

***

Quantum Entanglement (QE) enables at-a-distance changes of Quantum states:

(It comes in at least three types of increasing strength.) Quantum Entanglement, as known today, is within Quantum state to within Quantum state, but we cannot control in which Quantum state the particle will be, to start with, so we cannot use QE for communicating faster than light (because we don’t control what we write, so to speak, as we write with states, so we send gibberish).

This argument is formalized in a “No Faster Than Light Communication theorem”. However, IMHO, the proof contains massive loopholes (the proof assumes that there is no Sub Quantum Reality, whatsoever, nor could there ever be some, ever, and thus that the unlikely QM axioms are forever absolutely true beyond all possible redshifts you could possibly imagine, inter alia). So this is not the final story here. QE enables, surprisingly, the Quantum Radar (something I didn’t see coming). And it is not clear to me that we have absolutely no control on states statistically, thus that we can’t use what Schrödinger, building on the EPR thought experiment, called “Quantum Steering” to communicate at a distance. Quantum Radar and Quantum Steering are now enacted through real devices. They use faster-than-light in their inner machinery.

As the preceding showed, the supposed contradiction of faster-than-light communications with Relativity is just an urban legend. It makes the tribe of physicists more priestly, as they evoke a taboo nobody can understand, for the good reason that it makes no sense, and it is intellectually comfortable, as it simplifies brainwork, taboos always do, but it is a lie. And it is high time this civilization switches to the no more lies theorem, lest it wants to finish roasted, poisoned, flooded, weaponized and demonized.

Patrice Ayme’

Technical addendum:

https://en.wikipedia.org/wiki/Relativity_of_simultaneity

As Wikipedia itself puts it, weasel-style, to try to insinuate that Einstein brought something very significant to the debate, the eradication of the aether (but the aether came back soon after, and there are now several “reasons” for it; the point being that, as Poincaré suspected, there is a notion of absolute rest, and now we know this for several reasons: CRB, Unruh effect, etc.):

In 1892 and 1895, Hendrik Lorentz used a mathematical method called “local time” t’ = t – v x/c2 for explaining the negative aether drift experiments.[5] However, Lorentz gave no physical explanation of this effect. This was done by Henri Poincaré who already emphasized in 1898 the conventional nature of simultaneity and who argued that it is convenient to postulate the constancy of the speed of light in all directions. However, this paper does not contain any discussion of Lorentz’s theory or the possible difference in defining simultaneity for observers in different states of motion.[6][7] This was done in 1900, when Poincaré derived local time by assuming that the speed of light is invariant within the aether. Due to the “principle of relative motion”, moving observers within the aether also assume that they are at rest and that the speed of light is constant in all directions (only to first order in v/c). Therefore, if they synchronize their clocks by using light signals, they will only consider the transit time for the signals, but not their motion in respect to the aether. So the moving clocks are not synchronous and do not indicate the “true” time. Poincaré calculated that this synchronization error corresponds to Lorentz’s local time.[8][9] In 1904, Poincaré emphasized the connection between the principle of relativity, “local time”, and light speed invariance; however, the reasoning in that paper was presented in a qualitative and conjectural manner.[10][11]

Albert Einstein used a similar method in 1905 to derive the time transformation for all orders in v/c, i.e., the complete Lorentz transformation. Poincaré obtained the full transformation earlier in 1905 but in the papers of that year he did not mention his synchronization procedure. This derivation was completely based on light speed invariance and the relativity principle, so Einstein noted that for the electrodynamics of moving bodies the aether is superfluous. Thus, the separation into “true” and “local” times of Lorentz and Poincaré vanishes – all times are equally valid and therefore the relativity of length and time is a natural consequence.[12][13][14]

… Except of course, absolute relativity of length and time is not really true: everywhere in the universe, locally at rest frames can be defined, in several manner (optical, mechanical, gravitational, and even using a variant of the Quantum Field Theory Casimir Effect). All other frames are in trouble, so absolute motion can be detected. The hope of Einstein, in devising General Relativity was to explain inertia, but he ended down with just a modification of the 1800 CE Bullialdus-Newton-Laplace theory… (Newton knew his instantaneous gravitation made no sense, and condemned it severely, so Laplace introduced a gravitation speed, thus the gravitational waves, and Poincaré made them relativistic in 1905… Einstein got the applause…)

CONTINUUM FROM DISCONTINUUM

December 1, 2017

Discontinuing The Continuum, Replacing It By Quantum Entanglement Of Granular Substrate:

Is the universe granular? Discontinuous? Is spacetime somehow emergent? I do have an integrated solution to these quandaries, using basic mass-energy physics, and quantum entanglement. (The two master ideas I use here are mine alone, and if I am right, will change physics radically in the fullness of time.)  

First let me point out that worrying about this is not just a pet lunacy of mine. Edward Witten is the only physicist to have got a top mathematics prize, and is viewed by many as the world’s top physicist (I have met with him). He gave a very interesting interview to Quanta Magazine: A Physicist’s Physicist Ponders the Nature of Reality.

Edward Witten reflects on the meaning of dualities in physics and math, emergent space-time, and the pursuit of a complete description of nature.”

Witten ponders, I answer.

Quantum Entanglement enables to build existence over extended space with a wealth exponentially growing beyond granular space

Witten: “I tend to assume that space-time and everything in it are in some sense emergent. By the way, you’ll certainly find that that’s what Wheeler expected in his essay [Information, Physics, Quantum, Wheeler’s 1989 essay propounding the idea that the physical universe arises from information, which he dubbed “it from bit.” He should have called it: “It from Qubit”. But the word “Qubit” didn’t exist yet; nor really the concept, as physicists had not realized yet the importance of entanglement and nonlocality in building the universe: they viewed them more as “spooky” oddities on the verge of self-contradiction. ..]

Edward Witten: As you’ll read, he [Wheeler] thought the continuum was wrong in both physics and math. He did not think one’s microscopic description of space-time should use a continuum of any kind — neither a continuum of space nor a continuum of time, nor even a continuum of real numbers. On the space and time, I’m sympathetic to that. On the real numbers, I’ve got to plead ignorance or agnosticism. It is something I wonder about, but I’ve tried to imagine what it could mean to not use the continuum of real numbers, and the one logician I tried discussing it with didn’t help me.”

***

Well, I spent much more time studying logic than Witten, a forlorn, despised and alienating task. (Yet, when one is driven by knowledge, nothing beats an Internet connected cave in the desert, far from the distracting trivialities!) Studying fundamental logic, an exercise mathematicians, let alone physicists, tend to detest, brought me enlightenment. mostly because it shows how relative it is, and how it can take thousands of years to make simple, obvious steps. How to solve this lack of logical imagination affecting the tremendous mathematician cum physicist Witten? Simple. From energy considerations, there is an event horizon to how large an expression can be written. Thus, in particular there is a limit to the size of a number. Basically, a number can’t be larger than the universe.

https://patriceayme.wordpress.com/2011/10/10/largest-number/

This also holds for the continuum: just as numbers can’t be arbitrarily large, neither can the digital expression of a given number be arbitrarily long. In other words, irrational numbers don’t exist (I will detail in the future what is wrong with the 24 century old proof, step by step).

As the world consists in sets of entangled quantum states (also known as “qubits”), the number of states can get much larger than the world of numbers. For example a set of 300 entangled up or down spins presents with 2^300 states (much larger than the number of atoms in the observable, 100 billion light years across universe). Such sets (“quantum simulators”) have been basically implemented in the lab.

Digital computers only work with finite expressions. Thus practical, effective logic uses already only finite mathematics, and finite logic. Thus there is no difficulty to use only finite mathematics. Physically, it presents the interest of removing many infinities (although not renormalization!)

Quantum entanglement creates a much richer spacetime than the granular subjacent space. Thus an apparently continuous spacetime is emergent from granular space. Let’s go back to the example above: 300 spins, in a small space, once quantum entangled, give a much richer spacetime quantum space of 2^300 states.

Consider again a set S of 300 particles (a practical case would be 300 atoms with spins up or down). If a set of “particles” are all entangled together I will call that a EQN (Entangled Quantum Network). Now consider an incoming wave W (typically a photonic or gravitational wave; but it could be a phonon, etc.). Classically, if the 300 particles were… classical, W has little probability to interact with S, because it has ONLY 300 “things”, 300 entities, to interact with. Quantum Mechanically, though, it has 2^300 “things”, all the states of the EQN, to interact with. Thus, a much higher probability of interacting. Certainly the wave W is more likely to interact wit2^300 entities than with 300, in the same space! (The classical computations can’t be made from scratch by me, or anybody else; but the classical computation, depending on “transparency” of a film of 300 particles would actually depend upon the Quantum computation nature makes discreetly, yet pervasely!

EQNs make (mathematically at least) an all pervasive “volume” occupying wave. I wrote “volume” with quote-unquote, because some smart asses, very long ago (nearly a century) pointed out that the Quantum Waves are in “PHASE” space, thus are NOT “real” waves. Whatever that means: Quantum volumes/spaces in which Quantum Waves compute can be very complicated, beyond electoral gerrymandering of congressional districts in the USA! In particular, they don’t have to be 3D “volumes”. That doesn’t make them less “real”. To allude to well-established mathematics: a segment is a one dimensional volume. A space filling curve is also a sort of volume, as is a fractal (and has a fractal dimension).

Now quantum entanglement has been demonstrated over thousands of kilometers, and mass (so to speak) quantum entanglement has been demonstrated over 500 nanometers (5,000 times the size of an atom). One has to understand that solids are held by quantum entanglement. So there is plenty enough entanglement to generate spaces of apparently continuous possibilities and even consciousness… from a fundamentally granular space.

Entanglement, or how to get continuum from discontinuum. (To sound like Wheeler.)

The preceding seems pretty obvious to me. Once those truths get around, everybody will say:’But of course, that’s so obvious! Didn’t Witten say that first?’

No, he didn’t.

You read it here first.

Granular space giving rise to practically continuous spacetime is an idea where deep philosophy proved vastly superior to the shortsightedness of vulgar mathematics.

Patrice Ayme’

Science and Philosophy: two aspects of the same thing. Why they are separated.

November 22, 2017

 

Separating philosophy from science is like separating breathing in, from breathing out.

Philosophy is how one guesses, science is how one makes sure.

To this “Jan Sand” retorted: ‘Science is how one attempts to make sure.’

Well, no. Attempting is no science. Hope enables one to live, but it’s not life. “One makes sure” comes with a context, the context enabling to express the problem and the answer attached to it.

Science is both a method, and a field of knowledge. Both are relative to the context at hand. The method consists in using only elements of reality one is sure of.

In their context, for example, classical optics, mechanics, electromagnetism and thermodynamics are all appropriate and correct. Yet, they don’t work next to a Black Hole: a Black Hole is the wrong context for them.

The first interstellar asteroid is a shard, probably a metallic one. It was observed to cover the Earth-Moon distance in less than three hours. With the nes telescopes being built, it is the first of many.

Consider the first Interstellar Asteroid was observed passing by the sun, on a highly hyperbolic trajectory. Speed: 139,000 kilometer per hour. Color: the deep red of the severely irradiated material (an orange like picture was obtained). No water or other volatile element. Albedo (reflectivity) varies from one to ten. Making an absolute hypothesis of what the albedo is, its size would one hundred meters across, a kilometer long. Found first by an Hawaiian telescope, its name is 1I ‘Oumuamua (Reach out first first; “1I for First Interstellar”)

This is all science, because many telescope, including Europe’s VLT (Very Large Telescope) in Chile, observed the object, and science dating more than 4 centuries has made telescope highly reliable (although cardinals initially demurred).

Rubbing sticks vigorously just so will enable to bring in such high temperature, as to start a fire: that’s science. (The fundamental science of humanity, 1.3 million years old.)

But not all “attempts” at “making sure” turn out to be science. Philosophy is what organizes these attempts.

For “superstrings”, it was felt that, instead of supposing point-particles, one could suppose strings, and some problems would disappear. Other problems would disappear if one supposed a symmetry between fermions and bosons. Thus “superstrings” came to be.

Superstrings is certainly a sort of logic, but not science. In particular, it makes no peculiar predictions, aside from the hypotheses it started with!

Similarly, Euclidean geometry pushed all the way, is unending logic, not science (because it has nothing to do with reality, it says nothing relevant to reality, once pushed far enough).

Most famously, epicycle theory was a sort-of logic, with some truths mixed in, but not science: it turned out to be 100% false (although the Fourier analysis hidden therein gave it some respectability, because parts of a lie can be true).

I have my own proposal for Sub Quantum Reality (“SQPR”). It is an attempt. It is astoundingly smart. It does make predictions, and explains some significant phenomena, for example Dark Matter, Dark Energy. So it looks good. However, it is not science.

Why?

Because my theory makes extraordinary claims giving a completely different picture of physics, extremely far from the facts and moods which give meaning to both Relativity and Quantum Mechanics.

So SQPR would need extraordinary proofs.

One could be simply that all other explanations for Dark Matter fail, and only SQPR is left standing.

A more direct proof would be that SQPR predicts a measurable difference in energy distribution during the famous 2-slit experiment from the prediction Albert Einstein explicitly made. If it turned out to be true that my prediction is correct on this, pretty much all of existing physics becomes false, or, let’s say more precisely, it becomes a very good approximation to a deeper theory.

And then SQPR would become a science (if all other testable predictions turn out to be in accord with it).

Elements of science have to be certain, within a particular context, or “universe” (in the logic sense of “universe”) which, itself, is part of the real world.

For example Quantum Field Theory makes probabilistic predictions which can boil down in very precise numbers which can be measured. Quantum Computers will also make probabilistic predictions (about the real world, even the world of numbers).

In the latter case, it’s just a guess. In other words, philosophy.

Those who claim science does not depend upon philosophy, just as those who claim philosophy does not depend upon science are, at best, trivially correct: they have got to be looking at small subfields of these activities, cleaning the corners.  

In the grand scheme of things, science and philosophy are roughly the same activity: twisting logic any which way, to get testable consequences. Thus discovering new logics on the way, not just new facts

***

One may ask: why did philosophy and science get separated?

Because our masters, the plutocrats want to keep on ruling. That means they don’t want us to understand what they are doing. Thus, smarts are their enemy. Hence people have to be kept in little mental boxes, so stupid, just so.

This is nothing new. When Rome was at its apogee, very learned Greek slaves educated the youth of the elite. As they were slaves, they knew their place. This helps to explain why Rome stagnated intellectually, and thus was unable to solves its pressing strategic, technological, economic, health and ecological problems. Stupidly educated youth makes stupid, and obedient adults.  

Specialization is a way for plutocrats to keep on ruling. After all, to run a civilization, one needs special capabilities. The ultimate specialization is to pretend that certain knowledge, that is science, is independent from guessing new sure knowledge, that is, philosophy.

Actually the latter is intrinsically bad, since, if it was thoroughly applied, it would allow We The People to understand how plutocracy works. Thus philosophy was strongly encouraged to degenerate, by being cut from knowledge, be it sure, or historical, etc.

If society wants to survive, it will have to forge ahead in the way of understanding. Failing to comprehend or to implement this, has led many civilizations or states  to collapse (Maya, Sumer, Egypt, Abbasid Caliphate, Jin dynasty, Western Xia, the Dali Kingdom , Southern Song, Aztecs,.etc.).

Thus sustainable plutocracy is a balancing act between understanding and obedience. This time, though, understanding has to be maximized, be it only to solve the climate crisis (there are many other crises). Thus plutocracy has foster understanding (quite a bit as Jeff Bezos is doing with Amazon, hence his success)..

We may be unable to get rid of plutocracy, because We The Sheep People out there are so supine. The next best thing, which is also the necessary thing, is that it is in the interest of everybody to let philosophy roll, and thus get reacquainted with science. And reciprocally.

Patrice Ayme

WHY LIGHT & GRAVITATION GO AT SAME SPEED

November 2, 2017

As long as one does not have a simple explanation, and, or description, of a subject, one does not understand it fully.

The present essay presents a direct proof, found by me, from basic principles, that gravitational waves go at the speed of light.

The essay also presents the direct experimental proof of the same fact that we got a few days ago, when the explosion of a “kilonova” was observed (kilonovae are very rare, but crucial in the apparition of life as we know it, details below).

A consequence of the preceding is that the MOND theories are false. MOND was a philosophical horror, something full of ad hoc hypotheses, so I am happy it’s out of the window. MOND eschewed the simplest description of gravity, the basics of which, the 1/d^2 law preceded the birth of Newton himself.   

***

First things first: WHY GRAVITATIONAL WAVES?

When two sources of a field of type 1/d2 (such as gravitation or electromagnetism) rotate around each other, they generate waves which go to infinity (even if I don’t believe in infinity, as an absolute, it works fine as a figure of speech…)  

That’s simply because the field changes, as sometimes the charges are aligned, sometimes sideways. As the field changes it moves the objects it acts on. Now the point is that this disturbance of the field propagates indefinitely.

At this point, a philosophical question may arise: do the disturbances of the field carry away energy? Well, in a way, it’s an idiotic question, because we know it does, that’s an experimental fact.

This experimental fact shows fields are real.

Now, let’s slow down a bit: one century of experimentation with electromagnetic fields had shown, by 1900 CE, that, electromagnetic fields carried away energy.

What about gravitation? Well,  theories were made in which a waving gravitational field carried away energy, such as Poincaré’s theories of gravitation, and, in particular, Einstein’s.

The experimental proof came when closely rotating stars, which should have been emitting copious amounts of gravitational field energy, were observed to lose energy just as predicted. But first the theory:

Orbiting Masses Generate Gravitational Waves (on top). If the gravitational waves were left behind the light, many references frames would observe non-conservation of energy after a collision event (bottom) between aforesaid masses. This is my thought experiment, and it’s also what happened 130 million years ago in a galaxy not that far away.

***

HERE IS WHY GRAVITATIONAL WAVES GO AS FAST AS LIGHT WAVES:

Patrice Thought Experiment Demonstrating Gravitation & Electromagnetic Waves Go At the Same Speed:

So now visualize this. Say, to simplify, that two equal masses rotate around each other. Call them M1 and M2. Say M1 is matter, and M2 antimatter, each of mass m The system M1-M2, emits more and more gravitational energy as the two masses approach each other. Finally they collide. At this point, the system M1-M2 becomes pure electromagnetic radiation, of energy E = 2 (mc^2).

Now what does one see at a distance?

Suppose the electromagnetic energy E going at the speed of light, c, travelled faster than the gravitational wave of energy G, travelling at speed g.

Then suppose also one is in a reference frame R travelling at uniform speed V, away from the M1-M2 collision event. As g is less than c, V can be more than g.

And then what?

The gravitational wave of energy G going at speed g, CANNOT catch up with the reference frame R.

However, before the collision, some of the energy of the system was inside G. And it’s easy to compute how much: it’s equal to the potential energy of the rotating system before the collision. In the scenario we constructed, that energy is never seen again, from the point of view of R. Let me repeat slowly: before the collision, M1  and M2 can be seen, orbiting each other. The potential energy of the system P, can be computed, using this VISUAL information (visual, hence travelling at the speed of light, c). So then the energy of the system is 2Mc^2 + P.

All of P is transformed into G, the energy of the gravitational wave. If the speed g of the wave is less than the speed of light, c, there are reference frames, namely those with V > g, where P will be seen to have disappear.

Thus if the speed of gravitational waves was less than the speed of light, there would be frames in which one could observe distant events where energy would not be conserved. 

Now let’s make it realistic.  The preceding situation is not just possible, but common:

***

Closely Orbiting Annihilating Stars Were Just Observed:

Instead of making the preceding M2 out of antimatter, one can simply make M1 and M2 into neutron stars. That’s exactly what happened 130 million years ago, when dinosaurs roamed the Earth, in a galaxy far away—NGC 4993, to be exact—two neutron stars spiraled into each other, from emitting gravitational radiation, and emitting more, the more they spiraled (the waves got converted in sound). The stars then went into a frantic dance, and collided.

Had this happened inside our own Milky Way, the present gravitational waves detectors the U.S.-built LIGO and European-built Virgo observatories, would have detected the gravitational waves for minutes, or maybe hours. But the gravitational waves we got were diluted by a factor of 10^10 (!) relative to what they would have been if the collision had been just 10,000 light years away, inside the Milky Way.

After billions of years spent slowly circling each other, in their last moments the two neutron-degenerate stars spiraled around each other thousands of times in a frenzy before finally smashing together at a significant fraction of light-speed, likely creating a black hole (typically neutron stars are remnants of sun like stars, two of those packed in a small volume makes a black hole).

Such an event is called a “kilonova” (because it has the energy of 1,000 novas). Kilonovae are rare cosmic events, once every 10,000 years in a giant galaxy like the Milky Way. That’s because neutron stars are produced by supernovae. To boot, supernovae explode asymmetrically, giving hefty “kick” to those remnants, strong enough to eject a neutron star entirely from its galaxy (the Crab Nebula remnant goes at 375 km/s relative to the explosion nebula.

***

Exit MOND:

MOND, or MOdified Newton Dynamics is a somewhat ridiculous class of theories invented in the last few decades to deny the existence of DARK MATTER. Instead, the MOND monkeys devised an ad hoc theory, which basically claim that gravity is stronger at low speeds (whatever), as was more or less observed (sort of) inside galaxies (didn’t work so good, or not at all, for clusters).

You see, gravitation basic behavior is simple. Kepler thought it was an attractive force in 1/d. However Bullialdus suggested the law was 1/d2 in analogy with the behavior of… light (however Bullialdus didn’t understand that, in combination with Buridan’s mechanics from 1350 CE, one could explain Kepler’s laws; but Hooke and then Newton did)

***

The collision of the two neutrons stars, and the black hole they created, also emitted electromagnetic radiation. That light comes from the fact materials fall at enormous speeds. Thus both gravitational waves and electromagnetic waves were captured from a single source. The first light from the merger was a brief, brilliant burst of gamma rays, the birth scream of the black hole. The gamma ray flash was picked up by NASA’s Fermi Gamma-Ray Space Telescope, 1.7 second after the arrival of the gravitational waves (dust would have delayed the light a bit at the onset, but not the gravitational waves). Hours later astronomers using ground-based telescopes detected more light from the merger, the “kilonova” produced by the expansion of debris. The kilonova faded from view over the following weeks.

As expected, astronomers saw in the aftermath various wavelengths of corresponding to the many heavy elements formed instantly during the collision (it was an old prediction that merging neutron stars would form the heaviest elements such as gold and titanium, neutron-rich metals that are not known to form in (normal) stars.

(Caveat: I hold that star theory is incomplete for super hyper giant stars with hundreds of solar masses, and a very reduced lifetime; that has been one of my argument against standard Big bang theory.) But let’s go back to my thought experiment. What about the other aspect I envisioned, being on a frame R travelling at a very large speed?It’s very realistic, actually for its other aspect, frames moving at near light speed.

***

Frames Travelling At Close To Speed Of Light Are Common:

… Not jut a figment of my imagination. That’s also very realistic: as one approaches the event-horizon, entire galaxies recess ever closer to the speed of light, here is the V I was talking about above.   

***

Simple science is deep science

All treaties on Gravitation tend to be the same: hundreds of pages of computation, and one wrong equation could well sink the ship (Quantum Field Theory is worse, as few fundamental hypotheses therein make any sense. Hence the famous prediction from QFT that the energy of the vacuum should be 10120 greater than observed…)

I believe instead in a modular approach: from incontrovertible facts, quick reasonings give striking conclusions. This makes science logically compartmentalized, avoiding thus that any subfield of science follow the Titanic down the abyss, from a single breach. It also make science easier to teach, and even easier to think about. For example the reality of Quantum Waves comes not just from all sorts of variants of the 2-slit experiments, but also from the Casimir Effect, a direct evidence for the reality of Quantum waves in empty space, which is observed so much that it has to be taken into account in the engineering of any nano-machinery (I also suggested a device to extract energy from this “vacuum”).

***

Conclusion: Just from the necessity of apparent conservation of energy in all inertial frames, rather simple physics show that the speed of gravitational waves has to be exactly the speed of light. No need for hundreds of pages of obscure computations and devious logics. No need even for Relativity, just basic kinematics from 1800 CE.

Patrice Ayme’

LEARN TO LEARN: Henri Poincaré, Not Einstein, Discovered Gravitational Waves, 111 years Ago

October 3, 2017

Physics Nobel Committee Should Learn Physics! And the notion of truth!

The truth shall not just make us free, but also safe, and moral. Teaching thinking is to teach truth and how to get to it. One should start by not deliberately lying. And understanding when it is that humanity started to understand something.

Intellectuals should revere the truth. If Satan speaks the truth, intellectuals should quote him approvingly.Why? Because ethics is truth! The Nobel in Physics was given to screwdriver turners for decisive contributions to the LIGO detector and the observation of gravitational waves”

However the rest of the press release from the Nobel committee on physics is a lie: it attributes the original idea of gravitational waves to a German. Surely the physicists who sit on the Nobel Committee are knowledgeable enough to know this is a lie. That sort of lies may sounds innocuous, it’s not: it’s anti-scientific, and proto-Nazi. It teaches the youth wrong. It teaches present day Nazis wrong.

The generation of waves by a central source field is easy to understand in primary school.

It’s because of these sorts of nationalistic distortions that Germans, a century ago, got so full of hubris that they went mad: everybody told them they invented everything! Everybody told Germans they were the superior race! And Max Planck was one of the prophets of this German superiority. ! And the hated French, were nothing, because that “inferior race” had invented nothing! Thus, naturally enough, since they were told from everywhere that they were so smart, the Germans decided to subjugate the rest of humanity, be it only to enlighten it (that was the idea of Keynes in “The Economic Consequence of Peace”).

Actually, it’s not a German who discovered, and named, “Relativity”, but a Frenchman.    

In press releases announcing the detection of gravitational waves, the collaborations LIGO and VIRGO, as well as the Centre National de la Recherche Scientifique (CNRS, France), explicitly (and WRONGLY) attributed to the German Albert Einstein the original prediction of the existence of gravitational waves in 1916. A similar comment is made in the Physical Review Letters article by LIGO and VIRGO.

But actually, gravitational waves traveling at the speed of light, were clearly predicted by Henri Poincaré on June 5, 1905, as a relativistic requirement. Poincaré made this requirement explicit in his academic note Sur la dynamique de l’électron (On electron dynamics, June 5, 1905) published by the French Académie des Sciences.

At the time, Poincaré was already world famous, and Einstein, nothing. Planck, a German nationalist, would make Einstein everything by allowing Einstein to publish articles without any reference on preceding he knew about, and parroted. This was sheer propaganda.

After explicitly formulating special relativity in this fundamental article, Poincaré further develops the requirement suggested by Hendrik Antoon Lorentz that the new space-time transformation leading to special relativity should apply to all existing forces and not just to the electromagnetic interaction. (At the insistence of Poincaré, Lorentz got the Nobel for Relativity in 1902)

Henri Poincaré concludes that, as a consequence of the new space-time geometry, gravitation must generate waves traveling at the speed of light in a similar way to electromagnetism.

Following the pre-Nazi German nationalistic propaganda contained in the press releases of scientific collaborations and institutions, almost all medias attribute to Albert Einstein the original prediction of gravitational waves.

The Physical Review Letters article by LIGO and VIRGO Observation of Gravitational Waves from a Binary Black Hole Merger,  PRL 116, 061102 (11 February 2016), explicitly sates https://journals.aps.org/prl/pdf/10.1103/PhysRevLett.116.061102 : “In 1916, the year after the final formulation of the field equations of general relativity, Albert Einstein predicted the existence of gravitational waves”. What, then, about the work done by Henri Poincaré 11 years before the Einstein finding ?

Actually, the situation seems quite clear. In his short article of 5 June 1905 Sur la dynamique de l’électron, C.R. T.140 (1905) 1504-1508 (Comptes Rendus de l’Académie des Sciences, France), http://www.academie-sciences.fr/pdf/dossiers/Poincare/Poincare_pdf/Poincare_CR1905.pdf , the French mathematician and physicist Henri Poincaré explicitly formulated special relativity upgrading the space-time transformations that he called “Lorentz transformations” and to which he referred as the “Lorentz group”. After having worked out and discussed the new space-time geometry, Poincaré writes:

… Mais ce n’est pas tout: Lorentz, dans l’Ouvrage cité, a jugé nécessaire de compléter son hypothèse en supposant que toutes les forces, quelle qu’en soit l’origine, soient affectées, par une translation [a change of inertial frame in Poincaré’s language], de la même manière que les forces électromagnétiques, et que, par conséquent, l’effet produit sur leurs composantes par la transformation de Lorentz est encore défini par les équations (4).

Il importait d’examiner cette hypothèse de plus près et en particulier de rechercher quelles modifications elle nous obligerait à apporter aux lois de la gravitation [HOW TO MODIFY GRAVITATION]. C’est ce que j’ai cherché à déterminer; j’ai été d’abord conduit à supposer que la propagation de la gravitation n’est pas instantanée, mais se fait avec la vitesse de la lumière. (…)

Quand nous parlerons donc de la position ou de la vitesse du corps attirant, il s’agira de cette position ou de cette vitesse à l’instant où l’onde gravifique [GRAVITATIONAL WAVE] est partie de ce corps; quand nous parlerons de la position ou de la vitesse du corps attiré, il s’agira de cette position ou de cette vitesse à l’instant où ce corps attiré a été atteint par l’onde gravifique émanée de l’autre corps; il est clair que le premier instant est antérieur au second… [End of quote]

Gravitational waves were thus explicitly predicted by Henri Poincaré in his 5 june 1905 article formulating special relativity. All of these ideas got incorporated in the gravitational wave equation of Einstein (who worked closely, day by day, with a number of top mathematicians at the time, including crack mathematician David Hilbert, who found a different approach).

In special relativity, such as already defined explicitly, with all its equations, by Poincaré and Lorentz, the speed of light c is not just the speed of a specific object (light) but a universal constant defining (local) space-time geometry. As a consequence, no physical object, signal, or correlation can travel faster than c. Poincaré explained in extreme details the philosophy behind it (if something is always true, it’s a law of nature), in a book which Einstein and his student friends studied in thorough detail (although Einstein didn’t quote Poincaré in his famous 1905 parrot work, naturally enough for a nationalistic parrot (later Einstein would have a fall-out with another French Nobel, Bergson, about Relativity).

According to Poincaré in his article of 5 June 1905, the requirement of a universal space-time geometry with the speed of light c as the critical speed implies that the gravitational force must be propagated by gravitational waves with a speed equal to c , just as electromagnetic waves carry the electromagnetic interaction.

As Henri Poincaré explicitly underlines, the space-time geometry defined by Lorentz tranformations applies to all existing forces including the gravitational ones. Thus, gravitation cannot propagate instantaneously and must instead propagate at the speed of light. The same argument clearly applies to any object associated to gravitation.

Considering as a simple example the gravitational interaction between two bodies, Poincaré introduces a “gravific wave” leaving the first body, traveling at the speed of light and reaching the second body at a later time. This was the original formulation of the prediction of gravitational waves in a context where its general scope was obvious. Poincaré had been working for years on electromagnetism, and knew perfectly well that more sophisticated scenarios than the example he was providing could be imagined without altering the role of c as the critical speed.

A decade later, with general relativity, Albert Einstein considered in detail more involved scenarios than the one made explicit by Poincaré, incorporating in particular an effective space-time curvature generated by gravitation in a static universe. But this does not invalidate the basic principle discovered and formulated by Henri Poincaré in 1905.

In his article, Poincaré also refers to the previous work by Pierre-Simon de Laplace, Count of Laplace (1749-1827), one of the main French scientists of the period of Napoléon Bonaparte. Laplace had already considered the possibility that gravitation propagates at some finite speed, but he did not question the basic space-time geometry.

Poincaré had demonstrated and published E = m c^2… in 1900, more than 5 years before Einstein plagiarized it.

I have talked about this for years. I am happy that Science 2.0 picked up the notion in “Henri Poincaré Predicted The Existence Of Gravitational Waves As Early As June 5, 1905”

Correct attribution of civilization defining discoveries is fundamental. Example: India discovered numbers & zero as used today.

The chronological hierarchy of discoveries reflects, in general, the logical hierarchy of evidence supporting these discoveries. Whether in science, or in global thinking. Thus who discovered what, when, how and why, is not just anecdotal. it’s logical, according to the most natural logic.

As it turns out, few places in spacetime made most civilization defining discoveries, and then they made plenty of them, and that was related to political processes: a few Greek city-states, especially Ionian cities and Athens and Paris and its satellites are obvious examples.

One can learn to learn better, one can learn to think better, this is what the existence of concentrations of civilizational genesis, show.

It’s crucially important to understand what made these places tick and how, with the aim of reproducing such circumstances. Paris was the pioneering place in science, worldwide, for around a millennium, and this was the core mental skeleton of Europe, and even civilization. Buridan discovered in particular the inertia, thus the heliocentric system (attributed to Copernicus, well after the Catholic Church made studying Buridan into a capital crime!), Lamarck, evolution (taught in Paris while forbidden in England, etc… The same crowd probably wants us to believe in Donald Trump and Neo Liberalism, as no good idea could possibly come from anywhere else not Germanoido-Anglo-Saxon. The Nobel Committee is dominated by US physicists anxious to demonstrate US superiority and, in particular, the superiority of US universities, because there is beaucoup money in it, and it could please their sponsors (the tax-free plutocrats).

It’s also important to make correct attributions, because the original authors are always clearer about their reasonings, and how they got there. Plagiarists tend to be more obscure, because they hide their tracks.

Re-attributing the correct discoveries can be shattering, and teaches us how obscurantism proceeds to eradicate knowledge. The disappearance, for two millennia, of non-Euclidean geometry, is a case in point. So is that of atomism, and “Brownian” motion. The suppression of Buridan and the heliocentric system, by the Christian church is a particularly sinister instance: it was vicious, deliberate, and motivated by the hatred for thinking..

So let’s celebrate the discovery of gravitational waves. My little drawing above shows that one does not need even relativity to make waves. A big motion of the source will do, as anybody watching a tsunami on TV knows.

The gravitational wave detectors inaugurate a new sort of measuring instrument. However, the idea is at least as old as the Michelson and Morley interferometer of the Nineteenth Century. There is nothing new to it. (That’s why I called the laureates “screwdriver turners.)

And what of Planck, Einstein’s unhinged sponsor? Planck signed a disgusting message in World War One denying Germany had committed war crimes (he later denounced it, when the war was over). The French made one of Planck’s sons prisoner in World War One, and the other son was caged and executed by Hitler. That Hitler interlocutor, Max Planck, got, unfortunately, not just for him, but all humanity, his just deserts. But let’s not keep on having them now. Want Relativity? Think Henri Poincaré, forget about his parrots!

Planck enabled Einstein to post in the Annalen der Physik, the oldest journal in physics (1799), WITHOUT any reference, on the three most famous subjects in physics at the time. It was vicious and deliberate, to serve the satanic god of hyper-nationalism of the racist type. Playing with hyper-nationalism, Planck ended up losing, and Einstein, and the German Jews, became double losers (they lost as Germans and as Jews). So here is a case of the losers writing history… German hyper nationalism was encouraged by Einstein and Planck, with a false flag attribution, and they, and their kind, lost twice.

Truth is not seen just with the eyes. Truth is seen through the mind of a thorough debate.

Patrice Ayme’

 

SUB-QUANTUM GRAVITATIONAL COLLAPSE 2 SLIT Thought Experiment

September 23, 2017

A Proposed Lab SUB QUANTUM TEST: SQPR, Patrice Aymé Contra Albert Einstein: GRAVITATIONALLY DETECTING QUANTUM COLLAPSE! 

Einstein claimed that a “particle” was a lump of energy, even while in translation. He had no proof of this assertion, and it underlays all modern fundamental physics, and I believe it’s false. As I see it, this error, duplicated by 99.99% of 20 C theoretical physicists, led the search for the foundations of physics astray in the Twentieth Century. How could one prove my idea, and disprove Einstein?

What Einstein wrote is this, in what is perhaps his most famous work (1905 CE): “Energy, during the propagation of a ray of light, is not continuously distributed over steadily increasing spaces, but it consists of a finite number of energy quanta LOCALIZED AT POINTS IN SPACE, MOVING WITHOUT DIVIDING…” [What’s in capital letters, I view as extremely probably false. Einstein then added nine words, four of which explaining the photoelectric effect, and for which he got the Nobel Prize. Those nine words were entirely correct, but physically independent of the preceding quote!]

If those “energy quanta” are “localized at points in space“, they concentrate onto themselves all the mass-energy.

It’s simple. According to me, the particle disperses while it is in translation (roughly following, and becoming a nonlinear variant of its De Broglie/Matter Wave dispersion, the bedrock of Quantum Physics as everybody knows it). That means its mass-energy disperses. According to Einstein, it doesn’t.

However, a gravitational field can be measured. In my theory, SQPR, the matter waves are real. What can “real” mean, in its simplest imaginable form? Something is real if that something has mass-energy-momentum. So one can then do a thought experiment. Take the traditional Double Slit experiment, and install a gravitational needle (two masses linked by a rigid rod, like a hydrogen molecule at absolute zero) in the middle of the usual interference screen.

Sub Quantum Patrice Reality Is Experimentally Discernible From Einstein’s Version of Quantum Physics! Notice in passing that none of the physics super minds of the Twentieth Century seem to have noticed Einstein’s Axiom, which is ubiquitously used all over Quantum Physics and QFT!

According to Einstein, the gravitational needle will move before the process of interference is finished, and the self-interfering particle hit the screen (some may object that, because photons travel at c, and so do gravitons, one can’t really gravitationally point at the photon; however, that’s not correct, there should be a delayed field moving the needle).

According to me, the particle is dispersed during the self-interfering process: it’s nowhere in particular. Thus the mass-energy is dispersed before the collapse/singularization. Thus a gravitational field from the self-interfering particle can’t be measured from inside the self-interfering geometry.

Could the experiment be done?

Yes. But it won’t be easy.

Molecules constituted  of 5000 protons, 5000 neutrons and 5000 electrons have exhibited double slit behavior.  That’s plenty enough mass to turn a gravitational needle made of two hydrogen atoms. However, with such a large object, my theory may well fail to be experimentally checked (the molecule probably re-localizes continually, thus the needle will move before impact). Ideally, one should best check this Sub Quantum Reality with a simple unique particle, such as a photon, or an electron.

Why did I long believe Einstein was wrong on this point, what I called “Einstein’s Axiom” above?

First, he had no proof of what he said. Allure can’t replace reason

Second, localization into a point is contrary to the philosophical spirit, so to speak, of Quantum Physics. The basic idea of Quantum Physics is that one can’t localize physics into points in space… or into points in energy (this was Planck’s gist). Both space and energy come in LUMPS. For example, an electron delocalizes around a proton, creating an atom of hydrogen.

The lump thing for emissions of energy is Planck’s great discovery (a blackbody sends energy packets hf, where f is the frequency and h, Planck’s constant). The non-relevance of points is De Broglie’s great intuition: De Broglie’s introduced the axiom that one can compute everything about the translation behavior of an object from the waves associated to the energy-momentum of said object.

So Einstein was wrong on the philosophy, as he himself concluded thirty years of thinking hard about Quantum Physics, as one of its two founders, with his discovery of what he called “Spooky Interaction At A Distance” (the “EPR”, which has turned from thought experiment to real experiment, checked now in hundreds of different experiments). If “elements of reality” (to use the Einstein EPR language), are spooky action at a distance” why not so when the particle is in flight, which is precisely the gist of the EPR… (After I thought of this, I found a paper by Zurek and Al. who seem to draw a similar conclusion.)

The philosophy of Quantum Physics in one sentence: small is big, or even, everywhere.

Third, Einstein’s hypothesis of points particles being always localized has led to lots of problems, including the so-called “Multiverse” or the “Many Worlds Interpretation of Quantum Mechanics” (at least, according to yours truly…).

Fourth, the development of Twentieth Century physics according to Einstein’s roadmap, has led to theories on 5% or so of known mass-energy, at most: an epic failure. Whereas my own Sub Quantum Reality readily predicts the apparition of Dark Matter and the joint apparition of Dark Energy, as observed.

Fifth: If Einstein were right, the which-path information in the 2-slit experiment would be readily available, at least as a thought experiment, and that can’t work. The entire subject is still highly controversial: contemplate the massive paper in the Proceedings of the National Academy of Sciences, “Finally making sense of the double-slit experiment”, March 20, 2017, whose lead author is Yakir Aharonov, from the extremely famous and important Aharonov-Bohm effect. The Aharonov-Bohm effect pointed out that the potentials, not the fields themselves, were the crucial inputs of Quantum Physics. That should have been obvious to all and any who studied Quantum Physics. Yet it was overlooked by all the super minds for nearly 40 years!

Sixth: This is technical, so I won’t give the details (which are not deep). One can modify Einstein’s original EPR experiment (Which had to do with pairs of particles in general, not just photon polarization a la Bohm-Bell). One can introduce in the EPR 1935 set-up, an ideal gravity detector. If Einstein was right about the particle being always localized, determinism would be always true on particle A of an {A,B} interaction pair. Thus particle A could be tracked, gravitationally, always. But that would grossly violated the free arbiter of a lab experimenter deciding to tinker with B’s path, through an experiment of her choosing. (How do large particles do it, then? Well they tend to partly localize continually thanks to their own size, and random singularizations.)

The naked truth can be in full view, yet, precisely because it’s naked, nobody dares to see it!

Richard Feynman famously said that the double slit experiment was central to physics, and that no one understood it. He considered it carefully. Gravitation should stand under it, though! The preceding proposed experiment is one which it was obvious to propose. Yet, no one proposed it, because they just couldn’t seriously envision Quantum Collapse, and thus its impact on gravitation. Yet, I do! And therein the connection between Quantum Physics and Gravitation, the quest for the Graal of modern physicists… 

So let’s have an experiment, Mr. Einstein!

Patrice Ayme’

Physics Of Hurricanes: Force Six Hurricanes Someday Soon?

September 18, 2017

There is another powerful hurricane on the way in the Caribbean: Maria now already, category V and strengthening, will hit the large islands of Guadeloupe (population 500,000), Martinique (400,000), and Dominique (75,000) today. Steady winds up to 260 kilometers an hour (150 miles per hour) are already experienced, with gusts at 350 km/h. Meanwhile, long lasting hurricane Jose is still active, out there in the Atlantic ocean.

The physics of hurricane as usually depicted in the media shows what’s going on, but not fully, why it’s going on. Probably because those who write the articles have insufficient understanding. Let’s fill in the cognitive gap.

Overall, a hurricane works like a rotary thermal engine, with a warm source, the warm, moist ocean, and a cold sink (the icy stratosphere, up high). The warm moist air goes up, because it’s lower density than colder air.

The mechanism above depends only upon having a warm source and a cold source (known in thermodynamics as a “Carnot engine”). So one can have Polar Cyclones, or Cyclones on Jupiter (“Great Red Spot”)!

How does it start, why is it self-feeding? If the ocean is warm, many of these large clouds will rise, and dot the ocean. Now the overall rising of warm air creates a low pressure L in the center of a particularly active zone of storms (or “cells”). This is not, per se, exceptional: the entire tropical belt tends to be low pressure, just because the warm air rises more than colder air up north.

That phenomenon creates the trade winds, air from the upper tropical belt which rushes in towards the equator, the “inter-tropical convergence zone” (ITCZ). Because of the rotation of the Earth, the trade winds, which would just go straight south if the Earth didn’t turn, get deflected to the west.

Hurricanes have been piling up in September 2017, from lack of wind shear in the hurricane forming region… Six hurricanes in the Caribbean in 2 weeks… If this keeps up the question of evacuation of many islands arises…

Now let’s go back to hurricane formation. Three or four large cells in the ocean, if close by, will develop a particularly low Low L in the center of the formation. At that point, the cells will tend to gather towards that center. However, the cell closest to the equator will have a greater momentum to the east, thanks to the Earth’s rotation, and the one furthest to the equator, will deviate west. Thus a counterclockwise rotation (in the northern hemisphere) of the set of cells will appear. From conservation of angular momentum, the more the warm air rushes towards the center, the more it tends to rotate (the same effect which makes a skater rotate faster by closing her arms). Next, the cells will merge, a hurricane is formed.

Now the warmer the ocean, the more powerful the rise of air in the middle, the lower the Low L, the greater the rush of air towards the center, and thus the greater the rotating winds. And the greater the winds, the more warm, moist air can rush in from low above the surrounding seas, thus feeding the hurricane.

When part of the frontal edge of the hurricane touches land, or, worse, a mountain range, it loses power in that part (as the power comes from rushing warm, moist air), losing its low there. So naturally the hurricane steers towards areas which can feed it, avoiding large land masses and mountain ranges.

(Thus hurricane steering is reminiscent of how an elementary particle should be steered by the geometry in a future Sub Quantum Mechanics.)

In any case, the hurricane is a rotating engine, whose rotation brings in the warm moist air it uses as fuel. Thus, if the rotation can’t develop, the engine won’t start. And the rotation develops because of the unequal drag of the clouds depending upon how far the equator is (big word: Coriolis Force). In particular, if the clouds cells are astride the equator, they will be equally dragged, and no rotation will occur. Thus, there are no hurricanes around the equator itself.

(The energies involved are enormous: around a ten megaton H bomb every twenty minutes; nuking a grade 5 hurricane would have no effect whatsoever, but for augmenting a bit more the sucking action of the hurricane…)

What of the frequency of hurricanes? The scenario above supposes that the large storm cells can start to rotate. However, the greenhouse augments winds all over. Linear winds, not just rotating winds. It’s a question of equipartition of energy (spreading the energy around in all dimensions available).

https://patriceayme.wordpress.com/2008/03/08/the-equipartition-of-energy-theorem-should-be-applied-for-climate-change-and-predicts-wild-fluctuations-of-temperatures/

Those winds can, and will, shear thunderstorm cells… Just as Saharan sand can collapse them (so stronger trade winds also play against hurricane formation, at least in the Atlantic). Thus hurricanes will tend to form a more ferociously, but not more frequently. What will augment, though, will be the ferocity and frequency of linear storms, and many have ravaged Europe in the last decade.

So far, the Earth has warmed up one degree centigrade, from the anthropogenic greenhouse, since 1800 CE. Another two degrees seems baked in. In the Carboniferous (“Carbon-making”) era, 400 million years ago, the CO2 and the heat were greater. There is also evidence that pretty much all the continents had joined. Yet, there was moisture in the interior of said continents (because there were plants). Moisture, in the sort of climate we know now, should never have penetrated so deep. How come? Super giant hurricanes, obviously. So we can expect force six, or more, hurricanes in the future… It happened before.  

Patrice Ayme

Relativistic Philosophy Beyond Consensus

August 4, 2017

It’s good to focus on “General Relativity” and Cosmology without the cloak of mathematics gone wild and unsupervised, indeed.

Anything having to do with “General Relativity” has a lot of extremely debatable philosophy hidden below a thick carpet of computations. Abuse of philosophically unsupervised spacetime leads one to believe in time machines, wormholes, and similar absurdities. A recent discovery such as Dark Energy (ever expanding space faster than previously anticipated), and a not so recent one, Dark Matter, show one has to be extremely careful.

Einstein equation of “General Relativity” (GR) is basically Curvature = Mass-Energy. Einstein long observed that the left hand side of the equation was built of mathematical beauty, and the right hand side of a murky mud of a mess. The discovery of Dark Matter proved him prophetic about that. (BTW, I know perfectly well that, stricto sensu, it’s the Ricci tensor, derived from the full Curvature tensor on the left…)

First a philosophical trap: “General Relativity” (GR) is a misnomer. It’s not clear what’s being generalized. GR is certainly a theory of the relationship between gravity and local space-times (the Theory of Relativity of space and time which Poincaré named that way in 1904).

Einstein was initially motivated to explain inertia according to the Newton-Mach observation that the distant stars seemed to endow matter with inertia (because if matter rotates relative to distant stars, a centrifugal force appears).

That way, he failed, as Kurt Goedel produced spacetime models which rotated wildly without local consequences. Frame dragging exists nevertheless, and is crucial to GPS. So GR has local consequences.

Neither Poincaré nor Einstein liked the concept of “spacetime”.

There are massive galaxy cluster, such as Abell 370 (shown here). They can be made up of thousands of Milky Way-sized galaxies. This is beyond anything we can presently have a feeling for. The space inside this cluster is not expanding, that’s a fact, but the space between this cluster and other, unbound, galaxies and clusters, is viewed by today’s Main Stream Cosmology, as expanding. I’m robustly skeptical. Image credit: NASA, ESA/Hubble, HST Frontier Fields.

A question has naturally come up: if space expands, how come we don’t? An answer to this has been the raisin bread model of the expanding universe.

As Sabine Hossenfelder, a theoretical physicist in Quantum Gravity and High energy physics  puts it: “In cosmology, too, it helps to first clarify what it is we measure. We don’t measure the size of space between galaxies — how would we do that? We measure the light that comes from distant galaxies. And it turns out to be systematically red-shifted regardless of where we look. A simple way to describe this — a space-time slicing that makes calculations and interpretations easy — is that space between the galaxies expands.”

However, the entire area is contentious. The usual snap-back of haughty physicist keen to deny any brains worth noticing to the Commons, is to say that all those who don’t understand the mathematics at hand should shut up.

That’s a disingenuous answer, as NOBODY understands fully the mathematics at hand (those with snappy rejoinders know this, but they enjoy their power maliciously).

An example of the non-universality of the notion of expanding space is the following exact quote from Physics Nobel Laureate Steven Weinberg, author, among many other things, such as the Weinberg-Salam model of the electroweak interaction, of the most famous textbook on the subject, “Gravitation and Cosmology”: “…how is it possible for space, which is utterly empty, to expand? How can nothing expand? The answer is: space does not expand. Cosmologists sometimes talk about expanding space, but they should know better”

Well, they don’t.

Reference https://www.physicsforums.com/threads/raisin-bread-model-of-space-time.901290/

Personally, I think that both space and time are local concepts (as long as one does not add to consideration the Quantum theory, as it was created, post 1923, by De Broglie, and after 1924, by the Copenhagen School). Local space and local time are united by the speed of light, c, through naturally ubiquitous light clocks. Space and time are measured locally (although Poincaré proposed a slow motion to move synchronized clocks around, and Einstein copied and published that mechanism, verbatim, as he had with E = m c²).

It has been proposed that the redshift of cosmological photons, and its attribution, 100%, to the expansion of spacetime, is a proof of the expanding “spacetime”. One must say that this statement is the core of present cosmology. And anybody looking down on the idea will not be viewed as serious by famous physicists. However just saying something does not prove it. Especially when the conclusion seems to be the hypothesis.

Lorentz- Poincaré Local Space and Time theory was experimentally provable (electromagnetism proved it).

But where is the proof that the universe is like an expanding dough, spacetime, with galactic raisin grains in it? Just waving the notion that the atomic force is 10⁴⁰ the gravitation force at a small scale does not seem compelling to me. It’s rather a question of range: gravitation is much longer range, although, much weaker. Thus the geodesic deviations due to gravitation show up at a very great distance, whereas those due to atomic and molecular force cause enormous geodesic deviations, but only at very short range. We are these enormous local deviations, larger by 10⁴⁰ locally.

Yet, even this more precise argument smacks of hand waving.  Why? Because a theory of local forces as curvatures, although posited by Riemann in 1865, and the foundation of GR, still does not exist (that’s one thing string theory was trying to achieve, and failed). Gravitation remains the only force that is tautologically equivalent to a curved space theory.

Quantum Physics has provided that theoretical spacetime with a nonlocal causal architecture (through Quantum Entanglement). However that “causality” although geometric, is non metric (and thus manifests itself with no geodesic deviation, no force).

Einstein, after a debate on nonlocality imparted by the Quantum, with the Austrian philosopher Karl Popper, attracted the world’s attention on that problem in 1935, with his famous EPR paper. There Einstein denounced the way the “spooky action at a distance” affected distant “elements of reality”. Since then, the spookiness at a distance has been amply confirmed (and enables to encrypt space communications while knowing 100% whether they have been breached, as a Chinese satellite recently showed). Nonlocal effects show unambiguously that the metric (of “spacetime”) does not capture all the geometry (an notion which may surprise physicists, but not those mathematicians who have studied the foundations of their field).

This Quantum architecture has led, so far, to no prophecy, let alone theory, by established physicist. Entangled Quantum architecture is actually not part of the General Relativistic raisin cake model (or any GR model). However, I will venture to say one can view it as predicting Dark matter, at the very least. It’s just a question of baking something more sophisticated than raisin bread.

Patrice Ayme

QUANTUM ENTANGLEMENTS MAKE TIME AN ARROW

May 19, 2017

Through Wave Collapse and the ensuing Entanglements it sometimes brings, QUANTUM PHYSICS CREATES A CAUSAL STRUCTURE, THROUGHOUT THE UNIVERSE, THUS, AN ARROW OF TIME.

Actually it’s more than a simple causal structure: it is an existential structure, as localization creates materialization, in the (Sub-)Quantum Theory I advocate. (It’s a theory where there are no dead-and-alive cats, but particles in flight are not particles… Contrarily to what Einstein thought, but more along the lines of Niels Bohr, horror of horrors…) It also means that time, at the smallest scale, is a nonlocal entanglement. This is not a weird new age poetry, but pretty much what the raw formalism of Quantum Physics say. I throw the challenge to any physicist to contradict this in any way. It’s completely obvious on the face of it.

You read it here first, as they say (although I may have said it before). Is time absolute? How could time be absolute? Where does the Arrow Of Time (Eddington) come from? Is there something else which grows with time?

The old answer is entropy, traditionally denoted by S.

Boltzmann’s equation S = k log P says that entropy augments during the evolution of a system. P indicates the number of states accessible by the system. Entropy was a construction from later Nineteenth Century physics, a successful attempt to understand the basic laws of thermodynamics (mostly due to Carnot).

A big problem for classical thermodynamics: what’s a state? That’s not clear.

However Quantum Physics define states, very precisely. However, very specifically: a situation, defined in space-time, what Bohr and Al. called an “experiment” (rightly so!) defines a number of possible outcomes: the latter become the “states”, a basis for the Hilbert Space the “experiment” defines.

Classical statistical mechanics does not enjoy such precisely defined states. So why not to use the states of Quantum Physics? Some could object that Quantum “experiments” are set-up by people. However Quantum Interactions happen all the time, independently of people. As in the Quantum experiments set-up by people, those Quantum Interactions grow something: Quantum Entanglement. ( Self-described “Quantum Mechanic” Seth Lloyd from MIT has also mentioned that entanglement and the arrow of time could be related.)

Quantum Entanglement has a direction: from where singularization (= localization = the collapse of the Quantum wave packet) happened first, to the distant place it creates the geometry of (yes, entanglement creates geometry, that’s why it’s so baffling to specialists!) 

Quantum Physics, Or, More Precisely, What I call QUANTUM INTERACTIONS are irreversible processes. Hence the Arrow Of Time

So we have two things which grow, and can’t be reversed: Time and Wave Collapse/Quantum Entanglement. I propose to identify them. (After all, Maxwell proposed to identify electromagnetic waves and light, just because they are both waves and went at the same speed; it turned out to be a magnificent insight.)

Quantum Wave function collapse is time irreversible (actually, the entire Quantum Wave deployment is time irreversible, because it depends only upon the geometry it’s deployed in). The mechanism of wave function collapse is philosophically a matter of often obscure interpretations, and arguably the greatest problem in physics and philosophy.

My position here is perfectly coherent: I believe the Quantum Waves are real. (So I do not believe the waves are waves of ignorance, and an artefact, as some partisans of Quantum decoherence have it). Those objective waves are real, although not always in one piece (that’s how I generate Cold Dark Matter).

By the way, it is the collapse of the Quantum Wave which “creates” the Quantum Entanglement At least that’s how the mathematics, the description of the theory has it! The picture it creates in one’s mind (first the wave, then the collapse, then the entanglement) makes sense. Actually I am arguing that this is how sense makes sense!

Quantum Entanglement is a proven experimental fact. All physicists have to agree with that. Thus the Quantum Wave has to be real, as it is the cause of the Quantum Entanglement! (I am pointing out here that those, and that’s now nearly all of them, who believe in Entanglement are incoherent if they don’t believe in the wave too!).

Jules Henri Poincaré had seen that time and space were not equivalent. That was meritorious, as Poincaré had proposed the original ideas of “local time” and “local space” theories, which are the fundamental backbones of Special Relativity (they are deduced from the constancy of the speed of light).

Even Einstein publicly frowned on the concept of “spacetime”, which identifies space and time; “spacetime” was proposed by Minkowski, Einstein’s own professor at the EHT… They may not have been friends, as Minkowski compared Einstein to a “lazy dog”; Einstein, of course, respected Poincaré so much, that he grabbed the entire theory of Relativity from him, including its name…

Quantum Physics does not outright treat time as equivalent to space, quite the opposite (although Quantum Field theorists have tried to, and do treat space and “imaginary time” as the same!). In fundamental Quantum Physics, time is a one parameter group of transformation, not really a dimension.

When a glass falls and shatters, Classical Mechanics is at a loss:’Why can’t it reassemble itself, with as little work?” Classical Thermodynamics mumbles:’Because Entropy augments’. (That may be a tenable position, but one will have to count the states of the glass in a Quantum way. Even then, the full energy computation will reveal a lack of symmetry.)

I say, simply:’A glass which has shattered can’t be reassembled, because Quantum Interactions, and ensuing entanglements happen.’ The resulting topology of cause and effect is more complicated than what one started with, and can’t be reversed. Quantum Interactions and ensuing effects at a distance they provide with, create a partial, nonlocal, ordering of the universe. Time. (Once a set has been physically defined, it has been thoroughly interacted with, Quantum Mechanically, and then it becomes a “well ordering”!)

So what’s time? The causal structure of the universe as determined by irreversible, causal Quantum Wave collapse and Quantum Entanglement.

Patrice Ayme’