## Archive for the ‘Cosmology’ Category

### Sub Quantum Physical Reality (SQPR) Implies An Anisotropic Universe

April 16, 2020

Sub Quantum Physical Reality (SQPR) uses CIQ (the Copenhagen Interpretation of Quantum) as a first order, linear approximation. SQPR adds to CIQ just one axiom: the Quantum Interaction (aka collapse, aka decoherence, aka entanglement) proceeds at an enormous speed TAU (at least 10^23 c, the supposed speed of Cosmic Inflation in the standard Big Bang model known as LCDM).

(Yes having an entanglement speed 10^23 c would contradict Jules Henri Poincaré’s idea that the speed of light, c, is an absolute speed limit, the fundamental idea of his Theory of Relativity… But it does not contradict Relativity actually… in part because of several logical twists many overlooks: Poincaré’s Relativity is strictly local, at the neighborhood ultrafilter limit… and says something between events, not within events… moreover, the speed of space is not the speed of light. TAU is the speed of these expanding chunks of space we call “particles” although when in translation, they are known to behave like waves).

Out of this added hypothesis comes Dark Matter and probably Dark Energy (I say probably because DE is even more mysterious than DM).

Galaxies galore… To believe that Solar System physics works on the grandest scale is silly. Actually, that what’s grand can have nothing to do with what’s little is the fundamental hypothesis of “General Relativity”. In particular, GR predicts that light can go in circles, or stop (and even Einstein was aware of this). That has been experimentally demonstrated for more than a century…

How does SQPR predicts Dark Matter? Some Quantum Interactions, on a cosmic scale, will leave mass-energy behind, because TAU is not fast enough for a proper collapse. This is Dark Matter. In particular, DM will tend to gather in rings around galactic cores. Another remnant, from tearing apart of fundamental processes, averaged on a cosmic scale, is the 3 degree Kelvin background radiation [1].

If one assumes that interactions are mediated by quantum processes, the SQPR mechanism will apply to them. Thus long range forces will be affected, and distorted. In particular gravity [2]. Hence the prediction of DE (details in the future, as DE is too mysterious to describe cogently beyond the basic fact that it accelerates the cosmic expansion… How exactly is not clear).

Depending upon the exact geometry of a part of the universe then gravity will be affected… thus affecting the expansion speed, and making it potentially anisotropic, as observed in 2020. Now how to explain my slightly more sensationalist title, beyond a natural love for the spectacular?

Simple: any theory incorporating some sort of gravitation (a universal attraction) will develop grains, hence a chaotic geometry in the distribution of matter. At that point the Quantum Interaction (what causes Quantum Collapse) will guarantee that lumps in Dark Matter form. In turn, if virtual particles are really real (and not just a mathematical symbolism as they are presently), and thus submitted to the Quantum Interaction, we will variation in gravitation on a cosmic scale, namely a lowering of attraction in some region. Hence the tabloid inspiration. By the way, the same reasoning gives a reason for the expanding universe in totto… [3].

Patrice Ayme

***

[1] So SQPR throws out the main reason for the entire Big Bang. One thing is gained by SQPR: the need for Cosmic Inflation disappear (cosmic inflation is an ad hoc hypothesis built to make LCDM work)

***

[2] MOND boldly explains away DM by claiming the 1/dd, the inverse square of the distance law, is not valid anymore. So the Dark Matter effect, according to MOND is not caused by a mysterious form of matter (explained by SQPR), but simply by a failure of gravitation.

That 1/dd law is the bedrock of so-called “General Relativity” (which, despite its grandiloquent name, is truly *just* an improved theory of gravitation; one could call it Grav 4… Grav 1 being Boulliau/Bulladius (1/dd), Grav 2 Newton (equivalence with Kepler), Grav 3 Laplace (waves), Grav 4 (Riemann-Poincare-Einstein)).

MOND looks impossible, and is also ugly (being ad hoc, without deep, or logically simplifying justification). SQPR is natural, logically lean… But also predicts, in conjunction with the assumption that all forces are Quantum mediated, that at scales of the order of galactic clusters, gravity will turn in some sort of locally geometrically dependent MOND…

***

[3] esteemed commenter physicist Ian Miller (FRS) correctly observes that: “I don’t see that [Cosmic Anisotropy] cancels out the big bang concept, but it certainly puts the cat amongst the pigeons for cosmic inflation, which was devised to account for the difficulty in explaining the complete symmetry and uniformity of the expansion. Since it is no longer so uniform or symmetric, presumably cosmic inflation should disappear. Any bets on whether such an idea with so many devout followers will die out?” The problem is that the Big Bang LCDM model doesn’t work without cosmic inflation. So then the Big Bang is down to two arguments: the cosmic background radiation (answered above by SQPR), and the fabrication of helium (supposedly stars can’t do it, but those stars, the old fashion way, were weaklings… Thus that could be answered by a very old universe, as I have proposed, and as it is necessary now that we know Cosmic Inflation is lame, at best… with stars or supernova hotter than classical theory had it; it seems such stars have been observed…)

### Exit Big Bang? The Universe Is Anisotropic!

April 14, 2020

Of few things truly certain we are, but of many things most falsely speak…

Astronomers assumed for decades, without any proof, that the Universe was expanding at the same rate in all directions: it was simpler that way (after all some hanger-ons were claiming they were present during the “First Three Minutes”!… and thus became very famous…). A new study based on data from ESA’s XMM-Newton, NASA’s Chandra and the German-led ROSAT X-ray observatories suggests this key premise of cosmology might be wrong.

The Universe in simplified glory. However… Not as simple as expected! The blue areas expand more slowly than expected, the yellow areas faster. In isotropy, the image would be monochromatic red. Credit: © Konstantinos Nikolaos Migkas, Uni Bonn/Astronomy & Astrophysics. And the differences are not small: thirty percent! (30%!)

The isotropy hypothesis says that the Universe has, despite some local differences, the same properties in each direction on the large scale. The hypothesis has been supported by observations of the cosmic microwave background (CMB). An alleged direct remnant of the Big Bang, the CMB would reflect the state of the Universe as it was in its infancy, at only 380 000 years of age. The CMB’s uniform distribution in the sky suggested that in those early days the Universe must have been expanding at the same rate in all directions.

If this would still be true in more recent times, the speed of galactic clusters should average out. But significant differences were observed.

The astronomers used X-ray temperature measurements of the extremely hot gas that pervades the clusters and compared the data with how bright the clusters appear in the sky. Clusters of the same temperature and located at a similar distance should appear similarly bright. But that is not what the astronomers observed.

Clusters with the same properties, with similar temperatures, appeared to be less bright than expected in one direction of the sky, and brighter than expected in another direction. The difference was quite significant, around 30 percent. These differences are not random but have a clear pattern depending on the direction in which we observed in the sky.

Before challenging the widely accepted status quo ante, the cosmology model known as LCDM, which provides the basis for estimating the cluster distances, other possible explanations were looked at. Perhaps, there could be undetected gas or dust clouds obscuring the view and making clusters in a certain area appear dimmer. The data, however, do not support this scenario. Nor does it support that the distribution of clusters is affected by bulk flows, large-scale motions of matter caused by the gravitational pull of extremely massive structures such as large cluster groups.

The authors speculate that this uneven effect on cosmic expansion might be caused by Dark Energy, the mysterious component of the cosmos which accounts for the majority—around 69% – of its overall energy. Very little is known about dark energy today, except that it appears to have been accelerating the expansion of the Universe in the past few billion years.

Meanwhile, lots of things will have to be recomputed… And the flow of surprises from heavens doesn’t stop here… A Milky way sized Dark Matter galaxy would have been discovered…

Patrice Ayme

### “Fuzzy” Dark Matter & Sub Quantum Physical Reality (SQPR)

October 17, 2019

Abstract: An early Quantum universe would have appeared “fuzzy”, and striated, from Quantum self interference… If one adopts one basic consequence of my own SQPR theory: Dark Matter is made of ultra-light, ultra-low momentum particles. A team of physicists at prestigious institutions by adopting this conclusion of SQPR, one gets a drastically different looking model explaining the filament nature of galaxy distributions. (This completely new approach is indirectly rather supportive of SQPR… and very different from the usual LCDM; it should be testable soon, with new telescopes under construction…)

***

According to official, ruling Big Bang theory, Dark Matter was the starting ingredient for coagulating the very first galaxies in the universe. According to that “LCDM” model, shortly after the Big Bang, particles of Dark Matter clumped together in gravitational “halos,” pulling surrounding gas into their cores, which over time cooled and condensed into the first galaxies. [1]

Thus a curious situation: Dark Matter is considered the backbone to the structure of the universe, while physicists know very little about its nature, because the DM “particles” have so far evaded detection.

Now scientists at MIT, Princeton University, and Cambridge University have admitted the obvious, namely that the early universe, and the very first galaxies, would have looked very different depending upon the exact nature of Dark Matter.  They simulated what early galaxy formation would have looked like if Dark Matter were “fuzzy,” rather than cold or warm. “Fuzzy” here has a precise definition: it means very low momentum DM “particles”. Such “fuzzy” particles are what my own theory, SQPR is full of, as a consequence of my hypothesis that Quantum Mechanics is LOCAL.

Left is the conventional distribution of galaxies prediction of the conventional Big Bang (“LCDM”). Center is that with “warm” dark Matter. Right is the Quantum “fuzzy” DM model (compatible with SQPR).

Light Mechanics, electromagnetism, is local: this is also called Relativity (Poincaré named it thus). QM being a generalization of Light Mechanics, it is natural that it would be local too: this is the fundamental axiom of SQPR

In that most widely accepted scenario, the so-called LCDM (Lambda Cold Dark Matter) model of the early universe Dark Matter is Cold: it is made up of slow-moving particles that, aside from gravitational effects, have no interaction with ordinary matter (SQPR readily explains why DM doesn’t interact but gravitationally).

In LCDM, Warm Dark Matter is thought to be a slightly lighter and faster version of Cold Dark Matter (it has been heated by galaxies).

Fuzzy Dark Matter, is, for official physics, a new concept, something entirely different, consisting of ultralight particles, each about 1 octillionth 10^(-27) the mass of an electron (the Cold Dark Matter particle of LCDM are far heavier — about 100 times more massive than an electron). Repeat: the proposed mass for Dark Matter particles in this new simulation is the mass of an electron divided by 1,000,000,000,000,000,000,000,000,000

Now we are talking. This is the sort of numbers my own theory, SQPR considers.

The Millennium Simulation (below) is an example of an over 10 billion particle simulation that tries to reproduce the cosmic web of dark matter upon which exist galaxy clusters, filaments, and voids we see today. The LCDM (Lambda Cold Dark Matter) model of the universe assumes a flat universe now dominated by a cosmological constant $Lambda$, Einstein’s Cosmological Constant (Dark Energy?). As I said, the cosmological large structure formation is dominated by cold (non-relativistic) dark matter.

A view of the distribution of dark matter in our universe, based on the Millennium Simulation. The simulation is based on our current ideas about the universe’s origin and evolution. It included ten billion particles, and consumed 343,000 cpu-hours (Image: Virgo Consortium)Researchers found that if Dark Matter is cold, then galaxies in the early universe would have formed in nearly spherical halos, with ten times too much mass there. But if the nature of Dark Matter is fuzzy or warm, the early universe would have looked very different, with galaxies forming first in extended, tail-like filaments. In a fuzzy universe, these filaments would have appeared striated, like star-lit strings on a harp… As observed.

As new telescopes come online, with the ability to see further back into the early universe, scientists may be able to deduce, from the pattern of galaxy formation, whether the nature of dark matter, which today makes up nearly 85 percent of the matter in the universe, is fuzzy as opposed to cold or warm.

“The first galaxies in the early universe may illuminate what type of dark matter we have today,” says Mark Vogelsberger, associate professor of physics in MIT’s Kavli Institute for Astrophysics and Space Research. “Either we see this filament pattern, and fuzzy dark matter is plausible, or we don’t, and we can rule that model out. We now have a blueprint for how to do this.” [2]

Fuzzy Quantum Waves:

While dark matter has yet to be directly detected, the hypothesis that describes dark matter as cold has proven successful at describing the large-scale structure of the observable universe. As a result, models of galaxy formation are based on the assumption that dark matter is cold.

“The problem is, there are some discrepancies between observations and predictions of cold dark matter,” Vogelsberger points out. “For example, if you look at very small galaxies, the inferred distribution of dark matter within these galaxies doesn’t perfectly agree with what theoretical models predict. So there is tension there.” This is a euphemism: According to LCDM, the heavy DM particles should sink towards the core of galaxies, and this is exactly what is not observed.

Enter, then, alternative theories for dark matter, including warm, and fuzzy, which researchers have proposed in recent years.

“The nature of dark matter is still a mystery,” Fialkov says. “Fuzzy dark matter is motivated by fundamental physics, for instance, string theory, and thus is an interesting dark matter candidate. Cosmic structures hold the key to validating or ruling out such dark matter models.”

Fuzzy dark matter is made up of particles that are so light that they act in a quantum, wave-like fashion, rather than as individual particles. This quantum, fuzzy nature, Mocz says, could have produced early galaxies that look entirely different from what standard models predict for cold dark matter.

“Even though in the late universe these different dark matter scenarios may predict similar shapes for galaxies, the first galaxies would be strikingly different, which will give us a clue about what dark matter is,” Mocz says.

To see how different a cold early universe could be, relative to a fuzzy early universe, the researchers simulated a small, cubic space of the early universe, measuring about 3 million light years across, and ran it forward in time to see how galaxies would form given one of the three dark matter scenarios: cold, warm, and fuzzy.

The team began each simulation by assuming a certain distribution of dark matter, which scientists have some idea of, based on measurements of the cosmic microwave background — “relic radiation” that was emitted by, and was detected just 400,000 years after the alleged Big Bang. Dark matter doesn’t have a constant density, even at these early times. There are tiny perturbations on top of a constant density field. Those perturbations would gather more Dark Matter, nonlinearly.

The researchers were able to use existing algorithms to simulate galaxy formation under scenarios of cold and warm dark matter. But to simulate fuzzy dark matter, with its quantum nature, they needed to bring in the Quantum.

A cosmological map of Interfering Quantum strings:

To the usual simulation of cold dark matter were added two extra equations in order to simulate galaxy formation in a fuzzy dark matter universe. The first, Schrödinger’s equation, describes how a quantum wave evolves in the presence of (potential) energy, while the second, Poisson’s equation, describes how that (self-interfering) quantum wave generates a density field, or distribution of Dark Matter, and how that distribution leads to (uneven) gravity — the force that eventually pulls in matter to form galaxies. They then coupled this simulation to a model that describes the behavior of gas in the universe, and the way it condenses into galaxies in response to gravitational effects.

In all three scenarios, galaxies formed wherever there were over-densities, or large concentrations of gravitationally collapsed Dark Matter. The pattern of this Dark Matter, however, was different, depending on whether it was cold, warm, or fuzzy.

In a scenario of cold dark matter, galaxies formed in spherical halos, as well as smaller subhalos. Warm Dark Matter produced  first galaxies in tail-like filaments, and no subhalos. This may be due to warm dark matter’s lighter, faster nature, making particles less likely to stick around in smaller, subhalo clumps.

Similar to warm dark matter, fuzzy dark matter formed stars along filaments. But then quantum wave effects took over in shaping the galaxies, which formed more striated filaments, like strings on an invisible harp. This striated pattern is due to constructive interference, an effect that occurs when two waves overlap, similarly to the famous Double Slit experiment. When constructive interference occurs, for instance in waves of light, the points where the crests and troughs of each wave align form darker spots, creating an alternating pattern of bright and dark regions.

In the case of fuzzy dark matter, instead of bright and dark points, it generates an alternating pattern of over-dense and under-dense concentrations of dark matter.

“You would get a lot of gravitational pull at these over-densities, and the gas would follow, and at some point would form galaxies along those over-densities, and not the under-densities. This picture would be replicated throughout the early universe.”Vogelsberger explains.

The team is developing more detailed predictions of what early galaxies may have looked like in a universe dominated by fuzzy dark matter. Their goal is to provide a map for upcoming telescopes, such as the James Webb Space Telescope, that may be able to look far enough back in time to spot the earliest galaxies. If they see filamentary galaxies such as those simulated by Mocz, Fialkov, Vogelsberger, and their colleagues, it could be the first signs that Dark Matter’s nature is fuzzy.

“It’s this observational test we can provide for the nature of dark matter, based on observations of the early universe, which will become feasible in the next couple of years,” Vogelsberger says.

SQPR predicts less “fuzzy” Dark Matter in the earlier universe. However, a lot of the effects described by the MIT team would nevertheless happen, and for the same exact reasons. So the apparition of striated structures would not be surprising… even if LCDM was completely wrong.

Patrice Ayme

***

***

[1] There is a famous theorem that Newton needed for his celestial mechanics and tried to prove (and may have succeeded to prove; it’s controversial whether he did or not) according to which a ball of mass M acts gravitationally as a point of mass M.

***

[2] Vogelsberger is a co-author of a paper appearing (October 2019) in Physical Review Letters, along with the paper’s lead author, Philip Mocz of Princeton University, and Anastasia Fialkov of Cambridge University and previously the University of Sussex.

### Will Starburst Galaxies Explode the Big Bang?

June 11, 2018

There are MUCH MORE GIANT STARS THAN EXPECTED IN THE UNIVERSE (Factor of ten?)

I have proposed that the Big Bang Model is wrong, and that the universe could be much older, of the order of 100 billion years old, not 13.8 billion years; my iconoclastic and inconsiderate reasoning was philosophical: we have one expansion mechanism, DARK ENERGY. That expansion, Dark Energy, was directly observed, it exists, it’s not a figment of imagination. Many a physicist made a sour face, as Dark Energy was not expected at all: hundreds of arrogant  claims to explain the whole universe, talk to the media and the gullible as if one were god, and then, next thing one knows, one’s theories don’t explain 95% of the universe…

So an insolent philosophy asked: ‘Why would we need another cosmic expansion mechanism?’ Especially one expansion mechanism NOT directly observed, a figment of the imagination, the so-called Inflaton Field, necessary to make the Big Bang theory work (because of arcane complications: basically the universe as observed is around 100 billion light years across, and can have got that big only if it expanded at 10^10 times the speed of light, or something like this… Confusing enough? I have explained what is going on here and there, such as the locality of the speed of light, and the embedding theorem of Lorentzian manifolds. Stay tuned…)

A (Non Spectacular) Starburst galaxy, the Cigar, 12 million light year away. Full starburst galaxies are very blue, from the giant extremely hot (thus blue) stars in their midst. How much do we know about Helium formation in such super giant stars? Philosophers want to know!

So why is the Big Bang necessary? Besides making some people more puffed up than god itself?

Inspired by the H bombs they were thoroughly familiar with, Gamow, Alpher and Herman proposed the hot Big Bang as a means to produce all of the elements: extreme heat caused collisions and the nuclei fused (from the “STRONG FORCE”).

The lightest elements (hydrogen, helium, deuterium, lithium) were produced in the Big Bang nucleosynthesis

Ms. Burbidge, Mr. Burbidge, Fowler and Hoyle worked out the nucleosynthesis processes that go on in stars, where the much greater density and longer time scales allow the triple-alpha process (He+He+He –>> C) to proceed and make the elements heavier than helium.

But BBFH could not produce enough helium. The solution, which Hoyle didn’t like at all, was to make the Helium in the Big Bang. Now we think we know that both processes occur: most helium is produced in the Big Bang but carbon and everything heavier is produced in stars. Most lithium and beryllium is produced by cosmic ray collisions breaking up some of the carbon produced in stars.

In a pirouette, Helium abundance is now viewed the observation which makes the Big Bang necessary… Yet, all this rests on an ironclad understanding of stellar physics… which we assume we have, although we don’t.

Astronomers at the gigantic, high altitude Atacama Large Millimeter/submillimeter Array (ALMA) in Chile investigated intense bouts of star formation in four distant, gas-rich starburst galaxies, where new stars are formed 100 or more times faster than they are in the Milky Way.

By looking at isotopes ratio in Inter Stellar Medium (ISM) Carbon Monoxide CO, one can see if it has been generated in light, or heavy stars. To quote from the original article in Nature: “

Oxygen, carbon and their stable isotopes are produced solely by nucleosynthesis in stars. The minor isotopes, 13C and 18O, are released mainly by low- and intermediate-mass stars (those with stellar mass less than eight solar masses, M* < 8M⊙) and massive stars (M* > 8M⊙), respectively, owing to their differing energy barriers in nuclear  reactions and evolution of stars. These isotopes then mix with the interstellar medium (ISM) such that the 13C/18O abundance ratio measured in the ISM becomes a ‘fossil’, imprinted by evolutionary history and the stellar initial mass function (IMF). The abundances of the 13CO and C18O isotopologues in the molecular ISM, whose measurements are immune to the pernicious effects of dust, are therefore a very sensitive index of the IMF in galaxies.

***

Conclusion of the Nature article:

Classical ideas about the evolutionary tracks of galaxies and our understanding of cosmic star-formation history are challenged. Fundamental parameters governing galaxy formation and evolution—star-formation rates, stellar masses, gas-depletion and dust-formation timescales, dust extinction laws, and more—must be re-addressed, exploiting recent advances in stellar physics.

This doesn’t prove my ideas about the universe are right. Yet the article mention star formation rates have to be lowered by a factor of… seven. (I will resist multiplying 13.8 billions by 7, which is… not making this up, very close to 98 billions…)

This doesn’t prove my ideas about the universe are right… But it goes my way… Ok, let a professional concludes:

Our findings lead us to question our understanding of cosmic history,” Rob Ivison, co-author of the study and director for science at European Southern Observatory, said in the statement. “Astronomers building models of the universe must now go back to the drawing board, with yet more sophistication required.

Moods, in science cannot change until evidence contrary to the old visions one had of things, accumulate. Before that, a change of paradigm can’t be hoped for. Long ago, when I used to be all too human, I communicated with a director at ESO. Delighted by the change of tone, not to say mood… (Another guy I knew was so arrogant that he posited one was not really a scientist until one was the director of a lab, which he happened to be… in astrophysics, the field at hand, where it turns out the big picture was missed…)

But, ladies and gentlemen, remember this: wisdom, even scientific wisdom, doesn’t always triumph in a timely manner. We have examples in science, and mathematics, where wisdom was delayed and defeated for 24 centuries… by the greatest stupidity

Patrice Ayme

***

***

Examples of delayed wisdom: a) The Atomic Theory, of course, complete with eternal motion in the small (which the Greeks had observed and is strikingly described by Lucretius). The theory was then forgotten until the 19C.

b) The Archimedean Axiom in arithmetic/theory of infinity, undetected until 1950, when the US logician/mathematician Robinson detected it.

c) Non-Euclidean geometry found 24 centuries ago, and then lost until 1830 CE…

d) Biological evolution theory, lost between Anaximander and Lamarck… Although practiced by all serious breeders (especially Greek).

e) Computers, lost for 17 centuries… we have one proof the Antikyra mechanism (and various written description) until Blaise Pascal… Hence the computer language “Pascal”

f) Heliocentric theory of Aristarchus of Samos lost between Archimedes and Buridan (and buried again by Catholicism) Heliocentrism was of course obvious, except if one is a caveman, and not to observant…

g) And of course that Earth was round and how big, established and measured first by the great scientist and explorer Pytheas of Massalia (Pytheas de Marseilles), circa 320 BCE. Pytheas even related the tides to the Moon, and got the explanation roughly right (whereas Galileo Galilei, 19 centuries later, got the explanation of the tides completely wrong, and not just that but got a near lethal fight with his friend the Pope, who he brushed off as an ignorant… when the Pope was actually less wrong than Galileo…)

### Ever Darker Universe Expanding Ever Faster?

June 3, 2016

The most important discoveries in physics of the last 50 years are Dark Matter, and so-called Dark Energy.

The two most precise methods to evaluate the accelerated expansion of the Universe disagree by 9%. This surfaces from a recent 2016 paper. I am astounded by the fact that different methods agree so much.

A paper detailing the discrepancy, reported on the pre-print server Arxiv in April by Adam Riess of the Space Telescope Science Institute in Baltimore, Maryland, and colleagues, accepted by The Astrophysical Journal, reveals the slight discrepancy between the methods we have of measuring the expansion of the universe.

Not auspicious for life: Cepheids Stars Enable To Compute Distance. RS Puppis Shown Here, Varies By A Factor of 5 Every 40 Days.

One method looks at dimples in the cosmic microwave background (CMB), a glow supposedly left behind by the hot, early universe just a few hundred thousand years after the alleged Big Bang. Space-based observatories like NASA’s WMAP and ESA’s Planck have measured small fluctuations in temperature in the CMB. Assuming we understand the physics in extreme detail, the size of these fluctuations let physicists calculate how fast the universe was expanding when the universe began, some 13.7 billion years ago.

The other method measures how distant galaxies appear to recede from us as the universe expands, using stars and supernovae of type Ia, which have a known brightness to estimate the distance to those galaxies. These Type Ia supernovae measurements led to the discovery of dark energy, and earned Riess and other physicists in Berkeley and Australia a Nobel prize in 2011.

The discovery of Dark Energy was astounding (although rumors existed since the 1970s). The physics established in the early Twentieth Century did not predict Dark Energy anymore than Dark Matter (Dark Matter was indirectly observed around 1934, but mainstream physics obstinately refused to pay attention for many decades… And still does not, on the theoretical side).

In the case of Dark Matter, it is hoped by the Standard Persons of the Standard Model, that a mundane, anticipated explanation will surface, such as SuperSymmetry (“SUSY”). SUSY would provide for plenty of mass, because it adds plenty of particles (one for each existing particle). SUSY assumes a perfect symmetry between bosons and fermions.

But I don’t believe very much that SUSY, even if it existed, would explain Dark Matter, for a number of reasons. Somehow the mass of the Super Partners would have to add up to ten times the mass of everyday matter. That’s weird (to me). Even worse, SUSY does not explain why Super Partners would get spatially segregated, as Dark Matter is (as far as I know, only my own theory explains this readily).

Instead I believe an obvious logical loophole in Quantum Physics will provide (plenty of) Dark Matter. And it makes the observed spatial segregation between Dark Matter and normal matter, obvious. One could call that little pet of mine, the Quantum Leak Theory (QLT).

I do not see a natural explanation for Dark Energy. Nor do any of the established theories. Actually, Dark Energy is not described well enough to even know what is really going on (different scenarios are known as “Einstein Cosmological Constant”, or “Quintessence”, etc.).

Yet, it is imaginable, at least in my own theory of Dark Matter, that the mechanism creating Dark Matter itself could also produce Dark Energy. Indeed the QLT implies that long-range forces such as gravity change over cosmological distances (a bit like MOdified Newtonian Dynamics, MOND).

To come back down at the most prosaic level: supernovae distance measurements depend on knowing the distance to nearby pulsing stars very precisely (such as the Cepheid RS Puppis depicted above). The European Space Agency’s Gaia mission, an observatory launched last year, which is measuring the distance to 1 billion Milky Way stars, should help.

Many other telescopes will soon come on-line. Astronomy leads physics, just as it did, 25 centuries ago. Nothing beats looking out of the box, and peering in the dark universe.

Patrice Ayme’

### Points Against Multiverses

December 31, 2015

Physics, the study of nature, is grounded not just in precise facts, but also a loose form of logic called mathematics, and in even more general reasonings we know as “philosophy”. For example, the rise of Quantum Field Theory required massive Effective Ontology: define things by their effects. The reigning philosophy of physics became “shut-up and calculate”. But it’s not that simple. Even the simplest Quantum Mechanics, although computable, is rife with mind numbing mysteries (about the nature of matter, time and non-locality).

Recently the (increasing) wild wackiness of the Foundations of Physics, combined with the fact that physics, as it presently officially exists, cannot under-stand Dark Energy and Dark Matter, most of the mass-energy out there, has led some Europeans to organize conferences where physicists meet with reputable  philosophers.

Einstein Was Classical, The World Is Not. It’s Weirder Than We Have Imagined. So Far.

[Bell, CERN theory director, discovered a now famous inequality expressing locality, which Quantum physics violate. Unfortunately he died of a heart attack thereafter.]

Something funny happened in these conferences: many physicists came out of them, persuaded, more than ever, or so they claimed, that they were on the right track. Like little rodents scampering out in the daylight,  now sure that there was nothing like a big philosophical eagle to swoop down on them. They made many of these little reasonings in the back of their minds official (thus offering now juicy targets).

Coel Hellier below thus wrote clearly what has been in the back of the minds of the Multiverse Partisans. I show “his” argument in full below. Coel’s (rehashing of what has become the conventional Multiverse) argument is neat, cogent, powerful.

However I claim that it is not as plausible, not as likely, as the alternative, which I will present. Coel’s argument rests on a view of cosmology which I claim is neither mathematically necessary, nor physically tenable (in light of the physics we know).

To understand what I say, it’s better to read Coel first. Especially as I believe famous partisans of the Multiverse have been thinking along the same lines (maybe not as clearly). However, to make it fast, those interested by my demolition of it can jump directly to my counter, at the end: NO POINTS, And Thus No Multiverse.

***

Multiverses Everywhere: Coel Hellier’s Argument:

“Prompted by reading about the recent Munich conference on the philosophy of science, I am reminded that many people regard the idea of a multiverse as so wild and wacky that talking about it brings science into disrepute.”

Well, being guided by non-thinking physicists will do that. As fundamental physicist Mermin put it, decades ago:

The Philosophy “Shut Up And Calculate” Is A Neat Example Of Intellectual Fascism. It Is Increasingly Undermined By The Effort Toward Quantum Computing, Where Non-Locality Reigns.

Coel, claiming to have invented something which has been around for quite a while, probably decades: My argument here is the reverse: that the idea of multiple Big Bangs, and thus of a multiverse, is actually more mundane and prosaic than the suggestion that there has only ever been one Big Bang. I’m calling this a “philosophical” argument since I’m going to argue on very general grounds rather than get into the details of particular cosmological models.

First, let me clarify that several different ideas can be called a “multiverse”, and here I am concerned with only one. That “cosmological multiverse” is the idea that our Big Bang was not unique, but rather is one of many, and that the different “universes” created by each Big Bang are simply separated by vast amounts of space.

Should we regard our Big Bang as a normal, physical event, being the result of physical processes, or was it a one-off event unlike anything else, perhaps the origin of all things? It is tempting to regard it as the latter, but there is no evidence for that idea. The Big Bang might be the furthest back thing we have evidence of, but there will always be a furthest-back thing we have evidence of. That doesn’t mean its occurrence was anything other than a normal physical process.

If you want to regard it as a one-off special event, unlike any other physical event, then ok. But that seems to me a rather outlandish idea. When physics encounters a phenomenon, the normal reaction is to try to understand it in terms of physical processes.”

Then Coel exposes some of the basic conclusions of the Standard Big Bang model:

So what does the evidence say? We know that our “observable” universe is a region of roughly 13.8 billion light years in radius, that being the distance light can have traveled since our Big Bang. (Actually, that’s how we see it, but it is now bigger than that, at about 90 billion light years across, since the distant parts have moved away since they emitted the light we now see.) We also know that over that time our observable universe has been steadily expanding.

Then astrophysicist Coel start to consider necessary something about the geometry of the universe which is not so, in my opinion. Coel:

“At about 1 second after the Big Bang, what is now our observable universe was only a few light years across, and so would have fitted into (what is now) the space between us and the nearest star beyond our Sun. Before that it would have been yet smaller.”

What’s wrong? Coel assumes implicitly that the universe started from a POINT. But that does not have to be the case. Suppose the universe started as an elastic table. As we go back in time, the table shrinks, distances diminish. Coel:

“We can have good confidence in our models back to the first seconds and minutes, since the physics at that time led to consequences that are directly observable in the universe today, such as the abundance of helium-4 relative to hydrogen, and of trace elements such as helium-3, deuterium, and lithium-7.[1] Before that time, though, our knowledge gets increasingly uncertain and speculative the further back we push.”

These arguments about how elements were generated, have a long history. They could actually be generated in stars (I guess, following Hoyle and company). Star physics is not that well-known that we can be sure they can’t (stars as massive as 600 Suns seem to have been discovered; usual astrophysics says they are impossible; such stars would be hotter than the hottest stars known for sure).

Big Bangists insist that there would have been no time to generate these elements in stars, because the universe is 13.8 billion years old. But that 13.8 billion is from their Big Bang model. So their argument is circular: it explodes if the universe is, actually 100 billion years old.

But back to Coel’s Multiverses All Over. At that point, Coel makes a serious mistake, the one he was drifting towards above:

“One could, if one likes, try to extrapolate backwards to a “time = zero” event at which all scales go to zero and everything is thus in the same place. But trying to consider that is not very sensible since we have no evidence that such an event occurred (from any finite time or length scale, extrapolating back to exactly zero is an infinite extrapolation in logarithmic space, and making an infinite extrapolation guided by zero data is not sensible). Further, we have no physics that would be remotely workable or reliable if applied to such a scenario.[2]

…”all scales go to zero and everything is thus in the same place.” is not true, in the sense that it does not have to be. Never mind, Coel excludes it, although he claims “extrapolating back in time” leads there. It does not.

Instead, Coel invites us to Voodoo (Quantum) Physics:

“So what is it sensible to consider? Well, as the length scale decreases, quantum mechanics becomes increasingly important. And quantum mechanics is all about quantum fluctuations which occur with given probabilities. In particular, we can predict that at about the Planck scale of 10−35 metres, quantum-gravity effects would have dominated.[3] We don’t yet have a working theory of quantum gravity, but our best guess would be that our Big Bang originated as a quantum-gravity fluctuation at about that Planck-length scale.”

Well, this is conventional pata-physics. Maybe it’s true, maybe not. I have an excellent reason why it should not (details another time). At this point, Coel is firmly in the conventional Multiverse argument (come to think of it, he did not invent it). The universe originated in a Quantum fluctuation at a point, thus:

“So, we can either regard our Big Bang as an un-natural and un-physical one-off event that perhaps originated absolutely everything (un-natural and un-physical because it would not have been a natural and physical process arising from a pre-existing state), or we can suppose that our Big Bang started as something like a quantum-gravity fluctuation in pre-existing stuff. Any physicist is surely going to explore the latter option (and only be forced to the former if there is no way of making the latter work).

At times in our human past we regarded our Solar System as unique, with our Earth, Sun and Moon being unique objects, perhaps uniquely created. But the scientific approach was to look for a physical process that creates stars and planets. And, given a physical process that creates stars, it creates not just one star, but oodles of them strewn across the galaxy. Similarly, given a physical process that creates Earth-like planets, we get not just one planet, but planets around nearly every star.”

Coel then gets into the famous all-is-relative mood, rendered famous by “French Theory”:

“It was quite wrong to regard the Sun and Earth as unique; they are simply mundane examples of common physical objects created by normal physical processes that occur all over the galaxy and indeed the universe.

But humans have a bias to a highly anthropocentric view, and so we tend to regard ourselves and what we see around us as special, and generally we need to be dragged kicking and screaming to the realisation that we’re normal and natural products of a universe that is much the same everywhere — and thus is strewn with stars like our Sun, with most of them being orbited by planets much like ours.

Similarly, when astronomers first realised that we are in a galaxy, they anthropocentrically assumed that there was only one galaxy. Again, it took a beating over the head with evidence to convince us that our galaxy is just one of many.”

Well, it’s not because things we thought were special turned out not to be that nothing is special. The jury is still out about how special Earth, or, for that matter, the Solar System, are. I have argued Earth is what it is, because of the Moon and the powerful nuclear fission reactor inside Earth. The special twist being that radioactive elements tend to gather close to the star, and not in the habitable zone. So Earth maybe, after all special.

At this point, Coel is on a roll: multiverses all over. Says he:

“ So, if we have a physical process that produces a Big Bang then likely we don’t get just one Big Bang, we get oodles of them. No physical process that we’re aware of happens once and only once, and any restriction to one occurrence only would be weird and unnatural. In the same way, any physical process that creates sand grains tends to create lots of them, not just one; and any physical process that creates snowflakes tends to create lots of them, not just one.

So, we have three choices: (1) regard the Big Bang as an unnatural, unphysical and unexplained event that had no cause or precursor; (2) regard the Big Bang as a natural and physical process, but add the rider that it happened only once, with absolutely no good reason for adding that rider other than human parochial insularity; or (3) regard the Big Bang as a natural and physical event, and conclude that, most likely, such events have occurred oodles of times.

Thus Big Bangs would be strewn across space just as galaxies, stars and planets are — the only difference being that the separation between Big Bangs is much greater, such that we can see only one of them within our observable horizon.

Well, I don’t know about you, but it seems to me that those opting for (3) are the ones being sensible and scientifically minded, and those going for (1) or (2) are not, and need to re-tune their intuition to make it less parochial.”

To make sure you get it, professor Coel repeats the argument in more detail, and I will quote him there, because as I say, the Multiverse partisans have exactly that argument in the back of their mind:

“So, let’s assume we have a Big Bang originating as a quantum-gravity fluctuation in a pre-existing “stuff”. That gives it a specific length scale and time scale, and presumably it would have, as all quantum fluctuations do, a particular probability of occurring. Lacking a theory of quantum gravity we can’t calculate that probability, but we can presume (on the evidence of our own Big Bang) that it is not zero.

Thus the number of Big Bangs would simply be a product of that probability times the number of opportunities to occur. The likelihood is that the pre-existing “stuff” was large compared to the quantum-gravity fluctuation, and thus, if there was one fluctuation, then there would have been multiple fluctuations across that space. Hence it would likely lead to multiple Big Bangs.

The only way that would not be the case is if the size of the pre-existing “stuff” had been small enough (in both space and time) that only one quantum fluctuation could have ever occurred. Boy, talk about fine tuning! There really is no good reason to suppose that.

Any such quantum fluctuation would start as a localised event at the Planck scale, and thus have a finite — and quite small — spatial extent. Its influence on other regions would spread outwards, but that rate of spreading would be limited by the finite speed of light. Given a finite amount of time, any product of such a fluctuation must then be finite in spatial extent.

Thus our expectation would be of a pre-existing space, in which there have occurred multiple Big Bangs, separated in space and time, and with each of these leading to a spatially finite (though perhaps very large) universe.

The pre-existing space might be supposed to be infinite (since we have no evidence or reason for there being any “edge” to it), but my argument depends only on it being significantly larger than the scale of the original quantum fluctuation.

One could, of course, counter that since the initial quantum fluctuation was a quantum-gravity event, and thus involved both space and time, then space and time themselves might have originated in that fluctuation, which might then be self-contained, and not originate out of any pre-existing “stuff”.[5] Then there might not have been any pre-existing “stuff” to argue about. But if quantum-gravity fluctuations are a process that can do that, then why would it happen only once? The natural supposition would be, again, that if that can happen once, then — given the probabilistic nature of physics — it would happen many times producing multiple different universes (though these might be self-contained and entirely causally disconnected from each other).”

Then, lest you don’t feel Multiversal enough, professor Coel rolls out the famous argument which brings the Multiverse out of Cosmic Inflation. Indeed, the universe-out of nothing Quantum fluctuation is basically the same as that of Cosmic Inflation. It’s the same general mindset: I fluctuate, therefore I am (that’s close to Paris motto, Fluctuat Nec Mergitur…). Coel:

In order to explain various aspects of our observed universe, current cosmological models suggest that the initial quantum fluctuation led — early in the first second of its existence — to an inflationary episode. As a result the “bubble” of space that arose from the original quantum-fluctuation would have grown hugely, by a factor of perhaps 1030. Indeed, one can envisage some quantum-gravity fluctuations leading to inflationary episodes, but others not doing so.

The inflationary scenario also more or less requires a multiverse, and for a similar reason to that given above. One needs the region that will become our universe to drop out of the inflationary state into the “normal” state, doing so again by a quantum fluctuation. Such a quantum fluctuation will again be localised, and so can only have a spatially finite influence in a finite time.

Yet, the inflationary-state bubble continues to expand so rapidly, much more rapidly than the pocket of normal-state stuff within it, that its extent does not decrease, but only increases further. Therefore whatever process caused our universe to drop out of the inflationary state will cause other regions of that bubble to do the same, leading to multiple different “pocket universes” within the inflationary-state bubble.

Cosmologists are finding it difficult to construct any model that successfully transitions from the inflationary state to the normal state, that does not automatically produce multiple pocket universes.[6] Again, this follows from basic principles: the probabilistic nature of quantum mechanics, the spatial localisation of quantum fluctuations, and the finite speed at which influence can travel from one region to another.”

The driver of the entire Multiverse thinking is alleged Quantum Fluctuations in a realm we know f anything. Those who are obsessed by fluctuations may have the wrong obsession. And professor Coel to conclude with more fluctuations:

“The dropping out of the inflationary state is what produces all of the energy and matter that we now have in our universe, and so effectively that dropping-out event is what we “see” as our Big Bang. This process therefore produces what is effectively a multiverse of Big Bangs strewn across that inflationary bubble. Thus we have a multiverse of multiverses! Each of the (very large number of?) quantum-gravity fluctuations (that undergo an inflationary state) then itself produces a whole multiverse of pocket universes.

The point I am trying to emphasize is that any process that is at all along the lines of current known physics involves the probabilistic nature of quantum mechanics, and that means that more or less any conceivable process for creating one Big Bang is going to produce not just a single event, but almost inevitably a vast number of such events. You’d really have to try hard to fine-tune and rig the model to get only one Big Bang.

As with any other physical process, producing multiple Big Bangs is far more natural and in-line with known physics than trying to find a model that produces only one. Trying to find such a model — while totally lacking any good reason to do so — would be akin to looking for a process that could create one snowflake or one sand grain or one star or galaxy, but not more than one.”

Patrice Says: NO POINTS, AND THUS NO MULTIVERSE(s):

Did the universe expand from one point? Not necessarily. It could have been from a line, a plane, a volume, even something with a crazy topology. The Big Bang is the time zero limit of the FLRW metric. Then the spacing between every point in the universe becomes zero and the density goes to infinity.

Did the Universe expand from Quantum Gravity? Beats me, I don’t have a theory of Quantum Gravity.

What I know is that, expanding from what’s known of gravity, if the universe expanded from a “point”, that would be smaller than the Planck volume, thus the universe would be within a Black Hole. From what we know about those, no expansion.

Once we don’t have the universe expanding from a point, we cannot argue that it expanded from one point in some sort of “stuff”. If the universe is the “stuff” itself, and it’s everywhere, and expanding from everywhere, exit the argument about a “point”.

The argument about a “point” was that: why this particular point? Why not another “Quantum Fluctuation” from another “point” in the “stuff”. Why should our “point” be special? Is it not scientific to believe in the equality of points? Except points have measure zero in three dimensional space, and thus it’s more “scientific”, “mathematical” to suppose the universe expanded from a non-measure zero set, namely a volume (and it better be bigger than the Planck Volume).

So the argument that there should be many universes because there are many points and many Quantum (Gravity) fluctuations flies apart.

Remains the argument that we need Cosmic Inflation. Yes, but if the universe expands from all over, all over, there is only one such. Cosmic Inflation does not have to appear at all points generating baby universes, It becomes more like Dark Energy.

Speaking of which, why should we have two Cosmic Inflations when we already have one? Even my spell checker does not like the idea of two inflations. It does not like the “s”. Ah, yes, the existing Big Bang needs its own Inflation.

Yet if there is only one inflation, presto, no more standard Big Bang, But then what of Helium, Lithium, etc? How do we synthesize enough of those? Well maybe we would have much more time to synthesize them, inside stars… Especially super giant stars.

Another word about these Quantum Fluctuations. Are they the fundamental lesson of Quantum Physics (as the Multiversists implicitly claim)? No.

Why? There are several most fundamental lessons about Quantum Physics. Most prominent: the DYNAMICAL universe is made of waves. That fact, by itself implies NON-LOCALITY. It also implies neighborhoods, no points, are the fundamental concepts (one cannot localize a wave at a point). This is the origin of the “Quantum Fluctuations”.

So we just saw that “Quantum Fluctuations” may not be the most fundamental concept. Fundamental, yes, but not most fundamental. When debating fundamentals with the Devil, you better bring exquisite logic, and a Non-Local spoon, otherwise you will be Quantum fluctuated out.

Patrice Ayme’

### Can Space Be Faster Than Light?

October 30, 2015

Is space faster than light? The question may sound weird, like comparing apples and red herrings. Yet, it is being asked by serious cosmologists.

Here is Sean Carrol, a famous professional cosmologist from Caltech in his essay: “The Universe Never Expands Faster Than the Speed of Light”: That is intriguing, because it was alleged, long ago that so-called Cosmic Inflation, precisely, allowed the Universe to “expand faster than light”. Carrol:

…”here to get a little nitpick off my chest: the claim that during inflation, the universe “expanded faster than the speed of light.” It’s extraordinarily common, if utterly and hopelessly incorrect. (I just noticed it in this otherwise generally excellent post by Fraser Cain.) A Google search for “inflation superluminal expansion” reveals over 100,000 hits, although happily a few of the first ones are brave attempts to squelch the misconception. I can recommend this nice article by Tamara Davis and Charlie Lineweaver, which tries to address this and several other cosmological misconceptions.”

Notice How Big Bang Expansion Accelerates, Slows Down, Then Re-Accelerate. Twist & Turn?

Well. The varying speed of light model was proposed by Jean-Pierre Petit in 1988 (and copied by John Moffat in 1992, Albrecht and João Magueijo in 1999). Instead of superluminal expansion of space, the speed of light was proposed to be 60 orders of magnitude faster than its current value solving the horizon and homogeneity problems in the early universe…

Even for those who are not interested by cosmological physics and relativity, this is fascinating, because it means that most cosmologists had no idea of what they are talking about, or what other cosmologists are talking about…. Over the last few decades, that their collective cosmological wisdom got sold in tenths of millions of books on the subject.

This means that the making of “science” is considerably less obvious and appealing than the making of sausage. It also means that we have no idea what space, time, and even light, mean.

The reason many of these physicists do not understand, what they are talking about, is that they did not use the more advanced mathematics I will introduce later in an essay.

In “Expanding Confusion: common misconceptions of cosmological horizons and the superluminal expansion of the Universe”, Tamara and Charlie claim that:

“We use standard general relativity to illustrate and clarify several common misconceptions about the expansion of the Universe. To show the abundance of these misconceptions we cite numerous misleading, or easily misinterpreted, statements in the literature. In the context of the new standard Lambda-CDM cosmology we point out confusions regarding the particle horizon, the event horizon, the “observable universe” and the Hubble sphere (distance at which recession velocity = c). We show that we can observe galaxies that have, and always have had, recession velocities greater than the speed of light. We explain why this does not violate special relativity and we link these concepts to observational tests. Attempts to restrict recession velocities to less than the speed of light require a special relativistic interpretation of cosmological redshifts. We analyze apparent magnitudes of supernovae and observationally rule out the special relativistic Doppler interpretation of cosmological redshifts at a confidence level of 23 sigma.”

Before I expose my more advanced mathematics, let me point out this: the Big Bang created this problem, Cosmological Inflation. Cosmological Inflation, if admitted is a huge problem: why did it start? Why did it stop? (Guth himself views the lack of even a glimpse of an explanation here as a problem.) Why is it all over the place? (Physicists such as Linde, an ex-Russian at Stanford, believe in “chaotic inflation”, with inflation all over the place: completely silly? Well maybe not: I have a re-interpretation with Quantum Entanglement!)

A solution to solve the Cosmic Inflation problem is to decapitate the Big Bang. That’s what I do with my proposal of Eternal Dark Energy, the “100 billion years old universe”.

On the face of it, it’s obvious: why to imagine one which makes no sense, when we already have one, for sure, which makes no sense either?

Some things are obvious. Some car makers claimed that their “hybrid” cars went one hundred kilometers on 1.2 liters. A French magazine went out, and measured. It was found hybrid car fuel usage was three times higher than officially announced.

Surprising? No.

Obvious.

Why would a hybrid car going in a straight line at uniform speed use less fuel? Witchcraft? In uniform motion, only the gasoline engine works. The electric engine(s) get dragged along. In truth, the hybrid machinery is heavy and, if anything, the car should use more fuel, no less. As found.

Simple: basic logic is a killer, if no obvious evidence to the contrary.

Patrice Ayme’

### Flat Universe Flattens Twisted Logic

April 11, 2015

The observed universe is flat. I will explain what it means in practice, before going into a bit of theory. Including a sickle move through the lamentable precedent of the heliocentric system.

Basically, when we look at a galaxy which is very very very far away, it appears to have the same size as it should have considering its distance. Ah, yes, because we can determine the distance of a very very remote galaxy, or so we think, by looking at its red shift (how much redder it looks than what it would be if it were next door).

This apparently innocuous set-up creates lots of problems for the ruling cosmological theory, the Big Noise Bang. The barnacles attached to the Big Noise, thousands of professional cosmologists, would not be happy to see their ship sink, so they insist it’s doing all right. Yet I am dancing all around with the facts, and, should they read me carefully, they would be less sanguine about the respect they will enjoy, in the fullness of time.

Gravitational Lensing. Lensing Without Gravitation Would Signal Curvature. So Would Apparent Size Variations. Neither Is Observed, However far We Look.

The Big Noise cosmologists may well be wrong, because they suppose plenty of things for their model. All too many things, some of them, pretty weird. I get to the same observations, while being much more parsimonious with my hypotheses.

We have seen it all before, this conflict between common sense , and complicated absurdities by great priests, themselves at the service of higher authorities. Remember the Ptolemaic system? That claimed the Sun rotated around Earth. That absurdity ruled for around 15 centuries

***

The Ptolemaic System Was An Obese Lie, Thus Contradicting It, A Capital Crime:

The bigger the lie, the greater the authority. So great authority loves big lies: it is a training ground for the feeble minds which make authority so great.

The greatest philosopher of the Fourteenth Century, and the greatest physicist of the Middle Ages, the Parisian Johannes Buridanus, sent the Ptolemaic system to the bottom of the sea (1320s CE).

However Jean Buridan, adviser to 4 kings, and head of the University of Paris, did not want to be burned alive. So Buridan presented all his new physics and cosmology as something “supporters” of the point of view that “authority does not demonstrate” were talking about (he named no names).

Buridan believed that the Earth turned on itself each day, and around the sun in a year, that the arrow would fall at the same point, because of his own theory of impetus. Etc. It’s all very clear, and some of it can even be read. (In this extract Buridan supports geocentrism; in later extracts, he concludes he cannot be distinguished from heliocentrism observationally; a full study of Buridan is not extant. Some of the later arguments of Buridan are found in Oresme.)

Even the ship example used by Galileo, 300 years later, to demonstrate the undetectability of uniform motion is Buridan’s invention, for the same purpose (Buridan’s student, bishop Oresme wrote about it too).

The Catholic Church, supported by King Plutocrat Louis XI, made reading Buridan a capital crime in 1473 CE. Buridan’s cosmology was posthumously re-amplified by his student and (self) publicist, the dying Abbot Copernicus.

That fancy, the heliocentric system, was, on the face of it, quite ridiculous: Buridan said the Earth was “tiny” so it was only understandable that the tiny thing would rotate on itself, while enormous thing would stay put.

***

Authorities Love Systems Which Lie And Make No Sense:

Why the heliocentric system, was entertained so long explains much of the enthusiasm for the Big Bang. The psychology is similar: an obscure set of ideas was made more hermetic by computations nobody understands. Actually, it’s Plato who launched the Big Ptolemaic Noise, six centuries prior to Ptolemy’s efforts.

Believing in the heliocentric system was good training for submitting to stupid authority, and learning to become non-critical.

But let’s go back to flatness.

Basic Math Of Flatness:

Our universe of stars, clouds, and galaxies, is three dimensional (as I often talk of high dimensions, see note: the “3” maybe an average of the “many”).

Geometries can be flat (a plane) or spherical (aka “elliptic”; as on a round planet), or “hyperbolic” (a saddle).

A mighty theorem (Perelman-Thurston; see technical note on mathematical background) implies that astronomically plausible non-flat geometries contain flat, spherical or hyperbolic elements.

I will simplify further.

Geometries are determined by their geodesics (the shortest paths). At least locally.

A non-flat universe means that that some perspective can be found so that two neighboring geodesics will either converge or diverge.

For a proof, just look at a sphere, or a saddle; the geodesics can be determined by pulling a string between two points, making the shortest paths. They are the greatest circles in the case of a sphere. Notice that the distances between two nearby strings, once pulled to make geodesics, vary. The big math proof, with equations, does not say anything more.

No Empty Space Lensing, No Curvature:

In space, geodesics are paths followed by light. If the universe is not flat, light will either diverge, or converge, as if space itself was a lens. This means that a galaxy, or a galactic cluster, will appear bigger, or smaller, than it should.

Some may object that lensing in space is well known, and is even used to look at the furthest galaxies. However that lensing is due to gravity slowing down, and bending light, as happens with light grazing the sun. That’s called gravitational lensing. Entire galactic clusters are known to operate as giant lenses.

If one saw lensing, with nothing in between, the lensing would not be gravitational and the universe would not be flat.

But so far, this has not been observed.

A perfectly flat universe means global curvature zero. However the basic idea of the Einstein Field Equation (EFE) is:

CURVATURE = MASS-ENERGY-MOMENTUM

Actually, this equation is the basic idea, thus the ultimate simplification. As it is, it cannot work without further complications, because the object on the left has much higher dimension than the 10 dimensional tensor on the right; so one has to simplify the curvature first). The real equation is more like:

Function of Curvature = Mass-Energy-Momentum

There are a lot of mathematical details to figure out, to make that basic idea fit in. It took many stupendous mathematicians and physicists many years working together frantically to figure them out. In particular, Einstein and Hilbert cooperated intensely, helped by many collaborators… And the initial idea comes from the mathematician/physicist/philosopher Riemann (1866). So it took 60 years to make the idea work, and one should not expect casual readers to get the ideas in 60 lines, let alone 60 seconds.

An obvious (sort of) prediction was that, as the Mass-Energy of the universe is not zero (it’s full of galaxies, which have mass, and energy), then the curvature could not be zero. But then, if curvature (of the space-time of the universe) is not zero, then the universe has got to be moving.

Revolted by a moving universe, Einstein then added another curvature term, Lg. Lg counterbalanced Mass-Energy-Momentum, and gave a static (but unstable) universe.

Thus Einstein did not predict what the astronomers were starting to observe, namely the expansion of the universe. Einstein abandoned L (“Lambda”), calling it the “biggest blunder [he] ever made”.

(According to me, he made a much graver error in 1905.)

***

Dark Energy Flattens Cosmological Logic:

Ninety years later, the most basic supernovas were studied. They arise in binary systems: a star transfers part of itself to its companion, a super hot white dwarf. It is a bit like transferring gasoline on an amber: when enough mass has been transferred to Dwarf, the pressure and heat in the depth is just right for thermonuclear fusion to re-ignite explosively. It happens in exactly the same way always (although some argue about this). So these Type 1a supernovae are viewed as candles always of the same luminosity.

Large surveys (rejecting some explosion viewed as outliers) concluded that far-away Type 1a explosions were weaker than the Hubble law of expansion predicted. And the further one looked, the more the 1a explosions faded.

The conclusion was drawn that the universe expanded faster than the old model of Hubble and Einstein’s Gravitation theory predicted.

Greater expansion meant greater energy, and its source was not clear, so it was named DARK ENERGY.

Ironically to describe the simplest way to describe it was just to re-introduce the Lg term Einstein had introduced and then rejected, while he blundered about clumsily.

***

It remains that the original theory of Einstein requires a very fine tuning of parameters to make our universe explode into its present very flat state in a bit less than 14 billion years. It also requires a supplementary explosion, called “Cosmological Inflation”.

I don’t have this problem.

I just wipe Einstein and his cohorts clean. I am master of my own soul. They have two Cosmological Inflations. I have just one, the one that is observed.

And my version of the universe can be 100 billion years old, or more.

I don’t confuse gravitation and revolution, inflation and what not. The Einstein Field Equations are correct, I just don’t apply them to the universe.

Simple does it.

Making something complicated simply because it allows to “shut and calculate” (the philosophical doctrine of contemporary physics) has been seen before. This was the trap into which Ancient Greek astronomy fell, making ever more sophisticated versions of the Ptolemaic system.

We should avoid duplicating our forebears’ mistakes.

Patrice Ayme’

Mathematical Note:

That I consider the universe three dimensional may sound as a strange admission, as I always advocate all sorts of dimensions, from the brain to fundamental physics. But not so: just view the three dimensional aspect as an… average.

(Here I am going to talk as a common physicist or mathematician, and elide the tweaking of fundamental axioms of topology and logic that I am wont to engage in, because I want to present the simplest picture.)

More precisely, this is what happens in two dimensions. In one dimension, the line or circle, there is just one geometry.

The USA mathematician Thurston launched a theorem, proven by the Russian Perelman, which showed there were just eight fundamental geometries in three dimensions.

(Disgusted by the dog eat dog attitude of famous mathematicians, some of whom I personally know, Perelman refused prizes, and abandoned math; I do share Perelman’s indignation, and then, more. Austerity, as imposed by plutocrats, has made even mathematicians like rats, prone to devour the innocent. The problem is not just in physics.)

Rise, Republic, Plutocracy, Degeneracy, Fall And Transmutation Of Rome

Power Exponentiation By A Few Destroyed Greco-Roman Civilization. Are We Next?

SoundEagle 🦅ೋღஜஇ

Where The Eagles Fly . . . . Art Science Poetry Music & Ideas

Artificial Turf At French Bilingual School Berkeley

Artificial Turf At French Bilingual School Berkeley

Patterns of Meaning

Exploring the patterns of meaning that shape our world

Sean Carroll

in truth, only atoms and the void

West Hunter

Omnes vulnerant, ultima necat

GrrrGraphics on WordPress

www.grrrgraphics.com

Skulls in the Stars

The intersection of physics, optics, history and pulp fiction

Footnotes to Plato

because all (Western) philosophy consists of a series of footnotes to Plato

Patrice Ayme's Thoughts

Striving For Ever Better Thinking. Humanism Is Intelligence Unleashed. From Intelligence All Ways, Instincts & Values Flow, Even Happiness. History and Science Teach Us Not Just Humility, But Power, Smarts, And The Ways We Should Embrace. Naturam Primum Cognoscere Rerum

Learning from Dogs

Dogs are animals of integrity. We have much to learn from them.

ianmillerblog

Smile! You’re at the best WordPress.com site ever

Defense Issues

Military and general security

RobertLovesPi.net

Polyhedra, tessellations, and more.

How to Be a Stoic

an evolving guide to practical Stoicism for the 21st century

Rise, Republic, Plutocracy, Degeneracy, Fall And Transmutation Of Rome

Power Exponentiation By A Few Destroyed Greco-Roman Civilization. Are We Next?

SoundEagle 🦅ೋღஜஇ

Where The Eagles Fly . . . . Art Science Poetry Music & Ideas

Artificial Turf At French Bilingual School Berkeley

Artificial Turf At French Bilingual School Berkeley

Patterns of Meaning

Exploring the patterns of meaning that shape our world

Sean Carroll

in truth, only atoms and the void

West Hunter

Omnes vulnerant, ultima necat

GrrrGraphics on WordPress

www.grrrgraphics.com

Skulls in the Stars

The intersection of physics, optics, history and pulp fiction

Footnotes to Plato

because all (Western) philosophy consists of a series of footnotes to Plato

Patrice Ayme's Thoughts

Striving For Ever Better Thinking. Humanism Is Intelligence Unleashed. From Intelligence All Ways, Instincts & Values Flow, Even Happiness. History and Science Teach Us Not Just Humility, But Power, Smarts, And The Ways We Should Embrace. Naturam Primum Cognoscere Rerum

Learning from Dogs

Dogs are animals of integrity. We have much to learn from them.

ianmillerblog

Smile! You’re at the best WordPress.com site ever

Defense Issues

Military and general security

RobertLovesPi.net

Polyhedra, tessellations, and more.

How to Be a Stoic

an evolving guide to practical Stoicism for the 21st century

Rise, Republic, Plutocracy, Degeneracy, Fall And Transmutation Of Rome

Power Exponentiation By A Few Destroyed Greco-Roman Civilization. Are We Next?

SoundEagle 🦅ೋღஜஇ

Where The Eagles Fly . . . . Art Science Poetry Music & Ideas

Artificial Turf At French Bilingual School Berkeley

Artificial Turf At French Bilingual School Berkeley

Patterns of Meaning

Exploring the patterns of meaning that shape our world

Sean Carroll

in truth, only atoms and the void

West Hunter

Omnes vulnerant, ultima necat

GrrrGraphics on WordPress

www.grrrgraphics.com

Skulls in the Stars

The intersection of physics, optics, history and pulp fiction

Footnotes to Plato

because all (Western) philosophy consists of a series of footnotes to Plato

Patrice Ayme's Thoughts

Striving For Ever Better Thinking. Humanism Is Intelligence Unleashed. From Intelligence All Ways, Instincts & Values Flow, Even Happiness. History and Science Teach Us Not Just Humility, But Power, Smarts, And The Ways We Should Embrace. Naturam Primum Cognoscere Rerum

Learning from Dogs

Dogs are animals of integrity. We have much to learn from them.

ianmillerblog

Smile! You’re at the best WordPress.com site ever

Defense Issues

Military and general security

RobertLovesPi.net

Polyhedra, tessellations, and more.

How to Be a Stoic

an evolving guide to practical Stoicism for the 21st century