Archive for the ‘Astronomy’ Category

Black Hole For Dummies: An Old Illuminating Story

April 11, 2019

Black Hole Seen At Core Of Galaxy Messier 87 

Black Holes were predicted at the end of the Eighteenth Century. I am not here campaigning for justice or historical precision, by giving Michell and Laplace the honor due to them. I am also defending physics, and promoting understanding. The guy with the bushy hairdo didn’t launch understanding of Black Holes. That means Black Hole theory arose for DEEPER reasons than in Einstein’s theory of gravitation. Deeper reasons is what science is all about.

Black Holes are indeed an effect of the most basic theory of gravity which was elaborated in the 1560-1800 CE period by Tycho, Kepler, Galileo, Bullialdus, Hookes, Newton, and finally Laplace. That basic theory of gravitation is the first order of the present theory of gravitation. The Black Hole effect, per se, has nothing to do with Jules Henri Poincaré’s Theory of Relativity (translated into German by Einstein).

In 1796 marquis Pierre-Simon de Laplace,mathematician, physicist, astronomer and philosopher (of course) rediscovered the idea of John Michell, a cleric and independent scholar. Michell has noticed that a body falling from far away onto something big enough, would exceed the speed of light. Thus, supposing that light would be made of particles, those particles would lose as much speed, trying to escape that big body, and thus, would fall back onto that body.  Laplace wrote:

Un astre lumineux, de la même densité que la Terre, et dont le diamètre serait 250 fois plus grand que le Soleil, ne permettrait, en vertu de son attraction, à aucun de ses rayons de parvenir jusqu’à nous. Il est dès lors possible que les plus grands corps lumineux de l’univers puissent, par cette cause, être invisibles.

(...because of this, it’s possible that the greatest luminous bodies of the universe would be invisible.)

Here I will follow Laplace’s proof.

Laplace on top. Don’t pay much attention to the text (not from me) which is a bit confusing

The Black Hole effect comes from the fact that the gravitational attraction is proportional upon the mass of an object, but also inversely proportional to the distance of said object, while the energy of an object necessary to escape the gravitation, is simply proportional to its mass. So, if too close, the gravitation will overwhelm any escape energy.  

Here is a bit more detailed reasoning  Supposing a particle of light has mass m, 1/2 mvv is its kinetic energy. If situated at x from the gravitational center, the energy to bring it to infinity is Gm/x. (G is aM, where a is some constant and M the central mass.)

Equating, we get 1/2 mvv = Gm/x

Thus, cancelling m, changing the constant: v^2= bM/x

But now, as early as the late 17C, the speed of light became known, by observing carefully Jupiter’s satellites.  It’s c, a constant. So we get: x= bM/cc.

Hence, if x is smaller than bM/cc, the potential gravitational energy Gm/x is TOO BIG to become 1/2 mcc.

Let’s put it in words only. Suppose light is a particle of mass m.  

OK, let me wait for the laughter of professional physicists to die off… Indeed, those simple souls will object that I neglected Relativity and its guru, Einstein. Well, my reply is that I know very well what I am doing, and they don’t. Meanwhile, here is the Black Hole:

Matter Falling into the Black Hole or running crazy orbits around it at relativistic speeds generate lots of heat, by collision and sheer acceleration (like a super enormously incredibly humongously giant circular particle accelerator, CERN on unimaginable steroids…). With 6.5 BILLION Solar Masses, this is one the largest Black Holes known.

OK, this reasoning was in Laplace. The incredibly famous Laplace, after whom Laplacians are named, made gravitation into a field theory, predicting thus gravitational waves (said waves were relativized by Jules Henri Poincaré… Modern Quantum Field Theory is all about manipulating Laplacians…

So is light a particle? Einstein said so (following Newton) [I have my doubts: SQPR changes the game!] Does light have mass? Definitely yes, according to E = mcc, a relation first demonstrated and taught by Jules Henri Poincaré in 1899 at the Sorbonne (the Einstein cult omits this little detail). There is a simple reasoning for that… simple once one knows Maxwell equations, or observe light momentum…

Here is the simplest proof of E = mcc. Light pushes, it has momentum. So light acts as if it had what’s called “inertial mass”. Now the “Equivalence Principle”  says that inertial mass = gravitational mass. Thus, light behaves as if it were endowed with a gravitational mass m, as used above.  

(The EP is truly an experimental observations, last checked excruciatingly a year ago, in a French satellite launched for that purpose)

So what’s the next problem in my hare brain derivation of Black Hole? None, really. The modern gravitation theory (aka General Relativity) integrates the LOCAL TIME theory of Lorentz-Jules Henri Poincaré into the gravitation theory of Newton. Local time runs slow in a gravitational field, and the deeper the gravitational well, the slower the time. Thus, if I wanted to ameliorate the hare brain Black Hole theory, I would have to add that….

The full Einstein gravitation theory simply says that: Ricci Tensor = Mass-Energy Tensor.

The Italian Ricci, starting in 1890, simplified the full Riemann Curvature tensor. It’s applied to the spacetime metric g. We see immediately that, the more mass-energy, the more curvature. In the limiti of small masses, this is Newton’s equation…

The preceding is very simple, thus ironclad.  

So here it is: physics is not that complicated.

***

Many scientists present science as more complicated than it is, so they appear to be great sorcerers or shamans. An example is the claim made by Darwin that man arose in East Africa (then a UK dominion). There was evidence for this, as the Brits digged in East Africa. When the Chinese digged in China, they begged to differ. Humans had originated in China too, they insisted.

Now another human species was just discovered, in the Philippines… ‘Homo luzonensis’ boasted an eclectic mix of features comparable to, but distinct from, different species of hominins. So this is another human species which lived 50,000 to 60,000 years ago. We now have five. It’s clearly a different species as they have three root teeth where we have just two.

Contemporary humans have genetic material from three human species: ancestral Sapiens, Neanderthals and Denisovans…

Science is both simple, and complex. Often the lack of simplicity, and the grandiloquent style in exposition, is just an attempt to hide ignorance, and leveraging said ignorance in awe for the perpetrators of pseudo-scientific obscurantism.  

Physicists are particularly culprit of this in recent decades. Consider titles such as: “The First Three Minutes”, “The Theory of Everything”, “A Universe Out Of Nothing”, “The God Particle”, etc… The more fatuous physicists became, the less the theory progressed. Now, right, they couldn’t probably have done better. Fortunately, experimental physics, and especially astronomy has kept on advancing, ever more spectacularly… cornering the fatuous ones, even when adorned with Nobel Prizes, into irrelevance…

Decades ago, I caused a scandal at an integrated physics-mathematics seminar at Stanford by exposing the shortcoming of Black Hole Theory… I was coming from the mathematical, hyper-logical side, unearthing all the little problems, which weren’t so little… Namely I claimed it didn’t take into account enough Quantum Theory. (Following my generously provided orientation, has brought a cottage industry of quantized “Black Holes” theory… Some not really black, just frozen…)

Many surprises await… Stay tuned…

Patrice Ayme

 

New Definition Of A Planet Makes Pluto A Planet

March 5, 2019

In the past, one distinguished among three sorts of bodies in the Solar System: planets, satellites, asteroids. However, it turned out that Pluto, once thought to be Earth sized, is much smaller… Although its atmosphere extends way out, it’s nearly extended as far as the diameter of Earth! Yet, some smart ass astronomer, greedy for fame as academic types tend to be, their feeding depends upon their renown, then made his claim to fame by demoting Pluto. He even sold T-shirts, I have one.

Planet means “wanderer”, in greek. Because the planets wander across the stars without reason which prehistoric men, or even the Greeks, could discern. Upon closer inspection (astronomer Tycho and his pupil Kepler), planets turned out to follow slowly changing ellipses.

The modern scientific sense of planet as “world that orbits a star” was prominent in Giordano Bruno’s cosmology, giving the sadists in the Vatican a reason to torture him to death for seven years (Catholic sadistic abusers are persistent, they have been around for nearly two millennia).

Very round, and very blue. Don’t tell me that’s not a planet. Real picture of Pluto from nuclear energized New Horizon Spacecraft, as it left the planet behind….

Pluto was loudly demoted, officially speaking, because it was argued that “Pluto didn’t clear its orbit”. The statement sounds superficially impressively scientific. But, as always in science, the devil lurks in a more careful examination of the situation. Actually, Pluto, as far as I can see, does clear its orbit. So this was an example of the “Big Lie Technique” dear to Hitler: the bigger the lie, the more it convinces people… Just like Obama fixing healthcare with “Romneycare“…

Pluto clears it so well it is accompanied by an entire cortege of satellites, including the relatively enormous Charon, a companion so large Pluto and Charon orbit around a common point exterior to Pluto.

Pluto is usually farthest from the Sun. However, its orbit is closer than Neptune’s orbit for 20 years out of every 248 years. Pluto got closer than Neptune on February 7, 1979, temporarily becoming the 8th planet from the Sun. Pluto crossed back over Neptune’s orbit again on February 11, 1999, resuming its place as the 9th planet from the Sun for the next 228 years. 

In truth (see Note), Pluto overflies Neptune orbit, when at closest point to the Sun. So the two orbits never intersect. One could introduce the notion of orbit disk (the part of the planetary plane of a planet inside the orbit). The intersection of the planetary disks of Pluto and Neptune have planar measure zero! (First Objection to the “clearing” notion).

Anyway, one may as well say that Neptune didn’t clear its own orbit (as Pluto occupies it sometimes, according to those who aren’t smart enough to understand the First Objection, namely that the orbits don’t cross).

Pluto could be more colorful than expected… Charon hanging through the blue haze, which is bue for the same exact reason as Earth atmosphere (Rayleigh scattering….) Mountains are made of water ice with very different properties than terrestrial ice (it’s much harder).

To satisfy all, it was decreed that Pluto was a “Dwarf Planet”… Other Pluto-sized objects have been found since, further out. But some have weird shapes… Some expected Pluto’s atmosphere to freeze down… But the New Horizon robot found it alive and well. The Pluto system has turned out to be very complex.

As one finds exoplanets, the possibility exists that Earth-sized satellites will be found in orbit around giant planets. As in the movie Avatar. Actually one may have been discovered (there is a controversy, as with all new scientific discoveries).

One will want to call Earth sized moons orbiting giant planets “planets”.

So what would be a planet? One can still use the official criterion brandished by the International Astronomical Union. And then I would add the following NEW criterion:

Worlds around giant planets do exist, and one may have already been found. (from variation of the light of the local sun, with slow downs and accelerations similar to those observed in Jupiter’s satellites in the 17C…)

An object large enough, and round enough, to hold an atmosphere all around its entire surface should be called a planet. That would make the giant asteroid Ceres NOT a planet: it doesn’t hold an atmosphere, and didn’t clear its orbit (it’s part of the asteroid belt). According to that definition Ganymede, which has an oxygen atmosphere, and a diameter of 5262 kms, is a planet, so is Titan (very thick atmosphere; 5150 kms diameter). Mercury, clearly a planet, has a diameter of 4880 kms, a tiny bit larger than Callisto, Jupiter second largest satellite… which also has a tenuous atmosphere.

Whether one wants to call large moons equipped with atmospheres planets is a matter of taste.

But, “Pandora” like world would be planets (Ganymede is 4.5 billion years old, at least as old as Earth… Plenty of time to evolve life…)

So what’s a planet? As Giordano Bruno said, a world. And certainly an atmosphere all around, especially if propitious to life, makes it a world. Pandora is a world, that is, a planet. It doesn’t matter that, as it turns around its sun, Pandora also turns around a hyper giant planet.

And certainly, Pluto is a world, too. A much smaller one, but still a world.

Worlds, here we come!

Patrice Ayme

***

***

Note 1;

Pluto has cleared its orbit. That’s why it’s so full of satellites….

Diagram of Pluto’s and Neptune’s orbit, on a distance scale in AUs.

Pluto and Neptune will never collide. You can see this in the image below, which shows a view as seen from the side as the planets orbit around the Sun.

Neptune is cleared of Pluto’s orbit, and reciprocally

Most planets only make small excursions in the vertical and radial directions, but Pluto makes large ones. Pluto at times will move closer to the Sun than Neptune, but it is always well “below” the orbit of Neptune when this happens. The orbits never actually cross the same point in space. Simulations have shown this is stable for the next 4 billion years.

***

Note 2:

Could one have an irregular shaped body with an atmosphere only in some basins? Probably, but unlikely to be naturally sustainable (one would need to make computer simulations taking into account the Roche limit, and Quantum effects on geological stress… Atmospheric pressure varies significantly on parts of Mars… which has giant high mountains, but also deep basins…)

 

Most Habitable Exoplanets Found By NASA’s Kepler Aren’t So, Hints Gaia

October 30, 2018

Yes, before science becomes straightforward, it’s made of crooked timber: something we held as sure, and a great discovery of the last few years, is often revealed, on second examination, as in need of serious tweaking: the initial breakthrough survives, but transmogrified. As an interlude between aspects of the civilization bandwagon, with more lofty essays, let’s look at the future… space. Yes, the future is space: it is to us what the savannah, was to the genus Homo. The savanna, in combination with necessity, will, and the Élan vital of colonization, evolved us. The same Élan vital spurs space colonization. Élan vital, popularized by Nobel laureate philosopher Henri Bergson,was central to Lucretius-Epicurus philosophy of 23 centuries ago (philosophy which Christianism eradicated by burning its books, and killing its practitioners and defenders).  

It’s pretty clear that humanity, barring a deplorable accident, will be able to spawn across the galaxy: there are plenty of habitable planets out there, and, thanks to NASA, Elon Musk and his ilk, cheap access to space will come very soon.

However, the complementary technology we need for mass space colonization, compact controlled thermonuclear technology, has not yet arrived… Indeed, “habitable planet” doesn’t mean life appeared there, let alone advanced life, or civilization. So planets will be found, ripe for colonization, yet life-less. Colonizing Europa, for example, is feasible: there is plenty of water. Yet it will necessitate to harness fusion power (except if battery tech leaps ahead, and photovoltaics could be used after all…).

Nearly 4,000 exoplanets have been found by 2018.

Artist’s illustration of how rocky, potentially habitable worlds elsewhere in our galaxy might appear (from data found so far). Data gathered by telescopes in space and on the ground suggest that small, rocky planets are common (some system as Trappist, have many, close together… although not as close as here, ha ha ha.)
Credit: R. Hurt (SSC-Caltech)/NASA/JPL-Caltech

https://patriceayme.wordpress.com/2017/02/24/contemplating-philosophically-trappist-habitable-planets/

Supposedly, NASA’s prolific Kepler space telescope has discovered about 30 roughly Earth-size exoplanets in their host stars’ “habitable zone” — the range of orbital distances at which liquid water can likely exist on a world’s surface.

One doesn’t want planets to be too large: they would crush life as we know it, and retain light gases, making them “mini-Neptunes”.

However observations by the European Space Agency’s (ESA) Gaia spacecraft suggest that the actual number of habitable planets among them is probably only between two and 12, NASA officials said today. Ooops.

Before the scoffing starts, let me observe that this means, mostly, that the habitable zones have to be computed again: so planets in systems viewed as inhabitable, maybe, actually habitable, after all. And vice versa. So may be only a couple of habitable Kepler planets are habitable, but others may be.

Gaia launched in December of 2013 to create an ultraprecise 3D map of our giant galaxy, the Milky Way. So far, this map includes position information for about 1.7 billion stars and distance data for about 1.3 billion stars.

Gaia’s observations suggest that some of the Kepler host stars are brighter and bigger than previously believed. Planets orbiting such stars are therefore likely larger and hotter than previously thought. questions complete

A larger, brighter star releases more light, hence more heat. The “larger” correlation is a consequence of that. Kepler uses the “transit method.”

Kepler records the brightness dips caused when a planet crosses its parent star’s face from the Kepler telescope’s perspective. Estimates of such planets’ sizes are derived from the percentage of the stellar disk they block during these “transits.” So, if the stars’ diameter is revised upward, because it’s brighter, so is the size of the planet.

Astronomers, astrobiologists and planetary scientists still have a lot to learn about exoplanet habitability. So philosophers can strike: I pointed out that life may be common in the galaxy, but not so advanced life.

https://patriceayme.wordpress.com/2013/11/06/40-billion-earths-yes-no/

I pointed out too that the Earth nuclear reactor enables plate tectonic, and that the latter, hence the former, may be necessary for life. So may have to consider a radioactive belt, not just a water belt…

https://patriceayme.wordpress.com/2014/01/14/life-giving-nuclear-earth-reactor/

And in more details:

https://patriceayme.wordpress.com/2016/02/22/is-this-why-we-are-alone/

“We’re still trying to figure out how big a planet can be and still be rocky,” declared Jessie Dotson, astrophysicist at NASA’s Ames Research Center in Silicon Valley. Dotson is the project scientist for Kepler’s current, extended mission, known as K2. That will depend in part upon the nature of the planet’s rock, especially its density.

As I have pointed out, The concept of the habitable zone can’t be based solely on water and orbital size relative to the star’s output. That would ignore important planetary characteristics, such as mass, which influences a world’s ability to hold onto an atmosphere, and which sort of atmosphere it holds. Then, there’s atmospheric composition, which greatly affects a planet’s temperature, and depends upon the planet’s gravity.

Life may not require liquid water on the surface. A number of frozen-over moons outside our own solar system’s habitable zone, such as Jupiter’s Europa and Saturn’s Enceladus, have giant buried oceans that may be capable of having evolved life as we know it: at first sight, they seem to have had warm, liquid water, for even longer than Earth. Indeed their heat is gravitationally generated, from massaging the moon with the giant planet’s gravity.

(An old consideration has also been life as we don’t know it, which would depend on something other than water as a solvent; however, the combination water-carbon seems impossible to beat in the wealth of possibilities…)

The $600 million Kepler mission launched in March 2009, following a successful pioneering French satellite, named after another astronomer, Corot. The first confirmed exoplanet was discovered at the French Observatoire de Haute Provence.  As of 1 October 2018, there are 3,851 confirmed exoplanets in 2,871 stellar systems, with 636 systems having more than one planet.

Further philosophy out of all this?

Of course!

Next: the related, and philosophically as deep as it gets Big Silence From Necessary Malevolence?

Patrice Ayme

 

 

Will Starburst Galaxies Explode the Big Bang?

June 11, 2018

There are MUCH MORE GIANT STARS THAN EXPECTED IN THE UNIVERSE (Factor of ten?)

I have proposed that the Big Bang Model is wrong, and that the universe could be much older, of the order of 100 billion years old, not 13.8 billion years; my iconoclastic and inconsiderate reasoning was philosophical: we have one expansion mechanism, DARK ENERGY. That expansion, Dark Energy, was directly observed, it exists, it’s not a figment of imagination. Many a physicist made a sour face, as Dark Energy was not expected at all: hundreds of arrogant  claims to explain the whole universe, talk to the media and the gullible as if one were god, and then, next thing one knows, one’s theories don’t explain 95% of the universe…

So an insolent philosophy asked: ‘Why would we need another cosmic expansion mechanism?’ Especially one expansion mechanism NOT directly observed, a figment of the imagination, the so-called Inflaton Field, necessary to make the Big Bang theory work (because of arcane complications: basically the universe as observed is around 100 billion light years across, and can have got that big only if it expanded at 10^10 times the speed of light, or something like this… Confusing enough? I have explained what is going on here and there, such as the locality of the speed of light, and the embedding theorem of Lorentzian manifolds. Stay tuned…)

A (Non Spectacular) Starburst galaxy, the Cigar, 12 million light year away. Full starburst galaxies are very blue, from the giant extremely hot (thus blue) stars in their midst. How much do we know about Helium formation in such super giant stars? Philosophers want to know!

So why is the Big Bang necessary? Besides making some people more puffed up than god itself?

Inspired by the H bombs they were thoroughly familiar with, Gamow, Alpher and Herman proposed the hot Big Bang as a means to produce all of the elements: extreme heat caused collisions and the nuclei fused (from the “STRONG FORCE”).

The lightest elements (hydrogen, helium, deuterium, lithium) were produced in the Big Bang nucleosynthesis

Ms. Burbidge, Mr. Burbidge, Fowler and Hoyle worked out the nucleosynthesis processes that go on in stars, where the much greater density and longer time scales allow the triple-alpha process (He+He+He –>> C) to proceed and make the elements heavier than helium.

But BBFH could not produce enough helium. The solution, which Hoyle didn’t like at all, was to make the Helium in the Big Bang. Now we think we know that both processes occur: most helium is produced in the Big Bang but carbon and everything heavier is produced in stars. Most lithium and beryllium is produced by cosmic ray collisions breaking up some of the carbon produced in stars.

In a pirouette, Helium abundance is now viewed the observation which makes the Big Bang necessary… Yet, all this rests on an ironclad understanding of stellar physics… which we assume we have, although we don’t.

Astronomers at the gigantic, high altitude Atacama Large Millimeter/submillimeter Array (ALMA) in Chile investigated intense bouts of star formation in four distant, gas-rich starburst galaxies, where new stars are formed 100 or more times faster than they are in the Milky Way.

By looking at isotopes ratio in Inter Stellar Medium (ISM) Carbon Monoxide CO, one can see if it has been generated in light, or heavy stars. To quote from the original article in Nature: “

Oxygen, carbon and their stable isotopes are produced solely by nucleosynthesis in stars. The minor isotopes, 13C and 18O, are released mainly by low- and intermediate-mass stars (those with stellar mass less than eight solar masses, M* < 8M⊙) and massive stars (M* > 8M⊙), respectively, owing to their differing energy barriers in nuclear  reactions and evolution of stars. These isotopes then mix with the interstellar medium (ISM) such that the 13C/18O abundance ratio measured in the ISM becomes a ‘fossil’, imprinted by evolutionary history and the stellar initial mass function (IMF). The abundances of the 13CO and C18O isotopologues in the molecular ISM, whose measurements are immune to the pernicious effects of dust, are therefore a very sensitive index of the IMF in galaxies.

***

Conclusion of the Nature article:

Classical ideas about the evolutionary tracks of galaxies and our understanding of cosmic star-formation history are challenged. Fundamental parameters governing galaxy formation and evolution—star-formation rates, stellar masses, gas-depletion and dust-formation timescales, dust extinction laws, and more—must be re-addressed, exploiting recent advances in stellar physics.

This doesn’t prove my ideas about the universe are right. Yet the article mention star formation rates have to be lowered by a factor of… seven. (I will resist multiplying 13.8 billions by 7, which is… not making this up, very close to 98 billions…)

This doesn’t prove my ideas about the universe are right… But it goes my way… Ok, let a professional concludes:

Our findings lead us to question our understanding of cosmic history,” Rob Ivison, co-author of the study and director for science at European Southern Observatory, said in the statement. “Astronomers building models of the universe must now go back to the drawing board, with yet more sophistication required.

Moods, in science cannot change until evidence contrary to the old visions one had of things, accumulate. Before that, a change of paradigm can’t be hoped for. Long ago, when I used to be all too human, I communicated with a director at ESO. Delighted by the change of tone, not to say mood… (Another guy I knew was so arrogant that he posited one was not really a scientist until one was the director of a lab, which he happened to be… in astrophysics, the field at hand, where it turns out the big picture was missed…)

But, ladies and gentlemen, remember this: wisdom, even scientific wisdom, doesn’t always triumph in a timely manner. We have examples in science, and mathematics, where wisdom was delayed and defeated for 24 centuries… by the greatest stupidity

Patrice Ayme

***

***

Examples of delayed wisdom: a) The Atomic Theory, of course, complete with eternal motion in the small (which the Greeks had observed and is strikingly described by Lucretius). The theory was then forgotten until the 19C.

b) The Archimedean Axiom in arithmetic/theory of infinity, undetected until 1950, when the US logician/mathematician Robinson detected it.

c) Non-Euclidean geometry found 24 centuries ago, and then lost until 1830 CE…

d) Biological evolution theory, lost between Anaximander and Lamarck… Although practiced by all serious breeders (especially Greek).

e) Computers, lost for 17 centuries… we have one proof the Antikyra mechanism (and various written description) until Blaise Pascal… Hence the computer language “Pascal”

f) Heliocentric theory of Aristarchus of Samos lost between Archimedes and Buridan (and buried again by Catholicism) Heliocentrism was of course obvious, except if one is a caveman, and not to observant…

g) And of course that Earth was round and how big, established and measured first by the great scientist and explorer Pytheas of Massalia (Pytheas de Marseilles), circa 320 BCE. Pytheas even related the tides to the Moon, and got the explanation roughly right (whereas Galileo Galilei, 19 centuries later, got the explanation of the tides completely wrong, and not just that but got a near lethal fight with his friend the Pope, who he brushed off as an ignorant… when the Pope was actually less wrong than Galileo…)

Super Earths, Or How The Exponential Function Can Matter

April 23, 2018

We live in the times where exponentials have come to rule, as they never ruled before. Ignore at the risk of everything we claim to hold dear. As mathematically challenged Silicon Valley nerds put it, all too simplistically, the coming “singularity” looms. Simple minds do not much understanding create, though, so here a little elaboration…

An example of exponentials in action, is graciously offered by so-called “Super Earths“, giant versions of Earths, hundreds of which have been discovered in our neighborhood.

Before I get into this, a short lesson on the exponential.

The Ancient Greeks thought they knew mathematics, but they were prisoners of linear thinking (especially after the top intellectuals spurned non-Euclidean geometry and arithmetic). The exponential is the most obvious, most crucial to understand, most vital to handle example of nonlinear thinking.

An exponential is any function which grows proportionally to itself.

Our present “leaders” (Putin, Trump, Xi, Macron, etc.), and their underlings have no idea what an exponential is, and that it feeds on itself.

Civilizations get ambushed by exponentials. This is why they so often irresistibly decay: the effect is blatant, be it the Late Roman empire, Tang China, the Maya…  

***

Socrates:The unexamined life is not worth living“. That was HIS (wise) feeling. His own feeling. Others don’t have to share it. Actually vain, self-admiring, erroneous, hateful people detest nothing more than self-examination. They deeply dislike, hinder those, and what, promotes self-examination.

And tell me, Socrates, you who didn’t like knowledge you didn’t already have, and you thought everybody had, when did you learn about the exponential function? How can you know something that important you never even suspected existed? And, absent that tool of the spirit, you thought you could examine everything? How stupid was that? And you, out there, the ignorant admirers of Socrates and his ilk: you don’t even have the excuse to have been dead for 24 centuries! To extract you from the gutter, seize the exponential!

***

After discovering a few thousands exoplanets, Super Earths are, so far, more frequent than simple Earths (it may be a bias from our present telescopes, but I don’t think so…). If the Super Earth is slightly bigger than Earth, depending upon the nature of its core, its surface gravity doesn’t have to be much higher than Earth (I computed). However, the present article considers Super Earths were the gravity is much higher than on Earth…

“Super-Earth” planets are gigantic versions of Earth. In some ways, they are more likely to be habitable than Earth-size worlds: their thicker atmospheres protect them better from radiations, either from their parent stars, supernovae, gamma ray bursts, galactic core explosions, etc.. However, it would be difficult for any inhabitants on these exoplanets to access to space. At least with known, or imaginable technologies.

To launch a vehicle as light as the Apollo moon mission capsule, a rocket on a super-Earth such as (potentially inhabitable) Kepler 20b would require more than double the escape velocity.

To leave Earth (“⊕”)’s gravitational influence, a rocket needs to achieve at minimum the escape velocity vesc = s 2GM⊕ R⊕ ∼ 11.2 km s−1 (2) for Earth, and vesc ∼ 27.1 km s−1 for a 10 M⊕, 1.7 R⊕ Super-Earth similar to Kepler-20 b. Computation shows one would need a mass of about 400,000 metric tons, mostly due to the exponential demand of fuel. That’s 5% of the mass of the Great Pyramid of Giza in Egypt (still by far the Earth’s most massive monument, excluding utilitarian walls and dams).  

That means a chemical rocket there should have one hundred times the mass of one here (Apollo’s Saturn V launcher was 3,000 tons). However, that’s not a show stopper: our largest ocean-going ships are more massive than that, and a massive rocket is imaginable. So Hippke is not correct when he says that:

“On more-massive planets, spaceflight would be exponentially more expensive,” said study author Michael Hippke, an independent researcher affiliated with the Sonneberg Observatory in Germany. “Such civilizations would not have satellite TV, a moon mission or a Hubble Space Telescope.

This is of great practical interest. Research has revealed that Super Earths are abundant, and obvious targets for human colonization. They can reach up to 10 times the mass of our own Earth (after that, they retain light gases, and turn into mini Neptunes, unsuitable for direct colonization, although Pandora like scenarios are highly plausible). Many super-Earths apparently lie in the habitable zones of their stars, where temperatures can theoretically support liquid water on the planetary surface and thus, potentially, life as it is known on Earth. Although I have had reservations about this: I view the presence of a nuclear reactor inside the planet as necessary for life, since it provides with a magnetic shield, and the recycling of the atmosphere through plate tectonic, let alone continents… (Being in the water belt and the nuclear belt simultaneously is a miracle Earth’s biosphere profits from.)

This being said, it is true that some ways to access space that we potentially have, won’t happen on Super Earths. Rockets work better in the vacuum of space than in an atmosphere: super-Earthlings might want to launch from a mountaintop. However, the strong gravitational pull of super-Earths would squash down super Alps (it’s a pure application of Quantum mechanics). Super towers won’t be be feasible, either…

Using space elevators traveling on giant cables rising out of the atmosphere depends upon the strength of the cable material. The strongest (per unit of mass) material known today, carbon nanotubes, is just barely strong enough for Earth’s gravity (it is not at this point possible to imagine stronger materials, putting in doubt the feasibility of space elevators on super-Earths). Here is Michael Hippke (Submitted on 12 Apr 2018):

Spaceflight from Super-Earths is difficult:

 

Many rocky exoplanets are heavier and larger than the Earth, and have higher surface gravity. This makes space-flight on these worlds very challenging, because the required fuel mass for a given payload is an exponential function of planetary surface gravity, ∼3.3exp(g0). We find that chemical rockets still allow for escape velocities on Super-Earths up to 10 times Earth mass. More massive rocky worlds, if they exist, would require other means to leave the planet, such as nuclear propulsion.

Comments: Serious version of the April Fool’s idea (arXiv:1803.11384). Submitted on April 4th 2018
Subjects: Popular Physics (physics.pop-ph); Earth and Planetary Astrophysics (astro-ph.EP)
Cite as: arXiv:1804.04727 [physics.pop-ph]
(or arXiv:1804.04727v1 [physics.pop-ph] for this version)
  1. INTRODUCTION Do we inhabit the best of all possible worlds (Leibnitz 1710)? From a variety of habitable worlds that may exist, Earth might well turn out as one that is marginally habitable. Other, more habitable (“superhabitable”) worlds might exist (Heller & Armstrong 2014). Planets more massive than Earth can have a higher surface gravity, which can hold a thicker atmosphere, and thus better shielding for life on the surface against harmful cosmic rays. Increased surface erosion and flatter topography could result in an “archipelago planet” of shallow oceans ideally suited for biodiversity. There is apparently no limit for habitability as a function of surface gravity as such (Dorn et al. 2017). Size limits arise from the transition between Terran and Neptunian worlds around 2 ± 0.6 R⊕ (Chen & Kipping 2017). The largest rocky planets known so far are ∼ 1.87 R⊕, ∼ 9.7 M⊕ (Kepler-20 b, Buchhave et al. 2016). When such planets are in the habitable zone, they may be inhabited. Can “Super-Earthlings” still use chemical rockets to leave their planet? This question is relevant for SETI and space colonization (Lingam 2016; Forgan 2016, 2017).

***

Pessimistically, Hippke considered another possibility, a staple of science-fiction which originated in the very serious “Orion” project of the 1950s, an apocalyptic period: nuclear pulse propulsion. It works by detonating thousands of atom bombs below a shield cum shock absorber attached to the vehicle, hurling it through space. This explosive propulsion has much more lifting power than chemical rockets, and might be the only way for a civilization to leave a planet more than 10 times Earth’s mass, Hippke (naively) said.

However, slaying the radioactive dragon he himself brought up, such a nuclear-powered spacecraft would pose not only technical challenges but political ones as well, he said: “A launch failure, which typically happens with a 1 percent risk, could cause dramatic effects on the environment. I could only imagine that a society takes these risks in a flagship project where no other options are available, but the desire is strong — for example, one single mission to leave their planet and visit a moon.”

Unwittingly, Hippke then demonstrates the danger of the single mind (in this case, his!) Indeed the most obvious way to use nuclear propulsion is simply to run a liquid, even water, through the core of a nuclear fission reactor. That was tested, and it works extremely well… and very safely! It’s much less prone to failure than a chemical rocket.  On a planet with ten times the Earth’s surface, there would be plenty of space to do such dirty launches by the thousands.

Besides, it may possible to engineer absolutely giant thermonuclear PROPULSION reactors (thermonuclear fusion is easier, the larger the reactor: the exponential at work again; if we just made a fusion reactor that was large enough, it would certainly work). The radioactivity generated would be neglectable.

So we don’t have to worry about colonizing Super Earths… We just have to worry about weight (that is, surface gravity)….

But, here, now, we have to worry about all those exponentials going crazy. Last I checked, the Arctic ice was running one million square miles below its old minimum: at some point the so-far linear decrease of Arctic ice is going to decrease exponentially, as warming there is highly self-feeding (that’s why it runs already at twice the rate of the rest of the planet…).

And as usual, let’s remember what the arrogant, stupid imperial Romans never learned, and the Maya never reached: inventing completely new, liberating, energizing technologies is how, and the only way how, to break the strangulation from the ecological, political, economical and moral exponentials which smother civilizations. A most recent example is diffuse, dim light solar cells, dye-sensitized solar cells (DSSCs), a tech already in full deployment, which has just made spectacular progress in the lab.

Even language acquisition is exponential… Let alone thought system acquisition. You want to examine life, in ultimate depth? Learn to think exponentially!

The coming “singularity” looms. How to manage it? First by understanding what makes it tick, exponentials.

Patrice Aymé

 

We Are No Dinosaurs: H Bombs Potentially Save Lives, Whereas PC kills

March 18, 2018

I long advocated comet and asteroid defense. The subject is intricate, with high stakes, even more interesting as the very strange reactions it brought.

Years ago, I claimed that H bombs were the ultimate, and necessary tool to achieve security. At the time, “experts” were ruling out nukes. Contemporaneously, I had a major fight with a (decorated) geophysicist about my claim that the Earth functions as a nuclear reactor, thus driving the Earth magnetic field, which protects the upper atmosphere from stripping by the solar wind during Coronal Mass Ejections (which is, we now know the way Mars’ atmosphere was stripped). In my opinion both experts’ opinions were the fruit, not of science, but of the Political Correctness which fed them (they got to their expert position not thanks to their smarts, but thanks to being PC).

Political Correctness, in this case “nukes are evil” was so great at the time that “experts”, to achieve astronomical moral superiority over the moral turpitude of the ilk of yours truly, the moral turpitude of those who tell the truth, pontificated idiotically that H-bombs would be inefficient, unsuitable, inappropriate, and that, to promote their usage, was, besides criminal, the mark of a lack of scientific culture. Moreover, they added, impacts were impossible in the foreseeable future.

Here is an update: the old “experts” were wrong, like 100% wrong, peer-review journal and the US government recognize, it is my pleasure to reveal.

Siding Springs comet, the smallest one is even more dangerous, because of its speed. Europe landed, and orbited with Rosetta the bigger one… which is a tenth of the largest comet known. As comets fly by, they can destabilize others, or asteroids.

 

A new generation of expert has arisen, who say exactly what I used to say, the obvious. The latest study, in a Russian lab, mimicked nuclear blasts, using lasers (whose energy, just as a nuclear bomb is mostly photons, initially). A US study, on project Hypervelocity Asteroid Mitigation Mission for Emergency Response (HAMMER), concluded the same. New facts have come to the surface.

***

The probability of impacts, as I said in the past, was underestimated:

As demonstrated in the Tunguska impact in Siberia, “impacts” of a less than 200 meters across bolide has a high probability to end as an airburst. Tunguska flattened 2,150 square kilometers of forest, destroying 80 million trees. That’s a circle with a diameter of 50 kilometers. In other words, exploding above some of the largest city on Earth, it could kill up to 30 million people, annihilating Tokyo-Yokohama, Mexico City, New York, Moscow, the greater Paris, etc… Initially it was thought the explosion was up to 30 megatons TNT, but then it was realized one should take into account the momentum of the disintegrating bolide, just as in the case of a hollow charge to penetrate armor. That lowered the yield to no more than 5 megatons!

A stony asteroid of about 10 m (33 ft) in diameter can produce an explosion of around 20 kilotons, similar to that of the Fat Man plutonium implosion bomb dropped on Nagasaki, Data released by the US Air Force’s Defense Support Program indicate that such explosions occur high in the upper atmosphere more than once a year.

The 1930 Curuçá River event in Brazil, observed by many, was an explosion of a superbolide that left no clear evidence of an impact crater. A smaller air burst occurred over a populated area in Russia on 15 February 2013, at Chelyabinsk in the Ural district of Russia. The exploding meteoroid was an asteroid that measured about 17 to 20 metres across, with an estimated initial mass of 11,000 tonnes, and inflicted over 1,200 injuries (mostly from flying glass, like a nuclear blast)… It would have been worse, had it streaked closer to the city.

***

ATLAS, the Asteroid Terrestrial-impact Last Alert System:

So now the good news, thanks to our old friends at NASA, ATLAS is running, with one telescope on top of Maui at 3,000 meters, and the other on top of Mauna Loa, 100 miles away, at 4,200 meters.    

ATLAS is an asteroid impact early warning system developed by the University of Hawaii and funded by NASA. It consists of two telescopes, 100 miles apart, which automatically scan the whole sky several times every night looking for moving objects. On the side it has already detected several comets, and 1,200 Supernova, mostly type 1a, among other scientific prowesses. It has also discovered 17 Potentially Hazardous Asteroids…

Second good news, a new generation of experts, American and now Russian, has established that nuclear bombs would be safe and effective to dispose of a dangerous impactor detected too late to nudge it away… Besides, they would be the only way (as I used to say).   

***

Blowing off skeptics with the Hypervelocity Siding Springs Comet:

Some fanatics, outwardly humanistic, inwardly the opposite, will sneer that my desire to find some use to thermonuclear fusion is pathetic: who cares if a given city has non negligible probability to be vaporized by heavens in the next 100,000 years?

However, the probability is much higher than Conventional Wisdom has it. And a new reasoning will be deployed below, not found in the scientific literature, to my knowledge.

An example was the Siding Spring comet. Nobody expected this sort of comet. It zoomed on what looked like a straight line through the Solar System, passing Mars at 56 kilometers per second. A typical speed for a meteorite (and probably Tungunska) is 11 kilometers per second.

As really great, although female, physicist Émilie Du Châtelet demonstrated in the Eighteenth Century, the energy of body of mass m going at speed v is ½ mvv. So energy per unit of mass augments as the square of the speed. Thus one kilogram of the Siding Spring Comet had 25 times the energy of one kilogram of Tunguska bolide. Add to this the fact Siding Spring was much smaller than its big tail advertised: only 400 to 700 meters across. Siding Spring passed very close to Mars (140,000 kilometers, half the Earth-Moon distance). If Siding Springs had hit Earth, that would have been probably at an angle, and it would have liberated an amazing amount of energy

Computer simulation show that a 200 meters diameter comet going at 11km/s would explode with an energy of 30 megatons. If the comet were 650 meters: 30 x 40 megatons, say: 1,000 megatons (Siding Springs diameter: 400-700 meters). Multiply by 25, from the v^2 factor of Émilie Du Châtelet: 25,000 megatons of TNT. The average penetration angle would be 45 degrees, giving an average trajectory of hundreds of kilometers by hypersonic surfing of the shock wave on the upper atmosphere, say 600 kilometers over 10 seconds. Acceleration: 56,000 meters/sec/10 seconds, in other words 5,600 gs, more than 5,000 the acceleration of earth’s gravity.

These are orders of magnitude. Maybe the comet would surf the atmosphere over a minute, etc. In any case, no solid body, a fortiori a fragile comet, can withstand thousands of gs. It will disintegrate. As it does, it would fry everything in its path. If it surfs two minutes, it could leave a trail of devastation hundreds of kilometers wide, over thousands of kilometers. And guess what? No impact!

***

Some mysterious degradations of living conditions in recorded history are so far unexplained. Volcanoes, of course are generally the prime culprit, as I long suspected, and explained, with dinosaurs:

https://patriceayme.wordpress.com/2009/11/21/trapped-by-super-traps/

Chinese and Romans records indicate a spectacular deterioration of conditions in the Sixth Century. Volcano, or impact? It seems two distinct eruptions were culpritRecently, mainly from the work of some researchers in France, mass deaths in Europe from starvation in 1257 CE, were explained by the explosion of a giant volcano on the island of Lombok, bringing down the kingdom there. The explosion of Thera/Santorin brought down Cretan civilization, thanks to tsunamis, ash, quakes, etc…  

If an impact was found to have occurred in recorded history, with catastrophic consequences, instead of still another crazy volcano, the probability of those extraterrestrial events would be jacked up. In the scenario I gave, the impact from a fast-moving, small comet, there would be no traces on the ground.

In any case, the pollution of questions of life and death by Political Correctness can be utterly grotesque: it was clear all along that, in some configurations, we would have to nuke asteroids or comets. To pretend otherwise was idiotic, corrupt.

Having to use nuclear energy to save ourselves is very good. And ultimately, we need controlled, sustained thermonuclear fusion. If we had it now, the CO2 catastrophe (and many other catastrophes) would be avoided. Nuclear is our friend, if, and only if, well done.

What’s never our friend is any notion that it is philosophically correct to believe that fancy and corrupt notions of so-called Political Correctness trump the truth. It’s not just that nothing trumps reality. By faking reality, we reject altruism. 

Patrice Aymé

 

Watch This Ocean Of Galaxies, And Tremble!

October 10, 2017

SOME BARYONIC MATTER FOUND

Observations of galaxies and galaxy clusters in the local universe accounted for only 10% of the “normal” particle, baryon content inferred from measurements of the cosmic microwave background and from nuclear reactions in the early Universe. Locating the remaining 90% of baryons has been one of the major challenges in modern cosmology. Cosmological simulations predict that the ‘missing baryons’ are spread throughout filamentary structures in the cosmic web, forming a low density gas with temperatures of 10^5−10^7 Kelvins.

Using the acceleration of photons by very hot plasma (“Inverse Compton Scattering”), The estimated gas density in these 15 Megaparsec-long filaments (that’s around 50 light years) is approximately 6 times the mean universal baryon density, and overall this can account for ∼ 30% of the (Big Bang hypothesized, thus deduced) total baryon content of the Universe. This result establishes the presence of ionised gas in large-scale filaments, and suggests that the missing baryon problem may be resolved via observations of the cosmic web.

Hubble Ultra Deep Field: Galaxies forever. Something very simple and deep here: where is everybody? More galaxies have been seen than there are grains of sand on Earth. But any civilization in our style would show up very quickly, thanks to the large structures it would build, none of which are observed… So tremble: all the imaginable explanations are rather ominous…

Think of it: there are probably there may be 40 billion Earths in our galaxy alone! Then remember that 10^12 galaxies loom out there…

That partly solves the missing mass problem for normal matter. It has nothing to do with the missing mass problem for Dark Matter, or Dark Energy. I suggest both arise from a (Sub-)Quantum Effect, a prediction from a theory more general than Quantum Physics as we know it today. The basic idea is that there is something one should know as the “Quantum Interaction”, and it proceeds at a finite speed.

he “Quantum Interaction” would be the Entanglement speed and the Collapse speed. Over cosmological distances, it leaves remnants: Dark Matter. It also weakens gravitation over cosmic distances, accelerating the universe.

Some will scoff. However, basic ideas in physics can be simple.  Often the simpler, the deeper.

If I am right about Sub Quantum Physics, all our physics establishment looks rather pathetic… All the more as experiments could be made…

https://patriceayme.wordpress.com/2017/09/23/sub-quantum-gravitational-collapse-2-slit-thought-experiment/

Back in 1969 the Sunyaev-Zel’dovich paper predicting the effect of hot plasma on Cosmic Background light came out, The interaction of matter and radiation in a hot-model universe. It would be decades before the effect was first detected. The paper was written almost entirely by Sunyaev, with the famous Zel’dovich (“Cosmic Inflation”) merely adding in how difficult the effect would be to detect. Nearly 50 years later, it has been used it to detect the missing normal matter in the Universe. However the fundamental idea is just Inverse Compton Scattering. Nothing new.

Prizes such as the Nobel lionize, erroneously, a few people misleading us in how the achievements of humanities in the matter of science are achieved (even Scientific American agrees a bit with me now). The nearly dozen scientists mentioned in the present story related here, however meritorious, were eminently replaceable, but their discovery was not.

Science needs to be supported by all (taxes! redistribution!), and can rise, only if shared and appreciated by all. Modesty, when looking up at this immense universe, is of the essence. It may well be full of life, but empty of any advanced intelligence. Why? Hubris. Hubris is mostly to be suspected there. Even our most advanced thinkers are just monkeys on a beach, looking at pretty shells. They should admit it, and to themselves first of all… (Thanks to Isaac Newton for the basic idea here: he said he was just a boy on a beach, picking up pretty shells…)  

Watch this ocean of galaxies, and tremble!

Patrice Ayme’

Relativistic Philosophy Beyond Consensus

August 4, 2017

It’s good to focus on “General Relativity” and Cosmology without the cloak of mathematics gone wild and unsupervised, indeed.

Anything having to do with “General Relativity” has a lot of extremely debatable philosophy hidden below a thick carpet of computations. Abuse of philosophically unsupervised spacetime leads one to believe in time machines, wormholes, and similar absurdities. A recent discovery such as Dark Energy (ever expanding space faster than previously anticipated), and a not so recent one, Dark Matter, show one has to be extremely careful.

Einstein equation of “General Relativity” (GR) is basically Curvature = Mass-Energy. Einstein long observed that the left hand side of the equation was built of mathematical beauty, and the right hand side of a murky mud of a mess. The discovery of Dark Matter proved him prophetic about that. (BTW, I know perfectly well that, stricto sensu, it’s the Ricci tensor, derived from the full Curvature tensor on the left…)

First a philosophical trap: “General Relativity” (GR) is a misnomer. It’s not clear what’s being generalized. GR is certainly a theory of the relationship between gravity and local space-times (the Theory of Relativity of space and time which Poincaré named that way in 1904).

Einstein was initially motivated to explain inertia according to the Newton-Mach observation that the distant stars seemed to endow matter with inertia (because if matter rotates relative to distant stars, a centrifugal force appears).

That way, he failed, as Kurt Goedel produced spacetime models which rotated wildly without local consequences. Frame dragging exists nevertheless, and is crucial to GPS. So GR has local consequences.

Neither Poincaré nor Einstein liked the concept of “spacetime”.

There are massive galaxy cluster, such as Abell 370 (shown here). They can be made up of thousands of Milky Way-sized galaxies. This is beyond anything we can presently have a feeling for. The space inside this cluster is not expanding, that’s a fact, but the space between this cluster and other, unbound, galaxies and clusters, is viewed by today’s Main Stream Cosmology, as expanding. I’m robustly skeptical. Image credit: NASA, ESA/Hubble, HST Frontier Fields.

A question has naturally come up: if space expands, how come we don’t? An answer to this has been the raisin bread model of the expanding universe.

As Sabine Hossenfelder, a theoretical physicist in Quantum Gravity and High energy physics  puts it: “In cosmology, too, it helps to first clarify what it is we measure. We don’t measure the size of space between galaxies — how would we do that? We measure the light that comes from distant galaxies. And it turns out to be systematically red-shifted regardless of where we look. A simple way to describe this — a space-time slicing that makes calculations and interpretations easy — is that space between the galaxies expands.”

However, the entire area is contentious. The usual snap-back of haughty physicist keen to deny any brains worth noticing to the Commons, is to say that all those who don’t understand the mathematics at hand should shut up.

That’s a disingenuous answer, as NOBODY understands fully the mathematics at hand (those with snappy rejoinders know this, but they enjoy their power maliciously).

An example of the non-universality of the notion of expanding space is the following exact quote from Physics Nobel Laureate Steven Weinberg, author, among many other things, such as the Weinberg-Salam model of the electroweak interaction, of the most famous textbook on the subject, “Gravitation and Cosmology”: “…how is it possible for space, which is utterly empty, to expand? How can nothing expand? The answer is: space does not expand. Cosmologists sometimes talk about expanding space, but they should know better”

Well, they don’t.

Reference https://www.physicsforums.com/threads/raisin-bread-model-of-space-time.901290/

Personally, I think that both space and time are local concepts (as long as one does not add to consideration the Quantum theory, as it was created, post 1923, by De Broglie, and after 1924, by the Copenhagen School). Local space and local time are united by the speed of light, c, through naturally ubiquitous light clocks. Space and time are measured locally (although Poincaré proposed a slow motion to move synchronized clocks around, and Einstein copied and published that mechanism, verbatim, as he had with E = m c²).

It has been proposed that the redshift of cosmological photons, and its attribution, 100%, to the expansion of spacetime, is a proof of the expanding “spacetime”. One must say that this statement is the core of present cosmology. And anybody looking down on the idea will not be viewed as serious by famous physicists. However just saying something does not prove it. Especially when the conclusion seems to be the hypothesis.

Lorentz- Poincaré Local Space and Time theory was experimentally provable (electromagnetism proved it).

But where is the proof that the universe is like an expanding dough, spacetime, with galactic raisin grains in it? Just waving the notion that the atomic force is 10⁴⁰ the gravitation force at a small scale does not seem compelling to me. It’s rather a question of range: gravitation is much longer range, although, much weaker. Thus the geodesic deviations due to gravitation show up at a very great distance, whereas those due to atomic and molecular force cause enormous geodesic deviations, but only at very short range. We are these enormous local deviations, larger by 10⁴⁰ locally.

Yet, even this more precise argument smacks of hand waving.  Why? Because a theory of local forces as curvatures, although posited by Riemann in 1865, and the foundation of GR, still does not exist (that’s one thing string theory was trying to achieve, and failed). Gravitation remains the only force that is tautologically equivalent to a curved space theory.

Quantum Physics has provided that theoretical spacetime with a nonlocal causal architecture (through Quantum Entanglement). However that “causality” although geometric, is non metric (and thus manifests itself with no geodesic deviation, no force).

Einstein, after a debate on nonlocality imparted by the Quantum, with the Austrian philosopher Karl Popper, attracted the world’s attention on that problem in 1935, with his famous EPR paper. There Einstein denounced the way the “spooky action at a distance” affected distant “elements of reality”. Since then, the spookiness at a distance has been amply confirmed (and enables to encrypt space communications while knowing 100% whether they have been breached, as a Chinese satellite recently showed). Nonlocal effects show unambiguously that the metric (of “spacetime”) does not capture all the geometry (an notion which may surprise physicists, but not those mathematicians who have studied the foundations of their field).

This Quantum architecture has led, so far, to no prophecy, let alone theory, by established physicist. Entangled Quantum architecture is actually not part of the General Relativistic raisin cake model (or any GR model). However, I will venture to say one can view it as predicting Dark matter, at the very least. It’s just a question of baking something more sophisticated than raisin bread.

Patrice Ayme

Olber’s Paradox Solved, Yesterday, Now & Tomorrow

June 15, 2017

The oldest cosmological paradox considers the fact that the night sky should not appear dark in an infinite, ageless Universe. It should glow with the brightness of a stellar surface, because, if we look far enough, we would see some star.

Possible explanations have been considered to get rid of the problem. Here are the most obvious:

  1. There’s too much dust to see distant stars. (This was Heinrich Olbers’ attempted explanation, in 1826. If true, it showed the universe was young! Olbers had several predecessors, including Kepler and Jean-Philippe de Chéseaux in the 1720s… But a German name beats a French one, in the matter of Anglo-Saxon fame….)
  2. The Universe has only a finite number of stars.
  3. The distribution of stars is not uniform. So, for example, there could be an infinity of stars, but they hide behind one another so that only a finite angular area is subtended by them.
  4. The Universe is expanding, so distant stars are red-shifted into obscurity.
  5. The Universe is young. Distant light hasn’t even reached us yet.

Galaxies Galore! Hubble Ultra Deep Field 2014. Other Hubble Pictures Within our own Milky Way giant galaxy, show nearly solid wall of stars, that is, the Olber’s effect!

The first attempted explanation is wrong, because dust will heat up too. If it didn’t heat up, that means the universe is young. (So Olbers could have predicted that! Or a finite universe!)

The premise of the second explanation may technically be correct. But that means that the universe is finite. The third explanation may be partially correct, because matter is very far from being uniformly distributed in the universe. We just don’t know how severe the lumping is: there are Great Walls (of galaxies!), Great Attractors (of galaxies!), Great Blobs (of quasars!), etc. If the stars are distributed in a lumpy way, then there could be large patches of empty space (which there is, because they have been seen!), so the sky could appear dark except in those directions.

Look far enough, you will hit a galaxy! At least if light does not somehow age…

The final two possibilities are presently viewed as correct by common cosmologists, and a cause of what’s observed. Some computational arguments suggest that the finite age of the Universe is the larger effect. We live inside a spherical shell of “Observable Universe” which has a diameter equal to the (“Cartan’s comoving”) distance covered by the expanding  universe during the lifetime of said Universe. That’s 95 billion light-years, according to the most esteemed conventional computation. Objects which were far enough to start with,  are too far away for their light ever to reach us.

The resolution of Olber’s paradox is found in the combined observation that 1) the speed of light is finite and 2) the Universe has a finite age, i.e. we only see the light from parts of the Universe which at some point in time where less than 15 billion light years away. Everywhere far away, say the conventionalists, we should see the fiery light of the Big Bang, and we do, they add: this is the 3 degree Kelvin background cosmic radiation. Initially it was hyper hot, but the light got stretched in the last 13.8 billion years, by the expansion of the universe, so now it appears very cold… (Except that I have a different explanation for it!)

And now for a word from our sponsor:

***

Subquantum Cosmology’s Olber’s Paradox Resolution:

How does my own SubQuantum Patrice Reality (SQPR) theory fits in all this? Very well. In my theory, the universe also expands (that’s called “Dark Energy”, and it’s a direct experimental fact). But the universe expands slowly (that’s how I resolve the problems “cosmological inflation” is supposed to resolve, but doesn’t!).

As the universe slowly expands, every single photon wave gets stretched, as in the usual Big Bang Lemaitre metric. However now that effect is not enough to solve Olbers paradox (the expansion being too slow). So another effect comes into play: light ages, from the Sub Quantum Reality (SQPR). The average photon coming from far away is so spread-out, when it hits an object, somewhere, that part of said photon is too far to coalesce with the rest, thus gets disconnected from the main singularization, and is left, in the average, as a 3 Kelvin remnant.

***

Notice that Olbers and his predecessors could have deduced much from the simple fact that the sky was not all like the surface of the sun. Olbers said: that’s because there is dust. But ultimately dust would turn as yellow and hot as the sun too. It didn’t, either because the density of stars was not constant… Or then the universe was only 6,000 years old, or so (;-)).
This being said, dust should not be ignored. Recently, it was proclaimed a proof of cosmological inflation had been found, and eminent cosmologists such as inflationistas like Guth were already attributing to themselves the Nobel Prize, but it was only an effect due to galactic dust.

Conclusion: a simple observation can very well contain revolutionary science, when, and if, logically processed. But one needs courage to do this. An obvious candidate is the collapse of the “wave packet” in Quantum Physics. Attempts to ignore, or deny that collapse, have brought the “Many Worlds” Derangement Syndrome affecting physics (and not just physics, thanks to mood transmission…)

Patrice Ayme’

DARK MATTER EMERGENCE! (If so, is a New Quantum revolution at hand?)

March 31, 2017

Long story short: My own theory of Dark Matter predicts that Dark Matter is EMERGENT. That could be viewed as a huge flaw, easy to disprove, sending me back to a burrow somewhere to pursue my humble subterranean existence of sorts. HOWEVER, big surprise: DARK MATTER EMERGENCE seems to be exactly what was just observed in 2017, at the European Southern Observatory (ESO)!

***

Anomalies in the behavior of gravitation at a galactic scale, has become the greatest crisis in physics. Ever:

What is the problem? Four centuries of physics possibly standing on its head! (Using the virial theorem,) Swiss astronomer Fritz Zwicky discovered and named Dark Matter, or, as Zwicky said in German,  “dunkle Materie“, in 1933. Zwicky observed an enormously mysterious gravitational pull.

Zwicky computed that the observed gravitational pull did not correspond to the visible matter, by an ORDER OF MAGNITUDE, and thus Zwicky assumed that there was plenty of matter that could not be seen. (At the time, physicists scoffed, and went to stuff more interesting to the military, thus, better esteemed and more propitious to glorious splurging and handshakes from political leaders!)

If spiral galaxies were only made up of the matter that we can see, stars at the outer edge should orbit the centre slower than those closer to the center.. But Zwicky  noticed that this was not the case: all the stars in the Andromeda galaxy move at similar speeds, regardless of their distance from the galactic center. (For nationalistic reasons Americans love to attribute DM’s discovery to American astronomers Vera Rubin and Kent Ford .in the 1970s. However great Vera Rubin is, that’s despicable: they worked 40 years after Zwicky.)

Many studies since the 1930s provided evidence for Dark Matter. Such matter doesn’t interact with light, that’s why it is dark. Thus, one can only observe the effects of Dark Matter via its gravitational effects.

Nobel Prizes Were Only Given To the 5% So Far. The 5% Are All What Today’s Official Physics Is About. This Is One Of The Reasons Why I Am Thinking Outside Of Their 5% Box…

***

How does one compute the mass of a galaxy?

One just look at how many stars it has. (In the Solar System, the sun is a thousand times more massive than all the planets combined; studies on how much stars are moved by the planets around them confirm that most of the mass is in the stars.) And that shows up as the overall light emitted by a galaxy. Summing up the observed light sums up the mass. Or, at least that was the long-standing idea. (More recently, the pull gravitation exerts on light has been used to detect Dark Matter, and it has been used on a… massive scale!) 

At the scale of galaxies, or galactic clusters, the motions of objects is indicating at least ten times the gravitational force that should be there, according to gravitation theory, considering the mass we see (that is the mass of all the stars we see).

Problem: that would mean that he so-called “Standard Model” of physics has no explanation for most of the mass in the galactic clusters.

Reality check: the celebrities of physics are very arrogant, and think they know exactly what the universe had for breakfast, 13.8 billion years ago, and how big it was (never mind that their logic is ridiculously flawed). Up to a few years ago, many were in denial that they were missing most of the mass-energy in the universe with their Standard Model theory. 

However, here they are now, having to admit they missed 95.1&% of the mass-energy in the universe (according to their own latest estimates)!

A low logical cost solution to the riddle of the apparently missing mass, was to decree that all physicists who have studied gravitation since Bullialdus, nearly four centuries ago, got it wrong, and that gravitation is not, after all, an inverse square of the distance law. A problem is that French astronomer Bullaldius’ very elementary reasoning seems still to have kept some validity today. Remember that, in the Quantum Field Theory setting, forces are supposedly due to (virtual) particle exchanges? Well, that was the basic picture Bullialdus had in mind! (Thus those who want to modify so-called “Newtonian Dynamics” wreck the basic particle exchange model!)

***

Bullialdus’ Inverse Distance Squared Law, Basic to Newton-Einstein:

Ismael Boulliau (aka Bullialdus) a famous astronomer, member of the English Royal Society, proposed the inverse square law for gravity, a generation before Newton. (Bullialdus crater on the Moon, named for Boulliau, would have water, by the way.) Boulliau reasoned that the force would come from particles emitted by the sun, just like light. Here is Bullialdus voice:

“As for the power by which the Sun seizes or holds the planets, and which, being corporeal, functions in the manner of hands, it is emitted in straight lines throughout the whole extent of the world… seeing that it is corporeal, it becomes weaker and attenuated at a greater distance or interval, and the ratio of its decrease in strength is the same as in the case of light, namely, the duplicate proportion, but inversely, of the distances that is, 1/d².”

Why still true today? The carrier of force are particles.If they go to infinite distance (as electromagnetism and gravitation do), then the density of filed carriers (photons, gravitons) will go down, as Bullialdus said, for the reason he gave.

Bullaldius’ observation is the basis of Newton’s gravitation theory, which is itself the first order approximation of Einstein’s theory of gravitation. (Einstein’s gravitaion is a tweak on Newton’s theory; what Einstein did is actually to re-activate Buridan’s inertial theory with advanced mathematics invented by others (Riemann, Ricci, Hilbert, Levi-Civitta)

There is a basic problem here: although Einstein’s theory is a small tweak on Newton’s, MONDs are not. Correcting a theory by a factor of ten, a hundred, or a thousand is no tweak. Moreover: 

The ESO (European Southern Observatory) observation, illustrated above by ESO itself, seems to condemn BOTH of the two known, “official”classes of solutions for the gravitation problem: LCDM Dark Matter and Mond. The only theory left standing is my own Sub Quantic Dark Matter theory, which is fully emergent.

***

2017 ESO Discovery: Slowly Spinning Old Galaxies:Natascha Förster Schreiber at the Max Planck Institute for Extraterrestrial Physics in Germany and her colleagues have used the European Very Large Telescope in Chile to make the most detailed observations so far of the movement of six giant galactic discs, 10 billion years ago.

They found that, unlike in (quasi-)contemporary galaxies, the stars at the edges of these galaxies long ago, far away, move more slowly than those closer in.

“This tells us that at early stages of galaxy formation, the relative distribution of the normal matter and the dark matter was significantly different from what it is today,” says Förster Schreiber. (Well, maybe. MY interpretation would be very different! No DM!)

In order to check their unexpected results, the researchers used a “stack” of 101 images of other early galaxies to find an average picture of their rotations. The stacked galaxies matched the rotations of the more rigorously studied ones. “We’re not just looking at six weirdo galaxies – this could be more common,” says Förster Schreiber. “For me, that was the wow moment.”

***

MOdified Newtonian Dynamics (MONDs) Don’t Work:

About 10 billion years ago, there was a peak formation period of galaxies. By looking 10 billion light years away, one can see what was going on then, and have plenty of galaxies to look at. Where was the Dark Matter there? Was there Dark Matter then? One can answer these questions by just looking, because Dark Matter shows up in the way galaxies rotate, or orbit (in galactic cluster).

The result is both completely unexpected and spectacular! I am thrilled by it, because what is observed to happen is exactly the main prediction of MY theory of Dark Matter!

What is found is that, ten billion years ago, the largest star-forming galaxies were dominated by normal matter, not by the dark matter that’s so influential in galaxies today. (I reckon that this result was already indicated by the existence of galaxies which are mostly Dark Matter… at least in my sort of cosmology which differs massively from the standard Lambda Cold Dark Matter, LCDM model.)

MOND theories, relativistic or not, say that gravity is ten times stronger at, say, 30,000 light years away from a mass. If that’s the true law of gravitation in the last few hundreds of millions of years (as observed in presently surrounding galaxies), it should have been the case ten billion years ago. But that’s not what’s observed. So MOND theories can’t be true

***

LCDM cop-out: Dark Matter makes halos, like around the Virgin Mary’s Head!

On the face of it, the discovery about those ten billion year old galaxies say that the galactic disks then did not contain Dark Matter. That seems to me that it shoots down both MOND theories and the LCDM model (that’s the fancy name for the conventional Big Bang, latest version).

However, conventional scientists, and, in particular, cosmologists, are good at pirouettes, that’s why they are professionals.  There is still a (twisted) logical escape for LCDM model. The differences in early galaxies’ rotations demonstrates that there is very little Dark Matter in towards the middle of their disks, to start with, reason the Cold Dark Matter specialists. Instead, those ancient galaxies’ disks are almost entirely made up of the matter we see as stars and gas. The further away (and thus earlier in cosmic history) the galaxies were, the less dark matter their disks contained.

The specialists suggest that the turbulent gas in early galaxies condensed into the flat, rotating disk shapes we see today more quickly than Dark Matter, which remained in a diffuse  “halo”, which would progressively fall in… but had not started to falling enough, ten billion years ago. (That’s weird, because I thought LCDM mixed normal matter and dark matter, right from the start. In any case, I am not going to make their increasingly fishy case for them!).

Dark Matter gathers – but it takes time. This is exactly what my theory of Dark Matter predicts. In my own theory, Dark Matter is the result, the debris, of Quantum Interactions (entanglement resolutions, singularization) at very large distances. This debris gathering takes time.

My Dark Matter theory predicts that Dark Matter is an Emergent phenomenon. No other theory does that. Studies of more than 100 old giant galaxies support my theory, why making the situation (very) difficult for the conventional Dark Matter theory (“LCDM”) and impossible for the MOND theories.

This progressive build-up  of Dark Matter is NOT predicted by the other two Dark Matter theories. The standard (LCDM) cosmological Dark Matter model does NOT predict a slow gathering of Dark Matter. Nor does the  MOdified Newtonian Dynamics theories (MOND, relativistic or not) predict a slow apparition of Dark Matter.m the center and most of the visible matter.

It has been taken for granted by the Dark Matter advocates that Dark Matter, a sort of non-standard standard matter, was in the universe from its legendary start, the Big Boom, aka, Big Bang,

This is an important step in trying to figure out how galaxies like the Milky Way and larger galaxies must have assembled,” says Mark Swinbank at Durham University. “Having a constraint on how early the gas and stars must have formed the discs and how well-mixed they were with dark matter is important to informing their evolution.”

Journal reference: Nature, DOI: 10.1038/nature21685

Right. Or maybe, as I speculate, for plenty of excellent reasons coming from logically far away, this is an indication that not Gravitation Theory, but Quantum Theory, is not correct. Oh, the Standard Model, too, is not correct. But we all already knew this…

Conclusion: If the ESO observation that Dark Matter was not present in large galactic disks, ten billion years ago, is correct, I cannot imagine how MOdified Newtonian Dynamics theories could survive. And I find highly implausible that LCDM would. All what is left standing, is my own theory, the apparent main flaw of which, is now turned into a spectacular prediction! DARK MATTER Appears SLOWLY as predicted by Patrice Ayme’s SUB-QUANTIC Model. (Wow!)

Patrice Ayme’