Archive for the ‘Foundations Physics’ Category

Galaxy Without Dark Matter Found: Another Proof of New Physics?

March 31, 2018

ASTRONOMERS OGLE GALAXY DEVOID OF DARK MATTER!

The newfound object NGC 1052-DF2, a vast, diffuse galaxy, defies conventional explanations. It is to be feared (just kidding!) that various breakthroughs are in the offing, including in fundamental physics (if I believe that what could be true, SQPR, a proposed new foundation for physics, is really true).

The “ultra-diffuse” galaxy NGC1052-DF2, seen here in an image from the Hubble Space Telescope, is the same size as our Milky Way but contains just 1 percent as many stars. It also appears to be empty of Dark Matter. And therein a big problem for Conventional Wisdom:

Yes, that’s a galaxy… Looks dark, but without DM… Nothing the LCDM model saw coming… Is resistance to the New Physics Futile?
NGC1052-DF2 doesn’t look like a typical spiral or elliptical galaxy, but rather a loosely connected glob of star-pocked gas and dust. If it contained an amount of Dark Matter typical for a galaxy of its size, the Dark Matter’s gravity would hasten the motions of several star clusters that orbit it. Instead, van Dokkum’s team found those star clusters moving languidly around NGC 1052-DF2… That suggests Dark Matter can decouple not only from regular, visible matter, but from entire galaxies—a phenomenon LCDM cosmologists claimed couldn’t happen.

Large galaxies, radiant agglomeration of stars, are tied up together by the gravitational pull of Dark Matter, a hidden material that is revealed and observed by its gravitational pull upon the shiny stars it seems to outmass by a factor of ten (we know this from the virial theorem, which basically say: v^2 ~ M/R, where M is the global gravitational mass, v the (“dispersion”) speed, and R the radius where the speed is measured; so the higher the speed of the orbiting stars, clusters, galaxies, at the greater distance, the higher the global mass M).

Dark Matter is considered to be as a defining feature of galaxies as stars and gas… and is thought in the reigning LCDM model, to provide the gravitational seeds from which galaxies assemble and grow (a top cosmologist Sean Carroll insisted on this point in correspondence with me). I strongly disagree with the latter point (in my model, Dark Matter is EMERGENT, a fruit of the Quantum Interaction).

A galaxy without Dark Matter—or without some bizarre, twisted deformation of gravity (such as MOND) that would mimic Dark Matter behavior, in some, only some, cases, and not in cases such as the Bullet Cluster —would contradict the religion of LCDM (Lambda Cold Dark Matter) and the sect of MOND, in other words, such a heretical galaxy would shred official thinking and its main alternative. Yet that is exactly what Yale University astronomer Pieter van Dokkum and his colleagues have found, they report in a study published Wednesday in Nature.

From the horse’s mouth:

A GALAXY LACKING DARK MATTER

(Pieter van Dokkum and Al.)

Studies of galaxy surveys in the context of the cold dark matter paradigm have shown that the mass of the dark matter halo and the total stellar mass are coupled through a function that varies smoothly with mass. Their average ratio Mhalo/Mstars has a minimum of about 30 for galaxies with stellar masses near that of the Milky Way (approximately 5 × 10^10 solar masses) and increases both towards lower masses and towards higher masses… Here we report the radial velocities of ten luminous globular-cluster-like objects in the ultra-diffuse galaxy NGC1052–DF2, which has a stellar mass of approximately 2 × 10^8 solar masses. We infer that its velocity dispersion is less than 10.5 kilometres per second with 90 per cent confidence, and we determine from this that its total mass within a radius of 7.6 kiloparsecs is less than 3.4 × 10^8 solar masses. This implies that the ratio Mhalo/Mstars is of order unity (and consistent with zero), a factor of at least 400 lower than expected. NGC1052–DF2 demonstrates that dark matter is not always coupled with baryonic matter on galactic scales.

The twelve (!) authors from Yale, Harvard, Heidelberg, Santa Cruz, who used the giant Keck observatory in Hawai’i, don’t shrink from the exciting consequences:

Regardless of the formation history of NGC1052–DF2, its existence has implications for the dark matter paradigm. Our results demonstrate that dark matter is separable from galaxies, which is (under certain circumstances) expected if it is bound to baryons through nothing but gravity. The ‘bullet cluster’ demonstrates that dark matter does not always trace the bulk of the baryonic mass, which in clusters is in the form of gas. NGC1052–DF2 enables us to make the complementary point that dark matter does not always coincide with galaxies either: it is a distinct ‘substance’ that may or may not be present in a galaxy. Furthermore, and paradoxically, the existence of NGC1052–DF2 may falsify alternatives to dark matter. In theories such as modified Newtonian dynamics (MOND) and the recently proposed  emergent gravity paradigm, a ‘dark matter’ signature should always be detected, as it is an unavoidable consequence of the presence of ordinary  matter. In fact, it had been argued previously that the apparent absence of  galaxies such as NGC1052–DF2 constituted a falsification of the standard cosmological model and offered evidence for modified  gravity. For a MOND acceleration scale of a0 = 3.7 × 103 km2 s−2 kpc−1, the expected28 velocity dispersion of NGC1052–DF2 is σM ≈  (0.05GMstarsa0)1/4 ≈ 20 km s−1, where G is the gravitational constant—a factor of two higher than the 90% upper limit on the observed dispersion.

So exit MOND, Modified Newtonian Dynamics, once again! How many times do we need to kill that vampire? MOND is philosophically ugly, as it is an ad hoc theory; strictly engineered to explain a peculiar feature that is observed… Whereas my own theory, SQPR, was invented for reasons which have strictly to do with the foundations of Quantum Theory, and Dark Matter, and, moreover, Dark Matter as it turns out to be, is just a particular consequence.

***

SQPR, Sub Quantum Patrice Reality Shines, Once Again:

In SQPR, Dark Matter is created by the Quantum Interaction, at particular cosmic distances from ordinary matter, and only then. The density of matter at cosmic distances needs to be just so, otherwise Dark Matter, Patrice’s way, will NOT decouple from normal matter. Instead the galaxy will not develop Dark Matter, just DELOCALIZED Matter.

So how did we get to the present situation, as found in NGC1052–DF2? Suppose the existence of an ultra diffuse gas, on a larger scale than the Milky Way, way back in time. Under its own gravity, the ultra diffuse gas, will gather, and form stars. What is the difference with LCDM? In LCDM, Dark Matter is present to start with, seeds and accelerates galaxy formation.

Whereas in my model, the universe, being much older, perhaps 100 billion years old, there is no need for Dark Matter to seed galaxies: in complete contrast with LCDM, there is plenty of time for ultra diffuse gas to gather into ultra diffuse galaxies…. So this is not just about Dark Matter: the way I see it, it’s the entire vision of cosmology and the Quantum, which is in question.

Patrice Aymé

***

Contextual Notes: 1) Only discovered in 2015, ultra-diffuse galaxies are thought cosmic laboratories for Dark Matter. Surely, astronomers thought, Dark Matter must provide severely needed mass to form these objects so devoid of normal stars. That thinking led van Dokkum and his colleagues to build the Dragonfly Telephoto Array, a telescope in New Mexico created for the express purpose of scrutinizing ultra-diffuse galaxies. The researchers initially used Dragonfly to study a different galaxy, which possesses an almost inconceivably gargantuan amount of dark matter, a “weird” result in and of itself. When van Dokkum and his team found NGC 1052-DF2, they expected to see something similar.

“Instead we saw the opposite, leading to this remarkable conclusion that there’s actually no room for dark matter at all in this thing,” van Dokkum says. “It’s not something we were looking for or expecting. At all. But you go in the directions the data takes you, even if it’s in contradiction to what you’ve found before.”

In Dragonfly images, NGC 1052-DF2 looked like a standard ultra-diffuse galaxy. But when the team compared them to a better image from the Sloan Digital Sky Survey, they found a surprising mismatch. What had seemed to be dim basic galactic structures in Dragonfly’s view appeared as point-like sources in the Sloan image. To resolve the discrepancy, the team scrutinized the galaxy with the Hubble Space Telescope, the W.M. Keck Observatory and the Gemini Observatory, the latter two on giant volcano Mauna Kea in Hawaii.

The point sources proved to be 10 globular clusters—compact and spherical groupings of stars orbiting the galaxy’s core. The researchers then set about measuring the movements of the clusters as a way to estimate the galaxy’s total mass. Simply put, the velocity at which clusters orbit a galaxy is related to the amount of matter—normal or dark—that a galaxy contains. Using information from the Keck telescopes, the team found the globular clusters were moving much more slowly than expected. And therefrom the tale above…

2) Without modifying vastly the age of the universe, as I boldly suggest, there are a few theories to explain how galaxies like NGC 1052-DF2 could come together without being seeded by Dark Matter (as LCDM necessarily has it). That would be a downer (for me!) but, in the interest of scientific fairness, let’s mention them.

It could be that NGC 1052-DF2 was once a glob of gas perturbed by another unseen (?) galaxy nearby, sparking DF2 star formation. Or, van Dokkum speculates, perhaps this ultra-diffuse, dark-matter-free galaxy arose from two streams of gas that collided and compressed to form a scattering of stars. Another idea, first proposed more than two decades ago by Yale astronomer Priyamvada Natarajan, suggests galaxies like NGC 1052-DF2 form from galaxy-sized globs of gas clumping together in jets ejected by supermassive black holes in large galaxies’ hearts. NGC1052-DF2 does reside in a region where such things could occur, as it lies near a giant elliptical galaxy, those are the largest galaxies, with a supermassive black hole at its heart.

Notice that, in any case, it looks bad for MOND… MOND has several variants, but, basically, says that, at the scale of 50,000 light years (say) gravity, as described by the French astronomer Ismael Bullaldius (Ismaël Boulliau), a notion picked up by Hookes, Newton, etc. and amply confirmed since on the scale of the Solar System, is actually false. Thus the virial theorem (see above), at the scale of R = 50,000 light years, should be false. But above, everybody (not just me, but also the honorable professional astronomers) assumed it was true! Not just that, but the pull of gravity was observed to be just as needed. MOND assumes it’s stronger! So MOND, in case there is indeed NO apparent Dark Matter in NGC 1052-DF2, predicts the existence of NEGATIVE mass (reference the movies Avatar? I presume?) Laughter, please!

In any case, time will tell… Paradigm shift, or overlooked subtleties? Big telescopes are coming soon to a desert near you…

Advertisements

Of God, Mice, And Men Who Believe They Created The Universe

February 8, 2018

When theists say that the universe exists because of God, they are saying that the universe exists, because of some agent they know: that make those theists vastly superior to us, simple miscreants, who do not happen to be acquainted with what, or who, created all and everything. Surely, those superior beings should lead us? So what sounds metaphysical, by asserting a “God” boils down to claiming a higher place in an all too human hierarchy.

Universe” means literally, “turned into one”, whereas “multiverse” would be: “turned into many”. So the set of all multiverses is the universe. (So the alleged existence of “multiverse” is akin to Bertrand Russell’s famous paradox of the set whose elements are not elements of itself; Russell’s paradox brought down mathematical logic as it had been known prior; present day physicists have been repeating that mistake, from lack of basic culture in the matter of mathematical logic!)

If we were to claim, and, or, even worse, have the feeling, that we know why the universe exists, we would be claiming, or have the impression, that we were God. This is not the business of physics, only the business of those who want us to be guided by absolutism.

Alexander the Great, seeing his blood flow, asked himself that question: am I a God? His Greek and Macedonian companions laughed him off. Later, on the advice of his mom, Olympia, Alexander ordered the old, most senior generalissimo Antipater, a companion of Alexander’s father, from Greece to Babylon. Antipater refused to obey. Antipater’s youngest son was Alexander’s page. Alexander found himself ceasing to be, before he could even organize his affairs.

We are both everything and nothing relative to the universe. The key to wisdom, is to keep a balance.

Man, playing God, touches man, playing Adam. All very touching, self-obsessing, self-gratifying, self-glorifying mental, self-stimulation, and self-mutilation.

The universe is, what it is. Science can describe it, not explain how it came to be. That is the proper mood that wisdom should embrace. Embracing the humility of reality, so we can unleash the power of truth.

Let theologians, dinosaurian conservatives, the Politically Correct and the Perfect Cretins, among others, try to learn this: We have to embrace the way things are, before we can hope to change what needs to be changed. And there is plenty of the latter. So stop claiming some human beings know why there is all there is. They don’t. They, and, or, their supporters just want everything you could possibly imagine, and then more.

Patrice Aymé

Note 1: the comment above was an answer to: “Why Is There Something, Rather Than Nothing?
Posted on February 8, 2018 by Sean Carroll
A good question!

Or is it?”

In it, Sean points out notions which I have exposed in the past, but are worth repeating, as many physicists, let alone philosophers and theologians, don’t get them. First of all Sean basically points out that the universe just is (as I said above, by definition of this neuronal activity!). And secondly Sean Carroll, a famous Cal Tech cosmologist, points out that all too many professional physicists don’t even understand that physics, as presently understood, doesn’t explain the universe! In other words, as I have said for decades, all too many physicists take themselves for God! (That is in the same meta category as Niels Bohr’s famous retort to Albert Einstein:”Stop telling God what to do!“)

“The right question to ask isn’t “Why did this happen?”, but “Could this have happened in accordance with the laws of physics?” As far as the universe and our current knowledge of the laws of physics is concerned, the answer is a resounding “Yes.” The demand for something more — a reason why the universe exists at all — is a relic piece of metaphysical baggage we would be better off to discard.

This perspective gets pushback from two different sides. On the one hand we have theists, who believe that they can answer why the universe exists, and the answer is God. As we all know, this raises the question of why God exists; but aha, say the theists, that’s different, because God necessarily exists, unlike the universe which could plausibly have not. The problem with that is that nothing exists necessarily, so the move is pretty obviously a cheat. I didn’t have a lot of room in the paper to discuss this in detail (in what after all was meant as a contribution to a volume on the philosophy of physics, not the philosophy of religion), but the basic idea is there. Whether or not you want to invoke God, you will be left with certain features of reality that have to be explained by “and that’s just the way it is.” (Theism could possibly offer a better account of the nature of reality than naturalism — that’s a different question — but it doesn’t let you wiggle out of positing some brute facts about what exists.)

The other side are those scientists who think that modern physics explains why the universe exists. It doesn’t! One purported answer — “because Nothing is unstable” — was never even supposed to explain why the universe exists; it was suggested by Frank Wilczek as a way of explaining why there is more matter than antimatter. But any such line of reasoning has to start by assuming a certain set of laws of physics in the first place. Why is there even a universe that obeys those laws? This, I argue, is not a question to which science is ever going to provide a snappy and convincing answer. The right response is “that’s just the way things are.” It’s up to us as a species to cultivate the intellectual maturity to accept that some questions don’t have the kinds of answers that are designed to make us feel satisfied.”

Note 2: Swiss citizen Tariq Ramadan, the world’s most famous  Islamist propagandist, holder of two chairs (no less!) at Oxford University, and now in a French prison, was going around the world grievously beating and raping women. Why? Because, precisely, he wanted everything, and that included beating up handicapped women. Even now, as he sits in prison, he enjoys his power: immensely powerful organizations behind him, the sort who made him an Oxford Don, are threatening many more women, who also want to file complaints against Ramadan, but are afraid to do so. The human species is naturally metaphysical. Ramadan wanted to create a universe where he and his ilk could hurt and terrorize women at will. This is not any different from telling us that Muhammad flew to Jerusalem, the capital of Israel, on a winged horse: it is outrageous, but it creates a universe, and its cause (and in this case Islamists are the cause of said universe!)

DOING AWAY WITH INFINITY SOLVES MUCH MATH & PHYSICS

January 11, 2018

Particle physics: Fundamental physics is frustrating physicists: No GUTs, no glory, intones the Economist, January 11, 2018. Is this caused by a fundamental flaw in logic? That’s what I long suggested.

Says The Economist:“Persistence in the face of adversity is a virtue… physicists have been nothing if not persistent. Yet it is an uncomfortable fact that the relentless pursuit of ever bigger and better experiments in their field is driven as much by belief as by evidence. The core of this belief is that Nature’s rules should be mathematically elegant. So far, they have been, so it is not a belief without foundation. But the conviction that the truth must be mathematically elegant can easily lead to a false obverse: that what is mathematically elegant must be true. Hence the unwillingness to give up on GUTs and supersymmetry.”

Mathematical elegance? What is mathematics already? What maybe at fault is the logic brought to bear in present day theoretical physics. And I will say even more: all of today logic may be at fault. It’s not just physics which should tremble. The Economist gives a good description of the developing situation, arguably the greatest standstill in physics in four centuries:

“In the dark

GUTs are among several long-established theories that remain stubbornly unsupported by the big, costly experiments testing them. Supersymmetry, which posits that all known fundamental particles have a heavier supersymmetric partner, called a sparticle, is another creature of the seventies that remains in limbo. ADD, a relative newcomer (it is barely 20 years old), proposes the existence of extra dimensions beyond the familiar four: the three of space and the one of time. These other dimensions, if they exist, remain hidden from those searching for them.

Finally, theories that touch on the composition of dark matter (of which supersymmetry is one, but not the only one) have also suffered blows in the past few years. The existence of this mysterious stuff, which is thought to make up almost 85% of the matter in the universe, can be inferred from its gravitational effects on the motion of galaxies. Yet no experiment has glimpsed any of the menagerie of hypothetical particles physicists have speculated might compose it.

Despite the dearth of data, the answers that all these theories offer to some of the most vexing questions in physics are so elegant that they populate postgraduate textbooks. As Peter Woit of Columbia University observes, “Over time, these ideas became institutionalised. People stopped thinking of them as speculative.” That is understandable, for they appear to have great explanatory power.”

A lot of the theories found in theoretical physics “go to infinity”, and a lot of their properties depend upon infinity computations (for example “renormalization”). Also a lot of problems which appear and that, say, “supersymmetry” tries to “solve”, have to do with turning around infinite computations which go mad for all to see. For example, plethora of virtual particles make Quantum Field Theory miss reality by a factor of 10^120. Thus curiously, Quantum Field Theory is both the most precise, and most false theory ever devised. Confronted to all this, physicists have tried to do what has worked in the past, liked finding the keys below the same lighted lamp post, and counting the same angels on the same pinhead.

A radical way out presents itself. It is simple. And it is global, clearing out much of logic, mathematics and physics, of a dreadful madness which has seized those fields: INFINITY. Observe that infinity itself is not just a mathematical hypothesis, it is a mathematically impossible hypothesis: infinity is not an object. Infinity has been used as a device (for computations in mathematics). But what if that device is not an object, is not constructible?

Then lots of the problems theoretical physics try to solve, a lot of these “infinities“, simply disappear. 

Colliding Galaxies In the X Ray Spectrum (Spitzer Telescope, NASA). Very very very big is not infinity! We have no time for infinity!

The conventional way is to cancel particles with particles: “as a Higgs boson moves through space, it encounters “virtual” versions of Standard Model particles (like photons and electrons) that are constantly popping in and out of existence. According to the Standard Model, these interactions drive the mass of the Higgs up to improbable values. In supersymmetry, however, they are cancelled out by interactions with their sparticle equivalents.” Having a finite cut-off would do the same.

A related logic creates the difficulty with Dark Matter, in my opinion. Here is why. Usual Quantum Mechanics assumes the existence of infinity in the basic formalism of Quantum Mechanics. This brings the non-prediction of Dark Matter. Some physicists will scoff: infinity? In Quantum Mechanics? However, the Hilbert spaces which Quantum Mechanical formalism uses are often infinite in extent. Crucial to Quantum Mechanics formalism, but still extraneous to it, festers an ubiquitous instantaneous collapse (semantically partly erased as “decoherence” nowadays). “Instantaneous” is the inverse of “infinity” (in perverse infinity logic). If the later has got to go, so does the former. As it is Quantum Mechanics depends upon infinity. Removing the latter requires us to change the former.

Laplace did exactly this with gravity around 1800 CE. Laplace removed the infinity in gravitation, which had aggravated Isaac Newton, a century earlier. Laplace made gravity into a field theory, with gravitation propagating at finite speed, and thus predicted gravitational waves (relativized by Poincaré in 1905).

Thus, doing away with infinity makes GUTS’ logic faulty, and predicts Dark Matter, and even Dark Energy, in one blow.

If one new hypothesis puts in a new light, and explains, much of physics in one blow, it has got to be right.

Besides doing away with infinity would clean out a lot of hopelessly all-too-sophisticated mathematics, which shouldn’t even exist, IMHO. By the way, computers don’t use infinity (as I said, infinity can’t be defined, let alone constructed).

Sometimes one has to let go of the past, drastically. Theories of infinity should go the way of those crystal balls theories which were supposed to explain the universe: silly stuff, collective madness.

Patrice Aymé

Notes: What do I mean by infinity not constructible? There are two approaches to mathematics:1) counting on one’s digits, out of which comes all of arithmetics. If one counts on one’s digits, one runs of digits after a while, as any computer knows, and I have made into a global objection, by observing that, de facto, there is a largest number (contrarily to what fake, yet time-honored, 25 centuries old “proofs” pretend to demonstrate; basically the “proof” assumes what it pretends to demonstrate, by claiming that, once one has “N”, there is always “N + 1”).

2) Set theory. Set theory is about sets. An example of “set” could be the set of all atoms in the universe. That may, or may not, be “infinite”. In any case, it is not “constructible”, not even to be extended consideration, precisely because it is so considerable (conventional Special Relativity, let alone basic practicality prevent that; Axiomatic Set Theory a la Bertrand Russell has tried to turn around infinity with the notion of  a proper class…)

In both 1) and 2), infinite can’t be considered, precisely, because it doesn’t finish.

Some will scoff, that I am going back to Zeno’s paradox, being baffled by what baffled Zeno. But I know Zeno, he is a friend of mine. My own theory explains Zeno’s paradox. And, in any case, so does Cauchy’s theory of limits (which depends upon infinity only superficially; even infinitesimal theory, aka non-standard analysis, from Leibnitz + Model Theory survives my scathing refounding of all of logics, math, physics).  

By the way, this is all so true that mathematicians have developed still another notion, which makes, de facto, logic local, and spurn infinity, namely Category Theory. Category Theory is very practical, but also an implicit admission that mathematicians don’t need infinity to make mathematics. Category Theory has now become fashionable in some corners of theoretical physics.

3) The famous mathematician Brouwer threw out some of the famous mathematical results he had himself established, on grounds somewhat similar to those evoked above, when he promoted “Intuitionism”. The latter field was started by Émile Borel and Henri Lebesgue (of the Lebesgue integral), two important French analysts, viewed as  semi-intuitionists. They elaborated a constructive treatment of the continuum (the real line, R), leading to the definition of the Borel hierarchy. For Borel and Lebesgue considering the set of all sets of real numbers is meaningless, and therefore has to be replaced by a hierarchy of subsets that do have a clear description. My own position is much more radical, and can be described as ultra-finitism: it does away even with so-called “potential infinity” (this is how I get rid of many infinities in physics, which truly are artefacts from mathematical infinity).  I expect no sympathy: thousands of mathematicians live off infinity.

4) Let me help those who want to cling to infinity. I would propose two sort of mathematical problems: 1) those who can be solved when considered in Ultra Finite mathematics  (“UF”). 2) Those which stay hard, not yet solved, even in UF mathematics.

DARK MATTER, Or How Inquiry Proceeds

September 7, 2016

How to find really new knowledge? How do you find really new science? Not by knowing the result: this is what we don’t have yet. Any really new science will not be deduced from pre-existing science. Any really new knowledge will come out of the blue. Poetical logic will help before linear logic does.

The case of Dark Matter is telling: this increasingly irritating elephant in the bathroom has been in evidence for 80 years, lumbering about. As the encumbering beast did not fit existing science, it was long religiously ignored by the faithful, as a subject not worthy of serious inquiry by very serious physicists. Now Dark Matter, five times more massive than Standard Model matter, is clearly sitting heavily outside of the Standard Model, threatening to crush it into irrelevance. Dark matter obscures the lofty pretense of known physics to explain everything (remember the grandly named TOE, the so-called “Theory Of Everything“? That was a fraud, snake oil, because main stream physics celebrities crowed about TOE, while knowing perfectly well that Dark Matter dwarfed standard matter, and was completely outside of the Standard Model).

Physicists are presently looking for Dark Matter, knowing what they know, namely that nature has offered them a vast zoo of particles, many of them without rhyme or reason (some have rhyme, a symmetry, a mathematical group such as SU3 acting upon them; symmetries revealed new particles, sometimes). 

Bullet Cluster, 100 Million Years Old. Two Galaxies Colliding. The Dark Matter, In Blue, Is Physically Separated From the Hot, Standard Matter Gas, in Red.

Bullet Cluster, 100 Million Years Old. Two Galaxies Colliding. The Dark Matter, In Blue, Is Physically Separated From the Hot, Standard Matter Gas, in Red.

[This sort of pictures is most of what we presently have to guess what Dark Matter could be; the physical separation of DM and SM is most telling to me: it seems to indicate that SM and DM do not respond to the same forces, something that my Quantum theory predicts; it’s known that Dark Matter causes gravitational lensing, as one would expect, as it was first found by its gravitational effects, in the 1930s…]

However, remember: a truly completely new piece of science cannot be deduced from pre-existing paradigm. Thus, if Dark Matter was really about finding a new particle type, it would be interesting, but not as interesting as it would be, if it were not, after all, a new particle type, but from a completely new law in physics.

This is the quandary about finding truly completely new science. It can never be deduced from ruling paradigms, and may actually overthrow them. What should then be the method to use? Can Descartes and Sherlock Holmes help? The paradigm presented by Quantum Physics helps. The Quantum looks everywhere in space to find solutions: this is where its (“weird”) nonlocality comes in. Nonlocality is crucial for interference patterns and for finding lowest energy solutions, as in the chlorophyll molecule. This suggests that our minds should go nonlocal too, and we should look outside of a more extensive particle zoo to find what Dark Matter is.

In general, searching for new science should be by looking everywhere, not hesitating to possibly contradict what is more traditional than well established.

An obvious possibility is, precisely, that Quantum Physics is itself incomplete, and generating Dark Matter in places where said incompleteness would be most blatant. More precisely, Quantum processes, stretched over cosmic distances, instead of being perfectly efficient and nonlocal over gigantically cosmic locales, could leave a Quantum mass-energy residue, precisely in the places where extravagant cosmic stretching of Quanta occurs (before “collapse”, aka “decoherence”).

The more one does find a conventional explanation (namely a new type of particle) for Dark Matter, the more likely my style of explanation is likely. How could one demonstrate it? Not by looking for new particles, but by conducting new and more refined experiments in the foundations of Quantum Physics.

If this guess is correct, whatever is found askew in the axioms of present Quantum Physics could actually help future Quantum Computer technology (because the latter works with Quantum foundations directly, whereas conventional high energy physics tend to eschew the wave aspects, due to the high frequencies involved).

Going on a tangent is what happens when the central, attractive force, is let go. A direct effect of freedom. Free thinking is tangential. We have to learn to produce tangential thinking.

René Descartes tried to doubt the truth of all his beliefs to determine which beliefs he could be certain were true. However, at the end of “The Meditations” he hastily conclude that we can distinguish between dream and reality. It is not that simple. The logic found in dreams is all too similar to the logic used by full-grown individuals in society.

Proof? Back to Quantum Physics. On the face of it, the axioms of Quantum Physics have a dream like quality (there is no “here”, nor “there”, “now” is everywhere, and, mysteriously, the experiment is Quantum, whereas the “apparatus” is “classical”). Still, most physicists, after insinuating they have figured out the universe, eschew the subject carefully.  The specialists of Foundations are thoroughly confused: see Sean Carroll, http://www.preposterousuniverse.com/blog/2013/01/17/the-most-embarrassing-graph-in-modern-physics/

However unbelievable Quantum Physics, however dream-like it is, physicists believe in it, and don’t question it anymore than cardinals would Jesus. Actually, it’s this dream-like nature which, shared by all, defines the community of physicists. Cartesian doubt, pushed further than Descartes did, will question not just the facts, the allegations, but the logic itself. And even the mood behind it.

Certainly, in the case of Dark Matter, some of the questions civilization has to ask should be:

  1. How sure are we of the Foundations of Quantum Physics? (Answer: very sure, all too sure!)
  2. Could not it be that Dark Matter is a cosmic size experiment in the Foundations of Quantum Physics?

Physics, properly done, does not just question the nature of nature. Physics, properly done, questions the nature of how we find out the nature of anything. Physics, properly done, even questions the nature of why we feel the way we do. And the way we did. About anything, even poetry. In the end, indeed, even the toughest logic is a form of poetry, hanging out there, justified by its own beauty, and nothing else. Don’t underestimate moods: they call what beauty is.

Patrice Ayme’

Astronomy Domine

April 6, 2016

Astronomy domine is a song much played in philosophy, not just by Pink Floyd, ever since there are men, and they observe. (Homo Erectus probably observed the last fabulous Galactic Core Eruption, two million years ago.)

Before feeding the pocketbooks of the greedy, science feeds the imagination of poets.

Astronomy has been at the forefront of physics, at least since Buridan (14th Century). Buridan applied his notion of impetus to explain that planets went around in circles from what we now call inertia. In Greek Antiquity, a large, wagon sized meteorite landed in Northern Greece, and was visited for centuries (it may have been a piece of Halley’s comet, which whizzed by spectacularly close in 466 BCE).

A Place Of Great Eruptions, Past & Future. Eta Carinae Nebula, At Least A Couple of Giant Stars, The Lightest One At Least 30 Sun Masses, the Largest Maybe As Much As 220 Solar Masses, 7,500 Light Years Away. Five Million Times The Luminosity Of the Sun. Stellar Natures & Explosions Are Far From Fully Understood!

A Place Of Great Eruptions, Past & Future. Eta Carinae Nebula, At Least A Couple of Giant Stars, The Lightest One At Least 30 Sun Masses, the Largest Maybe As Much As 220 Solar Masses, 7,500 Light Years Away. Five Million Times The Luminosity Of the Sun. Stellar Natures & Explosions Are Far From Fully Understood!

Supernova explosions are awesome: the most luminous one ever detected had a peak luminosity 570 BILLION times the luminosity of the Sun (yes, (570) 10^9 Suns; that was seen in 2015).

Supernovae are us. Supernovae create most of chemistry: the extremely high temperatures of their explosions enable light nuclei to smash into each other, and fuse, making most elements of the periodic table.

There are two main types of stars which explode as supernovae: white dwarfs and massive giant stars. In the so-called Type Ia supernovae, gases falling onto a white dwarf raise its mass until it nears a critical level, the Chandrasekhar limit, resulting in an explosion when the mass approaches exactly 1.44 Solar Mass. In Type Ib/c and Type II supernovae, the progenitor star is a massive star which runs out of fuel to power its nuclear fusion reactions and collapses in on itself, reaching astounding temperatures as it implodes, and then explodes.

Supernova science is very far from finished knowledge. Even the nature of the Crab Nebula supernova, which was seen to explode in 1054 CE, is not clear (it is known it was a big star, more than 8 Solar Masses; it left a pulsar).

Even the Crab was philosophically interesting in devious ways: the explosion was duly recorded by Europeans and Chinese. However the Muslims tried very hard not to see it (a mention was recently found). Indeed, the heavens, for desert savages, are supposed to be messages from God, and God playing games with stars was apparently not kosher…

Type Ia supernovae have completely changed our idea of the universe in the last two decades. (According to your modest servant, other types of supernovae may change our view of the universe even more dramatically. See the conclusion!)

Eta Carinae is the only star known to produce ultraviolet laser emission!

There is some philosophy to be extracted from Eta Carinae: if a star, or a system of gravitationally bound stars, can be that exotic, how sure are we from the astrophysics we think we know?

I am not the only one who thought of this. The teams who determined the accelerating acceleration of the universe (“Dark Energy”), had to exclude weird, sort-of Type Ia Supernovae… from their statistics (pre-selecting the population of explosions they would apply statistics on…). There are now other ways to detect Dark Energy (and they give the same results as the pre-selected Type Ia supernovae studies). So the results have been confirmed.

However my position is more subtle, and general. How sure are we of the astrophysics we have, to the point that we can claim that stars are unable to create all the known elements? In the proportion observed?

I am no specialist of astrophysics. But, as a philosopher, I have seen the science evolve considerably, so I think we cannot be sure that we absolutely need the hellish temperatures of the Big Bang to generate all observed elements.

Very large stars (600 Solar masses) have now been observed. They don’t live very long. I don’t see why stars thousands of Solar Masses, living only for a few hundred years, before exploding, are not possible. During these so-far-unconceived apocalypses, nucleogenesis could well follow unexpected ways.

And that could well remove one of the main arguments for the Big Bang.

Patrice Ayme’

QUANTUM FLUCTUATIONS & ARROW OF TIME

January 18, 2016

What is time? Quantum Physics gives an answer, classical physics does not. Quantum Physics suggests that time is the set of all irreversible processes. This is a world first, so it requires some explanations. I have been thinking, hard, of these things all my life. Sean Carroll, bless his soul, called my attention to the new development that mainstream physicists are starting to pay attention to my little kingdom(so I thank him).

***

SCIENCE IS WHAT WE DO:

Sean Carroll in “Quantum Fluctuations”:

“Let’s conjure some science up in here. Science is good for the soul.”

Patrice Ayme’: Why is science good for the soul? Because the human soul is centered on finding truth. Science is truth, thus science is human. Nothing is more human than science. Science is what humans do. Another thing humans do is art, and it tries to both duplicate, distort, and invent new nature, or interpretations, interpolations, and suggestions, of and from, nature:

Claim: Quantum Interference Is An Irreversible Process, Time's Arrows All Over. Quantum Interference Goes From Several Waves, To One Geometry. Soap Bubbles Brim With Quantum Interference..

Claim: Quantum Interference Is An Irreversible Process, Time’s Arrows All Over. Quantum Interference Goes From Several Waves, To One Geometry. Soap Bubbles Brim With Quantum Interference..

SC: …what are “quantum fluctuations,” anyway? Talk about quantum fluctuations can be vague. There are really 3 different types of fluctuations: Boltzmann, Vacuum, & Measurement. Boltzmann Fluctuations are basically classical: random motions of things lead to unlikely events, even in equilibrium.

Patrice Ayme’: As we will see, or we have already seen in my own “Quantum Wave”, Quantum Fluctuations are just the Quantum Waves. Richard Feynman, at the end of his chapter on entropy in the Feynman Lectures on Physics, ponders how to get an arrow of time in a universe governed by time-symmetric underlying laws. Feynman:

“So far as we know, all the fundamental laws of physics, such as Newton’s equations, are reversible. Then where does irreversibility come from? It comes from order going to disorder, but we do not understand this until we know the origin of the order. Why is it that the situations we find ourselves in every day are always out of equilibrium?”

Patrice Ayme’: Is that really true? Are equations time-symmetric? Not really. First, equations don’t stand alone. Differential equations depend upon initial conditions. Obviously, even if the equations are time-symmetric, the initial conditions are not: the final state cannot be exchanged with the initial state.

Quantum Physics make this observation even more important. The generic Quantum set-up depends upon a geometric space S in which the equation(s) of motion will evolve. Take for example the 2-slit: the space one considers generally, S, is the space AFTER the 2-slit. The one before the 2-slit, C, (for coherence) is generally ignored. S is ordered by Quantum interference.

The full situation is made of: (C, S & Quantum interference). it’s not symmetric. The Quantum depends upon the space (it could be a so-called “phase space”) in which it deploys. That makes it time-assymmetric. An example: the Casimir Effect.

***

QUANTUM PHYSICS IS ABOUT WAVES:

Sean Carroll: “Nothing actually “fluctuates” in vacuum fluctuations! The system can be perfectly static. Just that quantum states are more spread out.”

Indeed. Quantum states are, intrinsically, more spread out. They are NON-LOCAL. Why?

One has to go back to the basics. What is Quantum Physics about? Some, mostly the “Copenhagen Interpretation” followers, claim Quantum Physics is a subset of functional analysis. (The famous mathematician Von Neumann, one of the creators of Functional Analysis, was the founder of this system of thought; this scion of plutocrats, famously, yet satanically, claimed that De Broglie and Bohmian mechanics were impossible… Von Neumann had made a logical mistake; maybe that had to do with being involved with the satanic part of the American establishment, as, by then, that Hungarian had migrated to the USA and wanted to be called “Johnny”!).

The Quantum-as-functional analysis school became dominant. It had great successes in the past. It allows to view Quantum Physics as “Non Commutative Geometry”. However, contrarily to repute, it’s not the most fundamental view. (I have my own approach, which eschews Functional Analysis.)

But let’s backtrack. Where does Quantum-as-functional-analysis come from? A Quantum system is made of a (“configuration”) space S and an equation E (which is a Partial Differential Equation). Out of S and E is created a Hilbert Space with a basis, the “eigenstates”.

In practice, the eigenstates are fundamental waves. They can be clearly seen, with the mind’s eye, in the case of the Casimir Effect with two metallic plates: there is a maximal size for the electromagnetic wavelengths between the plates (as they have to zero out where they touch the metal).

The notion of wave is more general than the notion of eigenstate (Dirac pushed, successfully, the notion of wave so far that it created space, Spinor Space, and Quantum Field Theory has done more of the same, extending the general mood of De Broglie-Dirac to ever fancier Lagrangians, energy expression guiding the waves according to De Broglie scheme).

Historically, De Broglie suggested in 1923 (several publications to the French Academy of Science) that to each particle was associated a (relativistic) wave. De Broglie’s reasons were looked at by Einstein, who was impressed (few, aside from Einstein could understand what De Broglie said; actually De Broglie French jury thesis, which had two Nobel prizes, was so baffled by De Broglie’s thesis, that they sent it to Einstein, to ask him what he thought. Einstein replied with the greatest compliment he ever made to anyone: “De Broglie has started to lift the great veil,” etc…).

The De Broglie’s wave appears on page 111 of De Broglie’s 1924 thesis, which has 118 pages (and contains, among other things, the Schrödinger wave equation, and, of course, the uncertainty principle, something obvious: De Broglie said all particles were guided by waves whose wavelengths depended upon their (relativistic) energy. An uncertainty automatically appears when one tries to localize a particle (that is, a wave) with another particle (that is, another wave!)

***

CLASSICAL PHYSICS HAS NO ARROW OF TIME:

Consider an empty space S. If the space S is made available to (classical) Boltzmann particles, S is progressively invaded by (classical) particles occupying ever more states.

Classical physicist (Boltzmann, etc.) postulated the Second Law of Thermodynamics: something called entropy augmented during any process. Problem, rather drastic: all classical laws of physics are reversible! So, how can reversible physics generate a time-irreversible law? Classical physicist have found no answer. But I did, knight in shining armor, mounted on my powerful Quantum Monster:

***

QUANTUM PROCESSES CREATE IRREVERSIBLE GEOMETRIES:

When the same space S is made available as part of a Quantum System, the situation is strikingly different. As Sean Carroll points out, the situation is immediately static, it provides an order (as Bohm insisted it did). The observation is not new: the De Broglie waves provided an immediate explanation of the stability of electronic waves around atoms (thus supporting Bohr’s “First, or Semi-Classical, Quantum Theory”.

What’s a difference of a Quantum System with a classical system? The classical system evolves, from a given order, to one, more disordered. The Quantum system does not evolve through increasing disorder. Instead, the space S, once accessed, becomes not so  much an initial condition, but a global order.

The afore-mentioned Hilbert Space with its eigenstates is that implicit, or implicate (Bohm) order. So the Quantum System is static in an important sense (from standing Quantum Waves, it sorts of vibrates through time).

Thus Quantum Systems have an intrinsic time-assymmetry (at least when dealing with cavities). When there are no cavities, entanglement causes assymmetry: once an interaction has happened, until observation, there is entanglement. Before interaction, there was no entanglement. Two classical billiards balls are not entangled either before or after they interact, so the interaction by collision is fully time reversible.

Entanglement is also something waves exhibit, once they have interacted and not before, which classical particles are deprived of.

Once more we see the power of the Quantum mindset for explaining the world in a much more correct, much simpler, and thus much more powerful way. The Quantum even decides what time is.

So far as we know, all the classical fundamental laws of physics, such as Newton’s equations, are reversible. Then were does irreversibility come from? It does NOT come, as was previously suggested, from order going to disorder.

Quite the opposite: irreversibility comes from disorder (several waves)going to order (one wave, ordered by its surrounding geometry). And we do understand the origin of the order: it’s the implicit order of Quantum Waves deployed.

You want to know the world? Let me introduce you to the Quantum, a concept of wealth, taste and intelligence.

Last and not least: if I am right, the Quantum brings the spontaneous apparition of order, the exact opposite picture which has constituted the manger in which the great cows of physics have found their sustenance. Hence the fact that life and many other complicated naturally occurring physical systems are observed to create order in the universe are not so baffling anymore. Yes, they violate the Second Law of Thermodynamics. However, fundamentally, that violated the spirit, the principle of the universe, the Quantum itself.

Patrice Ayme’

BEING FROM DOING: EFFECTIVE ONTOLOGY, Brain & Consciousness

December 29, 2015

Thesis: Quantum Waves themselves are what information is (partly) made of. Consciousness being Quantum, shows up as information. Reciprocally, information gets Quantum translated, and then builds the brain, then the mind, thus consciousness. So the brain is a machine debating with the Quantum. Let me explain a bit, while expounding on the way the theory of General Relativity of Ontological Effectiveness, “GROE”:

***

What is the relationship between the brain and consciousness? Some will point out we have to define our terms: what is the brain, what is consciousness? We can roll out an effective definition of the brain (it’s where most neurons are). But consciousness eludes definition.

Still, that does not mean we cannot say more. And, from saying more, we will define more.

Relationships between definitions, axioms, logic and knowledge are a matter of theory:

Take Euclid: he starts with points. What is a point? Euclid does not say, he does not know, he has to start somewhere. However where that where exactly is may be itself full of untoward consequences (in the 1960s, mathematicians working in Algebraic Geometry found points caused problems; they have caused problems in Set Theory too; vast efforts were directed at, and around points). Effectiveness defines. Consider this:

Effective Ontology: I Compute, Therefore That's What I Am

Effective Ontology: I Compute, Therefore That’s What I Am

Schematic of a nanoparticle network (about 200 nanometres in diameter). By applying electrical signals at the electrodes (yellow), and using artificial evolution, this disordered network can be configured into useful electronic circuits.

Read more at: http://phys.org/news/2015-09-electronic-circuits-artificial-evolution.html#jCp

All right, more on my General Relativity of Ontological Effectiveness:

Modern physics talks of the electron. What is it? Well, we don’t know, strictly speaking. But fuzzy thinking, we do have a theory of the electron, and it’s so precise, it can be put in equations. So it’s the theory of the electron which defines the electron. As the former could, and did vary, so did the latter (at some point physicist Wheeler and his student Feynman suggested the entire universe what peopled by just one electron going back and forth in time.

Hence the important notion: concepts are defined by EFFECTIVE THEORIES OF THEIR INTERACTION with other concepts (General Relativity of Ontological Effectiveness: GROE).

***

NATURALLY Occurring Patterns Of Matter Can Recognize Patterns, Make Logic:

Random assemblies of gold nanoparticles can perform sophisticated calculations. Thus Nature can start computing, all by itself. There is no need for the carefully arranged patterns of silicon.

Classical computers rely on ordered circuits where electric charges follow preprogrammed rules, but this strategy limits how efficient they can be. Plans have to be made, in advance, but the possibilities become vast in numbers at such a pace that the human brain is unable to envision all the possibilities. The alternative is to do as evolution itself creates intelligence: by a selection of the fittest. In this case, a selection of the fittest electronic circuits.

(Selection of the fittest was well-known to the Ancient Greeks, 25 centuries ago, 10 centuries before the Christian superstition. The Ancient Greeks, used artificial and natural selection explicitly to create new breeds of domestic animals. However, Anglo-Saxons prefer to name things after themselves, so they can feel they exist; thus selection of the fittest is known by Anglo-Saxons as “Darwinian”. Hence soon we will hear about “Darwinian electronics”, for sure!)

“The best microprocessors you can buy in a store now can do 10 to the power 11 (10^11; one hundred billions) operations per second and use a few hundred watts,” says Wilfred van der Wiel of the University of Twente in the Netherlands, a leader of the gold circuitry effort. “The human brain can do orders of magnitude more and uses only 10 to 20 watts.  That’s a huge gap in efficiency.”

To close the gap, one goes back to basics. The first electronic computers, in the 1940s, tried to mimic what were thought at the time to be brain operations. So the European Union and the USA are trying more of the same, to develop “brain-like” computers that do computations naturally without their innards having been specifically laid out for the purpose. For a few years, the candidate  material that can reliably perform real calculations has been found to be gold.

Van der Wiel and colleagues have observed that clumps of gold grains handle bits of information (=electric charge) in the same way that existing microprocessors do.

Clump of grains computing operate as a unit, in parallel, much as it seems neurons do in the brain. This should improve pattern recognition. A pattern, after all, is characterized by dimension higher than one, and so is a clump operating together. A mask to recognize a mask.

Patterns are everywhere, logics itself are patterns.

***

WE ARE WHAT WE DO:

So what am I saying, philosophically? I am proposing a (new) foundation for ontology which makes explicit what scientists and prehistoric men have been doing all along. 

The theory of the nature of being is ontology, the “Logic of Being”. Many philosophers, or pseudo-philosophers have wrapped themselves up in knots about what “Being”. (For example, Heidegger, trained as a Catholic seminarian, who later blossomed as a fanatical professional Nazi, wrote a famous book called “Zein und Zeit”, Being and Time. Heidegger tries at some point to obscurely mumble feelings not far removed from some explicit notions in the present essay.)

Things are defined by what they do. And they do what they do in relation with other things.

Where does it stop? Well, it does not. What we have done is define being by effectiveness. This is what mathematicians have been doing all along. Defining things by how they work produce things, and theories, which work. The obvious example is mathematics: it maybe a castle in the sky, but this castle is bristling with guns, and its canon balls are exquisitely precise, thanks to the science of ballistics, a mathematical creation.

Things are what they do. Fundamental things do few things, sophisticated things do many things, and thus have many ways of being.

Some will say: ‘all right, you have presented an offering to the gods of wisdom, so now can we get back to the practical, such as the problems Europe faces?’

Be reassured, creatures of little faith: Effective Ontology is very practical. First of all, that’s what all of physics and mathematics, and actually all of science rest (and it defines them beyond Karl Popper’s feeble attempt).

Moreover, watch Europe. Some, including learned, yet nearly hysterical commenters who have graced this site, are desperately yelling to be spared from a “Federal Europe“, the dreaded “European Superstate“. The theory of Effective Ontology focuses on the essence of Europe. According to Effective Ontology, Europe is what it does.

And  what does Europe do? Treaties. A treaty, in Latin, is “foedus. Its genitive is foederis, and it gives foederatus, hence the French fédéral and from there, 150 years later in the USA, “federal”. Europe makes treaties (with the Swiss (Con)federation alone, the Europe Union has more than 600 treaties). Thus Europe IS a Federal State.

Effective Ontology has been the driver of Relativity, Quantum Physics, and Quantum Field Theory. And this is precisely why those theories have made so many uncomfortable.

Patrice Ayme’

Why Mathematics Is Natural

April 21, 2015

There is nothing obvious about the mathematics we know. It is basically neurology we learn, that is, that we learn to construct (with a lot of difficulty). Neurology is all about connecting facts, things, ideas, emotions together. We cannot possibly imagine another universe where mathematics is not as given to us, because our neurology is an integral part of the universe we belong to.

Let’s consider the physics and mathematics which evolved around the gravitational law. How did the law arise? It was a cultural, thus neurological, process. More striking, it was a historical process. It took many centuries. On the way, century after century a colossal amount of mathematics was invented, from graph theory, to forces (vectors), trajectories, equations, “Cartesian” geometry, long before Galileo, Descartes, and their successors, were born.

Buridan, around 1330 CE, to justify the diurnal rotation of Earth, said we stayed on the ground, because of gravity. Buridan also wrote that “gravity continually accelerates a heavy body to the end” [In his “Questions on Aristotle”]. Buridan asserted a number of propositions, including some which are equivalent to Newton’s first two laws.

Because, Albert, Your Brain Was Just A Concentrate Of Experiences & Connections Thereof, Real, Or Imagined. "Human Thought Independent of Experience" Does Not Exist.

Because, Albert, Your Brain Was Just A Concentrate Of Experiences & Connections Thereof, Real, Or Imagined. “Human Thought Independent of Experience” Does Not Exist.

At some point someone suggested that gravity kept the heliocentric system together.

Newton claimed it was himself, with his thought experiment of the apple. However it is certainly not so: Kepler believed gravity varied according to 1/d. The French astronomer Bullialdius ( Ismaël Boulliau) then explained why Kepler was wrong, and gravity should vary as, the inverse of the square of the distance, not just the inverse of the distance. So gravity went by 1/dd (Bullialdius was elected to the Royal Society of London before Newton’s birth; Hooke picked up the idea then Newton; then those two had a nasty fight, and Newton recognized Bullialdius was first; Bullialdius now has a crater on the Moon named after him, a reduced version of the Copernicus crater).

In spite of considerable mental confusion, Leonardo finally demonstrated correct laws of motion on an inclined plane. Those Da Vinci laws, more important than his paintings, are now attributed to Galileo (who rolled them out a century later).

It took 350 years of the efforts of the Paris-Oxford school of mathematics, and students of Buridan, luminaries such as Albert of Saxony and Oresme, and Leonardo Da Vinci, to arrive at an enormous arsenal of mathematics and physics entangled…

This effort is generally mostly attributed to Galileo and Newton (who neither “invented” nor “discovered” any of it!). Newton demonstrated that the laws discovered by Kepler implied that gravity varied as 1/dd (Newton’s reasoning, using still a new level of mathematics, Fermat’s calculus, geometrically interpreted, was different from Bulladius).

Major discoveries in mathematics and physics take centuries to be accepted, because they are, basically, neurological processes. Processes which are culturally transmitted, but, still, fundamentally neurological.

Atiyah, one of the greatest living mathematicians, hinted this recently about Spinors. Spinors, discovered, or invented, a century ago by Elie Cartan, are not yet fully understood, said Atiyah (Dirac used them for physics 20 years after Cartan discerned them). Atiyah gave an example I have long used: Imaginary Numbers. It took more than three centuries for imaginary numbers (which were used for the Third Degree equation resolution) to be accepted. Neurologically accepted.

So there is nothing obvious about mathematical and physics: they are basically neurology we learn through a cultural (or experimental) process. What is learning? Making a neurology that makes correspond to the input we know, the output we observe. It is a construction project.

Now where does neurology sit, so to speak? In the physical world. Hence mathematics is neurology, and neurology is physics. Physics in its original sense, nature, something not yet discovered.

We cannot possibly imagine another universe where mathematics is not as given to us, because the neurology it is forms an integral part of the universe we belong to.

Patrice Ayme’

Flat Universe Flattens Twisted Logic

April 11, 2015

The observed universe is flat. I will explain what it means in practice, before going into a bit of theory. Including a sickle move through the lamentable precedent of the heliocentric system.

Basically, when we look at a galaxy which is very very very far away, it appears to have the same size as it should have considering its distance. Ah, yes, because we can determine the distance of a very very remote galaxy, or so we think, by looking at its red shift (how much redder it looks than what it would be if it were next door).

This apparently innocuous set-up creates lots of problems for the ruling cosmological theory, the Big Noise Bang. The barnacles attached to the Big Noise, thousands of professional cosmologists, would not be happy to see their ship sink, so they insist it’s doing all right. Yet I am dancing all around with the facts, and, should they read me carefully, they would be less sanguine about the respect they will enjoy, in the fullness of time.

Gravitational Lensing. Lensing Without Gravitation Would Signal Curvature. So Would Apparent Size Variations. Neither Is Observed, However far We Look.

Gravitational Lensing. Lensing Without Gravitation Would Signal Curvature. So Would Apparent Size Variations. Neither Is Observed, However far We Look.

The Big Noise cosmologists may well be wrong, because they suppose plenty of things for their model. All too many things, some of them, pretty weird. I get to the same observations, while being much more parsimonious with my hypotheses.

We have seen it all before, this conflict between common sense , and complicated absurdities by great priests, themselves at the service of higher authorities. Remember the Ptolemaic system? That claimed the Sun rotated around Earth. That absurdity ruled for around 15 centuries

***

Cosmology is serious business:

The Ptolemaic System Was An Obese Lie, Thus Contradicting It, A Capital Crime:

The bigger the lie, the greater the authority. So great authority loves big lies: it is a training ground for the feeble minds which make authority so great.

The greatest philosopher of the Fourteenth Century, and the greatest physicist of the Middle Ages, the Parisian Johannes Buridanus, sent the Ptolemaic system to the bottom of the sea (1320s CE).

However Jean Buridan, adviser to 4 kings, and head of the University of Paris, did not want to be burned alive. So Buridan presented all his new physics and cosmology as something “supporters” of the point of view that “authority does not demonstrate” were talking about (he named no names).

Buridan believed that the Earth turned on itself each day, and around the sun in a year, that the arrow would fall at the same point, because of his own theory of impetus. Etc. It’s all very clear, and some of it can even be read. (In this extract Buridan supports geocentrism; in later extracts, he concludes he cannot be distinguished from heliocentrism observationally; a full study of Buridan is not extant. Some of the later arguments of Buridan are found in Oresme.)

Even the ship example used by Galileo, 300 years later, to demonstrate the undetectability of uniform motion is Buridan’s invention, for the same purpose (Buridan’s student, bishop Oresme wrote about it too).

The Catholic Church, supported by King Plutocrat Louis XI, made reading Buridan a capital crime in 1473 CE. Buridan’s cosmology was posthumously re-amplified by his student and (self) publicist, the dying Abbot Copernicus.

That fancy, the heliocentric system, was, on the face of it, quite ridiculous: Buridan said the Earth was “tiny” so it was only understandable that the tiny thing would rotate on itself, while enormous thing would stay put.

***

Authorities Love Systems Which Lie And Make No Sense:

Why the heliocentric system, was entertained so long explains much of the enthusiasm for the Big Bang. The psychology is similar: an obscure set of ideas was made more hermetic by computations nobody understands. Actually, it’s Plato who launched the Big Ptolemaic Noise, six centuries prior to Ptolemy’s efforts.

Believing in the heliocentric system was good training for submitting to stupid authority, and learning to become non-critical.

But let’s go back to flatness.

Basic Math Of Flatness:

Our universe of stars, clouds, and galaxies, is three dimensional (as I often talk of high dimensions, see note: the “3” maybe an average of the “many”).

Geometries can be flat (a plane) or spherical (aka “elliptic”; as on a round planet), or “hyperbolic” (a saddle).

A mighty theorem (Perelman-Thurston; see technical note on mathematical background) implies that astronomically plausible non-flat geometries contain flat, spherical or hyperbolic elements.

I will simplify further.

Geometries are determined by their geodesics (the shortest paths). At least locally.

A non-flat universe means that that some perspective can be found so that two neighboring geodesics will either converge or diverge.

For a proof, just look at a sphere, or a saddle; the geodesics can be determined by pulling a string between two points, making the shortest paths. They are the greatest circles in the case of a sphere. Notice that the distances between two nearby strings, once pulled to make geodesics, vary. The big math proof, with equations, does not say anything more.

No Empty Space Lensing, No Curvature:

In space, geodesics are paths followed by light. If the universe is not flat, light will either diverge, or converge, as if space itself was a lens. This means that a galaxy, or a galactic cluster, will appear bigger, or smaller, than it should.

Some may object that lensing in space is well known, and is even used to look at the furthest galaxies. However that lensing is due to gravity slowing down, and bending light, as happens with light grazing the sun. That’s called gravitational lensing. Entire galactic clusters are known to operate as giant lenses.

If one saw lensing, with nothing in between, the lensing would not be gravitational and the universe would not be flat.

But so far, this has not been observed.

A perfectly flat universe means global curvature zero. However the basic idea of the Einstein Field Equation (EFE) is:

CURVATURE = MASS-ENERGY-MOMENTUM

Actually, this equation is the basic idea, thus the ultimate simplification. As it is, it cannot work without further complications, because the object on the left has much higher dimension than the 10 dimensional tensor on the right; so one has to simplify the curvature first). The real equation is more like:

Function of Curvature = Mass-Energy-Momentum

There are a lot of mathematical details to figure out, to make that basic idea fit in. It took many stupendous mathematicians and physicists many years working together frantically to figure them out. In particular, Einstein and Hilbert cooperated intensely, helped by many collaborators… And the initial idea comes from the mathematician/physicist/philosopher Riemann (1866). So it took 60 years to make the idea work, and one should not expect casual readers to get the ideas in 60 lines, let alone 60 seconds.

An obvious (sort of) prediction was that, as the Mass-Energy of the universe is not zero (it’s full of galaxies, which have mass, and energy), then the curvature could not be zero. But then, if curvature (of the space-time of the universe) is not zero, then the universe has got to be moving.

Revolted by a moving universe, Einstein then added another curvature term, Lg. Lg counterbalanced Mass-Energy-Momentum, and gave a static (but unstable) universe.

Thus Einstein did not predict what the astronomers were starting to observe, namely the expansion of the universe. Einstein abandoned L (“Lambda”), calling it the “biggest blunder [he] ever made”.

(According to me, he made a much graver error in 1905.)

***

Dark Energy Flattens Cosmological Logic:

Ninety years later, the most basic supernovas were studied. They arise in binary systems: a star transfers part of itself to its companion, a super hot white dwarf. It is a bit like transferring gasoline on an amber: when enough mass has been transferred to Dwarf, the pressure and heat in the depth is just right for thermonuclear fusion to re-ignite explosively. It happens in exactly the same way always (although some argue about this). So these Type 1a supernovae are viewed as candles always of the same luminosity.

Large surveys (rejecting some explosion viewed as outliers) concluded that far-away Type 1a explosions were weaker than the Hubble law of expansion predicted. And the further one looked, the more the 1a explosions faded.

The conclusion was drawn that the universe expanded faster than the old model of Hubble and Einstein’s Gravitation theory predicted.

Greater expansion meant greater energy, and its source was not clear, so it was named DARK ENERGY.

Ironically to describe the simplest way to describe it was just to re-introduce the Lg term Einstein had introduced and then rejected, while he blundered about clumsily.

***

Your Humble Servant Flattens All:

It remains that the original theory of Einstein requires a very fine tuning of parameters to make our universe explode into its present very flat state in a bit less than 14 billion years. It also requires a supplementary explosion, called “Cosmological Inflation”.

I don’t have this problem.

I just wipe Einstein and his cohorts clean. I am master of my own soul. They have two Cosmological Inflations. I have just one, the one that is observed.

And my version of the universe can be 100 billion years old, or more.

I don’t confuse gravitation and revolution, inflation and what not. The Einstein Field Equations are correct, I just don’t apply them to the universe.

Simple does it.

Making something complicated simply because it allows to “shut and calculate” (the philosophical doctrine of contemporary physics) has been seen before. This was the trap into which Ancient Greek astronomy fell, making ever more sophisticated versions of the Ptolemaic system.

We should avoid duplicating our forebears’ mistakes.

Patrice Ayme’

Mathematical Note:

That I consider the universe three dimensional may sound as a strange admission, as I always advocate all sorts of dimensions, from the brain to fundamental physics. But not so: just view the three dimensional aspect as an… average.

(Here I am going to talk as a common physicist or mathematician, and elide the tweaking of fundamental axioms of topology and logic that I am wont to engage in, because I want to present the simplest picture.)

More precisely, this is what happens in two dimensions. In one dimension, the line or circle, there is just one geometry.

The USA mathematician Thurston launched a theorem, proven by the Russian Perelman, which showed there were just eight fundamental geometries in three dimensions.

(Disgusted by the dog eat dog attitude of famous mathematicians, some of whom I personally know, Perelman refused prizes, and abandoned math; I do share Perelman’s indignation, and then, more. Austerity, as imposed by plutocrats, has made even mathematicians like rats, prone to devour the innocent. The problem is not just in physics.)

STRUCTURED LIGHT: WHY LIGHT SLOWS DOWN IN WATER

February 14, 2015

Light slows down in water. That’s a known experimental fact. The usual explanation is that, when light advances through water, it collides with water molecules. So it zigs and zags through the water, and this zig-zagging action slows it down.

This makes no sense (sorry, noble predecessors!)

After showing why it makes no sense, I will present my solution, STRUCTURED LIGHT. The reasoning squarely contradicts Einstein on the photon, and its triumph helps to demonstrate how right it is.

Structured Light Slows Down In Empty Space. I Apply To H2O

Structured Light Slows Down In Empty Space. I Apply To H2O

If the zig-zag collision theory of the slowing down of light were true, light would lose energy during these collisions. (Light speed through water is only 2/3 c; the collision theory would mean that laser light through water would cover one third more distance, simply due to haphazard collisions; thus laser light would certainly losing coherence.)

Simple basic physics shows that light loses energy: if particle P hits particle W, and particle P’s momentum changes, W momentum also changes, and so does its energy. Energy is conserved (at least for times long enough), so as P gives energy to W, P loses energy. Here P is for Photon, of course, and W for Water. (Remember Quantum Physics does not contradict Classical Mechanics; instead, it gives it a SUBSTRUCTURE, in the finer domain that subtends the Classical domain.)

So the slow-down through collision theory predicts that light will lose energy when it goes through water.

However, it does not. Light comes out of water at the same exact color, thus energy, as it came in. Laser light keeps being laser light under water. It surely would not if every single photon of the beam had to collide with a water molecule. (Notice in the link how confused research presently is about optics and liquids; my proposed reasoning is at a scale thousands of times smaller.)

Proposing that light slows down from collision is thus wrong.

So, what’s my solution?

Absolute Wave Theory.

According to said theory, propagating photons are NOT particles (Va De Retro, Einsteinas!)

What are photons, when viewed as Absolute Waves?

Einstein proposed that photons (“Lichtquanten”) were points. He made it up. He had no proof, whatsoever, that this was true. It just sounded good. Worse: he did not need point-particle photons to explain the photoelectric effect. That error has poisoned the well of physics for 110 years. Thousands of physicists repeated what Einstein said. That Einstein was given the Nobel Prize for this exact idea, is no proof of its validity, as far as I am concerned. That makes me special.

But I have very good reasons to believe photons are not points. Because:

  1. I don’t know what points are. Not only I do not know what points are physically, I don’t even know what they are, mathematically. (By the way, I know Real Analysis and some Model Theory, so I am not as naïve as I may sound to the unwary.)
  2. Light diffracts and bends around corners. Isolated photons do this. How could they do it, if they were not spread about transversally?

Here is my conclusion: Photons are structured waves. This basically means that they have some width.

This is now experimentally supported. What was published in Science on January 22, 2015?

Spatially structured photons that travel in free space slower than the speed of light. (Daniel Giovannini1,*, & Al.)

http://www.sciencemag.org/content/early/2015/01/21/science.aaa3035

“Abstract: That the speed of light in free space is constant is a cornerstone of modern physics. However, light beams have finite transverse size, which leads to a modification of their wavevectors resulting in a change to their phase and group velocities. We study the group velocity of single photons by measuring a change in their arrival time that results from changing the beam’s transverse spatial structure. Using time-correlated photon pairs we show a reduction of the group velocity of photons in both a Bessel beam and photons in a focused Gaussian beam. In both cases, the delay is several micrometers over a propagation distance of the order of 1 m. Our work highlights that, even in free space, the invariance of the speed of light only applies to plane waves.”

So what do I propose?

That water structures photons propagating through it. Structuring is what slows light down. Instead of having just one mask, as in the Glasgow experiment, we have thousand within one wavelength of light. Thus, instead of being slowed down .0001%, it’s slowed down of the order of 10% or more.

As in the Glasgow experiment, photons are not “particles”, they are spread about (they have a “TRANSVERSE structure”).

When a photon enters water, should it NOT hit a water molecule, the photonic wave will get endowed with a complex topology of non-trivial genus (because the non-linear wave that constitutes the photon has to have avoided nuclei and orbiting electrons, and the only way it can do that is by evolving holes in the right places).

As a photon passes a water molecular group, it slows down a bit. The water molecules act like the mask the physicists applied to slow down the beam photons in their experiment. Those breaking episodes pile up, and integrate in a global slow-down.

Frequency, thus energy, is unaffected.

Some may object that the theory is obviously false: should not the slow-downs pile up, and thus, the thicker the water, the more photons will slow down?

No. In the slowing down of the Structured Photons in vacuum, the slowing down is necessitated by the collapse of the photon back into a linear wave. It’s a one time event. However, in water, when the photon has acquired a structure which is enough like a sieve, after going around enough water molecules, it needs time to restructure. So over that distance, it has slowed down. Then the process repeats.

Let me quote a bit more from the violation of light speed Glasgow University paper (from behind its pay wall):

“The speed of light in free space propagation is a fundamental quantity. It holds a pivotal role in the foundations of relativity and field theory, as well as in technological applications such as time-of-flight measurements. It has previously been experimentally established that single photons travel at the group velocity (20). We have now shown that transverse structuring of the photon results in a decrease in the group velocity along the axis of propagation. We emphasize that in our full-aperture experiments, no pre- or post-selection is applied to the spatially structured photons, and that the group velocities are always compared over the same propagation distance, much as if they were in a race. The effect can be derived from a simple geometric argument, which is also supported by a rigorous calculation of the harmonic average of the group velocity. Beyond light, the effect observed will have applications to any wave theory, including sound waves.”

The authors have declared that they could not see any application of the effect they discovered. In particular not in cosmology.

However, I just found one, in everyday physics.

Einstein said nobody understood Quantum Mechanics. Feynman added that all the mystery of the Quantum was in the Double Slit Experiment. Here I explain speed of light in a medium by piling up thousands of double slit experiments within a wavelength of light, and the slow-down they bring. (It’s not quite the Double Slit as it involves continual collapses along the propagation axis.)

The structured photon is the fundamental idea, the order one idea, of the Absolute Quantum Wave theory. The preceding, and the Glasgow experiment itself, establish it further (more is coming soon).

There is no experimental support for Einstein’s views on the spatial the nature of the photon as a particle, there is plenty of evidence against it (the latest being Structured Light).

By contrast there is increasing evidence for the Absolute Wave Theory. Einstein and company, bless their souls, pontificated about a lot of things they did not know anything about. That photons were point-particles is one of them. Time to move on.

Patrice Ayme’