A Vulnerability in the Mathematics of Physics

The field of physics advances relative to the accepted tower of information that has emerged from the history of physics theory, mathematics, and experimental observation. When theory predictions match experimental observations to a high degree of statistical significance that is a factor that binds the experimentally supported theory into the tower of information. Of course we know that bindings of thought to reality can be perfectly sensible one day and unwound the next day by a new more scientifically sensible paradigm.

A geocentric model of our universe, from James Ferguson’s 1756 “Astronomy Explained,” highlights its absurdity; the Ptolemaic system required that the planets in our solar system orbit Earth in strange, looping arcs. Credit…Science Photo Library” via NYT

Let’s carefully examine what we mean by the following principle of the scientific method:

When theory predictions match experimental results to a high degree of statistical significance that is a factor that builds confidence in the theory.

When the scientific method is applied properly we must also consider the possibility that an assumption is FALSE.

Consider that the Planck length scale is some 15 orders of magnitude smaller than that detectable with advanced experimental and observational technology circa 2020 which has at best \mathbf{10^{-20}} experimental discernment. From the Planck length scale looking up, experiments in 2020 are a quadrillion units larger.

1,000,000,000,000,000

A QUADRILLION
A THOUSAND TRILLION
A MILLION BILLION

The scientific method has been interpreted such that consideration of Planck scale as a meaningful clue about physical nature is considered to be taboo because it cannot be falsified via testing. The Planck scale includes length and energy and other dimensions and measures. Let’s extend our claim beyond the Planck scale and include all scales of structure not currently observable by state of the art experimentation.

Proposition: For any theory of nature, nature beyond the current capabilities of experimental observation could impact the assumptions or findings of the theory.

If the proposition were TRUE, then for each theory that relies on faulty assumptions, the statistical significance of the findings is invalidated and it must be re-evaluated theoretically and experimentally.

The proposition is not even close to being FALSIFIABLE give the state of physics and cosmology circa 2020. There are so many paradoxes and open problems that no scientist would say that we understand nature from its most fundamental level.

If the proposition were actually FALSE, then it would be the case that our experiments had reached the natural limit of experimentation. There would be no ‘plus ultra’, no more beyond and nature and the universe would make eminent sense. We’re not there yet.


The long con of physics is the degree of confidence attributed to various theories. Physics has many theories and narratives on why certain propositions we expected to be TRUE were actually FALSE. What if those propositions really were TRUE but science could not tell they were TRUE because state of the art experiments could not detect the underlying structure? What if physics has cemented one or more incorrect theories into the tower of information?

This is exactly the situation we find ourselves in with NPQG. The vast majority of scientists in the fields of physics, cosmology, and astronomy consider it nearly taboo to question GR, QM, or ΛCDM. However, as you study NPQG the case continues to build that nature outside our scales of detection leads to a new more sensible theory of nature and the universe.

J Mark Morris : San Diego : California