Can you imagine that this minimal and simple set of ingredients produces the observable universe given the right densities of Planck scale point chargesand energy? Can you imagine that this set of ingredients might produce a background **spacetime æther structure** that emerges from the complex dynamical nature of electrinos and positrinos? Electrinos and positrinos interacting with each other at a variety of energies produce structure, much as individuals create with rocks and wood blocks, Erector sets, Tinkertoys, Lincoln Logs, K’nex, Lego, or for that matter, software, solid modeling, lattice modeling, and a profusion of other intensely mathematical forms including simulation. Nature’s ingredients to the universe implement emergence via structure formation and structure begets structure.

Once we realize that this is how nature and the universe work, we can accelerate our efforts to understand all of the emergent structures and reactions under all conditions we find interesting and promising. Revising the advanced modeling techniques of prior science with NPQG will result in enormous (well beyond quantum!) leaps forward through the remainder of the configuration space provided by nature. Entirely new algorithms will emerge that are far more efficient after being informed by NPQG.

Beyond improving knowledge, models, and algorithms, it is when NPQG reaches the stage of impacting implementation technology that things get really interesting. What types of applications are a fit with the technology that will emerge early in the NPQG era? It is difficult to predict with any accuracy. There are many advanced applications in many fields and investment capital and potential return on investment will play a large role. The core of NPQG knowledge and software modeling is and will be open source. The open source project is hoped/projected to evolve to have a vibrant contributor community with many sub-projects available for universities, institutes, and businesses to support or add proprietary value while also funding and contributing to the core.

The most important questions are how will NPQG inform improved technology and at what scale, cost, manufacturability and in what fields? To answer this question requires consideration of a large multi-dimensional matrix of factors and it is premature to make confident assessments.

One exciting application of NPQG will be to computing and memory. Since we will understand how structure works to the lowest possible level of nature, we will set our sights on leveraging that fundamental level behaviour at the tiniest and fastest scales to architect computers and memory of incredible capability. Of course it will take technology developers some time to reach this point, but they will be informed by a precise understanding of nature. Being precise means that modeling will play a significant role in determining the most promising technology paths forward at the fastest pace.

Already today, prior to NPQG, entanglement and uncertainty computing based on quantum mechanics, aka quantum computing, is garnering significant investment and research and may well reach a plateau of productivity and have its heyday. I am certain there are physics at that level to exploit. NPQG may also help to inform the equivalent of Moore’s law in quantum technology.

Let’s turn to the fundamental layers of nature and Planck scale point charges. How do we model them based on our reverse engineering of nature?

- Immutable point charges are symmetric under any condition.
- Their charge emits from their center point.
- The origin and implementation of point charges is unknown.
- They have a sphere of immutability at a radius near the Planck length.
- Their sphere of immutability may not be penetrated by another Planck particle.
- Electromagnetic fields flow through them with no impact.
- Planck scale point charges operate in a continuous Euclidean geometry of absolute space and absolute time.
- While the point charges themselves are physical quanta, they operate in a continuous geometry.
- As structures form from point charges, those structures can taken on quantum/discrete behaviours and/or continuous behaviours. Refer to how we make digital circuits out of analog gates.
- The most important basic emergent structure is the Tau dipole consisting of an orbiting electrino and positrino pair, which can take on increments of
*h*-bar of energy in Joule-secs.

For the purposes of this article, let’s presume that in free space, spacetime *æther* is composed of structures that form from extremely tired (redshifted to very low energy) photons, and neutrinos which are composite particles. They are rotating slowly and are nearly depleted of energy. Their apparent kinetic energy corresponds to 2.7 Kelvin. Their generation 1 dipoles are approaching zero frequency.

Photons and neutrinos can take on an incredible number of precise energy levels, from near zero to the Planck frequency. Imagine if we could isolate a photon as a computational memory cell. If I did the math correctly, that is about 2^{143} states, or the equivalent of a 143 bit register. Of course, real technology uses up raw capability to handle error rates, so who knows the conversion rate, but still, considering the size, that is an incredibly small and fast memory cell. And yes, probably many years down the road and in an entirely different form of technology. I am turning up the contrast to show what NPQG ultimately leads to technology wise.

We would need the following operations :

- Read the energy level of a photon precisely
- Add energy to a photon
- Subtract energy from a photon
- We would also need to know any uncertainties or error rates.

Such a photon would be the ultimate in a computational memory cell. Not only would it make for extremely fast and dense memory, but it has a built in add and subtract capability, and it is non-volatile as well.

*I hope you enjoyed this post. It has been a bit imaginative, yet it is certainly fun to contemplate the ultimate leverage of nature.*

*J Mark Morris : San Diego : California*

You must be logged in to post a comment.