Categories
Physics

Planck Shell Memory and Computation

NEOCLASSICAL PHYSICS
and QUANTUM GRAVITY
Given energetic immutable point charges permeating a flat Euclidean space and time, emergence creates our universe.
NPQG unifies GR and QM and transforms ΛCDM.

Seed a Euclidean garden with duos of energetic, charged, immutable Planck scale point charges and a Universe will emerge.

J Mark Morris
Photo by Snapwire on Pexels.com

Can you imagine that this minimal and simple set of ingredients produces the observable universe given the right densities of Planck scale point chargesand energy? Can you imagine that this set of ingredients might produce a background spacetime æther structure that emerges from the complex dynamical nature of electrinos and positrinos? Electrinos and positrinos interacting with each other at a variety of energies produce structure, much as individuals create with rocks and wood blocks, Erector sets, Tinkertoys, Lincoln Logs, K’nex, Lego, or for that matter, software, solid modeling, lattice modeling, and a profusion of other intensely mathematical forms including simulation. Nature’s ingredients to the universe implement emergence via structure formation and structure begets structure.

Once we realize that this is how nature and the universe work, we can accelerate our efforts to understand all of the emergent structures and reactions under all conditions we find interesting and promising. Revising the advanced modeling techniques of prior science with NPQG will result in enormous (well beyond quantum!) leaps forward through the remainder of the configuration space provided by nature. Entirely new algorithms will emerge that are far more efficient after being informed by NPQG.

Beyond improving knowledge, models, and algorithms, it is when NPQG reaches the stage of impacting implementation technology that things get really interesting. What types of applications are a fit with the technology that will emerge early in the NPQG era? It is difficult to predict with any accuracy. There are many advanced applications in many fields and investment capital and potential return on investment will play a large role. The core of NPQG knowledge and software modeling is and will be open source. The open source project is hoped/projected to evolve to have a vibrant contributor community with many sub-projects available for universities, institutes, and businesses to support or add proprietary value while also funding and contributing to the core.

The most important questions are how will NPQG inform improved technology and at what scale, cost, manufacturability and in what fields? To answer this question requires consideration of a large multi-dimensional matrix of factors and it is premature to make confident assessments.

One exciting application of NPQG will be to computing and memory. Since we will understand how structure works to the lowest possible level of nature, we will set our sights on leveraging that fundamental level behaviour at the tiniest and fastest scales to architect computers and memory of incredible capability. Of course it will take technology developers some time to reach this point, but they will be informed by a precise understanding of nature. Being precise means that modeling will play a significant role in determining the most promising technology paths forward at the fastest pace.

Already today, prior to NPQG, entanglement and uncertainty computing based on quantum mechanics, aka quantum computing, is garnering significant investment and research and may well reach a plateau of productivity and have its heyday. I am certain there are physics at that level to exploit. NPQG may also help to inform the equivalent of Moore’s law in quantum technology.

Let’s turn to the fundamental layers of nature and Planck scale point charges. How do we model them based on our reverse engineering of nature?

  • Planck point charges are symmetric under any condition.
  • Their charge emits from their center point.
  • The origin and implementation of point charges is unknown.
  • They are immutable.
  • They have a sphere of immunity with a great circle circumference of the Planck length, i.e., a radius of Lp/tau
  • Their sphere of immutability may not be penetrated or dented by another Planck particle.
  • Electromagnetic fields flow through them with no impact.
  • Planck scale point charges operate in a continuous Euclidean geometry of absolute space and absolute time.
  • While the point charges themselves are physical quanta, they operate in a continuous geometry.
  • As structures form from point charges, those structures can taken on quantum/discrete behaviours and/or continuous behaviours. Refer to how we make digital circuits out of analog gates.
  • The most important basic emergent structure is the Tau dipole consisting of an orbiting electrino and positrino pair, which can take on increments of Planck’s constant h of energy in Joule-secs.

For the purposes of this article, let’s presume that in free space, spacetime æther is composed of extremely tired (redshifted), very low energy photons, and neutrinos which are composite particles consisting of a shell with no payload. They are rotating slowly and are nearly depleted of energy. Their kinetic energy corresponds to 2.7 Kelvin. Their flywheel battery shell is approaching zero frequency.

Photons and neutrinos can take on an incredible number of precise energy levels, from near zero to the Planck frequency in half integer frequencies. Imagine if we could isolate a photon as a computational memory cell. If I did the math correctly, that is about 2143 states, or the equivalent of a 143 bit register. Of course, real technology uses up raw capability to handle error rates, so who knows the conversion rate, but still, considering the size, that is an incredibly small and fast memory cell. And yes, probably many years down the road and in an entirely different form of technology. I am turning up the contrast to show what NPQG ultimately leads to technology wise.

We would need the following operations :

  • Read the energy level of a photon precisely
  • Add energy to a photon
  • Subtract energy from a photon
  • We would also need to know any uncertainties or error rates.

Such a photon would be the ultimate in a computational memory cell. Not only would it make for extremely fast and dense memory, but it has a built in add and subtract capability, and it is non-volatile as well.

I hope you enjoyed this post. It has been a bit imaginative, yet it is certainly fun to contemplate the ultimate leverage of nature.

J Mark Morris : San Diego : California : August 11, 2020

By J Mark Morris

I am imagining and reverse engineering a model of nature and sharing my journey via social media. Join me! I would love to have collaborators in this open effort. To support this research please donate: https://www.paypal.me/johnmarkmorris

https://johnmarkmorris.com
https://twitter.com/J_Mark_Morris
https://www.reddit.com/r/NPQG/
https://www.facebook.com/NPQG/
https://www.linkedin.com/in/johnmarkmorris/

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s