## Let’s Get On With It!

It amuses me to say that ‘physicists are field happy‘. It’s in the same genre of not even punny groaners as ‘physicist out standing in their field.’ Perhaps the biggest irony of all is that physicists did not invent the correction field to Einstein‘s curvy stretchy spacetime so they could work in normal Euclidean four dimensional space and time. Had they done so, it’s a short stroll down Easy Street to realize that the correction field must be an aether. Then bingo, bango, bongo the solution to the universe emerges. It is a fascinating set of circumstances that obfuscated the solution which was right in front of physicists and cosmologists all along and giving them clues galore.

I purchased Richard Feynman’s book QED and am looking forward to reading it with a critical eye and attempting to link NPQG to Feynman diagrams. I already know one secret. Study this Feynman diagram and see if you can apply your knowledge of NPQG to find a new insight.

In NPQG we know that each point charge is immutable, and therefore conserved. Therefore at each node (junction) in the Feynman graph there is a reaction with mathematical identities between the reactants (what goes in to the reaction) and the products (what comes out of the reaction). We know that the sum of the energy carried by the reactant point charges must equal the sum of the energy carried by the product point charges. What else do we know? The number of reactant electrinos must equal the number of product electrinos. Likewise, the number of reactant positrinos must equal the number of product positrinos. Not only that, we can trace the provenance of each point charge in the reaction. These are huge new insights that NPQG brings to the equation.

In this Feynman diagram we have three junctions or reactions.

1. A reaction where an electron reactant and a positrino reactant results in a photon product. 9/3 + 3/9 = 6/6 + x/y. Therefore x/y must be 6/6 or 3/3 + 3/3.
2. A reaction where a photon reactant results in a quark product and an anti-quark product. 6/6 + x/y= 3/3(a/b) + 3/3 (b/a) where a + b = 6 presuming these are generation 1 reactants and products. Apparently this reaction is consuming a 6/6 structure or two 3/3 structures from spacetime aether.
3. A reaction where an antiquark reactant results in an antiquark product and a gluon product. We haven’t yet deciphered the composition of the gluon but indications are it may be 1/1, a single energetic dipole. From whence does that gluon arise?

When we have an imbalance in point charge count between reactants and products, where are these extra point charges coming from or going to? It appears the answer must be spacetime aether. Spacetime aether provides ready vessels to exchange (transact) energy and point charges with standard matter reactant and product structures. How cool is that? Every reaction can be modeled relationally as a transaction involving conserved quantities. Those conserved quantities include a count of point charges by flavor and energy in various forms. The energy conservation is a mix of kinetic and electromagnetic forms and their directionality so that momentum is conserved as well.

These findings indicate that if Feynman diagrams are to be used in the NPQG era then the symbology and diagrams will need updates to show ALL the reactants and products in every reaction. We no longer need the ‘virtual particle’ crutch employed by physicists. There is no such thing as a ‘virtual particle’ in the point charge universe. Wherever we encounter the term ‘virtual particle’, we know that description of science is out of date. The same can be said about the erroneous concept of ‘annihilation’.

Neutrons (18/18) and protons (15/21) each have a structure built with 36 point charges. A photon (6/6) has 12 point charges. It is both odd and exciting to realize that photons are created out of energetic matter. Three photons have the same number of electrinos and positrinos as a neutron. The apparent energy carried by those point charges is vastly different and that is our path to rationalization and understanding.

The new finding of NPQG that photons are actually wave emitting material and not a pure wave also means that photons are carrying point charge ‘matter’ away from the initiating reaction and delivering point charge matter to the consuming reaction. From this I ponder the set of reactions that consume photons of different energies, and not just their energies, consuming point charges as well. For example, might photosynthesis consume some or all of the point charges in arriving photons? Wouldn’t that be interesting? Any such reaction is like 4D printing using both the energy and the point charges from arriving photons. If such reactions occur in nature. then what is the accumulation rate of point charges on earth from photons arriving from our Sun? Is that a significant inflow of point charges to Earth over large time scales? What percentage of the point charges in solar radiation (by frequency) are absorbed by earth vs. being diverted by fields and solar winds, or by reflection? I have no idea of the answers to these questions, but they open up an entire new line of inquiry that can be applied not just to the Earth : Sun photonic relationship, but other reactions across the universe where photons are reactants.

One incredibly awesome potential future application is that we may find a way to use a variety of lasers of different energies to 4D print directly from photons, or from photons combined with spacetime aether structures. I imagine a printer that consumes spacetime aether and somehow (a laser equipped Maxwell’s daemon?) can output two streams of spacetime aether, one stream with less energy that could be considered exhaust, and one stream with more energy that we keep and send to the next stage. Through a series of these conservative cycles the 4D printer would build up energetic point charges to be used in subsequent stages that implement the process of creating assemblies. This could be the ultimate printer. You could make anything directly from spacetime.

What are the technologies required?

• Efficient, energy = E0, process to consume spacetime aether at Energy E1 and output spacetime aether in two aether streams at energies E2 and E3. We know that E0 + E1 = E2 + E3. We want to start the race for technologies that maximize the difference | E2 – E3 | with efficient E0 and can be mounted in a spacecraft.
• Radiation friendly processes and radiation control technologies. Radiation could be quite adverse for existing structure (including the operators!) so we’ll need to find safe processes that keep all the energy we are working with under control and deposit it safely into the energy cores of matter (i.e., nested dipoles).
• Process that consumes energetic point charges and creates matter made from protons, neutrons, and electrons. Need a raw ingredient for an existing process? Let’s turn on the printer and whip up a mole of gold or silver. Imagine that.

A set of requirements could be developed for the ultimate technologies required for our spacecraft based 4D printer. Technology planners could then carefully break this down into a network of steps to occur over a certain time frame.

• How much matter-energy do you need to generate per unit time?
• How much printer time is required to produce each necessary object?
• What are the operating parameters for such a manufacturing system? How large a pod bay? Temperature? Acceleration tolerances? Velocity tolerances?
• Steps to reach zero emissions and 100% green manufacturing.

I would hazard a wishful guess that initial cost would be more than a reactor and less than or equal to a CERN class particle collider. However, that is only the early technology develoment period. Eventually the equivalent of Moore’s Law will kick in for NPQG and it is off to the races reducing cost and size of these printers, while increasing their capability.

Lest anyone be skeptical and say these ideas are as ludicrous as alchemy, be aware that the science of nuclear transmutation of one element to another element already exists. Furthermore, think about the standard model of ‘fundamental’ particles. The standard model is based upon transmutation of these particles that are incorrectly termed ‘fundamental’. Seriously, physicists? Well, whatever — now we will finally have immutable fundamental point charge particles that make every type of structure. Of course, structures can be decomposed into constituents and those constituents used to build other structures. That is exactly what nature does. That IS emergence.

Imagine the network of structures and reactions occurring in absolute time throughout our galaxy. Imagine that network with traceable histories of every point charge. That’s a lot of structures and reactions!

Let’s imagine the universal network of point charges, structures, and reactions is a database. Let’s write some pseudo-SQL queries. For now I will use the symbol # to represent the network of all energetic point charges.

• select count(*) from # group by reaction
• This is the set of all reactions that occur in our galaxy during the absolute time interval used to project our network of reactions, #.
• If the absolute time interval is long enough, we should capture several inflationary mini-bangs from the SMBH including point charge plasma breaches, eruptions, and massive jets of reacting point charge plasma and emerging structure.
• If the spatial volume over which you sampled # includes the outskirts of the galaxy, where aether has expanded maximally given the opposing flows of dissipating aether from neighbor galaxies, then I would think you have a reasonably accurate view of the frequency of reactions throughout the entire universe. Perhaps you would also want to model some galaxy mergers for any unique reactions.
• select count(*) from # group by point charge, reactant/product
• This will output four numbers describing all reactions
• input electrinos = output electrinos,
• input positrinos = output positrinos.
• select timestamp, structure, count(*) from # group by point charge, structure where timestamp = T
• This query will show every type of structure in the galaxy at time T and the count of those structures.
• This query could be extended to answer many questions.
• What percentage of point charges in our galaxy are tied up in carbon atoms?
• What percentage of point charges in our galaxy are in the SMBH?
• What percentage of space in our galaxy is inaccessible due to being inside the sphere of immutability around point charges?
• You could go on and on imagining queries you could write against a relational knowledge cube of the galaxy.
• Perhaps it won’t be long until scientists can simulate small and large scale NPQG processes and generate such cubes. That would be fascinating.
• We’ll need to include absolute location of each point charge as well.
• Quite obviously there is a tremendous amount of redundancy in a point charge knowledge cube as specified.
• A realistic goal would be to generate highly compressed and encoded purpose built cubes that could be used for specific research.

When will this paradigm shift occur? By my timeline we are already 2.5 years behind schedule and the clock is ticking. We could have been out of the starting gate in mid-2018. Let’s consider the timeline.

• T0 : Mid 2018 : My confidence in NPQG skyrockets. My description is nascent. Physicists ignore, express skepticism, ridicule, and bully during this period. None engage productively.
• T1 : ?. NPQG goes viral. T1 >= January 2023. T1 – T0 > 4.5 years.
• T2 = T1 + 0.5 years : Scientists, universities, institutions, corporations, and investors take six months to wrap their minds around NPQG and realize the opportunities. Superscalar technology driven corporations will be much faster than others at realizing the potentials and making large strategic investments. I am thinking the likes of Tesla, Amazon, Google, Apple, Microsoft, and IBM will move very quickly.
• T3 = T1 + 1 year : Groups of well funded and well compensated people are working in teams to take NPQG research to the next levels and to pursue NPQG enabled technologies. New startups and reorganized departments and institutions, throughout academia and industry, spring up like mushrooms. Interestingly, each group can set reasonably aggressive yet achievable goals. With access to the source code of nature, we will have realistic confidence when assessing what it is going to take to reach each new level in knowledge and technology. NPQG is beginning to be taught in middle school, high school, and universities as educators have raced to produce new learning material and revise existing material.
• T3 = T1 + 3 years : Productivity of NPQG related research and technology development rises quickly as groups find their groove. The paradigm change has occurred and new entrants to universities are coming in prepared with NPQG knowledge of nature, the universe, and the source code. To address demand for talent, online and university educators have developed certification programs for various skill tracks so that prior graduates, as well as the as yet to be university educated, can develop NPQG skills, achieve certifications, and fill those roles.
• T4 = T1 + 5 years : By now many existing technologies have been improved with knowledge of NPQG. New technologies enabled by NPQG are popping up every day. The use cases rise as fast or faster than AI deep neural networks did from 2015 to 2020. Deep neural networks will have also been applied to advance NPQG. Perfect timing.
• T4 = T1 + 10 years : After a decade the first large scale applications of NPQG are coming online. The first NPQG enabled 4D printers are in use in labs. New energy production technologies are emerging. The era of resource abundance, predicted several decades earlier, is well under way. There are many newly minted NPQG millionaires and some billionaires.

The main issue is when T1 occurs. I am hoping it is any day now. You’ll note that I have established a ten year timeline from T1. It is rather normal for technology development to take two or more decades. Witness fusion reactors — going on what 50 years now? I firmly believe that with the specifications for the universe in hand, that intelligent individuals can at least double the technology development pace of prior eras.

J Mark Morris : San Diego : California