Here is a conversation I had on the PBS Space Time discord. As usual, the fanboi who frequent social media demonstrate their complete lack of creativity and imagination. I continue to find it to be either a lack of intellectual capability or perhaps some kind of subterfuge, for these people to claim they don’t recognize the mapping from point charge theory to general relativity and quantum theory. Alas, this seems to be the common reaction on social media. I’ve mostly given up on trying to engage via social media sites like reddit, twitter, or discords. Occasionally I venture forth to see if the winds of understanding have changed, but have never failed to be disappointed in the people who claim to understand physics.
The conversation started with a link to an article : Notorious dark-matter signal could be due to analysis error : Observations that physicists have so far failed to replicate could be the result of misinterpreted data.
I read the article to say the veracity of the signal is facing another challenge yet remains in the same status of “uncertain finding”. My, perhaps wishful sense, is that the rumblings of dissatisfaction with LCDM are getting louder and stronger. The next several years will be very interesting.
My posts are in this type of quote.
Fanboi : In the other news water is wet. That is always problem in science… Misinterpreted data is always a possibility it is not like something new. Basically “might be” not gonna cut it, show exactly what is wrong and prove it to 5 sigma precision, then we are talking.
What I’m sensing is not so much a change in the first order accepted physics and cosmology (unless the Big Bang were to be falsified by JWST), but it seems like a lot of second order contender theories are shifting around like horses in a horse race. Feels like MOND is moving up. Dark matter is lagging a bit. Tensions in LCDM feel poised to increase significantly. That’s sort of interesting, but still tragic given what I think is the real problem.
Fanboi : My general impression is that nobody likes LCDM, but it’s really unclear what the next step should be with a lot of alternatives, none of which actually work that well. This is not dissimilar to the GR/QM disagreements. LCDM will remain the status quo regardless of the problems until a truly better alternative arises. So IMO adding more problems to LCDM doesn’t really change anything; actually coming up with a better model which solves those problems without making things worse is the issue.
Fanboi : Yeah it is like “we know there are problems, no need to point at them over and over again, we need solutions not pointers “
Basically knowing what the problem is doesn’t magically give solution to said problem.
Fanboi : Seemingly in relation to some of the past discussion here, saw this article this morning that talks about some of the inconsistencies that JWST has revealed (talks about some of the inconsistencies, though I suppose the mainstay of the article is rebuking the whole thing from the other week where some articles were claiming JWST was “disproving” the big bang or something, I didn’t give those too much attention.)
Fanboi : All the fuss comes from an article that started with “panic ! […]” and, of course, twitter
I think it is quite amusing that the scientsts aren’t just fessing up and saying that JWST looks like it will present a number of technical challenges to the “early times” of LCDM. Instead we have the over-reaction from the scientists themselves to first pass observations, many of which will be culled. Queue the un-punny Panic paper. Then on cue, the science press goes nutz. Now we are in the phase where some scientists are kinda pissed off at the other scientists who published provocative quick memos because that’s kinda un-cool in some ways. The part I find most horrifying is that one of the arguments in this current backlash phase is “oh, we just have to tweak some knobs and dials in our models”. Yeah, doh. That’s called curve-fitting and it’s pseudo-science! So, I think this is not helping scientists image. Instead it would be far better to just say plain as day that it’s wonderful JWST will present new challenges to our models and we’ll follow those clues dispassionately, even if it were to mean we have to turn our understanding upside-down and inside-out.
Fanboi : One Panic! At the Disco pun and everyone loses their minds!
Fanboi : Lambda CDM model seems to be infinitely tuneable. Kind of like the ptolemaic model was a few hundred years ago, with its mathematical epicycles to account every observed motion in the cosmos. Paul Steinhardt , who helped develop LCDM model, has defected from it:
Fanboi : Not really…quintessence is infinitely tunable, by construction. We just don’t have the data to warrant reaching into that…still non-applied math. The theoretical purpose of the inflation fixup to the Big Bang model, was to avoid a “probability zero possible” situation (read: infer Prime Mover from cosmic microwave background radiation) by enabling thermal equilibrium to be reached “almost always”. That rationale has not gone away.
Fanboi : Can it be that the LCDM problems with JWST early galaxies and the cosmological crisis are related? Seems like if the universe expanded maybe a bit more slowly in the early years than we thought, that gives galaxies more time to form, and might impact cosmological constant measurements
Fanboi : Yes, they are intimately related. (LCDM is already “too flexible” to have real problems w/JWST; what has problems is the pre-JWST reverse fits of LCDM. But that’s exactly the kind of problems Aristotle thought his comments on science, etc. were vulnerable to — he predicted that the instant devices that would allow seeing, etc. better than mere human senses were available, that much of what he was writing would become obsolete.
Fanboi : I guess the real question is that if you alter the fit to make the Universe older so all the mature early galaxies are covered, does that solve the cosmological crisis? One also wonders if this solves the issue about early black holes being too big?
That’s tuning. it’s like curve fitting with your fourier coefficients.
Fanboi : All models involve some tuning.
NPQG doesn’t. Everything is empirically measurable.
What we are talking about for LCDM is not a minor adjustment.
Fanboi : I don’t have a strong opinion here (case of the relevant models not really being open source, so it’s hard for someone w/o credentials to audit them without a total re-implementation) What is clear is that “very early” works great (Big Bang nucleosynthesis is numerically reverse-fittable to measurement accuracy here, 4-5 significant digits for all of hydrogen through boron) and that we never had a great numerical modeling for early universe (past z-factor 6), the models simply weren’t coming up as large as even what was being seen w/Hubble. JWST just made that numerically modeling even worse and you just don’t have a lot of flex to re-slow things (the thermal equilibrum fixup already requires a massive pulse of dark energy, and always has even as far back as the 1960’s when it was initially proposed).
Fanboi : Measuring it empirically and then setting the parameters to be what you measured is exactly tuning.
Fanboi : I mean it is a possibility… Therr are inherent problems with cosmological models, we cannot do multiple experiments to see what happens, because we have only one universe and models themselves only go so far, and really depend on us getting the laws of physics right when we can’t be sure that we did
I don’t think so. I’m talking about universal constants. One of them, the speed of the potential field, is constant everywhere. Two are large scale constants — the density of unit potential point charges, and the density of energy carried by unit potential point charges. That’s not tuning, that’s simply empirical measurement.
Fanboi : The age of the universe is a constant, it’s just a question of difficulty measuring it.
In NPQG, there is no known age of the universe. No known beginning nor end to time nor space. No parameters here.
Fanboi : Doesn’t really matter whether your model treats it as a parameter since you are complaining about tuning the age of the universe for LCDM to fit the data is no different to tuning the speed of the potential field to fit the data. You are simply saying that there are some values where the model doesn’t give a reason for why it should be one thing or the other and you can only determine it via measurement.
Standard model has ~26 parameters? How many does LCDM have?
Fanboi : Yeah, but since your model is incredibly incomplete, I don’t think we can really count the number of parameters as a fair comparison.
Tuning and measuring seem to me to be two distinctly different concepts. NPQG is more complete than any model, since it builds on the current patchwork math of the state of the art accepted effective theories. It also eliminates all the incorrect narratives. More complete and more correct.
Fanboi : To be complete you would have to show that it actually reproduces known physics and experiments, which you are a very long way from doing, which I am not currently complaining about, but merely pointing out that it’s hard to assert your model’s superiority in such a case.
The state of the art models don’t even know what to make of the Planck scale! The state of the art models for the effective theories are based upon false priors!!
Fanboi : Immaterial.
Psshawww. ROFL.
Fanboi : I am not saying that your model will not be better at some point in the distant future, but as of right now though, it is very far from any comparison to the existing models.
It is better now, but since physicists are so lost, they can’t even comprehend that they are lost in la-la land of Woo.
Fanboi : It is only better once you can prove that it reproduces existing experiments at least. In any case this is getting beyond the scope of my original point. All models require some tuning, they have some parameters or details which the model does not predict or explain, you simply measure the proper value, and this is no different in nature, although one could certainly wonder about the magnitude
Fanboi Y : Besides all the talky-talky, here is what a proper paper and scientific work looks like, table 1 gives the best-fitted parameters including the age of the universe using Planck data. I know you two can’t read it, but when talking about climbing a mountain it may be humbling to be put at its base.
Dear Fanboi Y, I very much appreciate your participating given that you are a professional. Not many professionals engage in dialogue with hobbyists, amateurs, or aspirants. Thank you. That said, although I understand you cannot comprehend large scale alternative narratives, I think you are unable to see that possibility due to false priors. Once those false priors are corrected, it will pull the narrative rug right out from under you and your colleagues.
Cosmologists are going to be thrown under the bus by particle physicists and run over back and forth a dozen times. It will be exhausting for cosmologists to repair their field and self diagnose how deluded they became and why did the scientific method not detect this era of confusion?
Particle physicists will be in a true quandry because they will be delivered a new model on a silver platter which they think is unbelievable, but which will turn out to be the sought for idyll. It’s going to be difficult for the hard core physicists to cross the chasm from unbelievable complexity to the land of a simple solution. The good news is that there is a gold mine at the end of the rainbow and if they can make the transition quickly and smartly the sky is the limit.
I imagine that sounds like a lot of hubris coming from one person, but I’ve been trying to engage your field since July 2018 and if they weren’t so stand-offish (a kind euphismism for the bullying I have endured) we would already have the fields converted over and be well in to the next era.
In my ideal vision, there would be one paper annoucing the basics of the next era with opt-in co-authors from all extant professionals in the field. I don’t even have to be an author, but I would appreciate acknowledgement as a reverse engineering sleuth.
In my view, this whole thing is a tragedy. There were so many points along the way where the actual solution to nature was within reach, but not grasped due to interpretational confusion, descending from false priors. So, I truly feel sad that so much intelligence has been squandered on less productive endeavors in effective theory land.
We would be so much farther along the Moore’s curve understanding of nature (and the resulting technology $$$$) had the false prior seeds not been planted circa 1900 and earlier with Michelson-Morley. On the other hand, I often wonder if there were a few scientists circa 1905-1927 who divined the solution and somehow secreted it away for fear of what evil people could do with it. The era 1900-1970-ish was heavily influenced by evil and hawkish people. Even today there is cause to worry what might result from the source code to nature, similar to the worries about evil use of the advancements in DNA science, viz CRISPR and followons.
Driving along the Maine coastline today I was trying to diffentiate between definitions of the term tunable. Maybe there should be various gradations. I presume academics have studied this, but I haven’t yet googled the topic. Perhaps we could use the concept of order? Zeroth order would be a fundamental constant that is not determined by anything other than empirical means. It is simply a fact of nature at a certain scale. Perhaps it is the limit that is approached in the fundamental dynamical geometry.
Example : The speed of the point charge potential emission determines the closest orbital radius of a dipole, which ties directly to the Planck length and Planck energy. A zeroth order tunable is a foundation upon which we can build and derive nature from first principles. Another type of tunable is one that is a control point in a larger mathematical model. Similar to recent advance in Ai, we may or may not have some level of understanding of the implementation of that tunable. Perhaps these are graded in terms of scale and degree of influence. These tunables may be both specific in a model and vague at a larger scale of understanding the implementation of nature. I’d leave it to the professionals to define this taxonomy. So those are my first primitive thoughts about tunables and why we might understand them differently.
Imagine logarithmic spikes as energy matter density. If we map the universe we see isolated peaks often in galaxy centers, with decreasing scale radial peaks. Why did science invent a single big peak some 13.8 billion years ago? Why do we need that? What is the differential signal from the CMB to a quasi-stable universe of no known beginning or end in time or space? How does that differ from the Big Bang model? Is the solution that there is a density of distributed mini-bangs that emits what we perceive as the CMB? Each pixel in the CMB would have incredible depth to see the photons of exteme redshift into the microvave band. Imagine photons that begin life near the Planck scale energy. How far would they have traveled to be observable in the microvave band?
At the other end of the scale, physics doesn’t seem to understand that this same spike in energy density occurs in the core of each standard matter particle. This is the strong force we perceive in standard model particles. Nature has simply used superposition in a clever survivor model that we call emergence. Orbiting dipoles are neutral. Their potential field alternates with frequency (energy). The point charge velocities in the dipoles approach c in nines 99.999…… Then we have the region where point charge velocity in the dipole exceeds field speed. We can surmise it still clicks off h-bar with increasing frequency to Planck frequency. The orbital radius now decreases towards Planck scale.
At this point the conversation fizzled out, presumably because the fanboi could not follow along with my descriptions of the point charge universe or could not deal with the cognitive dissonance to their incorrect narratives. Alas. Sigh.
J Mark Morris : Boston : Massachusetts