Saturday, April 18, 2020

I have company

Finally We May Have a Path to the Fundamental Theory of Physics… and It’s Beautiful
But then, at our annual Summer School in 2019, there were two young physicists (Jonathan Gorard and Max Piskunov) who were like, “You just have to pursue this!” Physics had been my great passion when I was young, and in August 2019 I had a big birthday and realized that, yes, after all these years I really should see if I can make something work.
So—along with the two young physicists who’d encouraged me—I began in earnest in October 2019. It helped that—after a lifetime of developing them—we now had great computational tools. And it wasn’t long before we started finding what I might call “very interesting things”. We reproduced, more elegantly, what I had done in the 1990s. And from tiny, structureless rules out were coming space, time, relativity, gravity and hints of quantum mechanics.
For ten years I have been staring  at Wofram's description of Hufrwitz theory and saying there is missing spectral theory. The key was to find the no arbitrage solution, treating it like a finance problem. That meant keeping the round off error close to white noise. That was the connection with Hurwitz and Markov.  We are going to have a boatload of andboxers.

Some notes for Steve:
We’ve talked several times about particles like electrons. In current physics theories, the various (truly) elementary particles—the quarks, the leptons (electron, muon, neutrinos, etc.), the gauge bosons, the Higgs—are all assumed to intrinsically be point particles, of zero size. In our models, that’s not how it works. The particles are all effectively “little lumps of space” that have various special properties.
A particle has the property of rolling cancelling error updates, especially spin updates get canceled before issued so often the machine has mass.

And this raises an intriguing possibility. Perhaps the particles—like electrons—that we currently know about are the “big ones”. (With our estimates, an electron would have hypergraph elements in it.) And maybe there are some much smaller, and much lighter ones.

Missed an important point that I discovered.  You do not need all the hypergraphs because the N adjust locks you in place. There is a specific point where Avagadro and Plank match.  But ypu are on the right path.
there will usually be many places where this rule can be applied. So which update should we do first? The model doesn’t tell us.
Yes it does. The next updat surfaces because oif the curvature issue makes the system look gaussian,  This is the disequilibrium of spin, and that disqualibrium speads and causes error updates by the charge system.

Look, a Huffman encoding graph though he misses it.  My discovery, we do not make this for whole integers,m we just do this for the error updates.  pin disiquelibrium allows us to do this.






He has some good stuff, he has figured out to make this a graph problem then count paths through the graph, "Good Will Hunting"). He misses that the graph is a compression framework, and limited to solutions of Huffman codes. Those are a very specific directed graph that keps all the properies needs to quantization. Huffman codes do not have reverse paths, hence always no arbitrage when kept up to date. The decoder exists, and we can use it, but nature generally makes the coding part only. Huffman graphs are always entropy maximizing, the first and current best estimate.

I did not invent the idea of seeing this as an information processor, that goes back to the 50s. My discovery was that Shannon described a general purpose congestion manager and then found It can be expanded to a 3 and 5 color channel packer. I found out the relationshp between graph coloring and dimensionality.

But is is not physics, really. Formal physics is about making the universe fit the engineers model. This problem is all about counting stuff, pure Planck beginning to end.

Spin again.
There is no prefect packing of a sphere. One can get it down to two choices and be disequilibriated. This disequilibrium drives the whole system until Planks cannot find its way. It is locked in by he need to match Avogadro. Their is no magnetic field for spin, each disequilibrium is handled inernally and locally. Hence one gets an N mismatch sooner than later and the mass collects more vacuum particles, causing it to have charge cancellation and thus magnetic adjustment.  It does thin until it gets an N match.

Steve needs to look at the bipartite graph problem, how closely can two or three graphs be matched, then he gets the abstract tree issue, making the system rounded, this is Huffamn encoding, and this is what we see, an optimum  triple sided Shannon congested flow.  The big bang model as we see is just a march up the Markov tree until we get an N match. hat new proof that is floating around used bipartite graph to count counts.  That triggers the idea of the abstract Tree, the bipartite graph process is making round. Anbd I managed to connect the making round with markov via the error updaes, it all fell into place.

Welcome to the club Stephen.

No comments: