Monday, November 21, 2016

Like this orbital

If your aggregate was a bunch of volume grid markers, and the grid markers adjust to estimate both a Pi-like curvature that keeps bit error spread laterally and a Phi-like conservation rule that estimated a center.  But the co-estimators could not perfectly correlate, one get an adjustment to make better Pi, the other trades that away to make better Phi.  They get stuck, sharing bit error.  No force interaction, just  bit error with no empty space, a local and semi-repeatable traveling probability bulge in the elements of vacuum.

Now, understand quantum physics as if the vacuum process was a snippet from some python geek. It would be he feynman diagram, the individual singleton ask/bid trade.  Now that feynman diagram gets busy, and spawns. High energy make bit error, it is converted into order by Mr. Feynman:
Pit.boss = FeynmanOperator. ANd as usual, on an accumulation of bit error, a new Feynman Operator is deployed.

And the ask/bid are posi/anti annihilators.   So, unlike the Wythoff game, or infinite precision block chain, the system keeps only a finite past memory, we got Plank's disease.

The core of the trading pit is this very simple nested block structure, optimized for very fast descent, and collection.  That means, it is a very nice tool for doing Monte Carlo synthesis when you know the probability graph, he Huffman graph of the  sequence. Basically, dF and F have been gridded out to make probability convolutions very fast, default to array processing for a list of singletons.

But, it is made fast because it is entropy maximizing, so that is optimally reducing redundant cycles on the processor.  One says Hmm...; maybe those atomic orbitals are the natural result of limited dimensionality in bubble overlap,   Bubble overlap selects the available prime segmentations the aggregate can use for bit error dispersal.

Computational quantum wells

Looking at the computational task, and considering the multiplies as integer approximated, then the computer geeks says, sure, when my bit error is bell shaped and mean mostly looks like variance, then sure, the integrals approximations will be accurate.  Hr will have the whole chain working the proper mantissa and exponent, so to speak.  But the geek gets a quantum well in that he is only using a small set of the available grid sizes, the Huffman graph is implied in that.

Or, the system is working in some cloud on that Wythoff chart, some set of positions away from center. And in he cloud, he process is to find a center and a curvature. What? Making the trade between mantissa and exponent.

No comments: