At any point in the universe, the vacuum only operates where there is no simultaneity. When there is simultaneity, Pauli would be broken and the vacuum simply treats the simultaneous samples as one. How does it do this? Simple, the two simultaneous samples are left as they are, and eventually the ordering process evacuates nearby samples, and the two that break Pauli are left alone where they are not a bother. Not a bother, until the compaction process again crowds them out with variation such the the vacuum again sees them as different.
Wave motion operates on the principle of simple ordering process. On the wave front, the few samples in motion are simply exchanged, in order of phase reduction, with the next sample that reduces phase, this is all we need.
I think this gets us the big bang process. The original disturbance is considered an unordered, unsampled disturbance. The vacuum collects signals up to Nyquist noise, thus making all samples distinguishable.
.
OK, the big bang starts with a disturbance have some unordered grouping by density. The vacuum treats this initially as a simple reordering issue without simultaneity. But the sampler assumes no simultaneity from any angle, and attacks the problem from many different angles. Soon, two at a time until it reached two samples that are provide no exchange that minimizes phase.
Dens regions get sequential samples like -1,-1,-1,0,0,+2,+2. And so one, always a slight imbalance. A pair of -1 is not separable and the sorting process simply gets hung switching between the two. But the -1,0,+1 sequences get sorted, away. Finally we have a batch of similar numbers, the size of the batch being the gradient level, or density. The disturbance is sorted by density, the number of indistinguishable sample together is the batch density, and that gets quantizes in the sort process. Density regions become optimally separated, the gradient actually quantized in the value Batch size, which are integer. They will be numbered: 1,2,3,5,8; a Fibonacci, I think, as the process completes.
The red shift phenomena is simple the batching process, long wave have finer granularity and the vacuum is simply increasing sparsity in free space at the expense of density else where, by the sorting process. All wave action is the sorting process. Quantization points, like electron or magnetrons are simply density peaks that have established the regions batching level.
The batching level explains why light is energy. The higher frequencies need more sample pairs, each sample pair requires its own sampler, and sampling by Nyquist is energy.
If you follow this approach then you end up with the group theory behind Hilbert spaces. Each 'batching' level becomes a
Lets assume separable groups and estimate the order of the electron and nuclear. The electron has a mass that is approximately 1/1836 that of the proton.
Note the overlap between orders, that means the groups are not closed. That is wavelength to me and correspond to quantizatuion levels. The Nyquist gets three leveles. Taken two at a time, and assuming closed, I get five levels. Each pairwise increase the number of states by 2. But I can mix and match the three group and five group, no?
So, 3, 3*5, 3*5*7,3*5*7*9, 3*5*7*9*11,3*5*7*9*11*13 gets me:
3;15;105;945;10,395;135,135. These numbers are an over estimate, but I get that the electron is order five, the proton order four. True? We have:
Boson,Hadrons,fermions. Notice the overlap? That is what I mean when I say a wave won't become simultaneous until it jump two orders.
Two missing orders, if I calculated right. My numbers likely screwed, but then maybe there is dark subatomic mass? Where am I going wrong?
No comments:
Post a Comment