Our model of the vacuum is that of an optimum sampler, congested by the Pauli exclusion. It is sampling the disturbance with increasing accuracy, but a bandwidth limited sample rate. In other words, a particular bit of disturbance is sampled with higher frequency quantizers, but lower frequency quantizers start measuring it also, and the order of the Huffman samples is infinite, starting with the band limit and increasingly in with slower quantizers, each quantizer having its own 'field'.
In this model, all the disturbance is undimensioned and immediate, there is no light travelling. When we 'fire' a photon, it is simply increasingly sampled in the lower modes and appears to us, with the fake variables, as moving away. It is not, the increasing dimness is simply that disturbance being captured in lower frequency modes. Black holes, galaxies, electrons are all coexistent.
Is this a consistent model? A black hole seems to have an event horizon, noticeable by us with a 15% variation. It is really light being modelled as part of an unknown, long wave length field, and the hole is just the sampler doing its normal quantization with longer wave length particles. The set of fields is infinite, energy never lost, just captured had the low frequency modes, and the disturbance being more and more accurately described. Gravity is just the next long wave length mode up, just below dark matter.
Now particles only require that the encoder be a minimum volatility sampler, particles and Pauli fall out from that.
No comments:
Post a Comment