The problem with quantum physics is just that it implies the vacuum is a single 16 bit computer over the vastness. A computer is a congested system, optimally designed so only one of its significant bits can change for each clock pulse. That computer cannot count the 4th bit and the 8th bit at the same time. That is also what quantum physics specifies, hence the idea that the universe is a simulation always comes up.
This is implied by my analysis, here and here. I dunno the answer, I just created the model because I know what an optimum, congested system in a bandwidth limited space should look like.
But if the speed of light is constant everywhere, one possibility is multi-processing. Light has the same speed everywhere, but its phase shifts in the vacuum sampler. In this case the vastness is a multi-processing system of identical computers. Light should have a random phase shifts.
The other possibility is that the vacuum has infinite precision but the same clock rate and same natural uncertainty. This implies that the disturbance will increasing be counted as larger and larger particles with properties yet unknown. The set of fields is infinite, but optimally separated. In this model, space disturbances slow way done, but get larger and larger. Black holes are just huge particles and fields, but unknowable by humans who live in the vacuum where disturbances are short lived and particles smaller.
Physicists need to do some more work.
No comments:
Post a Comment