Light must behave as if it consists of particles to explain the low-intensity Compton scattering. Compton's experiment convinced physicists that light can behave as a stream of particle-like objects (quanta) whose energy is proportional to the frequency.Not either or but its both and. The key is that the vacuum has both nulls and phase, and for matter or wave, both are involved, the difference being which one is the whole part and which is the fractional. Waves count whole phase quants and partial null quants, mass counts whole null quants and partial phase quants.
Now, one might say that I just played another trick, converting everything to sample space. But it matters not, there is an equivalence between sample space and elastic space, as long as the sample space is smaller than any effect observable.
Folks who believe in the elastic space have a very serious problem in deriving an elastic computation of the sine function. Once you believe in elastic space, you will spend your life trying to derive an elastic integral for that function. String theory gets into this problem, as does general relativity. Each one has to add dimensionality to some kernel as physicists get more accurate. Quantum theory has a simple solution, it physicists get accurate, it must be a better set of quants being executed by nature. So they just rescale. All sample data space (and associated group theory) does is add a uniform method of rescaling to normalize potential and kinetic energy separation. We get signal to noise, instead of a Hamiltonian; and that includes tunneling and just about everything else.
No comments:
Post a Comment