Optic approximation tells us that we are looking at a star from near infinity, and we can ignore beam spread. But maximum entropy tells us that the flux density is lower, hence the Signal to Noise is lower, so the light must be delivered in smaller quants. Then Compton tells us longer wavelength come with smaller quants. How does general relativity correct for beam spread when space time is expanding? Our sample bandwidth is fixed, energy is reduced, frequency is energy. I do not get it. I do not need general relativity to tell me that wavelength increases with beam spread.
When light arrives to earth, it is accumulated by atoms, using the quant sizes it has. If it needs to drop the quant size, to match the SNR then atoms will drop the quant size. If the noise of free space is also quantized to the same ratios, then it too will drop quant sizes to meet SNR.
I looked all over wiki, could not find even the classical explanation about beam spread and frequency drop, even though classical physics has frequency and energy equivalence.
I am not working this one yet. But the normal correction Einstein would want would be of the order of 3/2 in wavelength, and that is not bad for the vacuum over a 20 billion year path, good job!.
No comments:
Post a Comment