Friday, July 25, 2014

Jaynes and the principle of maximum entropy

Edwin T. Jaynes
Central to the MaxEnt thesis is the principle of maximum entropy. It demands as given some partly specified model and some specified data related to the model. It selects a preferred probability distribution to represent the model.

Systems evolve to minimize motion and the result is always maximum entropy, or minimum redundancy. Is this true? 
This is the entropy, say a sequence of disturbances. The log of the probability tells us how many quants the disturbance holds, and the probability tells us the relative frequency of this disturbance. So the log is the number of additive steps needed to disperse the event, and the probability how often those steps are used. When the sum is maximized, the probability of collisions between events is minimized.

Collisions

The system, by virtue of the Higgs fiels, maintain S/N ratios as agglomerations of the Higgs field. When collisions happen, in physics, the resulting collision noise multiplies through the agglomerations and reduces the total quant number N. This restores maximum entropy and the side lobes of the collision spectrum are lost as quantization noise. The Higgs spectrum is malleable, though finite. The log function is continually  maintained.

So Jaynes is right once we understand that the fundamental quant, the constituents of the vacuum obey the principle of minimum redundancy in their actions.

No comments: