Understanding this equivalence lets us refer back to the optimum congestion Huffman encoder, where is all started. Maximum entropy encoding is minimum redundancy coding. The idea is to assign quants to levels of redundancy, and you can see that in the Huffman encoder. Once we get that then I don't always have to hand wave the Nyquist vacuum samples exchanging phase values.
The optimum congestion (Pauli) encoder is a Huffman encoder that assumes that the sequence will ultimately be represented as a single code looking like 2**N-1, all ones, in binary. We are not binary but (3/2)-nary. Redundancy is a bundle of dense phase energy which needs to be counted (compacted in place) .
In binary terms the Universe looks like, in sequence:
Order code remaining energy
Order 1 1 ½ total
Order 1 and 2 11 1/4 total
Order 1,2 and 3 111 1/8 total
Where to these codes appear? Right in place, each 1 is matter. A 111 is three matters together at Pauli separation. That is what is happening, except the Nyquist uses the asymmetric counting system, -1,0,+2, and 0,00,000 are the matters.
It will miss, it will find regions where the Pauli does not obey, where the next level of encoding will not fit loose energy into a unit of matter. Those are the compromises it makes, and it will squeeze back on matter already created, making room, but end the end it wins, via a relaxation process. It will constantly try to balance sparsity and density among orders.
Nyquist is dumb, but easy to understand if we remember: 'Optimum Pauli coder of redundant energy'
I want to leave the mode of hand waving the phase balancing process, and just start counting Nulls, waves, kinetic energy and phase imbalance (charge).
No comments:
Post a Comment