Huffman encode one or more complete sequences, and you get a compression graph which has rank equal to channel size, approximately. This works for short sequences, 256 samples, with a most stable precision value, minimize round off error. That has a mapping to the number of check out counters, in the abstract, we see the encoding graph root being of rank size, the number of stable Poisson queues the counter keeps. A lot of this stuff becomes trivially true when examining systematic node by node alterations. There will always be a graph shape preserved in any transformation. That has to be true because maximum entropy is not a multiple equilibrium. But the variations from typical still captured on the uncertainty of quantization, the bit error.
This is me hand waving, I am a blogger.
No comments:
Post a Comment