Then I mean that over some window the elements can be ordered locally, based on how their variations aggregate. So this is Windowed Huffman encoding. Local sorting takes place, the error band is a short order histogram, rather than traditional gaussian. Probability of occurrence replaces sequence. Absent fixed ordering there is no time, right? So, that's why I say, the Huffman encoding network is the dual of the optimal distribution network, because they both minimize distribution steps. And goods should be packaged as Huffman encodings.
Even in database this applies. The ordering of a key word set, Give me a large bag of key words, and you set the window. Using a partial reordering algorithm over that window, scramble the first three and generate a set of graph queries based on reordering. In other words, language. Yes, that must become part of Lazy J, the semi-ordered bag of key words. If we humans are maximum entropy encoded, than the bots will find that in our click thru rates, and order graphs such that a Fibonacci click rate generator gets sequentially rare, but larger blobs of results. The pages get cached to minimize total moves.
No comments:
Post a Comment