Two graphs will be coherent if the convolution ofthe two closes everywhere with two leaf nodes. The idea is that one is a multiple of the other in rank, everywhere, and the two meet boundary conditions.
So, we can say that, to the extent France and the USA trade, some coherent abstraction of the French Huffamn tree should be a factor of the USA Huffman tree.
If we add that matching along the convolution is probabilistic and weighted, then we get a measure of mutual entropy, how big is the covariance at each of the matched nodes. I think this works, it derals with large queue length at the nodes which have high error and graph coherence down weighted.
This is an important point in decision trees that learn, what we are really after is high mutual entropy between a decision path and a structured feature set. If we get a countable stack of features that match, they will jam. We get large queue length and large variance makes lower mutual entropy.
Entropy is conserved, there are finite features to be distributed, and in multiple graphs of equal mutual entropy they must eqully divide the queues at each matching node. So we can talk about optimum basis sets across large classes of possibilities. Transfer the concept to word lists, in a closed set of words, against masses of unknown text structure, we should be able to identify wrod list pathst that optimally separate the various text by 'meaning', equal and mutual entropic paths through a decision dictionary, a measure of understanding relative to what is available.
No comments:
Post a Comment