Monday, October 24, 2016

Compact graph structure

I use the term a lot.  Compact graphs was the basis for storing JSON graphs into SQLite. It means the graphs are spanning trees without loops and each graph holds a sequence count and is composed of sequence of graphs.

G is a (count,G) or (0,Null)

It means, 1) the Turing structure has no loops, a descent down the graphs leads to a fixed point. and 2) efficient memory retrieval is linear. So, algorithms which maintain the graph by deleting and adding nodes is always deleting and adding graphs and must maintain the block count.  But, the counts add and subtract, node counting algorithms needed.

Now, when your bot is traversing the graph the nexy node algorithm becomes:

def traverse(g):
  for i in range(g.count) :
    if function(g.nodes[i] :
      traverse(g.nodes[i])
else :
      traverse(g.link)

I doubt this is exact, I wrote it here on the fly.   Now all of g is a linear array so the nodes list of any g is simply a structure pointer to the array below.

But notice the difference, We do not decide to go left or right, instead, the default i to always go down the array unless a condition requires following a link.  The links should always be local, they simply jump down the array to some point within the 'count' of nodes. But this requires care when altering the graph, which the bot does on the fly.

Hence, node delete simply dumps the enclosed block of nodes, and shifts everyone. Enclosed counts, being nested, are self contained blocks.

Do this when you have many retrievals and fewer updates. You spend the time maintaining block counts on update or delete, but access is a zip once the counts are maintained.

In the Huffman encoder, the nodes contain symbol value and frequency, the graph becomes organized to minimize path length. You compare the user symbol to the symbol at each node, descending the graph and accumulating the code word. The adaptive Huffman holds the typical value, and its recent frequency. A comparison is not exact, a user symbol will be tagged to a node if it is within precision of the typical value,

Hence,the difference between the adaptive and complete Huffman. The adaptive needs a local linearity in the symbols, it needs a price, because its precision is finite.

What about block chain?

Look, a complete block chain can be organized as a compact graph.  That nmeans it is Turing complete, and is a finite spanning tree with no loops.  Hence here exists a symbol set hat represents the original sequence that built the graph.  That symbol sequence is, something like order of the transaction, its index, I think. Whatever the value used i exact comparison,that sequence exists and can be derived backwards, according to my proof, mainly based on the fixed point structure of compact graph. How about that math!

For AI this means that if the AI bot can derive some locally linearity on the nodes,then it can find the theory of the nodes,what is the dual structure (its approximate expression graph,  its grammar)  that must have created the compact graph.

Scheduler
I have yo do this traversal method to test the Redneck debugger, it is another of its necessary tests. So I will get this basic snippet going in a day or two, post the method and go on from there.

No comments: