There is an implicit, ongoing contract between the client and the data provider. They both seek the minimum number of clicks to each other.
Self directed graphs can count this, they count and aggregate access statistics in table variable using what? Bson expressions on Bson variables that are compatible with the graph layer. What are these statistics? Access sequences that can be huffman encoded into better schema forms, which then cause the bo to fill in parameters on a sub graph he carries around. He launches this subgragph to extract from some table an inner join that selects matches to the chosen schema. In this way, the bots keep ongoing statistics about information entropy, the bots reformulate the schema patterns to optimize squareness and nearness. Well, every graph is a bot, really.
No comments:
Post a Comment