The definition of a word is itself an ontology graph. Each definition a short rank two or three ontology whose keyword entries are themselves entries into the same dictionary graph. So a web bot, looking to match semantic patterns, can take a sample, cleaned of the tinies, and attempt a best fit of the pattern it has onto the graph of self referencing pattern. Use partial ordering, once again to minimize search steps. This would be a Watson trick, more advance use of graph traversal. It gets the web bots a systematic way to reduce human text.
Grphs give us objects that have inner and outer joins:
Inner is the graphs that have non null matches between them. left outer the set of left graphs that have matches, including null; and so on. A semantic dictionary is a self join with small ranked graphs. Define a match as a rank two overlap on a rank three pattern! how about that. This thing is going to read.
Default start up and tables:
This is the default start up. Default grammar is to check the config table for set up or respond with the default prompt.
will look up UserWord in the local table list, if not fond responds with default prompt. So every machine has a set of of local tables, as per sqlite3 tradition.
All clients in the model have the sql sequencer, the term might be better termed the graph seuencer, that methods at the graph layer. (Keyword match, sets, descents, wild cards, schema, overloads, open query, micro methods on result table) So, having an intelligent self directed table, modeled to the local client by the local web bots is the norm. The client has slider bars to set signal to noise ratio, drag and drop selection of optimum forms. The bots, with the graph layer, click thru counts, defaults cast pairings and partial ordering have default methods to switch from short to long form, or special forms,as the client drags and drops by trial and error. The client gets it, the first word he types means something he has invented by prior action.