Well, I am doing market research, everything is there except the ontology optimizer, the thing that maintaints the Shannon encoded ontology map. Let's talk about that.
How do humans encode? They Shannon encode their language. Consider the small company, and its employees interest in sales reports. 'last months sales' , december sales report' 'sales from the last four months' ,'summer sales'. Those phrases in common use are very likely to be Shannon encoded in their normal usage, irrespective of any conscious design, it generates the most information transfer with the least effort.
The optimizer can find these easily using click thru counts and search set length. The selected ontologies are that set most likely to be Shannon encoded.
But we need all this in open source. Hmm. Keep it simple. Linked lists of keywords, ultimately pointing to a url. Links may be arbitrary including loops, so it is an arbitrary graph whose nodes are a small finite set of partially order keywords. Easy to standardize. Maybe we need to formally define a convolution on the graph that produces a maximum entropy resulting graph, without loops. I will think on that and continue searching the market for possibilities.
That is how we do it at Imagisoft.
No comments:
Post a Comment