Thursday, October 27, 2016

Think search engines

How often are the search semantic graph updated?  Likely on,a weekly, or better yet, on a schedule that is 100 times slower than any read only search. So, your engine does the background work, it really does the Huffman graoh on search word frequency, and quantizes he typical search results along the block structure matching word frequency.  What are you delivering back to the user? Information, sure, but where is that information?

It is in the search language that the site bots and the human user co-created, by implicitly pricing words.  We and the site bot learn the cost of words in that little window. It cost, it cost redundant searches when we get the word order and significance all screwed up, redundancy is minimizes, when  you do a huffman encode, and you get the dictionary graph for free, just map your disk to that, and let the disk handle that whole subject line, it will be doing linear, short hops down the compact graph and I will once again become a gazillionare.  Once again, the algorithm gain is so  significant, that transactions costs can be brought to zero, except for the 100 line of  micro-instuctions I will right, that cost 2 trillion.

No comments: