I have a simple example for users, adventurers and technologists on where the web is going. My simple example is the search function that happens when I type text into my browser URL box. The modern browser searches my local bookmarks, as I type, printing out, on my screen the best match between the current text and the bookmarks. The function is a mobile web bot, traversing json expression graphs, asynchronously, on its own. It is continually doing a join between the text I type, view as a json expression graph, with the books marks, which are stored as json expression graphs. The Ugly Syntax is:
html:join(bookmarks:,text:"stuff I type")
The colon operator identifies the controlling interface that makes the serialized object appear as a json expression graph. html:, for example, makes the browser window appear, to the join operator, as an output expression graph that can be written to using standard json expression graph moves.The entire ongoing operation all built around the universal computer operation the binary convolution of finite directed graphs.
Hence we must view Javascript as the default database for json expression graphs. Javascript is a graph oriented storage using asynchronous record markers. A complete join operates directly on Javascript. But Javascript needs some navigational operators, wild cards, negation logic, most of which it already has, which direct the join.
The traditional Javascript interpreter operates on the script in place. It treats the script as a ready to go expression graph that is mostly self referential. The most an interpretor will do is build an fast index to speed references back into the script. So in the join context, the browser does:
html:join(Some Java script text,*)
gets your display. In the new mode the browser is in control, able to release javascript bots to collect and return data remotely, the result appearing asynchronously in the browser. Already done, yes, true, and it is increasingly being done with this Watson like technology.
My view of the Big Data History
It started with the exponential growing mass of web pages, mixes of html/xlm and Javascript. The first to tackle the problem were the Javascript kernels, data processors executed right in the bowser, then V8 and the server Javascript kernel. This was driven by Moore's law, the ability to manage reformatting and searching with scripts at high speed. The mass of web pages began to look like Javascript objects, mainly Json expression graphs. Javascript and its forms were now being shipped everywhere, and that made simple json expressions canonical.
No comments:
Post a Comment