The first part is calling the join loop, new_cursor, and it delivers a bland version. The attachment is selected via argument and called to make the cursor specialized. Some cursors get a whole massive text space, they manage it themselves leaving their pointers on the new cursors. Some are just repetitive search strings.
The user application has the same access to the system as the macro shell, via the exec file, which has all the spaghetti. User doesn't notice, the real path, the step and match path is tens of instruction, if the attachments have simple, mostly Comma dominasted graphs, then join only imposes another 20 processor instructions.
Blazingly fast, and for plain text words comprising some information structure, this machine will do the trick. Especially our ability to extract plain text from across the web. Not even centralized, every day people who read web pages can run the scraper, adn dump the results into the company semantic processor. My desk top can run a few hundred word lists in the background, select an encode graphs based on prior knowledge of the word lists. The powers to be can pay us and or bots will read our favorite web pages .
It has one problem, we move the key twice to get to output. A bug, easily fixed/.I pout it there on purpose, as I deal with words, sort in length, so I keep a copy in the cursor, not the pointer.
No comments:
Post a Comment