We assume Wiki is searchable and modifiable by search bots. Search bot watch everywhere, external to Wiki, like IBM Watson. They read all the stuff and ancietnt texts and odd ball side notes in research and odd references.
Then making a humongous Bayesian map of it all, the Wiki bots find earlier solutions to modern puzzles solved. They may identify some side note an chain of semantics indication an odd researcher somewhere had i figured.
Wiki bots then rewrite the Wiki pages to restore the hidden linkages. Humans wake up the next day and see evidence of time travel. Like biologists notice a new fundamental scientist they never heard of, and begin accusing their colleagues of time travel.
No comments:
Post a Comment