Every computing model can be decomposed into a Turing complete, binary convolution or join of two expression graphs.
- Each expression graph has a node of the form, keyvalue, operator, object
- Every object itself an expression graph with the same join rules.
- Every instance of a join operation connects to an external namespace via the keyvalue, no operation, object form
- Every join system has built in traversal operators, the join is commutable, and operator pairs form a consistent precedence; meeting Turing completeness.
The claim starts with Morse signalling up to to Watsons talking to me. The There is no other model of AI.
The effect of Moore's Law is to increase the rank of the recursion, each shift in Moore's moves up the expression graph, the previous graph model being subsumed into the objects of the new join model. The scale changes, the rules remain.
My proof.
Trivial really, I prove that all semantics are serializable, finite, and terminating. Then I prove that the basis function for the model is the finite expression graph. I show this to be true when making multipliers from transistor logic, as well as organizing signals on a telegraph line. Then I prove recursion by scale.
Then I prove this likely to be true if I did it, by confirmation bias; thus no further effort on my part needed.
No comments:
Post a Comment