My theory says, give me some finite number,N, and specify the prcision then I can tell you the distribution of primes. Increase N without limit, but keep the same precision, and I can scale the function systematically.
This is the same as specifying a path through the Markov tree, or selecting a sub tree from the Wythoff array. The form of the tree remains the same, determined by the precision, but the tree scales systematically by adding to the rank of the tree. I claim this because of minimum redundancy and uniqueness are equivalent.
The problem is the my precision is the total precision, and the total precision is subdivided into the precision of identifying any particular prime for mu multiplier tree. Each particular some primes will be more specific in my estimate of frequency then others. That is a big draw back, but the concept of minimally redundant graphs associated with a Weiner process, of some dimension, is valid.
It is a result of non-emptiness. All natural processes are congested, there is no empty, divisible space greater than the precision of the natural process. Thus Pauli exclusion is a requirement, and that means minimally redundant since any redundancy has a finite probability of violating Pauli. Finite, and bound, precision is ergodic and also maximum entropy. There is only one finite map between the two, and thus there is a unique finite polynomial of some order that defines the map and generates a finite graph for the polynomial.
No comments:
Post a Comment