Sunday, September 27, 2020

Weighted binomials

 

So let Px be the frequency that the X binomial gets updated, and so on. Let Ix be the total number of coin tosses allocated to binomial X. And so on.

Then we have a relationship that the sum  -iLog(i) will be the log(x^x*y*y*z*z), and we can let x,y,z be integers.  -ilog(i) is the mean of binomial i, where means are normalized to 1.  A switch of parameters gets us to integers, but we have to include N, the total number of coin tosses. The Pi, Pj, Pk are just the number number binomial terms shared among the total.

This tells me something about Markov 3-tuples. The left side is simply the weighted mean of binomials with a free parameter,N.  

Markov is open ended in that getting three hits in a row  solves to an adjacent node. So in the solution set, one has to call an 'out of bounds' for some rational reason related to context.

I digress. The weighted mean must be 1/2, normed to one.  Moments are being removed. The left side happens half the time, the right side is cropped significantly. This is where I am thinking, then reconsruct the final weighted binomial, it should be white noise.

It is adaptive, the most entropic path is taken and not known ex ante. The -iLog(i) should be approximated as normal. One can see that having any binomial be the most entropic path three times in a row means a redundancy in the other coins, they are forced out of relative prime. Exceptions are generally explosions. There is an inherent cropping, allowed by the round off error.

No comments: