Wednesday, March 13, 2019

65 little Hoovers

If we were to model government as a distribution graph. Then consider the single link between one of the small western states, Montana, Idaho, Nevada, Utah. On the other side are the Maine and Vermont, and so on.  They represent the smallest possible link in any complete model.

Then a complete controlled model would break up all the larger states and get a balanced, representation, like 65 states.  Using this model and consider a stable flow of liquidity events, top down.  At equilibrium, we would expect an economic measurement, by "state" to have high mutual entropy with private sector liquidity events. That means that at the state transition step, the Null hypothesis after all differences controlled we will see indifference curve parallel for state and private sector.

Which we won't, but that is the closest thing we could have to an unbiased model. Then  hunt  the causes of oval and concave indifference curves in the real state set up.  You are really looking at a random set of graphs, superimposed, on the finitely small sample size. Then the variance on the matched node basis is equivalent to queueing process.

If the queues are stable, then there should be a compact graph that generate random 'queue structures', each structure within one normal variance of the other. Clerks and customers meet often enough prevent sample space aliasing, actually. A clerk over sampling customers (texting friends at the empty counter) or customers over sampling clerks (causing loud, angry crowds at a backed up counter).  There is equivalence between queuing variance in a directed graph and random graphs from a compact generator.

The aggregate system must obey the variance rule in order to be a measurement reference axis. It is hard to even compose the problem. The variance rule is like the bandwidth limits are met, your agents cannot infinitely subdivide, but finite solutions work, with finite constant error. Add quantization and we get bound error in the model, basically modeled as uncertainty in the generator. The quantizing model should has tipping points, error exceeds the bounds and requant takes place.

Equivalence in this sense

Let us say we have compact generator and we feed it a uniform random input, generating sequences out that are random sequences.

Then take the resulting stream and feed it into an fir estimator of the compact generator. Compare it to the previous trials and note the resulting compact generators are a random graph sequence. The two processes equivalent, queuing and random graph generation for compacting processes.

No comments: