The subject came up in a comment on another blog, why don't the distribution netwoks collapse in a multi-stage queuing analysis? The answer can be illustrated usingmy universal economic calculator and the uncertainty constant. In the process we get the outline of proof for Milt Friedman's Plucking Theory.
Looking at the yield curve for some stable region, the fuzzy boundaries are the variation in the yields over a 30 day update cycle. That fuzzy boundary is a normal ID estimate of the uncertainty region. If one looks at the 30 year yield and asks, how far down do I travel on the curve before I can pick a spot in which the uncertainty regions do not overlap, then I get the 20 year yield, and so on. The method is an approximation.
What this means is that in any given N level distribution network, which has one N-1 stage distribution, then there is an adjustment process will yield a situation in which an extra step of distribution will be observable as a gain in scale. That is, after a period of incoherency, the system settles to the point that an intermediate manufacturing step will observably reduce the total number of transaction in the network.
Hence, the Plucking Theorem. The system always reverts to the number of stages defined by the shape of the yield curve and the measurement uncertainty. The shape of the curve is fixed at equilibrium by our Hamiltonian, the maximum variation in the minimum space, gaussian.
No comments:
Post a Comment