Friday, September 23, 2011

Operation Twist and Channel theory

I should go in to the relationship.
All things beig equal, I have switched the norm, minimum variance becomes minimum redundancy. I have a theory of minim span flow graphs, queuing theory. The banking network at equilibrium will reaste to smal shocks by a tendency to reduce the network flow rank, get economies of scale to conserve inventory.

THis is maximum entropy calculus, so I use Channel channel: The banking network at equilibrium is perturbed, and should show a set of events i, as in -iLog(i), within an integer, as if they were encoded by a Huffman encoder. How does the be operate under flow conditions?

I convert the finite precision of Shannon into a yield curve, plotting the N as in (Signal to Noise ratio, (SNR). That plot of sigma variation in SNR, vs the finite set i, will be matched by a basis set from Fibonacci. So I can model the ideal banking network, and it will be visible as the dual of the Huffman encoding tree. My model in a nutshell.

When I run it over real data like the SP100, I see unattended aggregates appear in data. A perfect flow will appear to the Huffman encoder as 'already encoded'. This is simple economies of scale.

Result of Operation Twost in the finite channel model? A shock disturbs inventory and the net wants to drop rank and get economies of scale, a contraction.

I go on.
The central banker is squeezed by this effect and central government debt service. A steep curve helps the banking network, but makes he borrowing costs of Congress jump a large chunk in the finite solution  space.  Government's forced to get economies of scale when the curve gets steep. Crowding out, channel style, cwntral governmen is a wuantum constraint, a condensate, a large stone ball and chain.

How does the central government constraints show  up in the data? If we could assemble all government transactions in a flow, the run our Huffman encoder over it, the Huffman finds huge bunches of unattended aggregates.

No comments: