Tuesday, December 7, 2010

Economic reading

The velocity theory of money gives us a proxy for transaction rate, the GDP / monetary base estimates how many times dollar inventories went into transactions. In the Levine paper, there is also a proxy for velocity, but it is an implied sequence of actions through the production steps. The paper itself is a time independent construction of production chains.

In the economy a reduction in money velocity is, in terms of the paper, a reduction in k, the number of steps in production; but in terms of money theory a decrease in the average transaction rate.

Velocity reduction was associated with a shortening of maturity, a shift toward liquidity. So investors are now desirous of a peak growth period from 7-10 years rather than the patient 15-20 years before the crash. The investors did this because production with k steps resulted in inventories going to zero, so they assume a k-1 system. The paper considers this a link failure, I consider it an inventory failure, but they can be equivalent.

In QM Theory inventory depletion results from too much flow variance. We might model failure as the probability that an inventory will go to zero, and that is the standard one sided tail test. The agent would keep inventory averages about two sigma away from zero. That is a specific signal to noise, SNR.   When investors want safety they act as if the channel is noisy, the SNR is low, so they set up a channel with smaller k and lower transaction rates. This is channel theory.

So, right away, I get a bit of a jump because I can say that investors assume a shorter, noisier channel after the crash, which gets me channel theory, a very much simpler math. Channel theory comes complete with its own set of limit and convexity proofs, and tells me the optimum transaction rate.

The author in the paper does tackle cross chain correlated noise. I am still working on doing that with channels, and dunno if it can be done with Shannon theory. My approach would be to use a correlated noise term and redo the theory, but more likely I would just do a web search for a similar problem solved.

Oh, one thing that confuses.  When people look up Shannon they do not see finite stage production, but if they look up Huffman encoders,  I use them interchangabley and consider the Huffman encoder a finite version of the Shannon channel.  Huffman encoders take frequent messages and gives them shorter symbols with the result that bits are assigned according to equalize redundancy across messages.  That is forcing symbols to appear as constant innovation, a constant noise channel.  The Shannon channel is the pure result of an infinite symbol Huffman encoder.  The symbols we talk about are actual boxes of goods, containers etc.  To repeat,  transaction size and rate are adjusted such that a transaction arrival of a container is just an innovative as the delivery of a loaf of bread.

Sorry for not reminding folks and remembering myself.

No comments: