Take the standard curve generator and add interference from an entirely separate channel. The variance of the channel component suffering the interference looks like the equation above, where the first term is quantization error and the second results from another channel causing congestion. For example, the Dynamic Yield curve shows the long term bond stuck and unable to move for nearly three years before the crash. From 2003 until Feb 2006, the 30 year bond was stuck in position. That term was dominated by an external channel, likely the massive borrowing by house buyers and the Treasury in support of the war. Take a look at the government borrowing demands from Fred.
This is mutually shared sample space, in this case the interfering variance, dominated the 30 year term. How does the rest of the yield curve respond? The terms move toward the stuck position; investors crowd around noise, grabbing yields. The remaining terms rose up and to the right to "tune out" the cause, including the Fed chasing inflation nearly all the way.
Feb 23, 2006, that was the crashing of house prices. Look here at the inflation chart. After a peak in fourth quarter 2005, inflation dropped off for the remainder of 2006. Business continued as usual, amazingly with the Fed inverting the curve until Jun 2007, the start of the crash.
We get collision, short term interest rates high and investors caught long in the bond market. But the point here is I need to extend channel theory to handle channel interference.
Modelling the process of adjustment.
Before taking this theory farther, we need a simple model of adjustment durig epansionadn contraction of the economic bandwidth. Her do investors jump in and ou of the market during change?
I propose a crowd right and expand left process. During contraction, bankers shift right, readjusting their separation and keeping a Gibbs separaton. Investors at the long end get longer and longer, drifting toward a sample rate of zero, otherwise known as taking their losses. Hence we really don't have to drop rank, we can leave them in the system as dormant dead weight. I'll go back to my R Code model and try this out and report back. It will be interesting to see if micro adjustments in the model during changes result in the same scaled Fibonacci sequence, another F sequence or something different.
No comments:
Post a Comment