Sunday, January 9, 2011

Making zero bound predictions with entropy

Let us consider the yield curve to be a Huffman encoder, in the presence of noise the economy tries tpo quantize surprising events.  A good noise proxy would be the dollar deflator, for example.  We are encoding events happening in a sequence, the events come from the Canadian Interpretation. Under the interpretation, the yields represent a forecasting error event, the economy measures and quantizes the error in estimating the arrival of some innovation in the economy.

The Huffman encoder model and the great Moderation tell us that  the long term yield drops over time, while noise remains stable; implying fewer forecasting errors as the series continues. Following the Huffman algorithm, remove the large, less frequent events and measure the arrival of smaller more frequent events. We will see that even smaller events happen less often as the series progresses.

If the economy is measuring forecasting errors with the same relative variance (constant SNR assumption) then the number of independent channels needed to encode events have dropped, the channels has lost dimensionality. The yield curve is measuring less variation over time, so minimizing transaction costs imply we have a less precise economy, we no longer have many varied medium innovations over time.  We operate  the economy with fewer stages of production.

In a world of constant imprecision, where have all the variations gone?  There is a larger global channel happening, and our yield curve is watching that channel for very large, really infrequent events.  We have changed internal variation to reserve more of the channel to forecast external events.

No comments: