Back to theory.
We started with a general precision metaphor, and began casting about for ways to turn that into a queuing theory for network flows. We knew the key had to be a constant imprecision across all inventories (we deliberately chose a quantum model)
Then Nick Rowe brought up the old Friedman puzzle that a variable targeted to an instrument will decorrelate with the instrument upon success. So we had our Copenhangen interpretation for a quantum theory.
Within a constant precision, we will decorrelate with the targeted instrument. The basic transaction rate of the economy was large compared to transaction sizes. Quantum theory is necessary in economics. That in turn lead us to the minimum redundancy norm an Shannon theory. We, the statistician, were using the wrong norm in the past
We should treat the economy as channels of quantized flows, as if the quantities and rates were determined with Huffman coding, to a predetermined precision. These Huffman channels do a good job of modeling the economies of scale flows in well isolated industrial channels.
The agent in the system is not a money counter, but a queue manager.
Leading to a corrolary: Whenever large arbitrage opportunities exist for some time, the market will set up aggregators and ecoomies of scale.
We worked the S&P500 through a Huffman encoder to measure the amount of unused entropy during the crash. These are unobserved arbitrage opportunities.
The theory predicts Fibonacci series are fundamental to distribution of inventory, and showed they obey the necessary proerty that -ilog(i) must be within an integer to conserve inventory wiht minimal redundancy. Fibonacci holds the property and is a candidate for the natural basis set.
Relationship with standard stochastic theory:
Minimal redundancy rather than minimal variance norm
Both have a golden ratio
mutual entropy vs covariance
integer solutions vs infinite basis set
asymmetric vs symmetric
graph algebra vs full algebra
network flows vs statistical mechanics
No comments:
Post a Comment