Saturday, October 18, 2014

Nature's amazing counting ability

Look here, real GDP, and it looks like a 2.2% growth rate with about a half point variance.  If you take growth at quarterly rates, I bet you get mean and variance of growth being equal.  Also, this means the bean counters are using log base 2,  they have approximated the M2 Velocity as 2, instead of 1.8.

Is this a plot? No, the BEA has to do revisions because data is late.  Data is late because inventories are jammed and uncountable, or foreign currency has to be swapped or the derivative industry has to price insurance.  Stores across the economy take a good shot at getting this right, and it is basically improved in accuracy as it goes up the stream.

If you look at RGDP/NGDP, YoY growth rates, and average those over two or three recession cycles the number is .5, to an accuracy of 1%. This is true from 1950 until today. That number, is also a variance and even splits the variance in money and goods inventory, it is the optimumly accurate double entry accounting system. It is Shannon sampling theory, but accountants never really used that theory. And entropy theory was mainly buried in physics and engineering until the digital age. This is minimum redundancy, we need this as the basic law of nature.

What does it mean?
The economy has somehow done a six bit, base 2, counter with sampling at Nyquist-Shannon on half quarter periods. Six bits because the economy knows we do an eight year cycle and counting down from eight: 8,4,2,1,1/2,1/4.  How did 100 million monkeys across three time zones, four geographies, and four different weather systems figure this out.  They did a Huffman encode on trade, creating a minimum redundancy network.

Is it a trick?

No, quasars make a mixed two and three base log system and make baryons, matching the coefficients of motion to fit the finite log pattern. DNA adds a log base five, and Vegas gets the number seven.  Something is going on.  Nature make groups to match finite logs and create minimum spanning finite log networks and pack them with Higgs bubbles. Mathematicians are just now cracking natures code.

The connection:
There is a connection between mean equals variance from Poisson and optimal transfer networks.  Something about making all the queues equal length distributes uncertainty (or motion in physics). Mean equals variance seems to be the fundamental Compton wave equivalence.

How did nature figure out how to make F and 1/F from recursive equations. We can do it by simple division and aggregation from a starting ratio. How did the universe figure out how to do it in log? How did figure out it needed two different bubbles of wave to effect subtraction? Is this simply minimum redundancy, such that non-solutions fly away? Random events result in a stable solution?


This is not a drill. These are exciting times for mathematicians, the world is at their door. This stuff is better than Isaac's rules of grammar. I wish I were young and brilliant.

No comments: