Wednesday, December 30, 2009

The real problem with the Fed's Exit

Assume the fincance market makes Kelly trades. The formula:

W(p) is maximum when p * log(b) - H(p) is minimum, the betting proportion has maximum entropy.

The p is the proportion of wealth, W, to bet, b the probability that the Fed will exit in the next trade. H(p) is the entropy of the bets.

Right now the Fed is making noises about an exit strategy. The effect of the noises is reduce our knowledge of b, hence to slow down the Kelly Trades, which are, I think, just the way we do Carry Trades. A Carry trade is a bet that one can use cheap money and hold decorrelated assets during the betting period.

So Betters bet a proportion of wealth on the probability that one year rates rise, they bet every quarter. But information theory says that b will rapidly converge to .5. QM say b will converge to within an snr quanta of .5

The mechanism of this convergence is the growing economies of scale for the carry trade. The carry trade subdivides the problem into a bet that rates hold for four quarters, that is it begins to bet Kelly trades farther into the future, and an efficient, multi-stage market emerges, the production network. The Fed is facing a more precise market and discovers its inability to quantify its own future. b goes to .5 on the noise. The result is a return to the nominal aggregate noise floor.


The Fed is competing against a Carry structure in the process of computing a more accurate future for the Carry trade. When the Fed loses control when it is less accurate than the Carry Trade market.

How to test this theory? Well, the TIPs market represents the closest we have to a retail inflation futures. Look to see how responsive is TIPS to Fed announcements. If TIPS is the more stable, then the Fed has lost control?

Kelly, A New Interpretation of Information Rate

Kevin Drum misses the point

Here he talks about the stealth program to eliminate insurance companies.

Health care reform is not about insurance companies, it is about the AMA wage setting process. Kevin's dream of eliminating insurance companies is really a long drawn out battle between Congress and the AMA. No one is going to subsidize anybody's health care unless Congress wins this battle. The more likely outcome is a huge cash market for retail health care as the nominal health system shuts down during the coming ten year battle.

Remember what Yglesias, Krugman and Drum are asking of us. They want taxpayer payments to physicians who have monopoly wage power. So, Yglesias is asking person X to pay whatever fees the doctor decides to charge person Y. The average middle class taxpayer cannot absorb the concept, and will refuse to participate.

Do I have to look for research on the mal-distribution of wages for medical workers?


Hot Air talks about the Gubinator in California who was misled into believing that the health Care reform will not cost the state. Now the Gubinator find the reform will saddle California with an additional $2 to 3 billion in debt, for a state on the edge of bankruptcy. Does the Gubinator seem a little out of his depth?

A reading of entropy analysis of CEO pay

In this research.

If CEO pay is informative, then the salary of a manager will reveal the increment of value he provides. Using the quantization noise metaphor we can claify this. If an incremental piece of value is computed with a linear combination of firm wages, then a top heavy management team will dominate the calculation with large quants. The firm cannot be precise (in the general case).

The author states his technique is most valuable for small firms, which would be correct. Larger firms can use variance approximations.

Interest in better measures of employment dispersion is important in deciding the issue of structural vs cyclical employment, an issue for Recalculationists.

The upshot of this study is that CEOs are paid about 150 times more than they need.

Here is 2006 research that uses entropy to determine lot size and transaction rates in the firm! I though I was original! This research is valuable in that is looks at the learning process (Ramsey Search) in determining lot sizes for a firm.

This research computes the entropy measures of regional skill level in Europe, and show positive correlation with maximum entropy and growth.

I will be wading into this morass using the entropy model. Note the main issue here: Does government stimulus efforts increase or decrease entropy in the work force?

Tuesday, December 29, 2009

An old Siglitz paper on the Limitations of the Price Norm

Which Brad referenced in a post about pricing CO2 emissions.

The paper is gated, but I assume Stiglitz explains that at equilibrium all inventory variances will drop to zero, and the price point reaches the informationally useless level of zero. Another statement is that at equilibrium, all sectors are perfectly decorrelated and price comparisons across products fail.

In fact, producers might find alternative uses for a product in surplus and thus inflate the production line. Stiglitz would be led to either of two conclusions, 1) Another Norm emerges which is informationally useful, or 2) we end up back with Creative Destruction. My argument is that Stiglitz is stuck with Constant Uncertainty, hence he had proved Keynes wrong. Kling started this focus a few years ago when he stated we need a non-price norm to understand change.

We know what the basic underlying Norm is for the economy, and that price comparison is an approximation.

Monday, December 28, 2009

Kruugman mistaken about what we learned

He complains:

"Even as the dot-com bubble deflated, credulous bankers and investors began inflating a new bubble in housing. Even after famous, admired companies like Enron and WorldCom were revealed to have been Potemkin corporations with facades built out of creative accounting, analysts and investors believed banks’ claims about their own financial strength and bought into the hype about investments they didn’t understand."

Potemkin firms and credulous bankers exist to maintain a semblance of normality while the rest of the components align for a restructuring. Then we get on with the unwinding, all at once with economies of scale.

Here is an update on the GS shenanigans:

"The Times article points out that unusually large contrary bets placed by Goldman and others were not primarily defensive. According to industry experts interviewed, these bets put the firms’ interests clearly at odds with their clients’ interests. Goldman says that its clients knew that it might place contrary bets. But does that excuse placing them? What goals, other than lining it pockets, were served by the deals?"

GM serves the unsustainable goals of the government mortgage insurance game. It is their job to force the bankruptcy.

More on Californian's looming default

Pension Pulse points us toward a Bloomberg Report:

"Investors have demanded higher interest rates from California, compared with other borrowers. The state’s 10-year bonds yielded 4.6 percent by the end of last week, 1.51 percentage points more than top-rated municipal borrowers, according to Bloomberg indexes. Three months ago, that difference was as little as 1.06 percentage points. Greek 10- year bonds yield 5.72 percent, Ireland’s 4.78 percent and Spain’s 3.93 percent.

In California, “it’s never a quick budget, it’s always prolonged and when it’s prolonged the headlines get worse and spreads widen,” said Peter Hayes, who oversees $115 billion in municipal bonds for New York-based BlackRock Inc., the world’s largest asset manager."

It is the last paragraph that tells the tale. Negotiations with lenders are more drawn out, the deal involving more complications that the lenders no longer have the resources to continue. Hence the Gubinator goes to Washington to have Federal mandates lifted. In other words, variance lending no longer works, time to recode entropy without mandates.

Yet another of Felix Salmon's endless government insurance programs causing a long term risk and uncertainty.

When caught in a fraud, settle out of court

Which is really what is happening here:

"The Treasury Department announced Dec. 24 that the two mortgage-finance companies, which were seized by the U.S. almost 16 months ago, could tap an unlimited amount of capital for three years, up from as much as $200 billion each. It reworked caps on Fannie Mae and Freddie Mac’s mortgage-asset portfolios to require the holdings to fall to $810 billion each by Dec. 31, 2010, rather than about $690 billion."

The aggregate taxpayer somewhere in the past agreed to backstop Congressional involvement in the mortgage industry. That promise is now being kept by the aggregate taxpayer. If I were an aggregate taxpayer I would seriously think before I off-handedly issue block insurance promises.

26 miles of Oil Tankers!

Which Mish talks about.

So what is happening from QM perspective? As long as traders have a flexible supply of tankers, then they can use them for storage and make variance trades. As soon as the tanker supply shows stress, one gets quantum effects, the lot size (one tanker) begins to have noticeable round off errors. That signifies it is time for traders to engage in entropy trades. Mainly, see if you can buy a few tankers of oil in an optimum region, trying to match the size of the trade and maximize entropy.

Now a smart oil trader would have a special consultant doing entropy analysis on the distribution of tankers over time and region. QM Theory as it develops will incorporate the dual norms in trade optimizers. The result will be greater transparency because each party will have tools to measure the entropy loss or gain in trades.

Can Congress provide general risk insurance?

I hate to read anything into Felix Salmon's post, but he says:

"I’d actually go further than that, and say that the dynamism of capitalism is largely a function of safety nets, dispersed risk, and limited downside."

I don't know about you, but I have a gestalt about Congress that tells me a business partner like the Representative of San Francisco will not make my firm secure.

Maybe Felix is doing the Keynesian head fake, giving us a false sense of secutity that we go out and make big commitments. Then, I fear the Pelosi jerk of the lease, a yank back to serfdom.

Felix does not inspire me to buy insurance from Congress.

Sunday, December 27, 2009

Sheldon Silver, go away

New York, the State and City would do well if Sheldon Silver retired, gracefully.

Entropy Encoding in the Brain?

Science to Go reports:

"Researchers led by neuroscientist Joe Tsien found that the brain appears to have a system of repeatedly replaying and reinforcing the same cellular event that led to the initial formation of a memory. The reinforcement is critical for creating the cell-to-cell connections that constitute long-term memories, the researchers found."

Remember Pavlov?

The idea is that coincidence detectors remove redundancy in neuronal spike trains. The rat in Tsien's experiment could associate the vision of the water pool with muscle reaction required to escape. So the brain gets a learning short fut, a sort of Cliff Notes for the complex problem of getting out of the water,

The debate here is when the spike trains cause some protein build up, dendrite growth, that makes long term memory. Coincidence detectors appear on the surface, cortical, I think. The brain embeds the past, so to speak.

Can we find out whether we get a thrill when we coincidence detect? Remember the seven things in a row we remember? A seven level entropy encoder, pretty neat?

I am reading through some of the papers that look for entropy efficiency in spike trains. The complement of entropy is redundancy. The brain might want less redundancy, it means less work.

Saturday, December 26, 2009

Venezuela and default probability

Reuters reports on CMA Datavision default probability numbers. WSJ reports on some of the"progressive" political antics that caused the problem. HT Carpe Diem.

Looking at the CMA Datavision report, the top sovereign default probabilities are Venezuela(57.7), Ukraine(54.6) and Argentina (49.1). For comparison, Greece (17%) at eighth place.

The method is to use insurance premiums for their debt to cumulative probability of default in the next five years. Bottom line on the computation is to create an trend model of the previous history of debt insurance and see how probable the model converges to a higher and unsustainable insurance premium.

Earlier, CMA put California default probability at 27%. Given that California is part of the USA, that is a very high default probability.

Congress is in a race, a race to keep the dollar internationally sound while inflating debt away. I do not think that is possible with the eyes of all investors watching inflation rates. Congress will sooner or later either engage in serious financial reorganization, or the US economy will enter another deflation. I am with Mish on this. The international reserve system is gearing up for alternative to the dollar and they will move fast once Congress begins the debt inflation.

Thursday, December 24, 2009

Derivatives and QM Theory and Fraud

If I were a believer in QM Theory and I wanted to cheat the buyers of derivitives out of their money, this is how I would do it.

I would deliberately construct lot sizes that are not maximum entropy, and hence I secretly create opportunities for minimum variance trades and short the sold bundles. My side information is the disequilibriated entropy design of the bundles, my hedging system are minimum variance insurance trades.

The most similar cheat we all understand are the programmers who shave off the fraction of pennies in interest accounting by computer, then hold the saved penny fractions in secret accounts.

This is complimentary to cheating the Fed with Kelly trades when the Fed distorts the banking channel. In this case, the Fed creates channel distortion by deliberately enabling minimum variance trades, but entropy traders jump in and hog the channel.

With bundled derivatives, the sellers design in a round off error because the bundle lot sizes do not match the actual entropy of the mortgage aggregate. That round off error becomes the probability in a Kelly trade formula or the minimum variance insurance premium.

The seller can do this when the seller has advanced or more detailed knowledge of the original mortgage aggregates than the buyer. When this situation exists, then the buyer cannot take the bundles and reverse the coding process with any accuracy.

Anyway, by understanding which norm dominates a channel, the trader can find other opportunities, maybe, using the complementary norm. The norms are balanced when transaction costs of variance trades equal quantization noise of entropy trades. Hire me as an expert witness in the trial against Goldman Sachs.

Congress is performing a similar trick with Health Care. They are Requantizing, and key legislators know that the quantization is not entropy optimum. The variance trades in this case are the many free riders that result. Employers, insurers and medical professionals will be searching out empty spaces in the market where variance trades can shave of a percentage from the government.

My Christmas Story

Christmas, like most holidays, is a retreat into the past, to a previous way of doing business which survives in residual culture. In Christmas we retreat to the gift economy, in which we balance accounts. Santa is the King of old, and we are his serfdom. We keep Santa around because when we requantized society after overthrowing the King, we could not quite cover all the King's liabilities. So our new division of labor carries with it a small imbalance, an inability to cover some portions of the King's liabilities, which grows over the year. So we have a planned deflation, a return to the simpler production once a year, a time in which the old rules apply and we can settle this residual account of the King.

It is our children, because when we overthrew the King, we partially abandoned the children with our new duties. So we bring back the kindly King for the child in us.

Audi on the move

Audi is going to send an E Car on the Pikes Peak run in [June 27?]. The report from Sidney Morning Herald.

"Audi, however, is pre-empting a backlash from motorists by stressing that the Autonomous Audi TTS Coupe is not designed to dispense with drivers in the future."

Audi knows the deal here, every sports car owner wants to turn on race mode and sit back. That will be drivers being too much in love with their car automation. The way Audi set up the E network for communicating with the car is spot on, and I expect every Audi lover will want this digital system.

Audi should advertise on my blog and I will extol the virtues of their technology.

Another economist joins the Tort Club

Says Graham Dawson from the Mises Blog:

"The use of fossil fuels, like any other economic activity, should be subject to constraints designed to avoid the infringement of other people's property rights. Tort litigation on the basis of strict liability would protect people against others meddling with their climates. The courts would build up a body of common law and establish precedents to guide the actions of the users of fossil fuels — a privatized policy."

Or should we call this the Mises Club?

Wednesday, December 23, 2009

Looking at oil markets

Here I try and use the Dual Norm insight and see if it fits my view of the oil markets.

A simple entropy analysis of oil distribution reveals quants of: The tanker ship, the oil port, the refinery, the gasoline truck, the gas stations and the auto gas tank. Thes Quants are unlikely to change, but more or less of these quants will be deployed over the globe in a spanning tree. And oil is liquid. Hence, I would expect the oil experts use effectively Minimum Variance estimates of future oil flows.

The yield curve of oil distribution will be segmented by the depreciation schedules of the technology supporting these Quants.

DeLong, Kling, Speculators, Austrians, and Norm

DeLong helps the bankers play the Fed accommodation game. I always like Brad's little math tricks, being such a bad mathematician myself. Kling wonders if DeLong is close to Austria. Brad would make a great Recalculationist, and his math would be accurate.

So, let me simply discuss the whole issue of Speculators, Fed accommodation, and Norms, and games. If, by some magic, the Fed can force open an accommodative banking channel, then entropy based Kelly gamblers will fill the channel with optimum sized trades such that any other effort to find a minimum variance trades will be drowned in quantization noise. In English, the transactions costs are now very high in finding available Variance trades in an Entropy dominated channel.

From an optimization point of view this is clear, if the Fed can distort the channel, then the channel should develop recoding; changing the frequency and size of trades across the yield. Look at the Shannon channel formula, the C/B in the exponent. By magic we distort B, the channel bandwidth. Then the market has lower C, the goods rate; and hence a Recoding must occur in an Entropy channel (this is the Ramsey search). In the financial system, the Recoding will simply result in a production chain that generates a way around the Fed distortion.

Boys and girls, my guess is that right now, is that entropy coding dominaters variance coding in the banking channel. Fiscal and Monetary games cost lots in terms of transaction costs, because the markets have to recode.

Tuesday, December 22, 2009

Tracking the Taxonomy of QM

From the Hydraulic model to the Dual Norms model.

Infinite dimensional, reversible flow. This is the hydraulic model with no inventories, an infinite stages of production, smoothly changing. The result is all goods distributions have smooth Gaussian yield curves. No shock can harm the system because we can costlessly repossess and disassemble products and put their components into inventory. Everything is visible and firms minimize inventory variance. This is the minimum Variance Norm. If we add hidden information, nothing changes except decay processes are known ex post.

I suggest following the physics standard and put decay processes on the right side of any yield curve where they have negative sign.

The next addition is positive definite flow, little reversibility. This introduces inventories and the Zero Bound problem. But the Zero Bound problem still allows smooth changes as long as firms in the stages of production can costlessly merge and divide so as to avoid the bound.

Then I introduce the Constant Uncertainty, inventories at all levels of production will have the same variance. This introduces the market, for the first time, in my opinion. Markets exist so the stages of production can negotiate the size and frequency of goods shipments to meet constant uncertainty. This introduces the Shannon coding problem of the market and introduces the Entropy Norm.

Up through hydraulic macro, the simple mechanics of minimizing inventory variances was the main concern. Inventories were allowed to grow in reasonable size, even the Zero Bound was handled. In these conditions, the economy simply act like a hydraulic flow with local adjustments promising to lead to global adjustments. Hence, no real need for markets, whether truck and barter or monetary.


Here I perform a selective web search of Entropy based economic studies. I limit references to studies the utilize entropy coding in markets, excluding the theories based on general thermodynamic entropy. The two themes should be resolved, but not by me right now.


"Interbank markets allow banks to cope with specific liquidity shocks. At the same time, they may be a channel allowing a bank default to spread to other banks. This paper analyzes how contagion propagates within the Italian interbank market using a unique data set including actual bilateral exposures. Since information on bilateral exposures was not available in most previous studies, they assumed that banks spread their lending as evenly as possible among all the other banks by maximizing the entropy of interbank linkages. Based on the data available on actual bilateral exposures for all Italian banks, the results obtained by assuming the maximum entropy are compared with those reflecting the observed structure of interbank claims. The comparison indicates that, in line with the thesis prevailing in the literature, the maximum entropy method tends to underestimate the extent of contagion. However, this does not hold in general. Under certain circumstances, depending on the structure of the interbank linkages, the recovery rates of interbank exposures and banks’ capitalization, the maximum entropy approach overestimates the scope for contagion."


Actually develops the entropy model for insurance networks, still reading this one.

Totonto prediction Blog:
Measuring Entropy in prediction markets.

Phillippatos and Wilson:
State Weighted Entropy as a Measurement of portfolio risk.

Reesor and McLeish:
Uncertainty in financial markets: Can entropy be a solution?

And we should read the work of Kelly, Entropy and Gambling discussed in this Wiki.

The Democrats certainly hire a lot of lobbyists!

Reports Victoria McGrane at Politico:

"Washington’s influence industry is on track to shatter last year’s record $3.3 billion spent to lobby Congress and the rest of the federal government — and that’s with a down economy and about 1,500 fewer registered lobbyists in town, according to data collected by the Center for Responsive Politics."

So, Obama clearly did not understand the process when he talked about Change.

HT to Instapundit

Dueling Norms and QM Theory

Back on the issue of what Norm should be the vertical axis in the supply and demand curves. Economists currently use price comparison as the Norm for the household and firm. Let us call the household or firm by one name, the Firm, hoping our approach will show the household and firm operate from the same principles.

I am suggesting, tentatively, that QM Theory should put the Mechanics inside the Firm as minimum variance of inventories, and put Quantization in the market as a maximum entropy norm. Hence, the firm is schizophrenic, and for Supply and Demand the vertical axis is Entropy. The goal of the Firm is to minimize internal variance while minimizing quantization error in the market. What do we mean when the Firm attempts to maximize entropy. The Firm will construct the quantum of value it sells such that the product delivers the maximum of new information in the market.

Dual (or Dueling!) Norms describe the asymmetry of aggregate data. We deflate with entropy adjustments to product quanta, and inflate according via inventory minimization.

California's default date

Blogging Stocks reports on a new index, the Soverign Risk Monitor, which ranks governments by their probability of default. California ranks ninth, just above Greece, and Greece is very close to default. The interpretation of the default risk is that California will default within five years. However, if the index is accurate the economy is likely to force a California default sooner if it is going to happen anyway.

Monday, December 21, 2009

Felix Salmon wants to Reduce the Shame of Default

In this post he channels Steve Waldman. Bankers need not be left off the hook, and default is the way out, I agree.

I just want to extend the concept to local government suffering under pension obligations.

Saturday, December 19, 2009

Supply and Demand curves

Economists draw nonsensical Supply/Demand curves. The vertical axis has been screwed up by economists since they began. The vertical axis depends on your norm, and few theories put price as the norm. The supply of apples equals demand when the apple market has highest entropy (Entropy norm) or lowest noise (Minimum Variance). These are the two Norms in common use, I think no economic theory somehow makes price a norm, so why are they still drawing the curve like this? If we have price illusion or money illusion, it is a separate condition, one which traditional undergraduate economics enforces. Money illusions have to do with the mid-apprehension about the utility of money. Around here we misapprehend the value pot farming.

What is on the vertical axis of the IS/LM curves? Entropy or variance, depending on the norm. All the derangement theorems imply that the normative function for some important market has inversions. Since the theory needs a norm, its failure for any derangement syndrome must imply incompleteness in individual transactions. (The normative function is does not strictly obey the triangle inequality in all markets).

More later.

Thursday, December 17, 2009

Real World comments on Minimum Wage

The minimum wage is an accounting standard, an established one. Untold business hours are saved in planning to the extent the minimum wage is accurate. But it is the tail end of labor, the minimum labor Quant, so the economy resets full skill range, ultimately in response to a change in min Quant. But the Minimum Wage has to move now and then.

If we followed good advice we would have it set on a more regional basis, more often with more market competition. But, good advice is expensive, so the thing is set nationally, less often.

It doesn't go down, and that quantum exclusion causes unplanned and sudden devaluations in real wages via monetary disturbance.

In Roman times, multiplication was expensive, and accounting standards mostly followed an additive trade pattern in denominations. The shrewd merchant could use multiplication and odd lots, losing economies of scale but clearing profit on quantization noise. Then the economy learned multiplication. Perhaps now, the economy should learn negative numbers?

Even the Hidden Hand takes a lesson now and then.

LGL on the Minimum Wage

Lawrence Lux posts the Minimum Wage debate.

"The entire Concept of Minimum Wage should be reevaluated by Keynesians, as the most notable part is ignored. This consists of the fact that Wage levels are differentiated by Skill levels; alteration of the Minimum Wage brings on the Shelving effect, where labor insists on separate Wage levels based upon Skill level"

Lawrence is appealing to Quantum analysis. If we set the Minimum Wage level as the smallest quantity of observable wage difference, then we can construct an N length labor production system that assigns wages as binary multiples of the standard wage unit. First back to out the distribution of wages from the data then assign wage quanta according to the Shannen channel formula.

I pointed this out on Thoma's blog on the subject, by noting that the Minimum Wage might just be used as an accounting standard. It makes planning simpler by defining a preset range of wages at each skill level. The minimum wage is likely a wage standard more than a wage regulation.

Wednesday, December 16, 2009

I like the Oil Dollar

Proposed image of Mohammed for the new Petrol Dollar

As reported here.

I earlier stated the oil dollar makes sense right now and for the next ten years. We are solving the oil problem, lets align the major reserve currency to oil while we solve it. If done right, then the energy cost of our planning will be apparent.

The Gulf Dollar should be called the Jihadi! The world will go on an Oil Efficiency Jihad.

The Jihadi may not last longer then ten or fifteen years, but it will be an enormously profitably monetary system during that period, and if its decline becomes near, all parties can prepare a little crash.

The rise of the Jihadi in international trade will be a substantial monetary stimulus, causing discipline in the investment banks. The Jahidi will be a close approximation to the Selgin Productivity Norm.

Because the Jiadi monetary system will optimally align oil trade, oil will be globally priced with maximum precision. The Jihadi will force developed nations to match oil scarcity with oil efficiency on a one to one basis.

Monday, December 14, 2009


The Entropy Function

This function is relevant to Paul Samuelson because information theory grew out of statistical mechanics, which was Samuelson's favorite tool.

This chart says that a coin with equiprobability of heads or tails will deliver the most information over a large number of coin flips. If the coin always turns up tails, there is no information gained by coin flipping. Nor is any information gains when heads always turns up.

Leading us to Huffman coding. The idea with Huffman coding is to spend the most bits on the least probable events. Common events are coded with the least amount of bits. Hence, the communications channel devotes the proper amount of bandwidth to the according to event probability. Thus bandwidth is allocated such that symbols arrive with equiprobability and one gets maximum entropy as in the equiprobable coin toss.

The equation above is of the form P(X)log(P(x)). which I take to be the amount of work required to get maximum information entropy. The Huffman coder actually resembles a distribution network and the number of decisions on the network should correspond to a NlogN format. Operation counts and the maximum entropy function should be related.

What does this have to do with the economy? The economy is a noisy channel which must allocate inventory investment to goods whose domain is the relative constraint of the good, and whose range is the inventory space allocated for the good. The allocation of symbols, in the economy, is what I call setting the lot size; the bits are the stages of production. The result is inventories can arrive with the same variance regardless of the constraint, the inventory channels are maximum entropy.

When referring to the Shannon coding theorem, remember my a priori is that the noise level is a constant, biologically driven. Well, let me just write out the equation in a form we need.

2**(C/B) = 1 + snr

SNR, signal to noise ratio is fixed. B: the bandwidth of the channel, or in our case, the bandwidth of the industrial production equipment. C: channel capacity is the transaction rate at each level, in our case. For now lets treat it as the scalar transaction rate at the retail level. The Hidden Hand tries to adjust C such that 1+snr is met, making the human happy. We adjust C, the number symbols, or sales, mainly by changing lot sizes, or in the channel case, reallocating bits so they are used to minimize transaction for restrained resources.

I have to be careful that the concepts of deflates and inflated states get understood by me with respect to bit assignment in a Huffman encoder. Also I introduce the second constraint in the system, the cost of providing additional NlogN chunks of labor. And, still, my thinking on the asymmetry problem, muddled. More later.

Consider the supply chain for consumer electronics with inventories growing at all levels from producer to consumer. This is productivity increasing faster than demand, excess profits are plowed into specialization and the industry inflates, increasing the stages of production but offering more specialized offerings to the consumer. The consumer is happier with the lower volume but greater specialization. The system has matched the consumer snr levels. Increasing the number of bits increases the precision of the product.

Changing B in the production system is a longer term process, involving a supply chain adjustments in lot sizes and technology to the industrial machines.

The key to understanding the deflation/inflation tipping points is to understand the nature of the delta NlogN in transaction rates across the jump.

Where is this leading? First, I probably got signs wrong,. I usually do. Second, the B and C become matrices, and the channel equation above, the left side becomes a matrix power series. The Eigenfuntions on the right will be snr * 2**i, i finite and small. The power series gives the vector of inventory variances and the solution will actually be the lot size, in units of snr. An example might be taking a consumer survey to discover the smallest noticable consumption of water, and then computing the lot sizes, which yield the inventory capacity in a N stage, smooth earth water distribution system.

Saturday, December 12, 2009

Robert Burton and the certainty research

I first ran into Dr. Burton's research by Arnold Kling. Here is an Burton interview for starters. Dr. Burton refers to the mistake of complete certainty, it is a false emotion; life's alternatives come from the slight uncertainty of things. The QM physicists say such things.

To the point, Certainty and its inhibition are layered, not symmetrical, in the brain. Certainty (impulse to action) is the older brain function, likely olfactory; the Limbic system starts the inhibition process, widening the width of certainty we find comfortable.

Cutting to the chase, I think we are pulse frequency modulated stuff trackers, we track the frequency of arrival of good stuff. Our ability to track events, seasons, grazing routes, the herd; all that must have tuned itself to the ebb and flow of generational life, our brains are probably preset to the certain frequencies of the regular events in life. We are a tuned tracker of good stuff. Operating with a fixed uncertainty has benefits.

This conservation of work that evolution does, it discoiunts infrequent events, minimizing the certainty that matches the mammal to the environment. Thus it can fit simple trackers into unique environments, devoting more time working on running, jumping, etc.

It gets us to a simple Kalman filter tracker with fixed uncertainty, tuned to the plains of Africa. Buyers and sellers judge each other, ultimately, by the arrival rates, hence in our search for a tuned environment, we create one, inadvertently, like with a Hidden Hand.

Yes, Kling reminds me of this Tim Harford post on why poor nations often stay that way. At the heart is the inability of poor societies to maintain the repeatability of goods flow. So this approach of necessity emphasizes the development of transportation.

Friday, December 11, 2009

Tight and Loose money

In a flow model, I talk about tight and loose goods flow, then specialize to debt.

Simple: In the inflated state, a distribution network has tight inventories. In the deflated state is has loose inventories. The results follow because the state change is faster than the goods adjustment, and so state change over shoots or under shoots. When bankers inflate, we have more stages of bankers in the production line than we have debt flowing. When the bankers have deflated, money is loose.

Deflation and inflation paths have to be asymmetric, but I am still a little confused there.

How's the bandwidth?

The shortest sample period the banker are following is about one year, with a bandwidth of two years on the short end. Bankers have simplified the yield curve to three broad stroke. At the long end, the 20 and 30 year is blurred, so we have little vision of the downward slope into negative term territory.

If we are going sideways, that is good, it means that currently decay and growth are evenly, and we are stationary. But we are seeing the broad strokes of the economy because whatever ails us was highly correlated with all aspects of the economy. We have lowered the dimensionality of our view on purpose, as part of a Ramsey search, finding the most optimum distribution for the most constrained input.

Thursday, December 10, 2009

A Stimulus tool for foreign aid

New work reported by VOX on limitation of internal transportation in Africa. Krugman reported similar results when working with trade patterns. I find it important and it leads to a Stimulus. Why not simply subsidize the shipment of goods to an African villager? Pay any shipper of goods 20% subsidy for shipping costs when the good arrives at an African village.

The subsidy in shipping will, over time, ease the leading constraint on African development. The cost is slightly higher taxes for OEDC citizens.

Forget African exports here, the idea is to get them great stuff sooner and allowing natural development of transportation.

Positive Remandation

I have decided to take up Supreme Court Law, why not, if it helps Justice Roberts out of his dilemma.

For his sake, I invent Positive Remandation, otherwise known as nailing the vampire in his coffin, as in Citizens United Corp. Justice Roberts affirms that standard contract judges to rule in Corporate rights (non) cases, and necessarily retracts the law on Limited Liability to nothingness, Zero, Zip. No legislature may ever remove personal liberties by grants of limited liability.

Put me on the court next time.

Wednesday, December 9, 2009

My assesment of the economy

With an update below.

I note that oil imports are approaching their 2001 import levels, by volume, when oil supplies first became constrained. Oil prices are now at $75/barrel, a 10% annual inflation from the $30/barrel of 2001. Hence, we now have some wiggle room between nominal and real oil prices, oil supplies are much less tight. Retail inflation (core) is about 0%, so we have an equilibrium. The economy has lost output capability, about $150 billion a year due to excessive oil prices we will have to endure until efficiency or time erase the consequence. At 3.5% a year, over ten years, we are potentially as risk for about $3.0 trillion total over the next ten years.

If the Fed normalizes the yield curve in mid 2010, then, with its constant six month delay, we should see real rates form a normalized yield curve about now. Hence the fed is in unavoidable bubble mode, unless it can pull an unexpected rate hike. Theory says it cannot pull the unexpected, so here is a test of theory.

For an update, I am going to link to Gail the Actuary who laid out the calculationist argument using the oil constraint and oil volatility in a post at Oil Drum.

Monday, December 7, 2009

In which I take question number 3 from Bryan Caplan

which he asks of the Recalculationists:

"By what percentage do real GDP and employment fall if nominal GDP unexpectedly declines by 5%?"

I go back to Bryan's original example, mud pies. The mud pie industry deflates because mud pie utility suddenly drops. I use bits of precision as my recalculation metaphor.

Pie utility has dropped, so the industry wants to simplify its supply chain, increasing lot sizes at each stage and gaining economies of scale. The pie distribution network drops from a four stage network to a three stage. Measurement accuracy drops from 1/16 to 1/8, but the transaction rates reduce from 4log4 to 3log3, and employment drops in proportion to transactions. Before and after adjustment, at each stage of production, the lot size is set to present an inventory variation equal to the constant uncertainty.

Now, if we make no other assumptions, especially assumptions about rates of deflation and asymmetry in trades, but assume the same a completeness* of the banking sector, then the central bank would notice increased imprecision in its measurement of NGDP proportional to the net loss in utility of mud pies. The economy is restored to precision when the mud pie industry is absorbed into a four stage food conglomerate. The one time restructuring fee is the precision loss weighted by the mud pie share of the economy. (Mud Pie industrial equipment will be devalued as it is less precise in matching input rates to output rates in the firm).

Note 1) QM theory would have various distribution networks popping between deflation and inflation states with differing probability, based on constraints. Banking and all other sector would change the relative probability of being deflated vs inflated based on the net loss.

Note 2) If there is not a large drop in mud pie utility, then the industry stays in its corridor, and over the business cycle will display nominal precision.

* Complete (my definition) : The dimensionality of the domain equals dimensionality of range. The bankers track the economy with acceptable uncertainty.

Inflation and debt revisited

Research on inflation by Reis and Watson, and by Aizenman and Nancy. The former says that inflation is composed of 20% common monetary inflation and the rest relative inflation among goods. The latter says the most debt we can inflate away is about 20%. Are these two numbers a coincidence?

Reading both papers (which I am still doing), my model says, of the former, as bankers push inflation they also spread the relative pricing of consumer goods and hit a constraint. The latter says that if the bankers push inflation then they also cause future debt to be indexed to inflation and they hit a constraint.

HT to Econobrowser and Econolog. I post this to keep these two references close for this issue is going to play out very soon. We are at the 20% limit, or very near.

Sunday, December 6, 2009

More brain evidence of the uncertainty constant

I always watch the neurosciences, looking for some clue that tells me what the universal "herding" constant is for humans, that is the comfortable level of uncertainty about where the herd is headed.

Lauren Schenkman files a report on new advances for ABC News:

"Countless psychological experiments have shown that, on average, the longest sequence a normal person can recall on the fly contains about seven items."

She then goes on to describe a neuronal model for this devised by Mikhail Rabinovich, a neuroscientist at the BioCircuits Institute at the University of California.

The point for economic theory is that the eighth item in a list is barely remembered, the brain can track seven items reliably. This kind of research leads us toward the master uncertainty constant of human economies. We are comfortable when our error rate reaches .125, our first guess at the universal uncertainty constant. This tells Quantum Economists a lot about the structure of the economy in terms of distribution networks. More later.

Kling, Recalculation and the Great Depression

I generally like the Recalculation story by Kling.

I think Incompleteness does provide a context to fit his story in detail. By incompleteness I mean that the economy is deliberately imprecise because of the cost of precision. It is very expensive to track all the possible mis-allocations, or find all the profit opportunities. Hence, over time, the economy has adapted to a boom bust cycle as the least cost alternative.

Second comment I have is the details of the Great Depression. Kling comments:

"The amount of economic realignment caused by the internal combustion engine ultimately was huge. "

Then goes on to comment:

"A source of the current Recalculation is the Internet. "

He is close, but I wish he would have commented on the advent of broadcast based commercial markets in the late 20s. Like the Internet, broadcast radio changed the utility of freight transportation so greatly, so suddenly, with much more efficient advertising.

Saturday, December 5, 2009

We await the spiritual raising by Justice Roberts

Justice Roberts, in an amazing extension of rights, wants to grant artificial persons fundamental legal rights in the case of Citizens United vs Federal Government. Why he took the case, or why he didn't remand it back to contracts court is beyond me.

It would take a tremendously ignorant law professor to accept that somewhere in the Constitution that artificial beings have rights. Anybody know of a tremendously ignorant law professor?

My home town has half of residences underwater

GoBiz has the latest numbers from Corelogic on underwater mortgages. My hometown, Fresno, CA,has 48% of residences underwater. My hometown is also the center of Public Union mafia control of government in central valley. This post shows our city approaching bankruptcy after years of handing out future goodies.

What is the outcome of the pension tsunami in Fresno? Ultimately it is bankruptcy and restructuring. Our new mayor is dealing with a city council that basically reports to the union head, especially the police union. Our citizens are wondering which crime is the least painful, street crime or city boardroom crime.

Obama may well delay the restructuring with a bailout, but he just ensures another Double Dip. One thing we can all count on, the Pension Tsunami is unaffordable by the federales or the locals.

Zero Hedge is doing its job.

Using forensic accounting they have trapped the FDIC in an insider trading scandal, which Mary Shapiro at SEC will cover-up.

Tyler Cowen on Health Care

He makes a very strong data case that Medicare costs have never been under control, in spite of bad reports cited by Yglesias. What we have is wishful thinking on the part of progressives and realism on the part of investors, the result will be a severe inflation/deflation cycle, a Double Dip.

Romer worries about a repeat of the 1937 change in circumstance. That second dip came in 1937 becaue government changed its product mix substantially, switchin to wartime production. Healthcare reform will look like a Double Dip trigger when we are done. Something which was not a current constraint in the economy, now becomes one.

Thursday, December 3, 2009

What you need to know about Climategate

The fundamental we are dealing with is that the earth is in the Holocene period, has been for 12,000 years or so. That is the hottest period in the glacial cycle, but the length of the Holocene may be abnormal, and related to biosphere effects, a change in CO2 some 12,000 years ago. We are at the change over period in the glacial cycle. That period is generally very rapid, a collapse in temperature, meaning it is a tipping point to rapid cool down.

With the CO2, then, where do we tip? Dunno, really, because we have never dwelled the top of the glacial cycle with so much CO2, not in a million years. I can think of scenarios and minimizing constraints that might put us back to ice ball. The danger is precisely because we are doing this at the top of the hottest period, why? Is there some natural reason? If so, on what basis says that the ratio of tipping points go in one or the other? Did we do anything 12,000 year ago, what are we doing now, and what was evolution's intent regarding our relationship to the glacial cycle?

But, bottom line, the mix of CO2 with the tipping point combined with the length of Holocene will cause unexpected events.

Looking at the right side of yield

When we last left the subject, the relationship between economic yield and Fourier, I said, a yield curve, as a spectrum, will be DC (zero centered) normally distributed, the time series is zero centered gausian noise, at equilibrium. I mentioned the negative terms in Fourier, they were hidden terms in the economy.

We should use Fourier sign convention, and the right side of the yield curve are negative value going up. So, the yield at infinity, zero frequency, has positive yield on the left and negative yield on the right. The normal distribution is symmetrical with growth and decay processes.

But, the right side of the published economic yields is vacant, unknown. They can be recovered in hindsight with forensics, or recovered in real time by the likes of Zero Hedge and are observed in revision histories. The right side terms are held within the firm and are accounted for by command and control of internal inventory flows. The right side terms may include fraud, or potential stimulus actions by a monopoly.

The economy is risk averse, so it operates its aggregate yield curve to the left, such that the tail risk, the right side of yield, is small. We want slightly more growth processes than decay processes. But that puts the center of yield at or near 20 year, so we are Gaussian about a cycle because we are overly cautious. However, because of quantum affects we only measure yield from its peak to zero as a straight line, so it is not clear where the peak of yield actually lies.

Summary. We are cautious. We over produce to keep stockpiles high. As time flies, the reserves losses do not occur, so the reserves are moved from the right to the left in profit taking, the curve obviously peaks at the 20 term, the portion in the right dwindles, we relax, deflate and shift economies of scale for a while.

The theory tells us exactly when a deflation is going to occur, when the peak is noticeably at the 20 year point, for that means we get it, the past had been saved.

Felix Salmon and Political Science!

HT Marginal Revolution

Felix wants to mix economics and politics:

"Given the government’s insatiable appetite for cash, it’s only natural that it would prefer to tax plutocrats, spending some of that money on poorer Americans, rather than move to a world where poorer Americans earn more (but still don’t pay that much in taxes), and the plutocrats earn less, depriving the national fisc of untold billions in revenue."

The equilibrium is here in what Felix says, the balance between smaller government and progressive taxes. The problem, once again, is the lack of precision, we can never get that close to the equilibrium.

By the way:
fisc n. The treasury of a kingdom or state.

Felix taught me a new word, I can still learn!

The Fed Target: Interest rates, NGDP, Base Money?

That is the discussion among the banking economists at the moment. Kling, Sumner, etc. In short, which variable should the Fed watch when setting monetary policy, Nominal (non-inflation adjusted) GDP, real short term interest rates, or the money base in circulation. I propose here that the discussion resolves around the response time of the Fed, not the target.

We are really talking about the TIPs aberration in the chart above. As I pointed out, this aberration is the result of the economy changing faster than the adaption period of the Fed. There is a six month bandwidth limit in the financial system, it cannot be avoided regardless of the targeting regime.

Even George Selgin's competitive monetary system would only smooth out the pulse, but the delay remains.

Why do we have a six month noise limit in aggregate measurements? Because the cost of adding more bandwidth is enormous. How enormous? Look at the cost of our central bank in dealing with the crisis in late 2008. The central bank and Treasury were in the headlines many days, high level visible meetings all day among officials in Washington. We simply cannot afford the cost of collecting aggregate information with timelines less than six months.

Information theory gives us a clue. If finance operates with 5 bits of precision, and suddenly needs 6 bits, then the number of transactions the finance needs goes from 5Log(5) to 6log(6). As long as the economy is operating within a stable "corridor" we will want the Fed operating with low precision, and we are willing to suffer the consequences when change happens too fast.

Tuesday, December 1, 2009

Measuring GDP in a constrained environment

Comparing Real GDP to Oil Imports

Under the assumption that oil, or energy, is the dominant constraint in the economy, we would expect the oil yield curve to best represent the average yield curve of the economy. So, I am looking for correlations between oil imports and GDP. Note the Real GDP in the lower chart, from 1992 to today. Compare it with the oil import chart. They both have the inverted hockey stick. In typical asymmetry, oil declines faster than it rose.

What is causality here? Real GDP will be decomposed mostly by the top few constraints in the economy. that is there is only so much regression coefficient to be distributed. Most of that correlation coefficient is oil imports when oil is excessively constraining.

I think a look at the last portion of the oil import chart will show a small correlation between oil imports and the American Recovery Act. There was a typical, temporary excess of oil inventory after the crash. Congress managed to use it up.