Saturday, May 30, 2009

Zero crossings, asymmetry, and holiday seasons

What do these have in common? . Why is it that zero inventory levels cannot be accounted for in the money system except by bankruptcy? If economic change is asymmetric, then why to holiday traditions occur on regular intervals?

The invention of finance relies on the relatively fast adaption time of money. Yet there are many economic activities in which inventory changes occur faster than money can adapt. Money has never completely conquered the seasonal output of agriculture. Not has the pricing system eliminated criminal activity.

When inventory goes to zero, the ratio of lot sizes (in QM Theory) determines price. Inventory coherency becomes unstable as we get a zero in the denominator. Seasonal changes in agriculture, the sudden arrival of winter cause periodic events of inventory changes happening faster than money can adapt. During these periods the pricing system fails and the economy reverts to pre-monetary, culturally enforced coherency.

A bankruptcy is priced by arbitrary rule, Christmas is a period of non-monetary giving, as his Halloween a traditional form of non-monetary taking. Herding mammals do have to maintain coherency even in the absence of money.

Stock market manipulation

As reported in the Baltimore Chronicle and Sentinel. Read at your own risk, basically pointing out the natural monopolies don't go away, they just get government's help in stock manipulation.
HT Zero Hedge

Queuing transaction rates and the yield curve

Rather than use transaction rates to map to the yield curve, use inter arrival times. The optimization problem is, How can one minimize variance on queue size with the most idle time. So, 30 year term structures are things that happen on 30 year intervals, like building a highway or establishing a family neighborhood.

Sample period is the average inter arrival period, so we convert to sample periods. The question becomes, How many sample frequencies do we need to estimate queue sizes? Well, we are restricted to computing equilibrium moments, then the question becomes, How many sampling frequencies do I need to sample gaussian noise. And the the distribution of sampling frequencies becomes the Fourier transform of that, which is again Gaussian. So, we pick sampling frequencies along the yield curve, which is Gaussian at stability. Use the constant measurement error to limit the dimensionality. Do one of a number of decompositions of NGDP, in time, and pick the top five modes. They should be distributed to minimize inter nodal errors due to measurement uncertainty, which one can check against the actual real yield curves.

In the multi-stage queue model, all distribution chains will adapt to these sampling rates, dividing up production and lot sizes to meet a specific production structure, with capital investment decisions agglomerating about the sample period. So, in the queuing model, introduce lot size, again getting back to price theory as lot size ratios.

So, when a yield curve is steep, there is a tendency for long term capital investments to hold, for a long time; and variations from planned output will be managed by lower lever, shorter term queues. Our curve is steep, whatever we have decided long term can be changed by a double dip recession and restart (asymmetry) ; or it can be made to work.

Money Velocity and government spending

DeLong is at it again, trying to determine the determinants of money velocity.

Money velocity theory ultimately boils down to inventory management and the yield curve measures inventory shortages and gluts at the various term scales. Brad thinks part of our solution is more stimulus. Then Brad believes that, right now, the set of government goods and services are a better balance in terms of inventory management. To the extent that government offers a mix of production to better balance inventory then the higher the multipliers; and visa versa. Increase the government deficit if the inventory chain becomes better managed, decrease the deficit other wise.

Markets have given their first indication that the Obama administration still has their priorities wrong. The market disagrees with Brad. If the Obama administration shows no change, then another stimulus boost sends us to double dip.

Friday, May 29, 2009

Maximize inter arrival times.

makes the bet objective function. Maximize the length of time before I have to go collect inventory. Using that in place of velocity of money for some one of a few equilibrium points. Then the yield curve makes better sense, expecting the longer to shorter estimate how often a transaction needs to take place, on the average, to maintain coherency.

Then, at any given mode of the curve, there is an average lot size and typical queue size, the price ratio between lot sizes for both money and any other good.

If you recall, I work the multi-stage queue model, finite span.

When was the pending recession first observed?

Macro and Other Musings discussed the issue. So I go back to my Universal Economic Calculator and following David Beckworth's argument, I see the spot precisely as March 10, 2006. At that point, there was a rush to buy long term treasuries. The financial market correctly predicted the recession and the subsequent drop in long term rates. Going to my oil charts, (second chart) , I discover that oil at just peaked past $60, setting a new record since 1983. By late 2006, the yield curve was fully inverted.

Niall Ferguson hits back at Paul Krugment for the Econ 101 lecture when Naill pointed out that large deficits would cause inflation expectations.

The correct answer? The recent run up in rates after the Treasury auctions is caused by the financial markets indicating the deficits cannot hold with the severe oil constraint still on the docket. It is the deficit in comparison to the constraint that matters. Financial markets are testing whether the "stimulus" is actually solving the constraint. It is not. The stimulus is trying to solve problems not related to the cause of the recession.

Thursday, May 28, 2009

Let them mix, Podcar revisited

Podcars and pedestrians mix, I am convinced. Let the Podcar know how to stop and say excuse me, and the human pedestrian will treat the Podcar with respect.

These Podcars can move as slow as pedestrians, or as fast. Podcars can go up and down a straight line on Broadway, in the middle of pedestrian traffic. Podcars are courteous and safe. They can group and move together like a bus. Podcars hang out in parks, shopping centers, factories, and busy streets.

Some Podcars carry boxes of groceries. Bicyclists should not surprise the Podcar.

Profits first, financial adjustment second

Free Exchange asks a question, Can output recover before investment does?

The answer, according to Quantum Economics, is yes, output always must give a signal that it meets economies of scale before credit can transmit that information. In a multiple good model, in which money is just one of many goods, then money distribution relies on its ability to react fast with its own inventory adjustments, faster than other of goods. The job of money is to give a coherent account of inventory adjustments in other goods, so it must see them before it reports.

Within observability, most sectors see money adapting as fast, or faster than other goods, but this is the Money Illusion. Generally one systematic goods constraint is much harder than constraints to other inventory systems. The one constrained good reaches a satisfactory solution, and the happy news is transmitted to other goods as fast as finance can adapt.

Tyler Cohen asks a question and I answer

In this post he asks:

"But if you can explain to me exactly why oil prices rose as they did during the first part of 2008, despite the slowing global economy"

The answer is that some technology shift allowed important parts of the economy to uses oil more efficiently. There has been some positive productivity shock to the economy.

MV=PQ, modified to fit Quantum Economics

The orrect would be:
For some finite N, which seems to be in the range of 5-7, tyhe the equation becomes.

Vi becomes Transaction rate for equilibrium point i
Qi becomes average lot size for for equilibrium point i
Pis the ratio between Qi for this equilibrium point to the average Q overall.

Then for each i, Vi = PQi

Vi (or Qi)?? would follow the Hamiltonian and be normal, independent; thus the 1/Vi (or Q1??)trace the yield curve of the economy, long term purchases occur less frequently. This part needs some thught, maybe later.

The sum of Vi over all i is the minimum transaction rate required to meet the Hamitonian, and should minimize total measurement error. The end result is a Krugman agglomeration.

But we still can break Qi down to its subcomponents within an equilibrium point and talk about price just for that equilibrium point.

Podcars and bicycles

Can share the same road. Podcars are the safest of all the new transportation technologies, rarely going faster than 30 MPH, and they are very good at avoiding humans.

The problem are the companies like Ultra, makers of the current fad in PRT. Ultra is really selling concrete skyways which can be very restricting to transportation innovation. What we want from Ultr are the PodCars, skip the concrete. Podcars work well around the towns major loop, they work well along the center of Broadway, up and down marked lanes. They work will with a half meter curb or a green line to mark the lane.

City adminstrators need to look for traffic innovations they can buy a little at a time, try it out, and expand with more returns to scale. Do not buy a transit system that requires billions in up front infrastructure.

The Treasury Bubble is bursting

That is how I read the results of Treasury auctins. Tax receipts are down 38%, the Treasury now pays an inflation premium on debt, investors in the Treasury hedge are pulling out their deposits.

We now have to go back to this Congress for a reallocation, or wait until the next election. Smart money is parking their funds away from Congress until we can see clearly a change in direction from Congress.

Wednesday, May 27, 2009

Using monetary policy for stimulus

Economists who like to work from the wealth curve should be worried that the poor are not sufficiently monetized. They should think of monetary policy that keep the poor within the currency system.

That is why I have been promoting the concept of subsidizing the medium term savings account of the poor. The concept is to make the behavior of the poor more coherent wit the economy. Give them an extra incentive to optimize their six month planning behavior with a few points subsidy to their six month bank account.

Tuesday, May 26, 2009

Is the Taylor rule still working?

Yes, but there has been a regime change.
Reset the baseline to the last quarter of 2008, we have had an equilibrium shift.

The Fed Money letter talks about the Taylor Rule, (HT Geg Mankiw) and why it predicts a -5% interest rate. What nonsense, the Taylor Rule is a Newtonian construct, only valid for keeping the economy stable about the an equilibrium point that has been stable for some time. We have had a shift in the eigenvectors, we have incorporated the new technology, we have restructured. One has to start from the point of restructuring and reset the initial point from there.

Remember stationarity in stochastic approximation? The proper interest rate for overnight funds should be raised, as John Taylor suggests, start fresh.

Could we have regulated the dark matter of finance?

Mark Thom opens the debate again. The issue is whether "regulators" could have regulated the "predictors" {credit derivatives]

This debate is about who is assigned the task of predicting the future some 30 years out.

Arnold Kling has an answer, to paraphrase, for the next 30 years we will have group of investors who think they are better predictors than the previous group.

You are stuck. As long as we demand 30 year predictions we will get dark matter.

When and how do we restructure?

When the economies of scale at the production networks cause wide variation in queue sizes and due to coherence many reach zero bound in their supply networks simultaneously.

Our QM construction gives us a guide on how to interpret a technology shock and how it forces a restructuring as Schrumpeter predicts. Consider the current restructuring. I can describe the process under the assumption the the shock we suffer is on-line shopping. We leave it to professional economist to prove or disprove the thesis, but the thesis, real or imagined, is a good discussion point.

On-line shopping reduced the number of steps in key distriution networks from N to N-1., and left us with an eigen value mismatch, not all distribution networks are operating with the same eigen function set. The result is that the money queue adapts to the avergae, become more accurate and demands much more accurate management of queues from non affected networks. They often attempt refined economies of scale only to see their queues reach "zero" bound, resulting in price spikes. The zero bound queues result in bankruptcies, and reabsorbtion into the banks, which now hold assets. The assets sold at auction are purchased by the more efficient distribution networks which adapt them to the new technology and restore the natural eigenfunction set.

Explosive recombination of many sectors occurs when inventory bulges back up the distribution chains at points of convergence. In the 2008 Mini-Depression, the efficiencies of scale for on-line sales transmitted inventory bulges in other sectors up to the commonality of the energy distribution network. These inefficienct sectors began to share scarce energy resources with each other, and with the more efficient sector. The more efficient sector maintains its oil inventory above zero bound, and the others reach zero bound more or less simultaneously due to coherence.

Monday, May 25, 2009

Deriving nodal points on the yield curve

The subject came up in a comment on another blog, why don't the distribution netwoks collapse in a multi-stage queuing analysis? The answer can be illustrated usingmy universal economic calculator and the uncertainty constant. In the process we get the outline of proof for Milt Friedman's Plucking Theory.

Looking at the yield curve for some stable region, the fuzzy boundaries are the variation in the yields over a 30 day update cycle. That fuzzy boundary is a normal ID estimate of the uncertainty region. If one looks at the 30 year yield and asks, how far down do I travel on the curve before I can pick a spot in which the uncertainty regions do not overlap, then I get the 20 year yield, and so on. The method is an approximation.

What this means is that in any given N level distribution network, which has one N-1 stage distribution, then there is an adjustment process will yield a situation in which an extra step of distribution will be observable as a gain in scale. That is, after a period of incoherency, the system settles to the point that an intermediate manufacturing step will observably reduce the total number of transaction in the network.

Hence, the Plucking Theorem. The system always reverts to the number of stages defined by the shape of the yield curve and the measurement uncertainty. The shape of the curve is fixed at equilibrium by our Hamiltonian, the maximum variation in the minimum space, gaussian.

Business cycles and all that

Business cycles under coherency and constant measurement error assumptions using the multi-stage distribution networks model.

When the household sees a drop in credit card interest rates, they estimate an increase in the economies of scale across the vector of goods. Their response is to buy greater lots of goods, less frequently, maintaining larger variation in the inventory at home. That is, they accrue some benefit in measured gains in efficiency in the supply chains, and they lower their transaction rates. (Hayek again and minimization of transactions)

If investors somewhere in the middle of the supply change see a drop in medium term interest rates, they to act as if there has been increases in economies of sale up the chain, and they to maintain larger lot purchases, less often and manage a larger queue variation at their level of the supply chain.

All participants, down the chain, act so as to reduce the number of transaction, but accruing their share in returns to scale.

At any point in the queuing network for any good, including money, a "drop in interest rate" in interpreted as a gain from economies of scale, from that point up the distribution chain, and the economies of scale are passed down the distribution networks by reducing the number of transactions and raising the lot size per transaction, and increasing the variation in queue size.

If we add the condition of constant measurement error, then the queues will always revert back to their original lot sizes and yields (Milt Friedman's plucking theorem), (Quantum Mechanics under coherence and constant uncertainty)

In the interim, some distribution chain, via technological improvement, is operating from an inventory network that is non-coherent with the system, the transaction rates at each point differ from the other, within measurement error. The system is in disequilibrium, incoherent

So, from the multi-good, multi-stage coherent queue model, all shocks, either errors or technical changes obey Uncle Milt's plucking theorem. The structure of supply chains return to their original structure and original transaction rates at each level. The difference ni state is that some owner, somewhere, by virtue of increasing economies of scale now owns part of the money supply system. And visa versa for a negative shock or an estimation error, the money system ends up owning part of a goods distribution network.

Classicals vs Quantums in economic theory, comment

This distinction between the two groups will never disappear. I have hard time distinguishing the differences among economists because they often switch between the two models.

But, the central difference between the two is whether they assume measurement uncertainty is the agent is small relative to the returns to scale. When the measurement uncertainty is small, then the supply chains will have more stages in production, they have larger numbers of measurable eigenfunctions, the number of eigenfunctions being best estimated in the real economy by the number of terms points in the financial yield curve.

When the estimation uncertainty is large, then the number of eigenfunctionsts in the yield curve is small. Quantum theory will show this using multi-stage queuing theory for multiple goods under conditions of coherency.

Under the assumption of small uncertainty the perturbations of the economy about equilibrium can be considered smooth, even though each price change and inventory adjustments are quantized still.

The difference between depression economics and normal economics, as Krugman calls it, is the difference between when the smooth assumption holds and when it does not.

Incoherence in the economy results when a technological shock creates a difference in the natural eigenset in one or more sectors of distribution. The macro economy is no longer operating from the same set of eigenfunctions, and incoherence is demonstrated by bubbles as the economy re-establishes the proper, and natural set of eigenfunctions on the yield curves.

The bubble economy is executing a sort of Ramsey equilibrium process, and that process often ends up with the supply chain that maintains the largest lot sizes in its economies of scale, like IBM, GM, or the government. During readjustment back to the proper eigenfunction set, the process starts with sectors that have the greatest economies of scale, the largest lot sizes.

So the answer is, stop, they are both right. One being a very efficient estimator of price adaption near equilibrium, the other being seen by us as a restructuring by Ramsey Theory.

Graphical Calcultor for the Taylor Rule

I understand the Taylor Rule, established by John Taylor, though apologies to him for not yet readig his paper. I have a handy graphical system for calculating the proper short term interest rate using real data from my master calculator. This site has the S&P charted over time, and the Treasury yield curve at any point in time is constructed. I use the S&P as a proxy for real output of the economy, the Treasury Yield curve shape as a proxy for stability (zero inflation).

I move the pointer to a point in time in which the S&P is relatively stable between booms and busts, and not ethe shape of the yield curve. The stable period is one without rapid growth and the short term variation in S&P is relatively small.

Then I move the pointer to the current time and compare the yield curve shape to its shape during the stable period. I estimate the change on the short end of the yield curve to restore its stable shape.

I do a lot of playing with that yield curve, noting changes to it and the S&P during various information shocks, Treasury auctions, and the like.

My graphical computation and John Taylor's estimation for the current setting of short term rates seem to agree, though I have no confidence that my use of this graphical system was not influenced by John Taylor's announcement recently that he think short term rates should be raised a smidgen.

My Current activities:

I have selected a general approach to computing the particulas of our Quantum Mechanical economy. I am looking and playing with a multi-good, multi-stage queueing problem under conditions of coherence and constant measurement uncertainty. I will report back soon, hopefully, with the single QM equations that matches pricing theory, credit theory, and inventory management; at least to a first order estimate of the finite eigen values. I should also be able to establish the specific finite number and position of the term points on the yield curve that we humans rely on.

Sunday, May 24, 2009

Calculated Risk, give us a counter factual

Calculated Risk claims much of the recession could have been avoided with proper oversight of the bundled securities problem. If so, then what?

Where would the excess savings have gone? Invested in commodities (an oil bubble) invested in government (inflation), invested in stocks (a dot com bubble). Our economy had a reasonable investment environment in 2004, before that our government demanded excess money, after that the investment opportunities dried up. We were in need of restructuring, where and when did Calculated Risk suspect we should restructure?

We, the economy chose the time and place, not the Fed, not the regulators, and not the government.

Driverless freight and taxi in one

See the simulation photos.

I have one question that constantly plagues me. Why do they add rails in the age of microprocessors when a green line
can be followed just as easily? As in the Toyota driverless bus in the upper photo.

Even the PRT from England hasn't yet figured out that the concrete guide ways are an unnecessary expense, as in the lower picture.

There is something in the European brain maybe.

Lux has some answers

Lawrence G. Lux nails it. Read the post, it makes sense, with two provisos.

1) The problem is really inefficient use of gasoline, but the housing bubble sure did not help.
2) The solution is right, remove the Bush tax cuts. I add one more, raise the short term rates as Taylor recommends.

The essence of his post, if I can be interpretive here, is to keep the housing problem within the housing sector, though the taxpayer will have to take its losses with the Freddies and Fannies.

On the energy front, it really is up to state and local government to remove their imposed bottlenecks to transportation efficiency, if they refuse, then they accept a permanently lower level of output.

Saturday, May 23, 2009

Vice Chairman Kohn speaks

A speech HT Calculated Risk.

"With traditional monetary policy currently constrained from further reductions in the target policy rate, and with many analysts forecasting lower-than-desired inflation and a persistent, large output gap, agents may anticipate that the target federal funds rate will remain near zero for an extended period."

He is justifying this:

"But as a result, fiscal stimulus has potentially become more effective in boosting economic activity than it usually would be. "

What he is saying is that his borrowing customers cannot find solutions for the long term plans currently being presented by the Obama administration. If investors did have solutions for Obama and company, then they would be borrowing and investing in new firms like mad. Obama has specified too much, and it does not compute.

Fiscal expansion may be more effective? Obama and Congress need a more rational long term plan, try again. Either we expect too much from Congress, or Congress is contradictory and uncomputable.

The longer we wait for Congress to figure this out, the more rapid the rise in inflation as we restart.

Supply and Demand curves are theoretical wrong

Bad assumptions built into economics for a hundred years.

The assumption built in to supply/demand curves is that transactions occur in point time, nothing is learned about the changes in the macro distributions of goods during the transaction itself.

That is not true. Customers can see the queue in the grocery store, as can the clerk. Each party gets a small trace of the differential reduction or enhancement in a shelf full of apples as they buy. Construction workers sense of how many fellow workers queue up for a job interview. These people are professional micro-transaction makers, they have sufficient knowledge of how things work and they speculate in the goods when they shop.

Bankers have fast adaption, so money users can simply wait the next day, week, month and find changes in money distribution. The key to money is that it adapts and publishes much faster than other goods, that is its value.

So, the consumers and workers always make continual adjustments to their knowledge of goods distributions as they shop and work. There is always a force of coherence between macro estimates and micro estimates.

The system never gets stuck in a dis-equilibriated condition, absent a technology shock, positive or negative. Once dis-equilibriated, then the participants who can observe the shock will design the stimulus.

Great Depression and Technology shock

I have state my view. The Great Depression was cause by the technology shock of commercial radio and the constraint it imposed on transportation.

Just a quick background. A radio stations in a major metropolitan area cost a few hundred dollars, and ex[anded, overnight, information about goods. The relatively instantaneous knowledge of goods in the city put a premium on the automobile over the street car. The transportation network could not handle the load.

Hoover's response should have been "Roads,roads,roads". Roads were a government responsibility, mainly, for good or bad.

What the economy needed was knowledge about the elasticity of road building so that it could set the yield curve properly.

Roosevelt came into power and established that measure of elasticity so the yield curve could stabilize. (Scott Sumner outlined this process) His task aided by the stimulus effect of European gold. (see Rohmer, Chris). European gold came into this country because the European yield curve could not stabilize fast enough in this task of measuring road building capability, due to Hitler and German policies.

Friday, May 22, 2009

The Crisis and How to Deal with It

Panel Discussion

I guess I should comment.

Paul claims we had a housing crash people do not feel as wealthy. Oh, and a financial crash. Well, we have also had an oil crash, transportation crash, and likely a coming crash in government goods. Why does Paul mention Keynes, Keynes does not predict the sudden crash of multiple sectors. Schumpeter does.

Note, all of this was preceded by a housing bubble, an oil bubble, a government bubble, transportation bubble. Why did all these bubbles form and why did they all crash nearly simultaneously? Keynes does not explain that, except by animal spirits. More on animal spirits later.

If Keynes and his theory only have a vague idea, as John Quiggin says:

"(rational optimization in competitive markets) can produce large changes in macroeconomic outcomes. "

why did we spend hundreds of millions in research to get from Keynes who made none of these predictions to this?

No, Paul. I am not someone who claims deep insight, I am someone who simply searches the Wiki for mathematical models that describe what Shumpeter seems to have predicted. I do not have the desire or need to try and justify Keynes to fit the facts. I have Wiki, its explanation of various mathematical models and an array of economists over 200 years, one of whom might just have predicted the simultaneous destructive recombination of multiple sectors.

Quantum mechanical theory will win this debate, in light of the rapid communications network which makes coherence obvious.

Scott Sumner says the immediate preceding event to these depressions is monetary policy mismatch. Scott got me on the right track, the key, as I paraphrase him, is that monetary events are always and every coinciding with sudden changes in aggregate GDP, and I think him for clearing that up. But Scott says it is the failure of monetary policy to understand events and react. Not so fast. Monetary policy is always just behind the curve, a little behind, not simultaneous. Monetary systems adapts fast, but they adapt just like any other good, that is the key.

So, we have a solution for today's depression, and yesterday's and 1870, and on and on and on, quantum adjustments to the structure of the economy.

Scott would want to stop the theory at money, not me.

If quantum mechanics models disruptive change, then what describes the when and where of technology shock that start the process. For, under coherence, there is nothing that can change things except an external shock.

That brings us back to Keynes, the one contribution in his career, we are animals.

Animals change their mind when they receive a rapid expansion of information about their environment. They change their long term models of where things are and how they move. They changed their mind when commercial radio suddenly appeared in 1928. They change their mind when international telegraph suddenly connected Europe and America in 1870. And when animals discover a whole lot more about where the goods are, they get a transportation constraint, the leading edge of disruptive recombination.

Scott needs to ask, why did important events suddenly appear in 1928? Because all across Europe the sudden realization by everyone that they could use radio for greater efficiency, and that resulted in a cascade of changed outlook. Just as today, he Internet opens a wider world of where goods are, our model of how things move changed.

So, to answer the question, what to do about it. Follow the disruption backwards, and solve each constraint, each bubble with innovation. Do not hold anything back, get on the backs of local governments and tell them, try new ideas, try new technologies, drag those inventions out of the lab, push change onto the government unions.

Try again Paul.

Brad DeLong and his search for asymmetry

Brad ponders how he can get an asymmetric theory to explain lack of symmetry. I give him the short answer, asymmetry is built into the quantum theory.

Once the transaction starts, the long term eigenvalues are computed before the results of the transaction complete. We cannot do the reverse, go back in time and reverse the long term estimates while the transaction is ongoig.

This is also why Paul has causality reversed, thinking short term interest rates predict lng term. Short term rates are determined by the remainder of the transaction, after the long term estimates are made.

Brad may ponder why transactions are not infinitely short, as they are in all of his models. Physical inertia, the Hamiltonian component of the system. we will talk about that later, but I have to run an errand.

The Hamiltonian in this situation re the economies of scale, teh Hayek minimization of transaction. That Hamiltonian determines momentum and delay in each of the micro transactionas that make up the system. Each of Brad's models assume instant transactions, which violates the macro premis. Each of these transactions will have delay, synchronous with the marco delay determine by the Hamiltonian. That delay, plus the constant of uncertainty get you asymmetry. The micro transaction will get partial results, in units determined by the yield curve eigenfunctions, becaue that micro transaction will have delays proportional to the distribution of arrivals in the yield curve.

Or something like that.

Krugman wrong with his expectation theory

"The reason for the historical relationship between the slope of the yield curve and the economy’s performance is that the long-term rate is, in effect, a prediction of future short-term rates"

This is Krugman working with expectation theories. My claim is simple, long term rates represent what investors learn about long term events, as knowledge of long term events rises above uncertainty. The yield curve will be composed of stochastically independent estimations, orthogonal.

The issue came up because the FRB in Cleavland came out with an update of their predictions based on yield curve shape. The Krugman expectations view is that the yield curve is not functioning properly, as a predictor, because of the zero interest bound. Investors are betting that long term rates must rise because short term rates cannot drop further. Hence, like an option trade, there are few choices where short term rates can go.

The recent rise in ten and twenty year rates occurred because investors make choices as the restructuring of our economy unfolded, they had to make asset allocations as long term trends became clearer. The "option trading" on short term rates is suffering very low volume, investors are simply not betting o the short term yet. Look at the number of short term traders sitting out the market, as Zero Hedge identified over and over.

Thursday, May 21, 2009

Quick note on asphalt trains

Consider the concept of asphalt trains discussed earlier (much earlier) in this post which involved powered freight wagons, powered with an electrical tether from the locomotive.

Now, replace the self powered cargo wagons with a mechanical power from the locomotive, but tether enough electricity to power steering servos on each wagon, along with the standard microprocessor.

There will be other differences, the diesel electric locomotive must have multiple, very wide rubber tires to spread the traction across the asphalt and avoid tearing up our highways. But the same concept holds, long trains, 20-30; possibly 50 car trains running down our major highways.

We still keep our constant returns to scale and any parking lot becomes a freight yard.

My quick assessment of Keynes

He did well, but was not in a position to complete his theory. The economy of the time was in long term Malthusian expansion, so the best approximations were based on exogenous shocks. Until the macro system shows the shows global constraints, the data will not be clear enough to show coherence between micro and macro models. Chris Rohmer's paper on the stimulus effects of gold flows from Europe to America i eh 1930s had to be treated as exogenous because feed back effects were longer term than our outlook, and unobservable.

Keynes must have known this and animal spirits was an approximate heuristic that got us past the dearth of data. By singling out government, the largest monopoly in national economies, Keynes could approximately fill the gap in needed to model national coherence.

In 1928, the consumer could assume a infinite dimensional system, finding investment opportunity globally under the assumption of unlimited outward flows of technology. By 1990, the use of technology transplanted to developing economies synchronized with developed economies in a time period within the normal planning outlook. Our models had to accept finite dimensionality in the yield curve.

Bottom line:
A closed endogeneous model under constant measurement uncertainty has the following characteristics.

1) The money good will always have a visible term structure

2) Money will be defined by a unique and finite set of eigenfunctions defining the yield curve, the placement of functions differ by multiple of the measurement uncertainty.

3) Regardless of asymmetric information or relatively different estimation of elasticity distribution, the term structures of all other goods become a linear combination of the money functions, with a single solution. And agents will converge on that unique estimate of the term structure of the good, though they could occupy differing sets of the finite set of eigenfunctions.

4) If the unique solution in 3 is inverted, then a partial restructuring must occur.

5) The depth of the restructuring should be determined by the point in which the money yield curve inverts.

6) At the micro level, each transactions takes place over finite time and causes a series of adjustments to the agents balance sheet, equivalent to an eigenvalue decomposition of the traded, single good.

Wednesday, May 20, 2009

Uncertainty constant and the economic theory of quantum adjustments

In this post I want to talk about the process of a single economic transaction under the restriction that the agents have a fixed variance in their ability to measure value. This is equivalent to physicists talking about particle interactions with a restriction of planks constant. let me preface this discussion with profound thanks to all the folks who created and populated Wikipedia.

Under this theory I postulate that agents are comfortable estimating elasticity to within a fixed uncertainty. Then I postulate that any given transaction is a non zero time length process. I show that that the resultant transaction process is equivalent to an optimum fit orthogonal decomposition of the transaction value into subdivisions of value equivalent to multiple of the fixed measurement uncertainty. This then is the link between linear estimation theory of economics, Minsky restructuring, and the time independent model of macro economy. It unifies micro and macro. It provide a model of asymmetry.

As usual, I use lazy mathematics and hand waving, leaving the pros to sort the details, match the data, and win the Nobel.

Let us start with the process of a house purchase whose total transaction time is around a few weeks. The buyer and seller do not know the final outcome of the transaction until the few weeks needing to negotiate final details. I will discuss the process from the buyer's side. From the start the reader will understand about where I am going, and the reader will have some sudden insights into quantum mechanics.

The buyers first task in a home purchase is to estimate the typical price he will have to pay. He will estimate that price to the level of constant accuracy, no better and no worse. As soon as his survey of the market is complete, he makes adjustments, or planned adjustments to his balance sheet, perhaps pre-qualifying for a 30 year loan.

n the second step the buyers narrows down his choice to houses within that first estimate. He examines the typical commute, he does a walk through. Based on his further refinement he finds further adjustments to his balance sheet, as soon as he can estimate further refinements within his accuracy level. Some houses need work, some houses come with a refrigerator. He setermines this further refinement of price. Note the separation between his first and second estimate will be nearly equal to the accuracy estimate. In all steps, he is performing an orthogonal estimate of the residual of the prior estimate to within his constant accuracy.

He finishes his steps by a negotiation with the agent about transaction costs, fees, and move in times. In the finaly deal, he has a remaining residual nearly equal to his uncertainty constant.

Now, at equilibrium, value differences must be synchronous across all transaction, each agent suffers the same constant accuracy restriction. I will get into that later, as soon as I get an idea of the why and how. But the outcome is that the finite set of eigenfunctions will be separated by multiple of the uncertain constant. The aggregate system will be synchronous at equilibrium, and the monetary yield curve carved up into the same eigenfunctions.

Mathematizing Minsky, general approach

Severe restructuring, asymmetry and how to model it. DeLong worries about this, I got as far as this general approach, and no further.

I always start with the idea that things are restricted such that their positions are known to a fixed uncertainty, neither more nor less. Don't ask me why.

Then I add that things are optimal when organized as a sort of bell shape in distribution (maximum variation in minimal space).

These two assumptions get me fixed dimensionality. Specifically, if I break my measurments into orthogal functions, the number of functions I need is fixed, determined my my uncertainty constant.

Having assumed very little, then, I get to a conclusion that large groups of things that must know about its neighbors will organize into fixed spanning trees, with information moving as in a directed graph, and the dimensionality of the tree fixed at N. If the number of things grows, then the system must spawn sub spanning trees, and pick orthoganl functions so the hierarchy of things gets back to a composite of N ranked spanning trees.

This latter process describes a severe restructuring.

Tuesday, May 19, 2009

Who made those budget deficits?

InstaPundit yet again posted its misleading chart on Obama vs Bush deficits, hence another refutation is in order.

From Capital Gains and Games:

How Much Bigger Is the Obama Deficit? Ed Lazear, former chairman of the Council of Economic Advisers with the Bush administration, said on CNBC this past Friday that he was not happy with the Obama administration’s projected $1.5 trillion fiscal 2009 deficit. But as I’ve pointed out before, the Obama deficit is not all that different from what would have occurred had President George W. Bush remained in office and Lazear still been the CEA chairman.

As Bush was leaving Washington, the Congressional Budget Office estimated that the 2009 deficit would be around $1.2 trillion. A month or so later, Congress and Obama agreed to something that the Bush administration would have strongly supported, yet another one-year patch for the alternative minimum tax. That added around $70 billion to the deficit.

The $1.2 trillion estimate also didn’t include a supplemental appropriation for activities in Iraq and Afghanistan for this year. Although Bush didn’t request a supplemental before he left, he certainly would have had he been in office. Obama requested $75 billion for this in his budget.

These two additional expenditures increase the $1.2 trillion to about $1.35 trillion. Add to that the effect of the economy not performing quite as well as had been assumed when the original $1.2 trillion estimate was released, and you’re not that far away from what Lazear said was the Obama number.

Monday, May 18, 2009

More evidence on relative elasticities

With a HT to Justin Fox of the Curious Capitalist, he references this article by James Surowiecki regarding the adjustment times of zombie banks and zombie companies in Japan. He estimates that 30% of Japanese companies were being propped up by the banking system because these companies could not adapt.

So, if think about the constraints caused by slow adaption of the financial sector vs constraints in other distribution networks, we can say that only 30% the prolongation of the problem was caused liquidity transmission. That is within a reasonable error to the estimate obtained by this research which measured 10-15% of total constraints caused by the liquidity sector.

Sunday, May 17, 2009

Financial services and its elasticity

We often hear the complaint that investors should be investing in this or that innovation, often times the complaint is about energy efficiency investments.

How rapidly can the investment sector respond to an opportunity in energy developments? We have examples of financial elasticity in Silicon Valley. From the start it takes about 6-8months (my estimate) for a venture capital group to raise money and open up an investment office. That is elasticity.

From start up to production, it takes a car company about five years to develop the next generation of high mileage vehicle. That is inelasticity.

Great recession are rarely caused by financial inelasticity. Consider the example of Roosevelt raising the gold price from 1932 on. Sumner correctly observes that liquidity transmits faster that any other good. But, as Roosevelt played his game of raising the price of gold, what was the pacing item? More likely, the pacing item was the rate at which investment could be applied to the economy to solve constraints. What investors wanted was consistency of price, whether that consistency came from Roosevelt pacing the price of gold, or private investors pacing the rate of deflation made no difference, the requirement was consistency.

Generally great recessions are caused by real constraints that take time to solve. I can guestimate the elasticity of various sectors, defined in terms of their equilibrium constants as follows:

Financial - 6-8 months
Food Retail - 12 months
Housing - 24 months
Transportation vehicles - 48 moths
Roads - 60 months
Energy - 80 months
Defense - 120 months
Government services - 240 months

These guesses are highly biased and professional economists should measure them exactly. Equilibrium times will also have a long and short term component.


To avoid confusion, how does one distinguish between a change in the flow rate in an existing distribution channel and a change in the distribution channel? Depends on whare the change is viewed. Adding 1,000 homes to my home town looks like a structural change to us here in town. But to Freddie Mac it looks like a change in the flows of an existing channel. So, I would lump all changes into one category, structural changes and then I would say that some structural changes can be estimated to sufficient accuracy as changes in flow rates. In Great Recessions, we adopt structural change models throughout, when operating near equilibrium we use the change in flow rate model. They yield different mathematics.

Thursday, May 14, 2009

Notes on relative elasticity

These are coments in this paper posted by Econobrowser.

The paper is important because it breaks down inflation into relative inflation of one good to another. I got through my first read of the paper, and I get the gist of how he derived his result. I need another two reads, but here I go.

The best way to understand the result is to think of the economy as a series of production chains, commodities going in the top of a chain and consumer goods emitted from thebottom.
The next part of the framework is to consider liquidity (money) as just another good at each level of the chain.
Now, each vertex in the chain seeks to adjust a finite vector of goods (including liquidity) such that the distribution of elasticities is smooth and bell shaped. This result is in conformance with Hayek's minimization of transactions, and yields, at equilibrium, a Gaussian output of random noise. That is, there is no discernable bottlenecks at equilibrium, and the economies of scale yield maximum output for minimum input.
Given this framework we can hypothesize the effect of a positive productivity shock at some mid-point in the chain. The effect of better production methods in the middle of the chain is to pass elasticity down the chain to the consumer for a segment of goods; and pass inelasticity up the chain for a segment on inputs. Hence, initially, up the chain there is a distortion in the distribution of elasticities, and a complementary distortion down the chain. These distortions must be equilibriated out of the system over time, and during each step of equilibriation the distribution of elasticities at each vertex vibrates.

Under this formulation, then, the authors discovered that liquidity makes up about 10-15% of the vector of elasticities. But, overall, there is no such thing as monetary inflation in the sense of Uncle Milt. There may be productivity shocks in the financial sector, but these shocks, like all other shocks simply travel up and down the system settling over time in complementary fashion. At equilibrium, regardless of the shock the system always reverts to the same distribution (or tries to) at each level of the system.
What we see as inflation at one level must always be counter balanced by deflation at another level.
Uncle Milt was getting to this theory with his plucking theorem.

The framework I have outlined let's us discuss instability theorems by examining the possibility of equilibrium along the supply chain. I note that each input to a given vertes attempts to be a spectrally deficient white noise. But each vertex also attempts to combine its incident arcs into a gaussian distribution. However, by construction, a Gaussian white noise has no slope, it is spectrally deficient. Hence, there is no solution at equilibrium though all vertices try to achieve it in their collective actions. The best that can be had is to minimize the variations off of Gaussian such that the total variance of the economy has minimal spectral components.

More robo vehicles

I post these as I find them. This one is a toy tank chasing a human. And another. Another More. This one is not a toy.
Then we have a variety of robo vehicles adapted from the Segway. HT To Net Traveller

Tuesday, May 12, 2009

Obama may finally break Reagan's Big Government Record

HT From Angry Bear.

The chart shows what we all know, Reagan had been the leading Big Government politician, the nation's premier socialist, holding the record for 30 years.

Monday, May 11, 2009

Driverless car roams NYC

An article with video describing the test of automated cars on the streets of NYC. And here is a complete report on the Intelligent Traffic Congress by Blue Print America And traffic planners in NY State re getting the idea, along with a dozen other regions worldwide. Popular Mechanics weighs in with a report from Germany, calling them tantalizingly close. And here is a robocar crash!

Instapundit and it's intellectual dishonesty.

I have decided to monitor Glenn Reynolds because I seem to notice a pattern of intellectual dishonesty.

Take his continual printing of bar charts for federal budget deficits. He prints the current administration in read, and includes future estimates for seven years beyond the current. For previous administration he prints only the last four years.

He does not use share of GDP. which is the most important.

Nor does he consider lags, allowing the reader to ignore that tax receipts are way down dues to a mini-recession that was most likely attributable to the first four tears of the previous administration.

I am going to start monitoring this guy more carefully, I want to get a trend going of his dishonest stunts. I like his blog, read it constantly, but the reverse filter I need to apply gets cumbersome. Perhaps he can change.

Sunday, May 10, 2009

Another automated vehicle deployment

ROBOSOFT announces the delivery of three VolcanBuls to Vulcania . These vehicles, without a driver and guided exclusively by GPS, take visitors on a 1km tour through the park to observe the Puy volcanic mountains and to learn about their history.

Vulcania is the world's first site where such an autonomous transportation system has been installed and certified. No driver, as well as no need for any specific infrastructure, and its exploitation in an area shared by walkers make the VolcanBul particularly well adapted for all sites welcoming public, such as city centers, airports, hospitals, campuses, amusement parks, etc.


And, watch this video of the Heathrow Airport automated taxi. Also, don't miss this round up of deployed transport robocars. And be sure to peruse the European web site for automated transportation.

Thursday, May 7, 2009

Zero Hedge

Great site, great learning tool. My mathematics needs help with the acronyms however. They are more insightful than the macro-economists.

Wednesday, May 6, 2009

More bottlenecks

This one concerns the runnig of freight trains via Chicago, where the disparate train corporations do not have a smooth exchange yard.

Monday, May 4, 2009

Did home buyers make a mistake?

How much of the run up in home prices were justified? Relative to the value of other household expenses?

New homes have better energy efficiency, new home construction bubbles gain efficiencies of scale. New technology may have added value to the home relative to competitive needs.