Naturally, looking at Lagrangian system I immediately got in over my head. But a tangent bundle, in plain English, says that on some surface the tangent lines can be described by a basis (in the Newton sense), that is linear closed and compact, I guess are the words. So the surface allows the system to use ratios that have iverses, a great idea for doing Newton.
What about Ito? What is a dimension? It is the number of times we scale up. Consider a ruler with large medium and small notches, N of each. Can I fit N small notches between each of the N medium notches and fit all the between each of the large notches? Then can I make their inverses the same way, counting backwards from one to zero, and not hit zero. Do they all count in sequence? Do I have a reversable linear map between indices from one up and one down?
So I go ahead and try this out with Phi. I let Phi be a rational approximation, say F16/F15, the F being fibonacci numbers. I get F16/F15 - F15/F16 = .9999, my value of one is off. Does it improve or get worse?
I take my .9990,.9999,.. as the starting point for my next set of Fibonacci, and repeat. The error in one stabilizes as near as I can tell.! In fact I can vary the starting sequence up to the fine structure (1+fs and 1-FS) and have no problem, it always stabilizes. I did not more testing. This is a property of Fibonacci sequences, and there should be something equivalent for all Lucas sequences. But the clear idea here is that we have at least a fixed point, a sequence of stable mappings as we expand dimensionality.
But, I think this means the TOE has no bounds, it can build up systems to some unlimited dimensionality and the Ito indices will still work.
Thursday, April 30, 2015
When the power series do not converge evenly
If we take the series sum: (1/3)^n, n = 1 to some large N, then we get 1/2, or in the game of wythoff, we get one hot move per cold move, I guess.
And the series sum of (2/3)^n, where n goes to the same N, gets 2, and that is 2 hot moves per cold move.
The first is the fermion the second is the boson. Now that is fine when we can add energy into the system. We get this situation when we do fixed Shannon channels in engineering. In the vacuum the bubbles do not have that choice, they have to adjust chemical potential, to match both the vacuum at the edge while avoiding the winning position in the center. So the fermion gains a few hot moves and the boson loses a few. On net, the cold positions in the fermion get shared, so they move around relative to the hot positions. Physicists call this relativity, I call it making Ito's calculus work.
Hence, in my original spectrum, from sheer dumb luck, I was finding the solution by matching the series sum (Phi)^k to the boson (3/2)^j. It turns out that is the solution. It is stable because it is the connected solution, always locally additive. The math is hyperbolic at discrete Lucas angles. So the silly metaphor becomes that in the adapted system, adiabasis occurs when the hot players and the cold players choose not to win the game, but keep on playing. What happens if one of the fermion bubbles wins, it lands in the center, the abyss. I dunno, ask Professor Higgs. But is seems to me we get empty space if that happens, but that is impossible, there is no such thing.
And the series sum of (2/3)^n, where n goes to the same N, gets 2, and that is 2 hot moves per cold move.
The first is the fermion the second is the boson. Now that is fine when we can add energy into the system. We get this situation when we do fixed Shannon channels in engineering. In the vacuum the bubbles do not have that choice, they have to adjust chemical potential, to match both the vacuum at the edge while avoiding the winning position in the center. So the fermion gains a few hot moves and the boson loses a few. On net, the cold positions in the fermion get shared, so they move around relative to the hot positions. Physicists call this relativity, I call it making Ito's calculus work.
Hence, in my original spectrum, from sheer dumb luck, I was finding the solution by matching the series sum (Phi)^k to the boson (3/2)^j. It turns out that is the solution. It is stable because it is the connected solution, always locally additive. The math is hyperbolic at discrete Lucas angles. So the silly metaphor becomes that in the adapted system, adiabasis occurs when the hot players and the cold players choose not to win the game, but keep on playing. What happens if one of the fermion bubbles wins, it lands in the center, the abyss. I dunno, ask Professor Higgs. But is seems to me we get empty space if that happens, but that is impossible, there is no such thing.
Not the SmartPhone, but the SmardCard
Time: Can a smartphone satisfy all of your computing needs? That was the pitch Microsoft made during its Build developer’s conference on Tuesday, and it’s not so far-fetched as it sounds.Close, but the essential computing interface will be the smart card 'function' and the distributed network of Banker Bots. The SmartPhone will just be a location where SmartCard may reside.
On stage at the event, Microsoft Corporate Vice President Joe Belfiore plugged a Windows phone into a large screen display. On the small screen, he opened a mobile version of a PowerPoint presentation, but on the large screen, that same presentation was transformed. Menu options that would normally appear in the desktop version of PowerPoint suddenly appeared at the top of the screen. The “universal app” detected the display’s expansive canvas and packed it with new features. In essence, the app ran two independent displays from one device.
Distributed Banker Bots are the Singularity.
They constantly know, better than any other source, the distribution of goods. The consumerl will always have the best information about where she can find the items she wants. All of the consumer computations will be driven by that interface, including SmartPhone functions. Most of the time the user can leave the tablet at home and take off for short trips with nothing more than a $20 dollar smart card.
Oil prices rising faster than Jerry's Energy Taxes
Oil at $59, so it has jumped 20% in the last few weeks. So I would expect the energy efficient private sector to adapt faster than the energy inefficient budget demands in Sacramento. The data will be coming in soon.
A simple metaphor for the hologram universe
What was the new idea? The universe has no concept ot x,y,z; its clueless. What the universe does is pack bubbles until the bubbles snap to in a stable group, then it packs thopse groups until they snap to, then does it again; and likely again and again.
What is Avagadro's nmber? 2^79-1, lets add some motion and call that 2^80 Wythoff moves in the game. What is is? ((2^5)^4)^4); 2^80. These are likely close. This is a three scale up, five or four poweries each, each series likely a Lucas sequence. W=At each level they have the Hyperbolic boson/fermion pair thing going, an entropy match locked solid.
Now everything we do in life is built around Avogadro, actually. Its the number of vacuum bubbles in a proton, and we are proton based, everything we do is filtered by this Avogadro scale system. So as we go through life and look at things close or far away, we see everything scaled by these three levels of detail, we make that a 3 dimensional world. But it is strictly an illusion.
Now we know our solar system is surrounded by a wall of protons. What are they doing? They are making four scale groups out of our three scale Avogadros. Groups on groups, or tangent bundled I guess. Once a scale level pack lower dimensional groups, its get Pi damn close, and with a little kinetic energy can hold stable, locked in with a Shannon match, it is adapted. At the next level of chaos up, that group looks just like a bubble, ready for grouping.
Some astrophysicists are think we may have an Avogadro of universes, and out tiny bubble is over lapping, part of some local five order power series We are just a finite ratio in the grand scheme of things.
What is Avagadro's nmber? 2^79-1, lets add some motion and call that 2^80 Wythoff moves in the game. What is is? ((2^5)^4)^4); 2^80. These are likely close. This is a three scale up, five or four poweries each, each series likely a Lucas sequence. W=At each level they have the Hyperbolic boson/fermion pair thing going, an entropy match locked solid.
Now everything we do in life is built around Avogadro, actually. Its the number of vacuum bubbles in a proton, and we are proton based, everything we do is filtered by this Avogadro scale system. So as we go through life and look at things close or far away, we see everything scaled by these three levels of detail, we make that a 3 dimensional world. But it is strictly an illusion.
Now we know our solar system is surrounded by a wall of protons. What are they doing? They are making four scale groups out of our three scale Avogadros. Groups on groups, or tangent bundled I guess. Once a scale level pack lower dimensional groups, its get Pi damn close, and with a little kinetic energy can hold stable, locked in with a Shannon match, it is adapted. At the next level of chaos up, that group looks just like a bubble, ready for grouping.
Some astrophysicists are think we may have an Avogadro of universes, and out tiny bubble is over lapping, part of some local five order power series We are just a finite ratio in the grand scheme of things.
Wednesday, April 29, 2015
Run time adaptive Huffman encoding
I should mention this, it is one algorithm that makes adapted systems, and I think it should now be common practice in the stock market. But that is what really started this whole thing about using Shannon theory in the TOE. I did a few experiments with Huffman encoding and the SP500 as my long time readers know.
I should make a note to traders, the structure of the market is two, one for the traders and one for the companies themselves. The trading market will eventually become a complete no arbitrage system, under the TOE, so these robot wars will cease as all the robot just collectively organize themselves. At Zero Hedge keeps pointing out, this is not real, the real thing are the companies themselves and how they connect.
I should make a note to traders, the structure of the market is two, one for the traders and one for the companies themselves. The trading market will eventually become a complete no arbitrage system, under the TOE, so these robot wars will cease as all the robot just collectively organize themselves. At Zero Hedge keeps pointing out, this is not real, the real thing are the companies themselves and how they connect.
We do not use equilibrium analysis to analyze jerry energy tak hike
Matt Kahn says:
But I bet the opposite happens. I bet the folks who get the taxes will grow and size much faster than energy is taxed. So we are going to look at the volatility of taxes being spent, volatility in this case we interpret as energy inefficiency. People who want the new taxes, like energy inefficient light rail and high speed rail, will grow.
Note that everything I discussed above [about jerry's energy tax] cannot be analyzed with a computable general equilibrium model.We will use a contrained flow model and watch what Jerry does with his tax income from selling permits. Here is the law:
Q: Where does the auction money go?The taxes are passed around. So we will see how they are passed around. If the energy tax is effective, we expect the ratio of government 'passed around taxes' to total energy consumption to be stable. Passing around taxes is not likely to be any more energy efficient then any other activity.
A: The money goes into two buckets. Investor-owned utilities like PG&E and Southern California Edison auctioned their allowances under one program, and proceeds from theses ales must be used for the exclusive benefit of those utilities' ratepayers. The California Public Utilities Commission has proposed giving residential ratepayers a twice-a-year "climate dividend" worth about $30 and credits to small businesses; that proposal is expected to be voted on Dec. 20 at the CPUC. The second bucket includes proceeds from the industrial and transportation sectors. These will be deposited in a new special fund in the state treasury that will be used to further the state's clean energy goals. Legislation signed last September will require at least 25 percent of the proceeds benefit the state's most disadvantaged communities.
But I bet the opposite happens. I bet the folks who get the taxes will grow and size much faster than energy is taxed. So we are going to look at the volatility of taxes being spent, volatility in this case we interpret as energy inefficiency. People who want the new taxes, like energy inefficient light rail and high speed rail, will grow.
We have a new governor in Texas?
I just now found out their election cycle, wadday know. Who is this guy?
Well he speaks Spanish, out native tongue out here in the SouthWest. He is willing to use the Texas national Guard as a force against invasion by DC.
Well he speaks Spanish, out native tongue out here in the SouthWest. He is willing to use the Texas national Guard as a force against invasion by DC.
Calling HorseManure on Jim Hoft on job growth
Gateway Pundit:
Ronald Reagan’s economic plan saw GDP surge at a 3.5% clip – 4.9% after the recession. That’s a 32% bump.Jim continues with graphs.
During the Obama years, thanks to his big government policies, the US economy has stalled. Today the quarterly GDP was announced. The GDP for the first quarter of 2015 braked more sharply than expected at only a .2% pace. The US economy has grown an anemic 9.6% during the Obama years (excluding today’s dismal number).
His analysis is bullshit since the severe 1981 recession was entirely a Reagan recession. We can see Reagan's foot print, he immediately ran up a huge deficit on taking office, cause a crash sending unemployment into the stratosphere. Reagan paid the highest real price for debt than any president in history, nearly 7%, real rate, for ten year debt, he left the deficit much larger then the deficit Carter gave him. Reagan started the debt catastrophe. He left the second worst deficit since the Nixon Shock. Counting unemployment, Reagan actually had a worst recession than lil Bush.
The 2008 recession was entirely a Bush recession. Obama came into office with lil Bush's recession, the worst recession since the Nixon Shock. Obama has yet to cause a recession, and Clinton's mild recession was revised away. Obama reduced lil Bush's deficit at the fastest rate than any other postwar deficit has been reduced, and lil Bush left the deficit at 10%, the worst deficit since the Nixon Shock.
It was Clinton who finally got debt prices down, and Obama has finished the job with real debt prices down to 2.3% or so. So right now, every single recession since the Nixon Shock has been either a Bush or Reagan. That makes the entire unemployment series almost entirely a Reagan of Bush disaster.
I am now labelling Jim Holt an incompetent hack and will constantly expose the Gateway blog site as a nestbed of incompetent statistical liars.
John Whitehead wants instructions
John talks about Jerry Brown's energy tax hike:
Well John, the data will tell you, and I am at your side collecting data and looking as past results. We have a couple of complexities here.
1) California CapnTrade is a net revenue gain for government
2) California government invests in energy inefficient systems.
3) CapnTrade prices usually drop faster than the economy down turns.
So how can we answer your question?
We are going to see if Jerry gets more taxes for his projects that increase emissions. We are going to check the private sector sensitivity to energy prices. If the private sector is more sensitive to energy prices than government spending, then we have our answer, Jerry will increase emissions with CapnTrade because it is a net tax increase the way it is designed..
Then John aks:One feature of economic incentive-based environmental policy is that it is possible to achieve (1) a given amount of emissions reductions at lower cost and/or (2) additional emissions reductions at the same cost relative to environmental standards (i.e., command and control regulation):
Gov. Jerry Brown issued an executive order Wednesday dramatically ramping up this state’s already ambitious program aimed at curbing greenhouse gas emissions, saying it was critical to address what he called “an ever-growing threat” posed by global warming to the state’s economy and well-being.
Under Mr. Brown’s order, emissions would have to be reduced by 40 percent over 1990 levels by 2030. Under existing state law, emissions are supposed to be cut back by 80 percent over 1990 levels by 2050, and Mr. Brown said this tough new interim target was essential to helping the state make investment and regulatory decisions that will assure that goal is reached.
Mr. Brown’s order marks an aggressive turn in what had already been among the toughest programs in the nation aimed at reducing greenhouse gas emissions. Under the law put into place by Mr. Brown’s predecessor, Arnold Schwarzenegger, the state was required to reduce greenhouse gas emissions to 1990 levels by 2020 on the way to reach the 2050 target; California is already well on its way to meeting the 2020 goal, and may exceed it, officials said Wednesday.
Unless I'm told otherwise, I'm attributing California's ability achieve additional emissions reductions to cap-and-trade.
Well John, the data will tell you, and I am at your side collecting data and looking as past results. We have a couple of complexities here.
1) California CapnTrade is a net revenue gain for government
2) California government invests in energy inefficient systems.
3) CapnTrade prices usually drop faster than the economy down turns.
So how can we answer your question?
We are going to see if Jerry gets more taxes for his projects that increase emissions. We are going to check the private sector sensitivity to energy prices. If the private sector is more sensitive to energy prices than government spending, then we have our answer, Jerry will increase emissions with CapnTrade because it is a net tax increase the way it is designed..
More deficit spending say Obama and Republicans
Find me a Republican who won't deficit spend, its in their blod they know no other method of legislating. Obama agrees, raise spending across the board as a .2% growth rate cuts tax income. The ten year yield is now 2%, interest costs will rise and crowd out this government spending plan across the board. And, I am sure, the debt rollovers generated since 2008 are roosting. We will likely see greater stress in the state pension sector, watch Chicago. This is the pro-cyclical government spending processYahoo: Shaun Donovan, Director of the Office of Management and Budget under President Obama tells Yahoo Finance’s Rick Newman that the President has drawn “two clear, red lines.”“He’s not going to accept a budget that locks in sequestration,” says Donovan. “Second, he’s not going to accept fixing the defense spending cuts without fixing the non-defense side of the budget as well.”
Jerry Brown raises taxes:
Under Mr. Brown’s order, emissions would have to be reduced by 40 percent over 1990 levels by 2030. Under existing state law, emissions are supposed to be cut back by 80 percent over 1990 levels by 2050, and Mr. Brown said this tough new interim target was essential to helping the state make investment and regulatory decisions that will assure that goal is reached.This is not about reducing emissions. This is about raising the revenue from CapnTrade to cover short term shortfalls. It won't work because the short falls in revenue will be dominated by the slow down, and the slowdown will automatically reduce emissions. Most likely CapnTrade revenue will decrease, not increase.
Messing with the Lagrangian
As opposed to Lagrange numbers. I know they are connected, Lagrange numbers likely come from applying Lagrangian systems to estimation theory.
OK, Why am I doing this? I want the Euler Lagrange equations converted to finite system, convert the function q(t) into a Lucas sequence, put that into the hyperbolic, show all that match points between Boltzmam, Temperature, chemical potential, the Hamiltonian, the coupling constant, kinetic energy, uniform convergence ot the q(t), relationship between time and quant number. Then relate the boson and fermion to their nature ratio of hot wythoff moves to total moves, adjust the unit one with kT, to account for an adapted system that adds motion to obtain uniform convergence, and then establishes the sample rate needed to remain adiabatic. Boltzman is mostly about the change in elasticity of an ensemble of colliding bubbles. Then show that Shannon sets minimum action and all this resolves into the finite precision Lagrangian. It works because with finite precision any multi-dimensional system can be decomposed into a recursive order of two dimensional systems, and still be within precision. The TOE is a real thing, the basics of adapted system with no empty space. Believe it.
For example. When we deal with a Lucas sequence, q(n) = P*q(n-1) + Q*q(n-2), then we get the asymmetric Hyperbolic condition. When we regroup the parameters, we should get, on the right, a 1/(kT)^2, or equivalent, which is the coupling energy. The Shannon channel rate changes. The chemical potential which is fermion native hot move ratio will imply kinetic energy in the variable Shannon calls noise. The numerators, information rate are the hot moves for bosons, and signal the hot moves for fermion that retain stability. But every thing goes back to normalized two period lookahead, but the derived base forms a convergent series in b and 1/b for the Lucas sequence. We shouod get boson and ferion statistics. Stuff like that should happen.
The final outcome will be:
[1-(1/b)^(1/N)] + [1+b^(1/N)] = 1/N for some finite N which makes the power series in b and 1/b match. The difference 1/N is the coupling energy, and should include kT. The error after order N is the uncertainty in the finite self adapted system. This is the finite log solution in the TOE, and I think it will work for the general case of adapted, finite systems. Does this get us all the powers of powers, or do we have to apply this recursively? Dunno, you tell me.
Why am I tell you all this? Because friggen brilliant mathematicians should know this is the next step in the unification. There is going to be a bunch of next steps as we redo math and science to match the TOE. So, you brilliant young, soon to be wealthy mathematicians and physicists; this is what I am doing, please beat me to it.
OK, Why am I doing this? I want the Euler Lagrange equations converted to finite system, convert the function q(t) into a Lucas sequence, put that into the hyperbolic, show all that match points between Boltzmam, Temperature, chemical potential, the Hamiltonian, the coupling constant, kinetic energy, uniform convergence ot the q(t), relationship between time and quant number. Then relate the boson and fermion to their nature ratio of hot wythoff moves to total moves, adjust the unit one with kT, to account for an adapted system that adds motion to obtain uniform convergence, and then establishes the sample rate needed to remain adiabatic. Boltzman is mostly about the change in elasticity of an ensemble of colliding bubbles. Then show that Shannon sets minimum action and all this resolves into the finite precision Lagrangian. It works because with finite precision any multi-dimensional system can be decomposed into a recursive order of two dimensional systems, and still be within precision. The TOE is a real thing, the basics of adapted system with no empty space. Believe it.
For example. When we deal with a Lucas sequence, q(n) = P*q(n-1) + Q*q(n-2), then we get the asymmetric Hyperbolic condition. When we regroup the parameters, we should get, on the right, a 1/(kT)^2, or equivalent, which is the coupling energy. The Shannon channel rate changes. The chemical potential which is fermion native hot move ratio will imply kinetic energy in the variable Shannon calls noise. The numerators, information rate are the hot moves for bosons, and signal the hot moves for fermion that retain stability. But every thing goes back to normalized two period lookahead, but the derived base forms a convergent series in b and 1/b for the Lucas sequence. We shouod get boson and ferion statistics. Stuff like that should happen.
The final outcome will be:
[1-(1/b)^(1/N)] + [1+b^(1/N)] = 1/N for some finite N which makes the power series in b and 1/b match. The difference 1/N is the coupling energy, and should include kT. The error after order N is the uncertainty in the finite self adapted system. This is the finite log solution in the TOE, and I think it will work for the general case of adapted, finite systems. Does this get us all the powers of powers, or do we have to apply this recursively? Dunno, you tell me.
Why am I tell you all this? Because friggen brilliant mathematicians should know this is the next step in the unification. There is going to be a bunch of next steps as we redo math and science to match the TOE. So, you brilliant young, soon to be wealthy mathematicians and physicists; this is what I am doing, please beat me to it.
Tuesday, April 28, 2015
Appelbaumapril , incompetent economist
NY Times:
Central banks influence economic growth by raising and lowering borrowing costs. Higher costs crimp risk-taking; lower costs stimulate expansion. Those costs, expressed as interest rates, combine the price of money with an additional increment to compensate for inflation. Higher inflation means rates will run higher in normal times, allowing the Fed to make larger cuts during periods of duress.
Biny:
Look at the data and see if central bank sets rates. I looked, and I found three times the Fed has set rates since 1970. One, the Nixon shock; Two,, Volker raised the reserve requirement, and three, Selgin and Beckworth proved their case, at least in 2003, that the Fed acted to reduce rates.
That's it, I have found no other place where Greenspan or Bernanke did anything other than follow the bond market. The Fed mostly smoothes out the daily variance in overnight lending, and helps banks meet their monthly reserve requirement. Otherwise, rates are almost always vary with the one year bond rate.
The assumption that the fed sets rates was made by the MIT Basket Weavers, so they could use Newton's grammar, and they never checked that assumption. We are now at three generations of economists who have failed to check that assumption. It is not true in fact.
Right now the effective funds rate is below the IOER, and the IOER is the first direct rate setting power the Fed has ever had, and it is been around since 2008 , and the Fed is frightened to death to touch it. Right now the market rate for overnight lending is about .15; .1 below the IOER rate.
Biny, trying looking at the data, then speak. Isn't that what they teach in econ?
Krugman has no evidence
Krugman wanted more stimulus:But we should bear in mind that the world would be in much better shape right now if economic orthodoxy had in fact been followed. In practice, all the heterodoxy with any real-world influence has been used by politicians to justify policies that have deepened the slump and increased suffering.
Forget what economists tell us, show me the evidence. Look at the picture. Wages are sticky and unemployment flip flops from high to low. In 2009, Jan, when Obama took office the unemployment rate was already 90% of its way to the peak. The election was in November 2008. Krugman is asking Congress to respond in jan 2008 in the middle of a presidential election. Impossible.
Could Congress have made the unemployment rate drop faster? The only time it has ever dropped faster was the 1980 recession. In 2009 we had the debt rising faster than ever before in the series. We have no evidence in the series that DC could have done better.
Find me an economist who proves DC could have borrowed and spent at a faster pace. During the whole period since Obama took office until April 2010, the ten year rate was 3.5%, about one point over its fair rate. Debt at those rates would have been useless. At those elevated ten year rate we likely shifted 1% of nominal GDP into interest payments.
Some numbers:
We can put some numbers up here. Unemployment increased by 7%, a debt incurred by workers. DC increased debt by 35%, but it is 20% of the economy, is relative to the economy, DC incurred debt by 7%. The numbers match in the most constrained environment seen since the Nixon Shock. There is no proof, I can guarantee that, and I seem to be better at this than the economist.
Find me that economist, now. Its time for them to put up or shut up.
Adapted, maximum entropy systems and fermion spin
For example, when using measure theory on self adapting systems.
Fermions and boson are paired because of the maximum entropy Shannon condition. Bosons count the fermions plus the coupling constant, and the energy level, the 'bandwidth capacity' is set so the finite order of the fermion power series and the boson power series match. Absent kinetic energy in the fermion, the fermion will converge to maximum entropy about twice as fast, that is not minimally redundant. So nature adds kinetic energy to the fermion. Surprised? No, adiabatic systems are self adapted, finite measure; they get that fine tuning with motion. The effect of spin is to make Compton power spectral matching happen. Compton bandwidth matching is simply the derivative of Compton power spectral matching which is mass-energy equivalence.
Adaptation by 2 period look ahead:
Why the 2 period model? Because causality is not ambiguous, events can be sequenced. That does not mean an action has only one cause, it means the causes, multiple, must happen in an ordered sequence. This is equivalent to Shannon noting that within any finite precision, and 3 or 5 period model can be decomposed into a series of 2 period models.
But if you look at how spin adapted the system to minimum redundancy you will see the over lap is the Phi series, with Phi approximated to the precision limit. That limit is about 2^16, in units of quants, so we have a fourth order subdivision, and that gives us three dimensions of space and one of time. I presume time is the scalar, so I dunno, the system of sequential causality requires at least one scale and one non scalar? Measure guys, straighten this all out, OK, and the corporations will put huge quantities on your account.
Fermions and boson are paired because of the maximum entropy Shannon condition. Bosons count the fermions plus the coupling constant, and the energy level, the 'bandwidth capacity' is set so the finite order of the fermion power series and the boson power series match. Absent kinetic energy in the fermion, the fermion will converge to maximum entropy about twice as fast, that is not minimally redundant. So nature adds kinetic energy to the fermion. Surprised? No, adiabatic systems are self adapted, finite measure; they get that fine tuning with motion. The effect of spin is to make Compton power spectral matching happen. Compton bandwidth matching is simply the derivative of Compton power spectral matching which is mass-energy equivalence.
Adaptation by 2 period look ahead:
Why the 2 period model? Because causality is not ambiguous, events can be sequenced. That does not mean an action has only one cause, it means the causes, multiple, must happen in an ordered sequence. This is equivalent to Shannon noting that within any finite precision, and 3 or 5 period model can be decomposed into a series of 2 period models.
But if you look at how spin adapted the system to minimum redundancy you will see the over lap is the Phi series, with Phi approximated to the precision limit. That limit is about 2^16, in units of quants, so we have a fourth order subdivision, and that gives us three dimensions of space and one of time. I presume time is the scalar, so I dunno, the system of sequential causality requires at least one scale and one non scalar? Measure guys, straighten this all out, OK, and the corporations will put huge quantities on your account.
The short answer on the nature of nature
My request to mathematicians:The math involved takes about three weeks for me, but what it says is that the two conditions above imply that universe can do its adjustments on the surface area, not the volume in any adapted system. The key term being adapted, meaning there are fermions and matching bosons. But these come right out of maximum entropy, or minimum actions principles. This also implies that the Greek letters are defined by an imprecise power series. The fermion spin results because the two power series, boson and fermion, will not converge uniformly without spin in the fermion.
What's the deal with gravity? Dunno, Einstein shows that gravity has a non-uniform convergence of two power series, he called it relativity.
Most of this is about nature doing Ito's calculus as accurately as it can while we are still deluded by Newton's calculus. The main difference, zero does not exist (the vacuum is not empty), there is no measure for nothing.
Take this straight to measure theory and show that any adapted system must match the group multiplicative identity to a series of boundary conditions at each measure quant, when the quants are ordered in a causal sequence. In other words, do the science of making the yard stick. Remember that zero has no measure, it is the precision, a positive value. That should explain it all.
Take fermion spin, for example. The fermion has to have an extra kinetic energy because its power series, under maximum entropy. will not converge at the same rate as the boson. The fermion has about 1/2 the ratio of hot to cold positions as the boson, when they play the game of Wythoff. Zero, the winning position, never happens. That means Phi, in the Whthoff array, is a finite approximation and the array is finite size for any adapted system.
Quantum entanglement is really the vacuum doing a finite power series by overlapping the bubbles, that is how it makes Pi equally imprecise going out from the 'one' on the ruler to the maximum integer notch.
Monday, April 27, 2015
Doing time with Ken Rogoff
Hilsenrath on Rogoff: Interest rates are low, he says, because investors are averse to risk, governments are forcing banks to hold low-risk government debt and central bankers are pushing rates down, he says. This view leads to different prescriptions for the problems that now nag the global economy. Mr. Summers argues for government borrowing to fund investment. Mr. Rogoff wonders if the solution to a debt supercycle is really more debt. “It is highly superficial and dangerous to argue that debt is basically free,” he says.What is the cycle time of the debt cycle, relative to the cycle time of monetary regimes? He does not say. I have Reagan and Bush the Elder running up debt for 12 years, and Bill Clinton running it partly back down over eight. The Dick 'deficits dont; matter' Cheny running it up until we crash. Now Obama has gotten deficits partly back down. We are losing the time race.
The good news here is that debt cycles don’t last forever. “As the economy recovers,” Mr. Rogoff says, “the economy will be in position for a new rising phase of the leverage cycle. Over time, financial innovation will bypass some of the more onerous regulations. If so, real interest rates will rise though the overall credit surface facing the economy will flatten and ease.”
The Fed has its loan portfolio set at six years, from now. That is a long time to maintain political sanity in DC, and too long to maintain sanity in California or New York, much less Illinois. We are in our regularly scheduled slowdown, its election season, and the time is ripe for mass insanity among the politicians.
And what about public sector pensions? These folks are going to retire soon, so how do the fund managers convert stocks into cash without crashing the market?
The real interest rate under Reagan
This is the ten year rate, DC'c cost of money, minus the implicit price deflator, YoY. It gives us a measure of the real rate of interest, interest paid in units of real stuff.
Now Reagan took office in Q1 1981, and that is when real rates took off. His first major act, cuttting taxes across the board in August 1981. And waddya know, rates took off and the deficit doubled.
We spent 12 years under Republican regimes paying huge interest rates while driving up debt. Those Republicans are a stupid lot.
Anyway, the reason I bring it up is that real rates for the US government seem to be closer to 2-2.5%. Republicans nearly bankrupted the government, actually.
Now Reagan took office in Q1 1981, and that is when real rates took off. His first major act, cuttting taxes across the board in August 1981. And waddya know, rates took off and the deficit doubled.
We spent 12 years under Republican regimes paying huge interest rates while driving up debt. Those Republicans are a stupid lot.
Anyway, the reason I bring it up is that real rates for the US government seem to be closer to 2-2.5%. Republicans nearly bankrupted the government, actually.
Symmetry and fine tuning in physics
That one bugs me because symmetry changes the direction of fine tuning.
Is the particle an uncompressed system in a compressing vacuum, or visa versa?
The fermion always has twice as many hot moves as cold, the boson twice as many. The maximum intermixing between the two is where the all the players want to be, and that is where they all have the same approximation for pi. So the fine tuning always starts their and adjust inward for the fermion and out word for its corresponding boson. Reversing charge makes the exterior the starting point inward for the boson, and from the center outward for the boson.
The differential direction of fine tuning that blocks symmetry. The system is low dimensioned, the power series that the vacuum computes, by a local additive process is barely more than six, it does not know pi, its computing it. So physicists seem to be mixing up the symmetry in Newton's calculus with what nature does. Nature starts with low dimensionality Ito, then increases dimensionality up to the band limit of light. It never gets to Newton, it just improve Ito.
It all in the game of Wythoff, except no one wins the game. To win the game and be stable would mean the vacuum does Newton.
Is the particle an uncompressed system in a compressing vacuum, or visa versa?
The fermion always has twice as many hot moves as cold, the boson twice as many. The maximum intermixing between the two is where the all the players want to be, and that is where they all have the same approximation for pi. So the fine tuning always starts their and adjust inward for the fermion and out word for its corresponding boson. Reversing charge makes the exterior the starting point inward for the boson, and from the center outward for the boson.
The differential direction of fine tuning that blocks symmetry. The system is low dimensioned, the power series that the vacuum computes, by a local additive process is barely more than six, it does not know pi, its computing it. So physicists seem to be mixing up the symmetry in Newton's calculus with what nature does. Nature starts with low dimensionality Ito, then increases dimensionality up to the band limit of light. It never gets to Newton, it just improve Ito.
It all in the game of Wythoff, except no one wins the game. To win the game and be stable would mean the vacuum does Newton.
Sunday, April 26, 2015
Boston Fed says we need the Smart Card
Boston Fed: You can check in but you can't check out.
There report says just use the dollar for the top end of the curve and make interest payments. The rest of us will be needing the new smart card currencies.
There report says just use the dollar for the top end of the curve and make interest payments. The rest of us will be needing the new smart card currencies.
http://www.bostonfed.org/economic/current-policy-perspectives/2014/cpp1412.htm
During the onset of a very severe financial and economic crisis in 2008, the federal funds rate reached the zero lower bound (ZLB). With this primary monetary policy tool therefore rendered ineffective, in November 2008 the Federal Reserve started to use its balance sheet as an alternative policy tool when it began the large-scale asset purchases. Now attention is turning to how the Fed should transition back to a more conventional monetary policy stance. Largely missing from these discussions about the Fed's "exit strategy" is a consideration that perhaps it should retain, not discard, the balance sheet tools. Since the Dodd-Frank Act (DFA) has added maintaining financial stability to the Fed's existing dual mandate to achieve maximum sustainable employment in the context of price stability, it might be beneficial to have several tools to achieve multiple policy objectives. An additional consideration is that some of these tools may be needed to stem future crises as a result of the DFA's new limitations on how the Fed can provide liquidity under such adverse circumstances. In an effort to spur a broader debate, this brief discusses what is known and knowable regarding the effectiveness of balance sheet tools and examines four primary arguments for keeping these as part of the Fed's toolkit.
The relative costs and benefits of conventional versus unconventional policy are difficult to know, so appeals to these types of arguments in favor of one type of policy tool are hard to support. In terms of the absolute costs of balance sheet tools, there was a fear that using such tools would engender high inflation or unanchored inflation expectations. Yet after six years, neither result has occurred.
Having more than just one primary policy tool confers greater flexibility and may allow the Fed to better fulfill what are now its three policy goals. Moreover, using balance sheet tools to specifically target the sector(s) that are in disequilibrium would let the Fed better focus its policy efforts on the sectors it wants to affect and would diminish some of the potential policy tradeoffs that arise when just one policy instrument is available. Such arguments become even more powerful when the simultaneous objective of ensuring financial stability must be met.
In a low inflation environment, the probable frequency and duration of hitting the ZLB may actually be much greater than previously appreciated, and hence the need for having alternative policy instruments may be more critical than before.
With the ability to operate more directly on the asset classes and interest rates it would like to see changed, the Fed may be able to better communicate its policy intentions to market participants.
While more research is needed to study the relative costs and benefits of conventional versus unconventional monetary policy tools, including the credit allocation effects that result from using the federal funds rate versus more precisely targeted balance sheet tools, there are benefits from using balance sheet tools that are not available from using the federal funds rate alone. Policymakers should seriously consider the gains that could result from keeping balance sheet tools in the Fed's arsenal.
Doing Shannon as an adapted quantized system
This is unfinshed work, I ain't figured it all out. There will be another post on fermions and bosons, likely unfinished. In that work I construct the fermion problem as a queue with only 0 or 1 item; and the boson as a queue with 1 or 2 for the bosons, then construct the queues as a series sum in (1/3^n) and (2/3^n), getting an aggregate sample rate (Temp, in the physics) or 2 and 1/2. Stuff like that. I am fishing for those friggen mathematicians, as usual. But I post unfinished work, that is what I do, sorry.
Making Shannon a simple spectral allocation
Let's consider all following as variance or power spectra:
C - Capacity
B - Total power availailable, energy
S - Available spectra, signal spectra as an output
N - Allowable quantization error
Now I have:
C/B = 1+ S/N or the proportion of capacity variance, as a proportion of the available variance must be the variance of the sepataror (Ito's dx unitized) plus the proportion of the signal space available relative to error, this ratio is precision.
In other words, given the allowable error I the system needs, and the precision available we have the capacity needed to stay within the total energy. Its a kind of quantize Hamiltonian.
Not I want to adapt the system do get the dynamics. The system always makes exchanges to maintain the balance above, especially precision, S/N, whihc is fixed. To make it adapted I need to ensure the system maintain exchanges within the bandwidth allows for symmetric precision. So I convert all my power spectra into exponetial growth projected two period ahead so the exchanges sample at twice the rate of the projection. For simplicity let the log of the sqrt variable above be its lower case. We get, after moving thins around:
1/e * [e^[2*(c-b] - e^[2*(s-n)] = 1
or
{e^[c-b+1/2]}^2 -{e^[s-n+1/2]}^2 = 1
Is this legal? It is if if I have two independent exchange systems at the edapt equilibrium, but it says nothing about dis-equilibrium. Ons solutiion set are the hyperbolics if we have a functions that maps c-b+12/ into s-b+1/2, which are the logs of cosh and sinh respective.
Solution:
Anyway, make a short story here, we get two roots for Y in the Y + 1/Y and Y-1/Y, one from B/S and one from S/N.
Making Shannon a simple spectral allocation
Let's consider all following as variance or power spectra:
C - Capacity
B - Total power availailable, energy
S - Available spectra, signal spectra as an output
N - Allowable quantization error
Now I have:
C/B = 1+ S/N or the proportion of capacity variance, as a proportion of the available variance must be the variance of the sepataror (Ito's dx unitized) plus the proportion of the signal space available relative to error, this ratio is precision.
In other words, given the allowable error I the system needs, and the precision available we have the capacity needed to stay within the total energy. Its a kind of quantize Hamiltonian.
Not I want to adapt the system do get the dynamics. The system always makes exchanges to maintain the balance above, especially precision, S/N, whihc is fixed. To make it adapted I need to ensure the system maintain exchanges within the bandwidth allows for symmetric precision. So I convert all my power spectra into exponetial growth projected two period ahead so the exchanges sample at twice the rate of the projection. For simplicity let the log of the sqrt variable above be its lower case. We get, after moving thins around:
1/e * [e^[2*(c-b] - e^[2*(s-n)] = 1
or
{e^[c-b+1/2]}^2 -{e^[s-n+1/2]}^2 = 1
Is this legal? It is if if I have two independent exchange systems at the edapt equilibrium, but it says nothing about dis-equilibrium. Ons solutiion set are the hyperbolics if we have a functions that maps c-b+12/ into s-b+1/2, which are the logs of cosh and sinh respective.
Solution:
Anyway, make a short story here, we get two roots for Y in the Y + 1/Y and Y-1/Y, one from B/S and one from S/N.
Protecting copyrights and patents is not free trade
Free Trade means, two parties exchange without fear of threats or force.
The Trans pacific Trade Agreement is an agreement to eliminate tariffs, except for copyrights, patents and instances where partners get a prior agreement on tariffs. This agreement expands the power of courts to restrict trade where there are IP conflicts. Does not meet the definition of free trade.
Krugman is right to complain. Mankiw's argues for free trade, this agreement is restricted trade. Mankiw simply changes the semantics so he can select the restricted trade that favors his point of view, Mankiw is not a free trader. We all know the difference between restricted trade and free trade, Mankiw is wrong.
Don Boudreaux quotes a definition that denies this is a free trade agreement:
Score one for Krugman, he defends free trade.
The Trans pacific Trade Agreement is an agreement to eliminate tariffs, except for copyrights, patents and instances where partners get a prior agreement on tariffs. This agreement expands the power of courts to restrict trade where there are IP conflicts. Does not meet the definition of free trade.
Krugman is right to complain. Mankiw's argues for free trade, this agreement is restricted trade. Mankiw simply changes the semantics so he can select the restricted trade that favors his point of view, Mankiw is not a free trader. We all know the difference between restricted trade and free trade, Mankiw is wrong.
Don Boudreaux quotes a definition that denies this is a free trade agreement:
Understanding the connection between economic freedom, entrepreneurship and prosperity isn’t difficult. In a free market, entrepreneurs devise new products, as well as new methods of production and distribution. If consumers find entrepreneur Jones’s new product valuable enough to buy it a price that covers its cost, Jones reaps profits. If consumers find entrepreneur Smith’s new product to not be worth the price necessary to cover its costs, Smith suffers loses that are his to bear. This simple market test—one in which each consumer and entrepreneur spends his or her own money, and in which almost all economic transactions are consensual—is by far the best means yet devised for ensuring not only that scarce resources are used as productively as possible, but also that creative human effort is continually called forth to discover ever-newer and better ways to use resources.OK, Don, how does going to court to retract a cash transaction satisfy this definition? You say people have the right to buy and sell patents, find. We also have the right not to buy and sell them, fine also. The definition call a product the thing exchanged, I exchange a product without a patent, you don;t like it? Great, so you restrict my trade using the government police force? You and Mankiw are firggen socialists. If an entrepreneur does not want someone to copy, then hide and protect the method using his/her own money, keep the Cafe Hayek socialists out of government.
Score one for Krugman, he defends free trade.
Deutsche Banks wants a liquidity formula
(Wikimedia Commons)Right now, people in markets are worried about one big thing: liquidity.
But there's a problem: no one is exactly sure how to define or measure it.This week, Peter Hooper and his team at Deutsche Bank wrote a big report dissecting the subject of liquidity and defined it — or tried to — as follows:
Let i be the probability of a trade relative to the total market. Then we want -iLog(i) less than 1.0 . Now this makes a bold assumption, namely that the stock market or bond market trades paper coherently with the economy it measures, and the economy is adiabatic.What do we mean by market liquidity? Although there are potentially many different definitions of market liquidity, in its simplest form we think of a liquid market as one in which trades can be executed with some immediacy at low transaction costs. But even within this short and simple definition there are many uncertainties: Does this refer to all trades, regardless of size, or only trades of a "normal" size? What constitutes a low transaction cost, and how do we best measure this? Because of these uncertainties, there is no single best metric for the level of liquidity in a market.
Saturday, April 25, 2015
Potential output, yet another bogus variable from the MIT basket Weavers
I had another of those shocks that economists give from time to time. This one from Krugman complaining aboout the IMF calculation of potential output. Now I used it when I was talking about spectral decomposition in DSGE. Now I have to rethink that some more.
What is the problem with potential output as definined? Here is the definition from WIki:
OK, institutions are 40 years long, the long bond is thirty years. It has to be measured over that time span. The IMF measures it over the recession cycle, about eight years. What are they measuring? The ability of economies to adapt to the American recession cycle. Is that a good idea? Well, the US has dominated the world economy, so I guess so. But what a dreary proposition, and it makes the measure useless for much of anything endogenous to the native economy.
Is my method of DSGE spectral decomposition any better? Sure, a bit to the extent it is locally accurate. Spectral decomposition under the adiabatic assumption tells us the relative risk to each component of the DSGE when the economy slows., because it tells which component restructures first, in all probability.
What is the problem with potential output as definined? Here is the definition from WIki:
In economics, potential output (also referred to as "natural gross domestic product") refers to the highest level of real Gross Domestic Product output that can be sustained over the long term. The existence of a limit is due to natural and institutional constraints.
OK, institutions are 40 years long, the long bond is thirty years. It has to be measured over that time span. The IMF measures it over the recession cycle, about eight years. What are they measuring? The ability of economies to adapt to the American recession cycle. Is that a good idea? Well, the US has dominated the world economy, so I guess so. But what a dreary proposition, and it makes the measure useless for much of anything endogenous to the native economy.
Is my method of DSGE spectral decomposition any better? Sure, a bit to the extent it is locally accurate. Spectral decomposition under the adiabatic assumption tells us the relative risk to each component of the DSGE when the economy slows., because it tells which component restructures first, in all probability.
The Web is virtual money
What is it when a medium size company negotiates airline and hotel discounts based on volume? Those discounts go onto the company books and are used over time, just like a savings account. In 1985, medium sized companies could not do this easily, searching for the proper set of discounts was costly in the Peter Diamond sense. So the travel budget was real money, sitting in a real bank deposit. Today negotiating and setting a large deposits of hotel discounts is a few clicks on the web, maybe a phone call. All companies do it. That is lost business for the standard banks.
Banks want that business back, so banks need to have those discount points tradable. Banks need to get into the discount point business, and they need smart cards and banker bots. Banker Bot is especially useful if a trusted bank like, Bank of America can offer a no arbitrage savings and loan business on frequent flyer miles. That is a big win for everyone, the bank, the airlines and the flyer. Banker bot can create networks of discount points, so the correlation between hotel discounts and airline discounts match, then everyone wins to the extent that hotels and airlines share customers. If the banker bots trade, then tourist industries can buy and sell both discounts, adjust their capacity, another win.
Now we make this capabilit7 available to consumers via the smart card. The smart card can purchase in the optimally matched price of dollars plus discounts. This is all a must win business for traditional banks. All of these banks need to jump in with both feet and push the smart card and banker bot technology as far out into commerce as they can.
Banks want that business back, so banks need to have those discount points tradable. Banks need to get into the discount point business, and they need smart cards and banker bots. Banker Bot is especially useful if a trusted bank like, Bank of America can offer a no arbitrage savings and loan business on frequent flyer miles. That is a big win for everyone, the bank, the airlines and the flyer. Banker bot can create networks of discount points, so the correlation between hotel discounts and airline discounts match, then everyone wins to the extent that hotels and airlines share customers. If the banker bots trade, then tourist industries can buy and sell both discounts, adjust their capacity, another win.
Now we make this capabilit7 available to consumers via the smart card. The smart card can purchase in the optimally matched price of dollars plus discounts. This is all a must win business for traditional banks. All of these banks need to jump in with both feet and push the smart card and banker bot technology as far out into commerce as they can.
Housing boom in 2015 says Goldman Sachs? Not so.
The chart on the right explains GS assumptions of a 2.8% GDP growth in 2015. The main driver is a 6.8% growth in residential investment, in the right side of the chart. Currently Q1 is showing a 4.8% decline, YoY, in existing sales, the Q3 2014 levels. And new home sales dropped 11% in March. But that drop was from a previous high.
This is annual rate of housing permits, seasonally adjusted. Can GS get a 6.8% jump in these numbers? The market has levelled off. Where are the new buyers going to come from? Job growth has leveled off, and we have a low growth Q1. New buyers need at least one quarter to become a bit more confident, so that puts all the growth back into summer home season.
I do not think so.
I do not think so.
Friday, April 24, 2015
Jerry Dumps the Environmentalists
CalWatch: Gov. Jerry Brown’s administration has decided to scrap a key environmental commitment and forge ahead with a controversial, costly plan to build two massive water tunnels under California’s San Joaquin River Delta region.
The unpalatable choice underscored both Brown’s resolve to see the project through and the daunting challenges he still faces in trying to secure enough political support to do so. Without the tunnel plan, Brown would have to go back to square one in his ambition to secure adequate water resources for Southern California over the long term.Few options
As the San Jose Mercury News reported, the $25 billion tunnel project contained $7.8 billion earmarked for preservation and restoration efforts affecting some 100,000 acres of wetlands and targeted areas. Throwing that sum into doubt, Brown has walked way from the centerpiece of the environmental plan — a so-called “50-year guarantee” to environmentally safeguard the Delta.Although the surprise change has not yet been finalized, Brown’s choice appeared to be cemented by a lack of federal regulatory support for the commitment. According to CBS San Francisco, “biologists at the U.S. Fish and Wildlife Service and other federal agencies told the state they won’t issue permits for the environmental plan because the state cannot prove it will restore salmon, smelt and other wildlife.”Now, as water resources officials told the San Francisco Chronicle, Brown hopes “to use $17 billion from state water contractors just to build and operate the tunnels. That would allow habitat restoration work on the delta and surrounding waterways to begin immediately regardless of what happens with the tunnel project,” according to Bay Delta Conservation Plan spokesman Richard Stapler. But the Chronicle noted that “large water contractors footing the bill have said they aren’t willing to pay for the plan if they don’t have long-term water guarantees.”A difficult calculus
Contractors have become skittish over the plan’s budget for complex reasons. The half-century guarantee Brown has decided to jettison didn’t just provide environmentalists with a predictable framework. It helped water agencies manage their own expectations as well. “The agricultural and urban water districts that are the major drivers of the long-planned project were betting that a 50-year permit would stabilize delta deliveries that have been restricted by increasingly stringent protections for endangered fish,” the Los Angeles Times reported.Since a narrower time frame would open up the possibility of changes in permitting rules, water districts and contractors grew concerned that it was becoming too hard to determine if the high cost of the tunnels would pay off.Longstanding concerns
Brown’s move confirmed fears state environmental groups have harbored for years. As early as 2013, environmentalists went public with criticism of the tunnel plan. In a sign of how strongly they opposed Brown on the issue, some policy activists used the divisiveness of Brown’s other cherished high-stakes project — high-speed rail — to amplify skepticism for the tunnels.
Simple ideas on non linear DSGE
We want to keep it simple. We assume households, firms and government are adapted, that is they transact goods at the proper size and frequency such that the two period model is met, they are adiabatic. How do we find the number of goods that can be moved for some given, short period?
Assume perfect linearity and measure real growthrate level and variance of growth rate over some stable period. The variance is the power deviation from linearity. It is a measure of the value of e in the limit and the value of e in actual use, where e is Eulers number. That variance should be small, equal to the variation in prices.
The signal in this case is e, everyone is trying to keep up. The noise is the variance from trend. So the signal to noise ratio is high, the variation in the checkout counter is not that bad. How much value can be transacted in aggregate?
We get C/(2*B) from the Shannon condition, value rate over potential transaction rate if everyone knew e. Of the total bandwidth, 2*B, 2*B-C is the clock rate of the economy, the speed at which the check out counter can work the register.
Now I used value instead of good because when the economy is adapted, packing occurs, and value it the -iLog(i), not i alone.
Now we know then the economy is two period adapted, then there is an index that can be uniquely assigned to each transaction. That index is:
2^(C/2*B)-1. C/B is large, and let's just take it up to the next integer. Then we get a two bit counter which assigns a two bit number to each good. The checkout person takes the good package and runs it through a decoder selecting the proper index.
Now we go back to our linear DSGE model and go a ahead and solve it, getting a short series sum like e^(at) + e^(bt) + e^(ct), for example, the sum computing the real GDP growth over time. But t counts from 1 to 2*B. The a,b andc are ordered and diagonal. The number to clock ticks they get is proportional to their covariance (or deviation, see note below), and the bits in their index proportion to the coefficient,a,b,c. So from there we get the relative transaction value and probability of occurence.
Non linear effects occur with the number of agents, N in each group. Or a slow down in transaction rate, real GDP growth variance changes. Quantization requires one agent per transaction. Check for the probability that some agents will crowd the checkout counter.
Note of caution!
The two period model implies the agents can plan two periods ahead, an handle that determine whether you divide spectrum by the covariation of the sectors or the square root of covariance. So you have to keep that straight. that is where the checkout count samples a trice the rate of purchases.
Assume perfect linearity and measure real growth
The signal in this case is e, everyone is trying to keep up. The noise is the variance from trend. So the signal to noise ratio is high, the variation in the checkout counter is not that bad. How much value can be transacted in aggregate?
We get C/(2*B) from the Shannon condition, value rate over potential transaction rate if everyone knew e. Of the total bandwidth, 2*B, 2*B-C is the clock rate of the economy, the speed at which the check out counter can work the register.
Now I used value instead of good because when the economy is adapted, packing occurs, and value it the -iLog(i), not i alone.
Now we know then the economy is two period adapted, then there is an index that can be uniquely assigned to each transaction. That index is:
2^(C/2*B)-1. C/B is large, and let's just take it up to the next integer. Then we get a two bit counter which assigns a two bit number to each good. The checkout person takes the good package and runs it through a decoder selecting the proper index.
Now we go back to our linear DSGE model and go a ahead and solve it, getting a short series sum like e^(at) + e^(bt) + e^(ct), for example, the sum computing the real GDP growth over time. But t counts from 1 to 2*B. The a,b andc are ordered and diagonal. The number to clock ticks they get is proportional to their covariance (or deviation, see note below), and the bits in their index proportion to the coefficient,a,b,c. So from there we get the relative transaction value and probability of occurence.
Non linear effects occur with the number of agents, N in each group. Or a slow down in transaction rate, real GDP growth variance changes. Quantization requires one agent per transaction. Check for the probability that some agents will crowd the checkout counter.
Note of caution!
The two period model implies the agents can plan two periods ahead, an handle that determine whether you divide spectrum by the covariation of the sectors or the square root of covariance. So you have to keep that straight. that is where the checkout count samples a trice the rate of purchases.
Count me as puzzled by this transgender thing
BOSTON (CBS) – A Massachusetts couple knew their child wasn’t happy. So they made the courageous and difficult decision to raise their daughter as a boy at his own request.
As he showed off his collection of rocks and dead bugs in his bedroom, Jacob Lemay also said how he hated his old name. “Cause it was the stupidest name ever,” Jacob said.
He’s talking about the name ‘Mia,’ which was scrapped last June, when his parents and family therapy team concluded the then four-year old was transgender — a little girl wired as a little boy.
“That’s the kind of psychological burden that I don’t think anyone should have to deal with, especially not my child,” Jacob’s mother Mimi said.
They say this wasn’t tomboy stuff or some passing phase.
Mia began verbalizing it at age two, triggering a long family struggle with the subject, as the little girl grew increasingly unhappy and withdrawn — stuck with a gender she did not embrace.
Now, don't get me too wrong, but the child is being raised as a normal kid, and how many normal girls really want to spend the rest of their lives if dresses that are no fun, hair that is a pain in the ass, ill fitting shoes, stupid dolls and learning color coordination?
How exactly does a girl dress like a boy? Blue jeans and a shirt? For crying out loud, all girls and boys dress like that, that is the standard uniform. It is some other bizarre culture that creates the dolled up look as we know it.
And what is a boys name? OK, 'Buck', I give you that. Naming a kid after a dog is definitely a male thing.
Market regulators arrested the wrong Bot
Zero Hedge complaining about Bot inequality |
Zero Hedge has been bitching about this for some time. And when some HFT bots lose money, the regulators have turned back time and reverse the trades. But when Nav's bot does well, he gets arrested!
That x thing in Brownian motion
See it? That x^2 thing? What is it? It is the solvent. Einstein and company are, yet again, assuming the vacant space in which Newton's Grammar can be written. That is not so, the X thing are particles, just like the thing that is doing the D, for diffusion. There are two, and X^2/D is a power ratio, otherwise known as a spectrum ratio.
What we are looking at is more than likely e^[-(tanh)^2]. You can't get completely there by assuming the solving is a flat property, there is a covariant viscosity between the two materials. This is a power spectral distribution between to adapted systems. T is just Ito's calculus counter which tells you the order in which things must fit into the differential. It is really a sequential quant number. N is the additive number which generates the rational approximation to Pi and e. It will look like, ak/aj; where the k and j go as: a1 +2*a2 or a1+a2, depending upon the Lagrange.
I checked this out, that hyperbolic angle 3/2*ln(Phi) comes out as something like log(Phi+2), so at that half angle the second Lagrange is taking over, and Phi will be something like F16/F17, making it the log of (F16+2*F17)/F17. The silver ratio takes over at the barrier where probability his highest and the first Lagrange cannot get any more accurate. So that N and the Pi thing will cancel, e becomes a rational ratio and p(x,t) will end up being -tanh*log(tanh) or tanh*tanh', one of those, I am sure.
Hyperbolics does this with the Horwitz rational approximation and additive sequence. I mean, the graph guys are using a similar scheme to make 'greedy' links and nodes which work just like the rational approximation sequence. It is time to dump the Greek letters. Newton is the limit as Ito's index gets large, nothing more than that. When the index gets larger, the sequential Ito counter's,t, become dense, that is all that is happening.
What we are looking at is more than likely e^[-(tanh)^2]. You can't get completely there by assuming the solving is a flat property, there is a covariant viscosity between the two materials. This is a power spectral distribution between to adapted systems. T is just Ito's calculus counter which tells you the order in which things must fit into the differential. It is really a sequential quant number. N is the additive number which generates the rational approximation to Pi and e. It will look like, ak/aj; where the k and j go as: a1 +2*a2 or a1+a2, depending upon the Lagrange.
I checked this out, that hyperbolic angle 3/2*ln(Phi) comes out as something like log(Phi+2), so at that half angle the second Lagrange is taking over, and Phi will be something like F16/F17, making it the log of (F16+2*F17)/F17. The silver ratio takes over at the barrier where probability his highest and the first Lagrange cannot get any more accurate. So that N and the Pi thing will cancel, e becomes a rational ratio and p(x,t) will end up being -tanh*log(tanh) or tanh*tanh', one of those, I am sure.
Hyperbolics does this with the Horwitz rational approximation and additive sequence. I mean, the graph guys are using a similar scheme to make 'greedy' links and nodes which work just like the rational approximation sequence. It is time to dump the Greek letters. Newton is the limit as Ito's index gets large, nothing more than that. When the index gets larger, the sequential Ito counter's,t, become dense, that is all that is happening.
Thursday, April 23, 2015
Smart Cards, the modern way to hoard cash
Bankers think cash is inefficient. Citi’s Willem Buiter looks at the inefficiency of cash hoarding and is looking for better hoarding technology.
What do we need?
With these we can hoard valuable, secure digit sequences all we want.If the connection between the banker bot spreadsheet and the digit encryption is secure, then there is no way any banker of money exchanger can cheat the system. The banker bot can verify the digits using its own private key, then match block counts, and then put the digits into the cell labeled "savings balance" with its uncertain term length and posted rate, as I have mentioned here. The weak link is the trusted borrower, Banker Bot runs a savings and loan ratio business. I guarantees that all the digits it produces will collectively comprise a cotangent/tangent group. So it needs trusted members who will try and 'out fox' banker bot, and have fun doing so; so as to keep the ratio function up to date and accurate. But as long as merchants like secure digits then there should be a stable set of connected merchants to perform the member bank function.
What Banker Bot really does is ensure that all customers buying stuff will be a unit variance Gaussian distribution of arrival rates at the check out counter.. If they are crowding in line then the merchant has assurance that a price hike will restore equilibrium. And for customers, Banker Bot guarantees that when the line is crowded, they will be better off shopping less and putting money in the savings account. Neutral prices are discovered by the queue length. Merchants are all enthused about this, Banker Bot guarantees it, this will be very popular, safe and secure money.
How does the customer trade digits for tax dollars? Easy, the merchant knows the price he pays for a can of beans, in tax dollars. His register can always invert the price and get tax dollars per digit, since he sells beans for digits. The merchant is simply adding money transactions to the list of products. Since the digits are always a gaussian distribution there should be no problem, except with the tax dollar. If the tax dollar has an uncertain distribution, then raise the price of the exchange.
Consider my little community here in the Tower District of Fesno.
It is a pedestrian community with about 50 shops, restaurants and clubs. All this community needs to do is have about 10% of prices paid in Tower points, like discount coupons. That is enough to stabilize the foot traffic. They get a 30% boost in inventory efficiency, and the cost is nearly nothing, once the smart cards are in place. Our local Bank of America could easily run the system, providing a secure place for Banker Bot. The new smart cards would be compatible with exiting terminal and ATM systems. The bank web site could sell local advertising. Every merchant in the are would join because the digits are targeted to the local customer base. Digits and customers having the same probability distribution around the community.
Where is the problem?
What do we need?
- We need trusted exchangers that use no arbitrage banker bot technology.
- We need to use the public key encryption technology so we can always verify that a set of digits was encoded by a trusted exchanger.
- We need counterfeit proof smart cards that contain out faces in photo and digital form.
With these we can hoard valuable, secure digit sequences all we want.If the connection between the banker bot spreadsheet and the digit encryption is secure, then there is no way any banker of money exchanger can cheat the system. The banker bot can verify the digits using its own private key, then match block counts, and then put the digits into the cell labeled "savings balance" with its uncertain term length and posted rate, as I have mentioned here. The weak link is the trusted borrower, Banker Bot runs a savings and loan ratio business. I guarantees that all the digits it produces will collectively comprise a cotangent/tangent group. So it needs trusted members who will try and 'out fox' banker bot, and have fun doing so; so as to keep the ratio function up to date and accurate. But as long as merchants like secure digits then there should be a stable set of connected merchants to perform the member bank function.
What Banker Bot really does is ensure that all customers buying stuff will be a unit variance Gaussian distribution of arrival rates at the check out counter.. If they are crowding in line then the merchant has assurance that a price hike will restore equilibrium. And for customers, Banker Bot guarantees that when the line is crowded, they will be better off shopping less and putting money in the savings account. Neutral prices are discovered by the queue length. Merchants are all enthused about this, Banker Bot guarantees it, this will be very popular, safe and secure money.
How does the customer trade digits for tax dollars? Easy, the merchant knows the price he pays for a can of beans, in tax dollars. His register can always invert the price and get tax dollars per digit, since he sells beans for digits. The merchant is simply adding money transactions to the list of products. Since the digits are always a gaussian distribution there should be no problem, except with the tax dollar. If the tax dollar has an uncertain distribution, then raise the price of the exchange.
Consider my little community here in the Tower District of Fesno.
It is a pedestrian community with about 50 shops, restaurants and clubs. All this community needs to do is have about 10% of prices paid in Tower points, like discount coupons. That is enough to stabilize the foot traffic. They get a 30% boost in inventory efficiency, and the cost is nearly nothing, once the smart cards are in place. Our local Bank of America could easily run the system, providing a secure place for Banker Bot. The new smart cards would be compatible with exiting terminal and ATM systems. The bank web site could sell local advertising. Every merchant in the are would join because the digits are targeted to the local customer base. Digits and customers having the same probability distribution around the community.
Where is the problem?
Texas might want to rejoin the secession movement
Money velocity, good or bad?
Velocity, the number of times we trade per one quarter. There are two of them, the blue is the slower pegged to the left, and the purple the faster. The blue includes checking deposits. I just say the faster one are weekly cash purchases, the slower are monthly bills paid.
The faster we go, the more specialization we have, more or less. But that is less leisure and more work. You be the judge.
Aggregate statistics would just tell us that we need enough transactions to match leisure, to be adapted. In other words, hot and cold Wythoff positions match. But it looks like velocity was too high in the 90s and too low today. Look at the blue, the monthly bill payers. There are fewer of them, less monthly banking. And they both seem to drop with real growth. They both took that leap down during the crash and neither are returning. Banking has become less efficient.
The faster we go, the more specialization we have, more or less. But that is less leisure and more work. You be the judge.
Aggregate statistics would just tell us that we need enough transactions to match leisure, to be adapted. In other words, hot and cold Wythoff positions match. But it looks like velocity was too high in the 90s and too low today. Look at the blue, the monthly bill payers. There are fewer of them, less monthly banking. And they both seem to drop with real growth. They both took that leap down during the crash and neither are returning. Banking has become less efficient.
Wednesday, April 22, 2015
Are we doing the sudden stop in new home sales?
Kathileen Madigan at WSJ looks at new home prices and old:
Unsustainable means the market will be filled. When that happens, the flow of additional interest payments from mortgages stops. That in turn means short term credit becomes tight, having no back up collateral. The short end of the curve flattens flattens. Why is that? The long and short term credit markets have to stay connected, so we get a steep yield curve. Connecting the credit network means keeping enough credit activity at the knee of the curve, ensuring enough market liquidity.
Is this a crash?
No, not necessarily. There was no sharp moves in home prices during the run up to the 2008 crash, the housing market performed fine. See for yourself on the chart above, those prices in 2004-2009 seem smooth and symmetrical to me. Housing will cause a small rise in short term credit, then the flattening. Housing prices, by themselves, seem to adjust as needed. Home owners are a hetergenous lot, each with different credit demands and constraints; mostly unconnected, hence liquidity is not hampered.
On the other hand:
The City of Chicago, or the California public sector are no hetergeneous mix. For example, all the public pensions tie the stock market back to local government budgets and back to public sector employment. California being 15% of the economy has all this connected up. And this employment is tied back to DC which handles the flow for the mix government programs out here. And these mixed programs, like Obamacare, tie back into the pension payments of the public sector. And the adjustment volatility of adapting these new programs again effect local government budgets. Then the Obamacare tax changes alter the tax collection in California, and that changes the funds flow from Sacramento back to city and county governments. Finally, we have the sudden retirement shift as each laid off public worker picks the same timeframe to retire.
So as housing adjusts with no problem, But the California public system get massive jitters and segment and fail.
Two different segments of the housing market are yielding two different price trends.Will we do the sudden stop on new home sales?
When discussing Wednesday’s news that existing home sales climbed in March, National Association of Realtors chief economist Lawrence Yun said the 7.8% yearly rise in March’s median price was unsustainable. “This price gain of near 8% is not healthy, considering people’s incomes are only rising by 2%,” said Mr. Yun. “The only way to relieve housing cost pressure is to have more homes coming onto the market.”
Yet research by economists at TD Securities show the uptrend in resale values is nothing compared to the speedy rise in new-home prices.
Unsustainable means the market will be filled. When that happens, the flow of additional interest payments from mortgages stops. That in turn means short term credit becomes tight, having no back up collateral. The short end of the curve flattens flattens. Why is that? The long and short term credit markets have to stay connected, so we get a steep yield curve. Connecting the credit network means keeping enough credit activity at the knee of the curve, ensuring enough market liquidity.
Is this a crash?
No, not necessarily. There was no sharp moves in home prices during the run up to the 2008 crash, the housing market performed fine. See for yourself on the chart above, those prices in 2004-2009 seem smooth and symmetrical to me. Housing will cause a small rise in short term credit, then the flattening. Housing prices, by themselves, seem to adjust as needed. Home owners are a hetergenous lot, each with different credit demands and constraints; mostly unconnected, hence liquidity is not hampered.
On the other hand:
The City of Chicago, or the California public sector are no hetergeneous mix. For example, all the public pensions tie the stock market back to local government budgets and back to public sector employment. California being 15% of the economy has all this connected up. And this employment is tied back to DC which handles the flow for the mix government programs out here. And these mixed programs, like Obamacare, tie back into the pension payments of the public sector. And the adjustment volatility of adapting these new programs again effect local government budgets. Then the Obamacare tax changes alter the tax collection in California, and that changes the funds flow from Sacramento back to city and county governments. Finally, we have the sudden retirement shift as each laid off public worker picks the same timeframe to retire.
So as housing adjusts with no problem, But the California public system get massive jitters and segment and fail.
Tuesday, April 21, 2015
Fermions and Bosons
Just thinking about the adapted hyperbolic system:
1/2 *(b+1/b)^2 - 1/2 * (b-1/b)^2 = 1
What is this except a match set of bosons and fermions. Call it the Shannon condition, why not. The bandlimited number ofbosons fermions minus the bandlimited number of fermions bosons will be one unit of exchange energy. This comfims my view that the Cosh is the cold position, and cosh(0) is the winning positoon, the bubble that does not exchange. But the sinh are the odd functions and would have though they would be the fermions, I guess something is not quite right in my thinking, but never mind, go one.
I am cheating here, certainly, because I assume the bubbles have already done the job of figuring out how to pack a sphere, and I am just dumbly decoding what they did. The numbers of things are built right into the exponents.
Now we can get the statistics because we have energy levels quantized, by assumption. So, b is really:
kT is what you make it, the transaction rate of hiting a tennis racket or the transaction rate of hot molecules, who cares, the system is adapted. That exponent is closely related to how close the energy levels get to tanh = 1.0 before the capacity is exceeded.
The statistics will always look something like:
Out of scale, as usual, but these are just the second derivative plots for Coth (red fermion) and Tanh (blue boson) , they meet the flow conditions for the two groups, and give the probability for each of the integer exponents.
I mean, this is all about encoding the groups to meet Shannon. In one direction you are mapping to SNR and including the optimum sampler, in the other the fermion have their own sampler, you your are mapping NSR and excluding the sampler.
So lets play the game of Wythoff.
F is the number of fermions, B the number of bosons. For some energy states, the number of Fermions to Bosones is F/B. Now it takes three Wythoff moves for a Boson to maintain a cold position, of F/B^3. The same holds for the Fermions, B/F^3. At low energy levels, the fermions start with at least one. And the probability of finding a fermion is F/B^3, is large as B, sinh, is small. At high energys, the number of fermions is close to the number of bosons.
At high energy states, he denominator dominates.
So do I understand this? No, still confused. But it seems clear the game of Wythoff is happening and its takes three moves to exit the sphere. The connection with fermion and boson statistics most likely has to do with the connected system, so we end up with a rational approximation, of small order, for the exponential..
1/2 *(b+1/b)^2 - 1/2 * (b-1/b)^2 = 1
What is this except a match set of bosons and fermions. Call it the Shannon condition, why not. The bandlimited number of
I am cheating here, certainly, because I assume the bubbles have already done the job of figuring out how to pack a sphere, and I am just dumbly decoding what they did. The numbers of things are built right into the exponents.
Now we can get the statistics because we have energy levels quantized, by assumption. So, b is really:
kT is what you make it, the transaction rate of hiting a tennis racket or the transaction rate of hot molecules, who cares, the system is adapted. That exponent is closely related to how close the energy levels get to tanh = 1.0 before the capacity is exceeded.
The statistics will always look something like:
Out of scale, as usual, but these are just the second derivative plots for Coth (red fermion) and Tanh (blue boson) , they meet the flow conditions for the two groups, and give the probability for each of the integer exponents.
I mean, this is all about encoding the groups to meet Shannon. In one direction you are mapping to SNR and including the optimum sampler, in the other the fermion have their own sampler, you your are mapping NSR and excluding the sampler.
So lets play the game of Wythoff.
F is the number of fermions, B the number of bosons. For some energy states, the number of Fermions to Bosones is F/B. Now it takes three Wythoff moves for a Boson to maintain a cold position, of F/B^3. The same holds for the Fermions, B/F^3. At low energy levels, the fermions start with at least one. And the probability of finding a fermion is F/B^3, is large as B, sinh, is small. At high energys, the number of fermions is close to the number of bosons.
At high energy states, he denominator dominates.
So do I understand this? No, still confused. But it seems clear the game of Wythoff is happening and its takes three moves to exit the sphere. The connection with fermion and boson statistics most likely has to do with the connected system, so we end up with a rational approximation, of small order, for the exponential..
Unit rooting our way through the Chiocago Schools budget
(Reuters) - Yields topped out at 5.63 pct for bonds due in 2039 in a $295.7 million Chicago Board of Education general obligation debt sale on Tuesday, according to an initial pricing scale obtained by Reuters.
The yield was 285 basis points over Municipal Market Data's benchmark triple-A scale. The financially struggling public school district is selling the debt through lead underwriter PNC Capital Markets after its ratings with Moody's Investors Service and Fitch Ratings were lowered last month to one notch above the junk level. (Reporting By Karen Pierog)
Chicago can never Unit Root their way through 25 years, this is risky though bond dealers deny it. Then this, 5.6% for a 25 year bond. 25 years includes three recessions with unemployment at 9%; and three collapses in home prices. The interest rate is also about 1/3 what the school board needs to pay in on pension funds, likely close to 15% of teacher income.
Here is the opening line on the Chicago Schoold web site:
The Chicago Public Schools Fiscal Year 2014 budget protects critical investments in the future of Chicago’s next generation while reflecting the challenging financial realities of a district facing a historic $1 billion budget deficit. This year’s deficit, driven by a $400 million increase in pension payment obligations coupled by flat and declining revenues and increasing contractual and statutory obligations, has led to some difficult choices. But in spite of our financial challenges, our commitment to providing every child in every community with a high-quality education is as strong as ever, and the district, parents, teachers, principals, community members, local elected officials and business leaders must work together to protect investments in student learning that will maintain the tangible progress being made by our students.What does Moody's biobd raters say?
New York, March 06, 2015 -- Moody's Investors Service has downgraded to Baa3 from Baa1 the rating on the Chicago Board of Education, IL's general obligation (GO) debt. The Baa3 rating applies to $6.3 billion of the district's outstanding GO debt. The outlook remains negative. The Chicago Board of Education is the primary debt issuer for the Chicago Public Schools (CPS).
Chicago schools put out their own budget forecasts and summary. Here we have funds used for debt service. two year old data. The trend up, it is never goes down. The system is bankrupt.
Monday, April 20, 2015
How banker bot handles term length, gains and losses
The hyperbolic banker bot equation is:
savings^2-loans^2 = 1
where savings and loans are grow rates in the balances. Banker bot is a spread sheet, and the equation is a projected a net loss over the next two periods.
What is a period?
It is left undefined in unit of time; but in units of spectrum, it is two banker bot transactions, and a transaction is the event where banker bot sets saving and loans rates, again unspecified as to term period. Banker bot is hunting for the economic bandwidth, operating in the spectral domain.
In other words, banker bot is challenging the member banks to determine the term periods for the posted rates. The rates are like adjustable term loans and deposits. Once the rates are posted we get the following dynamics. The longer the member banks wait, the cheaper the periodic rate for its loan on balance and the expensiver the periodic rate its savings on balance. Eventually member banks will pull their savings, or increase their borrowing; as the term length grows. At that point, banker bot sees that the flow constraint is out of balance by some small value. How small is that value? Banker bot has to pay a few humans to manage the member bankers and rent a building. So banker bot actually makes a bet by posting balances, it uses its own system to fund its own humans. Its unit of error is what it needs to make sure rent and salaries are paid.
I can only guess what banker bot aim to do. It will attempt to take enough of its posted losses so its expenses are paid. Beyond that I have not though out whether its humans are paid by commission of by month. That is a determination to be made by experimenting.
So, David Beckworth, this is helicopter money done right, so I have this ongoing bet. I think you are going to adopt the helicopter as the standard vehicle for currency banking.
So, you see, banker bot is all about finding the Schramm-Loenner index of diffusion, the bandwidth of the member banks, including itself.
savings^2-loans^2 = 1
where savings and loans are grow rates in the balances. Banker bot is a spread sheet, and the equation is a projected a net loss over the next two periods.
What is a period?
It is left undefined in unit of time; but in units of spectrum, it is two banker bot transactions, and a transaction is the event where banker bot sets saving and loans rates, again unspecified as to term period. Banker bot is hunting for the economic bandwidth, operating in the spectral domain.
In other words, banker bot is challenging the member banks to determine the term periods for the posted rates. The rates are like adjustable term loans and deposits. Once the rates are posted we get the following dynamics. The longer the member banks wait, the cheaper the periodic rate for its loan on balance and the expensiver the periodic rate its savings on balance. Eventually member banks will pull their savings, or increase their borrowing; as the term length grows. At that point, banker bot sees that the flow constraint is out of balance by some small value. How small is that value? Banker bot has to pay a few humans to manage the member bankers and rent a building. So banker bot actually makes a bet by posting balances, it uses its own system to fund its own humans. Its unit of error is what it needs to make sure rent and salaries are paid.
I can only guess what banker bot aim to do. It will attempt to take enough of its posted losses so its expenses are paid. Beyond that I have not though out whether its humans are paid by commission of by month. That is a determination to be made by experimenting.
So, David Beckworth, this is helicopter money done right, so I have this ongoing bet. I think you are going to adopt the helicopter as the standard vehicle for currency banking.
So, you see, banker bot is all about finding the Schramm-Loenner index of diffusion, the bandwidth of the member banks, including itself.
Let's do a Beckworth
Banker theorists talking regime change.
Taking the second part: "they would then quickly work together to implement a helicopter drop." We do, every forty years or so. That is how long it takes to get into a liquidity trap. So this part is done, and now is the time, yet again, for our forty year helicopter drop.
The first part: "anchor nominal spending and income expectations". Inflation expectations are anchored, they are anchored to expect a Reagan style inflation after the Nixon style helicopter drop, followed by a Clinton style disinflation, then a short Hoover deflation until the next helicopter drop.
Beckworth has everything he wants, but it would seem to me we should shorten the cycle to say, maybe one year and do the helicopter in smaller amounts. What is the point of doing the same monetary regime we have done for 120 years, lets try a more efficient version of the same thing. Plus, the best method to handle a Reagan inflation is a bit of the helicoptor pick up. You know, drops and pickups both, why let a good tool go to waste?
We now have to tools to do helicopter drops and pick-ups on a day by day basis, using spreadsheets and web bots. let's do it, anybody up for some fun?
Helicoptor drops and pickups are fundamental. I thnk we miss this. Grocery store coupons, frequent flyer miles, Walmat value points, High Frequency trading, the Feynman diagram in physics. It is the fundamental action in adapted aggregate statistics, the exchange.
Beckworth on NGDP targeting: First, the Fed adopts a NGDP level target. Doing so would better anchor nominal spending and income expectations and therefore minimize the chance of ever entering a liquidity-trap... [I]f the public believes the Fed will do whatever it takes to maintain a stable growth path for NGDP, then they would have no need to panic and hoard liquid assets in the first place when an adverse economic shock hits.
Second, the Fed and Treasury sign an agreement that should a liquidity trap emerge anyhow [say due to central bank incompetence] and knock NGDP off its targeted path, they would then quickly work together to implement a helicopter drop. The Fed would provide the funding and the Treasury Department would provide the logistical support to deliver the funds to households. Once NGDP returned to its targeted path the helicopter drop would end and the Fed would implement policy using normal open market operations. If the public understood this plan, it would further stabilize NGDP expectations and make it unlikely a helicopter drop would ever be needed.
Taking the second part: "they would then quickly work together to implement a helicopter drop." We do, every forty years or so. That is how long it takes to get into a liquidity trap. So this part is done, and now is the time, yet again, for our forty year helicopter drop.
The first part: "anchor nominal spending and income expectations". Inflation expectations are anchored, they are anchored to expect a Reagan style inflation after the Nixon style helicopter drop, followed by a Clinton style disinflation, then a short Hoover deflation until the next helicopter drop.
Beckworth has everything he wants, but it would seem to me we should shorten the cycle to say, maybe one year and do the helicopter in smaller amounts. What is the point of doing the same monetary regime we have done for 120 years, lets try a more efficient version of the same thing. Plus, the best method to handle a Reagan inflation is a bit of the helicoptor pick up. You know, drops and pickups both, why let a good tool go to waste?
We now have to tools to do helicopter drops and pick-ups on a day by day basis, using spreadsheets and web bots. let's do it, anybody up for some fun?
Helicoptor drops and pickups are fundamental. I thnk we miss this. Grocery store coupons, frequent flyer miles, Walmat value points, High Frequency trading, the Feynman diagram in physics. It is the fundamental action in adapted aggregate statistics, the exchange.
Why are we taking sides?
It's the Sunni vs Shia. Why would we care? And if we care, it seems to me the Shia are he least psychotic of the two. Iran is helping in Iraq. Yes, we know the Persians march around doing the "death to" thing. I wish they wouldn't. Couldn't they do the 'a mild sickness to" thing, it sounds more congenial.
AP: WASHINGTON (AP) -- The U.S. Navy aircraft carrier USS Theodore Roosevelt is steaming toward the waters off Yemen to beef up security and join other American ships that are prepared to intercept any Iranian vessels carrying weapons to the Houthi rebels fighting in Yemen.Navy officials said Monday that the Roosevelt was moving through the Arabian Sea. The U.S. Navy has been beefing up its presence in the Gulf of Aden and the southern Arabian Sea amid reports that a convoy of about eight Iranian ships is heading toward Yemen and possibly carrying arms for the Houthis. Navy officials said there are about nine U.S. warships in the region, including cruisers and destroyers carrying teams that can board and search other vessels.The officials spoke on condition of anonymity because they were not authorized to discuss the ship movement on the record.The Houthis are battling government-backed fighters in an effort to take control of the country. The U.S. has been providing logistical and intelligence support to Saudi Arabia-led coalition launching airstrikes against the Houthis. That air campaign is now in its fourth week.
I never noticed this about quantum conductance
Wiki: Note that the conductance quantum does not mean that the conductance of any system must be an integer multiple of G0. Instead, it describes the conductance of two quantum channels (one channel for spin-up and one channel for spin-down) if the probability for transmitting an electron that enters the channel is unity, i.e. if transport through the channel is ballistic.That is just the hyperbolic condition, cosh^2-sinh^2 = 1. The equation insures that the spread of the outer surface minus the spread of the inner leave one unit of conductance between them. The G0 will be the cotangent, I thnk, in this case. Then we have this:
Alpha is the fine structure which we know is the residual noise when the system is perfectly adapted and likely at Lagrange number two. I only get it close with Lagrange one.
But, the spectral enegy needed to make exchanges will be twice the noise energy, no? After all, maximum entropy conditions have to be met, we have an adapted probability system.. So the ratio: G0/alpha is simply the center frequency to power spectra. I may have bandwidth and power spectrum mixed here, bear with me. But that ratio is just the bandwidth of space, including the coupling constant, so 1/Z0 is simply a measure of spectral capacity. The formula has all the usual time and distance fake units in it, so there is some reduction that needs to be done. But I know where that center frequcne is, it is hyperbolic angle 3/2 * ln(Phi()). This system is just making pi out of the 1.5 Quants supported by free space.
Sunday, April 19, 2015
Obamacare hits keep on coming
Public Sector Pension perils of Obamacare
CalWatch: Even those who clearly understand the changes, however, have found reason for worry. Not only individuals but large organizations have begun to brace for dramatic changes in the years ahead.At a meeting last month, for instance, CalPERS’ Pension and Health Benefits Committee voiced dismay over the so-called “Cadillac tax” due to hit the kinds of coverage many of its beneficiaries have enjoyed to date. A cost report for staff warned that the excise tax, which takes effect in 2018, will impose “a 40 percent excise tax on the aggregate cost of health benefits that exceed $10,200 for individual coverage and $27,500 for family coverage, indexed to inflation. Contracting agencies that offer health benefits through CalPERS are very concerned about cost impacts of the 40 percent excise tax.”According to the report, “as part of exploring options for containing costs while maintaining high quality, the Board’s Pension & Health Benefits Committee directed staff to research the option of removing the broad networks in areas where narrow networks are widely available.”At the same time, CalPERS recently announced that the Golden State “and its schools will increase their contributions to employee pension funds by 6 percent starting July 1,” according to Reuters. “The California Public Employees’ Retirement System, or CalPERS, said the increases were driven by payroll growth, salary increases and retirees living longer.”
Now, let me say. Any UC College professor who has been around for ten years knows perfectly well this volatility was established when No CHild Left Behind hit California. All California legislators are perfectly aware that we face ten years of volatility. Kevin Drum has wide experience watching the mix of California and DC public programs, he is perfectly aware. So if any of this collection fails to give a sound analysis of these perils, they are frauds.
Let us list the Obamacare hits.
- Obamacare taxes were incorrectly listed as consumption expenditures on the BEA report, and that caused nationwide confusion.
- Public sector pension system will be under stress for years to come. This may force Chicago into bankruptcy.
- Tax systems will become increasingly fraudulent, volatile and backlogged.
- There is a real issue of how California is going to cover the costs for all the new California citizens who do not qualify as federal citizens.
- Hospital and insurance systems are going through massive consolidation causing supply chain volatility which will last for years.
Why wait for federal citizenship, just come to California
Citizenship in California is hard to avoid, you actually have to get a court order denying you are a California citizen. The law is simple, if you have a dwelling address then you are a California citizen. Can't avoid it, its the law.
WA Examiner: While the administration struggles to move forward with its plan to grant amnesty to illegal immigrants, the list of foreigners trying to get into the United States legally has surged to 4.4 million, over 100,000 more than last year, according to the State Department.
Those on the list either have a family member who is a U.S. citizen or green card holder, sponsoring their entry, or an employer wants them.
ECB loans and deposits through 2011
From the VOX EU. The poitn here is that the so called raising of interest rates by the EU in 2011 was nothing. By 2011, borrowing from the ECB bank was essentially zilch, the .5 hike in interest rates was bluster. Look at excess deposits, they did go way up in May 2011. Was that a result of the raise in deposit rates?
Saturday, April 18, 2015
Bonehead speeches crash economy says Greg Ip
WSJ: The U.S. economy has downshifted rather abruptly in the last few months, prompting new discussion within the Federal Reserve about delaying its first interest-rate increase. Yet the growth deceleration should not come as a surprise, because the Fed has already tightened.
True, the Fed’s interest-rate target remains close to zero. But the Fed tightens through its words, not just its actions, and the drumbeat of chatter from the Fed in the last year has made it clear that officials plan to start raising rates sometime this year.
If a bunch of badly educated bureaucrats can crash the American economy, then we are in real trouble.
The sub-prime lending crisis and the Fed, once again
This time it is Selgin and Beckworth. They state the obvious, the fed is bzonkers.
I got into exactly what the Federal Reserve was doing in 2002 . In summary, the federal reserve target was on track with the market. There was no real skew in the deviations from target, nothing extraordinary in borrowing from the discount window.
2003 to 2004 was another story altogether, the Fed began buying government bonds, even as the implicit price deflator was rising. Why? Because the Fed needs to keep DC funded, it is as simple as that. The Fed was looking at increasing deficits in 2002, and concluded government, its boss, might be in trouble.
So no regime change in the central banker will work as long as government gets a special place. Economic models against that backdrop is like modelling all the hairs on an elephant to determine its size, you are just bouncing the rubble.
I got into exactly what the Federal Reserve was doing in 2002 . In summary, the federal reserve target was on track with the market. There was no real skew in the deviations from target, nothing extraordinary in borrowing from the discount window.
2003 to 2004 was another story altogether, the Fed began buying government bonds, even as the implicit price deflator was rising. Why? Because the Fed needs to keep DC funded, it is as simple as that. The Fed was looking at increasing deficits in 2002, and concluded government, its boss, might be in trouble.
So no regime change in the central banker will work as long as government gets a special place. Economic models against that backdrop is like modelling all the hairs on an elephant to determine its size, you are just bouncing the rubble.
Putin seems a little dense to me
If Vladimir was really going to do a sneak attack, then why does he telegraph his signals for months ahead of time?
Here is the problem that the Russian airforce is tying to fix:
So what is NATO supposed to do? Invade and have sex with a bunch of Russian women? I mean, that is the only explanation I can come up with.
As far as the Ukraine, here is their problem:
Much worse? So how does this work? It seems to me, the Ukranian desire to join NATO, and the Russian desire to get screwed by NATO have something in common.
I get that Russians are under an existential threat, but that is mainly from living too far north and drink too much vodka. So Vladimir sends his little airforce running around protesting the miserable state of Russia, as if what? And considering the web, I doubt he can fool the Russian people for very long.Telegraph: The commander of the US army in Europe has warned that Nato must remain united in the face of a "real threat" from Russia."It's not an assumption. There is a Russian threat," Lt-Gen Frederick "Ben" Hodges said."You've got the Russian ambassador threatening that Denmark will be a nuclear target if it participates in any missile defence programme. And when you look at the unsafe way Russian aircraft are flying without transponders in proximity to civilian aircraft, that's not professional conduct."Gen Hodges spoke to the Telegraph on the sidelines of a military debriefing after an exercise to move live Patriot missiles 750 miles across Europe by road and deploy them on the outskirts of Warsaw.The sight of a US military convoy crossing the German-Polish border more than 20 years after the end of the Cold War made international headlines and brought traffic to a standstill as people posed for selfies beside the troops.The intention of such a highly visible deployment was to send a signal, Gen Hodges said.
"That's exactly what it was about, reassuring our allies," he said.
Here is the problem that the Russian airforce is tying to fix:
So what is NATO supposed to do? Invade and have sex with a bunch of Russian women? I mean, that is the only explanation I can come up with.
As far as the Ukraine, here is their problem:
Much worse? So how does this work? It seems to me, the Ukranian desire to join NATO, and the Russian desire to get screwed by NATO have something in common.
Subscribe to:
Posts (Atom)
------------
Two dimension universe? The universe uses the two period model.
Quantum entanglement? There is no empty space, so the two period model changes the amount of overlap between the bubbles of the non-empty vacuum.