Monday, March 30, 2015

Hyperbolic group decomposition

That is where we are going when we incorporate the higher lagrans numbers inot the hyperbolic system.

Why do these Lagrange numbers work the way they do? They take advantage of their own nature, they reorganize sequences in separable groups, each group defined by a power series.  The Lagrange number itself just divides sequence into single matches, then dual matches, and so on. When combined with the set calculus of hyperbolics we get powerful artificial intelligence.

The Lagrange just combine sequential sets. But within the hyperbolic system, they find the minimum redundant sequence, label then, then the system just jumps up to a higher number and finds the next sequential sets.  That is a powerful artificial basis system. This idea of finite group calculus is very powerful, it is the next generation AI, and we barely passed the first.

Take speech understanding.  What would the Lagrange hyperbolics do? Find the optimum groups of growth and decay processes.  Do this recursively, get group powers of groups. They would actually learn speech.

isn't this idea, if developed, really finite discrete log?  Or actually, decomposition into optimum groups. We move the higher Lagrange, and we squeeze the angle, but get an independent basis to obtain the next estimate of divergence. This is the missing theory I wandered upon about two years ago. I always though the Lagrange was the generalization of minimum redundancy encoding. If we are working with an existing group, the goal is to estimate the maximum divergence about the ring.  Find its 'pi', this is the road to big stuff. I think this approach eventually solves this:
In probability theory, a balance equation is an equation that describes the probability flux associated with a Markov chain in and out of states or set of states.
It is an optimum decomposition of the probability flux. I think.It find the most divergent path about the ring, stepping between Lagrange power series.
 See this chart? From the last post. But I added some lines zig zagging across the red and white barrier, just on the left.  That path is the path through the discrete points of the probability flux. That path should be jumping across Lagrange quants.

The little thing itself, if done accurates, should be the prime group, a new and novel concept, and very  powerful.

So hyperbolics, when used with the Lagrange angles, are doing group intersections, decomposing some group using prime groups. The diagram is the first prime group, it is a basis set of all other groups.  So it is perfectly reasonable in particle physics to have a prime group, though we may never actually detect it.  But, try taking a perfect sphere in a pure simple vacuum. then cool the vacuum down, way down until you are freezing gravity. The slowly compress the sphere, I think you might make some of these and the vacuum will get sloshy.


This is great stuff, it really is.

The smallest piece of matter possible

I was playing around with my spreadsheet and wondered what the rational approximation was for the best approximation of pi that would create a bubble trap.  1.5 * ln(phi), naturally. But how many bubble of space would be trapped? The basic idea is that the bubbles of space play a Whythoff game and once the most efficient position were reached, at the white and red barrier, then the game was won by the cold positions in the center, and mass formed.

That point, reached by Fibonacci numbers,  F17/F16 (or F61/F15) I am note sure yet, but that is about 1,000 bubbles.  Inside the barrier there are twice as many cold positions as hot, and just outside, twice as many hot as cold.

At the barrier, the divergence of moves is optimum and most moves take place inside the perimeter in circular fashion.  Pi is the curvature, computed, and held in position when Lucas swaps the positions in the center.  In that white and red ring, the system has locked onto the ratio as the tangent, and Lucas angle keep the paring straight, there are two quants, 1 inside the ring and 2 outside.  Feynman plays around in the barrier.

The blue are  hots trying to get cold.  Outside the hots try to get in; forcing cold [position out,   but the extreme accuracy of the barrier make it impossible to penetrate.

I know this little guy as to exist, the Lagrange mathematicions say we cannot make bigger things until we trap all the 'bad' phi approximations.   Any way, I call this the next biggest thing above the vacuum bubbles. These have to be the composites that make a lepton.

I know the point is likely F16 because I have run the numbers so many different ways, and I get phi^16 very close to (3/2)^19.  After splitting up the possible cold bubbles, I get (3/2)^18  for inside and outside, so I know this is close.

This is more accurate than the fine structure, and I can put a bunch of these things together and make a lepton using the 1+2^1/2 ratio. The effect is to flatten out tanh'' so the sweet spot ios not so sweet. But like the mathematicians say, with the bad phi approximations trapped in a well, I should have no problem adding spectral modes up all the way through quarkland.

Let's derive the money equation from the supply/demand equation

We will solve this for the normalized symmetric case, the case when prices and supply make a symmetrical V at equilibrium.  And we are going to normalize it so gains are always one unit of currency.  Then extend it to a connected distribution network.

First, we let prices decay by some rate as supply grows by the same rate, and we set the quantity levels to normalize.
We have:

(1+1/r) growth rate of supply, and (1-1/r) decay rate for price.  Then the normalizing quantity basis is r so that:

(1+1/r)^2 - (1-1/r)^2  =4/r

I think I have that right .  The square ensures equilibrium because it projects the gain out two periods. Hence there will be two transaction available to realize the projections and with probability one the queue length for all transactions will be less than three.  The Nyquist bandwidth requirement is met, and that is the condition of an adapted process.

Now one should always be able to do this if the real quantities and rate are known and independent, there are enough variables to normalize.

But we can see I have constructed the problem as a unitized, symmetric hyperbolic condition.  Hence the differential equation of flow is satisfied.

Let price be  (1-1/r)/1+1/r), the ratio of price to supply in the current quater.  The that is tanh and we get inflation right away as tanh'. So then computing or measuring tahn'' we get the money equation:

Price * Inflation +1/2 Inflation' = 0.

If we require a connected network where all quantities are conserved, then r will always be of the form 1/Phi^(2n), where n enumerates the node level of the distribution network.

I think I have all that, if not, economists can work it out before they teach it.  The symmetric case is the first Lagrange. The asymmetric cases that use higher Lagrange numbers will require less inflation', the higher Lagrange numbers assume all the Phi estimates have been consumed. I am still working that one out.

This helps explain why Dean Baker's logic on inflation and inflation rate of change is a bit wrong. His theory is that inflation and inflation rate of change can be set independently, and we see they are mutually constrained by the price level.

But what this all means is that I have two chances to get to my favorite restaurant for apple  pie, and that should be enough to prevent crowd.

Here by the way is the connection between hyperbolics and Lagrange.

 x^2-r^2y^2=4   One can see this form closely related to the equation above. The variable r is a quadratic surd, but it determines 1/r *  tanh'' where 1/r is 1/2 when the symmetric hyperbolic is used.  Otherwise much remains the same, but there is certainly a bit of work for mathematicians to formally redo Lagrange theory in generalize hyperbolics.  Followign the link we see all the connections between hyperbolics and Marks bringing us to Weiner and Brownian motion then Schramm-Loewner.   

It is the calculus of combinatorice, a Schur reduction of chaos into optimal sets up to a standard error. Raising the lagrange order move the spectral peak down to a different angle and decomposis the tanh curve with another basis set. The theory of everything, actually.

Derive the whole mess from discrete angle hyperbolics and be done with it.


Sunday, March 29, 2015

Something strange in the neighborhood!


This chart has Nominal GDP in purple which tracks Personal consumption, measured in nominal dollars, I resume. Everything is YoY.

That red line is the price deflator, and tells us how much of the nominal GDP is real growth.  That deflator is headed down hill, and the latest estimates are that it should be hovering around zero.  That implies we have a 3% real GDP growth print coming up in a few weeks. And the jobs report is expected to show reasonable job growth.

But the Atlanta GDPNow forecast says no, the real GDP growth is likely to be 1%, it we are lucky.  And consumption growth is is barely .6% YoY. The other line, blue are hourly wages changes., and it is headed down, slightly, along with the deflator.  We are in a zero inflation environment.

So, something does not jive between the top two numbers and the bottom two.  . You can actually see them diverge, and the divergence has gotten worse in the last few quarters. Something strange in the neighborhood!


Saturday, March 28, 2015

The Lepton has to be a composite particle

Let's do some back of the envelope on this.

The ratio of the proton to electron is 1836.  We have three quarks, so they independently, manage one third each or 606 each. Now that number is roughly 2^9, but with hyperbolics (and Feynman) they run bidirectional flows, so make that 4^5.  Ok, this makes since since 4^phi = 3*pi, so we can see the quarks baking three pies.  And the 4^5 can be broken inot three groups, each a 1/phi^5 times the previous,  and the exponent matches the number of spectral modes for the quark; we have a way to make quarks work.

Great, things fit.  Now what about the electron itself? It has about 2^50 bubbles of the vacuum inside.  (After all, Avogadro is the iron law of spheres).  So, those bubbles have to be managed as grouped particles, as there is no way to fit all the bubbls onto on tanh curve and still make the connections. Ergo, the lepton is a composite particle, case closed.
 
What kind of particles? Well make them with the second Lagrange number, and they don't bake the good pi, but so they are a bit redundant in their Feynman transitions; they spin to make up for it. So you get something like 4^5 bubbles per small thing, and they group by fives, each crammed badly onto Tanh and using the redundant Lagrange.  That gets to 2^50 bubbles of vacuum and we are home free.  Or we have the second and third Lagrange in their, and the electron is always spinning and on the move, but it holds integrity.  The quarks can work with that, and refine the accuracy of pi.

ObamaCare whoops

I took the contributions to GDP from consumer spending from the latest Q4,2014 BEA update.  The Blue line in the contribution to GDP growth from health care spending, the red is the rate at which that changes, and the yellow is the rate at which the rate changes.  These are like differentials.

Note: These are Year over Year numbers.And my results are all YoY.


Here are the numbers from the BEA:
.37,0.24, 0.32, 0.20, 0.22, –0.03, 0.51, 0.70, 0.13, 0.40,
   0.13, 0.04, 0.40, 0.30, 0.48, –0.16, 0.45, 0.52, 0.88

These are the health service spending growth from the personal consumption graph, Line 17 in the second chart. They go back to  Q1 2011.

The number that stands out is the latest, Q4 2014, where health care spending comprise .88 of the GDP growth which is 2.2. So 40% of GDP growth last quarter came from increased health care spending.  Now this quarter the GDP growth is best guessed at 1.3%, and if the increase in health spending holds up, then Obamacare will drive 67% of this quarter;s growth.

Can the economy devote 2/3 of its growth rate to Obamacare?

No, the blue line tells us that health spending should be about 25% of the GDP growth rate, typically.  That red line, the first difference tells us how fast the economy can adapt to changes in health expenses, and that number hovers around zero. That red line, the adaptation rate, needs to revert to zero mean in about three quarters. The agents in the economy cannot adjust their balances any faster.

So, multiply the blue and the red line, and the result should be trending closer to zero, reverting to zero much faster. It is not. The yellow line is a proxy for that second differential, and it is widely out of the region of sustainability.

Let's try hyperbolic discounting, shall we?
Since we work with ratios, the total amounts are unitized to 1, year over year.

Now we expect one sectro of the economy to contribute 2/3 of growth, so in the short term, the total economy shrinks by 1-2/3 to support a growth of 1+2/3. That ratio comes to:
(1-23)/1+2/3) = .2, and that would be the loans to deposit ratio needed to support Obamacare growth. .  What is the loan to deposit ratio at the Fed? 1.0, and that number is holding for ten years folks.

How fast will loans to deposit change? That number, the first differential, is 1-ratio^2. So for the effect of Obamacare, we expect loans  to deposits to change by .99, in other words, the second differences need to be almost the entire ratio.  But the Fed that differential is 1-.99^2, or zero.  Loans to deposits at the Fed are not changing to accommodate Obamacare.  So these differentials are all calculated with respect to the Obamacare variable. Hyperbolic discounting tells us something is about to break because of Obamacare.

If Obamacare spending down not settle down, the economy will shrink.


Alaskans are moving to Anchorage

Vox gives us a nice map of where we moved to and from.  That blue area in Alaska is Anchorage, Alaska is concentrating in that spot. Not a bad idea as the oil runs dry.

Otherwise we are moving west and south. Florida, the retirement state, is growing. California breaking even, and Texas gaining the most. The entire North East continues its slow decline. The dark blue, north center, are the frackers.

Friday, March 27, 2015

Talking about the hyperbolic conservation constraints

This curve is:
tanh * tanh' + 1/2 tanh'' = 0
This is the Hamiltonion for the TOE.

The curve is tanh'', composed of tanh * tanh'.  It is the environment that is imposed on the system when adapted.

This curve goes down to the uncertainty of the system and is in balance with the environment.   That peak is where all the transcendentals are most accurate, their rational approximations get settles at 1.5 on this chart.  But 1.5 is not a solution, mainly because the system need to play the Wythoff game at that point where the combinations of events are maximum. Tanh is the amount of 'charge' the system support for any integer point along the curve. tanh' is eqwuivalent to momentum.

To the left of that peak, the bubbles are crowded, and want migrate to tanh = 0, to the right, tanh goes to one and there is no gain from an exchange, sinh and cosh are about the same. The effective quant of the electron is 1.5, mainly because the quarks bounce it between 1 and 2.

Where is that peak relative to the nucleus? I am not sure, actually.  But I am pretty sure angle count from the center of compaction, starting at zero. But I have gotten mixed up on that many times, beware.

I think the environment to the right of the peak is actually decompresses relative to the nominal vacuum, it is the positive charge region from the proton. To ghe left of the peak, the tanh function is very linear with the angle and tanh' is maximum. So the gluons, as you  see, have a lot of room between zero and one to set the environment, always near the vacuum uncertainty. This collection is an avogadro of bubble, and the electron is about 10e14 bubbles,  so a lot of Whythoff moves going on, but they all obey Wicks theorem.

Posotrons can slip over the peak going from left to right. But they are quant on stuck in a quant two environment, they need to decompress, so their tanh' flips sign, they appear positive. They are do not meet the constraint equation above.


A note on anti-particles

In a compressed environment, the Whytoff players want to get to a cold position. In a decompressed environment they want a hot position.  Anti-particles exist when the environment has been decompressed below ground by a magnetic field, it traps electrons I supposed and that decompresses the vacuum. A neutral vacuum has as many hot positions as cold. The nominal vacuum we have has more hot positions than cold.

Moving to a cold position is an increase in elasticity of interactions, and visa versa. I think I have that. More elasticity means more overlap meaning tanh is closer to zero.

A quick note on running time backwards

The difference between the Feynman diagrams and the hyperbolic approach is that the hyperbolic approach assumes the system is already adapted, or encoded. it is adiabatic. So actions are always adiabatic.  This ahs always been my approach. So a sequence of two actions, the second evolving from the first will produce a unique carrier between the two vertices in the Feynman diagram, and that carried is already prepped to happen simultaneous and independent of the two sequences on either side of the Feynman connection. So no need to introduce time at all, all sequences already occur in the proper order.

The positron, for example, is only observable when the physicist breaks the adiabatic condition.  I don't think anti-particles can survive. I will get into that sometime, I have not worked the whole thing out.

How accurate is the finite approximation to log(Phi)?

The Taylor series expansion of ln(Phi) is an expansion in 1/Phi^2, from the fact that Phi and Phi-1 are inverses. So I am interested on the order of the expansion when ln(phi) is within the fine structure.

Here is the graph:

And we see log(Phi) is within the Fine structure when the order is 16. I take the expansion up to some order, and correct by the Fine Structure. Then I see how close we are.  The numbers are here:

Order  Final error
  15    0.0005046831
  16   1.214354E-005
  17  -0.0004219024

So, nothing special here except the optimum choice is order 16.  For any other value of Fine Structure, there will some optimum expansion order, but what seems interesting is that the expansion order come to 16.  16 is a canonical set of binary values, or a set of values in which the Shannon encoding is fully non redundant.  This is to be expected in adapted systems, and one might expect this number to be 8 4,16,32 ,64 (I think), depending upon how accurate the quasars are in pressing the vacuum into electrons.  So, the fine structure is really an output from the Theory of Everything. 

The principles:

This is all about aggregates confined in a compression. The solution is always some elasticity in collisions such that collisions are minimally redundant, connected, and equally precise across the  dimensions of actions.  The three principles boil down to the aggregate finding rational approximations to phi,e, and pi. It is Newton done backwards, the chaos needs to find the best approximation to Newton's rules of grammar. We end up with one or more independent additive sequences in Phi that superimpose.  near tanh(0) there is a linear region where the center is adjusting energy so the aggregate is adiabatic to the compression force exterior. So we get an uncertainty, the fine structure, that separates the spectral modes and matches the uncertainty of the exterior environment.

My obsevations on Feynman Diagrams, so far

Here we go.  I decode this using the theory of everything.

First, why do they have Pi^4? Well that means they are integrating all the little bubbles, all of whom seem to be operating at hyperbolic angle 3*ln(phi), where the sinh and cosh should have pi^2 in it. So the hyperbolic conditions cosh^2 -sinh^2=1 will have Pi^4.

Why the imaginary number? because they use Newton's calculus which says Pi is always known, but the sin and cos Taylor series do not converge in sync, so Newton's grammar needs to include things that have not yet happened.

So this diagram is integrating over all the 10^15 little three-bubble exchanges of the vacuum that make up the electron. Under the TOE, we know the mix ov events happening at 3*ln(phi), because  an adapted process is a connected system, and Nyquist tells us events are maximally divergent and e, Eulers number, is well approximated as precision os distributed over the notches on the X axis.

Hence, Pa to Pb must be the projection out two transactions such that a single transaction samples twice.  Same with Pc and Pd.  so we get:
 cosh(3*ln(phi))^2 - sinh(3*ln(phi)) = 1 

By using the TOE we eliminate that pesky time thing. Now what about those bubbles outside the electron?  Why quant three? That is the first perturbation I guess, and all these perturbations are happening as the pass where tanh and tanh' are both equal. At that point the  little bubbles can try and hop the hump. Combinations are maximum and that is where Whythoff winers and losers are determined.

Thursday, March 26, 2015

How do all those Whythoff players fit inside the electron

I got curious, and went back to my spectral chart where phi^91 = (3/2)^108.  This is the ratio of cold positions and hot positions in the proton (with a 1/3 adjustment likely).

That is a lot of Whythoff players! Each player trying to get to the ground state.

Taking 1836 of the mass of the proton and getting close to (3/2)^89 = phi^75 then I get the electron spectral peak.   That means there are some 5e15 tiny chunks of the vacuum sitting at the cold position in the game.

But I guess it works, they all have the same rotation inside the little ball and thus the games stay separated. I suppose adding a little energy to the electron causes more dimensions of angular momentum. Hence, the number of counters taken from any hot position to reach a cold must be larger.  This must be the higher order perturbations in the Feynman diagram.

I am still going through the Feynman description. But it seem to imply matching of moves so they all remain separated.

The land where evolution takes place

Hillary and Obama were, of course, completely delusional about the middle east.

Politico has a complete description of the Free Fall in the Middle East. This has been going on for 50,000 years, way before the Persians. This is the spot where humanoids leave Africa and get evolved into humans who can live up north and enjoy Swedish welfare.
Politico: Barack Obama faces a slew of Middle East crises that some call the worst in a generation, as new chaos from Yemen to Iraq — along with deteriorating U.S.-Israeli relations — is confounding the president’s efforts to stabilize the region and strike a nuclear deal with Iran.
The meltdown has Obama officials defending their management of a region that some call impossible to control, even as critics say U.S. policies there are partly to blame for the spreading anarchy.
“If there’s one lesson this administration has learned, from President Obama’s 2009 Cairo speech through the Arab Spring, it’s that when it comes to this region, nothing happens in a linear way — and precious little is actually about us, which is a hard reality to accept,” said a senior State Department official.

Not everyone is so forgiving. “We’re in a goddamn free fall here,” said James Jeffrey, who served as Obama’s ambassador to Iraq and was a top national security aide in the George W. Bush White House.
For years, members of the Obama team has grappled with the chaotic aftermath of the Arab Spring. But of late they have been repeatedly caught off-guard, raising new questions about America’s ability to manage the dangerous region.
Obama officials were surprised earlier this month, for instance, when the Iraqi government joined with Iranian-backed militias to mount a sudden offensive aimed at freeing the city of Tikrit from the Islamic State in Iraq and the Levant. Nor did they foresee the swift rise of the Iranian-backed rebels who toppled Yemen’s U.S.-friendly government and disrupted a crucial U.S. counterterrorism mission against Al Qaeda there.
Both situations took dramatic new turns this week. The U.S. announced its support for a Saudi-led coalition of 10 Sunni Arab nations that began bombing the Houthis, while Egypt threatened to send ground troops — a move that could initiate the worst intra-Arab war in decades.

Read more:

Tuesday, March 24, 2015

Fenyman diagrams are Wythoff game moves

They look like this.The process on the left samples the blue line  as does the process on the right. Two fermion samples per boson in the middle, Nyquist rate.   In this game the players are trying to reach the hottest position by adding chips to their pile. That is a reverse Wythoff goal.

There should be another dual of the Feynman diagram in which two Bosons sample the fermion in the middle.  That would be the normal Wythoff game.  When mass fermions sample boson, the ratio of nulls (lowest state vacuum element, cosh(0)) to bosons is 2 to 1, still  Nyquist.  When the bosons sample the Nulls, ratio is still 2 to 1. In one case the effective sample rate is 2/3 and in the other 3/2.

The hyperbolics do this. cosh(theta)&2 - sinh(theta)^2 - 1. Thus, the one ssmpled element remains. reversing the process becomes cosh(-theta)&2 - sinh(-theta)^2 - 1. And there are some 2^50 games being played in the hydrogen ion.  Each game has some 2^16 moves available at any given time.

The whole key to understanding the adapted Weiner process seems to always be Nyquist/Shannon sample theorem. When ever I go back to that, issues clear up.  The Nyquist rate and Avogadros number are related, the first setting the second.



Monday, March 23, 2015

Kevin Drum lies with numbers

He uses this chart to compare Texas with its neighbors, only to find the second largest economy in the nation has the mean unemployment with respect to its smaller neighbors. Well, waddya know!



I have a better chart. Since Kevin's post title is the California miracle, lets compare the second largest state with the largest state, shall we?
There we have it, unemployment between California and Texas, a very sound comparison.  Note that California actually started the recession, and carried it farther as a result of California's fraudulent public sector pension system.  California was, in fact the overall worst performer throughout the post recession cycle.  Texas mainly carried the economy for most of the past five years.



Election Fraud Season is upon us

It is election time and we will get mostly foul statistics and outright fraud from the Kevin Drums, Brad Delongs, Dean Bakers, Art Laffers, and all the Krugmans. I am sorry for the voter for the enormous fraud that must be sorted through. We seriously need a method to keep California as far away from the Federals in DC, including secession.  The California economy is likely to collapse from the enormous cost of electioneering fraud.

El fraude electoral y la mentira están sobre nosotros, de ambas partes. Estamos más allá de la ayuda y la desunión parece ser la mejor manera de salir del lío horrible.

New research from the Fed

Mortgaging the Future?
In the six decades following World War II, bank lending measured as a ratio to GDP has quadrupled in advanced economies. To a great extent, this unprecedented expansion of credit was driven by a dramatic growth in mortgage loans. Lending backed by real estate has allowed households to leverage up and has changed the traditional business of banking in fundamental ways. This “Great Mortgaging” has had a profound influence on the dynamics of business cycles.

Ratio of mortgage debt to value of U.S. housing stock


The ratio of debt to all of housing.
The ratio of debt to equity then is .6/(1-.6) = 1.5, we owe 1.5 times what we own in housing.

The ratio should be 2/3, or .3 on the chart.   TOE tells us the number should be 1/Phi, I think, that is where Newton's calculus works best. But we can see the ratio was about right all the way through the 70s, we could have handled that.  The real problems started 15 years after the Nixon Shock, the point  where the losses produced by the Fed had worked through the economy, the point where disinflation, then deflation set in. And, naturally, that coincided with the great debt surge by DC.



Then they  conclude:
 The vast expansion of bank lending after World War II is one of the most extraordinary developments in the history of modern finance and macroeconomics. Our research suggests that the explosion of credit has played a more important role in shaping the business cycle than has been appreciated up to now.
The business cycles, since 1980, coincide with the recessions which have an 85% probability of coinciding with presidential regime change.  We know three things about DC and housing.

  • DC always engages in tax and spending fraud such that the bill come due at the end of term.
  • DC (and many states) make mortgage interest deductable.
  • DC has agencies to encourage growth in  the mortgage industry.
This is not a coincidence, this is deliberate destructive policies by our elected politicians.



What was that slam in the final minutes of the market?

Look at the end of this chart from the market today.  What the frig was the sudden sell off?

Changing the rules of the Wythoff game

Wythoff's game is a two-player mathematical game of strategy, played with two piles of counters. Players take turns removing counters from one or both piles; in the latter case, the numbers of counters removed from each pile must be equal. The game ends when one person removes the last counter or counters, thus winning.
This is the current rule set from Wiki. We have to change the goals a bit.  Lets set the goals in terms of loans to deposits, like banker bot.  The two players have change the loan to deposit ratio such that the ratio moves  loser to 1/Phi.  They exchange money between deposits and lending. A player when when the other player has no exchanges which move towards the ratio while the wining player always has at least one. The winning positions become two, (1,2) and (3,5). But these are also the best positions for the stanrd wythoff game.

Sunday, March 22, 2015

Smart Cash Card and BankerBots.

A Cal driver's license really. But we can easily put the smart chip in this thing. That face, easily captured in a hundred features. Person Smart Card ID verification nearly perfect.

Having the visual face on the terminal equipment, cheap and optional. Add the magnetic stripe, the near field optional.

Consumer would love a completely operational multi-currency, counterfeit proof digital cash card. Every merchant will support it. Block chain cash, immediate clearance cash, personal verification cash; available as currency slots for hundreds of merchant coupon supplementals. Make a variety, low scale and upscale. Any store, the local grocer at the corner, make a cash supplement for your regulars, easy to do, nearly free, invent your own money and be a central banker.

Who else can make their own discount coupons?
Local charity clubs. Offer deposit and loan rate, charity money. All of it spendable as charity donation. Health clubs, make it part of membership fees. Offer rates, spendable as discount entry into the club. Families, use it for children's allowance, exchanged to be no arbitrage with dollars.. Farm sale coops, insurance companies, endless.  Any connected network will want a dollar supplement. This is one of the technical explosion productivity things.



I would buy one of these for a higher price than any digital watch. I would pay up to $100.


We have a shadow banking committee!

Economic Principles talks about it. Let's try some corrections, shall we!

To repair its damaged credibility and place its policies back on solid analytic foundations, the FOMC should place greater emphasis on its continued commitment to the two percent long-run inflation target first established in January 2012. Next, the FOMC must embody its objectives in an explicit and empirically defensible monetary policy rule.
Now what they really mean is a 2% variance in prices. They want the second difference, not the first.

The monetary policy rule should also be somewhat countercyclical, recognizing that, within limits, monetary policy can be used to stabilize output and employment over the medium term, even as it focuses principally on stabilizing prices in the long run. The policy rule will then ensure that the Fed remains accountable in achieving both sides of its statutory dual mandate.
At equilibrium there are no cycles, money is no arbirage, meaning it is a Weiner process.   Ultimately the cycles are the result of fiscal budget cycles, and the 2% first difference was set by the necessity of accommodating fiscal cycles, it comes out of the way John Taylor had to set his stability parameters to get his stationary coefficients when DC does its cycle.

Any workable rule must limit the set of variables to which policy responds. This imposes discipline on policymakers, avoiding the temptation of excessive fine-tuning and ensuring that the Fed remains insulated from fiscal pressures. It also helps the Fed remain forward-looking, as it must be to account for the long and variable lags with which its policies affect the economy.

Long and variable lags are the responsibility of the banking network. They are best handled by having a competent system allowing entry and exit of member banks.  The Fed is, at equilibrium, a Black-Scholes spreadsheet, and should be operating at the short end of the curve. I have no idea what excessive fine tuning means, but the tuning rule is simple, when hyperbolic flow constraints are not met, the spreadsheet seterilizes losses or gains and resets deposit and loan rates, projected over the next two periods.

Within such a rules-based system, several very specific points of guidance for monetary policymakers become clear. First, historical experience tells us that whenever interest rates are too low for too long, financial markets become distorted and inflation begins to rise.
Hyperbolic discounting at equilibrium along with optimum selection of member banks handles all this.

Summary:
  • Currency policy is a no arbitrage spreadsheet
  • Human skills involve recruiting great bankers
  • Government is acting like a foul member bank.

Saturday, March 21, 2015

Austerity debating with Simon Wren Lewis

The debate on austerity continues. Simon thinks that more government spending helps real growth, let's check out the facts, shall we?

Just to make matters short, let's ask Simon if he ever actually looked at what the Federal government spends money on.  Here are changes in government spending, changes in real growth and changes in interest expense; all quarterly. Through out this whole five year period, interest expenses occupied about 11% of the US budget, it is the green line.  It has a variance of about 10%,

That red line is total federal spending , notice that total spending varies about 1%.  All of that variance in spending is due to volatility in interest expenses, all of it.

If these Keynesians had a clue they would think that the differential rate in government spending available for discretionary spending is all taken by interest payment. Do these Keynesians think that stimulating the bond industry is the central idea?  Simon Wren Lewis is clueless because he cannot show a damn thing about government spendign over the period as long as interest expenses consume all the variance.  The null hypothesis has not been rejected, in fact the interest rate volatility hypothesis entirely consumes his data.

This, noticing variance and finding its cause, is what Simon missed out on in statistics class. All things being equal, and they are not really, any lasrge change in government spending would cause a large change in interest expenses. This effect is mainly due to the roll over effect, a  rise in borrowing for current spending causes a rise in spendign for the 2 trillion in debt we roll over each year.

Now the counter argument is QE. But QE has just jammed the Fed into a corner from which it will not escape. (its tanh angle is way to high and loans to savings leave no room for flexibility, hence the deflation.  Go see banker bot, it knows. Folks like Simon are the reason we have to discover the theory of everything, we had to undue generations of badly taught mathematics, and now we know.

Why is it that I, a simple mathematician, can spot these correlations in about a three minutes; yet every single Keynesian, except maybe Roger Farmer, seems absolutely clueless.  Did they all learn from Krugman and the Basket Weavers from MIT?  What skill do I have? Answer: The theory of everything. I get flow constrained, connected finite networks. So, economists, get with the program, the new math is appearing all over your campuses.

The elections are going to a nightmare of fraud

Is there some way the SouthWest can keep the SwampRats east of the Rockies? Sort of have California and Texas agree to split the vote beforehand, then keep the Liars out of the region altogether.

Getting it wrong on Japan

QE did not cause inflation in Japan.  The peak in the implicit price deflator occured as a result of the tax hike on consumption. Shortly there after we see the deflator bounce around the zero point again.

There is a slight rise in the deflator since the crash, where is nose dived.  But Japan is going back to zero inflation.

Understanding the Higgs and vacuum expectation process

sinh(1.5 * ln(Phi)) = pi/4 with an error of .0007, well within the fine structure.

tanh(1.5 * ln(phi)) = 1/phi = e^ln(phi)

tanh'/tanh = 1 at that point.

So its all right here, the connection between e,pi,phi.  This is it, the constants of a finite set of things in a crowded environment.  And it meets the conservation of quants, energy and mass. The set of angles are the Schramm-Loewner indices. It is Shannon band limited. It is the theory of everything, right in front of us.

How did it get the half angles?
It jumped to the third row of Wythoff.  The half angle seems to be spin. This would happen if the quarks begin to ping pong between the two positive charged pairs. They would be a half angle out of phase from each other.
I am not sure yet of the sequence, but once the system has sinh(1.5 * ln(ph(), then it has Phi and has the Lucas numbers. It combines over two quants meeting the Shannon requirement, a two period look ahead as banker bot would say.
Our proton is cannot multiply, so it must have generated this sequence:

tanh and tanh^3, and combining them while the Lucas polynomial is within tanh'', which is kinetic.  I am staring at my spreadsheet, and it is clear that the constraint is met at the half angle when the sequences add across the full angle.  That is one unit of charge and one half angle of spin meeting the flow conditions.  I will take a bingo on that one, thank you, just e mail me my Swedish banana.


That is, the Lucas polynomial, adding variance about the angle, makes the weiner motion.  Hence there must be a sequence of the two powers of tanh generated somewhere somewhere in the Wythoff, and I have to hunt it down.   Making that tanh sequence is the Higgs effect, and finding 1.5*ln(ph) must be the vacuum expectation phase.

I am very close here. In logs, the flow constraint is:
 log(sinh) - log(cosh) + log(cosh^2), and these are accumulations of the tanh or coth values up the chain. The tanh'' is a bunch of things happened and thing not yet happened on the unit circle, so it has the potential and kinetic energy within the Lucas polynomials as it varies the circular angle. But that set of combinations, some not yet happening, has to be log.

Friday, March 20, 2015

Why Phi?

Its this:
t * t' = 1/2 * t''

The 1/2, that means the aggregate is sampled exactly at bandwidth and motion matches mass. At that point, e is optimally matched. It gets back to this:

c^2 - s^2 = 1.  When every point along the measuring stick has exactly one unit of imprecision at Shannon rate, then the system has the proper size measuring stick.  cosh are the incoming combinations of things at x and sinh the outgoing combinations of things. Just like banker bot says.

Another way to say it is, for each of the two elements exchanging, at the Shannon rate, there must be a null exchange. But, remember, the two that exchange will overlap, hence the Phi and not the 3/2.

 So, in the end, the entire universe is built around  causality, bandwidths have to match exchange rates.

Is their an eassier Black Scholes?

Sure.  The formula uses Pi, so it is Nyquist and Lucas applies.
The formula assume the safe rate is fixed, so we know we are working around a single Lucas angle. e is known so we know the Lucas angle is 1.5 * ln(Phi).

Hence tanh is the strike price, tanh' is the gain, and 1/2 * tanh'' is the derivative cost, normalized to a unit variance of 1.0. So then one needs to scale.
I think that is right, but check me.

The theory of everything simplifies life a whole bunch.

OK, so what is a circle? What are imaginary numbers?

What is this:

pi*r^2 = Area, and volume is derived.

Start with a finite set of things in crowded together within an environment.  They have a center and must be connected. The approximation to Pi is a result of that condition.  The model of the circle we humans crated is a grammar, pencil and paper, and it means: What happens if the number of crowded things goes to infinity in a small space. But be careful, nature takes the square because of causality and finite systems.  Hence, the Shannon Nyquist rate makes causality work and generate optimum divergence of neighboring combinations, and that makes pi. In a non uniform environment, the sample rate would vary, or the system would superimpose two segmented divergence processes.

a + ib, and imaginary number. It means a recombinations happened and b recombinations did not. The total number of actions is partitioned into those that have happened and those that have not happened. a^2 + b^2 is a number of thing optimally  packed to maintain connectivity. One is a packing of actions that did not happen, the other a packing of things that did happen. The unit circle which counts both action that have happened and actions that have not assumes the unactions and the actions were optimally packed together.

What is -i?

That tells us that if exp(-i) is to be the inverse of exp(i) then those actions that have not happened,i, will always have not happened.  Its about happenings (recombinations); they are [ future, past], and [never, always]. Kind of like a Feynman diagram. I have been looking into these lately because they resemble and exchange between two Lucas angles.

Imaginary numbers and Newton calculus.

As I often point out, Newtons calculus made the premise that all power series converge uniformly.  That implied a very simple grammar, if you used the transcendentals for symbolic multiply.  But that assumption thus needed to include actions that have not happened, so the grammar balances.

Lucas polynomials:

The Lucas polynomials, for example, at x=1, all the necessary combinations have been made to generate the Lucas numbers.  But at x < 1, the polynomials generate the actions taken and the actions not yet taken. Since the hyperbolic process is a continual approximation process we would expect some wandering about the solution set, government by the Lucas polynomials.  So take the polynomials for some order, n; and compute its derivative as a combination of Pn and Pn-1, perform the cosh^2 - sinh^2 = 1, and find the solution set; a path of combination of actions taken and actions yet to happen.

Folks, we are doing math differently and these are exciting moments.

Weiner processes

I see this process as a very large number of elements crowded together, relative to the external environment. The elements have the capacity to combine and reduce crowding.
I have t as the number of combinations that have taken place.
I have x as the notches on a measuring stick.

The job, then, is to approximate pi and e such that every notch on the measuring stick is equally precise in measuring the x number of combinations. The elements have, evidently, the ability to partially overlap when they combine.
In this formulation, the number of elements is finite, since the approximations to pi and e stabilize, and the approximation error determines  the largest x on the measuring stick. At that point, the environment and the the largest x have the same imprecision, but we can let that precision go to zero, but never get to zero.

This is what I am playing with at the moment. But I can tell you that this formulation leads to the Hyperbolics with solutions at the Lucas angles, to some combination of the Wythoff array if precision increases. That is because the rate of recombinations relative to the number of combinations (tanh) , on some x along the measuring stick, will equal the minimum precision possible, and that will determine the approximations to e and pi. The result is true because I set the precision at a finite value. That precision will be 1/2 * tanh'', I think. And when that value is normalized to the value one, then I always get the same solution, and that fixes the number of elements possible. The Lucas angles themselves are finit approximations based on Phi. And that is why we have quarks because the vacuum is precide enough that it needs to get close to angle 1.5 * ln(Phi)).  Hence the proton  needs  three quarks, making three Pi,  allowing the proton operating at that angle offset. We all want to know if the vacuum is equally precise everywhere, that should be a major project for astro physicists.

The Lucas solution is fixed because of the 1/2, and the 1/root(2) in the formulation above.  The defines causality, the process is spectral matched, it can sample itself, (recombine) at a rate needed to cover its stable motion; it is at the Shannon Nyquist rate.  That is a necessity because of the value pi that is used as the maximum divergence.  Spectrum different from symmetric, band limited are possible, if the environment has a gradient (is not uniform). But since the environment has a gradient, then it too must have elements doing recombinations, and you are eventually stuck with either finding God or making the world spherical.

What does log(1/x) = -log(x) mean?

It means that 1/x will not do -log(x) actions so that log(x) could do them.  1/x will never see those actions (recombinations). Actions that cannot happen for 1/x if it is to be a good estimate of the inverse of x.  x and 1/x have to divide up their allocated actions.


Negative time really means actions that did not happened, yet.   So let time go one forever, like the fools we are, and in finite systems we are unknowingly counting actions that did not happen. Time is not going anywhere, it is just subdividing actions and unactions.

The Dow Jones average vs the Ten year bond yield

That is the Dow market price, average, percent change year over year divided by the ten year yield, yearly. The Dow index grows five times faster than the ten year safe rate, that is called the equity premium over the safe rate.

Securing Smart Cards is not a problem

With Crypto currency, everyone wonders about security.  Use the smart card, and secure it with the face.

Use the person's face.

Just secure the smart cart by putting the owners photo right o the card, like a drivers license.  The store terminal can pull up the face and display it, so there are three faces right in front, one the car, one the person and in the terminal.

How much space is needed to store a good face likeness in memory?

How about 100 features, each feature chosen from 256 possibilities, that come to 800 bytes and makes makes a very good likeness.  A 2 Gbyte flash is about $10, and that is 2,000 million bytes, enough to store faces for 2 million people. The store owner can keep as many faces as he likes in his local terminal, each face tied to the card number and card issuer. It would be a severe pain in the ass to try and counterfeit the system.

Once the smart cars is issued, the merchant need not really contact the central bank for clearance of money since the smart cart can hold cash as well as a wallet.

So it is perfectly reasonable to have as many  types of currency floating around, storable in the smart card without any real fear of counterfeiters.
  • Face on card.
  • Face on person.
  • Face in the local merchant terminal. 
If you are a merchant.

Find a mathematician and get banker bot up and running, then issue your own currency.  These cards are coming late this year or early next. Merchants will have very fine tuned control over inventory, volatility is likely to drop by 50%, at least.  Your money is crypto discount coupons, issued in your own name. Whether it is frequent flyer miles, energy savings credits, rental car discounts, or Wal Mart money, get ready to make it cruypto. The Silicon Valley techies will make it as painless are can be..

If you make point of sale equipment.

Having a bit of trouble with face compression and face feature technology? Hire a mathematicians, they have this down cold.

Thursday, March 19, 2015

Islamic art lovers

Business Insider: TUNIS, Tunisia (AP) -- The Islamic State group has issued a statement claiming responsibility for the deadly attack on Tunisia's national museum that killed 23 people, mostly tourists.
Thursday's statement described the attack as a "blessed invasion of one of the dens of infidels and vice in Muslim Tunisia," and appeared on a forum that carries messages from the group.

Wednesday, March 18, 2015

Back to sample rates in light and matter

My bubble theory says, simply, that mass is the condition when ground state elements of the vacuum are sufficient to trap non ground states, and the non-ground states are called light in normal physics. Getting a bit handy with causality and sample rates, then, ground state particles do not initiate exchange, but may be exchanged by others.  Hence ground state particles must  mix in proportion of 1 element per 2 non ground state, as this insures that the Shannon rate is applied to the ground state particles. Hence the 3/2.

So my spectrum makes sense, phi^91 = 3/2^108 is the condition when the most elements meet that match, and ground state elements are conserved because non ground state elements always sample them at twice the rate. Thos exponents are the maximum packing when non ground state elements diverge at the rate of Phi, a necessity of they are connected.

That helps, glad to get that over with. That was why Einstein and Bose needed that ratio in the Zeta function.

Tuesday, March 17, 2015

Thayer, Frederick C, bonehead author

Thayer, Frederick C., The American Journal of Economics and Sociology:
"... since 1791, there have been six significant economic depressions among the innumerable "business cycles." Each sustained period of budget-balancing was immediately followed by a significant depression. There are as yet no exceptions to this historical pattern.
Here is his hose manure. I correct his errors below the quote, so please read on.

This is the record of six depressions:

1. 1817-21: in five years, the national debt was reduced by 29 percent, to $90 million. A depression began in 1819.

2. 1823-36: in 14 years, the debt was reduced by 99.7 percent, to $38,000. A depression began in 1837.

3. 1852-57: in six years, the debt was reduced by 59 percent, to $28.7 million. A depression began in 1857.

4. 1867-73: in seven years, the debt was reduced by 27 percent, to $2.2 billion. A depression began in 1873.

5. 1880-93: in 14 years, the debt was reduced by 57 percent, to $1 billion. A depression began in 1893.

6. 1920-30: in 11 years, the debt was reduced by 36 percent, to $16.2 billion. A depression began in 1929.
...

The question is whether this consistent pattern of balance the budget-reduce the national debt-have a big depression is anything other than a set of coincidences. According to economic myths, none of these sequences should have occurred at all. How on earth, for example, could we virtually wipe out the national debt in the mid-1830s, then fall immediately into one of the six recognized collapses in our history? ..."
At each of the authors list I give the results, everytime a new monetary regime for the US, or Civil War:


1812 first depression fallowed by the second bank of the United states, new monetary regime.
1836, second depression followed by elimination of second bank by Jackson, new monetary regime.
1857 third depression followed by civil war.
Fourth depression, 1873 followed by Coinage Act, new monetary regime.
Fifth depression, 1893 led to free silver movement, failed attempt at new monetary regime.
1930, Roosevelt and new monetary regime.
1972 Nixon Shock new monetary regime.
Yes indeed, we do new monetary regimes whenever voters get tired of paying for the interest costs in the old. So lets default like we always do, but do it with knowledge about how monetary systems really work.

What am I doing with the physics, you ask?

This relationship:
I know tanh(1.4*ln(phi)) = 1/phi, and I want to put that into integer angle 3*ln(ph). I will do that same with 3/2.






 At that point I go back to my original sparse spectral chart which showed phi^91 = (3/2)^108, with two spectal peaks I call the electron and the Higgs peaks. I can then cancel exponents and find the Lucas angles that fill in the sparse spectrum. The chart becomes a mixed set of Lucas angles, one set for the quarks and one for the electron orbitals. Then I am done.

Will this work?
Well, it is either easy or one of two things: 1) No theory or 2) I am too dumb.
My secret hope is, as always, that some brilliant young mathematician solves the problem before I look fundamentally stupid. I always prefer to cheer the successful mathematician before I make a fool of myself.

My entire odyssey here has been me learning from the mathematicians on the web,. I had stayed partially awake through math school so I had just the right amount of navigation skills, and the web is a fantastic theorem connection tool.
Where I ended up was with hyperbolics, with discrete Lucas angles, can describe set combinatorics that conserve the totality of elements by assigning them finite sets spectral features, best described as compressibility, minimizing redundancy and exchanges.

Housing takes a dump

Atlanta Fed Nowcast:

Latest forecast

The GDPNow model forecast for real GDP growth (seasonally adjusted annual rate) in the first quarter of 2015 was 0.3 percent on March 17, down from 0.6 percent on March 12. Following yesterday morning's industrial production release from the Federal Reserve Board that reported a 17 percent decline in oil and gas well drilling in February, the nowcast for first-quarter real nonresidential structures investment growth fell from -13.3 percent to -19.6 percent.

Cause? Housing starts plummet. Calculated Risk has a graph:


That would be a .3 YoY GDP growth predicted for this quarter.
Hence my suspicions confirmed, we will see a negative, year over year change in the implicit price deflator, the first  since the Nixon Shock.   This is a data geek's paradise moment.

How's the quark banker bot coming you ask?

Well I did give it some thought.
Let's start with the basic equation:

D(1+d) ^2 = L(1+l)^2 = 1

Total deposits at the deposit rate projected out two periods, minus the same for loans will be 1.0. There is a missing delta GDP, which I have divided out so we are dealing with the fraction component of money managed by the currency banker at the short end of the curve.  If we define sqrt[(L/D)](1+l)/(1+d) = tanh(x) as assets to liabilities, then they must obey the constraint equation.  

What about taking the root of D and L?  I added that in, I think it is correct though I neglected it in earlier posts.  That ratio should correspond to the Schramm-Loewner index.

Here is the flow constraint equation:

tanh*tanh' + 1/2 tanh'' = 0

The constraint equation ensures that the assets to liabilities are no arbitrage. Adjust rates occurs when the asset to liability is in error, as determined by taking the first and second differences of deposits and loans, as they occur. Eventually the constraint deviates from zero by the uncertainty, and rates are sterilized, yes, the banking term.  Putting rates back into no arbitrage is sterilization.

What is x in the tanh?
That will be integers multiple of log(phi), starting with 0Phi implies a finite connected network, which is additive and segments the hyperbolic angles into discrete Lucas angles.  That is the condition needed. These angles should fall out of the equilibrium conditions.  The currency banker occupies the first two or three angles, and the banking network subdivides the remainder of the tanh curve with multiples.


OK, what is the deal with two loan rates?

Everything operates exactly the same, the currency bankers has one deposit rate two loan rates.  The currency banker monitors both loan portfolios  for the no arbitrage constraint. The first loan to deposit that is out of balance will be sterilized with a rate adjustments for both the deposit and loan, as if the second loan balance was completely ignored. In other words, the unadjusted loan portfolio becomes a shared liability for the whole network, bankers will be aware that it may be closer or farther from balance relative to the adjusted rate.  The banker then continues, simply adjusting whatever loan rate becomes out of balance.

Why two loan portfolios?

Because the  currency banker is serving in a large complicated economy and want to approach the operating angle of 1.5 * ln(phi), which gives the best operating range to simulate Euler's number.  When that point is reached then Phi, euler and pi, the three trenscendentals, will be best matched and permanent Brownian motion operates within the capability of the aggregate. 

This will work much better then anything that came out of the MIT basket weavers club! I know this will work because I have seen the work on tanh used by our brilliant mathematicians, too numerous to mention.  But if you are a large corporation, go find one of these mathematicians and hire them with a very large salary.

Monday, March 16, 2015

Deflation becoming inflation says Billion Prices Project

WSJ: One of the biggest economic questions facing the U.S. economy in 2015 is this: Will measures of inflation veer into deflationary territory, or will prices firm?
The Billion Prices Project, which scrapes the Internet daily to capture changing prices online and has often foreshadowed subsequent changes in official price indexes, shows a sharp turn upward in measures of inflation, albeit from a low starting point.
The issue was whether that sharp drop in the red line would continue declining.  These are YoY changes, meaning that it takes a big couple of quarterly price declines to register a price decline over the previous year. The official CPI, the blue line, barely went negative, year over year, but given that basis was 2%, it must have taken a fairly large negative plunge this quarter. Here is inflation data:

Monthly deflation was -0.57% in December 2014,  -0.54% in November, -0.25% in October, -0.17% in August and -0.04% in July. But as the January 2014 numbers fell out of the calculation and were replaced by a massive -0.47% the annual inflation became deflation.

When they say, 'January 14 fell out of the calculation', they mean it no longer was in the 4 quarter look back.  So January had a price jump holding up the Year over Year rate.   So is the Billion Prices uptick, that little red hook at the end, significant? Not yet.  We are still dominated by six months of consumer price declines and that is still pinching profits.  The next tell will be jobs reports, will job gains taper off?

A book on the Smart Card revolution!

George Selgin has convinced me that total segmentation of currencies is unli8kely, which this author claims.  But all the currencies he mentions will exist as supplemental discount currencies, making prices as a combination the way discount coupons work at the store.  But that is where smart card is so powerful, combining the main currency and the sicount coupon into a single pay.

Also, when you read this article, every one of these supplemental currencies will be managed by open source, hyperbolic  banker bot.

Yahoo: How will you pay for groceries or gasoline in the future?  According to Kabir Sehgal, author of the new book “Coined,” it might not be the old familiar greenback. As technology advances, so do currencies and payment systems.
“Look at Amazon (AMZN) points, Starbucks (SBUX) points. These are effectively the new currencies,” he says. “There are more frequent flyer miles in circulation than dollars, so it's one of the biggest currencies in the world.”
Sehgal believes that the future of money may involve the linking of corporate currencies.
“What if you could go in a taxi and pay with your Starbucks points, or go to Starbucks and pay with Amazon points?” he says. “So effectively you can link up all these corporations and create one unified corporate currency, and it may be very user-friendly.”

He also thinks that mobile phones will likely become the payment device of the future. In recent years Apple Pay (AAPL), Google Wallet (GOOGL), and M-Pesa have been created to facilitate these transactions.
“So increasingly the mobile phone can be the way to pay for items and increase speed in a store,” he says. “Starbucks for example, is it quicker to pay with a mobile phone? Sure, the line will go faster and it's better for consumers, better for Starbucks.”
As for digital currencies, such as Bitcoin, Sehgal thinks it’s unlikely they will gain traction as an alternative to the U.S. dollar. He points out that the U.S. government still maintains the ultimate power to determine what is a currency. Importantly, taxes in the U.S. can only be paid in U.S. dollars.
“I think what we're seeing now is the Bitcoin will thrive as a technology, not necessarily as a currency,” he says. “Bitcoin is way to transfer files in authenticated manner. That technology will remain and flourish.”
Sehgal says it isn’t illegal to create currency, noting that both Philadelphia and Ithaca created their own local currencies in the 1990s as a way to stimulate economic activity.
America has a long history of using many different currencies. “If you look at the Civil War, there are 8,000 currencies in United States circulating throughout America,” Sehgal points out.

James Pethokoukis, Yes you did miss something

Pethokoukis on Kurt Anderson's column about the 90s:
With so many stats, strange that Andersen fails to offer even a single one about rising inequality. The top 1% US income share in 1992 — which Andersen cites as the start of the good times — was 13.5%, according to the World Top Income Data Base. When Bill Clinton left office in 2000, the share had risen to 16.5%.
James, look at the data, once again, pleas.  The Gini took a huge jump in 1992, Clinton took office in 1993. I went through the data, but then I data check, you just spout useless recipes.  The jump in Gini happened when rate plunged after the 1991 recession, Bush the elder's recession. After that, the Gini remain on its normal trend.

Is this your take? You left no reference:

The US economy entered the 1990s after undergoing a huge revamp in the 1980s: marginal tax rates were lowered from 70% to 28%, the inflation menace slayed, regulations reduced, and Corporate America got restructured. Then in the 1990s, government spending and debt were reduced, investment taxes cut,

Well, James. Taxes were still low when lil Bush took office, he ran the deficit to high heavan, and growth continually dropped until the big crash, the normal Republics Communist behavior. Bill Clinton took the deficit away, continuously, since the day he took office. straight shot into surplus.  Every friggen Republican president frigged that up, clueless dunces, all of them.

James, you have not yet learned, I data  check your bullshit, you cannot get away with recipes, you have to check your friggen data.  I have covered all this material with charts, causality, order of events since the Nixon shock, you end up on the wrong part of the recipes because you fail data analysis.

Sunday, March 15, 2015

Quark banker bot

Dealing with the hyperbolic banker, I want to find more separation dimension in the economy.  So I decide to split the banking network between  base 2 and base 3 network by making the currency banker offer two loan rates and one deposit rate, with the rule that any member bank can choose one of the loan rates but not both.  The lets the currency banker split the member banks into two groups, and each group can bet on the other.  Thus, for a large economy, as the one we have, the banking network can have a revolving account flow, an extra degree of freedom over which prices can equilibriate.

The revolving accounts are the two loan rates in bot, allowing one part of the economy to rotate money into and out of the other part of the economy.  This is going to be part of the banker bot for the US, China, Europe; but for smaller economies, not likely needed. As usual, I cheated, and just borrowed the concept of complex numbers from the math wizards, and using complex numbers keeps money in motion thus letting be further sub-sample the exchanges from Phi to 3/2. I will use the trigonometric or hyperbolic (at a different sample rate)  stability conditions, which yield two complex rate solutions for assets, and one for liabilities.  I think this is how the quarks do it, so I call it quark banking.

My astute readers will note that Euler's number, e = 2.718..., is between 2 and 3. The idea here is to allow our bankers to make a better finite approximation of that number, and this allows more bankers to operate over the network, it becomes dense. But we cannot allow local density as that  removes network connectivity when overlap reduces precision below the bounds of Phi.  Hence, quark bot, segmenting the bankers into two phases, each phase orthogonal to the other.  This will work, and this also helps me understand particle physics a bit better, one of my ambitions in life.

Can I represent this is a jump in my Wythoff Array? Dunno, still clueless and deferring to the pros. Why not let the bankers trample each other and competition separates them?  Not a bad idea, except money still has to measure the trampling process. This is what quark banking does, it measures how well bankers trample there way to 2.718...

Back to optimum congestion

 I started the whole physics thing with that concept, now it is returning. Banker bot will balance deposits with the loan portfolio most out of balance. The effect should be to make deposits more balanced, and have an indirect effect on the second loan portfolio.  Eventually the second loan portfolio rebalances with deposits.  The system should stabilize with optimum divergence between the two loan porfolios until they are complex conjugate quadratic roots.  The idea is one loan portfolio is always in the queue and the banker bot flow has a very small probability of running dry.  At any time, I think, member banks can place their bets in one or the other, but not both, I think. I am still fidgeting with the idea.

Saturday, March 14, 2015

Something strange in the neighborhood

The implicit price deflator seems in free fall?

I am having a hard time dealing with the implicit price , Year over Year. And these numbers should include home pricing.

The Producer price index is showing a -1.7%, very recent, numbers, but that thing is volatile. And  this number came from the commodities industry. PPI is the Red and it does spend time in negative territory. It makes up about 1/4 of the implicit deflator. The CPI, all itmes, green, above did dip into negative territory during the crash. Its current, most recent value is near zero, I think. The last release had it as -.2, according to Market Realist. These numbers are in flux as the NowCast GDP report expects a .6^ GDP growth rate, and that is a drop from 2.2%.

But, put these numbers together and we get something like a negative half point price deflator. That would be a record, the first negative deflator in 40 years, the first time since the Nixon Shock!

Do we care?
Normally no, we should be able to handle a half point price decline, except we had a 1 point rise last quarter, so the relative, the second differential on price, is very high.  We do not meet the flow constraints, this is not adiabatic.  This looks fubar.  But does it mean Gray Bar? Dunno, if it is contained to one quarter then fine.  But is looks very weird to me, very out of the ordinary.
Here are the quarter on quarter changes, not year over year.   More erratic, and the PPI, red line, has since taken a jump back down, otherwise  the quarterly changes are down.  

Behind all this is the straight up, 18% rise in the trade weighted dollar.  The world wants our prices to be reset to 1985 levels!

Sticky wages

The blue line is hourly wages per person.  The red line is total payroll.  Both in units of change per month.  The hourly wages per person wriggle about the average, the total payroll take the plunge during Gray Bar.  That is sticky wages.

Friday, March 13, 2015

I will protest that


Rubio plans to go deficit spending and crash the economy

WA Post: That's the same big idea that Marco Rubio and Mike Lee have today. They also want to cut the top rate from 39.6 to "only" 35 percent, and use the money that could have gone into cutting it even further to expand the Child Tax Credit from $1,000 to $3,500 instead. Now that credit would only be refundable against payroll and income taxes—so if you didn't owe any, you wouldn't get any help—which is why most of its benefits would go to the middle and upper-middle classes. The nonpartisan Tax Policy Center estimates that, altogether, this plan would raise after-tax incomes 1 percent for the bottom 40 percent, around 2.4 percent for the next 50, 2.8 percent for the top 1, and 3.8 percent for the top 0.1. That adds up to a lot of red ink, though: $2.4 trillion more in deficits over the next decade, to be exact.

Note to self, Marco Rubio, Republican Communist Rat

It's hard to be on time, Matt O'Brien

Matt: The U.S. economy is doing well enough that it's getting ready to raise rates, and the rest of the world is slowing down enough that it's cutting them.

The GDPNow model forecast for real GDP growth (seasonally adjusted annual rate) in the first quarter of 2015 was 0.6 percent on March 12, down from 1.2 percent on March 6. The nowcast for first-quarter real consumption growth fell from 2.9 percent to 2.2 percent following this morning's retail sales release from the U.S. Census Bureau.

Well, the dollar must be up because Spain is outperforming the USA?  Naw, more likely its roll over season in DC and the debt carter is collecting dollars for high yields.

Oldie but goodie

Check out this graph of Tanh and derivatives

That blue line, it is the second derivative, the negative actually.  The peak around 5 is where
tanh'/tanh = 1, and tanh = Phi so that means that, as a spectrum in integer quants, that peak is where the ratio of one spectra to its neighbor is Phi. That curve also looks like a Plank's curve, no?

Now check out this story from Quantum magazine:

What struck John Learned about the blinking of KIC 5520878, a bluish-white star 16,000 light-years away, was how artificial it seemed.
Learned, a neutrino physicist at the University of Hawaii, Mānoa, has a pet theory that super-advanced alien civilizations might send messages by tickling stars with neutrino beams, eliciting Morse code-like pulses. “It’s the sort of thing tenured senior professors can get away with,” he said. The pulsations of KIC 5520878, recorded recently by NASA’s Kepler telescope, suggested that the star might be so employed.
A “variable” star, KIC 5520878 brightens and dims in a six-hour cycle, seesawing between cool-and-clear and hot-and-opaque. Overlaying this rhythm is a second, subtler variation of unknown origin; this frequency interplays with the first to make some of the star’s pulses brighter than others. In the fluctuations, Learned had identified interesting and, he thought, possibly intelligent sequences, such as prime numbers (which have been floated as a conceivable basis of extraterrestrial communication). He then found hints that the star’s pulses were chaotic.

But when Learned mentioned his investigations to a colleague, William Ditto, last summer, Ditto was struck by the ratio of the two frequencies driving the star’s pulsations.
“I said, ‘Wait a minute, that’s the golden mean.’”
This irrational number, which begins 1.618, is found in certain spirals, golden rectangles and now the relative speeds of two mysterious stellar processes. It meant that the blinking of KIC 5520878 wasn’t an extraterrestrial signal, Ditto realized, but something else that had never before been found in nature: a mathematical curiosity caught halfway between order and chaos called a “strange nonchaotic attractor.”

Related?  Well the frequency seems to be of the order of hours, though I could not get that number from the article.  There is more to the story and chaos and order seem to be associated with Phi, but between chaos and order is a Weiner process, no?

Thursday, March 12, 2015

Another odd connection between hyperbolics and Phi

Consider tanh'/tanh = 1, the point where tanh = e^x

The solution is:
 (1-tanh^2)/tanh = 1/tahn-tanh = 1 
and we know then that
tanh = 1/Phi.

The angle where that happens is: 1.5 * log(Phi)

But that angle is not one of our Lucas angles, which are integer multiple of Phi.
But we know that:
tanh(x/2) = (cosh(x)-1)/sinh(x), so this drops out of the divisibility of Lucas polynomials, though I have not worked it.

For the trigonometric functions we have:
tan'/tan = 1/tan + tanh  = 1

We get a similar result.  It may seem odd, but circular functions are maximally divergent and we just get the connection between the log,pi and phi; because phi is the most irrational number and will be the maximally divergent.



Regarding the hyperbolic banker

I have added an intermediate variable, delta. Using the same terminology I have:

D/delta*(1+d)^2 - L/delta*(1+l)^2 = 1

The delta is the proportion that makes the unit one, but it cancels in the flow constraints.  But we will be using it. Let me remind my readers, the equation above simply states the condition that the growth on D and L are within the bandwidth needed for causality. That is two period look ahead is ensured.

Delta, for the hyperbolic banker, is the amount of growth expected, I think. Since we have a finite banker net, the period must correspond to rates d and l. The term at each level of banker net changes, and rates correspond. How do we find the term in units of time? We measure the activity at equilibrium, time in an output. So this is different than DSGE style equilibrium because we equilibriate over flows.

A price neutral monetary regime

My proposal.

Congress sells the Fed to the public for 10 Trillion in government debt relief. The sales price is paid over a ten to fifteen year period by having the new fed buy government debt and burn it.

The new fed is managed by a spreadsheet which sets the savings and lending rates. The Fed staff is paid with printed cash. No member bank owns any part of the spreadsheet, but the spreadsheet functions are open source. No dividends are paid, no money remitted to anyone, the central bank simply pays the staff and runs the lending and deposit portfolios, automatically, via spreadsheet.

That leaves only the human function of selecting and qualifying member banks, and securing the PC on which the spreadsheet runs.

Simple plan, DC walks away and gets a 10 T debt relief. The public gets efficient price neutral money. And the Republicans are no longer blamed for being a bunch of deficit loving communist Yahoos.

Why not? Just buy Congress out permanently. The spreadsheet will be banker bot, the no arbitrage currency banker.  We can have member banks come and go as long as the Fed staff can pick quality members. Congress is damn near broke, they won;t last another recession anyway.

And, to top it all off, we have Obama, the  president who reduces deficits faster than any other post war president. Obama knows this stuff, as does the debt cartel.  This is a good deal, a sensible deal.

Greg Ip said what?

Greg Ip: Yet focusing solely on the dollar’s fall gives a blinkered view. As the Fed bought bonds, their yields fell, and so investors rushed to other markets in search of better returns. While that pushed those countries’ currencies up, it also pushed their interest rates down and stock prices higher.

Let's check:
 Ten year yield, red,  rises when Fed is buying bonds, blue.

Perhaps Greg meant relative to trend?








Nope, the ten year headed down hill in a fairly consistent manner. Nothing in the blue line seems to have much effect.

DC sold more than the Fed bought, says Larry Summers.  Mostly what happened is the long term trend simply shifted. It is still a bit of a puzzle.



But the best explanation I see is that the bond dealers consider QE an implicit tax. The Fed bought some 800 billion and DC was selling about a trillion of new debt.  So the bond market bought about 200 billion at ten year yields of 3.2%, mostly by restricting trade, as near as I can tell. This must be a controlled market How did the big banks compute the optimum yield?

They must have known the budget constraints before hand, and had the Treasury elasticity function. The Fed gave that money at .23%, basically the IOER plus dividends, minus earnings from the rest of the portfolio.   So, on net, Treasury paid a combined 1% (market plus Fed)  or so relative to 2% before QE2.  I guess, in the end, the bond traders got their half of the tax cost paid for, with hiked rates.  That is a 15% tax per year on the government bond market.

What was going on then?
GDP had reached its pre crash level, and federal spending stayed at that level.  And everything remained stable at those levels. The only thing different would have been roll over of the old debt was swamping the system. The entire mess was likely the sudden roll over of thirty and ten year debt from Republican deficit spending in the past. The Republican recession problem always hits the oil supplies first because that is the import.So the claim that the ten year rate was lower is mainly a claim that all parties agreed to a 15% bond tax on interest evenues because of the rollover problem. A problem almost entirely created by Republicans, especially Reagan and Cheney.