Monday, March 30, 2015

Hyperbolic group decomposition

That is where we are going when we incorporate the higher lagrans numbers inot the hyperbolic system.

Why do these Lagrange numbers work the way they do? They take advantage of their own nature, they reorganize sequences in separable groups, each group defined by a power series.  The Lagrange number itself just divides sequence into single matches, then dual matches, and so on. When combined with the set calculus of hyperbolics we get powerful artificial intelligence.

The Lagrange just combine sequential sets. But within the hyperbolic system, they find the minimum redundant sequence, label then, then the system just jumps up to a higher number and finds the next sequential sets.  That is a powerful artificial basis system. This idea of finite group calculus is very powerful, it is the next generation AI, and we barely passed the first.

Take speech understanding.  What would the Lagrange hyperbolics do? Find the optimum groups of growth and decay processes.  Do this recursively, get group powers of groups. They would actually learn speech.

isn't this idea, if developed, really finite discrete log?  Or actually, decomposition into optimum groups. We move the higher Lagrange, and we squeeze the angle, but get an independent basis to obtain the next estimate of divergence. This is the missing theory I wandered upon about two years ago. I always though the Lagrange was the generalization of minimum redundancy encoding. If we are working with an existing group, the goal is to estimate the maximum divergence about the ring.  Find its 'pi', this is the road to big stuff. I think this approach eventually solves this:
In probability theory, a balance equation is an equation that describes the probability flux associated with a Markov chain in and out of states or set of states.
It is an optimum decomposition of the probability flux. I think.It find the most divergent path about the ring, stepping between Lagrange power series.
 See this chart? From the last post. But I added some lines zig zagging across the red and white barrier, just on the left.  That path is the path through the discrete points of the probability flux. That path should be jumping across Lagrange quants.

The little thing itself, if done accurates, should be the prime group, a new and novel concept, and very  powerful.

So hyperbolics, when used with the Lagrange angles, are doing group intersections, decomposing some group using prime groups. The diagram is the first prime group, it is a basis set of all other groups.  So it is perfectly reasonable in particle physics to have a prime group, though we may never actually detect it.  But, try taking a perfect sphere in a pure simple vacuum. then cool the vacuum down, way down until you are freezing gravity. The slowly compress the sphere, I think you might make some of these and the vacuum will get sloshy.


This is great stuff, it really is.

The smallest piece of matter possible

I was playing around with my spreadsheet and wondered what the rational approximation was for the best approximation of pi that would create a bubble trap.  1.5 * ln(phi), naturally. But how many bubble of space would be trapped? The basic idea is that the bubbles of space play a Whythoff game and once the most efficient position were reached, at the white and red barrier, then the game was won by the cold positions in the center, and mass formed.

That point, reached by Fibonacci numbers,  F17/F16 (or F61/F15) I am note sure yet, but that is about 1,000 bubbles.  Inside the barrier there are twice as many cold positions as hot, and just outside, twice as many hot as cold.

At the barrier, the divergence of moves is optimum and most moves take place inside the perimeter in circular fashion.  Pi is the curvature, computed, and held in position when Lucas swaps the positions in the center.  In that white and red ring, the system has locked onto the ratio as the tangent, and Lucas angle keep the paring straight, there are two quants, 1 inside the ring and 2 outside.  Feynman plays around in the barrier.

The blue are  hots trying to get cold.  Outside the hots try to get in; forcing cold [position out,   but the extreme accuracy of the barrier make it impossible to penetrate.

I know this little guy as to exist, the Lagrange mathematicions say we cannot make bigger things until we trap all the 'bad' phi approximations.   Any way, I call this the next biggest thing above the vacuum bubbles. These have to be the composites that make a lepton.

I know the point is likely F16 because I have run the numbers so many different ways, and I get phi^16 very close to (3/2)^19.  After splitting up the possible cold bubbles, I get (3/2)^18  for inside and outside, so I know this is close.

This is more accurate than the fine structure, and I can put a bunch of these things together and make a lepton using the 1+2^1/2 ratio. The effect is to flatten out tanh'' so the sweet spot ios not so sweet. But like the mathematicians say, with the bad phi approximations trapped in a well, I should have no problem adding spectral modes up all the way through quarkland.

Let's derive the money equation from the supply/demand equation

We will solve this for the normalized symmetric case, the case when prices and supply make a symmetrical V at equilibrium.  And we are going to normalize it so gains are always one unit of currency.  Then extend it to a connected distribution network.

First, we let prices decay by some rate as supply grows by the same rate, and we set the quantity levels to normalize.
We have:

(1+1/r) growth rate of supply, and (1-1/r) decay rate for price.  Then the normalizing quantity basis is r so that:

(1+1/r)^2 - (1-1/r)^2  =4/r

I think I have that right .  The square ensures equilibrium because it projects the gain out two periods. Hence there will be two transaction available to realize the projections and with probability one the queue length for all transactions will be less than three.  The Nyquist bandwidth requirement is met, and that is the condition of an adapted process.

Now one should always be able to do this if the real quantities and rate are known and independent, there are enough variables to normalize.

But we can see I have constructed the problem as a unitized, symmetric hyperbolic condition.  Hence the differential equation of flow is satisfied.

Let price be  (1-1/r)/1+1/r), the ratio of price to supply in the current quater.  The that is tanh and we get inflation right away as tanh'. So then computing or measuring tahn'' we get the money equation:

Price * Inflation +1/2 Inflation' = 0.

If we require a connected network where all quantities are conserved, then r will always be of the form 1/Phi^(2n), where n enumerates the node level of the distribution network.

I think I have all that, if not, economists can work it out before they teach it.  The symmetric case is the first Lagrange. The asymmetric cases that use higher Lagrange numbers will require less inflation', the higher Lagrange numbers assume all the Phi estimates have been consumed. I am still working that one out.

This helps explain why Dean Baker's logic on inflation and inflation rate of change is a bit wrong. His theory is that inflation and inflation rate of change can be set independently, and we see they are mutually constrained by the price level.

But what this all means is that I have two chances to get to my favorite restaurant for apple  pie, and that should be enough to prevent crowd.

Here by the way is the connection between hyperbolics and Lagrange.

 x^2-r^2y^2=4   One can see this form closely related to the equation above. The variable r is a quadratic surd, but it determines 1/r *  tanh'' where 1/r is 1/2 when the symmetric hyperbolic is used.  Otherwise much remains the same, but there is certainly a bit of work for mathematicians to formally redo Lagrange theory in generalize hyperbolics.  Followign the link we see all the connections between hyperbolics and Marks bringing us to Weiner and Brownian motion then Schramm-Loewner.   

It is the calculus of combinatorice, a Schur reduction of chaos into optimal sets up to a standard error. Raising the lagrange order move the spectral peak down to a different angle and decomposis the tanh curve with another basis set. The theory of everything, actually.

Derive the whole mess from discrete angle hyperbolics and be done with it.


Sunday, March 29, 2015

Something strange in the neighborhood!


This chart has Nominal GDP in purple which tracks Personal consumption, measured in nominal dollars, I resume. Everything is YoY.

That red line is the price deflator, and tells us how much of the nominal GDP is real growth.  That deflator is headed down hill, and the latest estimates are that it should be hovering around zero.  That implies we have a 3% real GDP growth print coming up in a few weeks. And the jobs report is expected to show reasonable job growth.

But the Atlanta GDPNow forecast says no, the real GDP growth is likely to be 1%, it we are lucky.  And consumption growth is is barely .6% YoY. The other line, blue are hourly wages changes., and it is headed down, slightly, along with the deflator.  We are in a zero inflation environment.

So, something does not jive between the top two numbers and the bottom two.  . You can actually see them diverge, and the divergence has gotten worse in the last few quarters. Something strange in the neighborhood!


Saturday, March 28, 2015

The Lepton has to be a composite particle

Let's do some back of the envelope on this.

The ratio of the proton to electron is 1836.  We have three quarks, so they independently, manage one third each or 606 each. Now that number is roughly 2^9, but with hyperbolics (and Feynman) they run bidirectional flows, so make that 4^5.  Ok, this makes since since 4^phi = 3*pi, so we can see the quarks baking three pies.  And the 4^5 can be broken inot three groups, each a 1/phi^5 times the previous,  and the exponent matches the number of spectral modes for the quark; we have a way to make quarks work.

Great, things fit.  Now what about the electron itself? It has about 2^50 bubbles of the vacuum inside.  (After all, Avogadro is the iron law of spheres).  So, those bubbles have to be managed as grouped particles, as there is no way to fit all the bubbls onto on tanh curve and still make the connections. Ergo, the lepton is a composite particle, case closed.
 
What kind of particles? Well make them with the second Lagrange number, and they don't bake the good pi, but so they are a bit redundant in their Feynman transitions; they spin to make up for it. So you get something like 4^5 bubbles per small thing, and they group by fives, each crammed badly onto Tanh and using the redundant Lagrange.  That gets to 2^50 bubbles of vacuum and we are home free.  Or we have the second and third Lagrange in their, and the electron is always spinning and on the move, but it holds integrity.  The quarks can work with that, and refine the accuracy of pi.

ObamaCare whoops

I took the contributions to GDP from consumer spending from the latest Q4,2014 BEA update.  The Blue line in the contribution to GDP growth from health care spending, the red is the rate at which that changes, and the yellow is the rate at which the rate changes.  These are like differentials.

Note: These are Year over Year numbers.And my results are all YoY.


Here are the numbers from the BEA:
.37,0.24, 0.32, 0.20, 0.22, –0.03, 0.51, 0.70, 0.13, 0.40,
   0.13, 0.04, 0.40, 0.30, 0.48, –0.16, 0.45, 0.52, 0.88

These are the health service spending growth from the personal consumption graph, Line 17 in the second chart. They go back to  Q1 2011.

The number that stands out is the latest, Q4 2014, where health care spending comprise .88 of the GDP growth which is 2.2. So 40% of GDP growth last quarter came from increased health care spending.  Now this quarter the GDP growth is best guessed at 1.3%, and if the increase in health spending holds up, then Obamacare will drive 67% of this quarter;s growth.

Can the economy devote 2/3 of its growth rate to Obamacare?

No, the blue line tells us that health spending should be about 25% of the GDP growth rate, typically.  That red line, the first difference tells us how fast the economy can adapt to changes in health expenses, and that number hovers around zero. That red line, the adaptation rate, needs to revert to zero mean in about three quarters. The agents in the economy cannot adjust their balances any faster.

So, multiply the blue and the red line, and the result should be trending closer to zero, reverting to zero much faster. It is not. The yellow line is a proxy for that second differential, and it is widely out of the region of sustainability.

Let's try hyperbolic discounting, shall we?
Since we work with ratios, the total amounts are unitized to 1, year over year.

Now we expect one sectro of the economy to contribute 2/3 of growth, so in the short term, the total economy shrinks by 1-2/3 to support a growth of 1+2/3. That ratio comes to:
(1-23)/1+2/3) = .2, and that would be the loans to deposit ratio needed to support Obamacare growth. .  What is the loan to deposit ratio at the Fed? 1.0, and that number is holding for ten years folks.

How fast will loans to deposit change? That number, the first differential, is 1-ratio^2. So for the effect of Obamacare, we expect loans  to deposits to change by .99, in other words, the second differences need to be almost the entire ratio.  But the Fed that differential is 1-.99^2, or zero.  Loans to deposits at the Fed are not changing to accommodate Obamacare.  So these differentials are all calculated with respect to the Obamacare variable. Hyperbolic discounting tells us something is about to break because of Obamacare.

If Obamacare spending down not settle down, the economy will shrink.


Alaskans are moving to Anchorage

Vox gives us a nice map of where we moved to and from.  That blue area in Alaska is Anchorage, Alaska is concentrating in that spot. Not a bad idea as the oil runs dry.

Otherwise we are moving west and south. Florida, the retirement state, is growing. California breaking even, and Texas gaining the most. The entire North East continues its slow decline. The dark blue, north center, are the frackers.

Friday, March 27, 2015

Talking about the hyperbolic conservation constraints

This curve is:
tanh * tanh' + 1/2 tanh'' = 0
This is the Hamiltonion for the TOE.

The curve is tanh'', composed of tanh * tanh'.  It is the environment that is imposed on the system when adapted.

This curve goes down to the uncertainty of the system and is in balance with the environment.   That peak is where all the transcendentals are most accurate, their rational approximations get settles at 1.5 on this chart.  But 1.5 is not a solution, mainly because the system need to play the Wythoff game at that point where the combinations of events are maximum. Tanh is the amount of 'charge' the system support for any integer point along the curve. tanh' is eqwuivalent to momentum.

To the left of that peak, the bubbles are crowded, and want migrate to tanh = 0, to the right, tanh goes to one and there is no gain from an exchange, sinh and cosh are about the same. The effective quant of the electron is 1.5, mainly because the quarks bounce it between 1 and 2.

Where is that peak relative to the nucleus? I am not sure, actually.  But I am pretty sure angle count from the center of compaction, starting at zero. But I have gotten mixed up on that many times, beware.

I think the environment to the right of the peak is actually decompresses relative to the nominal vacuum, it is the positive charge region from the proton. To ghe left of the peak, the tanh function is very linear with the angle and tanh' is maximum. So the gluons, as you  see, have a lot of room between zero and one to set the environment, always near the vacuum uncertainty. This collection is an avogadro of bubble, and the electron is about 10e14 bubbles,  so a lot of Whythoff moves going on, but they all obey Wicks theorem.

Posotrons can slip over the peak going from left to right. But they are quant on stuck in a quant two environment, they need to decompress, so their tanh' flips sign, they appear positive. They are do not meet the constraint equation above.


A note on anti-particles

In a compressed environment, the Whytoff players want to get to a cold position. In a decompressed environment they want a hot position.  Anti-particles exist when the environment has been decompressed below ground by a magnetic field, it traps electrons I supposed and that decompresses the vacuum. A neutral vacuum has as many hot positions as cold. The nominal vacuum we have has more hot positions than cold.

Moving to a cold position is an increase in elasticity of interactions, and visa versa. I think I have that. More elasticity means more overlap meaning tanh is closer to zero.

A quick note on running time backwards

The difference between the Feynman diagrams and the hyperbolic approach is that the hyperbolic approach assumes the system is already adapted, or encoded. it is adiabatic. So actions are always adiabatic.  This ahs always been my approach. So a sequence of two actions, the second evolving from the first will produce a unique carrier between the two vertices in the Feynman diagram, and that carried is already prepped to happen simultaneous and independent of the two sequences on either side of the Feynman connection. So no need to introduce time at all, all sequences already occur in the proper order.

The positron, for example, is only observable when the physicist breaks the adiabatic condition.  I don't think anti-particles can survive. I will get into that sometime, I have not worked the whole thing out.