Sunday, May 31, 2015

The work involved in finding a noth on the yardstick

Probability, pure math, physics, statistics all have equivalent functions called work, energy, rate and ratio. They transform into frequency power spectrum, and probability distributions.

All these things seem to re-appear across fields because that is what it takes to optimize a finite yardstick which support a ring.  Matching precision along  the notches, allocating white apace and notch space, matching actions to significance. Making sure log is equally accurate everywhere. Getting a good estimate of Pi. Its all hard work, and somebodies got to do it.

Can the bots learn to cook?

Certainly if the bot had a representative sample of actions in the kitchen. I think it could digest recipe books and learn the relationship between actions in the kitchen and the recipes. It might need descriptive word tr5ees to encode the kitchen actions, a sort of aid. But assume not, assume that it has mechanical sequences that match each recipe. It should be able to read the recipe book and identify the action sequence, by entropy matching.
Can the bot discover a circle? Give it the XY coordinates, representative sample. It will reduce precision until a rank great than one encoding tree. It should dicover the tree is perfectly symmetric, and construct a code that is linear in angle, a circle. Given a set of unknown data, the bot builds the encoding tree, then likely bounces that graph against a set of common graphs in its base, to find out how unique it is.


This is the cross over candidate


DAVENPORT, Iowa (AP) -- Former Maryland Gov. Martin O'Malley entered the Democratic presidential race on Saturday in a longshot challenge to Hillary Rodham Clinton for the 2016 nomination, casting himself as a new generation leader who would rebuild the economy and reform Wall Street.
"I'm running for you," he told a crowd of about 1,000 people in a populist message at Federal Hill Park in Baltimore, where he served as mayor before two terms as governor. O'Malley said was drawn into the campaign "to rebuild the truth of the American dream for all Americans."

He would do well,  he is a government executive, first.  He is like Brown, mostly managerial, and secondly worried about party politics. So I don't care that he has a progressive bent, he is going to be pressured and when that happens does the work.

Hillary will always resort to script reading, she is the worst government manager, I would pick Walker over her, for the same reason; Walker had to, sooner or later, be managerial in Wisconsin.

Dealing with the equipartition

We show any bijective set decomposition meets equipartion when the two period conditions are met.  Dip into the Fischer metric and show no gain in significance from taking sets three at a time.  This is Shannon bandwidth conditions, and we get hyperbolic straight away.

Then the standard physics of energy, momentum, work, acceleration and mass, those laws are conditions imposed by the finite three dimensional system. Mostly the Hamiltonian is a condition on the frame indices. They arise because P/Q and Q/P do not converge uniformly as they are rational.  Spacetime physics is all about aligning the frame, comparing it to quantized system then realigning the axis using the laws. Doing the relativity fix where p/q and q/p suddenly got more accurate.

No real multiplicative field, its all locally additive. That is minimum redundancy it keeps the bandwidth match, which is maximizing information or significance, or meeting Ito conditions, or make Riemann smooth enough.

The TOE is a big thing

This will be one of the top two or three events that mark civilization here, this is a leap. It changes our entire world view, makes honesty almost mandatory, creates brilliant artificial intelligence. If you took a survey of all intelligent life in the cosmos, the one category you want is, did they find the TOE. Finding the TOE marks the jump to super intelligence.

Combinatorics allocation and optimal flow

Matilde, my new hero got me thinking about that Fischer metric, it measures the change in data significance  between in coordinates on the hypothetical Riemann surface.

Number of combinations seems to be the root of significance. Mix things one at a time, two at a time or three at a time.  The two period model, one side contracts the number of combinations, the other expands.  So we have an allocation problem, what is the maximum entropy assignment of basis set algebras that make  a more smooth Riemann surface. That means a stickiness, an elasticity ratio,  A number of combinatorics left over, the Solow growth residual!

Customers make combinations taken two at a time. Inventory has to match, with one combination to spare. We get combinations of input and output. We still get a queue. Some customers do a one period, some do a three period. But inventory refuses to do three period combinations, not available except for low price. One perioders are no problem, a better profit.  But the two periods form the bulk of its operations.

Set and number of elements are controlled by the overlap function, which is quantized.  We restrict ourselves to quantum entanglement.

Lagrange himself

We have to prove we make a σ-algebras:

The main use of σ-algebras is in the definition of measures; specifically, the collection of those subsets for which a given measure is defined is necessarily a σ-algebra. This concept is important in mathematical analysis as the foundation for Lebesgue integration, and in probability theory, where it is interpreted as the collection of events which can be assigned probabilities. Also, in probability, σ-algebras are pivotal in the definition of conditional expectation.
In statistics, (sub) σ-algebras are needed for a formal mathematical definition of sufficient statistic,[1] particularly when the statistic is a function or a random process and the notion of conditional density is not applicable.


So the overlapping bubbles need this:

Let X be some set, and let 2X represent its power set. Then a subset Σ ⊂ 2X is called a σ-algebra if it satisfies the following three properties:[2]
  1. X is in Σ.
  2. Σ is closed under complementation: If A is in Σ, then so is its complement, X\A.
  3. Σ is closed under countable unions: If A1, A2, A3, ... are in Σ, then so is A = A1A2A3 ∪ … .
From these properties, it follows that the σ-algebra is also closed under countable intersections (by applying De Morgan's laws).
I think the bubble get this for us. We do not need them spherical, we need each bubble conserved.

Saturday, May 30, 2015

Mish points us to a remarkable economic paper.

The future of unemployment.

This has TOE written all over it.  They essentially do spectrum matching, finding the spectrum of actions for any task. Then finding the quantum entanglement with the aggregate Reimman surface. That gets them the probability of any set of actions over a Coubs Douglas growth.

Take away for economists.
The Cobb's-Douglas is being decomposed into a spectrum of activities in a basis of the author's choosing. Each dimension of activity defines an integer index for the number of combinations this activity generates. That is spectral decomposition of an adapted ensemble. It finds the digit system based on minimizing spectrum, redundant actions.

NowCast coins!

I talked about the no arbitrage bot that runs bets on the NowCast output. It pays off by most probable relative to the actual NowCast, pasys out all the dollars and earns ad space revenue.

But if the players are on line anyway, they can choose to take either dollar or NowCast coins off the table. They become zero bets with zero probability, banker bot is still no arbitrage.

Why would players take NowCast coins off the table? To buy NowCast products, sold by vendors in NowCast points. If this was RetailCast, betting the accumulation of retail growth and decay? Then all retail vendors could immediately quote in retail points, gain all the money transaction costs.

Banker Bot always runs double entry accounts, cash in advance on one and cash on pay-off for the other, enough to generate fermion and boson transactions, and make the most likelihood betting graph.  So NowCast coins would rapidly become the dominant unit of account. They become the Fed target.

My new hero, the yardstick physicist

Caltech mathematician Matilde Marcolli and graduate students. Her research says:

The Ryu-Takayanagi formula relates the entanglement entropy in a conformal field theory to the area of a minimal surface in its holographic dual. We show that this relation can be inverted for any
state in the conformal field theory to compute the bulk stress-energy tensor near the boundary of the
bulk spacetime, reconstructing the local data in the bulk from the entanglement on the boundary.
We also show that positivity, monotonicity, and convexity of the relative entropy for small spherical domains between the reduced density matrices of any state and of the ground state of the conformal field theory, follow from positivity conditions on the bulk matter energy density. We discuss an information theoretical interpretation of the convexity in terms of the Fisher metric.
I have heard of this Fisher metric before.
Tells us how the significance of x changes when we count x on some smooth surface having a coordinate system. This is what got her the TOE job, she the idea of optimum divergence. But, we can skip smooth surface can't we? and go straight to bandwidth matched. Go from probability to spectrum. Use the Shannon idea that any large collection can be broken into a series of bisected set, which are inclusive in one direction. Make that a sampler, for the complete sets and the incomplete sets, and get an index on both. The second sequence of sets is the round off error, and makes the fermion. Define boson and fermion by capacity, in units of relative sample rate. The sample rate being convergent in one and divergent in the other.

Once you have conditions on matching sample rates, go to interactions of finite elements, described by their 'chemical potential' , otherwise known as bubbles overlapping while compressed. Everything is local, and you go straight to hyperbolics, skipping the information part altogether.

The requirement on set distribution is sufficient to make a stationary Lagrange point, which is at least elliptic. We get our Reimann Grid. In otherwords, Lagrange theory is half done, I think. Then set distribution is maintained by quantum entanglement, equivalent to sample rate matching, adjusting the bisecting point from one set arrangement to the next. Straight to constrained flow, get to constrained flow ASAP, then write the book: Having fun with finite hyperbolics. We get Winnie Coopers version of TOE.

The paper begins with this:

One important development in this direction was
the proposal of Ryu and Takayanagi [1, 2] that the en-
tanglement entropy (EE) between a spatial domain D of
a CFT and its complement is equal to the area of the
bulk extremal surface Σ homologous to it

Now this seems like irrational approximation with rational ratio.

But anyway, I would not go near my hero until I have completely digested this paper

Friday, May 29, 2015

Can we bet the NowCast machine?

Anybody taking bets on the output? That is a great betting tool, players monitor the NowCast queue, consisting of reports.  This is something a Banker Bot should be able to do, takes no arbitrage bets on the price of a NowCast output. Betters can bet on future NowCast. BankerBot makes no hedge on NowCast output, it only reports most probable price distribution of NowCast bets. It pays off to make neutral, its cost 25 cents. I get the ad revenue. Venture capitalist guarantees me 20% of the ad revenue and I would write a banker bot.

I would set Nmax, the number quants. Create Nmax queues, the nodes on a Huffman Encoding Tree.  Send the next price to the middle queue, and work left or right to the price. That queue computes mostly likely price, using second difference. Sometime NowCast reports, and all currents bets paid. Cash in advance on bets.

But if the bot just treated all bets as savings and loans, with the top node having the most likely price, then each bet causes a rate change. Each queue can have a savings and loan accounts with the queue up stream, and changes its savings and loans when rates change upstream.

The actual NowCast comes in and it becomes the exact price.  That rescales the probability graph where we have eight betting queues (nodes) each with a probable price.  The eight queues earn one unit, so the dollars bet is split between them according to the most probable.

Frequency can be misleading



According to Planck, each energy element (E) is proportional to its frequency (ν):


Max Planck is considered the father of the quantum theory. where h is Planck's constant.
OK, fine. But Newton defined frequency as the infinitely sub-dividable number of events, a scalar. When things become finite divisible, then  frequency become quant, a slightly different thing. This can be confusing for a probability theorist coming straight from Newton land, because that person will say, no, frequency must be squared in this case, it is used as a rate of energy transfer. In fact, quantum physicists have to define frequency as change over some small dx*dt.

All Plank's constant does is scale one unit of proton quant to engineering units when the frame of reference is directly perpendicular to the surface of maximum divergence. It is the common prime multiple of all the energy modes in the proton.


Gavtity and fermion statistics

My guess:

Gravity is standing wave, intermixed with fermion statistics, going from earth to the L1 spot with Sun. The earth is the same kinetic energy it has when it was a band of proton dust in the original cloud. The tendency is to remove redundant interactions.  So the protons coalesce, but the proton being sufficiently accurate accommodates the line of symmetry created. That creates the quantum entanglement at the L1, the spot where interaction probability goes to zero. We get optimum queueing up to the L1 and that means fermion exchanges optimally queue up to the L1.  Proton distribution. That channel is impedance matched, up to the accuracy of the protons in organizing themselves.  There was never gravity in the spacetime sense, because there was never a spacetime.  All the kintetic energy just got reorganized. The protons simply keep the kinetic spectra matched within precision.

Le Grisbe is le root of le evil in man

The Silk Road quickly became massively successful and extremely profitable: so much so that Ulbricht promptly forgot the idealism that made him launch the project and quickly subverted the power and wealth it provided him for his own selfish ways, among which ordering the assassinations of subordinates who crossed him.

The pros are coming to town on the Theory of Everything!

The pros from CalTech are putting everything in order.  Caltech mathematician Matilde Marcolli and graduate students Jennifer Lin and Bogdan Stoica will be doing the work. I checked out their previous work, its good, they know their stuff.

One change, I would dump information theory and go with hyperbolic queuing over finite networks instead, that lets them adjust the liquidity facor  that I have talked about, as in Cosh^2-Sinh^2 = Q, Q may deviate from one.

Th goal here is to unify Hurwitz rational approximation, constrained flow and the Shannon equation. Then Markov falls into place. That gets us through queuing on graphs  and group theory.

Hat Tip to Thoma's website, Economists View.  He does the work to find this stuff, probably should get a grad student or two involved.

Muhamud cartoon time!

Phoenix (CNN)Jon Ritzheimer is a former Marine, and he has no middle ground when it comes to Islam.
His T-shirt pretty much says it all: "F--- Islam."
Ritzheimer is the organizer of Friday's "Freedom of Speech Rally" outside the Islamic Community Center in Phoenix.
It's the mosque that Elton Simpson and Nadir Soofi attended for a time. They're the men who drove from Arizona to a Dallas suburb to shoot up a Prophet Mohammed cartoon contest there. Both were killed by police early this month.
Many Muslims consider demeaning depictions of Mohammed to be blasphemous and banned by Islamic law.

Thursday, May 28, 2015

Baltimore gone haywire

CBS:
“It’s so bad, people are afraid to let their kids outside,” Perrine said. “People wake up with shots through their windows. Police used to sit on every corner, on the top of the block. These days? They’re nowhere.”
West Baltimore residents worry they’ve been abandoned by the officers they once accused of harassing them, leaving some neighborhoods like the Wild West without a lawman around.
“Before it was over-policing. Now there’s no police,” said Donnail “Dreads” Lee, 34, who lives in the Gilmor Homes, the public housing complex where Gray, 25, was chased down. “People feel as though they can do things and get away with it. I see people walking with guns almost every single day, because they know the police aren’t pulling them up like they used to.”
Police Commissioner Anthony Batts said his officers “are not holding back,” despite encountering dangerous hostility in the Western District.
“Our officers tell me that when officers pull up, they have 30 to 50 people surrounding them at any time,” Batts said.
Batts provided more details at a City Council meeting Wednesday night, saying officers now fear getting arrested for making mistakes.
“What is happening, there is a lot of levels of confusion in the police organization. There are people who have pain, there are people who are hurt, there are people who are frustrated, there are people who are angry,” Batts said. “There are people, and they’ve said this to me, `If I get out of my car and make a stop for a reasonable suspicion that leads to probable cause but I make a mistake on it, will I be arrested?’ They pull up to a scene and another officer has done something that they don’t know, it may be illegal, will they be arrested for it? Those are things they are asking.”

Wednesday, May 27, 2015

Lots of Republicans plan on a deficit spending binge

Hill: The Republican presidential field will swell to nine official candidates in the next week as three new contenders enter the race.
Former Sen. Rick Santorum (R-Pa.), who, in 2012, won the Iowa caucuses and finished second overall to eventual nominee Mitt Romney, is expected to announce his second consecutive presidential bid from Pittsburgh on Wednesday.
ADVERTISEMENT
On Thursday, former New York Gov. George Pataki (R), a long shot, will most likely hit the launch button from New Hampshire. And Monday, Sen. Lindsey Graham (R-S.C.) will enter the race from his hometown of Central, S.C., becoming the fourth senator to throw his hat into the ring.
The trio faces an uphill climb in the fight for money, media, and top-level political staffers and advisers.
“This group doesn’t look like it has a real shot at becoming president, but they could be fighting for Cabinet or V.P. slots and can contribute to the debate in different ways by highlighting their views on issues like social conservatism and foreign policy,” said GOP strategist Ron Bonjean.

Redundant exchanges

Liquidity, spare exchanges, its like impedance; local storage os sets, abaivle for exchange.

Cosh^2 - Sinh^2 = Q

We call Q the spare exchanges at any quant number i. It causes the shift by a quant in the hyperbolic angle,  It appears as the ratio on tan'' in the flow constraint. When Q = 1 then the ratio on tanh'' is 1/2, that is adapted, and the redundant exchanges are the Shannon clock.

When Q > 1, we get mass at i < 1.  There are redundant  exchanges, but not enough to generate a wave.  We get fermion statistics.

Tuesday, May 26, 2015

Banker Bot to the rescue

Skift: It used to be that saving up 25,000 award miles almost guaranteed a passenger a free domestic plane ticket.
In an era of stingier loyalty programs and fewer cheap award seats though, it’s getting extremely difficult to put those 25,000 miles to good use. As a result, some passengers have turned to external help to find their ideal award tickets.

Award booking services have been around for years, but a more active points economy and an increasingly complex airline industry have slowly put them back in the spotlight.
The points economy has been driven on the supply side primarily by points-based credit cards and the big banks behind them. With a new points-backed product or promotion launching every week, consumers are constantly sold on a variety of credit cards that promise enormous signup bonuses and a continuous stream of inbound points based on spend.

We need a new company to handle all the frequent flyer miles.  Then let Banker Bot manage savings and loan rates on the frequent flyer points while watching flight congestion. Intelligent banker bot can keep flights optimally filled and waiting lines small. If airlines get freaked, they have the option of buying and selling them on the open market. Aitline executives and their staff should be capable of making the trade between holding flyer points and booking passengers.

Skift is tracking and reporting on the new complementary money in the travel industry. Here is part of their work:

Executive Summary
Loyalty marketing is a multi-billion dollar revenue stream for the travel industry and the programs, as measured in terms of total number of enrollments, are more popular than ever. But changes to the industry landscape and changes in consumer behavior have many observers questioning if they still serve their original purpose: building connections with customers. This question is all the more urgent as new technology tools give travelers access to better information about their travel experience, opening up more information about everything from prices to customer reviews than ever before. This information is also creating new learnings for travel marketers as well, who are gaining more ability to selectively personalize the travel experience and identify pain points as they happen.
While the “points for rewards” loyalty paradigm still rules, there are plenty of new loyalty strategies that are challenging this model. A tidal wave of data, coming from customer mobile phones, social media and digital behavior on booking websites offers new opportunities to better personalize services, surprise and satisfy customers. Meanwhile, this huge wealth of information is also forcing many companies to rethink how and why they are deliver rewards to loyal customers, becoming more transparent and responsive in the process. Those companies that manage to incorporate this wealth of travel data into their loyalty offering stand to reap significant rewards, both in terms of customer perception, but also in terms of their bottom line.

A note on rates and the two period model

The two period rate on the ten year is about 23%. That is, one interest payment made at the ten year point, or 2.1*10, roughly. Take (1+.23)^(1/10), and get a good idea of the yearly rate.

The hyperbolic two period rate is set for that particular term period.  So be careful when interpreting rates.   But the hyperbolic banker has no knowledge of time, time is set by the humans when they space themselves along the yield curve. I really did not make that clear, and in fact ignored it in a previous post.
For bankers everything comes down to:

D(1+d)^2 - L(1+l)^2 =  Q , Q is the liquidity. Its a two period model, we impose the rule of no imaginary numbers and the deposits are the derivative of the loans and visa versa. If you want this in hyperbolic form, take the ratio, root it and get the hyperbolic angle. If the banker imposed fixed terms then Q has to vary for each term.



The liquidity premium in one chart

Why is the yearly rate on a long term bond higher than the yearly rate on a short term bond? Because the borrower pays a price for avoiding the chart on the left.

That chart is from Merrill Lynch’s US economist Ethan Harris. The one year lender has to figure out the one year demand for money in the face of this huge seasonal swing in money demand.


Who pays for that one year cycle? Mostly the liquidity cost is paid for at the ten year rate. The rate curve is not nearly so steep from ten to thirty.

Monday, May 25, 2015

Light motion as a bandwidth adapted aggregate

I thought that light should move even if no one was looking and there was no time or space. So, li8ght has to move via the adaptation process, a process trying to match exchange rates between a bundle of hot positions and the density of cold positions in the vacuum; using terminology from the game of Wythoff.

Fake Chart
So, I made a fake power spectrum, using the distribution of Coth and Tanh, derived from their second derivatives.  Coth are cold positions, they  exchange at half the rate of the hot positions. Cold in red, hot in blue; and they look like fermion and boson statistics.

Now in this model, the vacuum of space has cold positions, but when a density of hot positions are dumped into the vacuum, the bandwidth equalization process takes over.  Cold positions are redistributed such that their probability is four times the probability of the hot positions. In the chart I did this by shifting X and 1/X exponent to simulate charge. But since the probability distributions of fermion and boson are asymmetric, there will not be a complete match. At some point the cold positions will have more bandwidth than the hot. That point is the line of symmetry over which the hot positions will move.  Other hot positions, with high quant, greater exchange rate, will circle back.

I am close here, but I am in no position to try out actual numbers.  However, I am almost certain that this is the method used when the modern physicist dumps time and distance.

Let's try some X axis rules

Rule 1: The banker X axis and the Planks X axis are the same, they count wavelength (term length) low to high going right.

Rule2: Mathematicians count frequency, low to high going right, the same as the trig and hyperbolic exponnents count.

So banker and Plank count quants Nmax to 0 going left to right and their negative complements,  -Nmax to 0  the same. But atomic physicists count energy quants low to high, going right, the opposite of plank and bankers.

Hyperbolics count the one period rate as e^(-2N), where N is the quant number.   So two period rates go low as the quant goes high, for the tangent function. But loans use either cash in advance or they project two period in reverse time.

There, but I am sure I will mix this up again.

Stanley Fischer said what?

HERZLIYA, Israel—Federal Reserve Vice Chairman Stanley Fischer said Monday the central bank expects to follow a “gradual and relatively slow” trajectory of short-term interest-rate increases over the next three to four years to bring borrowing costs back to “normal” levels.
Mr. Fischer said observers focus too much on when The Fed will start raising its benchmark short-term rate from near zero, and instead should think more about where interest rates are headed over time. He said Fed economists expect the rate will reach from 3.25% to 4% in three to four years.
“There is so much importance given to the first move. But I think it’s misleading,” said Mr. Fischer in a lecture at the Interdisciplinary Center Herzliya, a college in a suburb outside Tel Aviv.
Mr. Fischer, who served as chief of Israel’s central bank for eight years before becoming the No. 2 U.S. central banker, said the coming Fed rate increases “will be a gradual process.”
He said it would not be like the relatively rapid and predictable path of Fed rate increases from 2004 to 2006, when the benchmark rate rose by 0.25 percentage point at each of 11 consecutive monetary policy meetings.

That is an impossibility. That means interest costs will grow by about 18% per year for four years, doubling the interest costs in DC when the ten year is 5.25%.  A Republican Congress cannot even budget anything close to that

Sunday, May 24, 2015

My favorite sequence

Sum of (2/3)^n and sum of (1/3)^N

These power series sums make 2 and 1/2, so we have:

cosh^2-sinh^2 = 3/2  Spare capacity

I call this the optimum queuing model, it keep two queues. One queue has 1 or 2 in line at a time, the other 0 or 1. Phi^16 = (2/3)^19. Npw Phi is just a numberical tool, in this case it stands for the ratio of two Fibonacci numbers.

Anyway, to my thinking, 2/3 and 1/3 are the Boson and Fermion exchange rates whne there is no motion.  These are the conversion ratios when the bubbles are separated into the cold and hot positions of Wythoff.  For Fermions, the cold spots out number hot spots 2 to 1. Two times as many bubbles have no desire to go overlapping.

Now the 13th Fibonacci number is 377, about where the spare capacity is near the fine structure.  That should not surprise us, the Hyperbolics as simply a useful too to manage power series.  And Phi is just an estimation tool. So any reduced physics model will have lots of matches to some power series tool. But the adapted bandwidth requires sinh = cosh', and visa verso so it supports the impedance model. The tool is built around bandwidth adapted systems, two period constraint.

How does the system make (2/3)^n match (1/3)^n?  The 1/3 and 2/3 series shift, that makes charge, and they  become an additive series.

Republican Communist Party watch

(Bloomberg) -- Republicans in the U.S. House and Senate say their budget proposals add up. It takes some creative math and logic to make that true.
The plans unveiled this week call for the U.S. government to collect about $2 trillion in taxes in the next decade that Republicans have little or no intention of collecting. Some of that revenue would come straight from taxes to pay for Obamacare -- which they want to repeal.
Republicans also gloss over details of where they’d cut more than $5 trillion to balance the books. Senate Budget Chairman Mike Enzi’s plan cuts $430 billion from Medicare without saying how. House Budget Chairman Tom Price’s proposal includes $1 trillion in “other mandatory” reductions that aren’t entirely laid out.
Budget analysts are criticizing the approach.
“While the goals put forward by the budget resolution are praiseworthy, the details are in some ways unrealistic and unspecified,” Maya MacGuineas, president of the bipartisan Committee for a Responsible Federal Budget, based in Washington, said in a statement.
The House proposal includes about $94 billion for a special war-funding account that isn’t subject to spending limits set by Congress in 2011. The Senate plan includes $58 billion in war funding, the same amount requested by President Barack Obama.
We balanced it under Bill Clinton. Put the sequester back, that was working fine.

Traversing from hyperbolics to yield curve

This part is mainly to clear things up because of the annoying habit of looking at the tanh curve and seeing it is shaped like a yield curve, and that is confusing.

I made a fake yield curve, out of scale, mainly to get the axis and units aligned. I converted hyperbolic angle, x axis,  to term length.  And the curve is not hyperbolic shaped because the hyperbolic angle is inverted and flipped, both.   I used liquidity 1.5 as that seems to be the liquidity for the real curve up to the six year term. But I am still working that.

When liabilities to assets are near one, that is the short end and rates are low.  Loan terms are short, no term premium so rates are low and liabilities high.  So, long term loan, the market carries more assets against the loan, large down payments required. I think I have this.


Saturday, May 23, 2015

Hillary created ISIS?

This report says so. But I tried to link through to real evidence and got not much more than innuendo.

What Secretary of State Hillary Dingbat created was chaos, chaos everywhere she went. That is all she does, create chaos.  Look over her entire career, almost all of it, from first lady to presidential candidate,  Dingbat Chaos. She is clueless, going through life reading from a script.

Liquidity and Forex trading

Zero Hedge brings it up. How much liquidity should a bot retain? Generally Phi if the market is adapting perfectly.  So for any two period model, a currency value may grow by G^2 and shrink by S^2, so G^2- S^2 = Phi.  That gives the market enough liquidity to pass the 'map' around as circumstances change.

Here is my thinking, take it or leave it.

1) All the moves we see in the markets are one period moves designed to stabilize a two period plan. This includes the bots.  If that were not the case then the market would blow up.

2) When a currency is gaining value then the market is doing the tanh function and the tanh flow constraints apply. Tanh gives you the down move over the up move and it should be less than one.  The probability of that trade pair is given by (1/2)* tanh'', taken as positive. The most likely trades are Pi/4 down and Phi*Pi/4 up.  But the bot should track the probability curve and bet to make that valid.

3) When a currency is losing value then  the market is doing the coth function, and the most probable trades are around 1/2 up and sqrt(5)/2 down.  Butuse the (1/2)* coth'' curve, taken positive, for the most probable trades.

4) There is twice the market activity up then there is down, I think.  This is the  boson uo and fermion down thing.

5) I am not sure how to scale these to real currency values.  Phi should be the signal to noise power ratio, if this market is adapting. Measure that SNR in units of the currency and you can scale all this.

Frequency falls as the hyperbolic angle drops, is my latest contorted thinking, but I still am a bit disoriented about mapping the hyperbolics to real markets. But, once the no arbitrage bots protect everyone's trade, then high frequency spoofing will not work.  Zero Hedge wants the world to go back to fundamentals, the no arbitrage bots do that because the recognize only new information.



Sorting vectors

I have a collect of N vectors, of arbitrary dimension,k. So they can be arranged as a k column by N row matrix. I want to sort them by significance, the vectors which most precisely describe the collect sorted higher.

Step one, find the mean/variance of each column. The column with the highest m/variance is the more significant, it has the lease noise. I pick some base b such that the logb(1+m^2/var) for all columns that makes the most significant column have a log at least one great then the rest.  I take the log of each element in the most significant column.I sort the rows based on that column. I split the matrix in two groups depending on whether the most significant bit is set in that column.

I then repeat this process for each of the two resulting matrices until there is no more sorting to do. Does this work?

Applying hyperbolics to the treasury curve

I simply took the existing rates and their log. Then I subtract one rate from the other. The hyperbolic angle is largest at the short end. I always get this mixed up, but now that I got off my rear and did the work it makes more sense.  The largest hyperbolic angle is pi()/2, so we see that the first three rates are split evenly by some delta of the hyperbolic angle, they have the same liquidity. Then going past the knee of the curve, liquidity changes. But this is clear from looking at the curve, it is linear up to the knee. All this proves is that two period planning is likely the norm and that we change the market liquidity at the knee of the curve. It also trains me to orient myself better when trying to match hyperbolics to any aggregate system, its not clear how to orient the thing until one actually works through a data sample.


Rate log diff
0.200 1.349 -0.221
0.610 1.107 -0.242
1.560 0.903 -0.204
2.210 0.828 -0.076
2.980 0.763 -0.065

Getting oriented then.
Hyperbolic angle is large at the short end of the curve, rates lower and loan/deposit close to one.  Loan/deposit decreases at the long end of the curve. But these are the aggregate numbers for the lending market. The problem is when lenders and borrowers individually deviate from the two period model.  DC, for example. It acts like a member banks, but it segments the members banks into itself with all the loans and the real banks that have all the deposits. Then at the long end, DC appears again, with all the loans and most of the deposits held by wealth.  G fouls the whole mess up and causes liquidity crashes, the markets becomes unstable.  

Friday, May 22, 2015

Liquidity and interest rates

Consider two cased, fixed term, non-tradeable deposit certificates and loan. And the other case, tradeable deposits and loans.

The non-tradable paper has a fixed market liquidity which is set to the standard value of 1.0.  One means one transaction, essentially.  This is the case where neither the lender nor borrower expects any material changes in the economy, a fixed term non liquid market. At the end of the period, the depositor demands principal and interest, and visa versa for the lender.  The both have to prepare to make the money good.  Hence, the two period planning model. Lender and depositor get one period to adjust their cash in preparation to the two period demand, this is the basic rule of adapted statistics.

Let's define an index, m, which identifies an ordering of markets for which loans and deposits are made with liquidity 1. Set deposits to D, loans to L, then we have:

D(m) is the amount due, principal and interest, over one period in market m. Do the same for loans, L(m). Then we apply the two [period model and take the derivative with respect to a change in markets.

D(m)^2 - L(m)^2 = 1, and its derivative, D(m)D'(m) - L(m)L'(m) = 0.

All this says is that we are dealing with markets having liquidity constant 1, so this differential must be zero.  And we see that D' = L and L'=D. Regardless of the liquidity value, stable markets have constant liquidity. So we get a standard result if the market liquidity is constant and two period planing is in effect.

The form of the deposits and loans

 We are already hyperbolic, lets look:

deposit value = (e^m+1/e^m)/2 and loan value is (e^m-1/e^m)/2
So what is going on here is maintaining liquidity. Over one period we want the deposits to earn 1/2 of the liquidity requirement and loans should retain 1/2 unit of liquidity. So, in one period, we have a half unit earned on deposits and a half unit retained on borrowings. Thus, the required liquidity in an adapted network is met.

Next we deal with liquidity different than one. This is the case when the network has to adapt. Let the adaption liquidity be some Q. Then we have:
D^2-L^2 = Q^2.   So in one period, D has to earn Q/2 and loans reserve Q/2. But our derivative still has to work, and they do according to my wxMaxima machine.

What are rates and balances?

D is the one period value, so:

b(1+r) = (e^m + Q*e^-m)* 1/2 = 1/2 * e^m *(1+Q * e^(-2m))

r= Q* e^(-2m)

b = 1/2 * sqrt(Q/r), with likely math errors 

So there is a fixed relationship between liquidity required and rates for any market m.   For loans, use reverse time like they do with anti-electron or like cash in advance. I will work it later.

So, term period is not known, but there is a frequency relationship between the currency banker at m=1 and the rest of the markets.   And for any given m, Q, the liquidity is a scale factor to the value of deposits and loans. That is, any quantity of deposits and loan values can be simply accumulated as a batch of unitary loans, so scale does not yet come into play. But the number of deposit/loan pairs at any given m, relative to another m, will be constricted by the flow conditions. The probability of a deposit/lona pari across all m have to be one, and that proability is inherent in the loan/deposit ratios at any m.








Did QE lead to faster NGDP growth?

The blue line is growth in NGDP, the red is the Fed bond portfolio. I see NGDP growth stuck around 3%, maybe the Fed added a quarter point to that number.

Jeb Bush caught in a lie

Angry Bear: Questioned by a voter inside a sports bar about whether there is “space” between himself and his older brother on any issues, Bush offered a clear critique.
“Are there differences? Yeah, I mean, sure,” Bush said. “I think that in Washington during my brother’s time, Republicans spent too much money. I think he could have used the veto power — he didn’t have line-item veto power, but he could have brought budget discipline to Washington, D.C. That seems kind of quaint right now given the fact that after he left, budget deficits and spending just like lit up astronomically. But having constraints on spending across the board during his time would have been a good thing.”

Deficit when Obama took over was about 9%, after adjusting for Obama;s stimulus act. After it increased a bit, Obam has reduced it steadily.
Jebb Bus is a liar.

Consumer prices flat over Q1

Here we are, lucky consumers with the CPI being the red line.  The grey bars are consumer prices minus the expensive stuff.

Thursday, May 21, 2015

Is the Fed to blame for price distortion?

Lawrence Lindsey, former fed governor says yes.
Market Watch: WASHINGTON (MarketWatch) — The Federal Reserve risks another bond market tantrum if it continues to hold off on a rate hike, a former U.S. central banker said Tuesday.
Lawrence Lindsey, who served at the Fed in the 1990s before joining the George W. Bush White House, said the central bank had delayed normalization of rates “way beyond what is prudent.”
“You would have been laughed out of the classroom” in graduate school if you proposed holding rates at zero with the unemployment rate at 5.4%, as the Fed is doing now, Lindsey said during a panel discussion on Fed policy at an event sponsored by the Peterson Foundation.
“At some point we’re going to get a series of bad numbers, showing a little higher inflation and the market is going to say ‘on my god, we’re so far behind the curve’ and force an adjustment that is going to be wrenching,” Lindsey said.
He predicted that the market disruption would be a “seven or eight” on a scale of 10, which 10 being the worst.
This risk could be mitigated if the Fed made some modest hikes now, he said.
The “taper tantrum” in the bond market occurred in the summer of 2013 when then-Fed Chairman Ben Bernanke switched gears and hinted for the first time of the eventual end of the third round of bond purchases, commonly known as QE3. After the comments, bond prices fell and yields spiked. The housing market arguably suffered the most as mortgage rates moved higher and potential buyers pulled back.
Lindsey said the Fed “has almost no credibility” with his clients about its ability to “stay on top of ticking monetary bomb.”
Maybe, but raising interest on deposits reduces the remits back to Treasury, and all of treasury debt costs rise.  Bernanke's favorite member bank could freeze up. So before raising the deposit rate, make sure Congress is well educated on the issue.

And Hillary Rodham, Secretary of State

That  would be Hillary's idea of foreign policy as nanny therapy.
Via Zero Hedge: Just hours after ISIS scored a significant victory in Iraq when it captured the town of Ramadi over the weekend, the first Iraqi town that had been actively defended by the US as opposed to just Iraqi troops, overnight ISIS also captured the ancient Syrian town of Palmyra, which the mainstream media promptly concludes was proof that the Islamic State's momentum was growing.
Around a third of the 200,000 people living in Palmyra may have fled in the past few days during fighting between government forces and Islamic State militants, the U.N. human rights office said on Thursday.
That's not all: according to Reuters, "extending its reach in the region, fighters loyal to the Sunni Muslim group have also consolidated their grip on the Libyan city of Sirte, hometown of former leader Muammar Gaddafi.

"ISIL has reportedly been carrying out door-to-door searches in the city, looking for people affiliated with the government. At least 14 civilians are reported to have been executed by ISIL in Palmyra this week," Shamdasani said in emailed comments.

Napoleon Bonaparte where art thou?

I am trying to be nice here, but its hard.  What we need, Mr. Fischer, is a clear understanding that having government hedge the currency banker is not really a good idea.  Look at the loan/deposit balance on you sheet. You have 2.4 T loaned to one member bank and all the rest have 2.4T on deposit.  A clear indication that you need to re-shuffle you member banks, maybe dump the member bank with the 2.4T on the loan books.

Fed’s Fischer Calls for Greater European Fiscal Integration
Europe’s Economic and Monetary Union will “very likely” survive its current crisis, but greater fiscal integration is needed for the eurozone’s future, the U.S. central bank’s No. 2 official said Thursday.
“The decision to use the single currency to drive the European project forward was a risky one, and at some stage or probably in several stages, it will be necessary to put the missing fiscal framework into place,” Federal Reserve Vice Chairman Stanley Fischer said in remarks prepared for delivery at a conference hosted by the European Central Bank in Sintra, Portugal.
Mr. Fischer said past setbacks and crises for the post-World War II European integration effort have “spurred policy makers to take steps that they might not otherwise have taken at that time, and the end result of those steps has been a more unified European monetary union.”
Greater monetary integration in the form of the euro’s 1999 introduction “until recently seemed to be a major success,” Mr. Fischer said, and “in turn made crystal clear the need for more fiscal integration.”
Mr. Fischer expressed optimism that the eurozone will endure, though he acknowledged the European Union faces “the possibilities of major difficulties associated with the current Greek crisis and, later, with a potential British exit.”

Martin O'Malley, here is some short advice

Tell us you fix potholes in government.  Seriously, we need a pot hole fixer, think Jerry Brown, second term. 

Be the small state governor who is going to fix potholes in federal government.

I am going to be nicer to the economists

I have been a prick to those folks for too long.

Wednesday, May 20, 2015

Nice curve

This is curve steepness, the ten year yield minus the overnight rate.  Look it over, our little smooth curve since 2011 has never appeared before a recession. Take a look.
 I am thrilled, perplexed, nervous and mixed. But the curve looks nice.

Krugman mostly weaseling out

Krugman is argues that rate matter (they do). Did Krugman actually say the Fed sets rates here?

Krugman: Via FT Alphaville, James Montier has an interesting piece castigating economists for their “interest rate idolatry”, their belief that central bank-set interest rates matter a lot for the economy and that therefore it is useful, at least conceptually, to think about the “natural” rate of interest that would lead the economy to full employment. There is no evidence that interest rates matter in that way, he says, and economists who talk about natural rates are simply engaged in groupthink.
In particular, he identifies three blind and/or stupid economists leading everyone astray: Janet Yellen, Larry Summers, and yours truly.
Well, it could be true; there’s plenty of stupidity in the world, and much of it imagines itself wise. But in my experience people who declare confidently that “economists don’t understand X” usually turn out to be wrong both about X and about what economists understand. As I wrote in one context, often what they imagine to be a big conceptual or empirical failure is just a failure of their own reading comprehension.

Not clear is it?  It is not clear because Krugman knows there has been an ongoing search by me to see exactly when and where the the Fed actually sets rates.  In almost all cases, excetp Volker and the Nixon shock, the Fed simply follows the market.  This is a well documented fact and Krugman knows it.

So, you see, he takes the argument and skillfully changes it to rates matter, and doe not address the issue proposed, that the Fed is a fraud which claims to set rates.

Here is my bet, Krugman will never say that any Fed actually set rates beyond Volker.  He will slightly alter the subject, then skip town.

Tuesday, May 19, 2015

Martin Fedstein thinks the dollar no longer works?

Mark Perry reports on Martin Feldstein's op ed. Subject: the failure of GDP measurement to capture all the value in the economy.  Te dollar no longer measures, we have a pricing inefficiency problem, the theory says. In short:

In short, there is no way to know how much of each measured price increase reflects quality improvements and how much is a pure price increase. Yet the answers that come out of this process are reflected in the CPI and in the government’s measures of real growth. This is why we shouldn’t place much weight on the official measures of real GDP growth. It is relatively easy to add up the total dollars that are spent in the economy—the amount labeled nominal GDP. Calculating the growth of real GDP requires comparing the increase of nominal GDP to the increase in the price level. That is impossibly difficult.
So, my question, hos does the economy price quality?  It must have a unit of account somewhere? Here is one answer:

Big Business Is Getting Bigger - FiveThirtyEight

Andrew Flowers report that the size of private corporations are getting larger, so that allows them to use an internal unit of account.  But is not just the private sector, the government enterprises grew enormously in the early 2000s, crowding out the ability of the fed to set rates. Obamacare has grown the government health industrurt by six fold, at least.  Hospital systems are undergoing mergers and concentration.  The doubling of the nation debt has virtually required that Goldman Sachs take over the government debt industry.

It is measurement theory.  Mark Perry can no more make a theory about quality unless quality is measured, otherwise it is non existent.   SO, the economy is trying to escape from the inefficient dollar.

That means banker bot and smart card.

The mad dash by Apple, Google, PayPal, CardLogix, the Merchant exchange group, and just about every consumer in the USA is demanding these new currencies and smart cards.  The pressure on Silicon Valley to get this done is enormous.  The SP500 is being put under a bitcoin like accounting system.  The use of points, value points, discounts, flyer miels and all these supplemental currencies is rising faster than Silicon Valley can produce banker bot. Government and their fiat bankers have moved off the grid and will have a hard time restoring their tax currencies.

Housing boom!

Zero Hedge looking at permits: Following two ugly months of dramatically missed expectations, Housing Starts exploded to 'recovery' highs (highest since Nov 2007) jumping 20.2% MoM to 1.135million (against 1.015 exp.). This is the 2nd biggest MoM jump in history. Both single-family (3rd biggest MoM surge since the crisis peak) and multi-family starts surged. Permits also surged in April (jumping 10.1% MoM - the most since 2012) to 1.143 million (well above expectations) and the highest since June 2008.

Well, the permit business is looking up, puts money in local government accounts.  Does it help? Yes, this summer if the economy has not gone grey bar.  But I have a hard time seeing any help this quarter, the lead times are too great.

Monday, May 18, 2015

Atalnta Fed forecast has current growth at .7%

The GDPNow model forecast for real GDP growth (seasonally adjusted annual rate) in the second quarter of 2015 was 0.7 percent on May 13, down slightly from 0.8 percent on May 5.

This forecast seems accurate, it nailed the last quarter pretty well.  So that means average YoY growth for the first two quarters seem to be around .5%. Will the NBER recession daters grey bar us?

Krugman asks, Why doesn't Treasury borrow short?

He's back:
"the predictions of Hicks-type liquidity trap analysis"
Short term rates are low because Treasury does not borrow short term.  We went through this, many times. If this was a liquidity trap then Treasury could borrow indefinitely at the short end where YoY rates are .23.  In fact the one year bond was sitting at .17 for a long time. Treasury tested Krugman's theory, borrowed short and ran the rates up to .23, equal to the rate on reserves.

It's not Hicks, its Treasury in DC, they are the ones who let the excess reserves pile up. Why pay 2.3%, the ten year rate, when vast stores of liquidity are available at the one year rate? Its not Hicks for the umpteenth time, Treasury would simply overwhelm the reserves, drain them in about 6 months if it tapped those reserves.  Congress cannot manage a 2 trillion dollar rollover at the one year rate, Congress does not have the flexibility.

Krugman is simply trying to divert attention from the debt problem by relying on some magical property of the Fed.  Its not the Fed, its not Hicks. The liquidity trap is Congressional debt: Congress has more debt then it has budget flexibility.  Congress is screwed because Congress was fooled by the Kanosian Magic.

There is some crazy theory around that says money is created when government borrows.  How stupid! Money is created when the Fed takes losses about once every 40 years.  Then Congress consumes all the liquidity over the period, we have a war or some other catastrophe, and the Fed takes another round of losses.  I dunno where these crazy economists get their version of history, but this is how the USA has done it since it was founded. 

We tested Krugman's Theory

That is the thing.  Months ago he was spouting this stuff, so we, and his undergraduate students, took the challenge.  If rates are stuck low, at the short end, then it must be the case of some massive fraud at Treasury.  Because Krugman says we can borrow infinite money at the short end!  And Jack Lew, rather than be accused of fraud, did in fact try to use the short end.  In fact, the debt cartel which manages government debt, wrote memos to Jack, and Jack repeated these memos to Congress. The memo says: 'If Congress wants to manage its debt at .23% rates, then Congress needs to have cash in the account, liquidity. Disputing Krugman's Theory was a big deal in Congress, the finance committee.  They are the ones who are supposed to make this crazy theory true.  And they, the Senate, concluded, correctly, that they cannot do what the Krugman theory says, Congress spent the liquidity!

And the theory of everything proves Krugman wrong because the TOE can compute a very accurate measure of Congressional liquidity as a constrained flow.  That means, Congress needs the  two rate. That is  where the curve matches Brad Delong's linearity.

Sunday, May 17, 2015

Jerry says a recession is on the way


Mercury News: The Californians most unhappy with Brown's spending plan were health and anti-poverty advocates. That's because the governor ignored lawmakers' pleas for spending more on home-care workers' salaries, higher Medi-Cal reimbursement rates and giving all illegal immigrants health coverage.
But Brown brushed back the criticism. He argued that long-term fiscal forecasts show the state could slip back into deficits within a few years, so this is no time for runaway spending.
"I don't want to get caught in the jaws of the persistent fiscal instability of the state government in California," Brown said.
"We know a recession is on the way. It's around the corner. And when it comes and you get these cutbacks, who's cut back? School teachers, people on welfare, all sorts of programs."The governor said he wouldn't support a pending bill that would give all of California's illegal immigrants access to health insurance because it would be too expensive.
Overall, the budget Brown unveiled Thursday is $7.3 billion larger than the $108 billion budget enacted last June for the current fiscal year. It reflects a $6.7 billion increase in projected general-fund revenues compared to the proposal he released in January.
Nice forecast Jerry. You have beat 90% of the economists to the finish line.

Theories of voting, via Bryan Caplan

I search 'hidden agenda', my says we plan ahead and in the transaction model the hidden agenda is the second period of planning. It has to be this way, otherwise we could not plan ahead.

But I ran across Bryan Caplan's course syllabus on a class in voter motivation. Let me clear a few things up regarding the collection of data, I have gone into this previously.

In California, 90% of the voting on ballots concern ballot measures, local, county and state. 8% concern the actual election of legislatures by party.  Barely more than 2% here concern with national politics.  Collecting aggregate voting data will simply mix the instruments one looks for.  Political scientists who do this voting analysis correctly almost always do voter analysis on a district by district basis before aggregation. The research I looked into was extensive and was mostly about ballot  initiative and voter education on the issues.  Political scientists often do good work. But never try and discern anything from national voting aggregates without decomposition at the district level first.

Let's go through Maxwell's light equations for a moment

My claim here is that all physicists have done is found Avogadro's number, yet again, and the whole formulation is based on that number and the assumption that electrons pack a sphere. These equations, below, simply is a restatement of all the sphere packing theories.








Let's start, I use all standard electrodynamics here.  The first thing that happens is physicists create a nearly straight magnetic line through a nearly spherical cloud of charge. Then they align their sensor so that the change in charge is nearly vertical from the magnetic line, that is they get their instrument exactly perpindicular to the sphere surface.  Now, to test the relationship between the charge and the magnetic, they insert some tiny batch of electrons which move in response to charge motion.  They adjuat the instrument to nullify all spurious motion.

What do they find?  The Compton match between the fermion and the boson, the relative rates of actions between the two for the given radius.  The given radius is as close to a standing wave of charge/magnetic as they can get.  So what they end up doing is calibrating radius and relative actions along one degree of freedom.  Now, from there getting the speed of light is simply taking all the charge from the sphere, laying them out in a line and counting up the number of radiuses. That gets them radiuses/second for a planar, single more wave.

When tehy do all this, under the assumption of good sphere making, then the equation above will encapsulate the entire experiment. The mass of the electron, or amy of the other units, can be a free variable that lets them scale the remaining units. The reason this all works is simply that the Compton match is maximum entropy, and thus the light is finite spectrum. But relative to the mass of the electron, Newton's approximation works just fine for most engineering work.  It is not a bad solution for engineers.

Look at the atomic orbitals.

They do not look like they are packing a uniform sphere, but if you remove the myth of stable mass and forces, and just work the solution as packed vacuum, then the units of packed vacuum do appear to make an Avogadro sphere.  The packed vacuum units do this with connected power series, Lucas sequences. But these power series do not uniformly the over lap mode and its inverse, so they are short order power series.  From these short order power series, another power series uses them to concoct another short order power serie.

Electrons appear when error causes redundant exchanges, and some exchanges go to Null, and that causes a fermion to result, the Compton match is maintains.  So electrons really do disappear and reappear in the atom, appearing whenever redundancy grows. Magnetic overlap modes appear to be the inverse of charge overlap modes. And that mode, the impedance of space, in fact appears to be the sequence limit, its an actual Fibonacci number.  It defines the longest power series that can be maintained.  The finite set of exchange modes is simply a result of the finite spectrum of light, and the compression causes the optimum discrete spectrum of packing modes.

Fermions simply collect the round off error from a short order Boson power series, and they re-distribute that round off error around the atom as needed.  So the excess energy needed, the adaptation energy, is the klinetic energy of these fermions as they are destroyed and created to make Ito's calculus work.  That excess energy should appear in either the hyperbolic equations or the Shannon maximum entropy equations. lets use Shannon:

c/b = log2(e+SNR)

e is 1 when this is a fixed cable.  e should be 1/2+sqrt(5)/2 when this is a sphere.  e can be a bit larger if the sphere is contained in a vacuum with variations. e likely goes up to the second lagrange, but I have not, and will likely not, work that problem.  Pi is simply the maximum amount of divergence the overlap modes can have and still remain connected.  Remaining connected is the same as making sure empty space does not exist. Empty space does not exist, there is no such thing.  So there is no other choice for the vacuum when it is compressed.

The bond vigilantes never showed up?

Interest rates didn't bark says Delong
Delong 2011: It is in this situation that we want a government deficit--the government to print and issue the safe bonds that private investors really want to hold. As these bonds hit the market, people who otherwise would have socked their money away in cash--thus diminishing monetary velocity and slowing spending--buy the bonds instead. A large and timely government deficit thus short-circuits the adjustment mechanism, and avoids the collapse in monetary velocity that was the source of all the trouble. And as long as output is depressed--as long as monetary velocity is low and there is slack in the economy--printing more and more bonds will have next to no effect increasing interest rates.
And he shows this chart:


Look at the rates from 2009-2011, up at 3.5%, when Brad says rates were low.









Here is the long view:
Interest rates did bark, right after the Nixon Shock.   Yes, rates are low compared to the monetary reset in 1972.  And much of the the debt from 1981 was buried in the 30 year bond, and in 2011, we rolled over some of that debt, then in 2014 we rolled over some of the lil Bush debt and some of the Kanosian Obama debt. But not quite fast enough as my previous post points out, the key variable is interest costs in DC. They spiked, as I noted in the previous post.

We are not going to get an interest rates spike because we are not going to do another Nixon Shock. And the long view is a long term trend down in the ten year rate.  2.5% seems to be near normal.  The question is why do we just now discover the true nature of rates? And why do we set rates at the central bank once every 40 years? And why do all those gray bars line up with the DC election cycle? The evidence points to a deep structural flaw in DC governance.

DC interest costs plunging, deflator at zero

The red line, interest costs plunging in DC.  We are paying off the high interest rate Kanosian spending we were stuck with, courtesy of the Krugman, Delong crowd. They urgd us to borrow, saying that 3% plus ten year rates were cheap. Now the ten year bond costs 2.3%, inflation is at zero (the blue line).

That red line sitting at a mean of 420 billion per year in interest costs from 2010 through 2014?  Those high costs were all the result of the 'debt is what we owe ourselves' crowd. See it peak just two quarter ago in 2014? That is what caused our current recession. Running that much of the economy through the Goldman Sachs debt machine so soon required a contraction of the economy, and we are in contraction.  Let's just hope the current Kanosian recession cycle is mild, but at least we know why we are here.

Now the key here is California, will the Flounder succeed in raising taxes some 30%? If they manage that, then the public sector pensions go to hell.

Generalizing hyperbolics

I am working the general solution to:
cosh^2-sinh^2 = Y
Y generally greater than one. But I work it from the cosh function so the solution is alwasy symmetric around Y or 1/Y.

So divide through by Y and change the base from Euler to 1/sqrt(Y), then quantize the exponent in integers.  The greater Y is from one, the fewer quants until the tanh curve is full. It works like Fermi Dirac:

For the distribution of coth'', and works like the bosons for tanh''.  So making Y greater than one reduces the supported exchange rate, kT,  as in the plot.  That is lower energy, all the fermions crowd near the low hyperbolic angles.  Setting Y to Phi generates the Lucas sequence, and I suspect I can generate the silver ratio sequence, or any of the Lagrange sequences.

Saturday, May 16, 2015

Feds tell us airline manufactures have done a stupid

Wired: A security researcher kicked off a United Airlines flight last month after tweeting about security vulnerabilities in its system had previously taken control of an airplane and caused it to briefly fly sideways, according to an application for a search warrant filed by an FBI agent.
Chris Roberts, a security researcher with One World Labs, told the FBI agent during an interview in February that he had hacked the in-flight entertainment system, or IFE, on an airplane and overwrote code on the plane’s Thrust Management Computer while aboard the flight. He was able to issue a climb command and make the plane briefly change course, the document states.
“He stated that he thereby caused one of the airplane engines to climb resulting in a lateral or sideways movement of the plane during one of these flights,” FBI Special Agent Mark Hurley wrote in his warrant application (.pdf). “He also stated that he used Vortex software after comprising/exploiting or ‘hacking’ the airplane’s networks. He used the software to monitor traffic from the cockpit system.”

Now this seems either a lie or a perfect engineers stupidity.  How did the flight system become connected to the entertainement system? Impossible, unless

But this is what the researcher claimed!

History of Researching Planes

Roberts began investigating aviation security about six years ago after he and a research colleague got hold of publicly available flight manuals and wiring diagrams for various planes. The documents showed how inflight entertainment systems one some planes were connected to the passenger satellite phone network, which included functions for operating some cabin control systems. These systems were in turn connected to the plane avionics systems. They built a test lab using demo software obtained from infotainment vendors and others in order to explore what they could to the networks.
He named the manufacturing vendors:

 After news broke about a report from the Government Accountability Office revealing that passenger Wi-Fi networks on some Boeing and Airbus planes could allow an attacker to gain access to avionics systems and commandeer a flight, Roberts published a Tweet that said, “Find myself on a 737/800, lets see Box-IFE-ICE-SATCOM,? Shall we start playing with EICAS messages? ‘PASS OXYGEN ON’ Anyone?” He punctuated the tweet with a smiley face.

So, his claim is that there are stupid software engineering manager at these companies, both companies!  Boeing and Airbus both have stupids? Hardly possible, but there may be a third vendor with stupidity squared.

Stay tuned.

Friday, May 15, 2015

How can we beleive in the Higgs Boson and think space has intrinsic properties?

Space impedance, once again.  I always get it mixed, myself, but it is still taught as an intrinsic property of free space.  But obviously, the capacity of the vacuum, as a holder of light, is the maximum compression the vacuum can handle before triggering the Higgs effect. Wiki says:

Since 1948, the SI unit ampere has been defined by choosing the numerical value of μ0 to be exactly 4π×10−7 H/m. Similarly, since 1983 the SI metre has been defined by choosing the value of c0 to be 299 792 458 m/s. Consequently
Z_{0} = \mu_{0} c_0 = 119.9169832 \; \pi \ \Omega exactly,

Exactly? Only to the extant that free space gets pi, but since free space  capacity is limited, the precision of pi is limited.   The physicists are mad at me because I don't believe in free space. So they are on alert whenever my readers try to ask the simple question, how can free space hold anything?

Otherwise, they have already decided the capacity of light when they introduced the speed of light, so this is just a change of variable to electrical engineers units.

And they are using a sphere surface. What they really mean by pi is maximum divergence, there is no interference between the magnetic exchanges laterally, they are not curving.  What they get is the bandwidth of proton light that has escaped the proton, in reduced units. By reduced I mean all the common coefficients have been cancelled.

Anyway, free light is free because it escaped a proton. So free light will not have a Compton make a mass equivalent in free space. 

I am not coding up a banker bot

I quite the hieroglyphics game. But I can describe it in more detail, right here.

The bot always keys off the incoming bet, find the appropriate note and updates the no arbitrage price for that node. Then it emits down stream gains and upstream losses, the residual goes to the appropriate bot and the next no arbitrage price is updates. When branches are in 'out of the money' the bot will create or destroy a branch.  Basically it warps the graph to fit the tanh curve,  spreding out precision (or error variance) evenly.and the complexity of the incoming prices get reflected in the spectral decomposition done by the bot. The graph should always approach the minimum spanning tree. Start simple, then add branch and cut functions. Ant bot node should see its cosh updated twice as often as its sinh, so sinh is the echo, it is magnetism. Otherwise, top to bottom, bottom to top, multiple streams, whatever, add it in and have fun.

Bit coin coders should figure this out, except we use 'almost probable', as in Ito's calculus. Error is accepted, the thing we adapt.  Look at bit coding block processing, look at Huffman encoders, especially run time variable window Huffman encoders.  Find the folks doing that stuff, and find the mathematicians who know graph theory, hyperbolics, Lagrange, Lucas polynomials and sequences and maximum entropy systems. Physicists are getting a real handle on this stuff.

I am retired from coding, I freak at the very though of rummaging through this Ubuntu and trying to recover all the software infrastructure. I am not amining to be a gazillionare.

An even simpler prrof of Shannon entropy theory

In any channel, running at maximum information transfer, the residual entropy in the channel must be zero. Hence the power retained in  the channel must always be normalized to one.  Thus:
Power_in - Power_out = 1, but power is a square. So this is the hyperbolic flow constraint.

That does bring up a point. The adapted system is not a stable Shannon, it keeps entropy in the channel above zero, it transfers the map via connectivity, so we have:

Power_in-Power_out = Phi, I do believe. I think that adds asymmetry because:

phi^(1/2+n) + phi^(1/2 - n -1) is the hyperbolic form, but if the powers of Phi come from a sequence then the sequence is shifted by one to compose the hyperbolic:


Exp X     1/2    |  1/2+2  | 1/2+3
Exp 1/X  1/2-1 |  1/2 - 3  | 1/2 -4

So there is a charge shift.  Now the result is still the hyperbolic flow model, but the connected network is offset. I think there is a Wythoff row that does this, let me check.

Jerry's Emission tax finds a home

Professor Kahn, our environmental expert on emission taxes. He predicts this passing out of the new taxes will reduce the flow of emissions. My claim is that UC Professors emit more than the average California driver.

Today, UC's President Janet Napolitano distributed a letter which features some very good news for the University of California.   Instate tuition will remain frozen at roughly $12,000 per year and Governor Brown will provide the UC with extra $.  This deal opens up the "win win" of providing the resources for continuing to have an excellent university while protecting the middle class from tuition increases.

As an economist always seeking "natural experiments", one part of the Governor's letter caught my eye.

Pension changes

·     The agreement's $436 million in one-time funding over three years to help UC pay down its pension liability recognizes the State's obligation to help support UC's pension plan.

·     In exchange for the pension funding, UC would adopt, upon approval by the Regents, a new pension tier by July 1, 2016.  The new tier, which would affect only new employees hired after it is implemented, would provide, at the employee's election, either:
-- A defined benefit plan with a pensionable salary up to the California Public Employees' Pension Reform Act of 2013 (PEPRA) cap (currently $117,020), plus a supplemental defined contribution plan for certain employees,

How will we know? We have to find data on the quantity of foul gas emitted by UC Professors.  My readers knew the scam very well, this is a repeat. Jerry gets away with it because most of us speak Spanish.

Como ya he mencionado, nuestro gobernador no elevar el impuesto sobre las emisiones, levantó un impuesto de ventas sobre la clase media para financiar un montón de gas que emite UC Profesores.

So, you tell me.  How did Jerry Brown figure out that funding UC Pensions reduces emissions?

Wednesday, May 13, 2015

Federal pre-emptive surge against the Southwest Secessionists?

Jerry and Nancy have gone and dunnit it, started a civil war. It's always California starts these things.
Rasmussen Poll: As usual, recent mainstream media reporting on the controversy over the Jade Helm 15 military exercises, set to take place over eight weeks across several U.S. states, completely missed the point. Mainstream media focus was primarily about characterizing and demonizing Texans concerned about the exercises as backwater, paranoid rednecks with wild fantasies about an imminent government takeover. While exercises like these will always cause the imagination of some to run amok, the key point here is this: concerns that U.S. military exercises will be used to exert more power over states is not a fringe view.

¿Tiene el gobierno en Washington temen una secesión de California? ¿Ya cruzado el Rubicón?

Banker Bot, the unpatented, better version of Merkle

Merkle Tree from Wiki.

The banker bot does not hash, it collects compact polynomial representations of the probability distribution in the branch. So the accumulation of the polynomials generates a successive set of combinatoric that meet the control flow constraint. And there is a bunch of theory along with that which comes from high paid mathematicians.

The banker bot is also a very old, and well known algorithm; in common use with no ownership claim known. It cannot have a patent.

But, back to banker.  For banker bot, that tree is a decoding/encoding tree; encode up, decode down.  The leaves get raw prices, each price parsed from leave to root; higher price bets to lower price bets.  The root generates the encoded gain or loss back down. That is what I have in a banker bot network. It should ultimately, do a Lagrange number shifts high, going up.  Each polynomial a function of the previous.

The theory should be something like: The two period adaption meets the Shannon banwidth limit, hence encoding/decoding apply. The graph is a  compact polynomial,  is minimally redundant and adapts to the Shannon condition of maximum entropy. That makes it hyperbolic. The graph undergoes non-adiabatic change when rank changes. Hire a mathematicians, they make very good bot algorithms.

In economics, the banker graph is the belief function that Roger Farmer talks about.  The accumulated polynomial is the graph generator of what the agents does.  From the neural point of view, that model is exact, neurons form the minimal rate reduction graph which generates human actions, in sequence. Shopping habits is a multi modal rate counter, adapting toward a common finite ratio, low order, power series; a shopping clock.