Saturday, December 31, 2011

Software, databases and tree structured file formats!

Looking at disk Io for databases, I begin to investigate all the paing structures databases have available to manage disk data. Generally, database data is organized as trees of data, rather then linear arrangements,. Trees make for short traversals down the database.

What is my point?
What the industry really needs is a set of tree testers, a standalone program that can take a variety of trees and test their functionality against data sets. If I started a new project, that would be the target, making it easier for new data base engine designers to try and test various tree structures.

This is where my programming addiction starts. I think, hey, a few lines of code and I can try out one or two trees from this selection. I am getting better at resisting, however.

We are supposed to read this paper

Global imbalances and the financial crisis: Link or no link?
We conjecture that the main contributing factor to the financial crisis was not “excess saving” but the “excess elasticity” of the international monetary and financial system: the monetary and financial regimes in place failed to restrain the build-up of unsustainable credit and asset price booms (“financial imbalances”). Credit creation, a defining feature of a monetary economy, plays a key role in this story.

I am reading the paper, but I am gathering this paper is referring to a global monetary system betting on American productivity. Geithner's Socialist Banking network requires a productive American middle class to work. It don't work because politicians abuse the system. If you have Geithners brand of Banking Socialism, politicians eventually cause large bankruptcies.

So what does this suggest for 2012? Well, most American politicians will screw up the debt issue, but Chinese and European politicians are uch worse! So we have cascading bankruptcies of the major OECD entitlement systems, starting with Greece, Ireland, Italy, then France, then the USA.

In channel theory we would ask, what is the relative frequency of bankruptcies? Europe can suffer bankruptcies in succession, one small couyntry after another. In America its is going to be larger but fewer bankruptcies.

Chart of the day

From David Rhodes and Daniel Stelter The red is how much the politicians promised, the blue is how much we believed.

Kind of a problem for the 'Deficit don't matter' types.

Court blocks another Democrat attempt at collapsing energy markets

Judge Lawrence J. O’Neill, a federal judge appointed by George W. Bush, issued the stop order until the case can be heard. The basis of the order was that California’s Cap and Trade Law violates the Commerce Clause of the U.S. Constitution. The Commerce
Clause reserves regulation of interstate trade to the Federal government.
Too bad, Jerry Brown the senile was planning the same bullshit that got us in trouble with the electricity price controls in 2000. I can see Jerry stuck with dysfunctional politics, much of which he created in his early years. But how can he be so dumb as to propogate the same energy catastrophe again and again?

Republican Communist party Watch, CA edition

When your political enemies give you a gift, you ought to take it. Instead of taking it, California Republicans actively opposed the governor’s plan and shamelessly sided with the people who run roughshod over everything the GOP is supposed to stand for. Forget all the talk about property rights, limited government, free markets and family values. Cal Watch
Republicans are like this everywhere, they want the Goodies and have someone else pay. Does that remind you of any other party?

Republican Communist party Watch

Rep. Steve King, an influential Republican congressman from western Iowa, has not endorsed a presidential candidate. But in the final sprint toward Tuesday’s caucuses, he is urging conservatives to stay away from Rep. Ron Paul of Texas.

King is adamant that Paul’s “isolationist” views are a threat to the country. NRO
Tis the season for Big Republican Debt, Big Republican Communism.
Remember Grover Norquist. You can only be elected if you gut defense.

Ron Paul, senile?

“Well, first i thought it was a very inappropriate question, you know, for the presidency to be decided on a scientific matter,” he said. “I think it’s a theory…the theory of evolution and I don’t accept it as a theory. But I think the creator that i know, you know created us, every one of us and created the universe and the precise time and manner and all. I just don’t think we’re at the point where anybody has absolute proof on either side.” OTB
I mean he was a doctor.

Laws in California

California law: AB 641 : This bill protects same-sex spouses and domestic partners of nursing home residents from losing access to shared assets and other financial protections. Every Deranged California Assemply person gets a law
Assemblyman Mike Feuer is the author. He is representing the 42nd Assembly District. The district includes Beverly Hills, West Hollywood, and parts of West Los Angeles. I dunno, maybe its the drugs they eat in West Hollywood.

Is this a Higgs?

This is plot derived from the accelerator experiments, Sean Carroll filling us in. The energy band around 125 GeV, Giga Electron volts.  The spike around 126 is, I presume, a Higgs symptom  appearing for the briefest moment.  Reading the article we get a decomposition of Higgs nto photons or into the more exotic. So they work a probability problem, finding all possible sources of the photons taken two at a tie, I guess, and see which combinations match what hypothesis.


In terms of optimal flow we might say the Higgs is the smallest cargo nature can carry. We cannot see it directly because we cannot get Nyquist rate. We see instead the aliasing effect of undersampling an event in nature.

Time particles!!!!

Ten Things Everyone Should Know About Time
Thye Discover blogger went to a conference on time, but I think he was late.
Is time fundamental or derived?

1. Time exists. Might as well get this common question out of the way. Of course time exists — otherwise how would we set our alarm clocks? Time organizes the universe into an ordered series of moments, and thank goodness; what a mess it would be if reality were complete different from moment to moment. The real question is whether or not time is fundamental, or perhaps emergent. We used to think that “temperature” was a basic category of nature, but now we know it emerges from the motion of atoms. When it comes to whether time is fundamental, the answer is: nobody knows. My bet is “yes,” but we’ll need to understand quantum gravity much better before we can say for sure.

I have a problem with the standard model and time. Is thier a time particle? No time particle, no time constant. How could a time particle have velocity in a standard particle model?

Time is not real, folks, it is an illusion we have created to simplify the math.

Bailouts make things worse?

As the investigative reports in the Wall Street Journal this week make clear, this same type of fear of contagion (expressed, according the the WSJ story, very strongly by Jean-Claude Trichet during the past two years) is why a restructuring of Greek sovereign debt has been kicked down the road so many of times. Instead of reducing the role of bailouts as occurred for emerging markets around the time of Argentina, the role of bailouts in European policy has increased and this has made policy even less predictable. As the prospects of bailout increased, the political incentives to take action to reduce deficits and debt decreased as evidence in Italy over the past year and made the crisis much worse.  Economics One

Why would this happen? Bureaucrats can choose to stay within the current communication channels, it is an institutional block, it allows them to ignore reality until the last month.

This time is different

Hmm...

Different than a central government of a badly proportioned nation going belly up? How does a broad based tax regime produce the volatility in government debt since 1980? We can see the problem start even earlier.  We have a government ruling most of North America and it cannot figure out a stable investment pattern. Sounds like the same old shit to me.


Let's look at Britain, courtesy of Paul:


What's that look like to you? A debt history of a nation losing its empire and accumulating debt to bail out the denialists, then entering two world wars as each nation fights over default issues. In other words, sound a lot like the Republican party!

But here is the real secret:

Information revolutions hit the institutions. Bureaucrats are stuck with the old, restricted lines of communication and operate with missing data.  They fill in the missing data with paranoid delusions and start warring, like meth addicts.  Entanglement, folks, do not underestimate entanglement.

Brad said this?

A reader of mine provided me with this quote (apparently, from Brad DeLong):
I learned this from Andy Abel and Olivier Blanchard before my eyes first opened: increases in government purchases are ineffective only if (a) "Ricardian Equivalence holds and (b) what the government buys (and distributes to households) is exactly what households would buy for themselves. RE by itself doesn't do it." 
Macromania quoting.

I am a little confused since about half of the federal government is devoted to providing what household normally provide for themselves. Brad, for example, just indicated that food stamps have low multipliers. Who would have known? This explains the Dollars for Clunkers fail, people buy cars anyway. This also explains why stimulus would have such a low multiplier when essential goods, like energy, are in short supply.

Also, while we are on the subject, does anyone still doubt that pushing monetary stimulus causes price spreads to appear?

Otherwise known as encoding the surplus

James Hamilton writes,

My suggestion is that America should try to return to what some scholars maintain was the original source of America's success, which came from using North America's abundant natural resources as a basis for a competitive advantage in manufacturing.

He notes that shale gas and rare earth elements are resources that we could exploit. It seems to me that these offer opportunities for patterns of sustainable specialization and trade.Kling

In and around the mine, piles of rare earths appear, round off error from the mining network. Little miners say, "hey, I think I can do a second stage processing of this little pile." The new second stage processing industry increases the rank of the total economic network and we get a NLogN increase in economic activity, at equilibrium.

Think of a web bot cruising a database.  The web bot notices piles of data and says:  "Hey, I can put a schema around this data and make faster access"

Semantic Networks again


We present statistical analyses of the large-scale structure of 3 types of semantic networks: word associations, WordNet, and Roget’s Thesaurus. We show that they have a small-world structure, characterized by sparse connectivity, short average path lengths between words, and strong local clustering. In addition, the distributions of the number of connections follow power laws that indicate a scale-free pattern of connectivity, with most nodes having relatively few connections joined together through a small number of hubs with many connections. These regularities have also been found in certain other complex natural networks, such as the World Wide Web, but they are not consistent with many conventional models of semantic organization, based on inheritance hierarchies, arbitrarily structured networks, or high-dimensional vector spaces. We propose that these structures reflect the mechanisms by which semantic networks grow. We describe a simple model for semantic growth, in which each new word or concept is connected to an existing network by differentiating the connectivity pattern of an existing node. This model generates appropriate small-world statistics and power-law
connectivity distributions, and it also suggests one possible mechanistic basis for the effects of learning history variables (age of acquisition, usage frequency) on behavioral performance in semantic processing task Author Lost Via Blogger Inattention

Try looking at semantic networks as entropy encoded grammars. The idea is that computers and humans organize data so all ideas have codes that follow Shannon optimization.

Brad says Dean wrote this

In an article discussing the implications of the extension of the payroll tax cut, the Washington Post told readers:
"This year, the Social Security system projects that it will pay out $46 billion more in benefits than it will collect in cash. It made up for the shortfall by redeeming Treasury bonds bought in years when there were cash surpluses."
This is not true. The Social Security trust fund is projected to earn $114.9 billion in interest on the bonds it holds. It will use a portion of these earnings to pay current benefits. It will not be redeeming its bonds.
This is not real theory,Brad DeLong, this is Dean spouting from the Book of the Exogenous. Listen carefully:

Dean is claiming that some government agency is making money on their investments, so much money that the political establishment can cut back the funding for this agency, as a stimulus ecxperiment. How is this agency making money? Investing in a sister government agency!

If being a theoretical professor of economics ever has any meaning, then this is not theory.

George Will you need a lesson in supply and demand

In 2011, for the first time in 62 years, America was a net exporter of petroleum products. For the indefinite future, a specter is haunting progressivism, the specter of abundance. George Will
His opinion piece skips supply and demand throughout. We have an abundance of oil because we are using less of it at the higher price. Foreigners, evidently can find better uses for refine petroleum at the margin.

The George attacks the Progressives, failing to acknowledge that progressives, including their enabling economists, ignore supply and demand. We have too many government bureaucrats, George, because taxes are too low. Supply and demand? Get it?

Supply and Demand, it still works in spite of the nonsense learned in graduate economics.

Never seen the curve so low

I don't think the Treasury curve got this low even at the point of the crash. M1V has dropped like a stone the last measurement. I thought we had made a soft landing, maybe I was wrong.

The aggregate vs the distribution

The thesis that public debt allows the cost of public spending programs to be shifted forward in time has proven difficult for many economists to accept. Consider a simple formulation of the public debt controversy: instead of taxing to finance a rail system, a government borrows. Buchanan argues that borrowing shifts the cost of the rail system into the future, while others say that the cost remains in the present, noting the resources that went into the rail system are used up this year regardless of how it is financed. Public debt, they say, cannot be a burden on future generations because “we owe it to ourselves, [now].”….

It is irrelevant that “we owe it to ourselves,” for we are not many heads attached to one body. Some owe it to others, and it is irrelevant to note that the sum of the debts equals the sum of the credits. One could similarly aggregate over mortgage lenders and mortgage borrowers and say we owe it to ourselves, but it would be no more enlightening. Public debt allows present taxpayers to reduce their tax payments and obligates future taxpayers to amortize that debt, and it is here that the burden of the debt resides. True, the resources required to construct the rail system this year are furnished by bondholders, but those bondholders do not bear the burden of the debt simply because they are compensated sufficiently to make them willing lenders.Cafe Hayek quoting Dick Wagner

The debate rages between Krugman the collective, who aggregates over the whole, and the Austrians who think distribution matters.

Normally the way I analyze this issue is to eliminate time from the equation and go straight to probability of transaction. But we all know that works, lets look at Krugman's real problem.

Krugman does not have a democracy to work with, he can never claim that government spending can be measured in the aggregate unless he has democracy, by definition. I have to repeat this for the simple Keynesian mind. You can only measure the aggregate to the extant that lower orders of the economy are in the same relative sample space. Austrians know this, it is the local knowledge problem, and they might not believe me, but channel theory explains the local knowledge problem. No fair vote, no equilibrium, no valid measure of the aggregate.

The vote is the ultimate transaction that takes place, and governments do exist. If the vote and the economic life of the voter are at disequilbrium, the government dead weight losses increase, multipliers drop. Would a government with very good democracy ever vote to make DON BOUDREAUX a billionaire? They most assuredly will take action so someone becomes a billionaire, can't be avoided. It might be Don, I would vote for Don over someone granted a bogus patent.

Obamacare complaints

Our company, CKE Restaurants Inc., employs about 21,000 people (our franchisees employ 49,000 more) in Carl’s Jr. and Hardee’s restaurants. For months, we have been working with Mercer Health & Benefits LLC, our health-care consultant, to identify Obamacare’s potential financial impact on CKE. Mercer estimated that when the law is fully implemented our health-care costs will increase about $18 million a year. That would put our total health-care costs at $29.8 million, a 150 percent increase from the roughly $12 million we spent last year. Bloomberg

Obamacare, adding costs to the readjustment process, likely adding years to the depression.

Choo Choo con game was back in town

I haven't been mentioning one of my most popular blog posts, the High Speed Choo Choo scam Ray LaHood and Jim Costas keep running in Fresno CA. Remember the issue, build 20 miles of ultra high speed Choo Choo from Corchran to Weep Patch. Purpose: To get Jim Costas re-elected.

They were in town the other day, Ray LaHood insisting that the Choo Choo will be built. I guess I bring it up because the County here is broke, literally on the edge of bankruptcy and recently all county employees took a 10% pay cut.

There is a much cheaper way for Jim Costas to repair the economy in Fresno. Spend $100,000 on a billboard in central Fresno with a large picture of a brain scan from some meth head. Once the meth heads see the hole they have made in their brains they may just cut back a little, reducing the flow of federal funds to the Mexican Drug Cartels. Fresno has no economy when 10% of its population has a hole in their head, literally.

Web searching, target the metaphor, not the data

Web seaching is all about metaphors. What was the model scientists used last time, and try to duplicate that pattern on the web. It is a give and take between real scientists and jerks like me who blog and pester.

Scientists responde to good blogs, especially on Wiki. If I search for what might be a solution to problem X, by modeling the solution to problem Y, then either I get hits or I get obscure scientific articles. Even those articles I generally get into the articles that are relevant. In my next search and on my blog I reference the article. The bots do the rest, the scientists say, close but no cigar, then they correct me, often with new entries in Wiki.

My readers know I rerun themes. I drop a theme when my own blog appears in the search hits. Then weeks or a month later, I return to search the original theme, looking for progress of any kind.

It is not just me who does this, it is a common method of semantic communication now taking place under the covers of the web. The whole thing is intermediated by, well web bots. All the search engines constantly indexing pages. Sometime in the next year or so, we will be experimenting with bots that can suggest patterns, they are less programmed by click thrus. Interesting stuff, I do not think we all appreciate how the semantic web has already begun.

Santorum planning to bailout his buddies with trillions!

Santorum said that he is targeting “real conservatives, Reagan conservatives.” Hot Air
Reagan, the originator of borrow and bailout. Reagan the Alzheimer's president. Ronald 'Deficits Don't Matter' Reagan,.

Seriously, Santorum is going to get trounced by Obama, with me helping. Give the independent someone to vote for, quit throwing these religious psychotics at the independent. We have had enough with Perry, Santorum and Bachmann delving into the occult, kind of like voting for a Islamic nutcase.

There is no intelligent design among Republican voters

In 2001, Santorum tried unsuccessfully to insert into the No Child Left Behind bill language, which came to be known as the "Santorum Amendment" that sought to promote the teaching of intelligent design while questioning the academic standing of evolution in public schools.Wiki

Republican voters have become just another socialist group who couldn't find their goodies in the Democratic pile. Whenever their goodies are threatened, Republican voters resort to making the Constitution a bible. It is either 'Gemme my goodies or I will write the Book Of Santorum (Perry, Bachmann) into the Constitution.

All of this nuttiness arises because there is no entity called the 'vote'. The vote varies by an order of magnitude across the state boundaries. Pollsters, dumb as shit, dunno how to get good samples as a result. Politicians use the skew to lie and economists choose to ignore the problem because dictatorship makes their math easier.

What is Honorable about the Classroom?

My Phan, a senior at Monterey Trail High School in Elk Grove, says she can't get help from teachers before or after school.She and other members of the school's National Honor Society no longer can meet on Wednesday mornings because the teacher isn't in the classroom.

Read more here: http://www.sacbee.com/2011/12/31/4154061/some-elk-grove-teachers-are-working.html#storylink=cpy
This is a Sacromento story bout teachers working only the 7.5 hours a day and no more. Nothing wrong with that.

My question is for the National Honor Society. Why do smart people think time in a classroom is more productive than time on the information network? You know these kids are on the computer each night. What is it that an Elk Grove teacher knows that Kahn, and thousands of PHDs and thousands of on line materials do not know?

Somethings wrong here, this story seems too contrite, something the reporter mis-represented.

28 year old midget threatens South Korea

Kim Jong-un, who was declared Supreme Leader of North Korea yesterday, issued his first terrifying threat of war today.
North Korea may "smash the stronghold of the puppet forces" in the South in retaliation for "hideous crimes" committed during the mourning period for Kim Jong-il.
Read more: http://www.businessinsider.com/north-korea-hideous-crimes-morning-period-2011-12#ixzz1i6OZdu91
Why not, I guess.

Spanish voters getting a clue?

Facing a wider then expected budget deficit, Spain’s new government announced a $19.3 billion package of tax hikes and spending cuts Friday and admitted the picture was likely even worse than it appeared because of overspending by the country’s autonomous regions. NYT

We have a regional government called Fresno County. They make Spanish voters look like Einstein.

Deranged Egyptian Muslims are coming for me!

Muslim, Christians clash in southern Egypt
Someone drew a picture of the Child Molester and so deranged muslims go on a rampage.

Friday, December 30, 2011

A programming addiction

I am convincing myself the database industry is headed to where it should go, and I think abou programming something, just for the hell of it!

It's weird, the goal is to program just enough to prove proof of concept, after that I am done.  But I have this addiction, an addiction just to start up the Linux file system and run some IO checks for various buffer sizes and see if I can recreate enough of a data base file structure to test my triplet scheme.  That's terrible! That is just goofing off, I resist.

BSON qnd SQLITE file formats

I am not sure how Mango might be packing data onto disk.  I did however look at SQLITE3 and its paging system for file store.  It occurred to me, we are dealing with a single disks, generally, except on large data centers where disk striping would occur.

So the question becomes, if we have a single disk then data serialization is the best option except maybe for one case, discussed below.  What I mean here, even though key strings are variable length, it helps nothing by storing them in separate pages as that just causes another disk seek.  Conclusion, I think we can abstract a graph layer from the BSON approach, what they are doing seems just fine to me, in term of graph convolutions.

What is the exception? large binary blobs, like multi-media files.  Does BSON intend to break these large binaries into BSON segments? Not likely, probably just treat them as large strings.  I am not sure about blobs, they might want their own file pages,  but it matters not since I am not messing with them and can worry those later.

What would a machine disk 'node' look like?
Well, we need four bytes of object pointer, it defined the following set of nodes that constitute the descending object. And we need four bytes for the predicate, which I call the link. One byte of that for the graph layer and one byte for BSON, two bytes spare.  Following these two int32, is the BSON data, say in chunks of 16 bytes..  The only issue remaining is when the BSON data is a matchable key value, then the graph layer has to conform to a BSON string, no problem. Counting the size of the BSON data in units of 16 bytes is easy pointer arithmetic.

Oracle exec wants to screw Jodie Foster!

Wait, neither of these are Jodie Foster! Never mind.

There goes Hungary

Ten Year Yields
They  got their own currency, and the government just passed a law making it legal for politicians to hyper-inflate. Austrian banks are getting the shaft.

Database Taxonomy discussion at Wiki

In writing the database articles for Wiki the geeks are debating the attribute differences between an SQL database and a MangoDB datatbase.  The main issue is whether data constraints and regularity is imposed at the storage layer or the next higher layer up

Savings in the news

The webosphere is talking savings,
Kling,Henderson,McArdle, Klotikoff They even talk about the paradox of thrift.

The point of saving is that the future looks more efficient than the present. But in channel theory we have no time, we have relative probability of transaction. Let's formalize savings a bit:

View the economy as a set of transaction rates i, then to meet maximum entropy we should see the set:
-iLog(i) where the i range from fast to slow in a Fibonacci sequence. All the -iLog(i) are within 1.0 of each other at equilibrium (this is the optimum flow conditions)

OK, here then is the finite savings rule. If you expect some i to have a greater -iLog(i) then some subset j of i, where j > i. Then do less of the j and more of the i, trade wholesale!!  Should I buy apples at the grocer or shares in an apple farm?

Savings is buying on the wholesale end (less frequent, larger sizes), consumption is buying on the retail end (more frequent smaller transactions).

Political corruption dealt a blow in California

Supreme Court Upholds Decision to Eliminate Redevelopment Agencies
Yes, finally. Crooks from both parties have been stealing from voters with the scam.

Thursday, December 29, 2011

Saudi Arabia in a panic over the Iranian bomb

The White House Just Agreed To Sell Saudi Arabia The Largest US Overseas Arms Package Ever

Its a war of the Sheep, Shia vs Sunni. Too bad, I kind of liked Persia.

The Microsoft embedded query language

http://en.wikipedia.org/wiki/Language_Integrated_Query
int someValue = 5;

var results = from c in SomeCollection
where c.SomeProperty < someValue * 2
 select new {c.SomeProperty, c.OtherProperty};
 foreach (var result in results)
{ Console.WriteLine(result); }

 No brainer, the graph machine can make this language happen over the web. But it is query only, no floating schemas in the data to manipulate, no graph layer, no named graphs. But great stuff, actually, build the query functions right into the language. Imagine if a graph of data contains its own instructions on unpacking and packing. There is a functional data/declaration language, k, it does this. The focus, abstract a graph layer, a container layer designed for high speed mobility. Then build everything on that. Within that layer, first class objects are automatic. Where object code mapped to memory, we want to lift that a bit, let the object code roam about as it works, hence the graph layer, it provides structured movement.

I start to get anxious to open up the debugger and screw around with the machine. But it has to be really interesting for me to jump in, not quite really interesting, but I like LINQ, and I think it can drive a ton of traffic onto the Global Wide Graph.

Oh, yes.  Everything you can do in LINQ, you can do in R with the Sqlite3 interface, the 150 Essential Lines for R/Sqlite Integration, cut that from the page on the right, you can make it do LINQ and more using R string handlers.

President Obamney

Say the polls.
Click onto the web site, then wait a few second, this person,Megyn Kelly, starts yapping, real loud. I turned and damn near kicked my flat panel out of fright!, Right out of the blue, yaps at me in my own home. Bad web practice, you have to sneak up on them.  Sound serious, theoretical, then out of the blue, hit em with a curse word.  But type it, don't yell it.

Abundant software

With the rise of browsers, html, protected languages. I  love it, but it is addicting. I can spend hours looking over all the open source projects. The stuff is getting more automated each year.

Get a shovel, start digging

As a way out of the economic depression, says James Hamilton. All about mining for oil, rare earths and shale in the USA. Carpe Diem has been on that bandwagon also.

Whoops!

Hungary's bond auction failed this morning, when debt management agency AKK rejected all bids for its three-year notes and took in reduced bids on the 10-year offering. Yields on 10-year Hungary debt soared to 9.70%, up from 8.78% last month. The nation's debt was downgraded to junk by Standard and Poors and Moody's last week.

Read more: http://www.businessinsider.com/10-things-you-need-to-know-before-the-opening-bell-dec-29-2011-12#ixzz1hvmIfKaK

Anti genocide is now global law

The issue came up, what is the global law regarding genocide? The answer, I think, is that global law accepts or requires action against it.  I say accepted tradition, if dictators began genocidal killing, they generally expect intervention and  there is a global custom that intervention is accepted.

The issue came up in the Ron Paul libertarian foreign policy.  I don't know what the libertarian tradition is formally, but my tradition tells me that when human custom begins to prevail on an issue, it is better to accommodate it with minimal cost.

Is it accept intervention or require it?
Hmmm...

Web languages and graph convolution

Consider the following:
@(JaveRoutine,Arglist)

I interpret that to mean, take a Java routine, construed as a graph, and convolve it with an argument list and the result is a graph. I can change this a bit:
@(JaveRoutine(Arglist1,Arglit2,Arglist3,...)

My claim is that the TE above holds, that is a Java code can be construed as a convolving graph, one that tries to find matches in another convolving graph of arguments. The two are commutative.  Normally we see the java code as mobile against a fixed argument arrangement in memory. Bur, the virtual machine in reality the a graph convolver.

The graph machine, when is sees a BSON scrip, matches it against the next element, and runs the Statement, os SubQuery. Anything above the graph layer is construed as a query.

Related notes:
I still lurk within the web confines of the B ON,JSON,UBSON and variants.  They are getting real close, real close to something Cray XMT supercomputers could use. UBSON is an attempt, and demands strict adheranve to JSON.  One point to the BSON folks, do not forget that JSON is our TE, out string version of nested store. But the BSON folks anticipate variable expressions in graphs, so we are almost there. I have also seen the four of our basic ugly set used simultaneously as descriptive and query!!

But still, my message.  Remember, local sequences, local data, few look ups and very super fast sequencing through the expression tree.  Fixed format fields at the graph layer, keep the variable length data separate until you need it, expose the variable types and operator overloads in the fixed field.  I have called it the link, we can rename it the predicate. We need a node pointer, worried about space, make int16.

The ides is toabstract the graph layer out of XBSON. The fundamental data can be matched and is orangaized as a descending graph among a set of descending graphs.  That is,  in subject/predicate/object, the subject is a key string associated with the Dot and Comma at the predicate.  The object descending graph bound by the pointer.  The graph layer also knows about named graphs, that is the only reason it would ever open up the subject.

Everything above that, XBSON can have, and really, have at it

Wednesday, December 28, 2011

Public Capitulation Unions

Recent media reports have suggested that to solve the unfunded liability the state will have to increase CalSTRS funding by $3.8 billion a year for 30 years for a total of more than $114 billion.
Although this is an accurate statement based on current projections, achieving adequate funding can occur several ways that would be phased in over time. The CalSTRS $56 billion funding shortfall can be managed, but it will require gradual and predictable increases in contributions. Calstrs
HT Calwhine

Here in Fresno we knew that the day the County Supervisors stole the billion dollars, some ten years ago.

Adding another chart to a Beckworth analysis

What really caused the crisis he asks.  So,  according to Beckworth, Ben failed to apply the monetary juice just when oil was hitting $140.   How do we know it was an oil shortage? Nominal and real prices converged, on a particular week, oil became very short.  Likely some economy went home without their allotment of oil that week.

A libertarians as head of Socialist DC?

Presidential hopeful Gary Johnson announced Wednesday that he’s bolting the Republican Partyin favor of a long-shot Libertarian bid. WA Times
Libertarians, I think, should be seeking the break up of the USA into regional autonomous economies.

Krugman in Wonderland

People think of debt’s role in the economy as if it were the same as what debt means for an individual: there’s a lot of money you have to pay to someone else. But that’s all wrong; the debt we create is basically money we owe to ourselves, and the burden it imposes does not involve a real transfer of resources. NY Times
Translation, if we have enough public sector teachers brainwashing us into thinking we are one family, then yes, we might get a bunch of foolish Californians to pay a 25% premium on federal debt costs. The Keynesian lie.

Truth: We are not all the same. We do not even have the same amount of democracy among us. It doesn't help if bureaucrats in DC get to print their own salaries, in California you made it illegal for John Lockyer to print money. We are not all that same.

Tuesday, December 27, 2011

The open source client search model

I mean the poor slob typing into the search windows. In his world he wants to click on a icon that does:
@(TodaysNews,ShortForm:BookmarkOntologies)
This gets him the short hand version of the general bookmarks, listed as a liear graph, perhaps. Breakfast reading.

The named graph, TodaysNews, is something he and his local machine created over time, with click thrus, key words searches, and the occasional person management of keywords via drop down menu. It is the client's morning aggregator, and convolves with the clients general ontological set of information resources.

Underneath the clients browser are loads of synonym lists, expanders graphs, and indices proviced by his favorite sources, like wiki. The client has sets of botlets, little optimizer graphs designed to hunt through raw key word text, looking for patterns and associations.

The client is constantly tagging stuff found, with external ad hoc attribute tags. So he has bots underneath constantly trying to keep these tags organized, sometie thrashing.

Every business person having a personalized Watson system, tuned to his specifications, but created with graphlet downloads. He loves his system, constantly dragging and dropping graphlet sequences into search complexes.

On that disk striping thing

In the big machine, disk striping is likely a nested graph model, right? That is how most of these linked memory mangers work. But that is exactly what we want to manipulate with the graph convolution syntax. So we cpondier the query optimization process as a mobile web bot. This thing can cruise through the Cray XMT semantic nest looking for high click thru counts, passing and collecting the keywords that get them, along with their predicates. The result is a better key word index, groupign the highest click tru counts together, maybe.

Then you run the second web bot, it looks over the list and breaks it out into sub graphs based upon required striping patters. So, striping patterns become named graphs, subject to morphing. The result is high eficiency bandwidth matching, click thru counts, matching disk io, matching nested store decomoposition optimized in shared memory. You get flow with minimal number of processing steps, but flow matched to client habits.

Threads, javabyte codes, dom trees cvs files, mallocs; all of them can be modeled as a graph of sets and descents, incorporated into an abstract graph layer with exposed node pointers and predicates.

Related, what is named graph transparency?

It means that at any level, when the client wants to search slightly beyond his norm, then the client may be slightly annoyed by named graph syntax.

Monday, December 26, 2011

Disk striping the Cray XMT

Cray has a disk striping layer, the ability to split data onto more than one disk allowing much greater recovery times. Disk striping should be part of the graph layer, for whoever gets the Cray job. We have a bunch of work to do on links and jumps in the graph layer, how is that formatted, what are the overloads, how does it become transparent to the client, yet robust and complex for the geek.

We will crack this problem in a week or two, hopefully someone besides me.

Other items on the agenda:

Having now rediscovered Pavlov we are going to spend time looking at pavlov stimulus response fro a channel point of view. The responses are encodings of a channel, phase locked to the stimulus. I think I have that right, we will be taking a look at the neuro scientists and entropy norms again; see if they are making progress.

What else?
We will be fleshing out the cure all meth web business plan, openly, in full view. We will calculate large market numbers, and see if we can kick start the thing.

Other things we let stew as the web bots mull them over.

Simple solution to the national meth problem?

In the war on drugs, the one drug with the greatest externality is meth, it creates semi-intelligent violent zombie drones. Perhaps there is a simple way to use the web and cure the problem.

Of all the cures, probably the cure with the highest success rate is to pay meth users for bonafide clean tests. A lot of theoretical reasons why this should work, though maybe expensive. The therapy was tested in SF regions, where Pelosi keeps legions of gay zombie voters. Many of them stayed off the drug when paid for clean tests. Many of them began to think clearly. So here is the plan:

Business Model:

Create a web site that can act as an intermediary between meth addicts and friends and family. Anonomously, family or friends can propose a payment for clean testing. The business therapists discreetly contact the user, and without naming the donor, offer the deal. If accepted, the business therapists will conduct and bond the testing procedure and report results.

Philanthropists who are concerned about thousands of zombie Mexican drug dealers marching north now have a mechanism to cut back the domestic drug demand. I bet Bill Gates, with a million dollars and a bit of Pavlovian research could cure a good 20% of the meth problem in central Fresno, meth capital of the world, the city with 20 thousand deranged zombies on our streets.

Pavlov was a long time ago, but I remember some of the stuff

Like, if a family member has a recalcitrant meth relative, Pavlov would say, discreetly offer the zombie free money for a test, even if it is not clean. But offer a bit of a bonus the cleaner it is. You use a Pavlov schedule to outwit the drug triggers that the zombie encounters each day. The zombie will begin to actually think, if I skip one trigger, pass up one hit from the glass pipe, I get a few dollars more. The Pavlovian schedule would try to optimize that utility by variances in the rate of return. So the Pavlovian system woks like the economy, on the marings. It works on rats and politicians, if I remember, so it should work on meth zombies.

Think of the incentives from the zombie point of view.
The meth case can purchase the next batch of the stuff if they have money. But one sure way to get money is to lay off a day or two and show a reasonable test. Eventually we get to the point that the addict decides to skip the next meth cycle and double his testing take profits, and bingo, we have him hooked on money.

A bit of good analysis by Cullen Roach

A Major Historical Trend That Predicts Recession In 2012

Read more: http://pragcap.com/will-the-shorter-business-cycle-lead-to-recession-in-2012#ixzz1heKwincL

Worth reading because they break down the economic cycles into a finite set of, well patterns, Elliot waves maybe? But they call it two long cycles, one on 30 years and one about 4 years. The 30 ye3ar trend was declining prices due to economies of scale in global trade. The soverigns used up all that good stuff with debt, using debt to fixed problems along the way.
We think that the three ‘super-cycles’ between 1982-2007 were the exception rather than the norm and existed largely because of a near 30 year secular global decline in inflation that transcended the business cycle. This was perhaps caused by Globalisation and the reduction in cost pressures that it facilitated. We think that the Western authorities ‘maxed out’ on the benefits of this inflationary decline by pumping monetary and fiscal stimulus into their economies whenever they had an economic problem. Given the lack of inflationary pressures they had a rare ability to do this without the normal subsequent price rises.”

Read more: http://pragcap.com/will-the-shorter-business-cycle-lead-to-recession-in-2012#ixzz1heLcxlHC

Let me put this in context. The Chinese were funding their Republican socialism, and Republicans lied about. All their crony capitalist freinds getting bailout money from middle class tax payers. The scam was masked by Chinese workers, so we had a defacto communist alliance between the two communist parties of the respective nations.

Who started the shit? Reagan, the Alzheimer's president. That idiot was a tool in the hands of communist bailouts,which continue to this day.

Can't get simpler searches than this!

Cray XMT™ System

The Cray XMT supercomputing system is a scalable massively multithreaded platform with a shared memory architecture for large-scale data analysis and data mining. The system is purpose-built for parallel applications that are dynamically changing, require random access to shared memory and typically do not run well on conventional systems.
Hey, Cray, get me mucho processors, working with big chunks of shared memory, and, no need to hire me, we can independently push the standards group to get your software architecture for Cray, then Cray can have boom times. Seriously, if these things are what they claim, then Cray can win big in the semantic convolution system.
Multithreaded technology is ideally suited for tasks such as pattern matching, scenario development, behavioral prediction, anomaly identification and graph analysis.
I bet they found my blog and added the last item about graph analysis. They should go back and alter the description and say the thing is designed for next generation semantic pattern matching, for the WWW.


The programming staff here at Imagisoft would start debugging code is they had access to one of these. The market model ere is tha tthe BSON approach is gointg to yield a combined data, variable expression streams built on the exposed nested store. It will be driven by a hgih performance BSON expression evaluator. So, the way I do the g engine, the nested store itself is self configuring, able to deliver expression sequences as well as semantic entity sequences, intermixed and properly nested with scoping rules that manage the pointer variable. That BSON code will have run as the kernel in each member of a group of procressers. Above these group of SMP processors is a Linux CPU, managing IO, mainly doing link jumps, but it is critical is does well with disk io. If we split SQLITE3, then the compiled ueries run on the SMP processors, and generic table/URL management running in the Linux CPU. The entire system would be nothing but a graph layer, with BSON overloads triggering special node matches.

Cambridge Semantics would love this.

.

Graph convolution is a natural for SMP

Wiki has a good summary:
Mesh architectures avoid these bottlenecks, and provide nearly linear scalability to much higher processor counts at the sacrifice of programmability:
Serious programming challenges remain with this kind of architecture because it requires two distinct modes of programming, one for the CPUs themselves and one for the interconnect between the CPUs. A single programming language would have to be able to not only partition the workload, but also comprehend the memory locality, which is severe in a mesh-based architecture.[1]
A computer system that uses symmetric multiprocessing is called a symmetric multiprocessor or symmetric multiprocessor system (SMP system).[2][3] SMP systems allow any processor to work on any task no matter where the data for that task are located in memory, provided that each task in the system is not in execution on two or more processors at the same time; with proper operating system support, SMP systems can easily move tasks between processors to balance the workload efficiently.

Graphs are self directed, nested form with forward pointer, they meet the SMP architecture. So we can conceive of an ideal graph layer. Read in a whole bust of nested form into shared memory. No matter if they comne from one graph or a zillion graphs as long as the graphs meet syntax.

The get some microprocessors, don't matter how many, have all the processors get access to shared memory. Each micro processor execute the convolution function at the request of the data. The only thing we need to care is to make sure the threads of convolution coalesce into the proper output segments, more of an order problem than a shared memory problem.

I think we are looking for SMP processors with four, maybe eight, independent processors doing convolutions into o and out of a huge chunk of ram. Every so often when a huge chunk of ram has become free, we load in a huge block of nested stored.

The issue of multi-processing has no context in self directed graph convolutions. Ram bandwidth being about 10,000 times the disk io bandwidth, naturally, one disk per eight SMP processors, and pile on the size of the shared ram. The ultimate efficiency comes when you have enough processors to chew up ram bandwidth and then the disk rate and the ram rate are equalized.

SMP and simplicity engineering match here.

Sunday, December 25, 2011

Christmas picture of the year


So far. I am not looking far enough because anti-planner keeps winning!
Author unknown

New England, getting the government goodies

An example of the fiscal policy skew. Look at the Ceridian, the pulse of commerce index based upon transportation. Click regional view, go to New England. Ever since the recession and/or election of Obama the economy has been run from DC with future taxpayer money, much of it expected from Californian and the big states. Who got the greatest growth? Traditional small state New England, with about three times the senate democracy as the average state. That is anti-democratic Senate skew. There are cross entropy effects because California moderately outperforms when DC money is flowing. The Mountain states underperform with fiscal spending, it seems, and I am not sure why.

So, if you want to beat Obama, the thing to do is to tailor the message according the skew. Tell New England that hard work and private enterprise wins. But tell the Mountain State that federal spending is draining their region. The same message is likely to work in all big states.

They are coming for me!!

Bloggers in Oregon, watch out. That’s because this month an Oregon court ruled that bloggers do not have same protection as the “media.”

This ruling emerged when Crystal Cox, a blogger, was accused of defaming Obsidian Finance Group and its co-founder Kevin Padrick on her blog. She posted that Padrick acted criminally in a federal bankruptcy case. Padrick sued and the court found that Cox was not protected under the state’s media shield law. This decision has implications for bloggers around the country. Cal Watch
Let me list my potential assassins for the Tourette Syndrome blog of the century.
So far I must be on the assassination list for the IRS, Fresno County Felonious Supervisors, Mexican Drug Cartel, Association of Islamic Belt Bombers. Republican Communist party, and now the Oregon media police!! I think the pope also has a hit out on me!

Saturday, December 24, 2011

Politician discovers anti-democracy

Rep. Hurt said he understands why Speaker of the House John Boehner (R-OH) agreed to the compromise, but he's unhappy with the Senate for not coming back to the bargaining table.

"I was in the state legislature for nine years, and never heard an objection to sitting down at a conference table and ironing out differences between the (State) Senate position and the House (House of Delegates) position,” Hurt said. “The consequences of what happened (Thursday) is the House position was totally made irrelevant." WSLS
HT Topix

Politicians are generally a delusional bunch of liar scoundrels, but so much for their good side.

Here Rep Hurt discovers a symptom but fails to count to ten. The Senate problem is there is no democracy, look at the variations in Senate proportionment over the various states. Fair democracy does not make the Senate. Foolish politicians who think or expect there to be democracy, are politicians unable to count.

Entanglement:
Rep Hurt is entangled, the Senate is the 100 folks there at the moment, and nothing more. That is entanglement, it ignores the huge anti-democtratic process that enforces Senate skew. Entanglement is the means and mechanism for enforcing the entropy norm.

Deranged African Muslims

In hard-hit Yobe state, where at least 50 people died, the government ordered a dusk-till-dawn curfew following attacks by the sect known as Boko Haram. In Maiduguri, the capital of neighboring Borno state, bombs reduced at least three churches to rubble and raised fears of further attacks by a group that claimed Christmas Eve bombings last year that killed dozens. Huffington Post

A theory of infotech and productivity

An explanation related to information technology

Greenwood and Yorukoglu (1997) turn the argument about diminishing returns on its head, arguing that the slowdown resulted not from an exhaustion of technical possibilities but from the opening up of new ones, specifically, the introduction of information technologies (IT). The authors argue that firms and workers will take a while to learn how to use the new technology. For example, they point to David's (1991) analysis of electrification in America. Before electricity, factories used a single source of energy—typically steam or water—to power all the machines at once, using a system of belts and drives. They continued to use this single-power-source structure even after the advent of electric power, using motors to drive groups of machines. Over time, however, firms figured out that machines could be powered individually, leading to more efficient production processes; for instance, the production plans for one machine no longer had to take account of when the other machines were running.

Critically, during this period, when both firms and workers are learning what to do with the new technology, worker productivity is likely to fall below what it was otherwise. Thus, Greenwood and Yorukoglu (GY) argue that while new technology ultimately leads to higher productivity, the immediate response to the new technology is likely to be a decrease in productivity. FRB

This theory is but one of several in the whole article, and they discuss the productivity slowdown in the 70s, for which we still don't get. More later.

Open Source getting popular

In August 2010 Jeffrey Hammond, the principal analyst at Forrester Research pronounced: "Linux has crossed the chasm to mainstream adoption." His declaration was based on the huge number of enterprises that had moved to Linux during the late-2000s recession. In a company survey completed in the third quarter of 2009, 48% of companies surveyed reported using an open source operating system.[6]  http://en.wikipedia.org/wiki/Linux_adoption
A big chunk of that server market is getting filled with MySql and Sqlite3 as well as a few other open source DB. What is Oracle up to?

More on Cray and the rest

Cray Inc. (NASDAQ:CRAY) shares slumped 11.15 percent to $5.50 in post market trading session. The stock has a 52-week low of $4.96, a 52-week high of $8.38 and $224.55 million in market capitalization.
This explains a lot. Cray is looking for a bigger business model.
Then this whoops!
Though upfront costs of commissioning a customized HPC system have largely been eliminated, the increased demand for performance results in astronomical rates of energy consumption. "When people talk about exascale computing, what they're really talking about is exascale at 20 megawatts," said Moore, going on to briefly explain that a comparable system operating today would use approximately 200 megawatts yearly at the prohibitive annual cost of $200 million. TechnTrend
So, we know why the web bots are moving to the North Pole and live with Santa.

Regarding the NVIDIA architecture, I probed far enough to read a paper on asynchronous message passing between cores. That architecture, independent routing between cores makes it work. Each core has about 48k of memory, maybe sharable. Worth further though, mainly about how a graph can be decomposes. The ideal application is a NVIDIA prestored with components of a dictionary, then pushing through ontology streams, each GPU core collecting any matches in its section of the subgraph. Worth further looking into, likely a role for them to play.

But Cray is right to take a shot at this market. It is not traditional scientific computing, but highly scalable to supercomputer. If I were the CEO of Cray, my message is. It is IBM Watson vs Cray and a semantic machine. Cray will have to be ready and force this issue, but certainly there is room for Cray, especialy is they are one of the few remaining supercomputer companies.

Hey, look at this


I can make a graph layer with this

Cray is looking for a Senior Semantic Web Engineer in Pleasanton, CA.
Qualifications for this position include: “Sound understanding and experience with RDF. Proficient in the SPARQL query language. Knowledge of Ontology Modeling and Description Logics. A minimum B.S. degree, or equivalent, in a technical field (Engineering, Science, Mathematics, Computer Science, etc.); advanced degrees desirable. Proficiency in the following: C, C++, multithreading programming, Java, Linux, shell programming, Semantic databases, web 3.0 applications stack. Analytical problem solving ability. A thorough understanding of vertical informatics applications and databases.” Semantic Web

The reason I quit the industry (aside from being a jerk) is that I find the process of convincing the hiring manager that I am qualified actually is slower than getting the job done via open source. So, what does Cray thinks it knows that is a big secret, anybody guess? I can get more things done with no cost by stealth with this blog and open source than anything Cray has invented. We, have the entire research of convolution style data queries going 30 years back.

You also know right away something is wrong, because you actually have to go to Pleasanton and stand in from of the big red thing. That not semantic networking.

My best advice to Cray, push the BSON folks on a graph convolution, a superfast graph layer embedded into the risk architecture. Once the BSON folks get moving then Cray can take their big box and claim to have the fastest semantic network available, open source, open standards; leverage the demand for this stuff from paying customers.

What if they are targeting me?
I know my ontology stuff comes up on their searches, they are in this huge business already. What can I do for them? Well, send me to the BSON committee with some cash and I can bribe and kick butt. The fact that I am being a jerk on Cray's behalf would certainly indicate to everyone that Cray will have the fastest semantic net available. But, again, this blog, some occasional market and technical research and detective work, I can get Cray what is wants. But at the same time, I get Sea Micro what it wants, but them's the breaks.

Look here is some architecture:
The XK6, announced on Tuesday, is made up of multiple supercomputer blade servers. Each blade includes up to four compute nodes containing AMD Opteron CPUs and Nvidia Tesla-architectureGPUs. It marks Cray's first attempt to blend dedicated GPUs and CPUs in a single high-performance computing (HPC) system.

NVIDIA, the same crew that screwed up my graphics driver on Windows 7. I could look at what Nvidia is doing. Maybe they can build a triplet flow architecture, move a few million of them per second. Parallel processing is what NVIDIA is pushing. Does this work for graph layer? Dunno, yet, depends on how fast we can get in and out of each processor and how much cache each one has.

Getting close to the semantic web

Simple Sloppy Semantic Database (S3DB) is a distributed infrastructure that relies on Semantic Web concepts for management of heterogeneous data. This distributed data management system was first proposed in 2006[1], following the argumentation the previous year that omics data sets would be more easily managed if fragmented in RDF triples.[2] That first version, 1.0, was focused on the support of an indexing engine for triplestore management. The second version, made available in October 2007, added cross-referencing between triples in distinct S3DB deployments to support it as a distributed infrastructure.S3DB
I will check into this in more detail.

Republican Communist Party Watch, McCain, Romney edition

Arizona Sen. John McCain said the coordinated attacks were proof that the U.S. is, "paying a very heavy price in Baghdad because of our failure to have a residual force there. I’m deeply disturbed by events there, but not surprised."

Meanwhile, White House hopeful Mitt Romney also agreed with McCain's assessment, telling reporters on the campaign trail, "The president's failure to secure an agreement and maintain 10,000 to 30,000 troops in Iraq has to be one of his signature failures." Big Talker

Let me get this straight. The middle class is supposed to cover $800 billion/yr in defense because another nation 12,000 miles away might upset some Communists in America?

Anybody doubt Eric Holder's citizenship

Eric Holder has blocked South Carolina’s voter ID law. Hans von Spakovsky and I have been predicting this was going to happen for over eight months here at PJ Media. The only surprising thing is that no halftime adjustments were made after it became even clearer an objection was on the way. Texas now faces the same dilemma. Sadly, I’m not convinced Texas understands the battlefield or the stakes involved.PJ Media

There are two types of black people in the USA, the ex slave families and Jamacan potheads. Some of us believe that Obama's father was a third type of black citizen, but I dispute the findings. Obama's father was an unemployed, Italian pizza chef in Honolulu.

Back to the point. I do not think black people who are ex slaves have anything to fear about voter ID, their citizenship is in the fucking Constitution. So, when I hear progressives claim that most black people to too fucking stupid to fill out a form, I look at Eric Holder and think, they may be right.

Republican Communist Party Santorum edition

Executive orders Santorum favors include banning federal funding for embryonic stem cell research and restoring conscience clause protections for health care workers. If a pharmacist did not want to fill a prescription for, say, the morning-after pill he or she would not be required to do so. Santorum would also ban military chaplains from performing same sex ceremonies on military bases.
Congressional directives include advocating for a personhood amendment to the constitution, a highly controversial initiative — a fetus would be considered a person at the moment of conception and could not be  aborted.

He is like Rick Perry, fairly dense. I mean, the issue here is the impending bankruptcy in DC. The last thing we need in DC is another screwed up father figure telling us when to fuck and when to eat.

He goes on with the usual crap about how the middle class needs to have a humongous military so wealthy American businessmen get an edge.

Republicans, socialism for another group, Santorum, Communist idiot.

Friday, December 23, 2011

Economic banking jargon

Macro and other musing reposrts on something called the safe asset shortage (bogus in my opinion). Francis Warnock summarizes the probem, he says:
To supply the world’s risk-free asset, the country at the heart of the international monetary system has to run a current account deficit. In doing so, it becomes more indebted to foreigners until the risk-free asset ceases to be risk-free. Here
Let me translate this brilliance with a metaphor. If you pour water into a glass, it eventually fills up. OK, let's go on:
a point made recently by David Andolfatto:
[G]iven the huge worldwide appetite for U.S. treasury debt (as reflected by absurdly low yields), this is the time to start accommodating this demand. Failure to do so at this time will only drive real rates lower.
This is a difficult concept, let me help. If you empty the glass of water, eventually the stream dribble down to nothing. The he says:
Both the Fed and the ECB need to return aggregate nominal incomes in their regions to their pre-crisis trends and do so using a nominal GDP level target.
OK, why are pre-crisis risk premia better than the post crisis risk premia? Supply and demand both meet, what business is it for DC bureaucrats to decides on another price/supply equilibrium? The only thing I can figure out is that some buyers of  safe assets are friends of DC bureaucrats and need to be satisfied.

I have news for economists, about 40% of voters in the USA could care less whether rich bankers and Congress control the price of safe assets.

Drug wars in the news

Carpe Diem wants us to read Richard Branson on drug legalization in Portugal.
The paper, published by Cato in April 2011, found that in the five years after personal possession was decriminalized, illegal drug use among teens in Portugal declined and rates of new HIV infections caused by sharing of dirty needles dropped, while the number of people seeking treatment for drug addiction more than doubled.

It has enabled the Portuguese government to manage and control the problem far better than virtually every other Western country does.
Well! CATO says socaalized drug therapy works!
Following decriminalization, Portugal has the lowest rate of lifetime marijuana use in people over 15 in the EU: 10%. The most comparable figure in America is in people over 12: 39.8%, Proportionally, more Americans have used cocaine than Portuguese have used marijuana.

The Cato paper reports that between 2001 and 2006 in Portugal, rates of lifetime use of any illegal drug among seventh through ninth graders fell from 14.1% to 10.6%. Drug use in older teens also declined. Life time heroin use among 16-18 year olds fell from 2.5% to 1.8%.

New HIV infections in drug users fell by 17% between 1999 and 2003.
There are a lot of cross correlations that are ignored, but for a preliminary look this seems like good news.

Has this worked in California? No, public sector unions got on the drug counseling bandwagon and screwed up the process. The other thing, California bureaucrats get power from distributing drugs, so they make heroin substitutes available only from government run drug houses. That is turnimng out to e a disaster, just let the doctors prescribe.

But otherwise, California has no choice, we can no longer afford to deal with drug addicts in any event. I don't think we have a solution anymore, except to warn the straights and stay out of the drug neighborhoods.

This is Fresno on meth

"Mean looks" between groups of teens Tuesday night sparked a confrontation that led to the shooting deaths of two teens, Fresno Police Chief Jerry Dyer said Thursday.Two other teens -- Jarrad Beard, 19, and his 16-year-old brother -- were arrested after the shootings. Dyer said the brothers, who were booked on two counts of murder each, are gang members. Beard was booked into the Fresno County Jail, while his brother was booked into the Juvenile Justice Center.Before announcing the arrests, Dyer expressed his condolences to the families of shooting victims Justin Hesketh, 18, and Brandon Moore, 16, both of Fresno.Hesketh and Moore were not associated with any gang, police said.

Read more here: http://www.fresnobee.com/2011/12/22/2658951/mean-looks-led-to-fresno-slayings.html#storylink=cpy

If you are a 19 year old kid from Fresno, you have already been tagged as a potential zombie creation of the Mexican Drug cartel. My advice to Fresno parents? Get out of Fresno with your family, get up to Clovis or somewhere that the local government is not overrun with Mexican drug importers.

Sea Micro can make graph machines

Sea Micro
The company is designing computers that require less space and power than conventional machines yet perform just as well. The idea is to fill a gap left by Hewlett-Packard Co. (HPQ) and Dell Inc. (DELL), which have curtailed research and development, SeaMicro Chief Executive Officer Andrew Feldman said in an interview.
What do customers want?
Instead of buying powerful computers designed for the complex needs of a few internal users, cloud-service providers want servers to handle millions of customers. That means they want a much bigger array of cheap, efficient machinery, he said.

They need to make a linux box with simple optical connects to other Linux boxes. Stack em up, as many as needed. The graph layer will make everything happen for the cloud customer.
Feature list from their site:
768 1.66 GHz x86-64 cores
1.28 Terabit interconnect fabric
Up to 64 1 Gbps or 16 10 Gbps uplinks
0-64 SATA SSD/Hard disk
Integrated load balancing, Ethernet switching, and server management
Uses less than 3.5 KW of power

If you want cloud applications, then you need a graph layer. The standards group is going to have a gestalt on this in about five days, watch.

A sign of the times


An unprecedented surge in internet shopping this Christmas has caused chaos for retailers and parcel delivery companies, who are struggling to deliver orders on time and admit they were unprepared for the sharp rise in demand. FT

Deliverbot is arriving, likely next year in gated communities.

If your payroll service is run by dolts, call me.

The National Payroll Reporting Consortium, representing those who process paychecks, said of the two-month extension passed by the Senate just days before the new year: “There is insufficient lead time to accommodate the proposal,” because “many payroll systems are not likely to be able to make such a substantial programming change before January or even February,” thereby “creat[ing] substantial problems, confusion and costs.” NRO
There are better open source solutions that come with intelligent programmers, I can find them using my human web search techniques.

Make cash illegal?

Prime Minister Mario Monti, in office just over a month, wants landlords, plumbers, electricians and small businesses to stop conducting large transactions in cash, which critics say helps them evade taxes. The government on Dec. 4 reduced the maximum allowed cash payment to 1,000 euros from 2,500 euros. Bloomberg

That's some solution to tax evasion!

Thursday, December 22, 2011

Externally assigned attributes, once again

Here at Imagisoft, I convened our imaginary staff and we drew up a statements on how to manage external properties assigned to pre-existing objects in the graph.

The client can only attach external properties to named graphs. The name graph function will maintain the reversible link between the named graph and external property. Named graphs have to be promoted to an external index.

So, if the rest of the group agrees to this, then maybe I can kick our programmer in the ass to debug a line or two of code.

A little Greecey

Rhode Island communities face more credit-rating cuts as the local economy declines, property values plunge and pension liabilities rise, Moody’s Investors Service said, citing sharper trends than in most states.

The growing cost of retirement benefits is “reaching a crisis point” for many of its local governments, Moody’s said in a report released today. …

“A lot of national negative trends are particularly acute in Rhode Island,” Naomi Richman, an analyst at the credit rating company in New York, said by telephone before the report was released.

The trend for 2012 is “likely to favor downgrades,” and there will be “few if any upgrades,” Moody’s said in the report. It also warned that a state oversight program, set up in June 2010 to help municipalities facing financial pressures, is untested and may be overwhelmed by multiple simultaneous requests for assistance. Moody's

If you read the article you find pensions a bit sticky. Keynes would say sticky government wages are good, look at Greece and Italy.

Attributes vs Overloads

I should set my terminology7 before I mix myself up too much. From now on, overloads are overloads on the predicate, actually make the link field overloaded. The domain over which the overload appears are the proper object for the overload. An attribute is something a client tags on bound object. The object may or may not know it has been tagged, I dunno, looking at that now.

So is MangoDB and Freebase, by the way, looking at this issue this very moment. The industry will solve it. But external tagging an existing dgraph segment with a property is important to the client, it has to be solved.

A multiplier greater than one? Not

The population of the District of Columbia is growing faster than that of any state in the country, according to a new U.S. Census report that shows an acceleration of a trend in which largely skilled and educated workers have flocked to the city’s resilient local economy and its well-paying jobs connected to the federal government.WA Times

What is the logic here? The Keynesiasn would say that moving a lot of peole to DC to run the machinery better will make the whole economy better. But that is a two step process numbskulls, just the process Keynes deplored. People moving to DC because we have a shortage of bureaucrats? That is not a high multiplier.

The war: Shia vs Sunni

Baghdad Blasts Kill 57 as Political Tensions Rise

Believe it or not, the Iranian, Syrian and Iraqi regimes are coordinating their push. Dunno what the human race can do about the Middle East, I think is is a passageway out of Africa, and that is a difficult position. The price of oil will rise very soon as the sunni vs shia thing begins to target oil.

Wednesday, December 21, 2011

Ontologies, named graphs and links

Normally our searches undergo a process like:

@(@(MyGraph,OptimizerGraph),GraphLand)

The major search engines take a sloppy TE list of key words, run it through the graph optimizer which find two or three orderings for the words, adds a few synonyms. The resulting opmized search graph is then convolved with G land.

In the expression, all three graphs are named graphs. How do named graphs and URL jumps coalesce? A named graph is either a node index into a local table, or a node index into a remote table. How does nested stores maintain the distinction, and do URL jumps always require a named graph? Is their special indexing used for named graphs?
Dunno, but a few posts and I can drive the industry toward a consensus on this. It is a big problem right now with everyone doing semantic nets and convolution models.

One thing I can add, the table is not known to the client, who knows only graphs. A table is an internal construct, mainly because my graph layer is sqlite3. Graph layers will, in general, have to have an internal construct, like a table, a collection, something the query system can latch onto. We now with certainty, I think, that the table will at least be a complete and correct nested store.

So let's define the general philosophy of meaning and look ups in ontology networks.
The idea is series of convolutions of a search graph transforming it from a less informative structure to a more informative structure. In the example above, for example, the philosophy would state the the query optimizer is a local ontology network familiar from the past with user keywords and click thrus. So the query optimizer takes search graph through a revision process, translating loose TE into tight TE, based upon local knowledge of the client. The result is sent into G land where the data structures may be very square, very well indexed, and using very proprietary search methods. Do not lose meaning when it has been captured. Under different clients, bedpan may refer to products, or it may refer to a set of very badly designed URLs; who knows what ontologies the client has created with his key words? The local optimizer knows.

What is my underlying assumption?
The graph layer and BSON layer support web bots, bots that operate within g land and recognize meanings via click thru statistics and keyword sets. Consider attributes. In my local machine I have enough indexing space to cross a thousand URLs with a thousand attributes. But this attribute list is unavailable to anyone but me, easy to do. The web bots are the individuals who determine whether an attribute link set should be promoted. Promoted up the net so other clients access it, and promoted into square schema wherever possible.

The Freebase node

In the [Freebase] quad dump, every line is a freebase node. Think of a node as an atomic fact. A fact in this context is a single proposition that relates an entity (an object) to a predicate.
The entity is represented by a mid, a machine-generated id, which is in the first column (the "source" column). The source column always contains a value.
The predicate is represented by at least two, and sometimes three of the remaining columns.
The second column is the "property". Values in this column are names like "/type/object/name". These represent a particular kind of quality of the entity mentioned in the "source" column. Like the source column, the property column always contains a value.
The value of the property is held in the remaining columns ("destination" and "value"). Depending on the kind of property, either or both the destination and value columns have a value. Also depending on the kind of property, a single property name can appear multiple times for a particular mid in the source column. In this latter case, the property is multivalued or represents a 1:m relationship with a set of other entities.

Well, I can tell what's going on. They don't have a real graph layer. The subsegment bounds are hidden in node values, have to be extracted to use them. No graph layer, they will have to adopt triplets, or scrap the technology. Gotta have a real graph layer or you gonna get lost.

Here are some general instructions: The node has to expose a node pointer so the graph machine can get at node graphs without opening any string up. The predicate is broken up into the graph layer and the BSON layer, wit two available bytes at the moment. The graph layer never opens the key value. So we have the key,predicate,pointer, the general triplet is more useful to the graph layer than to the humans, and engine designers have to think about machines talking to each other in triplets. Any compound object, like name/value pairs should be composed of triplet sequences, the predicates will indicate what's going on. Do not have two variable triplet formats, really screws up the graph layer. Generally the graph layer does not look up symbols, but with the named schema that might change.

Loops in the process

Setting up github, and git the source code control. Their access points, command line, GUI, and an instruction set from the GitHub itself. If you work all three set ups simultaneously, you end up with two imcomatible usernames! Loops in the process, complexity engineers create, and simplicity engineers make gazillions removing them.

Anyway, following various instructions, I likely have three projects, two usernames, and three code repositaries, none of the repositories having code! Loops, unemployed complexity engineers create them. Having created this mess, I now likely have the keys, which I never wanted, mixed up, making it impossible to coordinate my local git with github on the web.

Complexity engineering makes marketing managers. Rather than sit in Silicon Valley listening to complexity engineers, I would prefer to market simplicity, and let free competition create it.

OK, so I have now deleted my Github account, I will now delete everything I have done with git, and restart. This is query optimization. Search for the blog posts of hundreds of complexity engineers, then finding the optimum path via an inner join of all their GIT setup instructions. I can do this in byte code mode now.

Murphy's law keeps on going. So on a restart what stuff do I want to keep in my repository on the local machine? Dunno, since git installed some defaults, and I screwed up some usernames, I have no idea how to reverse things. The alternative is to decode the the Typeface Exlposion of Unix commands, to do Git commands then forget them immediately. Or erase the entire contents and reinstall the local git hub. Or just leave it the way it is.

What happens if I screw up again? I am still forced, sooner or later, to go relearn unix typeface, for the umpteenth time. Yes, just uninstall github, let some else worry the issue.

The Freebase API

http://api.freebase.com/api/service/mqlread?query={"query":[{"id":null,"name":null,"type":"/astronomy/planet"}]}

They go back to the string form, they should ship things around in their BSON form. Inside the 200 square foot brain at the North Pole are thousands of Linux boxes shipping billions of graphs per second among each other.

What is Oracle going to do? I will do a bit of research. But Oracle has adopted SQLITE3, a big member of the sqlite3 forum. Hmm... maybe Larry Ellison has sent his detectives digging intoo Google trash and knows something. I doubt it, I doubt Freebase knows that much. But you can see the basic fail in Googles plan, the graph layer extends right into the clients machine, there is very little to prepare a search, mostly free hand search words. It is the web bots, underneath, who create a bson query, web bots executing bson expressions. Doing so, they filter the key words through the clients own query expander.

The problem here is Freebase waited ten years to develop a technology that, at the time, was 20 years old, it even predated computers. This whole episode is all about getting standards to catch up before big companies patent the prior art.