Friday, November 30, 2018

Consider enterprise management or software upgrades

The new flexible shell is opaque, but the developer can deploy a command and control library for the deployed application or hardware.  They get a clean interface to the standard GNU power shell.

New graphics driver? Release the lab utilities in a library, the user can execute them easily enough. Or write a complete application management system, just expose your control utilities with the simple API interface and it is full shell compatible, and you get the flexibility of full duplex paths over the arg list.  It is going to b much easier to bring the power of the command line to the user.

We will add as an option a char based GUI, an upgrade from Ncurses,  designed over XCB, universally available anywhere in linux world.  A simple character windowing code. Developers and deployers know th user sees a basic format for command and control on the console.  The user getting a systematic response across all subsystems.  A whole lot about linux and marketing linux products will get easier.

I posted a link to the complete shell prototype code on the right. It includes a rudimentary symbol table, it is incomplete and completely untested, a first pass. About 100 lines of code in that prototype replace about 50 lines of code in shell, a fair trade. I leave the cheap symbol table as is, low priority. I implement only the c functions I need for testing join and report the results. Open source community should do the rest. Not much work, no new header files, standard c, keep the whole shell under 600 lines, it is a cut n paste and fully standardized in the c and linux community.

All character based, dd another 400 lines of c code. Everyone will use this, or a competative copy.  It is the packaging:
First  cutnpaste, second gcc shell.c, third ./a.out a

And you are connected, all the help, command and control, special diagnostic libraries, distributed management system, driver managers, all under the standard linux c macro shell, portable.

Something broke

The ten year yield has been steadily dropping, the one to ten slope at 31 basis points.  Something went bzonker in developing markets of in europe, some groups are running for the safety of ten year yields.  This is  oligarchs running from risk.

My c language parser

After looking at two or tree simple c interpreters, I stole some code and got a simple parser down to 200 lines of code.  But it is parser only, and there are a few hundred lines of code that executes the c code, I just parse it.

I used the mass approach. I grabbed a list of every actionable tokens in the c language, and check the source token against that. If this is a key c language token, I get a tag ptr and push it on a stack. The tag ptr tell me the string representation of the action and a ptr into the source token triggering that.

If the source token is not n the official list, then is is a variable, and goes in the symbol table.

Then pass two is a simple switch statement:

Idiom *  process_tags(Idiom * tag) {
void * arg_list[20];
int argc=0;
while(1) {
if(tag->str == 0 || *tag->str == '}')  return(tags+1);
switch(tag->form) {
case null: // Not implemented or pass through
case unary: // ^ a
case binary: // + a b as a cmd sequence
arg_list[argc]=tag->str; argc++;
break;
case command: 
if(*tag->str ==';' || *tag->str==0) // The cmd sequence is complete
tag = ExecCommand(&argc,arg_list));
; // Exeute cmd calls the generic cmd list
break;
case cast: // Already in the symbol table
break;
case pair:  // process internals until done
tag = process_tags(tag+1);
break;
case variable: //Already in the symbol table
arg_list[argc]=tag->str;
case integer: // Becomes an argument to pass or value in an expression
case string:  // ditto
arg_list[argc]=tag->str; argc++;
break;
}
tag++;
}
}



OK, notice the one thing, in my syntax the ';' char causes the switch to immediately call ExecCommand. A command of the form:

cmg arg1 arg2 ..; That process has hundreds of lines of code, all of them unwritten. Here is my object code for while:
while arg1; 

Test the argument, then continue if true, return from the routine above if not. The decision is mad in the command handler for while, and it returns the continuing tag pointer into the sack.  There is some code for that.  Also there is code that needs to compute expressions, none of it written, and any of it I do insert will be mostly stolen.

My point is, ExecCommand, that routine is a commandhandler, it can to ls, cd, gcc eve, as well as while, for >>,<< and soon, a unified command handler with all arguments being:

ExecCommand(int &argc,vid*args[]);

In other words, the standard linux and c argument list grammar is my object code format. The parser above will pass through all the undefined as long as they fit a proper cmd sequence form.  

Well, actually, the parser above does none of that now, it is untested. But we can see the key idea, combine the cmd sequence grammar and scripting grammar in one parser. We can get this in under a thousand lines of code, get the basic c control structure, and add command implementations as we go to get more functionality.  Making that argument list full duplex, and typeless gives great flexibility down at the sub system level.

Plus the idea that this is opaque to any application underneath. A very powerful concept, an adaptable layer that can compete and win over PowerShell.  It is really a layer, slightly more than a protocol; thin, easy to deploy. The customer will load the dynamic lib for any sub system he needs to manage. The customer get the standard command format,

VerbAdjectiveNoun arg1 arg2 ..;

The light bulb clicks because, hey, we all already agree on what a command sequence looks like, all the existing utilities have adapted to the linux argument format. So the pros can write the sophisticated spaghetti to get in and out of linux, then just connect the user to his system with a load lib on the shell. All the subsystem commands become part of macro shall language, the pros can pass arguments down the chain, return well formed results using he arglist format.

Macros and functions?

Do them both.  Macros already easy, just inset the expanded script in place.  Functions are set on an arg_list, a separate one or even the same one. Uses the same parser, but a separate arg list.  The cmd format is a great format, frees the developer to write whatever spaghettit desired, and always the same interface.

Two year 2.81 five year 2.84

Three basis points/year is all one gets for tying up their money an additional three years.  Investors will take the more liquid position.

The guts of my macro capability

Here on git hubKai Sun, wrote this, a thousand lines of code, but it treats simple c syntax as a scripting language.  Why not dump this into my shell? Then all the macro facilities will be simple c like loops and testing. The code actually removes some of my spaghetti and replaces it with new spaghetti, so the total load on bloat is about 800 lines.

This parser has capability of variable definition, I can modify that slightly to define variable interfaces tlike the argument list, like data storage.  The interpreter is independent of any object code, and unknown cmds need not be errors if they obey the command syntax. When the interpreter will see something like:

if(true)  gcc -c joinloop.c;

That statement in normal c generates an undefined for gcc, here are no operators on the arguments, everything fails.  But noting stops me from modifying the language to include the SPACE as a valid operator, separating arguments which are a new fundamental type.  The one or more spaces is a binary operator that concatenates undefines into the argument list. Any correct command line sequence is valid script syntax, undefines in that context are first class vars..

Easy changes, and remember, this is not meant to be a c interpreter, just a macro expansion capability and users are aware of the argument grammar, they are experienced Power Sell users.

This code is well written, and there are others.  Can I reformat c code into linux argument syntax?

The syntax gives us type flags as in:

gcc -c join.c -o join

Good enough to reformat a while statement for example:
while(a==) { do code }


becones

  
while -var a;  // test a then interpret the enclosed statements,exit on the '}' char.

This is a just in time compile, the while statement expanded properly as soon as it is reached..


I use the flag to identify types as in

var    - in the symbol table
const - immediate value
local  - index into arg_list

Note:  At run time the macro shell actually has all arguments in an array, essentially a linear array of tokens  which we can call the executable format.

Then add a binary and unary cmds for expressions.  Thus, we can intermix cmd, args, and macro control in the same format, just obey the context specific grammar.  Context specific means the cmd may re-interpret flags in the current cmd sequence.  gcc -c uses the flag -c different than any other command sequence. Nor is any cmd restricted on how local variables are identifies, and it can keep its own set of variable names, or make assumptions about all its arguments.  The contract is between user and cmd, the shell does not participate in context specific grammar within the cmd line.






I like the Windows Power shell

I have not looked far into it, but I get the basic concepts and keep the framework in mind.  We need a GNUPowerShell, and I give it these specific requirements:

  • Combine standard argument syntax for commands with scripting controls to make one general parse that produces command line sequences of the form:   cmd arg1 arg2 ..;
  • Deliver arguments in a full duplex argument list of type void * argv[].Upgrade arguments to first class variables so that the command can be sub classed while keep most of the call structure.
  • The shell maintains its own environment variables.  Subcommands are free to access system variables.
The shell is disconnected from subsystems, it is just a full duplex channel. Hence, leave it in c, the inheritance is informal, the code base not likely to exceed 1500 lines, a single package, easy to compile.  Then use the dynamic loader to get specific subsystem commands into the system, as needed.  Dynamic loading here works fine because the call interface does not change, the older versions of a sub shell still functions at the shell level and problems easily diagnosed.

Existing commands can be easily adapted at the subsystem level. If they cannot accept the call format, the the subsystem easily reconstructs the format it needs.  But most commands need to decompose the arguments into system calls anyway and would prefer the arguments pre-parsed by the shell.

My shell is mostly an incomplete version of this, just enough implemented to get me through testing join.  Join needs a full command and control channel, no other option here. The GNUPowerShell idea needs to move forward as an open source project.

The typical American ID has been hacked many times

(Reuters) - Marriott International (MAR.O) said on Friday hackers stole about 500 million records from its Starwood Hotels reservation system in an attack that began four years ago, exposing personal data of customers including some payment card numbers.

And we be hacked many times more under the current system.  The inherent failure? Customers never should have given Marriott their specifics in the first place.   The reason our IDs get stolen is because we give them away to organizations like Marriott.  Instead, we need to have our banker bot just give them the public key needed to complete the sale, and keep the secret key hidden by the banker bot.

My smart card proposal is simple, but fails in one regard.  We continue to have laws dis-allowing our possession of honest bots watching our stuff, the smart card is illegal.  When the NSA wants secret keys forbidden, they actually desire to peruse our stuff. Those terms are unacceptable, violate basic rights. Techies need to deploy the smart card functionality, they know it.

The big tech companies fear the political correctness crowd. At the heart of Silicon Valley is the desire to control our identities, it is in their politics and their software.  Everything built around selling our stuff to others.  We can fix this, outside of Silicon Valley, it is not something we can do in the Silicon Valley political atmosphere.

David Mitchell at the Guardian discusses the issue:
I do realise that, if we’re going to have online banking, customers have to take some responsibility for keeping their money secure. If you start putting your passwords on Facebook so friends can help you remember them, banks are put in an impossible position. Then again, if you find the online world impersonal and bewildering, there is no longer a realistic option of banking in the old-fashioned way – of having a personal contact with a bank employee, in a branch you can walk to, to whom you can hand your money and who will hand it back only to you.We’re all forced to engage with internet and telephone banking, with all their possibilities for fraud, primarily because it’s a cost-efficient way for banks to do business. All those high street premises, all the cash and cash machines and UK-based staff created huge overheads. But it doesn’t seem right that the banks benefit from all the cost savings made by going online, while customers take the hit for the consequent ease with which money can be stolen.

Passwords should be generated and held in secret by our hand held bot. Even I should not know my password, the smart card has NFC, it can log in for me.  It is, in general impossible to have humans protect identities while in the open web.   Hence, put the bot in charge, keep humans out. As long as I have a valid thumb print, the bot can operate safely on my behalf.

What key was revealed?

The key revelation of Michael Cohen’s new guilty plea is this: Justice Department Special Counsel Robert Mueller is one step closer to showing links between Donald Trump’s business interests in Russia and his conduct as a candidate for president.
The guy sold hotels and ran for president.   Truman was a haberdasher, Carter a psycho peanut farmer, Bushies professional welfare bums, Reagan brain dead.

Was Trump's economic policy rigged to make it easier to sell hotels abroad? Absolutely, we voted for that policy, it was public news, the very heart of his campaign.

More competition!

Acumos Project's 1st Software, Athena, Helps Ease AI Deployment


IBM has an AI platform also, as does MS, Google, and the rest. At the heart of all these things is a kind of 'join' system with various grammars for finding paths through the spaghetti.  I should work with these folks, Acumos is Open Source.  But I fear bloat, I fear the case where everyone has their own analysis tool jammed into the system making it difficult to sort the good and bad.

My best bet is to plug away slowly and post the files.  Soon I will have competitors trying to architect the same idea, incremental grammars operating on structured data.

I will win, I have no bloat. The core join program has less than a thousand lines of code, before grammars are loaded.  It is not a GUI framework, it is a machine framework, designed to be crammed into a AI co-processor. Developers are free to add the GUI that supports their needs, but a python application built over the join machine is the goal.

the join system is built around LazyJ, which is an executable form of data and execution, in one format based on comma separated file, like JSON.  By adding grammar specific operators to join, the LazyJ syntax is still usable, it is context adaptable and works on high and low lever data structures, like templates.  LazyJ is relatively undefined in terms of its search,replace and delete capability inside the join.  The added functionality I add to LazyJ is all experimental, ad hoc. I fully expect the pros to get a complete LazyJ definition soon.  Ayway, I have the winning architecture.

Thursday, November 29, 2018

What did I do on the join today?

Brought the memory module up to snuff by adding the second indexing mode.  Not too bad as both indexing models share the same fate with respect to Mach, and the other LazyJ ops.  Hence, most of it is shared in that module except for a few lines doing Step, Skip and Append. 

That is the core, the other grammars are all dll loaded, optionally. Thus he dynamic loading on demand simplifies the update process, the core remains stable while attaching grammars go through their revisions. The core consists of the join loop, the executive cursor manager, and the memory module.  The shell just starts it up. So those four modules are a much smaller task.

Updating files in a day or two on memory.c

And a damn good plan it was

Sater told BuzzFeed News today that he and Cohen thought giving the Trump Tower’s most luxurious apartment, a $50 million penthouse, to Putin would entice other wealthy buyers to purchase their own. “In Russia, the oligarchs would bend over backwards to live in the same building as Vladimir Putin,” Sater told BuzzFeed News. “My idea was to give a $50 million penthouse to Putin and charge $250 million more for the rest of the units. All the oligarchs would line up to live in the same building as Putin.” A second source confirmed the plan. -BuzzFeed
It didn't work out, Trump got elected.  Using ill gotten gains to build hotels is as good as any other use for the money.  Economically, the bigger money is building housing within the range of the Moscow middle class, the volume * price product should be highest across the median class.

More to the point, marketing American hotel brands is an OK business, not necessarily all on the up and up.

Smart card concept in linux

In a promotional video not dissimilar from the sleek ads released by tech giants such as Apple, Coinvest unveiled the Coinvest Vault – which is designed for storing digital assets. The company says the hardware wallet provides a “one-of-a-kind security architecture” and supports the transfer of hundreds of different assets – and a special emphasis has been placed on simplicity and user experience.
The Coinvest Vault runs on the Linux operating system and is equipped with a 3.5-inch LED touchscreen display, and is furnished with a USB-C connection that can be used for authenticating desktop and mobile devices. According to the company, the hardware wallet’s screen – along with its authentication chip – can “protect consumers even if the host computer is infected with a virus or malware.”

A linux in your pocket!  This looks like an iPod, and really will work.   Linux is fully a distributed architecture in that development groups can remotely agree on a transaction contract layer.

The full linux is a bit of over kill, we need my idea of a simpler, character based linux for embedded 'accounting devices', devices managed by bots which view the world as arrays directed graphs of ints and strings, not bitmaps. Designed to run on an accountants character based screen.  hen since it is single user, by security of design, all of the Spectre fixes work fine, super threading not needed. The smart contracts protocol utilizes instruction cache control to implement shortest path to resolution with trusted miners, and that makes pure, liquid bearer asset.
2018 was year one, my prediction is true.  Most of this driven by what we know from spectre. The entire fintech industry 'gets it'.

This gets my vote:

Coinvest argues that entering the crypto world has been a bewildering prospect for the public to date – with thousands of coins, hundreds of wallets and multiple exchanges vying for their attention. Its website explains: “We aim to lower the barriers to investing and liquidity by providing simple, accessible solutions to all audiences.”

This company is founded by a bunch of MS alumni. Linux needs to agree on a contracts protocol. It needs to be constructed from a small, closed set of atomic transactions, operations on account. The grammar specifies a parse tree made of these, finite, no loops top to bottum directed graph, the format for protocol.

The protocol acts like a script, executing kernel secure transactions, all consistent within the instruction cache. The protocol has an extensive, off line  'proofing' industry, and ongoing, online  protocol police bots monitoring accounts ex-post, the trusted miners.  So the protocol defines confirm points, and defines clear exit and swap points within the cache for security clean up.

But, it is, in essence, a special cache mode, likely built into linux at the kernel layer and connected to secret key capability of the processor.  The result is a scripting grammar executed by the kernel inside the closed instruction cache.  The kernel in charge, connected to processor secret key management. The proof here is 'proof of only path aailable',  all counter parties locked in the cache, their trusted miners gove the go ahead, and nothing leaves the cache until all parties reach another check point in the protocol.

In this manner, we can guarantee that the cost of scofflaws cheating the trusted miners is bounded. The honest miners gaining advantage in tracking down scofflaws, ex poste.  The protocol includes auxiliary methods for crypto badges and prequals.  This is all easy stuff for the linux community.

The protocol looks like a standard set of spreadsheet functions, he contract a rectangular balance sheet.  Generae these from the NASB list, automatically, look at Quicken, GNUCash, both of which will be participants. Given the device above, we can iagine a Quicken contracts manager which shows the user all he automatic updates to his sheet, in real time.  Qucken or GNUcash pros could lead the charge, full integration with Coinvest.

Dumping the bitmap in hardware wallet

GUI is bad in hardware wallets because the ots do not talk GUI, they talk lain text.  The cards will notwork for the user if we have to translate between GUI and contract script. All contracts will be plain structured text.  Accounting isbuilt upon the well defined monetary entries, the human user needs to see the plain text operate in raw contract form. We are not masking anything, GUI in ths app is a mask. the card is desined to talk to other bots while the user watches. Keep the GUI at the personal desktop or laptop, where applications lie.

Key insight
Spectre was not a bug, it is a proof of completion protocol for instruction caches, it is the key innovation that makes 'shortest path to completion' work which enables trusted miners. It is kernel expensive to swap the protocol in and out of cache, securely. But we don't care, the trusted miners take secondat check points, swapping out will be the norm; keep the swap secure. We have inverted the bug.  Malicious threads have become trusted miners. Swapping costs have become secure checkpoint waits.

A system block diagram

I tried out LibreDraw, figured on making a block diagram of the join system.  I also did quite a bit of software yesterday, after working my tax form.

But, the point is, the shell.  It knows othing about what is going on, all it does is generate command sequences of the form:

cmd arg1 arg2 ....NULL

all elements of the valid command sequence are encapsulated in an array of Nul pointers and called with:

int Entry(int * argc, void * argv[]);

The argv is, in essence, the program and argc the program counter all kept in shell space.

The cursor manager will partially execute those cursor commands that need management, but otherwise passes the command and control to the grammar specific layers, as in:


// go through the list of attachments
// and pass the command down
while(devices[i].entry) {
if(devices[i].entry(argc,args) == SUCCESS) 
 break;
 }

The the command and control interface has inheritance properties making configuration much more systematic.  Further the subsystem grammars may alter the argv list, and otherwise us them for both inut and output making command and control a full duplex process. And the nice thing, if the new macro capability is not called out then it should default to bash syntax.


Initialize:

I have a simple rule, the first shared object loaded from the shell becomes he executive, and is the only module called by shell.  All load requests are followed by the command sequence: Init, using:  argv[0]="Init"; 
entry(argc,argv);
which is passed down the line to the specific grammar attachment that was loaded loaded.

So the system maximally uses inheritance, informally,  in both command path and run path.  The shell is kept completely in the dark about any of this, just preparing command sequences from the script with no knowledge of their function.  Shell does not even include any headers from any particular subsystem, and could be managing an apple farm for all it cares.

This works because the bash macro grammar and linux argument list standards are well accepted, so shell can just duplicate that functionality in one parse routine.  That exposes the extra functionality in treating the arguments as first class variables without destroying old style functionality..

The added advantage, each command handler needs to break out the argument list anyway, so why not have the macro scripter do it once at the start.

 Code status:

I am slowly updating the files listed on the right side of this blog.  Most of this architecture is in place in the lab, but I hesitate to post all the files as I cannot say I am ready to expose a production beta.  The files on the right will not compile together, at the moment, and should be read as prototype examples of the production beta changes.

A conspiracy to build hotels

According to the Hill, Cohen's alleged lies stem from testimony he gave in 2017, when he told the House Intelligence Committee that a planned real-estate deal to build the Trump Moscow Hotel had been abandoned in January 2016 after the Trump Organization decided that "the proposal was not feasible." While Cohen's previous plea was an agreement with federal prosecutors in New York, this marks the first time Cohen has been charged by Mueller.
Trump's attorney to plead guilty on lying to Congress.   He lied about a hotel project in Moscow, and this was and remains Trump's crime business, building hotels and selling rooms to oligarchs.

Trump was never in a position, intellectually or otherwise to collude on American policy, he simply didn't have time or knowledge to make that deal at the moment in question, the hectic months before the election.

Wednesday, November 28, 2018

About that Ruskie conspiracy

Two things.
Mueller is in a huff about something.
The other is someone claims to have seen the Manafort guy vising Assange.

The second is unconfirmed, consider the first.

Who knew about Ruskies having the Hillary goods?  This is July before the election, Roger Stone is trying to get a pal to get a pal to go see Assange, no Ruskies involved.  Whatever Roger did, it had nothing to do with any policy change purchased by Putin from Trump. Roger was still working Assange rumors.

On the other item, the Manafort guy visiting Assange, I doubt it. It would have come up much sooner, and it is just a rumor from an anonymous source.

I am not sure that folks living in a free society can be charge for being gossip mongers in the internet. I enjoyed Roger antics, it is entertainment, hardly a criminal conspiracy.

The obstruction of justice charge was about Manafort laundering Ukraine money, nothing to do with e mails. So Trump did not use his authority to impead Mueller, he used his power to impede a bunch of other stuff.

Mueller has the goods on hotel deals, most of that prior to the political victories. Damaging stuff, likely illegal, but having nothing to do with e mails.

BTC and the cryptos got a 13% boost overnight

Is this anticipation of the Powell speech?  Bitcoin doing its job, hedging the central banks.  Then BTC leads all the cryptos higher, they trade in pairs a lot. We get CB fiat into the BTC market, and a portion of that leaves blockchain to other cryptos, actually a kid of over flow, a market outlet.

We want to harness the energy, courtesy of Treasury.  When BTC jumps, we want it flowing into manage contracts, investments in people doing work.   The difference between flowing into other cryptos and flowing into work contracts is minimal, one more rank in the transaction graph.  All of it transported via that same crypto base verification system. Grow beyond simple currency counters.

Treasury getting killed

Powell Sees Solid Economic Outlook as Rates ‘Just Below’ Neutral

Look, deposits exit seeking the one year treasury at 2.67%, nice number and as I pointed out earlier, the curve slope represents the cost of liquidity, quite low.

Treasury is getting killed, hit by the double whammy, rates up, QE sold off; at the same time.     The central issue, the Fed thinks Treasury has suffered enough for a while, doesn't want Trump to invert the curve.

It was an unfair move, raise IOER and reverse QE at once as that is not reverse order.

Back to taxes

I screwed around to long, the union thugs are  the gate. No more posts for a while.

The architecture of a personal contract manager

This stuff is years old.  But it is a simple, secure, small spreadsheet function in our ATM card.  It operates on the standard personal accounts schema, and contracts have access to secure macros (or dictionaries) that they can check for compliance.  Contracts become scripts on the secure spreadsheet.

Use the same layout to manage keys held privately in the card. Keep a few secure key macros that can generate public keys of various sorts, under contract.  Simple stuff, but it requires counterfeit proofing and monitoring by trusted forensic accounting miners on the network. It works, it brings everyone equally close to the currency issuer, it enables things like generic stock issuance from groups of individuals.  Techies need to break the rules here, get the prototypes out and deal with NSA later. Just keep transacted message protocols short, to the point so there is less space for cheating and terrorizing.

Chain contracts

Open software collaboration made easy.  Each party uses their smart card fr login and releases, and work performance monitored by the smart card contract.  Renumeration back to the developer is assured at later date, depending on product significance.  Philanthropists can offer personal loans to the poor, and return on deposits for same.  Hand held, used everywhere from purchase to driver license to employment ID.  All really based on a simple thumbprinted agreement to execute specific scripts overour personal spreadsheet for a variety of contracts.

Easily extended to handling remote power of attorney, we agree to allow a bot to trade remotely for us, done through the checks and balances of the smart card and limited to contracts represented as convolutions over short finite spreadsheet structures.  Handle all forms of secure login in via its NFC interface, carry our photo ID.

Family budget agreements, shared rental agreements, group contracts of all sort, including buyers clubs.  Distributed credit apps that the corner grocer can use with his measurable customer base. Put in conrtact what the corner grocer knows about customer ability to make promises.  Transaction costs dropping everywhere, yield a rich industry of distributed, contracted credit applications. The ability of like minded individuals to make contracts across the globe would be unprecedented.

What is the format?

Comma separate CSV files, a subset of LazyJ.  CSV files have been working for multi sheet, complex accounting for years. We are not that complex, or need not be that complex for the handheld ATM card. And we generally need encryption on transmissions.  Otherwise, there is enough functionality, way more than enough, in CSV grammar to meet all the known contracts or use cases.  The macro operators over the ATM sheet are approved by the NASB.  Ownership contracts, employmet contracts, all abstracted from SEC, and so on.  Open up a huge CSV industry, blows we wide open with remote bot operations, allowing power of attorney to operate hot wallets at the pits.

Marketable security?

The definition means 'no real asset is exchanged'.  What is exchanged is a legal contract by a counter party to provide goods or currency in the future. The emphasis is on personal contract, a pre-qualified agent promising to deliver makes liquidity.

The stock market buys and sells personal contracts, all buyers and sellers know the stock represents a uniform balance sheet structure with stability guaranteed by the personal employment contracts of the employees. The stock market has liquidity because it has a wlll layer software format actually, the standard corporate spreadsheet. 

All of these 'liquidity standards' are subject to becoming computerized, layered contracts, monitored by our select trusted miners. Being based on standard personal contracts, we have a provable finite graph transaction, the corporate spreadsheet is, yet again, a step or skip directed graph and all exit states can be measuring by queuing in real time. 

All of this subject to simple automaton with networks of linux processors.   And it is happening, except fr one thing.  The small player has a right to prove his reliability, in proportion to her cash flow.  We need a device to take with us, a device that can prove our compliance with  low transaction costs. It is not fair to leave the technology only to the wealthy, techies have a market and moral issue here, they need to promote autonomous, private key holding provable bots, handheld contract managers.

The topic is arising in central banking again, what should the central banker buy and sell?  It really wants to measure the willingness of dollar holders to obey the contract, live with budget balance, their baskets optimally full.  That is the measure of future investment uncertainty, the bounds on currency risk.  Any currency issuer really need the individual agents to hold savings to loan balances, which it can observe by allowing individuals to hold accounts if they meet reserve rules.  So the Fed always has the best measure of our willingness to make promises about the future at a uniform risk level in measurement.  This keeps currency risk bounded, it is, in effect, the current consensus and a shared cost.

Auto html in join

The html emitter nd LazyJ search and replace can generate the basic boundaries, but it need not go directly to press, it can move to a memory index and be further joined with a javascript layout manager. The CIO works with his information warriors to impose coordination between generic header and sub header  upon which can be written anchor, selectors, forms, all via the interposition of javascript control of the base html tagged structures.

Call the highest level of join in the stack the layout grammar, and it can spin right out to the user browser, serially, as the final script is composed. The presentation layout is done by a human with an editor, which abstracts the javascript compositions and widgets forms into a template to the lay out grammar in join.  Join can run freely, generating timely summaries of the web to the user in the proper form.  Any request from the user is recovered, made self consistent (pathes resolved), then dumped to the local join machine.

Tuesday, November 27, 2018

No longer pretending to be smart

"I'm doing deals, and I'm not being accommodated by the Fed," Trump said. "They're making a mistake because I have a gut, and my gut tells me more sometimes than anybody else's brain can ever tell me."

Likely a helleva gut also. What he means, the Fed should start up the QE thing, sooner the better.   I hear a bit more anxiety these days, like interest shock.

Wayland and XCB

I look closely at this stuff, and the two are compatible.  xcb is a protocol generated from X, automatically, with adjustments to rid the synchronous waits.  Works great.

To make xcb interface to Wayland is the reverse process, use the auto generated headers to make the graphical object templates needed for client API. I think this auto-reversal was on the minds of xcb developers as there is way too much redundancy in all the interfaces, most work fine on the generic interface structures. But by capturing it all with individual definitions, in name only, then they capture a complete reversal, and the xcb users can be unaware.  If they follow the complete definition, xcb code will adapt to wayland with little work.

So who will generate any html?

If the bots read the stuff and present summaries to their owners.

Well, bots generate html pages, simple enough with the html output emitter on join, could that with LazyJ select and replace syntax, we get nice web pages.  Bots do not generally scrape their own html output, they have already collaborated in lazyJ, they have all the plain structured text in their semantic graphs.

HTML authors still have jobs, they control the html grammar using controls on the HTML emitter.

A theory of graph coloring

A while back I was posting Huffman trees onto my 20 line Ncurses display.  They ended up filling the console window with the nodes of the tree spread proper;ly, relative to each other, on the plain grid.

Consider the banker who has a Huffman tree of depositors. Put that on my 20 line Ncurses grid, and also put the loan tree on the same grid. The bankers job is to make the market by filling in her chits, on either graph, such that the covered grid is 'white', the two are spectral matched, each equally imprecise in sampling the other.  Neither will meet the Shannon sampling theorem.  But they will have a bounded mismatch if they are packing sphere somewhere.

We can show this puzzle breaks down to the standard vertex coloring problem of graphs.  Our solution implies an equal density, or consistent density over the grid.  The two colored vertex patterns will nearly balance, ad their exits an extended graph with links between opposite colors, almost. We work the problem backwards, we find the combined graph that  expresses a two coloring, almost.  This is like Taylor approximation in graphs, I can do it because Huffman codes have spacial density property, the more precise the tree, the more dense, and density spread as this is a balanced queuing process, a sphere packer.

I think if we scale and round we can fit them both to the same grid and derive interest charges from the scale while returning the round back to individual accounts. Going to three color, I am not sure the process is commutative, I ain't thunk it yet.

The total subject, regarding central banking treasuries unmatched with private sector depositors. Off white is a cycle in graph coloring business, mis coloring in a graph is a loop shorter than the number of colors. It is spectrally off white meaning the grid size aint't matching and you have sample aliasing, as in sample theory. That is the very definition of a cycle.

Consolidating scripting in linux

I have noticed something about the general format of command switches:
gcc -o out x.c -I.

OK, selected switches at random, but in gcc ordering is important, and some switches modify the function of previous switches.  The switches are presumed to be process left  to right. Command line switches in linux are a sort of scripting language, informally defined.

But, command shell, bash, has scripting:

while [n] 
do 
gcc -o %n.c
((n--))
done

Or something like this, I never wrote a while loop in bash until just now.

For a complete macro shell, I see no reason the two scripts cannot be intermixed into a single grammar allowing:

gcc -o while[n] do %n.c ((n--)) done

Which is different from the one above.  This one compiles the whole set of arguments all at once. But it demonstrates the basic idea.

Further.  Arguments can be delivered as an array of string pointers, a perfect instruction format for scripting languages.  This allows the macro pre-processor to expand macros along the way, my shell does this. The steps then become:

Text strings convert line by line or ; separated into an array of null terminated strings..  The string pre-processed on the fly, expanding macros and generating the array of argument pointers.  But we can add argument switches such as:

command %1 where %1 expands into args[1], the first argument in the argument list.  Thus command sequences can carry over arguments, and modify them. Also we have some variable control for output arguments, command sending results back to requester.

All the commands may be idiosyncratic, having nothing to do with each other, but they share a generic grammar regarding the argument list.  Thus the whole script grammar is lifted up, above the actual command function.

Anyway, this issue came up in trying out the simple macro shell I wrote, a lab version really.    But in the join system, I have a stack of join instances, pending on a stack.  I need array and loop functioning within the command arguments, especially when referring to cursors, there are lots of those.  I need to do a bit of iteration, not much more than bash, but inside the argument list, within a single command.  Then maybe reset the counter and repeat with another command.

The consolidated grammar would fall back to normal bash if it is not used. This consolidation adds nothing, really, it just takes advantage of the similarity of argument lists and bash macro sequences.

Regarding the AI model:

In my AI model, the bots are really this scripting layer, which could also be python.  But that script operates a massively parallel system. Whether a trading bot, a search graph, or text readers, the bot will be spawning script and other subsets of the problem, as they are all construed to be directed graphs and segments thus parceled out freely.  My entire enterprise text reader will be written in this scripting variant, it will just condition the source text via a series of intersections with word lists from the word smiths. Hundreds of differing word lists matched against the daily web new. A problem not much more complex then make, it just executes more iterations. Make sits on top of powerful compilers, reader bots sit on top of powerful join machines. Like gcc is multi-lingual, the join bots have access to tens of different grammars on op,key pairs.  BerkelyDB is the natural database structure and needs to be installed as a grammar atachment.

I do not see the architecture changing much for S&L operations, nor do I see any conflicts with the spectre standard.  Join is something we might want a co-processor to do. I think the key is plain structured text, and bots have dictionaries to deal with special join set up and control.

Why open source linux?

It is the open source that is working.  Open source projectsthat work are projects in which most of the developers have applications in mind, they need a distrributed architecture.  Like xcb, it is really a protocol, almost automatically generated from X itself.  Its success is ecause many other had applications in mind, they needed a better layering.

Snap is another case, a generic binary package format which is isolated from system specifics, a multi-application protocol.

Linux machines are everywhere in the network because of the distributed development nd application protocols. It has always been about collaboration.

Like this join layer, not much there in the join loop, it works because of the attachment concept and the ability to layer specialized grammars on top of LazyJ with thin code.  My application is reading plain human text. But I need joins everywhere to collaborate, partial out the days reading fr the entire enterprise.  I can't make my application until the other apps appear in collaborators minds, and they want a join loop everywhere for their grammar. At that moment we get a consensus on what the base join grammar looks like. Then in my case, wordsmiths can start generating word lists, specialized matching lists they sell to corporations to ferret out the minute details of their market.

The cost of liquidity

The difference in yields between one nd ten is less than 40 basis points,.  Over a two year period, hat difference in yields becomes less than a point.  But the one year, and with a loss of less than a point you get the liquidity of cash, almost.  That is a low price for liquidity, ATMs charge nearly 1-2% for typical cash withdrawals, by comparison.

Treasury is losing when it piles debt at the short end, they create liquidity for cheap for the money market funds.  Market makers can keep the one year around for back up as they match deposits with loans. They do not need to hold a bunch of long term bonds with a slope that low.


Millennials have to execute a monetary regime change

Retirement’s a long way off for the 83 million Americans born between 1982 and 2000 (that’s the date range used by the Census Bureau to identify millennials), but the smart ones are already planning. One big worry they have: Despite all they’re paying into Social Security now, most — a whopping 80% — say they don’t expect to get a nickel of it back.

Young and unknowing.  What is killing social security is high debt levels in the economy due to boomers never covering the debt they created.

Work that cannot meet the safe rate in terms of productivity growth will not be employed.  That is ultimately the  millennial problem, high and high productivity needed as we continue to lag in paying back all that boomer debt.

Computing history

A hollerith card, could hold up to 64 bytes.

This from History of Digital Storage

One program was a shoe box full of these cards. The result was Fortran, a deliberately minimal, and specialized script language.





Even today we hear the words 'tape drive' in the linux community.
First memory chip.
The first solid state logic gate. A regular 'linear' amplifier, this transistor is forced into two meta stable states; on and off. This pic from History of Transistor.

An early internet router.  They used Morse code, a more efficient version of ascii.  Morse was Huffman encoded, frequent letters had the shortest code.
... - .... .- - .. ... .... .. ... - --- .-. -.--


The first web browser. Xwindows carried the display server model into the 21st century.

Great idea, horse manure logic

OK, this public school in Virginia apologizes for reading the pl;edge in Spanish. Apologize to whom? Dunno, read the case for cancellation
“It was my understanding it would be Spanish, French and German, and I think there’s 80, 90 different languages at this school and I thought that would be a cool thing,” she said.
Fairfax county Public Schools contends the program wasn’t canceled but rather designed to only last the two days before Thanksgiving as a means to “promote engagement and inclusion,” according to a prepared statement to WTTG.
“A school administrator suggested students lead the pledge in Spanish to promote engagement and inclusion,” the statement read. “Administrators believed this was an opportunity for other voices and languages to be heard and recognize the school diversity.

My bold.
Now here is the horse manure.  Have the kids read the pledge in 3 or 5 languages, enough to cover at least the euro spectrum.  They learn word derivation, boneheads, and it is a great idea and you teachers should focus on teaching skills not diversity.

Reinstate the plan, reduce the number of languages, and tell the community this is about learning word stems, word derivations and the like. There is nothing offensive.  The diversity here is language skills among little kiddies of different colors.

Library hell in linux

A good portion of the requests for help in linux are off the form: cannot find object in the library path.

The linux pros have the pat answer, need more details to find out what went wrong.

The answer is currently ldconfig, a command tool that maintains the library index for run time start up of apps. The more correct answer is a GUI based tool for ldconfig which lists, for the user, intelligently, what libraries are known by the run time linker.  I have not found that tool, but I know someone has written it and has it using ldconfig in the background.  Where is that tool?

It seems to me that enterprise management of linker libraries is big business, a path to shortening the network distribution of enterprise software.  Someone must have invented the tools.

Let us cycle

"Raising rates too quickly could unnecessarily shorten the economic expansion, while moving too slowly could result in rising inflation and inflation expectations down the road that could be costly to reverse, as well as potentially pose financial stability risks."
A currency issuer at equilibrium has no idea what the next move in interest charges will look like.

In the case of the Fed they seem to know a lot about the future, as if they we far enough off equilibrium such that there is only one direction back to normal.  So, they make these announcements, basically telling us what the debt cartel has computed based on their wealthy clients.


 In other words, they are coordinated rates to 'glide' government debt and get creditors paid off. We can see the deliberate control over the curve slope, accumulating paper until the pressure is off the ten year yield. The cycle comes from regime change in the Swamp which are the only borrowers in the Fed balance sheet.

Mathematically it is easy to treat the loan and deposit queues as a spectra and they obviously do not match, one is 'aliased' with respect to the other.  That is government debt is way over sampled and private deposits way undersampled.  Mathematically, two spectra distributions that are aliased is the definition of cycle.

Chairman Mao of Alabama, Commie Rat

Jack Ma Confirmed as Chinese Communist Party Member

Monday, November 26, 2018

Nvidia is in trouble with the linux developers

Nvidia has evidenlty been quite an asshole towards open source graphics drivers, especially the new wayland windows server to replace parts of X11.

The new graphics system built using the wayland protocol eliminates the concept of messaging and streams between client and windows server.  It relies on shared memory object that have their own rendering system, it is a great improvemnt.

The XCB update for X11 will be compatible with wayland.  But nvidia is refusing to budge on the nbew interface, and in general is disruptive to linux users, delaying progress by tying up their graphics drivers.

But, IBM just paid 34 billion for Red at, and I am sure IBM:

Steve Almy, principal product manager of Red Hat Enterprise Linux, told El Reg in an email: “Based on trends in the Red Hat Enterprise Linux customer base, there is overwhelming interest in desktop technologies such as Gnome and Wayland, while interest in KDE has been waning in our installed base.”

A good strategy and one that IBM will adopt.  So, IBM needs to devlop the aternative graphics chip for a Nvidia replacement, and we also want to run Nvidia out opf the AI business, don't need them, we can use an Intel co-processor well enough to search and modify our decision trees.   Also, support for AMD is crucial, they are the one supporting linux, always ask for AMD AI chips, not Nvidia.

I am co,pletely i support of Red hat going to wayland completely, most of the OS vendors follow suit.   Wayland works well with XCB and the x.org group lives on.

Solved a housing problem

While it’s too soon to know if the improvement [opioid deaths down] is part of a long-term trend, it is clear there are some lessons to be learned from Dayton. The New York Times spent several days here interviewing police and public health officials; doctors, nurses and other treatment providers; people recovering from opioid addiction and people who are still using heroin and other drugs.

Mayor Nan Whaley thinks nothing has had as big an impact on overdose deaths as Gov. John Kasich’s decision to expand Medicaid in 2015, a move that gave nearly 700,000 low-income adults access to free addiction and mental health treatment.

I know the treatment business.

The key here is to house them away from the hood, or at least get them away from the hood during the day.  Especially with meth, it is not 'medicine in the medicaid sense', the traditional sense. It is militant cultural intervention under the cover of medical license. It works.  Medical techies can be militant, they have license, funds, insurance and a monopoly.  This works, it is a fiction that medicaid does it, the Conservation corp could do this with a fake jobs program, the PeaceCorp does this.

Especially with meth, one needs militant camp guards who can take a bus to the meth heads, offer them  a hundred bucks for a weekend at work camp.  That alone gets you over half way home on the meth problem by simply getting them out of the meth network for the weekend.

Call it medicine, I will triple the government costs because are using medical militancy.  Call it am jobs program for meth heads and you costs go way down, we know what we deal with and we specialize the work camp to treat the meth heads..  They can be induced to exit the bad scene for a week end.

Herion is a bit different, the medics are generally forced into substitute medicines, less dangerous.  Then it is mainly getting them in and out of the clinc on a regular basis.  This is real medicine, and it works for cocaine also, which is easy to quit with quick recovery.  With meth you need the work camp, the militancy, you are really dealing with zombies.

Every meth head recovery story I have dealt with had one thing in common.  For circumstantial reasons, the person was isolated from the meth network. Meth is true mid brain damage, recoverable after a few years.  On meth, the person has few human emotions and cannot really respond to counseling.  Once isolated fro the meth network the meth head is forced to deal directly with human connections, and doing this under structured workcamp conditions is ideal, like taking them back to summer camp to learn human interactions.

Reading the AI trade press

Got bored with doing my tqax return, went web surfing for AI trade mags like:

ChatBots

AI Magazine | Artificial Intelligence News and Discussions 

Digitalist Magazine News on Cloud, Mobile, Big Data, Analytics & More | SAP Innovation

BizTech Magazine | Technology Solutions That Drive Business

The sales to business will be a 30 billion market in a few years, no doubt.  Take for example,  having a network of text readers scanning the web pages and presenting results to corporate workers.  The work can be shared with any number of processorss passing LazyJ messages around to indicate what they read and what they summarize. 

Massive multi-processing with cheap high performance processors.  We can do this with the join system.  We can create complete S&L architectures, find data correlations.  Most of AI can be incorporated into a single join architecture, a framework for working through the various grammars of AI.

I should be working the software, but alas, I am bottlenecked with the IRS.

Fedora update??

Red Hat insisted on updating my OS last night.  I feared the worse, but let it go.  Sure enough, this morn I find that Fedora starts with a bunch of missing library error, and gnome even crashed a couple of times.

This morn, Red hat wants to do another update! No kidding!

This is how we do it with free software, I, who use free software, become the guinea pig.  Red Hat creates the enterprise version after testing all the updates on their free users.

Tax day!

For me, one form to amend and return.  But I can say that I am at least a legal citizen again,.

So, no software today.

A dead journalist trades for ten bucks a barrel.

Saudis Confuse Traders By Pumping A Record Amount Of Oil As Goldman Top Trade Says "Buy"


I think the Saudis got ripped off.

More evidence about the debt cartel

We had previously documented large excess returns on equities ahead of scheduled announcements of the Federal Open Market Committee (FOMC)—the Federal Reserve’s monetary policy-making body—between 1994 and 2011. This post updates our original analysis with more recent data. We find evidence of continued large excess returns during FOMC meetings, but only for those featuring a press conference by the Chair of the FOMC. 
From the research staff of the NY Fed. 

The federal reserve board represents member banks in their district.s  The member banks are within the same organizations that manage debt; the debt advisory board, the primary dealers.  The whole point of the Fed chair making an announcement is to inform the world that the debt cartel has reached a decision. So, obviously, all the members of the debt cartel and their wealthy patrons have advanced knowledge.  In fact it is worse.  The large banks offer a premium yield to wealthy investors who  play the debt cartel game.  In this way, debt flow is regular and smooth, but it is oligarchy capitalism required to manage the unending flow of massive borrowings by Congress.

The main point is how we do this in sandbox. We let everyone play the debt cartel game as long as they keep the required reserve ratios.  Since citizens are stuck with a volatile republican form of government then citizens need to play the government debt game.

What is a linux terminal?

When talking about command line shells, like MS power shell, the linux community talks about terminal emulation.  What is a terminal?

Some old mechanical thing for talking to mainframes, like a typewriter.  But is has nothing to do with command line execution, the main function of shells.  One of those leftovers from days gone by, like text files or like Xwindows client-server model.

What I want is a simple Xwindows based character shell, a graphics buffer holding rectangular arrays of character bits.  I guess we can call that a terminal, but it is confusing if you don't know the history, and confounding if you do.

I keep thinking, an Xwindow, basic character mode GUI.  Divide the screen into rectangles according to font size. Not doing graphics smaller than a character font.  In fact, font shell is the more appropriate term.  It should have been invented, all 200 lines of code, set up window, compute font size, implement :

strout(str,i,j);  // write a string of characters to a char window

Too simple, and I think it is not worth managing, not much more than a code snippet, too small to be a generic 'tool'.  Easier to gin one up than hunt down open source code?  Be nice if someone had already done this.

I cutnpasted an Xwindows 'hello world' window, and it basically this this strout(str,i,j); or something similar with a default character height.  From there create Xchar and on top of that add whatever text menu headers and text block utilities to make a Char manager.  So we get a stack similar to Gnome over X, for example.  We want GnomeChars over Xchars, everything just course grained down to character arrays.

The interface becomes very simple, limited to the printable chars.  Like the plaintext idea, sometimes pixel graphics are a nuisance. But some things are too simple to pay the cost of software management, Xchars is a cutnpaste kind of snippet.

Ncurses is really a char window manager, while it has no standard Xwindos sublayer. If we could extract the window and menu manager from Ncurses, this is the real code.  Drop all the stuff about old style typewriters.

Sunday, November 25, 2018

Firefox bug

The Mozilla browser, Firefox, has a bug.  One window loading will slow the other window loading.  I performed the flooing experiment:

python -m SimpleHTTPServer 8080

Ther I set up a local ping to that local serer:


ping 127.0.0.1
Works fine. Access time less than a millisec as we would expect with local access.  I add the following to firefox address space:
http://localhost:8080/

Access works fine, fast no problem.

Then I open a new Firefox tab, access a website and Firefox is slow as usual, almost a minute to load.  But the first tab, the local access also slows when the other tab is slow!  Yet ping keeps with on trucking fast to the local server.

Why would one firefox open tab slow another open tag but not ping?  Clearly a bug in firefox, (plus a lot of bugs with Comcast, my provider).

Chrome loaded onto fedora, with the usual trial and error.  My main problem is that chrome will not run in root mode, too dangerous.

However, I have heard complaints about chrome becoming slow, for some reason.  So far it works on fedora much better than firefox.



Dynamic linking under linux, working

Removing the last of windows specific code in the join system so it all becomes linux compatible, then put the Windows switches in last.  Dynamic linking with dl lib is about the same as with windows dll.

Saturday, November 24, 2018

Xwindows has legs

The low level unix windowing systems used on most linux releases.  It has kept up with the times with the Xcb release.  Most of the unnecessary asynchronous waits are gone in calls to Xwindows. So it becomes efficient, again, and is ubiquitous across linux. It does not replace the window managers with the drop down menu systems and layouts, its main job is to move pixels to restore windows uncovered on the screen, and to deal with mouse or keyboard clicks.  It also defines the low level font storage.

The new technology of Xcb enables a new genre of games, not as fast as direct access to the graphics card, and those games use direct access anyway.  So X fits, it is long lasting.

Only in California

Here’s a longer quote of Rep. McClintock (R-CA) from his speech on the House floor on Tuesday, October 3, 2017 concerning the Resilient Federal Forests Act:
The wildfire crisis facing our forests across the West comes down to a simple adage. Excess timber comes out of the forest one way or the other. It is either carried out, or it burns out. But it comes out. When we carried out our excess timber, we had healthy, resilient forests and we had thriving prosperous communities. Excess timber sales from federal lands not only generated revenues for our mountain communities, but created thousands of jobs.
But in the 1970s, we adopted laws like the National Environmental Policy Act and the Endangered Species Act that have resulted in endlessly time consuming and cost-prohibitive restrictions and requirements that have made the scientific management of our forests virtually impossible. Timber sales from the federal lands have dropped 80% in the intervening years, with a concomitant increase in forest fires. In California alone, the number of saw mills has dropped from 149 in 1981 to just 27 today.
Timber that once had room to grow healthy and strong now fights for its life against other trees trying to occupy the same ground. Average tree density in the Sierra Nevada is three to four times the density the land can support. In this weakened condition, trees lose their natural defenses to drought, disease, pestilence, and ultimately succumb to catastrophic wildfire.
After 45 years of experience with these environmental laws – all passed with the promise they would improve the forest environment – I think we are entitled to ask, “How is the forest environment doing?” All around us, the answer is damning. These laws have not only failed to improve our forest environment – they are literally killing our forests.
Throughout our vast forests, it is often very easy to visually identify the property lines between well managed private forests and the neglected federal lands – I’ve seen it myself on aerial inspections.  The managed forests are green, healthy and thriving.  The neglected federal forests are densely overcrowded and often scarred by fire because we can’t even salvage the fire-killed timber while it still has value.  How clever of the climate to know exactly what is the boundary between private and government lands!
This is not complicated.  Our forests are catastrophically overgrown.  Drought is a catalyst – it is not a cause.  In overgrown forests, much snow evaporates in dense canopies and cannot reach the ground.  The transpiration volume in an overgrown forest is a problem in normal years – in a drought it becomes lethal.
Pestilence is a catalyst – it is not a cause.  Healthy trees can naturally resist bark beetles – stressed trees can’t.  A properly managed forest matches the tree density to the ability of the land to support it.   But we cannot properly manage our forests because of the laws now in place.
California environmentalists are philosophers, not scientists.  We can blame Jerry for the deaths, he should know better.

Fellow web scraper


I Don't Need No Stinking API: Web Scraping For Fun and Profit

 

Author has an html scrape tool similar to mine. I joined his club and more than just scrape the web, we will read the text also, find its additional structure not retained as part of HTML.

I am starting to update files now and then, but the versions are old and I am a bit slow on the join project. I now run a simple GUI editor builder called Geany and it works with gcc by default. Everything linux.

But I also have one more IRS tax form to revise and likely wll go nuts before it is done. My tax problems causing my computer game addiction. (oved on to gnome chess). My game addiction causes me to write software in an effort to beat the game AI.

Only in California

We have many lawyer protection acts here in California, one in particular is the historical buildings preservation act.  That act allows lawyers to sue if a historical building is destroyed improperly.

In the typical case, here is Fresno, old buildings that have no future value must be arsoned to avoid the lawyer fees. Step one os to demonstrate the old dilapidated home is unlivable, and must be condemned for remodeling.  Then, board it up, then the meth heads invade and eventually it burns and we have lawyer free urban renewable.

We have let four dilapidated homes go in this neighborhood, both requiring way too much remodel to meet any modern usage.

Arson as a form of urban renewal is part of the meth militia wars.  We have decided the best cure for meth use is homelessness, do not let the  zombies have a place to settle and create havoc.  So they wander among abandoned and condemned buildings, makes it easier to chase them.  In fact, this is true, homelessness and poverty make for a meth cure, the only cure that seems to work.  We should supplement that with occasional trips to work camp for a few days of paid regimentation, this further keeps the methheads on the run.

My hometown is full of dummies

We are the red dots in central California.  Few of us graduate college, fewer still in the STEMs.  Likely cause is that we have many on meth out here in the meth capital of the world, Fresno,CA.