Monday, October 31, 2016

Encode up and decode down the graph

The traditional Huffman encoder goes from the leave to the root.  The value, uncoded, finds its slot at the bottom of the graph actually, the pit. As the original value travels up the tree is collects its code, the code  being the path.

So, the natural path down the nested block is decode, the encoded symbol resulting in the original.  So, for any bid, with low probability if acceptance, its bot will likely have a longer path down the graph to find a bid, and it is likely it reaches the bottom without a match and times out.  The system centers on liquidity, short oaths are liquid and cheaper transaction costs.  And, short paths are easier for the Pit Boss, leaving more cycles to the traders.  In the balance, the probability distribution of the accepts  should look bell shaped, it should conform to the shape of the noise that makes Shannon true.

The Pit Boss, in stress, may raise the price of cycles, as per contract.

Call it the Pit Boss

The sitr bot,the pit boss. It has then job of quickly organizing the mass of undifferentiated ask and bids on the bottom of the graph, where the singletons piling up.

Each of the singletons will active a bot soon, to go  search a match.  The pit boss wants these bid/ask bots to wake in on a node that encloses his most probably match.

So this pit boss has the trade off, group the singletons in large sets, few groups, or pay the cycle price and get small sets of many groups. So my pit boss will check transaction rates on input,  set the group size, then map the bid/ask values to the proper bin, then sort the bins in Huffman style.

How to map prices to bins? It is the sort function that I supply to select the sort key value, and it is not necessarily ordered by bit significance. The real ordering is how the trader bots left the tree after getting matches, or not.   The Pit boss can be constructed as a learning banker,  What it gets is binary prices, but what it wants is probability of a match.  There is a lot of room here to make big bucks with a smart, efficient pit boss.

If the algorithm sorted by the MSBs of price, and by ask or bid, Then the bins with unequal amounts of bid to ask are low probability of a match, bubble them up and further subdivide.

The trading bots get it, they wake up,their asking price has no matching the current bin, they spend cycles searching, or they drop through, bid not takes.  No ask/bid offer is permanent, all contracts have time limits.  The traders will flock to sites that have great Pit Boss algorithms, I will be rich.

Translating contract parameters

In the Trading Pit.  The site owner publishes the bot contract, that contract has to be operational for the bots.  Read the documentation doesn't work, there is a closed parameter spacefromwhich the site owner has to choose.  If the site owner wants 30% of the graoh cycles, say it in a standard resource parameter.  Any depth limit, define a  seignoriage parameter, and stick to it.  Posted prifes for graph cycles.

The Ethereum folks should have figured out a lot of this, take their experience and build it int othe pit software, in fact, build the pit for use in the ethereum network.

Folks who are writing these financial algos know of which I speak, it is the difference between being in the pit or hearing about it a week later.  So they will readily come up with the fair parameter set, ifit allows them  to launch their software into a fair pit.

How should the trader see this?

The trader's bot is doing real time chart analysis, in the pit, and making trades on the spot.  The site owner, under contract, is expected to spend some time organizing a probability tree for the pit, to reduce trader cycle times in finding bids.  That is the contract.

Horse manure

Market Watch: The Federal Reserve is not expected to leave anyone guessing.
The central bank will use its two-day meeting Tuesday and Wednesday to put the market on notice that it intends to raise interest rates — in December.
Economists expect the Fed to borrow a page from last year’s playbook to tee up a rate hike.
In fact, Market Watch is fundamentally horse manure. 

Listen up, if the currency banker is doing its job, everything it does is a mild surprise.

The Fed has already raised rates

The one year has been above .5% for a couple of months, and the Fed is not buying short term securities, or otherwise attempting the twist.  That is a raise in rates.  It is not some metaphysical philosophy, the Fed might 'raise rates' whatever that means.  The Fed is either buying or selling short term securities, and right now it is mostly selling.  That is the answer, and that answer sets  up our expectations, philosophies and anxieties.

The Trading Pit will need he virtual coin option

The standard setup for S&L will include the option of running externally protected digits, or run a virtual coin.  Running with virtual coin means the site will hold negative values of coins, representing coins actually 'mis-placed'. rates out minus rates in.  What Nick Rowe calls negative money.

We talked bout this, micro-helicopters,they distribute, what is ultimately, bit error and spread it around. It happens  when member bankers snooker the currency banker with real growth, the mis-placed coins stabilize prices.

Sunday, October 30, 2016

Let's talk about the flow of bitcoin into the S&L bot

This scenartio, some fool downloads tradingpit and sets it to default S&L, and connects to the bitcoin exchange.  What happens? Hissmart card will flash redand redder until the lending ratrs are skyhigh and deposit rates at zero.  This smart card has already picked up the typical bitcoin sequence,and it is imprecise.n So, set the lending ratrs sky high, shut deposit rate and the channel shuts, the site owner is losing $5/month, plus enormous labor and humor.

One day, on a lark, someone borrows abitcoin for the fdsyat 20%, and pays back principal and interest.  Who gets the interest?  In the default case the excess flow in, gains, stay with the S&L bot until it takes losses.  There is no seigniorage unless clearly speicifoed in the contract.

The bot knows the behavior of those digits, and the behavior of the bitcoin, it will raise deposit rates, according the the published liquidity bounds, the queue size difference between rate in and rates out.

Ther information revealed about bitcoin goes from zilch to a couple of bits, and bitcoin suddenly stabilized, your S&L site is stable, it self measures


It is the 'Who is going to stop me?'

Hey, I have a spare hundred, I down load tradingpit, and limit my loses to a hundred a month, and let the bicoin S&L set rates.  Transactions costs, remember, measurably at zero.



Bring bitcoin to the trading pit

Bitcoin is a great fit, it has its own security, external to the system.  So the secure digit part is done, that just leaves verifies.

But bitcoin is volatile,you say!
Not any more.  the trading pit can remember the typical monetary sequence by the Commie rats over there, the queues will hedge properly. Mathematicians will write great bots that keep the medium term bin for shock.

Locked into  maximum entropy box, a savings and loan outfit, operated automatically, imposes a know probability of arrival function, bitcoin will organize according to its typical sequence, including the shenanigans over there.

Iterators on block sructure, I think

I explains a lot.  I use dot, comma syntax.  NextDot jumps by the block count, NextComma just increments by one.  Interesting but useless if python does not understand the g array index, andI have not worked that.

But, the singleton is a scalar, its block count is one, so it has no differentiation between dot and comma.  Most of theses are at the bottom of the array, the pit.  But singletons can be moved up in groups, like all groups within a quant bin.  There bins are quantized, their bids ate not.  The bots trade the actual value, not the bin.
I updated the block code here. In default, it dumps a new node in the pit, otherwise it is enclosed by the specified node.  I am trying to get iterated functions, like sort, work in the native block mode.

Talk about bins and values a moment
Ifm tyhe site owner publishes a standard huffman encoder for the tree,complete.  The trader bots can detect that.  There are standard parameters like internal and external precision, minimum cycle timer allocations for the site bot,;parameters readable by the bots,they have standard mechanisms for describing tree structure. So, a trading bot, descending the tree may find groups of unclassified bids a the third level down.  It has a natural us of NextDot, NextComma; the iterators resolve, the bot can just scan each value in the bin, from none to many. It will never really sees a finite symbol Huffman encode, except maybe test.  The window size adapts as the queues adapts, most sites will  never have a complete tree tructure.  Thinks of it as a mix of index funds, day traders and pit traders, these are real sequences, and the site bot will have a variable window in the general case,and an incomplete tree, but a tree in which more precision moves the tree closer to fixed point binary Huffman. (The last condiyion is the red/green smooth condition and the Ito condition.  Almost one everywhere means a slight copression of probability space under decomposition, tendency to fixed point.

And, generally. Iterators ate dangerous, especially mine with bugs.  But in general, we want bots to 'free fall', the metaphor of a bot stationary and pick through the tree with iterators is dangerous, unless the iterators enforce descent.  And, iterators are difficult for code verification, unless the counting is locked, guarded, and bounded properly.


class NextDot:
    def __init__(self, low, high):
        self.current = 0
    def __iter__(self):
        return self
    def next(self): # Python 3: def __next__(self)
   self.current += self.count
        if self.count > G.top.count :
            raise StopIteration
        else:
            return self.current

class NextComma:
    def __init__(self, low, high):
        self.current = low
    def __iter__(self):
        return self

    def next(self): # Python 3: def __next__(self)
   self.current += 1
        if self.count > G.top.count :
            raise StopIteration
        else:
            return self.current


Using trading pit on a standard commercial site

If you are selling from inventory,at fixed price, with no discount point; you are not taking currency risk, you  a banker.  But the traing pit libarary is still the way to go, by far, given the security and itsability to minimize inventory volatility.

Purchases coming in get stacked in yhe pit, as usueal,.  But each bit is still attached to one of the standard graph bots.  In this case, its a bot that just completes a standard smart card purchase, on behalf of the buyer.

But the site owner runs a different., kind of graph tree,he runs a tree that organizes accoring to inventory priority.  His inventory staff continues to run shipment bots down, arranging shipments and the purchase folks run their purchase bots down the graph.  The commercial sites keep his finite graph at 8 bits, accurate enough.  It all works because optimum inventory combinatorics and queue sizes are related.

My consulting fees. I train executives not to be bozos


This must have been tried

Assume price is elastic, a smooth Reg/Gteen scaler.

Take the bids, as integers. Sort them on the MSB, into two groups. Then each subgoup, sort by the send bit down, etc.  Use the set size as a measure of probaility, build your tree.  If this works, then we can systematically keep the 'best look' on the tree, the accuracy of the tree dependent on the rate os innovations in the bid.  Hence, in the scenario, the precision of the tree depends on he site bot getting requant cycles.  If traders jam the queue , so what, they see the two bit version of the tree.

Notice it self balances, the python interpreter and the human market pit are both entropy increasing, or redundancy removing, they self organize over a limited bandwidth.  Combinatorial theory makes the math similar in both cases, as long as price is locally elastic.  If the python folks get an enforcerd, thread fairness bot system built-in, secure and verifiable, then the interpreter is the market and it guarantees pure cash digits. I should mention the interpreter guarantees pure cash because it produces queu distributions accurate to the stated precision.  It measures queue size for actual goods.

So, we would:
import tradingpit

And we get it all:  the classes to box in trader bots, keep the trading trees and verify the secure digits.  And, smart card connects it.  An important trade off.  The better that python can bond the system, the simpler is code verification for the bots.  Binding the bot to a gravity fall through the graph, having timeouts. Add to that very short term data integrity checks, done asynchronously,at the interpreter level.  Then code verification, we are looking for complete fallthrough on the code, sorter snippets, nouse of dictionry to tore indexes,.  These can be enforced, we get:

import sourceverify

A package that does standard checks on snippet bots.  And this code is tied to the integrity system by python, it itself is verified. The python group generates such an enthusiastic response from traders, they will get confidant with the system.  The geeks can do this, make them do it, demand it.

The market works the way it seems to be shaping up.  The hot commodities are the new fangled quant bots, and they will be running against a set of standard site bots; all working from the same frame work, import tradingpit.  This is huge money, way beyond a few trillion folks.

B uid and sell a complete trading box, physically. Offer hardware connect between microprocessor and smart card. A lock down, verified, interpreter with full security cnrol, get tyhe memory hardware built in.. Comes complete with security, verify and trading pit, all libs burned in, cannot be replaced until the guy with the smart card shows up., and nothing else.  No trading site would ever use any other box, it guarantees pure cash, because it guarantees an secure, efficient trading pit.

I ask the public a question

What does it mean when some web site wants you to register, fill out some details?

Well, the bots do this in an instant, the only reason the site is collecting human text information is because some dumbshit coward of an executive is going to peruse your data, and pick arbitrages.  Well, fuck you executive, because my bots can spot arbitrages in an instant, as does all the other bots, including Google autofill.  There is no reason for anyone to collect data on you except when you and your bot think it wise, which will be rare.

If Google  bonded autofill, give it intelligencer;  I would pay a hundred dollars/ year for the thing, worth a lot more than a Windows OS.

 Do any of us remember Wal Mart's first attemp at digital currency? It flopped, the whole thing was conditional on some dumbshit executive perusing your e mail address! And anyone with a tiny brain said, all at once, what the fuck does my email have to do with banking theory.  Retailers, get a clue, pay attention.

How to lose customers immediately on the web

Give them a choice or register or login.

The user gets involved in a nightmare of trying to remember if he ever registered for the site, then he says no, then he registers, then his user name is fouled, then he is lost, and he hits the back button.

Here is a clue.  Have the user keep his identity at home. If he likes something on your site, then do a one shot exchange, if you bot tries to make friends with some strange human using  fill out form? Don't work, the human clicks the back button.

I just back buttoned on WalMart, a great site.  But I got fouled on what my username was, or did I register.  I had just hit click to buy my bike tubes, and their figgen bot takes me on an unpleasant journey.  Google auto fill offered to consummate the deal for me, but that has no verification protocol with the Wal Mart bot.

So, back button on WalMart, and I will always remember, dumbshit WalMart software geeks, just like those stupid Microsoft geeks and heir parental controls in the Xbox.

Someone has done some work for me!

PyAlgoTrade is a Python Algorithmic Trading Library with focus on backtesting and support for paper-trading and live-trading. Let’s say you have an idea for a trading strategy and you’d like to evaluate it with historical data and see how it behaves. PyAlgoTrade allows you to do so with minimal effort.

And we have Trading with Python by Jev Kuznetsov

Just the tip of the iceburg.  All of these neat sites and now we have Bitcoin style structured block chain.  So, I think the market is going exactly where we expect, especially with ethereum surviving their blunder and Lending Tree surviving.  

I am hesitate to write a lot of code here since I expect the pros will knock this thing out with perfection.


Quantopian has the right idea, but they have m no framework.  It assumes that traders in remote locations download the pit data and start from scratch.  No, we are going to change this.  Their algorithms become launchable bots and they operate in the pit,at the trading site.

Modelling the trading pit on the graph

Use out dot and comma protocol to describe the graph:

top.(Huff_tree_a,...Huff_tree_n,PIT)

The top node, top is simply the wrapper for the whole array, the owner can easily grab the whole thing.
The next nodes down is where precision starts with one or more Huffman trees hanging. It has subgraphs for each tree,plus a sub graph of all the singletons that have been jammed onto them queue at the bottom of the pit.  This is what needs to bubble up asap.

So, if we want to iterate on the pit, say group them into fives by probability, then bubble one or more groups up to start the requantization.

How to find the pit?

Insert another anchor node, one that points you directly to the pit, and one there the iterator can bounce you along the queue until your finished.  The iterator can also sort, though we want is to sort in place, a problem to be fixed.  The site management bits will have a pit iterator and sorter.  The mathematician can use these tools, and grouping algorithm short cut the normal Huffman traversals, the goal to get partially finished, the let the traders run, then ru  the site bot,...etc.



Find probability bins?

I generate a series of random numbers.  What is their probability?  One simple technique is to sort the rounded off integers, set the lower six bits to zero, and sort them on the masked value.   The real values then group according to mask value, and you get the count of values all having the same mask, grouped together.  The count is a good approximation to probability for bins of the particular bit size. I tried it out, it le tme learn how tom do sorting under Pythin.


import random
random.seed()
class Data:
 def __init__(self, val):
  self.val = val
  self.mask = self.val & 0x3e0
 def __repr__(self):
  return repr((self.val, self.mask))

Mylist = []
for i in range(50) :
 j = random.randint(0, 1024)
 Mylist.append(Data(j))
print(sorted(Mylist,key=lambda thing: thing.mask))


On might consider a strategy.  Find the lowest probability group, and bubble the whole groupup as a sub graph.  An imperfect Huffman tree, but it gets the significant groups up the tree ASAP, and you can refine the sort later.
The next step is to make the tree an iterable object, if its possible.

Saturday, October 29, 2016

Queuing the bots

The point I am going to make is that all bid queues ultimately approach first in first out at the bottom of the array,as they pile in.  It is the outcome of stable queues,within the allowed liquidity preference, the bots are all driven to optimum precision, cycles cost, Optimum precision means queue sizes are small, four or five, so. everything in the pit looks like single priority FIFO.

So, everything coming in, should be smart card protocol approved, That means, just throw it in the pit of singletons on arrival,and your site graph operators, if running at stability, will get to it on time,mostly.  The protocol guarantees fees (rates) adjust to make sure queues are informative.  Doing so, makes the queues very efficient, scoff the next six or less, then tour operators looks them over and bubble them up, with multiple descends.  These site bots, hey will be dropping spawned bots down the top all of the time, making tiny adjustments and measurements in the free fall until they meet their spawner somewhere in the  net, or not, just fall through.

Requests, just price them to near equal priority. The Fed runs two graphs, deposits and borrows. They have yo be fairly coherent or the throttle is applied to make is so, pricing or rates.  The s bots no know difference,, just spots on the queue.

Simple methods to measure the global economy with the bot

Got back to the idea that Janet lets any smart card in the world scan the reserve tree, collect the typical sequence of liquidity changes to six bits.  We,or I, concluded in a previous post that in the collective, bots would lower their request rate until the information they get back had a queue size that yielded useful information. otherwise they would simply self measure their own jammed queue, and their bots have the precision for that, bad. They adjust.

So the clever mathematicians says, ha ha, I will collect the request tree all by its lonesome.  What is that tree?  It is the typical sequence of Red/Green settings on 6 billion smart cards, to six bits of precision.  This mathematician can tell you who how anxious is the world.  How big are the big spending groups? How concentrated the wealth?

What else does that tree represent? The typical set of independent school girls the  good reverend got marching abreast.

Fine, lets give Janet a pricing knob,she can turn it right and a hundreth of a cent is charged for requests.  The bots will spread the word, they know what it means. It means their request market is offering greater precisiuon for more secure digits.

The bots do  their daily, hourly convolutions on graphs,right their, inside your wallet.  Your card bot has spawns out their, running the various graphs. They have an army out in the net working for you.

I have yo mention
When he smart cards pay for queue scan at the Fed liquidity tree, then the request tree becomes coherent with the member bank liquidity events.  So, only the most clever of clever mathematicians wil lfind a third tree, one he can use to get mutual dis-entropy and tease out the actual request tree.  We get Dynamic Entropic Local Disequilibrium analysis,the old DELD,you gotta get your DELDs done.

I should mention timeouts

Starters, no protocol is stable over a probabilistic network unless it has a timeout, agreed by both side.  The timeout leads to  stable outcome known to both sides.

Trader bots have a strategy,when they have their own thread. They will drop to a depth of four, then pause, savings cycles on the graph.  They can go down, to the right, and take a rest.  The bots will take a rest just above the sport where the  re-quantization takes place. They will wake up and take a peek, if they see green,they hit the CONTINUE flag and drop down the net.

That is OK,but the timeout is published. It is part of critical stability, insures secure digits don't get lost. Since the bot's movements are bounded anyway, it is a simple task, I am sure, to formally verify that every bot code responds properly to the timeout interrupt.  If not, the site owner complains to his bonded code verifier.

The plot

Now that I have my nested block structure working with prepend, I will add the simples, like append, then build some higher order operators to descend the graph

But in the process, we get a method to dump a list onto the graph,or dump a list of tuples onto the graph. The idea is to use python to easily generate some structured samples. Then   try different methods to bubble them up in groups,  not to test entropy but to count cycles.  That gets them math folks excited, rock solid, high speed descend only graph with millions of node, most of them secure digits.

The approach seems to be, take the incoming bids and just 'append' to the top.  That is a simple create the node, add it to the g array and bump the top clock account.  Then its locked in, all other node sustain the block structure, and the job of the mathematicians will be to design entropy methods that bubble the bids up the graph for market purposes. prepend is what the bubble up process uses, basically a swap between two subgraphs relative to a parent.

Market traders buy sophisticated graph operators, operators that bubble up or expose critical market moves, innovations.  let's give the math folks some room, lot of room, to innovate.

Python strings, list and tuples

Here is one that just removes parenthesis and their contents,uses all the capability.

text = "No parenth (alpha(how are you?) ((((add some)date),d) ending ) ) " # Sample
start = 0
end = start
size=len(text)
while start < size :
 print(text)
 end = text.find(')')
 if end < 0 : break
 start = text[0:end].rfind('(')
 if start < 0 : break
 text = text[0:start-1]+text[end+1:size-1]
 size=len(text)

Learning about python, still. C coders will first just write python like it is c code, hen slowly begin to thumb through the documentation.

Just playing around with the various python features, I modify the parenthesis remover. I capture the parenthesis in a list, along with the character pointer, and I can replace them later.

text = "No parenth (alpha(how are you?) ((((add some)date),d) ending ) )      "
mylist=[]
def parenthesis(text) :
 start = 0
 end = start
 size=len(text)
 while True :
  print(text)
  end = text.find(')')
  if end < 0 : break
  start = text[0:end].rfind('(')
  if start < 0 : break
  mylist.append((start+1,text[start+1:end]))
  text = text[0:start]+text[end+1:size]
  size=len(text)

parenthesis(text)
for i in range(len(mylist)) : print(mylist[i])


So, I have the strings methods, list methods,and tuples, I think, let me look it up...yes, I did a tuple!

One morer! I want to collect the parenthesies from outer first to inner:

def outer_first(text) :
 start = 0
 end = start
 size=len(text)
 while True :
  print(text)
  end = text.rfind(')')
  if end < 0 : break
  start = text[0:end].find('(')
  if start < 0 : break
  mylist.append((start+1,end-start,text[start+1:end]))
  text = text[start+1:end]
  size=len(text)


And I learn clear(), keeps the list, just empties it.

Let us try a strange thought experiment

We will suppose a theory of the universe and see where it leads, likely fun but screwy.

Here it is.  Quasers take protons, strip them back down to pure vacuum and then rebuild them, shooting them out a 'channel'.  What would we need to verify this theory?

Well, we have to see an aggregate, an we cannot see one unless the new protons retain some state information about their origins.  So, at the minimum, protons have to be talking to each other, collectively estimating where each thinks the point of origin was.

In a metaphoric way, that is was the  physicists Qbit conferences are all about. It is the same idea. Once you dump the concept of empty space, you get local knowledge only and thenm is  a combinatorial (and therefore a graph queueing) problem.  How much bandwidth is needed to maintain a consistent view of the center without disintegrating the codecs from radiation?  The clock energy, or quantization rate in sending messages is bandwidth used.  If they settle for low precision, they do not have to requantize for 10*32 years. It is a game, how many proton bots can you let loose without jamming the queue.

An explicit mechanism for Shannon information theory?

Looking at the equation, we see yhe noise term, which is a stationary, probabilistic restriction on channel bandwidth.  A reasonably constant guassian noise limits bandwidth.  Shannon's construction is complete, almost everywhere, so Ito is happy.

Where, then, is the cost of clocking data, the cost of the clock itself? It is in the noise term, implicitly.  We see it appear in the literature as the cost of the map; 'the bandwidth cost of shipping the map' when we talk pure digital systems.  In our case, the Shannon noise term is time on the graph needed to re-normalize, that is bandwidth lost in the input and output queues.  That time on the graph is the liquidity exposure that traders suffer when their secure digits get stuck on the queue,   What we hey is sort of the obvious connect between queuing and information, when the encoding tree is stable, input symbols will not queue up along the nodes. Ort, the encoding graph is a ininimal match to the re-normalizing time, a self adapted system.

Set up your own dark pool

Simple under the new systrm.  You are a group of brokers who mostly operate in t he middle of the market (N/2 orders down on the graph). You and your pals watch your node, where your trades usually end up ndthink hey, tyhat a stable node.

So you buy a cheap computer, put it in your house, and fire up a six bit, self measured currency graph.  Your group trades whole sale stocks at that stable level.  You set buy in and buy out where member can trade the group virtual currency against the external government tax currency.

All bets are six bit integers, secure digits.  What has you group gained by breaking your own dark pool? You get cycles on the graph, because you know your precision, it is in your new contract.  The aggregate  market had no such limiting contract, and you paid for useless cycles on their graph, buing  for information your group could monetize internally

.These money systems will be downloaded, like click and setup.

Giving mathematicians freedom on the graoh

There is a ood reason I hesitated to describe yhe actul 'algorithm' to rebalance, first, I am partly ignorany, but mostly there is a lot of freedom to choose how mutual entropy and a coding tree interact.  Or, in other words,m more than one way to batch you huffman encoder to do partial rebalance, then rest.  Like collecting the five least common trades in the queue, end bubbling them up.  The Huffman tree is binary, ultimately, because binary is complete.  But your graph is never completely balanced, there is always some marginal 'secure digit' risk.

So, I think the mathematicians have a bunch of selections to make in working the tree, as long as they can get a smooth red/green indicator of mutual entropy even when the tree is imprecise.  There is plenty of room for innovation in how that tree is managed, just remember, each cycle taken by the owner is one less cycle sold.

The pure cash fairness doctrine

All Smart Cards are created equal, in terms of protocol.  So, any smart card cannot create special groups without pricing them in our out.

Consider the idea of free scans of the graph.  Under the fairness doctrine the numb of requests in  the queue will represent the value of 6 billion smart cards doing requests.  The collection of bots, all 6 billion, will simultaneously discover that this site estimates the typical red/green indicator over the aggregate, as a generating graph.  It will see a relatively N bit value, the typical sequence of the request queue.  Remember the golden rule of the TOE,the aggregate becomes a theory of operation as it attempts a two coloring.  So eventually a site offering free scans will eventually become a useful indicator of something as the request queue balances. Or, under the fairness doctrine, the aggregate always generates a price.

How does the adjustmen work?
Bots deliver honest assessments of precision, it is the equilibriating function. So when a flood of bots send requests, the queue size reported back will have very low precision, it is full of requests.

,The originating bot will make fewer requests, it is getting low precision results back.  Eventually the requests stabilize, in the collective, and it self measures its collective desire to submit requests.

The current Fed is saying money is all about requests for queue size

It is what happens when one member bank gets stuck and there is no exit protocol.  So, the member banks have settled down to checking the queue size every so often to see what the member bank with the 2.4 T in loans is up to.

The central banker has all the knowledge it needs

The Knowledge Problem In Monetary Policy


By Beckworth

Dave talks about the idea that central bankers do  not know the potential growth rate.  The member banks have that knowledge.

We have

two groups of member banks.  One group does not borrow from the Fed, the other group, a group of one, has borrowed 2.4 Trillion.  The knowledge of that imbalance is well known, it is one of the most common political themes of the era.

The problem is that there is no entry and exit of member banks, hence we have to wait indefinitely for the one member bank to clear some of its borrowings.
Let's do an example, say the US Fed.
No, if you had, say, a currency banker in your pocket watching the blue line it would think, hey the borrowing queue is getting large chunks of 'secure digits'.  Your bot would respond green,you should make the counter posing deposits.  Why? Because your bot knows there are two outcomes, the blue line will make money useless, or there is going to be a retro active rate increase on deposits once the graph re-balances.  
Is the reyro-actve rate increase here?

In international terms it surer has, see that dollar spike? A 10% return on deposits in  one year.  The natural rate imposes itself, even when the central banks is stuck with a very bad member bank. Bot naturally detect the natural rate because it shows up in queue size first, and they watch that.

Mutual entropy between the blue and red lines in the first chart, tha mutual entropy reveals queue size imbalance.  Your bot knows a deposit will get picked up by the betting site right away, the graph renormalizer has to run, its the contract, the rule.

When money becomes useless

The bots don't care,their transaction costs are close to zero.  They will wait for two people to make one trade every thousand years, secure digits are like gold.

Friday, October 28, 2016

The simplest parser

It is a parser  in which the text has already been separated out between operators and tokens Separating out token and operators,  that part has always been the battle, but when that is done what is left?

# Expression are simple when the text has been tokenized
# so lets do that
tokop = [('a',','),('b','.')]

for i in tokop.len() :
if tokop[i][1] == '.' :  parent = parent.pevious()
parent.append(tokop[i][0])

This code is collect tokens by block level, nothing more.  But it assumes that append maintains graph integrity, however the graph is stored.

Grammar is always block based, it is the decoding of all the syntax short cuts we use, and need. Dealing with that is simply painful.

The other point is that in a descend only graph, how do you find the previous parent?  Launch another operator from the top that stops when  the target is at the next block start, block counts work.  It is a direct shot when the child index is known because you can skip down by block count, Homing in with log n steps.  The site operator cannot let graph operations maintain stacks and jump up and down the graph, arbitrarily, that is a no no.


Liquidity risk on the graph

We will hear the following descripyion:

6 bit precision betting over an 8 bit graph.  The sitre owner is telling the traders the contract,they can search a depth of 6, and the tree is organized out to eight.  The site owner is chicken, he is only going to cover the bit error far eight bit, much less than 2%, mean while the bets have a precision of more than 3%.  That bit error is liquidity risk.  This site owner is not taking any, he is making his money on ads,most likely this is a lark betting site, betting college games and stuffed with recreational betters.  Where are the informing bets?  They are being stiffed onto the end of the graph, at bit level 9, and the site has a graph operator in the background combing through those bets and bubbling up the significant (lowest probability) pairs.

The government currency banker is keeping two trees, the saving and borrowing tree.  It is gfairly brave, it keeps the tree about six deep, and actually kicks out member banks when the tree gets to deep.  The currency banker wants to be transparently exposed,it hold no more depth then member banks can search.

There are a lot of methods to run the graphs, as long as you have a mathematician trained in mutual entropy calculations.  You are free to really design markets, just keep the concept of precision in mind. The conrtacts, close automatically when recursions are bounded and cycles are paid for.

Precision is recursion 

That is why it is costly for the trader and site owner.  That was the problem ethereum had, offering greater precision and having the queue stack up as they pay out the bit error.  Site owners of today, with their smart cards, can obtain generators of sequences from actual trades, and run them against your sie before you turn on the contract.  Remember, these are not humans, they are bots equipped to scan for entropy mis-matches, gaps in the queues. One bad outcome is when the site owner gets it wrong, and his graph loads up with singletons at the end, his site Huffman encoder is plowing through the chaos and the owner is committed to pay the liquidity cost for what ever is in the huge pile.  Every one is making 'secure digits' when the queue sizes are optimal. The currency banker wants the queue short, five pending transactions is large when billions are flowing in second.

The user understanding of verification

How does the bot, in your smart card, guarantee authenticity?
The example is that you authorized auto trading.  There are two smart cards, your's and the person who is on then hook, contractually at the trading site.

A verification is an echo, your bot sends a verification request down the line to the destination bot.   The destination bot returns the echo, and the echo will collect verifications along that path back; verifications up to the level of precision required.  The required here is system wide, and may not be what you like, but verifications have to remain free, I think.  All the weak links in the chain have to have their critical code verifiable.

Its the weak spot in the architecture

There exists the possibility of the hackers flooding the system with verifications reports.  The system lowers precision, to reduce traffic, the hackers then jump in the sudden bandwidth gap, sneak in front of the line. It illustrates the point, when traffic is priced, there is not this risk as in time of the  flood,  because sites raise prices a bit, andft significant trades still get through whilr the hackers go  broke.

Running the graph in low precision

Simple.  Set your search depth, that is your precision. Anything below your search depth is considered a bag of noise traders. Low precision scans reduce your cycles on the graph; cycles on the graph cost you secure digits.

But big traders will be shy to reveal their trades on the system!  Big traders do not make the trade, y\their bot does. Their bot will run the Red/Green imbalance operator and report back the optimum trade size, the size that fits into the current stream without to causing a reconfigure.  Bots, in the system, make the trade naturally.

Just always remember, pure cash is redeemable in probability of arrival, exchange of the digits guarantees cycles on the graph, and reserves cycles in the actual flow of goods. Shippers run the same system, they compare the typical sequence of goods in with the sequence of goods out, a convolution of two coding graphs.  The rule is simple, within some N bits of precision, the graph probably holds all events within a half bit of precision, and Ito, god rest his soul, will be happy.

Let's take an example

The siter owner wants yo accumulate yhe ref/green indicator between the savings tree and borrowing treer. In graph world this bevcomes:

recurse from the A, depth = 6 with fun = .recurse on B, depth =6 with fun = redgreen

This is the equivalent to, for each row,column; for each row,column, in a square convolve.nThis gets you a six bit value for red/green as accumulated by the redgreen function.  How is that scalar computed?  Probability weighted distance between two node values, my first guess.  If I plow into it, the mathematicians will have already worked it out.  My suggestion is to get the interpreter developers talking directly with the mathematicians, both being paid by the bankers.

Stays on yhis log
How important is this stuff, what is yhe significance?  I get four time he number of hits as usual when talking about self adapting systems; relative to being a politically incorrect asshole.  When talking physics I get about ten times the hits.  For you marketing folks,that tells me the work is being done, out there, and I have seen applications that perform to this model.   Mathematicians are getting the connection between semi-random sequences, semi-random graph generators and grammars. I know, I steal their stuff.

Past the cycle?

Yahoo: The US economy grew at an annualized rate of 2.9% in the third quarter of 2016, the fastest rate since the third quarter of 2014.
But underneath this better-than-expected headline — Wall Street economists were looking for growth of 2.6% — we see a number boosted by two of the least-reliable elements of this report: inventories and trade.
In the third quarter, inventory accumulation added 0.61% to GDP while trade added 0.83% to growth. Neil Dutta, an economist at Renaissance Macro, notes these are the “most volatile” components of the report, adding that taking out these elements yields annualized GDP growth of just 1.4%.
The third quarter’s inventory accumulation largely offsets what had been a drag from this component of GDP over the last 5 quarters. The boost from trade was largely the result of a (likely) one-off jump in soybean exports.

It looks like we might just get by with no formal recession, farmers have picked up the slack.  I am becoming a believer in the 'bounce along at low growth' theory.

Recursions on steroids!

Below is the current recursion call, I want to make it a generator.  hat is the stack used for? Carry the accumulated index fount as the operator descends he graph.  The other call parameters are for the operator, and are static pointers.  So I am going to replace the recursive call with a pop and push on a list, and carry the list pointer rather then the id itself. Wrap the whole routine below in a while loop which says 'while id list is not empty, keep doing this'.

Modified, it looks like (untested):

def Recurse(fun,arglist)
 idlist=[]
 idlist.push(0)
 while idlist.len() > 0 :
  id = idlist.pop()
  self = G.GET(id)
  end = id + self.count 
 if fun(self,id,arglist) == CONTINUE :  
  id = id + 1 # skip the anchor
  if id  < end  and (self.return_val == CONTINUE) :
  push(id)
  else :
  if self.return_val == BLOCK : self.return_val = CONTINUE

 Look, no function call back to recurse!  Does it help much?  Where we have a short structure, and each 'node' mostly contains a long list of singletons,then this does not help because we skip the recurse on block count =1.  But it doesn't hurt much.

What does the saving and loans trees look like?

Treat it like two trees, or two branches off the main node.  Left branch is loans, right is savings.

We would expect the two to be highly similar.  When a string of borrowings comes in, the bot expects a similar stream of deposits to arrive. Compare the trees, they should have similar shapes, we can construct a similarity measure.

If the bank were a channel with limited bandwidth, we want savers and borrowers to spend equal time using the channel. It is known to the traders that when the loans are are out of balance, there will be  retro-active rate hike for savers, which is to say the bot allocates more of the channel to savers.

Lending tree failed this part.  If they could have looked at the wo trees separately, the lendng tree bot would have raised loans rates earlier.  The bot would have spotted the string of duplicate loans coming in.

In coding, the code word sent down the channel is foun d by entering the encode tree, and comparing a trade to the left right quant boundaries and  thus moving down the tree to a leaf, collecting the code word on hr way down.  The graph, itself, is the allocation of bandwidth, as measured in he recent past.  Creating the tree is reverse, staring at tyhe leafs,where all the input transactions are listed,the bot find the least probable, and enclosed therm  together, thus building the tree.

You see, the solution has to exist because savings and loan business is simply a queuing problem between savers and borrowers.   Think of it as  Poisson queuing, except you are finite, arrival rates can vary a bit past over some window in the past. The bots, in their maximu entropy computations, naturally keep the queue length such that is exposes the greatest amount of current information.  The larger the bet, the more time on the tree, queue stack up, the bet stream is slowed by the generator bots.

In other words, do not worry; the Lending Tree problem is going away, the Ethereum snaffu is going away.  This problem is contained in a box, packed with minimum redundancy, equipartion tells us there has to be  local, smooth  red/green detector.  Just be clever,  there is a convolution operator that gives you an imbalance measure, and it can be an operator running in an independent thread..

Thursday, October 27, 2016

Nested structure eliminates Facebook

Mainly because of the protected ggraph traversal, it take the task of data control out of the hands of Facebook, and it becomes standard.  The python interpreter will have the protected block structuretype.   So site software rents out various operators, like, check posts, grab mail. drop post, stream posts, etc.  .  Each of these operators, which will traverse the graph, if your cheap, you get ads with your return blocks.  But, if you don;t like ads, tsp on the icons that say, no ads, your card will deliver some secure digits.

Since the nested blocks are built ins with guarded traversal, real ownership of the graph is handed over to the anonymous bot.  He charges for electricity only.  The whole concept of facebook seems absurd to me.

Replicating a block structure

We want to solve a problem.  We have this syntax:

top(left_pane((entrybox,ok_button),sourcetext,lower_pane(menu1menu2)))

If I got my block markers right.  What do we want to do? Read it left to right and duplicate it as a formal block structure.
Then we an call top.traverse(drawe,drawdict)

In other words, run the graph with the draw method. Well, convert the syntax above into a block structure of pointers to widgets. The syntax is block nested in design. So just read it, count blocks, and do the appropriate calls to and create and enclose. Your operator has the token and dictionary, it can transpose the text into some object pointer for the array.  We would like nested block structure to be a built_in, to get it ground down to  binary efficiency.

What is the trick?

You have to stay one step behind in unfolding your text, that cdreates at least one forward traversal on the output graph, you won't get booted by end of graph.

Traders pay money to rent these operators

These are the graph operators. The site owner has a bunch of these snippets, on entry to his site.  The trader can rent one, run the graph with it. The general call form, which I ignore, is:
funcall(self,id,dict) 
where dict is a variable length variable dictionary.    These will be bounded operators, and have a couple of  license levels.  So if the trader thinks the market is rigged, he can get verifications for free, and also run one or more inspection scanners over the pool of trades.


# recursion control constants 
APPENDED = 0
CONTINUE = 1
BLOCK = 2
ERROR = 3
DONE = 4
# graph operators allow variable arg list
def NullFun(n,id,x=None) : # complete recursion
 n.return_val = CONTINUE
 print("Null ",n.count,id)
 return(n.return_val ) 
def JustDotsFun(n,id,x=None) : # Testing recursion
 n.return_val = BLOCK
 print("Dots ",n.count,id)
 return(n.return_val ) 
def IsEqualFun(n,id,x) : # Stop when x=y
 if  n == x['target'] : n.return_val = DONE
 if n.count  >= x['target'].count : n.result_val = BLOCK # the target is not in this block
 return(n.return_val)
def UpdateFun(n,id,x) : # Stop when x=y
 if  n == x['target'] :
  print("found ") 
  n.return_val = DONE
 else : 
  n.count = n.count + x['offset']
 return(n.return_val)

 The Matket place to buy and sell operators.

The site owner finds a great operator, verified and bonded, a tap of the screen, the owner here is going to pay big bucks to the other software geek on he other end, secure digits get passed.  Because, you see, the array is rock solid and valuable.

Physicists discover the unit of the universe

“All the world’s a stage…,” Shakespeare wrote, and physicists tend to think that way, too. Space seems like a backdrop to the action of forces and fields that inhabit it but space itself is not made of anything—or is it? Lately scientists have begun to question this conventional thinking and speculate that space—and its extension according to general relativity, spacetime—is actually composed of tiny chunks of information. These chunks might interact to create spacetime and give rise to its properties, such as the concept that curvature in spacetime causes gravity. If so, the idea might not just explain spacetime but might help physicists achieve a long-sought goal: a quantum theory of gravity that can merge general relativity and quantum mechanics, the two grand theories of the universe that tend not to get along. Lately the excitement of this possibility has engrossed hundreds of physicists who have been meeting every three months or so under the banner of a project dubbed “It from Qubit.”
The “it” in this case is spacetime, and the qubit (pronounced “cue-bit,” from “quantum bit”) represents the smallest possible amount of information—a computer “bit” on a quantum scale. 

Theoretical physicists are s fun bunch.  Here they discover, or hope to prove, they the all really just  some huge quantity of tiny bubble we call the vacuum.  These bubbles have to estimate pi, and he best they can do is about 22/7, except in the quasars, their 'bits of precision' . It is not that the vacuum is too stupid to count with more accuracy, its just that we can't have quasars everywhere.

I wonder if these physicists suffer economics envy, the desire to figure out sticky things.

EWe could enslave the Berkeley Blacks and force them to play basketball

Reason: Student protesters at the University of California-Berkeley gathered in front of a bridge on campus and forcibly prevented white people from crossing it. Students of color were allowed to pass.
The massive human wall was conceived as a pro-safe space demonstration. Activists wanted the university administration to designate additional safe spaces for trans students, gay students, and students of color. They were apparently incensed that one of their official safe spaces had been moved from the fifth floor of a building to the basement.
But that would be unfair to the NFL, so we need to institutionalize the issue, make Black slavery regulated. 

Not in my hometown

Yahoo: The US housing market is supply constrained, sending home prices in major US metros back to levels last seen in the winter of 2007.
Research out of JP Morgan published Thursday indicates that this situation appears unlikely to resolve itself anytime soon.
“Nationwide house price indexes have been pushing steadily higher—real house prices are now 25% above their 2012 trough and at the highest levels on record outside the pre-crisis boom years,” JP Morgan’s Jesse Edgerton writes.
“One might wonder if these high prices reflect growing demand that could soon elicit a wave of construction that would prove our forecasts wrong. We find, however, that high prices are concentrated in markets where supply is constrained by geography or regulation, suggesting there may be little room for additional construction.” (Emphasis added.)

My town, Fresno CA, has the micro house in the backyard ordinance, I can comfortably house someone with 8 grand and a couple of weeks hammer, nail and saw.   We have a smart mayor, and I been impressed lately, We got our pensions fully funded! cash strapped for sure, but we done it. And now our pals ain't gonna be homeless, not when a little sweat makes them comfortable, and swamp coolered.

I should invite the world's mathematicians to come live here and make banker bot. Wouldn't have to pay them, they make their own  money.

How do goods merchants use manage the business?

In the mall, each shop keeper has a knob, the red/green knob.

If the merchants chooses to make larger inventory shipments less often, that is hold inventory longer, then he turns the knob to the right and smart cards passing by will see more green and less red.

Exactly how it works today, the big sale, clear out whole chunks of inventory storage and make one large shipment coming in.  The merchant knows the risk, he has a smart card.

The critical code

Block operations on tyhe protected array.

These moves on the linear array, blind, no regard for block counts.  When graph operators foul the block  counts, there is partial recovery, but when the block array gets fouled by this code,a day, an hour a week of a generation of trades are lost.  This is what the microprocessor and the interpreter have to protect, subject to verification over the net.


#Node operations know about block counts
# this code does not, except on initialization
class garray:
 def __init__(self,r): # upper is the enclosing block
  self.top = r 
  self.top.count = 1
  self.g =[]
  self.g.append(r)
  # Move a block up the array
 def MOVE(self,dest,src,num) :
  h=[] # grab some temporary storage
  h[0:num] = self.g[src:src+num] # copy the source segment
  self.g[src:src+num] = self.g[dest:dest+num] #move list down
  self.g[dest:dest+num]=h[0:num] # replace
 def DELETE(self,i,num) : pass

 def APPEND(self,n) :
  self.g.append(n)
 def GET(self,i) :
  return(self.g[i])

 The big question, does the block structure ever get fouled?

The graph operators get jammed to a halt, during what we might call big black swan events. Things that are unmeasurabe since they have no real past and block chains are, ultimately, finite. When the terrorists blow up city hall in LA, the  secure digits are stuck where they for for a while.  But gold shipments get stuck worse, so what can I say?

A better solution to python variable arguments and dictionaries

We add one feature to dictionaries, they have a local context, even if it is None.  An essential component of dictionaries, but otherwise we keep their functionality. In the attached local context, key words are immediately defined, as keywords.  Then tell me how I can use keywords in the local context. Make keyword have an operational definition.

Nested block works great for managing screen widgets

Exactly what we want, nest the widgets, select the nesting structure from a pattern.  You see some graph patterns, icons,on your IDE, like seeing block patterns  on your blog design tool.  Then, screen changes become enclose operations, prepend this nest block of widgets  into the structure of some sub graph in the layout.  let's you create a whole set [of configuration contexts for some part of the  screen, say, a separate swatch window functionality in the debugger, the new nest replacing the previous nest enclose by that panel.

Transactions costs in the bot net

Define he term first.  In the aggregate, the total cost of running tyhe net is the heat radiaton generated as my 20 lines of code are executed in silicon in a million places at millions of times per second.  Lost money, the heat leaves the nest blocks, we gotta do a delete.

Otherwise, everything is double entry exchanges of secure digits, and the cost of the exchange is one bots travel price charged to another bot.  There is one exception, verification reports are free when trading contracts are authorized.  We have to trust the bots on this, they have to have built ins on bounding  the use of verification. Verifications are authorized by us, when we thumb print our smart cards,  releasing them for auto trade.

Scans, read only grabbing of finite precision block structure from the graphs? Cost you, a tiny but its gonna cost.

Then we have writeable trades, gonna cost.  The two costs are proportional to the graph size and published liquidity variance bounds on block structure.  Cycles on the graph are the double entry charges passed around between bots, that is how hey view their selves.

Nested block structure makes asset=liabilities within known precision.

Enclose is the new insert

Here is the code that maintains the block structure  .  All inserts, edxcept the root node, are all 'enclose'.  self encloses subgraph new.

# self prepends the new block
 def enclose(self,new) : #
  i = G.top.traverse(0,UpdateFun,dict(target=self,offset=new.count,func="Update"))

  k = self.traverse(i,IsEqualFun,dict(target=new,func="IsEqual")) # locate the new block
  G.MOVE(i+1,k,new.count) 
  self.count = self.count + new.count

Note the use of recursive descent to scoff graph indices.  It is efficient because one pass down the graph from the top to the enclosed block. Start fro the top until you find self, accumulates indices, then from that point, find the index of new, so one pass in two pats.

Will this work on a massive scale.  Well, it is optimized for wealthy trades trade that happen from the top of the graph down.  Those critical trades can flood in times of stress.

The general procedure is that new trades are appended to the array,enclose by the root, so initially you have one huge subgraph as the first block in the root, and this tiny trade at the end.  From then on the graph site bot will be doing the bubble sort, moving the trade up, with successive call to enclose.. Tiny trades do not bubble far, the bot is bubbling up and checking imbalance at any node, it it is out of variance, another bubble up is required.  Large trades which impact the market will bubble up higher and higher, making the site bot spend  more and more graph cycles.

That is the solvable issue we face. The solutions involve short cuts, leaps down the array, driven because the block counts are accurate, recurse can make the jump, how well it jumps depends on thei mix of bots running the graph.

Queuing on the graph keeps the balance

Bots know the published rules,n the site owner runs under the rule that the graph cannot get more than 5% put of balance.  Any bot can scan the graph and get a measure of imbalance, and determine it time on the net, figure out the queue length.  The longer the queue the more uncertain the trade.  Risk management is automatic, the red/green light, and risk management will maintain the queue, or the site manager will get rich.

The dot, comma algebra

It helps.  Like right now the graph machine doeas:

a.(b,c) -> a.(c,b)  : a,b,c all nested block

and it has to do a special, the append.  Append is not a natural act, every append or delete alters the length of the array of nodes, and that requires a special arrangement between the bots, a verification about when and where.

But, all appends happen at the array end, and are 'covered' by the root:

a.x -> a.(x,new) and new is generally a graph of rank one.  But, other than the root node, every node is covered, and it covering is to the left.   The amateur that I am, I can tell you that the bot graph managers or going to do a lot of local bubble sort, and it will look like some guy with a heavy graph in each arm, he is standing at the node of some lattice, trying to pull the one up, let the other down and get a switch.  It does this because in the adaptive window, some prior quantization allocate too much rap for the information monetization delivered.

So, loal switching between two peer blocks is simple, basically done.  We might get away with combinations (nightmarish maybe) but sequences of local exchanges.  This is another big deal for both security and efficiency, it is really mathematically required to make this mostly a responsive local bubble sort.  So, I move on. Gonna pull out the Huffman literature, see what's new,and get the standard algorithm going for test Gotta see how efficient it is with nested block.

What if President Hillary's jaw gets stuck

Don't want to alarm the electorate, but, say she giving a major speech to the Ruskies, Putin is all, kind being a dick head a he press conference, and her jaw gets suck, right in front of everyone!

So we can see the security profile develop

We can see what we need from the python interpreter and from micro software, the linear array is block protected, by virtue of a verification chain, end to end, a peer to peer exchange between two worried smart cards.  Underneath, trader bots, a tiny piece of code on some plastic somewhere, is getting verification reports from the python interpreter at his favorite trading site.  The natural probabililty adaption,driven by queue priority, will soon flash your light red.    Your card will lose a mutual entropy between your generator graph and the verifications it gets.  An incoherence between the terms and rates of one group vs terms and rates for another, it is detectable.  Verifications fly, all he time, hard for the bots not to notice.

Take disk drive develipment for example

I know nothing about it, but I can still be an expert.  How ado we design he disk head when we know he typical segment size.  Imagine a disk design engineer who followed a simple algorithm,  Matching the typical head sequence to he particular encoding graph for the typical searches.

I don't see a problem with theory here, we get a minimum redundancy result, in linear nest block.  The engineer has his optimization window, derive he fastest typical transaction rater against yhe typical search result.  Then try that disk in the web, get click data, because, we all know, the process of self adaption requires some fermion statistics.  So, the engineer gets a co-design, he web of users co-developed a language closer to your disk design, and you can go another round.  But if every heir was a need for minimal  spaning ree recursion,  it is he disk head.

Think search engines

How often are the search semantic graph updated?  Likely on,a weekly, or better yet, on a schedule that is 100 times slower than any read only search. So, your engine does the background work, it really does the Huffman graoh on search word frequency, and quantizes he typical search results along the block structure matching word frequency.  What are you delivering back to the user? Information, sure, but where is that information?

It is in the search language that the site bots and the human user co-created, by implicitly pricing words.  We and the site bot learn the cost of words in that little window. It cost, it cost redundant searches when we get the word order and significance all screwed up, redundancy is minimizes, when  you do a huffman encode, and you get the dictionary graph for free, just map your disk to that, and let the disk handle that whole subject line, it will be doing linear, short hops down the compact graph and I will once again become a gazillionare.  Once again, the algorithm gain is so  significant, that transactions costs can be brought to zero, except for the 100 line of  micro-instuctions I will right, that cost 2 trillion.

Wednesday, October 26, 2016

Quick numbers

How precise is the stock market?  Assume there was some theory, rational for market moves, how accurate would you expect the market to execute the theory?  One biased estimators, bur a good guess is inflation, or what are the price variations over the monetary economy over the typical term. About 3%, is a good number.  We went through this, I can model a theory having 3% error with a six bit calculator, I know my bit error.

What does that tells us about the graph?  Most of your bets will be 'binned' across a six high, maybe five wide spanning tree, if it were viewed as a Huffman encoding tree.  But, at each of those nodes are the bets of equal significance at that level, and there is likely millions of them.

So, we see the asymmetry, that is what I am after.  But the site keeps the whole graph as nearly infinite, I guess,that is what they talk about with block chain.  Whatever the case, we want the scanner bots to jam  down the graph, fast, and collect some low precision version of the entire block chain, while being completely self protecting and mutual protecting against fake bots.  This seems to be the stumbling block in moving to the revolution.  This is all doable in a priced environment,that is key.  If its priced, then it is either locally stable around each of these nodes, or the variations are within precision.  The bots descend faster in a stable tree.  But that is the market, queuing problem, always has been, let's crack this case, move to the new tecnology

My current traverse, I can make it an in place generator

# descend from the top and collect G_index going down
 def traverse(self,id,fun,arglist=None) :
 # self anchors multiple blocks, get the last index for elements covered by self
  end = id + self.count 
  if fun(self,id,arglist) != CONTINUE : return(end) 
  
  # go through each of the comma separated blocks
  id = id + 1 # skip the anchor
  while id  < end  and (self.return_val == CONTINUE) :
   id = G.GET(id).traverse(id,fun,arglist) # here it gets the actual node from the array
  if self.return_val == BLOCK : self.return_val = CONTINUE
  return(end)

Descends a nested block sequence. See that line where the comment says "skip the anchor".  That means its jumping the dot and doing comas as in:

graph = a.(b,c)

This is beginning computer stuff, sure, but there is a point. If you have a bunch of singletons,no compound blocks, then is two lines of code and you exit this routing. It is almost like a scan of the linear array. Right now, on the enter side, I have block moves going on like bananas, and there is where the focus is. o I don't want a flexible graph, I want something like this, that is maintained in a way it can be perfectly protected even while supporting million of node scans a second on large graphs.  

At the site, we have probability counters that run the graph for the site owner, and hey schedule a rebalance.  We want that rebalance bot to efficiently walk the graph even while it and million of other bots pour through it.  But it is exactly those moments when the traders know something that is not yet priced,  That's the surprise, the innovation, that is when everyone wants to be running the graph, and also when the graph needs the most rebalancing. If I can make that work, I am rich, I ell you, filthy rich. Why I might even start acting some dumbfounded Silicon Valley geek.

But, look how short and contained.  Why bother with calls, just generate the code sequence whenever the software wants a traverse.  I am not in a hurry, until I et a full simulator,  I know this lap top can do it, it is an old used proton smasher. 

How much code, really?
Been there, done that. Under the right conditions, I could hand code that into less than a hundred micro-instructions, deliver a binary set of bits that is a graph engine for millions of transactions.  But it is egads! on insert, delete, and replace; a small nightmare.

Pounding the graph

My tiny little graph traversal is starting to work ok, so I am pounding it.  My goal here is to simulate the major problem, keeping very large betting graphs up to date without locking out the traders.   I will crack the problem, segment actions on graph reconfiguration such that trading can go on simultaneously, mostly.

I have queueing theory on my side.  When transaction  costs are very low, then queuing time on the graph becomes the limiting constraint in trading. I will solve that problem by segmentation of graph operations and become a gazzillionare, Tim Cook will be jealous, I beat him to the punch.

Let's call it re-compress the bets.  It is what my yellen bot does when the graph becomes unbalanced.  The rebalancer goes up and down the graph, it is the one parceling out risk.  The trading site that keeps the graph in balance better will draw mucho traffic. Watch the revolution unfold.

Code here.

Not enough lipstick to cover the pig Mr. Drum

Kevin thinks he can white wash the chart by noting it is only a 6.1% yearly price hike. Compared to what? Kevin lives in the past, just after the helicopter flight in 1974 when rates were 10-15%.

Today, all consumer goods,minus housing and medical, are dropping about almost  a point a year now.  So, in the wash, Kevin Drum and the Kanosians are literally starving the middle class to death, as the middle pays the subsidies.

Then having committed to the program, based on a foul understanding, they are stuck.  Kanosian have a tribe they have o answer for, they are not really scientists.

Yes, I did software today

Fixed bugs in the debuger. Set up a nice init file to extract graph structure as I pound my broken graph machine. Becoming smooth, almost ready to quit!

When boomers want comfy retirement, they import boomer slaves

Wa Examiner: Moms in the United States illegally gave birth to 275,000 babies in 2014, enough birthright U.S. citizens to fill a city the size of Orlando, Florida, according to an analysis of data from the National Center for Health Statistics.
My question is how does one give birth illegally? 

Variable length argument lists in python

They have new syntax to declaring them, the asterik. I don't completely get it, so  what I do:

UpdateFun,dict(target=self,offset=self.count)

Then the called function just selects the arguments with:

def UpdateFun(n,id,x) : # Stop when x=y
if  n == x['target'] : 
print("found ",id,n.value,n.count)
n.return_val = DONE
else :  n.count = n.count + x['offset']
return(n.return_val)

It works. But I don't get the part about quoting keywords. They are keywords, they should not be quoted by definition of the term keyword, it means,"No quoting necessary".  But, since the keywords are not known to be keywords until the dictionary is opened and peeked at, the operation seems correct. What I am doing does not match their semantics, I am likely doing this inefficiently.

If there was a method, we want the interpreter to check the token against the normal dictionary, it cannot find it. It has to stop looking, dict has no way to indicate that another dict  is in the variable list. Dict should be a bit smarter on setting itself up.

I mean, when I print out the dictionary that is implicit,the one execuses, all the keywords are quoted, like a local dictionary,they work the same, fine.  Then an unquoted, x, which is in implicit dict, works fine.  But the explicit local dict, needs 'x'. Why not attach the dictionary to the local contet, optionally?