Saturday, October 29, 2016

An explicit mechanism for Shannon information theory?

Looking at the equation, we see yhe noise term, which is a stationary, probabilistic restriction on channel bandwidth.  A reasonably constant guassian noise limits bandwidth.  Shannon's construction is complete, almost everywhere, so Ito is happy.

Where, then, is the cost of clocking data, the cost of the clock itself? It is in the noise term, implicitly.  We see it appear in the literature as the cost of the map; 'the bandwidth cost of shipping the map' when we talk pure digital systems.  In our case, the Shannon noise term is time on the graph needed to re-normalize, that is bandwidth lost in the input and output queues.  That time on the graph is the liquidity exposure that traders suffer when their secure digits get stuck on the queue,   What we hey is sort of the obvious connect between queuing and information, when the encoding tree is stable, input symbols will not queue up along the nodes. Ort, the encoding graph is a ininimal match to the re-normalizing time, a self adapted system.

No comments: