This is his probability distribution. It tells us how many drunkards are waiting for the bartender when they go bar hopping. The drunkards arrive alone, ungrouped wandering the street, and meander into the next bar. That is called Gaussian arrival rates, and the bartender has a fixed service time, the time to fix one drink. In a Shannon decoding network, the drunkards are organized into groups, some groups want a martini, some a screwdriver; thus bar tender can organize an assembly line, first setting up the glasses, then allocating the liquor, then the additive. When the bar is so organized, then the wait in line is minimized, and the number of drinks along the assembly line is stabilized.
Here is the entropy for a poisson, or the amount of redundancy removed from the wait in line when customers are not groupd. The greek character that looks like a chair is called Lambda, and is the average arrival rate. This equation, with a change of sign and moving the right part across the equal sign looks like out typical Shannon condition. Put the summation into the form of our sdigits in powers of the natural log and we have it.
What is time?
Simply the smallest, finite action that can be taken, measured relative to the error rate of the multiplier, it is the baud rate. How is that related to the entropy and size of the aggregate? What is the relationship between the finite log in math, the Stiglitz screening function in economics, and acquiring the vacuum expectation value in atomic physics? How do we relate entropic inefficiency and the lambda constant?
Mathematicians need to set this all up, queueing is basic to the whole natural process everywhere; from the atom to the web to economics. Queue shifts are changes in the recursion forumula of discrete sets of integers.
The time is now for mathematical heroism, and companies need to pay high wages for mathematicians who understand minimal redundancy, finite measuring networks, the Theory of Counting.
No comments:
Post a Comment