In this economic model, we assume the fruit harvest is a constant growth. Our problem is converting the fuit into bags of fruit chips, fruit pie, cans of fruit and all the rest. So first, how much to each of the banker bots make in their manufacturing. We assume complete manufacturing, that is we can take fruit, one at a simt (harvesters), or two at a time, or three at a time. There are a fixed number of subsets available at each stage in a hierarchy of banker bots. When all inputs and outputs are minimal for each banker bot, the bot earns one unit of the X axis growth.
Let the X axis be a two complement number, 2^n-1. Then the exponent at the most significant bit is the growth rate of the harvest. That is, it tells us how fast the X axcis is growing way out at the maximum value of X. That maximum value of X is how much fruit are in inventory. But if fruit harvesting rates are constant, the economy can make the number line as long as it wants.
The harvester each the two period input rate minus the two period output rate, and the net is the projected break even earnings, one. So we can see that under this very ideal condition, the manufacturing tree is a 2 nary symmetric graph, and at its pinacle is Janet Yellen, and she makes nothing but the unit of account with no output. That is, Janet consumes one unit of a very sophisticated fruit product. Janet is the
The two period constraint then gives up the hyperbolic conditions, and the one period consumption rate at each rung of the ladder is output/input And using Newton's short cut we find the sum of all consumption is log(input) for the harvesters. That is the banker bots consume the bit length of our twos binary number.
Exchange rates reduce going up the ladder, harvesters pick at the faster rate. This model is a subdi9vide yard stick, harvesters work every small notch, fuit chip makers work every two notches, canned fruit makers every fourth notch and so on.
When the harvest rate uncertainty is uncertain
In that case, the bot have to adapt. They have to estimate the rate of change in inputs, then calculate the project output. That is the root function my spreadsheet did to make this chart. When that happens the bots are no longer have an exact Euler's number, the number has error. Each bot has to let its input queue grow a bit until the variance and mean of the queue size match.
What is the difference between the two cases?
In the first, the uncertainty if fixed everywhere, I have a fixed error all the down the X axis, in the second case the error changes going down the axis. If I fix my bots to the maximum error, then some bots will over hedge and no longer meet the no arbitrage. So the bots do a run time Black Scholes. There is no consistent safe rate.
What doe the queue look like?
First, consider Janets position. As an encoder she clocks once per scan of the X axis. But as a decoder, she clocks once per incoming message. Look at this as the encoder problem. Janet is not reporting to anyone, except staff salaries, the shredder and the press. So she always consumes all of her input and wants that as close to one as possible. Her output queue is zero. She is looking at the entire X axis, her mean error is about zero, and her X variance is maximum. She wants lots of member banks to spread her error so she produces no output. Her input queue is largest of all the bots, her output is always zero.
For the rest of the bots going down the chain, their output queue averages have of Janet's input queue, about N/2 for N bots, by conservation of queue and by guessing. But their inoput queues will still be large, they look at a considerable amount of the X axis, depending upon the aggregation ratio. Somewhere toward the long end, toward fruit pickers), the input and output almost match by 1/2. They look at a small section of the X axis, they are nearly impervious to the X axis ward.
Where is the decoder? I guess its an echo right off Janet, bouncing back down, through the cotangent bundle supported going down. When Janet makes more than one error of botcoin, she goes arbitrage, she wins the game of Wythoff. So she is going to wait on input until she gets a big N that drops the portion of t'' she needs to adjust. So, slow the bet rate down, lower rates, and visa versa. Growing N is a reduction of the payoff rates on bets. Bets earn t'', and Janet has gone t0o a 1/N of that, N larger. That in turn puts the bots down the chain off on their t'', and they add a bit of N.
What kind of error function on the X axis is allowed?
I think that is where Lucas the polynomial guy comes in. He has a series of Lucas polynomial;s which all have spectral properties relating to boundary hyperbolics, especially around Newton unit circle. Look at those various polynomials, I think when x=1 in those equations, they all make Wythoff array, the first two rows are for beginners, like me. But X never makes it to one, because that makes an irrational and we are on a Newton Strike.
Why would the error function on X change? I dunno, mostly because physicists invent fun games. But, say, someone bopper an atom with light, you move up the polynomial ladder, and maybe get to move X closer to one.
Anyway, these bots are Wythoff players, Feynman diagramers, and Hopfield Neural net jumpers, and Groupies. So we have the cotangent/tangent optimum we have to meet. That is the uniform convergence of the power series, The cotangent series has to match the tangent series in their approximation, as order increases. The series come out of the Shannon condition and has Lucas solution on hyperbolics.
1 comment:
http://www.molihua.org/2011/04/freedom.html
Post a Comment