Saturday, May 31, 2014

Great work Mr. Gibbs and Wiki for the explanation

In statistical mechanics, the Gibbs algorithm, first introduced by J. Willard Gibbs in 1878, is the injunction to choose a statistical ensemble (probability distribution) for the unknown microscopic state of a thermodynamic system by minimising the average log probability

 

You are right up there with Planck and Heisenberg, Weinberg and Higgs. And that Russian guy with no vowels.
Entropy has the dimension of energy divided by temperature, which has a unit of joules per kelvin (J/K) in the International System of Units.

Correct, that is bandwidth per quant of third order moment.


We are now ready to provide a definition of entropy. The entropy S is defined as

S = k_B \ln \Omega \!
where

kB is Boltzmann's constant and
\Omega \! is the number of microstates consistent with the given macrostate.
Where consistent means the Hamiltonian of the system is consistent between the micro and macro. That implies the decoding map is not necessary. If you include the decoding map then you have made the system consistent, but added the entropy of the decoding map.
In classical statistical mechanics, the number of microstates is actually uncountably infinite, since the properties of classical systems are continuous. For example, a microstate of a classical ideal gas is specified by the positions and momenta of all the atoms, which range continuously over the real numbers. If we want to define Ω, we have to come up with a method of grouping the microstates together to obtain a countable set. This procedure is known as coarse graining

Coarse graining is actually the business of physics, that is what it is all about, finding the decoding map. The states in nature are not continuous, but Isaac's rules of grammar are defined for continuous systems. But Isaac's rules just define a discrete set of functions which go to their limit faster than the finite number line grows in size.

The decoding map:

 The correct interpretation of the Shannon channel rate in the presence of carrier noise includes the band width to send the decoding map. That is, it is information flow up to the limit of Gaussian noise in the channel.  If you know how many degrees of freedom are needed for the map, then take your baud rate and include room for the map, separating the two functions.  In physics, the proton does this, it sets aside bauds to send information to the vacuum, called quantum entanglement.  It is just enough information, 1/5 to 2 bits, so the vacuum and the proton can agree on space curvature, which is the only map they ever use. The 1 12/ to 2 bit measure is a third moment, and according to the MIT cold people, that is 180 meters, or 6.4e6 meters as a first moment. That means gravity knows the typical space curvature at 1e6 meters. But within 180 meters quantum entanglement to 1 1/2 degrees of freedom is maintained. So, I think 180/6.4e6 is the correct Schwarzschild radius, or 2e-4. But we have to do the 4*Pi/3, I think, or 8e-4. And I am off an order of magnitude. Hmmm. That point should be the point where gravity is completely part of the proton Baud system. Together they have gained 2 Baud in group organization. Am I supposed to multiply by 2^3 to account for encoding gain? That would be the same as packign efficiency which I have as 3.

No comments: