Posted by on Apr 18, 2013 in Blog, Numbers, Science | 0 comments

Maxwell’s Demon and counting

I have already written about Maxwell’s demon in a previous post some time back. Before going on with this post, unless you know the demon well, you should read that previous post. He is a most interesting wee beasty.

In the early days of the demon, the notion was that he could decide which molecules had greater or lesser energy, and with this knowledge, decrease the entropy of the system by increasing the temperature of one side of the container and decreasing that of the other. By doing this without any expenditure of work, he violates the second law of thermodynamics. This operation is show in the first few images in the gallery below. In the next to last images, the demon is shown unmixing a mixture of gas molecules, and again, decreasing the entropy of his system without doing work. Finally, in a cycle due to Szilard, the problem is reduced to a single molecule in a box. The demon first detects which half of the box contains the molecule. He then introduces a diaphragm that separates the box into two halves. Knowing which half contains the molecule, the demon cleverly attaches a mechanism to the diaphragm so that it will lift a weight by virtue of the pressure against the diaphragm exerted by the molecule with its random kinetic energy at temperature, T. Work is done by the molecule. The diaphragm is then removed and the cycle begins again.

In the context of my last post on numbers, it should be immediately apparent that any of these cases involve counting. This is especially apparent in the model due to Szilard. Over the history of the analysis of the demon, it became apparent that this simple counting problem faced by the demon involved an amount of information equal to k_B T ln(2), where k_B is the Boltzmann constant, and T is the temperature, in other words, 1 bit.

From the point of view of classical physics, the problem in the Szilard model can be stated as either “the molecule is in the right half of the box” or “the molecule is in the left half of the box”. Without a measurement, there is a symmetry to the situation, because the truth of either proposition is unknown. The act of measurement breaks the symmetry by defining a truth value for these propositions: one is true, the other false. This symmetry breaking is not spontaneous as such; it is the consequence of an act by the demon. The result of the measurement is also a count: either “1” on the right and “0” on the left, or vice versa. Whatever the count is, one bit of information is obtained by making the count.

Furthermore, knowledge of the count allows work to be done. The acquisition of information yields work and hence energy. In the early analysis of this model, it was postulated that the balance between the work done per cycle and the entropy originated in the process of acquiring the 1 bit of information. However, a more complete study of the problem, in the context of digital computation revealed that the detailed balance between the work per cycle and the entropy increase of the system is actually achieved in the erasure of the information obtained at the measurement stage. In other words, if we were to construct a digital system to execute Szilard’s model cyclically, we would have to include a 1-bit memory storage element in our digital demon. At the end of the cycle, in preparation for the measurement step of the following cycle, we have to erase the information of the previous cycle. At the time this erasure is done, one bit of information is lost and entropy, equal to the work done, increases in detailed balance.

We do not have to erase that bit of information; we could store a history of the experiment. We could transmit a copy to a colleague. In this way, we could delay the increase in entropy, perhaps indefinitely. The work done is balanced by the information stored.

This model is a classical one. It involves the notion that the position and momentum of the molecule can both be estimated simultaneously in such a manner as to achieve work. In quantum mechanics, this becomes somewhat more difficult. The act of localizing the molecule to one-half of the box allows the uncertainty in the momentum to increase such that trapping it on the side were it was found becomes just about impossible. W H Zurek analyzed this problem and published a result, “Maxwell’s demon, Szilard’s engine and quantum measurements” Frontiers of Nonequilibrium Statistical Physics, ed. G T Moore and M O Scully (Plenum Press, New York, 1984, pp 151-61). Zurek’s analysis proceeds by accounting for the quantum states of the demon. He finds, in a manner consistent with what I mentioned earlier, that whatever the details of the quantum mechanical system under consideration, the demon must be reset from a final, entangled, measurement state back to a “ready to measure” state. This reset of the demon, again, removes one bit of information from the system.

In short, counting gathers information; preparing a new count loses information. I have been focussed on the case of counting a single item. What about counting up to 2 or 3? This question can be answered in terms of a successor function. We simply ask, “is there another one?” We can start at “O”, and ask, is there another one? We may count “1”. We then ask, is there another one? We may count “2”. We then ask, is there another one? We may count “3”, and so on. We are done when the answer to our question is finally “no”.

This raises an interesting question about the amount of information obtained at each of these recursive steps. Is it 1 bit per step? The answer, of course, is the expectation of the logarithm of the probability of the count at each step. In the Szilard model, the problem is such that there is only one molecule, and by construction, the answer to the count on the right or left side is equally probable. If we were, instead, counting all of the sheep on the Earth, getting the first 100 sheep counted is no news whatsoever. The count does not get interesting until we have a few hundred million on the books. (According to Wikipedia, the UN FAO estimates something over 1 billion sheep. There is some a priori information to prime your Bayesian thinking with.) In short, the amount of information depends on what we are counting. If it were decimal numeric representations for the natural numbers, there is no information in counting the next one at all, no matter how large the number is. It is guaranteed that there is a representation for the next number.

It might be worth reviewing our units. From thermodynamics, we have an equations of the form

 dW = P dV - T dS

where dS is a differential change in entropy, P is pressure, and dV is a differential change in volume. In the present context, we are equating a certain amount of work \Delta W to a change in entropy of \Delta S = k_B ln(2) at a given temperature, T. Our unit of information as 1 bit comes from the use of base-2 logarithms instead of the natural logarithms commonly used in the equations of physics like this. It would be a trivial matter to re-express Boltzmann’s constant in a way consistent with base-2 logarithms, or equally, to divide all of our units through by the constant and express energy in terms of bit-degrees. That is, the expression above shows us that the units of energy are equivalent to degrees in units of [\text{Kelvin}] [k_B] [\text{ln(2)}] times bits. Dividing by c^2 would yield units of mass in terms of bits. Interesting? Conversely, one might measure information in terms of Joules per degree instead of bits, multiplying through by k_B / ln(2).

Back to counting and state… In statistical thermodynamics, one generally computes a partition function Z, also known as a “sum over states”, that constitutes the total number of particles in all possible states of the system. Estimates of the probability of the occupancy of a particular state or group of states is simply obtained by dividing the occupancy number of the state or group of states by the partition function, Z. To mention that this is a counting operation and that it goes directly to an estimate of entropy is virtually trivial. Like a Maxwell demon, we may sample the energy of specific molecules in a gas at some temperature. That we obtain less information in finding one near the mean energy than in finding one at three times the mean goes directly to the expectation of the logarithm of the corresponding probabilities; that is, the information.

Here is a simple exercise in Planck units. The Planck mass is 2.17651 \times 10^{-8} kg, which is equivalent to a Planck energy of 1.9561 \times 10^{9} Joules. The Planck temperature is a whopping 1.416834 \times 10^{32} Kelvin. That would give us a Planck entropy of 1.3806 \times 10^{-23}. But this is exactly Boltzmann’s constant. The Planck entropy is precisely 1 in natural units.

Fascinating…

Leave a Reply