Note added March 31, 2002.

On the ARN website, someone called only Erik accused me of misunderstanding Dembski's definition of information. [See ``How to Evolve Specified Complexity by Natural Means''] He is correct, but the point is irrelevant.

Specifically, Dembski defines the information in a sequence of numbers as -log(P), where P is the probability of generating that sequence randomly and the logarithm is to the base 2. Information is more commonly defined in terms of entropy, which differs from Dembski's information. Specifically, the entropy is the average information needed to specify, for example, a sequence of a given length; Dembski's information is the information of a specific sequence and reduces to entropy only in a special case. I call the units of -log(P) Dembski bits, or dits, to distinguish them from ordinary bits.

In almost all his examples, Dembski uses a sequence of binary digits. He assumes without justification that each digit in the sequence is independent of each other digit and further that the probability of any one digit's being 0 or 1 is ½. In that special case, Dembski's information is exactly equal to the entropy of the sequence; that is, 1 dit = 1 bit.

The point is moot anyway: My model will generate Dembski information as readily as it will generate conventional information.

Erik further claims that the entropy of information theory is not the same as the entropy of thermodynamics. I will say only that the two entropies may described by the same equation, so at a minimum they may be analogized. Erik's point is nevertheless irrelevant to my argument, inasmuch as my model does not depend on any analogy between information theory and thermodynamics. Erik's discussion of the two vials of gas is likewise irrelevant because the state of the gas is not specified in the sense that Dembski means it; that is, it does not exhibit "the type of pattern characteristic of intelligence" (Intelligent Design, p. 158).

In a different vein, by the way, Dembski sometimes uses a wholly different measure of probability: the probability of choosing one or more relevant outcomes from a set of possibilities (thanks to Richard Wein for pointing it out). This measure allows Dembski to call a uniform probability distribution complex, since it is only one out of an infinite number of possible choices. A crystal, however, is not complex, according to Dembski (No Free Lunch, p. 12). Thus, to Dembski, a snowflake is simpler than an antenna pattern that is the same in all directions. I hope to have more to say about Dembski's use of probability in a paper tentatively titled "Dembski's Arrow."