The Probability of Information -- Part 3
A thoughtful person once asked the question: “Does entropy apply to information?” The answer is obviously “yes”, but why? The short answer is that it’s because the probability of information is extremely low. The purpose of this blog entry is to attempt to explain my answer.
Consider the highly organized sequence of characters below.
Mathematics is the language of science. To succeed in science, one must use mathematics. Thus high quality science depends on high quality mathematics.
Now if we were to apply a random single point mutation to the above paragraph we might get the paragraph below.
Mathematics is the language of science. To succeed iK science, one must use mathematics. Thus high quality science depends on high quality mathematics.
In this case an “n” was replaced by a “K”. The above character sequence is less organized than the original paragraph. In other words the entropy of the character sequence increased. We could apply another random single point mutation to the character sequence and we might get the original sequence. This would be an increase in organization and therefore a decrease in entropy. However, given the 55 character set used in The Probability of Information, as shown below, there are 16,665 ways to apply a single point modification to the character sequence, and only one of these mutations would represent a decrease in entropy.
Any of the 151 characters in the sequence can be changed to one of the other 54 characters (151 x 54).
Any character can be deleted (151).
Any of the 55 characters could be inserted into or appended to the sequence (152 x 55).
(151 x 54) + 151 + (152 x 55) = 16,665
So while it is possible for a random mutation to decrease entropy (increase organization), it is much more likely that a random mutation will increase entropy (decrease organization). In this case, the chances of increasing the organization of the second character sequence are only 1 in 16,665. If we apply just three single point mutations there are over 4 billion (16,665 x 16,665 x 16,665) ways to do this and the chances of increasing the organization of the character sequence is considerably slimmer. In other words, if you randomly modify information there is a very strong tendency for it to become disorganized.
If you continue at apply random modifications to the character sequence it will wander away from information into a vast sea of nonsense never to return again. Is it possible that the sequence will ever again turn into something that resembles a coherent thought? Well … yes. However, we showed in The Probability of Information that the chances of a character sequence of similar size representing information to be less than 1 in 8.5E+90. So you could make trillions of single point modifications per microsecond to the sequence for a length of time comparable to the age of the universe probably still not manage to happen upon another sequence representing information.