### Bits

The information theoreticians maintain that any piece of information can be reduced to small "information units." Think of the chemist, who breaks down - theoretically, at any rate - every kind of matter into its constituent elements, and these elements again into atoms. The information theoreticians have developed a "computer's atom" for their own science, which they call a "bit." (We shall learn a llttIe later on why this particular name was chosen.) A "bit," scientifically speaking, is a minimum unit of information.

Imagine that you ask someone "Are you a business man?"

He can reply in severaI different ways. He may say "I wish I were!"

He may also say "Good grief, what do you take me for?"

Or he can simply say "Yes" or "no".

AII these answers are pieces of information, very informative. The sentence "I wish I were!" is a story in itself. But the most precise and clearest replies are the "Yes" and the "No". It is not possible to give briefer answers to the question if you wish to answer it at all. "Yes" and "No" are the smallest possible units of information. They are "bits".

With a little trouble, assuming that you are fond of playing games, every piece of information can be reduced to "bit" form. This is what happens in a television quiz, for instance. A man is introduced, his trade or profession is a secret, and the competitors have to guess what it is. But the man is not allowed to give any answer but "Yes" or "No" to each of the questions put to him.

Let us assume the man is a plumber. The first question might be, "Do you work with your hands?"

The answer, of course, is "Yes".

Sccond question, "Do you use wood?"

Answer, "No".

So it continues. If the man's business has not been guessed

in ten or twelve questions, the questioners have lost. Surprisingly enough, the puzzle is often solved, if reasonably intelllgent questions are asked, after the eighth or ninth question. For its solution, therefore, eight or nine "Ayes" or "Nays" - eight or nine bits - were necessary.

In our game (and in the theory of information as a whole) ten bits will suffice to provide a selection from 1,024 pieces of information. If you are allowed to put 20 questions - in other words, to use 20 bits - you have a choice of 1,048,576 possibilities. Industrious people have succeeded in working out that the page of a book contains an average of 10,000 bits, and a whole thick book ten million bits. The more abstruse an idea, the more remote it is from the normal or commonplace, and the more difficult it is to disentangle it from a mass of similar notions, so the more bits are needed to describe it.

Let us take another example - a card game. Suppose you have extracted the four kings from a deck of cards: the red King of Hearts, the black King of Clubs, the red King of Diamonds and the black King of Spades. How many bits do you think you need to indicate one of these four cards? You need exactly two.

Let's say you thought of the King of Clubs. Our first question wouId be, "Is the card red?"

Your answer is "N0".

So the card must be black, and our next question would be, "Is it the King of Spades?"

your answer would be "No". So we would know that it must be the King of Clubs. Two answers, two bits.

It is no more difficult, but only a trifle more complicated, to designate one card (let's say the Jack of Diamonds) by bits out of a piIe of 32 cards. How would you go about it?

You could make two piIes of thc 32 cards, one of 16 at the upper end of the table and one of 16 at the lower. Then the question-and-answer game starts:

"Is the card in the upper pile?"

The answer can only be "Yes" or "No". One bit!

You now take the pile containing the card and divide it once more into two heaps of 8 each. Again we ask "upper pile" and you must reply.

By the time we have arrived at the fifth question, you have only two cards to choose between, and with the fifth answer the right card has been found. To discover one in a pack of 32 cards, five bits were needed.

Graphics >>>>

First printed in Germany: 1963