Representation of Characters

Before proceeding with the understanding of a microprocessor chip, it is necessary to understand as to how characters are represented in the computer. The term byte refers to a single character of storage. A byte is essentially a collection of 1s and Os representing a character.

One is familiar with the ten distinct decimal digits 0 to 9 commonly used in the decimal arithmetic system. The computers being electronic machines work better with just two signals (on/off) thus giving them only two distinct digits 1 (for on) and 0 (for off) to work with.

This inherent ease of working with two digits makes the computer amenable to binary arithmetic (as opposed to the decimal arithmetic that one is familiar with).

Hence, in the binary system used by computers, each number or character is represented in terms of the binary digits 1 and 0. Each of these binary digits is called a bit. A collection of 8 bits or a byte is used to represent a character in the computer.

The number of bits used to represent a character will determine the number of unique characters that can be represented.

If only 3 bits were used, there would be 8 (23) possible combinations of 1s and 0s. These are:

000,
001,
010,
011,
100,
101,
110,
111.

Each could represent a unique character in the computer. This would mean that 8 different characters can be represented using 3 bits. In a similar way, by using 8 bits, the PC can represent a larger set of characters (28).

As to which combination of 1s and 0s will represent which number (0-9) or character (A-Z, *, /... etc.) is decided by a representation scheme used by the computer. The most widely-used representation scheme is called ASCII.

A sample of the ASCII representation for some of the characters is shown in Table.

Character

ASCII Representation
60011                  0110
70011                  0111
  
A0100                  0001
B0100                  0010
  

ASCII Representation