Data and instructions that are presented in a written or typed format can only be understood by the user. If the data is not in the user’s language, s/he will not be able to understand it. It is the same way with the computer; the computer’s language is binary 0s and 1s. The computer cannot understand typed or written instructions or data. Whenever data or instructions or input to the computer it is first converted to 0s and 1s, these are called binary digits (bits). There are a number of methods that are used to represent data in computer system, namely: 1. Binary Representation 2. ASCII - American Standard Code for Information Interchange 3. EDCDIC - Extended Binary Coded Decimal Interchange Code 4. Binary Coded Decimal (BCD) 5. Sign-and-magnitude 6. Ones Complement 7. Two’s Complement
Representation of Characters Data used on the computer for input and output operations are expressed using characters because this is what we humans understand. These characters are broken down into three categories, namely: 1. Numeric - This include all the digits from 0 to 9. 2. Alphabetic - This include all the letters from A to Z and a to z. 3. Special Characters This includes punctuation, symbols, etc.
The code used to represent each character is usually a unique group of 7 or 8 binary digits (bits). There are several methods used to represent characters, namely:
➢ ASCII The most popular method used to represent characters in computers is the American Standard Code for Information Interchange. This is a 7-bit code. Sometimes an eight bit, called a parity bit, is added for checking purposes. ➢ EBCDIC This is an 8-bit code that is widely used on IBM machines. Extended Binary Coded Decimal Interchange Code.
NB:- These codes are called character codes and are widely used to represent non-numeric data.
The