1.4 DATA REPRESETATION: BIT, BYTE AND CHARACTER.
BIT.
A bit (short for binary digit) is the smallest unit of data computer can process. The binary system is a number system that has just two unique digits, 0 and 1 called bits. A bit is represented by the number 1 and 0. These numbers represent the binary system. They correspond to the states of on and off, true and false, or yes and no.
BYTE
Byte is a unit of information built from bits. When 8 bits are grouped together as a unit, they form byte. Bits and bytes are the basis for representing all meaningful information and programs on computers.
CHARACTER
One byte is equal to 8 bits. One byte represents a single character such as the number, letter, or symbol. For example, the capital letter F is represented by binary code 01000110 that can be understood by the computer system.
Computer does not understand letters or numbers or pictures or symbols. Computer uses a binary system to count as it only recognizas two states that are 0 an 1. Number 9 is represented by binary code 00111001. Eight bits grouped together as a unit are called byte. A byte represents a single character in the computer.
1 byte = 8 bits = 1 character
There are three character codes or coding schemes to represent characters which are ASCII, EBCDIC and Unicode. Each byte contains eight bits. A byte provides enough different combination of 0s and 1s to represent 256 characters.
ASCII is the most widely used binary code for microcomputers (Personal Computer).
EBCDIC was developed by IBM and is used primary for large computer - mainframe and high end server.
The combinations of 0s and 1s are defined by patterns. These patterns are called coding system. The 256- character capability of ASCII and EBCDIC is too small to as Arabic, japanese and Chinese.
The Unicode coding schme is designed to solve this problem. It uses two bytes(16 bits) to represent one character. This gives it the capability for representing 65,536 different characters. This can cover all the world's languages. Unicode is downward-compatible with ASCII, meaning that Unicode can recognizes ASCII characters.
Unicode
2 bytes = 16 bits = 1character
Tiada ulasan:
Catat Ulasan