*** Welcome to piglix ***

Decimal architecture


Decimal computers are computers which can represent numbers and addresses in decimal as well as providing instructions to operate on those numbers and addresses directly in decimal, without conversion to a pure binary representation. Some also had a variable wordlength, which enabled operations on numbers with a large number of digits.

Early computers that were exclusively decimal include the ENIAC, IBM NORC, IBM 650, IBM 1620, IBM 7070. In these machines, the basic unit of data was the decimal digit, encoded in one of several schemes, including binary-coded decimal or BCD, bi-quinary, excess-3, and two-out-of-five code. Except for the 1620, these machines used word addressing. When non-numeric characters were used in these machines, they were encoded as two decimal digits.

Other early computers were character oriented, providing instructions for performing arithmetic on character strings of decimal numerals. On these machines, the basic data element was an alphanumeric character, typically encoded in six bits. UNIVAC I and UNIVAC II used word addressing, with 12-character words. IBM examples include IBM 702, IBM 705, the IBM 1400 series, IBM 7010, and the IBM 7080.

The IBM System/360, introduced in 1964 to unify IBM's product lines, used per character binary addressing, and also included instructions for packed decimal arithmetic as well as binary integer arithmetic, and binary floating point. It used 8-bit characters and introduced EBCDIC encoding, though ASCII was also supported. The Burroughs B2500 introduced in 1966 also used 8-bit EBCDIC or ASCII characters and could pack two decimal digits per byte, but it did not provide binary arithmetic, making it a decimal architecture.


...
Wikipedia

...