Serial decimal explained

In computers, a serial decimal numeric representation is one in which ten bits are reserved for each digit, with a different bit turned on depending on which of the ten possible digits is intended. ENIAC and CALDIC used this representation.[1]

See also

Notes and References

  1. http://www.tjhsst.edu/~prittman/arch/Chap1/Chap1Old.html tjhsst.edu