In computers, a serial decimal numeric representation is one in which ten bits are reserved for each digit, with a different bit turned on depending on which of the ten possible digits is intended. ENIAC and CALDIC used this representation.[1]
This article is licensed under the GNU Free Documentation License. It uses material from the Wikipedia article "Serial decimal".
Except where otherwise indicated, Everything.Explained.Today is © Copyright 2009-2025, A B Cryer, All Rights Reserved. Cookie policy.