Electronic data processing (EDP) or business information processing can refer to the use of automated methods to process commercial data. Typically, this uses relatively simple, repetitive activities to process large volumes of similar information. For example: stock updates applied to an inventory, banking transactions applied to account and customer master files, booking and ticketing transactions to an airline's reservation system, billing for utility services. The modifier "electronic" or "automatic" was used with "data processing" (DP), especially c. 1960, to distinguish human clerical data processing from that done by computer.[1] [2]
Herman Hollerith then at the U.S. Census Bureau devised a tabulating system that included cards (Hollerith card, later Punched card), a punch for holes in them representing data, a tabulator and a sorter.[3] The system was tested in computing mortality statistics for the city of Baltimore.[3] In the first commercial electronic data processing Hollerith machines were used to compile the data accumulated in the 1890 U.S. Census of population.[4] Hollerith's Tabulating Machine Company merged with two other firms to form the Computing-Tabulating-Recording Company, later renamed IBM. The punch-card and tabulation machine business remained the core of electronic data processing until the advent of electronic computing in the 1950s (which then still rested on punch cards for storing information).[5]
The first commercial business computer was developed in the United Kingdom in 1951, by the J. Lyons and Co. catering organization.[6] This was known as the 'Lyons Electronic Office' – or LEO for short. It was developed further and used widely during the 1960s and early 1970s. (Lyons formed a separate company to develop the LEO computers and this subsequently merged to form English Electric Leo Marconi and then International Computers Limited.[7] By the end of the 1950s punched card manufacturers, Hollerith, Powers-Samas, IBM and others, were also marketing an array of computers.[8] Early commercial systems were installed exclusively by large organizations. These could afford to invest the time and capital necessary to purchase hardware, hire specialist staff to develop bespoke software and work through the consequent (and often unexpected) organizational and cultural changes.
At first, individual organizations developed their own software, including data management utilities, themselves. Different products might also have 'one-off' bespoke software. This fragmented approach led to duplicated effort and the production of management information needed manual effort.
High hardware costs and relatively slow processing speeds forced developers to use resources 'efficiently'. Data storage formats were heavily compacted, for example. A common example is the removal of the century from dates, which eventually led to the 'millennium bug'.
Data input required intermediate processing via punched paper tape or punched card and separate input to a repetitive, labor-intensive task, removed from user control and error-prone. Invalid or incorrect data needed correction and resubmission with consequences for data and account reconciliation.
Data storage was strictly serial on paper tape, and then later to magnetic tape: the use of data storage within readily accessible memory was not cost-effective until hard disk drives were first invented and began shipping in 1957. Significant developments took place in 1959 with IBM announcing the 1401 computer and in 1962 with ICT (International Computers & Tabulators) making delivery of the ICT 1301. Like all machines during this time the processor together with the peripherals – magnetic tape drives, disks drives, drums, printers and card and paper tape input and output required considerable space in specially constructed air conditioned accommodation.[9] Often parts of the punched card installation, in particular sorters, were retained to present the card input to the computer in a pre-sort form that reduced the processing time involved in sorting large amounts of data.[9]
Data processing facilities became available to smaller organizations in the form of the computer services bureau. These offered processing of specific applications e.g. payroll and were often a prelude to the purchase of customers' own computers. Organizations used these facilities for testing programs while awaiting the arrival of their own machine.
These initial machines were delivered to customers with limited software. The design staff was divided into two groups. Systems analysts produced a systems specification and programmers translated the specification into machine language.
Literature on computers and EDP was sparse and mostly obtained through articles appearing in accountancy publications and material supplied by the equipment manufacturers. The first issue of The Computer Journal published by The British Computer Society appeared in mid 1958. [9] The UK Accountancy Body now named The Association of Chartered Certified Accountants formed an Electronic Data Processing Committee in July 1958 with the purpose of informing its members of the opportunities created by the computer.[9] The Committee produced its first booklet in 1959, An Introduction to Electronic Computers. Also in 1958 The Institute of Chartered Accountants in England and Wales produced a paper Accounting by Electronic Methods.[9] The notes show what may be possible and the potential implications of using a computer.
Progressive organizations attempted to go beyond the straight systems transfer from punched card equipment and unit accounting machines to the computer, to producing accounts to the trial balance stage and integrated management information systems.[9] New procedures redesigned the way paper flowed, changed organizational structures, called for a rethink of the way information was presented to management and challenged the internal control principles adopted by the designers of accounting systems.[10] But the full realization of these benefits had to await the arrival of the next generation of computers
As with other industrial processes commercial IT has moved in most cases from a custom-order, craft-based industry where the product was tailored to fit the customer; to multi-use components taken off the shelf to find the best-fit in any situation. Mass-production has greatly reduced costs and IT is available to the smallest organization.
LEO was hardware tailored for a single client. Today, Intel Pentium and compatible chips are standard and become parts of other components which are combined as needed. One individual change of note was the freeing of computers and removable storage from protected, air-filtered environments. Microsoft and IBM at various times have been influential enough to impose order on IT and the resultant standardizations allowed specialist software to flourish.
Software is available off the shelf. Apart from products such as Microsoft Office and IBM Lotus, there are also specialist packages for payroll and personnel management, account maintenance and customer management, to name a few. These are highly specialized and intricate components of larger environments, but they rely upon common conventions and interfaces.
Data storage has also been standardized.Relational databases are developed by different suppliers using common formats and conventions. Common file formats can be shared by large mainframes and desktop personal computers, allowing online, real-time input and validation.
In parallel, software development has fragmented. There are still specialist technicians, but these increasingly use standardized methodologies where outcomes are predictable and accessible.[9] At the other end of the scale, any office manager can dabble in spreadsheets or databases and obtain acceptable results (but there are risks, because many do not know what Software testing is). Specialized software is software that is written for a specific task rather for a broad application area. These programs provide facilities specifically for the purpose for which they were designed. At the other end of the scale, any office manager can dabble in spreadsheets or databases and obtain acceptable results.[9]