History of computing hardware
From Wikipedia, the free encyclopedia
The
history of computing hardware is the record of the ongoing effort to make computer hardware faster, cheaper, and capable of storing more data.
Computing hardware evolved from machines that needed separate manual action to perform each arithmetic operation, to punched card machines, and then to
stored-program computers. The history of stored-program computers relates first to computer architecture, that is, the organization of the units to perform input and output, to store data and to operate as an integrated mechanism (see
block diagram to the right). Secondly, this is a history of the electronic components and mechanical devices that comprise these units. Finally, we describe the continuing integration of 21st-century supercomputers, networks, personal devices, and integrated computers/communicators into many aspects of today's society. Increases in speed and memory capacity, and decreases in cost and size in relation to compute power, are major features of the history. As all computers rely on digital storage, and tend to be limited by the size and speed of memory, the history of
computer data storage is tied to the development of computers.
[edit] Overview
Before the development of the general-purpose computer, most calculations were done by humans. Tools to help humans calculate were then called "calculating machines", by proprietary names, or even as they are now,
calculators. It was those humans who used the machines who were then called computers; there are pictures of enormous rooms filled with desks at which computers (often young women) used their machines to jointly perform calculations, as for instance,
aerodynamic ones required for in aircraft design.
Calculators have continued to develop, but computers add the critical element of conditional response and larger memory, allowing automation of both numerical calculation and in general, automation of many symbol-manipulation tasks. Computer technology has undergone profound changes every decade since the 1940s.
Computing hardware has become a platform for uses other than mere computation, such as process automation, electronic communications, equipment control, entertainment, education, etc. Each field in turn has imposed its own requirements on the hardware, which has evolved in response to those requirements, such as the role of the
touch screen to create a more intuitive and
natural user interface.
Aside from written numerals, the first aids to computation were purely mechanical devices which required the operator to set up the initial values of an elementary arithmetic operation, then manipulate the device through manual manipulations to obtain the result. A sophisticated (and comparatively recent) example is the
slide rule in which numbers are represented as lengths on a logarithmic scale and computation is performed by setting a cursor and aligning sliding scales, thus adding those lengths. Numbers could be represented in a continuous "analog" form, for instance a voltage or some other physical property was set to be proportional to the number. Analog computers, like those designed and built by
Vannevar Bush before World War II were of this type. Or, numbers could be represented in the form of digits, automatically manipulated by a mechanical mechanism. Although this last approach required more complex mechanisms in many cases, it made for greater precision of results.
Both analog and digital mechanical techniques continued to be developed, producing many practical computing machines. Electrical methods rapidly improved the speed and precision of calculating machines, at first by providing motive power for mechanical calculating devices, and later directly as the medium for representation of numbers. Numbers could be represented by voltages or currents and manipulated by linear electronic amplifiers. Or, numbers could be represented as discrete binary or decimal digits, and electrically controlled switches and combinational circuits could perform mathematical operations.
The invention of electronic amplifiers made calculating machines much faster than their mechanical or electromechanical predecessors.
Vacuum tube (thermionic valve) amplifiers gave way to solid state
transistors, and then rapidly to
integrated circuits which continue to improve, placing millions of electrical switches (typically transistors) on a single elaborately manufactured piece of semi-conductor the size of a fingernail. By defeating the
tyranny of numbers, integrated circuits made high-speed and low-cost digital computers a widespread commodity.
[edit] Earliest true hardware
Devices have been used to aid computation for thousands of years, mostly using
one-to-one correspondence with our
fingers. The earliest counting device was probably a form of
tally stick. Later record keeping aids throughout the
Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, probably livestock or grains, sealed in containers.
[1][2] The use of
counting rods is one example.
The
abacus was early used for arithmetic tasks. What we now call the
Roman abacus was used in
Babylonia as early as 2400 BC. Since then, many other forms of reckoning boards or tables have been invented. In a medieval European
counting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to calculating sums of money.
Several
analog computers were constructed in ancient and medieval times to perform astronomical calculations. These include the
Antikythera mechanism and the
astrolabe from
ancient Greece (c. 150–100 BC), which are generally regarded as the earliest known mechanical analog computers.
[3] Other early versions of mechanical devices used to perform one or another type of calculations include the
planisphere and other mechanical computing devices invented by
Abū Rayhān al-Bīrūnī (c. AD 1000); the
equatorium and universal latitude-independent astrolabe by
Abū Ishāq Ibrāhīm al-Zarqālī (c. AD 1015); the astronomical analog computers of other medieval
Muslim astronomers and engineers; and the
astronomical clock tower of
Su Song (c. AD 1090) during the
Song Dynasty.
The "castle clock", an
astronomical clock invented by
Al-Jazari in 1206, is thought to be the earliest
programmable analog computer.
[4] It displayed the
zodiac, the
solar and
lunar orbits, a
crescent moon-shaped
pointer traveling across a gateway causing
automatic doors to open every
hour,
[5][6] and five
robotic musicians who play music when struck by
levers operated by a
camshaft attached to a
water wheel. The length of
day and
night could be re-programmed every day in order to account for the changing lengths of day and night throughout the year.
[4]
Suanpan (the number represented on this abacus is 6,302,715,408)
Scottish mathematician and physicist
John Napier noted multiplication and division of numbers could be performed by addition and subtraction, respectively, of logarithms of those numbers. While producing the first logarithmic tables Napier needed to perform many multiplications, and it was at this point that he designed
Napier's bones, an abacus-like device used for multiplication and division.
[7] Since
real numbers can be represented as distances or intervals on a line, the
slide rule was invented in the 1620s to allow multiplication and division operations to be carried out significantly faster than was previously possible.
[8] Slide rules were used by generations of engineers and other mathematically involved professional workers, until the invention of the
pocket calculator.
[9]
Yazu Arithmometer. Patented in Japan in 1903. Note the lever for turning the gears of the calculator.
Wilhelm Schickard, a German
polymath, designed a calculating clock in 1623, unfortunately a fire destroyed it during its construction in 1624 and Schickard abandoned the project. Two sketches of it were discovered in 1957; too late to have any impact on the development of mechanical calculators.
[10]
In 1642, while still a teenager,
Blaise Pascal started some pioneering work on calculating machines and after three years of effort and 50 prototypes
[11] he invented the
mechanical calculator.
[12][13] He built twenty of these machines (called the
Pascaline) in the following ten years.
[14]
Gottfried Wilhelm von Leibniz invented the
Stepped Reckoner and his
famous cylinders around 1672 while adding direct multiplication and division to the Pascaline. Leibniz once said "It is unworthy of excellent men to lose hours like slaves in the labour of calculation which could safely be relegated to anyone else if machines were used."
[15]
Around 1820,
Charles Xavier Thomas created the first successful, mass-produced mechanical calculator, the Thomas
Arithmometer, that could add, subtract, multiply, and divide.
[16] It was mainly based on Leibniz' work. Mechanical calculators, like the base-ten
addiator, the
comptometer, the
Monroe, the
Curta and the
Addo-X remained in use until the 1970s. Leibniz also described the
binary numeral system,
[17] a central ingredient of all modern computers. However, up to the 1940s, many subsequent designs (including
Charles Babbage's machines of the 1822 and even
ENIAC of 1945) were based on the decimal system;
[18] ENIAC's ring counters emulated the operation of the digit wheels of a mechanical adding machine.
In Japan,
Ryōichi Yazu patented a mechanical calculator called the Yazu Arithmometer in 1903. It consisted of a single cylinder and 22 gears, and employed the mixed base-2 and base-5 number system familiar to users to the
soroban (Japanese abacus). Carry and end of calculation were determined automatically.
[19] More than 200 units were sold, mainly to government agencies such as the Ministry of War and agricultural experiment stations.
[20][21]
[edit] 1801: punched card technology
- Main article: Analytical engine. See also: Logic piano
Punched card system of a music machine, also referred to as
Book music In 1801,
Joseph-Marie Jacquard developed
a loom in which the pattern being woven was controlled by
punched cards. The series of cards could be changed without changing the mechanical design of the loom. This was a landmark achievement in programmability. His machine was an improvement over similar weaving looms. Punch cards were preceded by punch bands, as in the machine proposed by
Basile Bouchon. These bands would inspire information recording for automatic pianos and more recently NC machine-tools.
In 1833,
Charles Babbage moved on from developing his
difference engine (for navigational calculations) to a general purpose design, the Analytical Engine, which drew directly on Jacquard's punched cards for its program storage.
[22] In 1835, Babbage described his
analytical engine. It was a general-purpose programmable computer, employing punch cards for input and a steam engine for power, using the positions of gears and shafts to represent numbers. His initial idea was to use punch-cards to control a machine that could calculate and print logarithmic tables with huge precision (a special purpose machine). Babbage's idea soon developed into a general-purpose programmable computer. While his design was sound and the plans were probably correct, or at least
debuggable, the project was slowed by various problems including disputes with the chief machinist building parts for it. Babbage was a difficult man to work with and argued with everyone. All the parts for his machine had to be made by hand. Small errors in each item might sometimes sum to cause large discrepancies. In a machine with thousands of parts, which required these parts to be much better than the usual tolerances needed at the time, this was a major problem. The project dissolved in disputes with the artisan who built parts and ended with the decision of the British Government to cease funding.
Ada Lovelace,
Lord Byron's daughter, translated and
added notes to the "
Sketch of the Analytical Engine" by
Federico Luigi, Conte Menabrea. This appears to be the first published description of programming.
[23]
A reconstruction of the
Difference Engine II, an earlier, more limited design, has been operational since 1991 at the
London Science Museum. With a few trivial changes, it works exactly as Babbage designed it and shows that Babbage's design ideas were correct, merely too far ahead of his time. The museum used computer-controlled machine tools to construct the necessary parts, using tolerances a good machinist of the period would have been able to achieve. Babbage's failure to complete the analytical engine can be chiefly attributed to difficulties not only of politics and financing, but also to his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow.
Following Babbage, although unaware of his earlier work, was
Percy Ludgate, an accountant from Dublin, Ireland. He independently designed a programmable mechanical computer, which he described in a work that was published in 1909.
In the late 1880s, the American
Herman Hollerith invented data storage on a medium that could then be read by a machine. Prior uses of machine readable media had been for control (
automatons such as
piano rolls or
looms), not data. "After some initial trials with paper tape, he settled on
punched cards..."
[24] Hollerith came to use punched cards after observing how
railroad conductors encoded personal characteristics of each passenger with punches on their tickets. To process these punched cards he invented the
tabulator, and the
key punch machine. These three inventions were the foundation of the modern information processing industry. His machines used mechanical
relays (and
solenoids) to increment
mechanical counters. Hollerith's method was used in the
1890 United States Census and the completed results were "... finished months ahead of schedule and far under budget".
[25] Indeed years faster than the prior census had required. Hollerith's company eventually became the core of
IBM. IBM developed punch card technology into a powerful tool for business data-processing and produced an extensive line of
unit record equipment. By 1950, the IBM card had become ubiquitous in industry and government. The warning printed on most cards intended for circulation as documents (checks, for example), "Do not fold,
spindle or mutilate," became a catch phrase for the post-World War II era.
[26]
Leslie Comrie's articles on punched card methods and
W.J. Eckert's publication of
Punched Card Methods in Scientific Computation in 1940, described punch card techniques sufficiently advanced to solve some differential equations
[27] or perform multiplication and division using floating point representations, all on punched cards and
unit record machines. Those same machines had been used during World War II for cryptographic statistical processing. In the image of the tabulator (see left), note the
patch panel, which is visible on the right side of the tabulator. A row of
toggle switches is above the patch panel. The
Thomas J. Watson Astronomical Computing Bureau,
Columbia University performed astronomical calculations representing the state of the art in
computing.
[28]
Computer programming in the punch card era was centered in the "computer center". Computer users, for example science and engineering students at universities, would submit their programming assignments to their local computer center in the form of a stack of punched cards, one card per program line. They then had to wait for the program to be read in, queued for processing, compiled, and executed. In due course, a printout of any results, marked with the submitter's identification, would be placed in an output tray, typically in the computer center lobby. In many cases these results would be only a series of error messages, requiring yet another
edit-punch-compile-run cycle.
[29] Punched cards are still used and manufactured to this day, and their distinctive dimensions (and 80-column capacity) can still be recognized in forms, records, and programs around the world. They are the size of American paper currency in Hollerith's time, a choice he made because there was already equipment available to handle bills.
[edit] Desktop calculators
The
Curta calculator can also do multiplication and division
By the 20th century, earlier mechanical calculators, cash registers, accounting machines, and so on were redesigned to use electric motors, with gear position as the representation for the state of a variable. The word "computer" was a job title assigned to people who used these calculators to perform mathematical calculations. By the 1920s
Lewis Fry Richardson's interest in weather prediction led him to propose
human computers and
numerical analysis to model the weather; to this day, the most powerful computers on
Earth are needed to adequately model its weather using the
Navier-Stokes equations.
[30]
Companies like
Friden,
Marchant Calculator and
Monroe made desktop mechanical
calculators from the 1930s that could add, subtract, multiply and divide. During the
Manhattan project, future Nobel laureate
Richard Feynman was the supervisor of the roomful of human computers, many of them female mathematicians, who understood the use of
differential equations which were being solved for the war effort.
In 1948, the
Curta was introduced. This was a small, portable, mechanical calculator that was about the size of a
pepper grinder. Over time, during the 1950s and 1960s a variety of different brands of mechanical calculators appeared on the market. The first all-electronic desktop calculator was the British
ANITA Mk.VII, which used a
Nixie tube display and 177 subminiature
thyratron tubes. In June 1963, Friden introduced the four-function EC-130. It had an all-transistor design, 13-digit capacity on a 5-inch (130 mm)
CRT, and introduced
Reverse Polish notation (RPN) to the calculator market at a price of $2200. The EC-132 model added square root and reciprocal functions. In 1965,
Wang Laboratories produced the LOCI-2, a 10-digit transistorized desktop calculator that used a Nixie tube display and could compute
logarithms.
In the early days of binary vacuum-tube computers, their reliability was poor enough to justify marketing a mechanical octal version ("Binary Octal") of the Marchant desktop calculator. It was intended to check and verify calculation results of such computers.
[edit] Advanced analog computers
Cambridge differential analyzer, 1938
Before World War II, mechanical and electrical
analog computers were considered the "state of the art", and many thought they were the future of computing. Analog computers take advantage of the strong similarities between the mathematics of small-scale properties—the position and motion of wheels or the voltage and current of electronic components—and the mathematics of other physical phenomena, for example, ballistic trajectories, inertia, resonance, energy transfer, momentum, and so forth. They model physical phenomena with electrical
voltages and
currents[31] as the analog quantities.
Centrally, these analog systems work by creating electrical
analogs of other systems, allowing users to predict behavior of the systems of interest by observing the electrical analogs.
[32] The most useful of the analogies was the way the small-scale behavior could be represented with integral and differential equations, and could be thus used to solve those equations. An ingenious example of such a machine, using water as the analog quantity, was the
water integrator built in 1928; an electrical example is the
Mallock machine built in 1941. A
planimeter is a device which does integrals, using
distance as the analog quantity. Unlike modern digital computers, analog computers are not very flexible, and need to be rewired manually to switch them from working on one problem to another. Analog computers had an advantage over early digital computers in that they could be used to solve complex problems using behavioral analogues while the earliest attempts at digital computers were quite limited.
Some of the most widely deployed analog computers included devices for aiming weapons, such as the
Norden bombsight[33] and the
fire-control systems,
[34] such as
Arthur Pollen's Argo system for naval vessels. Some stayed in use for decades after World War II; the
Mark I Fire Control Computer was deployed by the
United States Navy on a variety of ships from
destroyers to
battleships. Other analog computers included the
Heathkit EC-1, and the hydraulic
MONIAC Computer which modeled econometric flows.
[35]
The art of mechanical analog computing reached its zenith with the
differential analyzer,
[36] built by H. L. Hazen and
Vannevar Bush at
MIT starting in 1927, which in turn built on the mechanical integrators invented in 1876 by
James Thomson and the torque amplifiers invented by H. W. Nieman. A dozen of these devices were built before their obsolescence was obvious; the most powerful was constructed at the
University of Pennsylvania's
Moore School of Electrical Engineering, where the
ENIAC was built. Digital electronic computers like the ENIAC spelled the end for most analog computing machines, but hybrid analog computers, controlled by digital electronics, remained in substantial use into the 1950s and 1960s, and later in some specialized applications. But like all digital devices, the decimal
precision of a digital device is a limitation, as compared to an analog device, in which the
accuracy is a limitation.
[37] As
electronics progressed during the 20th century, its problems of operation at low voltages while maintaining high
signal-to-noise ratios[38] were steadily addressed, as shown below, for a digital circuit is a specialized form of analog circuit, intended to operate at standardized settings (continuing in the same vein,
logic gates can be realized as forms of digital circuits). But as digital computers have become faster and use larger memory (for example,
RAM or internal storage), they have almost entirely displaced analog computers.
Computer programming, or coding, has arisen as another human profession.
[edit] Electronic digital computation
Friden paper tape punch.
Punched tape programs would be much longer than the short fragment of yellow paper tape shown.
The era of modern computing began with a flurry of development before and during World War II, as
electronic circuit elements replaced mechanical equivalents, and digital calculations replaced analog calculations. Machines such as the
Z3, the
Atanasoff–Berry Computer, the
Colossus computers, and the
ENIAC were built by hand using circuits containing relays or valves (vacuum tubes), and often used
punched cards or
punched paper tape for input and as the main (non-volatile) storage medium. Defining a single point in the series as the "first computer" misses many subtleties (see the table "Defining characteristics of some early digital computers of the 1940s" below).
Alan Turing's 1936 paper
[39] proved enormously influential in computing and
computer science in two ways. Its main purpose was to prove that there were problems (namely the
halting problem) that could not be solved by any sequential process. In doing so, Turing provided a definition of a universal computer which executes a program stored on tape. This construct came to be called a
Turing machine.
[40] Except for the limitations imposed by their finite memory stores, modern computers are said to be
Turing-complete, which is to say, they have
algorithm execution capability equivalent to a universal Turing machine.
For a computing machine to be a practical general-purpose computer, there must be some convenient read-write mechanism, punched tape, for example. With knowledge of Alan Turing's theoretical 'universal computing machine'
John von Neumann defined an architecture which uses the same
memory both to store programs and data: virtually all contemporary computers use this architecture (or some variant). While it is theoretically possible to implement a full computer entirely mechanically (as Babbage's design showed), electronics made possible the speed and later the miniaturization that characterize modern computers.
There were three parallel streams of computer development in the World War II era; the first stream largely ignored, and the second stream deliberately kept secret. The first was the German work of
Konrad Zuse. The second was the secret development of the Colossus computers in the UK. Neither of these had much influence on the various computing projects in the United States. The third stream of computer development, Eckert and Mauchly's ENIAC and EDVAC, was widely publicized.
[41][42]
George Stibitz is internationally recognized as one of the fathers of the modern digital computer. While working at Bell Labs in November 1937, Stibitz invented and built a relay-based calculator that he dubbed the "Model K" (for "kitchen table", on which he had assembled it), which was the first to calculate using
binary form.
[43]
Main article:
Konrad Zuse A reproduction of Zuse's Z1 computer
Working in isolation in Germany,
Konrad Zuse started construction in 1936 of his first Z-series calculators featuring memory and (initially limited) programmability. Zuse's purely mechanical, but already binary
Z1, finished in 1938, never worked reliably due to problems with the precision of parts.
Zuse's later machine, the
Z3,
[44] was finished in 1941. It was based on telephone relays and did work satisfactorily. The Z3 thus became the first functional program-controlled, all-purpose, digital computer. In many ways it was quite similar to modern machines, pioneering numerous advances, such as
floating point numbers. Replacement of the hard-to-implement decimal system (used in
Charles Babbage's earlier design) by the simpler
binary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time.
Programs were fed into
Z3 on punched films. Conditional jumps were missing, but since the 1990s it has been proved theoretically that Z3 was still a
universal computer (as always, ignoring physical storage limitations). In two 1936
patent applications,
Konrad Zuse also anticipated that machine instructions could be stored in the same storage used for data—the key insight of what became known as the
von Neumann architecture, first implemented in the British
SSEM of 1948.
[45] Zuse also claimed to have designed the first higher-level
programming language, which he named
Plankalkül, in 1945 (published in 1948) although it was implemented for the first time in 2000 by a team around
Raúl Rojas at the
Free University of Berlin—five years after Zuse died.
Zuse suffered setbacks during World War II when some of his machines were destroyed in the course of
Allied bombing campaigns. Apparently his work remained largely unknown to engineers in the UK and US until much later, although at least IBM was aware of it as it financed his post-war startup company in 1946 in return for an option on Zuse's patents.
[edit] Colossus
Colossus was used to break German ciphers during World War II.
During World War II, the British at
Bletchley Park (40 miles north of London) achieved a number of successes at breaking encrypted German military communications. The German encryption machine,
Enigma, was attacked with the help of electro-mechanical machines called
bombes. The bombe, designed by
Alan Turing and
Gordon Welchman, after the Polish cryptographic
bomba by
Marian Rejewski (1938), came into productive use in 1941.
[46] They ruled out possible Enigma settings by performing chains of logical deductions implemented electrically. Most possibilities led to a contradiction, and the few remaining could be tested by hand.
The Germans also developed a series of teleprinter encryption systems, quite different from Enigma. The
Lorenz SZ 40/42 machine was used for high-level Army communications, termed "
Tunny" by the British. The first intercepts of Lorenz messages began in 1941. As part of an attack on Tunny, Professor
Max Newman and his colleagues helped specify the Colossus.
[47] The Mk I Colossus was built between March and December 1943 by
Tommy Flowers and his colleagues at the
Post Office Research Station at
Dollis Hill in London and then shipped to
Bletchley Park in January 1944.
Colossus was the world's first electronic programmable computing device. It used a large number of valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety of
boolean logical operations on its data, but it was not
Turing-complete. Nine Mk II Colossi were built (The Mk I was converted to a Mk II making ten machines in total). Details of their existence, design, and use were kept secret well into the 1970s.
Winston Churchill personally issued an order for their destruction into pieces no larger than a man's hand, to keep secret that the British were capable of cracking Lorenz during the oncoming cold war. As a result the machines were not included in many histories of computing. A reconstructed copy of one of the Colossus machines is now on display at Bletchley Park.
[edit] American developments
In 1937,
Claude Shannon showed there is a
one-to-one correspondence between the concepts of
Boolean logic and certain electrical circuits, now called
logic gates, which are now ubiquitous in digital computers.
[48] In his master's thesis
[49] at
MIT, for the first time in history, Shannon showed that electronic relays and switches can realize the
expressions of
Boolean algebra. Entitled
A Symbolic Analysis of Relay and Switching Circuits, Shannon's thesis essentially founded practical
digital circuit design. George Stibitz completed a relay-based computer he dubbed the "Model K" at
Bell Labs in November 1937. Bell Labs authorized a full research program in late 1938 with Stibitz at the helm. Their
Complex Number Calculator,
[50] completed January 8, 1940, was able to calculate
complex numbers. In a demonstration to the
American Mathematical Society conference at
Dartmouth College on September 11, 1940, Stibitz was able to send the Complex Number Calculator remote commands over telephone lines by a
teletype. It was the first computing machine ever used remotely, in this case over a phone line. Some participants in the conference who witnessed the demonstration were
John von Neumann, John Mauchly, and
Norbert Wiener, who wrote about it in their memoirs.
In 1939, John Vincent Atanasoff and Clifford E. Berry of Iowa State University developed the
Atanasoff–Berry Computer (ABC),
[51] The Atanasoff-Berry Computer was the world's first electronic digital computer.
[52] The design used over 300 vacuum tubes and employed capacitors fixed in a mechanically rotating drum for memory. Though the ABC machine was not programmable, it was the first to use electronic tubes in an adder. ENIAC co-inventor John Mauchly examined the ABC in June 1941, and its influence on the design of the later ENIAC machine is a matter of contention among computer historians. The ABC was largely forgotten until it became the focus of the lawsuit
Honeywell v. Sperry Rand, the ruling of which invalidated the ENIAC patent (and several others) as, among many reasons, having been anticipated by Atanasoff's work.
In 1939, development began at IBM's Endicott laboratories on the
Harvard Mark I. Known officially as the Automatic Sequence Controlled Calculator,
[53] the Mark I was a general purpose electro-mechanical computer built with IBM financing and with assistance from IBM personnel, under the direction of Harvard mathematician Howard Aiken. Its design was influenced by Babbage's Analytical Engine, using decimal arithmetic and storage wheels and rotary switches in addition to electromagnetic relays. It was programmable via punched paper tape, and contained several calculation units working in parallel. Later versions contained several paper tape readers and the machine could switch between readers based on a condition. Nevertheless, the machine was not quite Turing-complete. The Mark I was moved to
Harvard University and began operation in May 1944.
ENIAC performed ballistics trajectory calculations with 160 kW of power
The US-built ENIAC (Electronic Numerical Integrator and Computer) was the first electronic general-purpose computer. It combined, for the first time, the high speed of electronics with the ability to be programmed for many complex problems. It could add or subtract 5000 times a second, a thousand times faster than any other machine. (Colossus couldn't add). It also had modules to multiply, divide, and square root. High speed memory was limited to 20 words (about 80 bytes). Built under the direction of
John Mauchly and
J. Presper Eckert at the
University of Pennsylvania, ENIAC's development and construction lasted from 1943 to full operation at the end of 1945. The machine was huge, weighing 30 tons, and contained over 18,000 vacuum tubes. One of the major engineering feats was to minimize tube burnout, which was a common problem at that time. The machine was in almost constant use for the next ten years.
ENIAC was unambiguously a Turing-complete device. It could compute any problem (that would fit in memory). A "program" on the ENIAC, however, was defined by the states of its patch cables and switches, a far cry from the
stored program electronic machines that evolved from it. Once a program was written, it had to be mechanically set into the machine.
Six women did most of the programming of ENIAC. (Improvements completed in 1948 made it possible to execute stored programs set in function table memory, which made programming less a "one-off" effort, and more systematic).
[edit] Early computer characteristics
[edit] First-generation machines
Even before the ENIAC was finished, Eckert and Mauchly recognized its limitations and started the design of a
stored-program computer, EDVAC.
John von Neumann was credited with a
widely circulated report describing the
EDVAC design in which both the programs and working data were stored in a single, unified store. This basic design, denoted the
von Neumann architecture, would serve as the foundation for the worldwide development of ENIAC's successors.
[54] In this generation of equipment, temporary or working storage was provided by
acoustic delay lines, which used the propagation time of sound through a medium such as liquid
mercury (or through a wire) to briefly store data. A series of
acoustic pulses is sent along a tube; after a time, as the pulse reached the end of the tube, the circuitry detected whether the pulse represented a 1 or 0 and caused the oscillator to re-send the pulse. Others used
Williams tubes, which use the ability of a small cathode-ray tube (CRT) to store and retrieve data as charged areas on the phosphor screen. By 1954,
magnetic core memory[55] was rapidly displacing most other forms of temporary storage, and dominated the field through the mid-1970s.
EDVAC was the first stored-program computer designed; however it was not the first to run. Eckert and Mauchly left the project and its construction floundered. The first working von Neumann machine was the Manchester "Baby" or
Small-Scale Experimental Machine, developed by
Frederic C. Williams and
Tom Kilburn at the
University of Manchester in 1948 as a test bed for the
Williams tube;
[56] it was followed in 1949 by the
Manchester Mark 1 computer, a complete system, using Williams tube and
magnetic drum memory, and introducing
index registers.
[57] The other contender for the title "first digital stored-program computer" had been
EDSAC, designed and constructed at the
University of Cambridge. Operational less than one year after the Manchester "Baby", it was also capable of tackling real problems. EDSAC was actually inspired by plans for EDVAC (Electronic Discrete Variable Automatic Computer), the successor to ENIAC; these plans were already in place by the time ENIAC was successfully operational. Unlike ENIAC, which used parallel processing, EDVAC used a single processing unit. This design was simpler and was the first to be implemented in each succeeding wave of miniaturization, and increased reliability. Some view Manchester Mark 1 / EDSAC / EDVAC as the "Eves" from which nearly all current computers derive their architecture. Manchester University's machine became the prototype for the
Ferranti Mark 1. The first Ferranti Mark 1 machine was delivered to the University in February, 1951 and at least nine others were sold between 1951 and 1957.
The first universal programmable computer in the Soviet Union was created by a team of scientists under direction of
Sergei Alekseyevich Lebedev from
Kiev Institute of Electrotechnology,
Soviet Union (now
Ukraine). The computer
MESM (
МЭСМ,
Small Electronic Calculating Machine) became operational in 1950. It had about 6,000 vacuum tubes and consumed 25 kW of power. It could perform approximately 3,000 operations per second. Another early machine was
CSIRAC, an Australian design that ran its first test program in 1949. CSIRAC is the oldest computer still in existence and the first to have been used to play digital music.
[58]
[edit] Commercial computers
The first commercial computer was the
Ferranti Mark 1, which was delivered to the
University of Manchester in February 1951. It was based on the
Manchester Mark 1. The main improvements over the Manchester Mark 1 were in the size of the
primary storage (using
random access Williams tubes),
secondary storage (using a
magnetic drum), a faster multiplier, and additional instructions. The basic cycle time was 1.2 milliseconds, and a multiplication could be completed in about 2.16 milliseconds. The multiplier used almost a quarter of the machine's 4,050 vacuum tubes (valves).
[59] A second machine was purchased by the
University of Toronto, before the design was revised into the
Mark 1 Star. At least seven of these later machines were delivered between 1953 and 1957, one of them to
Shell labs in
Amsterdam.
[60]
In October 1947, the directors of
J. Lyons & Company, a British catering company famous for its teashops but with strong interests in new office management techniques, decided to take an active role in promoting the commercial development of computers. The
LEO I computer became operational in April 1951
[61] and ran the world's first regular routine office computer
job. On 17 November 1951, the J. Lyons company began weekly operation of a bakery valuations job on the LEO (Lyons Electronic Office). This was the first business
application to go live on a stored program computer.
[62]
In June 1951, the
UNIVAC I (Universal Automatic Computer) was delivered to the
U.S. Census Bureau. Remington Rand eventually sold 46 machines at more than $1 million each ($8.46 million as of 2011).
[63] UNIVAC was the first "mass produced" computer. It used 5,200 vacuum tubes and consumed 125 kW of power. Its primary storage was
serial-access mercury delay lines capable of storing 1,000 words of 11 decimal digits plus sign (72-bit words). A key feature of the UNIVAC system was a newly invented type of metal magnetic tape, and a high-speed tape unit, for non-volatile storage. Magnetic media are still used in many computers.
[64] In 1952, IBM publicly announced the
IBM 701 Electronic Data Processing Machine, the first in its successful
700/7000 series and its first
IBM mainframe computer. The
IBM 704, introduced in 1954, used magnetic core memory, which became the standard for large machines. The first implemented high-level general purpose
programming language,
Fortran, was also being developed at IBM for the 704 during 1955 and 1956 and released in early 1957. (Konrad Zuse's 1945 design of the high-level language
Plankalkül was not implemented at that time.) A volunteer
user group, which exists to this day, was founded in 1955 to
share their software and experiences with the IBM 701.
IBM introduced a smaller, more affordable computer in 1954 that proved very popular.
[65] The
IBM 650 weighed over 900 kg, the attached power supply weighed around 1350 kg and both were held in separate cabinets of roughly 1.5 meters by 0.9 meters by 1.8 meters. It cost $500,000
[66] ($4.09 million as of 2011) or could be leased for $3,500 a month ($30 thousand as of 2011).
[63] Its drum memory was originally 2,000 ten-digit words, later expanded to 4,000 words. Memory limitations such as this were to dominate programming for decades afterward. The program instructions were fetched from the spinning drum as the code ran. Efficient execution using drum memory was provided by a combination of hardware architecture: the instruction format included the address of the next instruction; and software: the Symbolic Optimal Assembly Program, SOAP,
[67] assigned instructions to the optimal addresses (to the extent possible by static analysis of the source program). Thus many instructions were, when needed, located in the next row of the drum to be read and additional wait time for drum rotation was not required.
In 1955,
Maurice Wilkes invented
microprogramming,
[68] which allows the base instruction set to be defined or extended by built-in programs (now called
firmware or
microcode).
[69] It was widely used in the
CPUs and
floating-point units of
mainframe and other computers, such as the
Manchester Atlas [70] and the
IBM 360 series.
[71]
IBM introduced its
first magnetic disk system,
RAMAC (Random Access Method of Accounting and Control) in 1956. Using fifty 24-inch (610 mm) metal disks, with 100 tracks per side, it was able to store 5
megabytes of data at a cost of $10,000 per megabyte ($80 thousand as of 2011).
[63][72]
[edit] Second generation: transistors
The bipolar
transistor was invented in 1947. From 1955 onwards transistors replaced
vacuum tubes in computer designs,
[73] giving rise to the "second generation" of computers. Initially the only devices available were
germanium point-contact transistors, which although less reliable than the vacuum tubes they replaced had the advantage of consuming far less power.
[74] The first
transistorised computer was built at the
University of Manchester and was operational by 1953;
[75] a second version was completed there in April 1955. The later machine used 200 transistors and 1,300
solid-state diodes and had a power consumption of 150 watts. However, it still required valves to generate the clock waveforms at 125 kHz and to read and write on the magnetic
drum memory, whereas the
Harwell CADET operated without any valves by using a lower clock frequency, of 58 kHz when it became operational in February 1955.
[76] Problems with the reliability of early batches of point contact and alloyed junction transistors meant that the machine's
mean time between failures was about 90 minutes, but this improved once the more reliable
bipolar junction transistors became available.
[77]
Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Silicon junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space. Transistors greatly reduced computers' size, initial cost, and
operating cost. Typically, second-generation computers were composed of large numbers of
printed circuit boards such as the
IBM Standard Modular System[78] each carrying one to four
logic gates or
flip-flops.
A second generation computer, the
IBM 1401, captured about one third of the world market. IBM installed more than ten thousand 1401s between 1960 and 1964.
Transistorized electronics improved not only the
CPU (Central Processing Unit), but also the
peripheral devices. The
IBM 350 RAMAC was introduced in 1956 and was the world's first disk drive. The second generation
disk data storage units were able to store tens of millions of letters and digits. Next to the
fixed disk storage units, connected to the CPU via high-speed data transmission, were removable disk data storage units. A removable disk stack can be easily exchanged with another stack in a few seconds. Even if the removable disks' capacity is smaller than fixed disks, their interchangeability guarantees a nearly unlimited quantity of data close at hand.
Magnetic tape provided archival capability for this data, at a lower cost than disk.
Many second-generation CPUs delegated peripheral device communications to a secondary processor. For example, while the communication processor controlled
card reading and punching, the main CPU executed calculations and binary
branch instructions. One
databus would bear data between the main CPU and core memory at the CPU's
fetch-execute cycle rate, and other databusses would typically serve the peripheral devices. On the
PDP-1, the core memory's cycle time was 5 microseconds; consequently most arithmetic instructions took 10 microseconds (100,000 operations per second) because most operations took at least two memory cycles; one for the instruction, one for the
operand data fetch.
During the second generation
remote terminal units (often in the form of
teletype machines like a
Friden Flexowriter) saw greatly increased use. Telephone connections provided sufficient speed for early remote terminals and allowed hundreds of kilometers separation between remote-terminals and the computing center. Eventually these stand-alone computer networks would be generalized into an interconnected
network of networks—the Internet.
[79]
[edit] Post-1960: third generation and beyond
The explosion in the use of computers began with "third-generation" computers, making use of
Jack St. Clair Kilby's
[80] and
Robert Noyce's
[81] independent invention of the
integrated circuit (or microchip), which led to the invention of the
microprocessor,
[82] by
Ted Hoff,
Federico Faggin, and Stanley Mazor at
Intel.
[83] The integrated circuit in the image on the right, for example, an
Intel 8742, is an 8-bit
microcontroller that includes a
CPU running at 12 MHz, 128 bytes of
RAM, 2048 bytes of
EPROM, and
I/O in the same chip.
During the 1960s there was considerable overlap between second and third generation technologies.
[84] IBM implemented its
IBM Solid Logic Technology modules in
hybrid circuits for the IBM System/360 in 1964. As late as 1975, Sperry Univac continued the manufacture of second-generation machines such as the UNIVAC 494. The
Burroughs large systems such as the B5000 were
stack machines, which allowed for simpler programming. These
pushdown automatons were also implemented in minicomputers and microprocessors later, which influenced programming language design. Minicomputers served as low-cost computer centers for industry, business and universities.
[85] It became possible to simulate analog circuits with the
simulation program with integrated circuit emphasis, or
SPICE (1971) on minicomputers, one of the programs for electronic design automation (
EDA). The microprocessor led to the development of the
microcomputer, small, low-cost computers that could be owned by individuals and small businesses. Microcomputers, the first of which appeared in the 1970s, became ubiquitous in the 1980s and beyond.
In
April 1975 at the Hannover Fair, was presented the
P6060 produced by
Olivetti, the world's first personal with built-in floppy disk: Central Unit on two plates, code names PUCE1/PUCE2,
TTL components made, 8" single or double
floppy disk driver, 32 alphanumeric characters
plasma display, 80 columns graphical
thermal printer, 48 Kbytes of
RAM,
Basic language, 40 kilograms of weight. He was in competition with a similar product by IBM but with an external floppy disk.
Steve Wozniak, co-founder of
Apple Computer, is sometimes erroneously credited
[by whom?] with developing the first mass-market
home computers. However, his first computer, the
Apple I, came out some time after the
MOS Technology KIM-1 and
Altair 8800, and the first Apple computer with graphic and sound capabilities came out well after the
Commodore PET. Computing has evolved with microcomputer architectures, with features added from their larger brethren, now dominant in most market segments.
Systems as complicated as computers require very high
reliability. ENIAC remained on, in continuous operation from 1947 to 1955, for eight years before being shut down. Although a vacuum tube might fail, it would be replaced without bringing down the system. By the simple strategy of never shutting down ENIAC, the failures were dramatically reduced. The vacuum-tube SAGE air-defense computers became remarkably reliable – installed in pairs, one off-line, tubes likely to fail did so when the computer was intentionally run at reduced power to find them.
Hot-pluggable hard disks, like the hot-pluggable vacuum tubes of yesteryear, continue the tradition of repair during continuous operation. Semiconductor memories routinely have no errors when they operate, although operating systems like Unix have employed memory tests on start-up to detect failing hardware. Today, the requirement of reliable performance is made even more stringent when
server farms are the delivery platform.
[86] Google has managed this by using fault-tolerant software to recover from hardware failures, and is even working on the concept of replacing entire server farms on-the-fly, during a service event.
[87][88]
In the 21st century,
multi-core CPUs became commercially available.
[89] Content-addressable memory (CAM)
[90] has become inexpensive enough to be used in networking, although no computer system has yet implemented hardware CAMs for use in programming languages. Currently, CAMs (or associative arrays) in software are programming-language-specific. Semiconductor memory cell arrays are very regular structures, and manufacturers prove their processes on them; this allows price reductions on memory products. During the 1980s, CMOS
logic gates developed into devices that could be made as fast as other circuit types; computer power consumption could therefore be decreased dramatically. Unlike the continuous current draw of a gate based on other logic types, a
CMOS gate only draws significant current during the 'transition' between logic states, except for leakage.
This has allowed computing to become a
commodity which is now ubiquitous, embedded in
many forms, from greeting cards and telephones to
satellites. Computing hardware and its software have even become a metaphor for the operation of the universe.
[91] Although DNA-based computing and
quantum qubit computing are years or decades in the future, the infrastructure is being laid today, for example, with
DNA origami on photolithography
[92] and with quantum antennae for transferring information between ion traps.
[93] Fast
digital circuits (including those based on
Josephson junctions and
rapid single flux quantum technology) are becoming more nearly realizable with the discovery of
nanoscale superconductors.
[94]
Fiber-optic and photonic devices, which already have been used to transport data over long distances, are now entering the data center, side by side with CPU and semiconductor memory components. This allows the separation of RAM from CPU by optical interconnects.
[95]
An indication of the rapidity of development of this field can be inferred by the history of the seminal article.
[96] By the time that anyone had time to write anything down, it was obsolete. After 1945, others read John von Neumann's
First Draft of a Report on the EDVAC, and immediately started implementing their own systems. To this day, the pace of development has continued, worldwide.
[97][98]