NATS 1700 6.0 COMPUTERS, INFORMATION AND SOCIETY
Lecture 10: The Modern Computer
| Previous | Next | Syllabus | Selected References | Home |
| 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 |
8 | 9 | 10 | 11 | 12 | 13 |
14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 |
Introduction
-
The start of WW 2 marked the beginning of a rapid succession of prototypes of the digital computer. This is not to say
that the war effort was solely responsible for this outburst, but it certainly accelerated it to a great
degree. In the US and the UK governments allocated substantial financial and human resources to these projects, while,
strangely enough, the Nazi government did not understand the strategic importance of computers, and did not support
their development.
-
Consult again the useful Chronology of Digital Computing Machines (to 1952).
There you will find many more pioneering machines and details than mentioned in this lecture.
-
Ken Polsson's Chronology of Events in the History of Microcomputers
is a "collection of product announcements and delivery dates from various sources, mainly computer magazines and newspapers."
-
Read
Internet Pioneers. J C R Licklider, a short biography of the man who was called Computing's Johnny Appleseed because he laid the foundations of modern computing (e.g. time-sharing, point-and-click interfaces,
graphics and the Internet). Read Licklider's visionary paper
Man-Computer Symbiosis (1960).
-
Visit the EDSAC Simulator Home Page. Here
you can download the simulator software (and find links to other simulators). "The Edsac simulator is a faithful
software evocation of the EDSAC computer as it existed in 1949-51. The user interface has all the controls and displays
of the original machine, and the system includes a library of original programs, subroutines, and debugging software.
The simulator is intended for use in teaching the history of computing; as a tutorial introduction to the classic
'von Neumann' computer; or as an historical experience for current computer practitioners." Play with it, and
pause for a moment to think that the 'original' of this simulation was a huge, heavy, expensive and slow vacuum tube
machine.
Topics
- In Germany, beginning in 1931, Konrad Zuse built the Z1, Z2, Z3 and
Z4 series of machines, and in 1945-46 he introduced Plankalkül (plan calculus),
perhaps the world's first programming language. Check also Goldstrasz and Pantle's brief but very interesting account of modern
Germany's views on the history of Computers During WW 2.
- In 1939 George Stibitz constructed a large scale electro-mechanical Complex Number Calculator (one of
the machines vying for the title of first digital computer) for Bell Labs, and in 1940 the Bell Labs Model I
became the first computing device to be linked via telephone lines to another device.
- Also in 1939 John J Atanasoff designed a special-purpose calculator (also known as the ABC) for solving systems
of simultaneous linear equations ( see John Vincent Atanasoff & the
Birth of the Digital Computer ), with the help of graduate student Clifford Berry at Iowa State College. In 1973 a judge
ruled it the first automatic digital computer.
The ENIAC
- In 1943 John William Mauchly and John Presper Eckert, under guidance from John Brainerd, Dean of the Moore School of
Electrical Engineering at the University of Pennsylvania, begin development of the Electronic Numerical Integrator And
Calculator, or ENIAC for short. John von Neuman later wrote important papers on their work.
- The history of ENIAC is described in some detail in a commemorative article by D Winegrad and A Akera significantly
entitled A Short History of the Second American Revolution,
and in a first-hand account by Herman Goldstine: Computers at
the University of Pennsylvania's Moore School. Another interesting article is John
W Mauchly and the Development of the ENIAC Computer.
- Visit the University of Pennsylvania site that celebrates the 50th Anniversary of the ENIAC,
which was activated in 1946. ENIAC featured some 18,000 vacuum tubes, weighed 80 tons and could perform 5,000 additions and
360 multiplications per second. By comparison, the latest IBM supercomputer can perform a few trillion arithmetical
operations per second.
- In 1944 John von Neumann and others designed the EDVAC. The idea was further refined and implemented
by Maurice Wilkes in Cambridge, who built the EDSAC. In a report entitled First Draft of a Report
on the EDVAC (1945), the concept of stored program is first introduced. Here is a brief
summary. The fundamental idea, which was developed
by Eckert and Mauchly, besides von Neumann, is that the sequences of instructions, i.e. the program, can be stored
in memory, just like the data on which the instructions operate. This was a major breakthrough, because until then a
computer had to be laboriously re-wired by hand for each new problem.
- The digital computer, as we know it today, was born. A digital computer consists of the following components:
- memory unit: it holds the instructions and the data required by the instructions
- control unit: it fetches the instructions from memory
- arithmetic processor: it performs the operations specified by the instructions
- input/output and peripheral devices: they transfer the data to and from the system
- You may also want to get a rough idea of the the basic of a modern computer. Such logic elements or gates are
the operators that allow elementary programming statements to be combined, transformed and in general manipulated. For example,
if the inputs of an AND gate are two statements A and B, the output of the gate will be a new statement C which
is true if and only if A and B are both true. If we represent 'true' as a 1 and 'false' as a 0, we can summarize all possible
combinations in a table:
Input A |
Input B |
Output |
0 |
0 |
0 |
0 |
1 |
0 |
1 |
0 |
0 |
1 |
1 |
1 |
0's and 1's are only symbols. The may denote, for example, 'open' and 'closed,' or 'on' and 'off,' if we have in
mind an electrical circuit or a switch. And circuits and switches are precisely the physical elements that make up
a computer. We open and close a circuit, we flip a switch on or off by sending an appropriate current or by applying
a suitable voltage.
-
The concept of a stored program led to compilers, i.e. programs that translate the human, or high-level
formulation of a program into the machine code or low-level language understood by the computer. The first
compiler was probably the A-0, created by Grace Murray Hopper. There followed, in fairly rapid succession, FORTRAN
(1954), COBOL (1959), BASIC (1964). Grace Hopper is also credited with the
creation of the computer term 'debugging.' The story is
that one day Grace Hopper "found [a moth] trapped between points at Relay # 70, Panel F, of the Mark II
Aiken Relay Calculator while it was being tested at Harvard University, 9 September 1945. The operators affixed the moth
to the computer log, with the entry: 'First actual case of bug being found.' They put out the word that they had 'debugged' the
machine, thus introducing the term 'debugging a computer program.' In 1988, the log, with the moth still taped by the entry, was
in the Naval Surface Warfare Center Computer Museum at Dahlgren, Virginia."
- In 1948. Max Newman, F C Williams, and others at Manchester University, build the Mark I. This is
another contender for the title of first modern digital computer. It features a true stored-program capability, and uses
a new type of memory conceived by Williams.
- In 1949 Presper Eckert and John Mauchly built the first UNIVAC, which was purchased by the US Census
Bureau. This was the first truly commercial computer. In total, 47 UNIVACs were built. Read this fascinating story
by Ryan Weston: UNIVAC: The Paul Revere of the Computer Revolution.
- In 1947 J Bardeen, W Brattain and W Shockley invented the transistor,  which
replaced the vacuum tube in computers, as well as in televisions and radios. They were awarded the 1956 Nobel Prize in
physics. The size and power consumption of computers were reduced dramatically, while reliability increased, equally
dramatically. The first machines to make full use of transistor technology were the early supercomputers:
Stretch by IBM and LARC by Sperry-Rand. This fundamental invention made possible many
subsequent improvements. In 1958 Jack Kilby and Noyce developed the integrated circuit (IC), a single
silicon chip, on which several computer components could be fitted. The next step came with large scale
integration (LSI), then with very large scale integration (VLSI), and finally with
ultra-large scale integration (ULSI). These technologies squeezed first hundred and thousands, then
hundreds of thousands, then millions of components onto one chip.
- Another quantum leap was taken with the introduction of magnetic-core memory. In 1949 Wang patented
the concept of core memory, and Forrester used iron cores as main memory in the Whirlwind, a machine
built for the US Navy's Office of Research and Inventions.
- In 1954 Gene Amdahl develops the first operating systems for the IBM 704. An 'OS' allowed machines
to run many different programs at once with a central program that monitored and coordinated the computer's operations.
In 1959 C Strachey introduced the idea of time-sharing. making it possible for several users with
different programs to access the computer simultaneously.
- In 1975 MITS introduced the Altair personal computer, a home kit designed by Ed Roberts and Bill Yates.
In 1977 the Apple II appeared on the market, and in 1981 IBM manufactured the first IBM PC.
As they say, the rest is history.
- A good website to visit is Personal Computer Milestones,
where you can see fairly detailed descriptions and pictures of these early machines.
Questions and Exercises
- Explain why mathematics was so important in the development of computers.
- Check your computer manual and, using your own words, identify the memory unit, the control unit, the arithmetic processor,
and the input/output and peripheral devices. Hint: if you can't find your computer manual, search the web for a
description of a typical desktop computer.
Picture Credit: PENN Library Exhibitions
Last Modification Date: 07 April 2010
|