NATS 1700 6.0 COMPUTERS, INFORMATION AND SOCIETY
Lecture 22: Review and Conclusions
| Previous | Syllabus | Selected References | Home |
| 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 |
8 | 9 | 10 | 11 | 12 | 13 |
14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 |
Introduction
- Although written primarily with the US scene in mind, Jeffrey R Cooper's article in FirstMonday, The CyberFrontier and America at the Turn of the 21st Century: Reopening Frederick Jackson Turner's Frontier is
generally a very good and useful analysis of the meaning and implications of the information revolution.
- Computers are usually considered environmentally friendly, but this is far from the truth. The computer microchip
industry, for example, still uses freon and other CFC products--ozon depleting chemical--in its microelectronics assembly
cleaning processes and in computer chip cooling. And old computers and their peripherals still end up in waste disposal
sites, garbage dumps. See Poison PCs,
an interesting article that appeared recently on Salon.
- Both space exploration and modern computing started after WWII. "Like most busy commuters these days, astronauts
need to take their laptop computers with them on the road--even if they are travelling at 17,500 miles per hour in a
billion dollar space shuttle between Earth and the International Space Station. Given that NASA often touts itself as
being at the cutting edge of technology you'd expect that they'd lavish nothing but the latest and speediest laptop
computers on their astronauts. Alas, that isn't the case. But don't think that these folks aren't being given the tools
they need either." Read Keith Cowing's 2001: A Space Laptop
for a detailed look at computing in the Shuttle. And for something a bit more up-to-date, see Space Shuttle Computers and Avionics .
Topics
- The first part of the course was devoted to a survey of some of the major ideas concerning the nature of science.
The commonly held notion that science is synonymous with the scientific method was shown to be
inadequate. You don't have to believe in the extreme views of post-modernism (science is simply a socially
constructed system) to see, for example, that to collect facts is a theory-laden
activity, or that if experiments fail to support a hypothesis, the hypothesis is often retained, rather than
automatically discarded, or that the boundary between science and non-science is often quite fuzzy. To an extent which
is not so easy to define, science is also an institution, which protect itself, for example by developing an orthodoxy.
The question of what is truth in science, or how close it can approach the truth, is still problematic.
- The second part of the course started with a brief analysis of the meaning of the concept of information,
from its colloquial use to the refined theories developed by Shannon and Weaver. One way to express the differences
between the everyday and the technical definition of information is to say that the latter is not synonymous
with meaning, but a symbolic way to encode meaning.
- We then proceeded to review the main events in the history of computing, from the abacus to the modern computer.
The concept of (modern) computer was essentially developed by Turing, von Neumann, etc. (but Babbage already had a
similar idea a century earlier). In his famous First Draft of a Report on the EDVAC (1945), von Neumann
introduced the idea of storing in the machine not only the data, but also the instructions (the program) for processing
the data. Thus a computer consists of
- a memory unit, which holds the instructions and the data required by the instructions
- a control unit, which fetches the instructions from memory
- an arithmetic processor, which performs the operations specified by the instructions
- input/output devices, which transfer the data to and from the system
- The work of Turing and many others shows that computers can solve a very large class of problems. So much so that
the question whether computers may (one day) become intelligent, and even conscious, has become a real (as opposed to
rhetorical) question. We discussed the history of the idea of intelligence, and the fact that the
answer is not quite forthcoming yet. We then tackled the debate on artificial intelligence, and the
two major camps, strong AI and Weak AI. We reviewed two of the most important tests
proposed to answer the question: Turing's Test, and Searle's Chinese Room. We looked
at some of the most powerful techniques of AI, such as a-life, genetic algorithms,
neural nets, and knowledge discovery or data mining.
- Information technology has led to the emergence of the Internet, which has brought this technology
into everyday life. We discussed various examples of the reaction, sometimes very critical, sometimes enthusiastic,
to this phenomenon. These critiques led us to realize that the answers to the problems posed by IT often depend on the
much more difficult answers to old problems: what is the relationship between man and machine? what is freedom, what
is creativity, what is the purpose of education, etc.?
Paphiopedilum Orchid
- These questions were examined in greater detail in the last part of the course, where we discussed
e-business, property rights, privacy, censorship,
and security. Some of these are not new, but the web has made them more urgent and more complicated.
- So, in conclusion, this course has barely scratched the surface. Hopefully, it has offered you a useful sketch
of the long and tortuous, but rewarding path you would have to follow if you chose to pursue the study of information
technology. Hopefully, it has given you a bird's eye view of the field, taught you the importance of asking critical
questions about this technology, and given you some examples of how to do so. As I was told by one of my professors,
upon receiving my PhD, "remember, don't think you know anything yet--you have only begun to learn how to go about
asking questions and finding answers.
Questions and Exercises
- We have not even touched many important topics and questions. Can you think of some examples? Here is
one: Digital Preservation , where many of issues are presented
and discussed. See also K D Bollacker's Avoiding a Digital Dark Age, which appeared in the March-April issue of
American Scientist.
Here are other resources: Preserving the Internet:
How Archiving Online Content Can Make History. The Archivist in the Electronic Age ,
Digital Preservation .
A very important initiative is the Wayback Machine , where "85 billion
web pages [are] archived from 1996 to a few months ago." It is not a coincidence that one of the 'mirrors' of this
depository is in the newly reconstructed Bibliotheca Alexandrina or New Library
of Alexandria, "dedicated to recapture the spirit of the original."
- Read The Machine Stops, a story by E M Forster.
Notice that it was first published in the Oxford and Cambridge Review in 1909! Compare is with Vannevar Bush's
article "As We May Think" (see Lecture 9), and Licklider's
paper on "Man-Computer Symbiosis" (see Lecture 10).
- For a brief, interesting outlook on the future, read J Strickland's recent article (and video) at HowStuffWorks:
How will computers evolve over the next 100 years?.
- Am I a geek? Are you? Check out Harvey Blume's article Geek Studies in the July 13, 2000 issue of Atlantic Magazine:
"Hackers, freaks, outsiders, Homo Superior? Call them what you will, geeks are everywhere, and their stories help explain how
science is shaping us."
- ...and if you need to relax and still learn a lot about computers, computing, and the internet, check out The New Hacker's Dictionary .
Don't be misled by the title--it's much more than a dictionary.
Picture Credit: Rice University
Last Modification Date: 08 April 2010
|