History of
Computing
Chapter 1
Introduction to Social and
Ethical Computing
Historical Development of Computing
Development of the Internet
Development of the World Wide Web
The Emergence of Social and Ethical Problems in
Computing
The Case for Computer Ethics Education
Historical Development
Before 1900AD
Man sought to improved life through the invention of gadgets.
First utility tools recorded dealt with numbers
First recorded on bones – 20,000 to 30,000 B.C.
First place-value number system in place – 1800 B.C.
Abacus – Mother of Computers – 1000 B.C. and 500 B.C.
Zero and Negative Numbers – 300 B.C. and 500 A.D.
1500AD and 1900AD lot of activities in the development of
computing devices
Driven by commerce
1500 Leonardo da Vinci invented mechanical calculator
1621 invention of the slide rule
1625 Wilhelm Schichard’s mechanical calculator in
1640 Blaise Pascal’s Arithmetic Machine
Major breakthrough in speed up
1800 AD with the invention of the punched card by Joseph-Marie
Jacquard
Revolutionized computing
Quickly spread in other fields
Speed up computation and storage of information
1500 Leonardo da Vinci invented mechanical calculator
1621 invention of the slide rule
1625 Wilhelm Schichard’s mechanical calculator
1640 Blaise Pascal’s Arithmetic Machine
1800 AD with the invention of the punched card by
Joseph-Marie Jacquard
Historical Development
Before 1900AD
1830 AD exciting period
1830 - Charles Babbage’s Analytical Engine
George and Edward Schutz’s Difference Engine
Within a decade - major milestone
George Boole’s invention of Boolean Algebra
Opened fields of mathematics, engineering, & computing
Lead to the new frontiers in logic
1830 - Charles Babbage’s Analytical
Engine
Historical Development
Before 1900AD
Mid 1850 through the turn of the century
1857 - Sir Charles Wheatstone’s invention
Paper tape to store information
Created new excitement in the computing community of
the time.
Huge amounts of data could be entered & stored
1869AD - Logic Machine by William Stanley Jovons
~1874 - first Keyboard by Sholes
1881 - Rectangular Logic Diagrams by Allan Marquand
1857 - Sir Charles Wheatstone’s
invention
1869AD - Logic Machine by William
Stanley Jovons
1874 - first Keyboard by Sholes
Historical Development
Before 1900AD
Mid 1850 through the turn of the century
1886, Charles Pierce - first linked Boolean
Algebra to circuits based on switches
Major break through in mathematics, engineering and
computing science
1890 - John Venn invented the Venn
diagrams
Used extensively in switching algebras in both
hardware and software development
1890 - Herman Hollerith invented the
Tabulating Machine
Utilized Jacquard’s punched card to read the presence
or absence of holes.
The data read was to be collated using an automatic
electrical tabulating machine
Large number of clock-like counters
Summed up and accumulated the results in a number
of selected categories.
1886, Charles Pierce - first linked Boolean Algebra
to circuits based on switches
1890 - Herman Hollerith invented the Tabulating
Machine
After 1900 AD
Computing in infancy
Century began with a major milestone
Vacuum tube by John Ambrose Fleming.
Played a major role in computing for the next half
century.
All digital computer in the first half century ran on
vacuum tubes.
1906 - triode by Lee de Forest in 1906.
1926 - first semiconductor transistor
Not used for several years
Came to dominate the computing industry in late
years
1937 - Turing Machine by Alan Turing
Invention of an abstract computer
Some problems do not lend themselves to algorithmic
representations, not computable
1942 - COLOSSUS, one of the first working programmable
digital computers
1906 - triode by Lee de Forest in 1906.
1937 - Turing Machine by Alan Turing
1942 - COLOSSUS, one of the first working
programmable digital computers
After 1900 AD
1942 – Turing designed COLOSSUS
One of the first working programmable digital computers
1939 – Vincent Atanasoff – 1st digital computer model
utilized capacitors to store electronic charge to represent
Boolean numbers
0 and 1 used by the machine in calculations
Input and output data was on punched cards
Some doubt it ever worked
After 1900 AD
Howard Aiken – developed Harvard Mark I
1st large scale automatic digital computer.
also known as IBM automatic sequencer calculator-
ASCC
1943, Alan Turing – COLOSSUS
Considered 1st programmable compute
designed to break the German ENIGMA code
used about 1800 vacuum tubes
execute a variety of routines.
Howard Aiken – developed Harvard Mark I
After 1900
John William Mauchly & J. Presper Eckert Jr - ENICAC
Vacuum tube-based general purpose
10 feet high
Weighed 30 tons
Occupied 1000 square feet
70,000 resistors
10,000 capacitors
6000 switches
18,000 vacuum tube
No internal memory
Hard-wired
Consistently programmed by switches and diodes
After 1900
1944-1952 John William Mauchly & J. Presper
Eckert Jr – EDVAC
Electronic discrete variable automatic computer
1st truly general purpose digital computer
Stored program instruction concept
completed in 1956
4,000 vacuum tubes and 10,000 crystal diodes
1948 - UNIVAC I
1st commercially available computer.
After 1900
Many companies became involved
International Business Machines (IBM), Honeywell,
and Control Data Corporation (CDC) in the USA,
and International Computers Limited, (ICL) in UK
Built mainframe
Hugh – took entire rooms
Expensive – use limited to big corporations
Mid to late sixties
Developed less expensive but smaller computer
Minicomputer
Timesharing concept
Let to idea of networking
After 1900
1971 and 1976 - first microprocessor
Built with integrated circuit with many
transistors on a single board
Vacuum tubes and diodes no longer used
Ted Hoff
The 4004
4-bit data path
1972 – Intel - 8008
8-bit microprocessor based on the 4004
fIrst microprocessor to use a compiler
Specific application microprocessors
Microprocessor
1974 -truly general purpose
microprocessor
8080 -8-bit device - 4,500 transistors
& astonishing 200,000 operations
per second
After 1974, development
exploded
Computer Software and
Personal Computer (PC)
Until mid 1970s
Development led by hardware
Computers were designed and software was
designed to fit the hardware.
Personal computing industry began
1976 - Apple I and Apple II microcomputer
were unveiled
1981 - IBM joined the PC wars
3 Major Players
IBM
Gary Kildall - Developed the first PC operating
system
Bill Gates - Developed the Disk Operating
System (DOS).
The Development of the
Internet
Internet based on 4 technologies
Telegraph
Telephone
Radio
Computers
Originated from the early work of J.C.R.
Licklider
Conceptualized a global interconnected set of
computers
Concept for communication between network nodes
Packets instead of circuits
Enabled computers to talk to each other.
1961 - Kleinrock
Published first work on packet switching theory
The Development of the
Internet
Two additional important projects
Donald Davies and Roger Scantleberg
Coining the term packet
Connected computer in Boston with
one in Los Angels
Low speed dial-up telephone line
created the first working Wide Area
Network
1967 Roberts - publishing the first plan
for ARPANET
1968 - team, lead by Frank Heart and
included Bob Kahn, developed IMP
ARPNET
Began as tool for defense contractors
Universities added
Government joined
Other countries joined
ARPANET ceased to exist in 1989
Internet was an entity to itself
Development World Wide Web
Beginning concepts - Tim Berners-
Lee’s 1989
Proposal called HyperText and CERN
Enable collaboration between physicists &
researchers in the high energy physics research
Three new technologies were
incorporated.
HyperText Markup Language (HTML)
hypertext concepts- to be used to write
web documents
HyperText Transfer Protocol (HTTP) a protocol
Used to transmit web pages between hosts
Web browser client software program to
receive and interpret data and display
results.
Development World Wide
Web
Proposal included a very important
concept for the user interface
Consistent across all types of computer
platforms
Enable users to access information from any
computer.
Line-mode interface was developed &
named at CERN in late 1989
Development World Wide Web
Growth
Central computer at CERN with few web pages in 1991
50 world wide by 1992
720,000 by 1999
Over 24 million by 2001
1993 - graphic user interface browser
Mosaic
Popularized and fueled growth of internet
Emergence of the Social &
Ethical Problems in Computing
The Emergence of Computer Crimes
Perhaps started with the invention of the computer
virus
The term virus is derived from a Latin word virus
which means poison
Computer virus
Self-propagating computer program
Designed to alter or destroy a computer system resource
Spreads in the new environment
Attacks major system
Weakens the capacity of resources to perform
1972 – virus used to describe piece of
unwanted computer code
Growth of Computer
Vulnerabilities
The Case for Computer
Ethics Education
What is Computer Ethics
James H. Moore
First coined the phrase "computer ethics“
Computer ethics is the analysis of the nature and social
impact of computer technology and the corresponding
formulation and justification of policies for the ethical use of
such technology .
Definition focuses on the human actions
Study, an analysis of the values of human actions
influenced by computer technology.
Computer influence on human actions is widespread
throughout the decision making process preceding
the action
Education we study the factors that influence
the decision making process
Why You Should Study
Computer Ethics
Central task of computer ethics
determine what should be done
Especially whenever there is a policy vacuum
Vacuums caused by the ‘confusion’
between the known policies and what is
presented
Professionals unprepared to deal
effectively with the ethical issues
Can stop the vacuums
Can prepare the professionals
Schools of Thought
Study computer ethics as remedial moral education
Computer ethics education not as a moral education
but as a field worthy of study in its own right
Justification for First Thought
We should study computer ethics
because doing so will make us
behave like responsible
professionals.
We should study computer ethics
because doing so will teach us how
to avoid computer abuse and
catastrophes.
Material taken from Walter Manner in “Is Computer Ethics Unique?”
Justification for Second Thought
We should study computer ethics because
the advance of computing technology
will continue to create temporary policy
vacuums.
We should study computer ethics
because the use of computing
permanently transforms certain ethical
issues to the degree that their alterations
require independent study.
We should study computer ethics
because the use of computing
technology creates, and will continue to
create, novel ethical issues that require
special study.
We should study computer ethics because
the set of novel and transformed issues is
large enough and coherent enough to
define a new field
Material taken from Walter Manner in “Is Computer Ethics Unique?”