KEMBAR78
Lecture 06 | PDF | Theoretical Computer Science | Applied Mathematics
0% found this document useful (0 votes)
20 views62 pages

Lecture 06

The lecture discusses sorting algorithms, establishing that the best possible running time for comparison-based sorting is Θ(n lg n). It introduces the decision-tree model for comparison sorts and proves that any such algorithm must have a worst-case running time of Ω(n lg n). Additionally, it presents counting sort as a linear time sorting algorithm that does not rely on comparisons.

Uploaded by

benno0810
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views62 pages

Lecture 06

The lecture discusses sorting algorithms, establishing that the best possible running time for comparison-based sorting is Θ(n lg n). It introduces the decision-tree model for comparison sorts and proves that any such algorithm must have a worst-case running time of Ω(n lg n). Additionally, it presents counting sort as a linear time sorting algorithm that does not rely on comparisons.

Uploaded by

benno0810
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 62

Introduction to Algorithms

6.046J/18.401J

Lecture 6
Prof. Piotr Indyk
Today: sorting

• Show that Θ (n lg n) is the best possible


running time for a sorting algorithm.
• Design an algorithm that sorts in O(n) time.
• Hint: different models ?

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.2
Comparison sort
All the sorting algorithms we have seen so far
are comparison sorts: only use comparisons to
determine the relative order of elements.
• E.g., insertion sort, merge sort, quicksort,
heapsort.

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.3
Partitioning subroutine
PARTITION(A, p, r) ⊳ A[ p . . r]
x ← A[ p] ⊳ pivot = A[ p]
i←p
for j ← p + 1 to r
do if A[ j] ≤ x
then i ← i + 1
exchange A[i] ↔ A[ j]
exchange A[ p] ↔ A[i]
return i
Invariant: xx ≤≤ xx ≥≥ xx ??
p i j r
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.4
Comparison sort

• All of our algorithms used comparisons


• All of our algorithms have running time
Ω(n lg n)
• Is it the best that we can do using just
comparisons ?
• Answer: YES, via decision trees

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.5
Decision-tree example
Sort 〈a1, a2, …, an〉 1:2
1:2
(n=3) 2:3
2:3 1:3
1:3
123
123 1:3 213
213 2:3
1:3 2:3
132
132 312
312 231
231 321
321

Each internal node is labeled i:j for i, j ∈ {1, 2,…, n}.


• The left subtree shows subsequent comparisons if ai ≤ aj.
• The right subtree shows subsequent comparisons if ai ≥ aj.
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.6
Decision-tree example
Sort 〈a1, a2, a3〉 1:2
1:2
= 〈 9, 4, 6 〉:
2:3
2:3 1:3
1:3
123
123 1:3 213
213 2:3
1:3 2:3
132
132 312
312 231
231 321
321

Each internal node is labeled i:j for i, j ∈ {1, 2,…, n}.


• The left subtree shows subsequent comparisons if ai ≤ aj.
• The right subtree shows subsequent comparisons if ai ≥ aj.
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.7
Decision-tree example
Sort 〈a1, a2, a3〉 1:2
1:2 9≥4
= 〈 9, 4, 6 〉:
2:3
2:3 1:3
1:3
123
123 1:3 213
213 2:3
1:3 2:3
132
132 312
312 231
231 321
321

Each internal node is labeled i:j for i, j ∈ {1, 2,…, n}.


• The left subtree shows subsequent comparisons if ai ≤ aj.
• The right subtree shows subsequent comparisons if ai ≥ aj.
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.8
Decision-tree example
Sort 〈a1, a2, a3〉 1:2
1:2
= 〈 9, 4, 6 〉:
2:3
2:3 1:3
1:3 9≥6
123
123 1:3 213
213 2:3
1:3 2:3
132
132 312
312 231
231 321
321

Each internal node is labeled i:j for i, j ∈ {1, 2,…, n}.


• The left subtree shows subsequent comparisons if ai ≤ aj.
• The right subtree shows subsequent comparisons if ai ≥ aj.
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.9
Decision-tree example
Sort 〈a1, a2, a3〉 1:2
1:2
= 〈 9, 4, 6 〉:
2:3
2:3 1:3
1:3
123
123 1:3 213
213 4 ≤ 6 2:3
1:3 2:3
132
132 312
312 231
231 321
321

Each internal node is labeled i:j for i, j ∈ {1, 2,…, n}.


• The left subtree shows subsequent comparisons if ai ≤ aj.
• The right subtree shows subsequent comparisons if ai ≥ aj.
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.10
Decision-tree example
Sort 〈a1, a2, a3〉 1:2
1:2
= 〈 9, 4, 6 〉:
2:3
2:3 1:3
1:3
123
123 1:3 213
213 2:3
1:3 2:3
132
132 312
312 231
231 321
321
4≤6≤9
Each leaf contains a permutation 〈π(1), π(2),…, π(n)〉 to
indicate that the ordering aπ(1) ≤ aπ(2) ≤ L ≤ aπ(n) has been
established.
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.11
Decision-tree model
A decision tree can model the execution of
any comparison sort:
• One tree for each input size n.
• View the algorithm as splitting whenever it compares two
elements.
• The tree contains the comparisons along all possible
instruction traces.
• The number of comparisons done by the algorithm on a given
input =
…the length of the path taken.
• Worst-case number of comparisons =
…max path length = height of tree.
• Worst-case time ≥ worst-case number of comparisons
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.12
Lower bound for decision-
tree sorting
Theorem. Any decision tree that can sort n
elements must have height Ω(n lg n) .

Corollary. Any comparison sorting algorithm


has worst-case running time Ω(n lg n).

Corollary 2. Merge sort and Heap Sort are


asymptotically optimal comparison sorting
algorithms.

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.13
Lower bound for decision-
tree sorting
Theorem. Any decision tree that can sort n
elements must have height Ω(n lg n) .
Proof.
• The tree must contain ≥ n! leaves, since there
are n! possible permutations
• A height-h binary tree has ≤ 2h leaves
• Thus, 2h ≥ #leaves ≥ n! , or h ≥ lg(n!)

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.14
Proof, ctd.

2h ≥ n!
≥ n*(n-1)*…* n/2
≥ (n/2)n/2
⇒h ≥ lg( (n/2)n/2 )
≥ (n/2) (lg n –lg 2)
= Ω(n lg n) .

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.15
Example: sorting 3 elements

Recall h ≥ lg(n!)
• n=3
• n!=6
• log26 = 2.58
• Sorting 3 elements requires…
… ≥3 comparisons in the worst case

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.16
Decision-tree for n=3
Sort 〈a1, a2, a3〉 1:2
1:2
2:3
2:3 1:3
1:3
123
123 1:3 213
213 2:3
1:3 2:3
132
132 312
312 231
231 321
321

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.17
Sorting in linear time
Counting sort: No comparisons between elements.
• Input: A[1 . . n], where A[ j]∈{1, 2, …, k} .
• Output: B[1 . . n], sorted*
• Auxiliary storage: C[1 . . k] .

*Actually, we require the algorithm to construct a permutation of the input


array A that produces the sorted array B. This permutation can be obtained
by making small changes to the last loop of the algorithm.

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.18
Counting sort
for i ← 1 to k
do C[i] ← 0
for j ← 1 to n
do C[A[ j]] ← C[A[ j]] + 1 ⊳ C[i] = |{key = i}|
for i ← 2 to k
do C[i] ← C[i] + C[i–1] ⊳ C[i] = |{key ≤ i}|
for j ← n downto 1
do B[C[A[ j]]] ← A[ j]
C[A[ j]] ← C[A[ j]] – 1
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.19
Counting-sort example
1 2 3 4 5 1 2 3 4

A: 44 11 33 44 33 C:

B:

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.20
Loop 1
1 2 3 4 5 1 2 3 4

A: 44 11 33 44 33 C: 00 00 00 00

B:

for i ← 1 to k
do C[i] ← 0

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.21
Loop 2
1 2 3 4 5 1 2 3 4

A: 44 11 33 44 33 C: 00 00 00 11

B:

for j ← 1 to n
do C[A[ j]] ← C[A[ j]] + 1 ⊳ C[i] = |{key = i}|

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.22
Loop 2
1 2 3 4 5 1 2 3 4

A: 44 11 33 44 33 C: 11 00 00 11

B:

for j ← 1 to n
do C[A[ j]] ← C[A[ j]] + 1 ⊳ C[i] = |{key = i}|

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.23
Loop 2
1 2 3 4 5 1 2 3 4

A: 44 11 33 44 33 C: 11 00 11 11

B:

for j ← 1 to n
do C[A[ j]] ← C[A[ j]] + 1 ⊳ C[i] = |{key = i}|

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.24
Loop 2
1 2 3 4 5 1 2 3 4

A: 44 11 33 44 33 C: 11 00 11 22

B:

for j ← 1 to n
do C[A[ j]] ← C[A[ j]] + 1 ⊳ C[i] = |{key = i}|

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.25
Loop 2
1 2 3 4 5 1 2 3 4

A: 44 11 33 44 33 C: 11 00 22 22

B:

for j ← 1 to n
do C[A[ j]] ← C[A[ j]] + 1 ⊳ C[i] = |{key = i}|

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.26
Loop 3
1 2 3 4 5 1 2 3 4

A: 44 11 33 44 33 C: 11 00 22 22

B: C': 11 11 22 22

for i ← 2 to k
do C[i] ← C[i] + C[i–1] ⊳ C[i] = |{key ≤ i}|

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.27
Loop 3
1 2 3 4 5 1 2 3 4

A: 44 11 33 44 33 C: 11 00 22 22

B: C': 11 11 33 22

for i ← 2 to k
do C[i] ← C[i] + C[i–1] ⊳ C[i] = |{key ≤ i}|

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.28
Loop 3
1 2 3 4 5 1 2 3 4

A: 44 11 33 44 33 C: 11 00 22 22

B: C': 11 11 33 55

for i ← 2 to k
do C[i] ← C[i] + C[i–1] ⊳ C[i] = |{key ≤ i}|

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.29
Loop 4
1 2 3 4 5

A: 44 11 33 44 33
1 2 3 4

B: C': 11 11 33 55

for j ← n downto 1
do B[C[A[ j]]] ← A[ j]
C[A[ j]] ← C[A[ j]] – 1
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.30
Loop 4
1 2 3 4 5

A: 44 11 33 44 33
1 2 3 4

B: 33 C': 11 11 33 55

for j ← n downto 1
do B[C[A[ j]]] ← A[ j]
C[A[ j]] ← C[A[ j]] – 1
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.31
Loop 4
1 2 3 4 5

A: 44 11 33 44 33
1 2 3 4

B: 33 C': 11 11 22 55

for j ← n downto 1
do B[C[A[ j]]] ← A[ j]
C[A[ j]] ← C[A[ j]] – 1
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.32
Loop 4
1 2 3 4 5

A: 44 11 33 44 33
1 2 3 4

B: 33 44 C': 11 11 22 55

for j ← n downto 1
do B[C[A[ j]]] ← A[ j]
C[A[ j]] ← C[A[ j]] – 1
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.33
Loop 4
1 2 3 4 5

A: 44 11 33 44 33
1 2 3 4

B: 33 44 C': 11 11 22 44

for j ← n downto 1
do B[C[A[ j]]] ← A[ j]
C[A[ j]] ← C[A[ j]] – 1
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.34
Loop 4
1 2 3 4 5

A: 44 11 33 44 33
1 2 3 4

B: 33 33 44 C': 11 11 22 44

for j ← n downto 1
do B[C[A[ j]]] ← A[ j]
C[A[ j]] ← C[A[ j]] – 1
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.35
Loop 4
1 2 3 4 5

A: 44 11 33 44 33
1 2 3 4

B: 33 33 44 C': 11 11 11 44

for j ← n downto 1
do B[C[A[ j]]] ← A[ j]
C[A[ j]] ← C[A[ j]] – 1
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.36
Loop 4
1 2 3 4 5

A: 44 11 33 44 33
1 2 3 4

B: 11 33 33 44 C': 11 11 11 44

for j ← n downto 1
do B[C[A[ j]]] ← A[ j]
C[A[ j]] ← C[A[ j]] – 1
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.37
Loop 4
1 2 3 4 5

A: 44 11 33 44 33
1 2 3 4

B: 11 33 33 44 C': 00 11 11 44

for j ← n downto 1
do B[C[A[ j]]] ← A[ j]
C[A[ j]] ← C[A[ j]] – 1
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.38
Loop 4
1 2 3 4 5

A: 44 11 33 44 33
1 2 3 4

B: 11 33 33 44 44 C': 00 11 11 44

for j ← n downto 1
do B[C[A[ j]]] ← A[ j]
C[A[ j]] ← C[A[ j]] – 1
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.39
Loop 4
1 2 3 4 5

A: 44 11 33 44 33
1 2 3 4

B: 11 33 33 44 44 C': 00 11 11 33

for j ← n downto 1
do B[C[A[ j]]] ← A[ j]
C[A[ j]] ← C[A[ j]] – 1
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.40
B vs C
1 2 3 4 5 1 2 3 4

B: 11 33 33 44 44 C': 11 11 33 55

In the end, each element i occupies the range


B[C[i-1]+1 … C[i]]

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.41
Analysis
for i ← 1 to k
Θ(k) do C[i] ← 0
for j ← 1 to n
Θ(n)
do C[A[ j]] ← C[A[ j]] + 1
for i ← 2 to k
Θ(k) do C[i] ← C[i] + C[i–1]
for j ← n downto 1
Θ(n) do B[C[A[ j]]] ← A[ j]
C[A[ j]] ← C[A[ j]] – 1
Θ(n + k)
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.42
Running time
If k = O(n), then counting sort takes Θ(n) time.
• But, sorting takes Ω(n lg n) time!
• Why ?
Answer:
• Comparison sorting takes Ω(n lg n) time.
• Counting sort is not a comparison sort.
• In fact, not a single comparison between
elements occurs!
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.43
Stable sorting
Counting sort is a stable sort: it preserves
the input order among equal elements.

A: 44 11 33 44 33

B: 11 33 33 44 44

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.44
Sorting integers

• We can sort n integers from {1, 2, …, k} in


O(n+k) time
• This is nice if k=O(n)
• What if, say, k=n2 ?

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.45
Radix sort
• Origin: Herman Hollerith’s card-sorting
machine for the 1890 U.S. Census. (See
Appendix .)
• Digit-by-digit sort.
• Hollerith’s original (bad) idea: sort on
most-significant digit first.
• Good idea: Sort on least-significant digit
first with auxiliary stable sort.

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.46
Operation of radix sort

329 720 720 329


457 355 329 355
657 436 436 436
839 457 839 457
436 657 355 657
720 329 457 720
355 839 657 839

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.47
Correctness of radix sort
Induction on digit position
720 329
• Assume that the numbers
329 355
are sorted by their low-order
t – 1 digits. 436 436
839 457
• Sort on digit t
355 657
457 720
657 839

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.48
Correctness of radix sort
Induction on digit position
720 329
• Assume that the numbers
329 355
are sorted by their low-order
t – 1 digits. 436 436
839 457
• Sort on digit t
ƒ Two numbers that differ in
355 657
digit t are correctly sorted. 457 720
657 839

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.49
Correctness of radix sort
Induction on digit position
720 329
• Assume that the numbers
329 355
are sorted by their low-order
t – 1 digits. 436 436
839 457
• Sort on digit t
ƒ Two numbers that differ in
355 657
digit t are correctly sorted. 457 720
ƒ Two numbers equal in digit t 657 839
are put in the same order as
the input ⇒ correct order.
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.50
Analysis of radix sort
• Assume counting sort is the auxiliary stable sort.
• Sort n computer words of b bits each
E.g., if we sort elements in {1…n2} , b=2 lg n
• Each word can be viewed as having b/r base-2r
digits.
8 8 8 8
Example: 32-bit word
r = 8 ⇒ b/r = 4 passes of counting sort on base-28 digits;
or r = 16 ⇒ b/r = 2 passes of counting sort on base-216
digits.
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.51
Analysis (continued)
Recall: Counting sort takes Θ(n + k) time to
sort n numbers in the range from 0 to k – 1.
If each b-bit word is broken into r-bit pieces,
each pass of counting sort takes Θ(n + 2r) time.
Since there are b/r passes, we have
T (n, b) = Θ b (n + 2 r ) .
r 
Choose r to minimize T(n, b):
• Increasing r means fewer passes, but as
r >> lg n, the time grows exponentially.
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.52
Choosing r
T (n, b) = Θ b (n + 2 r )
r 
Minimize T(n, b) by differentiating and setting to 0.
Or, just observe that we don’t want 2r >> n, and
there’s no harm asymptotically in choosing r as
large as possible subject to this constraint.
Choosing r = lg n implies T(n, b) = Θ(bn/lg n) .
• For numbers in the range from 0 to n d – 1, we
have b = d lg n ⇒ radix sort runs in Θ(d n) time.
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.53
Conclusions
In practice, radix sort is fast for large inputs, as
well as simple to code and maintain.
Example (32-bit numbers):
• At most 3 passes when sorting ≥ 2000 numbers.
• Merge sort and quicksort do at least lg 2000 =
11 passes.
Downside: Unlike quicksort, radix sort displays
little locality of reference.

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.54
Appendix: Punched-card
technology
• Herman Hollerith (1860-1929)
• Punched cards
• Hollerith’s tabulating system
• Operation of the sorter
• Origin of radix sort
• “Modern” IBM card
• Web resources on punched- Return to last
card technology slide viewed.

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.55
Herman Hollerith
(1860-1929)
• The 1880 U.S. Census took almost
10 years to process. Image removed due to copyright considerations.

• While a lecturer at MIT, Hollerith


prototyped punched-card technology.
• His machines, including a “card sorter,” allowed
the 1890 census total to be reported in 6 weeks.
• He founded the Tabulating Machine Company in
1911, which merged with other companies in 1924
to form International Business Machines.
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.56
Punched cards
• Punched card = data record.
• Hole = value.
• Algorithm = machine + human operator.

Image removed due to copyright considerations.

© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.57
Hollerith’s
tabulating
system Image removed due to copyright considerations.

•Pantograph card
punch
•Hand-press reader
•Dial counters
•Sorting box
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.58
Operation of the sorter
• An operator inserts a card into
the press.
• Pins on the press reach through
the punched holes to make
Image removed due to copyright considerations.
electrical contact with mercury-
filled cups beneath the card.
• Whenever a particular digit
value is punched, the lid of the
corresponding sorting bin lifts.
• The operator deposits the card
into the bin and closes the lid.
• When all cards have been processed, the front panel is opened, and
the cards are collected in order, yielding one pass of a stable sort.
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.59
Origin of radix sort
Hollerith’s original 1889 patent alludes to a most-
significant-digit-first radix sort:
“The most complicated combinations can readily be
counted with comparatively few counters or relays by first
assorting the cards according to the first items entering
into the combinations, then reassorting each group
according to the second item entering into the combination,
and so on, and finally counting on a few counters the last
item of the combination for each group of cards.”
Least-significant-digit-first radix sort seems to be
a folk invention originated by machine operators.
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.60
“Modern” IBM card
• One character per column.

Image removed due to copyright considerations.

So, that’s why text windows have 80 columns!


© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.61
Web resources on punched-
card technology
• Doug Jones’s punched card index
• Biography of Herman Hollerith
• The 1890 U.S. Census
• Early history of IBM
• Pictures of Hollerith’s inventions
• Hollerith’s patent application (borrowed
from Gordon Bell’s CyberMuseum)
• Impact of punched cards on U.S. history
© Charles E. Leiserson and Piotr Indyk Introduction to Algorithms September 27, 2004 L6.62

You might also like