KEMBAR78
Lecture 3 insertion sort and complexity analysis | PDF
Lecture 3 : Analysis of
Algorithms & Insertion Sort
Jayavignesh T
Asst Professor
SENSE
Time Complexity
• Amount of computer time required by an algorithm
to run on completion
• Difficult to compute time complexity in terms of
physically clocked time.
• Drawbacks of measuring running time in-terms of
seconds, millisecond etc are
– Dependence of speed of a underlying hardware
– Number of other programs running (System load)
– Dependence of compiler used in generating
machine code
How to calculate running time then?
• Time complexity given in terms of FREQUENCY
COUNT
• Count denoting number of times of execution of
statement.
For (i=0; i <n;i++) { // St1 : 1, St 2 : n+1 , St 3 : n times
sum = sum + a[i]; // n times
}
3n + 2 ; O(n) neglecting constants and lower order terms
How to calculate running time then?
for (i=0; i < n ; i ++) // 1 ; n+1 ; n times
{
for (j=0; j < n ; j ++) // n ; n(n+1) ; n(n)
{
c[i][j] = a[i][j] + b[i][j];
}
}
3n2+4n+ 2 = O(n2)
Time Complexity
• Number of steps required by an algorithm varies
with the size of the problem it is solving.
• Normally expressed as order of magnitude
– eg O(n2)
– Size of problem doubles then the algorithm will
take 4 times as many steps to complete
How to calculate running time then?
• All Algorithms run longer on larger inputs
• Algorithm’s efficiency - f(n)
• Identify the most important operations of the
algorithm – BASIC Operation
• Basic operation – contributing to most of total
running time
• Compute the number of times basic operation is
executed (mostly in inner loop)
Ex : Sorting Algorithms – Comparison (< >)
Matrix Multiplication, Polynomial evaluation – Arithmetic Operations ( *, +)
= (assignment), ==(equality) etc..
Order of Growth of Algorithm
• Measuring the performance of an algorithm
in relation with input size n
• Cannot says it equals n2 , but it grows like n2
EFFICIENCY COMPARISONS
Rate of Growth of Algorithm as fn of i/p size
Determination of Complexities
• How do you determine the running time of
piece of code?
Ans : Depends on the kinds of statements used
1. Sequence of Statements
Statement 1;
Statement 2;
…
…
Statement k;
• Independent statement in a piece of code and not an
unrolled loop
• Total Time : Adding the time for all statements.
• Total Time = Time (Statement 1) + Time (Statement 2) + … +
Time (Statement k)
• Each statement – simple (basic operations) – Time constant –
Total time is also constant O(1)
1 (Constant Time)
• When instructions of program are executed once or at
most only a few times , then the running time
complexity of such algorithm is known as constant time.
• It is independent of the problem size.
• It is represented as O(1).
• For example, linear search best case complexity is O(1)
Log n (Logarithmic)
• The running time of the algorithm in which large
problem is solved by transforming into smaller sizes sub
problems is said to be Logarithmic in nature.
• Becomes slightly slower as n grows.
• It does not process all the data element of input size n.
• The running time does not double until n increases to
n2.
• It is represented as O(log n).
• For example binary search algorithm running time
complexity is O(log n).
2.For loops
for (i=0; i<N;i++)
{
Sequence of statements
}
• Loop executes N times, Sequence of statements also executes N
times.
• Total time for the for loop = N*O(1) = O(N)
3.If-then-else statements
If(cond) {
Sequence of statements 1
}
Else
{
Sequence of statements 2
}
• Either Sequence 1 or Sequence 2 will execute.
• Worst Case Time is slowest of two possibilities
– Max { time (sequence 1), time (sequence 2) }
– If Sequence 1 is O(N) and Sequence 2 is O(1), Worst case time for
if-then-else would be O(N)
n (Linear)
• The complete set of instruction is executed once for each
input i.e input of size n is processed.
• It is represented as O(n).
• This is the best option to be used when the whole input has
to be processed.
• In this situation time requirement increases directly with the
size of the problem.
• For example linear search Worst case complexity is O(n).
4.Nested Loops
For (i=0;i<N;i++){
for(j=0;j<M;j++){
sequence of statements;
}
}
Total Complexity = O(N*M)
= O(N2)
5.Statement with function calls
• for (j=0; j<N; j++) g(N); has complexity O(N2)
– Loop executes N times
– g(N) has complexity O(N)
n2 (Quadratic)
• Running time of an algorithm is quadratic in nature
when it process all pairs of data items.
• Such algorithm will have two nested loops.
• For input size n, running time will be O(n2).
• Practically this is useful for problem with small input
size or elementary sorting problems.
• In this situation time requirement increases fast with
the size of the problem.
• For example insertion sort running time complexity is
O(n2).
Performance Classification
Efficiency comparisons
Function of Growth Rate
Prob1. Calculate worst-case complexity!
• Nested Loop + Non-nested loop
for (i=0;i<N;i++){
for(j=0;j<N;j++){
sequence of statements;
}
}
for(k=0;k<N;j++){
sequence of statements;
}
• O(N2), O(N) = O(max(N2,N) = O(N2)
Prob 2.Calculate worst-case complexity!
• Nested Loop
for (i=0;i<N;i++){
for(j=i;j<N;j++){
sequence of statements;
}
}
• N+ (N-1) + (N-2) + …. + 1 = N(N+1)/2 = O(N2)
Approaches of Designing Algorithms
• Incremental Approach
• Insertion sort
– In each iteration one more element joins the sorted array
• Divide and Conquer Approach
– Recursively break down into 2 or more sub problems until it
becomes easy to solve. Solutions are combined to give
solution to original problem
• Merge Sort
• Quick Sort
Insertion Sort
3 4 6 8 9 7 2 5 1
1 nj 
 i
Strategy
• Start empty handed
• Insert a card in the right position
of the already sorted hand
• Continue until all the cards are
Inserted/sorted
Analysis – Insertion Sort
Insertion Sort – Tracing Input
Analysis – Insertion Sort
• Assume that the i th line takes time ci , which is a constant.
(Since the third line is a comment, it takes no time.)
• For j = 2, 3, . . . , n, let tj be the number of times that the
while loop test is executed for that value of j .
• Note that when a for or while loop exits in the usual way -
due to the test in the loop header - the test is executed
one time more than the loop body.
Analysis – Insertion Sort – Running time
Best case Analysis
Worst case Analysis
Average Case
Divide-and-Conquer
• The most-well known algorithm design strategy:
1. Divide instance of problem into two or more smaller
instances
2. Solve smaller instances recursively
3. Obtain solution to original (larger) instance by
combining these solutions
• Type of recurrence relation
Divide-and-Conquer Technique (cont.)

Lecture 3 insertion sort and complexity analysis

  • 1.
    Lecture 3 :Analysis of Algorithms & Insertion Sort Jayavignesh T Asst Professor SENSE
  • 2.
    Time Complexity • Amountof computer time required by an algorithm to run on completion • Difficult to compute time complexity in terms of physically clocked time. • Drawbacks of measuring running time in-terms of seconds, millisecond etc are – Dependence of speed of a underlying hardware – Number of other programs running (System load) – Dependence of compiler used in generating machine code
  • 3.
    How to calculaterunning time then? • Time complexity given in terms of FREQUENCY COUNT • Count denoting number of times of execution of statement. For (i=0; i <n;i++) { // St1 : 1, St 2 : n+1 , St 3 : n times sum = sum + a[i]; // n times } 3n + 2 ; O(n) neglecting constants and lower order terms
  • 4.
    How to calculaterunning time then? for (i=0; i < n ; i ++) // 1 ; n+1 ; n times { for (j=0; j < n ; j ++) // n ; n(n+1) ; n(n) { c[i][j] = a[i][j] + b[i][j]; } } 3n2+4n+ 2 = O(n2)
  • 5.
    Time Complexity • Numberof steps required by an algorithm varies with the size of the problem it is solving. • Normally expressed as order of magnitude – eg O(n2) – Size of problem doubles then the algorithm will take 4 times as many steps to complete
  • 6.
    How to calculaterunning time then? • All Algorithms run longer on larger inputs • Algorithm’s efficiency - f(n) • Identify the most important operations of the algorithm – BASIC Operation • Basic operation – contributing to most of total running time • Compute the number of times basic operation is executed (mostly in inner loop) Ex : Sorting Algorithms – Comparison (< >) Matrix Multiplication, Polynomial evaluation – Arithmetic Operations ( *, +) = (assignment), ==(equality) etc..
  • 8.
    Order of Growthof Algorithm • Measuring the performance of an algorithm in relation with input size n • Cannot says it equals n2 , but it grows like n2
  • 9.
  • 10.
    Rate of Growthof Algorithm as fn of i/p size
  • 11.
    Determination of Complexities •How do you determine the running time of piece of code? Ans : Depends on the kinds of statements used
  • 12.
    1. Sequence ofStatements Statement 1; Statement 2; … … Statement k; • Independent statement in a piece of code and not an unrolled loop • Total Time : Adding the time for all statements. • Total Time = Time (Statement 1) + Time (Statement 2) + … + Time (Statement k) • Each statement – simple (basic operations) – Time constant – Total time is also constant O(1)
  • 13.
    1 (Constant Time) •When instructions of program are executed once or at most only a few times , then the running time complexity of such algorithm is known as constant time. • It is independent of the problem size. • It is represented as O(1). • For example, linear search best case complexity is O(1)
  • 14.
    Log n (Logarithmic) •The running time of the algorithm in which large problem is solved by transforming into smaller sizes sub problems is said to be Logarithmic in nature. • Becomes slightly slower as n grows. • It does not process all the data element of input size n. • The running time does not double until n increases to n2. • It is represented as O(log n). • For example binary search algorithm running time complexity is O(log n).
  • 15.
    2.For loops for (i=0;i<N;i++) { Sequence of statements } • Loop executes N times, Sequence of statements also executes N times. • Total time for the for loop = N*O(1) = O(N)
  • 16.
    3.If-then-else statements If(cond) { Sequenceof statements 1 } Else { Sequence of statements 2 } • Either Sequence 1 or Sequence 2 will execute. • Worst Case Time is slowest of two possibilities – Max { time (sequence 1), time (sequence 2) } – If Sequence 1 is O(N) and Sequence 2 is O(1), Worst case time for if-then-else would be O(N)
  • 17.
    n (Linear) • Thecomplete set of instruction is executed once for each input i.e input of size n is processed. • It is represented as O(n). • This is the best option to be used when the whole input has to be processed. • In this situation time requirement increases directly with the size of the problem. • For example linear search Worst case complexity is O(n).
  • 18.
    4.Nested Loops For (i=0;i<N;i++){ for(j=0;j<M;j++){ sequenceof statements; } } Total Complexity = O(N*M) = O(N2)
  • 19.
    5.Statement with functioncalls • for (j=0; j<N; j++) g(N); has complexity O(N2) – Loop executes N times – g(N) has complexity O(N)
  • 20.
    n2 (Quadratic) • Runningtime of an algorithm is quadratic in nature when it process all pairs of data items. • Such algorithm will have two nested loops. • For input size n, running time will be O(n2). • Practically this is useful for problem with small input size or elementary sorting problems. • In this situation time requirement increases fast with the size of the problem. • For example insertion sort running time complexity is O(n2).
  • 21.
  • 22.
  • 23.
  • 24.
    Prob1. Calculate worst-casecomplexity! • Nested Loop + Non-nested loop for (i=0;i<N;i++){ for(j=0;j<N;j++){ sequence of statements; } } for(k=0;k<N;j++){ sequence of statements; } • O(N2), O(N) = O(max(N2,N) = O(N2)
  • 25.
    Prob 2.Calculate worst-casecomplexity! • Nested Loop for (i=0;i<N;i++){ for(j=i;j<N;j++){ sequence of statements; } } • N+ (N-1) + (N-2) + …. + 1 = N(N+1)/2 = O(N2)
  • 26.
    Approaches of DesigningAlgorithms • Incremental Approach • Insertion sort – In each iteration one more element joins the sorted array • Divide and Conquer Approach – Recursively break down into 2 or more sub problems until it becomes easy to solve. Solutions are combined to give solution to original problem • Merge Sort • Quick Sort
  • 27.
    Insertion Sort 3 46 8 9 7 2 5 1 1 nj   i Strategy • Start empty handed • Insert a card in the right position of the already sorted hand • Continue until all the cards are Inserted/sorted
  • 28.
  • 29.
    Insertion Sort –Tracing Input
  • 30.
    Analysis – InsertionSort • Assume that the i th line takes time ci , which is a constant. (Since the third line is a comment, it takes no time.) • For j = 2, 3, . . . , n, let tj be the number of times that the while loop test is executed for that value of j . • Note that when a for or while loop exits in the usual way - due to the test in the loop header - the test is executed one time more than the loop body.
  • 31.
    Analysis – InsertionSort – Running time
  • 32.
  • 35.
  • 36.
  • 38.
    Divide-and-Conquer • The most-wellknown algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances recursively 3. Obtain solution to original (larger) instance by combining these solutions • Type of recurrence relation
  • 39.