KEMBAR78
Asymptotic Analysis in Data Structure using C | PPTX
Presented By:
Deepshikha Haritwal
MCA/25001/18
 Meaning of algorithm
 Algorithm analysis
 Asymptotic analysis
 Asymptotic notations
 An algorithm may be defined as a finite
sequence of instructions each of which has a
clear meaning and can be performed with a
finite amount of effort in a finite length of
time.
 An algorithm has following properties:
 Finiteness
 Definiteness
 Generality
 Effectiveness
 Input- Output
 The performance of algorithm can be
measured on the basis of scale of time and
space.
 Time complexity of an algorithm or a
program is a function of the running time of
an algorithm or the program.
 Space complexity of an algorithm or program
is function of the space needed by algorithm
or program to run to completion.
 The time complexity of an algorithm could be
computed by:
 Posteriori analysis
 Apriori analysis
 Posteriori analysis calls for implementing the
complete algorithms and executing them on
computer for various instances of the
problem.
 Then the time taken by the execution of the
programs for various instances of problem are
noted and then compared.
 Apriori analysis calls for mathematically
determining the resources such as time and
space as a function of parameter related to
instances of problem.
In asymptotic Analysis, we
evaluate the performance of an
algorithm in terms of input size.
Asymptotic analysis refers to
computing the running time of
any operation in mathematical
units of computation.
 The time required by an algorithm falls under
three types −
 Best Case − Minimum time required for program
execution.
 Average Case − Average time required for program
execution.
 Worst Case − Maximum time required for program
execution.
 Following are the commonly used asymptotic
notations to calculate the running time
complexity of an algorithm:
 Ο Notation
 Ω Notation
 θ Notation
 Little o notation
 Little omega notation
 The notation Ο(n) is the formal way to
express the upper bound of an algorithm's
running time.
 It measures the worst case time complexity
or the longest amount of time an algorithm
can possibly take to complete.
 For example: the time complexity of
Insertion sort is O(n^2).
 O(g(n)) = { f(n): there exist positive constants c and
n0 such that 0 <= f(n) <= c*g(n) for all n >= n0}
 Example : if f(n) = 16n3 + 78n2 + 12n , g(n)= n3
then f(n)=O(n3).
 The notation Ω(n) is the formal way to
express the lower bound of an algorithm's
running time.
 It measures the best case time complexity
or the best amount of time an algorithm can
possibly take to complete.
 For example: the time complexity of
Insertion Sort can be written as Ω(n).
 Ω (g(n)) = {f(n): there exist positive constants
c and n0 such that 0 <= c*g(n) <= f(n) for all
n >= n0}.
 Example : if f(n)= 24n+9, g(n)= n then f(n)=
Ω(n).
 The notation θ(n) is the formal way to
express both the lower bound and the upper
bound of an algorithm's running time.
 For example: If we use Θ notation to
represent time complexity of Insertion sort,
we have to use two statements for best and
worst cases:
 The worst case time complexity of Insertion Sort
is Θ(n^2).
 The best case time complexity of Insertion Sort
is Θ(n).
 Θ(g(n)) = {f(n): there exist positive constants
c1, c2 and n0 such that 0 <= c1*g(n) <= f(n) <=
c2*g(n) for all n >= n0}
 Example : if f(n)= 28n+9, g(n)= n then f(n)= Θ
(n). Since f(n)>28n and f(n)<=37n.
 Little o provides strict upper bound (equality
condition is removed from Big O)
 “Little-ο” (ο()) notation is used to describe
an upper-bound that cannot be tight.
 Definition :
 Let f(n) and g(n) be functions that map positive
integers to positive real numbers. We say that
f(n) is ο(g(n)) if for any real constant c > 0, there
exists an integer constant n0 ≥ 1 such that 0 ≤
f(n) < c*g(n).
 little omega provides strict lower bound
(equality condition removed from big
omega).
 We use ω notation to denote a lower bound
that is not asymptotically tight.
 Definition :
 Let f(n) and g(n) be functions that map positive
integers to positive real numbers. We say that
f(n) is ω(g(n)) if for any real constant c > 0,
there exists an integer constant n0 ≥ 1 such that
f(n) > c * g(n) ≥ 0 for every integer n ≥ n0.
Asymptotic Analysis in Data Structure using C

Asymptotic Analysis in Data Structure using C

  • 1.
  • 2.
     Meaning ofalgorithm  Algorithm analysis  Asymptotic analysis  Asymptotic notations
  • 3.
     An algorithmmay be defined as a finite sequence of instructions each of which has a clear meaning and can be performed with a finite amount of effort in a finite length of time.  An algorithm has following properties:  Finiteness  Definiteness  Generality  Effectiveness  Input- Output
  • 4.
     The performanceof algorithm can be measured on the basis of scale of time and space.  Time complexity of an algorithm or a program is a function of the running time of an algorithm or the program.  Space complexity of an algorithm or program is function of the space needed by algorithm or program to run to completion.
  • 5.
     The timecomplexity of an algorithm could be computed by:  Posteriori analysis  Apriori analysis  Posteriori analysis calls for implementing the complete algorithms and executing them on computer for various instances of the problem.  Then the time taken by the execution of the programs for various instances of problem are noted and then compared.  Apriori analysis calls for mathematically determining the resources such as time and space as a function of parameter related to instances of problem.
  • 6.
    In asymptotic Analysis,we evaluate the performance of an algorithm in terms of input size. Asymptotic analysis refers to computing the running time of any operation in mathematical units of computation.
  • 7.
     The timerequired by an algorithm falls under three types −  Best Case − Minimum time required for program execution.  Average Case − Average time required for program execution.  Worst Case − Maximum time required for program execution.
  • 8.
     Following arethe commonly used asymptotic notations to calculate the running time complexity of an algorithm:  Ο Notation  Ω Notation  θ Notation  Little o notation  Little omega notation
  • 9.
     The notationΟ(n) is the formal way to express the upper bound of an algorithm's running time.  It measures the worst case time complexity or the longest amount of time an algorithm can possibly take to complete.  For example: the time complexity of Insertion sort is O(n^2).
  • 10.
     O(g(n)) ={ f(n): there exist positive constants c and n0 such that 0 <= f(n) <= c*g(n) for all n >= n0}  Example : if f(n) = 16n3 + 78n2 + 12n , g(n)= n3 then f(n)=O(n3).
  • 11.
     The notationΩ(n) is the formal way to express the lower bound of an algorithm's running time.  It measures the best case time complexity or the best amount of time an algorithm can possibly take to complete.  For example: the time complexity of Insertion Sort can be written as Ω(n).
  • 12.
     Ω (g(n))= {f(n): there exist positive constants c and n0 such that 0 <= c*g(n) <= f(n) for all n >= n0}.  Example : if f(n)= 24n+9, g(n)= n then f(n)= Ω(n).
  • 13.
     The notationθ(n) is the formal way to express both the lower bound and the upper bound of an algorithm's running time.  For example: If we use Θ notation to represent time complexity of Insertion sort, we have to use two statements for best and worst cases:  The worst case time complexity of Insertion Sort is Θ(n^2).  The best case time complexity of Insertion Sort is Θ(n).
  • 14.
     Θ(g(n)) ={f(n): there exist positive constants c1, c2 and n0 such that 0 <= c1*g(n) <= f(n) <= c2*g(n) for all n >= n0}  Example : if f(n)= 28n+9, g(n)= n then f(n)= Θ (n). Since f(n)>28n and f(n)<=37n.
  • 15.
     Little oprovides strict upper bound (equality condition is removed from Big O)  “Little-ο” (ο()) notation is used to describe an upper-bound that cannot be tight.  Definition :  Let f(n) and g(n) be functions that map positive integers to positive real numbers. We say that f(n) is ο(g(n)) if for any real constant c > 0, there exists an integer constant n0 ≥ 1 such that 0 ≤ f(n) < c*g(n).
  • 16.
     little omegaprovides strict lower bound (equality condition removed from big omega).  We use ω notation to denote a lower bound that is not asymptotically tight.  Definition :  Let f(n) and g(n) be functions that map positive integers to positive real numbers. We say that f(n) is ω(g(n)) if for any real constant c > 0, there exists an integer constant n0 ≥ 1 such that f(n) > c * g(n) ≥ 0 for every integer n ≥ n0.