Algorithm - Introduction
• A process or set of rules to be followed in calculations or other problem-solving operations.
• Algorithm refers to a set of rules/instructions that step-by-step define how a work is to be
executed upon in order to get the expected results.
Algorithm Design
• An algorithm is the best way to represent the solution of a particular problem in
a very simple and efficient way.
• If we have an algorithm for a specific problem, then we can implement it in any
programming language, meaning that the algorithm is independent from any
programming languages.
• The important aspects of algorithm design include creating an efficient algorithm
to solve a problem in an efficient way using minimum time and space.
Algorithm of factorial of a
number
Algorithm???
Contd…
Efficiency of an algorithm can be analyzed at two different stages, before implementation and after
implementation. They are the following −
A Priori Analysis − This is a theoretical analysis of an algorithm. Efficiency of an algorithm is measured
by assuming that all other factors, for example, processor speed, are constant and have no effect on
the implementation.
A Posterior Analysis − This is an empirical analysis of an algorithm. The selected algorithm is
implemented using programming language. This is then executed on target computer machine. In this
analysis, actual statistics like running time and space required, are collected.
• In an industry, we cannot perform Apostiari analysis as the software is generally made for an
anonymous user, which runs it on a system different from those present in the industry.
• In Apriori, it is the reason that we use asymptotic notations to determine time and space complexity as
they change from computer to computer; however, asymptotically they are the same.
Algorithm Complexity
• Suppose X is an algorithm and n is the size of input data, the time and
space used by the algorithm X are the two main factors, which decide
the efficiency of X.
• Time Factor − Time is measured by counting the number of key
operations such as comparisons in the sorting algorithm.
• Space Factor − Space is measured by counting the maximum memory
space required by the algorithm.
• The complexity of an algorithm f(n) gives the running time and/or the
storage space required by the algorithm in terms of n as the size of
input data.
Space complexity:
• Space complexity of an algorithm represents the amount of memory space
required by the algorithm in its life cycle. The space required by an algorithm
is equal to the sum of the following two components −
• A fixed part that is a space required to store certain data and variables, that
are independent of the size of the problem. For example, simple variables and
constants used, program size, etc.
• A variable part is a space required by variables; whose size depends on the
size of the problem. For example, dynamic memory allocation, recursion stack
space, etc.
Space complexity:
• S(P) of any algorithm P is
• The equation S(P) = C + Sₚ(I) represents the space complexity of
an algorithm. In this equation:
• S(P): denotes the total space complexity of an algorithm P.
• C: is a fixed part of the space that is independent of the input
size. This includes space for the program code, fixed-size
variables, and constants.
• Sₚ(I): is the variable part of the space, which depends on the
specific input instance (I) of the problem. This includes space
for dynamically allocated memory, recursion stack, and other
data structures whose size varies with the input.
Following is a simple example that tries to explain the concept −
• Algorithm: SUM(A, B)
• Step 1 - START
• Step 2 - C ← A + B + 10
• Step 3 - Stop
Here we have three variables A, B, and C and one constant. Hence
S(P) = 1 + 3. Now, space depends on data types of given variables and
constant types and it will be multiplied accordingly.
Time Complexity:
• Time complexity of an algorithm represents the amount of time
required by the algorithm to run to completion. Time requirements
can be defined as a numerical function T(n), where T(n) can be
measured as the number of steps, provided each step consumes
constant time.
• For example, addition of two n-bit integers takes n steps.
Consequently, the total computational time is
T(n) = c ∗ n
where c is the time taken for the addition of two bits. Here, we observe
that T(n) grows linearly as the input size increases.
Asymptotic Analysis:
What is it?
• Asymptotic analysis is used for
analyzing algorithms in terms of time
and space. We evaluate the
performance of algorithm with respect
to its input size(n).
• We actually do not compute actual
time taken or memory occupied by
algorithm, we just analyze how the
time or space taken by algorithm
changes with increase in input size.
• The main idea of asymptotic analysis is
to have a measure of efficiency of
algorithms that doesn’t depend on
machine specific constants and doesn’t
require algorithms to be implemented
and time taken by programs to be
compared.
Worst, Average and Best
cases…
We have three cases to analyze and algorithm:
1. Worst case(usually done)- The maximum time taken by algo to solve a
particular problem. In the worst-case analysis, we calculate upper bound
on running time of an algorithm. We must know the case that causes
maximum number of operations to be executed.
2. Average case(sometimes done)- The average time taken by algo to solve a
particular problem. In average case analysis, we take all possible inputs
and calculate computing time for all of the inputs. Sum all the calculated
values and divide the sum by total number of inputs. We must know (or
predict) distribution of cases.
3. Best case(bogus)- The best/minimum time taken by algo to solve a
particular problem. In the best-case analysis, we calculate lower bound on
running time of an algorithm. We must know the case that causes
minimum number of operations to be executed.
Let us take an example of Linear Search.
• Worst Case: When the element to be
searched is not present in the array. When x
is not present, the search() functions
compares it with all the elements of arr[] one
by one. Therefore, the worst-case time
complexity of linear search would be Θ(n).
• Average Case: Let us assume that all cases
are uniformly distributed (including the case
of x not being present in array). So we sum all
the cases and divide the sum by (n+1).->
• Best Case: When element is present at the
first location. The number of operations in
the best case is constant (not dependent on
n). So time complexity in the best case would
be Θ(1)
Asymptotic
Notations
• Asymptotic notations are mathematical tools to represent
time complexity of algorithms for asymptotic analysis.
• Asymptotic Notation is used to describe the running time of an
algorithm - how much time an algorithm takes with a given input, n.
• This is also known as an algorithm's growth rate. Does the algorithm
suddenly become incredibly slow when the input size grows?
• The following 3 asymptotic notations are mostly used to
represent time complexity of algorithms.
O − Big Oh
Ω − Big omega
θ − Big theta
O − Big Oh
Notation
• Asymptotic Upper Bound
• ‘O’ (Big Oh) is the most commonly used notation.
(Worst Case)
• The Big O notation defines an upper bound
of an algorithm, it bounds a function only
from above.
• A function f(n) can be represented is the order
of g(n) that is f(n) = O(g(n)), if there exists a value
of positive integer n as n0 and a positive
constant c such that
f(n) ≤ c.g(n) for n > no in all cases
• Hence, function g(n) is an upper bound for
function f(n), as g(n) grows faster than f(n)
Ω − Big omega
• Asymptotic Lower Bound
• Ω Notation is the least used notation
because best-case performance of an
algorithm is generally not useful(Best
Case)
• The Ω notation defines a lower bound of an
algorithm, it bounds a function only from
below.
• We say that f(n) = Ω(g(n)) when there
exists constant c that
f(n) ≥ c.g(n)
for all sufficiently large value of n. Here n is
positive integer. This means function g is
lower bound for function f, after a certain
• It may be noted that an
algorithm that takes
O(n) time is better than
that takes O(n²) time.
Think
Upon..
θ − Big theta
• It bounds a functions from above
and below, so it defines exact
asymptotic behavior.(Average
Case)
• We say that f(n) = θ(g(n)) when there
exists constant c1 and c2 that
c1.g(n) ≤ f(n) ≤ c2.g(n)
for all sufficiently large value of n. Here n
is positive integer. This means function g
is tight bound for function f, after a certain
value of n.
Try….
• A Queue data structure takes …………runtime to retrieve the first item
in a Queue.
• A Stack data structure takes ………..runtime to retrieve the first value
in a Stack.
• A Queue data structure
is based on First In First
Out order. It takes O(1)
runtime to retrieve the
first item in a Queue.
• A Stack data structure is
based on First In Last
Out order. Therefore, it
takes O(N) runtime to
retrieve the first value in
a Stack because it is all
the way at the bottom.
Some common Asymptotic Analysis