Algorithm Analysis: Time and Space Complexity
Classified in Computers
Written at on English with a size of 4.5 KB.
Space Complexity
- Analysis of the space complexity of an algorithm or program is the amount of memory it needs to run to completion.
- The space needed by a program consists of the following components:
- Fixed space requirements: Independent of the number and size of the program's input and output. It includes:
- Instruction Space (Space needed to store the code)
- Space for simple variables
- Space for constants
- Variable space requirements: This component consists of:
- Space needed by structured variables whose size depends on the particular instance I of the problem being solved
- Space required when a function uses recursion
Total Space Complexity
S(P) of a program is:
S(P) = C + Sp(I)
Here Sp(I) is the variable space requirements of program P working on an instance I.
C is a constant representing the fixed space requirements.
Time Complexity
The time complexity of an algorithm or a program is the amount of time it needs to run to completion.
T(P)=C +TP
Here C is compile time
Tp is Runtime
- For calculating the time complexity, we use a method called Frequency Count, i.e., counting the number of steps:
- Comments – 0 step
- Assignment statement – 1 Step
- Conditional statement – 1 Step
- Loop condition for ‘n’ numbers – n+1 Step
- Body of the loop – n step
- Return statement – 1 Step
Algorithm Analysis Cases
When we analyze an algorithm depending on the input data, there are three cases:
- Best case: The minimum number of steps that can be executed for the given parameters.
- Average case: The average number of steps executed on instances with the given parameters.
- Worst case: The maximum number of steps that can be executed for the given parameters.
Asymptotic Notation
- The complexity of an algorithm is usually a function of n.
- The behavior of this function is usually expressed in terms of one or more standard functions.
- Expressing the complexity function with reference to other known functions is called asymptotic complexity.
- Three basic notations are used to express the asymptotic complexity:
Big – Oh Notation (O)
- A formal method of expressing the upper bound of an algorithm’s running time.
- i.e. it is a measure of the longest amount of time it could possibly take for an algorithm to complete.
- It is used to represent the worst-case complexity.
- f(n) = O(g(n)) if and only if there are two positive constants c and n0 such that
f(n) ≤ c g(n) for all n ≥ n0.
- Then we say that “f(n) is big-O of g(n)”.
Example: Derive the Big – Oh notation for f(n) = 2n + 3
2n + 3 <= 2n+3n
2n+3 <= 5n for all n>=1
Here c = 5
g(n) = n
so, f(n) = O(n)
Big – Omega Notation (Ω)
- f(n) = Ω(g(n)) if and only if there are two positive constants c and n0 such that
f(n) ≥ c g(n) for all n ≥ n0.
- Then we say that “f(n) is omega of g(n)”.
Example: Derive the Big – Omega notation for f(n) = 2n + 3
2n + 3 >= 1n for all n>=1
Here c = 1
g(n) = n
so, f(n) = Ω(n)
Big – Theta Notation (Θ)
f(n) = Θ(g(n)) if and only if there are three positive constants c1, c2 and n0 such that
c1 g(n) ≤ f(n) ≤ c2 g(n) for all n ≥ n0.
- Then we say that “f(n) is theta of g(n)”.
Example: Derive the Big – Theta notation for f(n) = 2n + 3
1n <= 2n + 3 <= 5n for all n>=1
Here c1 = 1
C2 = 5
g1(n) and g2(n) = n
so, f(n) = Θ(n)