Asymptotic Notations

Asymptotic Notations

The efficiency of an algorithm is determined by the time, storage, and other resources needed to run it. This efficiency is typically measured using asymptotic notations.

An algorithm's performance can vary depending on the input type. As the input size grows, the algorithm’s performance also changes.

Asymptotic analysis studies how an algorithm's performance changes as the order of its input size increases.

 

Asymptotic Notations:

Asymptotic notations are mathematical tools used to describe an algorithm's running time as the input size approaches a specific or limiting value.

For example, in insertion sort, if the input array is already sorted, the algorithm’s running time is linear, which represents the best-case scenario.

However, if the input array is in reverse order, the algorithm requires the maximum time—quadratic in nature—to sort the elements, representing the worst-case scenario. When the input array is neither sorted nor in reverse order, then it takes average time.

 

There are mainly three asymptotic notations:

Big-O notation (O)

Omega notation (Ω)

Theta notation(Θ)

 

1. Big-O notation (O):

Big-O notation represents the upper bound of the running time of an algorithm. Thus, it gives the worst-case complexity of an algorithm.

A function f(n) can be represented is the order of g(n) that is O(g(n)), if there exists a value of positive integer n as n0 and a positive constant c such that −

f(n)c.g(n)f(n)c.g(n) for n>n0n>n0 in all case

Hence, function g(n) is an upper bound for function f(n), as g(n) grows faster than f(n).

 

 

 

Example:

Let’s consider a function, f(n)=3n2+7n+2

Now, if we take g(n)=n2

we can say f(n)≤4g(n) for all values of n>2

Thus, the complexity of f(n) can be represented as O(g(n)) i.e., O(n2)

 

2. Omega () Notation: Omega notation represents the lower bound of the running time of an algorithm. Thus, it provides the best case complexity of an algorithm.

 We say that f(n)=Ω(g(n))f(n)=Ω(g(n)) when there exists  constant c that 

f(n)c.g(n)f(n)c.g(n) 

for all sufficiently large value of n. Here n is a positive integer. It means function g is a lower bound for function f ; after a certain value of n, f will never go below g.

 

Example:

Let’s consider a function, f(n)=5n2+8n+3

Now, if we take g(n)=n2

we can say f(n)≥5g(n) for all values of n>0

Thus, the complexity of f(n) can be represented as Ω(g(n)) i.e., Ω(n2).

 

 Theta Notation (Θ-notation)

Theta notation encloses the function from above and below. Since it represents the upper and the lower bound of the running time of an algorithm, it is used for analyzing the average-case complexity of an algorithm.

We say that f(n)=θ(g(n)) when there exist

constants c1 and c2 that c1.g(n)f(n)c2.g(n) for all sufficiently large value of n. Here n is a positive integer. This means function g is a tight bound for function f.



Example:

Let’s consider a function, f(n)=6n2+9n+4

Now, if we take g(n)=n2

We can say 6g(n)≤f(n)≤7g(n) for all large values of n.

Thus, the complexity of f(n) can be represented as θ(g(n)) θ(g(n)), i.e., θ(n2).