What are asymptotic notations for complexity analysis?
A problem may have numerous algorithmic solutions. In order to choose the best algorithm for a particular task, you need to be able to judge how long a particular solution will take to run. Or, more accurately, you need to be able to judge how long two solutions will take to run, and choose the better of the two. You don't need to know how many minutes and seconds they will take, but you do need some way to compare algorithms against one another.
Asymptotic notations are just used for this, i,e they are used for comparing two functions of two different algorithms.
Let us suppose I write f(n) = W(g(n)) where W is an asymptotic notation. Here this expression says how the function f(x) grows in comparison to another function g(n).
What are the most common Asymptotic Notations used in Complexity Analysis?
Most common asymptotic notations used for time complexity analysis are Big Oh (O), Big Theta (Θ), Big Omega (Ω), Small Oh (o) and Small Omega (ω). Let us discuss them one by one
How to describe Big Oh (O) ?
If run time of an algorithm is of O(g(n)), it means that the running time of the algorithm (as n gets larger) is at most proportional to g(n). Hence it helps in estimating an upper bound of the number of basic operations to be performed.
Mathematically, a function f(x) is equal to Big Oh of another function g(x), i,e f(x) = O(g(x) is true if and only if there exists two constants (C1 and C2)such that
a) C1 and C2 are always positive
b) 0<= f(n) <= C1*g(n) for any value of n => C2
Let us have an example for this with f(n) being 2n^2 and g(n) being n^3.
Let us check whether 2n^2 = O(n^3) is true or not
if we take C1 = 1 and C2 = 2 (all positive maintaining condition described in a), then for n = 3 (that is value of n greater than value of C2), we have the following relation true
0<= 1*(2*3^2)<= 3^3.
Hence 2n^2 = O(n^3) is true for C1 = 1 and C2 = 2.