When we look at input sizes large enough to make only the order of growth of the running time relevant, we are studying the asymptotic efficiency of the algorithm. In this blog, I’ll give the Short description of Asymptotic Notation.

## Asymptotic Notation

The asymptotic notation is used to describe the running time of an algorithm.This shows the order of growth of function.It describes the algorithm efficiency & performance in a meaningful way.It also describes the behaviour of time and space complexity for large instance characteristics.

##### A.Time complexity

The time complexity of a program is the simple measurement of how fast the time taken by a program grows if the input size increases.

##### B.Space Complexity

The space complexity of a program is the measurement of space required by a program.

The asymptotic running time of an algorithm is defined in terms of function:

1. Big-oh Notation
2. Big-omega notation
3. Theta notation

#### 1.Big-oh Notation

• It is a formal method of expressing the upper bound of algorithm’s running time.
• It is the measures of the largest amount of time it could possibly take for the algorithm to complete.
• It is denoted by O-NOTATION.

f(n)=O(g(n)) if and only if there exist a positive constant ”c” such that

f(n)<=c*g(n) for all n>=n0,c>0 & n>=1

#### 2.Big-omega Notation

• It provides a asymptotic lower bound .
• It describes the best that can happen for a given data size.
• It is denoted by

For non negative function f(n) & g(n) if there exist an integer n0 & a constant c>0 such that for all integer n>n0,f(n)>=c(g(n)) then f(n) is big omega of g(n) and denoted by

f(n)=

#### 3.Theta Notation

• The lower and upper bound for function”f” is provided by it.
• It is denoted by Θ.

For non negative function,

f(n) & g(n) if there exist an integer n0 & positive constant c1 & c2 i.e c1>0 & c2>0 such that for all integer n>n0

c1g(n)<=f(n)<=c2g(n) then f(n) is Θ of g(n) and denoted by

f(n)=Θg(n)