Asymptotic Notations

Description of Asymptotic Notations.

Asymptotic Notations: Big O, Big Omega and Big Theta Explained

Asymptotic notation helps us understand the efficiency of an algorithm by comparing it to other algorithms. It provides a way to express the algorithm's performance in terms of its growth rate as the input size increases. This comparison allows us to determine which algorithm is more efficient in handling larger datasets.

Now let's look at the mathematical definition of 'order of.' Primarily there are three types of widely used asymptotic notations.

  1. Big oh notation ( O )
  2. Big omega notation ( Ω )
  3. Big theta notation ( θ ) – Widely used one

Big oh notation ( O ):

  • Big oh notation is used to describe an asymptotic upper bound.
  • Mathematically, if f(n) describes the running time of an algorithm; f(n) is O(g(n)) if and only if there exist positive constants c and n° such that:
0f(n) ≤ c g(n)        for all n ≥ n°. 
  • Here, n is the input size, and g(n) is any complexity function, for, e.g. n, n2, etc. (It is used to give upper bound on a function)
  • If a function is O(n), it is automatically O(n2) as well! Because it satisfies the equation given above.

Graphic example for Big oh ( O ):

example

Big Omega Notation ( Ω ):

  • Big O notation provides an asymptotic upper bound on an algorithm's performance, while Omega (Ω) notation offers an asymptotic lower bound. Together, they give us a clearer picture of an algorithm's efficiency in different scenarios.
  • Let f(n) define the running time of an algorithm; f(n) is said to be Ω (g(n)) if and only if there exist positive constants c and n° such that :
0 ≤ c g(n)f(n)        for all n ≥ n°. 
  • It is used to give the lower bound on a function.
  • If a function is Ω (n2) it is automatically Ω (n) as well since it satisfies the above equation.

Graphic example for Big Omega (Ω):

example

Big theta notation ( θ ):

  • Let f(n) define the running time of an algorithm.
  • F(n) is said to be θ (g(n)) if f(n) is O (g(n)) and f(x) is Ω (g(n)) both. Mathematically,
example

Merging both the equations, we get:

 0  ≤ c2 g(n)f(n) ≤ c1 g(n)       ∀   n ≥ no.  

The equation means that there are positive constants c1 and c2 such that the function f(n) is bounded between c2 g(n) and c1 g(n). In other words, f(n) grows at the same rate as g(n) within these constants, providing a precise comparison of their growth rates.

Graphic example of Big theta ( θ ):

example

Which one of these to use?

Big Theta (Θ) provides a more accurate representation of an algorithm's run time, which is why most interviewers expect you to respond with Big Theta when they ask about the "order of" an algorithm. When you express the time complexity in Big Theta, it inherently includes both Big O and Big Omega. For this reason, it is recommended to use Big Theta for clarity and completeness.

Quick Quiz: Prove that n2+n+1 is O(n3), Ω(n2), and θ(n2) using respective definitions.

Hint: You can approach this both graphically, making some rough graphs and mathematically, finding valid constants c1 and c2.

Increasing order of common runtimes: Below mentioned are some common runtimes which you will come across in your coding career.

example

So, this was all about the asymptotic notations. We’ll come across these a lot in future. However, there's no need for concern.