Friday, May 29, 2015

Runtime Analysis Notations and Time Complexity of Algorithms

Runtime Analysis
There are 2 aspects of writing the algorithms. One is its correctness and other is efficiency. Runtime analysis is to evaluate the efficiency of algorithm that how much resources does it use. Sometimes, it is easy to solve a problem with an algorithm however, if the algorithm is too slow, you are likely to lose the competition.

Time Complexity
Time complexity of an algorithm signifies the total time required by the program to run to completion. The time complexity of algorithms is most commonly expressed using the big O notation.

The time complexity of an algorithm quantifies the amount of time taken by an algorithm to run as a function of the length of the string representing the input. The time complexity of an algorithm is commonly expressed using big O notation, which excludes coefficients and lower order terms. When expressed this way, the time complexity is said to be described asymptotically, i.e., as the input size goes to infinity. For example, lets see how we simplify 3N + 2 machine instructions to describe this as just O(N).

Why do we remove the 3 and 2 ?

As we want to know the performance of the algorithm as N becomes large so lets consider the two terms 3N and 2. Now, we calculate the relative influence of these two terms as N becomes large. As large as  a million, so let say N is a million.

Then the first term is 3 million and the second term is only 2. For this reason, we drop all except the largest terms for large N.

So, now we have gone from 3N + 2 to 3N.

Traditionally, we are only interested in performance up to constant factors which means that we don't really care if there is some constant multiple of difference in performance when N is large. The unit of 3N is not well-defined in the first place anyway. So we can multiply or divide by a constant factor to get to the simplest expression.

So 3N becomes just N.

Types of Notations for Time Complexity
  • Big Oh denotes "fewer than or the same as" <expression> iterations.
  • Big Omega denotes "more than or the same as" <expression> iterations.
  • Big Theta denotes "the same as" <expression> iterations.
  • Little Oh denotes "fewer than" <expression> iterations.
  • Little Omega denotes "more than" <expression> iterations.

Understanding Notations of Time Complexity with Example
  • O(expression) is the set of functions that grow slower than or at the same rate as expression.
  • Omega(expression) is the set of functions that grow faster than or at the same rate as expression.
  • Theta(expression) consist of all the functions that lie in both O(expression) and Omega(expression).

Suppose you've calculated that an algorithm takes f(n) operations, where,

f(n) = 3*n^2 + 2*n + 4. // n^2 means square of n

Since this polynomial grows at the same rate as n^2, then you could say that the function f lies in the set Theta(n^2). (It also lies in the sets O(n^2) and Omega(n^2) for the same reason.)

The simplest explanation is, because Theta denotes the same as the expression. Hence, as f(n) grows by a factor of n^2, the time complexity can be best represented as Theta(n^2).

0 comments:

Post a Comment