# Overview

The Big O notation ignores behaviour when n is small, and ignores coefficients (e.g. an algorithm that grows at $$2n$$ is still $$O(n)$$).

The following complexities are described from best to worst.

# Constant Time

The following algorithm just prints “hello” once, and doesn’t depend on the number of elements (n). It will always run in constant time, and therefore is $$O(1)$$.

Note that even though the following algorithm prints out “hello” three times, it still does not depends on n, and runs in constant time. Again, it is still classified as $$O(1)$$.

$$O(1)$$ complexity is the best algorithm complexity you can achieve. Common software operations that have $$O(1)$$ complexity are:

• Reading an element from an array (or vector).
• Reading an element from hash table (in the average and best case only)

# log(n) Time

Note that whenever we are talking about software algorithms with $$O(log(n))$$ complexity, we are really talking about $$O(log_2(n))$$ complexity.

Notice the i = i * 2, which makes in run in $$O(log(n))$$ time.

Common software operations that have $$O(log(n))$$ complexity are:

• Reading an element from a binary search tree

# Linear Time

Linear time is when an algorithm grows at a rate proportional to the number of elements, $$n$$. A simple for loop has $$O(n)$$ complexity:

Another example:

Note that even though the above example prints "hello" for every second $$n$$ ($$0.5n$$), it is still said to have $$O(n)$$ complexity (remember that coefficients are dropped).

# nlog(n) Time

$$O(nlog(n))$$ complexity can be thought of as a combination of $$O(n)$$ and $$O(log(n))$$ complexity.

This can be demonstrated by a nested for loop, one having $$O(n)$$ complexity and the other $$O(log(n))$$ complexity

# n^2 Time

$$O(n^2)$$ complexity is proportional to the square of the number of elements $$n$$. This is a bad form of complexity to have, especially when $$n$$ grows large.

Here is another example which has $$O(n^2)$$ complexity.

Note that this is just a 2x repetition of a algorithm with $$O(n^2)$$ complexity. Repeating an algorithm (i.e. performing it twice) does not change the complexity.

# n^3 Time

$$O(n^3)$$ complexity is rarely seen in a single software algorithms (but can easily arise from the combination of multiple algorithms to solve a problem).

Naturally, we could keep going forever explaining poorer and poorer complexities ((\n^4\), $$n^5$$, e.t.c), but these are hardly educational and rarely seen in real software algorithms.

Posted: July 6th, 2017 at 12:11 pm
Last Updated on: July 6th, 2017 at 12:48 pm