Understand Time Complexity

We hear programmers and computer scientists talk about time complexity of an algorithm every now and then, this article would introduce you gently into understanding time complexity, asymptotic analysis also known as Big Oh notation.

Understanding the basics.

Every code you write is being transformed to binary and stored as instructions in the computer.

For example let us look at this simple code snippet below:

for (i = 0; i =< n; i++) {
m = 5;

We can extract the number of operations that your computer would perform:

  • Assignment operation
  • Arithmetic operation (incrementing i)
  • Comparator operation

So let us go through the actual work the computer would do to execute the program above:

  • Assign 0 to i
  • Compare i to n
  • Continue if i =< n
  • Assign 5 to m
  • Increment i by 1

— continue this process until i > n

Now from the above if n = 0, the number of instructions that would be executed by the computer would be 5.

if n = 1, the computer would execute 5 * 2 = 10 instructions.

if n = 2, the computer would execute 5 * 3 = 15 instructions.

We can create a function for the number of instructions to be executed for n in the program above be: f(n) = 5(n+1)

Asymptotic Analysis (Big Oh Notation)

The asymptotic analysis (Big Oh) tells how the function above behaves as n approaches infinity (as n grows very large), now let us assume n to be 100,000,000. This means that the computer would have to perform 5(100,000,000 + 1) operations. Now, we can clearly see we can afford to remove smaller terms from the equation as n grows very large because 100,000,000 is approximately 100,000,001. So we throw away 5 and 1 in the equation and we are left with f(n) = n. We can clearly see that this algorithm executes in linear time as n grows. We also have other asymptotic functions which are gotten from the same process above:

f(n) = n² — quadratic time O(n²)

f(n) = log n— logarithmic time O(log n)

and so on….