We use the big-O notation to calculate the temporal complexity of algorithms. In this post, I’ll show how to calculate the temporal complexity of recursive algorithms.

As a reminder, an algorithm is a finite sequence of steps that, when executed in a certain order, solve a problem.

A problem is a pair (E, S) where E is the input and S is the expected output.

Recursion is a very important technique in programming. To design a solution using recursion we need to do three things:

- Identify the recursion stop condition.
- Define the original problem, in problems of smaller instance size.
- Combine the solution of the partial problems into one to give the solution to the original problem.

Our goal, when calculating the temporal complexity of an algorithm, is to determine how many elementary operations are in an algorithm.

When recursion is involved, this becomes more complex than in iterative algorithms.

Let’s see an example.

## Recursive algorithm to calculate the factorial of a number

Factorial definition: f(n) = n * f(n-1), f(0)=f(1)=1

The algorithm:

```
factorial(n):
if n == 0 or n == 1:
return 1
else:
return n * factorial(n-1)
```

From the algorithm above, you can see the three aspects of recursion:

- recursion stop condition: if n==0 or n==1
- using problems of smaller size: factorial(n-1), here we reduce the size by 1
- combining partial solutions: n* factorial(n-1)

Let’s calculate the temporal complexity of this algorithm.

If we assume that T(n) is the number of Elementary Operations (EO) for an input of size n, then T(n-1) is the number of EO for an input of size n-1.

Therefore, T(n) = a, where a is a constant if n=0 or n=1, and T(n)=T(n-1) + b, where b is a constant if n>1.

Notice that we use a and b as the number of constant EO for comparison, multiplication, and return operations.

Now, T(n) is a recurrence equation. To solve this equation, we can use several methods.

# Substitution method

Let’s start with the recurrence equation from the previous algorithm.

T(n)=T(n-1) + b, n>1

To apply this method, we “guess” a solution, and then we prove it using mathematical induction. In this link you can find several examples of mathematical induction.

T(n)=T(n-1) + b, n>1 (1)

T(n-1) = T(n-2) + b, n>2 (2)

T(n-2) = T(n-3) + b, n>3 (3)

Now, if we substitute equation (3) in (2), we get

T(n-1) = T(n-3) + b + b = T(n-3) + 2b, n>2

Substituting the previous equation in (1), we get

T(n)= T(n-3) + 2b + b = T(n-3) + 3b, n>3

Here we can see that there is a general form:

T(n)=T(n-i) +ib, n>i

If i=n, we get

T(n)=T(n-n) + nb = T(0) + nb = a + nb

This represents a solution to the recurrence equation, and we already know how to use the big-O notation in this case.

Now, we need to prove the solution by induction.

T(n)= a+nb

Because the recurrence state T(n)=T(n-1) + b, substituting n by n+1, it follows that T(n+1)=T(n)+b

Induction hypothesis:

T (n + 1) = a+ (n + 1)b

T(n)+b = a+ (n + 1)b

T(n)+b = a+ nb + b

T(n) = a+ nb + b -b

T(n) = a+ nb ⧠

To finish, we use the big-O notation using the solution to the recurrence.

T(n) = a+ nb means that T(n) is O(n).

## The master theorem

Theorem: Let a>=1, b>1 be constants, let f(n) be a function, and let T(n) be defined on the nonnegative integers by the recurrence T(n)= aT(n/b) + f(n), were we interpret n/b to mean either ⎣a/b⎦ or ⎡a/b⎤.

Then T(n) has the following asymptotic bounds:

- if f(n)=O(n^(log
_{b}a-e)) for some constant e>0, then T(n)=𝚯(n^(log_{b}a)) - if f(n)= 𝚯 (n^(log
_{b}a)), then T(n)=𝚯(n^(log_{b}a)lgn) - if f(n)= 𝛀 (n^(log
_{b}a+e)) for some constant e>0, and if af(n/b)<=cf(n) for some constant c>1 and all sufficiently large n, then T(n)=𝚯(f(n))

You can find examples and explanations of the master theorem and other methods to solve recurrence equatons in this source.

It is straightforward to apply the master theorem. We just need to substitute the values of the recurrence so we can get the temporal complexity for the algorithm.

Suppose we have a recurrence equation that represents the EO of a certain algorithm:

T(n)=9T(n/3)+n.

We substitute the values of a, b and f(n) according to the master theorem to find out which case applies.

a=9, b=3, f(n)=n, and n^{log}_{b}^{a} =n^{log}_{3}^{9} =Θ(n^{2}).

f(n) = O(n^{log}_{3}^{ 9−1}), we can say that e = 1, therefore case 1 of the master theorem applies.

The solution is T(n) = Θ(n^{2}).

The master theorem is especially useful for calculating the temporal complexity of algorithms that follow a divide-and-conquer strategy.