What is the Complexity of Euclid’s Algorithm?

Algorithms
Computers
Author

Eliott Kalfon

Published

August 31, 2024

Subscribe to my newsletter to hear about my latest posts. No spam, I promise.


Euclid’s algorithm is one of the earliest algorithms ever recorded. It finds the Greatest Common Divisor or “GCD” between two integers a and b with a>b. As an example, the GCD of 36 and 27 is 9 as it is the largest integer that divides both numbers.

One simple way to find this GCD is to iterate over all integers between 1 and b (the lowest of the two) and find the largest one dividing both a and b. This brute force approach is of complexity O(b).

Euclid’s algorithm elegantly leverages the fact that the GCD of two numbers is also the GCD of their difference to significantly reduce computation time.

Starting from our intuition, this must be better than the brute force method O(b), but how much exactly?

Let a0=a and b0=b.
In the following update, a1=b0 and b1=a0modb0.
More generally, ak+1=bk and bk+1=akmodbk.

Using the properties of the modulo operator, it can be proven that bk+2bk2. In English, that b will more at least be divided by 2 after two steps of Euclid’s algorithm.

Quickly validating this with a quick example, with a0=72 and b0=54:

a1=54,b1=72mod54=18
a2=18,b2=54mod18=0

The proof is beyond the scope of this article (see Appendix for a proof), but this finding can be empirically validated by computing b0 and b2 for all combinations of a and b within the range [1,500] and plotting their ratio.

Computer Simulation

Distribution of the ratio of b2 to b0
Python code used to generate the chart
import matplotlib.pyplot as plt

def euclid_algorithm(a, b):
    b0 = b
    a1 = b
    b1 = a % b
    if b1 == 0:
        return b0, None
    a2 = b1
    b2 = a1 % b1
    
    return b0, b2

ratios = []

for a in range(1, 501):
    for b in range(1, 501):
        if a > b:
            b0, b2 = euclid_algorithm(a, b)
            if b2 is not None:
                ratio = b2 / b0
                ratios.append(ratio)

plt.hist(ratios, bins=50, edgecolor='black')
plt.xlabel('Ratio (b2 / b0)')
plt.ylabel('Frequency')
plt.title('Distribution of the Ratio between b0 and b2')
plt.show()

Achieving Logarithmic Upper-Bound

Given the fact the upper-bound bk+2bk2, we can show that the time complexity of Euclid’s algorithm is O(log(n)).

Using our Computer Science intuition, the division by 2 points to a logarithmic decrease.

As b is divided by 2 every two steps, we need to find the number of steps needed (k) to get b1.

Putting this into mathematical form, we get:

b02k/2=1
2k/2=b0
k2=log(b0)
k=2log(b0)O(log(b))

Final Thoughts

In its beautiful simplicity, Euclid’s algorithm is a prime example of leveraging information about a problem to avoid exhaustive search. By exploiting the fact that two numbers’ Greatest Common Divisor is also the Greatest Common Divisor of their difference, this algorithm recursively reduces the problem - reaching a time complexity of O(log(b)).

Appendix

Proving bk+2bk2

Before going into further details, let’s remember how Euclid’s algorithm works with the example of a0=72 and b0=54.

We can summarise the steps in the following table:

Step Number a b
0 72 54
1 54 18
2 18 0

The update rule of Euclid’s algorithm is as follows:

{ak+1=bkbk+1=akmodbk

Now, how to prove that bk+2 is always less than bk2? There are two cases to study:

Case 1: bk+1<bk2

In this case, bk+1 is already less than bk2. This is the case at the first update of the example above, with b0=54 and b1=18.

Case 2: bk+1>bk2

We observe the following:

(1)bk+1=akmodbk

(2)bk+2=bkmodbk+1

If bk+1>bk2, then:

(3)bkmodbk+1=bkbk+1

This is due to the division rule of the quotient and remainder:

bk=bk+1×1+(bkbk+1)

So, the remainder is bkbk+1.

Substituting equation (3) into equation (2), we get:

(4)bk+2=bkbk+1

From bk+1>bk2, we have:

bkbk+1<bkbk2=bk2

Substituting back into equation (4), we obtain:

(5)bk+2<bk2

Thus, in both cases, we have shown that bk+2<bk2.

Like what you read? Subscribe to my newsletter to hear about my latest posts!