Euclid's algorithm for greatest common divisor, upper bound on worst-case running time
def gcd3(i,j): # assumption: i and j are positive integers if i == j: return i if i == 1 or j == 1: return 1 if i mod j == 0: return j if j mod i == 0: return i if i < j: return gcd3(i, j mod i) if i > j: return gcd3(i mod j, j)
Does the algorithm terminate? Yes, because max(i,j) decreases with each recursive call.
correctness: Is the algorithm correct? Prove that if i < j, the g.c.d. of i and j equals the g.c.d. of i and j mod i. The correctness of the algorithm follows by induction.
running time: Prove that if i < j, then j mod i < j/2. Consequently, j decreases by a factor of two every other recursive call. Thus, the number of recursive calls is at most? 2 * log2(j), proportional to log(j).
By the same argument, the number of recursive calls is at most proportional to log(i).
Since each recursive call does only a constant number of operations (outside of the recursion), this means the running time is at most proportional to log(min(i,j)).
Note that we have only shown an upper bound on the running time. That's why we say the running time is "at most" proportional to log(min(i,j)). A-priori, it's possible that better upper bounds can be proven, but this is a pretty good one. Also, note that we've bounded the worst-case running time, because our bound holds for all inputs.
Not enough to show it for one input! You need to show a sequence of larger and larger inputs for which this is true.
Consider the fib. sequence f(0), f(1), ..., f(n) defined by the relation
Claim: gcd3(f(n),f(n-1)) takes time proportional to n.
To see why, note that f(n-1) < f(n) and
Claim: n is proportional to log f(n).
Proved in /Lecture3
Assuming the claim is true, we can conclude that there are arbitrarily large inputs (i,j) such that gcd3(i,j) takes time proportional to log(min(i,j)).
Combining this with our upper bound from before, we conclude that the worst case running time of gcd3(i,j) is proportional to log(min(i,j)).
continued... /Lecture3