When I need to calculate the greatest common divisor of two integers, instead of using Euclid’s algorithm or the binary GCD algorithm, I will sometimes use this super-simple algorithm:
// Given 0<A<B, calculate GCD(A,B)
unsigned gcd = A;
for (unsigned r = B%gcd; r>0; gcd=r, r = B%gcd);
This is the kind of thing I feel OK about writing inline in some cases instead of defining a function somewhere. I use it when the time for the GCD calculation is swamped by something else even if it takes a lot of steps.
However, experimentally, the number of steps that this can take seems to grow just a little bit faster than log2 B: https://ideone.com/SVRa64
Can anyone prove a tight bound?