Hi I am trying to write Euclidean's Alg. in Python using the concept that a = b*k + r. It works partially in some cases but it messes up near the end. Can someone help me figure out where i made an error?
a = int(input("Enter First Number: "))
b = int(input("Enter Second Number: "))
A = a
B = b
k = 0
r = 1
while (a!=(b*b)):
while(a>(b*k)):
k = k+1
k = k-1
r = (a-(b*k))
a = b
b = r
print (a,b) #to debug which step it is at
print("gcd(",A,",",B,") =",b)
I've read what you were trying to do but I strongly recommend you to read again the definition of the Euclidean Algorithm in order to start coding it. What you attempted to do is messing with the definition of k
and k is causing your code to loop in an overextended way.
A simple way to code the Euclidean algorithm is using the division-based implementation of it. Just keep in your mind what the modulus represents and you are ready to go.
Happy learning.