Search code examples
clcm

Finding the Least Common Multiple(LCM) of two integer in C


I am trying to use the following code to find the LCM of two integers in C. I have no idea why the code will show me the maximum number of the two integers instead of the LCM of them. Have I do something wrong?

#include<stdio.h>
int main(){
    int num1, num2;
    printf("Input your number 1:");
    scanf("%d", &num1);
    printf("Input your number 2:");
    scanf("%d", &num2);
    //find max and main
    int max = num1, min = num2;
    if(num1 < num2){
        max = num2;
        min = num1;
    }
    int k = 1, LCM;
    do{
        LCM = max*k;
        k++;
    }while(LCM%min == 0);
    printf("LCM of %d and %d is %d", num1 ,num2, LCM);
}

Here is the output of the program:

Input your number 1:5
Input your number 2:3
LCM of 5 and 3 is 5
Process returned 0 (0x0)   execution time : 1.735 s
Press any key to continue.

Solution

  • while(LCM%min == 0) is checking if LCM is divisible by min and if so the loop continues.

    You should use while(LCM%min != 0) instead to find the first multiple of max that is divisible by min and exit the loop when one is found.