So I got this formula to calculate the number of years it takes to double any amount of money with a given interest (i)
log(2) / log(1 + (i/100) )
So here's my code:
import java.util.Scanner;
public class JavaApplication37 {
public static void main(String[] args) {
Scanner reader = new Scanner(System.in);
System.out.println("What's the interest rate?: ");
int i = reader.nextInt();
double t = (Math.log(2))/(Math.log(1+(i/100)));
System.out.println("It takes " + t + " years before the amount has doubled");
}
}
That gives me the output: It takes Infinity years before the amount has doubled
What did I do wrong?
The problem is that 100
is assumed as integer. You should rather write 100 is as 100.0
and it will run.
This is because values as 1.0
, 2.3
assumed as double and not float. Notice the decimal point.
so rewrite you code as below and it will run:
import java.util.Scanner;
public class JavaApplication37 {
public static void main(String[] args) {
Scanner reader = new Scanner(System.in);
System.out.println("What's the interest rate?: ");
int i = reader.nextInt();
double t = (Math.log(2))/(Math.log(1+(i/100.0)));
System.out.println("It takes " + t + " years before the amount has doubled");
}
}