Search code examples
javascriptsymbols

Math.random() outcome seems flipped


I'm currently playing with a true or false script on the outcome from Math.Random()

code is as following

var win = Math.random() >= (percent / 100);
        
        if (win === true) {
           console.log("true")
        } else {
            console.log("false")
        }
  • In this case percent is 50 by default. If i put the percent to 90% it keeps returing false more or less all the time Math.random() >= (90 / 100); .. But since you only have 10% to return false, thats strange

  • If i set the percent to 1% it keeps returning true (but i suppose it almost always should return false, since you technically only have 1% chance to return true).

Is the >= (Greater than or equal too) flipped the wrong side around - it does feel like you get less chance of win = true, even if you put in 90(percent) / 100

Hope someone can clarify if my symbols are wrongly placed


Solution

  • If percentage = 90, this means that Math.random has to be 0.9 or greater, which return true in 10% of the cases.

    So reverse the comparator:

    var win = Math.random() < (percent / 100);