I have two defined objects: x
and y
If I do following, chances of getting either x
or y
are equal – 1 of 2:
var primary = [x, y];
var secondary = primary[Math.floor(Math.random() * primary.length)];
This would take a 1 of 3 (smaller) chances of getting y
:
var primary = [x, x, y];
// secondary unchanged
etc.
But I believe, this is bad practice because if I'd wanted to set infinitesimal chances (e.g. 1 of 1e9) of getting y
, I would have to do something extremely wasteful like this:
var primary = new Array();
for (i = 1e9 - 1; i--; i) primary.push(x);
primary.push(y);
var secondary = primary[Math.floor(Math.random() * primary.length)];
Is there a better way to do this in JavaScript?
Without digging too much into ECMAScript specification and its actual implementations, Math.random()
appears to produce a number from the range 0..1
in a smooth fashion. This means that the number is 50% likely to be less than 0.5
, 25% likely to be less than 0.25
, 10% likely to be less than 0.1
, etc.
To get x
17% of the time (and, conversely, y
83% of the time), one could use the corresponding number to be a gateway for Math.random()
’s results:
const x = "x";
const y = "y";
function getRandom() {
return Math.random() < 0.17 ? x : y;
}
const value = getRandom();
// 17% "x", 83% "y"
This works fine for two values, but working with lists of 3+ elements would require different thinking.