Given a number of characters, I want to generate the greatest integer whose binary representation fits in that number of characters.
My code working perfectly fine:
const getValueOf=(y)=>{return parseInt('1'.repeat(y),2)};
/*
| Chars | Binary | Decimal
| 4 | 1111 | 15
*/
console.log(getValueOf(4)); // = 15
/*
| Chars | Binary | Decimal
| 10 | 1111111111 | 1023
*/
console.log(getValueOf(10)); // = 1023
/* -------------- */
console.time('Time spent on 10000000 executions');
for(let z=0;z<10000000;++z){getValueOf(10)};
console.timeEnd('Time spent on 10000000 executions')
However, this is too slow for my purposes. For 10000000 executions it takes about 666ms. I guess the conversion to text makes it sluggish. How can I make this run faster?
That is a matter of using bit operators:
const getValueOf = y => (1<<y)-1;
So for instance, if y
is 4, this shifts a 1 bit four times to the left, which in binary is 10000, and then it subtracts 1, which is 1111 in binary.