Why does a shift by 0 truncate the decimal? Why does a shift by 0 truncate the decimal? javascript javascript

Why does a shift by 0 truncate the decimal?


You're correct; it is used to truncate the value.

The reason >> works is because it operates only on 32-bit integers, so the value is truncated. (It's also commonly used in cases like these instead of Math.floor because bitwise operators have a low operator precedence, so you can avoid a mess of parentheses.)

And since it operates only on 32-bit integers, it's also equivalent to a mask with 0xffffffff after rounding. So:

0x110000000      // 45634027520x110000000 >> 0 // 2684354560x010000000      // 268435456

But that's not part of the intended behaviour since Math.random() will return a value between 0 and 1.

Also, it does the same thing as | 0, which is more common.


Math.random() returns a number between 0 (inclusive) and 1 (exclusive). Multiplying this number with a whole number results in a number that has decimal portion. The << operator is a shortcut for eliminating the decimal portion:

The operands of all bitwise operators are converted to signed 32-bit integers in big-endian order and in two's complement format.

The above statements means that the JavaScript engine will implicitly convert both operands of << operator to 32-bit integers; for numbers it does so by chopping off the fractional portion (numbers that do not fit 32-bit integer range loose more than just the decimal portion).

And is it just a nuance of JavaScript, or does it occur in other languages as well?

You'll notice similar behavior in loosely typed languages. PHP for example:

var_dump(1234.56789 << 0);// int(1234)

For strongly types languages, the programs will usually refuse to compile. C# complains like this:

Console.Write(1234.56789 << 0);// error CS0019: Operator '<<' cannot be applied to operands of type 'double' and 'int'

For these languages, you already have type-casting operators:

Console.Write((int)1234.56789);// 1234


From the Mozilla documentation of bitwise operators (which includes the shift operators)

The operands of all bitwise operators are converted to signed 32-bit integers in big-endian order and in two's complement format.

So basically the code is using that somewhat-incidental aspect of the shift operator as the only significant thing it does due to shifting by 0 bits. Ick.

And is it just a nuance of JavaScript, or does it occur in other languages as well?

I can't speak for all languages of course, but neither Java nor C# permit double values to be the left operand a shift operator.