Javascript: Is This Truly Signed Integer Division -
given following code, both a
, b
number
s representing values within range of signed 32-bit signed integers:
var quotient = ((a|0) / (b|0))|0;
and assuming runtime in full compliance ecmascript 6 specifications, value of quotient
always correct signed integer division of a
, b
integers? in other words, proper method achieve true signed integer division in javascript equivalent machine instruction?
i'm no expert on floating-point numbers, wikipedia says doubles have 52 bits of precision. logically, seems 52 bits should enough reliably approximate integer division of 32-bit integers.
dividing minimum , maximum 32-bit signed ints, -2147483648 / 2147483647
, produces -1.0000000004656613
, still reasonable amount of significant digits. same goes inverse, 2147483647 / -2147483648
, produces -0.9999999995343387
.
an exception division zero, mentioned in comment. linked question states, integer division 0 throws sort of error, whereas floating-point coercion results in (1 / 0) | 0 == 0
.
update: according another answer, integer division in c truncates towards zero, |0
in javascript. in addition, division 0 undefined, javascript technically not incorrect in returning zero. unless i've missed else, answer original question should yes.
update 2: relevant sections of ecmascript 6 spec: how divide numbers , how convert 32-bit signed integer, what |0
does.
Comments
Post a Comment