Problem 1’s Complement
1’s complement seems slightly off?!
console.log(~0xFF); //gives -256
Taking a closer look at the ECMA script definition you see the definition of NOT as the following:
Let expr be the result of evaluating UnaryExpression.
Let oldValue be ToInt32(GetValue(expr)).
Return the result of applying bitwise complement to oldValue. The result is a signed 32-bit integer.
Now this is where things get interesting. You would naturally assume if you give the NOT operator a 8-bit value you’d get an 8-bit field back, but this is clearly not the case. If you want to dig even further the real concrete instance of the value is a 64-bit IEEE 754 representation see .
So that neatly explains why
~0xff transforms to
0000 0000 0000 0000 0000 0000 1111 1111
apply ~ (NOT)
1111 1111 1111 1111 1111 1111 0000 0000
But that still leaves a problem, how can you easily debug a value that isn’t a Number but is represented as one…
If you’ve got this far there is probably a very good reason you’re using bitwise ops, otherwise you might have given up on them altogether. So it would be somewhat handy to look at the real value, and not the Int32 representation, which
console.log and friends are going to give a misleading value for.
Getting at the full representation
A nice simple function to show all the bits for the input value.
>>> Operator coerce the value input a 32-bit representation by right shifting and filling in with zeros, depending on your usage you can also slice the value, to only show the 8-bits that are needed to debug, like so:
If dealing with bits, clamp the field to the proper bit length before assignment. Using an 8-Bit clamped array is no guarantee that the value will be assigned properly, there are cases where the value can be assigned to zero instead of the proper bit value. For an 8-bit you would use
array[i] = bits&0xFF this will make sure the value is appropriately truncated and does not get evaluated as a Number.