01.20.12
Accurate hexadecimal to decimal conversion in JavaScript
A problem came up at work yesterday: I was creating a web page that received 64-bit hex numbers from one API. But it needed to pass them off to another API that expected decimal numbers.
Usually this would not be a problem — JavaScript has built-in functions for converting between hex and decimal:
parseInt("1234abcd", 16) = 305441741
(305441741).toString(16) = "1234abcd"
Unfortunately, for larger numbers, there’s a big problem lurking:
parseInt("123456789abcdef", 16) = 81985529216486900
(81985529216486900).toString(16) = "123456789abcdf0"
The last two digits are wrong. Why did these functions stop being inverses of one another?
The answer has to do with how JavaScript stores numbers. It uses 64-bit floating point representation for all numbers, even integers. This means that integers larger than 2^53 cannot be represented precisely. You can see this by evaluating:
(Math.pow(2, 53) + 1) - 1 = 9007199254740991
That ends with a 1, so whatever it is, it’s certainly not a power of 2. (It’s off by one).
To solve this problem, I wrote some very simple hex <-> decimal conversion functions which use arbitrary precision arithmetic. In particular, these will work for 64-bit numbers or 128-bit numbers. The code is only about 65 lines, so it’s much more lightweight than a full-fledged library for arbitrary precision arithmetic.
The algorithm is pretty cool. You can see a demo, read an explanation and get the code here:
http://danvk.org/hex2dec.html.