如何将(值非常大的)十六进制字符串解码为十进制?

How to decode (very large in value) hex string into decimal?

我试图解码类似于以下格式的十六进制字符串:

0x00000000000000000000000000000000000000000000000bf97e2a21966df7fe

我能够用 online calculator 解码它。正确解码后的数字应该是220892037897060743166.

但是,当我尝试使用以下代码用 python 解码它时,它 returns 错误:

"0x00000000000000000000000000000000000000000000000bf97e2a21966df7fe".decode("hex")

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-32-1cf86ff46cbc> in <module>()
      9 key=keyarr[0]
     10 
---> 11 "0x00000000000000000000000000000000000000000000000bf97e2a21966df7fe".decode("hex")

/usr/local/Cellar/python/2.7.13_1/Frameworks/Python.framework/Versions/2.7/lib/python2.7/encodings/hex_codec.py in hex_decode(input, errors)
     40     """
     41     assert errors == 'strict'
---> 42     output = binascii.a2b_hex(input)
     43     return (output, len(input))
     44 

TypeError: Non-hexadecimal digit found

然后我把十六进制数前面的0x去掉再试:

"00000000000000000000000000000000000000000000000bf97e2a21966df7fe".decode("hex")

然后输出变成:

'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0b\xf9~*!\x96m\xf7\xfe'

其实我看不懂输出...

如果您想知道这些数字从何而来,它们来自 Ethereum blockchain (ERC20) tokens

以 16 为底调用 int:

int(your_string, base=16)

.decode('hex') 表示您希望将字符串视为单个字符的十六进制编码序列。

与 python 3.6.1

>>> a = '0x00000000000000000000000000000000000000000000000bf97e2a21966df7fe'
>>> a
'0x00000000000000000000000000000000000000000000000bf97e2a21966df7fe'
>>> int(a, 16)
220892037897060743166