byteArray to Hex NSString - 添加了一些错误的十六进制内容

byteArray to Hex NSString - adds some wrong hex content

我正在尝试将 byteArray 转换为十六进制 NSString。

这是我提到的将其转换为十六进制 NSString 的解决方案。但是,我发现它添加了 ffffffffffffff。如何获得正确的十六进制 NSString?

Best way to serialize an NSData into a hexadeximal string

const char myByteArray[] = {
        0x12,0x23,0x34,0x45,0x56,0x67,0x78,0x89,
        0x12,0x23,0x34,0x45,
        0x56,0x67,0x78,0x89 };

    NSData *myByteData=[NSData dataWithBytes:myByteArray length:sizeof(myByteArray)];
    NSMutableString *myHexString= [NSMutableString stringWithCapacity:myByteData.length*2];
    for(int i=0;i<myByteData.length;i++){
        ;
        NSString *resultString =[NSString stringWithFormat:@"%02lx",(unsigned long)myByteArray[i]];
        [myHexString appendString:resultString];
    }

输出字符串

12233445566778ffffffffffffff8912233445566778ffffffffffffff89

不要对每个字节使用 unsigned long。如果你不使用它,myByteData 有什么意义?

并且由于您并未真正使用 char,请使用 uint8_t

试试这个:

const uint8_t myByteArray[] = {
    0x12,0x23,0x34,0x45,0x56,0x67,0x78,0x89,
    0x12,0x23,0x34,0x45,
    0x56,0x67,0x78,0x89 };

size_t len = sizeof(myByteArray) / sizeof(uint8_t);
NSMutableString *myHexString = [NSMutableString stringWithCapacity:len * 2];
for (size_t i = 0; i < len; i++) {
    [myHexString appendFormat:@"%02x", (int)myByteArray[i]];
}

您的初始字节数据是 char 而不是 unsigned char。这意味着任何 >127 (0x7f) 的值都将被视为二进制补码负数,给出 ffffffffffffff89.

如果您将数据更改为 unsigned char,您将获得所需的结果。

const unsigned char myByteArray[] = {
        0x12,0x23,0x34,0x45,0x56,0x67,0x78,0x89,
        0x12,0x23,0x34,0x45,
        0x56,0x67,0x78,0x89 };

NSData *myByteData=[NSData dataWithBytes:myByteArray length:sizeof(myByteArray)];
NSMutableString *myHexString= [NSMutableString stringWithCapacity:myByteData.length*2];
for(int i=0;i<myByteData.length;i++){

    NSString *resultString =[NSString stringWithFormat:@"%02lx",(unsigned long)myByteArray[i]];
    [myHexString appendString:resultString];

}