为什么 String.protoptype.charCodeAt() 可以将二进制字符串转换为 Uint8Array?

Why String.protoptype.charCodeAt() can convert binary string into an Uint8Array?

假设我有一个base64编码的字符串,我想把它转换成一个ArrayBuffer,我可以这样做:

// base64 decode the string to get the binary data
const binaryString = window.atob(base64EncodedString);

// convert from a binary string to an ArrayBuffer
const buf = new ArrayBuffer(binaryString.length);
const bufView = new Uint8Array(buf);   
for (let i = 0, strLen = binaryString.length; i < strLen; i++) {
    bufView[i] = binaryString.charCodeAt(i);
}

// get ArrayBuffer: `buf`  

String.protoptype.charCodeAt(), it will return an integer between 0 and 65535 representing the UTF-16 code unit at the given index. But an Uint8Array开始的范围值为[0, 255]。

我最初认为我们从 charCodeAt() 获得的代码点可能会超出 Uint8Array 范围的范围。然后我查看了内置的atob() function, which returns an ASCII string containing decoded data. According to Binary Array,ASCII字符串的范围是0到127,包含在Uint8Array的范围内,所以我们可以安全地使用charCodeAt()这种情况。

这是我的理解。我不确定我是否正确解释了这一点。感谢您的帮助!

看来我的理解是正确的。

感谢@Konrad,这里是 his/her add-up:

charCodeAt is designed to support utf-16. And utf-16 was designed to be compatible with ASCII so the first 256 characters have exact values like in ASCII encoding.