我如何理解此代码挑战中此测试用例的结果?

How can I understand the result of this testcase in this code challenge?

我正在尝试了解此 challenge 在 codeforces 中的第一个测试用例。

描述是:

Sergey is testing a next-generation processor. Instead of bytes the processor works with memory cells consisting of n bits. These bits are numbered from 1 to n. An integer is stored in the cell in the following way: the least significant bit is stored in the first bit of the cell, the next significant bit is stored in the second bit, and so on; the most significant bit is stored in the n-th bit. Now Sergey wants to test the following instruction: "add 1 to the value of the cell". As a result of the instruction, the integer that is written in the cell must be increased by one; if some of the most significant bits of the resulting number do not fit into the cell, they must be discarded. Sergey wrote certain values ​​of the bits in the cell and is going to add one to its value. How many bits of the cell will change after the operation?

总结

给定一个二进制数,将其十进制值加1,计算运算后改变了多少位?

测试用例

4

1100
= 3

4

1111
= 4

备注 在第一个样本中,单元格最终值为 0010,在第二个样本中 - 为 0000.

在2个测试用例中1111是15,所以15+1=16(二进制10000),所以所有1的变化,因此是4

但是在2测试用例中1100是12,所以12+1=13(01101),这里只是最后左边的1变了,结果却变成了3,为什么?

你错过了关键部分:最低有效位是第一个(即最左边的),而不是最后一个,因为我们通常写二进制。

因此,1100 不是 12,而是 3。因此,1100 + 1 = 3 + 1 = 4 = 0010,所以 3 位被更改。

"least significant bit"字面上的意思是不是最重要的一点,所以可以理解为"the one representing the smallest value"。在二进制中,表示 2^0 的位是最低位。所以你任务中的二进制代码是这样写的:

bit no. 0    1    2    3    4   (...)
value   2^0  2^1  2^2  2^3  2^4 (...)
        | least             | most
        | significant       | significant
        | bit               | bit

这就是为什么 1100 是:

1100 = 1 * 2^0 + 1 * 2^1 + 0*2^2 + 0*2^3 = 1 + 2 + 0 + 0 = 3

不是相反(正如我们通常写的那样)。