将代码从 Objective-C 转换为 Swift 后测试用例失败

Testcase failed after converting codes from Objective-C to Swift

我正在做一些Swift风格的位运算,这些代码最初是用Objective-C/C写的。我使用 UnsafeMutablePointer 来声明内存地址的起始索引,并使用 UnsafeMutableBufferPointer 来访问范围内的元素。

您可以访问原始 Objective-C 文件 Here.

public init(size: Int) {
    self.size = size
    self.bitsLength = (size + 31) / 32
    self.startIdx = UnsafeMutablePointer<Int32>.alloc(bitsLength * sizeof(Int32))
    self.bits = UnsafeMutableBufferPointer(start: startIdx, count: bitsLength)
}

/**
 * @param from first bit to check
 * @return index of first bit that is set, starting from the given index, or size if none are set
 *  at or beyond its given index
 */
public func nextSet(from: Int) -> Int {
    if from >= size { return size }
    var bitsOffset = from / 32
    var currentBits: Int32 = bits[bitsOffset]
    currentBits &= ~((1 << (from & 0x1F)) - 1).to32
    while currentBits == 0 {
        if ++bitsOffset == bitsLength {
            return size
        }
        currentBits = bits[bitsOffset]
    }
    let result: Int = bitsOffset * 32 + numberOfTrailingZeros(currentBits).toInt
    return result > size ? size : result
}

func numberOfTrailingZeros(i: Int32) -> Int {
    var i = i
    guard i != 0 else { return 32 }
    var n = 31
    var y: Int32
    y = i << 16
    if y != 0 { n = n - 16; i = y }
    y = i << 8
    if y != 0 { n = n - 8; i = y }
    y = i << 4
    if y != 0 { n = n - 4; i = y }
    y = i << 2
    if y != 0 { n = n - 2; i = y }
    return n - Int((UInt((i << 1)) >> 31))
}

测试用例:

func testGetNextSet1() {
    // Passed
    var bits = BitArray(size: 32)
    for i in 0..<bits.size {
        XCTAssertEqual(32, bits.nextSet(i), "\(i)")
    }
    // Failed
    bits = BitArray(size: 34)
    for i in 0..<bits.size {
        XCTAssertEqual(34, bits.nextSet(i), "\(i)")
    }
}

有人可以指导我为什么第二个测试用例失败但 objective-c 版本通过吗?

编辑: 正如@vacawama 提到的:如果将 testGetNextSet 分成 2 个测试,则都通过。

Edit2: 当我 运行 测试 xctool 时,测试调用 BitArraynextSet() 会崩溃而 运行宁.

Objective-C numberOfTrailingZeros 的版本:

// Ported from OpenJDK Integer.numberOfTrailingZeros implementation
- (int32_t)numberOfTrailingZeros:(int32_t)i {
    int32_t y;
    if (i == 0) return 32;
    int32_t n = 31;
    y = i <<16; if (y != 0) { n = n -16; i = y; }
    y = i << 8; if (y != 0) { n = n - 8; i = y; }
    y = i << 4; if (y != 0) { n = n - 4; i = y; }
    y = i << 2; if (y != 0) { n = n - 2; i = y; }
    return n - (int32_t)((uint32_t)(i << 1) >> 31);
}

翻译 numberOfTrailingZeros 时,您将 return 值从 Int32 更改为 Int。很好,但是函数的最后一行在您翻译时运行不正常。

numberOfTrailingZeros中,替换为:

return n - Int((UInt((i << 1)) >> 31))

有了这个:

return n - Int(UInt32(bitPattern: i << 1) >> 31)

转换为 UInt32 会删除除低 32 位以外的所有位。由于您要转换为 UInt,因此您并没有删除这些位。有必要使用 bitPattern 来实现这一点。

最后发现startIdx只需要在分配后初始化即可。

self.startIdx = UnsafeMutablePointer<Int32>.alloc(bitsLength * sizeof(Int32))
self.startIdx.initializeFrom(Array(count: bitsLength, repeatedValue: 0))

或者用calloc一行代码:

self.startIdx = unsafeBitCast(calloc(bitsLength, sizeof(Int32)), UnsafeMutablePointer<Int32>.self)

此外,我使用 lazy var 来推迟 UnsafeMutableBufferPointer 的初始化,直到 属性 被首次使用。

lazy var bits: UnsafeMutableBufferPointer<Int32> = {
   return UnsafeMutableBufferPointer<Int32>(start: self.startIdx, count: self.bitsLength)
}()

另一方面,别忘了deinit:

deinit {
    startIdx.destroy()
    startIdx.dealloc(bitsLength * sizeof(Int32))
}