转换为 vImage 的 CMSampleBuffer 帧颜色错误
CMSampleBuffer frame converted to vImage has wrong colors
我正在尝试将 CMSampleBuffer
从相机输出转换为 vImage
,然后再进行一些处理。不幸的是,即使没有任何进一步的编辑,我从缓冲区得到的帧也有错误的颜色:
实施(不考虑内存管理和错误):
正在配置视频输出设备:
videoDataOutput = AVCaptureVideoDataOutput()
videoDataOutput.videoSettings = [String(kCVPixelBufferPixelFormatTypeKey): kCVPixelFormatType_32BGRA]
videoDataOutput.alwaysDiscardsLateVideoFrames = true
videoDataOutput.setSampleBufferDelegate(self, queue: captureQueue)
videoConnection = videoDataOutput.connection(withMediaType: AVMediaTypeVideo)
captureSession.sessionPreset = AVCaptureSessionPreset1280x720
let videoDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
guard let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice) else {
return
}
正在根据从相机收到的 CASampleBuffer
创建 vImage
:
// Convert `CASampleBuffer` to `CVImageBuffer`
guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
var buffer: vImage_Buffer = vImage_Buffer()
buffer.data = CVPixelBufferGetBaseAddress(pixelBuffer)
buffer.rowBytes = CVPixelBufferGetBytesPerRow(pixelBuffer)
buffer.width = vImagePixelCount(CVPixelBufferGetWidth(pixelBuffer))
buffer.height = vImagePixelCount(CVPixelBufferGetHeight(pixelBuffer))
let vformat = vImageCVImageFormat_CreateWithCVPixelBuffer(pixelBuffer)
let bitmapInfo:CGBitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.last.rawValue | CGBitmapInfo.byteOrder32Little.rawValue)
var cgFormat = vImage_CGImageFormat(bitsPerComponent: 8,
bitsPerPixel: 32,
colorSpace: nil,
bitmapInfo: bitmapInfo,
version: 0,
decode: nil,
renderingIntent: .defaultIntent)
// Create vImage
vImageBuffer_InitWithCVPixelBuffer(&buffer, &cgFormat, pixelBuffer, vformat!.takeRetainedValue(), cgColor, vImage_Flags(kvImageNoFlags))
正在将缓冲区转换为 UIImage:
为了测试,CVPixelBuffer 导出到 UIImage,但将其添加到视频缓冲区具有相同的结果。
var dstPixelBuffer: CVPixelBuffer?
let status = CVPixelBufferCreateWithBytes(nil, Int(buffer.width), Int(buffer.height),
kCVPixelFormatType_32BGRA, buffer.data,
Int(buffer.rowBytes), releaseCallback,
nil, nil, &dstPixelBuffer)
let destCGImage = vImageCreateCGImageFromBuffer(&buffer, &cgFormat, nil, nil, numericCast(kvImageNoFlags), nil)?.takeRetainedValue()
// create a UIImage
let exportedImage = destCGImage.flatMap { UIImage(cgImage: [=12=], scale: 0.0, orientation: UIImageOrientation.right) }
DispatchQueue.main.async {
self.previewView.image = exportedImage
}
对 vImageBuffer_InitWithCVPixelBuffer
的调用正在执行修改您的 vImage_Buffer
和 CVPixelBuffer
的内容,这有点调皮,因为在您的(链接的)代码中您保证不会修改当你说
时的像素
CVPixelBufferLockBaseAddress(pixelBuffer, .readOnly)
为 BGRA8888 初始化 CGBitmapInfo
的正确方法是 alpha 优先,32 位小尾数法,这是不明显的,但包含在 vImage_Utilities.h 中 vImage_CGImageFormat
的头文件中:
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.first.rawValue | CGImageByteOrderInfo.order32Little.rawValue)
我不明白为什么 vImageBuffer_InitWithCVPixelBuffer
正在修改您的缓冲区,因为 cgFormat
(desiredFormat
) 应该匹配 vformat
,尽管它被记录为修改缓冲区,所以也许你应该先复制数据。
尝试为您的 CV 图片格式设置颜色 space:
let vformat = vImageCVImageFormat_CreateWithCVPixelBuffer(pixelBuffer).takeRetainedValue()
vImageCVImageFormat_SetColorSpace(vformat,
CGColorSpaceCreateDeviceRGB())
...并更新您对 vImageBuffer_InitWithCVPixelBuffer
的调用以反映 vformat
现在是托管引用这一事实:
let error = vImageBuffer_InitWithCVPixelBuffer(&buffer, &cgFormat, pixelBuffer, vformat, nil, vImage_Flags(kvImageNoFlags))
最后,您可以删除以下行,vImageBuffer_InitWithCVPixelBuffer
正在为您完成这项工作:
// buffer.data = CVPixelBufferGetBaseAddress(pixelBuffer)
// buffer.rowBytes = CVPixelBufferGetBytesPerRow(pixelBuffer)
// buffer.width = vImagePixelCount(CVPixelBufferGetWidth(pixelBuffer))
// buffer.height = vImagePixelCount(CVPixelBufferGetHeight(pixelBuffer))
请注意,您不需要锁定 Core Video 像素缓冲区,如果您查看 headerdoc,它会显示 "It is not necessary to lock the CVPixelBuffer before calling this function"。
我正在尝试将 CMSampleBuffer
从相机输出转换为 vImage
,然后再进行一些处理。不幸的是,即使没有任何进一步的编辑,我从缓冲区得到的帧也有错误的颜色:
实施(不考虑内存管理和错误):
正在配置视频输出设备:
videoDataOutput = AVCaptureVideoDataOutput()
videoDataOutput.videoSettings = [String(kCVPixelBufferPixelFormatTypeKey): kCVPixelFormatType_32BGRA]
videoDataOutput.alwaysDiscardsLateVideoFrames = true
videoDataOutput.setSampleBufferDelegate(self, queue: captureQueue)
videoConnection = videoDataOutput.connection(withMediaType: AVMediaTypeVideo)
captureSession.sessionPreset = AVCaptureSessionPreset1280x720
let videoDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
guard let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice) else {
return
}
正在根据从相机收到的 CASampleBuffer
创建 vImage
:
// Convert `CASampleBuffer` to `CVImageBuffer`
guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
var buffer: vImage_Buffer = vImage_Buffer()
buffer.data = CVPixelBufferGetBaseAddress(pixelBuffer)
buffer.rowBytes = CVPixelBufferGetBytesPerRow(pixelBuffer)
buffer.width = vImagePixelCount(CVPixelBufferGetWidth(pixelBuffer))
buffer.height = vImagePixelCount(CVPixelBufferGetHeight(pixelBuffer))
let vformat = vImageCVImageFormat_CreateWithCVPixelBuffer(pixelBuffer)
let bitmapInfo:CGBitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.last.rawValue | CGBitmapInfo.byteOrder32Little.rawValue)
var cgFormat = vImage_CGImageFormat(bitsPerComponent: 8,
bitsPerPixel: 32,
colorSpace: nil,
bitmapInfo: bitmapInfo,
version: 0,
decode: nil,
renderingIntent: .defaultIntent)
// Create vImage
vImageBuffer_InitWithCVPixelBuffer(&buffer, &cgFormat, pixelBuffer, vformat!.takeRetainedValue(), cgColor, vImage_Flags(kvImageNoFlags))
正在将缓冲区转换为 UIImage:
为了测试,CVPixelBuffer 导出到 UIImage,但将其添加到视频缓冲区具有相同的结果。
var dstPixelBuffer: CVPixelBuffer?
let status = CVPixelBufferCreateWithBytes(nil, Int(buffer.width), Int(buffer.height),
kCVPixelFormatType_32BGRA, buffer.data,
Int(buffer.rowBytes), releaseCallback,
nil, nil, &dstPixelBuffer)
let destCGImage = vImageCreateCGImageFromBuffer(&buffer, &cgFormat, nil, nil, numericCast(kvImageNoFlags), nil)?.takeRetainedValue()
// create a UIImage
let exportedImage = destCGImage.flatMap { UIImage(cgImage: [=12=], scale: 0.0, orientation: UIImageOrientation.right) }
DispatchQueue.main.async {
self.previewView.image = exportedImage
}
对 vImageBuffer_InitWithCVPixelBuffer
的调用正在执行修改您的 vImage_Buffer
和 CVPixelBuffer
的内容,这有点调皮,因为在您的(链接的)代码中您保证不会修改当你说
CVPixelBufferLockBaseAddress(pixelBuffer, .readOnly)
为 BGRA8888 初始化 CGBitmapInfo
的正确方法是 alpha 优先,32 位小尾数法,这是不明显的,但包含在 vImage_Utilities.h 中 vImage_CGImageFormat
的头文件中:
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.first.rawValue | CGImageByteOrderInfo.order32Little.rawValue)
我不明白为什么 vImageBuffer_InitWithCVPixelBuffer
正在修改您的缓冲区,因为 cgFormat
(desiredFormat
) 应该匹配 vformat
,尽管它被记录为修改缓冲区,所以也许你应该先复制数据。
尝试为您的 CV 图片格式设置颜色 space:
let vformat = vImageCVImageFormat_CreateWithCVPixelBuffer(pixelBuffer).takeRetainedValue()
vImageCVImageFormat_SetColorSpace(vformat,
CGColorSpaceCreateDeviceRGB())
...并更新您对 vImageBuffer_InitWithCVPixelBuffer
的调用以反映 vformat
现在是托管引用这一事实:
let error = vImageBuffer_InitWithCVPixelBuffer(&buffer, &cgFormat, pixelBuffer, vformat, nil, vImage_Flags(kvImageNoFlags))
最后,您可以删除以下行,vImageBuffer_InitWithCVPixelBuffer
正在为您完成这项工作:
// buffer.data = CVPixelBufferGetBaseAddress(pixelBuffer)
// buffer.rowBytes = CVPixelBufferGetBytesPerRow(pixelBuffer)
// buffer.width = vImagePixelCount(CVPixelBufferGetWidth(pixelBuffer))
// buffer.height = vImagePixelCount(CVPixelBufferGetHeight(pixelBuffer))
请注意,您不需要锁定 Core Video 像素缓冲区,如果您查看 headerdoc,它会显示 "It is not necessary to lock the CVPixelBuffer before calling this function"。