使用增强现实录制视频的最佳方式是什么

What is the best way to record a video with augmented reality

录制增强现实视频的最佳方式是什么? (向来自 iPhone/iPad 相机的帧添加文本、图像徽标)

之前我想弄清楚如何绘制成 CIImage (How to draw text into CIImage?) and convert CIImage back to CMSampleBuffer (CIImage back to CMSampleBuffer)

我几乎什么都做了,只有在 AVAssetWriterInput

中使用新的 CMSampleBuffer 录制视频时遇到问题

但是这个解决方案无论如何都不好,它在将 CIImage 转换为 CVPixelBuffer 时吃掉了很多 CPU (ciContext.render(ciImage!, to: aBuffer))

所以我想在这里停下来寻找其他一些方法来录制带有增强现实的视频(例如,在将视频编码为 mp4 文件时在帧内动态添加(绘制)文本)

这是我尝试过但不想再使用的东西...

// convert original CMSampleBuffer to CIImage, 
// combine multiple `CIImage`s into one (adding augmented reality -  
// text or some additional images)
let pixelBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
let ciimage : CIImage = CIImage(cvPixelBuffer: pixelBuffer)
var outputImage: CIImage?
let images : Array<CIImage> = [ciimage, ciimageSec!] // add all your CIImages that you'd like to combine
for image in images {
    outputImage = outputImage == nil ? image : image.composited(over: outputImage!)
}

// allocate this class variable once         
if pixelBufferNew == nil {
    CVPixelBufferCreate(kCFAllocatorSystemDefault, CVPixelBufferGetWidth(pixelBuffer),  CVPixelBufferGetHeight(pixelBuffer), kCVPixelFormatType_32BGRA, nil, &pixelBufferNew)
}

// convert CIImage to CVPixelBuffer
let ciContext = CIContext(options: nil)
if let aBuffer = pixelBufferNew {
    ciContext.render(outputImage!, to: aBuffer) // >>> IT EATS A LOT OF <<< CPU
}

// convert new CVPixelBuffer to new CMSampleBuffer
var sampleTime = CMSampleTimingInfo()
sampleTime.duration = CMSampleBufferGetDuration(sampleBuffer)
sampleTime.presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
sampleTime.decodeTimeStamp = CMSampleBufferGetDecodeTimeStamp(sampleBuffer)
var videoInfo: CMVideoFormatDescription? = nil
CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBufferNew!, &videoInfo)
var oBuf: CMSampleBuffer?
CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixelBufferNew!, true, nil, nil, videoInfo!, &sampleTime, &oBuf)

/*
try to append new CMSampleBuffer into a file (.mp4) using 
AVAssetWriter & AVAssetWriterInput... (I met errors with it, original buffer works ok 
- "from func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)")
*/*

有没有更好的解决方案?

现在我回答我自己的问题

最好是使用 Objective-C++ class (.mm) 我们可以使用 OpenCV 和 easily/fast 从 CMSampleBuffer 转换为 cv::Mat 并在处理

后返回 CMSampleBuffer

我们可以轻松地从 Swift

调用 Objective-C++ 函数