如何创建样本缓冲区实例 (CMSampleBufferRef)?
How to create instance of Sample Buffer (CMSampleBufferRef)?
我试着写ios相机,我拿了部分代码from apple:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
// Create a UIImage from the sample buffer data
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
< Add your code here that uses the image >
}
我需要从程序中的任何位置调用此函数。但是因为它需要创建一个 (CMSampleBufferRef)
的对象类型。怎么做?
我试着写这样的东西:
buf1 = [[CMSampleBufferRef alloc]init]
但是方法不对。
尝试所有这些(一个可能有效):
UIImage *image = [self imageFromSampleBuffer:&sampleBuffer];
UIImage *image = [self imageFromSampleBuffer:(id)sampleBuffer];
UIImage *image = [self imageFromSampleBuffer:(__bridge CMSampleBufferRef)sampleBuffer];
UIImage *image = [self imageFromSampleBuffer:(__bridge id)sampleBuffer];
如果 none 这些工作,创建对 CMSampleBuffer 的 CVImageBuffer 的引用而不添加任何上述内容以替换 UIImage 方法中的 sampleBuffer:
CVImageBufferRef cvImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
如果这不起作用,您可以创建一个单独的方法将 CMSampleBuffer 转换为 UIImage,如下所示:
// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
// Get a CMSampleBuffer's Core Video image buffer for the media data
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer, 0);
// Get the number of bytes per row for the pixel buffer
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// Create a bitmap graphics context with the sample buffer data
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// Free up the context and color space
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
// Create an image object from the Quartz image
UIImage *image = [UIImage imageWithCGImage:quartzImage];
// Release the Quartz image
CGImageRelease(quartzImage);
return (image);
}
这个方法有效;事实上,这是我使用的。
这是我目前用于模拟 CMSampleBuffer
用于 swift3 中的单元测试的片段:
fileprivate func getCMSampleBuffer() -> CMSampleBuffer {
var pixelBuffer : CVPixelBuffer? = nil
CVPixelBufferCreate(kCFAllocatorDefault, 100, 100, kCVPixelFormatType_32BGRA, nil, &pixelBuffer)
var info = CMSampleTimingInfo()
info.presentationTimeStamp = kCMTimeZero
info.duration = kCMTimeInvalid
info.decodeTimeStamp = kCMTimeInvalid
var formatDesc: CMFormatDescription? = nil
CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer!, &formatDesc)
var sampleBuffer: CMSampleBuffer? = nil
CMSampleBufferCreateReadyWithImageBuffer(kCFAllocatorDefault,
pixelBuffer!,
formatDesc!,
&info,
&sampleBuffer);
return sampleBuffer!
}
Swift@Rotem Tamir 的回答的第 5 版
fileprivate func getCMSampleBuffer() -> CMSampleBuffer {
var pixelBuffer: CVPixelBuffer?
CVPixelBufferCreate(kCFAllocatorDefault, 100, 100, kCVPixelFormatType_32BGRA, nil, &pixelBuffer)
var info = CMSampleTimingInfo()
info.presentationTimeStamp = CMTime.zero
info.duration = CMTime.invalid
info.decodeTimeStamp = CMTime.invalid
var formatDesc: CMFormatDescription?
CMVideoFormatDescriptionCreateForImageBuffer(allocator: kCFAllocatorDefault,
imageBuffer: pixelBuffer!,
formatDescriptionOut: &formatDesc)
var sampleBuffer: CMSampleBuffer?
CMSampleBufferCreateReadyWithImageBuffer(allocator: kCFAllocatorDefault,
imageBuffer: pixelBuffer!,
formatDescription: formatDesc!,
sampleTiming: &info,
sampleBufferOut: &sampleBuffer)
return sampleBuffer!
}
Swift 5.6
public func convertCMSampleBuffer(_ cvPixelBuffer: CVPixelBuffer?) -> CMSampleBuffer {
var pixelBuffer = cvPixelBuffer
CVPixelBufferCreate(kCFAllocatorDefault, 100, 100, kCVPixelFormatType_32BGRA, nil, &pixelBuffer)
var info = CMSampleTimingInfo()
info.presentationTimeStamp = CMTime.zero
info.duration = CMTime.invalid
info.decodeTimeStamp = CMTime.invalid
var formatDesc: CMFormatDescription?
CMVideoFormatDescriptionCreateForImageBuffer(allocator: kCFAllocatorDefault,
imageBuffer: pixelBuffer!,
formatDescriptionOut: &formatDesc)
var sampleBuffer: CMSampleBuffer?
CMSampleBufferCreateReadyWithImageBuffer(allocator: kCFAllocatorDefault,
imageBuffer: pixelBuffer!,
formatDescription: formatDesc!,
sampleTiming: &info,
sampleBufferOut: &sampleBuffer)
return sampleBuffer!
}
我试着写ios相机,我拿了部分代码from apple:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
// Create a UIImage from the sample buffer data
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
< Add your code here that uses the image >
}
我需要从程序中的任何位置调用此函数。但是因为它需要创建一个 (CMSampleBufferRef)
的对象类型。怎么做?
我试着写这样的东西:
buf1 = [[CMSampleBufferRef alloc]init]
但是方法不对。
尝试所有这些(一个可能有效):
UIImage *image = [self imageFromSampleBuffer:&sampleBuffer];
UIImage *image = [self imageFromSampleBuffer:(id)sampleBuffer];
UIImage *image = [self imageFromSampleBuffer:(__bridge CMSampleBufferRef)sampleBuffer];
UIImage *image = [self imageFromSampleBuffer:(__bridge id)sampleBuffer];
如果 none 这些工作,创建对 CMSampleBuffer 的 CVImageBuffer 的引用而不添加任何上述内容以替换 UIImage 方法中的 sampleBuffer:
CVImageBufferRef cvImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
如果这不起作用,您可以创建一个单独的方法将 CMSampleBuffer 转换为 UIImage,如下所示:
// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
// Get a CMSampleBuffer's Core Video image buffer for the media data
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer, 0);
// Get the number of bytes per row for the pixel buffer
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// Create a bitmap graphics context with the sample buffer data
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// Free up the context and color space
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
// Create an image object from the Quartz image
UIImage *image = [UIImage imageWithCGImage:quartzImage];
// Release the Quartz image
CGImageRelease(quartzImage);
return (image);
}
这个方法有效;事实上,这是我使用的。
这是我目前用于模拟 CMSampleBuffer
用于 swift3 中的单元测试的片段:
fileprivate func getCMSampleBuffer() -> CMSampleBuffer {
var pixelBuffer : CVPixelBuffer? = nil
CVPixelBufferCreate(kCFAllocatorDefault, 100, 100, kCVPixelFormatType_32BGRA, nil, &pixelBuffer)
var info = CMSampleTimingInfo()
info.presentationTimeStamp = kCMTimeZero
info.duration = kCMTimeInvalid
info.decodeTimeStamp = kCMTimeInvalid
var formatDesc: CMFormatDescription? = nil
CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer!, &formatDesc)
var sampleBuffer: CMSampleBuffer? = nil
CMSampleBufferCreateReadyWithImageBuffer(kCFAllocatorDefault,
pixelBuffer!,
formatDesc!,
&info,
&sampleBuffer);
return sampleBuffer!
}
Swift@Rotem Tamir 的回答的第 5 版
fileprivate func getCMSampleBuffer() -> CMSampleBuffer {
var pixelBuffer: CVPixelBuffer?
CVPixelBufferCreate(kCFAllocatorDefault, 100, 100, kCVPixelFormatType_32BGRA, nil, &pixelBuffer)
var info = CMSampleTimingInfo()
info.presentationTimeStamp = CMTime.zero
info.duration = CMTime.invalid
info.decodeTimeStamp = CMTime.invalid
var formatDesc: CMFormatDescription?
CMVideoFormatDescriptionCreateForImageBuffer(allocator: kCFAllocatorDefault,
imageBuffer: pixelBuffer!,
formatDescriptionOut: &formatDesc)
var sampleBuffer: CMSampleBuffer?
CMSampleBufferCreateReadyWithImageBuffer(allocator: kCFAllocatorDefault,
imageBuffer: pixelBuffer!,
formatDescription: formatDesc!,
sampleTiming: &info,
sampleBufferOut: &sampleBuffer)
return sampleBuffer!
}
Swift 5.6
public func convertCMSampleBuffer(_ cvPixelBuffer: CVPixelBuffer?) -> CMSampleBuffer {
var pixelBuffer = cvPixelBuffer
CVPixelBufferCreate(kCFAllocatorDefault, 100, 100, kCVPixelFormatType_32BGRA, nil, &pixelBuffer)
var info = CMSampleTimingInfo()
info.presentationTimeStamp = CMTime.zero
info.duration = CMTime.invalid
info.decodeTimeStamp = CMTime.invalid
var formatDesc: CMFormatDescription?
CMVideoFormatDescriptionCreateForImageBuffer(allocator: kCFAllocatorDefault,
imageBuffer: pixelBuffer!,
formatDescriptionOut: &formatDesc)
var sampleBuffer: CMSampleBuffer?
CMSampleBufferCreateReadyWithImageBuffer(allocator: kCFAllocatorDefault,
imageBuffer: pixelBuffer!,
formatDescription: formatDesc!,
sampleTiming: &info,
sampleBufferOut: &sampleBuffer)
return sampleBuffer!
}