iOS 从 CMSampleBufferRef 创建 UIImage 时内存增加

iOS memory building up while creating UIImage from CMSampleBufferRef

我正在从 CMSampleBufferRef 中创建 UIImage 个对象。我在一个单独的队列(在后台)中执行此操作,因此我将处理包括在 @autorealease 池中。问题是内存在没有任何泄漏通知的情况下累积。下面是我正在使用的方法:

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    @autoreleasepool {
        // Get a CMSampleBuffer's Core Video image buffer for the media data
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        // Lock the base address of the pixel buffer
        CVPixelBufferLockBaseAddress(imageBuffer, 0);

        // Get the number of bytes per row for the pixel buffer
        void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

       // Get the number of bytes per row for the pixel buffer
       size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
       // Get the pixel buffer width and height
       size_t width = CVPixelBufferGetWidth(imageBuffer);
       size_t height = CVPixelBufferGetHeight(imageBuffer);

       // Create a device-dependent RGB color space
       CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

       // Create a bitmap graphics context with the sample buffer data
       CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
      // Create a Quartz image from the pixel data in the bitmap graphics context
       CGImageRef quartzImage = CGBitmapContextCreateImage(context);
       // Unlock the pixel buffer
       CVPixelBufferUnlockBaseAddress(imageBuffer,0);

       // Free up the context and color space
       CGContextRelease(context);
       CGColorSpaceRelease(colorSpace);

       // Create an image object from the Quartz image
       UIImage *image = [[UIImage imageWithCGImage:quartzImage] retain];

       // Release the Quartz image
       CGImageRelease(quartzImage);

       return (image);
   }
}

这就是我的使用方式:

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection {

    CFRetain(sampleBuffer);
    dispatch_async(movieWritingQueue, ^{
    @autoreleasepool {

        if (self.returnCapturedImages && captureOutput != audioOutput) {

            UIImage *capturedImage = [self imageFromSampleBuffer: sampleBuffer];

            dispatch_async(callbackQueue, ^{

                @autoreleasepool {

                    if (self.delegate && [self.delegate respondsToSelector: @selector(recorderCapturedImage:)]) {
                        [self.delegate recorderCapturedImage: capturedImage];
                    }

                    [capturedImage release];
                }
            });
        }
        CFRelease(sampleBuffer);
    }
});

其实我几天前也遇到过类似的问题...

您已经释放了您的 CMSampleBufferRef,但也尝试释放您的 CVPixelBufferRef,例如:

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    @autoreleasepool {

       // ...

       // Free up the context and color space
       CGContextRelease(context);
       CGColorSpaceRelease(colorSpace);

       // Create an image object from the Quartz image
       UIImage *image = [[UIImage imageWithCGImage:quartzImage] retain];

       // Release the Quartz image
       CGImageRelease(quartzImage);

       CVPixelBufferRelease(imageBuffer); <-- release your pixel buffer

       return (image);
   }
}

我找到了临时解决方案。我正在做相同的操作,但在主队列上。这一点也不优雅或高效,但至少内存不会再增加了。

我想知道这是否是一个 iOS 错误...?

更新: 这就是我在主线程上处理 CMSampleBuffers 的方式:

[[NSOperationQueue mainQueue] addOperationWithBlock:^ {

    CGImageRef cgImage = [self cgImageFromSampleBuffer:sampleBuffer];
    UIImage *capturedImage =     [UIImage imageWithCGImage: cgImage ];

    //do something with the image - I suggest in a background thread
    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
       // do something with the image
    });

    CGImageRelease( cgImage );
    CFRelease(sampleBuffer);
}];

- (CGImageRef) cgImageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);        // Lock the image buffer

    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);   // Get information of the image
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef newImage = CGBitmapContextCreateImage(newContext);
    CGContextRelease(newContext);

    CGColorSpaceRelease(colorSpace);
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);
    /* CVBufferRelease(imageBuffer); */  // do not call this!

    return newImage;
}