iOS- CVPixelBufferCreate 制作图片转视频时内存无法正确释放

iOS- CVPixelBufferCreate memory cannot release correctly when making image to video

我正在将图像制作成视频。但总是因为内存警告而崩溃,CVPixelBufferCreate 分配太多。我不知道如何正确处理它。我看过很多类似的主题,其中 none 解决了我的问题。

这是我的代码:

- (void) writeImagesArray:(NSArray*)array asMovie:(NSString*)path
{
    NSError *error  = nil;
    UIImage *first = [array objectAtIndex:0];
    CGSize frameSize = first.size;
    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                  [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                              error:&error];
    NSParameterAssert(videoWriter);
    
    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   [NSNumber numberWithDouble:frameSize.width],AVVideoWidthKey,
                                   [NSNumber numberWithDouble:frameSize.height], AVVideoHeightKey,
                                   nil];
    
    AVAssetWriterInput* writerInput = [AVAssetWriterInput
                                       assetWriterInputWithMediaType:AVMediaTypeVideo
                                       outputSettings:videoSettings];
    
    self.adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                     sourcePixelBufferAttributes:nil];
    
    [videoWriter addInput:writerInput];
    
    //Start Session
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:kCMTimeZero];
    
    int frameCount = 0;
    CVPixelBufferRef buffer = NULL;
    for(UIImage *img in array)
    {
        buffer = [self newPixelBufferFromCGImage:[img CGImage] andFrameSize:frameSize];
        if (self.adaptor.assetWriterInput.readyForMoreMediaData)
        {
            CMTime frameTime =  CMTimeMake(frameCount,FPS);
            [self.adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
        }
        if(buffer)
            CVPixelBufferRelease(buffer);
        
        frameCount++;
    }
    
    [writerInput markAsFinished];
    [videoWriter finishWritingWithCompletionHandler:^{
        
        if (videoWriter.status == AVAssetWriterStatusFailed) {
            
            NSLog(@"Movie save failed.");
            
        }else{
            
            NSLog(@"Movie saved.");
        }
    }];
    
    NSLog(@"Finished.");
}

        
- (CVPixelBufferRef)newPixelBufferFromCGImage: (CGImageRef) image andFrameSize:(CGSize)frameSize
{
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                             nil];
    
    CVPixelBufferRef pxbuffer = NULL;
    
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
                                          frameSize.width,
                                          frameSize.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
                                          &pxbuffer);
    
    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
    
    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);
    
    CGBitmapInfo bitmapInfo = (CGBitmapInfo) kCGImageAlphaNoneSkipFirst;
    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata,
                                                 frameSize.width,
                                                 frameSize.height,
                                                 8,
                                                 4*frameSize.width,
                                                 rgbColorSpace,
                                                 bitmapInfo);
    
    NSParameterAssert(context);
    CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
                                           CGImageGetHeight(image)), image);
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);
    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}

更新:

我把我的视频分成了小段。 添加 [NSThread sleepForTimeInterval:0.00005] 后;在循环。 记忆刚刚神奇地释放了。

但是,这导致我的 UI 因为这条线卡住了几秒钟。有更好的解决方案吗?

for(UIImage *img in array)
{
    buffer = [self newPixelBufferFromCGImage:[img CGImage] andFrameSize:frameSize];
    //CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, adaptor.pixelBufferPool, &buffer);
    if (adaptor.assetWriterInput.readyForMoreMediaData)
    {
        CMTime frameTime =  CMTimeMake(frameCount,FPS);
        [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
    }
    
    if(buffer)
        CVPixelBufferRelease(buffer);
    
    frameCount++;
    
    [NSThread sleepForTimeInterval:0.00005];
}

记忆如下:

通过快速查看您的代码,我看不出 CVBuffer 本身的管理有任何问题。
我认为这可能是您问题的根源是 UIImages 数组。
UIImage 有这种行为,直到您请求 CGImage 属性 或绘制它,附加图像不会在内存中解码,因此未使用图像对内存的影响很小。
您的枚举在每个图像上调用 CGImage 属性 并且您永远不会摆脱它们,这可以解释内存分配的持续增加。

如果以后不用的话Images。你可以这样做:

    [images enumerateObjectsUsingBlock:^(UIImage * _Nonnull img, NSUInteger idx, BOOL * _Nonnull stop) {
        CVPixelBufferRef pixelBuffer = [self pixelBufferFromCGImage:img.CGImage frameSize:[VDVideoEncodeConfig globalConfig].size];

        CMTime frameTime = CMTimeMake(frameCount, (int32_t)[VDVideoEncodeConfig globalConfig].frameRate);
        frameCount++;
        [_assetRW appendNewSampleBuffer:pixelBuffer pst:frameTime];

        CVPixelBufferRelease(pixelBuffer);
        // This can release the memory
        // The Image.CGImageRef result in the memory leak you see in the Instruments
        images[idx] = [NSNull null];
    }];