启用 ARC 的项目未释放 GPUImage 分配

GPUImage allocation not releasing with ARC enabled project

我正在连续拍摄多张照片并使用 GPUImage 框架处理它们。我有一个助手 class 主要用于执行 GPUImageSubtractBlendFilter。这是我的工作:

#import "ImageProcessor.h"

@interface ImageProcessor ()

@end

@implementation ImageProcessor

GPUImageSubtractBlendFilter *subFilter;

-(id)init {
    self = [super init];
    subFilter = [[GPUImageSubtractBlendFilter alloc] init];
    return self;
}

-(UIImage*)flashSubtract:(UIImage*) image1 : (UIImage*) image2{
    UIImage *processedImage;
//    @autoreleasepool {

    //CAUSING MEMORY ISSUE
    GPUImagePicture *img1 = [[GPUImagePicture alloc] initWithImage:image1];
    GPUImagePicture *img2 = [[GPUImagePicture alloc] initWithImage:image2];
    //MEMORY ISSUE END

    [img1 addTarget:subFilter];
    [img2 addTarget:subFilter];

    [img1 processImage];
    [img2 processImage];
    [subFilter useNextFrameForImageCapture];
    processedImage = [subFilter imageFromCurrentFramebuffer];

//    }

    //consider modifications to filter possibly?


    return processedImage;
}

内存在不断增长,即使启用了 ARC 也不会释放内存。我调试了它并将其缩小到这两个分配的核心:

 img1 = [[GPUImagePicture alloc] initWithImage:[imagesArray objectAtIndex:1]];
 img2 = [[GPUImagePicture alloc] initWithImage:[imagesArray objectAtIndex:0]];

我在这里遗漏了什么或者我应该做些什么来避免连续分配 GPUImagePicture 变量?

这里是代码的来源:

-(void)burstModeCapture : (AVCaptureConnection *) videoConnection : (int) i{//start capturing picture s rapidly and cache them in ram

    dispatch_group_t group = dispatch_group_create();
    dispatch_group_enter(group);

    NSLog(@"time entering: %d", i);


    [photoOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error)
     {

         if(error)
             NSLog(@"%s",[[error localizedDescription] UTF8String]);

         CVImageBufferRef cameraFrame = CMSampleBufferGetImageBuffer(imageSampleBuffer);
         CVPixelBufferLockBaseAddress(cameraFrame, 0);
         Byte *rawImageBytes = CVPixelBufferGetBaseAddress(cameraFrame);
         size_t bytesPerRow = CVPixelBufferGetBytesPerRow(cameraFrame);
         size_t width = CVPixelBufferGetWidth(cameraFrame);
         size_t height = CVPixelBufferGetHeight(cameraFrame);
         NSData *dataForRawBytes = [NSData dataWithBytes:rawImageBytes length:bytesPerRow * CVPixelBufferGetHeight(cameraFrame)];
         // Do whatever with your bytes

         // create suitable color space
         CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

         //Create suitable context (suitable for camera output setting kCVPixelFormatType_32BGRA)
         CGContextRef newContext = CGBitmapContextCreate(rawImageBytes, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);

         CVPixelBufferUnlockBaseAddress(cameraFrame, 0);

         // release color space
         CGColorSpaceRelease(colorSpace);

         //Create a CGImageRef from the CVImageBufferRef
         CGImageRef newImage = CGBitmapContextCreateImage(newContext);
         UIImage *FinalImage = [[UIImage alloc] initWithCGImage:newImage];
         [imagesArray addObject:FinalImage];//append image to array


         dispatch_group_leave(group);
     }];

    dispatch_group_notify(group, dispatch_get_main_queue(), ^{//execute function recursively to shoot n photos
        //base case to stop shooting pictures
        shootCounter--;

        if (shootCounter <= 0) {
            [flash turnOffFlash];
            shootCounter = NUMSHOTS;
            UIImage *output = [self processImages]; //THIS IS WHERE MEMORY STARTS ACCUMULATING
            [self updateUIWithOutput:output];
            NSLog(@"Done shooting!");
        }
        else {
            [NSThread sleepForTimeInterval: 0.1];
            [self burstModeCapture:videoConnection : shootCounter];
        }
    });


}

我运行 这个函数递归两次来捕获成对的图像。 [imageProcessor flashSubtract] 是问题所在。

您在 CGImageRef newImage = CGBitmapContextCreateImage(newContext); 行后缺少 CGContextRelease(newContext);。这可能会导致您的内存泄漏。