CGImageCreateWithImageInRect 持有图像数据 - 泄漏?

CGImageCreateWithImageInRect Holding Onto Image Data - Leaking?

我正在尝试拍摄图像快照、裁剪它并将其保存到 UIImageView。

我已经从几十个不同的方向尝试过,但这是一般设置。

首先,我运行在 ARC XCODE 7.2 上测试 6Plus phone iOS 9.2.

这里是委托设置..

    - (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    NSLog(@"CameraViewController : imagePickerController");


    //Get the Image Data
    NSData *getDataImage = UIImageJPEGRepresentation([info objectForKey:@"UIImagePickerControllerOriginalImage"], 0.9);

    // Turn it into a UI image
    UIImage *getCapturedImage = [[UIImage alloc] initWithData:getDataImage];

    // Figure out the size and build the rectangle we are going to put the image into
    CGSize imageSize = getCapturedImage.size;
    CGFloat imageScale = getCapturedImage.scale;
    int yCoord = (imageSize.height - ((imageSize.width*2)/3))/2;
    CGRect getRect = CGRectMake(0, yCoord, imageSize.width, ((imageSize.width*2)/3));
    CGRect rect = CGRectMake(getRect.origin.x*imageScale,
                             getRect.origin.y*imageScale,
                             getRect.size.width*imageScale,
                             getRect.size.height*imageScale);


    //Resize the image and store it
    CGImageRef imageRef = CGImageCreateWithImageInRect([getCapturedImage CGImage], rect);

    //Stick the resulting image into an image variable
    UIImage *cropped = [UIImage imageWithCGImage:imageRef];

    //Release that reference
    CGImageRelease(imageRef);

    //Save the newly cropped image to a UIImageView property
    _imageView.image = cropped;


    _saveBtn.hidden = NO;
    [picker dismissViewControllerAnimated:YES completion:^{
        // After we are finished with dismissing the picker, run the below to close out the camera tool
        [self dismissCameraViewFromImageSelect];

    }];


}

当我 运行 以上时,我得到下面的图像。



此时我正在查看之前设置的图像_imageView.image。而图像数据已经吞噬了 30MB。但是当我退出这个视图时,图像数据仍然保留。

如果我尝试完成捕获新图像的过程,这就是我得到的结果。



当我绕过调整图像大小并将其分配给 ImageView 时,没有 30MB 被吞噬。

我已经查看了所有关于此的建议,所有建议都没有影响,但让我们回顾一下我尝试过但没有奏效的内容。


没有成功。

  1. 将它放在@autoreleasepool 块中。

这似乎永远行不通。也许我做的不对,但是尝试了几种不同的方法,没有释放内存。

  1. CGImageRelease(imageRef);

我正在这样做,但我已经尝试了多种不同的方法。仍然没有运气。

  1. CFRelease(imageRef);

同样无效。

  1. 设置imageRef = nil;

还保留着。即使是它和 CGI​​mageRelease 的组合也不适合我。

我试过将裁剪方面分离到它自己的函数中并返回结果,但仍然没有成功。

我在网上没有发现任何特别有用的东西,所有对类似问题的引用都有似乎不起作用的建议(如上所述)。

提前感谢您的建议。

好吧,经过长时间的思考,我决定从头开始,因为我最近的大部分工作都在 Swift,我整理了一个 swift class 可以调用,控制相机,并通过委托将图像传递给调用者。

最终结果是我没有内存泄漏,因为某些变量保留在前一个图像的内存中,我可以通过桥接 Swift class 文件到我的 Obj-C ViewControllers。

这是执行抓取的 class 的代码。

//
//  CameraOverlay.swift
//  CameraTesting
//
//  Created by Chris Cantley on 3/3/16.
//  Copyright © 2016 Chris Cantley. All rights reserved.
//

import Foundation
import UIKit
import AVFoundation

//We want to pass an image up to the parent class once the image has been taken so the easiest way to send it up
// and trigger the placing of the image is through a delegate.
protocol CameraOverlayDelegate: class {
    func cameraOverlayImage(image:UIImage)
}

class CameraOverlay: NSObject, AVCaptureVideoDataOutputSampleBufferDelegate {

    //MARK: Internal Variables

    //Setting up the delegate reference to be used later on.
    internal var delegate: CameraOverlayDelegate?


    //Varibles for setting the camera view
    internal var returnImage : UIImage!
    internal var previewView : UIView!
    internal var boxView:UIView!
    internal let myButton: UIButton = UIButton()

    //Setting up Camera Capture required properties
    internal var previewLayer:AVCaptureVideoPreviewLayer!
    internal var captureDevice : AVCaptureDevice!
    internal let session=AVCaptureSession()
    internal var stillImageOutput: AVCaptureStillImageOutput!

    //When we put up the camera preview and the button we have to reference a parent view so this will hold the
    // parent view passed into the class so that other methods can work with it.
    internal var view : UIView!



    //When this class is instantiated, we want to require that the calling class passes us
    //some view that we can tie the camera previewer and button to.

    //MARK: - Instantiation Methods
    init(parentView: UIView){


        //Instantiate the reference to the passed-in UIView
        self.view = parentView

        //We are doing the following here because this only needs to be setup once per instantiation.

        //Create the output container with settings to specify that we are getting a still Image, and that it is a JPEG.
        stillImageOutput = AVCaptureStillImageOutput()
        stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]

        //Now we are sticking the image into the above formatted container
        session.addOutput(stillImageOutput)
    }

    //MARK: - Public Functions
    func showCameraView() {

        //This handles showing the camera previewer and button
        self.setupCameraView()

        //This sets up the parameters for the camera and begins the camera session.
        self.setupAVCapture()
    }

    //MARK: - Internal Functions

    //When the user clicks the button, this gets the image, sends it up to the delegate, and shuts down all the Camera related views.
    internal func didPressTakePhoto(sender: UIButton) {

        //Create a media connection...
        if let videoConnection = stillImageOutput!.connectionWithMediaType(AVMediaTypeVideo) {

            //Setup the orientation to be locked to portrait
            videoConnection.videoOrientation = AVCaptureVideoOrientation.Portrait

            //capture the still image from the camera
            stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {(sampleBuffer, error) in
                if (sampleBuffer != nil) {

                    //Get the image data
                    let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
                    let dataProvider = CGDataProviderCreateWithCFData(imageData)
                    let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
                    //The 2.0 scale halves the scale of the image.  Where as the 1.0 gives you the full size.
                    let image = UIImage(CGImage: cgImageRef!, scale: 2.0, orientation: UIImageOrientation.Up)


                    // What size is this image.
                    let imageSize = image.size
                    let imageScale = image.scale
                    let yCoord = (imageSize.height - ((imageSize.width*2)/3))/2
                    let getRect = CGRectMake(0, yCoord, imageSize.width, ((imageSize.width*2)/3))
                    let rect = CGRectMake(getRect.origin.x*imageScale, getRect.origin.y*imageScale, getRect.size.width*imageScale, getRect.size.height*imageScale)
                    let imageRef = CGImageCreateWithImageInRect(image.CGImage, rect)
                    //let newImage = UIImage(CGImage: imageRef!)

                    //This app forces the user to use landscapto take pictures so this simply turns the image so that it looks correct when we take the image.
                    let newImage: UIImage = UIImage(CGImage: imageRef!, scale: image.scale, orientation: UIImageOrientation.Down)

                    //Pass the image up to the delegate.
                    self.delegate?.cameraOverlayImage(newImage)

                    //stop the session
                    self.session.stopRunning()

                    //Remove the views.
                    self.previewView.removeFromSuperview()
                    self.boxView.removeFromSuperview()
                    self.myButton.removeFromSuperview()

                    //By this point the image has been handed off to the caller through the delegate and memory has been cleaned up.

                }
            })
        }
    }


    internal func setupCameraView(){

        //Add a view that is big as the frame that acts as a background.
        self.boxView = UIView(frame: self.view.frame)
        self.boxView.backgroundColor = UIColor(red: 255, green: 255, blue: 255, alpha: 1.0)
        self.view.addSubview(self.boxView)

        //Add Camera Preview View
        // This sets up the previewView to be a 3:2 aspect ratio
        let newHeight = UIScreen.mainScreen().bounds.size.width / 2 * 3
        self.previewView = UIView(frame: CGRectMake(0, 0, UIScreen.mainScreen().bounds.size.width, newHeight))
        self.previewView.backgroundColor = UIColor.cyanColor()


        self.previewView.contentMode = UIViewContentMode.ScaleToFill
        self.view.addSubview(previewView)


        //Add the button.
        myButton.frame = CGRectMake(0,0,200,40)
        myButton.backgroundColor = UIColor.redColor()
        myButton.layer.masksToBounds = true
        myButton.setTitle("press me", forState: UIControlState.Normal)
        myButton.setTitleColor(UIColor.whiteColor(), forState: UIControlState.Normal)
        myButton.layer.cornerRadius = 20.0
        myButton.layer.position = CGPoint(x: self.view.frame.width/2, y:(self.view.frame.height - myButton.frame.height ) )
        myButton.addTarget(self, action: "didPressTakePhoto:", forControlEvents: .TouchUpInside)
        self.view.addSubview(myButton)

    }


    internal func setupAVCapture(){

        session.sessionPreset = AVCaptureSessionPresetPhoto;

        let devices = AVCaptureDevice.devices();

        // Loop through all the capture devices on this phone
        for device in devices {

            // Make sure this particular device supports video
            if (device.hasMediaType(AVMediaTypeVideo)) {

                // Finally check the position and confirm we've got the front camera
                if(device.position == AVCaptureDevicePosition.Back) {
                    captureDevice = device as? AVCaptureDevice
                    if captureDevice != nil {

                        //-> Now that we have the back of the camera, start a session.
                        beginSession()
                        break;
                    }
                }
            }
        }
    }

    // Sets up the session
    internal func beginSession(){

        var err : NSError? = nil
        var deviceInput:AVCaptureDeviceInput?

        //See if we can get input from the Capture device as defined in setupAVCapture()
        do {
            deviceInput = try AVCaptureDeviceInput(device: captureDevice)
        } catch let error as NSError {
            err = error
            deviceInput = nil
        }
        if err != nil {
            print("error: \(err?.localizedDescription)")
        }

        //If we can add input into the AVCaptureSession() then do so.
        if self.session.canAddInput(deviceInput){
            self.session.addInput(deviceInput)
        }


        //Now show layers that were setup in the previewView, and mask it to the boundary of the previewView layer.
        let rootLayer :CALayer = self.previewView.layer
        rootLayer.masksToBounds=true


        //put a live video capture based on the current session.
        self.previewLayer = AVCaptureVideoPreviewLayer(session: self.session);

        // Determine how to fill the previewLayer.  In this case, I want to fill out the space of the previewLayer.
        self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
        self.previewLayer.frame = rootLayer.bounds


        //Put the sublayer into the previewLayer
        rootLayer.addSublayer(self.previewLayer)

        session.startRunning()

    }


}

以下是我在视图控制器中使用此 class 的方式。

//
//  ViewController.swift
//  CameraTesting
//
//  Created by Chris Cantley on 2/26/16.
//  Copyright © 2016 Chris Cantley. All rights reserved.
//

import UIKit
import AVFoundation


class ViewController: UIViewController, CameraOverlayDelegate{

    //Setting up the class reference.
    var cameraOverlay : CameraOverlay!

    //Connected to the UIViewController main view.
    @IBOutlet var getView: UIView!

    //Connected to an ImageView that will display the image when it is passed back to the delegate.
    @IBOutlet weak var imgShowImage: UIImageView!


    //Connected to the button that is pressed to bring up the camera view.
    @IBAction func btnPictureTouch(sender: AnyObject) {

        //Remove the image from the UIImageView and take another picture.
        self.imgShowImage.image = nil
        self.cameraOverlay.showCameraView()
    }


    override func viewDidLoad() {

        super.viewDidLoad()

        //Pass in the target UIView which in this case is the main view
        self.cameraOverlay = CameraOverlay(parentView: getView)

        //Make this class the delegate for the instantiated class.  
        //That way it knows to receive the image when the user takes a picture
        self.cameraOverlay.delegate = self


    }


    override func didReceiveMemoryWarning() {
        super.didReceiveMemoryWarning()

        //Nothing here but if you run out of memorry you might want to do something here.

    }

    override func shouldAutorotate() -> Bool {
        if (UIDevice.currentDevice().orientation == UIDeviceOrientation.LandscapeLeft ||
            UIDevice.currentDevice().orientation == UIDeviceOrientation.LandscapeRight ||
            UIDevice.currentDevice().orientation == UIDeviceOrientation.Unknown) {
                return false;
        }
        else {
            return true;
        }
    }

    //This references the delegate from CameraOveralDelegate
    func cameraOverlayImage(image: UIImage) {

        //Put the image passed up from the CameraOverlay class into the UIImageView
        self.imgShowImage.image = image
    }



}

这是我将其放在一起的项目的 link。 GitHub - Boiler plate get image from camera