GPUImage 和 GPUImageView:应用程序因内存错误而终止

GPUImage and GPUImageView : App terminated due to memory error

我正在使用 GPUImage 和许多 GPUImageView 实例。目的是显示原始图像,在顶部层叠几个过滤图像切片,最后在原始图像上缓慢地对切片过滤器进行动画处理。想象一个图像,其中有一些棕褐色条滚动显示正常图像和棕褐色图像。

我将此功能包装在 UIView 的子类中,如下所示:

import Foundation
import QuartzCore

class FilteredImageMaskView : UIView {

init(frame: CGRect, image: UIImage){
    super.init(frame: frame);

    let imageViewFrame = CGRectMake(frame.origin.x, 0.0, frame.size.width, frame.size.height);

    let origImage = GPUImagePicture(image: image);
    origImage.forceProcessingAtSizeRespectingAspectRatio(imageViewFrame.size);

    // Display the original image without a filter
    let imageView = GPUImageView(frame: imageViewFrame);
    origImage.addTarget(imageView);
    origImage.processImageWithCompletionHandler(){
        origImage.removeAllTargets();

        var contentMode = UIViewContentMode.ScaleAspectFit;
        imageView.contentMode = contentMode;

        // Width of the unfiltered region
        let regularWidth: CGFloat = 30.0;
        // Width of filtered region
        let filterWidth: CGFloat = 30.0;

        // How much we are moving each bar
        let totalXMovement = (regularWidth + filterWidth) * 2;

        // The start X position
        var currentXForFilter: CGFloat = -totalXMovement;

        // The filter being applied to an image
        let filter = GPUImageSepiaFilter();
        filter.intensity = 0.5;
        // Add the filter to the originalImage
        origImage.addTarget(filter);

        let filteredViewCollection = FilteredViewCollection(filteredViews: [GPUImageView]());

        // Iterate over the X positions until the whole image is covered
        while(currentXForFilter < imageView.frame.width + totalXMovement){
            let frame = CGRectMake(currentXForFilter, imageViewFrame.origin.y, imageViewFrame.width, imageViewFrame.height);
            var filteredView = GPUImageView(frame: frame);
            filteredView.clipsToBounds = true;
            filteredView.layer.contentsGravity = kCAGravityTopLeft;

            // This is the slice of the overall image that we are going to display as filtered
            filteredView.layer.contentsRect = CGRectMake(currentXForFilter / imageViewFrame.width, 0.0, filterWidth / imageViewFrame.width, 1.0);
            filteredView.fillMode = kGPUImageFillModePreserveAspectRatio;

            filter.addTarget(filteredView);

            // Add the filteredView to the super view
            self.addSubview(filteredView);

            // Add the filteredView to the collection so we can animate it later
            filteredViewCollection.filteredViews.append(filteredView);

            // Increment the X position           
            currentXForFilter += regularWidth + filterWidth;
        }

        origImage.processImageWithCompletionHandler(){
            filter.removeAllTargets();

            // Move to the UI thread
            ThreadUtility.runOnMainThread(){
                // Add the unfiltered image
                self.addSubview(imageView);
                // And move it behind the filtered slices
                self.sendSubviewToBack(imageView);

                // Animate the slices slowly across the image
                UIView.animateWithDuration(20.0, delay: 0.0, options: UIViewAnimationOptions.Repeat, animations: { [weak filteredViewCollection] in
                    if let strongfilteredViewCollection = filteredViewCollection {
                        if(strongfilteredViewCollection.filteredViews != nil){
                            for(var i = 0; i < strongfilteredViewCollection.filteredViews.count; i++){
                                strongfilteredViewCollection.filteredViews[i].frame.origin.x += totalXMovement;
                                strongfilteredViewCollection.filteredViews[i].layer.contentsRect.origin.x += (totalXMovement / imageView.frame.width);
                            }
                        }
                    }
                }, completion: nil);
            }
        }
    }
}

required init(coder aDecoder: NSCoder) {
    super.init(coder: aDecoder);
}

}

class FilteredViewCollection {
    var filteredViews: [GPUImageView]! = [GPUImageView]();

    init(filteredViews: [GPUImageView]!){
        self.filteredViews = filteredViews;
    }
}

FilteredImageMaskView 的实例以编程方式添加到 viewController 中的视图。当 viewController 被取消时,假设资源将被释放——我小心地避免了保留循环。当我在真实设备上观察调试器中的内存消耗时,当 viewController 被关闭时,内存确实会适当下降。但是,如果我反复加载那个 viewController 来查看图像,然后关闭它,然后再次重新加载,我最终会遇到 "App terminated due to memory error"

如果我在关闭 viewController 后等待一段时间,内存错误似乎不那么频繁,这让我相信内存在关闭 viewController 后仍在释放......?但是我也看到了 viewController.

不那么快速打开和关闭几次后的错误

我一定是在低效地使用 GPUImage and/or GPUImageView,我正在寻找指导。

谢谢!

编辑:查看下面的视图控制器实现。

import UIKit

class ViewImageViewController: UIViewController, FetchImageDelegate {

    var imageManager = ImageManager();

    @IBOutlet var mainView: UIView!

    override func viewDidLoad() {
        super.viewDidLoad()

        imageManager.fetchImageAsync(delegate: self);
    }

    // This callback is dispatched on the UI thread
    func imageFetchCompleted(imageData: [UInt8]) {
        let imageView = FilteredImageMaskView(frame: self.mainView.frame, image: UIImage(data: imageData));
        mainView.addSubview(imageView);

        var timer = NSTimer.scheduledTimerWithTimeInterval(NSTimeInterval(10.0), target: self, selector: Selector("displayReminder"), userInfo: nil, repeats: false);
    }

    func displayReminder(){
        // Show an alert or message here
    }

}

class ImageManager {

    func fetchImageAsync(delegate: FetchImageDelegate) {
        // This dispatches a high priority background thread
        ThreadUtility.runOnHighPriorityBackgroundThread() { [weak delegate] in
            // Get the image (This part could take a while in the real implementation)
            var imageData = [UInt8]();

            // Move to the UI thread
            ThreadUtility.runOnMainThread({
                if let strongDelegate = delegate {
                    strongDelegate.imageFetchCompleted(imageData);
                }
            });
        }
    }
}

现在我正在查看这个精简版,将 self 传递给 ImageManager 是否会创建一个保留周期,即使我将它 weak 仅引用到后台线程?我可以直接从 ViewImageViewController 将其作为弱引用传递吗? ViewImageViewController 肯定有可能在 fetchImageAsync 方法完成并调用回调之前被关闭。

编辑: 我想我找到了问题所在。如果您查看回调中的 ViewImageViewController,我会创建一个 NSTimer 并传递 self.我怀疑如果 viewController 在计时器执行之前被取消,则会创建一个保留周期。这可以解释为什么如果我再等几秒钟,我就不会收到内存错误 - 因为计时器会触发并且 viewController 会正确处理。这是解决方法(我认为)。

// This is on the ViewImageViewController
var timer: NSTimer!;

// Then instead of creating a new variable, assign the timer to the class variable
self.timer = NSTimer.scheduledTimerWithTimeInterval(NSTimeInterval(10.0), target: self, selector: Selector("displayReminder"), userInfo: nil, repeats: false);

// And finally, on dismiss of the viewcontroller (viewWillDisappear or back button click event, or both)
func cancelTimer() {
    if(self.timer != nil){
        self.timer.invalidate();
        self.timer = nil;
    }
}

FilteredImageMaskView在processImageWithCompletionHandler的block中被强引用,很可能形成retain cycle。尝试在 block

中使用 weak self

我想我找到了问题所在。如果您查看回调中的 ViewImageViewController,我会创建一个 NSTimer 并传递 self.我怀疑如果 viewController 在计时器执行之前被取消,则会创建一个保留周期。这可以解释为什么如果我再等几秒钟,我就不会收到内存错误 - 因为计时器会触发并且 viewController 会正确处理。这是修复(我认为)。

// This is on the ViewImageViewController
var timer: NSTimer!;

// Then instead of creating a new variable, assign the timer to the class variable
self.timer = NSTimer.scheduledTimerWithTimeInterval(NSTimeInterval(10.0), target: self, selector: Selector("displayReminder"), userInfo: nil, repeats: false);

// And finally, on dismiss of the viewcontroller (viewWillDisappear or back button click event, or both)
func cancelTimer() {
    if(self.timer != nil){
        self.timer.invalidate();
        self.timer = nil;
    }
}