AVCaptureSession 和相机线程未关闭 [iOS]
AVCaptureSession and Camera Threads Not Closing [iOS]
问题
当我停止运行 AVCaptureSession 时,在我的 AVCaptureSession 期间创建的线程不会关闭。
症状
通常我的 dispatch_queue 从相机获取帧会立即启动。但是在大约四次打开和关闭 ViewController 之后 opens/closes AVCaptureSession dispatch_queue 需要大约十秒钟才能启动。
预后
与 AVCaptureSession 关联的线程似乎没有清除。
关闭 AVCaptureSession 后,我看到这些线程仍然存在:
com.apple.coremedia.capturesource.connections(serial) 1 Pending Block
com.apple.coremedia.capturesession.connections(serial) 1 Pending Block
<AVCMNotificationDispatcher: 0x16bce00> serial queue(serial) 4 Pending Blocks
com.apple.avfoundation.videocapturedevice.observed_properties_queue(serial)
com.apple.tcc.cache_queue(serial) 1 Pending Block
com.apple.tcc.preflight.kTCCServiceCamera(serial) 1 Pending Block
并且在我 open/close ViewController 与 AVCaptureSession 之后,相同的线程仍然存在,但是这三个线程增加了待处理块的数量
<AVCMNotificationDispatcher: 0x17c441a0> serial queue (serial) 9 Pending Blocks
com.apple.avfoundation.videocapturedevice.observed_properties_queue(serial)
com.apple.tcc.preflight.kTCCServiceCamera(serial) 5 Pending Blocks
代码设置
VideoSource.h 和 VideoSource.mm
在我的 ViewController 中,我这样初始化它:
self.videoSource = [[VideoSource alloc] init];
self.videoSource.delegate = self;
[self.videoSource setResolution:AVCaptureSessionPreset352x288]; // was 640
[self.videoSource startWithDevicePosition:AVCaptureDevicePositionFront];
我按如下方式启动和停止 captureSession,它的启动和停止都很好。实际抓帧效果非常好。
[self.videoSource.captureSession startRunning];
[self.videoSource.captureSession stopRunning];
VideoSource的相关部分,如需查看请告知
来自VideoSource.mm
- (void)dealloc {
NSLog(@"Cleaning Up Video Source");
[_captureSession stopRunning];
AVCaptureInput* input = [_captureSession.inputs objectAtIndex:0];
[_captureSession removeInput:input];
input = nil;
AVCaptureVideoDataOutput* output = (AVCaptureVideoDataOutput*)[_captureSession.outputs objectAtIndex:0];
[_captureSession removeOutput:output];
output = nil;
_captureSession = nil;
_deviceInput = nil;
_delegate = nil;
// [super dealloc]; // compiler handles this for you with ARC
}
- (void) addVideoDataOutput {
// (1) Instantiate a new video data output object
AVCaptureVideoDataOutput * captureOutput = [[AVCaptureVideoDataOutput alloc] init ];
// captureOutput.alwaysDiscardsLateVideoFrames = YES;
NSLog(@"Create Dispatch Queue");
// (2) The sample buffer delegate requires a serial dispatch queue
dispatch_queue_t queue;
queue = dispatch_queue_create("com.name.test", DISPATCH_QUEUE_SERIAL);
[captureOutput setSampleBufferDelegate:self queue:queue];
// dispatch_release(queue); // compiler handles this for you with ARC
// (3) Define the pixel format for the video data output
NSString * key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber * value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary * settings = @{key:value};
NSLog(@"Set Video Settings");
[captureOutput setVideoSettings:settings];
NSLog(@"Always Discard Late Video Frames");
[captureOutput setAlwaysDiscardsLateVideoFrames:YES];
// (4) Configure the output port on the captureSession property
[self.captureSession addOutput:captureOutput];
}
从 VideoSource.h
@interface VideoSource : NSObject
@property (nonatomic, strong) AVCaptureSession * captureSession;
@property (nonatomic, strong) AVCaptureDeviceInput * deviceInput;
@property (nonatomic, weak) id<VideoSourceDelegate> delegate;
- (BOOL)startWithDevicePosition:(AVCaptureDevicePosition)devicePosition;
- (void) setResolution:(NSString*)resolution;
@end
请求
如何确保这些线程在我取消分配 VideoSource 时关闭?
解决了!
解决方案: 从与用于 captureOutput 的 SampleBuffer 队列相同的 dispatch_queue 调用 startRunning 和 stopRunning。
这是我的新设置:
#import "VideoSource.h"
@interface VideoSource () <AVCaptureVideoDataOutputSampleBufferDelegate>
// Session management.
@property (nonatomic) dispatch_queue_t sessionQueue;
@property (nonatomic) AVCaptureSession *captureSession;
@property (nonatomic) AVCaptureDeviceInput *deviceInput;
/*@property (nonatomic, strong) AVCaptureSession * captureSession;
@property (nonatomic, strong) AVCaptureDeviceInput * deviceInput; */
@end
@implementation VideoSource
-(id) init{
if(self = [super init]){
self.captureSession = [[AVCaptureSession alloc] init];
self.sessionQueue = dispatch_queue_create( "session queue", DISPATCH_QUEUE_SERIAL );
}
return self;
}
然后为您的 setSampleBufferDelegate 队列使用完全相同的 sessionQueue。
[captureOutput setSampleBufferDelegate:self queue:self.sessionQueue];
现在是最重要的部分,确保从非常相同的队列中调用 startRunning/stopRunning:
dispatch_async( self.sessionQueue, ^{
[self.captureSession startRunning];
});
同样,您可以创建一个漂亮的小函数来清理和停止 captureSession:
-(void)closeCaptureSession {
dispatch_async(self.sessionQueue, ^{
if([_captureSession isRunning])[_captureSession stopRunning];
[_captureSession stopRunning];
// Remove all inputs
for(AVCaptureInput *input1 in _captureSession.inputs) {
[_captureSession removeInput:input1];
}
// Remove all outputs
for(AVCaptureVideoDataOutput *output1 in _captureSession.outputs) {
[output1 setSampleBufferDelegate:nil queue:NULL];
[_captureSession removeOutput:output1];
}
// Set to Nil to make ARC's job a little easier
self.captureSession = nil;
self.deviceInput = nil;
self.delegate = nil;
self.sessionQueue=nil;
});
}
问题
当我停止运行 AVCaptureSession 时,在我的 AVCaptureSession 期间创建的线程不会关闭。
症状
通常我的 dispatch_queue 从相机获取帧会立即启动。但是在大约四次打开和关闭 ViewController 之后 opens/closes AVCaptureSession dispatch_queue 需要大约十秒钟才能启动。
预后
与 AVCaptureSession 关联的线程似乎没有清除。
关闭 AVCaptureSession 后,我看到这些线程仍然存在:
com.apple.coremedia.capturesource.connections(serial) 1 Pending Block
com.apple.coremedia.capturesession.connections(serial) 1 Pending Block
<AVCMNotificationDispatcher: 0x16bce00> serial queue(serial) 4 Pending Blocks
com.apple.avfoundation.videocapturedevice.observed_properties_queue(serial)
com.apple.tcc.cache_queue(serial) 1 Pending Block
com.apple.tcc.preflight.kTCCServiceCamera(serial) 1 Pending Block
并且在我 open/close ViewController 与 AVCaptureSession 之后,相同的线程仍然存在,但是这三个线程增加了待处理块的数量
<AVCMNotificationDispatcher: 0x17c441a0> serial queue (serial) 9 Pending Blocks
com.apple.avfoundation.videocapturedevice.observed_properties_queue(serial)
com.apple.tcc.preflight.kTCCServiceCamera(serial) 5 Pending Blocks
代码设置
VideoSource.h 和 VideoSource.mm
在我的 ViewController 中,我这样初始化它:
self.videoSource = [[VideoSource alloc] init];
self.videoSource.delegate = self;
[self.videoSource setResolution:AVCaptureSessionPreset352x288]; // was 640
[self.videoSource startWithDevicePosition:AVCaptureDevicePositionFront];
我按如下方式启动和停止 captureSession,它的启动和停止都很好。实际抓帧效果非常好。
[self.videoSource.captureSession startRunning];
[self.videoSource.captureSession stopRunning];
VideoSource的相关部分,如需查看请告知
来自VideoSource.mm
- (void)dealloc {
NSLog(@"Cleaning Up Video Source");
[_captureSession stopRunning];
AVCaptureInput* input = [_captureSession.inputs objectAtIndex:0];
[_captureSession removeInput:input];
input = nil;
AVCaptureVideoDataOutput* output = (AVCaptureVideoDataOutput*)[_captureSession.outputs objectAtIndex:0];
[_captureSession removeOutput:output];
output = nil;
_captureSession = nil;
_deviceInput = nil;
_delegate = nil;
// [super dealloc]; // compiler handles this for you with ARC
}
- (void) addVideoDataOutput {
// (1) Instantiate a new video data output object
AVCaptureVideoDataOutput * captureOutput = [[AVCaptureVideoDataOutput alloc] init ];
// captureOutput.alwaysDiscardsLateVideoFrames = YES;
NSLog(@"Create Dispatch Queue");
// (2) The sample buffer delegate requires a serial dispatch queue
dispatch_queue_t queue;
queue = dispatch_queue_create("com.name.test", DISPATCH_QUEUE_SERIAL);
[captureOutput setSampleBufferDelegate:self queue:queue];
// dispatch_release(queue); // compiler handles this for you with ARC
// (3) Define the pixel format for the video data output
NSString * key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber * value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary * settings = @{key:value};
NSLog(@"Set Video Settings");
[captureOutput setVideoSettings:settings];
NSLog(@"Always Discard Late Video Frames");
[captureOutput setAlwaysDiscardsLateVideoFrames:YES];
// (4) Configure the output port on the captureSession property
[self.captureSession addOutput:captureOutput];
}
从 VideoSource.h
@interface VideoSource : NSObject
@property (nonatomic, strong) AVCaptureSession * captureSession;
@property (nonatomic, strong) AVCaptureDeviceInput * deviceInput;
@property (nonatomic, weak) id<VideoSourceDelegate> delegate;
- (BOOL)startWithDevicePosition:(AVCaptureDevicePosition)devicePosition;
- (void) setResolution:(NSString*)resolution;
@end
请求
如何确保这些线程在我取消分配 VideoSource 时关闭?
解决了!
解决方案: 从与用于 captureOutput 的 SampleBuffer 队列相同的 dispatch_queue 调用 startRunning 和 stopRunning。
这是我的新设置:
#import "VideoSource.h"
@interface VideoSource () <AVCaptureVideoDataOutputSampleBufferDelegate>
// Session management.
@property (nonatomic) dispatch_queue_t sessionQueue;
@property (nonatomic) AVCaptureSession *captureSession;
@property (nonatomic) AVCaptureDeviceInput *deviceInput;
/*@property (nonatomic, strong) AVCaptureSession * captureSession;
@property (nonatomic, strong) AVCaptureDeviceInput * deviceInput; */
@end
@implementation VideoSource
-(id) init{
if(self = [super init]){
self.captureSession = [[AVCaptureSession alloc] init];
self.sessionQueue = dispatch_queue_create( "session queue", DISPATCH_QUEUE_SERIAL );
}
return self;
}
然后为您的 setSampleBufferDelegate 队列使用完全相同的 sessionQueue。
[captureOutput setSampleBufferDelegate:self queue:self.sessionQueue];
现在是最重要的部分,确保从非常相同的队列中调用 startRunning/stopRunning:
dispatch_async( self.sessionQueue, ^{
[self.captureSession startRunning];
});
同样,您可以创建一个漂亮的小函数来清理和停止 captureSession:
-(void)closeCaptureSession {
dispatch_async(self.sessionQueue, ^{
if([_captureSession isRunning])[_captureSession stopRunning];
[_captureSession stopRunning];
// Remove all inputs
for(AVCaptureInput *input1 in _captureSession.inputs) {
[_captureSession removeInput:input1];
}
// Remove all outputs
for(AVCaptureVideoDataOutput *output1 in _captureSession.outputs) {
[output1 setSampleBufferDelegate:nil queue:NULL];
[_captureSession removeOutput:output1];
}
// Set to Nil to make ARC's job a little easier
self.captureSession = nil;
self.deviceInput = nil;
self.delegate = nil;
self.sessionQueue=nil;
});
}