使用iOS CIDetector人脸检测触发功能
Using iOS CIDetector face detection to trigger function
我正在尝试使用 CIDetector
制作人脸检测器,只要检测到人脸就会启用按钮。我搜索但找不到的部分是如何使代码在检测到人脸时触发功能。并在人脸离开相机框架时禁用它。
这是我到目前为止的代码:
.h 文件:
#import <UIKit/UIKit.h>
@interface ViewController : UIViewController
@property (weak, nonatomic) IBOutlet UIButton *actionButton;
//Update 2:
@property (weak, nonatomic) IBOutlet UIView *containerView;
- (IBAction)actionButton:(id)sender;
@end
.m 文件:
#import "ViewController.h"
@import AVFoundation;
@interface ViewController () <AVCaptureMetadataOutputObjectsDelegate> {
AVCaptureVideoPreviewLayer *_previewLayer;
AVCaptureSession *_session;
CIDetector *_faceDetector;
CIContext *_ciContext;
}
@end
@implementation SCViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
// Create a new AVCaptureSession
_session = [[AVCaptureSession alloc] init];
[_session setSessionPreset:AVCaptureSessionPreset640x480];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if(input) {
// Add the input to the session
[_session addInput:input];
} else {
NSLog(@"error: %@", error);
return;
}
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
// Have to add the output before setting metadata types
[_session addOutput:output];
// Restrict the output metadata to faces
[output setMetadataObjectTypes:@[AVMetadataObjectTypeFace]];
// This VC is the delegate. Please call us on the main queue
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
// Display on screen
_previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
_previewLayer.bounds = self.view.bounds;
_previewLayer.position = CGPointMake(CGRectGetMidX(self.view.bounds), CGRectGetMidY(self.view.bounds));
// Update 2 change
[self.containerView.layer addSublayer:_previewLayer];
// Hide the button
self.actionButton.hidden = YES;
// Start the AVSession running
[_session startRunning];
}
// Update 1:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection
{
for(AVMetadataObject *metadataObject in metadataObjects) {
if([metadataObject.type isEqualToString:AVMetadataObjectTypeFace]) {
self.retakeButton.hidden = NO;
}
}
}
- (IBAction)actionButton:(id)sender {
}
@end
在您的情节提要中,您应该向主视图添加一个新视图并创建出口:
@property (weak, nonatomic) IBOutlet UIView *containerView;
并且您添加的按钮应该与新创建的子视图处于同一层级。
此外,该按钮应位于新创建的子视图的前面。
并在您的代码更改中:
[self.view.layer addSublayer:_previewLayer];
至:
[self.containerView.layer addSublayer:_previewLayer];
希望这些帮助
更新:
如果您有手势识别器但没有 UI,那么您可以使用这个快速简单的修复程序:
NSTimer *timer = [NSTimer timerWithTimeInterval:0.2f target:self selector:@selector(hideButton) userInfo:nil repeats:YES];
[[NSRunLoop mainRunLoop] addTimer:timer forMode:NSRunLoopCommonModes];
其中:
-(void)hideButton{
if(counterSeconds==2){
if (counterCaptureOutput==0) {
NSLog(@"hide button");
[self.retakeButton setHidden:YES];
}
counterCaptureOutput=0;
counterSeconds=0;
}
counterSeconds++;
}
和:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection
{
for(AVMetadataObject *metadataObject in metadataObjects) {
if([metadataObject.type isEqualToString:AVMetadataObjectTypeFace]) {
self.retakeButton.hidden = NO;
counterCaptureOutput++;
NSLog(@"ENTER FUNCTION");
}
}
}
也包含在 .m 中:
int counterCaptureOutput;
int counterSeconds;
我正在尝试使用 CIDetector
制作人脸检测器,只要检测到人脸就会启用按钮。我搜索但找不到的部分是如何使代码在检测到人脸时触发功能。并在人脸离开相机框架时禁用它。
这是我到目前为止的代码:
.h 文件:
#import <UIKit/UIKit.h>
@interface ViewController : UIViewController
@property (weak, nonatomic) IBOutlet UIButton *actionButton;
//Update 2:
@property (weak, nonatomic) IBOutlet UIView *containerView;
- (IBAction)actionButton:(id)sender;
@end
.m 文件:
#import "ViewController.h"
@import AVFoundation;
@interface ViewController () <AVCaptureMetadataOutputObjectsDelegate> {
AVCaptureVideoPreviewLayer *_previewLayer;
AVCaptureSession *_session;
CIDetector *_faceDetector;
CIContext *_ciContext;
}
@end
@implementation SCViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
// Create a new AVCaptureSession
_session = [[AVCaptureSession alloc] init];
[_session setSessionPreset:AVCaptureSessionPreset640x480];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if(input) {
// Add the input to the session
[_session addInput:input];
} else {
NSLog(@"error: %@", error);
return;
}
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
// Have to add the output before setting metadata types
[_session addOutput:output];
// Restrict the output metadata to faces
[output setMetadataObjectTypes:@[AVMetadataObjectTypeFace]];
// This VC is the delegate. Please call us on the main queue
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
// Display on screen
_previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
_previewLayer.bounds = self.view.bounds;
_previewLayer.position = CGPointMake(CGRectGetMidX(self.view.bounds), CGRectGetMidY(self.view.bounds));
// Update 2 change
[self.containerView.layer addSublayer:_previewLayer];
// Hide the button
self.actionButton.hidden = YES;
// Start the AVSession running
[_session startRunning];
}
// Update 1:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection
{
for(AVMetadataObject *metadataObject in metadataObjects) {
if([metadataObject.type isEqualToString:AVMetadataObjectTypeFace]) {
self.retakeButton.hidden = NO;
}
}
}
- (IBAction)actionButton:(id)sender {
}
@end
在您的情节提要中,您应该向主视图添加一个新视图并创建出口:
@property (weak, nonatomic) IBOutlet UIView *containerView;
并且您添加的按钮应该与新创建的子视图处于同一层级。 此外,该按钮应位于新创建的子视图的前面。
并在您的代码更改中:
[self.view.layer addSublayer:_previewLayer];
至:
[self.containerView.layer addSublayer:_previewLayer];
希望这些帮助
更新:
如果您有手势识别器但没有 UI,那么您可以使用这个快速简单的修复程序:
NSTimer *timer = [NSTimer timerWithTimeInterval:0.2f target:self selector:@selector(hideButton) userInfo:nil repeats:YES];
[[NSRunLoop mainRunLoop] addTimer:timer forMode:NSRunLoopCommonModes];
其中:
-(void)hideButton{
if(counterSeconds==2){
if (counterCaptureOutput==0) {
NSLog(@"hide button");
[self.retakeButton setHidden:YES];
}
counterCaptureOutput=0;
counterSeconds=0;
}
counterSeconds++;
}
和:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputMetadataObjects:(NSArray *)metadataObjects
fromConnection:(AVCaptureConnection *)connection
{
for(AVMetadataObject *metadataObject in metadataObjects) {
if([metadataObject.type isEqualToString:AVMetadataObjectTypeFace]) {
self.retakeButton.hidden = NO;
counterCaptureOutput++;
NSLog(@"ENTER FUNCTION");
}
}
}
也包含在 .m 中:
int counterCaptureOutput;
int counterSeconds;