最近在研究OC的生物活检方面的实现,发现SDK中自带有相应的功能类,则进行了调研与实现。

实现过程中发现一个比较坑人的一个地方,就是GPUIMAGE这个框架里面对于视频采集使用的YUV格式,而YUV格式无法与OC的类库进行配合实现实时识别。

现在我们来剖析一下GPUImageVideoCamera的实现:

@interface GPUImageVideoCamera : GPUImageOutput <AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate>- (id)initWithSessionPreset:(NSString *)sessionPreset cameraPosition:(AVCaptureDevicePosition)cameraPosition;

可以看到提供了一个初始化方法,此初始化方法内部的代码如下:

- (id)initWithSessionPreset:(NSString *)sessionPreset cameraPosition:(AVCaptureDevicePosition)cameraPosition;
{if (!(self = [super init])){return nil;}cameraProcessingQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH,0);audioProcessingQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_LOW,0);frameRenderingSemaphore = dispatch_semaphore_create(1);_frameRate = 0; // This will not set frame rate unless this value gets set to 1 or above_runBenchmark = NO;capturePaused = NO;outputRotation = kGPUImageNoRotation;internalRotation = kGPUImageNoRotation;captureAsYUV = YES;_preferredConversion = kColorConversion709;// Grab the back-facing or front-facing camera_inputCamera = nil;NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];for (AVCaptureDevice *device in devices) {if ([device position] == cameraPosition){_inputCamera = device;}}if (!_inputCamera) {return nil;}// Create the capture session_captureSession = [[AVCaptureSession alloc] init];[_captureSession beginConfiguration];// Add the video input    NSError *error = nil;videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:_inputCamera error:&error];if ([_captureSession canAddInput:videoInput]) {[_captureSession addInput:videoInput];}// Add the video frame output videoOutput = [[AVCaptureVideoDataOutput alloc] init];[videoOutput setAlwaysDiscardsLateVideoFrames:NO];//    if (captureAsYUV && [GPUImageContext deviceSupportsRedTextures])if (captureAsYUV && [GPUImageContext supportsFastTextureUpload]){BOOL supportsFullYUVRange = NO;NSArray *supportedPixelFormats = videoOutput.availableVideoCVPixelFormatTypes;for (NSNumber *currentPixelFormat in supportedPixelFormats){if ([currentPixelFormat intValue] == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange){supportsFullYUVRange = YES;}}if (supportsFullYUVRange){[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];isFullYUVRange = YES;}else{[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];isFullYUVRange = NO;}}else{[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];}runSynchronouslyOnVideoProcessingQueue(^{if (captureAsYUV){[GPUImageContext useImageProcessingContext];//            if ([GPUImageContext deviceSupportsRedTextures])//            {//                yuvConversionProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVVideoRangeConversionForRGFragmentShaderString];//            }//            else//            {if (isFullYUVRange){yuvConversionProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVFullRangeConversionForLAFragmentShaderString];}else{yuvConversionProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVVideoRangeConversionForLAFragmentShaderString];}//            }if (!yuvConversionProgram.initialized){[yuvConversionProgram addAttribute:@"position"];[yuvConversionProgram addAttribute:@"inputTextureCoordinate"];if (![yuvConversionProgram link]){NSString *progLog = [yuvConversionProgram programLog];NSLog(@"Program link log: %@", progLog);NSString *fragLog = [yuvConversionProgram fragmentShaderLog];NSLog(@"Fragment shader compile log: %@", fragLog);NSString *vertLog = [yuvConversionProgram vertexShaderLog];NSLog(@"Vertex shader compile log: %@", vertLog);yuvConversionProgram = nil;NSAssert(NO, @"Filter shader link failed");}}yuvConversionPositionAttribute = [yuvConversionProgram attributeIndex:@"position"];yuvConversionTextureCoordinateAttribute = [yuvConversionProgram attributeIndex:@"inputTextureCoordinate"];yuvConversionLuminanceTextureUniform = [yuvConversionProgram uniformIndex:@"luminanceTexture"];yuvConversionChrominanceTextureUniform = [yuvConversionProgram uniformIndex:@"chrominanceTexture"];yuvConversionMatrixUniform = [yuvConversionProgram uniformIndex:@"colorConversionMatrix"];[GPUImageContext setActiveShaderProgram:yuvConversionProgram];glEnableVertexAttribArray(yuvConversionPositionAttribute);glEnableVertexAttribArray(yuvConversionTextureCoordinateAttribute);}});[videoOutput setSampleBufferDelegate:self queue:cameraProcessingQueue];if ([_captureSession canAddOutput:videoOutput]){[_captureSession addOutput:videoOutput];}else{NSLog(@"Couldn't add video output");return nil;}_captureSessionPreset = sessionPreset;[_captureSession setSessionPreset:_captureSessionPreset];// This will let you get 60 FPS video from the 720p preset on an iPhone 4S, but only that device and that preset
//    AVCaptureConnection *conn = [videoOutput connectionWithMediaType:AVMediaTypeVideo];
//
//    if (conn.supportsVideoMinFrameDuration)
//        conn.videoMinFrameDuration = CMTimeMake(1,60);
//    if (conn.supportsVideoMaxFrameDuration)
//        conn.videoMaxFrameDuration = CMTimeMake(1,60);[_captureSession commitConfiguration];return self;
}

注意看此初始化方法:恶心就恶心在这儿了,初始化时,默认给此全局变量设置了一个YES的值。导致后面所有的功能均基于YUV对视频进行采集。

captureAsYUV = YES;

迫于无奈,进行修改,新增如下初始化方法:

//原初始化方法

- (id)initWithSessionPreset:(NSString *)sessionPreset cameraPosition:(AVCaptureDevicePosition)cameraPosition;

//新增初始化方法

- (id)initWithSessionPreset:(NSString *)sessionPreset cameraPosition:(AVCaptureDevicePosition)cameraPosition yuvColorSpace:(BOOL)yuvColorSpace;

- (id)initWithSessionPreset:(NSString *)sessionPreset cameraPosition:(AVCaptureDevicePosition)cameraPosition yuvColorSpace:(BOOL)yuvColorSpace
{if (!(self = [super init])){return nil;}cameraProcessingQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH,0);audioProcessingQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_LOW,0);frameRenderingSemaphore = dispatch_semaphore_create(1);_frameRate = 0; // This will not set frame rate unless this value gets set to 1 or above_runBenchmark = NO;capturePaused = NO;outputRotation = kGPUImageNoRotation;internalRotation = kGPUImageNoRotation;captureAsYUV = yuvColorSpace;_preferredConversion = kColorConversion709;// Grab the back-facing or front-facing camera_inputCamera = nil;NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];for (AVCaptureDevice *device in devices){if ([device position] == cameraPosition){_inputCamera = device;}}if (!_inputCamera) {return nil;}// Create the capture session_captureSession = [[AVCaptureSession alloc] init];[_captureSession beginConfiguration];// Add the video inputNSError *error = nil;videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:_inputCamera error:&error];if ([_captureSession canAddInput:videoInput]){[_captureSession addInput:videoInput];}// Add the video frame outputvideoOutput = [[AVCaptureVideoDataOutput alloc] init];[videoOutput setAlwaysDiscardsLateVideoFrames:NO];//    if (captureAsYUV && [GPUImageContext deviceSupportsRedTextures])if (captureAsYUV && [GPUImageContext supportsFastTextureUpload]){BOOL supportsFullYUVRange = NO;NSArray *supportedPixelFormats = videoOutput.availableVideoCVPixelFormatTypes;for (NSNumber *currentPixelFormat in supportedPixelFormats){if ([currentPixelFormat intValue] == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange){supportsFullYUVRange = YES;}}if (supportsFullYUVRange){[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];isFullYUVRange = YES;}else{[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];isFullYUVRange = NO;}}else{[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];}runSynchronouslyOnVideoProcessingQueue(^{if (captureAsYUV){[GPUImageContext useImageProcessingContext];//            if ([GPUImageContext deviceSupportsRedTextures])//            {//                yuvConversionProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVVideoRangeConversionForRGFragmentShaderString];//            }//            else//            {if (isFullYUVRange){yuvConversionProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVFullRangeConversionForLAFragmentShaderString];}else{yuvConversionProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVVideoRangeConversionForLAFragmentShaderString];}//            }if (!yuvConversionProgram.initialized){[yuvConversionProgram addAttribute:@"position"];[yuvConversionProgram addAttribute:@"inputTextureCoordinate"];if (![yuvConversionProgram link]){NSString *progLog = [yuvConversionProgram programLog];NSLog(@"Program link log: %@", progLog);NSString *fragLog = [yuvConversionProgram fragmentShaderLog];NSLog(@"Fragment shader compile log: %@", fragLog);NSString *vertLog = [yuvConversionProgram vertexShaderLog];NSLog(@"Vertex shader compile log: %@", vertLog);yuvConversionProgram = nil;NSAssert(NO, @"Filter shader link failed");}}yuvConversionPositionAttribute = [yuvConversionProgram attributeIndex:@"position"];yuvConversionTextureCoordinateAttribute = [yuvConversionProgram attributeIndex:@"inputTextureCoordinate"];yuvConversionLuminanceTextureUniform = [yuvConversionProgram uniformIndex:@"luminanceTexture"];yuvConversionChrominanceTextureUniform = [yuvConversionProgram uniformIndex:@"chrominanceTexture"];yuvConversionMatrixUniform = [yuvConversionProgram uniformIndex:@"colorConversionMatrix"];[GPUImageContext setActiveShaderProgram:yuvConversionProgram];glEnableVertexAttribArray(yuvConversionPositionAttribute);glEnableVertexAttribArray(yuvConversionTextureCoordinateAttribute);}});[videoOutput setSampleBufferDelegate:self queue:cameraProcessingQueue];if ([_captureSession canAddOutput:videoOutput]){[_captureSession addOutput:videoOutput];}else{NSLog(@"Couldn't add video output");return nil;}_captureSessionPreset = sessionPreset;[_captureSession setSessionPreset:_captureSessionPreset];// This will let you get 60 FPS video from the 720p preset on an iPhone 4S, but only that device and that preset
//    AVCaptureConnection *conn = [videoOutput connectionWithMediaType:AVMediaTypeVideo];
//
//    if (conn.supportsVideoMinFrameDuration)
//        conn.videoMinFrameDuration = CMTimeMake(1,60);
//    if (conn.supportsVideoMaxFrameDuration)
//        conn.videoMaxFrameDuration = CMTimeMake(1,60);[_captureSession commitConfiguration];return self;
}

以上代码的上的就是对外暴露初始化时是使用RGB还是YUV格式进行采样。

captureAsYUV = yuvColorSpace;

项目中使用的定义类:

#import <Foundation/Foundation.h>
#import <CoreImage/CoreImage.h>
#import <GPUImage.h>@interface CIFaceFeatureMeta : NSObject@property (nonatomic,strong) CIFeature *features;@property (nonatomic,strong) UIImage *featureImage;
@end@interface NSFaceFeature : NSObject
+ (CGRect)faceRect:(CIFeature*)feature;- (NSArray<CIFaceFeatureMeta *> *)processFaceFeaturesWithPicBuffer:(CMSampleBufferRef)sampleBuffercameraPosition:(AVCaptureDevicePosition)currentCameraPosition;
@end
//
//  NSFaceFeature.m
//  VJFaceDetection
//
//  Created by Vincent·Ge on 2018/6/19.
//  Copyright © 2018年 Filelife. All rights reserved.
//#import "NSFaceFeature.h"
#import "GPUImageBeautifyFilter.h"// Options that can be used with -[CIDetector featuresInImage:options:]/* The value for this key is an integer NSNumber from 1..8 such as that found in kCGImagePropertyOrientation.  If present, the detection will be done based on that orientation but the coordinates in the returned features will still be based on those of the image. */typedef NS_ENUM(NSInteger , PHOTOS_EXIF_ENUM) {PHOTOS_EXIF_0ROW_TOP_0COL_LEFT          = 1, //   1  =  0th row is at the top, and 0th column is on the left (THE DEFAULT).PHOTOS_EXIF_0ROW_TOP_0COL_RIGHT         = 2, //   2  =  0th row is at the top, and 0th column is on the right.PHOTOS_EXIF_0ROW_BOTTOM_0COL_RIGHT      = 3, //   3  =  0th row is at the bottom, and 0th column is on the right.PHOTOS_EXIF_0ROW_BOTTOM_0COL_LEFT       = 4, //   4  =  0th row is at the bottom, and 0th column is on the left.PHOTOS_EXIF_0ROW_LEFT_0COL_TOP          = 5, //   5  =  0th row is on the left, and 0th column is the top.PHOTOS_EXIF_0ROW_RIGHT_0COL_TOP         = 6, //   6  =  0th row is on the right, and 0th column is the top.PHOTOS_EXIF_0ROW_RIGHT_0COL_BOTTOM      = 7, //   7  =  0th row is on the right, and 0th column is the bottom.PHOTOS_EXIF_0ROW_LEFT_0COL_BOTTOM       = 8  //   8  =  0th row is on the left, and 0th column is the bottom.
};@implementation CIFaceFeatureMeta@end@interface NSFaceFeature()
@property (nonatomic, strong) CIDetector *faceDetector;@end@implementation NSFaceFeature- (instancetype)init {self = [super init];[self loadFaceDetector];return self;
}- (void)loadFaceDetector {NSDictionary *detectorOptions = @{CIDetectorAccuracy:CIDetectorAccuracyLow,CIDetectorTracking:@(YES)};self.faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:detectorOptions];
}- (NSArray<CIFaceFeatureMeta *> *)processFaceFeaturesWithPicBuffer:(CMSampleBufferRef)sampleBuffercameraPosition:(AVCaptureDevicePosition)currentCameraPosition {return [NSFaceFeature processFaceFeaturesWithPicBuffer:sampleBufferfaceDetector:self.faceDetectorcameraPosition:currentCameraPosition];
}#pragma mark - Category Function
+ (CGRect)faceRect:(CIFeature*)feature {CGRect faceRect = feature.bounds;CGFloat temp = faceRect.size.width;temp = faceRect.origin.x;faceRect.origin.x = faceRect.origin.y;faceRect.origin.y = temp;return faceRect;
}+ (NSArray<CIFaceFeatureMeta *> *)processFaceFeaturesWithPicBuffer:(CMSampleBufferRef)sampleBufferfaceDetector:(CIDetector *)faceDetectorcameraPosition:(AVCaptureDevicePosition)currentCameraPosition {CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);CFDictionaryRef attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, sampleBuffer, kCMAttachmentMode_ShouldPropagate);//从帧中获取到的图片相对镜头下看到的会向左旋转90度,所以后续坐标的转换要注意。CIImage *convertedImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer options:(__bridge NSDictionary *)attachments];if (attachments) {CFRelease(attachments);}NSDictionary *imageOptions = nil;UIDeviceOrientation curDeviceOrientation = [[UIDevice currentDevice] orientation];int exifOrientation;BOOL isUsingFrontFacingCamera = currentCameraPosition != AVCaptureDevicePositionBack;switch (curDeviceOrientation) {case UIDeviceOrientationPortraitUpsideDown:exifOrientation = PHOTOS_EXIF_0ROW_LEFT_0COL_BOTTOM;break;case UIDeviceOrientationLandscapeLeft:if (isUsingFrontFacingCamera) {exifOrientation = PHOTOS_EXIF_0ROW_BOTTOM_0COL_RIGHT;}else {exifOrientation = PHOTOS_EXIF_0ROW_TOP_0COL_LEFT;}break;case UIDeviceOrientationLandscapeRight:if (isUsingFrontFacingCamera) {exifOrientation = PHOTOS_EXIF_0ROW_TOP_0COL_LEFT;}else {exifOrientation = PHOTOS_EXIF_0ROW_BOTTOM_0COL_RIGHT;}break;default:exifOrientation = PHOTOS_EXIF_0ROW_RIGHT_0COL_TOP; //值为6。确定初始化原点坐标的位置,坐标原点为右上。其中横的为y,竖的为x,表示真实想要显示图片需要顺时针旋转90度break;}//exifOrientation的值用于确定图片的方向imageOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:exifOrientation] forKey:CIDetectorImageOrientation];NSArray<CIFeature *> *lt = [faceDetector featuresInImage:convertedImage options:imageOptions];NSMutableArray *at = [NSMutableArray arrayWithCapacity:0];[lt enumerateObjectsUsingBlock:^(CIFeature * _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {CIFaceFeatureMeta *b = [[CIFaceFeatureMeta alloc] init];[b setFeatures:obj];UIImage *portraitImage = [[UIImage alloc] initWithCIImage: convertedImage scale: 1.0 orientation: UIImageOrientationRight];[b setFeatureImage:portraitImage];[at addObject:b];}];return at;
}
@end

页面控制器:

@interface AttendanceBioViewController : UIViewController- (void)showWithResultsDelegateBlock:(void (^)(NSMutableDictionary *ej))delegateBlock useOtherBlock:(void (^)(void))useOtherBlock;
@end
//
//  AttendanceBioViewController.m
//  SGBProject
//
//  Created by carbonzhao on 2022/4/14.
//  Copyright © 2022 All rights reserved.
//#import "AttendanceBioViewController.h"
#import "GPUImageBeautifyFilter.h"
#import "NSFaceFeature.h"@interface AttendanceBioViewController ()<GPUImageVideoCameraDelegate>
{
}
@property (nonatomic, strong) GPUImageStillCamera *videoCamera;
@property (nonatomic, strong) GPUImageView *filterView;
@property (nonatomic, strong) GPUImageBeautifyFilter *beautifyFilter;
@property (strong, nonatomic) NSFaceFeature *faceFeature;
@property (nonatomic, strong) UIImageView *maskView;
@property (strong, nonatomic) UIImage *capturedImage;@property (strong, nonatomic) UISpinnerAnimationView *animationView;@property (nonatomic, strong) UILabel *tipsLabel;
@property (nonatomic, strong) UILabel *resultLabel;@property (nonatomic, assign) BOOL hasSmile;
@property (nonatomic, assign) BOOL leftEyeClosed;
@property (nonatomic, assign) BOOL rightEyeClosed;@end@implementation AttendanceBioViewController- (instancetype)init
{if (self = [super init]){}return self;
}- (void)viewDidLoad {[super viewDidLoad];// Do any additional setup after loading the view.//GPUImageVideoCamera.m中的修改,YUV格式下无法进行实时人脸识别,如果要进行实时人脸识别,设置为captureAsYUV = NO;否则无法进行人脸实时识别[self setupCameraUI];
}#pragma mark - otherMethod
- (void)setupCameraUI
{self.faceFeature = [NSFaceFeature new];//YUV格式下无法进行实时人脸识别,如果要进行实时人脸识别,设置为falseself.videoCamera = [[GPUImageStillCamera alloc] initWithSessionPreset:AVCaptureSessionPresetMedium cameraPosition:AVCaptureDevicePositionFront yuvColorSpace:NO];self.videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;self.videoCamera.horizontallyMirrorFrontFacingCamera = YES;WeakSelf(self);[NSAutoTimer scheduledTimerWithTimeInterval:2 target:self scheduledBlock:^(NSInteger timerIndex) {weakSelf.videoCamera.delegate = self;}];UILabel *lb = [[UILabel alloc] initWithFrame:CGRectMake(0, CG_NAV_BAR_HEIGHT, self.view.size.width, CG_NAV_BAR_HEIGHT)];[lb setBackgroundColor:[UIColor clearColor]];[lb setText:@"拿起手机,眨眨眼"];[lb setFont:[UIFont boldSystemFontOfSize:20]];[lb setTextAlignment:NSTextAlignmentCenter];[lb setTextColor:[UIColor blackColor]];[self setTipsLabel:lb];[self.view addSubview:lb];CGFloat sz = MIN(self.view.bounds.size.width, self.view.bounds.size.height)/2;CGRect ft = CGRectMake((self.view.bounds.size.width-sz)/2, lb.bottom+CG_NAV_BAR_HEIGHT, sz, sz);UISpinnerAnimationView *c1View = [[UISpinnerAnimationView alloc] initWithFrame:ft];[c1View.layer setMasksToBounds:YES];[c1View.layer setCornerRadius:sz/2];[c1View setAnimationType:UISpinnerAnimationViewAnimationSpinner];CGPoint arcCenter = CGPointMake(c1View.frame.size.width / 2, c1View.frame.size.height / 2);UIBezierPath *path = [UIBezierPath bezierPath];[path addArcWithCenter:arcCenter radius:sz/2 startAngle:0.55 * M_PI endAngle:0.45 * M_PI clockwise:YES];CAShapeLayer *shapeLayer=[CAShapeLayer layer];shapeLayer.path = path.CGPath;shapeLayer.fillColor = [UIColor clearColor].CGColor;//填充颜色shapeLayer.strokeColor = [UIColor lightGrayColor].CGColor;//边框颜色shapeLayer.lineCap = kCALineCapRound;shapeLayer.lineWidth = 4;[c1View.layer addSublayer:shapeLayer];[self setAnimationView:c1View];[self.view addSubview:c1View];CGRect fm = CGRectMake(4, 4, c1View.bounds.size.width-8, c1View.bounds.size.height-8);self.filterView = [[GPUImageView alloc] initWithFrame:fm];self.filterView.fillMode = kGPUImageFillModePreserveAspectRatioAndFill;
//    self.filterView.center = c1View.center;[self.filterView.layer setMasksToBounds:NO];[self.filterView.layer setCornerRadius:fm.size.height/2];[self.filterView setClipsToBounds:YES];self.filterView.layer.shouldRasterize = YES;self.filterView.layer.rasterizationScale = [UIScreen mainScreen].scale;[c1View addSubview:self.filterView];self.beautifyFilter = [[GPUImageBeautifyFilter alloc] init];[self.videoCamera addTarget:self.beautifyFilter];[self.beautifyFilter addTarget:self.filterView];UIView *mview = [[UIView alloc] initWithFrame:ft];[mview.layer setMasksToBounds:YES];[mview.layer setCornerRadius:ft.size.height/2];[mview setClipsToBounds:YES];mview.layer.shouldRasterize = YES;mview.layer.rasterizationScale = [UIScreen mainScreen].scale;[self.view addSubview:mview];CGRect myRect = CGRectMake(0,0,ft.size.width, ft.size.height);//背景path = [UIBezierPath bezierPathWithRoundedRect:mview.bounds cornerRadius:0];//镂空UIBezierPath *circlePath = [UIBezierPath bezierPathWithOvalInRect:myRect];[path appendPath:circlePath];[path setUsesEvenOddFillRule:YES];CAShapeLayer *fillLayer = [CAShapeLayer layer];fillLayer.path = path.CGPath;fillLayer.fillRule = kCAFillRuleEvenOdd;fillLayer.fillColor = [UIColor clearColor].CGColor;fillLayer.opacity = 0;[mview.layer addSublayer:fillLayer];UIImageView *icView = [[UIImageView alloc] initWithFrame:CGRectMake(0, -mview.bounds.size.height, mview.bounds.size.width, mview.bounds.size.height)];[icView setImage:[UIImage imageNamed:@"scannet"]];[icView setAlpha:0.7];[mview addSubview:icView];[self setMaskView:icView];lb = [[UILabel alloc] initWithFrame:CGRectMake(0, 0, ft.size.width, 40)];[lb setBackgroundColor:RGBA(40, 40, 40,0.7)];[lb setText:@"没有检测到人脸"];[lb setFont:[UIFont systemFontOfSize:12]];[lb setTextAlignment:NSTextAlignmentCenter];[lb setTextColor:[UIColor whiteColor]];lb.hidden = YES;self.resultLabel = lb;[mview addSubview:lb];UIButton *cbtn = [UIButton buttonWithType:UIButtonTypeCustom];[cbtn setFrame:CGRectMake(20, CG_NAV_BAR_HEIGHT, 20, 20)];[cbtn setBackgroundImage:[UIImage imageNamed:@"meeting_reserve_refuse_blue"] forState:UIControlStateNormal];[cbtn addTargetActionBlock:^(UIButton * _Nonnull aButton) {[weakSelf closeView:nil];}];[self.view addSubview:cbtn];[self.videoCamera startCameraCapture];[self loopDrawLine];self.atimer = [NSAutoTimer scheduledTimerWithTimeInterval:2 target:self scheduledBlock:^(NSInteger timerIndex) {[weakSelf loopDrawLine];} userInfo:nil repeats:YES];
}-(void)loopDrawLine
{UIImageView *readLineView = self.maskView;[readLineView setFrame:CGRectMake(0, -readLineView.bounds.size.height, readLineView.bounds.size.width, readLineView.bounds.size.height)];[UIView animateWithDuration:2 animations:^{
//修改fream的代码写在这里CGRect ft = readLineView.frame;ft.origin.y += ft.size.height;ft.origin.y += ft.size.height;readLineView.frame = ft;}completion:^(BOOL finished){}];
}- (void)closeView:(void (^)(void))hideFinishedBlock
{[self.atimer invalidate];[self.videoCamera stopCameraCapture];[self dismissViewControllerAnimated:YES completion:^{if (hideFinishedBlock){dispatch_async(dispatch_get_main_queue(), ^{hideFinishedBlock();});}}];
}- (void)showWithResultsDelegateBlock:(void (^)(NSMutableDictionary *ej))block useOtherBlock:(void (^)(void))useOtherBlock
{[self setDelegateBlock:block];[self setUseOtherBlock:useOtherBlock];
}//向服务器上传采集的照片
-(void)uploadFaceImgWithCompleteBlock:(void(^)(void))completeBlock
{WeakSelf(self);UIImage *image = self.capturedImage;//调用你们自己的服务器接口,成功后调用completeBlock
}#pragma mark - Face Detection
- (UIImage *)sampleBufferToImage:(CMSampleBufferRef)sampleBuffer
{CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);CIImage *ciImage = [CIImage imageWithCVPixelBuffer:imageBuffer];CIContext *temporaryContext = [CIContext contextWithOptions:nil];CGImageRef videoImage = [temporaryContext createCGImage:ciImage fromRect:CGRectMake(0, 0, CVPixelBufferGetWidth(imageBuffer), CVPixelBufferGetHeight(imageBuffer))];UIImage *result = [[UIImage alloc] initWithCGImage:videoImage scale:1.0 orientation:UIImageOrientationUp];CGImageRelease(videoImage);return result;
}- (NSMutableArray *)faceFeatureResults:(CIImage *)ciImage
{NSNumber *minSize = [NSNumber numberWithFloat:.45];NSMutableDictionary *options = [NSMutableDictionary dictionaryWithCapacity:0];[options setObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy];[options setObject:minSize forKey:CIDetectorMinFeatureSize];[options setObject:[NSNumber numberWithBool:YES] forKey:CIDetectorSmile];[options setObject:[NSNumber numberWithBool:YES] forKey:CIDetectorEyeBlink];[options setObject:[NSNumber numberWithBool:YES] forKey:CIDetectorTracking];CIDetector *detector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:options];NSMutableArray<CIFaceFeature *> *faceFeatures = (NSMutableArray<CIFaceFeature *> *) [detector featuresInImage:ciImage options:options];return faceFeatures;
}- (void)willOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
{UIImage *resultImage = [self sampleBufferToImage:sampleBuffer];CIImage *ciImage = [[CIImage alloc] initWithImage:resultImage];WeakSelf(self);if (/*self.hasSmile && */(self.leftEyeClosed || self.rightEyeClosed)){NSArray *faceFeatures = [self faceFeatureResults:ciImage];if(faceFeatures.count > 0){[self.videoCamera stopCameraCapture];self.capturedImage = resultImage;self.videoCamera.delegate = nil;[self.atimer invalidate];dispatch_async(dispatch_get_main_queue(), ^{[weakSelf.resultLabel setHidden:YES];[weakSelf.animationView startAnimation];[weakSelf uploadFaceImgWithCompleteBlock:^{[weakSelf.animationView stopAnimation];}];});}else{self.hasSmile = NO;self.leftEyeClosed = NO;self.rightEyeClosed = NO;}}else{NSMutableArray<CIFaceFeature *> *faceFeatures = [self faceFeatureResults:ciImage];if (faceFeatures && faceFeatures.count > 0){WeakSelf(self);dispatch_async(dispatch_get_main_queue(), ^{[weakSelf.resultLabel setHidden:YES];});__block BOOL hasSmile=NO;__block BOOL leftEyeClosed = NO;__block BOOL rightEyeClosed = NO;[faceFeatures enumerateObjectsUsingBlock:^(CIFaceFeature *ft, NSUInteger idx, BOOL * _Nonnull stop){hasSmile |= ft.hasSmile;leftEyeClosed |= ft.leftEyeClosed;rightEyeClosed |= ft.rightEyeClosed;}];self.hasSmile = hasSmile;self.leftEyeClosed = leftEyeClosed;self.rightEyeClosed = rightEyeClosed;if (/*hasSmile &&*/ (leftEyeClosed || rightEyeClosed)){//准备抓拍下一帧做为人脸}else{dispatch_async(dispatch_get_main_queue(), ^{[weakSelf.tipsLabel setText:@"请来一个阳光的微笑或眨眨眼"];});}}else{dispatch_async(dispatch_get_main_queue(), ^{[self.resultLabel setHidden:NO];[self.resultLabel setText:@"没有检测到脸"];});}}
}@end

使用的动画类:参考以下连接

OC 环形等待UI spiner动画_zhaocarbon的博客-CSDN博客

GPUImage实现人脸实时识别相关推荐

  1. 百度人脸实时识别软件

    百度人脸信息实时识别软件 你会用到这个程序https://gitee.com/wangkingking/BaiDuRenLianXinXiShiShiShiBieRuanJian/tree/maste ...

  2. 基于虹软SDK在C/S 模式下的多人脸实时识别(C#)

    一.前言 虹软开发SDK以来,其免费使用的营销策略,成功降低了中小企业使用人脸识别技术的成本.然而,对.NET开发者来说,虹软没有提供C#版本的SDK供开发者直接调用(为什么JAVA就有?!),而是建 ...

  3. 基于MTCNN和FaceNet的实时人脸检测识别系统

    文章目录 模型介绍 MTCNN FaceNet 基于MTCNN和FaceNet的实时人脸检测识别系统 在LFW数据集上测试 参考文献 GitHub项目地址:https://github.com/Har ...

  4. 【Matlab人脸识别】人脸实时检测与跟踪【含GUI源码 673期】

    一.代码运行视频(哔哩哔哩) [Matlab人脸识别]人脸实时检测与跟踪[含GUI源码 673期] 二.matlab版本及参考文献 1 matlab版本 2014a 2 参考文献 [1]孟逸凡,柳益君 ...

  5. 【百度大脑新品体验】人脸情绪实时识别

    [百度大脑新品体验]人脸情绪实时识别 作者:才能我浪费99 首先认证授权: 在开始调用任何API之前需要先进行认证授权,具体的说明请参考: http://ai.baidu.com/docs#/Auth ...

  6. 100行代码搞定实时视频人脸表情识别(附代码)

    点击上方"小白学视觉",选择加"星标"或"置顶" 重磅干货,第一时间送达本文转自|OpenCV学堂 好就没有写点OpenCV4 + Open ...

  7. 添加面部跟踪和实时识别到您的Android应用程序

    今天的相机应用可以做的不仅仅是拍完美的照片.无论是添加过滤器到您的图像或让您调整焦点和手动曝光,应用程序可以从根本上把你变成一个专业摄影师.虽然应用商店中的众多应用程序让你用相机做很多事情,还有其他人 ...

  8. 使用face_recognition(二)目标人脸“实时”检测

    使用face_recognition(二)目标人脸"实时"检测 原文:https://blog.csdn.net/hehangjiang/article/details/78962 ...

  9. 人脸情绪识别系统第一次迭代总结

    设想和目标: 1. 我们的软件要解决什么问题?是否定义得很清楚?是否对典型用户和典型场景有清晰的描述? 问题定义:目前有一种情感维度理论,我们要用机器学习的方式给出人脸的维度值. 典型场景1:用户通过 ...

最新文章

  1. BSON及mongoDB数据类型
  2. Mybatis简单的入门之增删改查
  3. 浅谈大型网络入侵检测建设
  4. 【VMCloud云平台】SCO(七)如何使用集成包
  5. 在iOS平台上使用TensorFlow教程(上)
  6. 英语中十二个月份的由来
  7. 安卓手机 python控制_PyAndroidControl:使用python脚本控制你的安卓设备
  8. 2020年全国儿童青少年总体近视率为52.7%,比上年上升2.5%播
  9. HNOI2008 GT考试 (KMP + 矩阵乘法)
  10. 设计模式——单例模式(饿汉式、懒汉式和DCL)
  11. Nagios 3 Centreon 2 RC5 安装与配置(1)
  12. 数值分析复习(七)——偏微分方程数值解法
  13. 接收诊断响应的相关CAPL函数,具有较高的可复用性
  14. iphone如何删除“不可删除”的描述文件?(桌面快捷方式web clib)
  15. 阿里云OSS文件上传
  16. 数学建模学习(75):全局敏感性分析Morris 方法
  17. 金笛邮件倒入专题之Umail倒入
  18. js中Object.defineProperty()方法的解释
  19. java 用户态_内核启动用户态的程序 - 但行好事 莫问前程 - JavaEye技术网站
  20. UA MATH524 复变函数8 Cauchy定理与原函数

热门文章

  1. # 团队项目测评博客
  2. 【云原生 | Kubernetes 系列】---CephFS和OSS
  3. FIIDO 王志凯:共享单车走到了尽头?电动自行车这才开始!
  4. XCTF-攻防世界CTF平台-Reverse逆向类——52、handcrafted-pyc(Python的pyc文件逆向)
  5. Debian修改桌面系统
  6. php pdo的用法,php pdo函数库用法详解
  7. 赵小楼《天道》《遥远的救世主》深度解析(117)分离已有的各种相,得你想要的结果
  8. 你说啥什么?注解你还不会?
  9. php开发his软件,HIS系统(his管理系统)V3.0.1 官网版
  10. java输出和读取CSV格式文件