程序员

使用AVFoundation录制视频,以及碰到的一些问题

2018-05-09  本文已影响663人  kikido

最近在项目中碰到一个需求:

播放一段事先录制好的音频文件,客户回答音频里面出现的问题,整个过程使用手机录制视频,完成后上传至服务器。

这里面有几个需求点:

在这里我是使用AVFoundation框架的AVCaptureSession + AVCaptureMovieFileOutput 来录制视频。好了,下面我会介绍一下自己项目里面的部分代码,并说明下我碰到过的一些麻烦,让有需要的同学们能少踩点坑。。。

遇到的坑

此文档篇幅有点长,怕没有耐心的同学直接走了。。。所有我现在这里放上本文的Demo地址:Demo地址,以及说明遇到的坑

设置AVCaptureMovieFileOutput,需要设置它的movieFragmentInterval属性为kCMTimeInvalid,不然录制的视频超过10秒就会没有声音

使用CMMotionManager时,需要将其设置为全局变量或者私有属性。在调用它的方法- (void)startAccelerometerUpdatesToQueue:(NSOperationQueue *)queue withHandler:(CMAccelerometerHandler)handler __TVOS_PROHIBITED;时,在block里面的self需设置为__weak修饰的self,不然会引起循环引用导致无法dealloc。

    [self.motionManager startAccelerometerUpdatesToQueue:[NSOperationQueue currentQueue] withHandler:^(CMAccelerometerData * _Nullable accelerometerData, NSError * _Nullable error) {
        __strong typeof(weakSelf) strongSelf = weakSelf;
        
        if (!error) {
            [strongSelf outputData:accelerometerData.acceleration];
        } else {
            NSLog(@"error = %@",error);
        }
    }];

可以使用NSTimer的[_recordTimer setFireDate:[NSDate distantPast]];[_recordTimer setFireDate:[NSDate distantFuture]];,实现定时器的暂停与继续

几个类

首先导入头文件#import <AVFoundation/AVFoundation.h>

导入头文件后我们需要创建几个相机必须的属性

//AVCaptureSession对象来执行输入设备和输出设备之间的数据传递
@property (nonatomic, strong) AVCaptureSession *captureSession;
//输入设备源 
@property (nonatomic, strong) AVCaptureDeviceInput *captureDeviceInput; 
//视频输出流
@property (nonatomic, strong) AVCaptureMovieFileOutput *captureMovieFileOutput;    
//相机拍摄预览图层
@property (nonatomic, strong) AVCaptureVideoPreviewLayer *captureVideoPreviewLayer;

AVCaptureSession: 控制输入和输出设备之间的数据传递
AVCaptureDeviceInput: 调用所有的输入硬件。例如摄像头和麦克风
AVCaptureMovieFileOutput: 用于输出视频
AVCaptureVideoPreviewLayer:镜头捕捉到得预览图层

一个AVCaptureSession对象可以管理多个输入输出设备,如下图所示


输入输出设备之间的关系

接下来就是初始化所有的对象,可以放在viewDidLoad方法中进行

CMMotionManager

#import <CoreMotion/CoreMotion.h>,然后将其设为私有变量或者属性,只是临时初始化得到一个实例时不起作用的。通过这个类管理的加速器属性,可以判断出设备的方向,从而设置AVCaptureConnection的方向,使用户在横拍或者竖的情况下,视频要求都是正的。

//运动管理器 初始化
    self.motionManager = [[CMMotionManager alloc] init];
    //加速器每2秒采集一次数据
    self.motionManager.accelerometerUpdateInterval = 2.;
    //避免循环引用
    __weak typeof(self) weakSelf = self;
    [self.motionManager startAccelerometerUpdatesToQueue:[NSOperationQueue currentQueue] withHandler:^(CMAccelerometerData * _Nullable accelerometerData, NSError * _Nullable error) {
        __strong typeof(weakSelf) strongSelf = weakSelf;
        
        if (!error) {
            [strongSelf outputData:accelerometerData.acceleration];
        } else {
            NSLog(@"error = %@",error);
        }
    }];


/**
 更新设备方向

 @param data 加速器
 */
-(void)outputData:(CMAcceleration)data
{
    UIInterfaceOrientation orientation;
    if(data.x >= 0.75){
        orientation = UIInterfaceOrientationLandscapeLeft;
    }
    else if (data.x<= -0.75){
        orientation = UIInterfaceOrientationLandscapeRight;
    }
    else if (data.y <= -0.75){
        orientation = UIInterfaceOrientationPortrait;
    }
    else if (data.y >= 0.75){
        orientation = UIInterfaceOrientationPortraitUpsideDown;
    }
    else{
        return;
    }
    self.videoOrientation = orientation;
}

AVCaptureSession

- (AVCaptureSession *)captureSession
{
    // 录制5秒钟视频 高画质10M,压缩成中画质 0.5M
    // 录制5秒钟视频 中画质0.5M,压缩成中画质 0.5M
    // 录制5秒钟视频 低画质0.1M,压缩成中画质 0.1M
    if (_captureSession == nil) {
        //设置分辨率
        _captureSession = [[AVCaptureSession alloc] init];
        if ([_captureSession canSetSessionPreset:AVCaptureSessionPresetHigh]) {
            _captureSession.sessionPreset =AVCaptureSessionPresetHigh;
        }
    }
    return _captureSession;
}

AVCaptureDevice(相机以及麦克风)

//获得输入设备
    //取得后置摄像头
    AVCaptureDevice *captureDevice = [self getCameraDeviceWithPosition:AVCaptureDevicePositionBack];
    if (!captureDevice) {
        NSLog(@"取得后置摄像头时出现问题");
        return;
    }

    //获取音频输入设备
    AVCaptureDevice *audioCaptureDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];

AVCaptureDeviceInput

使用输入设备AVCaptureDevice初始化AVCaptureDeviceInput对象。

    NSError *error = nil;
    //根据输入设备初始化设备输入对象,用于获得输入数据
    _captureDeviceInput = [[AVCaptureDeviceInput alloc]initWithDevice:captureDevice error:&error];
    if (error) {
        NSLog(@"取得设备输入对象时出错,错误原因:%@",error.localizedDescription);
        return;
    }
    
    //获取音频输入设备
    AVCaptureDevice *audioCaptureDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
    //创建音频输入源
    NSError *tError;
    AVCaptureDeviceInput *audioCaptureDeviceInput = [[AVCaptureDeviceInput alloc]initWithDevice:audioCaptureDevice error:&tError];
    if (tError) {
        NSLog(@"取得设备输入对象时出错,错误原因:%@",tError.localizedDescription);
        return;
    }

AVCaptureMovieFileOutput

初始化输出数据管理对象,如果要拍照就初始化AVCaptureStillImageOutput对象;如果拍摄视频就初始化AVCaptureMovieFileOutput对象。

需要注意的是:

需要设置movieFragmentInterval属性为kCMTimeInvalid,不然录制的视频超过十秒就会没有声音


![](https://img.haomeiwen.com/i1929756/833746eb5d9c73b1.jpeg?imageMogr2/auto-orient/strip%7CimageView2/2/w/1240)

//初始化设备输出对象,用于获得输出数据
    _captureMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
    //需要设置movieFragmentInterval,不然视频超过十秒就会没有声音
    _captureMovieFileOutput.movieFragmentInterval = kCMTimeInvalid;

添加输入源,输出源到会话中

//将视频输入添加到会话中
    if ([self.captureSession canAddInput:_captureDeviceInput]) {
        [_captureSession addInput:_captureDeviceInput];
    }
//将音频输入源添加到会话中
    if ([self.captureSession canAddInput:audioCaptureDeviceInput]) {
        [_captureSession addInput:audioCaptureDeviceInput];
    }
    AVCaptureConnection *captureConnection = [_captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
    //视频防抖 是在 iOS 6 和 iPhone 4S 发布时引入的功能。到了 iPhone 6,增加了更强劲和流畅的防抖模式,被称为影院级的视频防抖动。相关的 API 也有所改动 (目前为止并没有在文档中反映出来,不过可以查看头文件)。防抖并不是在捕获设备上配置的,而是在 AVCaptureConnection 上设置。由于不是所有的设备格式都支持全部的防抖模式,所以在实际应用中应事先确认具体的防抖模式是否支持
    if ([captureConnection isVideoStabilizationSupported ]) {
        captureConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto;
    }
    //将设备输出添加到会话中
    if ([_captureSession canAddOutput:_captureMovieFileOutput]) {
        [_captureSession addOutput:_captureMovieFileOutput];
    }

AVCaptureVideoPreviewLayer

- (AVCaptureVideoPreviewLayer *)captureVideoPreviewLayer
{
    if (_captureVideoPreviewLayer == nil) {
        _captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
        //填充模式
        _captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
        _captureVideoPreviewLayer.frame = self.view.bounds;
    }
    return _captureVideoPreviewLayer;
}
    //创建视频预览层,用于实时展示摄像头状态
    //将视频预览层添加到界面中
    [self.view.layer insertSublayer:self.captureVideoPreviewLayer atIndex:0];

开始录制

在viewWillAppear,viewDidDisappear方法里开启和关闭session, 捕捉摄像内容到AVCaptureVideoPreviewLayer

- (void)viewWillAppear:(BOOL)animated
{
    [super viewWillAppear:YES];
    [self.session startRunning];
}


- (void)viewDidDisappear:(BOOL)animated
{
   [super viewDidDisappear:YES];
   [self.session stopRunning];
}

将录制的视频输出到指定文件

//开始结束录制
- (void)startOrEndAction:(UIButton *)sender
{
    sender.selected = !sender.selected;
    
    if (sender.selected) {
        //录制状态
        if (!self.captureMovieFileOutput.isRecording) {
            //开始定时器
            [self startTimeLabel];
            [sender setTitle:@"结束录制" forState:UIControlStateNormal];
            
            //将视频的输出方向与设备方向保持一致
            AVCaptureConnection *connection = [self.captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
            connection.videoOrientation = self.videoOrientation;
            
            [self.captureMovieFileOutput startRecordingToOutputFileURL:self.fileUrl recordingDelegate:self];
        }
    } else {
        //结束录制状态
        [sender setTitle:@"开始录制" forState:UIControlStateNormal];
        //停止录制
        [self.captureMovieFileOutput stopRecording];
        //关闭定时器
        [self.recordTimer invalidate];
        self.recordTimer = nil;
    }
}

代理

当我们开始和结束录制都是有监听方法的,AVCaptureFileOutputRecordingDelegate这个代理里面就有我们想要做的。

#pragma mark - AVCaptureFileOutputRecordingDelegate

-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections
{
    NSLog(@"开始录制...");
}

-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
    NSLog(@"视频录制完成");
    
    [self stopTimeLabel];
    
    NSFileManager *fileManager = [[NSFileManager alloc] init] ;
    float filesize = -1.0;
    if ([fileManager fileExistsAtPath:self.fileUrl.path]) {
        NSDictionary *fileDic = [fileManager attributesOfItemAtPath:self.fileUrl.path error:nil];//获取文件的属性
        unsigned long long size = fileDic.fileSize;
        filesize = 1*size;
    }

    NSLog(@"视频大小 %lfM", filesize / 1024 / 1024);
}

至此相机的拍照功能已经完成~

私有方法

/**
 *  取得指定位置的摄像头
 *
 *  @param position 摄像头位置
 *
 *  @return 摄像头设备
 */
-(AVCaptureDevice *)getCameraDeviceWithPosition:(AVCaptureDevicePosition )position{
    NSArray *cameras= [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice *camera in cameras) {
        if ([camera position]==position) {
            return camera;
        }
    }
    return nil;
}

/**
 切换摄像头

 @param sender 按钮
 */
- (void)switchCameraClicked:(UIButton *)sender
{
    AVCaptureDevice *currentDevice = [self.captureDeviceInput device];
    AVCaptureDevicePosition currentPosition = [currentDevice position];
    [self removeNotificationFromCaptureDevice:currentDevice];
    
    AVCaptureDevice *toChangeDevice;
    AVCaptureDevicePosition toChangePosition = AVCaptureDevicePositionFront;
    if (currentPosition==AVCaptureDevicePositionUnspecified || currentPosition==AVCaptureDevicePositionFront) {
        toChangePosition=AVCaptureDevicePositionBack;
    }
    toChangeDevice = [self getCameraDeviceWithPosition:toChangePosition];
    if (!toChangeDevice) {
        NSLog(@"切换摄像头失败");
        return;
    }
    [self addNotificationToCaptureDevice:toChangeDevice];
    //获得要调整的设备输入对象
    AVCaptureDeviceInput *toChangeDeviceInput = [[AVCaptureDeviceInput alloc]initWithDevice:toChangeDevice error:nil];
    
    //改变会话的配置前一定要先开启配置,配置完成后提交配置改变
    [self.captureSession beginConfiguration];
    //移除原有输入对象
    [self.captureSession removeInput:self.captureDeviceInput];
    //添加新的输入对象
    if ([self.captureSession canAddInput:toChangeDeviceInput]) {
        [self.captureSession addInput:toChangeDeviceInput];
        self.captureDeviceInput = toChangeDeviceInput;
    }else{
        [self.captureSession addInput:self.captureDeviceInput];
    }
    
    //提交会话配置
    [self.captureSession commitConfiguration];
}

/**
 *  改变设备属性的统一操作方法
 *
 *  @param propertyChange 属性改变操作
 */
-(void)changeDeviceProperty:(PropertyChangeBlock)propertyChange{
    AVCaptureDevice *captureDevice= [self.captureDeviceInput device];
    NSError *error;
    //注意改变设备属性前一定要首先调用lockForConfiguration:调用完之后使用unlockForConfiguration方法解锁
    if ([captureDevice lockForConfiguration:&error]) {
        propertyChange(captureDevice);
        [captureDevice unlockForConfiguration];
    }else{
        NSLog(@"设置设备属性过程发生错误,错误信息:%@",error.localizedDescription);
    }
}

/**
 *  设置聚焦模式
 *
 *  @param focusMode 聚焦模式
 */
-(void)setFocusMode:(AVCaptureFocusMode )focusMode{
    [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
        if ([captureDevice isFocusModeSupported:focusMode]) {
            [captureDevice setFocusMode:focusMode];
        }
    }];
}


/**
 *  设置聚焦点
 *
 *  @param point 聚焦点
 */
-(void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{
    [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
        if ([captureDevice isFocusModeSupported:focusMode]) {
            [captureDevice setFocusMode:AVCaptureFocusModeAutoFocus];
        }
        if ([captureDevice isFocusPointOfInterestSupported]) {
            [captureDevice setFocusPointOfInterest:point];
        }
        if ([captureDevice isExposureModeSupported:exposureMode]) {
            [captureDevice setExposureMode:AVCaptureExposureModeAutoExpose];
        }
        if ([captureDevice isExposurePointOfInterestSupported]) {
            [captureDevice setExposurePointOfInterest:point];
        }
    }];
}

/**
 *  添加点按手势,点按时聚焦
 */
-(void)addGenstureRecognizer{
UITapGestureRecognizer *tapGesture=[[UITapGestureRecognizer alloc]initWithTarget:self action:@selector(tapScreen:)];
    [self.viewContainer addGestureRecognizer:tapGesture];
}
-(void)tapScreen:(UITapGestureRecognizer *)tapGesture{
    CGPoint point= [tapGesture locationInView:self.viewContainer];
    //将UI坐标转化为摄像头坐标
    CGPoint cameraPoint= [self.captureVideoPreviewLayer captureDevicePointOfInterestForPoint:point];
    [self setFocusCursorWithPoint:point];
    [self focusWithMode:AVCaptureFocusModeAutoFocus exposureMode:AVCaptureExposureModeAutoExpose atPoint:cameraPoint];
}

/**
 *  设置聚焦光标位置
 *
 *  @param point 光标位置
 */
-(void)setFocusCursorWithPoint:(CGPoint)point{
   self.focusImage.center = point;
    self.focusImage.transform = CGAffineTransformMakeScale(1.5, 1.5);
    self.focusImage.alpha = 1.0;
    [UIView animateWithDuration:.2 animations:^{
        self.focusImage.transform = CGAffineTransformIdentity;
    } completion:^(BOOL finished) {
        [UIView animateWithDuration:1.0 animations:^{
            self.focusImage.alpha = 0;
        }];
    }];
}

结尾

到这主要的基本都讲了了,详细点说的话代码有点多会显得很乱。所以如果对录制视频感兴趣的同学,可以在我的github上下载这个Demo:Demo地址

好久没写东西了,写的不好见谅有错误请指出哈~~~如果对你有帮助的话请点个赞吧!

上一篇下一篇

猜你喜欢

热点阅读