iOS高质量博客AVPlayer和GPUImagede学习ios 学习

AVCaptureSession 拍照,摄像

2016-05-24  本文已影响2643人  代码守望者

首先上Demo:类似微信的小视频
https://github.com/chengssir/MMRecoringDemo

用到的类:

AVCaptureSession(摄像、拍照控制器)
AVCaptureMovieFileOutput(输出数据)
AVCaptureDeviceInput(输入数据)
AVCaptureDevice(硬件设施)
AVCaptureVideoPreviewLayer(预览层)

1、建立AVCaptureSession

AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
添加add Input and Output.
[session startRunning];

1.1设置采集质量

if ([session canSetSessionPreset:AVCaptureSessionPreset1280x720]) {
   session.sessionPreset = AVCaptureSessionPreset1280x720;
}
AVCaptureSessionPresetHigh  质量最高
AVCaptureSessionPresetPhoto Photo模式不能输出视频
AVCaptureSessionPresetMedium 中等质量
AVCaptureSessionPresetLow 最低质量
AVCaptureSessionPreset320x240 320x240大小(不支持IPhone)
AVCaptureSessionPreset352x288 (支持iPhone)
AVCaptureSessionPreset640x480(支持iPhone)等等...

1.2重新设置AVCaptureSession

[_captureSession beginConfiguration];
中间进行重新设置(比如摄像头、闪光灯等)
[_captureSession commitConfiguration];

2、添加Input

2.1获取摄像头

默认后置
AVCaptureDevice *backCamera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
摄像模式数组(前置+后置)
//NSArray *cameras = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
所有模式
//NSArray *cameras = [AVCaptureDevice devices];

2.2添加Input输入流

AVCaptureDeviceInput  *_videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:nil];
if ([_captureSession canAddInput:_videoDeviceInput]) {
    [_captureSession addInput:_videoDeviceInput];
}

2.3.1切换摄像头

[_captureSession beginConfiguration];
[_captureSession removeInput:_videoDeviceInput];

self.isUsingFrontCamera = !_isUsingFrontCamera;
AVCaptureDevice *device = [self getCameraDevice:_isUsingFrontCamera];

[device lockForConfiguration:nil];
if ([device isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]) {
    [device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
}
[device unlockForConfiguration];
self.videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
[_captureSession addInput:_videoDeviceInput];

[_captureSession commitConfiguration];

2.3.2切换摄像头

- (AVCaptureDevice *)getCameraDevice:(BOOL)isFront
  {
    NSArray *cameras = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    AVCaptureDevice *frontCamera;
    AVCaptureDevice *backCamera;

    for (AVCaptureDevice *camera in cameras) {
        if (camera.position == AVCaptureDevicePositionBack) {
            backCamera = camera;
        } else {
            frontCamera = camera;
        }
    }

    if (isFront) {
        return frontCamera;
    }
    return backCamera;
  }

3、添加Output

AVCaptureMovieFileOutput  *_movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([_captureSession canAddOutput:_movieFileOutput])
{
    [_captureSession addOutput:_movieFileOutput];
}

3.1开始录制----录入到执行的文件中

记得添加代理 <AVCaptureFileOutputRecordingDelegate>
[_movieFileOutput startRecordingToOutputFileURL:fileURL recordingDelegate:self];

3.2设置采集格式

 self.output = [[AVCaptureVideoDataOutput alloc] init];
if ([_captureSession canAddInput:_output]){
    [_captureSession addInput:_output];
}
self.output.videoSettings =
[NSDictionary dictionaryWithObject:
 [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange]
                            forKey:(id)kCVPixelBufferPixelFormatTypeKey];

3.3成功代理方法

 - (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
  {
  }

4、设置PreviewLayer

self.preViewLayer = [AVCaptureVideoPreviewLayer layerWithSession:_captureSession];
_preViewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;

5、合成视频

fileURL 之前录制完成后当前视频的本地地址
mergeFilePath 处理后的视频路径

- (void)mergeAndExportVideosAtFileURLs:(NSURL *)fileURL newUrl:(NSString *)mergeFilePath
{
 NSError *error = nil;
CMTime totalDuration = kCMTimeZero;
//转换AVAsset
AVAsset *asset = [AVAsset assetWithURL:fileURL];
if (!asset) {
    return;
}

AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
//提取音频、视频
AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

//AVMediaTypeAudio
[self audioTrackWith:mixComposition assetTrack:assetTrack asset:asset totalDuration:totalDuration error:error];

//AVMediaTypeVideo
AVMutableCompositionTrack *videoTrack = [self videoTrackWith:mixComposition assetTrack:assetTrack asset:asset totalDuration:totalDuration error:error];

CGFloat renderW = [self videoTrackRenderSizeWithassetTrack:assetTrack];
totalDuration = CMTimeAdd(totalDuration, asset.duration);

NSMutableArray *layerInstructionArray = [self assetArrayWith:videoTrack totalDuration:totalDuration assetTrack:assetTrack renderW:renderW];

[self mergingVideoWithmergeFilePath:mergeFilePath layerInstructionArray:layerInstructionArray mixComposition:mixComposition totalDuration:totalDuration renderW:renderW];

 }

压缩视频
创建代理回调处理视频后的路径

-(void)mergingVideoWithmergeFilePath:(NSString *)mergeFilePath
           layerInstructionArray:(NSMutableArray*)layerInstructionArray
                  mixComposition:(AVMutableComposition *)mixComposition
                   totalDuration:(CMTime)totalDuration
                         renderW:(CGFloat)renderW

{
//get save path
NSURL *mergeFileURL = [NSURL fileURLWithPath:mergeFilePath];

//export
AVMutableVideoCompositionInstruction *mainInstruciton = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruciton.timeRange = CMTimeRangeMake(kCMTimeZero, totalDuration);
mainInstruciton.layerInstructions = layerInstructionArray;
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
mainCompositionInst.instructions = @[mainInstruciton];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
mainCompositionInst.renderSize = CGSizeMake(renderW, renderW/4*3);//renderW/4*3

AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
exporter.videoComposition = mainCompositionInst;
exporter.outputURL = mergeFileURL;
exporter.outputFileType = AVFileTypeMPEG4;//设置生成视频的格式
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{
    dispatch_async(dispatch_get_main_queue(), ^{
        if ([_delegate respondsToSelector:@selector(videoRecorder:didFinishMergingVideosToOutPutFileAtURL:)]) {
            [_delegate videoRecorder:self didFinishMergingVideosToOutPutFileAtURL:mergeFileURL];
        }
    });
}];
}

合成视频

- (NSMutableArray *)assetArrayWith:(AVMutableCompositionTrack *)videoTrack
                 totalDuration:(CMTime)totalDuration
                    assetTrack:(AVAssetTrack *)assetTrack
                       renderW:(CGFloat)renderW

{
NSMutableArray *layerInstructionArray = [[NSMutableArray alloc] init];

AVMutableVideoCompositionLayerInstruction *layerInstruciton = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];

CGFloat rate;
rate = renderW / MIN(assetTrack.naturalSize.width, assetTrack.naturalSize.height);

CGAffineTransform layerTransform = CGAffineTransformMake(assetTrack.preferredTransform.a, assetTrack.preferredTransform.b, assetTrack.preferredTransform.c, assetTrack.preferredTransform.d, assetTrack.preferredTransform.tx * rate, assetTrack.preferredTransform.ty * rate);

layerTransform = CGAffineTransformConcat(layerTransform, CGAffineTransformMake(1, 0, 0, 1, 0, -(assetTrack.naturalSize.width - assetTrack.naturalSize.height/4*3) / 2.0));//向上移动取中部影响
layerTransform = CGAffineTransformScale(layerTransform, rate, rate);//放缩,解决前后摄像结果大小不对称

[layerInstruciton setTransform:layerTransform atTime:kCMTimeZero];
[layerInstruciton setOpacity:0.0 atTime:totalDuration];
//data
[layerInstructionArray addObject:layerInstruciton];

return layerInstructionArray;
}

视频大小

-(CGFloat)videoTrackRenderSizeWithassetTrack:(AVAssetTrack *)assetTrack{

CGSize renderSize = CGSizeMake(0, 0);
renderSize.width = MAX(renderSize.width, assetTrack.naturalSize.height);
renderSize.height = MAX(renderSize.height, assetTrack.naturalSize.width);
return MIN(renderSize.width, renderSize.height);
}

videoTrack 视频

-(AVMutableCompositionTrack*)videoTrackWith:(AVMutableComposition *)mixComposition
                         assetTrack:(AVAssetTrack *)assetTrack
                              asset:(AVAsset *)asset
                      totalDuration:(CMTime)totalDuration
                              error:(NSError *)error{

AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
                    ofTrack:assetTrack
                     atTime:totalDuration
                      error:&error];

return videoTrack;

}

audioTrack 语音

-(void)audioTrackWith:(AVMutableComposition *)mixComposition
                             assetTrack:(AVAssetTrack *)assetTrack
                                  asset:(AVAsset *)asset
                          totalDuration:(CMTime)totalDuration
                                  error:(NSError *)error{

AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

NSArray *array =  [asset tracksWithMediaType:AVMediaTypeAudio];
if (array.count > 0) {
    AVAssetTrack *audiok =[array objectAtIndex:0];
    [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
                        ofTrack:audiok
                         atTime:totalDuration
                          error:nil];
  }
}
上一篇下一篇

猜你喜欢

热点阅读