iOS Developer音、視頻編解碼

短视频从无到有 (二)美颜、断点拍摄及视频合成功能

2018-07-13  本文已影响12人  卢叁

正文

众所周知,基于OpenGL的GPUImage是一个很出名的框架。使用GPUImage,我们能很轻易的做出美颜效果。其中有多达120多种效果提供给大家使用。今天,我只讲如何使用GPUImage,对于GPUImage的研究,网上有很多资料,大家可以很方便的学习,以后我也会会出个关于GPUImage的专题。

1.美颜的流程为:移除之前所有的处理链->创建美颜滤镜->设置GPUImage处理链。代码如下:

- (void)beautyClick{
   beautyBtn.selected =!beautyBtn.selected;
    if (beautyBtn.selected) {
        
        // 移除之前所有的处理链
        [videoCamera removeAllTargets];
        
        // 创建美颜滤镜
        filter = [[GPUImageBilateralFilter alloc] init];
        
        // 设置GPUImage处理链,从数据->滤镜->界面展示
        [videoCamera addTarget:filter];
        [filter addTarget:filterView];
        
      }else{
        
        // 移除之前所有的处理链
        [videoCamera removeAllTargets];
        [videoCamera addTarget:filterView];
    
    }
 }

2.断点拍摄的流程:开始拍摄时启动一个定时器,到了时间后则停止计时关闭拍摄即可。代码如下:

- (void)starWrite{
    
    deleteBtn.hidden =NO;
    beautyBtn.hidden =YES;
    [self initProgressView];
    
    isRecording =YES;
    seconds =0.00;
    
    [self initMovieWriter];
   
   dispatch_async(dispatch_get_main_queue(), ^{
       
       [movieWriter startRecording];
       [self gcdTimer];
       
   });
    
}

为了精准计时,我使用的是GCD计时器:


- (void)gcdTimer{
   //使用GCD定时器
    NSTimeInterval period =floatTime;
    dispatch_queue_t queue =dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
    
    _GCDtimer =dispatch_source_create(DISPATCH_SOURCE_TYPE_TIMER, 0, 0, queue);
    
    dispatch_source_set_timer(_GCDtimer, DISPATCH_TIME_NOW, period * NSEC_PER_SEC, 0 * NSEC_PER_SEC);
    
    dispatch_source_set_event_handler(_GCDtimer, ^{
        
        dispatch_async(dispatch_get_main_queue(), ^{
            
            //回到主线程
            seconds +=floatTime;
            totalTime +=floatTime;
            UIView *view =[self.viewArray lastObject];
            view.frame =CGRectMake(viewX,viewY,seconds*(ScreenWidth/timeSeconds),viewHight);
            
            NSNumber *number =[NSNumber numberWithFloat:totalTime];
            
            //   NSInteger integer =(NSInteger)totalTime;
            
            if ([number integerValue] >=timeSeconds) {
                
                NSLog(@"结束时间%f%ld",totalTime,timeSeconds);
                
                [self stopWrite];
            }
            
          
            
            
        });
        
    });
    
    
    dispatch_resume(_GCDtimer);
}

譬如计时10秒后关闭计时器停止拍摄:

- (void)stopWrite{
    
    //因为GCDtimer较为精准 务必先调用移除定时器的方法 再调用停止录制的方法
    [self removeTimer];
    dispatch_async(dispatch_get_main_queue(), ^{
        //注意顺序
        [movieWriter finishRecording];
        [filter removeTarget:movieWriter];
        videoCamera.audioEncodingTarget = nil;
        
    });
    
    beautyBtn.hidden =NO;
    [self.secondsArray addObject:[NSString stringWithFormat:@"%f",seconds]];
    [self saveModelToVideoArray];
    
    recordBtn.selected =NO;
    
    isRecording =NO;
    
    [self savePhotosAlbum:videoURL];
    NSLog(@"录制时间为%f 当前时间为%f",seconds,totalTime);
        
}

注意:拍摄视频中可能有多次拍摄,在删除不想要的视频时也应该移除相应的时间,确保整个视频的时间是我们设定的时间。还有在保存视频到相册中时需要调用延时的方法,否则可能出现保存失败的情况。

//保存到手机相册
- (void)savePhotosAlbum:(NSURL *)videoPathURL{
    
  //必须调用延时的方法 否则可能出现保存失败的情况
    dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.1 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
        
      
        ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
        if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:videoPathURL])
        {
            [library writeVideoAtPathToSavedPhotosAlbum:videoPathURL completionBlock:^(NSURL *assetURL, NSError *error)
             {
                 /*
                 dispatch_async(dispatch_get_main_queue(), ^{
                     
                     if (error) {
                         UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"保存失败"
                                                                        delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];
                         [alert show];
                         
                         
                     } else {
                         UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Success" message:@"保存成功"
                                                                        delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];
                         [alert show];
                         
                         
                         
                         
                     }
                 });
                  
                  */
             }];
        }
      
        
        
   });
}

3.视频合成即把多个视频合成为一个视频,这需要用到AVFoundation中的AVMutableComposition去处理音频轨和视频轨。需要注意的是:要遍历videosPathArray里的视频,然后取出视频轨和音频轨在首尾处连接,导出视频的用到的是AVAssetExportSession,具体代码如下:

/* 合成视频|转换视频格式
 @param videosPathArray:合成视频的路径数组
 @param outpath:输出路径
 @param outputFileType:视频格式
 @param presetName:分辨率
 @param  completeBlock  mergeFileURL:合成后新的视频URL
 
 
 */
+ (void)mergeAndExportVideos:(NSArray *)videosPathArray withOutPath:(NSString *)outPath outputFileType:(NSString *)outputFileType  presetName:(NSString *)presetName   didComplete:(void(^)(NSError *error,NSURL *mergeFileURL) )completeBlock{
    
    if (videosPathArray.count ==0) {
        
        NSLog(@"请添加视频");
        NSError *error =[NSError errorWithDomain:NSCocoaErrorDomain code:0 userInfo:@{NSLocalizedDescriptionKey:@"视频数目不能为0"}];
        
        if (completeBlock) {
            
            completeBlock(error,nil);
        }
        
        return;
    }
    
    AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
    
    
    AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                        preferredTrackID:kCMPersistentTrackID_Invalid];
    AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                        preferredTrackID:kCMPersistentTrackID_Invalid];
    CMTime totalDuration = kCMTimeZero;
    for (int i = 0; i < videosPathArray.count; i++) {
        
        NSDictionary* options = @{AVURLAssetPreferPreciseDurationAndTimingKey:@YES
              
                                  };
        
        AVURLAsset *asset =[AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:videosPathArray[i]] options:options];
        
        NSError *erroraudio = nil;
        
        //获取AVAsset中的音频 或者视频
       AVAssetTrack *assetAudioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] firstObject];
        //向通道内加入音频或者视频
        BOOL ba = [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
                                      ofTrack:assetAudioTrack
                                       atTime:totalDuration
                                        error:&erroraudio];
        
        NSLog(@"erroraudio:%@%d",erroraudio,ba);
        
        NSError *errorVideo = nil;
        AVAssetTrack *assetVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
        BOOL bl = [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
                                      ofTrack:assetVideoTrack
                                       atTime:totalDuration
                                        error:&errorVideo];
        
        NSLog(@"errorVideo:%@%d",errorVideo,bl);
        totalDuration = CMTimeAdd(totalDuration, asset.duration);
    }
    
    unlink([outPath UTF8String]);
    NSURL *mergeFileURL = [NSURL fileURLWithPath:outPath];
    if ([[NSFileManager defaultManager] fileExistsAtPath:outPath]) {
        
        [[NSFileManager defaultManager] removeItemAtPath:outPath error:nil];
        
    }
    
    
    
    //输出
    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                      presetName:presetName];
    
    exporter.outputURL = mergeFileURL;
    exporter.outputFileType = outputFileType;
    exporter.shouldOptimizeForNetworkUse = YES;
  //因为exporter.progress不可以被监听 所以在这里可以开启定时器取 exporter的值查看进度
    [exporter exportAsynchronouslyWithCompletionHandler:^{
        
        
        dispatch_async(dispatch_get_main_queue(), ^{
            
            switch (exporter.status) {
                case AVAssetExportSessionStatusFailed:{
                    
                    if (completeBlock) {
                        
                        completeBlock(exporter.error,mergeFileURL);
                    }
                    
                    
                    
                    break;
                }
                case AVAssetExportSessionStatusCancelled:{
                    
                    NSLog(@"Export Status: Cancell");
                    
                    break;
                }
                case AVAssetExportSessionStatusCompleted: {
                    
                    if (completeBlock) {
                        
                        completeBlock(nil,mergeFileURL);
                    }
                    
                    
                    break;
                }
                case AVAssetExportSessionStatusUnknown: {
                    
                    NSLog(@"Export Status: Unknown");
                   break;
                }
                case AVAssetExportSessionStatusExporting : {
                    
                    NSLog(@"Export Status: Exporting");
                    break;
                }
                case AVAssetExportSessionStatusWaiting: {
                    
                    NSLog(@"Export Status: Wating");
                    break;
                }
                    
            };
            
            
            });
        
    }];
    
}

大家注意CMTimeAdd的概念,即相当于加法。音视频中的时间 CMTime是一个用c语言定义的结构体,在音视频中有很重要的用途,下面会专门写篇文章介绍,毕竟这个是基础。
在合成视频时,也可以设置视频的格式以及分辨率用来转格式和压缩视频。

上一篇 下一篇

猜你喜欢

热点阅读