ios直播

MergeVideo

2016-08-25  本文已影响244人  MatthewSp

老生常谈:视频的合成

请勿转载~

ps: 只讲解其中的关键点和我自己在使用过程遇到的坑,如果有什么问题请在简书留言。

p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #ffffff}p.p2 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #ffffff; min-height: 13.0px}p.p3 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px 'PingFang SC'; color: #4bd157}p.p4 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #4bd157}span.s1 {font-variant-ligatures: no-common-ligatures}span.s2 {font-variant-ligatures: no-common-ligatures; color: #de38a6}span.s3 {font: 11.0px Menlo; font-variant-ligatures: no-common-ligatures; color: #ffffff}span.s4 {font: 11.0px Menlo; font-variant-ligatures: no-common-ligatures}span.s5 {font: 11.0px 'PingFang SC'; font-variant-ligatures: no-common-ligatures}span.s6 {font-variant-ligatures: no-common-ligatures; color: #ff4647}span.s7 {font-variant-ligatures: no-common-ligatures; color: #4bd157}span.s8 {font-variant-ligatures: no-common-ligatures; color: #ffffff}span.s9 {font-variant-ligatures: no-common-ligatures; color: #8b87ff}span.s10 {font: 11.0px 'PingFang SC'; font-variant-ligatures: no-common-ligatures; color: #4bd157}span.s11 {font: 11.0px 'PingFang SC'; font-variant-ligatures: no-common-ligatures; color: #ff4647}

- (IBAction)mergeVideo:(id)sender
{
    if (YES) {
        
        /* 合成视频套路就是下面几条,跟着走就行了,具体函数意思自行google
         1.不用说,肯定加载。用ASSET
         2.这里不考虑音轨,所以只获取video信息。用track 获取asset里的视频信息,一共两个track,一个track是你自己拍的视频,第二个track是特效视频,因为两个视频需要同时播放,所以起始时间相同,都是timezero,时长自然是你自己拍的视频时长。然后把两个track都放到mixComposition里。
         3.第三步就是最重要的了。instructionLayer,看字面意思也能看个七七八八了。架构图层,就是告诉系统,等下合成视频,视频大小,方向,等等。这个地方就是合成视频的核心。我们只需要更改透明度就行了,把特效track的透明度改一下,让他能显示底下你自己拍的视屏图层就行了。
         4.
        **/
        NSLog(@"First Asset = %@",firstAsset);
        
        // 1
        secondAsset = [AVAsset assetWithURL:videoURL];
        
        NSString *path = [[NSBundle mainBundle] pathForResource:@"rain" ofType:@"mp4"];
        NSLog(@"path is %@",path);
        //firstAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:path] options:nil];
        firstAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:path] options:nil];
        NSLog(@"second Asset = %@",secondAsset);
        NSLog(@"firstAsset==%f,%f",firstAsset.naturalSize.width,firstAsset.naturalSize.height);
        NSLog(@"secondAsset==%f,%f",secondAsset.naturalSize.width,secondAsset.naturalSize.height);
        
        
        //second Video
        
        //secondAsset = [AVAsset assetWithURL:videoTwoURL];
    }
    if (firstAsset&&secondAsset) {
        
        // 2.
        CGSize targetSize = CGSizeMake(640, 480);

        mixComposition = [[AVMutableComposition alloc] init];
        AVMutableCompositionTrack *firstTrack =
        [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                    preferredTrackID:kCMPersistentTrackID_Invalid];
        [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,CMTimeAdd(firstAsset.duration, CMTimeMakeWithSeconds(5.0f, 600)))
                            ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                             atTime:CMTimeMakeWithSeconds(0.0f, 30)
                              error:nil];
        
        [firstTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) toDuration:CMTimeAdd(firstAsset.duration, CMTimeMakeWithSeconds(5.0f, 600))];
        AVMutableCompositionTrack *secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        

        [secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,secondAsset.duration)
                             ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                              atTime:kCMTimeZero
                               error:nil];

        
        //CGAffineTransformMake(a,b,c,d,tx,ty) ad缩放 bc 旋转tx,ty位移

         //3.
        AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
        mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero,secondAsset.duration);
        
         //第一个视频的架构层
        
        AVMutableVideoCompositionLayerInstruction *firstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
        [firstlayerInstruction setOpacity:0 atTime:CMTimeAdd(firstAsset.duration, CMTimeMakeWithSeconds(5.0f, 600))];
        //[firstlayerInstruction setTransform:[self layerTrans:firstAsset withTargetSize:targetSize] atTime:kCMTimeZero];
//        [firstlayerInstruction setOpacityRampFromStartOpacity:1.0 toEndOpacity:0.0 timeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration)];
         // 第二个视频的架构层
        CGRect test = CGRectMake(0, 0, 300, 300);
        [firstlayerInstruction setCropRectangle:test atTime:kCMTimeZero];//展示整个layer中某一位置和大小的视图
        AVMutableVideoCompositionLayerInstruction *secondlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack];
       [secondlayerInstruction setTransform:[self layerTrans:secondAsset withTargetSize:targetSize] atTime:kCMTimeZero];
//        [secondlayerInstruction setOpacityRampFromStartOpacity:0.0 toEndOpacity:1.0 timeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMake(600, 600))];
        
        
       
        
        // 这个地方你把数组顺序倒一下,视频上下位置也跟着变了。
        mainInstruction.layerInstructions = [NSArray arrayWithObjects:firstlayerInstruction,secondlayerInstruction, nil];
        mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, secondAsset.duration);
        
        mainComposition = [AVMutableVideoComposition videoComposition];
        mainComposition.instructions = [NSArray arrayWithObjects:mainInstruction,nil];
        mainComposition.frameDuration = CMTimeMake(1, 30);
        mainComposition.renderSize = CGSizeMake(640, 480);
        
        //  导出路径
        
        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        
        NSString *documentsDirectory = [paths objectAtIndex:0];
        
        NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:
                                 [NSString stringWithFormat:@"mergeVideo.mov"]];
        
        NSFileManager *fileManager = [NSFileManager defaultManager];
        [fileManager removeItemAtPath:myPathDocs error:NULL];
        
        NSURL *url = [NSURL fileURLWithPath:myPathDocs];
        
//        NSLog(@"URL:-  %@", [url description]);
        
        //导出
        
        AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
        
        exporter.outputURL = url;
        
        exporter.outputFileType = AVFileTypeQuickTimeMovie;
        
        exporter.shouldOptimizeForNetworkUse = YES;
        
        exporter.videoComposition = mainComposition;
        
        NSLog(@"%.0f",mixComposition.duration.value/mixComposition.duration.timescale + 0.0f);
        
        AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:exporter.asset];
        playerItem.videoComposition = exporter.videoComposition;
        
        
        AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
        
        AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
        
        playerLayer.frame = self.view.layer.bounds;
        playerLayer.videoGravity = AVLayerVideoGravityResizeAspect;
        [self.view.layer addSublayer:playerLayer];
        
        [NSTimer scheduledTimerWithTimeInterval:0.1 target:self selector:@selector(playItem:) userInfo:@{@"item":player} repeats:NO];
        
        [exporter exportAsynchronouslyWithCompletionHandler:^{
            
            dispatch_async(dispatch_get_main_queue(), ^{
                
                [self exportDidFinish:exporter];
                
            });
        }];
        
    }else {
        
     
        UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"出错!" message:@"选择视频"
                                                       delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];
        [alert show];
    }
    

}

上述代码里面包含了多视频合成以及背景音频的合成:
1.视频合成最主要的还是依靠AVMutableComposition,其属性AVMutableVideoComposition包含着视频片段的所有信息,AVMutableAudioMix则包含了所有音频片段的信息。

  1. 如上所示,如果只是简单的视频合成,加一段或者不加音频,则只需要依次创建AVMutableComposition,AVMutableVideoComposition,AVMutableCompositionTrack,AVMutableVideoCompositionLayerInstruction。
  2. 上述代码就是前一篇文正提到的N久远的代码,好像是个印度人写的(也包含了一些我我之前测试用的片段),反正仔细看也能很清楚的发现代码顺序很别扭。 不过其中有很多启发性质的方法,比如scaleTimeRange:CMTimeRangeMake,之类的,说明视频片段被加到AVMutableCompositionTrack上之后其实是可以伸缩的,仔细看文档还可以发现很多细分的方法,插入,删除之类的。
  3. AVMutableVideoCompositionLayerInstruction是和track对应的,其实两个可以理解为track-avplayer,layerInstruction--AVPlayerlayer,仔细去看layer文档,就能找到layer的一些简单变换,伸缩,透明度之类的。
  4. 值得一提的是,AVMutableVideoComposition中插入AVMutableCompositionTrack是有数量限制的,15个位上限。所以如果你想要多短视频合成的话,最好优先考虑清楚每个视频的表现逻辑,然后根据需要确定创建多少个track,最后按照时间以及变换规则对layer进行变换。
  5. AVAssetExportSession即导出,有多重格式可选择,不过貌似就只能选AVFileTypeQuickTimeMovie,如果你的源视频来自网络则需要把shouldOptimizeForNetworkUse设置为yes.
  6. 绝大多数的导出异常都是mainInstruction出现问题
p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #ffffff}p.p2 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #00b1ff}span.s1 {font-variant-ligatures: no-common-ligatures}span.s2 {font-variant-ligatures: no-common-ligatures; color: #00b1ff}span.s3 {font-variant-ligatures: no-common-ligatures; color: #ffffff}

    NSURL *videoURL;
    AVURLAsset *firstAsset;
    AVURLAsset *secondAsset;
    AVMutableVideoComposition *mainComposition;
    AVMutableComposition *mixComposition;
    NSMutableArray * audioMixParams;
    NSURL * audioUrl;

算了,最后贴上github地址,里面包含了我自己测试用的音视频合成以及即时播放的代码,不过都是临时使用的,尤其是播放,并不一定会生效。
https://github.com/Thetiso/MergeVideo

上一篇下一篇

猜你喜欢

热点阅读