SwiftiOS学习开发iOS程序猿

iOS——给视频添加音频和字幕

2019-08-12  本文已影响11人  Bart_Simpson

最近需要做一个功能,给视频添加字幕,关键还是OC的代码,愁啊。
这种没做过的东西就只能面对搜索引擎编程了。
各种搜索和尝试,最后弄了两天给弄出来了,唯一不足的就是字幕的清晰度不够,也暂时不知道是不是视频合成质量的问题。

简单说一下整个流程的思路及关键点:
1. 创建AVMutableComposition 和 AVMutableCompositionTrack
AVMutableComposition *mix = [AVMutableComposition composition];
// 视频轨道
AVMutableCompositionTrack *videoTrack = [mix addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
//音频轨道
AVMutableCompositionTrack *audioTrack = [mix addMutableTrackWithMediaType:AVMediaTypeAudio
            preferredTrackID:kCMPersistentTrackID_Invalid];
2. 把资源放进Track
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:videoTrack.asset.duration error:nil];
[AudioTrack insertTimeRange:CMTimeRangeFromTimeToTime(kCMTimeZero, videoTrack.asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
3.创建导出
self.exporter = [[AVAssetExportSession alloc] initWithAsset:mix presetName:AVAssetExportPreset3840x2160];
4. AVMutableVideoComposition 和 AVMutableVideoCompositionInstruction
//AVMutableVideoComposition:管理所有视频轨道,可以决定最终视频的尺寸,裁剪需要在这里进行
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
// AVMutableVideoCompositionInstruction 视频轨道中的一个视频,可以缩放、旋转等
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
5.添加字幕或水印
// videoLayer是视频layer,parentLayer是最主要的,videoLayer和字幕,贴纸的layer都要加在该layer上
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
...
[parentLayer addSublayer: 字幕layer];
因为主要就是要添加字幕,所以这里的逻辑我会重点说一下。
字幕其实就是放在视频layer层上的文字(label),但是它们是隐藏的,并且在指定时间出现又在指定时间消失。也就是说要给这个layer加上一个动画组,第一个动画是将透明度变为1,第二个动画将透明度变为0
将字幕文件解析后,将动画组的beginTime设为开始时间,duration设为持续时间,同时将消失动画的beginTime设为持续时间,duration设为0.1
6. AVVideoCompositionCoreAnimationTool 以及 exporter的赋值
mainCompositionInst.animationTool = [AVVideoCompositionCoreAnimationTool
                                         videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
/// 设置字幕 layer层
    self.exporter.videoComposition = mainCompositionInst;
7.导出
[self.exporter exportAsynchronouslyWithCompletionHandler:^{
        dispatch_async(dispatch_get_main_queue(), ^{
        });
}];

感谢愛我你就抱抱我——ios 通过AVFoundation给视频添加字幕的这篇文章,不然我对添加字幕毫无头绪。虽然文章里面没有完整的字幕动画代码...

完整代码如下,这里面的代码是有实际业务需求的代码,所以会有一些变动。
另外添加字幕label的时候我碰到了一个特别奇怪的问题。如果只添加一个label.layer,没有问题,如果放入for循环中循环添加,一个都加不上去。尝试半天不知道为什么,最后灵机一动,加了一句添加label到view上,结果就可以了。
然后由于label添加的字幕清晰度不行,暂时也不知道是视频合成的原因还是说自己分辨率造成的。所以换成了CATextLayer,比label清晰了那么一丢丢。但是发现我设置了textLayer.contentsScale = [UIScreen mainScreen].scale;反而更不清晰了,所以就屏蔽了这个contentsScale
/**
 @param videos 需要被合成的音频文件
 @param audios 需要被合成的视频文件
 */
- (void)compositionVideos:(NSArray <NSURL *>*)videos audios:(NSArray <NSURL *>*)audios
{
    [SVProgressHUD setDefaultMaskType:(SVProgressHUDMaskTypeClear)];
    [SVProgressHUD showWithStatus:@"正在合成..."];
    
    NSMutableArray *videosAsset = [NSMutableArray arrayWithCapacity:0];
    NSMutableArray *audiosAsset = [NSMutableArray arrayWithCapacity:0];
    
    for (NSURL *videoPath in videos) {
        AVAsset *asset = [AVAsset assetWithURL:videoPath];
        NSLog(@"%@",asset);
        [videosAsset insertObject:asset atIndex:videosAsset.count];
    }
    
    for (NSURL *audioPath in audios) {
        AVAsset *asset = [AVAsset assetWithURL:audioPath];
        [audiosAsset insertObject:asset atIndex:audiosAsset.count];
    }
    
    // 1 - 创建 AVMutableComposition 对象. 对象将保存AVMutableCompositionTrack实例.
    AVMutableComposition *mix = [AVMutableComposition composition];
    // 2 - 视频轨道
    AVMutableCompositionTrack *videoTrack = [mix addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    
    [videosAsset enumerateObjectsUsingBlock:^(AVAsset *asset, NSUInteger idx, BOOL *stop) {
        [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:videoTrack.asset.duration error:nil];
    }];
    
    NSMutableArray *audioTrackes = [NSMutableArray arrayWithCapacity:0];
    [audiosAsset enumerateObjectsUsingBlock:^(AVAsset *asset, NSUInteger idx, BOOL *stop) {
        AVMutableCompositionTrack *AudioTrack = [mix addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:(int32_t)idx];
        [AudioTrack insertTimeRange:CMTimeRangeFromTimeToTime(kCMTimeZero, videoTrack.asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
        [audioTrackes addObject:AudioTrack];
    }];
    
    // 4 - 获取路径
    NSURL *url = [EditAudioVideo exporterVideoPath];
    
    // 5 - 创建导出
    self.exporter = [[AVAssetExportSession alloc] initWithAsset:mix presetName:AVAssetExportPreset3840x2160];
    
    //修改背景音乐的音量start
    AVMutableAudioMix *videoAudioMixTools = [AVMutableAudioMix audioMix];
    
    //获取音频轨道
    NSMutableArray *inputParameters = [NSMutableArray arrayWithCapacity:0];
    [audiosAsset enumerateObjectsUsingBlock:^(AVAsset *asset, NSUInteger idx, BOOL *stop) {
        AVMutableAudioMixInputParameters *audioParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrackes[idx]];
        [audioParameters setVolumeRampFromStartVolume:1.0 toEndVolume:1.0 timeRange:CMTimeRangeMake(kCMTimeZero, CMTimeAdd(kCMTimeZero, asset.duration))];
        
        [audioParameters setTrackID:(int32_t)idx];
        
        [inputParameters addObject:audioParameters];
    }];
    
    videoAudioMixTools.inputParameters = inputParameters;
    
    //3.1 AVMutableVideoCompositionInstruction 视频轨道中的一个视频,可以缩放、旋转等
    AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoTrack.asset.duration);
    
    /// 3.2 AVMutableVideoCompositionLayerInstruction 一个视频轨道,包含了这个轨道上的所有视频素材
    /// 必须依靠它
    AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
    //1 创建AVAsset实例 AVAsset包含了video的所有信息 self.videoUrl输入视频的路径
    AVAssetTrack *videoAssetTrack = [[videoTrack.asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
    UIImageOrientation videoAssetOrientation_  = UIImageOrientationUp;
    BOOL isVideoAssetPortrait_  = NO;
    CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
    if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
        videoAssetOrientation_ = UIImageOrientationRight;
        isVideoAssetPortrait_ = YES;
    }
    if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
        videoAssetOrientation_ =  UIImageOrientationLeft;
        isVideoAssetPortrait_ = YES;
    }
    if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
        videoAssetOrientation_ =  UIImageOrientationUp;
    }
    if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
        videoAssetOrientation_ = UIImageOrientationDown;
    }
    [videolayerInstruction setTransform:videoAssetTrack.preferredTransform atTime:kCMTimeZero];
    [videolayerInstruction setOpacity:0.0 atTime:videoTrack.asset.duration];
    
    mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];
    
    //AVMutableVideoComposition:管理所有视频轨道,可以决定最终视频的尺寸,裁剪需要在这里进行
    AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
    
    CGSize naturalSize;
    if(isVideoAssetPortrait_){
        naturalSize = CGSizeMake(videoTrack.naturalSize.height, videoTrack.naturalSize.width);
    } else {
        naturalSize = videoTrack.naturalSize;
    }
    
    /// 设置高宽
    float renderWidth, renderHeight;
    renderWidth = naturalSize.width;
    renderHeight = naturalSize.height;
    mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);
    mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
    mainCompositionInst.frameDuration = CMTimeMake(1, 30);
    
    /// 加水印 加字幕
    // videoLayer是视频layer,parentLayer是最主要的,videoLayer和字幕,贴纸的layer都要加在该layer上
    CALayer *parentLayer = [CALayer layer];
    CALayer *videoLayer = [CALayer layer];
    parentLayer.frame = CGRectMake(0, 0, renderWidth, renderHeight);
    videoLayer.frame = CGRectMake(0, 0, renderWidth, renderHeight);
    [parentLayer addSublayer:videoLayer];
    
    for (IVYSubTitleManager *obj in [IVYSubTitleManager manager].subTitles) {
        
//        UILabel * titleLabel = [[UILabel alloc]initWithFrame: CGRectMake(8, 8, kScreenW - 16, 38)];
//        titleLabel.attributedText = [[IVYSubTitleManager manager] getSRTSubtitleWithCurrentObj:obj];
//        titleLabel.minimumScaleFactor = 9/14;
//        titleLabel.adjustsFontSizeToFitWidth = true;
//        titleLabel.numberOfLines = 3;
//        titleLabel.shadowOffset = CGSizeMake(0, -1);
//        titleLabel.textAlignment = NSTextAlignmentCenter;
//        titleLabel.layer.opacity = 0;
        
        CATextLayer * textLayer = [CATextLayer layer];
        textLayer.string = [[IVYSubTitleManager manager] getSRTSubtitleWithCurrentObj:obj];
        textLayer.frame = CGRectMake(8, 8, kScreenW - 16, 38);
        textLayer.wrapped = YES;
//        textLayer.contentsScale = [UIScreen mainScreen].scale;
        textLayer.alignmentMode = kCAAlignmentCenter;
        textLayer.truncationMode = kCATruncationNone;
        textLayer.opacity = 0;
        
        // 这个是透明度动画主要是使在插入的才显示,其它时候都是不显示的
        CABasicAnimation *opacityAnim = [CABasicAnimation animationWithKeyPath:@"opacity"];
        opacityAnim.fromValue = [NSNumber numberWithFloat:1];
        opacityAnim.toValue = [NSNumber numberWithFloat:1];
        opacityAnim.removedOnCompletion = NO;
        
        CABasicAnimation *opacityDismissAnim = [CABasicAnimation animationWithKeyPath:@"opacity"];
        opacityDismissAnim.fromValue = [NSNumber numberWithFloat:1];
        opacityDismissAnim.toValue = [NSNumber numberWithFloat:0];
        opacityDismissAnim.removedOnCompletion = NO;
        opacityDismissAnim.beginTime = obj.endTime - obj.startTime;
        opacityDismissAnim.duration = 0.1;
        
        CAAnimationGroup *groupAnimation = [CAAnimationGroup animation];
        groupAnimation.animations = [NSArray arrayWithObjects:opacityAnim, opacityDismissAnim, nil];
        
        if (obj.startTime == 0) {
            groupAnimation.beginTime = 0.01;
        } else {
            groupAnimation.beginTime = obj.startTime;
        }
        groupAnimation.duration = obj.endTime - obj.startTime;
        [textLayer addAnimation:groupAnimation forKey:@"groupAnimation"];
        [parentLayer addSublayer: textLayer];
    }
    
    mainCompositionInst.animationTool = [AVVideoCompositionCoreAnimationTool
                                         videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
    self.exporter.outputURL = url;
    self.exporter.outputFileType = AVFileTypeQuickTimeMovie;
    /// 赋值音频
    self.exporter.audioMix = videoAudioMixTools;
    self.exporter.shouldOptimizeForNetworkUse = YES;
    /// 设置字幕 layer层
    self.exporter.videoComposition = mainCompositionInst;
    
    __weak typeof(self) wSelf = self;
    [self.exporter exportAsynchronouslyWithCompletionHandler:^{
        dispatch_async(dispatch_get_main_queue(), ^{
            [SVProgressHUD showWithStatus:@"录制合成完成"];
            [wSelf exportDidFinish:wSelf.exporter];
        });
    }];
}
上一篇下一篇

猜你喜欢

热点阅读