AV Foundation(四.视频播放)

2019-01-16  本文已影响0人  Trigger_o

1.AVPlayer,AVPlayerLayer和AVPlayerItem

AVPlayer是不可见组件,可以播放视听媒体,音频,视频都可以,即便不创建可见视图,它依然可以进行播放.
AVPlayerLayer即是让播放可见的组件,继承自CALayer
AVAsset是媒体资源的静态描述,AVPlayerItem则是媒体资源动态数据模型,例如seekToTime,currentTime等.
AVPlayerItem一个或多个媒体曲目组成,由AVPlayerItemTrack类建立模型,AVPlayerItemTrack用于表示某一种媒体流.

2.播放视频

//创建AVAsset
AVAsset *asset = [AVAsset assetWithURL:[[NSBundle mainBundle] URLForResource:@"测试录屏" withExtension:@"mov"]];
//创建AVPlayerItem
AVPlayerItem *item = [AVPlayerItem playerItemWithAsset:asset];
//创建AVPlayer
self.player = [AVPlayer playerWithPlayerItem:item];
//创建AVPlayerLayer
AVPlayerLayer *layer = [AVPlayerLayer playerLayerWithPlayer:self.player];
layer.bounds = self.view.bounds;
layer.position = CGPointMake(0, 0);
layer.anchorPoint = CGPointMake(0, 0);
[self.view.layer addSublayer:layer];
//监听AVPlayerItem的状态
[item addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nil];

//当状态是AVPlayerItemStatusReadyToPlay时,可以播放
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary<NSKeyValueChangeKey,id> *)change context:(void *)context{
    if([keyPath isEqualToString:@"status"]){
        AVPlayerItem *item = (AVPlayerItem *)object;
        if(item.status == AVPlayerItemStatusReadyToPlay){
            [self.player play];
        }
    }
}

自定义播放视图时,可以直接把播放View的layer变成AVPlayerLayer,方法是重写layerClass,uiview在初始化时,其默认的layer是CALayer,重写这个方法可以替换成CALayer的其他子类.

+ (Class)layerClass{
    return [AVPlayerLayer class];
}

- (instancetype)initWithPlayer:(AVPlayer *)player{
    if(self = [super initWithFrame:CGRectZero]){
        self.backgroundColor = [UIColor blackColor];
        self.autoresizingMask = UIViewAutoresizingFlexibleHeight | UIViewAutoresizingFlexibleWidth;
//等同于[AVPlayerLayer playerLayerWithPlayer:];
        [(AVPlayerLayer *)self.layer setPlayer:player];
    }
    return self;
}

3.CMTime处理时间

CMTime的结构为

typedef struct{
    CMTimeValue value;      
    CMTimeScale timescale;  
    CMTimeFlags flags;  
    CMTimeEpoch epoch;      
} CMTime;

其中最关键的是value和timescale,value是64位整数,timescale是32位整数,表示时间时,value是分子,timescale是分母,例如

CMTime half = CMTimeMake(1, 2); //0.5秒
CMTime fire = CMTimeMake(5, 1);  //5秒
CMTime zero = kCMTimeZero; //0

AVFoundation提供了两种监听时间的API,一个是定时监听,一个是监听指定时间点

//定时0.5秒监听 
[player addPeriodicTimeObserverForInterval:CMTimeMake(1, 2) queue:dispatch_get_main_queue() usingBlock:^(CMTime time) {
           NSTimeInterval second = CMTimeGetSeconds(time);
}];

//指定几个时间点监听
[player addBoundaryTimeObserverForTimes:@[[NSValue valueWithCMTime:CMTimeMake(10, 1)],[NSValue valueWithCMTime:CMTimeMake(30, 1)]] queue:dispatch_get_main_queue() usingBlock:^{
   //需求注意的是,这个方法不是遍历几个时间点,需要再计算确定一次当前走回调的是哪个时间点         
}];

播放完成的通知,AVPlayerItemDidPlayToEndTimeNotification.

4.生成图片

AVAssetImageGenerator定义了两个方法,copyCGImageAtTime(获取指定时间的一张图片)和generateCGImagesAsynchronouslyForTimes(获取多个指定时间的一组图片)

//copyCGImageAtTime
NSTimeInterval timeSec = 10;
    //获取视频图像实际开始时间 部分视频并非一开始就是有图像的 因此要获取视频的实际开始片段
    AVAssetTrack *videoTrack = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;
    NSArray<AVAssetTrackSegment *> *segs = videoTrack.segments;
    CMTime currentStartTime = kCMTimeZero;
    for (NSInteger i = 0; i < segs.count; i ++) {
        if (!segs[i].isEmpty) {
            currentStartTime = segs[i].timeMapping.target.start;
            break;
        }
    }
    
    CMTime coverAtTimeSec = CMTimeMakeWithSeconds(timeSec, asset.duration.timescale);
    //如果想要获取的视频时间大于视频总时长 或者小于视频实际开始时间 则设置获取视频实际开始时间
    if (CMTimeCompare(coverAtTimeSec, asset.duration) == 1 || CMTimeCompare(coverAtTimeSec, currentStartTime) == -1) {
        coverAtTimeSec = currentStartTime;
    }
    
    AVAssetImageGenerator *assetGen = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
    assetGen.requestedTimeToleranceBefore = kCMTimeZero;
    assetGen.requestedTimeToleranceAfter = kCMTimeZero;
    assetGen.appliesPreferredTrackTransform = YES;
    
    NSError *error = nil;
    CGImageRef image = [assetGen copyCGImageAtTime:coverAtTimeSec actualTime:NULL error:&error];
    if (!error) {
        UIImage *videoImage = [UIImage imageWithCGImage:image];
        CGImageRelease(image);
    }
//generateCGImagesAsynchronouslyForTimes
NSMutableArray *times = @[].mutableCopy;
//一共生成20张
CMTimeValue unit = asset.duration.value/20;
CMTimeValue current = kCMTimeZero.value;
while(current <= asset.duration.value){
       CMTime time = CMTimeMake(current, asset.duration.timescale);
        [times addObject:[NSValue valueWithCMTime:time]];
        current += unit;
    }
[assetGen generateCGImagesAsynchronouslyForTimes:times completionHandler:^(CMTime requestedTime, CGImageRef  _Nullable image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError * _Nullable error) {
        //requestedTime times中的某个时间点
        //image 图片
        //actualTime 实际生成图片的时间点
        //result 结果
        //error 错误信息
     UIImage *img = [UIImage imageWithCGImage:image];
 }];
上一篇下一篇

猜你喜欢

热点阅读