视频播放
一个视频播放需要涉及到的类有AVPlayer、AVPlayerLayer和AVAsset、AVPlayerItem和AVPlayerItemTrack等类。其中:
- AVAsset
AV Foundation最重要的类就是AVAsset;是AV Foundation设计的核心。AVAsset是一个抽象类和不可变类,定义了媒体资源混合呈现的方式,将媒体资源的静态属性模块化成一个整体。AVAsset本身并不是媒体资源,可以作为时基媒体的容器。有一个或多个带有描述自身元数据的媒体组成,即AVAssetTrack。
创建资源
NSURL *assetURL = //url
AVAsset *asset = [AVAsset assetWithURL:assetURL];
// AVAsset是一个抽象类,assetWithURL:创建的实例是其子类AVURLAsset的一个实例。有时候也会直接创建AVURLAsset的实例,可以传递更多的设置参数
NSDictionary *options = @{AVURLAssetPreferPreciseDurationAndTimingKey:@YES};
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:assetURL options:options];
- AVPlayer
AVPlayer是一个不可见组件。AVPlayer是一个用来播放基于时间的视听媒体的控制器对象,即一个对播放和资源时间相关信息进行管理的对象。
AVPlayer只是管理一个单独的资源的播放,不过框架还提供了AVPlayer的一个子类AVQueuePlayer,可以用来管理一个资源队列。
@interface AVPlayer : NSObject
AVPlayer 继承子NSObject,在其主类只有几种传参的初始化方法,和status和error两个属性。其他属性和方法定义在相关的分类中。
@property (nonatomic, readonly) AVPlayerStatus status;
@property (nonatomic, readonly, nullable) NSError *error;
- AVPlayerLayer
AVPlayerLayer是构建于Core Animation之上,是继承自CALayer的一个可见组件。Core Animation本身具有基于时间的属性,并且由于它基于OpenGL,所以具有很好的性能。AVPlayerLayer中可以自己设定的属性只有
@property(copy) AVLayerVideoGravity videoGravity;
- AVPlayerItem
AVAsset只有媒体资源的静态信息,仅使用AVAsset是无法完成播放的。AVPlayerItem会建立媒体资源动态视角的数据模型并保存AVPlayer在播放资源时的呈现状态。在这个类的分类AVPlayerItemTimeControl中可以看到有
- (CMTime)currentTime;
- (void)seekToTime:(CMTime)time completionHandler:(void (^_Nullable)(BOOL finished))completionHandler;
等方法控制播放。AVPlayerItem有一个或多个曲目组成,由AVPlayerItemTrack类建立模型。
播放bundle中的视频
-(void)viewDidLoad{
[super viewDidLoad];
//1、定义URL
NSURL *assetURL = [[NSBundle mainBundle] URLForResource:@"hubblecast" withExtension:@"m4v"];
// 2、创建AVAsset
AVAsset *asset = [AVAsset assetWithURL:assetURL];
//3、创建AVPlayerItem
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset automaticallyLoadedAssetKeys:keys];
// 4、创建一个指向playerItem的AVPlayer实例
self.player = [AVPlayer playerWithPlayerItem:playerItem];
//5、创建一个显示layer
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
//layer 添加到view上 一般都是自定义一个view layer用AVPlayerLayer覆盖
[self.view.layer addSublayer: playerLayer];
}
AVPlayerItem具有一个名为status的属性。在创建之初播放条目从AVPlayerItemStatusUnknown开始,表示当前媒体还未载入播放队列中。
AVPlayerItemStatusUnknown,
AVPlayerItemStatusReadyToPlay,
AVPlayerItemStatusFailed
需等到状态改变为AVPlayerItemStatusReadyToPlay之后才能够播放。因此需要监听status属性的改变。添加监听
[self.playerItem addObserver:self // 3
forKeyPath:@"status"
options:0
context:&PlayerItemStatusContext];
- (void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object
change:(NSDictionary *)change
context:(void *)context {
if (context == &PlayerItemStatusContext) {
dispatch_async(dispatch_get_main_queue(), ^{
[self.playerItem removeObserver:self forKeyPath:@"status"];
if (self.playerItem.status == AVPlayerItemStatusReadyToPlay) {
}
});
}
}
时间处理
AVPlayer和AVPlayerItem都是基于时间的对象,AV Foundation中的时间呈现是基于CMTime这个数据结构来实现的。
typedef struct
{
CMTimeValue value; /* value/timescale = seconds. */
CMTimeScale timescale;
CMTimeFlags flags;
CMTimeEpoch epoch;
} CMTime;
其中value是一个64位的整数,timescale是一个32位的整数值,在时间呈现样式中分别作为分子和分母。
CMTime halfSecond = CMTime(1,2) ;表示 1/2 = 0.5s
CMTime fiveSecond = CMTime(5,1) ;表示 5/1 = 5s
播放器
- 创建视频视图
创建一个用于展示视频内容的视图。
THPlayerView.h文件
#import "THTransport.h" #协议
@class AVPlayer;
@interface THPlayerView : UIView
- (id)initWithPlayer:(AVPlayer *)player; //传递当前AVPlayer实例的引用
@property (nonatomic, readonly) id <THTransport> transport;
@end
THPlayerView.m 文件
#import "THPlayerView.h"
#import "THOverlayView.h"
#import <AVFoundation/AVFoundation.h>
@interface THPlayerView ()
@property (strong, nonatomic) THOverlayView *overlayView; // 1
@end
@implementation THPlayerView
+ (Class)layerClass { // 2返回AVPlayerLayer
return [AVPlayerLayer class];
}
- (id)initWithPlayer:(AVPlayer *)player {
self = [super initWithFrame:CGRectZero]; // 3
if (self) {
self.backgroundColor = [UIColor blackColor];
self.autoresizingMask = UIViewAutoresizingFlexibleHeight |
UIViewAutoresizingFlexibleWidth;
[(AVPlayerLayer *) [self layer] setPlayer:player]; // 4将AVPlayer输出的视频指向AVPlayerLayer实例。
// 5一个xib搭建的view
[[NSBundle mainBundle] loadNibNamed:@"THOverlayView"
owner:self
options:nil];
[self addSubview:_overlayView];
}
return self;
}
- (void)layoutSubviews {
[super layoutSubviews];
self.overlayView.frame = self.bounds;
}
- (id <THTransport>)transport {
return self.overlayView;
}
@end
- 视频控制器
处理播放API的地方
@interface THPlayerController : NSObject
- (id)initWithURL:(NSURL *)assetURL;
@property (strong, nonatomic, readonly) UIView *view;
@end
#import "THPlayerController.h"
#import <AVFoundation/AVFoundation.h>
#import "THTransport.h"
#import "THPlayerView.h"
#import "AVAsset+THAdditions.h"
#import "UIAlertView+THAdditions.h"
#import "THNotifications.h"
// AVPlayerItem's status property
#define STATUS_KEYPATH @"status"
// Refresh interval for timed observations of AVPlayer
#define REFRESH_INTERVAL 0.5f
// Define this constant for the key-value observation context.
static const NSString *PlayerItemStatusContext;
@interface THPlayerController () <THTransportDelegate>
@property (strong, nonatomic) AVAsset *asset;
@property (strong, nonatomic) AVPlayerItem *playerItem;
@property (strong, nonatomic) AVPlayer *player;
@property (strong, nonatomic) THPlayerView *playerView;
@property (weak, nonatomic) id <THTransport> transport; //实现THTransport协议的对象
@property (strong, nonatomic) id timeObserver;
@property (strong, nonatomic) id itemEndObserver;
@property (assign, nonatomic) float lastPlaybackRate;
@end
@implementation THPlayerController
#pragma mark - Setup
- (id)initWithURL:(NSURL *)assetURL {
self = [super init];
if (self) {
_asset = [AVAsset assetWithURL:assetURL]; // 1 用传递的URL初始化AVAsset
[self prepareToPlay];
}
return self;
}
- (void)prepareToPlay {
NSArray *keys = @[
@"tracks",
@"duration",
@"commonMetadata",
@"availableMediaCharacteristicsWithMediaSelectionOptions"
];
self.playerItem = [AVPlayerItem playerItemWithAsset:self.asset // 2 自动载入tracks等属性 这些属性分布在AVAsset的不同分类中
automaticallyLoadedAssetKeys:keys];
[self.playerItem addObserver:self // 3添加监听status状态的改变
forKeyPath:STATUS_KEYPATH
options:0
context:&PlayerItemStatusContext];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem]; // 4 创建一个AVPlayer实例
self.playerView = [[THPlayerView alloc] initWithPlayer:self.player]; // 5 创建View其的layer是AVPlayerLayer,并传递player指针
self.transport = self.playerView.transport; //transport和controller建立联系 THTransport本来是协议,其拥有一个属性是另一个协议
self.transport.delegate = self; // THTransportDelegate
}
- (void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object
change:(NSDictionary *)change
context:(void *)context {
if (context == &PlayerItemStatusContext) {
dispatch_async(dispatch_get_main_queue(), ^{ // 1 确保回到主线程
[self.playerItem removeObserver:self forKeyPath:STATUS_KEYPATH]; //移除监听
if (self.playerItem.status == AVPlayerItemStatusReadyToPlay) {
// Set up time observers. // 2 设置播放器的时间监听
[self addPlayerItemTimeObserver];
[self addItemEndObserverForPlayerItem];
CMTime duration = self.playerItem.duration;
// Synchronize the time display // 3 transport对象设置时间和总长
[self.transport setCurrentTime:CMTimeGetSeconds(kCMTimeZero)
duration:CMTimeGetSeconds(duration)];
// Set the video title.
[self.transport setTitle:self.asset.title]; // 4 展示资源的标题AVAsset中可以获取到 AVMetadataItem 这里是一个分类
[self.player play]; // 5 播放视频
} else {
[UIAlertView showAlertWithTitle:@"Error"
message:@"Failed to load video"];
}
});
}
}
#pragma mark - Time Observers
/*
addPeriodicTimeObserverForInterval:(CMTime)interval
queue:(nullable dispatch_queue_t)queue
usingBlock:(void (^)(CMTime time))block;
*/
- (void)addPlayerItemTimeObserver {
// Create 0.5 second refresh interval - REFRESH_INTERVAL == 0.5
CMTime interval =
CMTimeMakeWithSeconds(REFRESH_INTERVAL, NSEC_PER_SEC); // 1 定义CMTime值
// Main dispatch queue
dispatch_queue_t queue = dispatch_get_main_queue(); // 2 主队列里更新
// Create callback block for time observer
__weak THPlayerController *weakSelf = self; // 3 定义Block 回调
void (^callback)(CMTime time) = ^(CMTime time) {
NSTimeInterval currentTime = CMTimeGetSeconds(time);
NSTimeInterval duration = CMTimeGetSeconds(weakSelf.playerItem.duration);
[weakSelf.transport setCurrentTime:currentTime duration:duration]; // 4
};
// Add observer and store pointer for future use
self.timeObserver = // 5
[self.player addPeriodicTimeObserverForInterval:interval
queue:queue
usingBlock:callback];
}
/*
* 监听播放完毕
*
*/
- (void)addItemEndObserverForPlayerItem {
NSString *name = AVPlayerItemDidPlayToEndTimeNotification;
NSOperationQueue *queue = [NSOperationQueue mainQueue];
__weak THPlayerController *weakSelf = self; // 1
void (^callback)(NSNotification *note) = ^(NSNotification *notification) {
[weakSelf.player seekToTime:kCMTimeZero // 2
completionHandler:^(BOOL finished) {
[weakSelf.transport playbackComplete]; // 3
}];
};
self.itemEndObserver = // 4
[[NSNotificationCenter defaultCenter] addObserverForName:name
object:self.playerItem
queue:queue
usingBlock:callback];
}
#pragma mark - THTransportDelegate Methods
- (void)play {
[self.player play];
}
- (void)pause {
self.lastPlaybackRate = self.player.rate;
[self.player pause];
}
- (void)stop {
[self.player setRate:0.0f];
[self.transport playbackComplete];
}
- (void)jumpedToTime:(NSTimeInterval)time {
[self.player seekToTime:CMTimeMakeWithSeconds(time, NSEC_PER_SEC)];
}
/*
UISlider 分别对应
UIControlEventTouchDown --> scrubbingDidStart
UIControlEventValueChanged --> scrubbedToTime
UIControlEventTouchUpInside --> scrubbingDidEnd
*/
- (void)scrubbingDidStart { //1 保存原来的播放率,调整完之后恢复这个播放率
self.lastPlaybackRate = self.player.rate; // 暂停播放 并移除时间监听
[self.player pause];
[self.player removeTimeObserver:self.timeObserver];
self.timeObserver = nil;
}
- (void)scrubbedToTime:(NSTimeInterval)time { // 2 取消之前的时间定为请求 Cancel any pending seek requests 中心定位
[self.playerItem cancelPendingSeeks];
[self.player seekToTime:CMTimeMakeWithSeconds(time, NSEC_PER_SEC) toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];
}
- (void)scrubbingDidEnd { // 3 最后重新添加监听 并开始播放 如果lastPlaybackRate == 0表示处于暂停状态 >0
[self addPlayerItemTimeObserver];
if (self.lastPlaybackRate > 0.0f) { // 表示视频在滑动slider之前是播放状态
[self.player play];
}
}
#pragma mark - Housekeeping
- (UIView *)view {
return self.playerView;
}
- (void)dealloc {
if (self.itemEndObserver) { // 5
NSNotificationCenter *nc = [NSNotificationCenter defaultCenter];
[nc removeObserver:self.itemEndObserver
name:AVPlayerItemDidPlayToEndTimeNotification
object:self.player.currentItem];
self.itemEndObserver = nil;
}
}
@end
内嵌的主视图 主要是这里UI事件与视频控制器交互并且相互没有直接引用对方只是通过协议代理完成。
THOverlayView视图
image.png
#import "THFilmstripView.h"
#import "THTransport.h"
@interface THOverlayView : UIView <THTransport>
@property (weak, nonatomic) IBOutlet UINavigationBar *navigationBar;
@property (weak, nonatomic) IBOutlet UIToolbar *toolbar;
@property (weak, nonatomic) IBOutlet UIButton *filmstripToggleButton;
@property (weak, nonatomic) IBOutlet UIButton *togglePlaybackButton;
@property (weak, nonatomic) IBOutlet UILabel *currentTimeLabel;
@property (weak, nonatomic) IBOutlet UILabel *remainingTimeLabel;
@property (weak, nonatomic) IBOutlet UISlider *scrubberSlider;
@property (weak, nonatomic) IBOutlet UIView *infoView;
@property (weak, nonatomic) IBOutlet UILabel *scrubbingTimeLabel;
@property (weak, nonatomic) IBOutlet THFilmstripView *filmStripView;
@property (weak, nonatomic) id <THTransportDelegate> delegate;
- (IBAction)toggleFilmstrip:(id)sender;
- (IBAction)toggleControls:(id)sender;
- (IBAction)togglePlayback:(UIButton *)sender;
- (IBAction)closeWindow:(id)sender;
- (void)setCurrentTime:(NSTimeInterval)time;
@end
#import "THOverlayView.h"
#import "UIView+THAdditions.h"
#import "NSTimer+Additions.h"
@interface THOverlayView () <THSubtitleViewControllerDelegate>
@property (nonatomic) BOOL controlsHidden;
@property (nonatomic) BOOL filmstripHidden;
@property (strong, nonatomic) NSArray *excludedViews;
@property (nonatomic, assign) CGFloat infoViewOffset;
@property (strong, nonatomic) NSTimer *timer;
@property (assign) BOOL scrubbing;
@property (strong, nonatomic) NSArray *subtitles;
@property (copy, nonatomic) NSString *selectedSubtitle;
@property (assign) CGFloat lastPlaybackRate;
@property (strong, nonatomic) MPVolumeView *volumeView;
@end
@implementation THOverlayView
- (void)awakeFromNib {
[super awakeFromNib];
self.filmstripHidden = YES;
self.excludedViews = @[self.navigationBar, self.toolbar, self.filmStripView];
UIImage *thumbNormalImage = [UIImage imageNamed:@"knob"];
UIImage *thumbHighlightedImage = [UIImage imageNamed:@"knob_highlighted"];
[self.scrubberSlider setThumbImage:thumbNormalImage forState:UIControlStateNormal];
[self.scrubberSlider setThumbImage:thumbHighlightedImage forState:UIControlStateHighlighted];
self.infoView.hidden = YES;
[self calculateInfoViewOffset];
// Set up actions
[self.scrubberSlider addTarget:self action:@selector(showPopupUI) forControlEvents:UIControlEventValueChanged];
[self.scrubberSlider addTarget:self action:@selector(hidePopupUI) forControlEvents:UIControlEventTouchUpInside];
[self.scrubberSlider addTarget:self action:@selector(unhidePopupUI) forControlEvents:UIControlEventTouchDown];
self.filmStripView.layer.shadowOffset = CGSizeMake(0, 2);
self.filmStripView.layer.shadowColor = [UIColor darkGrayColor].CGColor;
self.filmStripView.layer.shadowRadius = 2.0f;
self.filmStripView.layer.shadowOpacity = 0.8f;
[self resetTimer];
}
- (void)calculateInfoViewOffset {
[self.infoView sizeToFit];
self.infoViewOffset = ceilf(CGRectGetWidth(self.infoView.frame) / 2);
}
- (void)subtitleSelected:(NSString *)subtitle {
self.selectedSubtitle = subtitle;
[self.delegate subtitleSelected:subtitle];
if (self.lastPlaybackRate > 0) {
[self.delegate play];
}
}
- (void)setCurrentTime:(NSTimeInterval)time duration:(NSTimeInterval)duration {
NSInteger currentSeconds = ceilf(time);
double remainingTime = duration - time;
self.currentTimeLabel.text = [self formatSeconds:currentSeconds];
self.remainingTimeLabel.text = [self formatSeconds:remainingTime];
self.scrubberSlider.minimumValue = 0.0f;
self.scrubberSlider.maximumValue = duration;
self.scrubberSlider.value = time;
}
- (void)setScrubbingTime:(NSTimeInterval)time {
self.scrubbingTimeLabel.text = [self formatSeconds:time];
}
- (NSString *)formatSeconds:(NSInteger)value {
NSInteger seconds = value % 60;
NSInteger minutes = value / 60;
return [NSString stringWithFormat:@"%02ld:%02ld", (long) minutes, (long) seconds];
}
- (UILabel *)createTransportLabel {
UILabel *label = [[UILabel alloc] initWithFrame:CGRectZero];
label.backgroundColor = [UIColor clearColor];
//label.textColor = [UIColor whiteColor];
label.font = [UIFont boldSystemFontOfSize:15.0f];
label.text = @"00:00";
label.userInteractionEnabled = YES;
[label sizeToFit];
return label;
}
- (IBAction)toggleFilmstrip:(id)sender {
[UIView animateWithDuration:0.35 animations:^{
if (self.filmstripHidden) {
self.filmStripView.hidden = NO;
self.filmStripView.frameY = 0;
} else {
self.filmStripView.frameY -= self.filmStripView.frameHeight;
}
self.filmstripHidden = !self.filmstripHidden;
} completion:^(BOOL complete) {
if (self.filmstripHidden) {
self.filmStripView.hidden = YES;
}
}];
self.filmstripToggleButton.selected = !self.filmstripToggleButton.selected;
}
- (IBAction)toggleControls:(id)sender {
[UIView animateWithDuration:0.35 animations:^{
if (!self.controlsHidden) {
if (!self.filmstripHidden) {
[UIView animateWithDuration:0.35 animations:^{
self.filmStripView.frameY -= self.filmStripView.frameHeight;
self.filmstripHidden = YES;
self.filmstripToggleButton.selected = NO;
} completion:^(BOOL complete) {
self.filmStripView.hidden = YES;
[UIView animateWithDuration:0.35 animations:^{
self.navigationBar.frameY -= self.navigationBar.frameHeight;
self.toolbar.frameY += self.toolbar.frameHeight;
}];
}];
} else {
self.navigationBar.frameY -= self.navigationBar.frameHeight;
self.toolbar.frameY += self.toolbar.frameHeight;
}
} else {
self.navigationBar.frameY += self.navigationBar.frameHeight;
self.toolbar.frameY -= self.toolbar.frameHeight;
}
self.controlsHidden = !self.controlsHidden;
}];
}
- (IBAction)togglePlayback:(UIButton *)sender {
sender.selected = !sender.selected;
if (self.delegate) {
SEL callback = sender.selected ? @selector(play) : @selector(pause);
[self.delegate performSelector:callback];
}
}
- (IBAction)closeWindow:(id)sender {
[self.timer invalidate];
self.timer = nil;
[self.delegate stop];
self.filmStripView.hidden = YES;
[self.window.rootViewController dismissViewControllerAnimated:YES completion:nil];
}
- (void)showPopupUI {
self.infoView.hidden = NO;
CGRect trackRect = [self.scrubberSlider convertRect:self.scrubberSlider.bounds toView:nil];
CGRect thumbRect = [self.scrubberSlider thumbRectForBounds:self.scrubberSlider.bounds trackRect:trackRect value:self.scrubberSlider.value];
CGRect rect = self.infoView.frame;
rect.origin.x = (thumbRect.origin.x) - self.infoViewOffset + 16;
rect.origin.y = self.boundsHeight - 80;
self.infoView.frame = rect;
self.currentTimeLabel.text = @"-- : --";
self.remainingTimeLabel.text = @"-- : --";
[self setScrubbingTime:self.scrubberSlider.value];
[self.delegate scrubbedToTime:self.scrubberSlider.value];
}
- (void)unhidePopupUI {
self.infoView.hidden = NO;
self.infoView.alpha = 0.0f;
[UIView animateWithDuration:0.2f animations:^{
self.infoView.alpha = 1.0f;
}];
self.scrubbing = YES;
[self resetTimer];
[self.delegate scrubbingDidStart];
}
- (void)hidePopupUI {
[UIView animateWithDuration:0.3f animations:^{
self.infoView.alpha = 0.0f;
} completion:^(BOOL complete) {
self.infoView.alpha = 1.0f;
self.infoView.hidden = YES;
}];
self.scrubbing = NO;
[self.delegate scrubbingDidEnd];
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
[self resetTimer];
return ![self.excludedViews containsObject:touch.view] && ![self.excludedViews containsObject:touch.view.superview];
}
- (void)setCurrentTime:(NSTimeInterval)currentTime {
[self.delegate jumpedToTime:currentTime];
}
- (void)playbackComplete {
self.scrubberSlider.value = 0.0f;
self.togglePlaybackButton.selected = NO;
}
- (void)resetTimer {
[self.timer invalidate];
if (!self.scrubbing) {
self.timer = [NSTimer scheduledTimerWithTimeInterval:5.0 firing:^{
if (self.timer.isValid && !self.controlsHidden) {
[self toggleControls:nil];
}
}];
}
}
- (void)setTitle:(NSString *)title {
self.navigationBar.topItem.title = title ? title : @"Video Player";
}
@end
字幕
AV Foundation在展示字幕或隐藏字幕方面提供了可靠的方法,AVPlayerLayer会自动渲染这些元素。AVMediaSelectionGroup和AVMediaSelectionOption这两个类
NSString *mc = AVMediaCharacteristicLegible; // 1 表示字幕
AVMediaSelectionGroup *group =
[self.asset mediaSelectionGroupForMediaCharacteristic:mc]; // 2 请求相关的Group
if (group) {
NSMutableArray *subtitles = [NSMutableArray array]; // for (AVMediaSelectionOption *option in group.options) {
NSLog("%@",option.displayName);
}
}
[self.playerItem selectMediaOption:option //设置某个选项为指定的字幕
inMediaSelectionGroup:g
在这之前需要定义playerItem的keys中添加 availableMediaCharacteristicsWithMediaSelectionOptions
缩略图的生成
AVAssetImageGenerator是AV Foundation中生成缩略图的类。其提供了两个方法从视频资源中检索图片。
//允许按照第一个参数所指定的时间段生成一个图片序列。
- (void)generateCGImagesAsynchronouslyForTimes:(NSArray<NSValue *> *)requestedTimes completionHandler:(AVAssetImageGeneratorCompletionHandler)handler;
//允许在指定时间点捕捉图片
- (nullable CGImageRef)copyCGImageAtTime:(CMTime)requestedTime actualTime:(nullable CMTime *)actualTime error:(NSError * _Nullable * _Nullable)outError;
@property (strong, nonatomic) AVAssetImageGenerator *imageGenerator;
self.imageGenerator = // 1创建关于资源的实例
[AVAssetImageGenerator assetImageGeneratorWithAsset:self.asset];
// Generate the @2x equivalent
self.imageGenerator.maximumSize = CGSizeMake(200.0f, 0.0f); // 2 设置图片生成一些属性。maximumSize中width设置为200,height为0。确保生成的图片遵循一定的宽度,并且根据视频的宽高比自动设置高度值。
CMTime duration = self.asset.duration;
NSMutableArray *times = [NSMutableArray array]; // 3 计算生成CMTime值的集合。将整个视频的时间分成20个CMTime
CMTimeValue increment = duration.value / 20;
CMTimeValue currentValue = 2.0 * duration.timescale;
while (currentValue <= duration.value) {
CMTime time = CMTimeMake(currentValue, duration.timescale);
[times addObject:[NSValue valueWithCMTime:time]];
currentValue += increment;
}
__block NSUInteger imageCount = times.count; // 4
__block NSMutableArray *images = [NSMutableArray array];
AVAssetImageGeneratorCompletionHandler handler; // 5
处理的回调
handler = ^(CMTime requestedTime,
CGImageRef imageRef,
CMTime actualTime,
AVAssetImageGeneratorResult result,
NSError *error) {
if (result == AVAssetImageGeneratorSucceeded) { // 6
UIImage *image = [UIImage imageWithCGImage:imageRef];
[images addObject: image];
} else {
NSLog(@"Error: %@", [error localizedDescription]);
}
};
[self.imageGenerator generateCGImagesAsynchronouslyForTimes:times // 8
completionHandler:handler];