iOS开发笔记程序员iOS

iOS分解视频中所在帧

2016-01-30  本文已影响4011人  LeeYZ

0x00 需求

把视频拆成图片,视频文件其实是一帧帧的图片,视频文件信息及结构这里就不再赘述了,网上有很多讲的都比较好的文章,请自行谷歌

0x01 代码实现(AVAssetImageGenerator)

/**
 *  把视频文件拆成图片保存在沙盒中
 *
 *  @param fileUrl        本地视频文件URL
 *  @param fps            拆分时按此帧率进行拆分
 *  @param completedBlock 所有帧被拆完成后回调
 */
- (void)splitVideo:(NSURL *)fileUrl fps:(float)fps completedBlock:(void(^)())completedBlock {
    if (!fileUrl) {
        return;
    }
    NSDictionary *optDict = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:NO] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
    AVURLAsset *avasset = [[AVURLAsset alloc] initWithURL:fileUrl options:optDict];
    
    CMTime cmtime = avasset.duration; //视频时间信息结构体
    Float64 durationSeconds = CMTimeGetSeconds(cmtime); //视频总秒数
    
    NSMutableArray *times = [NSMutableArray array];
    Float64 totalFrames = durationSeconds * fps; //获得视频总帧数
    CMTime timeFrame;
    for (int i = 1; i <= totalFrames; i++) {
        timeFrame = CMTimeMake(i, fps); //第i帧  帧率
        NSValue *timeValue = [NSValue valueWithCMTime:timeFrame];
        [times addObject:timeValue];
    }
    
    NSLog(@"------- start");
    AVAssetImageGenerator *imgGenerator = [[AVAssetImageGenerator alloc] initWithAsset:avasset];
    //防止时间出现偏差
    imgGenerator.requestedTimeToleranceBefore = kCMTimeZero;
    imgGenerator.requestedTimeToleranceAfter = kCMTimeZero;
    NSString *cachePath = [NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES) lastObject];
    NSInteger timesCount = [times count];
    [imgGenerator generateCGImagesAsynchronouslyForTimes:times completionHandler:^(CMTime requestedTime, CGImageRef  _Nullable image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError * _Nullable error) {
        printf("current-----: %lld\n", requestedTime.value);
        switch (result) {
            case AVAssetImageGeneratorCancelled:
                NSLog(@"Cancelled");
                break;
            case AVAssetImageGeneratorFailed:
                NSLog(@"Failed");
                break;
            case AVAssetImageGeneratorSucceeded: {
                NSString *filePath = [cachePath stringByAppendingPathComponent:[NSString stringWithFormat:@"/%lld.png",requestedTime.value]];
                NSData *imgData = UIImagePNGRepresentation([UIImage imageWithCGImage:image]);
                [imgData writeToFile:filePath atomically:YES];
                if (requestedTime.value == timesCount) {
                    NSLog(@"completed");
                    if (completedBlock) {
                        completedBlock();
                    }
                }
            }
                break;
        }
    }];
}

0x10 获取单帧图片

- (nullable CGImageRef)copyCGImageAtTime:(CMTime)requestedTime actualTime:(nullable CMTime *)actualTime error:(NSError * __nullable * __nullable)outError CF_RETURNS_RETAINED;

requestedTime:要获取的帧时间CMTimeMake(a,b)

0x11 CMTime结构体

/*!
 @typedef CMTime
 @abstract Rational time value represented as int64/int32.
*/
typedef struct
{
 CMTimeValue value;  /*! @field value The value of the CMTime. value/timescale = seconds. */
 CMTimeScale timescale; /*! @field timescale The timescale of the CMTime. value/timescale = seconds.  */
 CMTimeFlags flags;  /*! @field flags The flags, eg. kCMTimeFlags_Valid, kCMTimeFlags_PositiveInfinity, etc. */
 CMTimeEpoch epoch;  /*! @field epoch Differentiates between equal timestamps that are actually different because
             of looping, multi-item sequencing, etc.  
             Will be used during comparison: greater epochs happen after lesser ones. 
             Additions/subtraction is only possible within a single epoch,
             however, since epoch length may be unknown/variable. */
} CMTime;

CMTime是专门用于表示视频时间的结构体
value不是指时间
timescale可以理解为帧率
获得秒数:value / timescale = seconds
创建CMTime 用 CMTimeMake(a, b) 当前第a帧,每秒b帧

上一篇下一篇

猜你喜欢

热点阅读