ios直播

ImgToVideo

2016-08-24  本文已影响113人  MatthewSp

参照N久找到的外网上别人的代码,加入了一些自己的理解。

可以是多图生成一个视频,也可以是一个图片单独生成视频;当时去引用这段代码的原因大致记得是为了多视频合成,虽然现在想想这个方法有点蠢,不过里面所使用的底层方法在后续的视频录制和合成中还是用得到的。

以下的实例代码是一图生成视频,被我简化了几个版本,多图生成视频可以自己按照理解补一些参数来实现:

  p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #ffffff}p.p2 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #00b1ff}span.s1 {font-variant-ligatures: no-common-ligatures}span.s2 {font-variant-ligatures: no-common-ligatures; color: #de38a6}span.s3 {font-variant-ligatures: no-common-ligatures; color: #00b1ff}span.s4 {font-variant-ligatures: no-common-ligatures; color: #08fa95}span.s5 {font-variant-ligatures: no-common-ligatures; color: #ffffff}

+ (void)saveVideoToPhotosWithImage:(UIImage *)image
                              item:(VideoItem *)item
                 withCallbackBlock:(SuccessBlock)callbackBlock
{
    image = [image normalizedImage:UIImageOrientationLeft];
    CGSize imgSize = CGSizeMake(CGImageGetWidth(image.CGImage), CGImageGetHeight(image.CGImage));
    CGSize videoOutputSize = imgSize;
    //计算比例,ratio>1表示比预期宽的图片 <1 表示比预期高的图片
    float ratio = item.renderSize.width * imgSize.height / item.renderSize.height / imgSize.width;
    float scale = 1;
    float maxRatio = 3;
    float minRatio = 1/maxRatio;
    
    NSLog(@"deviceheight:%f version:%f",[DeviceConfig config].Height * [DeviceConfig config].Scale,[DeviceConfig config].SysVersion);
    //暂时不清楚一些不显示的原因,因此,将其归为代版本再来处理
    if ([DeviceConfig config].Height * [DeviceConfig config].Scale <= 1136||[DeviceConfig config].SysVersion <9) {
        //
        maxRatio = 2;
        minRatio = 1/maxRatio;
    }
    
    //寻找最匹配的展示
    if (ratio <= maxRatio && ratio >= minRatio) {
        scale = MIN(imgSize.width/item.renderSize.width, imgSize.height/item.renderSize.height);
    } else {
        scale = MAX(imgSize.width/item.renderSize.width, imgSize.height/item.renderSize.height);
    }

    videoOutputSize = CGSizeMake(round(imgSize.width/scale), round(imgSize.height/scale));

//    //  因为我们是横屏,照片一般竖屏拍时,比较高
//    if (videoOutputSize.width != item.renderSize.width && videoOutputSize.height == item.renderSize.height) {
////    if (videoOutputSize.width < videoOutputSize.height) {
//        //横向的image   height=720 则将图片旋转过来
////        image = [[UIImage alloc] initWithCGImage:image.CGImage scale:image.scale orientation:UIImageOrientationLeft];
//        image = [image normalizedImage:UIImageOrientationLeft];
//        videoOutputSize = CGSizeMake(videoOutputSize.height, videoOutputSize.width);
//    }

    videoOutputSize = [self correctSize:videoOutputSize];
    
    NSLog(@"image2video :ouputSize.Width = %f, Height=%f", videoOutputSize.width, videoOutputSize.height);
    UIImage * outImage = [ImagesToVideo OriginImage:image scaleToSize:videoOutputSize];
    
    [ImagesToVideo writeImageToMovie:outImage
                              toPath:item.path
                                size:videoOutputSize
                                 fps:item.duration.timescale
                  animateTransitions:YES
                          repeatTime:CMTimeGetSeconds(item.duration)
                   withCallbackBlock:^(BOOL success,CGFloat progress) {
                       item.generateProgress = progress;
                       if(success && progress>=1)
                       {
                           item.status = YES;
                       }
                       else if(success)
                       {
                           item.lastGenerateInterval = [CommonUtil getDateTicks:[NSDate date]];
                           //UISaveVideoAtPathToSavedPhotosAlbum([[NSURL fileURLWithPath:item.path] path], self, nil, nil);
                           
                       }
                       if (callbackBlock) {
                           callbackBlock(success,progress);
                       }
                   }];
}

以上是对图片的预处理,大小比例判断之类的,没什么特别的。需要注意的是图片尺寸需要留心,下面一个方法在处理宽度不是16倍数的图片时,会导致最后出来的视频花屏。

p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #ffffff}p.p2 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #ffffff; min-height: 13.0px}p.p3 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #00b1ff}p.p4 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #4bd157}p.p5 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #ff4647}span.s1 {font-variant-ligatures: no-common-ligatures}span.s2 {font-variant-ligatures: no-common-ligatures; color: #de38a6}span.s3 {font-variant-ligatures: no-common-ligatures; color: #00b1ff}span.s4 {font-variant-ligatures: no-common-ligatures; color: #08fa95}span.s5 {font-variant-ligatures: no-common-ligatures; color: #eb905a}span.s6 {font-variant-ligatures: no-common-ligatures; color: #ff4647}span.s7 {font-variant-ligatures: no-common-ligatures; color: #ffffff}span.s8 {font: 11.0px 'PingFang SC'; font-variant-ligatures: no-common-ligatures; color: #ff4647}span.s9 {font: 11.0px 'PingFang SC'; font-variant-ligatures: no-common-ligatures}span.s10 {font-variant-ligatures: no-common-ligatures; color: #8b87ff}span.s11 {font-variant-ligatures: no-common-ligatures; color: #4bd157}span.s12 {font: 11.0px 'PingFang SC'; font-variant-ligatures: no-common-ligatures; color: #4bd157}

+ (void)writeImageToMovie:(UIImage *)image
                   toPath:(NSString*)path
                     size:(CGSize)size
                      fps:(int)fps
       animateTransitions:(BOOL)shouldAnimateTransitions
               repeatTime:(CGFloat)repeatTime
        withCallbackBlock:(SuccessBlock)callbackBlock
{
    
    NSLog(@"image2video :ready to write video: %@(%@)", [path lastPathComponent],NSStringFromCGSize(size));
    NSError *error = nil;
    if ([[NSFileManager defaultManager] fileExistsAtPath:path]) {
        NSLog(@"image2video :删除同名文件!!!%@",[path lastPathComponent]);
        if ([[NSFileManager defaultManager] removeItemAtPath:path error:&error])
        {
            //            NSLog(@"删除文件成功!!!");
        } else {
            NSLog(@"image2video :删除文件失败!!!%@",[error description]);
        }
    }
    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:path]
                                                           fileType:AVFileTypeMPEG4
                                                              error:&error];
    if (error) {
        NSLog(@"image2video :创建文件%@失败!!!%@",[path lastPathComponent],[error description]);
        if (callbackBlock) {
            callbackBlock(NO,0);
        }
        return;
    }
    NSParameterAssert(videoWriter);
    
    NSDictionary *videoSettings = @{AVVideoCodecKey: AVVideoCodecH264,
                                    AVVideoWidthKey: [NSNumber numberWithInt:size.width],
                                    AVVideoHeightKey: [NSNumber numberWithInt:size.height]};
    
    AVAssetWriterInput* writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                                                         outputSettings:videoSettings];
        NSLog(@"image2video: ready 1");
    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                                                                                     sourcePixelBufferAttributes:nil];
    [videoWriter addInput:writerInput];
    
    //Start a session:
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:kCMTimeZero];
    NSLog(@"image2video: ready 2");
    CVPixelBufferRef buffer;
//    CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &buffer);
//    
//    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
    
    CMTime presentTime = CMTimeMake(0, fps);
        NSLog(@"image2video: ready 3");
    int i = 0;
    
    while (1) {
        if(writerInput.readyForMoreMediaData){
            
            presentTime = CMTimeMake(i, fps);
            
            //有可能不为整
            if (i >= 1) {
                buffer = NULL;
            } else {
                buffer = [ImagesToVideo pixelBufferFromCGImage2:image.CGImage size:size];
                    NSLog(@"image2video: ready 4");
            }
            
            if (buffer) {
                //append buffer
                
                BOOL appendSuccess = [ImagesToVideo append2Adapter:adaptor
                                                       pixelBuffer:buffer
                                                            atTime:presentTime
                                                         withInput:writerInput];
                    NSLog(@"image2video: ready 5");
                i++;
                CVPixelBufferRelease(buffer);
                if (!appendSuccess) {
                    callbackBlock(NO,1);
                }
                else
                {
                    callbackBlock(YES,(CGFloat)i/1);
                }
            } else {
                
                //Finish the session:
                [writerInput markAsFinished];
                
                [videoWriter finishWritingWithCompletionHandler:^{
                    NSLog (@"image2video : %@ write done ",[path lastPathComponent]);
                    //NSLog(@"Successfully closed video writer");
                    if (videoWriter.status == AVAssetWriterStatusCompleted) {
                        
//                        [self writeCompletedFlagFile:path];
                        
                        if (callbackBlock) {
                            callbackBlock(YES,1);
                        }
                    } else {
                        if (callbackBlock) {
                            callbackBlock(NO,1);
                        }
                    }
                }];
                    NSLog(@"image2video: ready 6");
                CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
                break;
            }
        }
        
    }
}

核心函数之一:
开启AVAssetWriter,做一些预设置;
videoSettings用来指定保存视频文件,也就是目标文件的宽高;
AVAssetWriterInputPixelBufferAdaptor开启缓存;

p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #ffffff}p.p2 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #ffffff; min-height: 13.0px}p.p3 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #00b1ff}p.p4 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #ff4647}p.p5 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px 'PingFang SC'; color: #4bd157}p.p6 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #4bd157}span.s1 {font-variant-ligatures: no-common-ligatures}span.s2 {font-variant-ligatures: no-common-ligatures; color: #00b1ff}span.s3 {font-variant-ligatures: no-common-ligatures; color: #eb905a}span.s4 {font-variant-ligatures: no-common-ligatures; color: #ff4647}span.s5 {font-variant-ligatures: no-common-ligatures; color: #ffffff}span.s6 {font-variant-ligatures: no-common-ligatures; color: #8b87ff}span.s7 {font-variant-ligatures: no-common-ligatures; color: #de38a6}span.s8 {font: 11.0px Menlo; font-variant-ligatures: no-common-ligatures; color: #ffffff}span.s9 {font: 11.0px Menlo; font-variant-ligatures: no-common-ligatures}span.s10 {font: 11.0px 'PingFang SC'; font-variant-ligatures: no-common-ligatures}

+ (CVPixelBufferRef)如果:(CGImageRef)image
                                       size:(CGSize)imageSize
{
    NSLog(@"image2video : ready buffer :%@",NSStringFromCGSize(imageSize));
    
    NSDictionary *options = @{(id)kCVPixelBufferCGImageCompatibilityKey: @YES,
                              (id)kCVPixelBufferCGBitmapContextCompatibilityKey: @YES,
                              (id)kCVPixelBufferWidthKey:[NSNumber numberWithInt:imageSize.width],
                              (id)kCVPixelBufferHeightKey:[NSNumber numberWithInt:imageSize.height]};
    CVPixelBufferRef pxbuffer = NULL;
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, imageSize.width,
                                          imageSize.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
                                          &pxbuffer);
    if(status!=kCVReturnSuccess)
    {
        NSLog(@"image2video : ready buffer failure:%d",status);
    }
    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    //真正的输出宽高是这里的两个数值
    CGContextRef context = CGBitmapContextCreate(pxdata, imageSize.width,
                                                 imageSize.height, 8, 4*imageSize.width, rgbColorSpace,
                                                 kCGImageAlphaNoneSkipFirst);
    //图片视频静止
    CGRect rec = CGRectMake(0,0,CGImageGetWidth(image),CGImageGetHeight(image));
    
    CGContextDrawImage(context, rec, image);
    
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);
    
    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
    
    return pxbuffer;
}

+ (BOOL)append2Adapter:(AVAssetWriterInputPixelBufferAdaptor*)adaptor
           pixelBuffer:(CVPixelBufferRef)buffer
                atTime:(CMTime)presentTime
             withInput:(AVAssetWriterInput*)writerInput
{
    while (!writerInput.readyForMoreMediaData) {
        [NSThread sleepForTimeInterval:5];
    }
    BOOL result = [adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
    return result;
    
}

核心函数之二:
其实整个方法的关键就在于由Img创建CVPixelBufferRef,然后添加到AVAssetWriterInputPixelBufferAdaptor中;
如果想要图片能够动起来可以在pixelBufferFromCGImage2这个方法中加入适当的参数使得Img的位置发生变化--CGContextDrawImage;
核心函数一里面
CMTime presentTime = CMTimeMake(0, fps);
NSLog(@"image2video: ready 3");
int i = 0;

while (1) {....}

这段代码其实可以修改fps来修改目标视频文件的帧速,如果你是准备写一个静态图的视频,那么可以1帧,当然如果你是准备把图片动起来,那么最好是20-30帧。

A 那么问题也随之而来了,1帧的视频实质上也就只有文件开头的一个关键帧,如果你把这样的视频拿出来进行SeekTime,势必会得到一个黑屏,截图也一样,所以当你静态视频的时候,建议还是使用5-10帧来避免这种情况。

B 后续我还会写一部分关于CVPixelBufferRef这货的文章,其实呢,如果是图片转视频,最好是不要用上面用到的方法,至少不要用CGContextDrawImage之类的,毕竟CGImage实在太影响性能,建议使用CIImage,至于怎么做,后续讲关于我自己在做视频贴图尝试的时候再补充。

上一篇下一篇

猜你喜欢

热点阅读