相册

Live Photo实况照片的预录制设计开发思路

2018-09-04  本文已影响71人  咸鱼有只喵

简介

首先我们需要了解在使用LivePhoto是会有这样的效果,我们点击拍照键,系统会为我们展示一段视频和一张封面图。
如果只是视频和图片我们可简单地将其合称为一段LivePhoto,但是最难实现的效果就是,系统为我们展示的3秒视频,前面1.5秒是在你按拍照键之前的。基于这一点我们要实现预录制的功能。

实现方案

要实现这一效果,我们通常会想到,在我点击实况按钮或者进入我的相机App的时候就开始录制视频,点击拍照键后1.5秒就结束录制,最终裁剪最后的三秒。这么做也未尝不可,但是带来是后果是如果在live页面停留太久,低内存的机器的缓冲区会满,造成崩溃。
我在stackoverflow上提问后,有人告诉我,可以维护一个缓冲区,缓冲区内只保存最近的1.5s视频,拍摄完成再进行拼接。

那么问题来了,缓冲区该如何设计呢?怎么才能实现只保存最近1.5秒这一效果呢?

1.设置缓存队列

我经过思考,认为,维护一个视频队列是最合适的,也是较容易去实现的。
我在用户点击实况按钮时,开始录制,但是所不同的是,我会设置一个计时器,每过1.5秒就停止录制,存到缓存区队列,然后继续录制。如果队列的长度超过2,那么就把队首的一段视频删除。
这样就保证了缓冲区一定存在我拍照前1.5秒的视频,也不会造成缓存区满的状况。

2.点击拍照键的操作

用户点击拍照键时,立即停止录制,保存,然后开始录制1.5秒的视频。这样就有了后面两段视频。
我简单的画了个示意图:
字丑,凑合着看:


IMG_20180904_144636.jpg

也就是说,我点击拍照键的一瞬间,肯定不是恰好的1.5秒视频录制结束的时候,比如是0.6秒,那么我把这三段视频保存,三段视频即可合成一段LivePhoto。

所以,核心思想就是用队列保存三段视频,最后只对这三段视频进行裁切合并。

3.具体实现代码

这里给出部分代码。项目来自于AppStore上线项目#折纸相机#,希望大家多多支持。
最新版本尚未更新预录制livephoto功能。

变量定义

    //判断是否正在拍摄livePhoto
    var isLivePhoto = false
    //第一段计时器
    var liveTimer:Timer?
    //第二段计时器
    var liveTimer2:Timer?
    var liveCounter:Double = 0.5;
    var liveCounter2:Double = 0.5;
    var liveUrl:URL!
    var videoUrls = [URL]()
    var saveManager:SaveVieoManager?

具体核心代码

    func setLiveMode(){
        if !isLivePhoto{
            isLivePhoto = true
            setLiveStart()
        }else{
            isLivePhoto = false
            movieWriter?.finishRecording()
            liveTimer?.invalidate()
            liveTimer = nil
        }
    }
    
    /// 开始录制LivePhoto
    func setLiveStart(){
        //shotButton.isUserInteractionEnabled = false
        startRecord()
        videoUrls.append(videoUrl!)
        self.topView.liveCounter.isHidden = false
        topView.setCounter(text: "0")
        liveTimer = Timer.scheduledTimer(timeInterval: 0.5, target: self, selector: #selector(updateLiveCounter), userInfo: nil, repeats: true)
       // liveTimer = Timer.scheduledTimer(timeInterval: 1, target: self, selector: #selector(updateLiveCounter), userInfo: nil, repeats: true)
    }
    //倒计时控制
    @objc func updateLiveCounter(){
        liveCounter = liveCounter + 0.5
        print("正在拍摄LivePhoto: ",liveCounter)
        topView.setCounter(text: "\(liveCounter)")
//        if liveCounter == 3{
//            finishLiveRecord()
//        }
        
        if liveCounter == 1.5{
            movieWriter?.finishRecording()
            deleteLiveBuffer()
            startRecord()
            videoUrls.append(videoUrl!)
            liveCounter = 0
        }
    }
    
    /// 倒计时结束,结束录制
    func finishLiveRecord(){
        movieWriter?.finishRecording()
        shotButton.isUserInteractionEnabled = false
        liveTimer?.invalidate()
        startRecord()
        liveCounter2 = 0
        liveTimer2 = Timer.scheduledTimer(timeInterval: 0.5, target: self, selector: #selector(setIntervalFinish), userInfo: nil, repeats: true)
        
    }
    
    
    
    //录制完成,进行合成和跳转
    @objc func setIntervalFinish(){
        liveCounter2 = liveCounter2 + 0.5
        if liveCounter2 == 1.5{
            deleteAdditionalBuffer()
            shotButton.isUserInteractionEnabled = false
            movieWriter?.failureBlock = {
                Error in
                print(Error as Any)
            }
           // movieWriter?.completionBlock = {
                self.videoUrls.append(self.videoUrl!)
                self.movieWriter?.finishRecording()
                print("-------------:",self.videoUrls)
                self.topView.liveCounter.isHidden = true
                self.shotButton.isUserInteractionEnabled = true
                self.liveTimer = nil
                self.liveTimer2?.invalidate()
                self.liveTimer2 = nil
                self.liveCounter2 = 0
            setLiveStart()
            //视频合成
           // ProgressHUD.show("合成中")
            saveManager = SaveVieoManager(urls: videoUrls)
            let newUrl = URL(fileURLWithPath: "\(NSTemporaryDirectory())folder_all.mp4")
            unlink(newUrl.path)
            videoUrl = newUrl
            
            //视频裁剪以及合成
            saveManager?.combineLiveVideos(success: {
                com in
                self.saveManager?.store(com, storeUrl: newUrl, success:{
                    DispatchQueue.main.async {
                        let vc =  CheckViewController.init(image: nil, type: 2)
                        vc.videoUrl = newUrl
                        weak var weakSelf = self
                        vc.videoScale = weakSelf?.scaleRate
                        vc.willDismiss = {
                            //将美颜状态重置
                            if (weakSelf?.isBeauty)!{
                                weakSelf?.isBeauty = false
                                weakSelf?.defaultBottomView.beautyButton.isSelected = false
                            }
                            //使用闭包,在vc返回时将底部隐藏,点击切换时在取消隐藏
                            if weakSelf?.scaleRate != 0{
                                // weakSelf?.scaleRate = 0
                                weakSelf?.defaultBottomView.backgroundColor = UIColor.clear
                            }
                            //LivePhoto录像状态重置
                            
                        }
                        // ProgressHUD.showSuccess("合成成功")
                        //  weakSelf?.videoUrls.removeAll()
                        weakSelf?.present(vc, animated: true, completion: nil)
                        //self.setLiveStart()
                    }
                    
                })
            })
            
            
         
        }

    }
    
    func deleteLiveBuffer(){
        
        if videoUrls.count>=2{
            do {
                try FileManager.default.removeItem(atPath: (videoUrls.first!.path))
                videoUrls.removeFirst()
            } catch {
            }
        }
        
    }
    
    
    func deleteAdditionalBuffer(){
        while videoUrls.count>=3{
            do {
                try FileManager.default.removeItem(atPath: (videoUrls.first!.path))
                videoUrls.removeFirst()
            } catch {
            }
        }
    }

视频裁剪和合成部分代码:

裁剪视频

 /// 剪辑视频
    ///
    /// - Parameters:
    ///   - frontOffset: 前面剪几秒
    ///   - endOffset: 后面剪几秒
    ///   - index: url的下标
    /// - Returns: 合成
    func cutLiveVideo(frontOffset:Float64,endOffset:Float64,index:Int)->AVMutableComposition{
        let composition = AVMutableComposition()
        // Create the video composition track.
        let compositionVideoTrack: AVMutableCompositionTrack? = composition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
        // Create the audio composition track.
        let compositionAudioTrack: AVMutableCompositionTrack? = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)
        let pathUrl = videoUrls[index]
        let asset = AVURLAsset(url: pathUrl, options: nil)
        
        let videoTrack: AVAssetTrack = asset.tracks(withMediaType: .video)[0]
        let audioTrack: AVAssetTrack = asset.tracks(withMediaType: .audio)[0]
        compositionVideoTrack?.preferredTransform = videoTrack.preferredTransform
        
        // CMTime
        let trackDuration: CMTime = videoTrack.timeRange.duration
        let trackTimescale: CMTimeScale = trackDuration.timescale
        // 用timescale构造前后截取位置的CMTime
        let startTime: CMTime = CMTimeMakeWithSeconds(frontOffset, trackTimescale)
        let endTime: CMTime = CMTimeMakeWithSeconds(endOffset, trackTimescale)
        let intendedDuration: CMTime = CMTimeSubtract(asset.duration, CMTimeAdd(startTime, endTime))

        try? compositionVideoTrack?.insertTimeRange(CMTimeRangeMake(startTime, intendedDuration), of: videoTrack, at: kCMTimeZero)
        try? compositionAudioTrack?.insertTimeRange(CMTimeRangeMake(startTime, intendedDuration), of: audioTrack, at: kCMTimeZero)

        return composition
    
    }
    

合成视频

func combineLiveVideos(success:@escaping(_ mixComposition:AVMutableComposition)->()){

        for i in 0...videoUrls.count-1{
            //剪第一段
            if i == 0{
                //求liveVideo第二段长度n,需要用1.5 - 此长度作为第一段需要减去的长度
                if videoUrls.count >= 1{
                    var videoAsset2:AVURLAsset?
                    videoAsset2 = AVURLAsset.init(url: videoUrls[1])
                    let tt = videoAsset2!.duration
                    let getLengthOfVideo2 = Double(tt.value)/Double(tt.timescale)
                    //减掉开始 n 秒
                    let video1Composition = cutLiveVideo(frontOffset: getLengthOfVideo2, endOffset: 0.0, index: 0)
                    let newUrl = URL(fileURLWithPath: "\(NSTemporaryDirectory())foldercut_1.mp4")
                    unlink(newUrl.path)
                    videoUrls[0] = newUrl
                    //裁剪完第一段视频后,开始进行三段视频的合成
                    store(video1Composition, storeUrl: newUrl, success: {
                        let mixCom = self.combineVideos()
                        success(mixCom)
                        
                    })
                    
                }
            }
        }
        
   
    }

存储视频

/**
     *  存储合成的视频
     *
     *  @param mixComposition mixComposition参数
     *  @param storeUrl       存储的路径
     *  @param successBlock   successBlock
     *  @param failureBlcok   failureBlcok
     */
    func store(_ mixComposition:AVMutableComposition,storeUrl:URL,success successBlock:@escaping ()->()){
        //weak var weakSelf = self
        var assetExport: AVAssetExportSession? = nil
        assetExport = AVAssetExportSession.init(asset: mixComposition, presetName: AVAssetExportPreset640x480)
        assetExport?.outputFileType = AVFileType("com.apple.quicktime-movie")
        assetExport?.outputURL = storeUrl
        assetExport?.exportAsynchronously(completionHandler: {
            successBlock()
           // UISaveVideoAtPathToSavedPhotosAlbum((storeUrl.path), self,#selector(weakSelf?.saveVideo(videoPath:didFinishSavingWithError:contextInfo:)), nil)
        })
        
    }

好了,就到这里。

上一篇下一篇

猜你喜欢

热点阅读