移动端设计研发

ReplayKit2 屏幕录制

2020-12-01  本文已影响0人  Show_Perry

ReplayKit2 屏幕录制

如果你需要录制苹果手机屏幕,ReplayKit肯定需要了解。本文主要介绍Replaykit2 在iOS12后的一些技巧及使用方法。为啥不介绍iOS12前的录制呢,因为操作起来太麻烦了,麻烦到开发的我们使用起来都很不顺手,别说用户使用,而且现在苹果都iOS14了...

ReplayKit2 使用的技巧

  1. 由于系统提供的是个RPSystemBroadcastPickerView类型的View,需要用户点击这个View才能弹出录制界面。那如何才能优雅的把它给隐藏呢?答案是我们在它上面覆盖一层View,然后把点击事件传递给它达到点击效果。事件类型根据不同系统版本会稍有不同,直接贴代码:

    @property (nonatomic, strong) RPSystemBroadcastPickerView *sysTemBroadCastPickerView; //录制view
    @property (nonatomic, strong) UIButton *startPushStreamBtn;                           //开始录制按钮
    
    
    - (void)showReplayKitView
    {
       if (@available(iOS 12.0, *)) {
           for (UIView *view in _sysTemBroadCastPickerView.subviews) {
               if ([view isKindOfClass:[UIButton class]]) {
                   float iOSVersion = [[UIDevice currentDevice].systemVersion floatValue];
                   UIButton *button = (UIButton *)view;
                   if (button != self.startPushStreamBtn) {
                       if (iOSVersion >= 13) {
                           [(UIButton *)view sendActionsForControlEvents:UIControlEventTouchDown];
                           [(UIButton *)view sendActionsForControlEvents:UIControlEventTouchUpInside];
                       } else {
                           [(UIButton *)view sendActionsForControlEvents:UIControlEventTouchDown];
                       }
                   }
               }
           }
       }
    }
    
  1. 如何指定录制应用,是否使用麦克风。注意BundleID为录制TargetBundleID,和主工程的BundleID区分下。如果不填写,弹窗的录制界面将会带上手机上所有支持录制的应用,是不是很不友好..

    - (RPSystemBroadcastPickerView *)sysTemBroadCastPickerView
        API_AVAILABLE(ios(12.0))
    {
        if (!_sysTemBroadCastPickerView) {
            _sysTemBroadCastPickerView = [[RPSystemBroadcastPickerView alloc] init];
            _sysTemBroadCastPickerView.showsMicrophoneButton = NO;//是否显示麦克风
            _sysTemBroadCastPickerView.preferredExtension = [MnaConfig replayKitBundleID];//指定录制应用BundleID
        }
        return _sysTemBroadCastPickerView;
    }
    
    
  1. 点击开始直播后有个倒计时,倒计时结束后如何优雅的退出录制界面?首先我们需要捕获系统录制弹窗,看弹出效果应该是 presentViewController 然后采用Method Swizzling尝试下,发现可以拿到。然后我们可以在录制进程启动后发送进程通知,主进程收到后进行dismiss:代码如下:

    @implementation UIViewController (MnaPresentSwizzleAdd)
    
    + (void)load
    {
        static dispatch_once_t onceToken;
        dispatch_once(&onceToken, ^{
            [self swizzleSelector:@selector(presentViewController:animated:completion:) withAnotherSelector:@selector(mna_presentViewController:animated:completion:)];
        });
    }
    
    + (void)swizzleSelector:(SEL)originalSelector withAnotherSelector:(SEL)swizzledSelector
    {
        Class aClass = [self class];
    
        Method originalMethod = class_getInstanceMethod(aClass, originalSelector);
        Method swizzledMethod = class_getInstanceMethod(aClass, swizzledSelector);
    
        BOOL didAddMethod =
            class_addMethod(aClass,
                            originalSelector,
                            method_getImplementation(swizzledMethod),
                            method_getTypeEncoding(swizzledMethod));
    
        if (didAddMethod) {
            class_replaceMethod(aClass,
                                swizzledSelector,
                                method_getImplementation(originalMethod),
                                method_getTypeEncoding(originalMethod));
        } else {
            method_exchangeImplementations(originalMethod, swizzledMethod);
        }
    }
    
    #pragma mark - Method Swizzling
    
    - (void)mna_presentViewController:(UIViewController *)viewControllerToPresent animated:(BOOL)flag completion:(void (^)(void))completion
    {
        if ([NSStringFromClass(viewControllerToPresent.class) isEqualToString:@"RPBroadcastPickerStandaloneViewController"]) {
            MnaReplayKitHiddenManager.sharedInstance.replayKitBraodViewControler = viewControllerToPresent; //该管理类监听录制进程启动完成的通知然后进行Dismiss
            [self mna_presentViewController:viewControllerToPresent animated:flag completion:completion];
        } else {
            [self mna_presentViewController:viewControllerToPresent animated:flag completion:completion];
        }
        
    }
    
    
    @end
    
    

    更新:iOS14后弹出方法更改,暂时还不知道如何显示,该方法只适用iOS14以下

  2. 进程通知,进程间通讯非常麻烦。可以采用CFNotificationCenterPostNotification进行消息传递,不过有个问题是不能传递数据。不过我们可以使用App Groups可以数据共享的方式来进行传递。推荐一个已经封装好的开源库 MMWormhole

踩过的坑

  1. Extension进程最麻烦的就是调试。如果要查看Log,录制调试可以选择 录制进程->运行->选择主进程。这样启动后可以在终端看到Log,不过以前同事遇到过某些Xcode版本不能运行,并看不了Log。测试可行的Xcode版本:Version 12.1 (12A7403)

  2. 如果你需要打印一些日志保存,进行问题定位,需要保存到Group共享区。获取方式:

    [[NSFileManager defaultManager] containerURLForSecurityApplicationGroupIdentifier:goupName];
    
  3. 内存限制50M这个非常重要,因为如果超过这个大小直接被系统杀掉。如果你做推流:必须控制缓冲队列大小,做好内存管理。

  1. 音频输出为大端,如果需要,可以转换下大小端。转换为小端代码如下:

    - (NSData *)convertAudioSamepleBufferToPcmData:(CMSampleBufferRef)sampleBuffer
    {
        CMBlockBufferRef blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer);
        if (blockBuffer == nil) {
            return nil;
        }
    
        AudioBufferList bufferList;
        CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer,
                                                                NULL,
                                                                &bufferList,
                                                                sizeof(bufferList),
                                                                NULL,
                                                                NULL,
                                                                kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment,
                                                                &blockBuffer);
    
        int8_t *audioBuffer = (int8_t *)bufferList.mBuffers[0].mData;
        UInt32 audioBufferSizeInBytes = bufferList.mBuffers[0].mDataByteSize;
    
        CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
        const AudioStreamBasicDescription *asbd = CMAudioFormatDescriptionGetStreamBasicDescription(formatDescription);
    
        // Perform an endianess conversion, if needed. A TVIAudioDevice should deliver little endian samples.
        if (asbd->mFormatFlags & kAudioFormatFlagIsBigEndian) { //大端
            for (int i = 0; i < (audioBufferSizeInBytes - 1); i += 2) {
                int8_t temp = audioBuffer[i];
                audioBuffer[i] = audioBuffer[i + 1];
                audioBuffer[i + 1] = temp;
            }
        } else { //小端
        }
        NSData *data = [NSData dataWithBytes:audioBuffer length:audioBufferSizeInBytes];
        CFRelease(blockBuffer);
    
        return data;
    }
    
  1. 不管你手机竖屏还是横屏,非常无语的是视频输出,都是竖屏。还好iOS11后有个方法可以判定当前输出的视频帧方向:

    CGImagePropertyOrientation oritation = ((__bridge NSNumber *)CMGetAttachment(buffer, (__bridge CFStringRef)RPVideoSampleOrientationKey, NULL)).unsignedIntValue;
    typedef CF_CLOSED_ENUM(uint32_t, CGImagePropertyOrientation) {
        kCGImagePropertyOrientationUp = 1,        // 0th row at top,    0th column on left   - default orientation
        kCGImagePropertyOrientationUpMirrored,    // 0th row at top,    0th column on right  - horizontal flip
        kCGImagePropertyOrientationDown,          // 0th row at bottom, 0th column on right  - 180 deg rotation
        kCGImagePropertyOrientationDownMirrored,  // 0th row at bottom, 0th column on left   - vertical flip
        kCGImagePropertyOrientationLeftMirrored,  // 0th row on left,   0th column at top
        kCGImagePropertyOrientationRight,         // 0th row on right,  0th column at top    - 90 deg CW
        kCGImagePropertyOrientationRightMirrored, // 0th row on right,  0th column on bottom
        kCGImagePropertyOrientationLeft           // 0th row on left,   0th column at bottom - 90 deg CCW
    };
    
  1. 由于视频都是竖屏输出,所以需要旋转方向然后再转换为CVPixelBufferRef进行硬编码。找过很多资料,这块介绍很少。有介绍使用开源库libyuv 不过是转换为i420后调用的腾讯云接口无相关涉及。现在使用的是方法:

    #pragma mark - Rotation default stream
    
    - (void)dealWithSampleBuffer:(CMSampleBufferRef)buffer timeStamp:(uint64_t)timeStamp
    {
        if (@available(iOS 11.0, *)) {
            CGImagePropertyOrientation oritation = ((__bridge NSNumber *)CMGetAttachment(buffer, (__bridge CFStringRef)RPVideoSampleOrientationKey, NULL)).unsignedIntValue;
            CIImage *outputImage = nil;
            CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(buffer);
            CGFloat outputWidth = self.session.videoConfiguration.videoSize.width;
            CGFloat outputHeight = self.session.videoConfiguration.videoSize.height;
            BOOL isLandScape = self.session.videoConfiguration.landscape;
            size_t inputWidth = CVPixelBufferGetWidth(pixelBuffer);
            size_t inputHeight = CVPixelBufferGetHeight(pixelBuffer);
            CGAffineTransform lastRotateTransform = CGAffineTransformMakeScale(0.5, 0.5);
            CIImage *sourceImage = nil;
            
            CGImagePropertyOrientation lastRotateOritation = oritation;
            sourceImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
            // 如果是横屏且输入源为横屏(iPad Pro)或者 竖屏且输入源为竖屏
            if ((inputWidth > inputHeight && isLandScape) || (inputWidth <= inputHeight && !isLandScape)) {
                if (oritation == kCGImagePropertyOrientationUp) {
                    lastRotateOritation = kCGImagePropertyOrientationUp;
                } else if (oritation == kCGImagePropertyOrientationDown) {
                    lastRotateOritation = kCGImagePropertyOrientationDown;
                }
                lastRotateTransform = CGAffineTransformMakeScale(outputWidth / inputWidth, outputHeight / inputHeight);
            } else {
                if (oritation == kCGImagePropertyOrientationLeft) {
                    lastRotateOritation = kCGImagePropertyOrientationRight;
                } else if (oritation == kCGImagePropertyOrientationRight) {
                    lastRotateOritation = kCGImagePropertyOrientationLeft;
                } else {
                    lastRotateOritation = kCGImagePropertyOrientationLeft;
                }
                lastRotateTransform = CGAffineTransformMakeScale(outputWidth / inputHeight, outputHeight / inputWidth);
            }
            sourceImage = [sourceImage imageByApplyingCGOrientation:lastRotateOritation];
            outputImage = [sourceImage imageByApplyingTransform:lastRotateTransform];
            
            
            if (outputImage) {
                NSDictionary *pixelBufferOptions = @{(NSString *)kCVPixelBufferWidthKey : @(outputWidth),
                                                     (NSString *)kCVPixelBufferHeightKey : @(outputHeight),
                                                     (NSString *)kCVPixelBufferOpenGLESCompatibilityKey : @YES,
                                                     (NSString *)kCVPixelBufferIOSurfacePropertiesKey : @{} };
                
                
                CVPixelBufferLockBaseAddress(pixelBuffer, 0);
                CVPixelBufferRef newPixcelBuffer = nil;
                CVPixelBufferCreate(kCFAllocatorDefault, outputWidth, outputHeight, kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, (__bridge CFDictionaryRef)pixelBufferOptions, &newPixcelBuffer);
                [_ciContext render:outputImage toCVPixelBuffer:newPixcelBuffer];
                CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
                CMVideoFormatDescriptionRef videoInfo = nil;
                CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, newPixcelBuffer, &videoInfo);
                CMTime duration = CMSampleBufferGetDuration(buffer);
                CMTime presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(buffer);
                CMTime decodeTimeStamp = CMSampleBufferGetDecodeTimeStamp(buffer);
                CMSampleTimingInfo sampleTimingInfo;
                sampleTimingInfo.duration = duration;
                sampleTimingInfo.presentationTimeStamp = presentationTimeStamp;
                sampleTimingInfo.decodeTimeStamp = decodeTimeStamp;
                //
                CMSampleBufferRef newSampleBuffer = nil;
                CMSampleBufferCreateForImageBuffer(kCFAllocatorMalloc, newPixcelBuffer, true, nil, nil, videoInfo, &sampleTimingInfo, &newSampleBuffer);
                // 对新buffer做处理
                [self.session pushVideoBuffer:newSampleBuffer timeStamp:timeStamp];
                // release
                if (newPixcelBuffer) {
                    CVPixelBufferRelease(newPixcelBuffer);
                }
                if (newSampleBuffer) {
                    CFRelease(newSampleBuffer);
                }
            }
        } else {
            // Fallback on earlier versions
            [self.session pushVideoBuffer:buffer timeStamp:timeStamp];
        }
    }
    
  1. 推流。可以参考 LFLiveKit ,需要注意的是前面说的缓冲区大小设置,还有就是里面LibRTMP调整输出块大小,减少CPU消耗。
上一篇下一篇

猜你喜欢

热点阅读