iOS:直播采集推流:基于LFLiveKit+自定义滤镜,焦点

2019-08-10  本文已影响0人  豆浆油条cc

一、LFLiveKit

github地址:https://github.com/LaiFengiOS/LFLiveKit

本地rtmp服务建立可以看:https://www.jianshu.com/p/561df087fd4d

Untitled.gif

框架支持h264、AAC编码,动态改变速率,RTMP传输,美颜,差网丢帧,动态水印,接入硬件流媒体数据等。

核心类

LFLiveSession:封装了AVFoundation中录制视频的功能。
LFLiveVideoConfiguration:设置视频流分辨率,码率,帧率等。
LFLiveAudioConfiguration:设置音频码率,采样率,声道数等。

默认视频质量 -- 分辨率: 360 *640 , 帧数:24 , 码率:800Kps。
默认音频质量 -- 采样率: 44.1KHz, 码率: 96Kbps。

- (LFLiveSession*)session {
    if (!_session) {
        //默认音视频配置
        _session = [[LFLiveSession alloc] initWithAudioConfiguration:[LFLiveAudioConfiguration defaultConfiguration] videoConfiguration:[LFLiveVideoConfiguration defaultConfiguration]];
        _session.reconnectCount = 5;//重连次数
        _session.saveLocalVideo = YES;
        _session.saveLocalVideoPath = [NSURL fileURLWithPath:localVideoPath];
        _session.preView = self.view;
        _session.delegate = self;
    }
    return _session;
}
//开始直播
- (void)startLive {
    LFLiveStreamInfo *streamInfo = [LFLiveStreamInfo new];
    streamInfo.url = rtmpUrl;
    [self.session startLive:streamInfo];
}

//结束直播
- (void)stopLive {
    [self.session stopLive];
}

LFLiveSessionDelegate

//推流状态改变
- (void)liveSession:(nullable LFLiveSession *)session liveStateDidChange: (LFLiveState)state{
    
    NSString* stateStr;
    switch (state) {
        case LFLiveReady:
            stateStr = @"准备";
            break;
            
        case LFLivePending:
            stateStr = @"连接中";
            break;
            
        case LFLiveStart:
            stateStr = @"已连接";
            break;
            
        case LFLiveStop:
            stateStr = @"已断开";
            break;
            
        case LFLiveError:
            stateStr = @"连接出错";
            break;
            
        case LFLiveRefresh:
            stateStr = @"正在刷新";
            break;
            
        default:
            break;
    }
    
    self.liveStateLabel.text = stateStr;
}

//推流信息
- (void)liveSession:(nullable LFLiveSession *)session debugInfo:(nullable LFLiveDebug*)debugInfo{
}

//推流错误信息
- (void)liveSession:(nullable LFLiveSession*)session errorCode:(LFLiveSocketErrorCode)errorCode{
    switch (errorCode) {
        case LFLiveSocketError_PreView:
             NSLog(@"预览失败");
            break;
        case LFLiveSocketError_GetStreamInfo:
            NSLog(@"获取流媒体信息失败");
            break;
        case LFLiveSocketError_ConnectSocket:
            NSLog(@"连接socket失败");
            break;
        case LFLiveSocketError_Verification:
            NSLog(@"验证服务器失败");
            break;
        case LFLiveSocketError_ReConnectTimeOut:
            NSLog(@"重新连接服务器超时");
            break;
        default:
            break;
    }
}

闪光灯,前后置的切换

    //前后置切换
     AVCaptureDevicePosition devicePositon = self.session.captureDevicePosition;
        self.session.captureDevicePosition = (devicePositon == AVCaptureDevicePositionBack) ? AVCaptureDevicePositionFront : AVCaptureDevicePositionBack;
    
    self.session.torch =!self.session.torch;//闪光灯开关

二、自定义滤镜

基础推流功能配置完成,但是如果需要自定义滤镜,LFLiveKit框架只做了美颜调节,无法自定义滤镜,所以可以更改源码来做。

首先找到类:LFVideoCapture
修改.h添加一个属性

@property (nonatomic, strong) GPUImageOutput<GPUImageInput>* _Nullable currentFilter;

然后在.m中添加

-(void)setCurrentFilter:(GPUImageOutput<GPUImageInput> *)currentFilter{
    _currentFilter = currentFilter;
    [self reloadFilter];
}

再修改reloadFilter方法

- (void)reloadFilter{
    [self.filter removeAllTargets];
    [self.blendFilter removeAllTargets];
    [self.uiElementInput removeAllTargets];
    [self.videoCamera removeAllTargets];
    [self.output removeAllTargets];
    [self.cropfilter removeAllTargets];
    
    self.output = [[LFGPUImageEmptyFilter alloc] init];
    
    if (!self.beautyFace) {
        if (self.currentFilter) {
            self.filter = _currentFilter;
            self.beautyFilter = _currentFilter;
        }else{
            self.filter = [[LFGPUImageEmptyFilter alloc] init];
            self.beautyFilter = nil;
        }
        
    }else{
        self.filter = [[LFGPUImageBeautyFilter alloc] init];
        self.beautyFilter = (LFGPUImageBeautyFilter*)self.filter;
    }
    
    ///< 调节镜像
    [self reloadMirror];
    
    //< 480*640 比例为4:3  强制转换为16:9
    if([self.configuration.avSessionPreset isEqualToString:AVCaptureSessionPreset640x480]){
        CGRect cropRect = self.configuration.landscape ? CGRectMake(0, 0.125, 1, 0.75) : CGRectMake(0.125, 0, 0.75, 1);
        self.cropfilter = [[GPUImageCropFilter alloc] initWithCropRegion:cropRect];
        [self.videoCamera addTarget:self.cropfilter];
        [self.cropfilter addTarget:self.filter];
    }else{
        [self.videoCamera addTarget:self.filter];
    }
    
    //< 添加水印
    if(self.warterMarkView){
        [self.filter addTarget:self.blendFilter];
        [self.uiElementInput addTarget:self.blendFilter];
        [self.blendFilter addTarget:self.gpuImageView];
        if(self.saveLocalVideo) [self.blendFilter addTarget:self.movieWriter];
        [self.filter addTarget:self.output];
        [self.uiElementInput update];
    }else{
        [self.filter addTarget:self.output];
        [self.output addTarget:self.gpuImageView];
        if(self.saveLocalVideo) [self.output addTarget:self.movieWriter];
    }
    
    [self.filter forceProcessingAtSize:self.configuration.videoSize];
    [self.output forceProcessingAtSize:self.configuration.videoSize];
    [self.blendFilter forceProcessingAtSize:self.configuration.videoSize];
    [self.uiElementInput forceProcessingAtSize:self.configuration.videoSize];
    
    
    //< 输出数据
    __weak typeof(self) _self = self;
    [self.output setFrameProcessingCompletionBlock:^(GPUImageOutput *output, CMTime time) {
        [_self processVideo:output];
    }];
    
}

然后在LFLiveSession.m中添加方法

-(void)setCurrentFilter:(GPUImageOutput<GPUImageInput> *)currentFilter{
    [self willChangeValueForKey:@"currentFilter"];
    [self.videoCaptureSource setCurrentFilter:currentFilter];
    [self didChangeValueForKey:@"currentFilter"];
}

然后在LFLiveSession.h中添加属性

@property (nonatomic, strong) GPUImageOutput<GPUImageInput>* _Nullable currentFilter;

这样就可以自定义滤镜了。
焦点的原理是一样,在LFVideoCapture类中添加方法

- (void)setFocusPoint:(CGPoint)point{
    BOOL ret;
    if (!self.videoCamera.captureSession) return;
    AVCaptureSession *session = (AVCaptureSession *)self.videoCamera.captureSession;
    [session beginConfiguration];
    if (self.videoCamera.inputCamera) {
        if (self.videoCamera.inputCamera.torchAvailable) {
            NSError *err = nil;
            if ([self.videoCamera.inputCamera lockForConfiguration:&err]) {
                
                if ([self.videoCamera.inputCamera isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
                    [self.videoCamera.inputCamera setFocusMode:AVCaptureFocusModeAutoFocus];
                }
                if ([self.videoCamera.inputCamera isFocusPointOfInterestSupported]) {
                    [self.videoCamera.inputCamera setFocusPointOfInterest:point];
                }
                if ([self.videoCamera.inputCamera isExposureModeSupported:AVCaptureExposureModeAutoExpose]) {
                    [self.videoCamera.inputCamera setExposureMode:AVCaptureExposureModeAutoExpose];
                }
                if ([self.videoCamera.inputCamera isExposurePointOfInterestSupported]) {
                    [self.videoCamera.inputCamera setExposurePointOfInterest:point];
                }
                
            } else {
                NSLog(@"Error while locking device for torch: %@", err);
                ret = false;
            }
        } else {
            NSLog(@"Torch not available in current camera input");
        }
    }
    [session commitConfiguration];
    _torch = ret;
}

demo地址
https://github.com/qw9685/LFLiveKitDemo.git

上一篇 下一篇

猜你喜欢

热点阅读