iOS开发常用iOS DeviOS音视频(直播 音频 视频)

关于直播投屏解决方案

2017-05-22  本文已影响758人  大花头

笔者所在公司项目都比较奇葩,产品要求app带投屏功能(即手机屏幕投射到电脑屏幕或者智能电视)。只好硬着头皮,查找各方资料解决问题。首先想到的就是咱们大苹果的Airplay功能了,当然很多做投屏的iOS端是基于AirPlay Protocol开发的,不过人家mac端的Airserver的确好用啊(不了解的小伙伴可以自行度娘),不过最后由于我们pc客户端集成Airserver遇到问题,这个方案就阉割了(此为方案一)。最后采用了录屏框架+直播推流方式实现(此为方案二)。


一、两种方案优缺点比较

方案一:airplay成熟的协议 投屏连接速度快 效果好 流畅 缺点是需要研究AirPlay Protocol PC端不好开发集成

方案二:录屏框架不完善(其实录屏的流也是来自开启虚拟airplay连接 把实时的屏幕流导出来) 连接速度慢 推流拉流有延迟 效果一般 流畅度一般 优点是可以直接推流到rtmp服务器 进行网络直播

方案二 方案一

Unofficial AirPlay Protocol Specification

Screen Recorder SDK+Airplay Mirroring SDK For Windows

LFLiveKit


二、方案一实现流程(参考)

首先,需要分析一下你的需求,如果只是想实现投屏功能 并没有PC的客户端的定制需求(既 客户端自定义链接投屏界面而不是通过上拉菜单点击airplay镜像按钮自己选择设备);

对比图

原理分析:通过导入MediaPlayer.framework,调用api获取airplay可连接的设备列表,遍历列表找到你想要链接的设备name,选中此设备进行镜像;通过注册屏幕连接通知 检测设备屏幕个数及是否存在镜像来判断是否正在投屏;

2.1 首先下载项目AirplaySelector把项目里面的MediaPlayer Headers文件夹导入自己项目,然后控制器导入头文件即可使用API;


#import "ViewController.h"

#import "MPAudioVideoRoutingViewController.h"

#import "MPAVRoutingController.h"

#import "MPAVSystemRoutingController.h"

#import "MPAudioVideoRoutingTableViewController.h"

#import "MPAudioDeviceController.h"

#import "MPAVRoute.h"

2.2 注册通知 检测屏幕连接状态

- (void)addNotification{
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(screenDidConnect:) name:UIScreenDidConnectNotification object:nil];
    
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(screenDidDisconnect:) name:UIScreenDidDisconnectNotification object:nil];
}

#pragma mark - Notifications Handler

- (void)screenDidConnect:(NSNotification *)notification
{
    NSLog(@"connect");
    
    [self checkConnected];
}

- (void)screenDidDisconnect:(NSNotification *)notification
{
    NSLog(@"disconnect");
    
    [self checkConnected];
}

- (bool)checkConnected {
    
    // we need to use the secondary screen since the mainScreen does not mirror itself
    if ([[UIScreen screens] count] > 1) {
        UIScreen *secondaryScreen = [[UIScreen screens] objectAtIndex:1];
        NSLog(@"%@", secondaryScreen.mirroredScreen); // is actually the mainScreen
        if(secondaryScreen.mirroredScreen != nil){
            //screen is being mirrored
            [self changeLabelText:@"投屏中"];
            [_linkBtn setImage:[UIImage imageNamed:@"btn_end"] forState:UIControlStateNormal];
            [_linkBtn setImage:[UIImage imageNamed:@"btn_end_on"] forState:UIControlStateHighlighted];
            return true;
        }
    }
    [self changeLabelText:@"投屏异常,请重试!"];
    [_linkBtn setImage:[UIImage imageNamed:@"try_again"] forState:UIControlStateNormal];
    [_linkBtn setImage:[UIImage imageNamed:@"try_again_on"] forState:UIControlStateHighlighted];
    return false;
}

2.3 遍历airplay列表 连接与你输入name相同的设备(_severName.text 即你输入的设备name)

- (void)searchAirplayDevices {
    
    MPAudioVideoRoutingTableViewController *tableViewController = [[MPAudioVideoRoutingTableViewController alloc] initWithType:0 displayMirroringRoutes:YES];
    
    MPAVRoutingController *tableRouteController = [tableViewController routingController];
    _tableRouteController = tableRouteController;
    
    [tableRouteController fetchAvailableRoutesWithCompletionHandler:^(NSArray *routes) {
        NSLog(@"* Tried %d times", i);
        for (MPAVRoute *route in routes) {
            MPAVRoute *displayRoute = [route wirelessDisplayRoute];
            NSLog(@"* display route %@", displayRoute);
            if (displayRoute) {
                
                if ([[displayRoute routeName] isEqualToString:_severName.text]) {
                    
                    mirrorFound = true;
                    
//                    NSString *text =[[NSString stringWithFormat:@"Found!! "] stringByAppendingString:[displayRoute routeName]];
                    
                    NSLog(@"found!! %@", [displayRoute routeName]);
                    [tableRouteController pickRoute:displayRoute];
                    
                }else{
                   
                    NSLog(@"* SauceAirplay not found!!");
                    
                }
            }
            
        }
        if (!mirrorFound){
            [self changeLabelText:@"投屏异常,请重试!"];
            [_linkBtn setImage:[UIImage imageNamed:@"try_again"] forState:UIControlStateNormal];
            [_linkBtn setImage:[UIImage imageNamed:@"try_again_on"] forState:UIControlStateHighlighted];
        }
    }];
    
}
 [_tableRouteController pickHandsetRoute];

以上,即是通过输入设备name 自动连接airplay镜像方案 ,如果PC端需要集成airserver服务 请自行参考上面Airplay Mirroring SDK For Windows 我们公司windows端没搞定 如果你搞定了可以联系我 互相学习- -;


三、方案二实现流程

实现原理:利用XDWScreenRecorderSDK获取实时视频流,通过LFliveKit将视频流推流到rtmp服务器。然后pc端或者电视拉流直播;

投屏

3.1 下载项目LiveScreenStreamForiOS,将项目下include文件夹导入自己项目中lib文件夹下的.a文件引入到项目中,通过pod导入入LFLiveKit第三方库或者自己手动导入;

截图

3.2 导入头文件 准守协议

#import "LivingViewController.h"
#import "LFLiveKit.h"
#import "XDWScreenRecorder.h"
#import "ScanResultModel.h"

@interface LivingViewController () <XDWScreenRecorderDelegate,CmdSocketDelegate>
@property(nonatomic, strong) LFLiveStreamInfo *liveStreamInfo;
@property(nonatomic, strong) LFLiveSession *lfLiveSession;
@property(nonatomic, strong) XDWScreenRecorder *screenRecorder;

@property (weak, nonatomic) IBOutlet UIImageView *screenType;
@property (weak, nonatomic) IBOutlet UILabel *typeLabel;
@property (weak, nonatomic) IBOutlet UIButton *screenBtn;
@property (weak, nonatomic) IBOutlet UIView *baseView;
@property (nonatomic, strong) UIButton *leftBtn;
@property (nonatomic, strong) UIButton *rightBtn;

@end
- (XDWScreenRecorder *)screenRecorder {
    if (!_screenRecorder) {
        XDWScreenRecorderConfig *screenRecorderConfig = [[XDWScreenRecorderConfig alloc] init];
        screenRecorderConfig.videoSize = CGSizeMake(1080, 1920);
        screenRecorderConfig.framerate = 24;
        screenRecorderConfig.airTunesPort = 57000;
        screenRecorderConfig.airVideoPort = 8134;
        screenRecorderConfig.activeCode = "000000000";
        screenRecorderConfig.airPlayName = "ios";
        screenRecorderConfig.autoRotate = 0; //0 or 90 or 270
        
        _screenRecorder = [[XDWScreenRecorder alloc] initWithConfig:screenRecorderConfig];
        _screenRecorder.delegate = self;
    }
    
    return _screenRecorder;
}

- (LFLiveSession *)lfLiveSession {
    if (!_lfLiveSession) {
        
        _lfLiveSession = [[LFLiveSession alloc] initWithAudioConfiguration:[LFLiveAudioConfiguration defaultConfiguration]
                                                        videoConfiguration:[LFLiveVideoConfiguration defaultConfigurationForQuality:LFLiveVideoQuality_Medium2 outputImageOrientation:UIInterfaceOrientationPortrait]
                                                               captureType:LFLiveCaptureMaskAudioInputVideo];
        
    }
    
    return _lfLiveSession;
}

3.3 实现协议方法

#pragma mark - XDWScreenRecorderDelegate

- (void)screenRecorder:(XDWScreenRecorder *)screenRecorder didStartRecordingWithVideoSize:(CGSize)videoSize {
    //这里开启lflive
    self.liveStreamInfo.videoConfiguration.videoSize = videoSize;
    self.lfLiveSession.running = YES;
}

- (void)screenRecorder:(XDWScreenRecorder *)screenRecorder startError:(NSError *)error {
    //一般无法开启airplay或者其他错误的时候会调用此方法 逻辑自行处理
    //恢复按钮使用
    self.rightBtn.enabled = YES;
    self.leftBtn.enabled = YES;
    self.screenBtn.enabled = YES;
    
    [self changeScreenType:Reconnection];
}

- (void)screenRecorder:(XDWScreenRecorder *)screenRecorder videoBuffer:(CVPixelBufferRef)buffer timestamp:(NSTimeInterval)timestamp {
    //这里的buffer就是实时获取的视频流 通过lf推流到自己的rtmp服务器
    [self.lfLiveSession pushVideo:buffer];
}

- (void)screenRecorderDidStopRecording:(XDWScreenRecorder *)screenRecorder {
    //录屏结束时候调用
}
- (void)home{
    //挂起程序
    [[UIApplication sharedApplication] performSelector:@selector(suspend)];
}
    [self.screenRecorder stop];
    
    [self.lfLiveSession stopLive];
    
    [self.lfLiveSession setRunning:NO];
    
    self.lfLiveSession.delegate = nil;


结语

最近项目不紧 有时间会更新一些项目中遇到的困难 供大家参考 学习, 有兴趣的同学也可以一起交流;

上一篇下一篇

猜你喜欢

热点阅读