WebRTCiOS分享世界swift 文章收集

iOS 基于WebRTC的音视频通信 总结篇(2020年最新)

2019-10-22  本文已影响0人  coder_xiang

iOS 基于WebRTC的音视频通信 总结篇(2020最新)

附上我的swfit项目, 项目里面有整个swift应用使用框架, 网络请求框架, DSBridge原生与H5交互的用法, 反射知识的使用, WCDB数据库的封装使用, WebRTC音视频直播demo, socket的使用, socket协议的封装使用等等知识点. 希望对大家有用.-->swfit完整项目2020持续更新完善

公司要用webrtc进行音视频通信, 参考了国内外众多博客和demo, 总结一下经验:
webrtc官网
webrtc对iOS使用的说明

另外一个帮助大家很好理解webrtc的单独实例demo: https://github.com/Xianlau/WebRTC_Demo
先展示demo效果图

image.png

WEBRTC结构

完整的WebRTC框架,分为 Server端、Client端两大部分。

介绍下WebRTC三个主要API,以及实现点对点连接的流程。

  1. MediaStream:通过MediaStream的API能够通过设备的摄像头及话筒获得视频、音频的同步流
  2. RTCPeerConnection:RTCPeerConnection是WebRTC用于构建点对点之间稳定、高效的流传输的组件
  3. RTCDataChannel:RTCDataChannel使得浏览器之间(点对点)建立一个高吞吐量、低延时的信道,用于传输任意数据。
    其中RTCPeerConnection是我们WebRTC的核心组件。

WEBRTC的建立连接流程图

webrtc流程图.png

整个webrtc连接的流程说明

其主要流程如上图所示, 具体流程说明如下:

  1. 客户端通过socket, 和服务器建立起TCP长链接, 这部分WebRTC并没有提供相应的API, 所以这里可以借助第三方框架, OC代码建议使用CocoaAsyncSocket第三方框架进行socket连接https://github.com/robbiehanson/CocoaAsyncSocket
    swift代码的话国外工程师最喜欢用Starscream (WebSocket)
    https://github.com/daltoniam/Starscream

  2. 客户端通过信令服务器, 进行offer SDP 握手

SDP(Session Description Protocol):描述建立音视频连接的一些属性,如音频的编码格式、视频的编码格式、是否接收/发送音视频等等
SDP 是通过webrtc框架里面的PeerConnection所创建, 详细创建请参考我的demo.

3.客户端通过信令服务器, 进行Candidate 握手

Candidate:主要包含了相关方的IP信息,包括自身局域网的ip、公网ip、turn服务器ip、stun服务器ip等
Candidate 是通过webrtc框架里面的PeerConnection所创建, 详细创建请参考我的demo.

  1. 客户端在SDP 和Candidate握手成功后, 就建立起一个P2P端对端的链接, 视频流就能直接传输, 不需要经过服务器啦.

SDP握手流程和Candidate握手流程类似, 但有点繁琐, 下面就SDP握手流程简要说明:

下图为WebRTC通过信令建立一个SDP握手的过程。只有通过SDP握手,双方才知道对方的信息,这是建立p2p通道的基础。


SDP.jpg

1.anchor端通过 createOffer 生成 SDP 描述
2.anchor通过 setLocalDescription,设置本地的描述信息
3.anchor将 offer SDP 发送给用户
4.用户通过 setRemoteDescription,设置远端的描述信息
5.用户通过 createAnswer 创建出自己的 SDP 描述
6.用户通过 setLocalDescription,设置本地的描述信息
7.用户将 anwser SDP 发送给主播
8.anchor端通过 setRemoteDescription,设置远端的描述信息。
9.通过SDP握手后,两端之间就会建立起一个端对端的直接通讯通道。

由于我们所处的网络环境错综复杂,用户可能处在私有内网内,使用p2p传输时,将会遇到NAT以及防火墙等阻碍。这个时候我们就需要在SDP握手时,通过STUN/TURN/ICE相关NAT穿透技术来保障p2p链接的建立。

1. 建立Socket长连接, 为接下来的信令通信做好铺垫.

与服务器端建立长连接, 选用了socket连接, 用的第三方框架是CocoaAsyncSocket, 其实也可以使用WebSocket, 看你们团队的方案选型吧.

  • 以下是socket建立连接以及WebRTC建立连接的逻辑代码. socket连接其实代码量极少, socket连接参考一下github的CocoaAsyncSocket说明就好, 不必花太多时间在这块, 重点还是在WebRTC建立连接, 在与服务端进行数据传输的时候, 注意你们可能会有数据分包策略.
  • 网上绝大部分代码用的是OC, 而且很多已经过且零散的, OC版本相对简单, 以下分享的是swift版, 阅读以下代码请一定一定要先看看以上提到的两个逻辑时序图.
// MARK: - socket状态代理
protocol SocketClientDelegate: class {
    
    func signalClientDidConnect(_ signalClient: SocketClient)
    func signalClientDidDisconnect(_ signalClient: SocketClient)
    func signalClient(_ signalClient: SocketClient, didReceiveRemoteSdp sdp: RTCSessionDescription)
    func signalClient(_ signalClient: SocketClient, didReceiveCandidate candidate: RTCIceCandidate)
}

final class SocketClient: NSObject {
    
    //socket
    var socket: GCDAsyncSocket = {
       return GCDAsyncSocket.init()
    }()
    
    private var host: String? //服务端IP
    private var port: UInt16? //端口
    weak var delegate: SocketClientDelegate?//代理
    
    var receiveHeartBeatDuation = 0 //心跳计时计数
    let heartBeatOverTime = 10 //心跳超时
    var sendHeartbeatTimer:Timer? //发送心跳timer
    var receiveHeartbearTimer:Timer? //接收心跳timer

    //接收数据缓存
    var dataBuffer:Data = Data.init()
    
    //登录获取的peer_id
    var peer_id = 0
    //登录获取的远程设备peer_id
    var remote_peer_id = 0

    // MARK:- 初始化
    init(hostStr: String , port: UInt16) {
        super.init()
        
        self.socket.delegate = self
        self.socket.delegateQueue = DispatchQueue.main
        self.host = hostStr
        self.port = port
        //socket开始连接
        connect()
    }

    // MARK:- 开始连接
    func connect() {
        
        do {
            try  self.socket.connect(toHost: self.host ?? "", onPort: self.port ?? 6868, withTimeout: -1)
            
        }catch {
            print(error)
        }
    }
    
    // MARK:- 发送消息
    func sendMessage(_ data: Data){
        self.socket.write(data, withTimeout: -1, tag: 0)
    }

    // MARK:- 发送sdp offer/answer
    func send(sdp rtcSdp: RTCSessionDescription) {
        
        //转成我们的sdp
        let type = rtcSdp.type
        var typeStr = ""
        switch type {
        case .answer:
            typeStr = "answer"
        case .offer:
            typeStr = "offer"
        default:
            print("sdpType错误")
        }
        let newSDP:SDPSocket = SDPSocket.init(sdp: rtcSdp.sdp, type: typeStr)
        let jsonInfo = newSDP.toJSON()
        let dic = ["sdp" : jsonInfo]
        let info:SocketInfo = SocketInfo.init(type: .sdp, source: self.peer_id, destination: self.remote_peer_id, params: dic as Dictionary<String, Any>)
        let data = self.packData(info: info)
        //print(data)
        self.sendMessage(data)
        print("发送SDP")
    }

    // MARK:- 发送iceCandidate
    func send(candidate rtcIceCandidate: RTCIceCandidate) {
        
        let iceCandidateMessage = IceCandidate_Socket(from: rtcIceCandidate)
        let jsonInfo = iceCandidateMessage.toJSON()
        let dic = ["icecandidate" : jsonInfo]
        let info:SocketInfo = SocketInfo.init(type: .icecandidate, source: self.peer_id, destination: self.remote_peer_id, params: dic as Dictionary<String, Any>)
        let data = self.packData(info: info)
        //print(data)
        self.sendMessage(data)
         print("发送ICE")
    }
}

extension SocketClient: GCDAsyncSocketDelegate {
    
    // MARK:- socket连接成功
    func socket(_ sock: GCDAsyncSocket, didConnectToHost host: String, port: UInt16) {
        
        debugPrint("socket连接成功")
        self.delegate?.signalClientDidConnect(self)
        
        //登录获取身份id peer_id
        login()
        //发送心跳
        startHeartbeatTimer()
        //开启接收心跳计时
        startReceiveHeartbeatTimer()
        
        //继续接收数据
        socket.readData(withTimeout: -1, tag: 0)
    }
    
    // MARK:- 接收数据  socket接收到一个数据包
    func socket(_ sock: GCDAsyncSocket, didRead data: Data, withTag tag: Int) {
        
        //debugPrint("socket接收到一个数据包")
        let _:SocketInfo? = self.unpackData(data)
        //let type:SigType = SigType(rawValue: socketInfo?.type ?? "")!
        //print(socketInfo ?? "")
        //print(type)

        //继续接收数据
        socket.readData(withTimeout: -1, tag: 0)
    }
    
    // MARK:- 断开连接
    func socketDidDisconnect(_ sock: GCDAsyncSocket, withError err: Error?) {
        
        debugPrint("socket断开连接")
        print(err ?? "")
        
        self.disconnectSocket()
        
        // try to reconnect every two seconds
        DispatchQueue.global().asyncAfter(deadline: .now() + 5) {
            debugPrint("Trying to reconnect to signaling server...")
            self.connect()
        }
    }

}

2. 进行信令通信, 建立端对端的连接.


import Foundation
import WebRTC

// MARK: - webrtc连接状态代理 
protocol WebRTCClientDelegate: class {
    func webRTCClient(_ client: WebRTCClient, didDiscoverLocalCandidate candidate: RTCIceCandidate)
    func webRTCClient(_ client: WebRTCClient, didChangeConnectionState state: RTCIceConnectionState)
    func webRTCClient(_ client: WebRTCClient, didReceiveData data: Data)
}

final class WebRTCClient: NSObject {

    // MARK:- 懒加载factory
    private static let factory: RTCPeerConnectionFactory = {
        RTCInitializeSSL()
        let videoEncoderFactory = RTCVideoEncoderFactoryH264()
        let videoDecoderFactory = RTCVideoDecoderFactoryH264()
        let factory = RTCPeerConnectionFactory(encoderFactory: videoEncoderFactory, decoderFactory: videoDecoderFactory)

//        let options = RTCPeerConnectionFactoryOptions()
//        options.ignoreVPNNetworkAdapter = true
//        options.ignoreWiFiNetworkAdapter = true
//        options.ignoreCellularNetworkAdapter = true
//        options.ignoreEthernetNetworkAdapter = true
//        options.ignoreLoopbackNetworkAdapter = true
//        factory.setOptions(options)
        return factory
    }()
    
    weak var delegate: WebRTCClientDelegate?
    
    private let peerConnection: RTCPeerConnection
    private let rtcAudioSession =  RTCAudioSession.sharedInstance()
    private let audioQueue = DispatchQueue(label: "audio")
    private let mediaConstrains = [kRTCMediaConstraintsOfferToReceiveAudio: kRTCMediaConstraintsValueTrue,
                                   kRTCMediaConstraintsOfferToReceiveVideo: kRTCMediaConstraintsValueTrue]    
    private var videoCapturer: RTCVideoCapturer?
    private var localVideoTrack: RTCVideoTrack?
    private var remoteVideoTrack: RTCVideoTrack?
    private var localDataChannel: RTCDataChannel?
    private var remoteDataChannel: RTCDataChannel?

    @available(*, unavailable)
    override init() {
        fatalError("WebRTCClient:init is unavailable")
    }
    
    required init(iceServers: [String]) {

//        // gatherContinually will let WebRTC to listen to any network changes and send any new candidates to the other client
//        config.continualGatheringPolicy = .gatherContinually

        //config.iceTransportPolicy = .all
        //contraints: 控制MediaStream的内容(媒体类型、分辨率、帧率)
//        let constraints = RTCMediaConstraints(mandatoryConstraints: nil,
//                                              optionalConstraints: ["DtlsSrtpKeyAgreement":kRTCMediaConstraintsValueTrue])
        
        let config = RTCConfiguration()
        config.iceServers = [RTCIceServer(urlStrings: iceServers)]
        // Unified plan is more superior than planB
        config.sdpSemantics = .unifiedPlan
        //contraints: 控制MediaStream的内容(媒体类型、分辨率、帧率)
        let mediaConstraints = RTCMediaConstraints.init(mandatoryConstraints: nil, optionalConstraints: nil)
        self.peerConnection = WebRTCClient.factory.peerConnection(with: config, constraints: mediaConstraints, delegate: nil)
        
        super.init()
        self.createMediaSenders()
        self.configureAudioSession()
        self.peerConnection.delegate = self
    }
    
    
    // MARK: 挂起
    func disconnect(){
        self.peerConnection.close()
    }
    
    // MARK: Signaling 获取本地sdp 用来发送给socket服务器
    func offer(completion: @escaping (_ sdp: RTCSessionDescription) -> Void) {
        let constrains = RTCMediaConstraints(mandatoryConstraints: self.mediaConstrains,
                                             optionalConstraints: nil)
        self.peerConnection.offer(for: constrains) { (sdp, error) in
            guard let sdp = sdp else {
                return
            }
            
            self.peerConnection.setLocalDescription(sdp, completionHandler: { (error) in
                completion(sdp)
            })
        }
    }
    // MARK:- 回复sockdet服务器sdp answer
    func answer(completion: @escaping (_ sdp: RTCSessionDescription) -> Void)  {
        let constrains = RTCMediaConstraints(mandatoryConstraints: self.mediaConstrains,
                                             optionalConstraints: nil)
        //拿到本地sdp
        self.peerConnection.answer(for: constrains) { (sdp, error) in
            guard let sdp = sdp else {
                return
            }
            //设置本地sdp
            self.peerConnection.setLocalDescription(sdp, completionHandler: { (error) in
                //发送出去sdp
                completion(sdp)
            })
        }
    }
    // MARK:- 设置远程sdp
    func set(remoteSdp: RTCSessionDescription, completion: @escaping (Error?) -> ()) {
        self.peerConnection.setRemoteDescription(remoteSdp, completionHandler: completion)
    }
    
    // MARK:- 添加远程candidate
    func set(remoteCandidate: RTCIceCandidate) {
        self.peerConnection.add(remoteCandidate)
    }
    
    // MARK: Media
    func startCaptureLocalVideo(renderer: RTCVideoRenderer) {
        guard let capturer = self.videoCapturer as? RTCCameraVideoCapturer else {
            return
        }

        guard
            //获取前置摄像头front 后置取back
            let frontCamera = (RTCCameraVideoCapturer.captureDevices().first { $0.position == .front }),
            // choose highest res
            let format = (RTCCameraVideoCapturer.supportedFormats(for: frontCamera).sorted { (f1, f2) -> Bool in
                let width1 = CMVideoFormatDescriptionGetDimensions(f1.formatDescription).width
                let width2 = CMVideoFormatDescriptionGetDimensions(f2.formatDescription).width
                return width1 < width2
            }).last,
        
            // choose highest fps
            let fps = (format.videoSupportedFrameRateRanges.sorted { return $0.maxFrameRate < $1.maxFrameRate }.last) else {
            return
        }

        capturer.startCapture(with: frontCamera,
                              format: format,
                              fps: Int(fps.maxFrameRate))
        
        self.localVideoTrack?.add(renderer)
    }
    
    func renderRemoteVideo(to renderer: RTCVideoRenderer) {

        self.remoteVideoTrack?.add(renderer)
    }
    
    private func configureAudioSession() {
        self.rtcAudioSession.lockForConfiguration()
        do {
            try self.rtcAudioSession.setCategory(AVAudioSession.Category.playAndRecord.rawValue)
            try self.rtcAudioSession.setMode(AVAudioSession.Mode.voiceChat.rawValue)
        } catch let error {
            debugPrint("Error changeing AVAudioSession category: \(error)")
        }
        self.rtcAudioSession.unlockForConfiguration()
    }
    
    // MARK:- 创建媒体流
    private func createMediaSenders() {
        let streamId = "stream"
        
        // Audio
        let audioTrack = self.createAudioTrack()
        self.peerConnection.add(audioTrack, streamIds: [streamId])
        
        // Video
        let videoTrack = self.createVideoTrack()
        self.localVideoTrack = videoTrack
        self.peerConnection.add(videoTrack, streamIds: [streamId])
        self.remoteVideoTrack = self.peerConnection.transceivers.first { $0.mediaType == .video }?.receiver.track as? RTCVideoTrack

        //samadd
        //self.remoteVideoTrack?.source.adaptOutputFormat(toWidth: 960, height: 480, fps: 30)

        // Data
        if let dataChannel = createDataChannel() {
            dataChannel.delegate = self
            self.localDataChannel = dataChannel
        }
    }
    
    // MARK:- 创建音频track
    private func createAudioTrack() -> RTCAudioTrack {
        let audioConstrains = RTCMediaConstraints(mandatoryConstraints: nil, optionalConstraints: nil)
        let audioSource = WebRTCClient.factory.audioSource(with: audioConstrains)
        let audioTrack = WebRTCClient.factory.audioTrack(with: audioSource, trackId: "audio0")
        return audioTrack
    }
    
    // MARK:- 创建视频track
    private func createVideoTrack() -> RTCVideoTrack {
        let videoSource = WebRTCClient.factory.videoSource()
        
        #if TARGET_OS_SIMULATOR
        self.videoCapturer = RTCFileVideoCapturer(delegate: videoSource)
        #else
        self.videoCapturer = RTCCameraVideoCapturer(delegate: videoSource)
        #endif
        
        let videoTrack = WebRTCClient.factory.videoTrack(with: videoSource, trackId: "video0")
        return videoTrack
    }
    
    // MARK: Data Channels
    // MARK:- 创建data通道
    private func createDataChannel() -> RTCDataChannel? {
        let config = RTCDataChannelConfiguration()
        guard let dataChannel = self.peerConnection.dataChannel(forLabel: "WebRTCData", configuration: config) else {
            debugPrint("Warning: Couldn't create data channel.")
            return nil
        }
        return dataChannel
    }
    
    // MARK:- 发送data
    func sendData(_ data: Data) {
        let buffer = RTCDataBuffer(data: data, isBinary: true)
        self.remoteDataChannel?.sendData(buffer)
    }
}

// MARK:- Audio control
extension WebRTCClient {
    func muteAudio() {
        self.setAudioEnabled(false)
    }
    
    func unmuteAudio() {
        self.setAudioEnabled(true)
    }
    
    // Fallback to the default playing device: headphones/bluetooth/ear speaker
    func speakerOff() {
        self.audioQueue.async { [weak self] in
            guard let self = self else {
                return
            }
            
            self.rtcAudioSession.lockForConfiguration()
            do {
                try self.rtcAudioSession.setCategory(AVAudioSession.Category.playAndRecord.rawValue)
                try self.rtcAudioSession.overrideOutputAudioPort(.none)
            } catch let error {
                debugPrint("Error setting AVAudioSession category: \(error)")
            }
            self.rtcAudioSession.unlockForConfiguration()
        }
    }
    
    // Force speaker
    func speakerOn() {
        self.audioQueue.async { [weak self] in
            guard let self = self else {
                return
            }
            
            self.rtcAudioSession.lockForConfiguration()
            do {
                try self.rtcAudioSession.setCategory(AVAudioSession.Category.playAndRecord.rawValue)
                try self.rtcAudioSession.overrideOutputAudioPort(.speaker)
                try self.rtcAudioSession.setActive(true)
            } catch let error {
                debugPrint("Couldn't force audio to speaker: \(error)")
            }
            self.rtcAudioSession.unlockForConfiguration()
        }
    }
    
    private func setAudioEnabled(_ isEnabled: Bool) {
        let audioTracks = self.peerConnection.transceivers.compactMap { return $0.sender.track as? RTCAudioTrack }
        audioTracks.forEach { $0.isEnabled = isEnabled }
    }
}

extension WebRTCClient: RTCDataChannelDelegate {
    func dataChannelDidChangeState(_ dataChannel: RTCDataChannel) {
        debugPrint("dataChannel did change state: \(dataChannel.readyState)")
    }
    
    func dataChannel(_ dataChannel: RTCDataChannel, didReceiveMessageWith buffer: RTCDataBuffer) {
        self.delegate?.webRTCClient(self, didReceiveData: buffer.data)
    }
}

3. 对Webrtc模块进行封装管理

import Foundation
import AVFoundation
import WebRTC

// MARK:- 图传连接状态
public enum RtcConnectedState {
    
    case sucessed //连接成功
    case falure   //连接失败
    case connecting //正在连接
}


protocol WebRTCManagerDelegate: class {

    //socket是否连接上
    func webRTCManager(_ manager: WebRTCManager, socketConnectState isSucessed: Bool)
    
    //webrtc连接状态
    func webRTCManager(_ manager: WebRTCManager, didChangeConnectionState state: RTCIceConnectionState)
}

class WebRTCManager {
    
    static let shareInstance:WebRTCManager  = WebRTCManager()
    
    //private let signalClient: SignalingClient
    var signalClient: SocketClient?
    var webRTCClient: WebRTCClient?

    ///初始化的时候请传入config
    var sockitConfig: SocketConfig =  SocketConfig.default
    
    //代理
    weak var delegate: WebRTCManagerDelegate?
    
    var remoteCandidate: Int = 0
    ///rtc连接成功回调
    var feedbackConnectedBlock: ((_ webClient: WebRTCClient)->())?

    // MARK:- 断开socket连接
    public func disconnect(){
        
        self.signalClient?.disconnectSocket()
        
        self.webRTCClient?.disconnect()
        self.signalClient?.delegate = nil
        self.webRTCClient?.delegate = nil
        self.signalClient = nil
        self.webRTCClient = nil
        remoteCandidate = 0
    }
    // MARK:- 开始连接socket
    public func connect(){

        //打印RTC日记
        //RTCSetMinDebugLogLevel(.verbose)
        //let log = RTCFileLogger.init()
        //log.start()
        
        //创建socket和rtc对象
        signalClient = SocketClient.init(hostStr: sockitConfig.host, port: sockitConfig.port)
        webRTCClient = WebRTCClient(iceServers: sockitConfig.webRTCIceServers)
        webRTCClient?.delegate = self
        signalClient?.delegate = self

        self.signalClient?.connect()
    }
    
}

extension WebRTCManager: SocketClientDelegate {
    
    //socket登录成功
    func signalClientdidLogin(_ signalClient: SocketClient) {
        logger.info("********socket登录成功************************")
    }
    
    // MARK:- socket连接成功
    func signalClientDidConnect(_ signalClient: SocketClient) {

        self.delegate?.webRTCManager(self, socketConnectState: true)
    }

    // MARK:- socket连接失败
    func signalClientDidDisconnect(_ signalClient: SocketClient) {
        
        self.delegate?.webRTCManager(self, socketConnectState: false)
    }
    
    // MARK:- 收到对方sdp
    func signalClient(_ signalClient: SocketClient, didReceiveRemoteSdp sdp: RTCSessionDescription) {
        logger.info("************收到对方sdp****************************")
        
        //设置远方sdp
        self.webRTCClient?.set(remoteSdp: sdp) { (error) in
            
            self.webRTCClient?.answer { (localSdp) in
                
                self.signalClient?.send(sdp: localSdp)
            }
            
            logger.error(error.debugDescription)
        }
        
    }
    // MARK:- 收到对方ice
    func signalClient(_ signalClient: SocketClient, didReceiveCandidate candidate: RTCIceCandidate) {
         logger.info("************收到对方ice****************************")
        
        self.remoteCandidate += 1
        //设置远方ice
        self.webRTCClient?.set(remoteCandidate: candidate)
    }
}

extension WebRTCManager: WebRTCClientDelegate {
    
    // MARK:- 收到本地ice
    func webRTCClient(_ client: WebRTCClient, didDiscoverLocalCandidate candidate: RTCIceCandidate) {
        logger.info("********************************发现本地 ice candidate **********")
        
        self.signalClient?.send(candidate: candidate)
    }
    // MARK:- rtc连接状态
    func webRTCClient(_ client: WebRTCClient, didChangeConnectionState state: RTCIceConnectionState) {
        
        self.delegate?.webRTCManager(self, didChangeConnectionState: state)
        
        switch state {
        case .connected, .completed:
            
            logger.info("*********RTC连接状态成功*****************************************")
            if let block = feedbackConnectedBlock {
                block(client)
            }
            
        case .disconnected:
            logger.info("*********RTC失去连接*****************************************")
            
        case .failed, .closed:
            
            logger.info("*********RTC连接失败*****************************************")
        case .new, .checking, .count: break
            
        @unknown default: break
            
        }

    }
    // MARK:- 收到rtc 数据通道数据
    func webRTCClient(_ client: WebRTCClient, didReceiveData data: Data) {
//        DispatchQueue.main.async {
//            let message = String(data: data, encoding: .utf8) ?? "(Binary: \(data.count) bytes)"
//            let alert = UIAlertController(title: "Message from WebRTC", message: message, preferredStyle: .alert)
//            alert.addAction(UIAlertAction(title: "OK", style: .cancel, handler: nil))
//            self.present(alert, animated: true, completion: nil)
//        }
    }
}

持续更新中.....
大家有问题可以QQ我: 506299396
上一篇下一篇

猜你喜欢

热点阅读