iOS RTMP 视频直播开发笔记(1) – 采集摄像头图像

2018-09-04  本文已影响13人  半岛夏天
  1. 采集硬件(摄像头)视频图像

这里简单说下 iOS 的摄像头采集。

首先初始化AVCaptureSession,说到Session,有没有人想到AVAudioSession呢?

// 初始化 AVCaptureSession
_session = [[AVCaptureSession alloc] init];

设置采集的 Video 和 Audio 格式,这两个是分开设置的,也就是说,你可以只采集视频。


// 配置采集输入源(摄像头)
NSError *error = nil;
// 获得一个采集设备,例如前置/后置摄像头
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
// 用设备初始化一个采集的输入对象
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (error) {
    NSLog(@"Error getting video input device: %@", error.description);
}
if ([_session canAddInput:videoInput]) {
    [_session addInput:videoInput]; // 添加到Session
}
 
// 配置采集输出,即我们取得视频图像的接口
_videoQueue = dispatch_queue_create("Video Capture Queue", DISPATCH_QUEUE_SERIAL);
_videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[_videoOutput setSampleBufferDelegate:self queue:_videoQueue];
 
// 配置输出视频图像格式
NSDictionary *captureSettings = @{(NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)};
 
_videoOutput.videoSettings = captureSettings;
_videoOutput.alwaysDiscardsLateVideoFrames = YES;
if ([_session canAddOutput:_videoOutput]) {
    [_session addOutput:_videoOutput];  // 添加到Session
}
 
// 保存Connection,用于在SampleBufferDelegate中判断数据来源(是Video/Audio?)
_videoConnection = [_videoOutput connectionWithMediaType:AVMediaTypeVideo];

实现 AVCaptureOutputDelegate:


- (void) captureOutput:(AVCaptureOutput *)captureOutput 
 didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
        fromConnection:(AVCaptureConnection *)connection
{
    // 这里的sampleBuffer就是采集到的数据了,但它是Video还是Audio的数据,得根据connection来判断
    if (connection == _videoConnection) {  // Video
        /*
        // 取得当前视频尺寸信息
        CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        int width = CVPixelBufferGetWidth(pixelBuffer);
        int height = CVPixelBufferGetHeight(pixelBuffer);
        NSLog(@"video width: %d  height: %d", width, height);
        */
         NSLog(@"在这里获得video sampleBuffer,做进一步处理(编码H.264)");
    } else if (connection == _audioConnection) {  // Audio
        NSLog(@"这里获得audio sampleBuffer,做进一步处理(编码AAC)");
    }
}

关于实时编码H.264和AAC Buffer,这里又是两个技术点,之后再讲吧。

配置完成,现在启动 Session:

// 启动 Session
[_session startRunning];

1.1 附加任务:将当前硬件采集视频图像显示到屏幕

很简单,发送端直接使用自家的AVCaptureVideoPreviewLayer显示,so easy

_previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; // 设置预览时的视频缩放方式
[[_previewLayer connection] setVideoOrientation:AVCaptureVideoOrientationPortrait]; // 设置视频的朝向
 
_previewLayer.frame = self.view.layer.bounds;
[self.view.layer addSublayer:_previewLayer];

然后将这个layer添加到界面中即可显示了。

采集时候有个小坑,就是采集的图像尺寸和方向设置。如下:

- (void)setupCaptureOrientation {
    if([_previewLayer.connection isVideoOrientationSupported]) {
        AVCaptureVideoOrientation orientation;
        
        switch ([[UIApplication sharedApplication] statusBarOrientation]) {
            case UIInterfaceOrientationLandscapeLeft:
                orientation = AVCaptureVideoOrientationLandscapeLeft;
                break;
                
            case UIInterfaceOrientationLandscapeRight:
                orientation = AVCaptureVideoOrientationLandscapeRight;
                break;
            case UIInterfaceOrientationPortrait:
                orientation = AVCaptureVideoOrientationPortrait;
                break;
                
            default:
                orientation = AVCaptureVideoOrientationLandscapeLeft;
                break;
        }
        
        [_previewLayer.connection setVideoOrientation:orientation];
    }
}

转载自 Tony's blog

上一篇下一篇

猜你喜欢

热点阅读