拉流

2017-07-24  本文已影响0人  吴霸格07

网络摄像头拉流

1.接收视频数据

1.1结构体

1.1.1编码基本单位NALU

// NALU
typedef struct _MP4ENC_NaluUnit
{
    int type;
    int size;
    unsigned char *data;
    
}MP4ENC_NaluUnit;
/**
 * rational number numerator/denominator
 */
typedef struct AVRational{
    int num; ///< numerator
    int den; ///< denominator
} AVRational;

AVRational这个结构标识一个分数,num为分数,den为分母。

实际上time_base的意思就是时间的刻度:
如(1,25),那么时间刻度就是1/25; (1,90000),那么时间刻度就是1/90000
那么,在刻度为1/25的体系下的time=5,转换成在刻度为1/90000体系下的时间time为(51/25)/(1/90000) = 36005=18000

ffmpeg中做pts计算时,存在大量这种转换

在以下结构中都有
AVCodecContext:编解码上下文。
AVStream:文件或其它容器中的某一个track。
如果由某个解码器产生固定帧率的码流
AVCodecContext中的AVRational根据帧率来设定,如25帧,那么num = 1,den=25
AVStream中的time_base一般根据其采样频率设定,如(1,90000)
在某些场景下涉及到PTS的计算时,就涉及到两个Time的转换,以及到底取哪里的time_base进行转换:

场景1:编码器产生的帧,直接存入某个容器的AVStream中,那么此时packet的Time要从AVCodecContext的time转换成目标AVStream的time
场景2:从一种容器中demux出来的源AVStream的frame,存入另一个容器中某个目的AVStream。此时的时间刻度应该从源AVStream的time,转换成目的AVStream timebase下的时间。
其实,问题的关键还是要理解,不同的场景下取到的数据帧的time是相对哪个时间体系的。
demux出来的帧的time:是相对于源AVStream的timebase
编码器出来的帧的time:是相对于源AVCodecContext的timebase
mux存入文件等容器的time:是相对于目的AVStream的timebase

这里的time指pts。
http://blog.csdn.net/peckjerry/article/details/48344389

1.1.3录像文件信息RecordInfo

typedef struct tag_RECORD_INFO
{
    int m_nCurPts;          //当前位置的pts pts presentation time stamp 度量解码后的视频帧什么时候被显示出来
    double m_dCurTime;      //当前位置的时间(没有使用)
    uint m_nTotalTime;      //总时长
    uint m_nLastTimestamp;  //上一个位置的时间戳
}RecordInfo;

1.1.4 时间计时器
//时间计时器

unsigned int _getTickCount() {
    
    struct timeval tv;
    //gettimeofday 获取当前精确时间
    /*
     int gettimeofday(struct timeval*tv, struct timezone *tz);
     其参数tv是保存获取时间结果的结构体,参数tz用于保存时区结果:
     struct timezone{
     int tz_minuteswest;//格林威治时间往西方的时差
    int tz_dsttime;DST //时间的修正方式
}
timezone 参数若不使用则传入NULL即可。
     */
    if (gettimeofday(&tv, NULL) != 0)
        return 0;
    
    return (tv.tv_sec * 1000 + tv.tv_usec / 1000);
}

1.2属性

1.2.1 MP4VideoRecorder的属性

@interface Mp4VideoRecorder : NSObject
{
    bool m_bRecord;             //录像状态
    
    NSString* m_strRecordFile;  //录像文件存放路径
    
    int m_nFileTotalSize;       //文件总大小
    
    RecordInfo m_stFrameInfo;   //视频信息
    RecordInfo m_stAudioInfo;   //音频信息
    
    
    AVStream            *m_pVideoSt; //视频
    AVStream            *m_pAudioSt; //音频
    AVFormatContext     *m_pFormatCtx;
    
    int                 m_nWidth;
    int                 m_nHeight;
    int                 m_nFrameRate;
    double              startTimeStamp;
    
    int                 firstIFrame_states;
    int                 startRecord_states;

    
    MP4_RecordAvccBox   m_avcCBox;
    
    AACEncodeConfig*            g_aacEncodeConfig;
    
    int                         g_writeRemainSize;
    int                         g_bufferRemainSize;
    
    NSLock*                     videoLock;
    
    
}

1.2核心代码

- (void)doRecvVideo
{

    
    int readSize = -1;
    char *recvBuf = malloc(RECV_VIDEO_BUFFER_SIZE);
    
    FRAMEINFO_t frmInfo = {0};

    unsigned int frmNo = 0,prevFrmNo=0x0FFFFFFF;


    int outBufSize = 0, outFrmSize = 0, outFrmInfoSize = 0;

    H264Decoder*     decoder=[[H264Decoder alloc] init];
    decoder.updateDelegate=self;
    
    unsigned int timestamp=_getTickCount();
    
    
    int videoBps=0;
    
    if (sessionID >= 0 && avIndex >= 0)
    {

        avClientCleanVideoBuf(avIndex);

        
        while (isRunningRecvVideoThread)
        {
           
           
            unsigned int now = _getTickCount();
            
            if (now - timestamp > 1000)
            {

                dispatch_async(dispatch_get_main_queue(), ^{
                    
                    if (self.cameraDelegate &&[self.cameraDelegate respondsToSelector:@selector(camera:frameInfoWithVideoBPS:)])
                    {
                        [self.cameraDelegate camera:self frameInfoWithVideoBPS:videoBps];
                    }
                        
                });
                
                timestamp = now;
                videoBps = 0;
            }
            
            //usleep(1*1000);
            
  
            readSize = avRecvFrameData2(avIndex, recvBuf, RECV_VIDEO_BUFFER_SIZE, &outBufSize, &outFrmSize, (char *)&frmInfo, sizeof(frmInfo), &outFrmInfoSize, &frmNo);

            
            if (readSize == AV_ER_BUFPARA_MAXSIZE_INSUFF)
            {

                continue;
                
            }else if (readSize ==  AV_ER_MEM_INSUFF)
            {
                continue;
            } else if(readSize == AV_ER_INCOMPLETE_FRAME)
            {
                continue;
            }
            else if (readSize == AV_ER_LOSED_THIS_FRAME)
            {
                continue;
            }
            else if (readSize == AV_ER_DATA_NOREADY)
            {
                usleep(10*1000);
                continue;
                
            }
            else if (readSize >= 0)
            {
                if (frmInfo.flags == IPC_FRAME_FLAG_IFRAME || frmNo == (prevFrmNo + 1))
                {
                    
                    prevFrmNo = frmNo;
                    if (frmInfo.codec_id == MEDIA_CODEC_VIDEO_H264)
                    {
                        //printf("RECV H.264 ======================size: %d\n",readSize);
                        [[Mp4VideoRecorder getInstance] writeFrameData:(unsigned char *)recvBuf withSize:readSize];
                        
                        [decoder DecodeH264Frames:(unsigned char*)recvBuf withLength:readSize];
                      
                    }
                    videoBps+=readSize;
                }
                else
                {
                    NSLog(@"\t[H264] Incorrect %@ frame no(%d), prev:%d -> drop frame", (frmInfo.flags == IPC_FRAME_FLAG_IFRAME ? @"I" : @"P"), frmNo, prevFrmNo);
                    usleep(1*1000);
                    continue;
                }

                
            }
            else
            {
                usleep(1*1000);
                continue;
                
            }


        }

    }
    else
    {
        free(recvBuf);
        
        decoder.updateDelegate=nil;
        [decoder release];
        
        printf("doRecvVideo: [sessionID < 0 || avIndex < 0]\n");
        return;
    }



    free(recvBuf);
    
    decoder.updateDelegate=nil;
    [decoder release];
    
    NSLog(@"\t=== RecvVideo Thread Exit ===\n");
    

}

2.写入视频

- (void)writeFrameData: (unsigned char *)pszData withSize: (int)aSize
{
    if (!m_bRecord)
    {
        return;
    }
    if(aSize<=4){
        return;
    }
    
    
    MP4ENC_NaluUnit nalu;
    memset(&nalu, 0, sizeof(nalu));
    
    int  len = 0;
    int pos = 0;
    
    
    memset(&m_avcCBox, 0, sizeof(m_avcCBox));
    memset(m_avcCBox.ppsBuffer, 0, sizeof(m_avcCBox.ppsBuffer));
    memset(m_avcCBox.spsBuffer, 0, sizeof(m_avcCBox.spsBuffer));
    
    
    while ((len = [self readOneNaluFromBuffer:pszData withSize:aSize withOffSet:pos nalUnit: &nalu]))
    {
        if(nalu.type==0x07)//SPS
        {
            if(nalu.size>0){
                memcpy(m_avcCBox.spsBuffer, nalu.data, nalu.size);
                m_avcCBox.sps_length=nalu.size;
            }
            firstIFrame_states = 1;
        }
        
        if(nalu.type==0x08)//PPS
        {
            if(nalu.size>0){
                memcpy(m_avcCBox.ppsBuffer, nalu.data, nalu.size);
                m_avcCBox.pps_length=nalu.size;
            }
        }
        
        if(firstIFrame_states!=1){
            printf("NO_SPS_RETURN: \n");//获取第一个I帧. 否则就直接返回.
            startTimeStamp=[self _GetTickCount];
            pos += len;
            continue ;
        }
        
        if(nalu.type == 0x05) //i帧
        {
            FrameData *pFrameData=(FrameData *)malloc(sizeof(FrameData));
            memset(pFrameData, 0, sizeof(FrameData));
            
            int datalen = nalu.size+4;
            unsigned char *pData =(unsigned char *) malloc( datalen * sizeof(unsigned char));
            // MP4 Nalu前四个字节表示Nalu长度
            pData[0] = nalu.size>>24;
            pData[1] = nalu.size>>16;
            pData[2] = nalu.size>>8;
            pData[3] = nalu.size&0xff;
            memcpy(pData+4,nalu.data,nalu.size);
            
            
            
            
            pFrameData->m_nSize = datalen;
            memcpy(pFrameData->m_pszData, pData, datalen);
            
            
            
            pFrameData->m_nCodecID = CODEC_ID_H264;
            pFrameData->m_nTimestamp = [self _GetTickCount]-startTimeStamp;
            
            
            [self WriteVideoSt:pFrameData withKeyFlags:1];
            
            free(pFrameData);
            
            free(pData);
            
        }
        else if(nalu.type == 0x01)//
        {
            FrameData *pFrameData=(FrameData *)malloc(sizeof(FrameData));
            memset(pFrameData, 0, sizeof(FrameData));
            
            int datalen = nalu.size+4;
            unsigned char *pData =(unsigned char *) malloc( datalen * sizeof(unsigned char));
            // MP4 Nalu前四个字节表示Nalu长度
            pData[0] = nalu.size>>24;
            pData[1] = nalu.size>>16;
            pData[2] = nalu.size>>8;
            pData[3] = nalu.size&0xff;
            memcpy(pData+4,nalu.data,nalu.size);
            
            
            
            
            pFrameData->m_nSize = datalen;
            memcpy(pFrameData->m_pszData, pData, datalen);
            
            
            
            pFrameData->m_nCodecID = CODEC_ID_H264;
            pFrameData->m_nTimestamp = [self _GetTickCount]-startTimeStamp;
            
            
            [self WriteVideoSt:pFrameData withKeyFlags:0];
            
            free(pFrameData);
            
            free(pData);
            
        }
        pos += len;
        
    }
    
    
}

上一篇 下一篇

猜你喜欢

热点阅读