OpenCV 之ios Remapping 重映射

2019-11-14  本文已影响0人  充满活力的早晨

OpenCV 之ios Remapping 重映射

目标

本教程向你展示如何使用OpenCV函数 remap 来实现简单重映射.

理论

重映射是什么意思?


这里g()是目标图像,f()是源图像,h(x,y)是作用于(x,y)的映射方法函数.

解释 假设图像的大小是300 *300 ,那么 I.cols = 300
I.cols-x 相当于 300-x ,意思原来在x位置的像素点,现在变化到300-x的地方

会发生什么? 图像会按照x轴方向发生翻转. 例如, 源图像如下:



看到红色圈关于 x 的位置改变(x轴水平翻转):

代码

#ifdef __cplusplus
#import <opencv2/opencv.hpp>
#import <opencv2/imgcodecs/ios.h>
#import <opencv2/imgproc.hpp>
#import <opencv2/highgui.hpp>
#import <opencv2/core/operations.hpp>

#import <opencv2/core/core_c.h>
using namespace cv;
using namespace std;

#endif
#import "RemapViewController.h"

@interface RemapViewController ()

@end

@implementation RemapViewController

Mat src, dst;
Mat map_x, map_y;
 int ind = 0;
- (void)viewDidLoad {
    [super viewDidLoad];

    UIImage * src1Image = [UIImage imageNamed:@"dog.jpg"];
     src  = [self cvMatFromUIImage:src1Image];
    UIImageView *imageView;
    imageView = [self createImageViewInRect:CGRectMake(0, 100, 150, 150)];
    [self.view addSubview:imageView];
    imageView.image  = [self UIImageFromCVMat:src];
    dst.create( src.size(), src.type() );
    map_x.create( src.size(), CV_32FC1 );
    map_y.create( src.size(), CV_32FC1 );
    
    [self createTimer:1 exeBlock:^{
        [self update_map];
        remap( src, dst, map_x, map_y, CV_INTER_LINEAR, BORDER_CONSTANT, Scalar(0,0, 0) );
        UIImageView *imageView;
        imageView = [self createImageViewInRect:CGRectMake(0, 250, 150, 150)];
        [self.view addSubview:imageView];
        imageView.image  = [self UIImageFromCVMat:dst];
    }];
}
-(void)update_map{
    ind = ind%4;

      for( int j = 0; j < src.rows; j++ )
      { for( int i = 0; i < src.cols; i++ )
          {
            switch( ind )
            {
              case 0:
                if( i > src.cols*0.25 && i < src.cols*0.75 && j > src.rows*0.25 && j < src.rows*0.75 )
                  {
                    map_x.at<float>(j,i) = 2*( i - src.cols*0.25 ) + 0.5 ;
                    map_y.at<float>(j,i) = 2*( j - src.rows*0.25 ) + 0.5 ;
                   }
                else
                  { map_x.at<float>(j,i) = 0 ;
                    map_y.at<float>(j,i) = 0 ;
                  }
                    break;
              case 1:
                    map_x.at<float>(j,i) = i ;
                    map_y.at<float>(j,i) = src.rows - j ;
                    break;
              case 2:
                    map_x.at<float>(j,i) = src.cols - I ;
                    map_y.at<float>(j,i) = j ;
                    break;
              case 3:
                    map_x.at<float>(j,i) = src.cols - I ;
                    map_y.at<float>(j,i) = src.rows - j ;
                    break;
            } // end of switch
          }
       }
     ind++;
}


#pragma mark  - private
//brg
- (cv::Mat)cvMatFromUIImage:(UIImage *)image
{
  CGColorSpaceRef colorSpace =CGColorSpaceCreateDeviceRGB();
    
  CGFloat cols = image.size.width;
  CGFloat rows = image.size.height;
    Mat cvMat(rows, cols, CV_8UC4); // 8 bits per component, 4 channels (color channels + alpha)
  CGContextRef contextRef = CGBitmapContextCreate(cvMat.data,                 // Pointer to  data
                                                 cols,                       // Width of bitmap
                                                 rows,                       // Height of bitmap
                                                 8,                          // Bits per component
                                                 cvMat.step[0],              // Bytes per row
                                                 colorSpace,                 // Colorspace
                                                 kCGImageAlphaNoneSkipLast |
                                                 kCGBitmapByteOrderDefault); // Bitmap info flags
  CGContextDrawImage(contextRef, CGRectMake(0, 0, cols, rows), image.CGImage);
  CGContextRelease(contextRef);
    
    Mat dst;
    Mat src;
    cvtColor(cvMat, dst, COLOR_RGBA2BGRA);
    cvtColor(dst, src, COLOR_BGRA2BGR);

  return src;
}

-(UIImage *)UIImageFromCVMat:(cv::Mat)cvMat
{
//    mat 是brg 而 rgb
    Mat src;
    NSData *data=nil;
    CGBitmapInfo info =kCGImageAlphaNone|kCGBitmapByteOrderDefault;
    CGColorSpaceRef colorSpace;
    if (cvMat.depth()!=CV_8U) {
        Mat result;
        cvMat.convertTo(result, CV_8U,255.0);
        cvMat = result;
    }
  if (cvMat.elemSize() == 1) {
      colorSpace = CGColorSpaceCreateDeviceGray();
      data= [NSData dataWithBytes:cvMat.data length:cvMat.elemSize()*cvMat.total()];
  } else if(cvMat.elemSize() == 3){
      cvtColor(cvMat, src, COLOR_BGR2RGB);
       data= [NSData dataWithBytes:src.data length:src.elemSize()*src.total()];
      colorSpace = CGColorSpaceCreateDeviceRGB();
  }else if(cvMat.elemSize() == 4){
      colorSpace = CGColorSpaceCreateDeviceRGB();
      cvtColor(cvMat, src, COLOR_BGRA2RGBA);
      data= [NSData dataWithBytes:src.data length:src.elemSize()*src.total()];
      info =kCGImageAlphaNoneSkipLast | kCGBitmapByteOrderDefault;
  }else{
      NSLog(@"[error:] 错误的颜色通道");
      return nil;
  }
  CGDataProviderRef provider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);
  // Creating CGImage from cv::Mat
  CGImageRef imageRef = CGImageCreate(cvMat.cols,                                 //width
                                     cvMat.rows,                                 //height
                                     8,                                          //bits per component
                                     8 * cvMat.elemSize(),                       //bits per pixel
                                     cvMat.step[0],                            //bytesPerRow
                                     colorSpace,                                 //colorspace
                                     kCGImageAlphaNone|kCGBitmapByteOrderDefault,// bitmap info
                                     provider,                                   //CGDataProviderRef
                                     NULL,                                       //decode
                                     false,                                      //should interpolate
                                     kCGRenderingIntentAbsoluteColorimetric                   //intent
                                     );
  // Getting UIImage from CGImage
  UIImage *finalImage = [UIImage imageWithCGImage:imageRef];
  CGImageRelease(imageRef);
  CGDataProviderRelease(provider);
  CGColorSpaceRelease(colorSpace);
  return finalImage;
 }
@end

说明

Mat src, dst;
Mat map_x, map_y;
 int ind = 0;
UIImage * src1Image = [UIImage imageNamed:@"dog.jpg"];
     src  = [self cvMatFromUIImage:src1Image];
    UIImageView *imageView;
    imageView = [self createImageViewInRect:CGRectMake(0, 100, 150, 150)];
    [self.view addSubview:imageView];
    imageView.image  = [self UIImageFromCVMat:src];
dst.create( src.size(), src.type() );
map_x.create( src.size(), CV_32FC1 );
map_y.create( src.size(), CV_32FC1 );
 [self createTimer:1 exeBlock:^{
        [self update_map];
        remap( src, dst, map_x, map_y, CV_INTER_LINEAR, BORDER_CONSTANT, Scalar(0,0, 0) );
        UIImageView *imageView;
        imageView = [self createImageViewInRect:CGRectMake(0, 250, 150, 150)];
        [self.view addSubview:imageView];
        imageView.image  = [self UIImageFromCVMat:dst];
    }];

上面用到的重映射函数 remap. 参数说明:

如何更新重映射矩阵 mat_x 和 mat_y? 请继续看:

图像宽高缩小一半,并显示在中间:



所有成对的参数(i,j)处理后都符合



图像上下颠倒


图像左右颠倒


同时执行b和c的操作:


结果

QQ20191114-192717.gif

github 地址

摘录博客

上一篇下一篇

猜你喜欢

热点阅读