Android程序员Android开发Android开发精选

Android进阶——你所知道的Camera2和你所不知道的Ca

2017-10-10  本文已影响8261人  CrazyMO_

引言

一切源于在项目过程中的一个Bug:我的需求是在MainActivity 实现自动预览,也可以点击跳到签到SignedActivity去实现拍照签到,第一次进入界面的时候都是正常的,但是有时候返回来的时候预览失败,即从MainActivity跳转到SignedActivity偶尔预览失败和从SignedActivity返回MainActivity偶尔失败,都是报(CAMERA_IN_USE)ERRO=1的错误,奇怪的是的的确确做了完全释放操作,加上以前用的更多的是Camera api 对于Camer2 的机制没有完整去研究过,一下子懵了,于是乎先去找了Stack Overflow,查到一个解决方案是:"我弃用了新API,换回旧API",ORZ,找了其他的也没有答案,可是我不服呀,我就把官方的文档全部啃了一遍,于是乎便有了以下的理解,我想如果你不懂得怎么使用Camera2的话,这篇绝对值得你去阅读,你会发现Camera2 并非像大多数说得那样使用起来很复杂。

一、Camera2新架构概述

全新的android.hardware.Camera2 。Android 5.0对拍照API进行了全新的设计,新增了全新设计的Camera 2 API,这些API不仅大幅提高了Android系统拍照的功能,还能支持RAW照片输出,甚至允许程序调整相机的对焦模式、曝光模式、快门等。

这里写图片描述
如上图所示,Camera2 API相比原来android.hardware.Camera API 在架构上有了很大的改变,虽然让手机拍照功能更加强大,但同时也增加了开发复杂度了,从以下的Camera2 架构图中所有参与类角色不能看出。
这里写图片描述
引入了管道的概念将安卓设备和摄像头之间联系起来,系统向摄像头发送 Capture 请求,而摄像头会返回 CameraMetadata,这一切建立在一个叫作 CameraCaptureSession 的会话中

二、Camera2架构主要的类角色说明

在Camera2 架构在核心参与类角色有:CameraManagerCameraDeviceCameraCharacteristicsCameraRequest与CameraRequest.BuilderCameraCaptureSession以及CaptureResult

1、CameraManager

位于android.hardware.camera2.CameraManager下,也是Android 21(5.0)添加的,和其他系统服务一样通过 Context.getSystemService(CameraManager.class ) 或者Context.getSystemService(Context.CAMERA_SERVICE) 来完成初始化,主要用于管理系统摄像头:

CameraManager manager = (CameraManager)context.getSystemService(Context.CAMERA_SERVICE);

2、CameraDevice

CameraDevice是Camera2中抽象出来的一个对象,直接与系统硬件摄像头相联系。因为不可能所有的摄像头都会支持高级功能(即摄像头功能可被分为limit 和full 两个级别),当摄像头处于limited 级别时候,此时Camera2和早期的Camera功能差不多,除此之外在Camera2架构中,CameraDevice还承担其他两项重要任务:

3、CameraCaptureSession

正如前面所说,系统向摄像头发送 Capture 请求,而摄像头会返回 CameraMetadata,这一切都是在由对应的CameraDevice创建的CameraCaptureSession 会话完成,当程序需要预览、拍照、再次预览时,都需要先通过会话。(A configured capture session for a CameraDevice, used for capturing images from the camera or reprocessing images captured from the camera in the same session previously.A CameraCaptureSession is created by providing a set of target output surfaces to createCaptureSession, or by providing an InputConfiguration and a set of target output surfaces to createReprocessableCaptureSession for a reprocessable capture session. Once created, the session is active until a new session is created by the camera device, or the camera device is closed.)CameraCaptureSession一旦被创建,直到对应的CameraDevice关闭才会死掉。虽然CameraCaptureSession会话用于从摄像头中捕获图像,但是只有同一个会话才能再次从同一摄像头中捕获图像。另外,创建会话是一项耗时的异步操作,可能需要几百毫秒,因为它需要配置相机设备的内部管道并分配内存缓冲区以将图像发送到所需的目标,因而createCaptureSession和createReprocessableCaptureSession会将随时可用的CameraCaptureSession发送到提供的监听器的onConfigured回调中。如果无法完成配置,则触发onConfigureFailed回调,并且会话将不会变为活动状态。最后需要注意的是,如果摄像头设备创建了一个新的会话,那么上一个会话是被关闭的,并且会回调与其关联的onClosed,如果不处理好,当会话关闭之后再次调用会话的对应方法那么所有方法将会跑出IllegalStateException异常。关闭的会话清除任何重复的请求(和调用了stopRepeating()方法类似),但是在新创建的会话接管并重新配置摄像机设备之前,关闭的会话仍然会正常完成所有正在进行的捕获请求。简而言之,在Camera2中CameraCaptureSession承担很重要的角色:

这里写图片描述

4、CameraCharacteristics

描述Cameradevice属性的对象,可以使用CameraManager通过getCameraCharacteristics(String cameraId)进行查询。

5、CameraRequest和CameraRequest.Builder

CameraRequest代表了一次捕获请求,而CameraRequest.Builder用于描述捕获图片的各种参数设置,包含捕获硬件(传感器,镜头,闪存),对焦模式、曝光模式,处理流水线,控制算法和输出缓冲区的配置。,然后传递到对应的会话中进行设置,CameraRequest.Builder则负责生成CameraRequest对象。当程序调用setRepeatingRequest()方法进行预览时,或调用capture()方法进行拍照时,都需要传入CameraRequest参数。CameraRequest可以通过CameraRequest.Builder来进行初始化,通过调用createCaptureRequest来获得。

6、CaptureResult

CaptureRequest描述是从图像传感器捕获单个图像的结果的子集的对象。(CaptureResults are produced by a CameraDevice after processing a CaptureRequest)当CaptureRequest被处理之后由CameraDevice生成。

7、Camera2 主要角色之间的联系

CameraManager处于顶层管理位置负责检测获取所有摄像头及其特性传入指定的CameraDevice.StateCallback回调打开指定摄像头CameraDevice是负责管理抽象对象,包括监听Camera 的状态回调CameraDevice.StateCallback创建CameraCaptureSession和CameraRequestCameraCaptureSession用于描述一次图像捕获操作,主要负责监听自己会话的状态回调CameraCaptureSession.StateCallbackCameraCaptureSession.CaptureCallback捕获回调,还有发送处理CameraRequestCameraRequest则可以看成是一个"JavaBean"的作用用于描述希望什么样的配置来处理这次请求;最后三个回调用于监听对应的状态。

三、Camera2 使用步骤

private void startBackgroundThread() {
    LogUtil.showModelLog("摄像头"+"启动HandlerThread");
    mBackgroundThread = new HandlerThread("CameraBackground");
    mBackgroundThread.start();
    mBackgroundHandler = new Handler(mBackgroundThread.getLooper());
}
    private final TextureView.SurfaceTextureListener mSurfaceTextureListener = new TextureView.SurfaceTextureListener() {

        @Override
        public void onSurfaceTextureAvailable(SurfaceTexture texture, int width, int height) {
            LogUtil.showModelLog("摄像头"+"当TextureView 可用时");
            //3.在TextureView可用的时候尝试打开摄像头
            openCamera(width, height);
        }

        @Override
        public void onSurfaceTextureSizeChanged(SurfaceTexture texture, int width, int height) {
            configureTransform(width, height);
        }

        @Override
        public boolean onSurfaceTextureDestroyed(SurfaceTexture texture) {
            return true;
        }

        @Override
        public void onSurfaceTextureUpdated(SurfaceTexture texture) {
        }
    };
manager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);
    private CameraCaptureSession.CaptureCallback mCaptureCallback
            = new CameraCaptureSession.CaptureCallback() {

        private void process(CaptureResult result) {
            switch (mState) {
                case STATE_PREVIEW: {
                    LogUtil.showModelLog("摄像头"+"在CameraCaptureSession.CaptureCallback捕获回调处理CaptureResult,在process里预览成功工作");
                    // We have nothing to do when the camera preview is working normally.
                    break;
                }
                case STATE_WAITING_LOCK: {

                    Integer afState = result.get(CaptureResult.CONTROL_AF_STATE);
                    if (afState == null) {
                        captureStillPicture();
                    } else if (CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED == afState ||
                            CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED == afState) {
                        // CONTROL_AE_STATE can be null on some devices
                        Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
                        if (aeState == null ||
                                aeState == CaptureResult.CONTROL_AE_STATE_CONVERGED) {
                            mState = STATE_PICTURE_TAKEN;
                            captureStillPicture();
                        } else {
                            runPrecaptureSequence();
                        }
                    }
                    break;
                }
                case STATE_WAITING_PRECAPTURE: {

                    // CONTROL_AE_STATE can be null on some devices
                    Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
                    if (aeState == null ||
                            aeState == CaptureResult.CONTROL_AE_STATE_PRECAPTURE ||
                            aeState == CaptureRequest.CONTROL_AE_STATE_FLASH_REQUIRED) {
                        mState = STATE_WAITING_NON_PRECAPTURE;
                    }
                    break;
                }
                case STATE_WAITING_NON_PRECAPTURE: {
                    // CONTROL_AE_STATE can be null on some devices
                    Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
                    if (aeState == null || aeState != CaptureResult.CONTROL_AE_STATE_PRECAPTURE) {
                        mState = STATE_PICTURE_TAKEN;
                        captureStillPicture();
                    }
                    break;
                }
                default:
                    break;
            }
        }

        @Override
        public void onCaptureProgressed(@NonNull CameraCaptureSession session,
                                        @NonNull CaptureRequest request,
                                        @NonNull CaptureResult partialResult) {
            process(partialResult);
            LogUtil.showDebugLog("onCaptureProgressed");
        }

        @Override
        public void onCaptureCompleted(@NonNull CameraCaptureSession session,
                                       @NonNull CaptureRequest request,
                                       @NonNull TotalCaptureResult result) {
            process(result);
            LogUtil.showModelLog("摄像头"+"在CameraCaptureSession.CaptureCallback捕获回调处理CaptureResult");

        }

    };
    private void createCameraPreviewSession() {
        try {
            SurfaceTexture texture = cameraViewSigned.getSurfaceTexture();
            assert texture != null;

            // We configure the size of default buffer to be the size of camera preview we want.
            texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());

            // This is the output Surface we need to start preview.
            Surface surface = new Surface(texture);

            LogUtil.showModelLog("摄像头"+"通过调用CameraDevice.createCaptureRequest方法创建mPreviewRequestBuilder预览请求,并把要附着的Surface通过addTarget 封到请求中");
            // We set up a CaptureRequest.Builder with the output Surface.
            mPreviewRequestBuilder= mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            mPreviewRequestBuilder.addTarget(surface);
            LogUtil.showModelLog("摄像头"+"通过调用CameraDevice.createCaptureSession方法并传入CameraCaptureSession.StateCallbac创建预览会话mCaptureSession");
            // Here, we create a CameraCaptureSession for camera preview.
            mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()),
                    new CameraCaptureSession.StateCallback() {

                        @Override
                        public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
                            // The camera is already closed
                            if (null == mCameraDevice) {
                                return;
                            }
                            LogUtil.showModelLog("摄像头"+"当预览会话完成配置并开始处理请求时候,把闪光灯,模式等参数封装到预览请求");
                            // When the session is ready, we start displaying the preview.
                            mCaptureSession = cameraCaptureSession;
                            try {
                                // Auto focus should be continuous for camera preview.
                                mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,
                                        CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
                                // Flash is automatically enabled when necessary.
                                setAutoFlash(mPreviewRequestBuilder);

                                // Finally, we start displaying the camera preview.
                                mPreviewRequest = mPreviewRequestBuilder.build();
                                LogUtil.showModelLog("摄像头"+"接着把预览请求设置到预览会话mCaptureSession,传入CameraCaptureSession.CaptureCallback捕获回调并发出不断捕获图像的请求");
                                mCaptureSession.setRepeatingRequest(mPreviewRequest,
                                        mCaptureCallback, mBackgroundHandler);
                            } catch (CameraAccessException e) {
                                e.printStackTrace();
                            }
                        }

                        @Override
                        public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {
                            showToast("拍照失败");
                        }
                    }, null
            );
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }
这里写图片描述

CameraManager处于顶层管理位置负责检测检测获取所有摄像头并设置输出参数,传入指定的CameraDevice.StateCallback回调,然后打开指定摄像头,并触发CameraDevice.StateCallback中的onOpened方法,并在onOpened方法里开始通过调用创建预览会话,,CameraDevice负责创建请求CameraCharacteristicsCameraRequest与CameraRequest.BuilderCameraCaptureSession以及CaptureResult则可以看成是一个JavaBean的作用用于描述以什么样的配置来处理这次请求。

四、Camera2 实战使用并封装Camera2Helper类

Camera2Helper类只是简单的封装了下,为了让Camera2的初始化和Activity 高度分离,这个类只是Demo 阶段部分有待优化,另外结合我具体的业务,对于图片大小有限制,所以我都是默认采用采样压缩率方式对图片进行压缩

package com.crazyview.crazycamera2;

import android.Manifest;
import android.app.Activity;
import android.content.Context;
import android.content.pm.PackageManager;
import android.content.res.Configuration;
import android.graphics.Bitmap;
import android.graphics.ImageFormat;
import android.graphics.Matrix;
import android.graphics.Point;
import android.graphics.RectF;
import android.graphics.SurfaceTexture;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CameraMetadata;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.CaptureResult;
import android.hardware.camera2.TotalCaptureResult;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.Image;
import android.media.ImageReader;
import android.os.Build;
import android.os.Handler;
import android.os.HandlerThread;
import android.support.annotation.NonNull;
import android.support.annotation.RequiresApi;
import android.support.v4.app.ActivityCompat;
import android.util.Size;
import android.util.SparseIntArray;
import android.view.Surface;
import android.view.TextureView;

import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;
import java.util.concurrent.Semaphore;
import java.util.concurrent.TimeUnit;

/**
 * Auther: Crazy.Mo
 * DateTime: 2017/9/12 12:11
 * Summary:
 */
public class Camera2Helper {
    private static Activity activity;
    private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
    private static final int STATE_PREVIEW = 0;
    private static final int STATE_WAITING_LOCK = 1;//Camera state: Waiting for the focus to be locked.
    private static final int STATE_WAITING_PRECAPTURE = 2;//Camera state: Waiting for the exposure to be pre capture state.
    private static final int STATE_WAITING_NON_PRECAPTURE = 3;//Camera state: Waiting for the exposure state to be something other than precapture.
    private static final int STATE_PICTURE_TAKEN = 4;//Camera state: Picture was taken.
    private static final int MAX_PREVIEW_WIDTH = 1920;//Max preview width that is guaranteed by Camera2 API
    private static final int MAX_PREVIEW_HEIGHT = 1080;//Max preview height that is guaranteed by Camera2 API
    private AutoFitTextureView textureView;
    private String mCameraId;
    private CameraCaptureSession mCaptureSession;
    private static CameraDevice mCameraDevice;
    private Size mPreviewSize;
    private HandlerThread mBackgroundThread;//An additional thread for running tasks that shouldn't block the UI.
    private Handler mBackgroundHandler;//A {@link Handler} for running tasks in the background.
    private ImageReader mImageReader;
    private static File mFile = null;
    private Semaphore mCameraOpenCloseLock = new Semaphore(1);//以防止在关闭相机之前应用程序退出
    private boolean mFlashSupported;
    private int mSensorOrientation;
    private CaptureRequest.Builder mPreviewRequestBuilder;//{@link CaptureRequest.Builder} for the camera preview
    private CaptureRequest mPreviewRequest;//{@link CaptureRequest} generated by {@link #mPreviewRequestBuilder}
    private int mState = STATE_PREVIEW;//{#see mCaptureCallback}The current state of camera state for taking pictures.
    private static CameraManager manager;
    private AfterDoListener listener;
    private boolean isNeedHideProgressbar=true;

    //从屏幕旋转转换为JPEG方向
    static {
        ORIENTATIONS.append(Surface.ROTATION_0, 90);
        ORIENTATIONS.append(Surface.ROTATION_90, 0);
        ORIENTATIONS.append(Surface.ROTATION_180, 270);
        ORIENTATIONS.append(Surface.ROTATION_270, 180);
    }

    //This a callback object for the {@link ImageReader}. "onImageAvailable" will be called when a still image is ready to be saved.
    private final ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {
        @Override
        public void onImageAvailable(ImageReader reader) {
            mBackgroundHandler.post(new Camera2Helper.ImageSaver(reader.acquireNextImage(), mFile));
        }
    };

    private final TextureView.SurfaceTextureListener mSurfaceTextureListener = new TextureView.SurfaceTextureListener() {

        @Override
        public void onSurfaceTextureAvailable(SurfaceTexture texture, int width, int height) {
            //3.在TextureView可用的时候尝试打开摄像头
            openCamera(width, height);
        }

        @Override
        public void onSurfaceTextureSizeChanged(SurfaceTexture texture, int width, int height) {
            configureTransform(width, height);
        }

        @Override
        public boolean onSurfaceTextureDestroyed(SurfaceTexture texture) {
            return true;
        }

        @Override
        public void onSurfaceTextureUpdated(SurfaceTexture texture) {
        }
    };

    //实现监听CameraDevice状态回调
    private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {

        @Override
        public void onOpened(@NonNull CameraDevice cameraDevice) {
            mCameraOpenCloseLock.release();
            mCameraDevice = cameraDevice;
            createCameraPreviewSession();//要想预览、拍照等操作都是需要通过会话来实现,所以创建会话用于预览
        }

        @Override
        public void onDisconnected(@NonNull CameraDevice cameraDevice) {
            mCameraOpenCloseLock.release();
            cameraDevice.close();
            mCameraDevice = null;
        }

        @Override
        public void onError(@NonNull CameraDevice cameraDevice, int error) {
            mCameraOpenCloseLock.release();
            cameraDevice.close();
            mCameraDevice = null;
        }
    };

    /**
     * A {@link CameraCaptureSession.CaptureCallback} that handles events related to JPEG capture.
     */
    private CameraCaptureSession.CaptureCallback mCaptureCallback = new CameraCaptureSession.CaptureCallback() {

        private void process(CaptureResult result) {
            switch (mState) {
                case STATE_PREVIEW: {
                    if(isNeedHideProgressbar) {
                        listener.onAfterPreviewBack();
                        isNeedHideProgressbar=false;
                    }
                    // We have nothing to do when the camera preview is working normally.
                    break;
                }
                case STATE_WAITING_LOCK: {

                    Integer afState = result.get(CaptureResult.CONTROL_AF_STATE);
                    if (afState == null) {
                        captureStillPicture();
                    } else if (CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED == afState ||
                            CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED == afState) {
                        // CONTROL_AE_STATE can be null on some devices
                        Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
                        if (aeState == null ||
                                aeState == CaptureResult.CONTROL_AE_STATE_CONVERGED) {
                            mState = STATE_PICTURE_TAKEN;
                            captureStillPicture();
                        } else {
                            runPrecaptureSequence();
                        }
                    }
                    break;
                }
                case STATE_WAITING_PRECAPTURE: {

                    // CONTROL_AE_STATE can be null on some devices
                    Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
                    if (aeState == null ||
                            aeState == CaptureResult.CONTROL_AE_STATE_PRECAPTURE ||
                            aeState == CaptureRequest.CONTROL_AE_STATE_FLASH_REQUIRED) {
                        mState = STATE_WAITING_NON_PRECAPTURE;
                    }
                    break;
                }
                case STATE_WAITING_NON_PRECAPTURE: {
                    // CONTROL_AE_STATE can be null on some devices
                    Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
                    if (aeState == null || aeState != CaptureResult.CONTROL_AE_STATE_PRECAPTURE) {
                        mState = STATE_PICTURE_TAKEN;
                        captureStillPicture();
                    }
                    break;
                }
                default:
                    break;
            }
        }

        @Override
        public void onCaptureProgressed(@NonNull CameraCaptureSession session,
                                        @NonNull CaptureRequest request,
                                        @NonNull CaptureResult partialResult) {
            process(partialResult);
            LogUtil.showDebugLog("onCaptureProgressed");
        }

        @Override
        public void onCaptureCompleted(@NonNull CameraCaptureSession session,
                                       @NonNull CaptureRequest request,
                                       @NonNull TotalCaptureResult result) {
            process(result);
        }

    };

    /**
     * {@link CameraDevice.StateCallback} is called when {@link CameraDevice} changes its state.
     */


    private volatile static Camera2Helper singleton;///注意:用volatile修饰的变量,线程在每次使用变量的时候,都会读取变量修改后的最的值

    private Camera2Helper(Activity act, AutoFitTextureView view) {
    }

    private Camera2Helper(Activity act, AutoFitTextureView view, File file) {
        activity = act;
        textureView = view;
        mFile = file;
        manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
    }

    public static Camera2Helper getInstance(Activity act, AutoFitTextureView view, File file) {
        if (singleton == null) {
            synchronized (Camera2Helper.class) {
                singleton = new Camera2Helper(act, view, file);
            }
        }
        return singleton;
    }

    /**
     * 开启相机预览界面
     */
    public void startCameraPreView() {
        startBackgroundThread();
        //1、如果TextureView 可用则直接打开相机
        new Handler().postDelayed(new Runnable() {
            @Override
            public void run() {
                if (textureView != null) {
                    if (textureView.isAvailable()) {
                        openCamera(textureView.getWidth(), textureView.getHeight());
                    } else {
                        textureView.setSurfaceTextureListener(mSurfaceTextureListener);//设置TextureView 的回调后,当满足之后自动回调到
                    }
                }
            }
        },300);//建议加上尤其是你需要在多个界面都要开启预览界面时候

    }

    /**
     * 开启HandlerThread
     */
    private void startBackgroundThread() {
        mBackgroundThread = new HandlerThread("CameraBackground");
        mBackgroundThread.start();
        mBackgroundHandler = new Handler(mBackgroundThread.getLooper());
    }

    /**
     * Stops the background thread and its {@link Handler}.
     */
    private void stopBackgroundThread() {
        if (mBackgroundThread == null) {
            return;
        }
        mBackgroundThread.quitSafely();
        try {
            mBackgroundThread.join();
            mBackgroundThread = null;
            mBackgroundHandler = null;
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }

    /**
     * 拍照
     */
    public void takePicture() {
        lockFocus();
    }

    /**
     * 通过会话提交捕获图像的请求,通常在捕获回调回应后调用
     */
    private void captureStillPicture() {
        try {
            if (null == mCameraDevice) {
                return;
            }
            //创建用于拍照的CaptureRequest.Builder
            final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
            captureBuilder.addTarget(mImageReader.getSurface());

            // 使用和预览一样的模式 AE and AF
            captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
            setAutoFlash(captureBuilder);
            int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
            captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, getOrientation(rotation));

            CameraCaptureSession.CaptureCallback CaptureCallback = new CameraCaptureSession.CaptureCallback() {

                @Override
                public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
                    unlockFocus();
                    listener.onAfterTakePicture();
                    LogUtil.showErroLog("onCaptureCompleted" + "保存照片成功");
                }
            };
            mCaptureSession.stopRepeating();
            mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    /**
     * Lock the focus as the first step for a still image capture.
     */
    private void lockFocus() {
        try {
            if (mCaptureSession == null) {
                return;
            }
            // This is how to tell the camera to lock focus.
            mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_START);
            // Tell #mCaptureCallback to wait for the lock.
            mState = STATE_WAITING_LOCK;
            mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback, mBackgroundHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    /**
     * Unlock the focus. This method should be called when still image capture sequence is
     * finished.
     */
    private void unlockFocus() {
        try {
            // Reset the auto-focus trigger
            if (mCaptureSession == null) {
                return;
            }
            mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);
            setAutoFlash(mPreviewRequestBuilder);
            mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback, mBackgroundHandler);
            // After this, the camera will go back to the normal state of preview,重新预览
            mState = STATE_PREVIEW;
            mCaptureSession.setRepeatingRequest(mPreviewRequest, mCaptureCallback, mBackgroundHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    /**
     * 设置自动闪光灯
     *
     * @param requestBuilder
     */
    private void setAutoFlash(CaptureRequest.Builder requestBuilder) {
        if (mFlashSupported) {
            requestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
        }
    }

    /**
     * 从指定的屏幕旋转中检索JPEG方向。
     *
     * @param rotation The screen rotation.
     * @return The JPEG orientation (one of 0, 90, 270, and 360)
     */
    private int getOrientation(int rotation) {
        // Sensor orientation is 90 for most devices, or 270 for some devices (eg. Nexus 5X)
        // We have to take that into account and rotate JPEG properly.
        // For devices with orientation of 90, we simply return our mapping from ORIENTATIONS.
        // For devices with orientation of 270, we need to rotate the JPEG 180 degrees.
        return (ORIENTATIONS.get(rotation) + mSensorOrientation + 270) % 360;
    }

    /**
     * Run the precapture sequence for capturing a still image. This method should be called when
     * we get a response in {@link #mCaptureCallback} from {@link #lockFocus()}
     */
    private void runPrecaptureSequence() {
        try {
            // This is how to tell the camera to trigger.
            mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START);
            // Tell #mCaptureCallback to wait for the precapture sequence to be set.
            mState = STATE_WAITING_PRECAPTURE;
            mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback, mBackgroundHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    private void createCameraPreviewSession() {
        try {
            SurfaceTexture texture = textureView.getSurfaceTexture();
            assert texture != null;
            // 将默认缓冲区的大小配置为我们想要的相机预览的大小。
            texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
            // This is the output Surface we need to start preview.
            Surface surface = new Surface(texture);
            // set up a CaptureRequest.Builder with the output Surface.
            mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            mPreviewRequestBuilder.addTarget(surface);// 把显示预览界面的TextureView添加到到CaptureRequest.Builder中
            // create a CameraCaptureSession for camera preview.
            mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()), new CameraCaptureSession.StateCallback() {

                @Override
                public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
                    // The camera is already closed
                    if (null == mCameraDevice) {
                        return;
                    }
                    // When the session is ready, we start displaying the preview.
                    mCaptureSession = cameraCaptureSession;
                    try {
                        // 设置自动对焦参数并把参数设置到CaptureRequest.Builder中
                        mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
                        //设置闪光灯自动模式
                        setAutoFlash(mPreviewRequestBuilder);

                        // 封装好CaptureRequest.Builder后,调用build 创建封装好CaptureRequest 并发送请求
                        mPreviewRequest = mPreviewRequestBuilder.build();
                        mCaptureSession.setRepeatingRequest(mPreviewRequest, mCaptureCallback, mBackgroundHandler);
                    } catch (CameraAccessException e) {
                        e.printStackTrace();
                    }
                }

                @Override
                public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {

                }
            }, null);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    private void setUpCameraOutputs(int width, int height) {

        try {
            for (String cameraId : manager.getCameraIdList()) {
                CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);

                /*
                一般来说当你的Android智能设备有前后摄像头的话,那么后置摄像头的id为0 前置的为1
                Integer facing = characteristics.get(CameraCharacteristics.LENS_FACING);
                //前置摄像头
                if (facing != null && facing == CameraCharacteristics.LENS_FACING_FRONT) {
                    continue;
                }else if(facing != null && facing == CameraCharacteristics.LENS_FACING_BACK){
                    continue;
                }*/
                StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
                if (map == null) {
                    continue;
                }

                // For still image captures, we use the largest available size.
                Size largest = Collections.max(Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)), new Camera2Helper.CompareSizesByArea());
                mImageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(), ImageFormat.JPEG, /*maxImages*/2);//初始化ImageReader
                mImageReader.setOnImageAvailableListener(mOnImageAvailableListener, mBackgroundHandler);//设置ImageReader监听

                //处理图片方向相关
                int displayRotation = activity.getWindowManager().getDefaultDisplay().getRotation();
                mSensorOrientation = characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
                boolean swappedDimensions = false;
                switch (displayRotation) {
                    case Surface.ROTATION_0:
                    case Surface.ROTATION_180:
                        if (mSensorOrientation == 90 || mSensorOrientation == 270) {
                            swappedDimensions = true;
                        }
                        break;
                    case Surface.ROTATION_90:
                    case Surface.ROTATION_270:
                        if (mSensorOrientation == 0 || mSensorOrientation == 180) {
                            swappedDimensions = true;
                        }
                        break;
                    default:
                        break;
                }
                Point displaySize = new Point();
                activity.getWindowManager().getDefaultDisplay().getSize(displaySize);
                int rotatedPreviewWidth = width;
                int rotatedPreviewHeight = height;
                int maxPreviewWidth = displaySize.x;
                int maxPreviewHeight = displaySize.y;
                if (swappedDimensions) {
                    rotatedPreviewWidth = height;
                    rotatedPreviewHeight = width;
                    maxPreviewWidth = displaySize.y;
                    maxPreviewHeight = displaySize.x;
                }

                if (maxPreviewWidth > MAX_PREVIEW_WIDTH) {
                    maxPreviewWidth = MAX_PREVIEW_WIDTH;
                }

                if (maxPreviewHeight > MAX_PREVIEW_HEIGHT) {
                    maxPreviewHeight = MAX_PREVIEW_HEIGHT;
                }

                // Danger, W.R.! Attempting to use too large a preview size could  exceed the camera
                // bus' bandwidth limitation, resulting in gorgeous previews but the storage of
                // garbage capture data.
                mPreviewSize = chooseOptimalSize(map.getOutputSizes(SurfaceTexture.class), rotatedPreviewWidth, rotatedPreviewHeight, maxPreviewWidth,
                        maxPreviewHeight, largest);

                // We fit the aspect ratio of TextureView to the size of preview we picked.
                int orientation = activity.getResources().getConfiguration().orientation;
                if (orientation == Configuration.ORIENTATION_LANDSCAPE) {
                    textureView.setAspectRatio(mPreviewSize.getWidth(), mPreviewSize.getHeight());
                } else {
                    textureView.setAspectRatio(mPreviewSize.getHeight(), mPreviewSize.getWidth());
                }
                // 设置是否支持闪光灯
                Boolean available = characteristics.get(CameraCharacteristics.FLASH_INFO_AVAILABLE);
                mFlashSupported = available == null ? false : available;
                mCameraId = cameraId;
                return;
            }
        } catch (CameraAccessException e) {
            e.printStackTrace();
        } catch (NullPointerException e) {
        }
    }

    /**
     * 为了避免太大的预览大小会超过相机总线的带宽限
     */
    private static Size chooseOptimalSize(Size[] choices, int textureViewWidth, int textureViewHeight, int maxWidth, int maxHeight, Size aspectRatio) {

        // Collect the supported resolutions that are at least as big as the preview Surface
        List<Size> bigEnough = new ArrayList<>();
        // Collect the supported resolutions that are smaller than the preview Surface
        List<Size> notBigEnough = new ArrayList<>();
        int w = aspectRatio.getWidth();
        int h = aspectRatio.getHeight();
        for (Size option : choices) {
            if (option.getWidth() <= maxWidth && option.getHeight() <= maxHeight && option.getHeight() == option.getWidth() * h / w) {
                if (option.getWidth() >= textureViewWidth && option.getHeight() >= textureViewHeight) {
                    bigEnough.add(option);
                } else {
                    notBigEnough.add(option);
                }
            }
        }
        if (bigEnough.size() > 0) {
            return Collections.min(bigEnough, new Camera2Helper.CompareSizesByArea());
        } else if (notBigEnough.size() > 0) {
            return Collections.max(notBigEnough, new Camera2Helper.CompareSizesByArea());
        } else {
            return choices[0];
        }
    }

    /**
     * 打开指定摄像头
     */
    private void openCamera(int width, int height) {

        //4、设置参数
        setUpCameraOutputs(width, height);
        configureTransform(width, height);
        try {
            if (!mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) {
                throw new RuntimeException("Time out waiting to lock camera opening.");
            }
            if (ActivityCompat.checkSelfPermission(activity, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
                return;
            }
            manager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        } catch (InterruptedException e) {
            throw new RuntimeException("Interrupted while trying to lock camera opening.", e);
        }
    }

    /**
     * Closes the current {@link CameraDevice},异步、异步、异步操作
     */
    private void closeCamera() {
        try {
            mCameraOpenCloseLock.acquire();
            if (null != mCaptureSession) {
                mCaptureSession.close();
                mCaptureSession = null;
            }
            if (null != mCameraDevice) {
                mCameraDevice.close();
                mCameraDevice = null;
            }
            if (null != mImageReader) {
                mImageReader.close();
                mImageReader = null;
            }
        } catch (InterruptedException e) {
            throw new RuntimeException("Interrupted while trying to lock camera closing.", e);
        } finally {
            mCameraOpenCloseLock.release();
        }
    }

    /**
     * Configures the necessary {@link android.graphics.Matrix} transformation to `textureView`.
     * This method should be called after the camera preview size is determined in
     * setUpCameraOutputs and also the size of `textureView` is fixed.
     *
     * @param viewWidth  The width of `textureView`
     * @param viewHeight The height of `textureView`
     */
    private void configureTransform(int viewWidth, int viewHeight) {

        if (null == textureView || null == mPreviewSize) {
            return;
        }
        int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
        Matrix matrix = new Matrix();
        RectF viewRect = new RectF(0, 0, viewWidth, viewHeight);
        RectF bufferRect = new RectF(0, 0, mPreviewSize.getHeight(), mPreviewSize.getWidth());
        float centerX = viewRect.centerX();
        float centerY = viewRect.centerY();
        if (Surface.ROTATION_90 == rotation || Surface.ROTATION_270 == rotation) {
            bufferRect.offset(centerX - bufferRect.centerX(), centerY - bufferRect.centerY());
            matrix.setRectToRect(viewRect, bufferRect, Matrix.ScaleToFit.FILL);
            float scale = Math.max(
                    (float) viewHeight / mPreviewSize.getHeight(),
                    (float) viewWidth / mPreviewSize.getWidth());
            matrix.postScale(scale, scale, centerX, centerY);
            matrix.postRotate(90 * (rotation - 2), centerX, centerY);
        } else if (Surface.ROTATION_180 == rotation) {
            matrix.postRotate(180, centerX, centerY);
        }
        textureView.setTransform(matrix);
    }

    /**
     * 释放Act和View
     */
    public void onDestroyHelper() {
        stopBackgroundThread();
        closeCamera();
        activity = null;
        textureView = null;
        listener=null;
    }

    private static class CompareSizesByArea implements Comparator<Size> {

        @RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
        @Override
        public int compare(Size lhs, Size rhs) {
            // We cast here to ensure the multiplications won't overflow
            return Long.signum((long) lhs.getWidth() * lhs.getHeight() - (long) rhs.getWidth() * rhs.getHeight());
        }
    }

    private static class ImageSaver implements Runnable {

        private final Image mImage;
        private final File mFile;

        public ImageSaver(Image image, File file) {
            mImage = image;
            mFile = file;
        }

        @Override
        public void run() {
            ByteBuffer buffer = mImage.getPlanes()[0].getBuffer();
            byte[] bytes = new byte[buffer.remaining()];
            FileOutputStream output = null;
            buffer.get(bytes);
            try {
                Bitmap bitmap = BitmapUtil.bytes2Bitmap(bytes);//原始的转为Bitmap
                Bitmap bitmapAfter = BitmapUtil.compressBitmapBySampleSize(bitmap, 4);//压缩
                byte[] bytesAfter = BitmapUtil.bitmap2Bytes(bitmapAfter);//压缩后的Bitmap 转为Bytes
                output = new FileOutputStream(mFile);
                output.write(bytesAfter);//写到文件输出流
            } catch (IOException e) {
                LogUtil.showErroLog(e.getMessage().toString());
                e.printStackTrace();
            } finally {
                mImage.close();
                if (null != output) {
                    try {
                        output.close();
                    } catch (IOException e) {
                        e.printStackTrace();
                    }
                }
            }
        }
    }

    public void setAfterDoListener(AfterDoListener listener){
        this.listener=listener;
    }

    public interface AfterDoListener {
        void onAfterPreviewBack();

        void onAfterTakePicture();
    }
}


在Activity中使用

public class MainActivity extends AppCompatActivity implements Camera2Helper.AfterDoListener {

    private Camera2Helper camera2Helper;
    private File file;
    private AutoFitTextureView textureView;
    private ImageView imageView;
    private Button button;
    private ProgressBar progressBar;
    public static final String PHOTO_PATH = Environment.getExternalStorageDirectory().getPath();
    public static final String PHOTO_NAME = "camera2";

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        init();
    }

    @Override
    protected void onResume() {
        super.onResume();
        camera2Helper.startCameraPreView();
    }

    @Override
    protected void onDestroy() {
        super.onDestroy();
        camera2Helper.onDestroyHelper();
    }

    private void init(){
        textureView= (AutoFitTextureView) findViewById(R.id.texture);
        imageView= (ImageView) findViewById(R.id.imv_photo);
        button= (Button) findViewById(R.id.btn_take_photo);
        progressBar= (ProgressBar) findViewById(R.id.progressbar_loading);
        file = new File(PHOTO_PATH, PHOTO_NAME + ".jpg");
        button.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                camera2Helper.takePicture();
            }
        });
        camera2Helper=Camera2Helper.getInstance(MainActivity.this,textureView,file);
        camera2Helper.setAfterDoListener(this);
    }


    @Override
    public void onAfterPreviewBack() {
        runOnUiThread(new Runnable() {
            @Override
            public void run() {
                progressBar.setVisibility(View.GONE);
            }
        });
    }

    @Override
    public void onAfterTakePicture() {
        runOnUiThread(new Runnable() {
            @Override
            public void run() {
                InputStream input = null;
                try {
                    input = new FileInputStream(file);
                    byte[] byt = new byte[input.available()];
                    input.read(byt);
                    imageView.setImageBitmap(BitmapUtil.bytes2Bitmap(byt));
                } catch (FileNotFoundException e) {
                    e.printStackTrace();
                } catch (IOException e) {
                    e.printStackTrace();
                }
            }
        });
    }
}

其他参见源码(注意Demo中没有做动态权限的处理)Camera2Demo

上一篇下一篇

猜你喜欢

热点阅读