MTK Camera学习第二篇(相机的初始化)
本篇所讲主要是Camera的启动过程,我们从onCreate方法到Camera成功open完成作如下时序图:
camera-open.png
上图主要包括三个部分:从java到jni、从jni/framework到hal、从hal到驱动(不包括驱动)。
在第一部分中的应用层中我们可以直接得到一个java里的Camera对象,这时对此对象进行进一步封装,得到一个AndroidCamera对象:
public static ICamera openCamera(int cameraId) {
Camera camera = null;
if (sTrySwitchToLegacyMode > 0) {
// choose legacy mode in order to enter cam hal 1.0
camera = Camera.openLegacy(cameraId, Camera.CAMERA_HAL_API_VERSION_1_0);
} else {
camera = Camera.open(cameraId);
}
if (null == camera) {
Log.e(TAG, "openCamera:got null hardware camera!");
return null;
}
// wrap it with ICamera
return new AndroidCamera(camera);
}
ICamera接口内定义了一系列针对Camera进行操作的方法,而AndroidCamera则实现了这些方法:
public interface ICamera {
Camera getInstance();
void addCallbackBuffer(byte[] callbackBuffer);
void addRawImageCallbackBuffer(byte[] callbackBuffer);
void autoFocus(AutoFocusCallback cb);
void cancelAutoFocus();
void cancelContinuousShot();
void stopSmileDetection();
void lock();
Parameters getParameters();
void release();
void reconnect() throws IOException;
void setAsdCallback(AsdCallback cb);
void setAutoFocusMoveCallback(AutoFocusMoveCallback cb);
void setUncompressedImageCallback(PictureCallback cb);
void setAutoRamaCallback(AutoRamaCallback cb);
void setAutoRamaMoveCallback(AutoRamaMoveCallback cb);
void setJpsCallback(StereoCameraJpsCallback cb);
void setWarningCallback(StereoCameraWarningCallback cb);
void setMaskCallback(StereoCameraMaskCallback cb);
void setDistanceInfoCallback(DistanceInfoCallback cb);
void setContext(Context context);
void setContinuousShotCallback(ContinuousShotCallback callback);
void setContinuousShotSpeed(int speed);
void setDisplayOrientation(int degrees);
void setErrorCallback(ErrorCallback cb);
void setFaceDetectionListener(FaceDetectionListener listener);
void setFbOriginalCallback(FbOriginalCallback cb);
void setHdrOriginalCallback(HdrOriginalCallback cb);
void setParameters(Parameters params);
void setPreviewCallbackWithBuffer(PreviewCallback cb);
void setPreviewDoneCallback(ZSDPreviewDone callback);
void setPreviewTexture(SurfaceTexture surfaceTexture) throws IOException;
void setPreviewDisplay(SurfaceHolder holder) throws IOException;
void setSmileCallback(SmileCallback cb);
void setZoomChangeListener(OnZoomChangeListener listener);
// void slowdownContinuousShot();
void startAutoRama(int num);
void start3DSHOT(int num);
void stop3DSHOT(int num);
void setPreview3DModeForCamera(boolean enable);
void startFaceDetection();
void startObjectTracking(int x, int y);
void stopObjectTracking();
void setObjectTrackingListener(ObjectTrackingListener listener);
void startPreview();
void startSmoothZoom(int value);
void startSmileDetection();
void stopAutoRama(int isMerge);
void stopFaceDetection();
void stopPreview();
void setGestureCallback(GestureCallback cb);
void startGestureDetection();
void stopGestureDetection();
void takePicture(ShutterCallback shutter, PictureCallback raw, PictureCallback jpeg);
void takePicture(ShutterCallback shutter, PictureCallback raw, PictureCallback postview,
PictureCallback jpeg);
void unlock();
public void setOneShotPreviewCallback(PreviewCallback cb);
public void setMainFaceCoordinate(int x, int y);
public void cancelMainFaceInfo();
}
public class AndroidCamera implements ICamera {...}
另外还有两个内部类简单看一下关系
simpleuml.png
CameraHolder的内部类持有CameraManager内部类CameraProxy的对象,于是可以使用CamerHolder间接调用CameraProxy这个内部类的方法。同时CamerManager还有另一个内部类CameraHandler可以获取CameraProxy发出的消息。
private class CameraHandler extends Handler {
CameraHandler(Looper looper) {
super(looper);
}
@Override
public void handleMessage(final Message msg) {
long now = SystemClock.uptimeMillis();
if (mCamera == null) {
Log.e(mSubTag, "[handleMessage] camera device is null,return! ");
return;
}
Log.i(mSubTag, "[handleMessage]msg.what = " + getMsgLabel(msg.what)
+ " pending time = " + (now - msg.getWhen()) + "ms.");
try {
switch (msg.what) {
case RELEASE:
CameraPerformanceTracker.onEvent(TAG,
CameraPerformanceTracker.NAME_CAMERA_RELEASE,
CameraPerformanceTracker.ISBEGIN);
mCamera.release();
Log.i(mSubTag, "release camera device = " + mCamera);
CameraPerformanceTracker.onEvent(TAG,
CameraPerformanceTracker.NAME_CAMERA_RELEASE,
CameraPerformanceTracker.ISEND);
mCamera = null;
mCameraProxy = null;
mFaceDetectionRunning = false;
return;
case RECONNECT:
mReconnectException = null;
try {
mCamera.reconnect();
Log.i(mSubTag, "reconnect camera device = " + mCamera);
} catch (IOException ex) {
mReconnectException = ex;
}
mFaceDetectionRunning = false;
return;
case UNLOCK:
mCamera.unlock();
return;
case LOCK:
mCamera.lock();
return;
...
}
}
}
这个mCamera是一个ICamera对象,因此最终将调用AndroidCamera的相应方法。
在第二部分中,我们看到本地的APP使用jni与CameraService进行connect,然后会给APP创建一个客户端,这也就是我们常说的Camera采用的CS架构的原因了:
status_t CameraService::makeClient(const sp<CameraService>& cameraService,
const sp<IInterface>& cameraCb, const String16& packageName, const String8& cameraId,
int facing, int clientPid, uid_t clientUid, int servicePid, bool legacyMode,
int halVersion, int deviceVersion, apiLevel effectiveApiLevel,
/*out*/sp<BasicClient>* client) {
...
sp<ICameraClient> tmp = static_cast<ICameraClient*>(cameraCb.get());
*client = new CameraClient(cameraService, tmp, packageName, id, facing,
clientPid, clientUid, getpid(), legacyMode);
...
}
那么代码怎样进入到Hal层的呢?我们知道,开机后系统会启动一个CameraService,然后会执行到onFirstRef()方法,如下:
void CameraService::onFirstRef()
{
ALOGI("CameraService process starting");
BnCameraService::onFirstRef();
// Update battery life tracking if service is restarting
BatteryNotifier& notifier(BatteryNotifier::getInstance());
notifier.noteResetCamera();
notifier.noteResetFlashlight();
camera_module_t *rawModule;
int err = hw_get_module(CAMERA_HARDWARE_MODULE_ID,
(const hw_module_t **)&rawModule);
if (err < 0) {
ALOGE("Could not load camera HAL module: %d (%s)", err, strerror(-err));
logServiceError("Could not load camera HAL module", err);
mNumberOfCameras = 0;
return;
}
mModule = new CameraModule(rawModule);
ALOGI("Loaded \"%s\" camera module", mModule->getModuleName());
err = mModule->init();
if (err != OK) {
ALOGE("Could not initialize camera HAL module: %d (%s)", err,
strerror(-err));
logServiceError("Could not initialize camera HAL module", err);
mNumberOfCameras = 0;
delete mModule;
mModule = nullptr;
return;
...
}
...
}
hw_get_module根据CAMERA_HARDWARE_MODULE_ID加载相应的so文件,具体加载过程可查看hardware.c,这个module是什么样子的呢?如下
static
camera_module
get_camera_module()
{
camera_module module = {
common:{
tag : HARDWARE_MODULE_TAG,
#if (PLATFORM_SDK_VERSION >= 21)
module_api_version : CAMERA_MODULE_API_VERSION_2_4,
#else
module_api_version : CAMERA_DEVICE_API_VERSION_1_0,
#endif
hal_api_version : HARDWARE_HAL_API_VERSION,
id : CAMERA_HARDWARE_MODULE_ID,
name : "MediaTek Camera Module",
author : "MediaTek",
methods : get_module_methods(),
dso : NULL,
reserved : {0},
},
get_number_of_cameras : get_number_of_cameras,
get_camera_info : get_camera_info,
set_callbacks : set_callbacks,
get_vendor_tag_ops : get_vendor_tag_ops,
#if (PLATFORM_SDK_VERSION >= 21)
open_legacy : open_legacy,
#endif
set_torch_mode : set_torch_mode,
init : NULL,
reserved : {0},
};
return module;
};
而此module的open方法实际即为open_device方法:
static
hw_module_methods_t*
get_module_methods()
{
static
hw_module_methods_t
_methods =
{
open: open_device
};
return &_methods;
}
做了这么多仅仅只是打开的相机吗?我们回到jni一开始的入口方法setup:
static jint android_hardware_Camera_native_setup(JNIEnv *env, jobject thiz,
jobject weak_this, jint cameraId, jint halVersion, jstring clientPackageName)
{
// Convert jstring to String16
const char16_t *rawClientName = reinterpret_cast<const char16_t*>(
env->GetStringChars(clientPackageName, NULL));
jsize rawClientNameLen = env->GetStringLength(clientPackageName);
String16 clientName(rawClientName, rawClientNameLen);
env->ReleaseStringChars(clientPackageName,
reinterpret_cast<const jchar*>(rawClientName));
sp<Camera> camera;
if (halVersion == CAMERA_HAL_API_VERSION_NORMAL_CONNECT) {
// Default path: hal version is don't care, do normal camera connect.
camera = Camera::connect(cameraId, clientName,
Camera::USE_CALLING_UID);
} else {
jint status = Camera::connectLegacy(cameraId, halVersion, clientName,
Camera::USE_CALLING_UID, camera);
if (status != NO_ERROR) {
return status;
}
}
...
}
可以看到,我们得到一个sp<Camera>对象,并且建立了客户端到服务端的连接。
我们打开相机后还可能需要对相机进行各种参数设置,也就是说需要一个Parameters对象:
private int firstOpenCamera(){
...
mParameters = (mCameraDevice == null) ? null : CameraHolder.instance()
.getOriginalParameters(mCameraId);
if (mCameraDevice != null && mParameters != null) {
mCurCameraDevice = new CameraDeviceExt(mCameraActivity, mCameraDevice, mParameters,
mCameraId, mPreferences);
} else {
Log.d(TAG, "[openCamera fail],mCameraDevice:" + mCameraDevice + ",mParameters:"
+ mParameters);
}
...
}
这个Parameters对象的获取我们就不再从底层来分析了,简单来说只要Camera open成功了,这个参数就可以正常获得,当然一些基本的相机配置也可以从Hal层进行初始化定义,我们从Log看下使用Client端设置后得到的结果:
01-01 06:14:08.574 D/CameraClient( 353): setParameters (pid 14933) (3dnr-mode=off;3dnr-mode-values=off;afeng-max-focus-step=1023;afeng-min-focus-step=0;antibanding=auto;antibanding-values=off,50hz,60hz,auto;auto-exposure-lock-supported=true;auto-whitebalance-lock-supported=true;brightness=middle;brightness-values=low,middle,high;brightness_value=0;cap-mode-values=normal,face_beauty;capfname=/sdcard/DCIM/cap00;contrast=middle;contrast-values=low,middle,high;cshot-indicator=true;cshot-indicator-supported=true;disp-rot-supported=true;disp-rot-supported-values=true;dynamic-frame-rate=true;dynamic-frame-rate-supported=true;edge=middle;edge-values=low,middle,high;effect=none;effect-values=none,mono,negative,sepia,aqua,whiteboard,blackboard,posterize,nashville,hefe,valencia,xproll,lofi,sierra,walden;eng-mfll-e=false;eng-mfll-s=true;eng-s-shad-t=0;eng-shad-t=0;exposure-compensation=0;exposure-compensation-step=1.0;fb-enlarge-eye-max=4;fb-enlarge-eye-min=-4;fb-extreme-beauty-supported=false;fb-face-pos=-2000:-2000;fb-sharp=0;fb-sharp-max=12;fb-sharp-max-val
我们看到相机可以设置的参数非常多,本篇暂不对各个参数做详细解释。open成功之后又做了如下操作:
...
mCameraAppUi = new CameraAppUiImpl(this);
mCameraAppUi.createCommonView();
initializeCommonManagers();
mCameraAppUi.initializeCommonView();
...
// Here should be lightweight functions!!!
private void initializeCommonManagers() {
mModePicker = new ModePicker(this);
mFileSaver = new FileSaver(this);
mFrameManager = new FrameManager(this);
mModePicker.setListener(mModeChangedListener);
mCameraAppUi.setSettingListener(mSettingListener);
mCameraAppUi.setPickerListener(mPickerListener);
mCameraAppUi.addFileSaver(mFileSaver);
mPowerManager = (PowerManager) getSystemService(Context.POWER_SERVICE);
Log.v(TAG, "getSystemService,mPowerManager =" + mPowerManager);
// For tablet
if (FeatureSwitcher.isSubSettingEnabled()) {
mCameraAppUi.setSubSettingListener(mSettingListener);
}
}
CameraAppUiImpl也就是对整个Camera UI的实现,前面讲过整个UI由多个ManagerView做了具体的实现,也就是这个createCommonView方法,而后执行initializeCommonView方法的就可以将分散的UI加入一个整体进行统一管理:
public void initializeCommonView() {
mModePicker = mCameraActivity.getModePicker();
mCameraViewArray.put(CommonUiType.SHUTTER, new CameraViewImpl(mShutterManager));
mCameraViewArray.put(CommonUiType.MODE_PICKER, new CameraViewImpl(mModePicker));
mCameraViewArray.put(CommonUiType.THUMBNAIL, new CameraViewImpl(mThumbnailManager));
mCameraViewArray.put(CommonUiType.PICKER, new CameraViewImpl(mPickerManager));
mCameraViewArray.put(CommonUiType.INDICATOR, new CameraViewImpl(mIndicatorManager));
mCameraViewArray.put(CommonUiType.REMAINING, new CameraViewImpl(mRemainingManager));
mCameraViewArray.put(CommonUiType.INFO, new CameraViewImpl(mInfoManager));
mCameraViewArray.put(CommonUiType.REVIEW, new CameraViewImpl(mReviewManager));
mCameraViewArray.put(CommonUiType.ROTATE_PROGRESS, new CameraViewImpl(mRotateProgress));
mCameraViewArray.put(CommonUiType.ROTATE_DIALOG, new CameraViewImpl(mRotateDialog));
mCameraViewArray.put(CommonUiType.ZOOM, new CameraViewImpl(mZoomManager));
mCameraViewArray.put(CommonUiType.SETTING, new CameraViewImpl(mSettingManager));
if (mFaceBeautyEntryView != null) {
mCameraViewArray.put(CommonUiType.FACE_BEAUTY_ENTRY, new CameraViewImpl(
mFaceBeautyEntryView));
}
}
而后AppUiImpl加入DeviceCtrl:
mCameraDeviceCtrl.setCameraAppUi(mCameraAppUi);
我们从CameraDeviceCtrl的成员变更来看,它基本包含了Camera的一切要素,由此可见它非常重要:
...
private CameraAppUiImpl mCameraAppUi;
private ISettingCtrl mISettingCtrl;
private ModuleManager mModuleManager;
private ICameraDeviceExt mDummyCameraDevice = new DummyCameraDevice();
private ICameraDeviceExt mCurCameraDevice = mDummyCameraDevice;
private ICameraDeviceExt mTopCameraDevice = mDummyCameraDevice;
private ICameraDeviceExt mOldTopCameraDevice = mDummyCameraDevice;
private RotateLayout mFocusAreaIndicator;
private FocusManager mFocusManager;
private CamcorderProfile mProfile;
private CameraActor mCameraActor;
private SurfaceTexture mSurfaceTexture;
private SurfaceTexture mTopCamSurfaceTexture;
private PreviewSurfaceView mSurfaceView;
private View mSurfaceViewCover;
private FrameLayout mCurSurfaceViewLayout;
private FrameLayout mLastSurfaceViewLayout;
private CameraStartUpThread mCameraStartUpThread;
...
而CameraDeviceCtrl的初始化非常早(在open之前),在它的构造函数中直接启动了一个线程来开启预览(open之后):
public CameraDeviceCtrl(CameraActivity activity, ComboPreferences preferences) {
mCameraActivity = activity;
mPreferences = preferences;
mIsFirstStartUp = true;
mMainHandler = new MainHandler(mCameraActivity.getMainLooper());
mCameraStartUpThread = new CameraStartUpThread();
mCameraStartUpThread.start();
}
private class CameraStartUpThread extends Thread {
...
@Override
public void run() {
...
applyFirstParameters();
...
}
前面说的整个open过程也都是在该线程中完成的,然后就startpreview的过程:
private void applyFirstParameters () {
Log.i(TAG, "applyFirstParameters");
CameraPerformanceTracker.onEvent(TAG,
CameraPerformanceTracker.NAME_APPLY_FIRST_PARAMS,
CameraPerformanceTracker.ISBEGIN);
mIsFirstOpenCamera = false;
mMainHandler.sendEmptyMessage(MSG_SET_PREVIEW_ASPECT_RATIO);
switchCameraPreview();
mCurCameraDevice.setJpegRotation(mOrientation);
mCameraAppUi.setZoomParameter();
mCurCameraDevice.setDisplayOrientation(true);
mCurCameraDevice.setPreviewFormat(ImageFormat.YV12);
// Camera do not open zsd mode launched by 3rd party.
if (!mCameraActivity.isImageCaptureIntent() && !mCameraActivity.isVideoCaptureIntent()) {
mCurCameraDevice.getParametersExt()
.setZSDMode(SettingUtils.getPreferenceValue(mCameraActivity,
mPreferences,SettingConstants.ROW_SETTING_ZSD, Util.OFF));
}
mCurCameraDevice.getParametersExt().set(ParametersHelper.KEY_FIRST_PREVIEW_FRAME,
Util.FIRST_PREVIEW_BLACK_ON);
mCurCameraDevice.applyParametersToServer();
// for launch performance
mMainHandler.sendEmptyMessageDelayed(MSG_REMOVE_PREVIEW_COVER, 150);
mCameraActor.onCameraParameterReady(true);
mCurCameraDevice.setOneShotPreviewCallback(mOneShotPreviewCallback);
mMainHandler.sendEmptyMessage(MSG_CAMERA_PARAMETERS_READY);
mMainHandler.sendEmptyMessage(MSG_CAMERA_PREVIEW_DONE);
}
设置各种属性后通知PhotoActor:
@Override
public void onCameraParameterReady(boolean startPreview) {
super.onCameraParameterReady(startPreview);
Log.i(TAG, "[onCameraParameterReady]startPreview = " + startPreview);
mModuleManager.onCameraParameterReady(startPreview);
if (startPreview) {
if (!mModuleManager.startPreview(true)) {
startPreview(true);
}
}
if (mCameraActivity.getISettingCtrl() != null && mCameraActivity.getISettingCtrl()
.getSettingValue(SettingConstants.KEY_SELF_TIMER) != null) {
String seflTimer = mCameraActivity.getISettingCtrl().getSettingValue(
SettingConstants.KEY_SELF_TIMER);
mSelfTimerManager.setSelfTimerDuration(seflTimer);
}
mCameraActivity.setCameraState(CameraActivity.STATE_IDLE);
mHandler.removeMessages(PARAMETER_CHANGE_DONE);
mHandler.sendEmptyMessage(PARAMETER_CHANGE_DONE);
}
private void startPreview(boolean needStop) {
Log.i(TAG, "[startPreview]needStop = " + needStop);
mCameraActivity.runOnUiThread(new Runnable() {
public void run() {
mCameraActivity.getFocusManager().resetTouchFocus();
}
});
// continuous shot neednot stop preview after capture
if (needStop) {
stopPreview();
}
if (!mIsSnapshotOnIdle) {
// If the focus mode is continuous autofocus, call cancelAutoFocus
// to
// resume it because it may have been paused by autoFocus call.
if (Parameters.FOCUS_MODE_CONTINUOUS_PICTURE.equals(mCameraActivity.getFocusManager()
.getFocusMode())) {
mCameraActivity.getCameraDevice().cancelAutoFocus();
mCameraActivity.getCameraDevice().setAutoFocusMoveCallback(mAutoFocusMoveCallback);
}
mCameraActivity.getFocusManager().setAeLock(false); // Unlock AE and
// AWB.
mCameraActivity.getFocusManager().setAwbLock(false);
}
if (isPowerDebug()) {
if (SettingUtils.isSupported(Parameters.FOCUS_MODE_INFINITY, mCameraActivity
.getParameters().getSupportedFocusModes())) {
overrideFocusMode(Parameters.FOCUS_MODE_INFINITY);
mCameraActivity.getParameters().setFocusMode(
mCameraActivity.getFocusManager().getFocusMode());
//mCameraActivity.applyParametersToServer();
Log.i(TAG, "set debug focus FOCUS_MODE_INFINITY ");
}
} else {
setFocusParameters();
Log.i(TAG, "[startPreview]set setFocusParameters normal");
}
mCameraActivity.getCameraDevice().startPreviewAsync();
mCameraActivity.getFocusManager().onPreviewStarted();
}
然后CameraManager使用Handler发送消息,最后调用前面说过的ICamera对象进行相关操作
public void startPreviewAsync() {
mCameraHandler.sendEmptyMessage(START_PREVIEW_ASYNC);
waitDone();
}
继承PhotoActor的初始化,在它的构造函数中,根据参数确定mode类型:
public PhotoActor(CameraActivity context, ModuleManager moduleManager, int mode) {...}
private void prepareCurrentMode(int newMode) {
Log.i(TAG, "[prepareCurrentMode] mCurrentMode:" + mCurrentMode + ",newMode:" + newMode);
mCurrentMode = newMode;
CameraModeType mode = getCameraModeType(mCurrentMode);
if (mode == null) {
mode = CameraModeType.EXT_MODE_PHOTO;
}
mModuleManager.createMode(mode);
}
public void createMode(CameraModeType newMode) {
Log.i(TAG, "[createMode],newMode:" + newMode + ",mCurrentMode:" + mCurrentMode);
if (mCurrentMode == newMode) {
return;
}
mICameraMode.close();
mCurrentMode = newMode;
mICameraMode = ModeFactory.getInstance().createMode(newMode, mICameraContext);
mAdditionManager.setCurrentMode(newMode);
mICameraMode.open();
}
ICameraMode是一个接口,定义了各种Mode类型、Action类型以及相关操作,它的实现是一个抽象类CameraMode,因此这里mICameraMode.open();的具体实现在PhotoActor中:
@Override
public boolean open() {
mAdditionManager.setListener(this);
mAdditionManager.open(true);
return true;
}
跟一下这个open(true)方法,在AdditionManager中:
public void open(boolean isMode) {
Log.i(TAG, "[open]isMode = " + isMode);
Vector<ICameraAddition> curAddition = mModeAddition;
if (!isMode) {
curAddition = mNormalAddition;
}
for (ICameraAddition addition : curAddition) {
if (addition.isSupport()) {
addition.open();
}
}
}
它的实现也是一个抽象类CameraAddition,看下它的子类RemoteCameraAddition:
@Override
public void open() {
Log.i(TAG, "[open], mIsMtkCameraApServiceLaunched:" + mIsMtkCameraApServiceLaunched);
if (mIsMtkCameraApServiceLaunched) {
int cameraId = mICameraDeviceManager.getCurrentCameraId();
ICameraDevice cameraDevice = mICameraDeviceManager.getCameraDevice(cameraId);
if (cameraDevice == null) {
Log.i(TAG, "cameraDevice is null, return.");
return;
}
Parameters parameters = cameraDevice.getParameters();
parameters.setPreviewFormat(ImageFormat.NV21);
if (mCameraService == null) {
bindCameraService();
}
mOpened = true;
}
}
我们看到这里绑定了一个service,在相机启动后紧接着启动了一个service:
private void bindCameraService() {
Log.i(TAG, "bindCameraService()");
mHasNotifyParameterReady = false;
Intent intent = new Intent(mActivity, MtkCameraService.class);
mActivity.bindService(intent, mCameraConnection, Context.BIND_AUTO_CREATE);
}
private void unBindCameraService() {
Log.i(TAG, "unBindCameraService()");
mActivity.unbindService(mCameraConnection);
}
private ServiceConnection mCameraConnection = new ServiceConnection() {
@Override
public void onServiceDisconnected(ComponentName name) {
// TODO Auto-generated method stub
Log.i(TAG, "CameraConnection, onServiceDisconnected()");
mCameraService = null;
}
@Override
public void onServiceConnected(ComponentName name, IBinder service) {
// TODO Auto-generated method stub
Log.i(TAG, "CameraConnection, onServiceConnected()");
mCameraService = (IMtkCameraService.Stub) service;
}
};
这个Service有什么特别的要求呢?如果它是在子线程中启动的,那么它至少要支持跨进程通信,来看下:
@Override
public IBinder onBind(Intent intent) {
Log.i(TAG, "intent:" + intent.getAction());
return new MtkCameraServiceImpl();
}
而这个MtkCameraServiceImpl也很有讲究
public class MtkCameraServiceImpl extends IMtkCameraService.Stub {...}
看到这个Stub存根,我们想到了aidl,没错,它是一个aidl文件定义的:
interface IMtkCameraService {
void openCamera();
void releaseCamera();
void capture();
void sendMessage(in Message msg);
void registerCallback(ICameraClientCallback cb);
void unregisterCallback(ICameraClientCallback cb);
void setFrameRate(int frameRate);
// add for release
void cameraServerExit();
String getSupportedFeatureList();
}
我们暂不论这个service是怎样工作的?但记住它是在这里启动的。
最后补充一张图来说明各种Camera相关类之间的关系(菱形箭头表示内部持有该类的对象,圆内十字表示内部类):
camerdevice.jpeg
我们看到这个关系确实比较复杂,上边的CameraManger和AndroidCamera内部真正持有android.hardware.Camera的对象。而CameraDeviceCtrl持有多个ICameraDeviceExt的对象,ICameraDeviceExt的实现位于CameraDeviceExt,而CameraDeviceExt内部持有CameraProxy这个内部类的对象,所以通过CameraDeviceCtrl最终可以调节真正的相机。同理,如果我们有一个ICameraDeviceManger的对象,它的先一步实现在CameraDeviceManagerImpl中,而CameraDeviceManagerImpl持有ICameraDeviceManager的内部类ICameraDevice的对象,而ICameraDevice的实现位于CameraDeviceImpl中,同样,CameraDeviceImpl持有CameraProxy的对象,最终同样调用了真正的Camera对象的方法。