android– 使用Camera2 API在服务中拍照

android– 使用Camera2 API在服务中拍照,第1张

概述我正在使用camera2API.我需要在没有预览的情况下在服务拍照.它有效,但照片曝光不好.图片很暗或有时非常浅.如何修复代码以使照片质量更高?我正在使用前置摄像头.publicclassCamera2ServiceextendsService{protectedstaticfinalStringTAG="myLog";protect

我正在使用camera2 API.我需要在没有预览的情况下在服务中拍照.它有效,但照片曝光不好.图片很暗或有时非常浅.如何修复代码以使照片质量更高?我正在使用前置摄像头.

public class Camera2Service extends Service{    protected static final String TAG = "myLog";    protected static final int CAMERACHOICE = Cameracharacteristics.LENS_FACING_BACK;    protected CameraDevice cameraDevice;    protected CameraCaptureSession session;    protected ImageReader imageReader;    protected CameraDevice.StateCallback cameraStateCallback = new CameraDevice.StateCallback() {        @OverrIDe        public voID onopened(@NonNull CameraDevice camera) {            Log.d(TAG, "CameraDevice.StateCallback onopened");            cameraDevice = camera;            actOnReadyCameraDevice();        }        @OverrIDe        public voID ondisconnected(@NonNull CameraDevice camera) {            Log.w(TAG, "CameraDevice.StateCallback ondisconnected");        }        @OverrIDe        public voID one rror(@NonNull CameraDevice camera, int error) {            Log.e(TAG, "CameraDevice.StateCallback one rror " + error);        }    };    protected CameraCaptureSession.StateCallback sessionStateCallback = new CameraCaptureSession.StateCallback() {        @OverrIDe        public voID onReady(CameraCaptureSession session) {            Camera2Service.this.session = session;            try {                session.setRepeatingRequest(createCaptureRequest(), null, null);            } catch (CameraAccessException e) {                Log.e(TAG, e.getMessage());            }        }        @OverrIDe        public voID onConfigured(CameraCaptureSession session) {        }        @OverrIDe        public voID onConfigureFailed(@NonNull CameraCaptureSession session) {        }    };    protected ImageReader.OnImageAvailableListener onImageAvailableListener = new ImageReader.OnImageAvailableListener() {        @OverrIDe        public voID onImageAvailable(ImageReader reader) {            Log.d(TAG, "onImageAvailable");            Image img = reader.acquireLatestimage();            if (img != null) {                processImage(img);                img.close();            }        }    };    public voID readyCamera() {        CameraManager manager = (CameraManager) getSystemService(CAMERA_SERVICE);        try {            String pickedCamera = getCamera(manager);            if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {                return;            }            manager.openCamera(pickedCamera, cameraStateCallback, null);            imageReader = ImageReader.newInstance(1920, 1088, ImageFormat.JPEG, 2 /* images buffered */);            imageReader.setonImageAvailableListener(onImageAvailableListener, null);            Log.d(TAG, "imageReader created");        } catch (CameraAccessException e){            Log.e(TAG, e.getMessage());        }    }    public String getCamera(CameraManager manager){        try {            for (String cameraID : manager.getCameraIDList()) {                Cameracharacteristics characteristics = manager.getCameracharacteristics(cameraID);                int cOrIEntation = characteristics.get(Cameracharacteristics.LENS_FACING);                if (cOrIEntation != CAMERACHOICE) {                    return cameraID;                }            }        } catch (CameraAccessException e){            e.printstacktrace();        }        return null;    }    @OverrIDe    public int onStartCommand(Intent intent, int flags, int startID) {        Log.d(TAG, "onStartCommand flags " + flags + " startID " + startID);        readyCamera();        return super.onStartCommand(intent, flags, startID);    }    @OverrIDe    public voID onCreate() {        Log.d(TAG,"onCreate service");        super.onCreate();    }    public voID actOnReadyCameraDevice()    {        try {            cameraDevice.createCaptureSession(Arrays.asList(imageReader.getSurface()), sessionStateCallback, null);        } catch (CameraAccessException e){            Log.e(TAG, e.getMessage());        }    }    @OverrIDe    public voID onDestroy() {        try {            session.abortCaptures();        } catch (CameraAccessException e){            Log.e(TAG, e.getMessage());        }        session.close();    }    private voID processImage(Image image){        //Process image data        ByteBuffer buffer;        byte[] bytes;        boolean success = false;        file file = new file(Environment.getExternalStorageDirectory() + "/Pictures/image.jpg");        fileOutputStream output = null;        if(image.getFormat() == ImageFormat.JPEG) {            buffer = image.getPlanes()[0].getBuffer();            bytes = new byte[buffer.remaining()]; // makes byte array large enough to hold image            buffer.get(bytes); // copIEs image from buffer to byte array            try {                output = new fileOutputStream(file);                output.write(bytes);    // write the byte array to file                j++;                success = true;            } catch (fileNotFoundException e) {                e.printstacktrace();            } catch (IOException e) {                e.printstacktrace();            } finally {                image.close(); // close this to free up buffer for other images                if (null != output) {                    try {                        output.close();                    } catch (IOException e) {                        e.printstacktrace();                    }                }            }        }    }    protected CaptureRequest createCaptureRequest() {        try {            CaptureRequest.Builder builder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);            builder.addTarget(imageReader.getSurface());            return builder.build();        } catch (CameraAccessException e) {            Log.e(TAG, e.getMessage());            return null;        }    }    @OverrIDe    public IBinder onBind(Intent intent) {        return null;    }}

解决方法:

谢尔盖,我复制了你的代码,事实上我能够重现这个问题.我从Google Pixel 2(AndroID 8.1)中获得了完全黑色的图片.

但是,我已成功解决了黑皮特问题如下:

首先,如果有人想知道,你实际上不需要任何Activity,或任何预览UI元素,就像有关Camera API声明的许多其他线程一样!对于已弃用的Camera v1 API,过去曾经如此.现在,使用新的Camera v2 API,我所需要的只是前台服务.

为了开始捕获过程,我使用了以下代码:

CaptureRequest.Builder builder = cameraDevice.createCaptureRequest (CameraDevice.TEMPLATE_VIDEO_SNAPSHOT);builder.set (CaptureRequest.CONTRol_MODE, CaptureRequest.CONTRol_MODE_auto);builder.set (CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_OFF);builder.addTarget (imageReader.getSurface ());captureRequest = builder.build ();

然后,在ImageReader.onImageAvailable中,我跳过了前N张图片(意思是我没有保存它们).我让会话运行,捕获更多的图片而不保存它们.

这使相机有足够的时间自动逐渐调整曝光参数.然后,在N忽略照片后,我保存了一张照片,这张照片通常是曝光的,而不是黑色照片.

N常数的值取决于硬件的特性.因此,您需要通过实验确定N的理想值.您还可以使用基于直方图的启发式自动化.在实验开始时,不要害怕只有在经过数百毫秒的校准后才开始保存.

最后,在许多类似的主题中,人们建议等待,例如创建会话后500毫秒,然后只拍一张照片.这没有用.一个人必须让相机运行并让它快速拍摄许多照片(以最快的速度拍摄).为此,只需使用setRepeatingRequest方法(与原始代码一样).

希望这可以帮助. 总结

以上是内存溢出为你收集整理的android – 使用Camera2 API在服务中拍照全部内容,希望文章能够帮你解决android – 使用Camera2 API在服务中拍照所遇到的程序开发问题。

如果觉得内存溢出网站内容还不错,欢迎将内存溢出网站推荐给程序员好友。

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/web/1118430.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-05-29
下一篇 2022-05-29

发表评论

登录后才能评论

评论列表(0条)

保存