Getting an error on Android 8 on a Samsung device using SCameraCaptureSession Getting an error on Android 8 on a Samsung device using SCameraCaptureSession android android

Getting an error on Android 8 on a Samsung device using SCameraCaptureSession


Seems like there is an issue with the surface configuration coming from MediaRecorder. If you pass a custom persistent surface, it should work.

  1. Instantiate a surface by calling MediaCodec.createPersistentInputSurface()

  2. Pass it using mediaRecorder.setInputSurface(yourSurface);

  3. call yourSurface.release() after you stop using this surface.

NOTE: Do not use mediaRecorder.getSurface() if you decide to use this approach

REFERENCES:

MediaRecorder : MediaRecorder - Android Docs

MediaCodec: MediaCodec - Android Docs


I had the same exception and I solved my case.Root cause of my case is I had re-created Surface of TextureView.When I change it not to re-created Surface, The exception was gone.

My code also works good before Android 8.0

My initializing camera is like following.

CameraDevice mCameraDevice;CameraCaptureSession mCameraCaptureSession;CaptureRequest mCaptureRequest;Surface mTextureViewSurface;public void updateCameraState(boolean run) {    if (run) {        if (mTextureView == null || !mTextureView.isAvailable()) {            // wait until mTextureView is available            // then call updateCameraState() again via SurfaceTextureListener            return;        }        if (mCameraDevice == null) {            // open camera and wait until mCameraDevice is obtained.            // then call updateCameraState() again via CameraDevice.StateCallback            mCameraManager.openCamera(...);            return;        }        if (mCameraCaptureSession == null) {            // createCaptureSession and wait until mCameraCaptureSession is obtained.            // then call updateCameraState() again via CameraCaptureSession.StateCallback            mTextureViewSurface = new Surface(texture);            List<Surface> surfaces = Arrays.asList(mTextureViewSurface, mImageReader.getSurface());            mCameraDevice.createCaptureSession(surfaces, mSessionStateCallback, sHandler);            return;        }        if (mCaptureRequest == null) {            CaptureRequest.Builder builder = mCameraCaptureSession.getDevice().createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);            /* Put some values into builder */            // *************************************************************************            // POINT: In my old code, It re-create Surface            // *************************************************************************            // Surface surface = new Surface(texture);            // builder.addTarget(surface);            builder.addTarget(mTextureViewSurface);            mCameraCaptureSession.setRepeatingRequest(builder.build(), mCaptureCallback, sHandler);        }        // fin    } else {        if (mCaptureRequest != null) {            mCaptureRequest = null;        }        // *************************************************************************        // POINT: I do not know release() is needed. But I add it here.        // *************************************************************************        if (mTextureViewSurface != null) {            mTextureViewSurface.release();            mTextureViewSurface = null;        }        if (mCameraCaptureSession != null) {            mCameraCaptureSession.close();            mCameraCaptureSession = null;        }        if (mCameraDevice != null) {            mCameraDevice.close();            mCameraDevice = null;        }    }}


I had the same symptoms.I solved using 'SurfaceView' classbut not used android.hardware.camera2Using android.hardware.camera

Solved parts of codes.

@Overridepublic void onPreviewFrame(byte[] data, Camera camera) {    // encoding data    encoding(data) ;}/** * byte data encoding * @param data */private void encoding (byte[] data) {    // api 21 미만에 대해서 필요    ByteBuffer[] inputBuffers = this.mediaCodec.getInputBuffers();    ByteBuffer[] outputBuffers = this.mediaCodec.getOutputBuffers();    int inputBufferIndex = this.mediaCodec.dequeueInputBuffer(TIMEOUT_USEC/* wait time, nagative value is infinite */);    // data write 가능 할 경우    if (inputBufferIndex >= 0) {        // data null (마지막 데이터)        int length = 0, flags = MediaCodec.BUFFER_FLAG_END_OF_STREAM;        if (data != null) {            ByteBuffer inputBuffer = null;            if (CameraUtils.isCamera2()) inputBuffer = this.mediaCodec.getInputBuffer(inputBufferIndex);            else inputBuffer = inputBuffers[inputBufferIndex];            inputBuffer.clear();            inputBuffer.put(data);            length = data.length;            flags = 0;        }        /*         - index : dequeueInputBuffer 에서 return 받은 index 번호를 넣습니다.         - offset : 항상 0이겠지만 Buffer에 채워넣은 데이터의 시작 점을 지정할 수 있습니다.         - size : Buffer에 채워넣은 데이터 사이즈 정보         - presentationTimeUs : 디코딩의 경우 Play 할 데이터의 시간(마이크로 초)         - flags : 읽은 버퍼의 정보가 설정값인지 BUFFER_FLAG_CODEC_CONFIG, 마지막 데이터인지BUFFER_FLAG_END_OF_STREAM에 대한 정보를 초기화 할 수 있습니다.            대부분은 0을 채워넣고 마지막 데이터를 알리기 위해서는 BUFFER_FLAGS_END_OF_STREAM을 넣습니다.         */        this.mediaCodec.queueInputBuffer(inputBufferIndex, 0, length, computePresentationTimeNsec(), flags);    }    MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();    int outputBufferIndex = this.mediaCodec.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC/* wait time, nagative value is infinite */);    switch (outputBufferIndex) {        /*         MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED         - Buffer 정보가 1번 변경되게 됩니다.         - API 21인 Lollipop 부터는 이 @deprecated 되었기에 불필요하지만 이전 API에서는 꼭 필요한 정보입니다. 이게 호출되면 처음에 생성한 ByteBuffer[] 배열의 변화가 일어나게 됩니다.         */        case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:            Log.i(TAG, "INFO_OUTPUT_BUFFERS_CHANGED");            outputBuffers = this.mediaCodec.getOutputBuffers();            break;         /*MediaCodec.INFO_OUTPUT_FORMAT_CHANGED         - 처음에 생성하였든 MediaFormat을 기억하시는지요. 그 MediaFormat이 변경된 정보를 알려주게됩니다.         - 이 경우는 Encoder에서만 주로 사용하고, 디코더에서는 사용할 일은 없습니다.         */        case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:            if (this.isMuxerStart) throw new RuntimeException("Format changed twice");            Log.d(TAG, "INFO_OUTPUT_FORMAT_CHANGED format : " + this.mediaCodec.getOutputFormat());            this.trackId = this.mediaMuxer.addTrack(this.mediaCodec.getOutputFormat());            this.mediaMuxer.start();            this.isMuxerStart = true;            break;         /*MediaCodec.INFO_TRY_AGAIN_LATER         - 이 함수가 호출되는 경우라면 사실 무시하여도 됩니다.         */        case MediaCodec.INFO_TRY_AGAIN_LATER:            break;         /*outputBufferIndex >= 0         - 이 경우에 실제 디코딩 된 데이터가 들어오는 경우에 해당됩니다.         */        default:            while (outputBufferIndex >= 0 && this.mediaCodec != null && this.mediaMuxer != null) {                ByteBuffer outputBuffer = null;                if (CameraUtils.isCamera2()) outputBuffer = this.mediaCodec.getOutputBuffer(outputBufferIndex);                else outputBuffer = outputBuffers[outputBufferIndex];                // null exception                if (outputBuffer == null)                    throw new RuntimeException("EncoderOutputBuffer " + outputBuffer + " was NULL");                if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {                    // The codec config data was pulled out and fed to the muxer when we got                    // the INFO_OUTPUT_FORMAT_CHANGED status.  Ignore it.                    bufferInfo.size = 0;                }                if (bufferInfo.size >= 0) {                    if (!this.isMuxerStart) throw new RuntimeException("MediaMuxer hasn't started");                    // 프레임의 타임스탬프 작성                    bufferInfo.presentationTimeUs = computePresentationTimeNsec();                    this.prevTime = bufferInfo.presentationTimeUs;                    this.mediaMuxer.writeSampleData(this.trackId, outputBuffer, bufferInfo);                }                this.mediaCodec.releaseOutputBuffer(outputBufferIndex, false/* true is surface init */);                outputBufferIndex = this.mediaCodec.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC/* wait time, nagative value is infinite */);                // end of frame                if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {                    // release                    releaseRecorder();                    // 저장 완료                    onCompleteEncoding(recordPath);                    stopEncodingThread();                    return;                }            }            break;    }}