Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom byteArray data to WebRTC videoTrack #6

Open
sumitatflo opened this issue Jul 18, 2017 · 0 comments
Open

Custom byteArray data to WebRTC videoTrack #6

sumitatflo opened this issue Jul 18, 2017 · 0 comments

Comments

@sumitatflo
Copy link

I need to send specific cropped(face) video to the VideoTrack. I was able manipulate Camera1Session class of WebRTC to get the face cropped. Right now I am setting it to an ImageView.
My listenForBytebufferFrames() of Camera1Session.java:

private void listenForBytebufferFrames() {
    this.camera.setPreviewCallbackWithBuffer(new PreviewCallback() {
        public void onPreviewFrame(byte[] data, Camera callbackCamera) {
            Camera1Session.this.checkIsOnCameraThread();
            if(callbackCamera != Camera1Session.this.camera) {
                Logging.e("Camera1Session", "Callback from a different camera. This should never happen.");
            } else if(Camera1Session.this.state != Camera1Session.SessionState.RUNNING) {
                Logging.d("Camera1Session", "Bytebuffer frame captured but camera is no longer running.");
            } else {
                mFrameProcessor.setNextFrame(data, callbackCamera);
                long captureTimeNs = TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime());
                if(!Camera1Session.this.firstFrameReported) {
                    int startTimeMs = (int)TimeUnit.NANOSECONDS.toMillis(System.nanoTime() - Camera1Session.this.constructionTimeNs);
                    Camera1Session.camera1StartTimeMsHistogram.addSample(startTimeMs);
                    Camera1Session.this.firstFrameReported = true;
                }

            ByteBuffer byteBuffer1 = ByteBuffer.wrap(data);
            Frame outputFrame = new Frame.Builder()
                    .setImageData(byteBuffer1,
                            Camera1Session.this.captureFormat.width,
                            Camera1Session.this.captureFormat.height,
                            ImageFormat.NV21)
                    .setTimestampMillis(mFrameProcessor.mPendingTimeMillis)
                    .setId(mFrameProcessor.mPendingFrameId)
                    .setRotation(3)
                    .build();
            int w = outputFrame.getMetadata().getWidth();
            int h = outputFrame.getMetadata().getHeight();
            SparseArray<Face> detectedFaces = mDetector.detect(outputFrame);
            if (detectedFaces.size() > 0) {

                Face face = detectedFaces.valueAt(0);
                ByteBuffer byteBufferRaw = outputFrame.getGrayscaleImageData();
                byte[] byteBuffer = byteBufferRaw.array();
                YuvImage yuvimage  = new YuvImage(byteBuffer, ImageFormat.NV21, w, h, null);
                ByteArrayOutputStream baos = new ByteArrayOutputStream();

                //My crop logic to get face co-ordinates

                yuvimage.compressToJpeg(new Rect(left, top, right, bottom), 80, baos);
                final byte[] jpegArray = baos.toByteArray();
                Bitmap bitmap = BitmapFactory.decodeByteArray(jpegArray, 0, jpegArray.length);

                Activity currentActivity = getActivity();
                if (currentActivity instanceof CallActivity) {
                    ((CallActivity) currentActivity).setBitmapToImageView(bitmap); //face on ImageView is set just fine
                }
                Camera1Session.this.events.onByteBufferFrameCaptured(Camera1Session.this, data, Camera1Session.this.captureFormat.width, Camera1Session.this.captureFormat.height, Camera1Session.this.getFrameOrientation(), captureTimeNs);
                Camera1Session.this.camera.addCallbackBuffer(data);
            } else {
                Camera1Session.this.events.onByteBufferFrameCaptured(Camera1Session.this, data, Camera1Session.this.captureFormat.width, Camera1Session.this.captureFormat.height, Camera1Session.this.getFrameOrientation(), captureTimeNs);
                Camera1Session.this.camera.addCallbackBuffer(data);
            }

        }
    }
});}

jpegArray is the final byteArray that I need to stream via WebRTC, which I tried with something like this:

Camera1Session.this.events.onByteBufferFrameCaptured(Camera1Session.this, jpegArray, (int) face.getWidth(), (int) face.getHeight(), Camera1Session.this.getFrameOrientation(), captureTimeNs);

Camera1Session.this.camera.addCallbackBuffer(jpegArray);

Setting them up like this gives me following error:

../../webrtc/sdk/android/src/jni/androidvideotracksource.cc line 82
Check failed: length >= width * height + 2 * uv_width * ((height + 1) / 2) (2630 vs. 460800)

Which I assume is because androidvideotracksource does not get the same length of byteArray that it expects, since the frame is cropped now. Could someone point me in the direction of how to achieve it? Is this the correct way/place to manipulate the data and feed into the videoTrack?

jenkins-pristine pushed a commit that referenced this issue Aug 3, 2017
…rectory. (patchset #6 id:100001 of https://codereview.webrtc.org/2992103002/ )

Reason for revert:
Borken in the internal projects.

Original issue's description:
> Break peerconnection_jni.cc into multiple files, in "pc" directory.
>
> This CL breaks peerconnection_jni.cc apart, into one file for each
> class. It also moves the methods for converting between C++/Java
> structs into "java_native_conversion.cc", and uses a consistent naming
> scheme ("JavaToNativeX, NativeToJavaX"). These files go into a new
> "pc" directory, of which deadbeef@ is added as an owner.
>
> It also moves some relevant files to the "pc" directory that belong
> there: ownedfactoryandthreads, androidnetworkmonitor_jni, and
> rtcstatscollectorcallbackwrapper. This directory is intended to hold
> all the files that deal with the PeerConnection API specifically, or
> related classes (like DataChannel, RtpSender, MediaStreamTrack) that
> are tied to it closely.
>
> deadbeef@webrtc.org is added as an owner of the new "pc" subdirectory.
>
> BUG=webrtc:8055
>
> Review-Url: https://codereview.webrtc.org/2992103002
> Cr-Commit-Position: refs/heads/master@{#19223}
> Committed: https://chromium.googlesource.com/external/webrtc/+/dd7d8f1b609d51bcf39e9585871967a694a856bb

TBR=magjed@webrtc.org,sakal@webrtc.org,deadbeef@webrtc.org
# Skipping CQ checks because original CL landed less than 1 days ago.
NOPRESUBMIT=true
NOTREECHECKS=true
NOTRY=true
BUG=webrtc:8055

Review-Url: https://codereview.webrtc.org/2989323002
Cr-Commit-Position: refs/heads/master@{#19226}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant