-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Live Stream to HTML5 Video Tag #598
Comments
FFmpeg supports the HTTP protocol, sure: https://ffmpeg.org/ffmpeg-protocols.html#http |
I'm not sure what you mean by "range packets" though. Could you elaborate on why this isn't possible with now with FFmpegFrameRecorder? |
FYI, there is some sample code here: |
Thanks a lot I will try and let you know, this should be able to start an HTTP Server running inside FFMPEG and do the streaming. My second question was whether I can use my own http server or a web container like tomcat. To do that I will have to implement a Servlet that will respond to Range requests that Chrome and some other browsers will send to download video see example range serlvet here: https://gist.github.com/jsfeng/2629858 . The Range request will usually have an "Accept-Ranges: bytes" header and it will split the video downloading in many byte range requests. So how can I get bytes from the FrameGrabber or the Recorder and get an encoded n amount of bytes? What I am actually trying to achieve is to stream a sequence of images that I receive from a socket in JPEG format into an HTML Canvas element as a live video stream. |
I see. You can't expect FFmpeg to know magically how many bytes you're going to have before encoding happens... |
I tried your suggestion and I can't seem to start the FFMPEG HTTP Server. I am getting the following exception: Output #0, webm, to 'http://localhost:9090': This is how I am trying to initialize the WEBM HTTP Streaming Server:
|
regarding second question the HTML5 tag performs a progressive download so if there is nothing encoded yet we will simply wait until there is and in the range requested, meaning that recording will be done in a separate thread somehow. From my understanding It is not actually streaming but endless progressive downloading of byte array pieces. So is there a way to get an encoded packet based on a range? |
For some reason, it seems like the URLs need to start with "tcp" instead of "http". This works fine here: FFmpegFrameRecorder recorder = new FFmpegFrameRecorder("tcp://localhost:9090?listen", imageWidth, imageHeight);
recorder.setFormat("webm");
recorder.start(); For your questions concerning FFmpeg specifically, you should ask on FFmpeg's mailing list, not here. |
Actually, |
Hallo, I can not build the javacv-master using mvn install: [ERROR] Failed to execute goal on project javacv: Could not resolve dependencies for project org.bytedeco:javacv:jar:1.3.2-SNAPSHOT: The following artifacts could not be resolved: org.bytedeco:javacpp:jar:1.3.2-SNAPSHOT, org.bytedeco.javacpp-presets:opencv:jar:3.1.0-1.3.2-SNAPSHOT, org.bytedeco.javacpp-presets:ffmpeg:jar:3.2.1-1.3.2-SNAPSHOT, org.bytedeco.javacpp-presets:flycapture:jar:2.9.3.43-1.3.2-SNAPSHOT, org.bytedeco.javacpp-presets:libdc1394:jar:2.2.4-1.3.2-SNAPSHOT, org.bytedeco.javacpp-presets:libfreenect:jar:0.5.3-1.3.2-SNAPSHOT, org.bytedeco.javacpp-presets:libfreenect2:jar:0.2.0-1.3.2-SNAPSHOT, org.bytedeco.javacpp-presets:librealsense:jar:1.9.6-1.3.2-SNAPSHOT, org.bytedeco.javacpp-presets:videoinput:jar:0.200-1.3.2-SNAPSHOT, org.bytedeco.javacpp-presets:artoolkitplus:jar:2.3.1-1.3.2-SNAPSHOT, org.bytedeco.javacpp-presets:flandmark:jar:1.07-1.3.2-SNAPSHOT: Could not find artifact org.bytedeco:javacpp:jar:1.3.2-SNAPSHOT -> [Help 1] |
Reset the version to 1.3 and try again. |
Changed pom.xml to 1.3 Now getting same for 1.3: [ERROR] Failed to execute goal on project javacv: Could not resolve dependencies for project org.bytedeco:javacv:jar:1.3: Failure to find org.bytedeco.javacpp-presets:libfreenect2:jar:0.2.0-1.3 in https://repo.maven.apache.org/maven2 was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced -> [Help 1] |
Right, you can remove that since you don't need it. |
I've just released version 1.3.1 with the fix, so no need to build from source. Enjoy! |
Hallo,
Is it possible to use JavaCV to live-stream a sequence of images from a FrameGrabber into an HTML5 Canvas as an html5 video stream (currently OGV, MP4, WEBM are supported from browsers). I can see that FFMPEG supports http streaming using an expiramental http server: https://ffmpeg.org/ffmpeg-protocols.html#http but how can that be possible using javacv?
Identically could it be possible to use the framegrabber on a servlet and serve Range packets using the FFMPEGRecorder?
The text was updated successfully, but these errors were encountered: