Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FFmpegFrameFilter with "eq" filter produces strange video #887

Closed
kahgoh opened this issue Jan 24, 2018 · 14 comments
Closed

FFmpegFrameFilter with "eq" filter produces strange video #887

kahgoh opened this issue Jan 24, 2018 · 14 comments

Comments

@kahgoh
Copy link

kahgoh commented Jan 24, 2018

I am trying to read to adjust the brightness of a video file (I am using a small clip, on a Windows desktop computer).

My understanding from the reading the FFmpeg documentation is that I would apply the eq filter with the appropriate brightness value. So I am trying to make use of FFmpegFilter to apply the filter:

import static org.bytedeco.javacpp.avcodec.AV_CODEC_ID_H264;
import static org.bytedeco.javacpp.avutil.AV_PIX_FMT_YUV420P;

import java.nio.file.Path;
import java.nio.file.Paths;
import org.bytedeco.javacv.FFmpegFrameFilter;
import org.bytedeco.javacv.FFmpegFrameGrabber;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.bytedeco.javacv.Frame;

public class VideoRecorder {

  public static void main(String args[]) throws Exception {

    Path inputFile = Paths.get("C:\\Users\\gohkah\\Videos\\SampleVideo_1280x720_5mb.mp4");
    Path outputFile = Paths.get("C:\\Users\\gohkah\\Videos\\filtered.mp4");

    try (FFmpegFrameGrabber frameGrabber = new FFmpegFrameGrabber(inputFile.toFile())) {
      frameGrabber.setFormat("mp4");
      frameGrabber.start();

      try (FFmpegFrameFilter frameFilter =
          new FFmpegFrameFilter(
              "eq=brightness=0.25",
              frameGrabber.getImageWidth(),
              frameGrabber.getImageHeight())) {
        frameFilter.start();

        try (FFmpegFrameRecorder recorder =
            new FFmpegFrameRecorder(
                outputFile.toFile(),
                frameFilter.getImageWidth(),
                frameFilter.getImageHeight(),
                0); ) {

          recorder.setVideoCodec(AV_CODEC_ID_H264);
          recorder.setPixelFormat(AV_PIX_FMT_YUV420P);
          recorder.setFormat("mp4");
          recorder.setFrameRate(frameGrabber.getFrameRate());
          recorder.start();

          Frame frame = frameGrabber.grabImage();
          while (frame != null) {
            frameFilter.push(frame, frameGrabber.getPixelFormat());

            Frame filteredFrame = frameFilter.pull();
            if (filteredFrame != null) {
              recorder.record(filteredFrame);
            } else {
              System.out.println("Filter returned null frame!");
            }
            frame = frameGrabber.grabImage();
          }

          frameGrabber.flush();

          frameFilter.push(null);
          Frame filteredFrame = frameFilter.pull();
          recorder.record(filteredFrame);

          recorder.stop();
          recorder.release();
        }
        frameFilter.stop();
        frameFilter.release();
      }
      frameGrabber.stop();
      frameGrabber.release();
    }
  }
}

However, this produces a video that looks something like this:
vlcsnap-2018-01-24-09h21m51s16

I have tried running it through FFplay and the video shows correctly with just the brightness changed.

Another thing I noticed is that if simply change the filter in the above code to rotate=-PI/6, it is also able to produce the video correctly. I am not sure if I am missing something?

@saudet
Copy link
Member

saudet commented Jan 24, 2018

It might help to set the pixel format of the FFmpegFrameFilter before frameFilter.start():

frameFilter.setPixelFormat(frameGrabber.getPixelFormat());

@saudet
Copy link
Member

saudet commented Jan 24, 2018

Also the output of the eq filter might change the pixel format. It's not currently exposed through the API though. Could you check if this is what is happening in pull()?

@kahgoh
Copy link
Author

kahgoh commented Jan 24, 2018

It might help to set the pixel format of the FFmpegFrameFilter before frameFilter.start():

Just tried adding that in. It didn't make a difference.

Also the output of the eq filter might change the pixel format. It's not currently exposed through the API though. Could you check if this is what is happening in pull()?

I wasn't 100% sure how to do this, but I tried setting a break point at the end of the pull() method in FFmpegFrameFilter and compared image_frame.format() (which returned 3) against filt_frame.format() (which returned 5).

@saudet
Copy link
Member

saudet commented Jan 24, 2018

Yes, 3 is AV_PIX_FMT_BGR24 and 5 is AV_PIX_FMT_YUV444P, so the pixel format does change. We'll need to add a way to query that.

saudet added a commit that referenced this issue Jan 24, 2018
@saudet
Copy link
Member

saudet commented Jan 24, 2018

I've added a FFmpegFrameFilter.getPixelFormat() method to the latest commit above, so we can do recorder.record(filteredFrame, frameFilter.getPixelFormat()) but just calling recorder.record(filteredFrame, AV_PIX_FMT_YUV444P) should work for your code.

@kahgoh
Copy link
Author

kahgoh commented Jan 25, 2018

I've added a FFmpegFrameFilter.getPixelFormat() method to the latest commit above, so we can do recorder.record(filteredFrame, frameFilter.getPixelFormat()) but just calling recorder.record(filteredFrame, AV_PIX_FMT_YUV444P) should work for your code.

Yup, just tried it (using recorder.record(filteredFrame, AV_PIX_FMT_YUV444P)) and it works!

@kahgoh
Copy link
Author

kahgoh commented Mar 6, 2018

I have just noticed that similar issue exists with the CanvasFrame. If I change the code to display in CanvasFrame instead of writing it to file, it produces video with the following:

canvas_frame

Sample code to reproduce with the same video mentioned in the issue:

import java.nio.file.Path;
import java.nio.file.Paths;
import org.bytedeco.javacv.CanvasFrame;
import org.bytedeco.javacv.FFmpegFrameFilter;
import org.bytedeco.javacv.FFmpegFrameGrabber;
import org.bytedeco.javacv.Frame;

public class TestPlayback {

  public static void main(String args[]) throws Exception {

    Path inputFile = Paths.get("C:\\Users\\gohkah\\Videos\\SampleVideo_1280x720_5mb.mp4");
    CanvasFrame display = new CanvasFrame("Test video");

    display.setSize(1280, 720);
    display.setVisible(true);

    try (FFmpegFrameGrabber frameGrabber = new FFmpegFrameGrabber(inputFile.toFile())) {
      frameGrabber.setFormat("mp4");
      frameGrabber.start();

      try (FFmpegFrameFilter frameFilter =
          new FFmpegFrameFilter(
              "eq=brightness=0.25", frameGrabber.getImageWidth(), frameGrabber.getImageHeight())) {
        frameFilter.start();

        Frame frame = frameGrabber.grabImage();
        while (frame != null) {
          frameFilter.push(frame, frameGrabber.getPixelFormat());

          Frame filteredFrame = frameFilter.pull();
          if (filteredFrame != null) {
            display.showImage(filteredFrame);
          } else {
            System.out.println("Filter returned null frame!");
          }
          frame = frameGrabber.grabImage();
        }

        frameGrabber.flush();

        frameFilter.stop();
        frameFilter.release();
      }
      frameGrabber.stop();
      frameGrabber.release();
    }
  }
}

@saudet
Copy link
Member

saudet commented Mar 6, 2018

@kahgoh CanvasFrame doesn't support YUV images. Convert them to RGB images.

@saudet
Copy link
Member

saudet commented Mar 31, 2018

Fix included in JavaCV 1.4.1! Thanks for reporting this issue.

@JiGuangYuan
Copy link

JiGuangYuan commented Jun 20, 2019

@kahgoh CanvasFrame doesn't support YUV images. Convert them to RGB images.

@saudet I have the same problem. How can I Convert YUV444p Frame into a RGB Frame?

I use OpenCV to convert and it doesn't work.

  Frame oldFrame = filter.pull();
  Mat mat = openCVFrameConverter.convertToMat(oldFrame);
  outMat = new Mat();
  opencv_imgproc.cvtColor(mat, outMat, opencv_imgproc.COLOR_YUV2BGR);
  outFrame = openCVFrameConverter.convert(outMat);

@saudet
Copy link
Member

saudet commented Jun 20, 2019 via email

@saudet
Copy link
Member

saudet commented Jun 21, 2019

Something like FFmpegFrameGrabber.setPixelFormat(AV_PIX_FMT_RGB24) before start() will do the trick.

@JiGuangYuan
Copy link

JiGuangYuan commented Jun 21, 2019

@saudet ,Thank you for your reply.
I can't use xxxxx.setPixelFormat(). I Have filter.pull () Frame,I want to convert to RGB Frame on the UI interface.
Than use FFmpeg instead., I didn't find a good encapsulation method, I wasn't very familiar with FFmpeg's API, I tried to write some, There's something wrong here, I can't get the right Buffer []
Can you give me some examples? Thank you so much .

private Frame convertYUV2RGBFrame(Frame frame) throws Exception {
        AVFrame yuv_picture = avutil.av_frame_alloc();
        AVFrame rgb_picture = avutil.av_frame_alloc();

        int width = frame.imageWidth;
        int height = frame.imageHeight;
        int depth = frame.imageDepth;
        int channels = frame.imageChannels;
        int stride = frame.imageStride;
        int yuv_pixelFormat = avutil.AV_PIX_FMT_YUV444P;
        int rgb_PixelFormat = avutil.AV_PIX_FMT_BGR24;

        int size = av_image_get_buffer_size(rgb_PixelFormat, width, height, 1);
        BytePointer rgb_data = new BytePointer(av_malloc(size));

        Buffer[] image = frame.image;
        int step = stride * Math.abs(depth) / 8;
        BytePointer yuv_data = image[0] instanceof ByteBuffer
                ? new BytePointer((ByteBuffer) image[0]).position(0)
                : new BytePointer(new Pointer(image[0]).position(0));

        SwsContext img_convert_ctx = swscale.sws_getContext(width, height, yuv_pixelFormat, width, height, rgb_PixelFormat, swscale.SWS_BILINEAR,
                null, null, (DoublePointer) null);
        if (img_convert_ctx == null) {
            throw new Exception("sws_getCachedContext() error: Cannot initialize the conversion context.");
        }
        avutil.av_image_fill_arrays(new PointerPointer(yuv_picture), yuv_picture.linesize(), yuv_data, yuv_pixelFormat, width, height, 1);
        avutil.av_image_fill_arrays(new PointerPointer(rgb_picture), rgb_picture.linesize(), rgb_data, rgb_PixelFormat, width, height, 1);
        yuv_picture.linesize(0, step);
        yuv_picture.format(yuv_pixelFormat);
        yuv_picture.width(width);
        yuv_picture.height(height);

        rgb_picture.format(rgb_PixelFormat);
        rgb_picture.width(width);
        rgb_picture.height(height);
        swscale.sws_scale(img_convert_ctx, new PointerPointer(yuv_picture), yuv_picture.linesize(), 0, height, new PointerPointer(rgb_picture), rgb_picture.linesize());
        Frame outFrame = new Frame(width, height, depth, channels, stride);
        outFrame.image = new Buffer[]{rgb_picture.data(0).asBuffer()};
        swscale.sws_freeContext(img_convert_ctx);
        avutil.av_frame_free(yuv_picture);
        avutil.av_frame_free(rgb_picture);
        return outFrame;
    }

@saudet
Copy link
Member

saudet commented Jun 24, 2019

If you're using FFmpegFrameFilter, we can use the format filter on output for that:
https://ffmpeg.org/ffmpeg-filters.html#format-1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants