Java环境下如何解码H.264视频帧

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/2165593/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-13 04:19:27  来源:igfitidea点击:

How to decode H.264 video frame in Java environment

javadecodeh.264

提问by

Does anyone know how to decode H.264 video frame in Java environment?

有谁知道如何在 Java 环境中解码 H.264 视频帧?

My network camera products support the RTP/RTSP Streaming.

我的网络摄像机产品支持 RTP/RTSP Streaming。

The service standard RTP/RTSP from my network camera is served and it also supports “RTP/RTSP over HTTP”.

提供来自我的网络摄像机的服务标准 RTP/RTSP,它还支持“RTP/RTSP over HTTP”。

RTSP : TCP 554 RTP Start Port: UDP 5000

RTSP:TCP 554 RTP 起始端口:UDP 5000

回答by ptsw

Take a look at the Java Media Framework (JMF) - http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/formats.html

看看 Java 媒体框架 (JMF) - http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/formats.html

I used it a while back and it was a bit immature, but they may have beefed it up since then.

我不久前使用过它,它有点不成熟,但从那时起他们可能已经加强了它。

回答by Art Clarke

Or use Xuggler. Works with RTP, RTMP, HTTP or other protocols, and can decode and encode H264 and most other codecs. And is actively maintained, free, and open-source (LGPL).

或者使用Xuggler。适用于 RTP、RTMP、HTTP 或其他协议,可以对 H264 和大多数其他编解码器进行解码和编码。并且是积极维护的、免费的和开源的 (LGPL)。

回答by Stanislav Vitvitskyy

You can use a pure Java library called JCodec ( http://jcodec.org).
Decoding one H.264 frame is as easy as:

您可以使用名为 JCodec ( http://jcodec.org)的纯 Java 库。
解码一个 H.264 帧就像:

ByteBuffer bb = ... // Your frame data is stored in this buffer
H264Decoder decoder = new H264Decoder();
Picture out = Picture.create(1920, 1088, ColorSpace.YUV_420); // Allocate output frame of max size
Picture real = decoder.decodeFrame(bb, out.getData());
BufferedImage bi = JCodecUtil.toBufferedImage(real); // If you prefere AWT image

If you want to read a from from a container ( like MP4 ) you can use a handy helper class FrameGrab:

如果您想从容器(如 MP4 )中读取 a,您可以使用一个方便的帮助类 FrameGrab:

int frameNumber = 150;
BufferedImage frame = FrameGrab.getFrame(new File("filename.mp4"), frameNumber);
ImageIO.write(frame, "png", new File("frame_150.png"));

Finally, here's a full sophisticated sample:

最后,这是一个完整的复杂示例:

private static void avc2png(String in, String out) throws IOException {
    SeekableByteChannel sink = null;
    SeekableByteChannel source = null;
    try {
        source = readableFileChannel(in);
        sink = writableFileChannel(out);

        MP4Demuxer demux = new MP4Demuxer(source);

        H264Decoder decoder = new H264Decoder();

        Transform transform = new Yuv420pToRgb(0, 0);

        MP4DemuxerTrack inTrack = demux.getVideoTrack();

        VideoSampleEntry ine = (VideoSampleEntry) inTrack.getSampleEntries()[0];
        Picture target1 = Picture.create((ine.getWidth() + 15) & ~0xf, (ine.getHeight() + 15) & ~0xf,
                ColorSpace.YUV420);
        Picture rgb = Picture.create(ine.getWidth(), ine.getHeight(), ColorSpace.RGB);
        ByteBuffer _out = ByteBuffer.allocate(ine.getWidth() * ine.getHeight() * 6);
        BufferedImage bi = new BufferedImage(ine.getWidth(), ine.getHeight(), BufferedImage.TYPE_3BYTE_BGR);
        AvcCBox avcC = Box.as(AvcCBox.class, Box.findFirst(ine, LeafBox.class, "avcC"));

        decoder.addSps(avcC.getSpsList());
        decoder.addPps(avcC.getPpsList());

        Packet inFrame;
        int totalFrames = (int) inTrack.getFrameCount();
        for (int i = 0; (inFrame = inTrack.getFrames(1)) != null; i++) {
            ByteBuffer data = inFrame.getData();

            Picture dec = decoder.decodeFrame(splitMOVPacket(data, avcC), target1.getData());
            transform.transform(dec, rgb);
            _out.clear();

            AWTUtil.toBufferedImage(rgb, bi);
            ImageIO.write(bi, "png", new File(format(out, i)));
            if (i % 100 == 0)
                System.out.println((i * 100 / totalFrames) + "%");
        }
    } finally {
        if (sink != null)
            sink.close();
        if (source != null)
            source.close();
    }
}

回答by Yao

I think the best solution is using "JNI + ffmpeg". In my current project, I need to play several full screen videos at the same time in a java openGL game based on libgdx. I have tried almost all the free libs but none of them has acceptable performance. So finally I decided to write my own jni C codes to work with ffmpeg. Here is the final performance on my laptop:

我认为最好的解决方案是使用“JNI + ffmpeg”。在我目前的项目中,我需要在一个基于libgdx的java openGL游戏中同时播放几个全屏视频。我已经尝试了几乎所有的免费库,但没有一个具有可接受的性能。所以最后我决定编写我自己的 jni C 代码来使用 ffmpeg。这是我的笔记本电脑的最终性能:

  • Environment: CPU: Core i7 Q740 @1.73G, Video: nVidia GeForce GT 435M, OS: Windows 7 64bit, Java: Java7u60 64bit
  • Video: h264rgb/ h264 encoded, no sound, resolution: 1366 * 768
  • Solution: Decode: JNI + ffmpeg v2.2.2, Upload to GPU: update openGL texture using lwjgl
  • Performance: Decodingspeed: 700-800FPS, Texture Uploading: about 1ms per frame.
  • 环境:CPU:Core i7 Q740 @1.73G,视频:nVidia GeForce GT 435M,操作系统:Windows 7 64bit,Java:Java7u60 64bit
  • 视频:h264rgb/ h264 编码,无声音,分辨率:1366 * 768
  • 解决方案:解码:JNI + ffmpeg v2.2.2,上传到GPU:使用lwjgl更新openGL纹理
  • 性能:解码速度: 700-800FPS,纹理上传每帧1ms

I only spent several days to complete the first version. But the first version's decoding speed was only about 120FPS, and uploading time was about 5ms per frame. After several months' optimization, I got this final performance and some additional features. Now I can play several HD videos at the same time without any slowness.

我只花了几天时间就完成了第一个版本。但是第一个版本的解码速度只有120FPS左右,上传时间大约为每帧5ms。经过几个月的优化,我得到了最终的性能和一些附加功能。现在我可以同时播放多个高清视频而不会出现任何缓慢。

Most videos in my game have transparent background. This kind of transparent video is a mp4 file with 2 video streams, one stream stores h264rgb encoded rgb data, the other stream stores h264 encoded alpha data. So to play an alpha video, I need to decode 2 video streams and merge them together and then upload to GPU. As a result, I can play several transparent HD videos above an opaque HD video at the same time in my game.

我游戏中的大多数视频都有透明背景。这种透明视频是一个 mp4 文件,有 2 个视频流,一个流存储 h264rgb 编码的 rgb 数据,另一个流存储 h264 编码的 alpha 数据。因此,要播放 alpha 视频,我需要解码 2 个视频流并将它们合并在一起,然后上传到 GPU。因此,我可以在我的游戏中同时在一个不透明的高清视频上播放几个透明的高清视频。

回答by Teocci

I found a very simple and straight-forward solution based on JavaCV's FFmpegFrameGrabber class. This library allows you to play a streaming media by wrapping the ffmpeg in Java.

我找到了一个基于JavaCV 的 FFmpegFrameGrabber 类的非常简单直接的解决方案。该库允许您通过用 Java 包装 ffmpeg 来播放流媒体。

How to use it?

如何使用它?

First, you may download and install the library, using Maven or Gradle.

首先,您可以使用 Maven 或 Gradle 下载并安装该库。

Here you have a StreamingClientclass that calls a SimplePlayerclass that has Thread to play the video.

在这里,您有一个StreamingClient调用SimplePlayer具有 Thread 播放视频的类的类。

public class StreamingClient extends Application implements GrabberListener
{
    public static void main(String[] args)
    {
        launch(args);
    }

    private Stage primaryStage;
    private ImageView imageView;

    private SimplePlayer simplePlayer;

    @Override
    public void start(Stage stage) throws Exception
    {
        String source = "rtsp://184.72.239.149/vod/mp4:BigBuckBunny_115k.mov"; // the video is weird for 1 minute then becomes stable

        primaryStage = stage;
        imageView = new ImageView();

        StackPane root = new StackPane();

        root.getChildren().add(imageView);
        imageView.fitWidthProperty().bind(primaryStage.widthProperty());
        imageView.fitHeightProperty().bind(primaryStage.heightProperty());

        Scene scene = new Scene(root, 640, 480);

        primaryStage.setTitle("Streaming Player");
        primaryStage.setScene(scene);
        primaryStage.show();

        simplePlayer = new SimplePlayer(source, this);
    }

    @Override
    public void onMediaGrabbed(int width, int height)
    {
        primaryStage.setWidth(width);
        primaryStage.setHeight(height);
    }

    @Override
    public void onImageProcessed(Image image)
    {
        LogHelper.e(TAG, "image: " + image);

        Platform.runLater(() -> {
            imageView.setImage(image);
        });
    }

    @Override
    public void onPlaying() {}

    @Override
    public void onGainControl(FloatControl gainControl) {}

    @Override
    public void stop() throws Exception
    {
        simplePlayer.stop();
    }
}

SimplePlayerclass uses FFmpegFrameGrabberto decode a framethat is converted into an image and displayed in your Stage

SimplePlayer类用于FFmpegFrameGrabber解码frame转换为图像并显示在舞台中的 a

public class SimplePlayer
{
    private static volatile Thread playThread;
    private AnimationTimer timer;

    private SourceDataLine soundLine;

    private int counter;

    public SimplePlayer(String source, GrabberListener grabberListener)
    {
        if (grabberListener == null) return;
        if (source.isEmpty()) return;

        counter = 0;

        playThread = new Thread(() -> {
            try {
                FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(source);
                grabber.start();

                grabberListener.onMediaGrabbed(grabber.getImageWidth(), grabber.getImageHeight());

                if (grabber.getSampleRate() > 0 && grabber.getAudioChannels() > 0) {
                    AudioFormat audioFormat = new AudioFormat(grabber.getSampleRate(), 16, grabber.getAudioChannels(), true, true);

                    DataLine.Info info = new DataLine.Info(SourceDataLine.class, audioFormat);
                    soundLine = (SourceDataLine) AudioSystem.getLine(info);
                    soundLine.open(audioFormat);
                    soundLine.start();
                }

                Java2DFrameConverter converter = new Java2DFrameConverter();

                while (!Thread.interrupted()) {
                    Frame frame = grabber.grab();
                    if (frame == null) {
                        break;
                    }
                    if (frame.image != null) {

                        Image image = SwingFXUtils.toFXImage(converter.convert(frame), null);
                        Platform.runLater(() -> {
                            grabberListener.onImageProcessed(image);
                        });
                    } else if (frame.samples != null) {
                        ShortBuffer channelSamplesFloatBuffer = (ShortBuffer) frame.samples[0];
                        channelSamplesFloatBuffer.rewind();

                        ByteBuffer outBuffer = ByteBuffer.allocate(channelSamplesFloatBuffer.capacity() * 2);

                        for (int i = 0; i < channelSamplesFloatBuffer.capacity(); i++) {
                            short val = channelSamplesFloatBuffer.get(i);
                            outBuffer.putShort(val);
                        }
                    }
                }
                grabber.stop();
                grabber.release();
                Platform.exit();
            } catch (Exception exception) {
                System.exit(1);
            }
        });
        playThread.start();
    }

    public void stop()
    {
        playThread.interrupt();
    }
}