如何在 Android 上使用硬件加速视频解码?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/11321825/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to use hardware accelerated video decoding on Android?
提问by Glenn Yu
I need hardware-accelerated H.264 decoding for a research project, to test a self-defined protocol.
我需要一个研究项目的硬件加速 H.264 解码,以测试自定义协议。
As I have Search on the web, I have found a few ways to perform hardware-accelerated video decoding on Android.
由于我在网络上进行了搜索,因此我找到了几种在 Android 上执行硬件加速视频解码的方法。
- Use ffmpeglibstagefright (overview of libstagefright) or use libstagefright in the OS directly, like here.
- Use OpenMaxon specific hardware platform. like here about samsung deviceand here about Qualcomm Snapdragon series
- Some people mentioned PVplayer,
- 使用ffmpeglibstagefright(libstagefright概述)或直接在操作系统中使用 libstagefright,例如此处。
- 在特定硬件平台上使用OpenMax。喜欢这里关于三星设备和这里关于高通骁龙系列
- 有人提到了PVplayer,
Some people "say"libstagefright is the only way while Qualcomm guys have made success obviously.
有些人“说”libstagefright 是唯一的方法,而高通公司的人显然取得了成功。
Currently I am not sure which way could work. I am a little confused now. If all could work, I would certainly prefer a hardware independent method.
目前我不确定哪种方式可行。我现在有点困惑。如果一切都可行,我当然更喜欢与硬件无关的方法。
As I have tested a few video players of their H/W acceleration with Galaxy Tab 7.7(3.2 & Enxyos), VLC, Mobo, Rock, vplayer, rock and mobo work fine, VLC doesn't work, vplayer seems to have a rendering bug which costs its performance.
因为我已经用 Galaxy Tab 7.7(3.2 和 Enxyos)、VLC、Mobo、Rock、vplayer、rock 和 mobo 测试了一些他们的硬件加速的视频播放器工作正常,VLC 不起作用,vplayer 似乎有渲染会降低性能的错误。
Anyway, I did an 'operation' on Rockplayer and deleted all its .so libs in data\data\com.redirecting\rockplayer, and software decoding crashes while hw decoding works still fine! I wonder how they did that. It appears to me that hw acceleration could be independent of hardware platforms.
无论如何,我对 Rockplayer 进行了“操作”并删除了 data\data\com.redirecting\rockplayer 中的所有 .so 库,软件解码崩溃,而硬件解码仍然正常!我想知道他们是怎么做到的。在我看来,硬件加速可以独立于硬件平台。
Can someone nail this problem? Or provide any reference with additional information or better details?
有人可以解决这个问题吗?或者提供其他信息或更好的细节的任何参考?
回答by Oak Bytes
To answer the above question, let me introduce few concepts related to Android
为了回答上面的问题,我先介绍几个Android相关的概念
OpenMAX
Android uses OpenMAX for codec interface. Hence all native codecs (hardware accelerated or otherwise) provide OpenMAX interface. This interface is used by StageFright(Player framework) for decoding media using codec
OpenMAX
Android 使用 OpenMAX 作为编解码器接口。因此,所有本机编解码器(硬件加速或其他方式)都提供 OpenMAX 接口。StageFright(播放器框架)使用此接口使用编解码器解码媒体
NDK
Android allows Java Applications to interact with underlying C/C++ native libraries using NDK. This requires using JNI (Java Native Interface).
NDK
Android 允许 Java 应用程序使用 NDK 与底层 C/C++ 本机库进行交互。这需要使用 JNI(Java 本机接口)。
Now coming to your question How to tap native decoder to decode raw video bitstream?
现在来回答你的问题 How to tap native decoder to decode raw video bitstream?
In Android 4.0 version and below, Android did not provide access to underlying video decoders at Java layer. You would need to write native code to directly interact with OMX decoder. Though this is possible, it is not trivial as it would need knowledge of how OMX works and how to map this OMX to application using NDK.
在 Android 4.0 及以下版本中,Android 没有提供对 Java 层底层视频解码器的访问。您需要编写本机代码才能直接与 OMX 解码器交互。尽管这是可能的,但这并非微不足道,因为它需要了解 OMX 的工作原理以及如何使用 NDK 将此 OMX 映射到应用程序。
In 4.1 (Jelly Bean version), Android seems to provide access to hardware accelerated decoders at application level through JAVA APIs. More details about new APIs at http://developer.android.com/about/versions/android-4.1.html#Multimedia
在 4.1(Jelly Bean 版本)中,Android 似乎通过 JAVA API 在应用程序级别提供对硬件加速解码器的访问。有关新 API 的更多详细信息,请访问http://developer.android.com/about/versions/android-4.1.html#Multimedia
回答by Peter Tran
Its a Google-sponsored open-source project that replaces the platform's MediaPlayer. Each component in the pipeline is extensible including the sample source (how the H.264 frames are extracted from your custom protocol) to the rendering (to a Surface, SurfaceTexture, etc).
它是一个由 Google 赞助的开源项目,用于取代平台的 MediaPlayer。管道中的每个组件都是可扩展的,包括示例源(如何从您的自定义协议中提取 H.264 帧)到渲染(到 Surface、SurfaceTexture 等)。
It includes a nice demo app showing usage.
它包括一个很好的演示应用程序,显示用法。
回答by GregoryK
You might want to try MediaExtractorand MediaCodec(They are also available in NDK - AMediaExtractor and AMediaCodec - see sample for playing .mp4 here native-codec)
您可能想尝试MediaExtractor和MediaCodec(它们也可在 NDK 中使用 - AMediaExtractor 和 AMediaCodec - 请参阅此处播放 .mp4 的示例native-codec)