Html 仅使用视频标签实时流式传输到 HTML5(没有 webrtc)
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/12257762/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Real Time Streaming to HTML5 (with out webrtc) just using video tag
提问by Evren Bing?l
I would like to wrap real time encoded data to webm or ogv and send it to an html5 browser.
我想将实时编码数据包装到 webm 或 ogv 并将其发送到 html5 浏览器。
Can webm or ogv do this, Mp4 can not do this due to its MDAT atoms. (one can not wrap h264 and mp3 in real time and wrap it and send it to the client) Say I am feeding the input from my webcam and audio from my built in mic. Fragmented mp4 can handle this but its an hassle to find libs to do that).
webm 或 ogv 可以做到这一点,Mp4 不能做到这一点,因为它的 MDAT 原子。(不能实时打包 h264 和 mp3 并将其打包并发送给客户端)假设我正在从我的网络摄像头和内置麦克风中输入音频。碎片化的 mp4 可以处理这个问题,但是找到库来做到这一点很麻烦)。
I need to do this cuz I do not want to send audio and video separably.
我需要这样做,因为我不想分开发送音频和视频。
If I did send it separably, sending audio over audio tag and video over video>(audio and video are demuxed and sent) Can I sync them on client browser with javascript. I saw some examples but not sure yet.
如果我确实单独发送它,通过音频标签发送音频和通过视频发送视频>(音频和视频被解复用并发送)我可以使用 javascript 在客户端浏览器上同步它们吗?我看到了一些例子,但还不确定。
采纳答案by user1390208
Evren,
埃弗伦,
Since you have asked this question initially, the Media Source Extensions https://www.w3.org/TR/media-source/have matured enough to be able to play very short (30ms) ISO-BMFF video/mp4 segments with just a little buffering.
由于您最初问过这个问题,因此媒体源扩展 https://www.w3.org/TR/media-source/已经足够成熟,可以播放非常短(30 毫秒)的 ISO-BMFF 视频/mp4 片段,只需有点缓冲。
Refer to HTML5 live streaming
参考 HTML5 直播
So your statement
所以你的说法
(one can not wrap h264 and mp3 in real time and wrap it and send it to the client)
(不能实时打包h264和mp3并打包发送给客户端)
is out of date now. Yes you can do it with h264 + AAC.
现在已经过时了。是的,您可以使用 h264 + AAC 来实现。
There are several implementations out there; take a look at Unreal Media Server. From Unreal Media Server FAQ: http://umediaserver.net/umediaserver/faq.html
有几种实现方式;看看虚幻媒体服务器。来自虚幻媒体服务器常见问题解答:http: //umediaserver.net/umediaserver/faq.html
How is Unreal HTML5 live streaming different from MPEG-DASH?Unlike MPEG-DASH, Unreal Media Server uses a WebSocket protocol for live streaming to HTML5 MSE element in web browsers. This is much more efficient than fetching segments via HTTP requests per MPEG-DASH. Also, Unreal Media Server sends segments of minimal duration, as low as 30 ms. That allows for low, sub-second latency streaming, while MPEG-DASH, like other HTTP chunk-based live streaming protocols, cannot provide low latency live streaming.
虚幻 HTML5 直播与 MPEG-DASH 有何不同?与 MPEG-DASH 不同,Unreal Media Server 使用 WebSocket 协议在 Web 浏览器中实时流式传输到 HTML5 MSE 元素。这比通过每个 MPEG-DASH 的 HTTP 请求获取片段更有效。此外,虚幻媒体服务器发送持续时间最短的片段,低至 30 毫秒。这允许进行亚秒级的低延迟流式传输,而 MPEG-DASH 与其他基于 HTTP 块的实时流式传输协议一样,无法提供低延迟的实时流式传输。
Their demos webpage has a live HTML5 feed from RTSP camera: http://umediaserver.net/umediaserver/demos.htmlNotice that the latency in HTML5 player is comparable to that in Flash player.
他们的演示网页有来自 RTSP 摄像头的实时 HTML5 提要:http: //umediaserver.net/umediaserver/demos.html请注意,HTML5 播放器中的延迟与 Flash 播放器中的延迟相当。
回答by CoryG
I did this with ffmpeg/ffserver running on Ubuntu as follows for webm (mp4 and ogg are a bit easier, and should work in a similar manner from the same server, but you should use all 3 formats for compatibility across browsers).
我使用在 Ubuntu 上运行的 ffmpeg/ffserver 对 webm 进行了如下操作(mp4 和 ogg 更容易一些,并且应该在同一服务器上以类似的方式工作,但您应该使用所有 3 种格式以实现跨浏览器的兼容性)。
First, build ffmpeg from source to include the libvpx drivers (even if your using a version that has it, you need the newest ones (as of this month) to stream webm because they just did add the functionality to include global headers). I did this on an Ubuntu server and desktop, and this guideshowed me how - instructions for other OSes can be found here.
首先,从源代码构建 ffmpeg 以包含 libvpx 驱动程序(即使您使用具有它的版本,您也需要最新的(截至本月)来流式传输 webm,因为它们只是添加了包含全局标头的功能)。我在 Ubuntu 服务器和台式机上执行此操作,本指南向我展示了操作方法 -可以在此处找到其他操作系统的说明。
Once you've gotten the appropriate version of ffmpeg/ffserver you can set them up for streaming, in my case this was done as follows.
一旦您获得了适当版本的 ffmpeg/ffserver,您就可以将它们设置为流式传输,在我的情况下,这是按如下方式完成的。
On the video capture device:
在视频捕获设备上:
ffmpeg -f video4linux2 -standard ntsc -i /dev/video0 http://<server_ip>:8090/0.ffm
- The "-f video4linux2 -standard ntsc -i /dev/video0" portion of that may change depending on your input source (mine is for a video capture card).
- “-f video4linux2 -standard ntsc -i /dev/video0”部分可能会根据您的输入源而改变(我的是视频采集卡)。
Relevant ffserver.conf excerpt:
相关 ffserver.conf 摘录:
Port 8090
#BindAddress <server_ip>
MaxHTTPConnections 2000
MAXClients 100
MaxBandwidth 1000000
CustomLog /var/log/ffserver
NoDaemon
<Feed 0.ffm>
File /tmp/0.ffm
FileMaxSize 5M
ACL allow <feeder_ip>
</Feed>
<Feed 0_webm.ffm>
File /tmp/0_webm.ffm
FileMaxSize 5M
ACL allow localhost
</Feed>
<Stream 0.mpg>
Feed 0.ffm
Format mpeg1video
NoAudio
VideoFrameRate 25
VideoBitRate 256
VideoSize cif
VideoBufferSize 40
VideoGopSize 12
</Stream>
<Stream 0.webm>
Feed 0_webm.ffm
Format webm
NoAudio
VideoCodec libvpx
VideoSize 320x240
VideoFrameRate 24
AVOptionVideo flags +global_header
AVOptionVideo cpu-used 0
AVOptionVideo qmin 1
AVOptionVideo qmax 31
AVOptionVideo quality good
PreRoll 0
StartSendOnKey
VideoBitRate 500K
</Stream>
<Stream index.html>
Format status
ACL allow <client_low_ip> <client_high_ip>
</Stream>
- Note this is configured for a server at feeder_ip to execute the aforementioned ffmpeg command, and for the server at server_ip so server to client_low_ip through client_high_ip while handling the mpeg to webm conversation on server_ip (continued below).
- 请注意,这是为 feeder_ip 处的服务器配置的,以执行上述 ffmpeg 命令,并为 server_ip 处的服务器配置,以便在处理 server_ip 上的 mpeg 到 webm 对话时,通过 client_high_ip 服务器到 client_low_ip(下文继续)。
This ffmpeg command is executed on the machine previously referred to as server_ip (it handles the actual mpeg --> webm conversion and feeds it back into the ffserver on a different feed):
此 ffmpeg 命令在以前称为 server_ip 的机器上执行(它处理实际的 mpeg --> webm 转换,并将其反馈到不同提要上的 ffserver):
ffmpeg -i http://<server_ip>:8090/0.mpg -vcodec libvpx http://localhost:8090/0_webm.ffm
Once these have all been started up (first the ffserver, then the feeder_ip ffmpeg process then then the server_ip ffmpeg process) you should be able to access the live stream at http://:8090/0.webm and check the status at http://:8090/
一旦这些都启动了(首先是 ffserver,然后是 feeder_ip ffmpeg 进程,然后是 server_ip ffmpeg 进程),您应该能够访问 http://:8090/0.webm 的实时流并检查 http 的状态://:8090/
Hope this helps.
希望这可以帮助。
回答by av501
Not 100% sure you can do this. HTML5 has not ratified any live streaming mechanism. You could use websockets and send data in real time to the browser to do this. But you have to write the parsing logic yourself and I do not know how you will feed the data as it arrives to the player.
不是 100% 确定你可以做到这一点。HTML5 尚未批准任何直播机制。您可以使用 websockets 并将数据实时发送到浏览器来执行此操作。但是你必须自己编写解析逻辑,我不知道你将如何在数据到达玩家时提供数据。
As for video and audio tag: Video tag can play container files that have both audio and video. So wrap your content in a container that is compatible. If you modify your browser to write your live streaming to this video file as the live content keeps coming in and stream out that data for every byte requested by the browser this could be done. But it is definitely non trivial.
对于视频和音频标签:视频标签可以播放具有音频和视频的容器文件。因此,将您的内容包装在兼容的容器中。如果您修改浏览器以将实时流写入此视频文件,因为实时内容不断传入并为浏览器请求的每个字节流式传输该数据,这可以完成。但这绝对不是微不足道的。