Html 将 http 实时流式传输到 HTML5 视频客户端的最佳方法

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/21921790/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-29 00:59:04  来源:igfitidea点击:

Best approach to real time http streaming to HTML5 video client

htmlnode.jsffmpegstreaming

提问by deandob

I'm really stuck trying to understand the best way to stream real time output of ffmpeg to a HTML5 client using node.js, as there are a number of variables at play and I don't have a lot of experience in this space, having spent many hours trying different combinations.

我真的很难理解使用 node.js 将 ffmpeg 的实时输出流式传输到 HTML5 客户端的最佳方式,因为有许多变量在起作用,而且我在这个领域没有很多经验,花了很多时间尝试不同的组合。

My use case is:

我的用例是:

1) IP video camera RTSP H.264 stream is picked up by FFMPEG and remuxed into a mp4 container using the following FFMPEG settings in node, output to STDOUT. This is only run on the initial client connection, so that partial content requests don't try to spawn FFMPEG again.

1) IP 视频摄像机 RTSP H.264 流由 FFMPEG 拾取并使用节点中的以下 FFMPEG 设置重新混合到 mp4 容器中,输出到 STDOUT。这仅在初始客户端连接上运行,因此部分内容请求不会再次尝试生成 FFMPEG。

liveFFMPEG = child_process.spawn("ffmpeg", [
                "-i", "rtsp://admin:[email protected]:554" , "-vcodec", "copy", "-f",
                "mp4", "-reset_timestamps", "1", "-movflags", "frag_keyframe+empty_moov", 
                "-"   // output to stdout
                ],  {detached: false});

2) I use the node http server to capture the STDOUT and stream that back to the client upon a client request. When the client first connects I spawn the above FFMPEG command line then pipe the STDOUT stream to the HTTP response.

2)我使用节点 http 服务器来捕获 STDOUT 并根据客户端请求将其流式传输回客户端。当客户端第一次连接时,我会生成上面的 FFMPEG 命令行,然后将 STDOUT 流通过管道传输到 HTTP 响应。

liveFFMPEG.stdout.pipe(resp);

I have also used the stream event to write the FFMPEG data to the HTTP response but makes no difference

我还使用流事件将 FFMPEG 数据写入 HTTP 响应,但没有区别

xliveFFMPEG.stdout.on("data",function(data) {
        resp.write(data);
}

I use the following HTTP header (which is also used and working when streaming pre-recorded files)

我使用以下 HTTP 标头(在流式传输预先录制的文件时也使用和工作)

var total = 999999999         // fake a large file
var partialstart = 0
var partialend = total - 1

if (range !== undefined) {
    var parts = range.replace(/bytes=/, "").split("-"); 
    var partialstart = parts[0]; 
    var partialend = parts[1];
} 

var start = parseInt(partialstart, 10); 
var end = partialend ? parseInt(partialend, 10) : total;   // fake a large file if no range reques 

var chunksize = (end-start)+1; 

resp.writeHead(206, {
                  'Transfer-Encoding': 'chunked'
                 , 'Content-Type': 'video/mp4'
                 , 'Content-Length': chunksize // large size to fake a file
                 , 'Accept-Ranges': 'bytes ' + start + "-" + end + "/" + total
});

3) The client has to use HTML5 video tags.

3) 客户端必须使用 HTML5 视频标签。

I have no problems with streaming playback (using fs.createReadStream with 206 HTTP partial content) to the HTML5 client a video file previously recorded with the above FFMPEG command line (but saved to a file instead of STDOUT), so I know the FFMPEG stream is correct, and I can even correctly see the video live streaming in VLC when connecting to the HTTP node server.

我对 HTML5 客户端的流式播放(使用 fs.createReadStream 和 206 HTTP 部分内容)没有问题,这是以前使用上述 FFMPEG 命令行录制的视频文件(但保存到文件而不是 STDOUT),所以我知道 FFMPEG 流是正确的,我什至可以在连接到 HTTP 节点服务器时正确地看到 VLC 中的视频直播。

However trying to stream live from FFMPEG via node HTTP seems to be a lot harder as the client will display one frame then stop. I suspect the problem is that I am not setting up the HTTP connection to be compatible with the HTML5 video client. I have tried a variety of things like using HTTP 206 (partial content) and 200 responses, putting the data into a buffer then streaming with no luck, so I need to go back to first principles to ensure I'm setting this up the right way.

然而,尝试通过节点 HTTP 从 FFMPEG 实时流式传输似乎要困难得多,因为客户端将显示一帧然后停止。我怀疑问题在于我没有设置与 HTML5 视频客户端兼容的 HTTP 连接。我尝试了多种方法,例如使用 HTTP 206(部分内容)和 200 个响应,将数据放入缓冲区然后在没有运气的情况下进行流式传输,因此我需要回到首要原则以确保我正确设置道路。

Here is my understanding of how this should work, please correct me if I'm wrong:

这是我对这应该如何工作的理解,如果我错了,请纠正我:

1) FFMPEG should be setup to fragment the output and use an empty moov (FFMPEG frag_keyframe and empty_moov mov flags). This means the client does not use the moov atom which is typically at the end of the file which isn't relevant when streaming (no end of file), but means no seeking possible which is fine for my use case.

1) FFMPEG 应该设置为对输出进行分段并使用空的 moov(FFMPEG frag_keyframe 和 empty_moov mov 标志)。这意味着客户端不使用通常位于文件末尾的 moov 原子,这在流式传输时不相关(无文件结尾),但意味着无法进行搜索,这对我的用例来说很好。

2) Even though I use MP4 fragments and empty MOOV, I still have to use HTTP partial content, as the HTML5 player will wait until the entire stream is downloaded before playing, which with a live stream never ends so is unworkable.

2) 即使我使用 MP4 片段和空 MOOV,我仍然必须使用 HTTP 部分内容,因为 HTML5 播放器会等到整个流下载后才能播放,直播流永远不会结束,所以不可行。

3) I don't understand why piping the STDOUT stream to the HTTP response doesn't work when streaming live yet if I save to a file I can stream this file easily to HTML5 clients using similar code. Maybe it's a timing issue as it takes a second for the FFMPEG spawn to start, connect to the IP camera and send chunks to node, and the node data events are irregular as well. However the bytestream should be exactly the same as saving to a file, and HTTP should be able to cater for delays.

3)我不明白为什么在实时流式传输时将 STDOUT 流传输到 HTTP 响应不起作用,如果我保存到文件,我可以使用类似的代码轻松地将此文件流式传输到 HTML5 客户端。也许这是一个时间问题,因为 FFMPEG spawn 需要一秒钟才能启动,连接到 IP 摄像机并将块发送到节点,并且节点数据事件也是不规则的。然而,字节流应该与保存到文件完全相同,并且 HTTP 应该能够满足延迟。

4) When checking the network log from the HTTP client when streaming a MP4 file created by FFMPEG from the camera, I see there are 3 client requests: A general GET request for the video, which the HTTP server returns about 40Kb, then a partial content request with a byte range for the last 10K of the file, then a final request for the bits in the middle not loaded. Maybe the HTML5 client once it receives the first response is asking for the last part of the file to load the MP4 MOOV atom? If this is the case it won't work for streaming as there is no MOOV file and no end of the file.

4) 从摄像头流式传输 FFMPEG 创建的 MP4 文件时,从 HTTP 客户端检查网络日志时,我看到有 3 个客户端请求:对视频的一般 GET 请求,HTTP 服务器返回大约 40Kb,然后是部分具有文件最后 10K 字节范围的内容请求,然后对中间位的最终请求未加载。也许 HTML5 客户端一旦收到第一个响应就会要求文件的最后一部分加载 MP4 MOOV 原子?如果是这种情况,它将不适用于流式传输,因为没有 MOOV 文件并且没有文件结尾。

5) When checking the network log when trying to stream live, I get an aborted initial request with only about 200 bytes received, then a re-request again aborted with 200 bytes and a third request which is only 2K long. I don't understand why the HTML5 client would abort the request as the bytestream is exactly the same as I can successfully use when streaming from a recorded file. It also seems node isn't sending the rest of the FFMPEG stream to the client, yet I can see the FFMPEG data in the .on event routine so it is getting to the FFMPEG node HTTP server.

5) 在尝试直播时检查网络日志时,我收到一个中止的初始请求,仅收到了大约 200 个字节,然后再次中止了 200 个字节的重新请求和只有 2K 长的第三个请求。我不明白为什么 HTML5 客户端会中止请求,因为字节流与从录制文件流式传输时可以成功使用的字节流完全相同。节点似乎也没有将 FFMPEG 流的其余部分发送到客户端,但我可以在 .on 事件例程中看到 FFMPEG 数据,因此它正在到达 FFMPEG 节点 HTTP 服务器。

6) Although I think piping the STDOUT stream to the HTTP response buffer should work, do I have to build an intermediate buffer and stream that will allow the HTTP partial content client requests to properly work like it does when it (successfully) reads a file? I think this is the main reason for my problems however I'm not exactly sure in Node how to best set that up. And I don't know how to handle a client request for the data at the end of the file as there is no end of file.

6) 虽然我认为将 STDOUT 流传输到 HTTP 响应缓冲区应该可以工作,但我是否必须构建一个中间缓冲区和流,以允许 HTTP 部分内容客户端请求像它(成功)读取文件时那样正常工作? 我认为这是我出现问题的主要原因,但是我不确定在 Node 中如何最好地设置它。而且我不知道如何处理客户端对文件末尾数据的请求,因为没有文件结尾。

7) Am I on the wrong track with trying to handle 206 partial content requests, and should this work with normal 200 HTTP responses? HTTP 200 responses works fine for VLC so I suspect the HTML5 video client will only work with partial content requests?

7) 我是否在尝试处理 206 个部分内容请求时走错了路,这是否应该适用于正常的 200 个 HTTP 响应?HTTP 200 响应适用于 VLC,所以我怀疑 HTML5 视频客户端只能处理部分内容请求?

As I'm still learning this stuff its difficult to work through the various layers of this problem (FFMPEG, node, streaming, HTTP, HTML5 video) so any pointers will be greatly appreciated. I have spent hours researching on this site and the net, and I have not come across anyone who has been able to do real time streaming in node but I can't be the first, and I think this should be able to work (somehow!).

由于我仍在学习这些东西,因此很难解决这个问题的各个层面(FFMPEG、节点、流媒体、HTTP、HTML5 视频),因此将不胜感激。我花了几个小时在这个网站和网络上进行研究,我还没有遇到任何能够在 node 中进行实时流式传输的人,但我不能成为第一个,我认为这应该能够工作(不知何故!)。

采纳答案by szatmary

EDIT 3: As of IOS 10, HLS will support fragmented mp4 files. The answer now, is to create fragmented mp4 assets, with a DASH and HLS manifest. > Pretend flash, iOS9 and below and IE 10 and below don't exist.

编辑 3:从 IOS 10 开始,HLS 将支持碎片化的 mp4 文件。现在的答案是使用 DASH 和 HLS 清单创建碎片化的 mp4 资产。> 假装flash,iOS9 及以下和IE 10 及以下不存在。

Everything below this line is out of date. Keeping it here for posterity.

此行以下的所有内容都已过时。留在这里供后人使用。



EDIT 2: As people in the comments are pointing out, things change. Almost all browsers will support AVC/AAC codecs. iOS still requires HLS. But via adaptors like hls.js you can play HLS in MSE. The new answer is HLS+hls.js if you need iOS. or just Fragmented MP4 (i.e. DASH) if you don't

编辑 2:正如评论中的人指出的那样,事情发生了变化。几乎所有浏览器都支持 AVC/AAC 编解码器。iOS 仍然需要 HLS。但是通过像 hls.js 这样的适配器,你可以在 MSE 中玩 HLS。如果您需要 iOS,新的答案是 HLS+hls.js。或者只是碎片 MP4(即 DASH),如果你不这样做

There are many reasons why video and, specifically, live video is very difficult. (Please note that the original question specified that HTML5 video is a requirement, but the asker stated Flash is possible in the comments. So immediately, this question is misleading)

视频,特别是实时视频非常困难的原因有很多。(请注意,原始问题指定 HTML5 视频是一项要求,但提问者在评论中表示 Flash 是可能的。因此,这个问题立即具有误导性)

First I will restate: THERE IS NO OFFICIAL SUPPORT FOR LIVE STREAMING OVER HTML5. There are hacks, but your mileage may vary.

首先我要重申:没有官方支持通过 HTML5 直播。有黑客,但你的里程可能会有所不同。

EDIT: since I wrote this answer Media Source Extensions have matured, and are now very close to becoming a viable option. They are supported on most major browsers. IOS continues to be a hold out.

编辑:自从我写了这个答案媒体源扩展已经成熟,现在非常接近成为一个可行的选择。大多数主要浏览器都支持它们。IOS 仍然是一个坚持。

Next, you need to understand that Video on demand (VOD) and live video are very different. Yes, they are both video, but the problems are different, hence the formats are different. For example, if the clock in your computer runs 1% faster than it should, you will not notice on a VOD. With live video, you will be trying to play video before it happens. If you want to join a a live video stream in progress, you need the data necessary to initialize the decoder, so it must be repeated in the stream, or sent out of band. With VOD, you can read the beginning of the file them seek to whatever point you wish.

接下来,您需要了解视频点播(VOD)和视频直播是非常不同的。是的,它们都是视频,但问题不同,因此格式不同。例如,如果您计算机中的时钟运行速度比应有的速度快 1%,您将不会在 VOD 上注意到。使用实时视频,您将尝试在视频发生之前播放。如果要加入正在进行的直播视频流,则需要初始化解码器所需的数据,因此必须在流中重复,或带外发送。使用 VOD,您可以根据需要阅读他们寻找的文件的开头。

Now let's dig in a bit.

现在让我们深入了解一下。

Platforms:

平台:

  • iOS
  • PC
  • Mac
  • Android
  • IOS
  • 个人电脑
  • 苹果电脑
  • 安卓

Codecs:

编解码器:

  • vp8/9
  • h.264
  • thora (vp3)
  • VP8/9
  • h.264
  • 托拉 (vp3)

Common Delivery methods for live video in browsers:

浏览器中直播视频的常见交付方式:

  • DASH (HTTP)
  • HLS (HTTP)
  • flash (RTMP)
  • flash (HDS)
  • 达世币 (HTTP)
  • HLS (HTTP)
  • 闪存 (RTMP)
  • 闪存 (HDS)

Common Delivery methods for VOD in browsers:

浏览器中VOD的常见交付方式:

  • DASH (HTTP Streaming)
  • HLS (HTTP Streaming)
  • flash (RTMP)
  • flash (HTTP Streaming)
  • MP4 (HTTP pseudo streaming)
  • I'm not going to talk about MKV and OOG because I do not know them very well.
  • DASH(HTTP 流媒体)
  • HLS(HTTP 流)
  • 闪存 (RTMP)
  • 闪存(HTTP 流)
  • MP4(HTTP 伪流)
  • 我不打算谈论 MKV 和 OOG,因为我不太了解它们。

html5 video tag:

html5 视频标签:

  • MP4
  • webm
  • ogg
  • MP4
  • 网络管理器
  • 奥格


Lets look at which browsers support what formats

让我们看看哪些浏览器支持哪些格式

Safari:

苹果浏览器:

  • HLS (iOS and mac only)
  • h.264
  • MP4
  • HLS(仅限 iOS 和 Mac)
  • h.264
  • MP4

Firefox

火狐

  • DASH (via MSE but no h.264)
  • h.264 via Flash only!
  • VP9
  • MP4
  • OGG
  • Webm
  • DASH(通过 MSE 但没有 h.264)
  • h.264 仅通过 Flash!
  • VP9
  • MP4
  • 奥格
  • 网络管理器

IE

IE

  • Flash
  • DASH (via MSE IE 11+ only)
  • h.264
  • MP4
  • 闪光
  • DASH(仅通过 MSE IE 11+)
  • h.264
  • MP4

Chrome

铬合金

  • Flash
  • DASH (via MSE)
  • h.264
  • VP9
  • MP4
  • webm
  • ogg
  • 闪光
  • DASH(通过 MSE)
  • h.264
  • VP9
  • MP4
  • 网络管理器
  • 奥格

MP4 cannot be used for live video (NOTE: DASH is a superset of MP4, so don't get confused with that). MP4 is broken into two pieces: moov and mdat. mdat contains the raw audio video data. But it is not indexed, so without the moov, it is useless. The moov contains an index of all data in the mdat. But due to its format, it can not be 'flattened' until the timestamps and size of EVERY frame is known. It may be possible to construct an moov that 'fibs' the frame sizes, but is is very wasteful bandwidth wise.

MP4 不能用于实时视频(注意:DASH 是 MP4 的超集,所以不要混淆)。MP4 分为两部分:moov 和 mdat。mdat 包含原始音频视频数据。但是它没有被索引,所以没有moov,它是没有用的。moov 包含 mdat 中所有数据的索引。但是由于它的格式,在知道每个帧的时间戳和大小之前,它不能被“展平”。构建一个“fibs”帧大小的 moov 是可能的,但在带宽方面是非常浪费的。

So if you want to deliver everywhere, we need to find the least common denominator. You will see there is no LCD here without resorting to flash example:

因此,如果您想在任何地方交付,我们需要找到最小公分母。你会看到这里没有 LCD 而不求助于 flash 示例:

  • iOS only supports h.264 video. and it only supports HLS for live.
  • Firefox does not support h.264 at all, unless you use flash
  • Flash does not work in iOS
  • iOS 仅支持 h.264 视频。并且它只支持 HLS 直播。
  • Firefox 根本不支持 h.264,除非你使用 flash
  • Flash 在 iOS 中不起作用

The closest thing to an LCD is using HLS to get your iOS users, and flash for everyone else. My personal favorite is to encode HLS, then use flash to play HLS for everyone else. You can play HLS in flash via JW player 6, (or write your own HLS to FLV in AS3 like I did)

最接近 LCD 的是使用 HLS 来吸引您的 iOS 用户,并为其他所有人闪现。我个人最喜欢的是编码 HLS,然后使用 flash 为其他人播放 HLS。您可以通过 JW player 6 在 flash 中播放 HLS,(或像我一样在 AS3 中将自己的 HLS 写入 FLV)

Soon, the most common way to do this will be HLS on iOS/Mac and DASH via MSE everywhere else (This is what Netflix will be doing soon). But we are still waiting for everyone to upgrade their browsers. You will also likely need a separate DASH/VP9 for Firefox (I know about open264; it sucks. It can't do video in main or high profile. So it is currently useless).

很快,最常见的方法将是 iOS/Mac 上的 HLS 和其他地方通过 MSE 的 DASH(这是 Netflix 即将要做的)。但我们仍在等待每个人升级他们的浏览器。您可能还需要一个单独的 DASH/VP9 用于 Firefox(我知道 open264;它很糟糕。它不能在主要或高调中播放视频。所以它目前没用)。

回答by deandob

Thanks everyone especially szatmary as this is a complex question and has many layers to it, all which have to be working before you can stream live video. To clarify my original question and HTML5 video use vs flash - my use case has a strong preference for HTML5 because it is generic, easy to implement on the client and the future. Flash is a distant second best so lets stick with HTML5 for this question.

感谢大家,尤其是 szatmary,因为这是一个复杂的问题,并且有很多层次,在您可以流式传输实时视频之前,所有这些都必须有效。为了澄清我的原始问题和 HTML5 视频使用与 Flash - 我的用例对 HTML5 有强烈的偏好,因为它是通用的,易于在客户端和未来实现。Flash 遥遥领先,所以让我们在这个问题上坚持使用 HTML5。

I learnt a lot through this exercise and agree live streaming is much harder than VOD (which works well with HTML5 video). But I did get this to work satisfactorily for my use case and the solution worked out to be very simple, after chasing down more complex options like MSE, flash, elaborate buffering schemes in Node. The problem was that FFMPEG was corrupting the fragmented MP4 and I had to tune the FFMPEG parameters, and the standard node stream pipe redirection over http that I used originally was all that was needed.

我通过这个练习学到了很多东西,并且同意直播比 VOD(它适用于 HTML5 视频)困难得多。但是我确实让我的用例令人满意地工作,并且解决方案非常简单,在追逐更复杂的选项(如 MSE、闪存、Node.js 中精心设计的缓冲方案)之后。问题是 FFMPEG 破坏了碎片化的 MP4,我不得不调整 FFMPEG 参数,而我最初使用的通过 http 的标准节点流管道重定向就是所需要的。

In MP4 there is a 'fragmentation' option that breaks the mp4 into much smaller fragments which has its own index and makes the mp4 live streaming option viable. But not possible to seek back into the stream (OK for my use case), and later versions of FFMPEG support fragmentation.

在 MP4 中有一个“碎片化”选项,可以将 mp4 分成更小的片段,这些片段有自己的索引,并使 mp4 实时流媒体选项可行。但不可能返回到流中(对我的用例来说是好的),并且更高版本的 FFMPEG 支持碎片化。

Note timing can be a problem, and with my solution I have a lag of between 2 and 6 seconds caused by a combination of the remuxing (effectively FFMPEG has to receive the live stream, remux it then send it to node for serving over HTTP). Not much can be done about this, however in Chrome the video does try to catch up as much as it can which makes the video a bit jumpy but more current than IE11 (my preferred client).

注意时间可能是一个问题,在我的解决方案中,由于重新组合(实际上 FFMPEG 必须接收实时流,重新组合然后将其发送到节点以通过 HTTP 提供服务),我有 2 到 6 秒的延迟. 对此无能为力,但是在 Chrome 中,视频确实尝试尽可能多地追赶,这使得视频有点跳动,但比 IE11(我的首选客户端)更新。

Rather than explaining how the code works in this post, check out the GIST with comments (the client code isn't included, it is a standard HTML5 video tag with the node http server address). GIST is here: https://gist.github.com/deandob/9240090

与其在这篇文章中解释代码的工作原理,不如查看带有注释的 GIST(不包括客户端代码,它是一个标准的 HTML5 视频标签,带有节点 http 服务器地址)。要点在这里:https: //gist.github.com/deandob/9240090

I have not been able to find similar examples of this use case, so I hope the above explanation and code helps others, especially as I have learnt so much from this site and still consider myself a beginner!

我一直没能找到这个用例的类似例子,所以我希望上面的解释和代码能帮助其他人,尤其是我从这个网站学到了很多东西,仍然认为自己是一个初学者!

Although this is the answer to my specific question, I have selected szatmary's answer as the accepted one as it is the most comprehensive.

虽然这是我的具体问题的答案,但我选择了 szatmary 的答案作为公认的答案,因为它是最全面的。

回答by Michael Romanenko

Take a look at JSMPEGproject. There is a great idea implemented there — to decode MPEG in the browser using JavaScript. Bytes from encoder (FFMPEG, for example) can be transfered to browser using WebSockets or Flash, for example. If community will catch up, I think, it will be the best HTML5 live video streaming solution for now.

看看JSMPEG项目。那里实现了一个很棒的想法——使用 JavaScript 在浏览器中解码 MPEG。例如,来自编码器(例如 FFMPEG)的字节可以使用 WebSockets 或 Flash 传输到浏览器。如果社区能赶上,我认为,这将是目前最好的 HTML5 视频直播解决方案。

回答by 131

I wrote an HTML5 video player around broadway h264 codec (emscripten) that can play live (no delay) h264 video on all browsers (desktop, iOS, ...).

我围绕 Broadway h264 编解码器 (emscripten) 编写了一个 HTML5 视频播放器,它可以在所有浏览器(桌面、iOS 等)上播放实时(无延迟)h264 视频。

Video stream is sent through websocket to the client, decoded frame per frame and displayed in a canva (using webgl for acceleration)

视频流通过 websocket 发送到客户端,每帧解码帧并显示在画布中(使用 webgl 进行加速)

Check out https://github.com/131/h264-live-playeron github.

在 github 上查看https://github.com/131/h264-live-player

回答by Jannis

One way to live-stream a RTSP-based webcam to a HTML5 client (involves re-encoding, so expect quality loss and needs some CPU-power):

将基于 RTSP 的网络摄像头直播到 HTML5 客户端的一种方法(涉及重新编码,因此预计质量会下降并需要一些 CPU 功率):

  • Set up an icecast server (could be on the same machine you web server is on or on the machine that receives the RTSP-stream from the cam)
  • On the machine receiving the stream from the camera, don't use FFMPEG but gstreamer. It is able to receive and decode the RTSP-stream, re-encode it and stream it to the icecast server. Example pipeline (only video, no audio):

    gst-launch-1.0 rtspsrc location=rtsp://192.168.1.234:554 user-id=admin user-pw=123456 ! rtph264depay ! avdec_h264 ! vp8enc threads=2 deadline=10000 ! webmmux streamable=true ! shout2send password=pass ip=<IP_OF_ICECAST_SERVER> port=12000 mount=cam.webm
    
  • 设置 icecast 服务器(可以在您的网络服务器所在的同一台机器上,也可以在从摄像头接收 RTSP 流的机器上)
  • 在从相机接收流的机器上,不要使用 FFMPEG,而是使用 gstreamer。它能够接收和解码 RTSP 流,对其重新编码并将其流式传输到 icecast 服务器。示例管道(只有视频,没有音频):

    gst-launch-1.0 rtspsrc location=rtsp://192.168.1.234:554 user-id=admin user-pw=123456 ! rtph264depay ! avdec_h264 ! vp8enc threads=2 deadline=10000 ! webmmux streamable=true ! shout2send password=pass ip=<IP_OF_ICECAST_SERVER> port=12000 mount=cam.webm
    

=> You can then use the <video> tag with the URL of the icecast-stream (http://127.0.0.1:12000/cam.webm) and it will work in every browser and device that supports webm

=> 然后您可以将 <video> 标签与 icecast-stream 的 URL ( http://127.0.0.1:12000/cam.webm) 一起使用,它将在支持 webm 的每个浏览器和设备中工作

回答by Kiki.J.Hu

How about use jpeg solution, just let server distribute jpeg one by one to browser, then use canvas element to draw these jpegs? http://theHymanalofjavascript.com/rpi-live-streaming/

使用jpeg解决方案如何,只需让服务器将jpeg一一分发到浏览器,然后使用canvas元素绘制这些jpeg? http://theHymanalofjavascript.com/rpi-live-streaming/

回答by ankitr

Take a look at this solution. As I know, Flashphoner allows to play Live audio+video stream in the pure HTML5 page.

看看这个解决方案。据我所知,Flashphoner 允许在纯 HTML5 页面中播放实时音频+视频流。

They use MPEG1and G.711codecs for playback. The hack is rendering decoded video to HTML5 canvas element and playing decoded audio via HTML5 audio context.

他们使用MPEG1G.711编解码器进行播放。黑客将解码的视频渲染到 HTML5 画布元素并通过 HTML5 音频上下文播放解码的音频。

回答by szatmary

This is a very common misconception. There is no live HTML5 video support (except for HLS on iOS and Mac Safari). You may be able to 'hack' it using a webm container, but I would not expect that to be universally supported. What you are looking for is included in the Media Source Extensions, where you can feed the fragments to the browser one at a time. but you will need to write some client side javascript.

这是一个非常普遍的误解。不支持实时 HTML5 视频(iOS 和 Mac Safari 上的 HLS 除外)。您可以使用 webm 容器“破解”它,但我不希望它得到普遍支持。您要查找的内容包含在媒体源扩展中,您可以在其中一次向浏览器提供一个片段。但是您需要编写一些客户端javascript。

回答by Siddharth

Try binaryjs. Its just like socket.io but only thing it do well is that it stream audio video. Binaryjs google it

试试 binaryjs。它就像socket.io,但它唯一做得好的就是它流式传输音频视频。Binaryjs 谷歌它