将实时摄像头视频从 iOS (iPhone/iPad) 流式传输到远程 PC/服务器

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/22263291/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-30 23:39:56  来源:igfitidea点击:

Streaming live camera video from iOS (iPhone/iPad) to remote PC / server

iosobjective-cvideo-streamingavfoundationrtmp

提问by Medi The Jedi

I've been searching for a while on stackoverflow and around the web for a solution to my video-streaming problem. I need to stream live video being captured from the camera (no high-quality required) from an iOS device to a remote PC in one way, i.e., the iOS device will be sending a video stream to the server/PC but not the opposite.

我在 stackoverflow 和网络上搜索了一段时间来解决我的视频流问题。我需要以一种方式将从相机捕获的实时视频(不需要高质量)从 iOS 设备流式传输到远程 PC,即,iOS 设备将向服务器/PC 发送视频流,但不是相反.

What appears after some googling and documentation browsing is that there are two main major standards/protocols that can be used:

一些谷歌搜索和文档浏览后出现的是可以使用的两个主要标准/协议:

  • Apple's HTTP Live Streaming (HLS)
  • Adobe's RTMP
  • Apple 的 HTTP 实时流媒体 (HLS)
  • Adobe 的 RTMP

Again, my requirement is that the iPhone/iPad will be streaming the video. From what appears on Apple's website, I understand that HLS is to be used from an encoding perspective server-side, and a decoding perspective iOS side. As of RTMP, most libraries that allow iOS streaming have commercial licenses and closed code or require you to go through their P2P infrastructure (for instance angl.tvor tokbox.com/opentok/quick-start). As of HLS, no encoding libraries seem to exist iOS side.

同样,我的要求是 iPhone/iPad 将流式传输视频。从 Apple 网站上显示的内容来看,我了解到 HLS 是从服务器端的编码角度和 iOS 端的解码角度使用的。从 RTMP 开始,大多数允许 iOS 流式传输的库都有商业许可证和封闭代码,或者要求您通过他们的 P2P 基础设施(例如angl.tvtokbox.com/opentok/quick-start)。从 HLS 开始,iOS 端似乎不存在任何编码库。

So my questions are:

所以我的问题是:

  • Do you know of any SDK/Library preferably open and free that I could integrate to stream captured video from within my app?
  • If no, do you think developing a custom library would be a risky jungle-crossing endeavour? My guess is to go through AVFoundation and capture camera frames, compress them frame by frame and send them over HTTP. Does that sound crazy performance and bandwidth wise? Note that in that case I would need an HLS or RTMP encoder either ways.
  • 您是否知道我可以集成任何 SDK/Library 最好是开放和免费的,以便从我的应用程序中流式传输捕获的视频?
  • 如果不是,您认为开发自定义库会是一项冒险的穿越丛林的努力吗?我的猜测是通过 AVFoundation 并捕获相机帧,逐帧压缩它们并通过 HTTP 发送它们。这听起来疯狂的性能和带宽明智吗?请注意,在这种情况下,无论哪种方式,我都需要一个 HLS 或 RTMP 编码器。

I thank you very much in advance dear friends.

非常感谢亲爱的朋友们。

Mehdi.

迈迪。

回答by jgh

I have developed such a library, and you can find it at github.com/jgh-/VideoCore

我开发了一个这样的库,你可以在github.com/jgh-/VideoCore找到它

I am updating this answer because I have created a simplified iOS API that will allow you to easily setup a Camera/Mic RTMP session. You can find it at https://github.com/jgh-/VideoCore/blob/master/api/iOS/VCSimpleSession.h.

我正在更新此答案,因为我创建了一个简化的 iOS API,可以让您轻松设置相机/麦克风 RTMP 会话。您可以在https://github.com/jgh-/VideoCore/blob/master/api/iOS/VCSimpleSession.h找到它。

Additionally, VideoCore is now available in CocoaPods.

此外,VideoCore 现在可在 CocoaPods 中使用。