C++ MJPEG 流媒体和解码
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/6022423/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
MJPEG streaming and decoding
提问by umair
I want to receive JPEG images from an IP camera (over RTSP). For this, I tried cvCreateFileCapture_FFMPEG
in OpenCV. But ffmpeg seems to have some problem with the MJPEG format of the streaming (since it automatically tries to detect the streaming info) and I end up with the following error
我想从 IP 摄像机(通过 RTSP)接收 JPEG 图像。为此,我cvCreateFileCapture_FFMPEG
在 OpenCV 中尝试过。但是 ffmpeg 似乎对流媒体的 MJPEG 格式有一些问题(因为它会自动尝试检测流媒体信息),最终出现以下错误
mjpeg: unsupported coding type
I, then, decided to use live555 for streaming. Till now, I can successfully establish streaming and capture (non-decoded) images through openRTSP.
然后,我决定使用 live555 进行流媒体播放。到目前为止,我可以通过 openRTSP 成功建立流式传输和捕获(非解码)图像。
The question is how can I do this in my application, e.g., in OpenCV. How can I use openRTSP in OpenCV to get images and save them in JPEG format?
问题是如何在我的应用程序中执行此操作,例如在 OpenCV 中。如何在 OpenCV 中使用 openRTSP 获取图像并将其保存为 JPEG 格式?
I have heard that the data from openRTSP can be sent to a buffer (or a named pipe) and then read in OpenCV's IplImage
. But I don't know how to do this.
我听说可以将来自 openRTSP 的数据发送到缓冲区(或命名管道),然后在 OpenCV 的IplImage
. 但我不知道如何做到这一点。
I will really appreciate any help/suggestion in about this problem. I need answers of either of the following questions:
我将非常感谢有关此问题的任何帮助/建议。我需要以下任一问题的答案:
- How can I disable ffmpeg's automatic stream information detection and specify my own format (mjpeg), or
- How can I use openRTSP in OpenCV?
- 如何禁用ffmpeg的自动流信息检测并指定我自己的格式(mjpeg),或
- 如何在 OpenCV 中使用 openRTSP?
Regards,
问候,
回答by enthusiasticgeek
Is this an Axis IP camera? Either way, most IP cameras that provide MPEG4RTSP stream that can be decoded using OpenCV using cvCreateFileCapture_FFMPEG. However, ffmpeg decoder's MJPEGcodec has a widely known unresolved issues. I am sure you would have received an errorsimilar to
这是安讯士 IP 摄像机吗?无论哪种方式,大多数提供MPEG4RTSP 流的IP 摄像机都可以使用 OpenCV 使用cvCreateFileCapture_FFMPEG进行解码。然而,ffmpeg 解码器的MJPEG编解码器有一个广为人知的未解决问题。我相信你会收到类似的错误
[ingenient @ 0x97d20c0]Could not find codec parameters (Video: mjpeg)
Option1 : Using opencv, libcurl and libjpeg
选项 1:使用 opencv、libcurl 和 libjpeg
To view mjpeg stream in opencv take a look at the following implementation
要在 opencv 中查看 mjpeg 流,请查看以下实现
http://www.eecs.ucf.edu/~rpatrick/code/onelinksys.cor http://cse.unl.edu/~rpatrick/code/onelinksys.c
http://www.eecs.ucf.edu/~rpatrick/code/onelinksys.c或 http://cse.unl.edu/~rpatrick/code/onelinksys.c
Option2: Using gstreamer (no opencv)
选项 2:使用 gstreamer(无 opencv)
I would recommend looking at gstreamer if your goal is to just view or save jpeg images
如果您的目标只是查看或保存 jpeg 图像,我建议您查看 gstreamer
To viewMJPEG stream one may execute media pipeline string as follows
要查看MJPEG 流,可以按如下方式执行媒体管道字符串
gst-launch -v souphttpsrc location="http://[ip]:[port]/[dir]/xxx.cgi" do-timestamp=true is_live=true ! multipartdemux ! jpegdec ! ffmpegcolorspace ! autovideosink
For RTSP
对于 RTSP
gst-launch -v rtspsrc location="rtsp://[user]:[pass]@[ip]:[port]/[dir]/xxx.amp" debug=1 ! rtpmp4vdepay ! mpeg4videoparse ! ffdec_mpeg4 ! ffmpegcolorspace! autovideosink
To work with C API see
要使用 C API,请参阅
For a simple example take a look at my other post on rtsp for constructing gstreamer C API media pipeline (This is same as gst-launch string but rather implemented as a C API)
举个简单的例子,看看我在 rtsp 上的另一篇关于构建 gstreamer C API 媒体管道的帖子(这与 gst-launch 字符串相同,但作为 C API 实现)
Playing RTSP with python-gstreamer
To saveMJPEG stream as multiple images the pipeline (Let us put a vertical flip BINand connect the PADSto the previous and the next BINSto make it fancier)
为了节省MJPEG流作为多个图像管道(让我们把垂直翻转BIN和连接PADS以前的和下分档,使其票友)
gst-launch souphttpsrc location="http://[ip]:[port]/[dir]/xxx.cgi" do-timestamp=true is_live=true ! multipartdemux ! jpegdec ! videoflip method=vertical-flip ! jpegenc ! multifilesink location=image-out-%05d.jpg
Also maybe worthwhile have a look at gst-opencv
也可能值得看看gst-opencv
UPDATE:
更新:
Option3: Using gstreamer, Named Pipe and opencv
选项 3:使用 gstreamer、命名管道和 opencv
On Linux one may get mjpeg stream and convert it to mpeg4 and feed it to a named pipe. Then read the data from the named pipe in opencv
在 Linux 上,可以获取 mjpeg 流并将其转换为 mpeg4 并将其提供给命名管道。然后从opencv中的命名管道中读取数据
Step 1. Create Named Pipe
步骤 1. 创建命名管道
mkfifo stream_fifo
Step 2. Create opencvvideo_test.c
步骤 2. 创建 opencvvideo_test.c
// compile with gcc -ggdb `pkg-config --cflags --libs opencv` opencvvideo_test.c -o opencvvideo_test
#include <stdio.h>
#include "highgui.h"
#include "cv.h"
int main( int argc, char** argv){
IplImage *frame;
int key;
/* supply the AVI file to play */
assert( argc == 2 );
/* load the AVI file */
CvCapture *capture = cvCreateFileCapture(argv[1]) ;//cvCaptureFromAVI( argv[1] );
/* always check */
if( !capture ) return 1;
/* get fps, needed to set the delay */
int fps = ( int )cvGetCaptureProperty( capture, CV_CAP_PROP_FPS );
int frameH = (int) cvGetCaptureProperty(capture, CV_CAP_PROP_FRAME_HEIGHT);
int frameW = (int) cvGetCaptureProperty(capture, CV_CAP_PROP_FRAME_WIDTH);
/* display video */
cvNamedWindow( "video", CV_WINDOW_AUTOSIZE );
while( key != 'q' ) {
double t1=(double)cvGetTickCount();
/* get a frame */
frame = cvQueryFrame( capture );
double t2=(double)cvGetTickCount();
printf("time: %gms fps: %.2g\n",(t2-t1)/(cvGetTickFrequency()*1000.), 1000./((t2-t1)/(cvGetTickFrequency()*1000.)));
/* always check */
if( !frame ) break;
/* display frame */
cvShowImage( "video", frame );
/* quit if user press 'q' */
key = cvWaitKey( 1000 / fps );
}
/* free memory */
cvReleaseCapture( &capture );
cvDestroyWindow( "video" );
return 0;
}
Step 3. Prepare To Convert From MJPEG to MPEG4 using gstreamer (rate of incoming frames critical)
第 3 步。准备使用 gstreamer 从 MJPEG 转换为 MPEG4(传入帧的速率至关重要)
gst-launch -v souphttpsrc location="http://<ip>/cgi_bin/<mjpeg>.cgi" do-timestamp=true is_live=true ! multipartdemux ! jpegdec ! queue ! videoscale ! 'video/x-raw-yuv, width=640, height=480'! queue ! videorate ! 'video/x-raw-yuv,framerate=30/1' ! queue ! ffmpegcolorspace ! 'video/x-raw-yuv,format=(fourcc)I420' ! ffenc_mpeg4 ! queue ! filesink location=stream_fifo
Step 4. Display Stream in OpenCV
步骤 4. 在 OpenCV 中显示流
./opencvvideo_test stream_fifo