macos 如何在 Cocoa OpenGL 程序中显示原始 YUV 帧

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/1080545/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-21 06:15:09  来源:igfitidea点击:

How to display a raw YUV frame in a Cocoa OpenGL program

cocoamacosopenglrgbyuv

提问by ReachConnection

I have been assigned wit the task to write a program that takes a sample raw YUV file and display it in a Cocoa OpenGL program.

我的任务是编写一个程序,该程序采用示例原始 YUV 文件并将其显示在 Cocoa OpenGL 程序中。

I am an intern at my job and I have little or no clue how to start. I have been reading wikipedia & articles on YUV, but I couldn't find any good source code on how to open a raw YUV file, extract the data and convert it into RGB and display it in the view window.

我是我的工作实习生,我几乎不知道如何开始。我一直在阅读关于 YUV 的维基百科和文章,但我找不到任何关于如何打开原始 YUV 文件、提取数据并将其转换为 RGB 并在视图窗口中显示的好的源代码。

Essentially, I need help with the following aspects of the task -how to extract the YUV data from the sample YUV file -how to convert the YUV data into RGB color space -how to display the RGB color space in OpenGL. (This one I think I can figure out with time, but I really need help with the first two points)

本质上,我需要任务的以下方面的帮助 - 如何从示例 YUV 文件中提取 YUV 数据 - 如何将 YUV 数据转换为 RGB 颜色空间 - 如何在 OpenGL 中显示 RGB 颜色空间。(这个我想我可以随着时间的推移弄清楚,但前两点我真的需要帮助)

please either tell me the classes to use, or point me to places where i can learn about YUV graphic/video display

请告诉我要使用的课程,或者告诉我可以了解 YUV 图形/视频显示的地方

采纳答案by Adam Rosenfield

This answer is not correct, see the other answers and comments. Original answer left below for posterity.

此答案不正确,请参阅其他答案和评论。原始答案留给后人。



You can't display it directly.You'll need to convert it to an RGB texture. As you may have gathered from Wikipedia, there are a bunch of variations on the YUV color space. Make sure you're using the right one.

不能直接显示。您需要将其转换为 RGB 纹理。正如您可能从Wikipedia收集到的那样,YUV 颜色空间有很多变化。确保您使用的是正确的。

For each pixel, the conversion from YUV to RGB is a straightforward linear transformation. You just do the same thing to each pixel independently.

对于每个像素,从 YUV 到 RGB 的转换是一个简单的线性转换。您只需对每个像素独立执行相同的操作。

Once you've converted the image to RGB, you can display it by creating a texture. You need to call glGenTextures()to allocate a texture handle, glBindTexture()to bind the texture to the render context, and glTexImage2D()to upload the texture data to the GPU. To render it, you again call glBindTexture(), followed by the rendering of a quad with texture coordinates set up properly.

将图像转换为 RGB 后,您可以通过创建纹理来显示它。您需要调用glGenTextures()以分配纹理句柄,glBindTexture()将纹理绑定到渲染上下文,并将glTexImage2D()纹理数据上传到 GPU。要渲染它,您再次调用glBindTexture(),然后渲染具有正确设置的纹理坐标的四边形。

// parameters: image:  pointer to raw YUV input data
//             width:  image width (must be a power of 2)
//             height: image height (must be a power of 2)
// returns: a handle to the resulting RGB texture
GLuint makeTextureFromYUV(const float *image, int width, int height)
{
    float *rgbImage = (float *)malloc(width * height * 3 * sizeof(float));  // check for NULL
    float *rgbImagePtr = rgbImage;

    // convert from YUV to RGB (floats used here for simplicity; it's a little
    // trickier with 8-bit ints)
    int y, x;
    for(y = 0; y < height; y++)
    {
        for(x = 0; x < width; x++)
        {
            float Y = *image++;
            float U = *image++;
            float V = *image++;
            *rgbImagePtr++ = Y                + 1.13983f * V;  // R
            *rgbImagePtr++ = Y - 0.39465f * U - 0.58060f * V;  // G
            *rgbImagePtr++ = Y + 2.03211f * U;                 // B
        }
    }

    // create texture
    GLuint texture;
    glGenTextures(1, &texture);

    // bind texture to render context
    glBindTexture(GL_TEXTURE_2D, texture);

    // upload texture data
    glTexImage2D(GL_TEXTURE_2D, 0, 3, width, height, 0, GL_RGB, GL_FLOAT, rgbImage);

    // don't use mipmapping (since we're not creating any mipmaps); the default
    // minification filter uses mipmapping.  Use linear filtering for minification
    // and magnification.
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

    // free data (it's now been copied onto the GPU) and return texture handle
    free(rgbImage);
    return texture;
}

To render:

渲染:

glBindTexture(GL_TEXTURE_2D, texture);

glBegin(GL_QUADS);
    glTexCoord2f(0.0f, 0.0f); glVertex3f( 0.0f,  0.0f, 0.0f);
    glTexCoord2f(1.0f, 0.0f); glVertex3f(64.0f,  0.0f, 0.0f);
    glTexCoord2f(1.0f, 1.0f); glVertex3f(64.0f, 64.0f, 0.0f);
    glTexCoord2f(0.0f, 1.0f); glVertex3f( 0.0f, 64.0f, 0.0f);
glEnd();

And don't forget to call glEnable(GL_TEXTURE_2D)at some point during initialization, and call glDeleteTextures(1, &texture)during shutdown.

并且不要忘记glEnable(GL_TEXTURE_2D)在初始化期间的某个时刻调用,并glDeleteTextures(1, &texture)在关闭期间调用。

回答by Brad Larson

I've done this with YUV frames captured from a CCD camera. Unfortunately, there are a number of different YUV formats. I believe the one that Apple uses for the GL_YCBCR_422_APPLEtexture format is technically 2VUY422. To convert an image from a YUV422 frame generated by an IIDC Firewire camera to 2VUY422, I've used the following:

我已经用从 CCD 相机捕获的 YUV 帧完成了这项工作。不幸的是,有许多不同的 YUV 格式。我相信 Apple 用于GL_YCBCR_422_APPLE纹理格式的格式在技术上是 2VUY422。要将图像从 IIDC Firewire 相机生成的 YUV422 帧转换为 2VUY422,我使用了以下内容:

void yuv422_2vuy422(const unsigned char *theYUVFrame, unsigned char *the422Frame, const unsigned int width, const unsigned int height) 
{
    int i =0, j=0;
    unsigned int numPixels = width * height;
    unsigned int totalNumberOfPasses = numPixels * 2;
    register unsigned int y0, y1, y2, y3, u0, u2, v0, v2;

    while (i < (totalNumberOfPasses) )
    {
        u0 = theYUVFrame[i++]-128;
        y0 = theYUVFrame[i++];
        v0 = theYUVFrame[i++]-128;
        y1 = theYUVFrame[i++];
        u2 = theYUVFrame[i++]-128;
        y2 = theYUVFrame[i++];
        v2 = theYUVFrame[i++]-128;
        y3 = theYUVFrame[i++];

        // U0 Y0 V0 Y1 U2 Y2 V2 Y3

        // Remap the values to 2VUY (YUYS?) (Y422) colorspace for OpenGL
        // Y0 U Y1 V Y2 U Y3 V

        // IIDC cameras are full-range y=[0..255], u,v=[-127..+127], where display is "video range" (y=[16..240], u,v=[16..236])

        the422Frame[j++] = ((y0 * 240) / 255 + 16);
        the422Frame[j++] = ((u0 * 236) / 255 + 128);
        the422Frame[j++] = ((y1 * 240) / 255 + 16);
        the422Frame[j++] = ((v0 * 236) / 255 + 128);
        the422Frame[j++] = ((y2 * 240) / 255 + 16);
        the422Frame[j++] = ((u2 * 236) / 255 + 128);
        the422Frame[j++] = ((y3 * 240) / 255 + 16);
        the422Frame[j++] = ((v2 * 236) / 255 + 128);
    }
}

For efficient display of a YUV video source, you may wish to use Apple's client storage extension, which you can set up using something like the following:

为了有效地显示 YUV 视频源,您可能希望使用Apple 的客户端存储扩展,您可以使用以下内容进行设置:

glEnable(GL_TEXTURE_RECTANGLE_EXT);
glBindTexture(GL_TEXTURE_RECTANGLE_EXT, 1);

glTextureRangeAPPLE(GL_TEXTURE_RECTANGLE_EXT, videoImageWidth * videoImageHeight * 2, videoTexture);
glTexParameteri(GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_STORAGE_HINT_APPLE , GL_STORAGE_SHARED_APPLE);
glPixelStorei(GL_UNPACK_CLIENT_STORAGE_APPLE, GL_TRUE);

glTexParameteri(GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glPixelStorei(GL_UNPACK_ROW_LENGTH, 0);

glTexImage2D(GL_TEXTURE_RECTANGLE_EXT, 0, GL_RGBA, videoImageWidth, videoImageHeight, 0, GL_YCBCR_422_APPLE, GL_UNSIGNED_SHORT_8_8_REV_APPLE, videoTexture);    

This lets you quickly change out the data stored within your client-side video texture before each frame to be displayed on the screen.

这使您可以在要在屏幕上显示的每一帧之前快速更改存储在客户端视频纹理中的数据。

To draw, you could then use code like the following:

要绘制,您可以使用如下代码:

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);         
glEnable(GL_TEXTURE_2D);

glViewport(0, 0, [self frame].size.width, [self frame].size.height);

glMatrixMode(GL_PROJECTION);
glLoadIdentity();
NSRect bounds = NSRectFromCGRect([self bounds]);
glOrtho( (GLfloat)NSMinX(bounds), (GLfloat)NSMaxX(bounds), (GLfloat)NSMinY(bounds), (GLfloat)NSMaxY(bounds), -1.0, 1.0);

glBindTexture(GL_TEXTURE_RECTANGLE_EXT, 1);
glTexSubImage2D (GL_TEXTURE_RECTANGLE_EXT, 0, 0, 0, videoImageWidth, videoImageHeight, GL_YCBCR_422_APPLE, GL_UNSIGNED_SHORT_8_8_REV_APPLE, videoTexture);

glMatrixMode(GL_TEXTURE);
glLoadIdentity();

glBegin(GL_QUADS);
    glTexCoord2f(0.0f, 0.0f);
    glVertex2f(0.0f, videoImageHeight);

    glTexCoord2f(0.0f, videoImageHeight);
    glVertex2f(0.0f, 0.0f);

    glTexCoord2f(videoImageWidth, videoImageHeight);
    glVertex2f(videoImageWidth, 0.0f);

    glTexCoord2f(videoImageWidth, 0.0f);
    glVertex2f(videoImageWidth, videoImageHeight);      
glEnd();

回答by Jens Ayton

Adam Rosenfield's comment is incorrect. On Macs, you can display YCbCr (the digital equivalent to YUV) textures using the GL_YCBCR_422_APPLEtexture format, as specified in the APPLE_ycbcr_422extension.

亚当罗森菲尔德的评论是不正确的。在 Mac 上,您可以使用APPLE_ycbcr_422扩展中GL_YCBCR_422_APPLE指定的纹理格式显示 YCbCr(YUV 的数字等效项)纹理。