xcode CVOpenGLESTexture 方法类型的官方文档在哪里?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/9544293/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Where is the official documentation for CVOpenGLESTexture method types?
提问by Pochi
I tried google and stackoverflow but I cant seem to find the oficial documentation for functions that start with CVOpenGLESTexture. I can see they are from core Video, and I know they were added on iOS 5 but searching the documentation doesnt give me anything.
我尝试了 google 和 stackoverflow,但似乎找不到以CVOpenGLESTexture开头的函数的官方文档。我可以看到它们来自核心视频,我知道它们是在 iOS 5 上添加的,但搜索文档并没有给我任何信息。
I am looking for the information about the parameters, what they do, how to use them etc. like in the other apple frameworks.
我正在寻找有关参数的信息,它们的作用,如何使用它们等,就像在其他苹果框架中一样。
So far all I can do is command click on it to see the information but this feels super weird. Or is there a way to add this so it can be displayed on the quick help on the right on xcode?
到目前为止,我所能做的就是命令单击它以查看信息,但这感觉非常奇怪。或者有没有办法添加它,以便它可以显示在 xcode 右侧的快速帮助中?
Thanks and sorry if it is a stupid question.
谢谢,对不起,如果这是一个愚蠢的问题。
PD: The core Video reference guide doesnt seem to explain these either.
PD:核心视频参考指南似乎也没有解释这些。
回答by Brad Larson
Unfortunately, there really isn't any documentation on these new functions. The best you're going to find right now is in the CVOpenGLESTextureCache.h
header file, where you'll see a basic description of the function parameters:
不幸的是,确实没有关于这些新功能的任何文档。您现在要找到的最好的CVOpenGLESTextureCache.h
文件是头文件,您将在其中看到函数参数的基本描述:
/*!
@function CVOpenGLESTextureCacheCreate
@abstract Creates a new Texture Cache.
@param allocator The CFAllocatorRef to use for allocating the cache. May be NULL.
@param cacheAttributes A CFDictionaryRef containing the attributes of the cache itself. May be NULL.
@param eaglContext The OpenGLES 2.0 context into which the texture objects will be created. OpenGLES 1.x contexts are not supported.
@param textureAttributes A CFDictionaryRef containing the attributes to be used for creating the CVOpenGLESTexture objects. May be NULL.
@param cacheOut The newly created texture cache will be placed here
@result Returns kCVReturnSuccess on success
*/
CV_EXPORT CVReturn CVOpenGLESTextureCacheCreate(
CFAllocatorRef allocator,
CFDictionaryRef cacheAttributes,
void *eaglContext,
CFDictionaryRef textureAttributes,
CVOpenGLESTextureCacheRef *cacheOut) __OSX_AVAILABLE_STARTING(__MAC_NA,__IPHONE_5_0);
The more difficult elements are the attributes dictionaries, which unfortunately you need to find examples of in order to use these functions properly. Apple has the GLCameraRippleand RosyWriterexamples that show off how to use the fast texture upload path with BGRA and YUV input color formats. Apple also provided the ChromaKey example at WWDC (which may still be accessible along with the videos) that demonstrated how to use these texture caches to pull information from an OpenGL ES texture.
更困难的元素是属性字典,不幸的是,您需要查找示例才能正确使用这些函数。Apple 有GLCameraRipple和RosyWriter示例,它们展示了如何使用具有 BGRA 和 YUV 输入颜色格式的快速纹理上传路径。Apple 还在 WWDC 上提供了 ChromaKey 示例(它仍然可以与视频一起访问),演示了如何使用这些纹理缓存从 OpenGL ES 纹理中提取信息。
I just got this fast texture uploading working in my GPUImageframework (the source code for which is available at that link), so I'll lay out what I was able to parse out of this. First, I create a texture cache using the following code:
我刚刚在我的GPUImage框架(可在该链接中找到其源代码)中使用这种快速纹理上传,所以我将列出我能够从中解析的内容。首先,我使用以下代码创建纹理缓存:
CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)[[GPUImageOpenGLESContext sharedImageProcessingOpenGLESContext] context], NULL, &coreVideoTextureCache);
if (err)
{
NSAssert(NO, @"Error at CVOpenGLESTextureCacheCreate %d");
}
where the context referred to is an EAGLContext configured for OpenGL ES 2.0.
其中所指的上下文是为 OpenGL ES 2.0 配置的 EAGLContext。
I use this to keep video frames from the iOS device camera in video memory, and I use the following code to do this:
我使用它来将来自 iOS 设备相机的视频帧保存在视频内存中,并使用以下代码来执行此操作:
CVPixelBufferLockBaseAddress(cameraFrame, 0);
CVOpenGLESTextureRef texture = NULL;
CVReturn err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, coreVideoTextureCache, cameraFrame, NULL, GL_TEXTURE_2D, GL_RGBA, bufferWidth, bufferHeight, GL_BGRA, GL_UNSIGNED_BYTE, 0, &texture);
if (!texture || err) {
NSLog(@"CVOpenGLESTextureCacheCreateTextureFromImage failed (error: %d)", err);
return;
}
outputTexture = CVOpenGLESTextureGetName(texture);
glBindTexture(GL_TEXTURE_2D, outputTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
// Do processing work on the texture data here
CVPixelBufferUnlockBaseAddress(cameraFrame, 0);
CVOpenGLESTextureCacheFlush(coreVideoTextureCache, 0);
CFRelease(texture);
outputTexture = 0;
This creates a new CVOpenGLESTextureRef, representing an OpenGL ES texture, from the texture cache. This texture is based on the CVImageBufferRef passed in by the camera. That texture is then retrieved from the CVOpenGLESTextureRef and appropriate parameters set for it (which seemed to be necessary in my processing). Finally, I do my work on the texture and clean up when I'm done.
这会从纹理缓存中创建一个新的 CVOpenGLESTextureRef,代表一个 OpenGL ES 纹理。该纹理基于相机传入的 CVImageBufferRef。然后从 CVOpenGLESTextureRef 中检索该纹理并为其设置适当的参数(这在我的处理中似乎是必要的)。最后,我在纹理上做我的工作并在完成后清理。
This fast upload process makes a real difference on the iOS devices. It took the upload and processing of a single 640x480 frame of video on an iPhone 4S from 9.0 ms to 1.8 ms.
这种快速上传过程对 iOS 设备产生了真正的影响。在 iPhone 4S 上上传和处理单个 640x480 帧的视频需要 9.0 毫秒到 1.8 毫秒。
I've heard that this works in reverse, as well, which might allow for the replacement of glReadPixels()
in certain situations, but I've yet to try this.
我听说这也适用于 reverse,这可能允许glReadPixels()
在某些情况下替换,但我还没有尝试过。
回答by pfleiner
Apple has finally posted the documentation, a little over a week ago.
苹果终于在一周前发布了文档。