xcode AVCapture appendSampleBuffer
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/3846331/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
AVCapture appendSampleBuffer
提问by Michael O'Brien
I am going insane with this one - have looked everywhere and tried anything and everything I can thinks of.
我对这个要疯了 - 到处找,尝试过任何我能想到的东西。
Am making an iPhone app that uses AVFoundation - specifically AVCapture to capture video using the iPhone camera.
我正在制作一个使用 AVFoundation 的 iPhone 应用程序 - 特别是 AVCapture 使用 iPhone 相机捕捉视频。
I need to have a custom image that is overlayed on the video feed included in the recording.
我需要一个自定义图像,该图像覆盖在录制文件中包含的视频源上。
So far I have the AVCapture session set up, can display the feed, access the frame, save it as a UIImage and marge the overlay Image onto it. Then convert this new UIImage into a CVPixelBufferRef. annnd to double check that the bufferRef is working I converted it back to a UIImage and it displays the image fine still.
到目前为止,我已经设置了 AVCapture 会话,可以显示提要,访问框架,将其另存为 UIImage 并将叠加图像边缘化到其上。然后将这个新的 UIImage 转换为 CVPixelBufferRef。annnd 仔细检查 bufferRef 是否正常工作,我将它转换回 UIImage 并且它仍然可以很好地显示图像。
The trouble starts when I try to convert the CVPixelBufferRef into a CMSampleBufferRef to append to the AVCaptureSessions assetWriterInput. The CMSampleBufferRef always returning NULL when I attempt to create it.
当我尝试将 CVPixelBufferRef 转换为 CMSampleBufferRef 以附加到 AVCaptureSessions assetWriterInput 时,问题就开始了。当我尝试创建 CMSampleBufferRef 时,它总是返回 NULL。
Here is the -(void)captureOutput function
这是 -(void)captureOutput 函数
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
UIImage *botImage = [self imageFromSampleBuffer:sampleBuffer];
UIImage *wheel = [self imageFromView:wheelView];
UIImage *finalImage = [self overlaidImage:botImage :wheel];
//[previewImage setImage:finalImage]; <- works -- the image is being merged into one UIImage
CVPixelBufferRef pixelBuffer = NULL;
CGImageRef cgImage = CGImageCreateCopy(finalImage.CGImage);
CFDataRef image = CGDataProviderCopyData(CGImageGetDataProvider(cgImage));
int status = CVPixelBufferCreateWithBytes(NULL,
self.view.bounds.size.width,
self.view.bounds.size.height,
kCVPixelFormatType_32BGRA,
(void*)CFDataGetBytePtr(image),
CGImageGetBytesPerRow(cgImage),
NULL,
0,
NULL,
&pixelBuffer);
if(status == 0){
OSStatus result = 0;
CMVideoFormatDescriptionRef videoInfo = NULL;
result = CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixelBuffer, &videoInfo);
NSParameterAssert(result == 0 && videoInfo != NULL);
CMSampleBufferRef myBuffer = NULL;
result = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault,
pixelBuffer, true, NULL, NULL, videoInfo, NULL, &myBuffer);
NSParameterAssert(result == 0 && myBuffer != NULL);//always null :S
NSLog(@"Trying to append");
if (!CMSampleBufferDataIsReady(myBuffer)){
NSLog(@"sampleBuffer data is not ready");
return;
}
if (![assetWriterInput isReadyForMoreMediaData]){
NSLog(@"Not ready for data :(");
return;
}
if (![assetWriterInput appendSampleBuffer:myBuffer]){
NSLog(@"Failed to append pixel buffer");
}
}
}
Another solution I keep hearing about is using a AVAssetWriterInputPixelBufferAdaptor which eliminates the need to do the messy CMSampleBufferRef wrapping. However I have scoured stacked and apple developer forums and docs and can't find a clear description or example on how to set this up or how to use it. If anyone has a working example of it could you please show me or help me nut out the above issue - have been working on this non-stop for a week and am at wits end.
我一直听到的另一个解决方案是使用 AVAssetWriterInputPixelBufferAdaptor,它无需进行凌乱的 CMSampleBufferRef 包装。但是,我已经搜索了 stack 和 Apple 开发者论坛和文档,但找不到关于如何设置或如何使用它的清晰描述或示例。如果有人有一个可行的例子,请你告诉我或帮我解决上面的问题 - 已经在这个不间断的工作了一个星期,我已经无计可施了。
Let me know if you need any other info
如果您需要任何其他信息,请告诉我
Thanks in advance,
提前致谢,
Michael
迈克尔
回答by Johnmph
You need AVAssetWriterInputPixelBufferAdaptor, here is the code to create it :
您需要 AVAssetWriterInputPixelBufferAdaptor,这是创建它的代码:
// Create dictionary for pixel buffer adaptor
NSDictionary *bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, nil];
// Create pixel buffer adaptor
m_pixelsBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:assetWriterInput sourcePixelBufferAttributes:bufferAttributes];
And the code to use it :
以及使用它的代码:
// If ready to have more media data
if (m_pixelsBufferAdaptor.assetWriterInput.readyForMoreMediaData) {
// Create a pixel buffer
CVPixelBufferRef pixelsBuffer = NULL;
CVPixelBufferPoolCreatePixelBuffer(NULL, m_pixelsBufferAdaptor.pixelBufferPool, &pixelsBuffer);
// Lock pixel buffer address
CVPixelBufferLockBaseAddress(pixelsBuffer, 0);
// Create your function to set your pixels data in the buffer (in your case, fill with your finalImage data)
[self yourFunctionToPutDataInPixelBuffer:CVPixelBufferGetBaseAddress(pixelsBuffer)];
// Unlock pixel buffer address
CVPixelBufferUnlockBaseAddress(pixelsBuffer, 0);
// Append pixel buffer (calculate currentFrameTime with your needing, the most simplest way is to have a frame time starting at 0 and increment each time you write a frame with the time of a frame (inverse of your framerate))
[m_pixelsBufferAdaptor appendPixelBuffer:pixelsBuffer withPresentationTime:currentFrameTime];
// Release pixel buffer
CVPixelBufferRelease(pixelsBuffer);
}
And don't forget to release your pixelsBufferAdaptor.
并且不要忘记释放您的pixelsBufferAdaptor。
回答by Kongling Ouyang
I do it by using CMSampleBufferCreateForImageBuffer() .
我通过使用 CMSampleBufferCreateForImageBuffer() 来做到这一点。
OSStatus ret = 0;
CMSampleBufferRef sample = NULL;
CMVideoFormatDescriptionRef videoInfo = NULL;
CMSampleTimingInfo timingInfo = kCMTimingInfoInvalid;
timingInfo.presentationTimeStamp = pts;
timingInfo.duration = duration;
ret = CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixel, &videoInfo);
if (ret != 0) {
NSLog(@"CMVideoFormatDescriptionCreateForImageBuffer failed! %d", (int)ret);
goto done;
}
ret = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixel, true, NULL, NULL,
videoInfo, &timingInfo, &sample);
if (ret != 0) {
NSLog(@"CMSampleBufferCreateForImageBuffer failed! %d", (int)ret);
goto done;
}