ios 如何正确释放 AVCaptureSession

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/3741121/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-30 17:44:29  来源:igfitidea点击:

How to properly release an AVCaptureSession

iphonemultithreadingioscameraavfoundation

提问by Codo

I'm using the AV Foundation classes to capture the live video stream from the camera and to process the video samples. This works nicely. However, I do have problems properly releasing the AV foundation instances (capture session, preview layer, input and output) once I'm done.

我正在使用 AV Foundation 类从相机捕获实时视频流并处理视频样本。这很好用。但是,一旦我完成,我在正确释放 AV 基础实例(捕获会话、预览层、输入和输出)时确实遇到了问题。

When I no longer need the session and all associated objects, I stop the capture session and release it. This works most of the time. However, sometimes the app crashes with a EXEC_BAD_ACCESS signal raised in second thread that was created by the dispatch queue (and where the video samples are processed). The crash is mainly due to my own class instance, which serves as the sample buffer delegate and is freed after I've stop the capture session.

当我不再需要会话和所有关联的对象时,我停止捕获会话并释放它。这在大多数情况下都有效。但是,有时应用程序会因在由调度队列(以及处理视频样本的地方)创建的第二个线程中引发的 EXEC_BAD_ACCESS 信号而崩溃。崩溃主要是由于我自己的类实例,它用作示例缓冲区委托,并在我停止捕获会话后被释放。

The Apple documentation mentions the problem: Stopping the capture session is an asynchronous operation. That is: it doesn't happen immediately. In particular, the second thread continues to process video samples and access different instances like the capture session or the input and output devices.

Apple 文档提到了这个问题:停止捕获会话是一个异步操作。也就是说:它不会立即发生。特别是,第二个线程继续处理视频样本并访问不同的实例,如捕获会话或输入和输出设备。

So how do I properly release the AVCaptureSession and all related instances? Is there a notification that reliably tells me that the AVCaptureSession has finished?

那么如何正确释放 AVCaptureSession 和所有相关实例?是否有一个通知可靠地告诉我,AVCaptureSession已经完成?

Here's my code:

这是我的代码:

Declarations:

声明:

AVCaptureSession* session;
AVCaptureVideoPreviewLayer* previewLayer;
UIView* view;

Setup of instances:

实例设置:

AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo];
session = [[AVCaptureSession alloc] init];

AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice: camera error: &error];
[session addInput: input];
AVCaptureVideoDataOutput* output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
[session addOutput: output];

dispatch_queue_t queue = dispatch_queue_create("augm_reality", NULL);
[output setSampleBufferDelegate: self queue: queue];
dispatch_release(queue);

previewLayer = [[AVCaptureVideoPreviewLayer layerWithSession: session] retain];
previewLayer.frame = view.bounds;
[view.layer addSublayer: previewLayer];

[session startRunning];

Cleanup:

清理:

[previewLayer removeFromSuperlayer];
[previewLayer release];
[session stopRunning];
[session release];

采纳答案by Codo

Here's the best solution I've found so far. The basic idea is to use the finalizer of the dispatch queue. When the dispatch queue quits, we can be sure that there won't be any more action in the second thread where the sample buffers are processed.

这是我迄今为止找到的最佳解决方案。基本思想是使用调度队列的终结器。当调度队列退出时,我们可以确定在处理样本缓冲区的第二个线程中不会再有任何动作。

static void capture_cleanup(void* p)
{
    AugmReality* ar = (AugmReality *)p; // cast to original context instance
    [ar release];  // releases capture session if dealloc is called
}

...

dispatch_queue_t queue = dispatch_queue_create("augm_reality", NULL);
dispatch_set_context(queue, self);
dispatch_set_finalizer_f(queue, capture_cleanup);
[output setSampleBufferDelegate: self queue: queue];
dispatch_release(queue);
[self retain];

...

Unfortunately, I now have to explicitly stop capturing. Otherwise releasing my instance won't free it because the second thread now increments and decrements the counter as well.

不幸的是,我现在必须明确停止捕获。否则释放我的实例不会释放它,因为第二个线程现在也会递增和递减计数器。

A further problem is that my class is now released from two different threads. Is this reliable or is it the next problem causing crashes?

另一个问题是我的类现在从两个不同的线程中释放。这是可靠的还是下一个导致崩溃的问题?

回答by Codo

I've posted a very similar question in the Apple Developer Forum and got an answer from an Apple employee. He says it's a known problem:

我在 Apple Developer Forum 上发布了一个非常相似的问题,并从 Apple 员工那里得到了答案。他说这是一个已知问题:

This is a problem with the AVCaptureSession / VideoDataOutput in iOS 4.0-4.1 that has been fixed and will appear in a future update. For the time being, you can work around it by waiting for a short period after stopping the AVCaptureSession, e.g. half a second, before disposing of the session and data output.

这是 iOS 4.0-4.1 中 AVCaptureSession / VideoDataOutput 的问题,已修复并将在未来更新中出现。目前,您可以通过在停止 AVCaptureSession 后等待一小段时间(例如半秒)来解决此问题,然后再处理会话和数据输出。

He/she proposes the following code:

他/她提出以下代码:

dispatch_after(
    dispatch_time(0, 500000000),
    dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), // or main queue, or your own
    ^{
        // Do your work here.
        [session release];
        // etc.
    }
);

I still like the approach with the dispatch queue finalizer better because this code just guesses when the second thread might have finished.

我仍然更喜欢使用调度队列终结器的方法,因为这段代码只是猜测第二个线程何时可能完成。

回答by Kiran

As per current apple docs(1) [AVCaptureSession stopRunning]is a synchronous operation which blocks until the receiver has completely stopped running. So all these issues shouldn't happen any more.

根据当前的 Apple docs( 1)[AVCaptureSession stopRunning]是一个同步操作,它会阻塞直到接收器完全停止运行。因此,所有这些问题都不应再发生。

回答by VLegakis

Solved! Perhaps it is the sequence of acions on initializing the session. This one works for me:

解决了!也许这是初始化会话的动作序列。这个对我有用:

NSError *error = nil;

if(session)
    [session release];

// Create the session
session = [[AVCaptureSession alloc] init];


// Configure the session to produce lower resolution video frames, if your 
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPresetMedium;

// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
                           defaultDeviceWithMediaType:AVMediaTypeVideo];

// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device 
                                                                    error:&error];
if (!input) {
    // Handling the error appropriately.
}
[session addInput:input];

// Create a VideoDataOutput and add it to the session
AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
[session addOutput:output];


// Configure your output.
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);

// Specify the pixel format
output.videoSettings = 
[NSDictionary dictionaryWithObject:
 [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] 
                            forKey:(id)kCVPixelBufferPixelFormatTypeKey];

// If you wish to cap the frame rate to a known value, such as 15 fps, set 
// minFrameDuration.
output.minFrameDuration = CMTimeMake(1, 15);

previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
[delegate layerArrived:previewLayer];

NSNotificationCenter *notify =
[NSNotificationCenter defaultCenter];
[notify addObserver: self
            selector: @selector(onVideoError:)
            name: AVCaptureSessionRuntimeErrorNotification
            object: session];
[notify addObserver: self
            selector: @selector(onVideoStart:)
            name: AVCaptureSessionDidStartRunningNotification
            object: session];
[notify addObserver: self
            selector: @selector(onVideoStop:)
            name: AVCaptureSessionDidStopRunningNotification
            object: session];
[notify addObserver: self
            selector: @selector(onVideoStop:)
            name: AVCaptureSessionWasInterruptedNotification
            object: session];
[notify addObserver: self
            selector: @selector(onVideoStart:)
            name: AVCaptureSessionInterruptionEndedNotification
            object: session];

// Start the session running to start the flow of data
[session startRunning];

Btw this sequence seems to resolve the synchronous notifications problem :)

顺便说一句,这个序列似乎解决了同步通知问题:)

回答by Mahyar

With the queue finalizers, you can use a dispatch_semaphore for each queue and then continue with your cleanup routine once your done.

使用队列终结器,您可以为每个队列使用 dispatch_semaphore,然后在完成后继续您的清理例程。

#define GCD_TIME(delayInSeconds) dispatch_time(DISPATCH_TIME_NOW, delayInSeconds * NSEC_PER_SEC)

static void vQueueCleanup(void* context) {
  VideoRecordingViewController *vc = (VideoRecordingViewController*)context;
  if (vc.vSema) dispatch_semaphore_signal(vc.vSema);
}

static void aQueueCleanup(void* context) {
  VideoRecordingViewController *vc = (VideoRecordingViewController*)context;
  if (vc.aSema) dispatch_semaphore_signal(vc.aSema);
}

//In your cleanup method:
vSema = dispatch_semaphore_create(0);
aSema = dispatch_semaphore_create(0);
self.avSession = nil;
if (vSema) dispatch_semaphore_wait(vSema, GCD_TIME(0.5));
if (aSema) dispatch_semaphore_wait(aSema, GCD_TIME(0.5));
[self.navigationController popViewControllerAnimated:YES];

Remember that you have to set your AVCaptureVideoDataOutput/AVCaptureAudioDataOutput objects sample buffer delegates to nil or they will never release their associated queues and thus never call their finalizers when you release your AVCaptureSession.

请记住,您必须将 AVCaptureVideoDataOutput/AVCaptureAudioDataOutput 对象样本缓冲区委托设置为 nil,否则它们将永远不会释放其关联的队列,因此在您释放 AVCaptureSession 时永远不会调用它们的终结器。

[avs removeOutput:vOut];
[vOut setSampleBufferDelegate:nil queue:NULL];

回答by souvickcse

 -(void)deallocSession
{
[captureVideoPreviewLayer removeFromSuperlayer];
for(AVCaptureInput *input1 in session.inputs) {
    [session removeInput:input1];
}

for(AVCaptureOutput *output1 in session.outputs) {
    [session removeOutput:output1];
}
[session stopRunning];
session=nil;
outputSettings=nil;
device=nil;
input=nil;
captureVideoPreviewLayer=nil;
stillImageOutput=nil;
self.vImagePreview=nil;

}

i called this function before popping and pushing any other view. It solved my issue of low memory warning.

我在弹出和推送任何其他视图之前调用了这个函数。它解决了我的低内存警告问题。

回答by VLegakis

After AVCaptureSession allocation you may use:

在 AVCaptureSession 分配之后,您可以使用:

NSNotificationCenter *notify =
[NSNotificationCenter defaultCenter];
[notify addObserver: self
            selector: @selector(onVideoError:)
            name: AVCaptureSessionRuntimeErrorNotification
            object: session];
[notify addObserver: self
            selector: @selector(onVideoStart:)
            name: AVCaptureSessionDidStartRunningNotification
            object: session];
[notify addObserver: self
            selector: @selector(onVideoStop:)
            name: AVCaptureSessionDidStopRunningNotification
            object: session];
[notify addObserver: self
            selector: @selector(onVideoStop:)
            name: AVCaptureSessionWasInterruptedNotification
            object: session];
[notify addObserver: self
            selector: @selector(onVideoStart:)
            name: AVCaptureSessionInterruptionEndedNotification
            object: session];

These are calling back the relevant methods upon session.stopRunning, session.startRunning etc.

这些是在 session.stopRunning、session.startRunning 等时回调相关方法。

There you should also implement some undocumented cleanup block:

在那里,您还应该实现一些未记录的清理块:

AVCaptureInput* input = [session.inputs objectAtIndex:0];
[session removeInput:input];
AVCaptureVideoDataOutput* output = (AVCaptureVideoDataOutput*)[session.outputs objectAtIndex:0];
[session removeOutput:output];  

What I found confusing though is that upon calling seesion.stopRunning, onVideoStop: is called synchronously! despite Apple's asynchronous assumption on the case.

我发现令人困惑的是,在调用 seesion.stopRunning 时,onVideoStop: 被同步调用!尽管苹果对此案有异步假设。

Its working but please let me know in case you see any trick. I would prefer working with it asynchronously.

它可以工作,但如果您看到任何技巧,请告诉我。我更喜欢异步使用它。

Thanks

谢谢