无法使用 AVCapturePhotoOutput 捕捉照片 swift + xcode

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/41123346/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-15 09:38:38  来源:igfitidea点击:

Unable to use AVCapturePhotoOutput to capture photo swift + xcode

iosswiftxcodeswift3avfoundation

提问by user2238284

I am working on a custom camera app and the tutorial uses AVCaptureStillImageOutput, which is deprecated for ios 10. I have set up the camera and am now stuck on how to take the photo

我正在开发一个自定义相机应用程序,本教程使用 AVCaptureStillImageOutput,它在 ios 10 中已被弃用。

Here is my full view where i have the camera

这是我拿着相机的全貌

import UIKit
import AVFoundation

var cameraPos = "back"

class View3: UIViewController,UIImagePickerControllerDelegate,UINavigationControllerDelegate {


@IBOutlet weak var clickButton: UIButton!
@IBOutlet var cameraView: UIView!
var session: AVCaptureSession?
var stillImageOutput: AVCapturePhotoOutput?
var videoPreviewLayer: AVCaptureVideoPreviewLayer?

override func viewDidLoad() {
    super.viewDidLoad()        
}

override func didReceiveMemoryWarning() {
    super.didReceiveMemoryWarning()
}

override func viewDidAppear(_ animated: Bool) {
    super.viewDidAppear(animated)
    clickButton.center.x = cameraView.bounds.width/2
    loadCamera()
}

override func viewWillAppear(_ animated: Bool) {
    super.viewWillAppear(animated)
 }

@IBAction func clickCapture(_ sender: UIButton) {

    if let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo) {
       // This is where I need help 
        }
}

@IBAction func changeDevice(_ sender: UIButton) {
    if cameraPos == "back"
    {cameraPos = "front"}

    else
    {cameraPos = "back"}


    loadCamera()
}

func loadCamera()
{
    session?.stopRunning()
    videoPreviewLayer?.removeFromSuperlayer()

    session = AVCaptureSession()
    session!.sessionPreset = AVCaptureSessionPresetPhoto

    var backCamera = AVCaptureDevice.defaultDevice(withDeviceType: .builtInWideAngleCamera, mediaType: AVMediaTypeVideo, position: .front)

    if cameraPos == "back"
    {
        backCamera = AVCaptureDevice.defaultDevice(withDeviceType: .builtInWideAngleCamera, mediaType: AVMediaTypeVideo, position: .back)
    }

    var error: NSError?
    var input: AVCaptureDeviceInput!
    do {
        input = try AVCaptureDeviceInput(device: backCamera)
    } catch let error1 as NSError {
        error = error1
        input = nil
        print(error!.localizedDescription)
    }

    if error == nil && session!.canAddInput(input) {
        session!.addInput(input)

        stillImageOutput = AVCapturePhotoOutput()

if session!.canAddOutput(stillImageOutput) {
            session!.addOutput(stillImageOutput)
            videoPreviewLayer = AVCaptureVideoPreviewLayer(session: session)
            videoPreviewLayer?.frame = cameraView.bounds
            videoPreviewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
            videoPreviewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.portrait

            cameraView.layer.addSublayer(videoPreviewLayer!)
            session!.startRunning()

        }        }
}
}

This is where i need help

这是我需要帮助的地方

@IBAction func clickCapture(_ sender: UIButton) {

if let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo) {
   // This is where I need help 
    }
}

I have gone through the answer here How to use AVCapturePhotoOutputbut i do not understand how to incorporate that code in this code, as it involves declaring a new class

我已经通过这里的答案如何使用 AVCapturePhotoOutput但我不明白如何将该代码合并到此代码中,因为它涉及声明一个新类

回答by Bluewings

You are almost there.

你快到了。

For Output as AVCapturePhotoOutput

对于输出为 AVCapturePhotoOutput

Check out AVCapturePhotoOutputdocumentationfor more help.

查看AVCapturePhotoOutput文档以获得更多帮助。

These are the steps to capture a photo.

这些是拍摄照片的步骤。

  1. Create an AVCapturePhotoOutputobject. Use its properties to determine supported capture settings and to enable certain features (for example, whether to capture Live Photos).
  2. Create and configure an AVCapturePhotoSettingsobject to choose features and settings for a specific capture (for example, whether to enable image stabilization or flash).
  3. Capture an image by passing your photo settings object to the capturePhoto(with:delegate:)method along with a delegate object implementing the AVCapturePhotoCaptureDelegateprotocol. The photo capture output then calls your delegate to notify you of significant events during the capture process.
  1. 创建一个AVCapturePhotoOutput对象。使用其属性来确定支持的捕捉设置并启用某些功能(例如,是否捕捉实况照片)。
  2. 创建和配置AVCapturePhotoSettings对象以选择特定捕获的功能和设置(例如,是否启用图像稳定或闪光灯)。
  3. 通过将照片设置对象capturePhoto(with:delegate:)与实现AVCapturePhotoCaptureDelegate协议的委托对象一起传递给方法来捕获图像 。然后,照片捕获输出会调用您的委托以通知您在捕获过程中发生的重大事件。

have this below code on your clickCapturemethod and don't forgot to confirm and implement to delegate in your class.

在您的clickCapture方法中使用以下代码,不要忘记在您的类中确认并实现委托。

let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
                             kCVPixelBufferWidthKey as String: 160,
                             kCVPixelBufferHeightKey as String: 160,
                             ]
settings.previewPhotoFormat = previewFormat
self.cameraOutput.capturePhoto(with: settings, delegate: self)


For Output as AVCaptureStillImageOutput

对于输出为 AVCaptureStillImageOutput

if you intend to snap a photo from video connection. you can follow the below steps.

如果您打算通过视频连接拍摄照片。您可以按照以下步骤操作。

Step 1: Get the connection

第 1 步:获取连接

if let videoConnection = stillImageOutput!.connectionWithMediaType(AVMediaTypeVideo) {
  // ...
  // Code for photo capture goes here...
}

Step 2: Capture the photo

第 2 步:拍摄照片

  • Call the captureStillImageAsynchronouslyFromConnectionfunction on the stillImageOutput.
  • The sampleBufferrepresents the data that is captured.
  • 调用 上的captureStillImageAsynchronouslyFromConnection函数stillImageOutput
  • sampleBuffer表示被捕捉的数据。


stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in
  // ...
  // Process the image data (sampleBuffer) here to get an image file we can put in our captureImageView
})

Step 3: Process the Image Data

步骤 3:处理图像数据

  • We will need to to take a few steps to process the image data found in sampleBuffer in order to end up with a UIImage that we can insert into our captureImageView and easily use elsewhere in our app.
  • 我们需要采取一些步骤来处理在 sampleBuffer 中找到的图像数据,以便最终得到一个 UIImage,我们可以将它插入到我们的 captureImageView 中,并在我们的应用程序的其他地方轻松使用。


if sampleBuffer != nil {
  let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
  let dataProvider = CGDataProviderCreateWithCFData(imageData)
  let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
  let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
  // ...
  // Add the image to captureImageView here...
}

Step 4: Save the image

第 4 步:保存图像

Based on your need either save the image to photos gallery or show that in a image view

根据您的需要,将图像保存到照片库或在图像视图中显示



For more details check out Create custom camera view guideunder Snap a Photo

有关更多详细信息,请查看快照下的创建自定义相机视图指南