自定义相机视图 Swift iOS 8 iPhone Xcode 6.1
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/28768210/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Custom camera view Swift iOS 8 iPhone Xcode 6.1
提问by Adam Smaka
I would like use camera in my iPhone inside of View. I don't want use typical full screen camera view, but my own.
我想在我的 iPhone 中的 View 中使用相机。我不想使用典型的全屏相机视图,而是我自己的。
For example I would like have a square 200x200 at the middle of the screen and there have a camera preview. Below this square I would like have a button to take a picture. How to do it? I'm swift beginer.
例如,我希望在屏幕中间有一个 200x200 的正方形,并且有一个相机预览。在这个方块下方,我想要一个按钮来拍照。怎么做?我是快速入门。
回答by Komran Ghahremani
You'll want to use the AVFoundation
Framework to allow you to make your own AVCaptureSession
inside of a view that you create in your storyboard. Here is a nice tutorial showing you how to find the camera and create a capture session:
您将希望使用AVFoundation
框架来允许您AVCaptureSession
在故事板中创建的视图中创建自己的视图。这是一个很好的教程,向您展示如何找到相机并创建捕获会话:
http://jamesonquave.com/blog/taking-control-of-the-iphone-camera-in-ios-8-with-swift-part-1/
http://jamesonquave.com/blog/taking-control-of-the-iphone-camera-in-ios-8-with-swift-part-1/
This tutorial uses the whole view as the capture view, so that is how big the camera will be if you model it after his code. To make a 200x200 square in the middle of the screen, you have to draw one out on your view controller in your storyboard, link it to a variable in your swift file where all the code is going, and then change the part at the bottom that says,
本教程使用整个视图作为捕获视图,因此如果您按照他的代码对其进行建模,那么这就是相机的大小。要在屏幕中间制作一个 200x200 的正方形,您必须在故事板中的视图控制器上绘制一个,将其链接到 swift 文件中所有代码所在的变量,然后更改底部的部分说的是,
previewLayer?.frame = self.view.layer.frame
to your200by200View.layer.frame
到 your200by200View.layer.frame
Hopefully this can help. If not, I can try to help some more or someone can correct me.
希望这会有所帮助。如果没有,我可以尝试帮助更多或有人可以纠正我。
Good luck!
祝你好运!
回答by Durul Dalkanat
An another code block. how you can do manual focus with iPhone.
另一个代码块。如何使用 iPhone 进行手动对焦。
import UIKit
class ViewController: UIViewController {
@IBOutlet var cameraView: CameraView!
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
@IBAction func sliderChanged(sender: UISlider) {
cameraView.setFocusWithLensPosition(sender.value)
}
}
import UIKit
import AVFoundation
class CameraView: UIView {
// AVFoundation properties
let captureSession = AVCaptureSession()
var captureDevice: AVCaptureDevice!
var captureDeviceFormat: AVCaptureDeviceFormat?
let stillImageOutput = AVCaptureStillImageOutput()
var cameraLayer: AVCaptureVideoPreviewLayer?
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
initCamera()
}
func initCamera() {
captureSession.beginConfiguration()
stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
// get the back camera
if let device = cameraDeviceForPosition(AVCaptureDevicePosition.Back) {
captureDevice = device
captureDeviceFormat = device.activeFormat
let error: NSErrorPointer = nil
do {
try captureDevice!.lockForConfiguration()
} catch let error1 as NSError {
error.memory = error1
}
captureDevice!.focusMode = AVCaptureFocusMode.Locked
captureDevice!.unlockForConfiguration()
var deviceInput: AVCaptureDeviceInput!
do {
deviceInput = try AVCaptureDeviceInput(device: captureDevice)
} catch let error1 as NSError {
error.memory = error1
deviceInput = nil
}
if(error == nil) {
captureSession.addInput(deviceInput)
}
captureSession.addOutput(stillImageOutput)
// use the high resolution photo preset
captureSession.sessionPreset = AVCaptureSessionPresetPhoto
// setup camera preview
cameraLayer = AVCaptureVideoPreviewLayer(session: captureSession)
if let player = cameraLayer {
player.videoGravity = AVLayerVideoGravityResizeAspectFill
self.layer.addSublayer(player)
player.frame = self.layer.bounds
player.connection.videoOrientation = AVCaptureVideoOrientation.LandscapeRight
}
// commit and start capturing
captureSession.commitConfiguration()
captureSession.startRunning()
}
captureSession.commitConfiguration()
}
func setFocusWithLensPosition(pos: CFloat) {
let error: NSErrorPointer = nil
do {
try captureDevice!.lockForConfiguration()
} catch let error1 as NSError {
error.memory = error1
}
captureDevice!.setFocusModeLockedWithLensPosition(pos, completionHandler: nil)
captureDevice!.unlockForConfiguration()
}
// return the camera device for a position
func cameraDeviceForPosition(position:AVCaptureDevicePosition) -> AVCaptureDevice?
{
for device:AnyObject in AVCaptureDevice.devices() {
if (device.position == position) {
return device as? AVCaptureDevice;
}
}
return nil
}
}
回答by Lyndsey Scott
An example of how you can add a cameraOverlayView
to create a 200x200 square viewing window at the center of the screen:
您可以添加一个cameraOverlayView
在屏幕中央创建 200x200 方形查看窗口的示例:
@IBAction func takePhoto(sender: AnyObject) {
if !UIImagePickerController.isSourceTypeAvailable(UIImagePickerControllerSourceType.Camera){
return
}
var imagePicker = UIImagePickerController()
imagePicker.delegate = self
imagePicker.sourceType = UIImagePickerControllerSourceType.Camera;
//Create camera overlay
let pickerFrame = CGRectMake(0, UIApplication.sharedApplication().statusBarFrame.size.height, imagePicker.view.bounds.width, imagePicker.view.bounds.height - imagePicker.navigationBar.bounds.size.height - imagePicker.toolbar.bounds.size.height)
let squareFrame = CGRectMake(pickerFrame.width/2 - 200/2, pickerFrame.height/2 - 200/2, 200, 200)
UIGraphicsBeginImageContext(pickerFrame.size)
let context = UIGraphicsGetCurrentContext()
CGContextSaveGState(context)
CGContextAddRect(context, CGContextGetClipBoundingBox(context))
CGContextMoveToPoint(context, squareFrame.origin.x, squareFrame.origin.y)
CGContextAddLineToPoint(context, squareFrame.origin.x + squareFrame.width, squareFrame.origin.y)
CGContextAddLineToPoint(context, squareFrame.origin.x + squareFrame.width, squareFrame.origin.y + squareFrame.size.height)
CGContextAddLineToPoint(context, squareFrame.origin.x, squareFrame.origin.y + squareFrame.size.height)
CGContextAddLineToPoint(context, squareFrame.origin.x, squareFrame.origin.y)
CGContextEOClip(context)
CGContextMoveToPoint(context, pickerFrame.origin.x, pickerFrame.origin.y)
CGContextSetRGBFillColor(context, 0, 0, 0, 1)
CGContextFillRect(context, pickerFrame)
CGContextRestoreGState(context)
let overlayImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext();
let overlayView = UIImageView(frame: pickerFrame)
overlayView.image = overlayImage
imagePicker.cameraOverlayView = overlayView
self.presentViewController(imagePicker, animated: true, completion: nil)
}