首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >捕捉图像SWIFT的AVCapture会话

捕捉图像SWIFT的AVCapture会话
EN

Stack Overflow用户
提问于 2015-04-28 21:58:49
回答 1查看 9.2K关注 0票数 7

我创建了一个AVCaptureSession来捕获视频输出并通过UIView将其显示给用户。现在,我希望能够单击一个按钮(takePhoto方法),并在UIImageView中显示来自会话的图像。我试图迭代每个设备连接,并试图保存输出,但这是行不通的。下面是我的代码

代码语言:javascript
复制
let captureSession = AVCaptureSession()
var stillImageOutput: AVCaptureStillImageOutput!

@IBOutlet var imageView: UIImageView!
@IBOutlet var cameraView: UIView!


// If we find a device we'll store it here for later use
var captureDevice : AVCaptureDevice?

override func viewDidLoad() {
    // Do any additional setup after loading the view, typically from a nib.
    super.viewDidLoad()
    println("I AM AT THE CAMERA")
    captureSession.sessionPreset = AVCaptureSessionPresetLow
    self.captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
    if(captureDevice != nil){
        beginSession()
    }
}
    func beginSession() {

    self.stillImageOutput = AVCaptureStillImageOutput()
    self.captureSession.addOutput(self.stillImageOutput)
    var err : NSError? = nil
    self.captureSession.addInput(AVCaptureDeviceInput(device: self.captureDevice, error: &err))

    if err != nil {
        println("error: \(err?.localizedDescription)")
    }

    var previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
    self.cameraView.layer.addSublayer(previewLayer)
    previewLayer?.frame = self.cameraView.layer.frame
    captureSession.startRunning()
}

@IBAction func takePhoto(sender: UIButton) {
    self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo)) { (buffer:CMSampleBuffer!, error:NSError!) -> Void in
        var image = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(buffer)
        var data_image = UIImage(data: image)
        self.imageView.image = data_image
    }
}
}
EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2015-04-29 03:50:46

在开始之前,应该尝试在向会话添加输入和输出时添加一个新线程。在苹果公司的文件中,他们说

重要: startRunning方法是一个阻塞调用,可能需要一些时间,因此您应该对串行队列执行会话设置,这样主队列就不会被阻塞(这将保持UI响应)。有关规范实现示例,请参见AVCam获取iOS。

尝试在create方法中使用分派,如下所示

代码语言:javascript
复制
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), { // 1
        self.captureSession.addOutput(self.stillImageOutput)
        self.captureSession.addInput(AVCaptureDeviceInput(device: self.captureDevice, error: &err))
        self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto
        if err != nil {
            println("error: \(err?.localizedDescription)")
        }
        var previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
        previewLayer?.frame = self.cameraView.layer.bounds
        previewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
        dispatch_async(dispatch_get_main_queue(), { // 2
                    // 3
            self.cameraView.layer.addSublayer(previewLayer)
            self.captureSession.startRunning()
            });
        });
票数 5
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/29930699

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档