首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >利用AVKit检测光度

利用AVKit检测光度
EN

Stack Overflow用户
提问于 2022-09-04 22:37:22
回答 1查看 47关注 0票数 0

我正在开发一个使用SwiftUi的应用程序,它利用设备摄像头来检测光度,如这个职位的顶部答案所述。用顶部答案中的captureOutput(_:didOutput:from:)函数计算光度。根据苹果文档,这个函数的目的是通知委托编写了一个新的视频帧,所以我把这个函数放在一个VideoDelegate类中。然后在VideoStream类中设置此委托,该类处理请求权限和设置AVCaptureSession的逻辑。我的问题是如何访问在我的luminosity视图中的委托中计算出来的SwiftUI值?

代码语言:javascript
复制
struct ContentView: View {
    @StateObject var videoStream = VideoStream()
    
    var body: some View {
        Text("\(videoStream.luminosityReading) ?? Detecting...")
            .padding()
    }
}
代码语言:javascript
复制
class VideoStream: ObservableObject {
    
    @Published var luminosityReading : Double = 0.0 // TODO get luminosity from VideoDelegate
    var session : AVCaptureSession!
    
    init() {
        authorizeCapture()
    }
    
    func authorizeCapture() {
        // permission logic and call to beginCapture()
    }
    
    func beginCapture() {        
        session = AVCaptureSession()
        session.beginConfiguration()
        let videoDevice = bestDevice() // func definition omitted for readability 
        guard
            let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice),
            session.canAddInput(videoDeviceInput)
        else {
            print("Camera selection failed")
            return
        }
        
        let videoOutput = AVCaptureVideoDataOutput()
        guard
            session.canAddOutput(videoOutput)
        else {
            print("Error creating video output")
            return
        }
        
        session.sessionPreset = .high
        session.addOutput(videoOutput)
        
        let queue = DispatchQueue(label: "VideoFrameQueue")
        let delegate = VideoDelegate()
        videoOutput.setSampleBufferDelegate(delegate, queue: queue)
        
        session.commitConfiguration()
        session.startRunning()
    }
}
代码语言:javascript
复制
class VideoDelegate: NSObject, AVCaptureVideoDataOutputSampleBufferDelegate {
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        //Retrieving EXIF data of camara frame buffer
        let rawMetadata = CMCopyDictionaryOfAttachments(allocator: nil, target: sampleBuffer, attachmentMode: CMAttachmentMode(kCMAttachmentMode_ShouldPropagate))
        let metadata = CFDictionaryCreateMutableCopy(nil, 0, rawMetadata) as NSMutableDictionary
        let exifData = metadata.value(forKey: "{Exif}") as? NSMutableDictionary
        
        let FNumber : Double = exifData?["FNumber"] as! Double
        let ExposureTime : Double = exifData?["ExposureTime"] as! Double
        let ISOSpeedRatingsArray = exifData!["ISOSpeedRatings"] as? NSArray
        let ISOSpeedRatings : Double = ISOSpeedRatingsArray![0] as! Double
        let CalibrationConstant : Double = 50
        
        //Calculating the luminosity
        let luminosity : Double = (CalibrationConstant * FNumber * FNumber ) / ( ExposureTime * ISOSpeedRatings )
         
        // how to pass value of luminosity to `VideoStream`? 
    }
}
EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2022-09-05 01:09:53

正如注释中所讨论的,最小的摩擦选项是让VideoStream符合AVCaptureVideoDataOutputSampleBufferDelegate并在那里实现委托方法。

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/73603233

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档