我所做的是:
我将CMSampleBuffer从didOutputSampleBuffer in AVFoundation中提取出来,并在每次委托发出缓冲区时运行几个过滤器并将它们输出到UIImage中。
什么起作用了:
所有的过滤器都正常工作。它给了我我想要的输出。在一部新手机(iPhone 6/6s/7)上,一切都运行得很好;然而,在iPhone 5s上几秒钟后就冻结了。
过滤器& UIImage输出:
let inputImage = self.bufferImage!
let filter = CIFilter(name: "CIPixellate")
let beginImage = inputImage
filter!.setValue(beginImage, forKey: kCIInputImageKey)
let filter3 = CIFilter(name: "CIColorMonochrome")
filter3!.setValue(filter!.outputImage, forKey: kCIInputImageKey)
filter3!.setValue(CIColor(red: 1, green:0, blue: 0), forKey: kCIInputColorKey)
filter3!.setValue(200.0, forKey: kCIInputIntensityKey)
let filter2 = CIFilter(name: "CIMultiplyBlendMode")
filter2!.setValue(filter3!.outputImage, forKey: kCIInputImageKey)
filter2!.setValue(inputImage, forKey: kCIInputBackgroundImageKey)
let output2 = filter2!.outputImage
let cgimg = self.context.createCGImage(output2!, fromRect: output2!.extent)
let newImage = UIImage(CGImage: cgimg!)
dispatch_sync(dispatch_get_main_queue()) {
self.imageView?.image = newImage
}
self.context.clearCaches()我将CIContext创建为:
let context = CIContext(options: nil)我还试图强迫CIContext在硬件上呈现,反之亦然。
我觉得,内存/空间/泄漏/等等都快用完了,不过,当它冻结时,Xcode中没有错误,只有处于冻结状态的应用程序。我最后添加了self.context.clearCaches(),如果没有真正改变原来的问题的话。
这种情况只发生在速度较慢的设备- 5S上--在本例中,运行平稳,在6/6s/7上没有任何问题。
我的完整didOutputSampleBuffer供参考:
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
connection.videoOrientation = .Portrait
let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
CVPixelBufferLockBaseAddress(imageBuffer!, CVPixelBufferLockFlags(rawValue: CVOptionFlags(0)))
let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer!)
let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer!)
let width = CVPixelBufferGetWidth(imageBuffer!)
let height = CVPixelBufferGetHeight(imageBuffer!)
let colorSpace = CGColorSpaceCreateDeviceRGB()
let bitmap = CGBitmapInfo(rawValue: CGBitmapInfo.ByteOrder32Little.rawValue|CGImageAlphaInfo.PremultipliedFirst.rawValue)
let context = CGBitmapContextCreate(baseAddress, width, height, 8,
bytesPerRow, colorSpace, bitmap.rawValue)
let quartzImage = CGBitmapContextCreateImage(context!)
CVPixelBufferUnlockBaseAddress(imageBuffer!,CVPixelBufferLockFlags(rawValue: CVOptionFlags(0)))
self.bufferImage = CIImage(CGImage: quartzImage!)
let inputImage = self.bufferImage!
let filter = CIFilter(name: "CIPixellate")
let beginImage = inputImage
filter!.setValue(beginImage, forKey: kCIInputImageKey)
let filter3 = CIFilter(name: "CIColorMonochrome")
filter3!.setValue(filter!.outputImage, forKey: kCIInputImageKey)
filter3!.setValue(CIColor(red: 1, green:0, blue: 0), forKey: kCIInputColorKey)
filter3!.setValue(200.0, forKey: kCIInputIntensityKey)
let filter2 = CIFilter(name: "CIMultiplyBlendMode")
filter2!.setValue(filter3!.outputImage, forKey: kCIInputImageKey)
filter2!.setValue(inputImage, forKey: kCIInputBackgroundImageKey)
let output2 = filter2!.outputImage
let cgimg = self.context.createCGImage(output2!, fromRect: output2!.extent)
let newImage = UIImage(CGImage: cgimg!)
dispatch_async(dispatch_get_main_queue()) {
self.imageView?.image = newImage
}
self.context.clearCaches()
}更新
通过更改将像素缓冲区变为CIImage的方法,我能够修复冻结问题:
let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
self.bufferImage = CIImage(CVPixelBuffer: pixelBuffer)在didOutputSampleBuffer开始时取出了大部分代码。
然而,现在的CPU使用率是非常高的! Xcode显示“引擎的影响”一样高!
发布于 2016-10-30 20:08:15
你是说:
dispatch_sync(dispatch_get_main_queue()) {
self.imageView?.image = newImage
}你没有任何理由等待这个电话的结果。使用dispatch_async代替。
(更好的办法是:找出你是否在主线程上。如果是这样的话,什么都不要使用dispatch。)
https://stackoverflow.com/questions/40332403
复制相似问题