我正在写一个长时间曝光图像拍摄的应用程序。
我使用func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!)来获得一个CMSampleBuffer,用于使用CILightenBlendMode应用CIFilter。
问题是,混合时间太长,会导致帧下降。我试图复制缓冲区:
var copiedBuffer:CMSampleBuffer?
CMSampleBufferCreateCopy(nil, sampleBuffer, &copiedBuffer)
blendImages(copiedBuffer!)但这没什么用,画框还在下降。
完整守则:
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
if(CameraService.longExposureRunning){
var copiedBuffer:CMSampleBuffer?
CMSampleBufferCreateCopy(nil, sampleBuffer, &copiedBuffer)
blendImages(copiedBuffer!)
}
}
func captureOutput(captureOutput: AVCaptureOutput!, didDropSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
print("Dropped")
}
func blendImages(buffer:CMSampleBuffer){
let priority = DISPATCH_QUEUE_PRIORITY_DEFAULT
dispatch_async(dispatch_get_global_queue(priority, 0)){
let pixelBuffer = CMSampleBufferGetImageBuffer(buffer)
let cameraImage = CIImage(CVPixelBuffer: pixelBuffer!)
if let backgroundImage = self.lastImage{
let blendEffect = CIFilter(name: "CILightenBlendMode")
blendEffect?.setValue(backgroundImage, forKey: kCIInputBackgroundImageKey)
blendEffect?.setValue(cameraImage, forKey: kCIInputImageKey)
self.lastImage = blendEffect?.outputImage
print("Blending")
}else{
self.lastImage = cameraImage
}
let filteredImage = UIImage(CIImage: self.lastImage!)
dispatch_async(dispatch_get_main_queue())
{
imageView.image = filteredImage
}
}
}发布于 2016-04-20 18:00:24
我怀疑CoreImage正在将所有帧连接到一个巨大的内核中。您可能会发现CIImageAccumulator帮助您,但我可以通过强制Core呈现链并从每个帧重新开始来使您的代码工作。
我更改了lastImage变量的类型,而不是可选的UIImage,并添加了一个名为context的常量,即CIContext。有了这些地方,这件工作就很好了:
使用:let context:CIContext = CIContext(options: [kCIContextUseSoftwareRenderer:false])用于GPU而不是CPU渲染。
func blendImages(buffer:CMSampleBuffer){
let priority = DISPATCH_QUEUE_PRIORITY_DEFAULT
dispatch_async(dispatch_get_global_queue(priority, 0)){
let pixelBuffer = CMSampleBufferGetImageBuffer(buffer)
let cameraImage = CIImage(CVPixelBuffer: pixelBuffer!)
if let backgroundImage = self.lastImage {
let blendEffect = CIFilter(name: "CILightenBlendMode")!
blendEffect.setValue(
CIImage(image: backgroundImage),
forKey: kCIInputBackgroundImageKey)
blendEffect.setValue(
cameraImage, forKey:
kCIInputImageKey)
let imageRef = self.context.createCGImage(
blendEffect.outputImage!,
fromRect: blendEffect.outputImage!.extent)
self.lastImage = UIImage(CGImage: imageRef)
print("Blending")
}else{
let imageRef = self.context.createCGImage(
cameraImage,
fromRect: cameraImage.extent)
self.lastImage = UIImage(CGImage: imageRef)
}
let filteredImage = self.lastImage
dispatch_async(dispatch_get_main_queue())
{
self.imageView.image = filteredImage
}
}
}时髦的效果!
西蒙
发布于 2016-04-20 16:29:55
我能想到的最明显的事情是检查什么时候设置输出。
确保在您的expectsDataInRealTime上将AVAssetWriterInput设置为true。
https://stackoverflow.com/questions/36739653
复制相似问题