我正在开发一个电影制作者应用程序,它对进口的视频产生了一些影响。我正在使用AVAssetWriter编写应用程序的代码。一切都很好,但我的记忆中有一个大问题。我的应用程序在缓冲过程中需要超过500 MB的RAM。简单地说,过滤视频的算法如下所示:
1-进口视频。
2-将视频的所有帧提取为CMSampleBuffer对象。
3-将CMSampleBuffer对象转换为uiimage。
4.在uiimage上实现过滤器。
5-将uiimage转换回一个新的CMSAmpleBuffer对象。
6-将新缓冲区附加到写入器输出。
7-最终将新电影保存到PhotoGallery。
问题是在step5中,我有一个函数将UIImage转换为cvpixelBuffer对象并返回它。然后将CVPixelBuffer对象转换为CMSampleBuffer。该函数大大增加了内存,应用程序在结束时崩溃。
这是我的密码:
-(CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image andSize:(CGSize)size
{
double height = CGImageGetHeight(image);
double width = CGImageGetWidth(image);
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
size.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
&pxbuffer);
if (status != kCVReturnSuccess) {
return NULL;
}
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata,size.width ,
size.height, 8, 4*size.width, rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
CGFloat Y ;
if (height == size.height)
Y = 0;
else
Y = (size.height /2) - (height/2) ;
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, Y,width,height), image);
CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}CGContextDrawImage使每帧转换的内存增加2~5MB。
我尝试了以下解决方案:
1-使用CFRelease释放pxbuffer缓冲区。
2-我使用CGImageRelease发布图像参考文件。
3-我用@autoreleasepool块包围代码。
我用的是CGContextRelease.
5- UIGraphicsEndImageContext.
6-在Xcode中也使用了分析,并修正了所有的点。
以下是用于视频过滤的完整代码:
- (void)assetFilteringMethod:(FilterType)filterType AndAssetURL:(NSURL *)assetURL{
CMSampleBufferRef sbuff ;
[areader addOutput:rout];
[areader startReading];
UIImage* bufferedImage;
while ([areader status] != AVAssetReaderStatusCompleted) {
sbuff = [rout copyNextSampleBuffer];
if (sbuff == nil)
[areader cancelReading];
else{
if (writerInput.readyForMoreMediaData) {
@autoreleasepool {
bufferedImage = [self imageFromSampleBuffer:sbuff];
bufferedImage = [FrameFilterClass convertImageToFilterWithFilterType:filterType andImage: bufferedImage];
CVPixelBufferRef buffer = NULL;
buffer = [self pixelBufferFromCGImage:[bufferedImage CGImage] andSize:CGSizeMake(320,240)];
[adaptor appendPixelBuffer:buffer withPresentationTime:CMSampleBufferGetPresentationTimeStamp(sbuff)];
CFRelease(buffer);
CFRelease(sbuff);
}
}
}
}
//Finished buffering
[videoWriter finishWritingWithCompletionHandler:^{
if (videoWriter.status != AVAssetWriterStatusFailed && videoWriter.status == AVAssetWriterStatusCompleted){
dispatch_async(dispatch_get_main_queue(), ^{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library
videoAtPathIsCompatibleWithSavedPhotosAlbum:[NSURL fileURLWithPath:moviePath]]) {
[library writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:moviePath]
completionBlock:^(NSURL *assetURL, NSError *error){
}];
}
});
}
else
NSLog(@"Video writing failed: %@", videoWriter.error);
}];
}我花了3到4天来解决这个问题.任何帮助都将不胜感激。
发布于 2016-06-22 09:00:46
您必须使用以下行发布图像:
cgimagerelease(image.cgimage)https://stackoverflow.com/questions/37884663
复制相似问题