我从另一篇stackoverflow帖子中得到了这段代码,它似乎很好地完成了我想做的事情:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL,
sampleBuffer, kCMAttachmentMode_ShouldPropagate);
NSDictionary *metadata = [[NSMutableDictionary alloc]
initWithDictionary:(__bridge NSDictionary*)metadataDict];
CFRelease(metadataDict);
NSDictionary *exifMetadata = [[metadata
objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];
float brightnessValue = [[exifMetadata
objectForKey:(NSString *)kCGImagePropertyExifBrightnessValue] floatValue];
NSLog(@"AVCapture: %f", brightnessValue);
}但由于我对AVFoundation了解不多,我不知道如何使用它……如何获取AVCaptureOutput、CMSampleBufferRef和AVCaptureConnection对象?
或者换句话说,“如何使用AVFoundation框架设置视频输入”?
发布于 2015-08-17 23:46:29
下面的代码应该可以帮助您设置具有SampleBufferDelegate的CaptureSession:
AVCaptureSession *session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
[session addInput:input];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];
dispatch_queue_t queue = dispatch_queue_create("VideoQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
NSDictionary *outputSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[output setVideoSettings:outputSettings];
output.alwaysDiscardsLateVideoFrames = YES;https://stackoverflow.com/questions/32053460
复制相似问题