摘要 : 计划分享有关 iOS 音视频开发一些列文章,首先是 iOS 视频采集相关介绍,后续会持续丰富每篇内容。
AVCaptureSession 对象调节管理输入源和输出源之间的协作,最后通过AVCaptureVideoPreviewLayer来显示采集画面,主要流程如下 image 1:

代表硬件设备,例如麦克风或摄像头
从 AVCaptureDevcie 捕获的数据,是个抽象类,需要继承后使用,例如**:**AVCaptureDeviceInput,还有其他input 源,可参考Apple 官网文档根据场景选用。
同样,AVCaptureOutput 也是抽象类,常用的有:AVCaptureMovieFileOutput,AVCaptureVideoDataOutput,AVCaptureAudioDataoutput,AVCaptureStillMageOutput等,可根据实际情况选用。 ****
用来维护管理 input 和outout,由 AVCaptureSesstion 使用。 ****
用来管理协调输入输出流,可设置分辨率。
提供显示预览功能,AVCapturePreviewLayer 添加到目标 view 的 layer 即可。

// 设置捕获会话并设置分辨率
- (void)setupSession {
AVCaptureSession *avCaptureSession = [[AVCaptureSession alloc] init];
// 设置分辨率
avCaptureSession.sessionPreset = AVCaptureSessionPreset1280x720;
}设置分辨率可参考官网文档中参数:

//获取摄像头
AVCaptureDevice *captureDevice = [[AVCaptureDevice alloc] init];
NSArray *devices = [AVCaptureDevice devices];
for (AVCaptureDevice *device in devices) {
if(device.position == AVCaptureDevicePositionFront) {
captureDevice = device;
}
}
//添加
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
//设置帧率
captureDevice.activeVideoMinFrameDuration = CMTimeMake(1, 16);
[avCaptureSession addInput:videoInput];AVCaptureVideoDataOutput *videoOutput = [[AVCaptureVideoDataOutput alloc] init];
videoOutput.videoSettings = @{(NSString *)kCVPixelBufferPixelFormatTypeKey:@(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)};
//进制丢帧
videoOutput.alwaysDiscardsLateVideoFrames = NO;
//创建串行队列,获取帧
dispatch_queue_t queue = dispatch_queue_create("captureSerialQueue", DISPATCH_QUEUE_SERIAL);
[videoOutput setSampleBufferDelegate:self queue:queue];
[avCaptureSession addOutput:videoOutput];AVCaptureConnection *captureConnection = [videoOutput connectionWithMediaType:AVMediaTypeVideo];
captureConnection.videoOrientation = AVCaptureVideoOrientationPortrait;
captureConnection.videoMirrored = YES;[avCaptureSession startRunning];
[avCaptureSession stopRunning]; //视频采集数据回调
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
//获取data
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(source);
CVPixelBufferLockBaseAddress(imageBuffer,0);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
void *src_buff = CVPixelBufferGetBaseAddress(imageBuffer);
NSData *data = [NSData dataWithBytes:src_buff length:bytesPerRow * height];
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
//或者直接转成image
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *ciImage = [CIImage imageWithCVImageBuffer:imageBuffer];
UIImage *image = [UIImage imageWithCIImage:ciImage];
dispatch_async(dispatch_get_main_queue(), ^{
_captrueImageView.image = image;
});
}https://xdev.in/posts/audio-and-video-1/#%E9%87%87%E9%9B%86%E8%BF%87%E7%A8%8B
https://www.jianshu.com/p/bcff7965e1d0
https://juejin.cn/post/6844903887313305608#heading-14
https://juejin.cn/post/6844903818103095303#heading-29
https://blog.csdn.net/lincsdnnet/article/details/78255773
https://blog.csdn.net/HatsuneMikuFans/article/details/119855953
原创声明:本文系作者授权腾讯云开发者社区发表,未经许可,不得转载。
如有侵权,请联系 cloudcommunity@tencent.com 删除。
原创声明:本文系作者授权腾讯云开发者社区发表,未经许可,不得转载。
如有侵权,请联系 cloudcommunity@tencent.com 删除。