首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >用AVCaptureSession捕获视频,用EAGLContext没有可见输出

用AVCaptureSession捕获视频,用EAGLContext没有可见输出
EN

Stack Overflow用户
提问于 2014-03-25 17:32:22
回答 1查看 1K关注 0票数 1

我正在用iPhone上的后摄像头用AVCaptureSession捕捉实时视频,用CoreImage应用一些过滤器,然后尝试用OpenGL ES输出最终的视频。大部分代码来自WWDC 2012会话“核心图像技术”的一个示例。

使用UIImage imageWithCIImage显示过滤器链的输出:.或者为每个帧创建一个CGImageRef可以很好地工作。然而,当我试图用OpenGL显示时,我得到的只是一个黑色的屏幕。

在本课程中,他们使用自定义视图类来显示输出,但是该类的代码不可用。我的视图控制器类扩展了GLKViewController,它的视图类被设置为GLKView。

我已经搜索并下载了我能找到的所有GLKit教程和示例,但是没有任何帮助。特别是,当我试图从这里运行这个示例时,也无法获得任何视频输出。谁能给我指明正确的方向?

代码语言:javascript
复制
#import "VideoViewController.h"

@interface VideoViewController ()
{
    AVCaptureSession *_session;

    EAGLContext *_eaglContext;
    CIContext *_ciContext;

    CIFilter *_sepia;
    CIFilter *_bumpDistortion;
}

- (void)setupCamera;
- (void)setupFilters;

@end

@implementation VideoViewController

- (void)viewDidLoad
{
    [super viewDidLoad];

    GLKView *view = (GLKView *)self.view;

    _eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES3];
    [EAGLContext setCurrentContext:_eaglContext];

    view.context = _eaglContext;

    // Configure renderbuffers created by the view
    view.drawableColorFormat = GLKViewDrawableColorFormatRGBA8888;
    view.drawableDepthFormat = GLKViewDrawableDepthFormat24;
    view.drawableStencilFormat = GLKViewDrawableStencilFormat8;

    [self setupCamera];
    [self setupFilters];
}

- (void)setupCamera {
    _session = [AVCaptureSession new];
    [_session beginConfiguration];

    [_session setSessionPreset:AVCaptureSessionPreset640x480];

    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
    [_session addInput:input];

    AVCaptureVideoDataOutput *dataOutput = [AVCaptureVideoDataOutput new];
    [dataOutput setAlwaysDiscardsLateVideoFrames:YES];

    NSDictionary *options;
    options = @{ (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange] };

    [dataOutput setVideoSettings:options];

    [dataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

    [_session addOutput:dataOutput];
    [_session commitConfiguration];
}

#pragma mark Setup Filters
- (void)setupFilters {
    _sepia = [CIFilter filterWithName:@"CISepiaTone"];
    [_sepia setValue:@0.7 forKey:@"inputIntensity"];

    _bumpDistortion = [CIFilter filterWithName:@"CIBumpDistortion"];
    [_bumpDistortion setValue:[CIVector vectorWithX:240 Y:320] forKey:@"inputCenter"];
    [_bumpDistortion setValue:[NSNumber numberWithFloat:200] forKey:@"inputRadius"];
    [_bumpDistortion setValue:[NSNumber numberWithFloat:3.0] forKey:@"inputScale"];
}

#pragma mark Main Loop
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    // Grab the pixel buffer
    CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);

    // null colorspace to avoid colormatching
    NSDictionary *options = @{ (id)kCIImageColorSpace : (id)kCFNull };
    CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer options:options];

    image = [image imageByApplyingTransform:CGAffineTransformMakeRotation(-M_PI/2.0)];
    CGPoint origin = [image extent].origin;
    image = [image imageByApplyingTransform:CGAffineTransformMakeTranslation(-origin.x, -origin.y)];

    // Pass it through the filter chain
    [_sepia setValue:image forKey:@"inputImage"];
    [_bumpDistortion setValue:_sepia.outputImage forKey:@"inputImage"];

    // Grab the final output image
    image = _bumpDistortion.outputImage;

    // draw to GLES context
    [_ciContext drawImage:image inRect:CGRectMake(0, 0, 480, 640) fromRect:[image extent]];

    // and present to screen
    [_eaglContext presentRenderbuffer:GL_RENDERBUFFER];

    NSLog(@"frame hatched");

    [_sepia setValue:nil forKey:@"inputImage"];
}

- (void)loadView {
    [super loadView];

    // Initialize the CIContext with a null working space
    NSDictionary *options = @{ (id)kCIContextWorkingColorSpace : (id)kCFNull };
    _ciContext = [CIContext contextWithEAGLContext:_eaglContext options:options];
}

- (void)viewWillAppear:(BOOL)animated {
    [super viewWillAppear:animated];

    [_session startRunning];
}

- (void)didReceiveMemoryWarning
{
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}

@end
EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2014-03-26 10:25:35

哇,其实是我自己想出来的。(这一行工作终究适合我;)

首先,无论出于什么原因,这段代码只适用于OpenGL ES 2,而不是3。

其次,我在loadView方法中设置了loadView,该方法显然在viewDidLoad方法之前运行,因此使用尚未初始化的EAGLContext。

票数 2
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/22642140

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档