首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >CVOpenGLESTextureCacheCreateTextureFromImage未能创建IOSurface

CVOpenGLESTextureCacheCreateTextureFromImage未能创建IOSurface
EN

Stack Overflow用户
提问于 2012-10-01 14:50:55
回答 2查看 4.3K关注 0票数 17

在我当前的项目中,我正在阅读iPhone的主相机输出。然后,我将通过方法: OpenGL将像素缓冲区转换为缓存的CVOpenGLESTextureCacheCreateTextureFromImage纹理。这在处理用于预览的相机帧时非常有效。用iPhone 3GS、4、4S、iPod Touch (第4代)和IOS5、IOS6进行不同组合的检测。

但是,对于具有很高分辨率的实际最终图像,这只适用于以下几种组合:

  • iPhone 3GS + IOS 5.1.1
  • iPhone 4+ IOS 5.1.1
  • iPhone 4S +IOS6.0
  • iPod Touch (第4代)+IOS5.0

这不适用于: iPhone 4+ IOS6。

控制台中的确切错误消息:

代码语言:javascript
复制
Failed to create IOSurface image (texture)
2012-10-01 16:24:30.663 GLCameraRipple[676:907] Error at CVOpenGLESTextureCacheCreateTextureFromImage -6683

我通过改变苹果的GLCameraRipple项目来解决这个问题。您可以在这里查看我的版本:http://lab.bitshiftcop.com/iosurface.zip

下面是如何将仍然输出添加到当前会话中:

代码语言:javascript
复制
- (void)setupAVCapture
{
    //-- Create CVOpenGLESTextureCacheRef for optimal CVImageBufferRef to GLES texture conversion.
    CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, [EAGLContext currentContext], NULL, &_videoTextureCache);
    if (err) 
    {
        NSLog(@"Error at CVOpenGLESTextureCacheCreate %d", err);
        return;
    }

    //-- Setup Capture Session.
    _session = [[AVCaptureSession alloc] init];
    [_session beginConfiguration];

    //-- Set preset session size.
    [_session setSessionPreset:_sessionPreset];

    //-- Creata a video device and input from that Device.  Add the input to the capture session.
    AVCaptureDevice * videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    if(videoDevice == nil)
        assert(0);

    //-- Add the device to the session.
    NSError *error;        
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
    if(error)
        assert(0);

    [_session addInput:input];

    //-- Create the output for the capture session.
    AVCaptureVideoDataOutput * dataOutput = [[AVCaptureVideoDataOutput alloc] init];
    [dataOutput setAlwaysDiscardsLateVideoFrames:YES]; // Probably want to set this to NO when recording

    //-- Set to YUV420.
    [dataOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
                                                             forKey:(id)kCVPixelBufferPixelFormatTypeKey]]; // Necessary for manual preview

    // Set dispatch to be on the main thread so OpenGL can do things with the data
    [dataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];


    // Add still output
    stillOutput = [[AVCaptureStillImageOutput alloc] init];
    [stillOutput setOutputSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
    if([_session canAddOutput:stillOutput]) [_session addOutput:stillOutput];

    [_session addOutput:dataOutput];
    [_session commitConfiguration];

    [_session startRunning];
}

下面是我如何捕捉静态输出并处理它:

代码语言:javascript
复制
- (void)capturePhoto
{
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in stillOutput.connections) {
        for (AVCaptureInputPort *port in [connection inputPorts]) {
            if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection) { break; }
    }

    [stillOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:
     ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
         // Process hires image
         [self captureOutput:stillOutput didOutputSampleBuffer:imageSampleBuffer fromConnection:videoConnection];
     }];
}

下面是如何创建纹理:

代码语言:javascript
复制
- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
       fromConnection:(AVCaptureConnection *)connection
{
    CVReturn err;
    CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    size_t width = CVPixelBufferGetWidth(pixelBuffer);
    size_t height = CVPixelBufferGetHeight(pixelBuffer);

    if (!_videoTextureCache)
    {
        NSLog(@"No video texture cache");
        return;
    }

    if (_ripple == nil ||
        width != _textureWidth ||
        height != _textureHeight)
    {
        _textureWidth = width;
        _textureHeight = height;

        _ripple = [[RippleModel alloc] initWithScreenWidth:_screenWidth 
                                              screenHeight:_screenHeight
                                                meshFactor:_meshFactor
                                               touchRadius:5
                                              textureWidth:_textureWidth
                                             textureHeight:_textureHeight];

        [self setupBuffers];
    }

    [self cleanUpTextures];

    NSLog(@"%zi x %zi", _textureWidth, _textureHeight);

    // RGBA texture
    glActiveTexture(GL_TEXTURE0);
    err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, 
                                                       _videoTextureCache,
                                                       pixelBuffer,
                                                       NULL,
                                                       GL_TEXTURE_2D,
                                                       GL_RGBA,
                                                       _textureWidth,
                                                       _textureHeight,
                                                       GL_BGRA,
                                                       GL_UNSIGNED_BYTE,
                                                       0,
                                                       &_chromaTexture);
    if (err) 
    {
        NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
    }

    glBindTexture(CVOpenGLESTextureGetTarget(_chromaTexture), CVOpenGLESTextureGetName(_chromaTexture));
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); 
}

有解决这个问题的建议吗?

EN

回答 2

Stack Overflow用户

发布于 2012-12-22 13:35:29

iPhone 4(以及iPhone 3GS和iPod Touch第4代)使用PowerVR SGX 535 GPU,其中的最大OpenGL ES纹理大小为2048x2048。。此值可通过调用

代码语言:javascript
复制
glGetIntegerv(GL_MAX_TEXTURE_SIZE, &maxTextureSize);

iPod触觉第4代。相机分辨率为720x960和iPhone 3GS,640x1136,但iPhone 4的后向相机分辨率为1936x2592,太大,无法适应单一的纹理。

您可以始终以较小的大小重写捕获的图像,同时保持高宽比(1529x2048)。Brad在his GPUImage框架上完成了这一任务,但是它非常简单,只需使用Core重新绘制原始像素缓冲区的数据,然后从重新绘制的数据中再生成一个像素缓冲区。框架的其余部分也是一个很好的资源。

票数 2
EN

Stack Overflow用户

发布于 2012-12-21 10:22:01

我们不能将静态图像纹理转换为CVOpenGLESTextureCacheRef。核心视频允许您将视频帧直接映射到OpenGL纹理。使用视频缓冲区,其中核心视频创建纹理并给予我们,已经在视频内存。

要创建opengles纹理,此链接可能会帮助您链接

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/12675655

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档