首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >解锁电话时AVCapture预览冻结/卡住

解锁电话时AVCapture预览冻结/卡住
EN

Stack Overflow用户
提问于 2016-08-20 15:08:01
回答 2查看 1.3K关注 0票数 7

我用Objective C编写的iOS相机应用程序在从锁定屏幕返回/解锁手机时会冻结预览层。

所有摄像机配置设置都在viewWillAppear中调用。到目前为止,我已经成功了,除了唯一的问题,那就是相机预览层在从锁定屏幕返回时冻结或卡住。我的代码中的相机部分如下所示。

任何帮助都是非常感谢的。谢谢。附言:请随时指出我的代码中的任何错误,因为我只是一个新手。

代码语言:javascript
复制
- (void)viewWillAppear:(BOOL)animated
{
    [super viewWillAppear:animated];
    dispatch_async(dispatch_get_main_queue(), ^{
    [self setGUIBasedOnMode];
    });
}


-(void) setGUIBasedOnMode
{
if (![self isStreamStarted]) {
if (shutterActionMode == SnapCamSelectionModeLiveStream)
{
    _flashButton.hidden = true;
    _cameraButton.hidden = true;
    _liveSteamSession = [[VCSimpleSession alloc] initWithVideoSize:[[UIScreen mainScreen]bounds].size frameRate:30 bitrate:1000000 useInterfaceOrientation:YES];
    [_liveSteamSession.previewView removeFromSuperview];
    AVCaptureVideoPreviewLayer  *ptr;
    [_liveSteamSession getCameraPreviewLayer:(&ptr)];
    _liveSteamSession.previewView.frame = self.view.bounds;
    _liveSteamSession.delegate = self;
}
else{
    [_liveSteamSession.previewView removeFromSuperview];
    _liveSteamSession.delegate = nil;
    _cameraButton.hidden = false;
    if(flashFlag == 0){
        _flashButton.hidden = false;
    }
    else if(flashFlag == 1){
        _flashButton.hidden = true;
    }
    self.session = [[AVCaptureSession alloc] init];
    self.previewView.hidden = false;
    self.previewView.session = self.session;

    [self configureCameraSettings]; //All The Camera Configuration Settings.

    dispatch_async( self.sessionQueue, ^{
        switch ( self.setupResult )
        {
            case AVCamSetupResultSuccess:
            {
                [self addObservers];

                [self.session startRunning];

                self.sessionRunning = self.session.isRunning;
                if(loadingCameraFlag == false){
                    [self hidingView];
                }
                break;
            }
            case AVCamSetupResultCameraNotAuthorized:
            {
                dispatch_async( dispatch_get_main_queue(), ^{
                    NSString *message = NSLocalizedString( @"MyApp doesn't have permission to use the camera, please change privacy settings", @"Alert message when the user has denied access to the camera");
                    UIAlertController *alertController = [UIAlertController alertControllerWithTitle:@"AVCam" message:message preferredStyle:UIAlertControllerStyleAlert];
                    UIAlertAction *cancelAction = [UIAlertAction actionWithTitle:NSLocalizedString( @"OK", @"Alert OK button" ) style:UIAlertActionStyleCancel handler:nil];
                    [alertController addAction:cancelAction];

                    UIAlertAction *settingsAction = [UIAlertAction actionWithTitle:NSLocalizedString( @"Settings", @"Alert button to open Settings" ) style:UIAlertActionStyleDefault handler:^( UIAlertAction *action ) {
                        [[UIApplication sharedApplication] openURL:[NSURL URLWithString:UIApplicationOpenSettingsURLString]];
                    }];
                    [alertController addAction:settingsAction];
                    [self presentViewController:alertController animated:YES completion:nil];
                } );
                break;
            }
            case AVCamSetupResultSessionConfigurationFailed:
            {
                dispatch_async( dispatch_get_main_queue(), ^{
                    NSString *message = NSLocalizedString( @"Unable to capture media", @"Alert message when something goes wrong during capture session configuration" );
                    UIAlertController *alertController = [UIAlertController alertControllerWithTitle:@"MyApp" message:message preferredStyle:UIAlertControllerStyleAlert];
                    UIAlertAction *cancelAction = [UIAlertAction actionWithTitle:NSLocalizedString( @"OK", @"Alert OK button" ) style:UIAlertActionStyleCancel handler:nil];
                    [alertController addAction:cancelAction];
                    [self presentViewController:alertController animated:YES completion:nil];
                } );
                break;
            }
        }
    });
}
}

-(void)configureCameraSettings
{
self.sessionQueue = dispatch_queue_create( "session queue",      DISPATCH_QUEUE_SERIAL );
self.setupResult = AVCamSetupResultSuccess;
switch ( [AVCaptureDevice     authorizationStatusForMediaType:AVMediaTypeVideo] )
{
    case AVAuthorizationStatusAuthorized:
    {
        break;
    }
    case AVAuthorizationStatusNotDetermined:
    {
        dispatch_suspend( self.sessionQueue);
        [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^( BOOL granted ) {
            if ( ! granted ) {
                self.setupResult = AVCamSetupResultCameraNotAuthorized;
            }
            dispatch_resume( self.sessionQueue );
        }];
        break;
    }
    default:
    {
        self.setupResult = AVCamSetupResultCameraNotAuthorized;
        break;
    }
}

dispatch_async( self.sessionQueue, ^{
if ( self.setupResult != AVCamSetupResultSuccess ) {
    return;
}
self.backgroundRecordingID = UIBackgroundTaskInvalid;
NSError *error = nil;

AVCaptureDevice *videoDevice = [IPhoneCameraViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];

[self.session beginConfiguration];

if ( [self.session canAddInput:videoDeviceInput] ) {
    [self.session addInput:videoDeviceInput];
    self.videoDeviceInput = videoDeviceInput;

    dispatch_async( dispatch_get_main_queue(), ^{
        UIInterfaceOrientation statusBarOrientation = [UIApplication sharedApplication].statusBarOrientation;
        AVCaptureVideoOrientation initialVideoOrientation = AVCaptureVideoOrientationPortrait;
        if ( statusBarOrientation != UIInterfaceOrientationUnknown ) {
            initialVideoOrientation = (AVCaptureVideoOrientation)statusBarOrientation;
        }
        AVCaptureVideoPreviewLayer *previewLayer = (AVCaptureVideoPreviewLayer *)self.previewView.layer;
        if (shutterActionMode == SnapCamSelectionModeVideo)
        {
            [previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
            if([self.session canSetSessionPreset:AVCaptureSessionPresetMedium]){
                [self.session setSessionPreset:AVCaptureSessionPresetMedium];
            }
        }
        previewLayer.connection.videoOrientation = initialVideoOrientation;
    } );
}
else {
    self.setupResult = AVCamSetupResultSessionConfigurationFailed;
}

AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];

if ( ! audioDeviceInput ) {
}

if ( [self.session canAddInput:audioDeviceInput] ) {
    [self.session addInput:audioDeviceInput];
}
else {
}

AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];

Float64 TotalSeconds = 10*60;
int32_t preferredTimeScale = 30;
CMTime maxDuration = CMTimeMakeWithSeconds(TotalSeconds, preferredTimeScale);           movieFileOutput.maxRecordedDuration = maxDuration;
movieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024 * 100;

if ( [self.session canAddOutput:movieFileOutput] ) {
    [self.session addOutput:movieFileOutput];
    AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
    if ( connection.isVideoStabilizationSupported ) {
        connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto;
    }
    self.movieFileOutput = movieFileOutput;
}
else {
    self.setupResult = AVCamSetupResultSessionConfigurationFailed;
}

AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
if ( [self.session canAddOutput:stillImageOutput] ) {
    stillImageOutput.outputSettings = @{AVVideoCodecKey : AVVideoCodecJPEG};
    [self.session addOutput:stillImageOutput];
    self.stillImageOutput = stillImageOutput;
}
else {
    self.setupResult = AVCamSetupResultSessionConfigurationFailed;
}
[self.session commitConfiguration];
});
}
EN

回答 2

Stack Overflow用户

发布于 2016-08-20 16:37:42

尝试观察UIApplicationDidEnterBackgroundNotification/UIApplicationWillEnterForegroundNotification,UIApplicationWillResignActiveNotification/UIApplicationDidBecomeActiveNotification通知以相应地停止/启动捕获会话

票数 1
EN

Stack Overflow用户

发布于 2016-08-20 15:35:36

ViewDidLoad ViewWillAppearViewDidAppear方法应用程序生命周期的执行有很大的不同。

创建UIViews或执行一些繁重的任务会导致冻结,因此您应该尽可能避免在ViewWillAppear方法上执行此操作

看一看:

  1. ViewDidLoad:每当我向应该与视图一起出现的视图添加控件时,我会立即将其放入ViewDidLoad方法中。基本上,只要视图加载到内存中,就会调用此方法。例如,如果我的视图是一个有3个标签的表单,我会在这里添加标签;如果没有这些forms.
  2. ViewWillAppear:ViewWillAppear,视图将永远不会存在,通常只是为了更新表单上的数据。因此,对于上面的示例,我将使用它将数据从域实际加载到表单中。创建UIViews是相当昂贵的,你应该尽可能避免在ViewWillAppear方法上做这样的事情,因为当这个方法被调用时,这意味着iPhone已经准备好向用户显示UIView了,而你在这里做的任何繁重的事情都会以一种非常明显的方式影响性能(比如动画被延迟,etc).
  3. ViewDidAppear:ViewDidAppear启动新的线程到需要很长时间执行的事情,比如做一个webservice调用来为表单above.The获取额外的数据),好的一点是因为视图已经存在,并且正在向用户显示,您可以在获取数据时向用户显示一条漂亮的“等待”消息。
票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/39051497

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档