在我的应用程序中,我有一个对象CameraHandler,它使用GPUImage来检测摄像机中的某些运动。它是在我的GameViewController中初始化的。
它(CameraHandler)能够成功地检测运动,并启动相关方法,但在屏幕上显示任何更改之前,它会锁定GameViewController的视图相当长的时间(~5至10秒)。一旦CameraHandler检测到更改,它将触发一个方法,该方法将更改视图控制器上顶部视图的背景,并显示一个UIAlertView (用于测试目的)。就像我说的,这只发生在5-10秒后,从它被调用的那一刻起。我知道程序本身并没有冻结,因为我从方法中获得了相关的日志输出。我尝试过不同的方法来解决这个问题,但是我已经空手而归好几个星期了。
In GameViewController (在这里调用和启动CameraHandler):
-(void)startRound{
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0ul);
dispatch_async(queue, ^{
[_shotDetector captureStillImage];
dispatch_sync(dispatch_get_main_queue(), ^{
NSLog(@"finish capture still image thread");
});
});
}
/* this method gets called from CameraHandler once it detects movement */
-(void)shotLifted:(NSNumber*)afterTime{
NSLog(@"shot lifted fired");
UIAlertView *lost = [[UIAlertView alloc]initWithTitle:@"Good Job!" message:@"Shot lifted in time" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];
[lost show];
[_questionView setBackgroundColor:[UIColor whiteColor]];
NSLog(@"shot lifted done");
}CameraHandler.h
@interface CameraHandler : NSObject <GPUImageVideoCameraDelegate>
@property (strong) GPUImageOutput<GPUImageInput> *filter,*emptyFilter;
@property (strong) GPUImageVideoCamera *videoCamera;
@property (strong) GPUImagePicture *sourcePicture;
@property (strong) GPUImageOutput *pictureOutput;
@property (strong) GPUImageStillCamera *stillCamera;
@property (strong) __block UILabel *shotRemovedLabel;
@property (strong) __block NSDate *startTime;
@property (strong) NSMutableArray *averageLum;
@property (strong) id delegate;
@property (strong) GPUImageLuminosity *lumin;CameraHandler.m相关方法
-(void)startBlackoutShotMotionAnalysis{
NSLog(@"starting shot motion analysis");
[_videoCamera addTarget:self.filter];
[_sourcePicture processImage];
[_sourcePicture addTarget:self.filter];
[_videoCamera startCameraCapture];
_lumin = [[GPUImageLuminosity alloc] init];
[self.filter addTarget:_lumin];
__block int i =0;
__unsafe_unretained GameViewController* weakDelegate = self.delegate;
//begin luminosity detecting of live-video from uiimage
[(GPUImageLuminosity *)_lumin setLuminosityProcessingFinishedBlock:^(CGFloat luminosity, CMTime frameTime) {
if(i<60){
if(i>10){
_startTime = [NSDate date];
[_averageLum addObject:[NSNumber numberWithFloat:luminosity]];
}
i++;
}
else{
CGFloat average = [[_averageLum valueForKeyPath:@"@avg.floatValue"]floatValue];
CGFloat difference = fabsf(luminosity-average);
if(difference > 0.05){
NSTimeInterval liftedAfter = [_startTime timeIntervalSinceDate:[NSDate date]];
[weakDelegate performSelector:@selector(shotLifted:) withObject:[NSNumber numberWithFloat:liftedAfter]];
[_videoCamera stopCameraCapture];
NSLog(@"should turn white now");
return ;
}
}
}];
NSLog(@"finished returning executing starBlackoutMotionAnalysis Method");
}NSLOG产出:
2014-04-08 20:22:45.450 Groupy[2887:5c0f] starting shot motion analysis
2014-04-08 20:22:46.152 Groupy[2887:5c0f] finished returning executing starBlackoutMotionAnalysis Method
2014-04-08 20:22:48.160 Groupy[2887:1303] shot lifted fired
2014-04-08 20:22:48.221 Groupy[2887:1303] shot lifted done
2014-04-08 20:22:48.290 Groupy[2887:1303] should turn white now任何正确方向的帮助都将是巨大的。我一直在努力弄明白这个out.Thanks!
发布于 2014-04-18 15:20:42
当我看到UI中异常延迟的更新时,我通常首先看到的是我的UI更新代码是否正在主队列上执行。除了少数例外情况外,您应该始终将任何与UI相关的代码分派到主队列,否则会出现这样的奇怪行为。
根据我所看到的,您可以直接在shotLifted:内执行luminosityProcessingFinishedBlock选择器。我们可以放心地假设GPUImage将调用主线程之外的那个块。这意味着初始化和显示警报视图的代码也发生在主线程上。
要改变这一点,您应该尝试将对shotLifted:的调用包装在一个块中,并将其分派到主队列:
dispatch_async(dispatch_get_main_queue(), ^{
[weakDelegate performSelector:@selector(shotLifted:) withObject:obj];
}或者你也可以这样做:
[weakDelegate performSelectorOnMainThread:@selector(shotLifted:) withObject:obj waitUntilDone:NO];https://stackoverflow.com/questions/22950723
复制相似问题