首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >获取诸如[实用程序] +[AFAggregator logDictationFailedWithError:] error Domain=kAFAssistantErrorDomain Code=209 "(null)“之类的错误

获取诸如[实用程序] +[AFAggregator logDictationFailedWithError:] error Domain=kAFAssistantErrorDomain Code=209 "(null)“之类的错误
EN

Stack Overflow用户
提问于 2017-02-09 19:22:38
回答 1查看 1.9K关注 0票数 3

我试着开发一个应用程序,它同时具有文本到文本和文本到语音的功能,这意味着,当我将文本与语音对话时,当我播放文本到语音的声音时,语音识别就会处于关闭状态。

步骤:1-打开一个应用程序-应用程序应该开始识别声音2-开始说像:Hello3-然后播放-so,如果我们说play应用程序应该播放屏幕上的文本。再次开始说话--声音播放后,我想开始一个语音识别的过程。上述步骤是我所期望的经常程序。语音识别应该继续运行.For语音到文本,我使用苹果语音,framework.For文本到语音,我使用AVFoundation和MediaPlayer库。

这里我面临一个问题,即1-我在屏幕2中打印hello,我说的是play作为命令应用程序正在播放声音3之后,我得到了一个错误实用程序+AFAggregator : error Domain=kAFAssistantErrorDomain Code=209“(空)”“”。我不知道为什么会发生这种情况,我在google上搜索过,但是我没有找到任何公平的解决方案,请帮我解决这个问题。

这是我的密码

代码语言:javascript
复制
@interface ViewController ()

@end

@implementation ViewController
{
    SFSpeechAudioBufferRecognitionRequest *recognitionRequest;
    SFSpeechRecognitionTask *recognitionTask;
    AVAudioEngine *audioEngine;
    NSMutableArray *speechStringsArray;
    BOOL SpeechToText;
    NSString* resultString;
    NSString *str ;
    NSString *searchString;
    AVAudioInputNode *inputNode;
    BOOL didStartSpeaking;
    NSString *textToSpeak;
    NSMutableArray *speechCommandArray;

}

- (void)viewDidLoad {
    [super viewDidLoad];

    //Speech To Text ****
    speechStringsArray = [[NSMutableArray alloc]init];
    [self.textView resignFirstResponder];
    [self.textView setDelegate:self];


       ////             *****
    // Initialize background audio session
    NSError *error = NULL;
    AVAudioSession *session = [AVAudioSession sharedInstance];
    [session setCategory:AVAudioSessionCategoryPlayback error:&error];
    if(error) {
        NSLog(@"@error: %@", error);
    }
    [session setActive:YES error:&error];
    if (error) {
        NSLog(@"@error: %@", error);
    }

    // Enabled remote controls
    [[UIApplication sharedApplication] beginReceivingRemoteControlEvents];

    // Voice setup
    self.voicePicker.delegate = self;
    self.voice = [AVSpeechSynthesisVoice voiceWithLanguage:@"en-us"];
    self.voices = [NSMutableArray arrayWithObjects:
                   @{@"voice" : @"en-us", @"label" : @"American English (Female)"},
                   @{@"voice" : @"en-au", @"label" : @"Australian English (Female)"},
                   @{@"voice" : @"en-gb", @"label" : @"British English (Male)"},
                   @{@"voice" : @"en-ie", @"label" : @"Irish English (Female)"},
                   @{@"voice" : @"en-za", @"label" : @"South African English (Female)"},
                   nil];

    // Synthesizer setup
    self.synthesizer = [[AVSpeechSynthesizer alloc] init];
    self.synthesizer.delegate = self;

       // This notifcation is generated from the AppDelegate applicationDidBecomeActive method to make sure that if the play or pause button is updated in the background then the button will be updated in the toolbar
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(updateToolbar) name:@"updateToolbar" object:nil];

    if (self.textView.text.length >0) {

        [self.playPauseBarButtonItem setEnabled:YES];
     }
    else
    {
        [self.playPauseBarButtonItem setEnabled:NO];
        }}


-(void)viewDidAppear:(BOOL)animated
{

    self.speechRecognizer = [[SFSpeechRecognizer alloc]initWithLocale:[NSLocale localeWithLocaleIdentifier:@"en-US en-UK en-IN"]];



    self.speechRecognizer.delegate = self;

    audioEngine = [[AVAudioEngine alloc]init];

    self.textView.text = @"";

    [SFSpeechRecognizer requestAuthorization:^(SFSpeechRecognizerAuthorizationStatus authStatus) {
        switch (authStatus) {
            case SFSpeechRecognizerAuthorizationStatusAuthorized:
                //User gave access to speech recognition
                NSLog(@"Authorized");

                [self start_record];


                break;

            case SFSpeechRecognizerAuthorizationStatusDenied:
                //User denied access to speech recognition
                NSLog(@"AuthorizationStatusDenied");

                break;

            case SFSpeechRecognizerAuthorizationStatusRestricted:
                //Speech recognition restricted on this device
                NSLog(@"AuthorizationStatusRestricted");

                break;

            case SFSpeechRecognizerAuthorizationStatusNotDetermined:
                //Speech recognition not yet authorized

                break;

            default:
                NSLog(@"Default");
                break;
        }
    }];

    //MARK : Interface Builder Actions


}

 //methods for increase the speed    
- (IBAction)handleSpeedStepper:(UIStepper *)sender
{
    double speedValue = self.speedStepper.value;
    [self.speedValueLabel setText:[NSString stringWithFormat:@"%.1f", speedValue]];
}
//methods for increase the pitch
- (IBAction)handlePitchStepper:(UIStepper *)sender
{
    double pitchValue = self.pitchStepper.value;
    [self.pitchValueLabel setText:[NSString stringWithFormat:@"%.1f", pitchValue]];
}
//methods for play and pause the voice
- (IBAction)handlePlayPauseButton:(UIBarButtonItem *)sender
{

    if (self.synthesizer.speaking && !self.synthesizer.paused) {

        if (self.pauseSettingSegmentedControl.selectedSegmentIndex == 0) {
            // Stop immediately
            [self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryImmediate];
        }

        else
        {
            // Stop at end of current word
            [self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryWord];

        }


        [self updateToolbarWithButton:@"play"];
    }

    else if (self.synthesizer.paused) {
        [self.synthesizer continueSpeaking];
        [self updateToolbarWithButton:@"pause"];
    }


    else {

        if(self.textView.text.length>0)
        {


        [self speakUtterance];

        [self updateToolbarWithButton:@"pause"];
        }
        else
        {
            [self updateToolbarWithButton:@"play"];

        }

    }
}

//用户-speech语音识别方法

代码语言:javascript
复制
   -(void)start_record{


    NSError * outError;

    AVAudioSession *audioSession = [AVAudioSession sharedInstance];

    [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&outError];

    [audioSession setMode:AVAudioSessionModeMeasurement error:&outError];
    [audioSession setActive:YES withOptions:AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation  error:&outError];




    recognitionRequest = [[SFSpeechAudioBufferRecognitionRequest alloc]init];


    inputNode = audioEngine.inputNode;


    if (recognitionRequest  == nil) {
        NSLog(@"Unable to created a SFSpeechAudioBufferRecognitionRequest object");
    }
    if (inputNode == nil) {
        NSLog(@"Audio engine has no input node ");


    }

    //configure request so that results are returned before audio recording is finished


    [recognitionRequest setShouldReportPartialResults:YES];


    // A recognition task represents a speech recognition session.
    //We keep a reference to the task so that it can be cancelled .


    recognitionTask = [self.speechRecognizer recognitionTaskWithRequest:recognitionRequest resultHandler:^(SFSpeechRecognitionResult * result, NSError *  error1) {

        BOOL isFinal = false;

        if ((result = result)) {


            NSString *speech = result.bestTranscription.formattedString;
            NSLog(@"the speech:%@",speech);

            //storing the results in array.
            for (int i = 0 ;i <speechStringsArray.count;i++)
            {

                str = [speechStringsArray objectAtIndex:i];

                NSRange range = [speech rangeOfString:str options:NSCaseInsensitiveSearch];
                NSLog(@"found: %@", (range.location != NSNotFound) ? @"Yes" : @"No");

                if (range.location != NSNotFound) {

                    NSLog(@"inside if");
                    resultString = [speech stringByReplacingCharactersInRange:range withString:@""];

                    speech = resultString;

                    NSLog(@" the result is : %@",resultString);

                }
                else
                {
                    NSLog(@"ifcondition fails");
                    //speech = resultString;
                    resultString = speech;

                }

            }


            //commands are used like play to play a voice
            if (resultString.length>0) {


                if ([resultString isEqualToString:@" play"]) {

                    [self play];


                }
                else if ([resultString isEqualToString:@" exit"])
                {
                    UIApplication *app = [UIApplication sharedApplication];
                    [self applicationWillTerminate:app];

                }

                else if ([speech isEqualToString:@" mute"])
                {
                    [[MPMusicPlayerController applicationMusicPlayer] setVolume:0];



                }
                else if ([speech isEqualToString:@" up"])
                {
                    float vol = [[AVAudioSession sharedInstance] outputVolume];
                    vol ++;
                    [[MPMusicPlayerController applicationMusicPlayer] setVolume:vol];

                }
                else if ([speech isEqualToString:@" down"])
                {
                    float vol = [[AVAudioSession sharedInstance] outputVolume];
                    vol --;
                    [[MPMusicPlayerController applicationMusicPlayer] setVolume:vol];
                }

                else if ([speech isEqualToString:@" speed"])
                {
                    double speedValue = self.speedStepper.value;
                    NSString *speedup = self.speedValueLabel.text;
                    double speed = [speedup doubleValue];

                    speed += 0.1;

                    [self.speedValueLabel setText:[NSString stringWithFormat:@"%.1f", speed]];
                }

                else
                {
                    NSLog(@"coming");
                    self.textView.text = [NSString stringWithFormat:@"%@%@",self.textView.text,resultString];


          if (![resultString isEqualToString:@" play"]) {

               [speechStringsArray addObject:resultString];

  }



                }



            }

            else
            {

                if ([speech isEqualToString:@"Play"]) {

                    [self play];


                }



               if ([speech isEqualToString:@"Speed"])

                {
                    double speedValue = self.speedStepper.value;
                    NSString *speedup = self.speedValueLabel.text;
                    double speed = [speedup doubleValue];

                    speed += 0.1;

                    [self.speedValueLabel setText:[NSString stringWithFormat:@"%.1f", speed]];
                }

                else if ([speech isEqualToString:@"Exit"])

                {
                   //[[NSThread mainThread] exit];
                    UIApplication *app = [UIApplication sharedApplication];
                    [self applicationWillTerminate:app];

                }

                else if ([speech isEqualToString:@"Mute"])
                {
                    [[MPMusicPlayerController applicationMusicPlayer] setVolume:0];



                }
                else if ([speech isEqualToString:@"Up"])
                {
                    float vol = [[AVAudioSession sharedInstance] outputVolume];
                  //  vol +=1.0;
                    vol ++;

                    [[MPMusicPlayerController applicationMusicPlayer] setVolume:vol];

                }
                else if ([speech isEqualToString:@"Down"])
                {
                    float vol = [[AVAudioSession sharedInstance] outputVolume];
                    //vol -= 1.0;
                    vol --;
                    [[MPMusicPlayerController applicationMusicPlayer] setVolume:vol];
                }
                else
                {


                    if (![speech isEqualToString:@"Play"]) {

                        [speechStringsArray addObject:speech];

                        if (self.textView.text.length > 0) {

                            self.textView.text = [NSString stringWithFormat:@"%@%@",self.textView.text,speech];


                        }
                        else

                        {
                            self.textView.text = speech;

                        }


                    }

                }


            }

            NSLog(@" array %@",speechStringsArray);


            isFinal = result.isFinal;

        }


        if (error1 != nil || isFinal) {


            recognitionRequest = nil;
            recognitionTask = nil;


            [audioEngine stop];
            [inputNode removeTapOnBus:0];
            [recognitionRequest endAudio];


            [self start_record];


        }

    }];

    AVAudioFormat *recordingFormat =  [inputNode outputFormatForBus:0];

    [inputNode installTapOnBus:0 bufferSize:1024 format:recordingFormat block:^(AVAudioPCMBuffer * _Nonnull buffer, AVAudioTime * _Nonnull when){

        [recognitionRequest appendAudioPCMBuffer:buffer];


    }



     ];
    NSError *error1;
    [audioEngine prepare];
    [audioEngine startAndReturnError:&error1];




}

-(void)play
{
    if (self.synthesizer.speaking && !self.synthesizer.paused) {

        if (self.pauseSettingSegmentedControl.selectedSegmentIndex == 0) {
            // Stop immediately
            [self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryImmediate];
        }

        else
        {
            // Stop at end of current word
            [self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryWord];

        }


        [self updateToolbarWithButton:@"play"];
    }

    else if (self.synthesizer.paused) {
        [self.synthesizer continueSpeaking];
        [self updateToolbarWithButton:@"pause"];
    }


    else {

        if(self.textView.text.length>0)
        {


            [self speakUtterance];

            [self updateToolbarWithButton:@"pause"];
        }
        else
        {
            [self updateToolbarWithButton:@"play"];

        }

    }

}
 - (void)applicationWillTerminate:(UIApplication *)application
{
    exit(0);

}

//text to speech method

- (void)speakUtterance
{


    if (audioEngine.isRunning) {

        NSLog(@"Running");
        recognitionRequest = nil;
        recognitionTask = nil;
        [speechStringsArray removeAllObjects];
        resultString = @"";



        [audioEngine stop];
        [inputNode removeTapOnBus:0];
        [recognitionRequest endAudio];


    }

    if (self.textView.text.length > 0) {
        NSLog(@"speakUtterance");
        didStartSpeaking = NO;
        textToSpeak = [NSString stringWithFormat:@"%@", self.textView.text];
        AVSpeechUtterance *utterance = [[AVSpeechUtterance alloc] initWithString:textToSpeak];
        utterance.rate = self.speedStepper.value;
       utterance.pitchMultiplier = self.pitchStepper.value;
        utterance.voice = self.voice;
        [self.synthesizer speakUtterance:utterance];
        [self displayBackgroundMediaFields];

      //y  NSLog(@" after speaking:%@",self.textView.text);


    }

    else
    {
        [self updateToolbarWithButton:@"play"];

    }





}

- (void)displayBackgroundMediaFields
{
    MPMediaItemArtwork *artwork = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageNamed:@"Play"]];

    NSDictionary *info = @{ MPMediaItemPropertyTitle: self.textView.text,
                            MPMediaItemPropertyAlbumTitle: @"TextToSpeech App",
                            MPMediaItemPropertyArtwork: artwork};

    [MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo = info;


}

- (void)updateToolbar
{
    if (self.synthesizer.speaking && !self.synthesizer.paused) {
        [self updateToolbarWithButton:@"pause"];
    }
    else {
        [self updateToolbarWithButton:@"play"];
    }


}

- (void)updateToolbarWithButton:(NSString *)buttonType
{


       NSLog(@"updateToolbarWithButton: %@", buttonType);
    UIBarButtonItem *audioControl;
    if ([buttonType isEqualToString:@"play"]) {
        // Play
        audioControl = [[UIBarButtonItem alloc]initWithBarButtonSystemItem:UIBarButtonSystemItemPlay target:self action:@selector(handlePlayPauseButton:)];
    }
    else {
        // Pause
        audioControl = [[UIBarButtonItem alloc]initWithBarButtonSystemItem:UIBarButtonSystemItemPause target:self action:@selector(handlePlayPauseButton:)];


    }

    UIBarButtonItem *flexibleItem = [[UIBarButtonItem alloc] initWithBarButtonSystemItem:UIBarButtonSystemItemFlexibleSpace target:nil action:nil];

    [self.toolbar setItems:@[flexibleItem, audioControl, flexibleItem]];
}



- (void)remoteControlReceivedWithEvent:(UIEvent *)receivedEvent
{
    NSLog(@"receivedEvent: %@", receivedEvent);
    if (receivedEvent.type == UIEventTypeRemoteControl) {

        switch (receivedEvent.subtype) {

            case UIEventSubtypeRemoteControlPlay:
                NSLog(@"UIEventSubtypeRemoteControlPlay");
                if (self.synthesizer.speaking) {
                    [self.synthesizer continueSpeaking];
                }
                else {
                    [self speakUtterance];
                }
                break;

            case UIEventSubtypeRemoteControlPause:
                NSLog(@"pause - UIEventSubtypeRemoteControlPause");

                if (self.pauseSettingSegmentedControl.selectedSegmentIndex == 0) {
                    // Pause immediately
                    [self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryImmediate];
                }
                else {
                    // Pause at end of current word
                    [self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryWord];
                }
                break;

            case UIEventSubtypeRemoteControlTogglePlayPause:
                if (self.synthesizer.paused) {
                    NSLog(@"UIEventSubtypeRemoteControlTogglePlayPause");
                    [self.synthesizer continueSpeaking];
                }
                else {
                    NSLog(@"UIEventSubtypeRemoteControlTogglePlayPause");
                    if (self.pauseSettingSegmentedControl.selectedSegmentIndex == 0) {
                        // Pause immediately
                        [self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryImmediate];
                    }
                    else {
                        // Pause at end of current word
                        [self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryWord];
                    }
                }
                break;

            case UIEventSubtypeRemoteControlNextTrack:
                NSLog(@"UIEventSubtypeRemoteControlNextTrack - appropriate for playlists");
                break;

            case UIEventSubtypeRemoteControlPreviousTrack:
                NSLog(@"UIEventSubtypeRemoteControlPreviousTrack - appropriatefor playlists");
                break;

            default:
                break;
        }
    }
}

#pragma mark UIPickerViewDelegate Methods

- (NSInteger)numberOfComponentsInPickerView:(UIPickerView *)pickerView
{
    return 1;
}

- (NSInteger)pickerView:(UIPickerView *)pickerView numberOfRowsInComponent:(NSInteger)component
{
    return self.voices.count;
}

- (UIView *)pickerView:(UIPickerView *)pickerView viewForRow:(NSInteger)row forComponent:(NSInteger)component reusingView:(UIView *)view
{
    UILabel *rowLabel = [[UILabel alloc] init];
    NSDictionary *voice = [self.voices objectAtIndex:row];
    rowLabel.text = [voice objectForKey:@"label"];
    return rowLabel;
}

- (void)pickerView:(UIPickerView *)pickerView didSelectRow: (NSInteger)row inComponent:(NSInteger)component
{
    NSDictionary *voice = [self.voices objectAtIndex:row];
    NSLog(@"new picker voice selected with label: %@", [voice objectForKey:@"label"]);
    self.voice = [AVSpeechSynthesisVoice voiceWithLanguage:[voice objectForKey:@"voice"]];
}

#pragma mark SpeechSynthesizerDelegate methods

//在这种方法中,我调用了语音播放后的语音识别方法。

代码语言:javascript
复制
- (void)speechSynthesizer:(AVSpeechSynthesizer *)synthesizer didFinishSpeechUtterance:(AVSpeechUtterance *)utterance
{
    // This is a workaround of a bug. When we change the voice the first time the speech utterence is set fails silently. We check that the method willSpeakRangeOfSpeechString is called and set didStartSpeaking to YES there. If this method is not called (silent fail) then we simply request to speak again.
    if (!didStartSpeaking) {
        [self speakUtterance];

    }
    else {
        [self updateToolbarWithButton:@"play"];


        //here i am checking weather speech recognition is running or not , if not i am starting the speech recognition after playing a voice i.e,text to speech.

            if (!audioEngine.isRunning) {

                double delayInSeconds = 1.5;
                dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, delayInSeconds * NSEC_PER_SEC);
                dispatch_after(popTime, dispatch_get_main_queue(), ^(void){


                  ///  NSLog(@"Not Running");

                    [self start_record];

                });


            }




    }


}

- (void)speechSynthesizer:(AVSpeechSynthesizer *)synthesizer willSpeakRangeOfSpeechString:(NSRange)characterRange utterance:(AVSpeechUtterance *)utterance
{
    didStartSpeaking = YES;

}

#pragma mark UITextViewDelegate Methods

#pragma mark Cleanup Methods

- (void)dealloc
{
    [[NSNotificationCenter defaultCenter] removeObserver:self name:@"updateToolbar" object:nil];
}

- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}
@end

在上面的代码中,在didFinishSpeechUtterance方法中,在结束语音(即文本到语音)之后,我调用“开始记录”( start record ),它将再次启动语音识别,因此,在播放声音时,控件将启动记录方法,即

代码语言:javascript
复制
         recognitionTask = [self.speechRecognizer recognitionTaskWithRequest:recognitionRequest resultHandler:^(SFSpeechRecognitionResult * result, NSError *  error1) {

在这一行中,它抛出一个类似于实用程序+AFAggregator logDictationFailedWithError: error Domain=kAFAssistantErrorDomain Code=209“(Null)”()的错误,我不知道为什么.

请帮我解决这个问题。提前谢谢!

EN

回答 1

Stack Overflow用户

发布于 2021-03-14 00:32:32

我得到了这个209错误,偶尔还有203。我做了很多实验,终于解决了这个问题。我所做的是:

  1. 我没有做[recogTask finish],而是做了[recogTask cancel]
  2. 然后,我在while循环中等待recogTask取消,然后继续将其设置为零或任何其他步骤。例如,我做到了: 时间(recogTask && recogTask.cancelled == NO) { NSThread睡眠sleepForTimeInterval:0.1;}

因此,我没有得到任何209或203个代码错误,我也停止了recog 2秒每20秒。不过,看看是否可以缩小2秒的差距。

谢谢。

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/42145167

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档