首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >为什么我的多声道混音器不再在iOS 8中播放?

为什么我的多声道混音器不再在iOS 8中播放?
EN

Stack Overflow用户
提问于 2015-01-05 10:30:43
回答 1查看 424关注 0票数 0

我写了一些代码,在iOS上播放通用的多乐器MIDI文件。它在iOS 7上运行良好,但在iOS 8上停止工作。

我已经在这里把它剥离到它的本质。我没有为我的多通道混音器创建16个通道,而是只创建了一个采样器节点,并将所有轨道映射到该通道。它仍然表现出与多采样器版本相同的问题。在iOS 7或iOS 8中,所有音频工具箱调用都没有返回错误代码(它们都返回0)。在iOS 7中,无论是在模拟器上还是在iPhone/iPad设备上,该序列都会通过扬声器播放。在iOS 8模拟器或iPhone/iPad设备上运行完全相同的代码,不会发出任何声音。

如果注释掉对[self initGraphFromMIDISequence]的调用,它将在iOS 8上播放默认正弦波声音。

代码语言:javascript
复制
@implementation MyMusicPlayer {
    MusicPlayer _musicPlayer;
    MusicSequence _musicSequence;
    AUGraph _processingGraph;
}

- (void)playMidi:(NSURL*)midiFileURL {
    NewMusicSequence(&_musicSequence);
    MusicSequenceFileLoad(_musicSequence, CFBridgingRetain(midiFileURL), 0, 0);

    NewMusicPlayer(&_musicPlayer);
    MusicPlayerSetSequence(_musicPlayer, _musicSequence);

    [self initGraphFromMIDISequence];

    MusicPlayerPreroll(_musicPlayer);
    MusicPlayerStart(_musicPlayer);
}

// Sets up an AUGraph with one channel whose instrument is loaded from a sound bank.
// Maps all the tracks of the MIDI sequence onto that channel.  Basically this is a
// way to replace the default sine-wave sound with another (single) instrument.
- (void)initGraphFromMIDISequence {
    NewAUGraph(&_processingGraph);

    // Add one sampler unit to the graph.
    AUNode samplerNode;
    AudioComponentDescription cd = {};
    cd.componentManufacturer = kAudioUnitManufacturer_Apple;
    cd.componentType = kAudioUnitType_MusicDevice;
    cd.componentSubType = kAudioUnitSubType_Sampler;
    AUGraphAddNode(_processingGraph, &cd, &samplerNode);

    // Add a Mixer unit node to the graph
    cd.componentType = kAudioUnitType_Mixer;
    cd.componentSubType = kAudioUnitSubType_MultiChannelMixer;
    AUNode mixerNode;
    AUGraphAddNode(_processingGraph, &cd, &mixerNode);

    // Add the Output unit node to the graph
    cd.componentType = kAudioUnitType_Output;
    cd.componentSubType = kAudioUnitSubType_RemoteIO; // Output to speakers.
    AUNode ioNode;
    AUGraphAddNode(_processingGraph, &cd, &ioNode);

    AUGraphOpen(_processingGraph);

    // Obtain the mixer unit instance from its corresponding node, and set the bus count to 1.
    AudioUnit mixerUnit;
    AUGraphNodeInfo(_processingGraph, mixerNode, NULL, &mixerUnit);
    UInt32 const numChannels = 1;
    AudioUnitSetProperty(mixerUnit,
                         kAudioUnitProperty_ElementCount,
                         kAudioUnitScope_Input,
                         0,
                         &numChannels,
                         sizeof(numChannels));

    // Connect the sampler node's output 0 to mixer node output 0.
    AUGraphConnectNodeInput(_processingGraph, samplerNode, 0, mixerNode, 0);

    // Connect the mixer unit to the output unit.
    AUGraphConnectNodeInput(_processingGraph, mixerNode, 0, ioNode, 0);

    // Obtain reference to the audio unit from its node.
    AudioUnit samplerUnit;
    AUGraphNodeInfo(_processingGraph, samplerNode, 0, &samplerUnit);
    MusicSequenceSetAUGraph(_musicSequence, _processingGraph);

    // Set the destination for each track to our single sampler node.
    UInt32 trackCount;
    MusicSequenceGetTrackCount(_musicSequence, &trackCount);
    MusicTrack track;
    for (int i = 0; i < trackCount; i++) {
      MusicSequenceGetIndTrack(_musicSequence, i, &track);
      MusicTrackSetDestNode(track, samplerNode);
    }

    // You can use either a DLS or an SF2 file bundled with your app; both work in iOS 7.
    //NSString *soundBankPath = [[NSBundle mainBundle] pathForResource:@"GeneralUserv1.44" ofType:@"sf2"];
    NSString *soundBankPath = [[NSBundle mainBundle] pathForResource:@"gs_instruments" ofType:@"dls"];
    NSURL *bankURL = [NSURL fileURLWithPath:soundBankPath];
    AUSamplerBankPresetData bpdata;
    bpdata.bankURL  = (__bridge CFURLRef) bankURL;
    bpdata.bankMSB  = kAUSampler_DefaultMelodicBankMSB;
    bpdata.bankLSB  = kAUSampler_DefaultBankLSB;
    bpdata.presetID = 0;
    UInt8 instrumentNumber = 46;  // pick any GM instrument 0-127
    bpdata.presetID = instrumentNumber;
    AudioUnitSetProperty(samplerUnit,
                         kAUSamplerProperty_LoadPresetFromBank,
                         kAudioUnitScope_Global,
                         0,
                         &bpdata,
                         sizeof(bpdata));
}

我有一些代码,没有包含在这里,它通过调用MusicPlayer实例上的MusicPlayerGetTime来轮询序列,以查看序列是否仍在播放。在iOS 7中,每次调用的结果是自开始播放以来经过的秒数。在iOS 8中,调用总是返回0,这可能意味着MusicPlayer不会开始播放调用MusicPlayerStart的序列。

上面的代码是高度依赖顺序的--您必须在其他调用之前进行某些调用;例如,在节点上调用getInfo之前打开图形,并且在将曲目分配给通道之前不加载仪器。我遵循了其他StackOverflow线程中的所有建议,并验证了获得正确的顺序会使错误代码消失。

有没有iOS MIDI专家知道iOS 7和iOS 8之间可能发生了什么变化,从而使此代码停止工作?

EN

回答 1

Stack Overflow用户

发布于 2015-01-05 13:51:17

在iOS 8中,苹果引入了一个巧妙的Obj-C核心音频应用编程接口抽象-- AVAudioEngine。你也许应该去看看。https://developer.apple.com/videos/wwdc/2014/#502

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/27772680

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档