我已经为iOS中的群组创建了一个视频聊天应用程序。我一直在寻找一些方法来分别控制不同参与者的音量。我找到了在RemoteAudioTrack中使用isPlaybackEnabled静音和取消静音的方法,但不能控制音量。
我还想如果我们可以在AVAudioPlayer中使用它。我找到addSink了。这是我在here上尝试的方法
class Audio: NSObject, AudioSink {
var a = 1
func renderSample(_ audioSample: CMSampleBuffer!) {
print("audio found", a)
a += 1
var audioBufferList = AudioBufferList()
var data = Data()
var blockBuffer : CMBlockBuffer?
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(audioSample, bufferListSizeNeededOut: nil, bufferListOut: &audioBufferList, bufferListSize: MemoryLayout<AudioBufferList>.size, blockBufferAllocator: nil, blockBufferMemoryAllocator: nil, flags: 0, blockBufferOut: &blockBuffer)
let buffers = UnsafeBufferPointer<AudioBuffer>(start: &audioBufferList.mBuffers, count: Int(audioBufferList.mNumberBuffers))
for audioBuffer in buffers {
let frame = audioBuffer.mData?.assumingMemoryBound(to: UInt8.self)
data.append(frame!, count: Int(audioBuffer.mDataByteSize))
}
let player = try! AVAudioPlayer(data: data) //crash here
player.play()
}
}但它在let player = try! AVAudioPlayer(data: data)上崩溃了。
编辑:
这是错误:Fatal error: 'try!' expression unexpectedly raised an error: Error Domain=NSOSStatusErrorDomain Code=-39 "(null)": file。
这是data,所以我猜它不会被转换:
▿ 0 bytes
- count : 0
▿ pointer : 0x000000016d7ae160
- pointerValue : 6131736928
- bytes : 0 elements这是audioSample:
<CMAudioFormatDescription 0x2815a3de0 [0x1bb2ef830]> {
mediaType:'soun'
mediaSubType:'lpcm'
mediaSpecific: {
ASBD: {
mSampleRate: 16000.000000
mFormatID: 'lpcm'
mFormatFlags: 0xc
mBytesPerPacket: 2
mFramesPerPacket: 1
mBytesPerFrame: 2
mChannelsPerFrame: 1
mBitsPerChannel: 16 }
cookie: {(null)}
ACL: {(null)}
FormatList Array: {(null)}
}
extensions: {(null)}
}发布于 2019-08-05 09:21:10
您可以从CMSampleBuffer获取完整的数据缓冲区并将其转换为 data
let blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer)
let blockBufferDataLength = CMBlockBufferGetDataLength(blockBuffer!)
var blockBufferData = [UInt8](repeating: 0, count: blockBufferDataLength)
let status = CMBlockBufferCopyDataBytes(blockBuffer!, atOffset: 0, dataLength: blockBufferDataLength, destination: &blockBufferData)
guard status == noErr else { return }
let data = Data(bytes: blockBufferData, count: blockBufferDataLength)另请参阅AVAudioPlayer概述:
使用此类播放音频,除非您正在播放从网络流捕获的音频或需要非常低的I/O延迟。
所以我不认为它对你有效。你最好使用AVAudioEngine或Audio Queue Services。
发布于 2019-07-29 19:00:46
尝试将音频文件保存到文档目录,然后播放声音。这对我很有效。
func playMusic() {
let url = NSBundle.mainBundle().URLForResource("Audio", withExtension: "mp3")!
let data = NSData(contentsOfURL: url)!
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: nil)
AVAudioSession.sharedInstance().setActive(true, error: nil)
audioPlayer = AVAudioPlayer(data: data, fileTypeHint: AVFileTypeMPEGLayer3, error: nil)
audioPlayer.prepareToPlay()
audioPlayer.play()
}https://stackoverflow.com/questions/57075466
复制相似问题