我将两个AudioBuffer连接成一个。
在我看来,如果不使用MediaRecorder,这是可能的,因为我不需要实时记录。
我现在就是这样做的:
concatAudio(buffers: AudioBuffer[]): void {
const totalLength = buffers.reduce((acc, buffer) => acc + buffer.length, 0);
const audioContext = new AudioContext();
const audioBuffer = audioContext.createBuffer(1, totalLength, 48000);
const src = audioContext.createBufferSource();
const dst = audioContext.createMediaStreamDestination();
let offset = 0;
buffers.map((buffer) => {
audioBuffer.getChannelData(0).set(buffer.getChannelData(0), offset);
offset += buffer.length;
});
src.buffer = audioBuffer;
src.connect(dst);
src.start();
this.recordAudio(dst.stream, audioBuffer.duration);
}在recordAudio()中,我将流输入到MediaRecorder中。
记录新流所需的时间与两个Audiobuffer按顺序播放的持续时间一样长。
还有别的办法吗?
谢谢。
发布于 2021-03-20 02:30:55
有一种实验的方法,可以并行记录缓冲区,然后粘合小块。您可以根据需要将音频缓冲区分割成任意数量的块,以最小块长度为限。它适用于Chrome,而不适用于FF。
在任何情况下都不要使用。(从精神出发)。
// try splitting buffer to N parts, recording in parallel, generating blob
async function recordParallel() {
const audioContext = new AudioContext();
const bufs = [], all = []
for (let i = 0; i < 10; i++ ) {
// 2705 - min chunk length for opus encoder in Chrome, so we increase block size to 4096 plus silent header
let buf = new AudioBuffer({length: 4096, sampleRate: audioContext.sampleRate})
bufs.push(buf)
let data = buf.getChannelData(0)
for (let j = 0; j < 4096; j++) data[j] = Math.sin(j / ((i + 1) * 2))
// create recorders
const source = audioContext.createBufferSource();
source.buffer = buf;
const chunks = []
all.push(new Promise(r => {
const dest = audioContext.createMediaStreamDestination();
const recorder = new MediaRecorder(dest.stream);
source.connect(dest)
recorder.start(10)
// delay is needed to shift encodingblocks
source.start(0)
recorder.ondataavailable = (e) => {
const blob = e.data
if (blob.size) chunks.push(blob)
}
recorder.onstop = e => {
r(chunks)
}
source.onended = e => {
recorder.stop()
}
}))
}
const blobs = await Promise.all(all);
// combine multiple recorders back
console.log(blobs)
var blob = new Blob([...blobs[0], ...blobs.slice(1).map(b => b.slice(1)).flat()], { 'type' : 'audio/ogg; codecs=opus' });
let audio = document.createElement('audio')
audio.src = URL.createObjectURL(blob);
audio.play()
}https://stackoverflow.com/questions/63686468
复制相似问题