我目前正在构建一个基于浏览器的音频编辑器,我正在使用ffmpeg.wasm (一个纯FFmpeg的WebAssembly/JavaScript端口)来实现它。
我正在使用这个优秀的示例,它允许您上传视频文件并将其转换为gif:
import React, { useState, useEffect } from 'react';
import './App.css';
import { createFFmpeg, fetchFile } from '@ffmpeg/ffmpeg';
const ffmpeg = createFFmpeg({ log: true });
function App() {
const [ready, setReady] = useState(false);
const [video, setVideo] = useState();
const [gif, setGif] = useState();
const load = async () => {
await ffmpeg.load();
setReady(true);
}
useEffect(() => {
load();
}, [])
const convertToGif = async () => {
// Write the file to memory
ffmpeg.FS('writeFile', 'test.mp4', await fetchFile(video));
// Run the FFMpeg command
await ffmpeg.run('-i', 'test.mp4', '-t', '2.5', '-ss', '2.0', '-f', 'gif', 'out.gif');
// Read the result
const data = ffmpeg.FS('readFile', 'out.gif');
// Create a URL
const url = URL.createObjectURL(new Blob([data.buffer], { type: 'image/gif' }));
setGif(url)
}
return ready ? (
<div className="App">
{ video && <video
controls
width="250"
src={URL.createObjectURL(video)}>
</video>}
<input type="file" onChange={(e) => setVideo(e.target.files?.item(0))} />
<h3>Result</h3>
<button onClick={convertToGif}>Convert</button>
{ gif && <img src={gif} width="250" />}
</div>
)
:
(
<p>Loading...</p>
);
}
export default App;我修改了上面的代码,以获取浏览器中记录的mp3文件(使用npm包“麦克风记录器到mp3”进行记录,并以全局状态下的blobURL形式传递给该组件),并使用ffmpeg.wasm对其进行一些操作:
import React, { useContext, useState, useEffect } from 'react';
import Context from '../../store/Context';
import Toolbar from '../Toolbar/Toolbar';
import AudioTranscript from './AudioTranscript';
import { createFFmpeg, fetchFile } from '@ffmpeg/ffmpeg';
//Create ffmpeg instance and set 'log' to true so we can see everything
//it does in the console
const ffmpeg = createFFmpeg({ log: true });
const AudioEditor = () => {
//Setup Global State and get most recent recording
const { globalState } = useContext(Context);
const { blobURL } = globalState;
//ready flag for when ffmpeg is loaded
const [ready, setReady] = useState(false);
const [outputFileURL, setOutputFileURL] = useState('');
//Load FFmpeg asynchronously and set ready when it's ready
const load = async () => {
await ffmpeg.load();
setReady(true);
}
//Use UseEffect to run the 'load' function on mount
useEffect(() => {
load();
}, []);
const ffmpegTest = async () => {
//must first write file to memory as test.mp3
ffmpeg.FS('writeFile', 'test.mp3', await fetchFile(blobURL));
//Run the FFmpeg command
//in this case, trim file size down to 1.5s and save to memory as output.mp3
ffmpeg.run('-i', 'test.mp3', '-t', '1.5', 'output.mp3');
//Read the result from memory
const data = ffmpeg.FS('readFile', 'output.mp3');
//Create URL so it can be used in the browser
const url = URL.createObjectURL(new Blob([data.buffer], { type: 'audio/mp3' }));
setOutputFileURL(url);
}
return ready ? (
<div>
<AudioTranscript />
<Toolbar />
<button onClick={ffmpegTest}>
Edit
</button>
{outputFileURL &&
<audio
controls="controls"
src={outputFileURL || ""}
/>
}
</div>
) : (
<div>
Loading...
</div>
)
}
export default AudioEditor;当我按下编辑按钮来调用ffmpegTest函数时,此代码返回以下错误:

我已经做了实验,当我将罪魁祸首代码行调整为:
const data = ffmpeg.FS('readFile', 'test.mp3');函数运行时没有出错,只需返回输入文件。因此,我假设ffmpeg.run()行可能有问题,没有将'output.mp3‘存储在内存中?我不能为我的生活弄清楚,on...any的帮助会是多么的感激!
发布于 2021-02-25 22:26:32
修好了..。
结果,我需要在ffmpeg.run()之前放置一个“等待”。如果没有该声明,下一行:
const data = ffmpeg.FS('readFile', 'output.mp3');在output.mp3生成并存储在内存中之前运行。
https://stackoverflow.com/questions/66373403
复制相似问题