我有一个API方法,当调用并传递一个文件密钥数组时,从S3下载它们。我想流式传输它们,而不是下载到磁盘上,然后压缩文件并将其返回给客户端。
这是我当前代码的样子:
reports.get('/xxx/:filenames ', async (req, res) => {
var AWS = require('aws-sdk');
var s3 = new AWS.S3();
var str_array = filenames.split(',');
for (var i = 0; i < str_array.length; i++) {
var filename = str_array[i].trim();
localFileName = './' + filename;
var params = {
Bucket: config.reportBucket,
Key: filename
}
s3.getObject(params, (err, data) => {
if (err) console.error(err)
var file = require('fs').createWriteStream(localFileName);
s3.getObject(params).createReadStream().pipe(file);
console.log(file);
})
}
});我如何流式传输文件,而不是将它们下载到磁盘,以及如何压缩它们以将其返回给客户端?
发布于 2019-01-28 23:53:22
主要问题是压缩多个文件。
更具体地说,从亚马逊网络服务S3批量下载。
我在AWS SDK中搜索了一下,没有找到批量s3操作。
这给我们带来了一个可能的解决方案:
Zip压缩文件夹逐个加载文件并将其存储到folder
的包
这是一个未经测试的原始示例,但它可能会让您产生这样的想法:
// Always import packages at the beginning of the file.
const AWS = require('aws-sdk');
const fs = require('fs');
const zipFolder = require('zip-folder');
const s3 = new AWS.S3();
reports.get('/xxx/:filenames ', async (req, res) => {
const filesArray = filenames.split(',');
for (const fileName of filesArray) {
const localFileName = './' + filename.trim();
const params = {
Bucket: config.reportBucket,
Key: filename
}
// Probably you'll need here some Promise logic, to handle stream operation end.
const fileStream = fs.createWriteStream(localFileName);
s3.getObject(params).createReadStream().pipe(fileStream);
}
// After that all required files would be in some target folder.
// Now you need to compress the folder and send it back to user.
// We cover callback function in promise, to make code looks "sync" way.
await new Promise(resolve => zipFolder('/path/to/the/folder', '/path/to/archive.zip', (err) => {resolve()});
// And now you can send zipped folder to user (also using streams).
fs.createReadStream('/path/to/archive.zip').pipe(res);
});注意:根据streams的特性,你可能会遇到一些异步行为的问题,所以,首先,请在压缩之前检查所有文件是否都存储在文件夹中。
顺便提一下,我还没有测试过这段代码。因此,如果出现任何问题,让我们一起调试
https://stackoverflow.com/questions/54404046
复制相似问题