Javascript 使用适用于 Node.js 的 AWS 开发工具包将二进制文件上传到 S3
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/13807339/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Upload a binary file to S3 using AWS SDK for Node.js
提问by isNaN1247
Update:For future reference, Amazon have now updated the documentation from what was there at time of asking. As per @Loren Segal's comment below:-
更新:为了将来参考,亚马逊现在已经根据询问时的内容更新了文档。根据@Loren Segal 在下面的评论:-
We've corrected the docs in the latest preview release to document this parameter properly. Sorry about the mixup!
我们已更正最新预览版本中的文档以正确记录此参数。抱歉搞混了!
I'm trying out the developer preview of the AWS SDK for Node.Jsand want to upload a zipped tarball to S3 using putObject.
我正在试用适用于 Node.Js 的 AWS开发工具包的开发人员预览版,并希望使用putObject.
According to the documentation, the Bodyparameter should be...
根据文档,Body参数应该是...
Body - (Base64 Encoded Data)
正文 - (Base64 编码数据)
...therefore, I'm trying out the following code...
...因此,我正在尝试以下代码...
var AWS = require('aws-sdk'),
fs = require('fs');
// For dev purposes only
AWS.config.update({ accessKeyId: 'key', secretAccessKey: 'secret' });
// Read in the file, convert it to base64, store to S3
fs.readFile('myarchive.tgz', function (err, data) {
if (err) { throw err; }
var base64data = new Buffer(data, 'binary').toString('base64');
var s3 = new AWS.S3();
s3.client.putObject({
Bucket: 'mybucketname',
Key: 'myarchive.tgz',
Body: base64data
}).done(function (resp) {
console.log('Successfully uploaded package.');
});
});
Whilst I can then see the file in S3, if I download it and attempt to decompress it I get an error that the file is corrupted. Therefore it seems that my method for 'base64 encoded data' is off.
虽然我可以在 S3 中看到该文件,但如果我下载它并尝试解压缩它,我会收到一个错误,表明该文件已损坏。因此,我的“base64 编码数据”方法似乎已关闭。
Can someone please help me to upload a binary file using putObject?
有人可以帮我上传一个二进制文件putObject吗?
采纳答案by AndyD
You don't need to convert the buffer to a base64 string. Just set body to data and it will work.
您不需要将缓冲区转换为 base64 字符串。只需将 body 设置为 data 即可。
回答by CaptEmulation
Here is a way to send a file using streams, which might be necessary for large files and will generally reduce memory overhead:
这是一种使用流发送文件的方法,这对于大文件可能是必需的,并且通常会减少内存开销:
var AWS = require('aws-sdk'),
fs = require('fs');
// For dev purposes only
AWS.config.update({ accessKeyId: 'key', secretAccessKey: 'secret' });
// Read in the file, convert it to base64, store to S3
var fileStream = fs.createReadStream('myarchive.tgz');
fileStream.on('error', function (err) {
if (err) { throw err; }
});
fileStream.on('open', function () {
var s3 = new AWS.S3();
s3.putObject({
Bucket: 'mybucketname',
Key: 'myarchive.tgz',
Body: fileStream
}, function (err) {
if (err) { throw err; }
});
});
回答by shaun
I was able to upload my binary file this way.
我能够以这种方式上传我的二进制文件。
var fileStream = fs.createReadStream("F:/directory/fileName.ext");
var putParams = {
Bucket: s3bucket,
Key: s3key,
Body: fileStream
};
s3.putObject(putParams, function(putErr, putData){
if(putErr){
console.error(putErr);
} else {
console.log(putData);
}
});

