node.js 将流转换为缓冲区?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/19906488/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Convert stream into buffer?
提问by sam100rav
How to convert stream into buffer in nodejs? Here is my code to parse a file in post request in express.
如何在nodejs中将流转换为缓冲区?这是我在 express 中解析 post 请求中的文件的代码。
app.post('/upload', express.multipart({
defer: true
}), function(req, res) {
req.form.on('part', function(part) {
//Here I want to convert the streaming part into a buffer.
//do something buffer-specific task
var out = fs.createWriteStream('image/' + part.filename);
part.pipe(out);
});
req.form.on('close', function() {
res.send('uploaded!');
});
});
采纳答案by Paul Mougel
You can use the stream-tomodule, which can convert a readable stream's data into an array or a buffer:
您可以使用该stream-to模块,它可以将可读流的数据转换为数组或缓冲区:
var streamTo = require('stream-to');
req.form.on('part', function (part) {
streamTo.buffer(part, function (err, buffer) {
// Insert your business logic here
});
});
If you want a better understanding of what's happening behind the scenes, you can implement the logic yourself, using a Writablestream. As a writable stream implementor, you only have to define one function: the _writemethod, that will be called every time some data is written to the stream. When the input stream is finished emitting data, the endeventwill be emitted: we'll then create a buffer using the Buffer.concatmethod.
如果您想更好地了解幕后发生的事情,您可以使用Writable流自己实现逻辑。作为可写流实现者,您只需定义一个函数:_write方法,每次将某些数据写入流时都会调用该函数。当输入流完成发出数据时,将发出end事件:然后我们将使用Buffer.concat方法创建一个缓冲区。
var stream = require('stream');
var converter = new stream.Writable();
converter.data = []; // We'll store all the data inside this array
converter._write = function (chunk) {
this.data.push(chunk);
};
converter.on('end', function() { // Will be emitted when the input stream has ended, ie. no more data will be provided
var b = Buffer.concat(this.data); // Create a buffer from all the received chunks
// Insert your business logic here
});
回答by robertklep
Instead of piping, you can attach dataand endevent handlers to the partstream to read it:
取而代之的管道,你可以附加data和end事件处理程序的part流来阅读:
var buffers = [];
part.on('data', function(buffer) {
buffers.push(buffer);
});
part.on('end', function() {
var buffer = Buffer.concat(buffers);
...do your stuff...
// write to file:
fs.writeFile('image/' + part.filename, buffer, function(err) {
// handle error, return response, etc...
});
});
However, this willread the entire upload into memory. If that's an issue, you might want to consider creating a custom transform streamto transform the incoming data, but that might not be trivial.
但是,这会将整个上传读取到内存中。如果这是一个问题,您可能需要考虑创建自定义转换流来转换传入数据,但这可能并非易事。

