Node.js HTTP 响应流

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/19580305/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-02 16:04:54  来源:igfitidea点击:

Node.js HTTP response streams

javascriptnode.jsgzip

提问by dzm

Using the native http.get()in Node.js, I'm trying to pipe a HTTP response to a stream that I can bind dataand endevents to.

http.get()在 Node.js 中使用本机,我试图将 HTTP 响应通过管道传输到我可以绑定dataend事件的流。

I'm currently handling this for gzip data, using:

我目前正在处理 gzip 数据,使用:

http.get(url, function(res) {
  if (res.headers['content-encoding'] == 'gzip') {
    res.pipe(gunzip);
    gunzip.on('data', dataCallback);
    gunzip.on('end', endCallback);
  }
});

Gunzip is a stream and this just works. I've tried to create streams (write streams, then read streams) and pipe the response, but haven't been having much luck. Any suggestions to replicate this same deal, for non-gzipped content?

Gunzip 是一个流,这只是有效。我尝试创建流(写入流,然后读取流)并通过管道传输响应,但运气不佳。对于非 gzip 压缩的内容,是否有任何建议可以复制相同的交易?

回答by hexacyanide

The response object from a HTTP request is an instance of readable stream. Therefore, you would collect the data with the dataevent, then use it when the endevent fires.

HTTP 请求的响应对象是可读流的实例。因此,您将随data事件收集数据,然后在end事件触发时使用它。

var http = require('http');
var body = '';

http.get(url, function(res) {
  res.on('data', function(chunk) {
    body += chunk;
  });
  res.on('end', function() {
    // all data has been downloaded
  });
});

The readable.pipe(dest)would basically do the same thing, if bodyin the example above were a writable stream.

readable.pipe(dest)将基本上做同样的事情,如果body在上面的例子中是一个写流。

回答by The Fool

Nowadays the recommended way of piping is using the pipeline function. It is supposed to protect you from memory leaks.

现在推荐的配管方式是使用管道功能。它应该可以保护您免受内存泄漏。

const { createReadStream} = require('fs');
const { pipeline } = require('stream')
const { createServer, get } = require('http')

const errorHandler = (err) => err && console.log(err.message);

const server = createServer((_, response) => {
  pipeline(createReadStream(__filename), response, errorHandler)
  response.writeHead(200);
}).listen(8080);

get('http://localhost:8080', (response) => {
  pipeline(response, process.stdout, errorHandler);
  response.on('close', () => server.close())
});

Another way of doing it that has more control would be to use async iterator

另一种具有更多控制权的方法是使用异步迭代器

async function handler(response){
  let body = ''
  for await (const chunk of response) {
    let text = chunk.toString()
    console.log(text)
    body += text
  }
  console.log(body.length)
  server.close()
}

get('http://localhost:8080', (response) => handler(response).catch(console.warn));