javascript Node.js + request - 保存远程文件并在响应中提供它
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/19237460/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Node.js + request - saving a remote file and serving it in a response
提问by sidonaldson
I understand how to load a remote file with Node.js + request, and I can then read it and return the png binary blob. Is there a elegant way to do it with one request (or even a one-liner)
我了解如何使用 Node.js + 请求加载远程文件,然后我可以读取它并返回 png 二进制 blob。是否有一种优雅的方式可以通过一个请求(甚至是单行)来完成
something like:
就像是:
http.createServer(function(req, res) {
res.writeHead(200, {
'Content-Type': 'image/png'
});
var picWrite = fs.createWriteStream(local);
var picFetch = fs.createReadStream(local);
picStream.on('close', function() {
console.log("file loaded");
});
request(remote).pipe(picWrite).pipe(picFetch).pipe(res);
})
To be clear: my aim is to load a remote file from a CDN, cache it locally to the server and then return the file in the original request. In future requests I use fs.exists()
to check it exists first.
明确地说:我的目标是从 CDN 加载远程文件,将其本地缓存到服务器,然后在原始请求中返回该文件。在以后的请求中,我fs.exists()
首先用来检查它是否存在。
This is my best effort so far:
这是我迄今为止最大的努力:
http.createServer(function(req, res) {
var file = fs.createWriteStream(local);
request.get(remote).pipe(file).on('close', function() {
res.end(fs.readFileSync(local), 'binary');
});
})
回答by hexacyanide
Since the request will return a readable stream, we can listen on its data
and end
events to write to both the HTTP response and a writable stream.
由于请求将返回一个可读流,我们可以监听它的data
和end
事件以写入 HTTP 响应和可写流。
var http = require('http');
var request = require('request');
http.createServer(function(req, res) {
res.writeHead(200, { 'Content-Type': 'image/png' });
var file = fs.createWriteStream(local);
// request the file from a remote server
var rem = request(remote);
rem.on('data', function(chunk) {
// instead of loading the file into memory
// after the download, we can just pipe
// the data as it's being downloaded
file.write(chunk);
res.write(chunk);
});
rem.on('end', function() {
res.end();
});
});
The method that you showed first writes the data to disk, then reads it into memory again. This is rather pointless, since the data is already accessible when it's being written to disk.
您展示的方法首先将数据写入磁盘,然后再次将其读入内存。这是毫无意义的,因为数据在写入磁盘时已经可以访问了。
If you use an event handler, you can write to both the HTTP response and the file stream without needing to pointlessly load the file to memory again. This also solves the problem with using pipe()
, because pipe()
will consume the data from the readable stream, and can only be done once.
如果使用事件处理程序,则可以同时写入 HTTP 响应和文件流,而无需再次将文件毫无意义地加载到内存中。这也解决了 using 的问题pipe()
,因为pipe()
会消耗可读流中的数据,并且只能执行一次。
This also solves problems with running out of memory, because if you were to download a large file, then it would effectively run your Node.js process out of memory. With streams, only chunks of a file are loaded into memory at one time, so you don't have this problem.
这也解决了内存不足的问题,因为如果你要下载一个大文件,那么它会有效地运行你的 Node.js 进程内存不足。对于流,一次只能将文件的块加载到内存中,因此您不会遇到此问题。