javascript NodeJS JSON.stringify() 瓶颈

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/22046758/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-27 22:16:46  来源:igfitidea点击:

NodeJS JSON.stringify() bottleneck

javascriptjsonnode.jsstringify

提问by gosho_ot_pochivka

My service returns responses of very large JSON objects - around 60MB. After some profiling I have found that it spends almost all of the time doing the JSON.stringify()call which is used to convert to string and send it as a response. I have tried custom implementations of stringify and they are even slower.

我的服务返回非常大的 JSON 对象的响应 - 大约 60MB。经过一些分析后,我发现它几乎将所有时间都花在了JSON.stringify()用于转换为字符串并将其作为响应发送的调用上。我已经尝试过 stringify 的自定义实现,它们甚至更慢。

This is quite a bottleneck for my service. I want to be able to handle as many requests per second as possible - currently 1 request takes 700ms.

这对我的服务来说是一个瓶颈。我希望能够每秒处理尽可能多的请求——目前 1 个请求需要 700 毫秒。

My questions are:
1) Can I optimize the sending of response part? Is there a more effective way than stringify-ing the object and sending the response?

我的问题是:
1)我可以优化响应部分的发送吗?有没有比字符串化对象并发送响应更有效的方法?

2) Will using async module and performing the JSON.stringify()in a separate thread improve overall the number of requests/second(given that over 90% of the time is spent at that call)?

2) 使用异步模块并JSON.stringify()在单独的线程中执行是否会提高整体请求数/秒(假设超过 90% 的时间花在该调用上)?

采纳答案by Jason

You've got two options:

你有两个选择:

1) find a JSON module that will allow you to stream the stringify operation, and process it in chunks. I don't know if such a module is out there, if it's not you'd have to build it. EDIT:Thanks to Reinard Mavronicolas for pointing out JSONStreamin the comments. I've actually had it on my back burner to look for something like this, for a different use case.

1) 找到一个 JSON 模块,它允许您流式传输 stringify 操作,并分块处理它。我不知道是否有这样的模块,如果没有,您就必须构建它。编辑:感谢 Reinard Mavronicolas在评论中指出JSONStream。我实际上已经把它放在了我的次要位置,以寻找类似的东西,用于不同的用例。

2) asyncdoes not use threads. You'd need to use clusteror some other actual threading module to drop the processing into a separate thread. The caveat here is that you're still processing a large amount of data, you're gaining bandwidth using threads but depending on your traffic you still may hit a limit.

2)async不使用线程。您需要使用cluster或其他一些实际的线程模块将处理放到一个单独的线程中。这里需要注意的是,您仍在处理大量数据,您正在使用线程获得带宽,但根据您的流量,您仍然可能会遇到限制。

回答by Manuel Spigolon

After some year, this question has a new answer for the first question: yieldable-jsonlib. As described by in this talk by Gireesh Punathil (IBM India), this lib can evaluate a JSON of 60MB without blocking the event loop of node.js let you accept new requests in order to upgrade your throughput.

一年后,这个问题对第一个问题有了新的答案:yieldable-jsonlib。正如Gireesh Punathil(IBM 印度)本次演讲中所描述的那样,该库可以评估 60MB 的 JSON,而不会阻塞 node.js 的事件循环,让您可以接受新请求以升级您的吞吐量。

For the second one, with node.js 11 in the experimental phase, you can use the worker thread in order to increase your web server throughput.

对于第二个,node.js 11 处于实验阶段,您可以使用工作线程来增加您的 Web 服务器吞吐量。