postgresql 如何使用 node-postgres 进行批量插入

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/42468723/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-21 02:31:47  来源:igfitidea点击:

How to do a bulk insert with node-postgres

postgresqlexpressnode-postgres

提问by Fabrizio Mazzoni

I am importing an excel file into a postgres database with express and node-pg

我正在使用 express 和 node-pg 将一个 excel 文件导入到 postgres 数据库中

Currently I am looping through the excel rows and executing an insert for every row but I feel it's not the right way:

目前我正在遍历 excel 行并为每一行执行插入,但我觉得这不是正确的方法:

workbook.xlsx.readFile(excel_file).then(function () {
        // get the first worksheet          
        var worksheet = workbook.getWorksheet(1);
        // Loop through all rows
        worksheet.eachRow(function (row, rowNumber) {
            // Commit to DB only from line 2 and up. We want to exclude headers from excel file
            if (rowNumber > 1) {
                // Loop through all values and build array to pass to DB function
                row.eachCell(function (cell, colNumber) {
                    arrSQLParams.push(cell.value)                   
                })

                // Add the user id from session to the array
                arrSQLParams.push(user);

                // Insert into DB
                db.query(strSQL, arrSQLParams, function (err, result) {
                    if (err) {
                        console.log(err);
                            ret = false;
                        }
                })

                // Empty the array for new query
                arrSQLParams = [];
            }
        })          
    });

Is there a better way to do this to improve performance?

有没有更好的方法来提高性能?

回答by vitaly-t

Following the clarification provided by the author, to insert up to 1000 records at a time, the solution as suggested within Multi-row insert with pg-promiseis exactly what the author needs, in terms of both performance and flexibility.

根据作者提供的说明,一次最多插入 1000 条记录,在多行插入中建议的带有 pg-promise的解决方案正是作者所需要的,无论是性能还是灵活性。

UPDATE

更新

A must-read article: Data Imports.

必读文章:数据导入

回答by Zia Uddin

You can use this package https://www.npmjs.com/package/pg-essential. It will apply a patch on node-postgres and You just need to call it's executeBulkInsertion function. You can create an array of the objects to be inserted and pass it to the executeBulkInsertion function.

你可以使用这个包https://www.npmjs.com/package/pg-essential。它会在 node-postgres 上应用一个补丁,你只需要调用它的 executeBulkInsertion 函数。您可以创建要插入的对象数组并将其传递给 executeBulkInsertion 函数。

let bulkData = [];
foreach( user in users){
 bulkData.push(user);
}
await db.executeBulkInsertion(bulkData,[array of column names],[table name]);