如何在c#中有效地从SQL datareader写入文件?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/9055521/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-09 05:44:54  来源:igfitidea点击:

How to efficiently write to file from SQL datareader in c#?

c#sqldatareaderwritetofile

提问by Sam

I have a remote sql connection in C# that needs to execute a query and save its results to the users's local hard disk. There is a fairly large amount of data this thing can return, so need to think of an efficient way of storing it. I've read before that first putting the whole result into memory and then writing it is not a good idea, so if someone could help, would be great!

我在 C# 中有一个远程 sql 连接,需要执行查询并将其结果保存到用户的本地硬盘。这个东西可以返回的数据量相当大,所以需要想一种有效的存储方式。我之前读过,先把整个结果存入内存然后写出来不是一个好主意,所以如果有人能帮忙,那就太好了!

I am currently storing the sql result data into a DataTable, although I am thinking it could be better doing something in while(myReader.Read(){...}Below is the code that gets the results:

我目前正在将 sql 结果数据存储到 DataTable 中,尽管我认为在while(myReader.Read(){...}下面做一些事情可能会更好,这是获取结果的代码:

          DataTable t = new DataTable();
            string myQuery = QueryLoader.ReadQueryFromFileWithBdateEdate(@"Resources\qrs\qryssysblo.q", newdate, newdate);
            using (SqlDataAdapter a = new SqlDataAdapter(myQuery, sqlconn.myConnection))
            {
                a.Fill(t);
            }

            var result = string.Empty;
    for(int i = 0; i < t.Rows.Count; i++)
    {
        for (int j = 0; j < t.Columns.Count; j++)
        {
            result += t.Rows[i][j] + ",";
        }


        result += "\r\n";
    }

So now I have this huge result string. And I have the datatable. There has to be a much better way of doing it?

所以现在我有了这个巨大的结果字符串。我有数据表。必须有更好的方法吗?

Thanks.

谢谢。

采纳答案by Anders Abel

You are on the right track yourself. Use a loop with while(myReader.Read(){...}and write each record to the text file inside the loop. The .NET framework and operating system will take care of flushing the buffers to disk in an efficient way.

你自己走在正确的轨道上。使用循环while(myReader.Read(){...}并将每条记录写入循环内的文本文件。.NET 框架和操作系统将负责以有效的方式将缓冲区刷新到磁盘。

using(SqlConnection conn = new SqlConnection(connectionString))
using(SqlCommand cmd = conn.CreateCommand())
{
  conn.Open();
  cmd.CommandText = QueryLoader.ReadQueryFromFileWithBdateEdate(
    @"Resources\qrs\qryssysblo.q", newdate, newdate);

  using(SqlDataReader reader = cmd.ExecuteReader())
  using(StreamWriter writer = new StreamWriter("c:\temp\file.txt"))
  {
    while(reader.Read())
    {
      // Using Name and Phone as example columns.
      writer.WriteLine("Name: {0}, Phone : {1}", 
        reader["Name"], reader["Phone"]);
    }
  }
}

回答by Anders Abel

I agree that your best bet here would be to use a SqlDataReader. Something like this:

我同意您最好的选择是使用SqlDataReader. 像这样的东西:

StreamWriter YourWriter = new StreamWriter(@"c:\testfile.txt");
SqlCommand YourCommand = new SqlCommand();
SqlConnection YourConnection = new SqlConnection(YourConnectionString);
YourCommand.Connection = YourConnection;
YourCommand.CommandText = myQuery;

YourConnection.Open();

using (YourConnection)
{
    using (SqlDataReader sdr = YourCommand.ExecuteReader())
        using (YourWriter)
        {
            while (sdr.Read())
                YourWriter.WriteLine(sdr[0].ToString() + sdr[1].ToString() + ",");

        }
}

Mind you, in the whileloop, you can write that line to the text file in any format you see fit with the column data from the SqlDataReader.

请注意,在while循环中,您可以将该行以任何您认为适合SqlDataReader.

回答by Rob Sedgwick

I came up with this, it's a better CSV writer than the other answers:

我想出了这个,它是一个比其他答案更好的 CSV 编写器:

public static class DataReaderExtension
{
    public static void ToCsv(this IDataReader dataReader, string fileName, bool includeHeaderAsFirstRow)
    {

        const string Separator = ",";

        StreamWriter streamWriter = new StreamWriter(fileName);

        StringBuilder sb = null;

        if (includeHeaderAsFirstRow)
        {
            sb = new StringBuilder();
            for (int index = 0; index < dataReader.FieldCount; index++)
            {
                if (dataReader.GetName(index) != null)
                    sb.Append(dataReader.GetName(index));

                if (index < dataReader.FieldCount - 1)
                    sb.Append(Separator);
            }
            streamWriter.WriteLine(sb.ToString());
        }

        while (dataReader.Read())
        {
            sb = new StringBuilder();
            for (int index = 0; index < dataReader.FieldCount; index++)
            {
                if (!dataReader.IsDBNull(index))
                {
                    string value = dataReader.GetValue(index).ToString();
                    if (dataReader.GetFieldType(index) == typeof(String))
                    {
                        if (value.IndexOf("\"") >= 0)
                            value = value.Replace("\"", "\"\"");

                        if (value.IndexOf(Separator) >= 0)
                            value = "\"" + value + "\"";
                    }
                    sb.Append(value);
                }

                if (index < dataReader.FieldCount - 1)
                    sb.Append(Separator);
            }

            if (!dataReader.IsDBNull(dataReader.FieldCount - 1))
                sb.Append(dataReader.GetValue(dataReader.FieldCount - 1).ToString().Replace(Separator, " "));

            streamWriter.WriteLine(sb.ToString());
        }
        dataReader.Close();
        streamWriter.Close();
    }
}

usage: mydataReader.ToCsv("myfile.csv", true)

用法:mydataReader.ToCsv("myfile.csv", true)

回答by RWC

Rob Sedgwick answer is more like it, but can be improved and simplified. This is how I did it:

Rob Sedgwick 的回答更像它,但可以改进和简化。我是这样做的:

string separator = ";";
string fieldDelimiter = "";
bool useHeaders = true;

string connectionString = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx";

using (SqlConnection conn = new SqlConnection(connectionString))
{
     using (SqlCommand cmd = conn.CreateCommand())
     {
          conn.Open();
          string query = @"SELECT whatever";

          cmd.CommandText = query;

          using (SqlDataReader reader = cmd.ExecuteReader())
          {
                if (!reader.Read())
                {
                     return;
                }

                List<string> columnNames = GetColumnNames(reader);

                // Write headers if required
                if (useHeaders)
                {
                     first = true;
                     foreach (string columnName in columnNames)
                     {
                          response.Write(first ? string.Empty : separator);
                          line = string.Format("{0}{1}{2}", fieldDelimiter, columnName, fieldDelimiter);
                          response.Write(line);
                          first = false;
                     }

                     response.Write("\n");
                }

                // Write all records
                do
                {
                     first = true;
                     foreach (string columnName in columnNames)
                     {
                          response.Write(first ? string.Empty : separator);
                          string value = reader[columnName] == null ? string.Empty : reader[columnName].ToString();
                          line = string.Format("{0}{1}{2}", fieldDelimiter, value, fieldDelimiter);
                          response.Write(line);
                          first = false;
                     }

                     response.Write("\n");
                }
                while (reader.Read());
          }
     }
}

And you need to have a function GetColumnNames:

你需要有一个函数 GetColumnNames:

List<string> GetColumnNames(IDataReader reader)
{
    List<string> columnNames = new List<string>();
    for (int i = 0; i < reader.FieldCount; i++)
    {
         columnNames.Add(reader.GetName(i));
    }

    return columnNames;
}

回答by David Coulter

Using the response object without a response.Close()causes at least in some instances the html of the page writing out the data to be written to the file. If you use Response.Close()the connection can be closed prematurely and cause an error producing the file.

使用不带 a 的响应对象response.Close()至少在某些情况下会导致页面的 html 写出要写入文件的数据。如果使用Response.Close()该连接可能会过早关闭并导致生成文件时出错。

It is recommended to use the HttpApplication.CompleteRequest()however this appears to always cause the html to be written to the end of the file.

建议使用 ,HttpApplication.CompleteRequest()但是这似乎总是导致 html 被写入文件的末尾。

I have tried the stream in conjunction with the response object and have had success in the development environment. I have not tried it in production yet.

我已经尝试将流与响应对象结合使用,并在开发环境中取得了成功。我还没有在生产中尝试过。

回答by Nicolas

Keeping your original approach, here is a quick win:

保持你原来的方法,这是一个快速的胜利:

Instead of using String as a temporary buffer, use StringBuilder. That will allow you to use the function .append(String)for concatenations, instead of using the operator +=.

不要使用 String 作为临时缓冲区,而是使用StringBuilder。这将允许您使用该函数.append(String)进行连接,而不是使用 operator +=

The operator +=is specially inefficient, so if you place it on a loop and it is repeated (potentially) millions of times, the performance will be affected.

运算符+=特别低效,因此如果将其放在循环中并重复(可能)数百万次,性能将受到影响。

The .append(String)method won't destroy the original object, so it's faster

.append(String)方法不会破坏原始对象,因此速度更快

回答by Nigje

I used .CSVto export data from databaseby DataReader. in my project i read datareader and create .CSV file manualy. in a loop i read datareader and for every rows i append cell value to result string. for separate columns i use "," and for separate rows i use "\n". finally i saved result string as result.csv.

我使用.CSV通过DataReader数据库导出数据。在我的项目中,我阅读了 datareader 并手动创建了 .CSV 文件。在循环中,我读取 datareader,对于每一行,我将单元格值附加到结果字符串。对于单独的列,我使用“,”,对于单独的行,我使用“\n”。最后我将结果字符串保存为result.csv

I suggest this high performance extension. i tested it and quickly export 600,000 rowsas .CSV .

我建议使用这种高性能扩展。我测试了它并快速将600,000 行导出为 .CSV 。