Insert 2 million rows into SQL Server quickly Insert 2 million rows into SQL Server quickly sql-server sql-server

Insert 2 million rows into SQL Server quickly


  1. I think its better you read data of text file in DataSet

  2. Try out SqlBulkCopy - Bulk Insert into SQL from C# App

    // connect to SQLusing (SqlConnection connection = new SqlConnection(connString)){    // make sure to enable triggers    // more on triggers in next post    SqlBulkCopy bulkCopy = new SqlBulkCopy(        connection,         SqlBulkCopyOptions.TableLock |         SqlBulkCopyOptions.FireTriggers |         SqlBulkCopyOptions.UseInternalTransaction,        null        );    // set the destination table name    bulkCopy.DestinationTableName = this.tableName;    connection.Open();    // write the data in the "dataTable"    bulkCopy.WriteToServer(dataTable);    connection.Close();}// resetthis.dataTable.Clear();

or

after doing step 1 at the top

  1. Create XML from DataSet
  2. Pass XML to database and do bulk insert

you can check this article for detail : Bulk Insertion of Data Using C# DataTable and SQL server OpenXML function

But its not tested with 2 million record, it will do but consume memory on machine as you have to load 2 million record and insert it.


You can try with SqlBulkCopy class.

Lets you efficiently bulk load a SQL Server table with data from another source.

There is a cool blog post about how you can use it.


Re the solution for SqlBulkCopy:

I used the StreamReader to convert and process the text file. The result was a list of my object.

I created a class than takes Datatable or a List<T> and a Buffer size (CommitBatchSize). It will convert the list to a data table using an extension (in the second class).

It works very fast. On my PC, I am able to insert more than 10 million complicated records in less than 10 seconds.

Here is the class:

using System;using System.Collections;using System.Collections.Generic;using System.ComponentModel;using System.Data;using System.Data.SqlClient;using System.Linq;using System.Text;using System.Threading.Tasks;namespace DAL{public class BulkUploadToSql<T>{    public IList<T> InternalStore { get; set; }    public string TableName { get; set; }    public int CommitBatchSize { get; set; }=1000;    public string ConnectionString { get; set; }    public void Commit()    {        if (InternalStore.Count>0)        {            DataTable dt;            int numberOfPages = (InternalStore.Count / CommitBatchSize)  + (InternalStore.Count % CommitBatchSize == 0 ? 0 : 1);            for (int pageIndex = 0; pageIndex < numberOfPages; pageIndex++)                {                    dt= InternalStore.Skip(pageIndex * CommitBatchSize).Take(CommitBatchSize).ToDataTable();                BulkInsert(dt);                }        }     }    public void BulkInsert(DataTable dt)    {        using (SqlConnection connection = new SqlConnection(ConnectionString))        {            // make sure to enable triggers            // more on triggers in next post            SqlBulkCopy bulkCopy =                new SqlBulkCopy                (                connection,                SqlBulkCopyOptions.TableLock |                SqlBulkCopyOptions.FireTriggers |                SqlBulkCopyOptions.UseInternalTransaction,                null                );            // set the destination table name            bulkCopy.DestinationTableName = TableName;            connection.Open();            // write the data in the "dataTable"            bulkCopy.WriteToServer(dt);            connection.Close();        }        // reset        //this.dataTable.Clear();    }}public static class BulkUploadToSqlHelper{    public static DataTable ToDataTable<T>(this IEnumerable<T> data)    {        PropertyDescriptorCollection properties =            TypeDescriptor.GetProperties(typeof(T));        DataTable table = new DataTable();        foreach (PropertyDescriptor prop in properties)            table.Columns.Add(prop.Name, Nullable.GetUnderlyingType(prop.PropertyType) ?? prop.PropertyType);        foreach (T item in data)        {            DataRow row = table.NewRow();            foreach (PropertyDescriptor prop in properties)                row[prop.Name] = prop.GetValue(item) ?? DBNull.Value;            table.Rows.Add(row);        }        return table;    }}

}

Here is an example when I want to insert a List of my custom object List<PuckDetection> (ListDetections):

var objBulk = new BulkUploadToSql<PuckDetection>(){        InternalStore = ListDetections,        TableName= "PuckDetections",        CommitBatchSize=1000,        ConnectionString="ENTER YOU CONNECTION STRING"};objBulk.Commit();

The BulkInsert class can be modified to add column mapping if required. Example you have an Identity key as first column.(this assuming that the column names in the datatable are the same as the database)

//ADD COLUMN MAPPINGforeach (DataColumn col in dt.Columns){        bulkCopy.ColumnMappings.Add(col.ColumnName, col.ColumnName);}