Multiple concurrent calls to SqlCommand.BeginExecuteNonQuery using same SqlConnection Multiple concurrent calls to SqlCommand.BeginExecuteNonQuery using same SqlConnection multithreading multithreading

Multiple concurrent calls to SqlCommand.BeginExecuteNonQuery using same SqlConnection


Well, at the extreme risk of receiving a lot of down votes I have to comment on this one. Firstly, this is a good question and well stated to address the specific potential issue you mentioned. However, you have neglected to discuss this "lengthy" process you're trying to accomplish.

My experience has thought me one thing...

If the question your asking is hard to answer, change the question.

Though I know very little of your specific problem, I think this neatly applies to your predicament. As others have mentioned... Temp tables are nasty, creating your own tables for a specific task is nastier still, updating large quantities of data in SQL is expensive.

Ask yourself "Can you avoid it all?"

Quite often people choose to implement extremely complicated logic in databases because they have a belief that SQL can do it faster. Practically this is a flawed concept, Databases are storage/serialization devices, they are good at storing, updating, locating, and synchronizing access to data. They are not well equipped for processing complex operations. Even after Microsoft's (and others) bastardization of the database by injecting full development languages into it, It cannot perform as optimally as a well written client (*depending on the complexity of the operations, which I suspect you have surpassed).

As an example, you have a database of around 2gb worth of raw data. You want to produce a complex report or analysis on the entire data set. Well simply put 2gb of memory is easy to come by, slurp the entire database (or the portion you need) into memory using dictionaries or whatever to create the look-ups you will need. Depending on several factors the whole thing will likely run several times faster than SQL, can easily be unit tested, and will be (IMHO) significantly easier to build, debug, and maintain than a nasty assortment of SPROCs constructing dynamic SQL. Even with more than 2gb of raw data, client caches can easily be created with several existing technologies (B-Trees, ISAM, or the like).

The product I work on today has 2.4tb of data in the database and we have not a single sproc, join statement, or even a non-equality where clause.

But alas my advice may or may not be pertinent to your specific circumstances since I do not know your objectives or constraints. Hopefully, if nothing else, it will make you ask yourself:

"Am I asking the right question?"


You could use a producer-consumer pattern with 2 threads and 2 simultaneous but independent sql connections.

The producer (1st thread) has the DataReader (1st sql connection) and writes its results to a blocking queue. The consumer (2nd thread) reads from the queue, has the ExecuteNonQuery (2nd sql connection) and writes to the temp table(s).

Another idea in case your ExecuteNonQuery commands are basically multiple INSERTs:ExecuteNonQuery has an overload with a StringCollection to send multiple sql statements as one operation.


There can be only one DataReader associated with a Command object, and there can be numerous Command objects associated with the same connection. The only thing you can't do here is use the same command with different parameters.

However, when you start a database transaction (implicit if not explicit), the resources associated with that transaction are locked until the transaction is committed or rolled back, and all processes who want to query those resources are put in a queue. SQL Server manages queues pretty well. I had a few problems with deadlocks because of high server load in SQL Server 2000, but there were no such problems with later versions.

It is strange that you actually received a performance improvement. This makes me think you have a very large amount of data, which takes time to process when sending to SQL Server. When transmitting chunks, less time is consumed because data transmission and data processing is performed concurrently.

Anyway, there shouldn't be any problem with that.

However, consider using CLR Assemblies (if this option is available) to process the information directly in the database engine, without TCP traffic.