Exception of type 'System.OutOfMemoryException' was thrown. C# when using IDataReader Exception of type 'System.OutOfMemoryException' was thrown. C# when using IDataReader sql sql

Exception of type 'System.OutOfMemoryException' was thrown. C# when using IDataReader

Check that you are building a 64-bit process, and not a 32-bit one, which is the default compilation mode of Visual Studio. To do this, right click on your project, Properties -> Build -> platform target : x64. As any 32-bit process, Visual Studio applications compiled in 32-bit have a virtual memory limit of 2GB.

64-bit processes do not have this limitation, as they use 64-bit pointers, so their theoretical maximum address space is 16 exabytes (2^64). In reality, Windows x64 limits the virtual memory of processes to 8TB. The solution to the memory limit problem is then to compile in 64-bit.

However, object’s size in Visual Studio is still limited to 2GB, by default. You will be able to create several arrays whose combined size will be greater than 2GB, but you cannot by default create arrays bigger than 2GB. Hopefully, if you still want to create arrays bigger than 2GB, you can do it by adding the following code to you app.config file:

<configuration>  <runtime>    <gcAllowVeryLargeObjects enabled="true" />  </runtime></configuration>

I think simply you run out of memory because your DataTable gets so large from all the rows you keep adding to it.

You may want to try a different pattern in this case.

Instead of buffering your rows in a list (or DataTable), can you simply yield the rows as they are available for use when they arrive?

Since you are using a DataTable, let me share a random problem that I was having using one. Check your Build properties. I had a problem with a DataTable throwing an out of memory exception randomly. As it turned out, the project's Build Platform target was set to Prefer 32-bit. Once I unselected that option, the random out of memory exception went away.