Exception of type 'System.OutOfMemoryException' was thrown. C# when using IDataReader

32,743

Solution 1

Check that you are building a 64-bit process, and not a 32-bit one, which is the default compilation mode of Visual Studio. To do this, right click on your project, Properties -> Build -> platform target : x64. As any 32-bit process, Visual Studio applications compiled in 32-bit have a virtual memory limit of 2GB.

64-bit processes do not have this limitation, as they use 64-bit pointers, so their theoretical maximum address space is 16 exabytes (2^64). In reality, Windows x64 limits the virtual memory of processes to 8TB. The solution to the memory limit problem is then to compile in 64-bit.

However, object’s size in Visual Studio is still limited to 2GB, by default. You will be able to create several arrays whose combined size will be greater than 2GB, but you cannot by default create arrays bigger than 2GB. Hopefully, if you still want to create arrays bigger than 2GB, you can do it by adding the following code to you app.config file:

<configuration>
  <runtime>
    <gcAllowVeryLargeObjects enabled="true" />
  </runtime>
</configuration>

Solution 2

I think simply you run out of memory because your DataTable gets so large from all the rows you keep adding to it.

You may want to try a different pattern in this case.

Instead of buffering your rows in a list (or DataTable), can you simply yield the rows as they are available for use when they arrive?

Solution 3

Since you are using a DataTable, let me share a random problem that I was having using one. Check your Build properties. I had a problem with a DataTable throwing an out of memory exception randomly. As it turned out, the project's Build Platform target was set to Prefer 32-bit. Once I unselected that option, the random out of memory exception went away.

Solution 4

You are storing a copy of the data to dt. You are simply storing so much that the machine is running out of memory. So you have few options:

  • Increase the available memory.
  • Reduce the amount of data you are retrieving.

To increase the available memory, you can add physical memory to the machine. Note that a .NET process on a 32bit machine will not be able to access more than 2GB of memory though (3GB if you enable the 3GB switch in boot.ini) so you may need to switch to 64bit (machine and process) if you wish to address more memory than that.

Retrieving less data is probably the way to go. Depending upon what you are trying to achieve, you may be able to perform the task on subsets of the data (perhaps even on individual rows). If you are performing some kind of aggregation (e.g. a producing a summary or report from the data) you may be able to employ Map-Reduce.

Share:
32,743
DA_Prog
Author by

DA_Prog

Updated on March 24, 2020

Comments

  • DA_Prog
    DA_Prog about 4 years

    I have an application in which I have to get a large amount of data from DB. Since it failed to get all of those rows (it's close to 2,000,000 rows...), I cut it in breaks, and I run each time the sql query and get only 200,000 rows each time.

    I use DataTable to which I enter all of the data (meaning - all 2,000,000 rows should be there).

    The first few runs are fine. Then it fails with the OutOfMemoryException.

    My code works as following:

    private static void RunQueryAndAddToDT(string sql, string lastRowID, SqlConnection conn, DataTable dt, int prevRowCount)
        {
            if (string.IsNullOrEmpty(sql))
            {
                sql = generateSqlQuery(lastRowID);
            }
    
            if (conn.State == ConnectionState.Closed)
            {
                conn.Open();
            }
    
            using (IDbCommand cmd2 = conn.CreateCommand())
            {
                cmd2.CommandType = CommandType.Text;
                cmd2.CommandText = sql;
                cmd2.CommandTimeout = 0;
    
                using (IDataReader reader = cmd2.ExecuteReader())
                {
                    while (reader.Read())
                    {
                        DataRow row = dt.NewRow();
                        row["RowID"] = reader["RowID"].ToString();
                        row["MyCol"] = reader["MyCol"].ToString();
                        ... //In one of these rows it returns the exception.
    
                        dt.Rows.Add(row);
                    }
                }
            }
    
            if (conn != null)
            {
                conn.Close();
            }
    
            if (dt.Rows.Count > prevRowCount)
            {
                lastRowID = dt.Rows[dt.Rows.Count - 1]["RowID"].ToString();
                sql = string.Empty;
                RunQueryAndAddToDT(sql, lastRowID, conn, dt, dt.Rows.Count);
            }
        }
    

    It seems to me as if the reader keeps collecting rows, and that's why It throws an exception only in the third or second round.

    Shouldn't the Using clean the memory as its done? What may solve my problem?

    Note: I should explain - I have no other choice but get all of those rows to the datatable, Since I do some manipulation on them later, and the order of the rows is important, and I can't split it because sometimes I have to take the data of some rows and set it into one row and so on and so on, so I can't give it up.

    Thanks.

  • Marc Gravell
    Marc Gravell over 11 years
    Using doesn't really clear memory - it doesn't trigger collection. Also: a reader only has a small buffer
  • Paul Ruane
    Paul Ruane over 11 years
    @MarcGravell: good point, but it makes the memory the reader was using eligible for collection which would prevent the OOM in this case if the results were not being stored elsewhere.
  • Marc Gravell
    Marc Gravell over 11 years
    No, because the reader doesn't hold all the data: it is a streaming API.
  • Daniel Gee
    Daniel Gee about 6 years
    Worked for me! Thanks!
  • Natiq
    Natiq over 5 years
    Thank you. Worked for me
  • Anand
    Anand over 5 years
    I am using the 32-bit application and I am trying to add 1700000 records into the Data table from sql reader its thrown the Out of memory exception. After using the above-mentioned solution again it has thrown the same error. Please suggest me to solve his issue.