Exception of type 'System.OutOfMemoryException' was thrown

58,036

Solution 1

Most probabbly it's not about a RAM as is, so increasing your RAM or even compiling and running your code in 64 bit machine will not have a positive effect, in this case.

I think it's related to a fact that .NET collections are limited to maximum 2GB RAM space (no difference either 32 or 64 bit).

To resolve this, split your list to much smaller chunks and most probabbly your problem will gone.

Just one possible solution:

foreach (var result in query)
{
    ....
    if(logFilePathFileName.Count %1000 ==0) {
        Console.WriteLine(temp+"."+logFilePathFileName.Count);
        //WRITE SOMEWHERE YOU NEED 
        logFilePathFileName = new List<string>(); //RESET LIST !|
    }
}

EDIT

If you want fragment a query, you can use Skip(...) and Take(...)

Just an explanatory example:

var fisrt1000 = query.Skip(0).Take(1000);
var second1000 = query.Skip(1000).Take(1000);

... and so on..

Naturally put it in your iteration and parametrize it based on bounds of data you know or need.

Solution 2

Why are you collecting the data in a List<string> if all you need to do is write it to a text file?

You might as well just:

  • Open the text file;
  • Iterate over the records, appending each string to the text file (without storing the strings in memory);
  • Flush and close the text file.

You will need far less memory than now, because you won't be keeping all those strings unnecessarily in memory.

Solution 3

You probably need to set some vmargs for memory! Also... look into writing it straight to your file and not holding it in a List

Solution 4

What Roy Dictus says sounds the best way. Also you can try to add a limit to your query. So your database result won't be so large.

For info on: Limiting query size with entity framework

Share:
58,036
Admin
Author by

Admin

Updated on September 30, 2020

Comments

  • Admin
    Admin over 3 years

    Basically I use Entity Framework to query a huge database. I want to return a string list then log it to a text file.

    List<string> logFilePathFileName = new List<string>();
    var query = from c in DBContext.MyTable where condition = something select c;
    foreach (var result in query)
    {
        filePath = result.FilePath;
        fileName = result.FileName;
        string temp = filePath + "." + fileName;
        logFilePathFileName.Add(temp);
        if(logFilePathFileName.Count %1000 ==0)
            Console.WriteLine(temp+"."+logFilePathFileName.Count);
    }
    

    However I got an exception when logFilePathFileName.Count=397000. The exception is:

    Exception of type 'System.OutOfMemoryException' was thrown.

    A first chance exception of type 'System.OutOfMemoryException' occurred in System.Data.Entity.dll

    UPDATE:

    What I want to use a different query say: select top 1000 then add to the list, but I don't know after 1000 then what?

    • Roy Dictus
      Roy Dictus over 11 years
      Hint: upgrade your RAM :-)
    • Hamlet Hakobyan
      Hamlet Hakobyan over 11 years
      Can you provide full info about exception with inner exception if any and also stack trace.
    • Cédric Bignon
      Cédric Bignon over 11 years
      What length is temp typically?
    • Admin
      Admin over 11 years
      Memory is 4GB, it is enough...
    • Johnny Mopp
      Johnny Mopp over 11 years
    • Alex
      Alex over 11 years
      OutOfMemoryException has hardly ever anything to do with physical memory.
  • Admin
    Admin over 11 years
    The question is that the database seems to be died as well because of heavy query.
  • Tigran
    Tigran over 11 years
    So exception you get on server ?
  • Admin
    Admin over 11 years
    Not yet, may be something else. Please see my update, I want to select top 1000 bulk then iterate the rest, how?
  • Tigran
    Tigran over 11 years
    You ca fragment or your query using Skip(..) and Take(..), or get all in one query but fragment list.
  • Admin
    Admin over 11 years
    I meant after 1000, how to select 1001-2000 by query? You can't still use select top 1000...