Newtonsoft.Json - Out of memory exception while deserializing big object

12,908

According to Newtonsoft.Json Performance Tips your approach has to work (because you read via stream and it should make portion from your file). I can't figure out why your code doesn't work.

But you can try another approach, that was described in the next article - Parsing Big Records with Json.NET

Share:
12,908
cmarlowe88
Author by

cmarlowe88

Updated on June 14, 2022

Comments

  • cmarlowe88
    cmarlowe88 about 2 years

    I have a problem deserializing a JSON file of about 1GB. When I run the following code I get an out of memory exception:

    using (FileStream sr = new FileStream("myFile.json", FileMode.Open, FileAccess.Read))
    {
      using (StreamReader reader = new StreamReader(sr))
      {
        using (JsonReader jsReader = new JsonTextReader(reader))
        {
          JsonSerializer serializer = new JsonSerializer();
          dataObject = serializer.Deserialize<T>(jsReader);
        }
      }
    }
    

    the exception is thrown by

    Newtonsoft.Json.Linq.JTokenWriter.WriteValue(Int64 value)
    

    The serialization works well, here is the code I'm using

    using (StreamWriter reader = new StreamWriter("myFile.json"))
    {
       using (JsonReader jsWriter = new JsonWriter(reader))
       {
          JsonTextWriter jsonWriter = new JsonTextWriter(jsWriter) { Formatting = Formatting.Indented };
          JsonSerializer ser = new JsonSerializer();
          ser.Serialize(jsonWriter, dataObject, dataObject.GetType());
          jsonWriter.Flush();
        }
    }}
    

    Am I doing something wrong in the deserialization? Can you help suggesting a way to deserialize big json object?

    Thanks

  • Matteo Umili
    Matteo Umili over 8 years
    Probably a 1GB Json deserialized into an object does not "fit" in memory