Split large file into smaller files by number of lines in C#?

10,240

Solution 1

using (System.IO.StreamReader sr = new System.IO.StreamReader("path"))
{
    int fileNumber = 0;

    while (!sr.EndOfStream)
    {
        int count = 0;

        using (System.IO.StreamWriter sw = new System.IO.StreamWriter("other path" + ++fileNumber))
        {
            sw.AutoFlush = true;

            while (!sr.EndOfStream && ++count < 20000)
            {
                sw.WriteLine(sr.ReadLine());
            }
        }
    }
}

Solution 2

int index=0;
var groups = from line in File.ReadLines("myfile.csv")
             group line by index++/20000 into g
             select g.AsEnumerable();
int file=0;
foreach (var group in groups)
        File.WriteAllLines((file++).ToString(), group.ToArray());

Solution 3

I'd do it like this:

// helper method to break up into blocks lazily

public static IEnumerable<ICollection<T>> SplitEnumerable<T>
    (IEnumerable<T> Sequence, int NbrPerBlock)
{
    List<T> Group = new List<T>(NbrPerBlock);

    foreach (T value in Sequence)
    {
        Group.Add(value);

        if (Group.Count == NbrPerBlock)
        {
            yield return Group;
            Group = new List<T>(NbrPerBlock);
        }
    }

    if (Group.Any()) yield return Group; // flush out any remaining
}

// now it's trivial; if you want to make smaller files, just foreach
// over this and write out the lines in each block to a new file

public static IEnumerable<ICollection<string>> SplitFile(string filePath)
{
    return File.ReadLines(filePath).SplitEnumerable(20000);
}

Is that not sufficient for you? You mention moving from position to position,but I don't see why that's necessary.

Share:
10,240
DDiVita
Author by

DDiVita

Updated on June 04, 2022

Comments

  • DDiVita
    DDiVita almost 2 years

    I am trying to figure out how to split a file by the number of lines in each file. THe files are csv and I can't do it by bytes. I need to do it by lines. 20k seems to be a good number per file. What is the best way to read a stream at a given position? Stream.BaseStream.Position? So if I read the first 20k lines i would start the position at 39,999? How do I know I am almost at the end of a files? Thanks all

  • mqp
    mqp over 13 years
    You need to use File.ReadLines instead of ReadAllLines -- ReadAllLines reads it all into memory at once. Also, using index in the grouping function like that freaks me out.
  • Jimmy Hoffa
    Jimmy Hoffa over 13 years
    While this is indeed interesting, there are enough cases that you don't want to read an entire file into memory that I would at least add the stipulation that you need to know the files won't be too large if you're going to use this method..
  • Jimmy Hoffa
    Jimmy Hoffa over 13 years
    This seems the most straight forward to me, though for memory's sake I would flush the write buffer with each write possibly. if each line is 100 bytes, that makes 1000 lines 100k, and 20000 2Mb, not a ton of memory but an unnecesarry foot print..
  • Jon B
    Jon B over 13 years
    @Jimmy - I added AutoFlush = True, which automatically flushes after each write.
  • Lasse V. Karlsen
    Lasse V. Karlsen over 13 years
    Won't the grouping method collect everything regardless of whether you use ReadLines or ReadAllLines?
  • mqp
    mqp over 13 years
    I assume so, but with ReadAllLines, you'd have the whole thing in memory twice instead of once.
  • DDiVita
    DDiVita over 13 years
    Never thought about using LINQ. Nice!
  • Tergiver
    Tergiver over 13 years
    AutoFlush is a bad idea on a StreamWriter as it will flush after every single character (I looked at the code). If you don't specify a buffer size when creating a StreamWriter it defaults to only 128 characters, but that's still better than no buffer at all.