What is the maximum number of records i can see in Notepad++?

22,596

If you load it using a DTS package, it will serially process the file chunks at a time. It won't attempt to load everything into memory first, so you will get all your records.

It looks like Notepad++ has a limit of 2G and will load that much of the file without crashing. If you see 2lakhs rows, then each row probably consumes 10K of space.

Share:
22,596
chandra sekhar
Author by

chandra sekhar

Updated on July 05, 2022

Comments

  • chandra sekhar
    chandra sekhar almost 2 years

    I have a file with the name ABC.D110111 which is of 10GB size. I have to load this huge data to database table. So i created a DTS package to load from file to table. before loading i just want to see how many records exist in the file. So i opened it in notepad++. It is showing around 2.1 millions (21 lakhs) records.

    Since it is a 10GB file, it will definitely have more than 2.1 millions records. But i am able to see only 0.2 millions (2 lakhs) records. Is there any row limitation in Notepad++? If yes, how many rows or how much size?

    If load it as it is by running my DTS package, will i get all records or few records?

  • chandra sekhar
    chandra sekhar over 11 years
    If you do not mind, Just for confirmation. Let us say 10 lakhs records in file. I am able to see 1lakh records in notepad++. If i run DTS package, will it load 1lakh records or 10lakh records?
  • RichardTheKiwi
    RichardTheKiwi over 11 years
    you will get all your records - 10lakh. btw, try using English units of measure terms only on SO!
  • Sourav Ghosh
    Sourav Ghosh about 6 years
    I had a 44 MB text file with about 2,100,000 lines and notepad++ failed to open with error 'too big'. I used glogg. WIN 10 64 bit
  • Alessio Moraschini
    Alessio Moraschini about 3 years
    This does not means that loading that data can be fluent and easy to work with. The Scintilla engine used by notepad (such as other engines to have syntax highlighting and similar) uses much more memory than the data itself, and needs lot of cpu power with big files... try to load a file with one million lines, and often you will see that the application crashes, or hangs for seconds/minutes... To view logs I suggest to use specific tools such as GLogg or similar ones ;)