Text editor to open big (giant, huge, large) text files
Solution 1
Free read-only viewers:
- Large Text File Viewer (Windows) – Fully customizable theming (colors, fonts, word wrap, tab size). Supports horizontal and vertical split view. Also support file following and regex search. Very fast, simple, and has small executable size.
-
klogg (Windows, macOS, Linux) – A maintained fork of glogg. Its main feature is regular expression search. It supports monitoring file changes (like
tail
), bookmarks, highlighting patterns using different colors, and has serious optimizations built in. But from a UI standpoint, it's rather minimal. -
LogExpert (Windows) – "A GUI replacement for
tail
." It's really a log file analyzer, not a large file viewer, and in one test it required 10 seconds and 700 MB of RAM to load a 250 MB file. But its killer features are the columnizer (parse logs that are in CSV, JSONL, etc. and display in a spreadsheet format) and the highlighter (show lines with certain words in certain colors). Also supports file following, tabs, multifiles, bookmarks, search, plugins, and external tools. - Lister (Windows) – Very small and minimalist. It's one executable, barely 500 KB, but it still supports searching (with regexes), printing, a hex editor mode, and settings.
Free editors:
- Your regular editor or IDE. Modern editors can handle surprisingly large files. In particular, Vim (Windows, macOS, Linux), Emacs (Windows, macOS, Linux), Notepad++ (Windows), Sublime Text (Windows, macOS, Linux), and VS Code (Windows, macOS, Linux) support large (~4 GB) files, assuming you have the RAM.
- Large File Editor (Windows) – Opens and edits TB+ files, supports Unicode, uses little memory, has XML-specific features, and includes a binary mode.
- GigaEdit (Windows) – Supports searching, character statistics, and font customization. But it's buggy – with large files, it only allows overwriting characters, not inserting them; it doesn't respect LF as a line terminator, only CRLF; and it's slow.
Builtin programs (no installation required):
- less (macOS, Linux) – The traditional Unix command-line pager tool. Lets you view text files of practically any size. Can be installed on Windows, too.
- Notepad (Windows) – Decent with large files, especially with word wrap turned off.
-
MORE (Windows) – This refers to the Windows
MORE
, not the Unixmore
. A console program that allows you to view a file, one screen at a time.
Web viewers:
- readfileonline.com – Another HTML5 large file viewer. Supports search.
Paid editors/viewers:
- 010 Editor (Windows, macOS, Linux) – Opens giant (as large as 50 GB) files.
- SlickEdit (Windows, macOS, Linux) – Opens large files.
- UltraEdit (Windows, macOS, Linux) – Opens files of more than 6 GB, but the configuration must be changed for this to be practical: Menu » Advanced » Configuration » File Handling » Temporary Files » Open file without temp file...
- EmEditor (Windows) – Handles very large text files nicely (officially up to 248 GB, but as much as 900 GB according to one report).
- BssEditor (Windows) – Handles large files and very long lines. Don’t require an installation. Free for non commercial use.
- loxx (Windows) – Supports file following, highlighting, line numbers, huge files, regex, multiple files and views, and much more. The free version can not: process regex, filter files, synchronize timestamps, and save changed files.
Solution 2
Tips and tricks
less
Why are you using editors to just look at a (large) file?
Under *nix or Cygwin, just use less. (There is a famous saying – "less is more, more or less" – because "less" replaced the earlier Unix command "more", with the addition that you could scroll back up.) Searching and navigating under less is very similar to Vim, but there is no swap file and little RAM used.
There is a Win32 port of GNU less. See the "less" section of the answer above.
Perl
Perl is good for quick scripts, and its ..
(range flip-flop) operator makes for a nice selection mechanism to limit the crud you have to wade through.
For example:
$ perl -n -e 'print if ( 1000000 .. 2000000)' humongo.txt | less
This will extract everything from line 1 million to line 2 million, and allow you to sift the output manually in less.
Another example:
$ perl -n -e 'print if ( /regex one/ .. /regex two/)' humongo.txt | less
This starts printing when the "regular expression one" finds something, and stops when the "regular expression two" find the end of an interesting block. It may find multiple blocks. Sift the output...
logparser
This is another useful tool you can use. To quote the Wikipedia article:
logparser is a flexible command line utility that was initially written by Gabriele Giuseppini, a Microsoft employee, to automate tests for IIS logging. It was intended for use with the Windows operating system, and was included with the IIS 6.0 Resource Kit Tools. The default behavior of logparser works like a "data processing pipeline", by taking an SQL expression on the command line, and outputting the lines containing matches for the SQL expression.
Microsoft describes Logparser as a powerful, versatile tool that provides universal query access to text-based data such as log files, XML files and CSV files, as well as key data sources on the Windows operating system such as the Event Log, the Registry, the file system, and Active Directory. The results of the input query can be custom-formatted in text based output, or they can be persisted to more specialty targets like SQL, SYSLOG, or a chart.
Example usage:
C:\>logparser.exe -i:textline -o:tsv "select Index, Text from 'c:\path\to\file.log' where line > 1000 and line < 2000"
C:\>logparser.exe -i:textline -o:tsv "select Index, Text from 'c:\path\to\file.log' where line like '%pattern%'"
The relativity of sizes
100 MB isn't too big. 3 GB is getting kind of big. I used to work at a print & mail facility that created about 2% of U.S. first class mail. One of the systems for which I was the tech lead accounted for about 15+% of the pieces of mail. We had some big files to debug here and there.
And more...
Feel free to add more tools and information here. This answer is community wiki for a reason! We all need more advice on dealing with large amounts of data...
Yuvi
Updated on July 08, 2022Comments
-
Yuvi almost 2 years
I mean 100+ MB big; such text files can push the envelope of editors.
I need to look through a large XML file, but cannot if the editor is buggy.
Any suggestions?
-
Anders Sandvig over 15 yearsActually, text files of 100+ MB or even 1+ GB is not as uncommon as you may think (i.e. log files from busy servers).
-
Rich over 14 yearsSneakyness: And not exactly text. I think the requirements of reading text files and reading binary files differ somewhat. You might pass it through base64 or uuencode, though.
-
Gabe about 14 yearsBack in 1995 I used WinWord to open 64MB files on a 16MB machine. I'm sure it would do just as well 15 years later.
-
indiv about 14 yearsTo generate random text files instead of binaries, use this:
cat /dev/urandom | tr -dc 'A-z' | head -c 1000000
, where the last number after -c is the number of bytes in the file. -
Niels Brinch almost 13 yearsMicrosoft Office Access can actually read and parse very large xml files, but will only make sense of it if the xml format fits with something that it can translate to a table.
-
daydreamer over 12 yearsif using vim :set binary superuser.com/questions/364012/…
-
ONDEV over 12 yearsThis should be at least a similar question or even linked as it was asked 18 months prior... stackoverflow.com/questions/102829/…
-
HorseloverFat almost 12 yearsI was also looking for the answer to this exact question in order to read some huge log files that I've generated!
-
Carl over 11 yearsHere's my fallback: GigaEdit (heliwave.com/GigaEdit.html). Nothing fancy, but small, portable, free and opens massive files in an instant.
-
Rodolfo over 10 years@BlairHippo I feel the same way, I'm almost nervous when asking a question because chances are high that someone will say "Close this, it should go in WhateverExchange instead"
-
user624558 over 10 years@Sneakyness this can be used to generate large files in a matter of seconds as well.
grep -r "someText" . > bigfile
assuming that there are some files in your dir that contain matching lines with the search criteria. Of course, you would need to stop grep forcefully as this it will make it enter in a endless loop :) -
Amol Pujari about 10 yearsamolnpujari.wordpress.com/2012/03/31/reading_huge_xml-rb its so simple to deal with large xml in ruby
-
Ivan Kuckir almost 10 yearsTo view files, I recommend to use this online viewer - readfileonline.com - you don't have to install any programming interface, it works in every device and OS.
-
dhysong almost 10 yearsOn a windows machine with powershell > Get-Content C:\Scripts\Test.txt -totalcount 3
-
Bernhard almost 10 yearswinasm.net/free-small-fast-text-editor.html Free and very fast
-
Jenson M John almost 10 yearsYou can try this online jenson.in/demos/open_giant_files_in_browser.php
-
rustyx over 9 yearsFirst ask yourself this: do you actually want to edit a file >1GB in size, or do you just want to view it quickly, and be able to edit other, "normal" files? In the latter case you'll have a much better choice of log viewers and text editors.
-
-
Mike Stone over 15 yearsVIM, or Emacs... pick your poison, both will handle any file you throw at them. I personally prefer Emacs, but both will beat notepad without so much as a hiccup.
-
leo7r about 15 yearsEmacs has a maximum buffer size, dependent on the underlying architecture (32 or 64 bits). I think that on 32 bit systems you get "maximum buffer size exceeded" error on files larger than 128 MB.
-
barfoon almost 15 yearsI just tried Notepad++ with a 561MB log file and it said it was too big
-
Nippysaurus almost 15 yearsI regularly open ~600mb files with gVIM ...
-
boxofrats almost 15 yearsI've been asked in the past to edit a couple of plain text files in the multi-GB range, which our users tried to edit with MS Word... well, most of you will know what happened. Just opened it in vim and searched and replaced with the user sitting next to me in a matter of seconds (after that huge file was finally read in of course).
-
baudtack almost 15 years@Rafal Interesting! Looks like on 64bit it is ~1024 petabytes. The reason has to do with the fact that emacs has to track buffer positions (such as the point)
-
Benno over 14 yearsBut be careful, vim will only work as long as the files in question have enough line breaks. I once had to edit a ca. 150 MB file without any line breaks, and had to resort to gedit because vim couldnt handle it.
-
Dave Kirby about 14 yearsIf you are going to use (g)vim then to improve performance you may want to turn off some features such as syntax highlighting, swapfile and undo. See vim.wikia.com/wiki/Faster_loading_of_large_files, vim.wikia.com/wiki/VimTip611 and vim.org/scripts/script.php?script_id=1506.
-
wasatz about 14 years+1, I recently had some really huge xml files (+1 gigabyte) that I needed to look at. I'm on windows and both vim, emacs, notepad++ and several other editors completely choked on the file to the point where my system almost became unusable when trying to open the file. After a while I realized how unnecessary it was to actually attempt to open the file in an -editor- when I just needed to -view- it. Using cygwin (and some clever grep/less/sed-magic) I easily found the part I was interested in and could read it without any hassle.
-
InfantPro'Aravind' almost 14 yearsI wonder if 5GB text files exist .. :-O .. if you don't mind, may I know .. (in practical world) where we are forced to use/edit these bulky text files.. (alternate way would have been to break the file and make a few of it.. usually larger files, of any file-type, make system cry to give performance)
-
Paul Nathan almost 14 years@Rafal: emacs buffer size can be boosted with emacs 23. I don't recall offhand how to do it.
-
schmoopy almost 14 yearsI tried all, gVim sucked in that it didnt even tell you it was loading a file - took forever to load the file (only 200k, 5 million lines). SlickEdit opened the entire file in about 3 seconds. Getting the trial license was a PIA tho. Thank you for listing these.
-
Joseph Garvin over 13 yearsEmacs definitely has a buffer size problem on 32-bit.
-
Joe Koberg over 13 yearsI want an editor that mmap()s the file and reads only the parts I am looking at... even gvim seems to load the whole thing into memory first, and even resizing the window freeezes it while it thinks...
-
Jayme Tosi Neto over 13 yearsI had a problem with a single character encoded by 6MB of code. Notepad++ and Netbeans couldn't handle it, but the 010 Editor did it easily! ;D
-
rogerdpack over 12 yearsgVim takes forever to load a 2GB file, and then seeking within it is painfully slow too. It seems to load the entire file into RAM. Maybe the large file plugin would help but the default doesn't seem optimal.
-
rogerdpack over 12 yearscygwin's less works for viewing a file > 2GB sweet
-
Dustin Davis over 12 yearsWasn't able to get any of these to work on a 1.6GB file, especially gVim or any other Vim for windows that I could find. Had to use filesplitter to break it into 100MB chunks then I used EditPlus to view them. The old Edit.com (DOS) can handle large files (a few hundred MB) but is not available in 64bit windows.
-
ChristophK over 12 yearsyou don't need cygwin for less, you can also use it under windows: gnuwin32.sourceforge.net/packages/less.htm
-
Zane over 12 years010Editor successfully opened my 3.3GB MySQL database dump.
-
FIre Panda over 12 yearsDoes 010 Editor has XML formatting option?
-
dash-tom-bang about 12 yearsThe 32-bit version of Vim 7.3 crashes at about 2G if the swap is on and just gives errors about an incorrect line count if swap is off. The 64-bit version however works ok; I'm currently "browsing" a 7.5G file and I can't resize the window with the mouse, seeking is a bit slow, but it works. (81M lines, a log of all memory allocations in an application.)
-
JavaAndCSharp almost 12 years@docgnome Aw, crap. What about my 1025 petabyte file? Ugh. I guess Emacs is so stuck in the past that it can't edit it. What does it think the date is, 1997?
-
Pixel Elephant over 11 yearsUsed a combination of Large Text Viewer and HxD. LTV for nice text view and line seeking, and HxD for actual editing + search and replace.
-
nuala over 11 yearsI couldn't figure out how to edit files with
less
but after getting the taste I usedvim
to edit a bulky JSON object which made the other editors (Brackets, Textmate <- note I'm running on OS X) throw up. -
lichtfusion about 11 yearsThis XML editor here has also a large file viewer component and does provide syntax coloring also for huge files. The files are not loaded completely into memory so a multi-GB document shouldn't be a problem. In addition this tool can also validate those big XML documents ... In my opinion one of the best approaches to work with huge XML data.
-
HerrimanCoder over 10 yearsI had a 2+ gigabyte sql script that I tried to open with gVim. It folded like a pair of 2's, then sat down and cried. Then I tried 010Editor, which shined like a CHAMP. Scrap gvim, use 010.
-
user2097804 over 10 yearsI use Hxd to open large files. It works very will for me and has many other features. I can open and veiw my entire hard drive with it( ~200 GB) in less than a second. Scrolling through files and editing them is very smooth as well.
-
Aaron R. over 10 years"But be careful, vim will only work as long as the files in question have enough line breaks." @Benno, That's a configuration setting in vim, not a limitation. You can change it this way:
:set display+=lastline
. It is kind of weird to not have that as the default, though. -
Martin Ba over 10 yearsIf you have been using LargeTextFileViewer, switch to glogg! LTF is smaller, but glogg seems to work much better.
-
oabarca about 10 yearsIf you only want to view the file contents, then I suggest a very nice tool online: readfileonline.com it works on all modern browsers.
-
Aravind Yarram about 10 yearsLarge Text File Viewer was able to open a 22GB file without any issues.
-
Millemila almost 10 yearsIs there a way to see the file with the correct line breaks in HxD?? I see weird "..." as separator and not even at the end of the line.thanks
-
PaulBGD almost 10 yearsUsing Large Text File Viewer, I just opened a 30gb file with no issue. Finding a specific keyword took about 5 minutes though.
-
Mike almost 10 yearsgVim would open my 1.6 GB file but I couldn't really do much with it... I understand it's a big task, but I needed to do a find and replace about 150 million instances. gVim would only make it about half way.
-
miodf almost 10 yearsEmeditor opens fast "up to a 248 GB limit (or 2.1 billion lines)" emeditor.com/text-editor-features/large-file-support/… For csv files there is also Delimit ("Open data files up to 2 billion rows and 2 million columns large!") delimitware.com
-
Denilson Sá Maia almost 10 yearsWhen using gVim, use this plugin to automatically disable slow features on large files: github.com/vim-scripts/LargeFile
-
Roboprog about 9 yearsI'm going to "disavow" the logparser.exe thing that "the community" added to my answer, as I would prefer to stick to POSIX tools, rather than some Microsoft-only thing, which also apparently takes a longer command line to invoke than my other examples.
-
Paul Zahra almost 9 yearsgVim choked on an 8GB file and 'stopped working'.
-
goat almost 9 yearsI tried the
less
packaged in mingw shell, and it complains about memory on a 400MB file. No good. -
Andy Brown almost 9 years
less
is great as long as the lines aren't too long. I'm in this thread because less (linux) is choking badly on debug log files that contain BIG serialized XML and I need something faster. -
Andy Brown almost 9 yearsOK so I just fixed my own issue.
less
with word wrap is slow.less -S
without word wrap is lightning fast even on large lines. I'm happy again! -
Earlz over 8 yearsI couldn't get 010 editor to search across a large 20G text file... it just choked and froze until I killed it
-
Hans Ginzel over 8 yearsSee LargeFile plugin to Vim.
-
Clemens over 8 yearsYou can browse, edit and search/replace in text files of almost any size with XML ValidatorBuddy. In addition you get syntax-coloring for XML documents. The editor lets you select the encoding too.
-
Dan over 8 yearsLogExpert fails on long lines. If a line is ~8000 chars, it will get cut off and the following line will also be cut off, much shorter. HxD worked fine.
-
Andrew Jens about 8 yearsI used 010 Editor to open a 42.7 Gb file (yes, Wikipedia XML file 27). It worked (and the file has 634,957,038 lines)! I'm using Windows 10 64-bit on a laptop with 16 Gb RAM, and v6.0.3 of 010 Editor. I could also search for a string that was at the end of the file (although 010 took almost 10 minutes to finish the search).
-
bmende about 8 yearsAs @user2070775 suggested, readfileonline.com is fantastic for a quick view.
-
transistor1 almost 8 yearsGreat answer. I want to note that if you have Git for Windows installed, you probably have Git bash as well, which includes
less
. -
ruffin almost 8 yearsLarge Text View Viewer still exists here.
-
Eric Bole-Feysot almost 8 yearsI use myself Editpad lite (free for non commercial use) which can edit files larger than 4 GB, even if your PC only has a few GB of RAM. Also, the maximum length of a single line is not limited, which is a problem with many editors claiming to support "unlimited" file sizes.
-
clg4 over 7 yearswindows cmd
more
is ideal for looking and confirming file structure. -
Pavin Joseph over 7 yearsSublime Text 3 works beautifully for large files; I don't have the rep to add it as answer but just tested with a 1.6G log file. It even shows a loading bar while reading the file, feels fluid to use unlike the others mentioned here. Its free and also has a portable version!
-
skan over 7 yearsI don't know why I can't reply to the OP. The best options in Windows are: EmEditor, Delimit, Editpad Pro and Texpad, all commercial and all can read and write and do many things with files much bigger than memory. Emeditor and Delimit offer the option to see csv files as a spreadsheet. I've being having problems with EmEditor, it keeps reloading the file every few seconds preventing you from working.You can also try SlickEditor and HippoEdit.
-
Ultranuke over 7 years010 Editor was perfect for my situation where I had to edit a 8GB mysql dump that failed and I had to resume, I also removed some GBs from log tables, this saved me a lot of time!
-
myuce almost 7 yearsAlso total commander's "view file" menu option (F3) is very good at this. What it does, I think, is virtual scrolling; not loading all the content.
-
gaborous over 6 yearsLargeTextFileViewer worked for me for 2 GB of text. Emurasoft's EmEditor claims to open up to 248GB, might be worth a look if other solutions did not work for you.
-
Rüdiger Schulz over 6 yearsLiquidSoft has a horrible installer which left me with a non-closable dialog.
-
GabrielBB over 6 years010 Editor worked like a charm :D
-
Rodolfo Velasco over 6 yearsI have tried logexpert and worked well for 1GB text file. The others were not free or didnt worked for me
-
Ivan Akcheurov over 6 yearsLiquid Large File Editor does great job and is free (Community Edition): liquid-technologies.com/large-file-editor
-
smwikipedia over 6 yearsI tried LogExpert 1.5.5493. It always crash after opening a large txt file for less than 2 mins. The txt file is about 2.5G.
-
Ahmet Korkmaz over 6 yearsMixed bag experience with 010Editor on a 19GB JSON trace file. Takes several seconds to "scan for linefeeds", before showing the file. Crashed doing searches, and seems to look for all matches before moving to the first "closest" match (since took a while to show a match, with a full list of matches at that time). Otherwise still allowed me to examine my super-large file, so not a complete waste. FWIW.
-
Sina Madani almost 6 yearsNote that Liquid Studio requires you to sign up even for the "free" Community Edition (e-mail, name etc.) which is a bit dodgy. LogExpert tries to load the entire file into memory, so with a 30 GB file and 12 GB RAM it didn't work for me.
-
Desik almost 6 yearsLiquid Studio text search speed is mere 23 MB/s on my computer! Ridiculous. In comparison EmEditor speed is 312 MB/s and is limited by CPU speed. I have a SSD. Benchmarked on 06/19/2018.
-
Desik almost 6 yearsUpdate:
glogg
has search performance in betweenLiquid Studio
andEmEditor
.EmEditor
is still the fastest. -
Violet Giraffe over 5 yearsOn 64-bit Windows, the standard Notepad.exe handled opening and searching in a 450 MB csv file perfectly while Sublime Text 3 hung forever and VS Code said it's too large.
-
Roboprog over 5 yearsWell, this thing has become a bit of an "Alan Smithe Production". My original answer said nothing about IIS logparser, since I avoid MS-Windows whenever possible. Happy Wiki-ing, I guess :-)
-
kd4ttc over 5 yearsThe locked questions are some of the best questions on StackOverflow.
-
Shane over 5 yearsGlogg is fantastic. I can easily spot trends and filter down to a log token. Thanks for the recommendation.
-
Faheem about 5 yearsI think the VS Code is a great option. Free and Open Source. Just added to the list.
-
Zeeshan Ahmad Khalil almost 4 yearsSublime Text loaded 1.4 Gb .sql file for me & I can edit this file and save it. On saving, it takes a little bit time but you shouldn't click anywhere as it could crash the sublime text and stop the process.
-
Chidi almost 4 yearsi recommend
glogg
which is free and very fast...tested with an 8gb text file. but supports read-only. -
Kelly Elton over 3 yearsVSCode no longer opens large text files.
-
JP-Hundhausen over 3 yearsWith a ~240MB One-Liner XML File VIM, Emacs and Notepad++ don't work. LogExpert somewhat works and with VS Code, you can't click into the text or you get a Out of Memory Exception. Liquid Studio (mentioned as Large File Editor) works but the special features are turned off.
-
BornToCode about 3 yearsYou wrote Lister support regex search - how did you manage to do regex search with Lister? (Surprisingly I've found it the best to handle large files but I didn't find how to do regex search with it)
-
fav about 3 yearsAuthor of Klogg here. Klogg has switched to hyperscan regular expressions library in current dev builds. In our tests searching in big files is almost 2 times faster now. Any feedback is welcome github.com/variar/klogg
-
Bruce Adams over 2 yearsIf emacs performs poorly with a large file try M-x fundamental-mode to switch off highlighting.
-
Samir over 2 yearsI don't open enormous text files on a daily basis, but I have used Lister a couple of times now and I found it very easy to use, stable, and performant. No need to install it even, it just works; for every few times in a decade I need this kind of tool. So I cast my vote on Lister for Windows users. If you plan on doing this sort of thing on a regular basis though, I highly recommend that you switch to a more suitable OS to begin with, and when you do that you suddenly open the flood door to endless possibilities.
-
Samir over 2 years@BornToCode To enable RegEx in Lister, you simply click to check the "RegEx" option in the search box, type in your pattern and click OK to start searching. Using only the keyboard you can press Ctrl+F, Alt+X, Tab, Tab, Tab, Tab, type in your pattern and press Enter to start searching. (You have to tab back the focus to search input box after enabling RegEx.) Try something simple like searching for
.
(dot) with RegEx disabled vs. RegEx enabled to see the difference. Press F3 to continue to the next match. -
golimar over 2 years@AndyBrown
-S
is great and if you're going to move to the end of the file,-Sn
is even better -
Basj over 2 years
GigaEdit
's link is dead, could someone update it? I didn't find the new homepage. -
abhij89 almost 2 yearsUpdate 2022: EmEditor(Windows) crashed the system on opening a 17 GB file. Works well for anything less than 5 GB. Wouldn't recommend it for files larger than 5 GB.
-
beginner almost 2 yearsklogg easily searched through a regex in a 400 MB file, just in 1 second!!. Awesome. Thank you for sharing🙏