Linux slows to a crawl when unarchiving a big file
Solution 1
The slowdown is probably happening because of iowait. The ionice command should allow you to continue working:
ionice -c3 command
Solution 2
Try running the command with a lower priority using the nice utility. Uncompressing large files can be demanding on the CPU, hence why it is typically one of the tools used to measure performance in CPU benchmarks and reviews.
example:
$ nice -15 ./myprogram
The number you specify is an adjustment of the default nice level. -20 being the highest priority and 19 being the lowest. Negative numbers are reserved for the root user.
Solution 3
I also had this same phenomenon but I think I have found the answer. I used to use unrar
on 1 disk, reading and writing. Since this is a lot, all other processes got no time to work on the disk. Now I let unrar
place the unpacked items onto another disk, not another partition, no another physical disk. It has several advantages:
- it goes much faster so the time you have a slower computer is less.
- the computer does not come to a halt as it did before, because it now reads on 1 disk and writes on the other. In other words the disks divide the total amount of work.
It's a simple solution, but it works.
Solution 4
I had a similar problem with a big .zip file (1.4 Gb) which contains almost 200.000 small files. My Ubuntu 13.10 64 bit would have taken over 10 hours. No freezes, the system doesn't really slow down, it is just that the uncompressing is incredibly slow.
I tried the virtual machine solution mentioned above, with Virtualbox and W7 64. Here comes the surprise:
1) at first, I shared the folder to the virtual machine and tried to unzip it there, in the same location (a virtual F: unit in W7) with 7-zip. No luck, same crappy speed which would take forever. 7-zip reported an initial output of 200 Kb/s, but it kept slowing down until I stopped it (less than 100 kb/s and an ETA of 7 hours, but it probably would have slowed down more and taken way longer).
2) then I moved the .zip file to inside the virtual machine's "hard disk drive" (what the vm believes to be a HDD). So the file was not in a shared folder with Ubuntu. Surprise, surprise, it works great, at around 2000 Kb/s output, so it took less than 15 minutes.
3) anecdotally, a 32-bit Windows 7 system (not a virtual machine) with exactly the same hardware took around one hour, with a stable output around 500 kb/s, according to 7-zip. I have no idea how the 32 to 64 bit change affects uncompressing of files, just thought it'd be good to mention to compare.
Ubuntu 13.10 64 bit with ext4, W7 with NTFS both the vm 64bit and the 32 bit normal system. What really bewilders me is the fact that the W7 vm is really using the underlying ext4 file system, because it is a vm, and still achieves those speeds.
I hope some gurus read this and figure this out, this is extremely annoying and intriguing.
Solution 5
I found the solution. I have a Windows virtual machine already installed in Linux. I shared the the folder where the archive is with the virtual machine. Then inside Windows, I unarchived the file using 7-zip and everything went smooth. It took a long time, but I didn't see any noticeable difference in system performance. 7-zip is not available for Linux. Windows can still be useful sometimes!
Related videos on Youtube
Phenom
Updated on September 17, 2022Comments
-
Phenom over 1 year
I started unarchiving a RAR file that this several gigabytes big. The computer is now going really slow, it's almost frozen. Sometimes I can move the mouse a little, but that's it. The unarchiving process seems to have halted, so now all I can do is restart the system. I don't think I can unarchive this file in Linux.
I never had this problem in Windows. How can this be fixed?
-
Sasha Chedygov over 13 yearsYou mean -20 being the lowest, not highest. :) Fixed it for you.
-
Sasha Chedygov over 13 yearsIt's unlikely RAM but rather a slow CPU.
-
Janne Pikkarainen over 13 yearsDon't forget ionice command.
-
Vishu over 13 yearsHmm. If it's not a RAM problem, perhaps it would be easy to simply go to the gnome-system-monitor and give the unarchiving process a lower priority (higher number). Then, it won't slow the whole system down.
-
John T over 13 years@musicfreak I meant what I said... -20 is the highest priority you can give a process, not the lowest. Hence why only root can assign negative priorities because they have the highest precedence.
-
Sasha Chedygov over 13 yearsOh, I see. You should reword that sentence, then, because it doesn't make much sense as it stands. (Nevertheless, reverted, my mistake.)
-
Phenom over 13 yearsMy CPU isn't slow. It has four cores.
-
John T over 13 yearsCheck out p7zip: p7zip.sourceforge.net
-
Phenom over 13 yearsDoes nice only work when running programs or does it work on programs that are already running?
-
John T over 13 years@Phenom if you want to change the nice level of a program that is already running look into the
renice
command. -
Happy over 3 years
ionice -c 3 -n 7 command
is better, where-n
means priority levels from 0 to 7, and0
presents the highest priority level,7
is the lowest -
shaik moeed about 3 yearsCan you share an example command?