Gnome, nautilus copy files to USB stops at 100% or near

45

Solution 1

The reason it happens that way is that the program says "write this data" and the linux kernel copies it into a memory buffer that is queued to go to disk, and then says "ok, done". So the program thinks it has copied everything. Then the program closes the file, but suddenly the kernel makes it wait while that buffer is pushed out to disk.

So, unfortunately the program can't tell you how long it will take to flush the buffer because it doesn't know.

If you want to try some power-user tricks, you can reduce the size of the buffer that Linux uses by setting the kernel parameter vm.dirty_bytes to something like 15000000 (15 MB). This means the application can't get more than 15MB ahead of its actual progress. (You can change kernel parameters on the fly with sudo sysctl vm.dirty_bytes=15000000 but making them stay across a reboot requires changing a config file like /etc/sysctl.conf which might be specific to your distro.)

A side effect is that your computer might have lower data-writing throughput with this setting, but on the whole, I find it helpful to see that a program is running a long time while it writes lots of data vs. the confusion of having a program appear to be done with its job but the system lagging badly as the kernel does the actual work. Setting dirty_bytes to a reasonably small value can also help prevent your system from becoming unresponsive when you're low on free memory and run a program that suddenly writes lots of data.

But, don't set it too small! I use 15MB as a rough estimate that the kernel can flush the buffer to a normal hard drive in 1/4 of a second or less. It keeps my system from feeling "laggy".

Solution 2

Late to the party, but a workaround I use to copy big files to USB stick is rsync.

The basic syntax I always successfully use is the following:

rsync -avh /home/user/Documents /media/user/myusbstick

Warning: if you want to copy the whole Documents folder use the syntax is ok. But if you want to copy only the file and not the folder you have to add a slash. Like this:

rsync -avh /home/user/Documents/ /media/user/myusbstick

Of course if you want to copy a file:

rsync -avh /home/user/Documents/file1 /media/user/myusbstick

for multiple files

rsync -avh /home/user/Documents/file1 /home/user/Documents/file2 /media/user/myusbstick

The syntax works for any folder/file you want to copy.

I'm aware this is not the real solution, but it's an easy and safe way to avoid annoying issues.

Solution 3

Old question, but it seems as though the problem still comes up. Setting the buffer to 15MB as suggested here did not work on Ubuntu 19.04, and brought my system to a grinding halt.

I was trying to copy a 1.5GB file onto an empty (newly formatted) FAT32 16GB drive. I let it run for about 10 minutes just to see if it would finish, with no luck.

Reformatting to NTFS let the operation finish in less than 10 seconds. I don't know why this would matter because FAT32 should allow any single file with size under 4GB, but it seemed to work just fine. Not an ideal fix for drives you want to use with MacOS, but an easy workaround for all other use cases. I imagine exFAT would have worked similarly, but I did not test it.

Share:
45

Related videos on Youtube

skiwi
Author by

skiwi

Updated on September 18, 2022

Comments

  • skiwi
    skiwi over 1 year

    Assume I have a loop that produces items:

    for (int i = 0; true; i++) {
        ComplexObject co = new ComplexObject(i);
        System.out.println(co);
    }
    

    The standard behaviour would be to print a description about the objects, but that does not make that much sense for a real application.

    I wish to write a custom iterator (generator) now via which I can obtain the elements.

    So I need to write the next() method in such a way that it will return one ComplexObject on every next() call, so it is like one step in a loop.

    How would I go about doing that? What general mechanism would you advise me to use?

    Keep in mind that I dumbed down the real issue to make it explainable, in reality it is of course not as easy as I have stated here and that is why I need to achieve the exact same mechanism that I have asked.

    Regards.

    ps. (small rant) What is this bleep about only being allowed to post 6 questions per 24-hour period?

    • Admin
      Admin over 9 years
      just wondering if it is possible at all to check the progress in that situation???
    • Admin
      Admin almost 7 years
      try to format pendrive with option overwrite exiting data with zeros It works on My trancend 8GB pendrive
    • Admin
      Admin over 5 years
      For anyone coming across this issue, just format your drive to NTFS.
    • Admin
      Admin over 3 years
      I experienced very low copy speed to a usb stick formatted as FAT32 and as EXT4. It was solved by formatting it to NTFS
    • Admin
      Admin about 3 years
      Seeing the accepted answer's conclusion, I feel encouraged to bring forth my solution: the real-time system-resource monitoring graphs on the desktop. Back in Ubuntu's Unity DE it was called indicator-applet and in contemporary Gnome it's called gnome-shell-extension-system-monitor. The "Disks" graph will clearly show the ongoing write process (in case of SDCard devices too), and reliably provides the clue when the actual writing is complete. An installation guide of the latter (for Gnome desktops): askubuntu.com/a/1306383/1157519
  • Sidahmed
    Sidahmed over 7 years
    I was looking for a fix to this problem for a year or more, I thought it was just a bug in the linux. Thanks a lot.
  • Brofessor
    Brofessor almost 6 years
    Linux noob here, could someone post how to change the <dirty_bytes> values?
  • dataless
    dataless almost 6 years
    @Brofessor Oh, sorry, I should have described it by the official name instead of /proc details. Answer is updated.
  • Rmano
    Rmano over 5 years
    This is similar to unix.stackexchange.com/questions/107703/… --- should have been fixed, but believe me, it's not. I had to add it to Ubuntu 18.04 to stop behaving funny...
  • sziraqui
    sziraqui over 4 years
    Works on Fedora 30 too. I am surprised to see such stupid behaviour even in modern Linux distros.
  • Mohith7548
    Mohith7548 over 4 years
    Better read this (blog.programster.org/…
  • John
    John about 4 years
    Same problem with 19.10 and exFAT drive. Unable to copy a 40GB VMDisk over USB3Gen2 to an M.2 drive (Always fails using nautilus even cp/rsync). System is a 3950x 64gb of memory so not short on resources. Have to copy to a network drive (centos box) then mount my usb drive to my network server. This problem has been following Ubuntu like a dog with flees
  • Paul TIKI
    Paul TIKI almost 4 years
    curious, the quick and dirty change to NTFS is at least giving me an indication tha the files are moving, and its much faster than FAT32. wonder what ext4 would do?
  • deFreitas
    deFreitas almost 4 years
    Still works on 'Ubuntu 18.04 LTS'
  • deFreitas
    deFreitas almost 4 years
    @Mohith7548 the relevant information from your referred post is already mentioned on this answer, your link is broken though, the fixed link
  • AdminBee
    AdminBee over 3 years
    Welcome to the site, and thank you for your contribution. You may want to suggest the --progress option to have a rough equivalent of the progress bar found on graphical tools.
  • endrias
    endrias about 3 years
    I have seen this problem mentioned across forums that deal with various Debian flavors, the only apparent solution is disabling sync as a mount option
  • RomuloPBenedetti
    RomuloPBenedetti about 3 years
    Anyone interested in developing a complete solution I've made some exploration here: github.com/RomuloPBenedetti/SaneFileTransfere it is a crude investigation for this problem in particular.
  • gianni
    gianni over 2 years
    This is 2021, I'm using ubuntu 20.04 and still it's impossible to copy a 1 gigabyte file to a usb key. This is crazy.
  • Devyzr
    Devyzr over 2 years
    Same problem in 2022, Ubuntu 20.04, try and copy a 350MB file to a FAT usb, takes almost a minute. Format to NTFS and is done in a few seconds...