Millions of files in php's tmp error - how to delete?

5,220

Solution 1

First of all, the errors about not being able to write to

/var/www/clients/client1/web1/tmp/

doesn't mean that it is this directory that has all the files, just that it is there it's trying to write when it logs the error. But you have located the files, and are about to remove them.

  • stop the web server (if possible) to prevent creation of more, and stop spewing error messages
  • clean up
  • restart web server
  • observe if it starts again

For the cleanup step, assuming the files to clean are in /var/www/clients/client1/web1/tmp, first become the same effective user as the one creating the session files (probably one of apache or httpd or www-data), then:

  • cd /var/www/clients/client1/web1/tmp
  • ls -f | grep ^sess_ | xargs rm -f

Solution 2

I'm using this method to delete 2.3 million - looks like it will be finished in about 10-15 minutes

http://www.binarysludge.com/2012/01/01/how-to-delete-millions-of-files-linux/

Solution 3

find /tmp -name "sess_*" -exec rm {} \;

Share:
5,220

Related videos on Youtube

Ganeshkumar SR
Author by

Ganeshkumar SR

Designer and developer.

Updated on September 17, 2022

Comments

  • Ganeshkumar SR
    Ganeshkumar SR over 1 year

    I've got a tmp-folder with 14 million php session files in my home directory. At least that's what I think it is, it's not like I could ls it or anything.

    I've tried using find with the -exec rm {} \; commands but that didn't work. ls 'sess_0*' | xargs rm did neither.

    I'm currently running rm -rf tmp but after two hours the folder appears to be the same size.

    How can I empty this folder?

    Does anyone have a clue what caused it in the beginning? I don't remember changing anything critical lately.


    REFERENCE INFO:

    I suddenly encountered an error where SESSIONS could no longer be written to disk:

    [Mon Apr 19 19:58:32 2010] [warn] mod_fcgid: stderr: PHP Warning: Unknown: open(/var/www/clients/client1/web1/tmp/sess_8e12742b62aa68a3f9476ec80222bbfb, O_RDWR) failed: No space left on device (28) in Unknown on line 0

    [Mon Apr 19 19:58:32 2010] [warn] mod_fcgid: stderr: PHP Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/var/www/clients/client1/web1/tmp) in Unknown on line 0

    I ran:

    $ df -h
    Filesystem            Size  Used Avail Use% Mounted on
    /dev/md0              457G  126G  308G  29% /
    tmpfs                 1.8G     0  1.8G   0% /lib/init/rw
    udev                   10M  664K  9.4M   7% /dev
    tmpfs                 1.8G     0  1.8G   0% /dev/shm
    

    But as you can see, the disk isn't full.

    So I had a look in the syslog which says the following 20 times per second:

    kernel: [19570794.361241] EXT3-fs warning (device md0): ext3_dx_add_entry: Directory index full!

    This led me thinking to a full folder, obviously, but since my web folder only has 60k files (having counted them), I guessed it was the tmp folder (the local one, for this instance of php) that messed things up.

    Some commands I ran:

    $ sudo ls sess_a* | xargs rm -f bash: /usr/bin/sudo: Argument list too long

    find . -exec rm {} \; rm: cannot remove directory '.' find: cannot fork: Cannot allocate memory

    I'm running Debian Lenny, php5, ISPConfig, SuEXEC and Fast-CGI.

  • Ganeshkumar SR
    Ganeshkumar SR about 14 years
    As I said, I've already tried that. I tried sess_0* which is supposed to be an even smaller subset than the one you mentioned, and that didn't work. Also a small note, it isn't the global /tmp (in which case a simple reboot would've remounted tmpfs and cleared the folder).
  • solefald
    solefald about 14 years
    you said ls 'sess_0*' | xargs rm, which fills up ls buffer and bombs. This find command works flawlessly for me when i have to delete hundreds of thousands of amavis/spamassassin quarantine files...
  • Ganeshkumar SR
    Ganeshkumar SR about 14 years
    You're right. But do remember I did run another find which failed. But perhaps that was because the amount of files were too large? I thought -exec would execute as each file was read, not after they'd all been ran through once.
  • Scott Pack
    Scott Pack over 11 years
    Answers that just consist of links to some other page are not generally considered good answers as they cease to be useful if the link ever dies. Please consider expanding your answer to contain enough detail to stand on its own without the external reference. Thanks!
  • Ryan
    Ryan about 4 years
    This answer worked for me. Here is an example based on the above link to delete all files in the /var/lib/php5 directory with filename containing 'sess_'. perl -e 'chdir "/var/lib/php5" or die; opendir D, "."; while ($n = readdir D) { if (index($n, "sess_") != -1) { print $n."\n"; unlink($n); } }'