Disk usage per user in Linux / Unix

62,485

Solution 1

Is this a one time thing, or is this information you want to be able to extract regularly? In case it is the later then one option is to apply quotas on your filesystem. Doing that the system continuously keeps track of the amount of data used by each user. That way the information is merely a query to the quota database away.

Solution 2

Here is a simple and quick solution that I believe meets your requirement.

I assume that all your users have accounts in the /home directory. All you need to do is to change directory to the /home directory, and then do a du at a depth of 1.

cd /home
sudo du -d 1 -h

Your output will look something like this:

kcyow@linux-server:/home$ sudo du -d 1 -h
7.8M    ./user932
52G     ./user575
20K     ./user329
98G     ./user323
48G     ./user210
148G    ./user44
12M     ./kcyow
362G    ./user28
24G     ./user774
6.2M    ./user143
730G    .

Solution 3

Another nice solution I found here. Navigate to the directory of interest, and run (alternatively, change . to whichever directory interests you, e.g., /home/):

find . -type f -printf "%u  %s\n" \
  | awk '{user[$1]+=$2}; END{for(i in user) print i,user[i]}'

Solution 4

Or for finding the problem users (directories too),

du -xk | sort -n | tail -25

and for Solaris:

du -dk | sort -n | tail -25   

This gives you a list of the 25 largest directories. Not quite what you asked for, but I use it all the time.

Solution 5

What we do in many places is use the quota system, but set absurdly high quotas. This way you get the benefit of fast reporting. At one site, each user has 1 TB of "quota" space.

We periodically bump the quota higher as serviceable disk grows -- initially it was 30GB per user, something that was absurdly high at the time.

Share:
62,485

Related videos on Youtube

Escualo
Author by

Escualo

Updated on September 17, 2022

Comments

  • Escualo
    Escualo over 1 year

    I need to find out how much disk space is being occupied by each user on the network. I am aware of df and du commands: I could list the entire filesystem and AWK the output, but I wonder if there is a more standard command.

    The output I am looking for is:

    usr1  xMb
    usr2  yMb
    [...]
    Total zMb
    

    Any ideas?

    Thanks!

    PS. Red Hat Linux EE

  • ThorstenS
    ThorstenS over 14 years
    +1 quota is the solution!
  • Escualo
    Escualo over 14 years
    A one-time thing; possibly a solution which can be stored in a small script for the users to compute their usage if they want to. We cannot limit the amount of data because of the type of work we do does not accommodate hard limits.
  • Escualo
    Escualo over 14 years
    @ThorstenS: We do technical computing and we need to generate tons of information which may or may not be removed after a run is made. I don't think quotas help in our situation.
  • elmo
    elmo over 14 years
    @Arrieta: You don't have to limit their usage. Simply give each user a ridiculously hight quota. Also, every user can by themselves query the quota database and see how much data they are currently storing.
  • Daniel
    Daniel over 14 years
    You don't even need to set the quota to a big number, if you leave it unset (i.e. 0) it will not enforce it, but it will record the usage
  • Hennes
    Hennes almost 10 years
    Inefficient. You do not need to run find several times if you log the information in the same time. Save that information during your first run. Either in a file, or in an associative array.
  • Hennes
    Hennes almost 10 years
    +1. Maybe add a -type f is you are really only looking for files?
  • Hennes
    Hennes almost 10 years
    +1 for using quota as a permament solution. Find and awk are the answer if you only need to ask it once or if you have to answer this during a technical job interview. (I was asked, quota was disabled on that system).
  • TheDudeAbides
    TheDudeAbides over 4 years
    Good answer. Use -printf "%u\t%s\n" and awk -v OFS="\t" if you think you ever might have a username with space in it.
  • djdomi
    djdomi almost 3 years
    Simple basic but good solution, by the way suggest user28 to reduce his Video download Consume :-) laugh
  • Kin-Choong Yow
    Kin-Choong Yow almost 3 years
    Ha ha! You bet I will!
  • Admin
    Admin almost 2 years
    Nice, but could you print the output in GB, or other human readable format?