How to log GPU load?

31,379

Solution 1

It's all there. You just didn't read carefuly :) Use the following python script which uses an optional delay and repeat like iostat and vmstat:

https://gist.github.com/matpalm/9c0c7c6a6f3681a0d39d

You can also use nvidia-settings:

nvidia-settings -q GPUUtilization -q useddedicatedgpumemory

...and wrap it up with some simple bash loop or setup a cron job or just use watch:

watch -n0.1 "nvidia-settings -q GPUUtilization -q useddedicatedgpumemory"'

Solution 2

use

nvidia-smi dmon -i 0 -s mu -d 5 -o TD

then you can easily dump this into a log file. this is the gpu usage for device 0 sampled at an interval of 5 seconds

 #Date       Time        gpu    fb  bar1    sm   mem   enc   dec   pwr  temp
#YYYYMMDD   HH:MM:SS    Idx    MB    MB     %     %     %     %     W     C
 20170212   14:23:15      0   144     4     0     0     0     0    62    36
 20170212   14:23:20      0   144     4     0     0     0     0    62    36
 20170212   14:23:25      0   144     4     0     0     0     0    62    36

see the man page for details on flags.

Solution 3

You can use (tested with nvidia-smi 352.63):

while true; 
do nvidia-smi --query-gpu=utilization.gpu --format=csv >> gpu_utillization.log; sleep 1; 
done. 

The output will be (if 3 GPUs are attached to the machine):

utilization.gpu [%]
96 %
97 %
92 %
utilization.gpu [%]
97 %
98 %
93 %
utilization.gpu [%]
87 %
96 %
89 %
utilization.gpu [%]
93 %
91 %
93 %
utilization.gpu [%]
95 %
95 %
93 %

Theoretically, you could simply use nvidia-smi --query-gpu=utilization.gpu --format=csv --loop=1 --filename=gpu_utillization.csv, but it doesn't seem to work for me. (the flag -f or --filename logs the output to a specified file).

To log more information:

while true; 
do nvidia-smi --query-gpu=utilization.gpu,utilization.memory,memory.total,memory.free,memory.used --format=csv >> gpu_utillization.log; sleep 1; 
done

outputs:

utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB]
98 %, 15 %, 12287 MiB, 10840 MiB, 1447 MiB
98 %, 16 %, 12287 MiB, 10872 MiB, 1415 MiB
92 %, 5 %, 12287 MiB, 11919 MiB, 368 MiB
utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB]
90 %, 2 %, 12287 MiB, 11502 MiB, 785 MiB
92 %, 4 %, 12287 MiB, 11180 MiB, 1107 MiB
92 %, 6 %, 12287 MiB, 11919 MiB, 368 MiB
utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB]
97 %, 15 %, 12287 MiB, 11705 MiB, 582 MiB
94 %, 7 %, 12287 MiB, 11540 MiB, 747 MiB
93 %, 5 %, 12287 MiB, 11920 MiB, 367 MiB
Share:
31,379

Related videos on Youtube

Franck Dernoncourt
Author by

Franck Dernoncourt

Updated on September 18, 2022

Comments

  • Franck Dernoncourt
    Franck Dernoncourt over 1 year

    I wonder how to log GPU load. I use Nvidia graphic cards with CUDA.

    Not a duplicate: I want to log.

    • Admin
      Admin over 8 years
    • Admin
      Admin over 8 years
      @monsune Thanks, it looks like the other question is about real-time monitoring while mine is about logging. I'll try to Python code, that might do the trick.
  • Franck Dernoncourt
    Franck Dernoncourt over 8 years
    Thanks, eventually I am using: while true; do nvidia-smi --query-gpu=utilization.gpu --format=csv >> gpu_utillization.log; sleep 1; done. I wish the format was truly csv :) and I haven't found a way to obtain the per-user GPU usage on a Linux machine (CUDA).
  • Franck Dernoncourt
    Franck Dernoncourt over 8 years
    As a side note, nvidia-settings -q GPUUtilization is giving me this error: ERROR: The control display is undefined; please run nvidia-settings --help for usage information.. Maybe nvidia-smi changed.
  • monsune
    monsune over 8 years
    Right. That's the simple bash loop i mentioned above ;) There are so many ways to skin this cat and it's a nice task, too. Thanks for that useful per-user GPU usage link BTW. Not sure about that error yet but it is solvable most likely. I'm on 352.63 and it works fine. Do you run that as root? If so, try as user and it should work.
  • Franck Dernoncourt
    Franck Dernoncourt over 8 years
    Out of curiosity, which nvidia-smi version do you use? (I use 352.63)
  • aquagremlin
    aquagremlin about 8 years
    with 352.79 I just get 'not supported'utilization.gpu [%] [Not Supported] [Not Supported] utilization.gpu [%] [Not Supported] [Not Supported] utilization.gpu [%] [Not Supported] [Not Supported] utilization.gpu [%] [Not Supported] [Not Supported] utilization.gpu [%] [Not Supported] [Not Supported]
  • Franck Dernoncourt
    Franck Dernoncourt about 8 years
    @aquagremlin this may be due to the GPU model. askubuntu.com/q/701222/44876
  • Stephen Rauch
    Stephen Rauch over 7 years
    When giving an answer it is preferable to give some explanation as to WHY your answer is the one.
  • curio17
    curio17 over 7 years
    It's okay now,I hope
  • Sahil Chaudhary
    Sahil Chaudhary about 6 years
    wow, that is it, works in headless mode as well (when x driver is not used)