How to use CUDA with NVIDIA Prime

5,723

Solution 1

I believe I've found at least a cursory solution to this, as described in the update to my original post. Really there's two solutions I found, though I'm sure there are others.

1 - With Prime in Intel mode, re-enable the NVIDIA card via bbswitch, then run modprobe nvidia to load the module and create the device nodes.

2 - Use Bumblebee optirun to launch a bash session from where you can do all your CUDA stuff.

Both these solutions allow you to use the onboard graphics for your display, while using the NVIDIA card for compute loads. The optirun solution seems more versatile, but I prefer the first one for its minimalism.

I'm hoping someone with more understanding will improve on this answer.

Solution 2

In my case I found that the NVidia card was not actually turned off, and the only thing I actually needed to do to run CUDA code was:

export LD_LIBRARY_PATH=/usr/lib/nvidia-352

in the shell where I want to run it (I am assuming that globally changing the alternatives setting would break compiz, etc, etc...)

To get to this point (on a Dell Optiplex 7010, with Ubuntu 14.04, CUDA 7.5, and a GTX 980) I believe the steps were:

  1. Use the PRIME Profiles tab to select Intel
  2. Reboot, and select Intel as the default in the BIOS
  3. Shut down the computer
  4. Plug the monitors into the onboard video :)

Everything seems to be working fine so far (nvidia-smi sees the card, cuda samples run, theano uses the card, etc...)

Solution 3

I use NVIDIA card only for CUDA executions and find out this approach:

All the time I use intel card and it is confirmed by command lspci | grep -E "VGA|3D":

00:02.0 VGA compatible controller: Intel Corporation Skylake Integrated Graphics (rev 06)
01:00.0 3D controller: NVIDIA Corporation GM107M [GeForce GTX 960M] (rev ff)

In the corresponding row for NVIDIA card you should see (rev ff) means it turned off.

To turn the card on and use it for CUDA computations I use two following commands:

sudo prime-select nvidia
sudo prime-switch

After that command lspci | grep -E "VGA|3D" report:

00:02.0 VGA compatible controller: Intel Corporation Skylake Integrated Graphics (rev 06)
01:00.0 3D controller: NVIDIA Corporation GM107M [GeForce GTX 960M] (rev a2)

Notice about (rev a2), not (rev ff) in the corresponding row. Now card ready to computation.

After computations I use backward actions:

sudo prime-select intel
sudo prime-switch

And lspci | grep -E "VGA|3D" reports:

00:02.0 VGA compatible controller: Intel Corporation Skylake Integrated Graphics (rev 06)
01:00.0 3D controller: NVIDIA Corporation GM107M [GeForce GTX 960M] (rev ff)
Share:
5,723

Related videos on Youtube

orodbhen
Author by

orodbhen

Updated on September 18, 2022

Comments

  • orodbhen
    orodbhen over 1 year

    I've found a half a dozen posts on this all over the web, but none of them really answer the question.

    I want to set up my nvidia GPU to only do computations, not drive the display. But when I switch to using the Intel GPU in the nvidia-prime configuration, I can no longer load the nvidia module.

    modprobe: ERROR: could not insert 'nvidia_352': No such device
    

    Without the module, CUDA doesn't work, obviously.

    So what exactly is nvidia-prime doing that makes it impossible to load the module? It's not blacklisted. There's no xorg.conf file, so how does the system know to use the Intel GPU instead the discrete one?

    I'm on a Dell 5510 Precision with Ubuntu 14.04 factory installed, and my GPU is Quadro M1000M.

    Some suggest using bumblebee, but that shouldn't be necessary for pure compute loads.

    Also, apparently bumblebee is able to load the module. So what exactly is it doing?

    Update: So why does it always seem that I find the answer when I finally post a question, after hours of trying to figure it out. This actual only a partial answer, but I'm on to something.

    So far I've determined that prime does at least two things:

    • Switch the GPU off using bbswitch.
    • Changes the alternatives for /etc/ld.so.conf.d/x86_64-linux-gnu_GL.conf.

    By using bbswitch to turn the GPU back on, I'm now able to load the NVIDIA module.

    But the question still remains: What's the best way to configure the system to use the NVIDIA card only for computations?

    Should I set nvidia-prime to use the Intel GPU, and try to manually unravel what that did to get CUDA working?

    How do I ensure that the system still uses the Intel GPU for the display?

    How would I go about simply disabling NVIDIA prime, and configuring it all manually?

    Or should I jsut give in and use Bumblebee and optirun? What are the disadvantages of this if any?

    Any recommendations?

  • orodbhen
    orodbhen about 8 years
    Yeah, unfortunately there doesn't seem to be a consistent solution, which is why I really wanted to understnad what was going on under the hood.
  • Abonec
    Abonec about 7 years
    can you explain first option more clearly? How you turn on and turn off card using bbswitch?
  • orodbhen
    orodbhen about 7 years
    So, basically, you only need to log back in after switching with prime, if you want to switch which GPU is running the display? I hadn't though of that, but it makes sense. Of course, you need to make sure you switch back before logging out or restarting.
  • orodbhen
    orodbhen about 7 years
    I actually figured out how to get bumblebee working well, and I'll update my answer when I get time. I basically followed this. It's frustrating that this stuff is so poorly documented, because it's really quite simple.
  • nhorning
    nhorning over 6 years
    Also... I have bbswitch blacklisted as suggested above, because I would get a continual error with it on boot. I was able to boot into intel mode using nvidia-384 once or twice, but I had trouble doing the mining with that, and thought it could be because 1070 ti support was added with nvidia-387.
  • Epimetheus
    Epimetheus about 6 years
    I use this approach. I log in with the intel profile active so Xorg and firefox are not using the GPU. Then I switch to the nvidia profile and all my CUDA work then goes on the GPU and doesn't have to compete with firefox and Xorg for memory :) !