How to configure iGPU for xserver and nvidia GPU for CUDA work

15,213

Solution 1

I first installed NVIDIA drivers and CUDA packages following this guide. Except, after a reboot I ended up with /usr/lib/xorg/Xorg showing up in the output of nvidia-smi. This wasn't good, since I needed to have all of NVIDIA GPU RAM available to my work.

After some research I found a solution that solved my problem:

I created /etc/X11/xorg.conf with the following content:

Section "Device"
    Identifier      "intel"
    Driver          "intel"
    BusId           "PCI:0:2:0"
EndSection

Section "Screen"
    Identifier      "intel"
    Device          "intel"
EndSection

(if you try to do the same, make sure to check where your GPU is. Mine was on 00:02.0 which translates to PCI:0:2:0)

% lspci  | grep VGA
00:02.0 VGA compatible controller: Intel Corporation Device 3e92
01:00.0 VGA compatible controller: NVIDIA Corporation GP104 (rev a1)

After rebooting, xorg and other programs no longer appeared in the output of nvidia-smi. And I was able to use pytorch with CUDA-10.0.

Note, that I still have all the NVIDIA drivers installed, but they don't interfere.


update: for Ubuntu 20.04 some extra changes are needed for this to work. You will find the full details here.

Solution 2

Let me share my recipe which helped me on Razer Blade 15 laptop with Arch Linux and Gnome desktop environment.

Initially I started Gnome with Wayland session which at that time was incompatible with NVIDIA driver, so naturally I had integrated graphics adapter for display and NVIDIA GPU for deep learning. But after recent update GDM session started to fallback to Xorg with NVIDIA GPU as primary GPU. The problem was that:

  • it reduced available GPU RAM
  • it bogged down the whole system during a neural network training
  • it increased power consumption (= less battery life)

I ran nvidia-smi after startup. I expected to see No running processes found, but I saw a list of Xorg processes that used my NVIDIA GPU. That means Gnome Display Manager used Xorg session with NVIDIA GPU as primary GPU.

I examined /var/log/Xorg.0.log:

(II) xfree86: Adding drm device (/dev/dri/card1)
(II) systemd-logind: got fd for /dev/dri/card1 226:1 fd 11 paused 0
(II) xfree86: Adding drm device (/dev/dri/card0)
(II) systemd-logind: got fd for /dev/dri/card0 226:0 fd 12 paused 0
(**) OutputClass "nvidia" ModulePath extended to "/usr/lib/nvidia/xorg,/usr/lib/xorg/modules,/usr/lib/xorg/modules"
(**) OutputClass "nvidia" setting /dev/dri/card1 as PrimaryGPU

(**) means that the setting had been read from config file! I found out that the config file was /usr/share/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf. I changed the config file to set Intel integrated graphics adapter as primary GPU:

Section "OutputClass"
    Identifier "intel"
    MatchDriver "i915"
    Driver "modesetting"
    Option "PrimaryGPU" "yes"                   # <<<<<< add this string
EndSection

Section "OutputClass"
    Identifier "nvidia"
    MatchDriver "nvidia-drm"
    Driver "nvidia"
    Option "AllowEmptyInitialConfiguration"
#   Option "PrimaryGPU" "yes"                   # <<<<<< comment this string
    ModulePath "/usr/lib/nvidia/xorg"
    ModulePath "/usr/lib/xorg/modules"
EndSection

Solution 3

As I don't have the reputation to comment I share here results related to the answer of Maksym Ganenko: I tried the solution on my ubuntu 18.04 where I run gdm3 with kde-plasma or ubuntu. The file you mentioned /usr/share/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf is on my system called /usr/share/X11/xorg.conf.d/11-nvidia-prime.conf, probably because I had nvidia-prime installed for some time. The problem with editing this file is the fact that on my installation /usr/bin/gpu-manager is generating this file when starting a new xsession and so all edits are lost. As described here avoid using nvidia card for Xorg with plasma following the advice given here gpu-manager overwrites xorg.conf the solution is to protect the generated file against changes by means of

chattr +i /usr/share/X11/xorg.conf.d/11-nvidia-prime.conf

Could be a chmod 444 would do the same thing, but I simply used the solution proposed in gpu-manager overwrites xorg.conf.

Solution 4

I would like to add another way in which I am currently preventing Nvidia card from handling my display. I am simply booting to gnome by selecting Wayland instead of Xorg. Since Nvidia does not support Wayland, after logging in, nvidia-smi shows no process running.

However, I can still use Nvidia for stuff like Tensorflow.

Share:
15,213

Related videos on Youtube

stason
Author by

stason

Solving Natural Language Processing/Machine Learning problems one problem at a time. Please don't hesitate to contact me if you need help with your ML/NLP projects.

Updated on September 18, 2022

Comments

  • stason
    stason almost 2 years

    I have an Intel onboard GPU and NVIDIA GPU. I am running Ubuntu 18.04.

    How do I configure a dual GPU setup so that Intel onboard iGPU will drive the monitor, leaving NVIDIA GPU exclusively for Machine Learning CUDA work?

    • mook765
      mook765 over 5 years
      You should split this up into a question and an answer and mark your own answer as accepted.
    • stason
      stason over 5 years
      Thank you, @mook765. I did so and also updated the info for cuda-10.0.
  • Maksym Ganenko
    Maksym Ganenko about 5 years
    It worked for me too by default, but since the last update Gnome Display Manager fallbacks to Xorg and so NVIDIA driver started to use iGPU as primary GPU (Arch Linux on Razer Blade 15).
  • RedEyed
    RedEyed about 5 years
    Confirmed: it works on Ubuntu 19.04 nvidia-driver-418 and NVIDIA Quadro GV100
  • Richard_wth
    Richard_wth about 5 years
    Bro, you saved my day... I spent like ten hours trying to figure it out on Ubuntu 19.04... I uninstalled NVIDIA drivers and can never get it back properly! and I re-installed Ubuntu 19.04 and used your method. Work like a magic!
  • Bastiaan Quast
    Bastiaan Quast about 5 years
    This was very helpful for me. I was doing basically the same things (Antergos Arch, not vanilla). I have not processes now with nviddia-smi.
  • Bastiaan Quast
    Bastiaan Quast about 5 years
    However, I still get echo $XDG_SESSION_TYPE returns x11 instead of wayland. Ideas?
  • Christoph Henkelmann
    Christoph Henkelmann over 4 years
    This works fine on my XNG NEO 15 using Linux Mint 19.3, thank you. However, when using this method external monitors are not recognized - do you have any additional info on that?
  • stason
    stason over 4 years
    @ChristophHenkelmann, I shared all the bits I used. I know other solutions are available, for some extra ideas please see this thread.
  • Pandian Le
    Pandian Le over 3 years
    @ChristophHenkelmann did you manage to solve the external monitor not being recognized?
  • Christoph Henkelmann
    Christoph Henkelmann over 3 years
    @ThejKiran Regrettably I think that in my case, it is not possible to use an external monitor when excluding the NVIDIA GPU from rendering output. I do not have the link anymore, but I found info stating that many Laptops with strong GPUs are optimized for gaming. In that case, the NVIDIA card needs to be able to control things like refresh rates. That means it needs to be physically placed after the Intel GPU. Which means switching it off does not allow the external output to be used. I am no expert in these matters, so I do not know if that is actually true, but it sounds reasonable...