How to make Unity 3D work with Bumblebee using the Intel chipset

6,845

The problem was with libGL.so.1 - apparently, the Nvidia installer also removed the Intel one and replaced it with its own.

So I retrieved it from libgl1-mesa-glx and changed the symlink in /usr/lib to point to it.

I'm not going to go into the details of how to set up Bumblebee, but there are some key parts that are missing that aren't really documented:

  1. The Nvidia driver installer is likely to destroy your existing libglx.so (in /usr/lib/xorg/modules/extensions) and libGL.so (in /usr/lib). Back those up before installing the driver. If you already lost them, you could get them back by resintalling xserver-xorg-core and libgl1-mesa-glx, but when I tried that the first time, it left my laptop in a bad state (black screen after login, had to go into recovery), so I would recommend getting those manually via dpkg-deb.

  2. After installing the Nvidia driver, you'll notice that Nvidia replaced the two files listed above with symlinks to the Nvidia libraries. In my case, for example, libglx.so is a symlink to libglx.so.304.22.

  3. Since Bumblebee needs both drivers, and since it can only differentiate by directory, let move drivers for each graphics adapter into a separate directory. The Nvidia installer I used installed some drivers into /usr/lib/nvidia-current, let's use that. Keep in mind that we need to separate libraries and Xorg modules. Here are where I put the relevant files:

In /usr/lib:

libGL.so.304.22   -- Nvidia's driver (unchanged where the installer put it)
libGL.so          -- symlink to libGL.so.1
libGL.so.1        -- symlink to /usr/lib/x86_64-linux-gnu/mesa/libGL.so.1,
                     i.e. the libgl1-mesa-glx driver

In /usr/lib/x86_64-linux-gnu/mesa:

libGL.so.1   -- symlink to libGL.so.1.2
libGL.so.1.2 -- The normal driver from libgl1-mesa-glx 

In /usr/lib/nvidia-current:

libGL.so   -- symlink to libGL.so.1
libGL.so.1 -- symlink to /usr/lib/libGL.so.304.22 (i.e. Nvidia's driver)

In /usr/lib/nvidia-current/xorg:

libglx.so -- symlink to /usr/lib/xorg/modules/extensions/libglx.so.304.22

In /usr/lib/xorg/modules/extensions:

libglx.so.1       -- symlink to libglx.so.xserver
libglx.so.304.22  -- Nvidia's driver (unchanged from where the installer put it)
libglx.so.xserver -- I renamed the original libglx.so to that and put it here

Finally, we need to modify /etc/bumblebee/bumbleebee.conf to tell the system where to find the Nvidia drivers. I'm using the nvidia driver (as opposed to nouveau), so in the nvidia section, I'm using this:

KernelDriver=nvidia
Module=nvidia
PMMethod=auto
# colon-separated path to the nvidia libraries
LibraryPath=/usr/lib/nvidia-current:/usr/lib32/nvidia-current
# comma-separated path of the directory containing nvidia_drv.so and the
# default Xorg modules path
XorgModulePath=/usr/lib/nvidia-current/xorg,/usr/lib/xorg/modules

Note how the LibraryPath and XorgModulePath points to the Nvidia drivers, so if Bumblebee is looking for them, it'll find them. If Unity is looking for them, it'll look in /usr/lib and and /usr/lib/xorg/modules/extensions, and we made sure that those symlink to the non-Nvidia ones.

This is possibly a bit clumsy in places, but what can I say? It works perfectly:

Unity 3D works will full desktop effects, and they're lightning fast but still using the energy-saving slow Intel chipset. Normal 3D acceleration is available to apps like VirtualBox. optirun works, and I have the choice of running "glxspheres" (at ~60 fps) or "optirun glxspheres" (at ~120 fps).

Share:
6,845

Related videos on Youtube

EboMike
Author by

EboMike

Updated on September 18, 2022

Comments

  • EboMike
    EboMike over 1 year

    I have a Sony VAIO S laptop with the dreaded Optimus and finally managed to get Bumblebee to work fully on Ubuntu 12.04 so that I can utilize both the hardware acceleration of the Intel chipset as well as the Nvidia one via optirun and/or bumble-app-settings.

    However, the desktop effects don't work. But they should, I vaguely remember that they worked for a while before I had Bumblebee installed.

    This is what I get with the support test:

    :~$ /usr/lib/nux/unity_support_test -p
    Xlib:  extension "NV-GLX" missing on display ":0".
    OpenGL vendor string:   Tungsten Graphics, Inc
    OpenGL renderer string: Mesa DRI Intel(R) Ivybridge Mobile 
    OpenGL version string:  1.4 (2.1 Mesa 8.0.2)
    
    Not software rendered:    yes
    Not blacklisted:          yes
    GLX fbconfig:             yes
    GLX texture from pixmap:  yes
    GL npot or rect textures: yes
    GL vertex program:        yes
    GL fragment program:      yes
    GL vertex buffer object:  no
    GL framebuffer object:    yes
    GL version is 1.4+:       yes
    
    Unity 3D supported:       no
    

    First of all, I kind of doubt that the chipset doesn't support VBOs (essentially a standard feature in GL).

    Neither Xorg.0.log nor Xorg.8.log show any particular errors.

    As for the Nvidia drivers: In order to get them to work, I had to install the 304.22 drivers (older ones wouldn't work). They clobbered libglx.so, so I reinstated the xserver-xorg-core libglx.so in its original place, moved Nvidia's libglx.so to an nvidia-specific folder and specified that folder in the bumblebee.config. That seems to work and shouldn't cause the problem I see here.

    For fun, I tried to use the Nvidia chipset for Unity, but that didn't fly either:

    ~$ optirun /usr/lib/nux/unity_support_test -p
    OpenGL vendor string:   NVIDIA Corporation
    OpenGL renderer string: GeForce GT 640M LE/PCIe/SSE2
    OpenGL version string:  4.2.0 NVIDIA 304.22
    
    Not software rendered:    yes
    Not blacklisted:          yes
    GLX fbconfig:             yes
    GLX texture from pixmap:  no
    GL npot or rect textures: yes
    GL vertex program:        yes
    GL fragment program:      yes
    GL vertex buffer object:  yes
    GL framebuffer object:    yes
    GL version is 1.4+:       yes
    
    Unity 3D supported:       no