X.Org/nVidia Optimus
Contents |
[edit] Overview
This feature is advertised to boost both performance and battery life in laptops, but has no direct support from the Linux kernel as of version 2.6.39. Any attempts to configure an X.Org server with both cards and switch between then has failed. Setting up the nVidia card as the primary device will produce a clean server startup and a blank screen. On the other hand, Intel GMA works just fine out of the box. However, it is possible to setup two Xorg Servers and use both Intel GMA and nVidia cards at the same time.
[edit] Basic Installation
First, drivers for both cards need to be installed
To have Hardware Acceleration enabled, the Intel driver should be setup as the primary OpenGL interface.
[edit] Server Configuration
The primary server is responsible for rendering to the Monitor:
Code: /etc/X11/xorg.conf |
Section "Module" Disable "dri" EndSection Section "ServerFlags" Option "AllowEmptyInput" "no" EndSection Section "Monitor" Identifier "Monitor0" VendorName "Unknown" ModelName "Unknown" HorizSync 28.0 - 73.0 VertRefresh 43.0 - 72.0 Option "DPMS" EndSection Section "Device" Identifier "Device1" Driver "intel" VendorName "onboard" BusID "PCI:0:2:0" #Screen 1 EndSection Section "Screen" Identifier "Screen0" Device "Device1" Monitor "Monitor0" DefaultDepth 24 SubSection "Display" Depth 24 EndSubSection EndSection |
The second server is responsible for the nVidia card, and needs a separate config file.
Code: /etc/X11/xorg.nvidia.conf |
Section "DRI" Mode 0666 EndSection Section "ServerLayout" Identifier "Layout0" Screen "Screen1" Option "AutoAddDevices" "false" EndSection Section "Module" Load "dbe" Load "extmod" Load "glx" Load "record" Load "freetype" Load "type1" EndSection Section "Files" EndSection Section "Device" Identifier "Device1" Driver "nvidia" VendorName "NVIDIA Corporation" BusID "PCI:01:00:0" Option "IgnoreEDID" Option "ConnectedMonitor" "CRT-0" EndSection Section "Screen" Identifier "Screen1" Device "Device1" Monitor "Monitor0" DefaultDepth 24 SubSection "Display" Depth 24 EndSubSection EndSection Section "Extensions" Option "Composite" "Enable" EndSection Section "Monitor" Identifier "Monitor0" VendorName "Unknown" ModelName "Unknown" HorizSync 28.0 - 73.0 VertRefresh 43.0 - 72.0 Option "DPMS" EndSection |
[edit] Server Startup
The primary server can be started by default, as in a standard installation.
The secondary server needs to be started manually or via a special init script. It needs to load the nVidia driver and OpenGL libraries, so their location needs to be passed by command line, as the xorg-x11 implementation was previously set for the whole system ('-modulepath' option). Furthermore, it needs to continue listening even though all processes running on it have exited ('-noreset' option) - in fact there are no processes keeping it alive on startup, as xdm is running on the primary server, so it would terminate immediately after startup. To avoid conflict with the primary server, listening on the TCP socket should be disabled ('-nolisten tcp' option). One of the most important command line arguments comes last, the display number, as this is the second instance it should use display ':1'. Below is a complete command line.
And here is a sample init script for BaseLayout 2.x.
Code: /etc/init.d/optimus |
#!/sbin/runscript depend() { need xdm after xdm } start() { ebegin "Starting Optimus X Server" export LD_LIBRARY_PATH="/usr/lib/opengl/nvidia/lib:${LD_LIBRARY_PATH}" start-stop-daemon --start --background --pidfile /tmp/.X1-lock --exec /usr/bin/X \ -- -ac -config /etc/X11/xorg.nvidia.conf -sharevts -modulepath /usr/lib/opengl/nvidia,/usr/lib/xorg/modules -nolisten tcp -noreset :1 vt9 eend $? } stop() { ebegin "Stopping Optimus X Server" start-stop-daemon --stop --exec /usr/bin/X \ --pidfile /tmp/.X1-lock eend $? } |
In fact, the depend section does not need to require xdm, but all its dependencies to function standalone. Still, as the nVidia card is not physically connected to a monitor, it needs the primary server running to use it's display. Remember to make the script executable.
Now, start the optimus service.
[edit] Virtual GL
To stream frames rendered by the secondary X server, running the nVidia card, to the primary display, Virtual GL is needed. Unfortunately, it is not available through Portage, so it needs to be downloaded and installed manually. This can be achieved either by manual compilation, or using an RPM package.
[edit] Manual Compilation
Assuming a viable environment is available, first a static version of libjpeg-turbo is needed.
Download and extract a source package of VirtualGL from their SourceForge download page. To compile just run make and install in the source folder.
[edit] RPM Installation
First, a suitable RPM package of VirtualGL needs to be downloaded from their SourceForge download page. To extract the files install rpm2targz, and convert the package.
Then copy all the files to their intended locations, and update library directory cache.
[edit] Running Applications
First, crate a configuration file for VirtualGL, specifying the display used by the nVidia card, compression method, and optionally a log file.
Code: /etc/default/optimus |
# VirtualGL Defaults # Display for the nVidia X Server VGL_DISPLAY=:1 # Image transport xv|yuv VGL_COMPRESS=xv # Readback mode VGL_READBACK=fbo # Logging VGL_LOG=/var/log/vgl.log |
To run applications on the nVidia card a simple shell script needs to be created.
Code: /usr/local/bin/optirun |
#!/bin/bash if [ ! -f /tmp/.X1-lock ]; then echo "Optimus X Server is not running!" exit 1 fi source /etc/default/optimus export VGL_READBACK export VGL_LOG vglrun -c $VGL_COMPRESS -d $VGL_DISPLAY -ld /usr/lib/opengl/nvidia/lib "$@" |
To check that everything works just fine run glxgears.
You can also check the performance increase using the glxspheres and glxspheres64 programs supplied with VirtualGL.
Here is some sample output:
fred@iguana /opt/VirtualGL/bin $ optirun ./glxspheres64 Polygons in scene: 62464 Visual ID of window: 0x21 OpenGL Renderer: GeForce GT 520M/PCI/SSE2 47.626504 frames/sec - 42.166602 Mpixels/sec 44.310995 frames/sec - 39.231182 Mpixels/sec 46.325948 frames/sec - 41.015141 Mpixels/sec 46.685491 frames/sec - 41.333467 Mpixels/sec fred@iguana /opt/VirtualGL/bin $ ./glxspheres64 Polygons in scene: 62464 Visual ID of window: 0x92 OpenGL Renderer: Mesa DRI Intel(R) Sandybridge Mobile 30.748989 frames/sec - 27.223925 Mpixels/sec 29.870514 frames/sec - 26.446158 Mpixels/sec 31.471540 frames/sec - 27.863642 Mpixels/sec 29.858524 frames/sec - 26.435543 Mpixels/sec