External display without restarting X ! |
If you use linux on your Thinkpad W520, you have undoubtedly been frustrated by the graphics situation. With a default install, it's impossible to enjoy both the battery life of the integrated intel graphics card and to have external monitors. The situation is documented in this earlier blog post, along with a partial solution to the problem. The biggest drawback to the solution presented in that post is that you must log out of your X-session to disconnect or connect an external display. Fortunately, with some hacking from Liskni_si , we now have a better option:
A dynamic external display in nVidia Optimus Mode without having to restart the X-Server!
Notes:
- As far as I know this should also work on some T420/520 models to get the display port output and/or tri-head display.
- You may use this with bumblebee installed, however, you MUST follow ALL the steps that Lekensteyn lays out here except instead of $ export DISPLAY=:8 , run $ screenclone -d :8
- There are some perhaps easier-to-follow instructions to do this with bumblebee and 3D support on the intel card here: http://sagark.org/optimal-ubuntu-graphics-setup-for-thinkpads/
So, here is how the system works:
1. Use bbswitch to turn the nVidia card on and off when you need it.
2. All of the rendering is done on the intel card. This requires an extra virtual screen inside the intel card which can be dynamically activated and configured via xrandr.
3. The content of the intel card's virtual screen is copied to the nvidia card via Liskni_si's screenclone for displaying on the external display.
Here's some basic instructions to get it to work:
1. Download bbswitch via your package manager (for ubuntu it is in the bumblebee ppa). Source and instructions on how to use it are here: https://github.com/Bumblebee-Project/bbswitch (you don't NEED to install all of bumblebee)
$ sudo apt-add-repository ppa:bumblebee/stable
$ sudo apt-get update
$ sudo apt-get update
$ sudo apt-get install bbswitch-dkms
Learn how to use it. You may also have to manually insert the bbswitch module for it to work
2. Patch your intel driver to give it an extra virtual display.
First download the source for the intel driver
$ apt-get build-dep xserver-xorg-video-intel
$ apt-get source xserver-xorg-video-intel
$ cd <wherever you put the source>/xserver-xorg-...
$ wget https://raw.github.com/liskin/patches/master/hacks/xserver-xorg-video-intel-2.18.0_virtual_crtc.patch
$ patch -p1 < xserver-xorg-video-intel-2.18.0_virtual_crtc.patch
[EDIT - there is a newer update of the patch here: https://github.com/liskin/patches/blob/master/hacks/xserver-xorg-video-intel-2.20.14_virtual_crtc.patch. This new patch may be required for newer systems such as ubuntu 12.10. It also contains support for 2 virtual screens, although support for two external screens in screenclone may not be available yet]
Finally, build and install the new intel driver
$ dpkg-buildpackage -b
$ cd ..
$ sudo dpkg --install xserver-xorg-video-intel_2.17.0-1ubuntu4_amd64.deb
$ wget https://raw.github.com/liskin/patches/master/hacks/xserver-xorg-video-intel-2.18.0_virtual_crtc.patch
$ patch -p1 < xserver-xorg-video-intel-2.18.0_virtual_crtc.patch
[EDIT - there is a newer update of the patch here: https://github.com/liskin/patches/blob/master/hacks/xserver-xorg-video-intel-2.20.14_virtual_crtc.patch. This new patch may be required for newer systems such as ubuntu 12.10. It also contains support for 2 virtual screens, although support for two external screens in screenclone may not be available yet]
Finally, build and install the new intel driver
$ dpkg-buildpackage -b
$ cd ..
$ sudo dpkg --install xserver-xorg-video-intel_2.17.0-1ubuntu4_amd64.deb
Now you probably have to restart X, but after you do, if you run xrandr, you should see an entry called "VIRTUAL", that's what we need.
3. Download and build the screenclone source
$ git clone git://github.com/liskin/hybrid-screenclone.git
$ cd hybrid-screenclone
$ make
Usage
To attach an external monitor, first start the x server on the nvidia device with the xorg.conf (xorg.conf.nvidia) included with the hybrid-screenclone source (the nvidia card must be on of course - you can check this with bbswitch).
$ /usr/bin/X -ac -audit 0 -config xorg.conf.nvidia -sharevts -modulepath /usr/lib/nvidia/current,/usr/lib/xorg/modules -nolisten tcp -noreset :1
(If you have bumblebee installed, skip the step above because bumblebee controls the nvidia X server. Instead just run $ optirun true to get the X server going)
Then turn on the virtual screen in the intel chip
$ xrandr --output LVDS1 --auto --output VIRTUAL --mode 1920x1080 --left-of LVDS1
And finally, send the intel output to the nvidia card (you'll need to put the screenclone binary in your path)
$ screenclone
(If you have bumblee installed use $ screenclone -d :8 instead)
If you don't see the right screen on your external monitor, try running screenclone with the argument -x n where n is 0, 1, or 2.
(If you have bumblee installed use $ screenclone -d :8 instead)
If you don't see the right screen on your external monitor, try running screenclone with the argument -x n where n is 0, 1, or 2.
Make some scripts out of this (you can see the scripts I have made for my use here), and enjoy your new-found freedom!
And maybe one day, Liskni_si will even extend his code so that we W520 users can have triple head without needing to restart X every time ; )
[If you are looking for a way to enable 3D acceleration on the Intel card, have a look at Christoph Gritschenberger's comment below]
Hi,
ReplyDeleteI do not have VIRTUAL in xrandr,
I did all the things in the manual.
I have installed bbswitch, I have downloaded source of intel driver and applied the patch. When I was applying the patch, I had to download it manualy from the site because wget downloaded html header and lot of other things too. Here is the patch process:
patching file src/intel_display.c
Hunk #1 succeeded at 113 (offset 1 line).
Hunk #2 succeeded at 1354 (offset 6 lines).
Hunk #3 succeeded at 1419 (offset -2 lines).
Hunk #4 succeeded at 1487 (offset -3 lines).
Hunk #5 succeeded at 1623 (offset -3 lines).
patching file src/virtual_display.c
I guess that was alright.
Then dpkg-buildpackage -b which created package I installed. Then I rebooted the system and run xrandr. There is no "Virtual" in it. I have T520 thinkpad with nvidia optimus. Do you know where can be the problem?
Sorry for my bad English and thanks for any help.
Solved.... wrong package installed after dpkg
DeleteI have the same problem. After instaling the correct package, I cant see VIRTUAL in xrandr output.
DeleteI have Thinkpad T420 with optimus technology.
I am asking again because: When I try to run
ReplyDelete/usr/bin/X -ac -audit 0 -config xorg.conf.nvidia -sharevts -modulepath /usr/lib/nvidia/current,/usr/lib/xorg/modules -nolisten tcp -noreset :1
I get this error:
(EE) Failed to load module "nvidia" (module does not exist, 0)
But I have optirun fully functional, bbswitch is functional too. When I use just discrete graphic it loads the nvidia module with no problem.
Do you have any idea where the problem could be? T520 with UBUNTU 12.04
Sorry for my bad English, thanks for any help.
Your nvidia graphics driver is not properly loaded in a state where it can be used to start the X server.
ReplyDelete> But I have optirun fully functional...
It appears that you have bumblebee installed. I personally do not have bumblebee installed - as far as I know it will not help you to use an external monitor via display port. I have not used external monitors with bumblebee installed.
Personally, I think it is best not to have bumblebee installed unless you need to use the nvidia graphics card to display on the Thinkpad Display often. Uninstalling bumblebee seems to mess some stuff up with your nvidia driver. Last time I came back from bumblebee the only way that I could get the nvidia drivers working is by installing them straight from the nvidia website (also make sure nouveau is blacklisted in /etc/modprobe.d).
However, there may be a way to get the nvidia xserver working with bumblebee using this:
https://github.com/Bumblebee-Project/Bumblebee/issues/77
Note: I have also been able to use the nouveau driver by changing xorg.conf.nvidia, so you might try that if you can't get nvidia working.
If anyone gets this to work with Bumblebee, please let me know so that I can update the instructions
ReplyDeleteIt's me again :-D I am stuck in some kind of weird circle. When I do not have bumblebee installed the bbswitch does not work it says: tee: /proc/acpi/bbswitch: No such file or directory
ReplyDeleteBut when I have bumblebee installed and bbswitch working the nvidia module is not working :-D I even tried a clean install. How did you get bbswitch working without bumblebee?
Ok,so from a clean install, you should be able to do:
ReplyDelete$ sudo apt-add-repository ppa:bumblebee/stable
$ sudo apt-get update
$ sudo apt-get install bbswitch-dkms
This should ONLY install bbswitch, not bumblebee; you only want the bbswitch-dkms package to be installed, not all of bumblebee. bbswitch-dkms should NOT depend on bumblebee.I like to use aptitude to see info about packages.
$ aptitude show bumblebee
should tell you whether or not bumblebee is actually installed
watch the output of
ReplyDelete$ sudo apt-get install bbswitch-dkms
closely for errors. maybe you have to install the module separately
[$modprobe bbswitch] or something
use $ lsmod | grep bbswitch
ReplyDeleteto figure out whether the module is installed correctly . read up on dkms
Yes modprobe bbswitch made it. Could you please give here working dropbox link on your scripts? The actual one sends me to my own dropbox. And could you share xorg.conf.nvidia? It will be really helpful. Also how do you manage intel card? I have weird feeling that It can't handle unity 3d.
ReplyDeleteSorry for my bad english.
fixed the link to the scripts. I use the xorg.conf.nvidia that is included in the hybrid-screenclone git repository: https://github.com/liskin/hybrid-screenclone
ReplyDeleteAs far as the intel card goes. It definitely has the power to run unity 3D, but there can be a problem when both the intel and nvidia drivers are installed on the system. The problem is that the nvidia driver is not open source and runs a different implementation of 3D acceleration. I don't know much about this but keywords to search for more info would be "mesa", "GLX", "OpenGL nvidia and intel", etc.
Personally, I use KDE, which does not require 3D acceleration for most of its useful features. I stick with the nvidia driver because it offers very good 3D acceleration in the rare cases where I do need it (I usually just start a separate X server or reboot into discreet bios mode when I do 3D stuff).
There may be a way to have both the nvidia and intel 3D acceleration working on your system, but I don't know how to do that. Also, you might have better luck using the nouveau driver instead of nvidia. Since it is open source, it might not mess up the intel 3D acceleration. I can't really guide you on the exact steps for switching to nouveau, but it may be as simple as uninstalling nvidia-current, installing xorg-xserver-video-nouveau, switching "nvidia" to "nouveau" in xorg.conf.nvidia, and restarting (don't know if this is all correct, but I have used screenclone successfully with nouveau in the past)
Good luck! And there's no need to apologise about your English! It's fine.
backman, just for you, I figured out how to get all of this working with bumblebee (the post is now updated to include the bumblebee steps)
ReplyDeleteWith bumblebee installed, you have to follow ALL the steps Lekensteyn suggests here
https://github.com/Bumblebee-Project/Bumblebee/issues/122
except you DON'T have to do export DISPLAY=:8
then use
$ screenclone -d :8
instead of just screenclone
With bumblebee installed, you should definitely be able to use Unity 3D with the intel card
Well after lot of work, it works but not in a way I like. So I am going back to your previous solutions, and I did a startup script that replaces Xorg.conf based on whether is notebook on charger or not (I have dock station where are plugged external monitors). And after that turns on or off nvidia card.
ReplyDeleteBut anyway thank you. And btw the unity 3D is problem with intel driver in 12.04 not because of this solution.
I can clone the image, but I can't get an extended screen to work - the xrandr command seems to not completely work - I get some flicker on my LVDS1 output, compressing the image, then reverting back. When I run screenclond -d :8, I get a cloned image of my LVDS1 but not at the resolution I selected using xrandr, and also like I mentioned, not an extended screen like twinview. Do I need to setup xinerama for that?
ReplyDeleteThis is a W520, nVidia Quadro 1000M, Ubuntu 12.04 (Unity), with bumblebee installed. Everything else works fine.
Thanks for any suggestions!
No, you don't need Xinerama to get the twin-view-like connected desktop. xrandr automatically takes care of everything that Xinerama would (including providing the "xinerama screen" that the screenclone documentation speaks of). I don't have bumblebee installed, so I can't tell you if something with bumblebee is happening. You may not have installed the intel patch correctly. After you run the xrandr command, what is the output of
ReplyDelete$ xrandr
without any arguments? Also, can you move your cursor off of one of the sides of your LVDS1, or are you constrained on all sides?
I suspect that the fix will involve using
$ screenclone -d :8 -x n
where n is 0 or 1 or 2 or something
The command 'screenclone -d :8 -x 1' works for me; thanks for the tip! I applied the patch correctly, I was pretty sure. This is awesome.
ReplyDeleteAre there any other constrains, e.g. are there outputs which don't work with this solution? What about DisplayPort?
Thanks!
Cool man, I'm glad it works for you. I'm assuming you're using a W520 rather than T-series. DisplayPort will work, but it might be a little weird if you plan on switching between VGA and DisplayPort often. The nvidia X server should autodetect which one is plugged in when it is started (via optirun in your case). Just make sure you have the one you want plugged in when you start the nvidia X server and you should be fine. If it doesn't work how you want it to, you may have to mess with your /etc/bumblebee/xorg.conf.nvidia
ReplyDeleteYes, using W520. Eagerly awaiting the day when all of this autodetects, and 'just works'. About that patch: Tomas wrote in the comment section of the patch that his setup is _triple head with hotplug_ (!!)... I wonder how that works - I only get one VIRTUAL CRTC going, I wonder if there's a parameter you can pass or something like that.
ReplyDeleteHe has a T420 or T520. On those models the intel card is connected directly to the VGA output. So on those models, the VGA output does "just work" without any messing around. Hence, to have triple-head, he just needs one virtual crtc to output to his DisplayPort. Really, the T series is a better machine for day-to-day linux use because of this.
ReplyDeleteHi I found your post it is encouraging but I am having trouble with step 2.
ReplyDelete2. Patch your intel driver to give it an extra virtual display.
First download the source for the intel driver
$ apt-get build-dep xserver-xorg-video-intel
$ apt-get source xserver-xorg-video-intel
Then, download and apply the patch to create the extra virtual screen
$ cd
$ wget https://github.com/liskin/patches/blob/master/hacks/xserver-xorg-video-intel-2.18.0_virtual_crtc.patch
$ patch -p1 < xserver-xorg-video-intel-2.18.0_virtual_crtc.patch
Finally, build the new intel driver
$ dpkg-buildpackage –b
Here is my output:
$ patch -p1 < xserver-xorg-video-intel-2.18.0_virtual_crtc.patch
patching file src/intel_display.c
Hunk #1 FAILED at 112.
Hunk #2 FAILED at 1348.
Hunk #3 FAILED at 1421.
Hunk #4 FAILED at 1490.
Hunk #5 FAILED at 1626.
5 out of 5 hunks FAILED -- saving rejects to file src/intel_display.c.rej
patching file src/virtual_display.c
Also after doing: “$ apt-get source xserver-xorg-video-intel” and going into the “xserver-xorg-video-intel-2.9.1/src” directory I do not see the file “intel_display.c”
Any idea what I have done wrong?
Thanks for the help.
hmm, it seems like your intel driver package might be too old. In my version (2.17.0), intel_display.c is in xserver-xorg-intel-2.17.0/src. I'm a noob and I don't really know what the best thing to do is, but my guess is that you need to try to find a version of the intel drivers that are closer to 2.18.0, for which the patch was designed. See if your package manager has a newer version, perhaps it is locked at 2.9.1 for some reason.
ReplyDeleteHi again, sorry it this is a bad question to ask. I have now solved the problems with step 2 and have run "dpkg-buildpackage -b" with out errors. I now have a new build dir in my xserver-xorg-video-intel-2.17.0 dir. But I do not see the package to install. Should this produce a .deb? Where should I look/what should the file be called that I can install. Also as long as I am at it what did you do to install your recompiled driver?
ReplyDeleteThanks
Nathan
OK, I figured it out. The .deb file from step 2 was in the parent directory from where I downloaded the source. It created two packages, I installed the non- debug one. The in step three, I used Google to to search for the missing .h files to see what libraries I needed to install. Now I have a successfully compiled screenclone application.
ReplyDeleteAlso a note from my troubles, I originally tried to do this on Ubuntu 10.04, to get the xserver-xorg-video-intel-2.17.0 to work, I upgraded to Ubuntu 12.04.
Thanks all for your helpful comments on this blog.
Cool, glad you got it working!
ReplyDeleteOK, weird problem... even though the version numbers are the same, apt-get upgrade insists on replacing my custom package with the official one from the repos.
ReplyDeleteI looked at pinning, but that assumes different version numbers. How can I change the version numbers so that this doesn't happen all the time?
Thanks,
Stephan
OK sorry for the noise. I managed to get rid of it by increasing the version number using dch -l[your version identifier] and a file in /etc/apt/preferences.d/. For the itnerested, do this:
ReplyDelete(1) Before running dpkg-buildpackage -b, run
$ dch -l[your version identifier]
Then continue as normal and install the package.
(2) Create a file called "/etc/apt/preferences.d/xserver-xorg-video-intel" or similar, containing the following:
Package: xserver-xorg-video-intel
Pin: version 2:2.17.0-1ubuntu4[your version identifier]
Pin-Priority: 1001
Note that this will break future upgrades (i.e. they won't be incorporated and may break other things down the line), so you should frequently check whether a new version of the xserver-xorg-video-intel package is available by running
$ apt-cache policy xserver-xorg-video-intel
which should show all available package versions.
Cheers,
Stephan
Hey any idea what's wrong? I managed to get to the part where I run screenclone, and I get this error when I do screenclone -d :8
ReplyDeleteterminate called after throwing an instance of 'std::runtime_error'
what(): screenclone.cc:78 display
Thanks!
I believe this means that your nvidia x-server was not started properly. Since you are using -d :8, I am assuming that you are using bumblebee (If you don't have bumblebee, you shouldn't use -d :8). I don't have bumblebee installed, so I can't help you troubleshoot much. Start out by making sure you followed all the instructions at https://github.com/Bumblebee-Project/Bumblebee/issues/122
ReplyDeleteyou might also try
$ cat /var/log/Xorg.8.log | grep EE
to see if you can track down the error in the x server
Thanks for the response, I did a clean install, didn't install bumblebee, and am now following the guide
DeleteWhen I try to start the nvidia xserver i get this error:
(WW) warning, (EE) error, (NI) not implemented, (??) unknown.
(==) Log file: "/var/log/Xorg.1.log", Time: Tue May 15 12:52:04 2012
(EE) Unable to locate/open config file: "xorg.conf.nvidia"
(==) Using system config directory "/usr/share/X11/xorg.conf.d"
(EE) intel(0): [drm] failed to set drm interface version.
(EE) intel(0): Failed to become DRM master.
DRM_IOCTL_I915_GEM_APERTURE failed: Bad file descriptor
Assuming 131072kB available aperture size.
May lead to reduced performance or incorrect rendering.
get chip id failed: -1 [9]
param: 4, val: 0
X: ../../intel/intel_bufmgr_gem.c:2783: drm_intel_bufmgr_gem_init: Assertion `0' failed.
Any ideas?
Thanks!
Oh and some system specs and stuff:
DeleteW520 with Quadro 2000M
Running in Optimus mode with OS Detection disabled
Started following this guide immediately after a clean install of Ubuntu 12.04 x64
(EE) Unable to locate/open config file: "xorg.conf.nvidia"
Deleteis your problem. You should have gotten the xorg.conf.nvidia file when you downloaded screenclone via git. In the command to start the X server, after the "-config" option, you may have to adjust the path to wherever the xorg.conf.nvidia file is, or just run it from the directory that contains xorg.conf.nvidia
I did run the commands from the hybrid-screenclone folder (when i typed it in manually, it autocompleted the xorg.conf.nvidia so it's definitely there. However I noticed that I do not have a /usr/lib/nvidia/current folder. Is this normal?
ReplyDeleteSorry to keep bothering you and thanks again
This comment has been removed by the author.
DeleteI don't have anything show up on a clean install under additional drivers. Did you have to install it manually?
DeleteSorry, I didn't look at the link that I sent you above very closely, and it was probably not very helpful for what we're doing. Be patient. Once you understand how everything is working together, it will be worth it.
DeleteIf you were able to install the nvidia driver in Discrete mode like you mention below, that's great. They should be installed even when you are back in Optimus, but they won't be running.
If you cannot install it in discrete mode, the brute force way to do it is to download the latest driver from
http://www.nvidia.com/object/unix.html
, making the file executable, and running it. (this is brute force because your package manager probably will not know what's going on - in my opinion that is ok in this case)
DO NOT try to do nvidia-xconfig - it won't help you in Optimus mode
Either way, once you have the driver installed, it is a dynamic kernel module, meaning that it can be inserted or deleted from the kernel while the system is running.
To see if the module is inserted, run
$ lsmod | grep nvidia
if a module called nvidia comes up, the module is inserted correctly. If nothing comes up, insert the module manually by running
$ sudo modprobe nvida
(or it might be $ sudo modprobe nvidia_current depending on how you installed)
Once the module is inserted, make sure the card is turned on with bbswitch. Then you should be able to start the X server on the nvidia card.
Ok so I did all of this, I get a result for lsmod | grep nvidia, bbswitch appears to be working but now i get these errors while trying to start the xserver:
Delete(EE) Failed to load module "nvidia" (module does not exist, 0)
(EE) Failed to load module "kbd" (module does not exist, 0)
vesa: Ignoring device with a bound kernel driver
(EE) FBDEV(0): FBIOPUTCMAP: Invalid argument
(EE) Failed to load module "kbd" (module does not exist, 0)
(EE) No input driver matching `kbd'
It seems that I'm getting a bit closer every time thanks to your help :)
hmm. this is really confusing to me... it should work. It must be frustrating for you. Note that if you restart the card with bbswitch, you have to reinsert the nvidia module with modprobe. But, if you detect nvidia with lsmod immediately before you try to start the x server, then I am not sure what's going on. Do you have nouveau installed and running (lsmod | grep nouveau)? If nouveau is loaded, it can prevent the nvidia module from loading (the vesa:Ignoring device with a bound kernel driver makes me think of this). The nvidia installer should take care of that though if you selected the right settings during the install. I think I remember running into something like this at one point, but I don't remember exactly what fixed it. Installing the nvidia driver from their website usually takes care of these things.
DeleteAwesome, the driver from nvidia's site did the trick. I've got everything working great now with some scripts to automate it. Thanks for all your help! Now I can use ubuntu as my daily driver :)
DeleteHmm trying it again with bumblebee and i can get screenclone to run without errors after leaving a window of (optirun) glxspheres running, but I don't get anything on the external monitor.
ReplyDeleteI have had no issues with disper and disper-indicator to manage external monitors without restarting X.
ReplyDeletemaybe everyone here is doing something more technical than I am, but I run external monitors with 16 workspaces & compiz cube all day long.
Are you using Bumblebee and running in Optimus Mode? I cannot even get the exact fix here working on mine. It seems that I am unable in to install the Nvidia drivers when I'm on Optimus. The only way I've been able to install the nvidia drivers through "Additional Drivers" is on Discrete Mode with VT-x disabled (because of the BIOS bug).
DeleteUnknown - It sounds like you are using Discrete Mode in the bios. In discrete mode, managing displays is indeed simple with disper, however, this gives you about 30% less battery life than in Optimus mode in my experience. (I'm assuming you have a W520 instead of a T520.)
DeleteGot this working without bumblebee, but has anyone gotten 3d accel to work on the intel card without bumblebee? I can't use any of compiz's features (particularly keyboard shortcuts and workspaces, don't really care for the other stuff) since setting up this method.
ReplyDeleteThe problem is that some of the nvidia driver's hardware acceleration software conflicts with intel's. Bumblebee somehow allows them to be used in parallel, so if you do this with bumblebee, you should be able to get it to work. You might also be able to get it to work by using nouveau instead of nvidia to drive your nvidia card because I believe the nouveau driver is designed to work on the same DRI framework as intel's.
DeleteHi,
ReplyDeletethanks for this usefull post! I'm running Kubuntu 11.10 with bumblebee on an W520 and am stuck. I've successfully activeted the VIRTUAL output, but I don't get any output when I run screenclone -d :8 (-x 0 or -x 1 do not help either). I dont get any errors anywhere (screenclone, syslog, dmesg), everything seems to work, but the external monitor connected to the DP gets no signal and stays in standby. I figured it might have something to do with the VIRTUAL output because it gets listet as having an "unknown connection":
Screen 0: minimum 320 x 200, current 3840 x 1080, maximum 8192 x 8192
LVDS1 connected 1920x1080+1920+0 (normal left inverted right x axis y axis) 344mm x 193mm
1920x1080 60.0*+ 59.9 50.0
1680x1050 60.0 59.9
1600x1024 60.2
1400x1050 60.0
1280x1024 60.0
1440x900 59.9
1280x960 60.0
1360x768 59.8 60.0
1152x864 60.0
1024x768 60.0
800x600 60.3 56.2
640x480 59.9
VGA1 disconnected (normal left inverted right x axis y axis)
VIRTUAL unknown connection 1920x1080+0+0 (normal left inverted right x axis y axis) 0mm x 0mm
1920x1200 60.0
1920x1080 59.9*
1600x1200 60.0
1680x1050 60.0 59.9
1400x1050 60.0
1440x900 59.9
1280x960 60.0
1360x768 59.8 60.0
1152x864 60.0
800x600 56.2
640x480 59.9
Do you have any advice about this?
Replying to myself to clarify and to document my solution:
DeleteAfter adding
Option "ConnectedMonitor" "DFP-1"
to
/etc/bumblebee/xorg.conf.nvidia
Everything works now! Thanks again!
For completeness my config looks like this:
Section "ServerLayout"
Identifier "Layout0"
Screen "Screen0"
#Option "AutoAddDevices" "false"
EndSection
Section "Device"
Identifier "Device0"
Driver "nvidia"
VendorName "NVIDIA Corporation"
Option "ConnectedMonitor" "DFP-1"
EndSection
Section "Screen"
Identifier "Screen0"
Device "Device0"
DefaultDepth 24
SubSection "Display"
Depth 24
EndSubSection
EndSection
Hi Zachary,
ReplyDeletethx for your post. I really need the activiation of the second display port mainly for customer presentation, so I followed the steps described above.
after building the Intel driver, I restarted my computer, but xrandr doesn't show any VIRTUAL entry as expected....
The differences I see from your description above is the version of the Intel drivers downloaded and build. Yours is 2.17.0, mine is 2.15.0. But I am on a 32bit system (with PAE). The packae you're installing for the Intel driver is xserver-xorg-video-intel_2.17.0-1ubuntu4_amd64.deb (means you are on 64bit), mines is xserver-xorg-video-intel_2.15.901-1ubuntu2.2_i386.deb. Any hints, what I could have done wrong?
the xrandr means that it "Failed to get size of gamma for output default" but I think that it has nothing to do with my problem...
Regards, Marc
output of xrandr
Marc@MaGa-w520:~$ xrandr
xrandr: Failed to get size of gamma for output default
Screen 0: minimum 320 x 175, current 1920 x 1080, maximum 1920 x 1080
default connected 1920x1080+0+0 0mm x 0mm
1920x1080 50.0* 51.0 52.0 53.0
1680x1050 54.0
1600x1024 55.0
1440x900 56.0
1400x1050 57.0
1360x768 58.0 59.0
1280x1024 60.0 61.0
1280x960 62.0
1152x864 63.0 64.0 65.0 66.0 67.0 68.0
1024x768 69.0 70.0 71.0 72.0 73.0 74.0
960x720 75.0
960x600 76.0
960x540 77.0
928x696 78.0
896x672 79.0 80.0
840x525 81.0 82.0 83.0 84.0 85.0
832x624 86.0
800x600 87.0 88.0 89.0 90.0 91.0 92.0 93.0 94.0 95.0 96.0
800x512 97.0
720x450 98.0
720x400 99.0
700x525 100.0 101.0 102.0 103.0
680x384 104.0 105.0
640x512 106.0 107.0 108.0
640x480 109.0 110.0 111.0 112.0 113.0 114.0
640x400 115.0
640x350 116.0
576x432 117.0 118.0 119.0 120.0 121.0 122.0 123.0
512x384 124.0 125.0 126.0 127.0 128.0
416x312 129.0
400x300 130.0 131.0 132.0 133.0 134.0
360x200 135.0
320x240 136.0 137.0 138.0 139.0
320x200 140.0
320x175 141.0
Marc,
DeleteI'm not sure what would have happened. In order for the virtual screen patch to work, your version of the intel driver needs to be close to the 2.18.0 that the patch was designed for. If 2.15.0 is significantly different, it won't work exactly right.What is the output of
$ patch -p1 < xserver-xorg-video-intel-2.18.0_virtual_crtc.patch
? (the patch should work correctly for 32 bit or 64 bit btw)
If you are having trouble with this, a possibly more reliable way to be able to use two screens is to use only the discrete card (the only downside is worse battery life). See the instructions for option 2 at my other post on this subject: http://zachstechnotes.blogspot.com/2012/01/tri-head-display-on-linux-thinkpad-w520.html
Hi Zachary, thx for your quick response! I really appreciate it!
ReplyDeleteI understand that the version 2.15 I have, may be be to far/different from the 2.17 you have for the patch to be applied. But I cannot figure out why you get Version 2.17 and I only 2.15 as we are both using a w520. I already remove the Intel driver using synaptic, actualized the database and reinstalled it, but this is always the version 2.15 that is reinstalled. Are you perhaps using a another repository? Perhaps an unstable one? Or do you know where I can find the driver to install it manually?
I already read your other (very good) post about this, and I will give it a try if I am not successful with updating the Intel driver.
Any way, thank you for your help!
I am running precise (kubuntu 12.04) now which has the 2.17 driver. It looks like oneiric (11.10) is still using the 2.15 driver. https://launchpad.net/ubuntu/+source/xserver-xorg-video-intel
DeleteYou can either upgrade to precise, or you might be able to get the new packages by enabling backports. https://help.ubuntu.com/community/UbuntuBackports (I think that is what I was doing when I was still running oneiric, but I'm not 100% sure).
I tried this today on ubuntu 12.04 with intel-driver v2.17 on my T420.
ReplyDeleteI had to change the "ConnectedMonitor" in xrog.conf.nvidia to "DFP-1".
In order to get 3D-accel working with the intel-card I had to
update-alternatives --config x86_64-linux-gnu_gl_conf
* 2 /usr/lib/x86_64-linux-gnu/mesa/ld.so.conf 500 manual mode
This however will cause the secondary xserver to fail to load.
(EE) Failed to load /usr/lib/nvidia-current/xorg/libglx.so: libnvidia-tls.so.295.40: cannot open shared object file: No such file or directory
I now use this command to start the xserver:
LD_LIBRARY_PATH=/usr/lib/nvidia-current/ /usr/bin/X -ac -audit 0 -config xorg.conf.nvidia -sharevts -modulepath /usr/lib/nvidia-current,/usr/lib/xorg/modules -nolisten tcp -noreset :1
Also note that I had to change /usr/lib/nvidia/current to /usr/lib/nvidia-current.
The only thing I'm missing now is to define the VIRTUAL-monitor as primary.
when I do
xrandr --output VIRTUAL --mode 1680x1050 --output LVDS1 --auto --right-of VIRTUAL
screenclone -x 1
It works as expected (extended to the left). However when I try to define VIRTUAL as primary like this:
xrandr --output VIRTUAL --mode 1680x1050 --primary --output LVDS1 --mode 1600x900 --right-of VIRTUAL
I lose the VIRTUAL monitor.
Thanks for the info on the 3D stuff! I noted it at the end of the blog post.
DeleteI'm not sure why the --primary is not working. All I can tell you is that I use
xrandr --output LVDS1 --auto --output VIRTUAL --primary --mode 1920x1080 --right-of LVDS1
all the time and it works. If you think that it is a bug, you might want to contact the guy who wrote the patch that gives you the VIRTUAL screen: https://github.com/liskin
OK, it does not work in gnome 3 (with or without effects).
DeleteIt works on XFCE but it kind of ignores the primary anyways.
It works perfectly in KDE.
Maybe I can find something in the gnome-code
Nice work. I was kind of enjoying being the hub for information on this issue, but it looks like your instructions might be a little easier to follow. I'll put a link in the actual post. If you want to give people an idea of what's going on, you might want to link back to my other post http://zachstechnotes.blogspot.com/2012/01/tri-head-display-on-linux-thinkpad-w520.html
ReplyDeleteIt has some nice fun ascii art drawings to illustrate what's going on.
Done. And thanks again for your great work, if it weren't for this post, I would never have been able to switch to Ubuntu :p
ReplyDeleteI'm having trouble with screenclone:
ReplyDeleteafter "make" I get:
screenclone.cc:23:35: fatal error: X11/extensions/record.h: No such file or directory
compilation terminated.
make: *** [screenclone] Error 1
everything up to that point went flawless.
Thanks!
Mguel
PS: W520, lubuntu 12.04 fresh install. No bumblebee
ok, installed libxtst-dev and could "make" screenclone with no errors
Deletevery helpful. I too could not build.
Delete~/GIT/showkeys $ sudo apt-get install libxtst-dev
I only got a warning after this.
showkeys.c: In function ‘update_key_ring’:
showkeys.c:113:2: warning: ‘XKeycodeToKeysym’ is deprecated (declared at /usr/include/X11/Xlib.h:1695) [-Wdeprecated-declarations]
showkeys.c:122:2: warning: ‘XKeycodeToKeysym’ is deprecated (declared at /usr/include/X11/Xlib.h:1695) [-Wdeprecated-declarations]
Thanks
Hi!
ReplyDeleteSo I have it worked,
I am running scripts like this
xrandr --output LVDS1 --auto --output VIRTUAL --mode 1920x1200 --left-of LVDS1
optirun screenclone -d :8 -x 1
xrandr --output VIRTUAL --off
but everything what I have on Thinkpad display are transfering on external one, how to avoid it?
and how to run this screenclone on startup?
Thanks!
Hmm... unfortunately I don't have bumblebee installed (because I normally roll with three screens), so I can't test this. You might try messing around with the xrandr command and using the --primary argument (I think you might want --output LVDS1 --primary), also, you might need to change the number after -x when you call screenclone.
DeleteAs for using it at startup, I think that most desktop environments (gnome, ubuntu unity, KDE, xfce, etc.) have a folder where you can put scripts to run at startup. You might be able to accomplish this by putting the commands in a script and putting the script in this folder so that it runs when the desktop environment starts. Specific instructions will depend on which DE you are using.
Also, you might want to use Sagar Karandikar's tool found here: http://sagark.org/thinkdisp-about-installation/ . That might have a feature to automatically run at startup, or it might just make it easier to add the display manually after you have started it up. I have never personally tried it.
I assume since you are using "optirun screenclone" you have seen the other guides on this subject
http://sagark.org/optimal-ubuntu-graphics-setup-for-thinkpads/
http://blog.gordin.de/post/optimus-guide
Those may also be of help on either problem.
Sorry I can't offer more specific help.
Hi,
ReplyDeletefirst - thanks for a great blog and post.
I've been trying to setup the following 'stage': One external monitor (which I would like to use for 3D stuff, running a V.M. on top of KVM-Qemu 'connected' to the NVidia graphic card) and the laptop's lcd ('connected' to the integrated Intel chip).
I got the following, so far: the VM (Linux also) uses the NVidia graphic card (I checked by running lsmod |grep nvidia on the guest) and the host -in virtualization parlance - uses the integrated Intel, checked by a lsmod | grep i915.
So, in contrast to the tutorial you posted above, I did not (and am not planning to) fiddle around [too much ;D] with the intel-xorg driver and neither with screenclone/full bumblebee, if possible ..
Any ideas to let me just plug-in the external monitor and (either with or without restarting X) by using the triple - which in my case is dual- head monitor Xorg file, which you linked above, be able to have my regular desktop (running on the Intel chip) on my laptop screen and the other Linux (in the V.M.) displayed on the external monitor and using the NVidia driver ?
xrandr shows: VGA-0 not connected.
Thanks a lot
I too use a W520 Leno laptop.
In order to use the other xorg.conf file, get completely out of X to a tty terminal and run the startx -- -config script. You won't be able to use xrandr to add a display in optimus mode without doing the whole bumblebee/screenclone thing. Also, if the virtual machine needs to have the whole nvidia card devoted to itself (i.e. the on-hardware linux installation can't simultaneously use it), you won't be able to run it on an external display.
DeleteI don't know much about what virtual machines need to use different cards, but to minimize and isolate errors, here are the actions that I would take:
1. Start out by booting and switching to Discrete graphics mode in the bios. This will use the nvidia card for both the thinkpad display and the external display, and you can make sure both that the nvidia drivers are installed correctly, and that you VM works with the nvidia card. In this mode, you have to use nvidia-settings instead of xrandr if you're using the nvidia driver.
2. Next, I would switch to the nvidia optimus bios mode, and create an xorg.conf file with only *one* external screen running off the nvidia card. For this, you could start with the xorg.conf here: https://www.dropbox.com/s/elcz258bjbfl8ik/xorg.conf.triple.txt and comment out everything but either the DisplayPort Screen or VGA Screen section and the corresponding device and monitor section. Again, make sure your VM and everything works.
3. Finally, modify an xorg.conf to use your thinkpad display on the intel device and the external display on the nvidia device. If you're only using the external display for your virtual machine, you probably *don't* even want to use Xinerama - then you won't be able to drag windows between screens, but you won't need to, and the graphics will probably work better without Xinerama.
It sounds like you already saw the other post http://zachstechnotes.blogspot.com/2012/01/tri-head-display-on-linux-thinkpad-w520.html - that one is more relevent to what you're doing.
I’ve successfully followed this guide for 12.04 but am now trying to adopt it for 12.10 and keep failing.
ReplyDeleteI’m currently trying to use intel 2:2.20.2 for this but was wondering if anyone has had any success getting this or similar to work in 12.10 ?
Thanks Zack for the post, excellent work !
Hi,
DeleteI got nearly everything to work on 12.10, but you need the newer repo for the driver: xserver-xorg-video-intel-2.20.14_virtual_crtc.patch instead of the old one. Then, for me, one detail of the patch failed, but that was possible to fix by hand. Let me know if you need the details.
Thanks for the guide! Got my W530 working with two monitors (and laptop display) perfectly now, and can even change setup of two monitors in nvidia-settings.
Thanks for answering Patrick's question. I'll add a note in the instructions. I may have time in the next few weeks to do a major revision of these instructions, and fix up everything/ make it easier to follow.
DeleteIt looks like the patch no longer applies to the X driver source for 12.10. I stared at it for a while but it looks like the code was re-architected and the patch would need to be redesigned.
ReplyDeleteDid you try the newer patch available here: https://github.com/liskin/patches/blob/master/hacks/xserver-xorg-video-intel-2.20.14_virtual_crtc.patch ?
DeleteZach: confirmed that that patch applies, although the version I happened to have today was xserver-xorg-video-intel-2.20.9 and there was one diff rejected: intel_display.c.rej. I applied the simple portion of the patch. The driver built but due to time constraints I didn't get a chance to get it working. I'm not sure why my results from "apt-get source xserver-xorg-video-intel" don't match exactly, but this patch was closer than the last patch and likely works (just didn't have time to figure out why the new driver didn't load at reboot, probably just something on my end).
ReplyDeleteI was able to perfectly make on of my monitors work with this setup. Although I have screen resolution problems.
ReplyDelete1. Is there a way I could change the screen resolution so that it could support the actual resolution of the monitor?
2. Also, as I said earlier I was able to recreate this with my external monitor which is 19 inches. I have another which is 30 inches which I use mainly for my work. This monitor turns on when the Nvidia card is activated but I not able to render any image. It something to do with the power, display port? Any ideas on this would be really helpful. Thank You!
To change the resolution, you will have to modify the xrandr command when you start the virtual display on the intel card. The example xrandr command is for a 1080p display - if yours is not 1920 x 1080, you should change it to whatever the monitor is.
DeleteI'm not sure about the second problem - you will have to give more details about what point there is a failure (is it when you are running screenclone?) You may be able to find more help here: http://sagark.org/optimal-ubuntu-graphics-setup-for-thinkpads/
Hey Zach,
DeleteFor the first problem I am testing it on a 19" monitor. And the max resolution of it is 1280x1024. I am trying to see if I can render that resolution. But that does not seem to happen. This is the sequence:
optirun true
xrandr --output LVDS1 --auto --output VIRTUAL --mode 1280x1024 --right-of LVDS1
And I get -> xrandr: cannot find mode 1280x1024
So I was wondering if I could configure the Virtual device to render 1280x1024.
For the second problem, I use a Dell3007WFP monitor which is WQXGA. Maximum resolution of 2560x1600.
I do
optirun true
xrandr --output LVDS1 --auto --output VIRTUAL --mode 1920x1080 --right-of LVDS1 (lower than the maximum resolution)
screenclone -d :8 -x 1
This turns the monitor on, because up until that point the monitor is not turned on. But after that there is nothing rendered on the monitor. Let me know if you have any ideas or if you need any more info. I would be glad to provide it. I shall also try and post the second problem on Sagark's blog. Thanks again!
For the first problem, you probably need to add a new custom mode to xrandr (there is evidently not a 1280x1024 mode by default). see this to get started http://www.arunviswanathan.com/node/53
DeleteUnfortunately, regarding the second problem, I don't know exactly what's going on, and I don't have screenclone installed right now to test it. Sorry I can't be of more help.
Ashrith, I think the problem is that you have to run "optirun screenclone -d :8 -x 1". I had the same issue and this solved it. Though now that I've updated linux to 3.5.0.26.32, I've lost the ability output to my secondary monitor.
DeleteThis stopped working for intel 2.20 driver (was working for 2.19) , I get low graphics mode on reboot.
ReplyDeleteAbove steps worked great on Ubuntu 12.10.Thanks Zackary!
ReplyDeleteOn Ubuntu raring 13.10, the intel driver, though, with 2.21.6 did advance quite a bit beyond the last virtual_crtc patch available (2.20.14). Does anybody by chance have an updated patch and/or figure out a new way to get dual-screen on raring? thanks!
I use the old 2.20 driver with the patch in raring and it works well.
DeleteIf you want to use the new 2.21 driver I saw someone who got it working by switching from SNA to UXA.
I guess a third option is to figure out what needs to be changed to get it working with SNA. :)
Hi, the version my xorg-server-video-intel package is 2:2.19.0-6, so I don't know how I could apply the patch. I tried it by hand, but I can not find all lines which should be changed. Or maybe does my package version do includes this feature already?
ReplyDeleteI own a Thinkpad W530 with a Nvidia Quadro K2000M and I am using Linux Mint Debian Edition. Currently I use Bumblebee, but I want to use at least one external monitor. :-/
Hi Nicolas,
DeleteIt seems that the two versions of the patch available at https://github.com/liskin/patches/tree/master/hacks are for versions 2.18 and 2.20, so that seems to be why it won't work on 2.19. It is very unlikely that your version includes this feature already.
You may be able to roll back your xorg-server-video-intel to version 2.18 and apply the patch with apt like this: http://blog.andrewbeacock.com/2007/03/how-to-install-specific-version-of.html (I've never done this, but I think it should work)
Your other option is to take the xinerama approach that I posted about here: http://zachstechnotes.blogspot.com/2012/01/tri-head-display-on-linux-thinkpad-w520.html . That solution has it's own set of problems (for example it may not work with unity), but it doesn't require you to patch anything.
Hi,
ReplyDeleteafter several years of running a setup which is based on this post and Sagar's, I wonder if some progress was made and newer Linuxen don't require patching drivers and installing Bumblebee anymore.
Do you still use this? Did you make good experience with newer distributions? From what I read in the recent optimus guides, one needs to log out and in again quite often which obviously is not acceptable.
Kind regards and thanks for the post in the first place
This comment has been removed by the author.
DeleteHi! Yes, I would say the new distributions are better, but not perfect. I usually leave my laptop at my desk these days, so I use the discrete graphics bios mode because I don't need the battery life. I believe in optimus mode you still have to restart X to turn the nvidia card on an off, so that is a pain, but now it is officially supported, so it's MUCH less buggy (but not bug-free). I haven't tried tri-head display since installing 14.04, but I think it may work in optimus mode.
DeleteThanks for getting back, even though it didn't sound very encouraging.
DeleteBest regards
Hello guys,
ReplyDeleteI also have Lenovo T420 and I would like to know if it is possible to use external monitor only - as a primary - connected through the DisplayPort (DP->DVI cable) running on internal Intel HD Graphics 3000?
Right now I'm on Windows 10 where it seems impossible. After connecting the monitor to DisplayPort it always starts nVidia card and it looks like a hardware issue. I would be glad if you could clarify if this works in Linux so it is maybe not a hardware issue at all.
More one this: http://superuser.com/questions/1057044/force-using-internal-graphic-card-with-displayport-instead-of-external-nvidia-o
Hi, Can I tried it on my Lenovo ThinkPad E565???
ReplyDeleteHi Laila, I would guess that this is probably not applicable (or necessary) for newer models like the E565. The newest versions of ubuntu should have support - also since the E565 is AMD-based rather than intel/nvidia it may be much different than my W520
DeleteWe all are accustomed to the cosmetic foundation boxes showcased at the cosmetic counters in malls. Moisturizing and makeup foundations are packaged in boxes to protect them from getting tampered by heat and water. The material used in the manufacturing of foundation boxes can be customized to favored shapes, sizes and colors. Having an inspirational custom packaging for your cosmetic foundations aids you in endorsing your products efficiently. Usually, glamorous images and terse content is used for the foundation boxes. However simple yet classic artworks are also in vogue.
ReplyDeletePar excellence products: Our high-tech digital and offset printing equipment plus latest techniques ensure unrivaled finished products. Custom Boxes takes pride in providing its clients the best at affordable prices. Fast and Fruitful: We value customer retention. Our production team makes sure that all the orders are custom printed boxes within the time frames defined by our valuable clients. Timely shipment is one of the core values of our company.
ReplyDeleteDoes this work for the Lenovo ThinkPad T480? I have the same device and i need to connect external displays.
ReplyDeleteGreat work. Keep it up,
ReplyDelete