The Agony, the Ectasy, the Dual Monitors
I am composing this blog entry on the right-hand screen of a brand shiny new dual-monitor rig. That took me the best part of a week to get working. I am going to describe what I went through to get here because I think it contains some useful tips and cautions for the unwary.
I started thinking seriously about upgrading to dual monitors when A&D regular Hedgemage turned me on to the i3 tiling window manager. The thing I like about tiling window managers is that your screen is nearly all working surface; this makes me a bit unlike many of their advocates, who seem more focused on interfaces that are entirely keyboard-driven and allow one to unplug one’s mouse. The thing I like about i3 is that it seems to be the best in class for UI polish and documentation. And one of the things told me was that i3 does multi-monitor very well.
So, Monday morning I went out and bought a twin of the Auria EQ276W flatscreen I already have. I like this display a lot; it’s bright, crisp. and high-contrast. HedgeMage had recommended a particular Radeon-7750-based card available from Newegg under the ungainly designation “VGA HIS|H775FS2G”, but I didn’t want to wait the two days for shipping so I asked the tech at my friendly local computer shop to recommend something. After googling for Linux compatibility I bought an nVidia GeForce GT640.
That was my first mistake. And my most severe. I’m going to explain how I screwed up so you won’t make the same error.
For years I’ve been listening to lots of people sing hosannahs about how much better the nVidia proprietary blobs are than their open-source competition – enough better that you shouldn’t really mind that they’re closed-source and taint your kernel. And so much easier to configure because of the nvidia-settings tool, and generally shiny.
So when the tech pushed an nVidia card at me and I had googled to find reports of Linux people using it, I thought “OK, how bad can it be?”. He didn’t have any ATI dual-head cards. I wanted instant gratification. I didn’t listen to the well-honed instincts that said “closed source – do not trust”, in part because I like to think of myself as a reasonable guy rather than an ideologue and closed-source graphics drivers are low on my harm scale. I took it.
Then I went home and descended into hell.
I’m still not certain I understand all the causal relationships among the symptoms I saw during the next three days. There is post and comments on G+ about these events; I won’t rehash them all here, but do look at the picture.
That bar-chart-like crud on the left-hand flatscreen? For a day and a half I thought it was the result of some sort of configuration error, a mode mismatch or something. It had appeared right after I installed the GT640. I mean immediately on first powerup.
Then, after giving up in the GT640, because nothing I could do would make it do anything with the second head but echo the first, I dropped my single-head card back in. And saw the same garbage.
From the timing, the least hypothesis is that the first time the GT640 powered up, it somehow trashed my left-hand flatscreen. How, I don’t know – overvoltage on some critical pin, maybe? Everything else, including my complete inability to get the setup to enter any dual-head mode over the next 36 hours no matter how ingeniously I poked at it with xrandr, follows logically. I should have smelled a bigger rat when I noticed that xrandr wasn’t reporting a 2650×1440 mode for one of the displays – I think after the left one got trashed it was reporting invalid EDID data.
But I kept assuming I was seeing a software-level problem that, given sufficient ingenuity, I could configure my way out of. Until I dropped back to my single-head card and still saw the garbage.
Should I also mention that the much-vaunted nvidia-settings utility was completely useless? It thought I wasn’t running the nVidia drivers and refused to do a damn thing. It has since been suggested that I wasn’t in fact running the nVidia drivers, but if that’s so it’s because nVidias own installation package didn’t push nouveau (the open-source driver) properly out of the way. Either way, nVidia FAIL.
So, I ordered the Radeon card off Newegg (paying $20 for next-day shipping), got my monitor exchanged, got a refund on the never-to-be-sufficiently-damned GT640, and waited.
The combination of an unfried monitor and a graphics card that isn’t an insidiously destructive hell-bitch worked much better. But it still took a little hackery to get things really working. The major problem was that the combined pixel size of the two 2560×1440 displays won’t fit in X’s default 2560×2560 virtual screen size; this configuration needs a 2650*2×1440*2 = 5120×1440 virtual screen.
OK, so three questions immediately occur. First, if X’s default virtual screen is going to be larger than 2560×1440, why is it not 2x that size already? It’s not like 2560×1440 displays are rare creatures any more.
Second, why doesn’t xrandr just set the virtual-screen size larger itself when it needs to? It’s not like computing a bounding box for the layout is actually difficult.
Second, if there’s some bizarre but valid reason for xrandr not to do this, why doesn’t it have an option to let you force the virtual-screen size?
But no. You have to edit your xorg.conf, or create a custom one, to up that size to the required value. Here’s what I ended up with:
# Config file for snark using a VGA HIS|H775FS2G and two Auria EQ276W
# displays.
#
# Unless the virtual screen size is increased, X cannot map both
# monitors onto screen 0.
#
# The card is dual-head.
# DFP1 goes out the card's DVI jack, DFP2 out the HDMI jack.
#
Section "Screen"
Identifier "Screen0"
Device "Card0"
SubSection "Display"
Virtual 5120 1440
EndSubSection
EndSection
Section "Monitor"
Identifier "Monitor0"
EndSection
Section "Monitor"
Identifier "Monitor1"
Option "RightOf" "Monitor0"
EndSection
Section "Device"
Identifier "Card0"
Option "Monitor-DFP2" "Monitor0"
Option "Monitor-DFP1" "Monitor1"
EndSection
That finally got things working the way I want them.
What are our lessons for today, class?
Here’s the big one: I will never again install an nVidia card unless forced at gunpoint, and if that happens I will find a way to make my assailant eat the fucking gun afterwards. I had lots better uses for 3.5 days than tearing my hair out over this.
When your instincts tell you not to trust closed source, pay attention. Even if it means you don’t get instant gratification.
While X is 10,000% percent more autoconfiguring than it used to be, it still has embarrassing gaps. The requirement that I manually adjust the virtual-screen size was stupid.
UPDATE: My friend Paula Matuszek rightly comments: “You missed a lesson: When you have a problem in a complex system, the first thing to do is check each component individually, in isolation from as much else as possible. Yes, even if they were working before.”
Now I must get back to doing real work.
Eric S. Raymond's Blog
- Eric S. Raymond's profile
- 140 followers
