April 4, 2010
As far as I'm concerned, you can never be too rich, too thin, or have too much screen space. By "screen", I mean not just large monitors, but multiple large monitors. I've been evangelizing multiple monitors since the dark days of Windows Millennium Edition:
If you're a long time reader you're probably sick of hearing about this stuff by now, but something rather wonderful has happened since I last wrote about it:
If you're only using one monitor, you are cheating yourself out of potential productivity. Two monitors is a no-brainer. It's so fundamental that I included it as a part of the Programmer's Bill of Rights.
But you can do better.
As good as two monitors is, three monitors is even better. With three monitors, there's a "center" to focus on. And 50% more display area. While there's certainly a point of diminishing returns for additional monitors, I think three is the sweet spot. Even Edward Tufte, in the class I recently attended, explicitly mentioned multiple monitors. I don't care how large a single display can be; you can never have enough desktop space.
Normally, to achieve three monitors, you have to either:
- Buy an exotic video card that has more than 2 monitor connections.
- Install a second video card.
Fortunately, that is no longer true. I was excited to learn that the latest ATI video cards have gone from two to three video outputs. Which means you can now achieve triple monitors with a single video card upgrade! They call this "eyefinity", but it's really just shorthand for "raising the standard from two display outputs to three".
But, there is a (small) catch. The PC ecosystem is in the middle of shifting display output standards. For evidence of this, you need look no further than the back panel of one of these newfangled triple display capable ATI video cards:
I suspect part of this odd connector layout is due to space restrictions (DVI is awfully chunky), but I've always understood DisplayPort to be the new, improved DVI connector for computer monitors, and HDMI to be the new, improved s-video/component connector for televisions. Of course these worlds are blurring, as modern high-definition TVs make surprisingly effective computer monitors, too.
Anyway, since all my monitors have only DVI inputs, I wasn't sure what to do with the other output. So I asked on Super User. The helpful answers led me to discover that, as I suspected, the third output has to be DisplayPort. So to connect my third monitor, I needed to convert DisplayPort to DVI, and there are two ways:
- a passive, analog DisplayPort to DVI conversion cable for ~$30 that supports up to 1920x1200
- an active, digital DisplayPort to DVI converter for $110 that supports all resolutions
I ended up going with the active converter, which has mixed reviews, but it's worked well for me over the last few weeks.
Note that this adapter requires USB power, and given the spotty results others have had with it, some theorize that it needs quite a bit of juice to work reliably. I plugged it into my system's nearby rear USB ports which do tend to deliver more power (they're closer to the power supply, and have short cable paths). Now, I have gotten the occasional very momentary black screen with it, but nothing severe enough to be a problem or frequent enough to become a pattern. If you have DisplayPort compatible monitors, of course, this whole conversion conundrum is a complete non-issue. But DisplayPort is fairly new, and even my new-ish LCD monitors don't support it yet.
The cool thing about this upgrade, besides feeding my video card addiction, is that I was able to simplify my hardware configuration. That's always good. I went from two video cards to one, which means less power consumption, simpler system configuration, and fewer overall driver oddities. Basically, it makes triple monitors -- dare I say it -- almost a mainstream desktop configuration. How could I not be excited about that?
I was also hoping that Nvidia would follow ATI's lead here and make three display outputs the standard for all their new video cards, too, but sadly that's not the case. It turns out their new GTX 480 fails in other ways, in that it's basically the Pentium 4 of video cards -- generating ridiculous amounts of heat for very little performance gain. Based on those two facts, I am comfortable endorsing ATI wholeheartedly at this point. But, do be careful, because not all ATI cards support triple display outputs (aka "eyefinity"). These are the ones that I know do:
Unless you're a gamer, there's no reason to care about anything other than the least expensive model here, which will handily crush any 2D or 3D desktop GUI acceleration needs you might have. As an addict, of course I bought the high end model and it absolutely did not disappoint -- more than doubling my framerates in the excellent game Battlefield: Bad Company 2 over the GTX 280 I had before.
I'm excited that a triple monitor setup is now, thanks to ATI, so easily attainable for desktop users -- as long as you're aware of the DisplayPort caveat I discussed above. I'd encourage anyone who is even remotely interested in the (many) productivity benefits of a triple monitor setup to seriously consider an ATI video card upgrade.
Posted by Jeff Atwood
@codinghorror I've got 3 external monitors hooked up to my laptop. 2 through USB. http://bit.ly/cHPrQ8
sent you this tween a bit ago. USB monitors are more than good enough for developing on, they stutter on fullscreen 1920x1080 video though ^_^
Mono price has some fantastic prices for cables. ~13 bucks for a DisplayPort to DVI? I"m not sure about resolution, but I doubt I'd need to pay $30 for a cable (http://bit.ly/Fkx1O).
And Eyefinity really means 3-6 (so far) on a single card. Sure 6 is kinda crazy, but if you need it, its available (http://bit.ly/a2jh1I).
I use three monitors across two workstations (using the same keyboard+mouse via a free utility called Synergy). The downside is that you can't drag windows from one workstation to the next, but the plus is that you have more computing power to work with for heavy multitasking.
I use 3 monitors(http://bit.ly/bob2z2). If you have a PCIe card with two outputs, you can use the onboard graphics card on your motherboard to power the third monitor. I think that the chipset of both cards must be the same(for me, both are nVidia).
Low level video cards don't pull that much juice, you know. Entry level, passively cooled cards cost peanuts and still offer dvi + hdmi output support.
- More quiet working setup
- It's easy to fallback to single card in case of hw malfuncion
- You might want to try 4-display setup, yay
- Don't have to put up with a clumsy, $100 costing external block wich has occasional problems
- Too bad if simplified hardware configuration is intrinsic value value for you
- 4 monitors take insane amount of actual desktop real estate
According to anandtech.com, ati has a 6x card that will be out soon. And that will be the defaulyt on a chipset level (their customers want it for notebooks).
I have 3 1600x1200 displays, and I could put a 4th to good use. I have a monitoring app that is normally hiden, but that really should be where I can see it.
I dont think all this hardware or software features that are supposed to increase your productivity matters that much.
At least when I code, I spend much of the time thinking about how to solve the problem, how to augment the class design and so on (not to talk about all other time spent *not* coding, like attending meetings).
Also, the little time I lose when having to bring up another window or something like that can actually be used to let something I just wrote to "sink in". I am actually not sure even such highly regarded tools as code completion increases productivity.
An annoyance with these newfangled ports (HDMI and DisplayPort) for me is that to actually get a monitor which has those ports in it, you have to cough up 100-300 USD more.
And what do you get for that kind of money? More ports and maybe some height/tilt adjustments for the monitor.
Sure, the monitor probably has a better image too, but I'm perfectly happy with the cheap ones with "just" DVI and VGA connectors, so why would I bother?
Somehow I never really got the hang of multiple monitors. I had a dual-head setup even back when CRTs were still fashionable, but it seems that switching virtual desktops is easier for me than craning my neck.
Even now, with an external monitor on the macbook I need to make a conscious effort to actually use the internal screen.
I find the user interface (and how used one is to it) much more of a productivity factor. I still feel more at ease with my XWindows setup than with MacOS and the macbook keyboard.
(May I add that having the commenter's name linking to a profile instead of a proper homepage is somewhat lame? We'll see where my name will point to. -- Why do I have to do with typepad?)
Jeff, this id stuff is seriously stupid or seriously misleading. There isn't even a simple way to trace to my identity. :-(
Rocking as multiple monitors are I've always found virtual displays to be rocking-er? Same effect without having to have a desk full of monitors/move your head when switching?
FYI, DisplayPort and HDMI are pretty much the same, the only difference is, DisplayPort is royalty free and doesn't have any encryption ala HDCP. There are even simple little plugs you can buy so they can interconnect.
BTW, HDMI has a better electrical compatibility with DVI than DisplayPort does, it's just the card cannot power all three :/ DisplayPort uses lower voltages and that's why it's not an issue.
BTW, eyefinity is something else. That has to do with multi monitor setups to display a single image, for instance, using the HD 5870 Eyefinity model (6 mini DisplayPorts out, usually 100 bucks extra) to run a game over six monitors (e.g. http://vimeo.com/10425661), fairly pointless, three monitors work better. I've actually become so used to working on one and having virtual desktops.
Is it possible to set up three virtual desktops and have them rotate? As in I have one in the centre and then hit a shortcut and it switches it with the one to the right and then I hit something else and it switches it with the left (preferably I would have switch so the main [currently right] goes left, right goes right and left goes centre). Now something like that would be awesome since it would just blend into my current work style. I'd use the monitors as reference but when something needs focus, it needs to be on the main.
BTW, you forgot the HD 5970. It's a dual GPU (downclocked 5870s) for 700. Oh and I'd always check newegg, it's got some great prices on the cards. 5850s for sub 300, 5870 sub 400.
Correct me if i'm wrong guys, but isnt hdmi just like dvi, a digital connector but the difference is that it also transfers sound?
If so, why would you invest in an expensive displayport converter?
just buy one of these: http://www.octavainc.com/HDMI%20DVI%20adapter.htm
3 monitors on a single desktop is great, but you can go one better and have multiple computers using a software KM. It still feels like a single system in that your using one keyboard/mouse but you're controlling multiple computers.
Input Director ( http://www.inputdirector.com ) is great for Windows-only setups. Another option is Synergy (a little buggy but still works okay) for mixed O/S.
Jeff, passive DVI to Displayport adapters will not work with Eyefinity. The reason is that the video card, as far as I understand it, has only two timer units to power two DVI/hdmi outputs, respectively. The DisplayPort gets processed in the DisplayPort capable monitor (or in an active adapter through a powered converter unit that translates the DisplayPort data stream back into a timed DVI signal).
Of course, I only learned and researched this after buying a passive adapter and noticing it didn't work. So this is here telling people to save those 20 bucks and go for the active adapter or a real DisplayPort capable monitor instead. Personally, I then proceeded to buy a Lenovo L2440x monitor as my new "center piece", and have been happy using three screens ever since.
Btw. - while Eyefinity requires all screens to have the same resolution when you "span" the screen over all three displays (instead of using them as three individual displays as usual), playing Dragon Age: Origins at 4800x1200 totally rocks. Runs reasonably fast on a 5770, too, so for the financially conscious, you get fair enough 3D speed even across three screens at $150 here.
For those of you who don't have the hardware skills or are laptop bound, there is a nice multiple display technology called DisplayLink. You add displays by just plugging an adapter into your USB port. It's fine for day to day use or PowerPoint but the technology is not quite capable of hi-def video.
I use an IOGear device that allows me to connect a shared DVI device (projector or HDTV) to USB or an Ethernet port. A small TSR allows network users to take turns sharing the display device, either duplicating or extending their desktop.
(Mac users are supported too!)
Jeff, are you gonna sell your old card on eBay ?
How do I find you on eBay ? I'd like to be a bidder.
Am I the only one here who ever programmed without a monitor?
I started on TTYs with scrolling yellow paper. When I got to college it was punched cards and line printers. I remember what an improvement it was to move to a system in which I could submit the deck myself, using a remote card reader, instead of handing the deck in at the counter, and waiting for an operator to 1: load the deck into the system, 2: to schedule it to run, and 3: to burse the output and put it in the pickup bin.
How is one video card plus an ugly external powersucking dongle more "simple" than two video cards?
I have 10 engineers working for me and I got them all dual monitors, which they swear by for productivity, but personally I have to admit that I just don't get it. I'm using a single 22" monitor for development, and even that seems too big for me. I'm happiest with a 14" laptop. Whenever I have tried dual monitors I end up with a sore neck from looking side to side, and sometimes I even feel motion sickness (especially on site like Wikipedia where they don't limit the width of the text across the screen). I can't imagine using dual or triple 30" monitors. I'd have to put them on the other side of the room. I'm pretty sure I'm the only software engineer on the planet who would turn down extra screen real-estate.
Hey Now Jeff,
Great to read about 'As good as two monitors is, three monitors is even better.'
Coding Horror Fan,
"I plugged it into my system's nearby rear USB ports which do tend to deliver more power (they're closer to the power supply, and have short cable paths)."
Jeff, I would love for you to explain to us how a "shorter cable path" delivers "more power".
Be careful with Synergy, it's protocol is not secure. You can secure it by sending it over an ssh tunnel though (which is what I do for my Linux/Linux/XP/OS X setup).
Jeff, I think you've said before that you use 2048 x 1152 pixel resolution monitors. Is that correct?
When I first heard of Eyefinity, I was excited that card makers were finally taking triple monitor setups seriously.
But, then I learned of the DisplayPort nonsense.
Newegg seems to sell only one DisplayPort monitor. It's $185 more than a comparable monitor without DisplayPort! So that's $555 more to get three matching monitors.
The $110 DisplayPort-to-DVI active converter hassle seems to negate the benefits of simplicity and cost-savings that I'd hoped to achieve with an eyefinity card.
>Unless you're a gamer, there's no reason to care
Whoa.....or unless you want to get into OpenCL or DirectCompute programming (which I suspect a lot of your readers will end up wanting to learn, even if they don't realize it right now). If you are one of us, you *definitely* want to pay special attention to which triple output video card you upgrade to.
For those questioning dual monitors its awesome for developers.
Main monitor is used for the code editor/Development environment, and the second monitor lets me bring up API documentation which i can read at the same time as viewing my code.
There is a disconnect when you have to switch windows back and forth.
You can also run the program on the second screen allowing you to look at the code while the program is running, and of course, use the debugger on the main screen.
As a teacher I have to write alot of documentation and having my editor open at the same time has also improved my productivity.
I haven't seen as much value with a 3rd screen yet and probably never will as my desks are not large enough to hold that many.
Resistance in a wire increases as the length increases. See Ohm's Law.
a) Ok, tell me then how much of a difference in impedance you expect to see in 6" worth of 24 AWG wire as compared to 12" of the same wire.
b) You should probably look at Ohm's law. It has nothing to do with wire impedance. Or perhaps a lesson in sentence structure and context would suffice?
USB operates on DC power, so impedance is the same as resistance. Based on a table of resistivity for copper wire I found on google, resistance would increase from 0.0125 ohms to 0.025 ohms for those lengths.
And what kind of differences in voltage drop do you expect to find across those two lengths of wire? What effect would they have on current?
Those people talking about how virtual desktops are better or even that a 14" laptop is optimal must not have done the type of development where multiple monitors are so useful. In my setup I normally have my editor, a browser, a few shells, and a DB gui all up and visible at the same time, with many other apps popping in and out over the day as needed. When doing web development there are so many times when I'm rapidly cycling between the editor and the browser that there's no way that someone with less screen real estate could possibly be more productive than me.
HDMI == DVI. We have a number of dual monitor workstations that use the HDMI out one of the displays. An HDMI to DVI cable is approx $5.
Jan- For example, the HP 2338h costs $239, and has HDMI, and 1920x1080 resolution.
That is not $100 more than the competition (especially factoring in HP's industrial design and build quality - you can get a very cheap no-name for about $80 less... and exactly what you deserve along with it).
HDMI is not an expensive thing. HDMI and DVI-D are signal-compatible, as Duncansmart says.
On Jeff's post - HDMI is indeed aimed at the AV market as well as the computer market, but note that HDMI shouldn't really be thought of as a replacement for S-Video or Component; it's a digital signal at every level, lacking even DVI's (in the DVI-A flavor) analog capability.
Pure digital, which is why DVI-D/Mini-DVI and DisplayPort/Mini-DisplayPort adapters are cheap and easy.
(And on the DP-DVI adapters... it's kinda odd to call the cable "analog", since DP and DVI-D are digital data. There's no analog conversion process.
I suspect the cheap one simply requires the DP provide a DVI/HDMI signal, which DP is capable of. I suspect also that the expensive one has it use the native DP signal and processes it into the dual-link DVI.)
Here is what I do with my 3 1600x1200 displays. I'm a Unix sysadmin for a company that manages thousands of systems for a few dozen customers. I have my web browser (documentation) on one, monitoring program, ticket managment program, and as many as 12 terminals (and when I RDP to a windows box, I usually put it here) on the second, and chat, email, and notepad++ for notes and such on the third display. I really could use a 4th display so that I could leave the monitoring app visible all the time.
So yes, 6MP, and I know how I could use 8.
I have long been a fan of dual monitors for software development. Recently though I started working with a single 30" monitor and I like it much more than the dual monitor configuration.
I really like being able to have two things open side by side, an IDE window and a browser window, Xcode and the iPhone Simulator, etc. You just can't do this on a single monitor smaller than 30" but at this size and resolution I find it's easy to fit two windows side by side. For me it has the same benefits as dual monitors but also allows for a simpler hardware setup. I also end up spending some days using my laptop and some using my desktop and it's very convenient to be able to use the same display setup with both. This was something that was not workable with the dual monitors as the laptop doesn't have a great way to support two external displays.
Going to three monitors is very simple and cheap using the following USB solution: http://www.evga.com/products/moreInfo.asp?pn=100-U2-UV16-A1&family=USB
It works well although I strongly suggest connecting digitally. Currently I am using this device to drive my "primary" desktop monitor and it keeps up with everything ... including very intensive graphics.
The different ports are very annoying there is currently no out and out best between displayport and HDMI.
Display port has the most bandwidth and therefor supports higher resolutions at higher framerates this may be important in 3D TV (120hz). It is also usable internally for things like laptops, has an ethernet channel but doesn't have an audio return channel (for tv to av receiver or similar), xvYCC colour space or Consumer Electronics Control signals. Things that HDMI has. HDMI being more expensive on royalties and manufacture however and not usable internally.
The latest generation of DisplayLink USB-to-DVI adapters, which came out in November 2009, handles multiple monitors up to 1900x1200 with 32-bit color and no stutter whatsoever (that I can tell). They're plug-and-play and work with laptops and macs. I have these Kensington adapters http://us.kensington.com/html/17534.html driving two big monitors, with a third on DVI. I'm not a gamer, but I can't tell the difference performance-wise between the screen that's on DVI and the two that are on USB.
I have a 5x1 grid of 1920x1200 24" Samsung monitors pivoted vertically, which makes a 6000x1920 11.5 megapixels workspace, running off very cheap and silent video cards: 2x Asus EAH3650 PCI-E, 1x Asus EAH4350 PCI (five DVI's all up).
Five of these monitors cost less (in Australia) than one 30" Dell which only has 4.1 megapixels. The only problem with portrait monitors is ClearType doesn't work properly. Tuning does improve it, but it's not as good as a landscape setup.
Visual Studio is so much better at 2400x1920, and having dedicated locations for Firefox, Windows Explorer, SQL Management Studio, AutoCAD, Skype and Messenger with room to spare definitely saves me time every day.
HDMI is NOT the same as DVI, as it does not support all the resolutions, eg 1920x1200. With the ATI cards I believe you have to use the DisplayPort if you want to enable a third screen; if you use the HDMI port, you lose one of the DVI ports.
@Felonhead - "How is one video card plus an ugly external powersucking dongle more "simple" than two video cards?" Same thing I was thinking, and not much price or power consumption difference with the active adapter.
@Masterplansoftware - I'm thinking the same you are, 0.0125 ohms to 0.025 is not enough to make very much difference in USB current between front and rear ports, unless the PC case is over 5 meters long. ;-)
That new card is very cool though.
I'm an nvidia user and I've owned pretty much every series up to my 9600 GSO. They've all been 2 (or 1) output. But every GeForce 210 I've seen has 3 outputs. My first 3-output card was an ATI HD4350 for my HTPC and "Detect Displays" in Win7 actually showed 3 boxes. So 3 monitors with 1 card on a cheap budget is doable nowadays. I wouldn't pay more than $30 for a workstation card (or HTPC card) but that's just cheap me.
I am working on a 4-VGA workstation with Ubuntu and nvidia cards at the moment. 3 PCI 8400GS @ $40/ea and 1 PCI-E 8400GS @ $15, each with 1 DVI + 1 VGA. So there's room for more DVI/VGA monitors. Link: http://rwong.wordpress.com/2010/03/30/i-caved-in-to/
I use multiple screens, but on multiple systems at once.
On the MacBook Pro I always have a second screen attached. Then I sometimes use ScreenRecycler http://retep.net/aT0qA9 to add a third screen - it works via vnc and works pretty well.
Then most of the time I also use synergy http://retep.net/aqJ2RP to connect multiple linux boxes/laptops from the mac.
All this really means is that I've always got at least 4 screens accessible at any one time.
I have 2 wide screen monitors at work and it is nice. I don't know how essential it is though.
At home I have a single monitor and use the compiz cube to switch between workspaces and that works pretty darn good. The cube adds a wow factor too.
Definitely, wide screen is better than square.
I ran into the multiple monitor conundrum with my MacBook Pro. I was incredibly hesitant to use the USB -> DVI solutions, but I found one in particular that works pretty well. I'm able to watch Netflix (not quite full screen), Code with XCode, and display my IRC chats and Adium contact list without running into noticeable frame drop issues.
It is my recommendation for those of you who don't want to/can't buy a new-fangled video card.
Just don't expect to play any 3D games on that monitor.
If you get sick reading long lines on Wikipedia: there is no law that says that you always have to maximize your browser window. In fact, one advantage of having a big monitor (or two or three) is that you can actually see more than one window at a time!
This is absurd. I only need 1 physical monitor and 8 to 10 virtual desktops. I can't read 3 or even 2 monitors simultaneously, so hitting ctl-f1, ctl-f2, etc. to switch between different "screens" is perfect. I don't even have to turn my head.
I'll also argue that you are less productive because you're trying to focus on more than one thing at a time.
Michael: HDMI interfaces can support resolutions well beyond 1920x1200. (Not all of them do, of course, just as some sources still only output 720p... but that's not in the HDMI spec itself, it's a limitation of the source.)
Some monitors and video cards and driver combinations don't auto-negotiate resolutions like that properly (requiring futzing and manual setup), but there's nothing about the HMDI interface, signaling, or cabling that makes it incompatible or not supported.
The HMDI spec (and the fact that a cable adapter alone will work) assures us that the video signal from an HDMI port is identical to that from a DVI-D port.
The HDMI spec, in fact, requires full compatibility with DVI 1.0 (HDMI spec v1.3, Appendix C).
So I think it really is fair to say that "HDMI is the same as DVI" in this context... with the single caveat that some drivers for video cards aren't very good at detecting resolution capabilities of some HDMI sinks. But it's not a limitation of HDMI itself.
You actually CAN be too thin, it's called anorexia.
Apart from that, I agree: you can't have too much screen real estate.
My workplace sports two 30-inch Dells and the only things that are keeping me from getting a third one is a) the heat another 30-inch would produce, b) the cost and c) the fact that I would lose desk real estate that is currently occupied by a phone, a MacBook and some audio hardware.
Jeff: Here is what to do.
Plug your Stuffs like this.
DVI-1: DVI Cable in LCD1
DVI-2: DVI Cable in LCD2
HDMI: HDMI to DVI in DVI Cable in LCD3
HDMI: HDMI Cable in HDMI to DVI in LCD3
Use an HDMI to DVI cable or adaptor they are dirt cheap. they range from From 0$ to 20$. check your LCD-DVI cable-end, to choose the proper adaptor.
HDMI is mainly to allow easy plug-in with HDTV so that your HDTV setup is not using 2 cables to connect Digital Sound and Digital video.
A few useful videos how-to :
HDMI acts as a Dvi Cable and a SPDIF cable into one.
DVI is a mute digital signal without sound of a certain resolution and refresh rate.
While VGA is an Analogue Signal of a certain resolution and refresh. RCA, S-video and Composite are the same Analogue Signal NTSC/PAL but over different cable.
4. Your LCD normally receives a muted digital signal of a certain resolution from the DVI port
5. The Resolutions you choose on an HDMI port depend on the Receiving Hardware compatibility. I mean that if you select an output resolution for your HDMI port compatible with your LCD monitor, it will work.
6.Some projectors only have an HDMI ports but dont use HDTV compatible resolutions. Some HDTV accepts 1024 by 768, others only understand HD resolutions, etc.. so your videocard should be versatile.
Hope It help
There is a lot of video blur going-on.
This story reminds me of the words of a New York City investment banking vice-president (quoted on page 16 in the ICP Interface Administrative & Accounting August 1983 issue) as he explains how using a Compaq portable, an IBM PC, and an Apple III all at once provides him with truly multiple windows and literally split screens: "It's like Lisa, only separate machines are better".
Sigivald: That's a good explanation, thanks.
David Durham: Don't knock it till you try it. The point with multiple screens is that you don't need to hit Ctrl-F1 or Alt-Tab. I often have two instances of API help open, two or three source files, some sample code on a website, and several other programs all at once to solve one problem. Clearly, your eyes don't focus on everything all at once, but it's like driving. In a car you have a windscreen, two side windows and three mirrors all providing different but very important information. You might even have to turn your head, which I don't see a problem with, it's good for you. But, you're still really only focussed on one task: driving.
1) DisplayPort, HDMI, DVI1, DVI2 - I've counted four outputs. Which two are for the same signal?
2) Do new monitors have DisplayPort?
Interesting article. I'm stuck with "only" two monitors at the moment but one thing that I have found is: If you are designing an application for end users (either web based or desktop) then one of the monitors should be no bigger than the average size used by the target audience. UIs that look great on a 1600x1200 monitor often just don't cut it on a 1280x800 screen, and many users still have smaller screens.
My understanding is that the evolution of display connectors is something like this:
Originally there was VGA, which is a purely analog RGB signal. DVI is essentially a digital version of a VGA signal, where analog voltage changes are replaced with digital values. This is due to the rise in prominence of digital flat panel displays, where the digital signal is "native" to the underlying hardware in the same way that an analog signal is "native" to a CRT. DVI also includes pins for analog VGA signal for backwards compatibility.
HDMI is basically DVI with analog VGA compatibility removed, audio channels added, and a more consumer-friendly (some would say flimsy) connector. DVI and HDMI are nearing the end of their evolution because the technology cannot support further increases in the maximum resolution.
DisplayPort is significantly different. It is a high bandwidth packetized digital signal with significantly more headroom to support higher maximum resolutions. Also the packet approach allows for alternate transmission media like fiber optics, as well as additional data streams such as multiple video signals, audio streams, or even internet packets.
That said, it is going to take awhile for DisplayPort to gather momentum. Considering how long it took for USB to be fully established as the standard connector for input devices (introduced 1996, computers as late as 2002 often still only had a single USB port and came with PS/2 mice), new standards take a long time to displace the existing installed base.
"Dual Monitors" since 1983 when I hooked up a terminal as "glass printer" and extended the BIOS to hot-key output between PRN and COM1. See the first photo here. http://bit.ly/dcmYXu (The primary monitor is off photo.) The "2nd monitor" was only slightly less functional than the primary because my DOS 1.0 Eagle 1610 was pre-graphics. Be sure to note all the space available to spread out listings and documentation. The home-built bins for wide printouts behind me are unusually empty. BTW I still have the hat. It's quite a strange feeling to work with people younger than some of my clothing... ;-) (I still have the son too but he doesn't sit on my lap and watch me code anymore.)
Crude as it was, the 2nd monitor was wonderfully productive. I'd typically Print-Screen the C variables for reference then scroll down and edit. The "glass printer" helped me get more done in less time as well as stay in flow. Even in 1983 I knew more screen space would improve my productivity. Lackus Fundus prevented a 3rd monitor -- for mainframe access I'd spent $1000 on the terminal and $300 for a used 300 BAUD modem.
No photos, unfortunately, but I had dual monitors in Win 3.x. A card made the computer think it had one large monitor but the card knew it had two monitors. The monitors had to have identical specs.
The four right monitors on the 2nd photo belong to my main dev box of the era. It took a long time to get its 2 framegrabbers and VGA adapters to work at the same time, but the pixels were worth the hassle. Pre-USB imaging was a pain and I miss it not at all.
The last photo is of my 7-monitor laptop. I find 2*(1920x1200) + 5*(1600x1200) a very productive development platform (for a laptop. ) The photo is not a setup for the camera, it's early in a workday while there is still a little free space on my "reference" monitors for non-maximized windows. Similar to how I keep my clean undies and socks in a fixed location and can usually successfully dress before that first A.M. cup of coffee, I know exactly the position of the perhaps-overlapped window I want and can click it to the front before anyone could alt-tab there. It's great to have the space for a separate Explorer window for each folder of interest. I admit that almost never having to scroll the Locals window ain't bad either.
No photos of my current 10-monitor system, but I wrote about it when it had but 9 for my woodworking buddies. http://lumberjocks.com/projects/28014
UltraMon profiles let me enable and disable monitors in bulk. I don't use more than I need. Other than testing that all 10 monitors work I have yet to use more than 6. When I get to the documentation / VM testing phase no doubt I'll use them all and wish I had more. ;-)
Mark Jerde: And they tell me 11,520,000 pixels is excessive! I concur with the overlapped non-maximised windows on the miscellaneous screen; that's exactly what I do. I'm glad you qualified 7 monitors being a very productive development platform for a laptop. (LOL!)
You might like to try the QTTabBar tabs plugin for Windows Explorer. (Last time I checked it didn't work properly with Windows 7 Libraries, but for XP and Vista it works great.)
I'm with David Durham. At my last job we got two monitors and I thought I could never live without it. So when I switched jobs and I only got one monitor, it took some getting used to again. However, we use Macs at my current job. Among OSX's many great features is it has Spaces. (yes I'm aware Windows has a similar feature)
Since I've started using Spaces, the need for multiple monitors has virtually disappeared, as switching between spaces has taken the place of multiple monitors. Web browser and Terminal on 1, TextMate on 2, database on 3, all in fullscreen mode or close to it. It's all convenient keyboard shortcuts, so it takes basically no effort, either physically or mentally to switch. One advantage of this setup is when I am using my laptop without a monitor, I don't feel like I'm sacrificing anything. Plus you can't physically focus on two monitors at once anyway.
After experience with both, and liking both, I would choose virtual desktops over multiple monitors.
I use three monitors with two cards without a problem. Very simple setup.
I did buy one of these SAPPHIRE TECHNOLOGY RADEON HD 4850X2 1GB GDDR3 PCI-E QUAD-DVI
Which in theory was great and the price was no real different from two cards. However, my workstation was unable to power it. It has no problem with two individual 4850s but this monster needed so many power cables and just caused random crashes because of the juice it pulled on my standard PSU. Bear in mind this is a high efficiency PSU on a fairly new HP Workstation. I wasn't about to upgrade the PSU so went back to two cards!
I personally like 4 monitors...perfect for coding.
I've got a 4 monitor setup running on my Win7 machine with a single video card. No cables or other stuff. I've got an ATI Radeon HD 4850 X2 that has 4 DVI outputs. Does the job rather nicely.
I've been multiple monitors since I put a monochrome card in alongside the CGA card - they used different address spaces.
I now have three displays on my MacBook Pro, the third running off a DisplayLink DL 195 adapter. I also use Spaces. The difference is:
three monitors allow complex and multiple related displays, e.g. score on the monitor over the keyboard, DAW on the large monitor, and property sheets on the MBP monitor.
Spaces - use for unrelated human multi-tasking, one space for office stuff (email 3tc), one for studio, one for programming.
As a couple of others have already noted, these little USB adapters do a pretty good job. But note that DisplayLink is an OEM chip company; the actual devices you will buy are made under other names. Also be aware that there are two distinct flavors, the cheap ones based on the DL 165 and the slightly pricier ones based on the DL 195. The difference is in maximum resolution. Many options on eBay or Amazon, search on "USB DIsplay Adapter". Be sure to only buy from one that included specs and chip type used - not all sellers do.
Thank you for telling me about the USB adapters. I was going to buy a nettop and install a Radeon 5440, but if I can use the adapters I can just use the Mac Mini I already have to run it all. :)
[And while Spaces is good, it doesn't replace what I used multiple monitors for before, which is just having code on one screen and the API in the other, or work on one screen and random internet browsing on the other, or OpenOffice on one screen and internet on the other, etc. etc. I do use Spaces a lot, though I tend to max out at 4 spaces (Garageband, OpenOffice,XCode,assorted small programs [i.e. AIM/iTunes]], and at certain times I'm fliping between monitors a bunch and not being able to read on one screen+do things on the other like I'd like.
Don't some motherboards come with embedded video and DVI out yet? But I guess that gets into running different video cards and thus drives which can get messy.
I have two extra monitors plugged into my laptop using the Matrox DualHead2Go break out box.
I find it great for developing database driven web applications because you can have your IDE, Connection Manager and web browser all on separate monitors; my productivity shot up as soon as I had more screen space.
I use three monitors: laptop monitor, second monitor for the laptop on the right and my Linux box hooked up to the third monitor on the left. I have a script on my laptop that lets me copy and paste a URL in and it opens on the Linux box. I'd eventually like to make it into a little drag-and-drop icon so I can just drag on a file or a URL or a load of text and it'd display it on the third monitor.
It won't be long until all programmers get issued with this kind of setup upon graduation/initial employment:
The EyeFinity ATIs go all the way down to 5450 for a truly entry-level 3-monitor video card ($45). Since I am only using it for coding, it works for me. You still need to use a DisplayPort to get all three monitors going though.
I've been a proponent of Three Monikers For Every Developer, since the 90's
You can call me g0thX, PHB, or of course, Sam
Hooray, there's finally a GFX board capable of connecting more than two monitors!
What about Matrox? They've been in the business of triple-head gfx solutions for years now. Last time I looked, they had a 4-DisplayPort card for offer. No, wait, now I see they've got an *8*-port card...
At work, I only have a laptop so in order to implement three monitors (2 external) I picked up the Sewell Minideck (USB-to-DVI):
This device uses an internal DisplayLink chip to enable resolutions up to 1920 x 1200. Since most of these devices stop short of that res., I opted to spend the extra for it.
My current setup at work: http://twitpic.com/168fak
They issue the external 19" monitor to all developers, but I opted to pick up the 28" on sale last Black Friday as my new primary. I should have picked up 2.
I also bought an ATI 5970 for the new home desktop, but I've only the one monitor there so far.
Michael Csikos: My tone may have seemed a bit off. I'm just like that. Have you tried virtual workspaces? I don't know how it is on windows, I'm a linux guy. In the land of X, virtual workspaces have been around since "the beginning," when smart people realized they needed "multiple monitors" but the things were so darned expensive and heavy that it just was not practical to do so. Some things have changed, some things have not.
I'm certainly not going to buy 8 monitors, and that's how many "spaces" I need. So what would I do with 2 monitors? 1,2,3,4 are on monitor 1 and 5,6,7,8 are on monitor 2? Maybe. It could work. Maybe this isn't absurd :)
After I ditched my 19" CRT that I bought for gaming a few years back... my Sammy 22" was lonely... so last week I decided to order 2x Sammy 943's!
http://twitpic.com/1ixims (Currently have them the other way around)
I also ordered a 8400GS for my old 22" (226BW) but the brightness didn't match with the new 943's... so I've decided to just use the 2 for the time being. Multiple monitors FTW.
So, along with ATI SurroundView, would I get a total of 5 displays (2 on builtin + 3 on external) this way? :) Sounds interesting!
I realise that the chances of you reading this comment are slim, but I'm going to try anyway. I'm a software developer and for years I've worked with only one screen. Years ago on 1024x768 without any problems, then on 1280x1024 and now on 1680x1050. Both at work and at home, with no problems at all.
I once tried to work with two screens but I found that it took a lot of space on my desk and I didn't really like that.
But most of all, [b]I didn't know how to work with multiple screens productively[/b]. When you have the time, could you please dedicate a post on [i]how[/i] to actually [i]work with[/i] multiple screen productively?
Multiple monitor setups work best, in my opinion, with 4:3 monitors. I don't like widescreen for these kinds of setups. I have a dual monitor setup at home (2x Syncmaster 930bf @1280x1024) and a triple monitor setup at work (same monitors, but triple).
I tend to work on the right-one, not the middle one (for the triple setup, you smart-asses). My VS2010 window layout looks like this: http://slash14.nl/cndsa
For applications that don't make extensive use of multiple monitors (like word processing, browsers etc.) I usually use a monitor to keep my Email open all the time (Outlook, left screen), some application I regularly switch to but am not working on primarily on the middle screen and the application I'm working with on the right screen.
As an electrical engineering student, my mouth kinda fell agape at the 'USB gets more power closer to the PSU' comment. Then i lol'd.
Realizing I, an electrical engineer, am a member of some of the worst yet most prolific programmers out there. We write device drivers, which are known to cause many if not most of kernel panics across all OS's, so really I have no room to talk here on a programming blog.
Still, we fail at programming (I'm reading this blog to try to get better), and sometimes programmers fail at electrical engineering :-P
(PS the power delivered to USB devices is standardized in the specification)
Like Dale Brett already mentioned, Matrox has had Triple- and DualHead2Go for some years now: http://www.matrox.com/graphics/en/products/gxm/th2go/
Personally, 2 monitors usually suffice. Recently i started to make use of my nvidia graphics card's nView feature: multiple desktops. I divide my desktop into workspaces - one for design stuff, and one for programming for example. It's working out pretty well thus far.
For developing with Delphi i also wanted to use three monitors, but the old graphics card would only support two.
So i chose the Radeon 5770, paired with the Apple Active Adapter (http://store.apple.com/us/product/MB570Z/A . All well and good except for one thing.
Apple has chosen to use Mini DisplayPort, but the Radeon 5770 only uses the normal size DisplayPort interface. After searching around the net for an adapter, i finally came across this one: http://www.dinodirect.com/Forum/SKUA17220000A
When it arrived i was a little worried if it would work or not, but after plugging it in, it has been working flawlessly ever since.
One minor issue is that the monitor connected with the Apple adapter, sometimes needs to be turned off and on, to pick up the signal from the computer.
Another option would be the active adapter from Dell, that will do the same without the extra adapter, but where is the challenge in that :o)
I tried for many hours to achieve the above set-up with older systems but hit a lot of problems. I found out many others had also hit problems when I sought assistance on the web. For the benefit of others, this is how I setup MSI MS-6728 (MSI 865g /p/pe neo2 ) with 3 monitors – using NVIDIA AGP FX 5600 & PCI FX5200 video cards. I hope this helps.
A pdf of the solution, with pictures and screen dumps, is here ... http://tinyurl.com/25b79dl
The outline is..
2. Video cards
1. Have only the AGP video card installed with 2 monitors connected.
2. Delete all reference to existing Nvidia drivers. ( Use Control Panel > add/remove programs OR CCleaner ).
3. Restart Computer – do not accept M/soft to look for P&P drivers.
4. Restart Computer and initiate the Bios with the DEL key. Make sure PNC/PCI configuration - Init. Graphics Adapter Priority is set for AGP/PCI as per this photo… Save on exit by pressing F10.
5. Restart Computer - do not accept M/soft to look for P&P drivers.
6. Download and install Nvidia Driver 93.71 (93.71_forceware_winxp2k_english_whql.ex e) from this site .. http://www.nvidia.com/content/DriverDownload-March2009/confirmation.php?url=/Windows/175.16/175.16_geforce_winxp_32bit_english_whql.exe&lang=us&type=GeForce. DO NOT USE THE LATEST DRIVERS – THEY WILL NOT WORK.
7. Restart Computer – set each monitor screen to its native resolution (desktop / right click – properties – settings).
8. Turn OFF computer.
9. Install the PCI video card and connect the 3rd monitor.
10. Restart Computer - do not accept M/soft to look for P&P drivers.
11. Install Nvidia Driver 93.71 (93.71_forceware_winxp2k_english_whql.ex e) This will identify the additional PCI video card..
12. Restart Computer – set 3rd monitor screen to its native resolution (desktop / right click – properties – settings).
13. Set the monitor order to suit by click / hold / move with mouse from this screen..
14. Restart Computer. All should now be working as per the picture of “Desired Result”.
If you use virtual desktops, three monitors aren't necessary...
There are exceptions. Like, I'd never use anything less than two monitors if I'm doing some electrical design work or any serious graphic work because the number of tools that you need to use simultaneously.
For programming OTOH, one high quality display can easily be enough if you have a good virtual desktop setup.
I currently use Linux Mint with 4 virtual desktops setup. One for coding, one for internet browsing (research/code reference), one for revision control, one for unit testing. If 4 isn't enough I can add more as I need them.
The trick to using them effectively is having a good key combination setup. I use:
* Ctrl-Left Ctrl-Right to cycle back and fourth through the desktops (I think this is the default in *nix).
* Ctrl-Up for the compiz 'Scale' plugin which is the equivalent to Expose on Mac.
*Ctrl-Down for the compiz 'Desktop Cube' which shows you the desktops in relation the current one in a break-out view so it's easier to see where everything is.
That on a 15" laptop screen with 1920x1080 is perfect for my needs. If I had multi monitors visible at any given time it would only distract me from writing code. It's much easier to focus on one desktop at a time.
My monitor doesn't have DVI input (although it has everything else), so I just bought a $8 cable off Newegg that has DVI on one end and HDMI on the other end.
HDMI is basically just DVI + audio.
Is that a Dell 1800FP in "Multiple Monitors and Productivity"?
It's 2010 and I just got rid of mine. Opted for one 30 inch dell widescreen
wow... I was just musing on this topic this morning and stumbled on this accidentally. thanks! my dual monitor (two 1600x1200s) setup is just getting too crowded since I need both full monitors to display code on, plus another to display the results (i.e. application i'm building). so i either need to go with one large monitor (1920x1200+) and one modest-sized (1600x1200) or split it up into three displays since i really don't switching back and forth. grrr... dilemma, dilemma.
I'm with @Vincent O'Sullivan - having an extra (2nd or 3rd) "Joe User" monitor running at 1280x800 can be very useful to ensure that your design works well on a constrained browser. This is also a great use for a netbook - they take up very little real estate, have a 1024px width, and their constrained height makes you really focus on what's "above the fold".
As a primarily Windows developer, I find this to be a great use for my Mini 9 Hackintosh - I design for my Windows box and sanity-test in Safari on the Hackintosh. If it looks good on a 24" screen in Windows and also on a 9" Mac browser, then that's good enough for me.
I use three monitors across two workstations (using the same keyboard+mouse via a free utility called Synergy).
Hopefully, this helps someone out.
I had an XFX 5850 for several months, but never used triple monitors because of not wanting to buy a $100 cable to make it possible. Eventually, I figured that $100 wasn't so bad, and purchased a set of 3 matching monitors.
Initially, I purchased this single-link active Displayport adapter.
It's not usb-powered, and as I came to learn, the USB powered dual-link active DisplayPort adapters are only needed when the monitor it's hooked-up to is going to run a resolution > 1920x1080.
However, my monitor wasn't recognized when using this cable. I figured the cable was faulty, RMAed, and it still didn't work with the replacement cable. On the third attempt, I bought a different brand of cable, an Accell, and still it didn't work. It was then that I contacted XFX, and it turns out the DisplayPort on my videocard was faulty. XFX sent me a replacement, and I can now run Eyefinity setups.
So, just to be clear (as the countless forums and comments I read while I was trying to figure this out were cryptic at best). I have a single XFX 5850 hooked up to 3 monitors (2 with DVI, and 1 DP->DVI) using a single-link, non-usb powered, Accell DP->DVI adapter (http://www.amazon.com/Accell-B087B-005B-DisplayPort-Single-Link-Certified/dp/B004071ZX0/ref=sr_1_1?ie=UTF8&s=electronics&qid=1295730037&sr=8-1) and I am able to run Eyefinity @ 5760x1080, or the typical 3 monitor setup.
I couldn't agree with you more. The last company that I worked for provided me with a computer that had two monitors and it was perfect for running all the reports (I was a data analyst). At my new place, though, the company is a start up and doesn't have the means to provide me with a dual monitor system. I miss having my double system! At any rate, I feel like I'm in a progression puzzle now, trying to figure out how to make everything fit!
I'm going to jump on to the end of a fairly old thread to add - getting your monitors off of your desk makes just as much of a difference.
I've had two monitors at work, for quite a while. I just the other day set up a second monitor at home. At work, I've one of those long, ell-shaped cubicle desks. At home, I have a 1930's gray steel office desk. With two large monitors, there was far too little desk space.
So I stopped by the local computer store and came home with a pair off flat screen wall-mount brackets. The ones I got have a jointed extension arm that lets me position them anywhere from 4-12 inches out from the wall, and provides for considerable lateral adjustment.
So now I have them positioned about 8" in from the back of my desk, about where they were before, but with the entire surface of the desk as a usable work area, underneath.
The difference is huge. With the monitors hung from the wall, wireless keyboard and mouse, and the computer itself on the floor in the corner, the only piece of computer hardware that is on my desk is my USB Doomsday Device, which serves as a convenient USB hub for my assorted mobile toys.
It's been a very long time since I had a clean desk, without unavoidable electronic clutter, to work on.