December 13, 2006
After revisiting my
ongoing three monitor obsession recently, I was compelled to upgrade my current
mongrel mix of varying LCD monitor brands and sizes. I settled on three 20"
Samsung 204B panels.
Standardizing on a single type of monitor in a multiple monitor configuration has obvious
advantages in color consistency and uniform sizing. But LCD technology has also
advanced quite a bit since my last LCD purchase. Here's a small chart outlining the relevant
specs of every LCD panel I've ever owned:
|Response time||25 ms
Yes, there are minor brightness and contrast bumps. But more importantly, there's
been a huge improvement in
response time. This addresses one of the
key criticisms of LCD monitors:
The liquid crystals in an LCD have to change their
molecular structure to manipulate light, and that's not a speedy process. As a result,
LCD pixels respond much slower than what you may be used to on a CRT monitor, and
that can cause ghosting and streaking, especially at high frame rates.
The pixel response time of LCDs has improved dramatically over the years, but CRTs
still have the edge. What's most worrying about pixel response times, however, is
that LCDs with similar pixel response time specs don't always show the same performance
in the real world. It's really something you have to check for yourself.
That was written in 2002. LCDs are larger and cheaper now, and getting
larger and cheaper every day. The improvement in response time makes watching video and gaming
on LCDs nearly equivalent to a CRT. Worries about dead pixels are generally a thing of the past, too. It's
fair to say that LCDs have won the hearts, minds, and wallets of the public in 2006. It's difficult to find places that even sell CRTs these days.
I'm glad the era of the CRT is finally over. Not just because LCDs are more svelte
and power efficient, but also because LCD monitors are much less finicky than CRT
monitors. Getting great image quality with an LCD is dead simple.
You only need to do two things:
- Always use the DVI port to connect your LCD.
The DVI port is digital, so you get a perfect connection to the LCD every time. Using a VGA port with an LCD is downright pathological-- it means you're converting
the digital bits inside your video card to an analog video signal, then converting
the analog video signal back
to digital bits inside the LCD. You open yourself
up to a whole host of nasty analog conversion issues along the way. Don't do this!
Use the DVI port! If you own a video card
that doesn't have two DVI ports, it's time for a new video card.
Set your monitor's refresh rate to 60 Hertz.
Refresh rate has no real meaning
to a LCD, but it can still cause problems if it's set to anything higher
than 60 hertz. This ought to be set automatically when you connect a LCD panel,
but it never is in my experience. So go in and lock the refresh rate down to 60
- Set the display to the maximum native resolution.
LCDs work best at their native resolution-- when there is one pixel on the LCD for every pixel on the screen. If you run a 1600x1200 LCD panel with a 1280x1024 desktop, you'll get a blurry, upsized image instead of the perfect digital clarity LCDs are capable of. The maximum native resolution is usually the maximum resolution available in the display dialog, but double check your monitor's manual if you're unsure. LCDs should look perfect. If it looks blurry at all, either you're using an anlaog VGA input, or you're using the wrong resolution.
The digital connection between PC and LCD is about as good as it gets, right out
of the box. Contrast this with the experience of hooking up a new CRT: to get the
best image quality, you had to tweak the refresh rate, the image sizing, the pincushion
adjustment, and a half-dozen other tedious, fiddly little analog settings.
But even with the refresh rate issue largely addressed, LCDs do have a few
other display peculiarities that linger on:
Viewing angle. When viewed from the side, above, or below, images on LCD
monitors become noticeably darker, and colors start to get washed out. CRTs, on
the other hand, can be viewed from extreme angles with little loss in actual picture
quality. Admittedly, there are few areas where viewing angle makes a big difference
for end users, but the limitation is worth noting. If, for example, you want to
watch a DVD on your LCD with a group of friends, everyone is going to have to get
real cozy with each other on the couch to see things properly. Limited viewing angles
might not be a bad excuse to get a little closer to your date, but your buddy that's
just over to watch Office Space may object to you rubbing up against his leg like
Color reproduction. Although LCD screens claim support for 32-bit color,
the displays themselves often aren't capable of accurately reproducing all
16.7 million colors common 32-bit graphics modes. With a properly calibrated LCD,
a casual user probably won't notice the difference, but the limitation will probably
give graphics designers fits.
Contrast ratio. LCDs are back-lit whenever they're on, which means that
TFT panels have to orient the liquid crystals to block light if they want to display
black. Some light inevitably manages to seep through the cracks, which limits a
screen's ability to display a true black.
Some of these peculiarities are only of interest to hardcore graphic designers.
I'm assuming that most modern LCDs are capable of displaying the full 32-bit range
of color by now. Regardless, color calibration remains
as much an art as a science, and adjusting colors is difficult on LCDs.
I've also noticed backlight problems on every LCD I've owned, including the new
models that just arrived. Blacks are never quite as deep as they would be on a CRT.
And the backlighting is never perfectly uniform. I tend to see gradations in color
and brightness on LCDs where there shouldn't be any. As for viewing angle, this
is more of a problem for LCD televisions than monitors.
In computing scenarios,
we tend to sit much closer to the monitors, and in a fixed location. If you're a
hardcore graphic designer, you can buy
extremely high end, extremely expensive LCD monitors which attempt to resolve
the color and uniformity problems endemic to most LCDs. But these specialty monitors
do nothing for viewing angle, and they tend to sacrifice response time, making the ghosting and streaking problems even worse.
Is it possible to calibrate a LCD? You can get some ideas of what
you might want to calibrate in
CNET's description of their LCD monitor testing methodology. Software like
MonitorTest can even walk you through the process. But the earlier good
news-- that LCD displays generally don't need to be adjusted to produce good image
quality -- is also the bad news here. There's not much you can adjust on
a digitally connected LCD, other than the brightness and color temperatures. But
that's generally enough to
calibrate the essentials: brightness and gamma.
After you're done calibrating, it's time to clean all that dust off your LCD. I've
been wary of cleaning my LCD panels, because I'm afraid of damaging the anti-glare
or glossy (laptop) coatings. Soap and water leaves massive streaks, and caustic
window cleaners are clearly out. I was recently turned on to the Monster ScreenClean display cleaning kit,
which works wonderfully. It cleans almost effortlessly without streaks, and it's
a screen-friendly non-alcohol formulation. It's almost like screen lube.
LCDs still have a way to go before they're perfect substitutes for CRTs.
With the recent industry advances in refresh rate, at least LCDs no longer have
any glaring weaknesses. Here's hoping that improvements continue to keep pace in in viewing angle and backlighting.
Posted by Jeff Atwood
Which video card(s) are you using to get all three monitors hooked up with DVI?
GeForce FX5200 PCI
GeForce 7600GT PCIe
ATI X1900XTX PCIe
ATI X600 PCIe
It's generally a good idea to use cards from the same vendor so you can install *one* video driver instead of two. Windows Vista requires this to get the hardware accelerated GUI across all three monitors.
Also, if you don't want to futz around with multiple video cards, you can try the Matrox TripleHead2Go:
To clean my LCD's of dust, I just use one of those disposable electrostatic dust cloth thingees. One lasts a long time, and sits quietly in a drawer until I need it. A light wipe, and I'm immediately done -- no drying time, no residue.
I'm partial to the Pledge brand, perhaps foolishly thinking that Johnson is slightly less evil than most of the other players in the market.
Make sure you get a flavor that does not contain a polish, scent or whatever that would leave a residue.
I have a Radeon X700 and two NEC 70GX2 panels. I run one panel from DVI and one from VGA (that's all it has). I can't tell the difference in the image quality on those panels - and I'm *very* picky. I used to believe the DVI is better myth, but I have one anecdotal example sitting on my desk that says otherwise.
What a decade, by 2010 we should've successfully replaced all bulky CRT's with LCD's ...
By 2010 I hope we have SED displays if they're as good and as cheap to produce as they supposedly are. I'm a stickler for what CRT color quality was compared to LCD and can't wait for the technology (SED) with the advantages of both.
* Note: It's worth mentioning I use 20" LCDs all day at work and at home (general, games, and movies) and I quite like them, but with a 19" CAD-quality CRT as a secondary (soon to be tertiary I hope) display at home. I just want to represent all fronts.
Great artile. Why not run an LCD monitor at 75 Hz?
In short because refresh rate has to do with the electron gun in a CRT; LCDs have no electron gun and therefore no refresh rate.
In a CRT, the scan rate is controlled by the vertical sync signal generated by the video controller, ordering the monitor to position the electron gun at the upper left corner of the raster, ready to paint another frame. It is limited by the monitor's maximum horizontal scan rate and the resolution, since higher resolution means more scan lines. Increasing the refresh rate decreases flickering, reducing eye strain.
Much of the discussion of refresh rate does not apply to LCD monitors [because they have no electron gun]. A phosphor on a CRT will begin to dim as soon as the electron beam passes over it. LCD cells open to pass a continuous stream of light, and do not dim until instructed to produce a darker color.
LCD's do have refresh rate. To not get too confused, you might call it update rate. It's the rate that new pixel values are sent to the screen so it can update the display. (You still have to wait for the liquid crystals to respond) If the monitor is on a digital output you may crank your refresh rate as high as your monitor supports - in theory everything is smoother. In practise it's hardly noticeable, especially if you're not a gamer. On analog output's it may be better to use a lower refresh rate to get a crisper signal. But again, if your equipment and cables are good quality, the difference isn't all that noticeable.
Some other comments:
Be very sceptical about stated response time, viewing angle and contrast numbers. There is no established standard method to measure them, so the may be off by quite a bit. For example viewing angle is for some manufacturers the angle where the monitor can maintain at least 1:10 contrast, for others it's 1:5. In reality, if you care about colour integrity, the minimum should be something like 1:50 or 1:100. Response times are usually black-white-black transition lengths with transitions starting and ending at 5/95% or 10/90% luminosity (depends on who's measuring). Transitions from light-grey to dark-grey can take several times longer.
Your assumption about most displays supporting 24bit colours is wrong. Most displays sold today are Twisted-Nematic panels, with extremely fast black-white-black response rates. Unfortunately fast TN panels ususally have only 6bit per colour channel accuracy and extremely poor vertical viewing angles. Even though the colour inaccuracy is somewhat masked through temporal dithering, if you move your eyes at the right speed you can see pretty annoying dithering. And don't get me started on the viewing angles, if you need to do any colour matching, TN displays are basically useless. Most TN displays have noticeable colour changes over the height of the display thanks to changing viewing angle.
I find panels using SIPS and PVA technology a lot better compromise. They have usable viewing angles and 8bit colour channels. In last couple of years the response times have also gotten acceptable to anyone but the hardcore gamers. Unfortunately most manufacturers don't bother to specify the technology used. Nevertheless, you can probably identify SIPS and PVA panels by their ridiculously high (due to the useless measuring standard) viewing angles of 176 or 178 degrees.
I just spotted that Samsung monitor the other night and finally think I've found an LCD worthy of me. (I'm terribly picky, and colors and refresh rates are really bothersome. Most LCDs just don't look good enough.)
Now I want one.
Are you using all three monitors through DVI?
If the human eye is capable of distinguishing about one million* colors, what's the point of worrying about whether a monitor can display 16.7 million different hues? Isn't that just an excuse to sell bigger disks and scanners that can't be differentiated on actual, you know, *features*?
Admittedly, I'm a programmer, not a graphics and color-theory geek, so maybe I'm missing something. But obsessing about color depths beyond 24 bits strikes me as one of those "if Superman raced the Flash..." pursuits.
* (I've seen this figure a number of places; here's a 30-second Google find: http://www.visualexpert.com/Resources/color.html)
I don't care if my monitor has one million, 16.7 million or 24 zillion colours. But I do care that I see clearly visible dithering and/or banding with 6bit displays. 7bit is not as annoying, but you do need 8bits per colour channel and a reasonable gamma curve to not see noticeable artifacts. Actually you need lots more than that if you take account the full dynamic range of the human eye But current displays can't display 1:1000000 contrast yet, so for now 8 bits is enough.
Are you using all three monitors through DVI?
Admittedly, I'm a programmer, not a graphics and color-theory geek,
If your job title is "graphics designer", color matching is a critical and essential part of your job. But you're right that for most people, the current level of LCD color fidelity is good enough.
What a decade, by 2010 we should've successfully replaced all bulky CRT's with LCD's, many mechanical HD's with solid state and burning hot single-cores with moderately warm quad-cores. The BIOS (or replacement for it) should support virtualization such that we will be able to use whatever operating system seamlessly next to each other, there will be no cables due to wireless USB and.. and… ok I’ll stop.
I used to believe the DVI is better myth
The analog conversion circuitry in your average LCD is fairly good by now, but DVI is still the way to go. Try the monitor test application I linked and do a side-by side comparison of analog to digital. I think you'll be surprised.
The last time I tried this was with the two 19" Rosewill panels (at the time, my secondary video card had only analog VGA out), and I could easily tell which was which by eyeballing it.
Which video card(s) are you using to get all three monitors hooked up with DVI?
"If the human eye is capable of distinguishing about one million* colors"
My color theory prof. told me that in addtion to that, the million colors that we can see is not a subsest of the 16 million that the monitor displays. It is an intersecting set.
Apparently there are colors that no additive-model display technology will be able to faithfully reproduce.
I always research monitors on tomshardware when I can, because they're the only ones I know of that take a rigorous, scientific approach measuring all aspects of LCD monitors. That way when you're dithering between five with similar specs, you can quickly weed out two or three that are junk.
Of course, with the huge proliferation of LCDs it's impossible to test more than a fraction of them; they concentrate mainly on high-end, high-quality ones. (Or at least those that claim to be such.) I just have to hope that their close cousins are similar and just slightly worse; in my case VG930 instead of VP930.
I'm a Viewsonic bigot, myself. ;)
HD15 (analog) definitely doesn't have the color exactness of DVI, but you'd only notice if you push pixels for fun and profit. When I switched, I had to swap back and forth in photoshop a few times to find any difference.
How do you orient your 3 20" LCDs? All in landscape mode?
All landscape. Be very careful with portrait mode; rotating the display also rotates the RGB pixel matrix, which tends to break ClearType. Or at least the 3x resolution improvement is now *vertical* and not *horizontal*. Not sure if Vista can deal with this or not, but I doubt it.
There's more (somewhat dated, but accurate) info on the portrait vs. landscape ClearType issue here:
Ian Griffiths entered a comment in that thread, I wonder if he'd want to follow up on that here..
"I used to believe the DVI is better myth, but I have one anecdotal example sitting on my desk that says otherwise."
Further to what Jeff said - if you're running any sort of high resolution monitors (say, 20" or larger?) - you will notice issues. If you're using 17" or smaller LCDs, and you don't have a whole lot of other power cables running around near your VGA cables, you should be fine for most applications.
I've got a Dell 24" LCD Panel and it is very obvious if I try to run this over a VGA link.
Oh, and on the viewing angle thing: Only the cheap/older panels out now will have anything resembling a viewing angle issue.
We've got a "moderately" priced 32" LCD TV at home (and I tried out a whole bunch more in store) - the viewing angle is well over 120 degrees side-to-side (not sure about how the up-down angle goes, but it's well and truely enough that I can lay on the floor ~3 foot from the TV (it's on a side-table, about the same height as a regular desk), or stand up, and still watch just fine.
How did you decide on the 204B? IMHO, it has quite a few drawbacks for a programmer's monitor.
We bought ~10 LCDs at work recently and we went with the HP LP2065, at a slightly better price. We considered the 204B and while an excellent monitor for gaming, we found it unacceptable in our tests for development work at our firm.
The main reason is the panel type: 204B is a TN, while the LP2065 is a S-IPS.
TN panels like the 204B have awful angles (especially vertical) which is really obvious when two developers are working together on something on a single screen. Additionally, inaccurate color reproduction means unpredictable results in any GUI / web type of work.
Most of the glowing reviews of the 204B you'll find online come from gamers, for which the fast response time is very important. That is unfortunately close to worthless in programming-type work.
Look for reviews from photographers, artists and web designers, they shun the 204B. Some choose other VERY expensive panels, but others choose the HP - same price, much better image.
The LCD peculiarities you list would've been much reduced if you'd have chosen a S-IPS panel.
And if "always use DVI", how about your previous choice of Matrox's TripleHead2Go? I've been always astonished by people using TripleHead, considering it doesn't work with DVI! Who wastes 3 LCD panels on analog connections?!
For the love of god, don't "calibrate" your monitors "by eye". Buy a hardware calibration device for $100, make a profile once and sell it on ebay. LCDs have no phosphors, they don't burn out, you're unlikely to even need to re-profile. I'm looking at SyncMaster 204T right now and the difference between calibrated and uncalibrated monitor is noticeable with even untrained eye. As a rule, on laptops the difference is DRASTIC. The only way to bring a laptop screen to life is by calibrating it. Choose Gamma 2.2 and native white point for LCDs. With rare exceptions, you can't change their whitepoint anyway, and most panels come close to gamma 2.2 from the factory with minor (but, again, noticeable) deviations.
"If the human eye is capable of distinguishing about one million* colors, what's the point of worrying about whether a monitor can display 16.7 million different hues? Isn't that just an excuse to sell bigger disks and scanners that can't be differentiated on actual, you know, *features*?"
I suspect that the 1,000,000 colors claim is somehow flawed, similarly to the widely claimed "fact" that 300dpi is the highest resolution the eye can distinguish.
The "16.7 million color" LCD monitor spec is based on 24 bits-per-pixel (bpp), or 8-bits each or red-green-blue. (IIRC, the extra byte for 32-bit color mode doesn't give any extra color depth; it's essentially just padding for speed). So 16.7M colors = 256 red * 256 green * 256 blue. Adds up top a lot of colors, but think about this: you only get a range of 256 colors from the purest brightest red to complete black, and similarly for green, and blue, and grayscale.
Take grayscale: 256 colors from pure white to pure black is a good range, but banding is still quite visible, to me at least, on the CRTs I was using back when I was doing some ray tracing graphics programming.
Per wikipedia, the human eye can (adjusting over time) distinguish about 20 f-stops of brightness (note brightness, not color) (Also note that's the 1:1000000 contrast that Ants Aasma mentions). That would require 20 bits of grayscale to cover that full range, or in other words, 20 bits each of red, green, and blue, or 60bpp.
By that measure, the 18bpp dithered up to fake 24bpp that the TN LCD panels produce seems a bit shabby. FYI: I've noted a trend that the LCD monitors that support true 24 bit color are specced as "16.7M colors", while the dithered 18bpp ones are specced as "16M colors". I don't know if that's always true, though.
All that said, I have a Samsung 214T myself, and think it looks great, but if I had any kind of choice in reasonably priced 21" CRTs, I probably would have stuck with a CRT.
We considered the 204B and while an excellent monitor for gaming, we found it unacceptable in our tests for development work at our firm.
You're right, and these monitors are at my home, so they need to be good at gaming too.
TN panels like the 204B have awful angles (especially vertical) which is really obvious when two developers are working together on something on a single screen
This is true. The 204B has restrictive vertical view angles. You have to look at the monitor dead on to avoid any subtle vertical color casting. I noticed this with my console window, which has a background of dark green; it appears darker at the top or bottom depending on the vertical angle.
You forgot to mention native resolution. It's important to set your PC to output the native resolution of the LCD display. (And to make sure your video card is capable of driving all your displays simultaneously at those resolutions; this information is often surprisingly hard to come by. Video card manufacturers typically list the resolutions they support but omit limitations such as being able to drive one monitor at 1600x1200 but another at only 1280x1024.)
Also, if you use your display with non-PC consumer electronics devices, be aware that some displays limit which input/resolution combinations work, and sometimes even fail to handshake properly (usually caused by failing to correctly implement standards such as EDID). Like the recent bruhaha with some Sony LCDs failing to provide 1080p as an option over component or at all (until the CE device it's being used with gets a firmware update to special-case the monitor's crappy handshake). There's basically no way to find out this information unless you've implemented video at the hardware level.
Another thing that's great about LCD's is they dont burn your eye balls out since you're looking at a bunch of crystels instead of a tube with a electron gun shooting beams at a screen.
I've spent hours and hours in front of computers since I was 13 and that's probably why I have limited field of view (even though my eye doctor never tells me this or helps).
Plus electricity savings and less space and better resolution. I'm never buying a CRT ever again since I ussually buy my monitors in small sizes due to lack of space and I think using a 17" monitor is obscene even though I've used one for quite awhile, though I still think it's crazy to be looking at a CRT that big so close to your eyes. MM radiation. I was told not to sit too close to the TV when I was little, yet I was glued to something almost like a TV. Does it fuck up your vision? I think so. That's why I am glad LCD'S are so cheap now!
Samsung has really nice monitors. I actually have Four of the same LCD model you have on a quad stand. I got it at: http://Multi-Monitors.com in 2006. Of course I had to buy a SUPER PC multi-monitor computer as well, to support all of the monitors. Anyway, great website you have here. I've already found some very helpful code sniplets!
me to I used to believe the DVI is better myth, but I have one anecdotal example sitting on my desk that says otherwise.
Further to what Jeff said - if you're running any sort of high resolution monitors (say, 20 or larger?) - you will notice issues. If you're using 17 or smaller LCDs, and you don't have a whole lot of other power cables running around near your VGA cables, you should be fine for most applications.
Also, if your office temperature is set a 65 during the winter, a CRT also increase ambient temperature by a few degrees, which in my opinion is a plus.
Although it hard to tote around a pair of 21" CRT monitors unless your Hanz or Franz, LCD monitors are much easier to relocate.
How do you orient your 3 20" LCDs? All in landscape mode?
I've been using dual Samsung 204T (20") for about a year now, and now my PC has 3 DVI ports so I'm thinking of adding another LCD. Either another 20" standard ration monitor, or a 20" widescreen. Still deciding.
I'm not sure the size/resolution of a NEC 70GX2, but with 20" LCD I do notice a difference between DVI and VGA, I just can't get the VGA to look good.
When I had 17" LCDs, there was no visible difference.
I did spend a day with one 20" LCD in portrait mode and tried coding. 1) The font quality wasn't as good which turned off off pretty quickly (this is under Linux). 2) I just couldn't get used to it. Even with 2 monitors, I like to have 2 code windows side by side on the same screen which just can be done when in portrait mode (at least not confortably).
My wife isn't too happy with your post here, as now I'm shopping for a 3rd large LCD :)