June 13, 2007
In a recent post, Dave Shea documented his love/hate relationship with the pixel grid:
Here's the caveat though -- high resolution displays. At 100dpi, ClearType wins out, but we're not going to be stuck here much longer. Give it a few years, let's do this comparison again when 200dpi is standard. I suspect the pixel grid won't matter nearly so much then.
I was somewhat curious about Dave's claim that in "a few years" displays with 200 DPI will be standard fare. So I did some research to document how far we've come in display resolution over the last twenty years.
I used the Tag studios Monitor DPI calculator to arrive at the DPI numbers in the above table. I couldn't quite figure out what the actual displayable area of those early CRT monitors were, so I estimated about 5% non-displayable area based on the diagonal measurement.
Regardless, it's sobering to consider that the resolution of computer displays has increased by less than a factor of two over the last twenty years. Sure, displays have gotten larger-- much larger-- but actual display resolution in terms of pixels per inch has only gone up by a factor of about 1.6.
I can't think of any other piece of computer hardware that has improved so little since 1984.
Some manufacturers do make high resolution displays, but they're far from common, and very few get anywhere close to 200 DPI. Here's one model ViewSonic was demonstrating in 2002:
This 22.2-inch LCD panel being sold by Viewsonic uses the same panel developed and marketed by IBM last year (T220/T221). The difference is that IBM charged nearly $20,000 for its version; Viewsonic plans on selling this one for around $8,000. That's still pretty pricey -- what makes this panel so special?
Try 9.2 million pixels, for one thing. This 16x9 aspect panel has a native resolution of 3840x2400 pixels. That translates to roughly 200 dots per inch. In fact, you have to put your nose up to the screen to really notice the pixels. Scanned topographical maps could be easily read, even down to the smallest typeface. The monitor is targeted towards specialized image processing and CAD applications, and offers a 400:1 contrast ratio. Driving 9.2 megapixels requires a graphics card with twin TMDS transmitters.
High pixel density monitors are far outside the mainstream. The large versions are prohibitively expensive; the small versions can't justify their price premium over the lower-resolution competition with larger physical size. It's telling that today, in 2007, the Apple store doesn't even sell a single standalone LCD offering over 100 DPI. Nor can I find a single high resolution LCD of any type on newegg. I have no doubt that if I had $10,000 burning a hole in my pocket, I could buy a 200 DPI display somewhere, but at consumer prices and through consumer outlets, high resolution displays simply don't exist.
Most of the time, you see high resolution display options on laptops, where the notebook form factor physically precludes the display from getting any larger. Manufacturers are forced to pack more and more pixels into a LCD panel of a fixed size:
When I purchased my notebook I had a choice of three monitor resolutions - the standard 1200 x 800, 1680 x 1050, and 1920 x 1200. The diagonal screen size is 15.4" giving me the three corresponding pixel densities of 98, 129, and a whopping 147 ppi!
It's hard to see this choice of display resolutions as anything other than a side-effect of laptop size restrictions. If notebook vendors could somehow fit a folding 30" LCD panel into a laptop, they absolutely would. But even at 147 DPI, we're only halfway to our goal. To reach 200 DPI, that same 15.4" laptop display would have to pack in 2560 x 1600 pixels. Imagine a 30" Apple Cinema HD display shrunken by half, and you'll get the idea.
Short of some kind of miraculous technological breakthrough, I can't see computer displays reaching 200 DPI in "a few years". It's unlikely we'll even get there in ten years. I'd love to be proven wrong, but all the evidence of history-- not to mention typical consumer "bigger is better" behavior-- is overwhelming.
Posted by Jeff Atwood
What is the big deal about DPI?
Don't get me wrong, I can definitely see the benefits when it comes to manipulating images or watching something like an HD DVD, but what about the majority of users that simply set down at a computer to browse the web, type a paper, view an Excel spreadsheet, etc? You know that "average user". What will they do if this becomes standard?
What about the normal "power users" of which I group in this category programmers? Is our code supposed to look better at this resolution?
In the end, I totally agree with Jeff here. I do not see ultra-high DPI monitors becoming any form of common any time soon.
"I can't think of any other piece of computer hardware that has improved so little since 1984."
The keyboard? :)
All of the things you mentioned will look better and be easier on the eyes. ClearType and that thing Apple do is far from perfect; things would look a lot better to read if the DPI was increased. Anything reading should be easier on the eyes; anything just looking should look nicer, too.
This is a real shame, but the advantage to raising the pixel density is clear - we can remove all the workarounds to low-resolution displays. MS Cleartype vs Apple being the current one, but anti-aliasing in general is a workaround the fact that the smallest display element, the pixel, can easily be resolved by the human eye.
Sitting in front of my 1280x800 laptop screen (13") and my 1280x1024 LCD (19"), the laptop gives a much clearer picture. Scale that up to 19", and I'd either have more pixels to work with, or screen elements of the same size that just looked much sharper... One of my big hopes would be that people would increase their font sizes and save their eyes :)
users who'll benefit most with super high dpi monitors are graphics designer/photographers
With DigiCam's Mega Pixel going higher and higher, viewing a picture at 100% w/o scrolling is impossible, even with the apple 30" screen. If you've seen digital pictures on one of those high dpi screens you'll notice the difference and would dream you've one of them at home :)
How many dpi can we actually see? check this page out
basically it depends on the viewing distance, the closer you're to the screen the higher the dpi you'll need. The bigger the monitor, the further you sit, hence higher dpi is needed to see a crisp, smooth image.
I'd hope that they can stuff 2560*1600 on a 22" LCD, that would be good enough for me :)
Interesting point. In some regards we have even gone backwards. 5 years ago I purchased a mid range laptop with a 15.4"/1920 x 1200 screen (on sale) for exactly $1,100. I've wanted to replace it for a while now but but every other screen I try out seems hopelessly clunky in comparision. Right now there is only 1 machine on newegg with this combination and it is $2,000. (How did it go up in price?) To get 1920 on a Mac you have to spend $3,000 for the 17" which I'd never spend on one machine. You can argue that 17" is a better size for it but I like lesser weight and size of 15.4" and I'm sorry to see that it didn't become the standard.
"The keyboard? :)"
I'd say this one has actually gone backwards. Used to be, you could find completely standard keyboards all day for $10. Nowadays, you have to scour far and wide to find a keyboard that doesn't have some bizarre cursor keypad rearrangement abomination, or uses the small backspace key, or sticks the "sleep, power, hibernate" buttons right next to the cursor keys, or throws a ton of useless "function buttons" at you, or...
Now get off my lawn.
Where Are The High Resolution Displays?
- New MacBook Pro, 133 DPI
- iPhone, 160 DPI
Jeff Atwood wrote:
"the resolution of computer displays has increased by less than a factor of two over the last thirty years"
1984 was way less than thirty years ago (I don't have the equipment to do the exact calculation right now).
The Apple II in 1978 had a resolution of 290x192 on a regular television. On a typical 15" television of the day, that would be about 20 dpi. That was less than thirty years ago also.
I'm all over higher DPI. However, what I am NOT all over is the hurt it is going to put on my system for gaming.
Already takes a 400+ dollar GPU to run on my display with all the eye candy, and even then I see fps dropping as low as 20 even on games that are a couple years old.
A resolution of 3800x2400 makes any current gaming PC cry... even the very top end.
Sure, AA won't matter as much when you have that much pixel density, but even without the burden of anti-aliasing, I'm sure we'd need much more advanced hardware to get playable framerates on games.
One thing that no one has mentioned yet: Prior to Vista, Windows wasn't exactly friendly to high DPI monitors. Let's face it, there is only so small a character can physically be before you can't see it. Since Windows was essentially pixel-based in all things (dialog box positioning, font sizing, etc), working with a display of significantly higher DPI than the average means that you are going to have to deal, on a daily basis, with all the software out there that doesn't do sizing correctly.
Or go turn on 'Large Fonts' or 'Extra Large Fonts' in the Appearance tab on the desktop. Things work, but there's plenty of software where stuff will be out of position, dialogs will look funny, buttons and fields will have chopped off text, etc.
Heck, just go around to some bad websites where the nitwit specified his font or table or frame in pixels. It's bad enough on a normal monitor - what will it look like on a 200 dpi one?
Until Vista takes hold and most software starts to use the APIs correctly enough, we're not going to see really high res monitors, because it will be too painful to use them, day to day.
I think one good point would be the fact in conventional operating systems everything gets small with a higher resolution. Many non-power users do not want everything to be where they can barely read this. OS 10.5 should fix this with their resolution independence. I do not know much about vista put I did notice an "increase font size" option that changed its dpi setting so that things became larger to allow for larger resolutions.
Hmm, thats I what I get for not reloading my page before posting. Sorry for the similar comment.
To argue the other side: desktop display sizes *are* approaching the limit of what people want on their desk. (Ok, I'll admit, I'd still like a giant monitor. But for your average man-on-the-street, the Mac 30" display is TOO big.) It's not as hard a limit as in laptops, but there's definitely some back pressure there.
Something I found out when researching High-DPI screens:
For a given technology, higher-DPI screens are typically dimmer than lower-DPI screens. This is because LCD-based displays have a matrix of thin black lines that define the edges of the pixels. (Use a magnifying glass to look at your LCD to see the matrix.) When you increase the DPI, the matrix lines don't get any thinner, so the total area not covered by the matrix lines decreases, causing less light to pass through the LCD.
You can compensate for this by increasing the intensity of the back-light, but that wastes more power and generates more heat.
1984 was way less than thirty years ago (I don't have the equipment to do the exact calculation right now).
Heh. :) Corrected.
for your average man-on-the-street, the Mac 30" display is TOO big
Interesting, because I've seen people use 37" HDTVs as computer monitors..
Comparing an original MacIntosh with today's screens is a bit misleading as that used a CRT while today's computers almost always use an LCD. My office replaced all its CRTs with LCDs a year or so ago, and it was definitely a step backwards. The equivalently priced and sized LCD to our old 1600x1200 CRTs only do 1280x1024. So yes, there's been a bit of a move backwards as the LCD replaced the CRT. If LCDs had never been invented, than we'd probably all be looking at screens with a higher DPI.
Plus, Dell has sold 130+ DPI laptops for a while now.
I'm surprised you didn't mention gaming, the one force that actually keeps the DPI from going up.
DPI is measured only in one direction. So when the DPI doubles, the number of pixels that have to be addressed is squared.
Simply rendering at a lower resolution and then stretching the image clearly defeats the point. So faster graphics hardware is the only alternative.
So even if you get your hands on a nice 30" 200DPI display (with a resolution of say, 5120 x 3200. Then we're talking about more than 10 times as many pixels as with a normal 1280x1024 (common for 19" TFT).
When our graphics cards have no trouble dealing with such resolutions, then the DPI may increase. But currently, large screens with low DPI is the only possibility with the slow graphics hardware we have.
I just want it for my tired old eyes. I program for hours and hours each day. The company just sent out an ergonomic questionnaire. One of the questions was "Do you spend four hours or less per day working on the computer?". Correct answer..."Ha!"
I've tweaked as best as I can and my 22 inch is pretty sharp, but with that amount of time on the screen, I would LOVE to have it easier on the eyeballs.
And as a non-game programmer (and pretty much a non-gamer) the horsepower needed to drive it isn't the problem. It's just the lack of availability....oh....and the cost if it were.
If you allow for laptop screens, that 100 DPI isn't really the top of the consumer monitor spectrum.
I'm the proud owner (sort of) of a Dell Latitude D820 with a 15.4 inch WUXGA display. That's 1900x1200, and _150 DPI_ for those playing along at home. I bought it almost exactly a year ago, and at the time, Dell was the only company I could find selling WUXGA laptops at 15.4 inches.
Now with a year under my belt on this bad boy, I know why they're so hard to find. Everything is absolutely _tiny_. If I slouch back in my chair, I can't read anything. Smaller images on websites essentially become thumbnails. (Thank goodness for the Firefox Mouse Gestures plugin that lets me scale up images with a quick mouse stroke.) And the natural dimness of laptop displays really doesn't help things.
Yes, I could turn up the DPI settings in Windows, but lets face it, most apps don't do very well, aesthetically, if you mess with the DPI settings. Even some of Microsoft's own.
Anyway, if I had the choice to make again, I'd go with a 14.1" SXGA+ (1280x1024, 116 DPI) screen, for two big reasons. For one, everything would be a little easier to see. For two, the widescreen format just isn't good for editing code or documents. Better to just go with letterbox and get multiplier monitors. If you're someone that needs widescreen though, you might be better off with the more modest WSXGA+ (1680x1050, 128 DPI), or bumping up to at "desktop replacement" 17-incher for WUXGA (133 DPI).
(BTW, for anyone who wants to dismiss my complaints and is interested in getting one of their own, Dell discontinued this screen format for a while, but you can get it again now on the Latitude D830.)
Interesting, because I've seen people use 37" HDTVs as computer monitors.
*Drool* I would love to have my computer hooked up to this bad boy: http://www.samsung.com/Products/TV/DLPTV/HLT6189SXXAA.asp. All my multimedia and computing needs served to me right in the comfortable butt-groove of my couch. =)
If you allow for laptop screens, that 100 DPI isn't really the top of the consumer monitor spectrum.
Doh! I guess on my first read-through I missed the blockquote in your post that said essentially exactly this.
Hopefully displays never do become any more high resolution. 90% of users simply browse the web, watch (poor quality) web videos, and already their monitors are left on wasting energy. In order to have inexpensive high resolution monitors available, it would have to be affordable to the masses so that mass production would be profitable. Seeing as how Microsoft loves chewing up resources, I can see them running out of pixels in the not-to-distant future. Do users really need seamless text with detail only visible under magnifying glasses? I am more excited to see high contrast (100000:1) SED displays, and I anxiously wait to be dazzled by them.
Floppy disks haven't improved either. I don't know the exact year, but once we reached 1.44/2.88, development complete stopped (in favor of CD, DVD, zipdisk, etc). Still, Most PCs I see have a floppy drive (aka existing PCs, not new ones) and they pretty much use the same hardware that we had decade(s) ago.
You said it yourself - laptops.
More and more people are moving to laptops. Especially laypeople who don't really need the grunt of a huge desktop computer any more. As laptops take over you're going to see average DPI go up on monitors by simple virtue of the fact that most computers will be laptops.
Lots of things to comment on. I just got a 24", 1920x1200. I don't really want anything physically larger than this, so the only way anyone's going to sell me a new monitor is if this breaks or they offer higher DPI. I want higher DPI - after all, that's why it's more pleasant to read text on paper. I know some people will always want larger displays, but I think when 24"ers come down in price to say $300-400, higher DPI might be the next step.
Games: double the DPI. This shouldn't be a big deal - set the game at half the native res and the monitor should have an easy job of scaling. It should have the same quality as a display with half the DPI. Or the video card could scale - this shouldn't be much work compared to the other rendering the card has to do.
Programming on wide-screen: turn it sideways - now you have 1200x1920. I actually find x1200 to be enough - I like the extra space to the sides for icons or docks.
*Effective* dpi. Perhaps it makes more sense to measure the eye-to-display distance, and compare effective resolution in pixels per degree.
Windows has "large fonts" and screen resolution settings. OS X Leopard will have a resolution-independent interface. Soon it will be convenient to sit as close or as far from your display as you are comfortable.
And if you *have* been looking at computer displays since 1984, I bet you are sitting farther back, so your effective resolution is increasing at better than industry standard. Cheers.
I will mention yet again, laptops. I have a 1920x1200 display on my notebook, and absolutely love it. This comes out to something like 147 dpi.
Additionally, I have an external older CRT monitor at 1600x1200. I love all of the real estate, but could definitely use more.
I will be attending RPI in the fall, and my only concern about the laptop that they will have for the students to buy is that it will not be high resolution, i.e. 1920x1200 (or better). (They haven't yet released the specs for it.)
Just cound how much memory you'll need and how much cpu cycles will be wasted.
Althrough solution to this is clear - send images compressed and letters as vector data to screen. Some tricks will be needed to make vector data rasterisation more computationaly easy but for characters this could work well i think.
By the way as far as i could remember there was a solution in crt world to use horizontal line luminophores instead of dots. So you get infinite horizontal resolution.
Amen, someone who does care about resolution!
Last year, when buying an LCD monitor, I had basically two options (I didn't want to go over expensive widescreen stuff): a 17" running at 1280x1024 or a 19" running at 1280x1024. I said "pay more for the same screen space, actually resulting in an inferior resolution?" and bought the 17" one. I actually prefer my pixels smaller and setting my OS to a large DPI setting (making text bigger and subpixel anti-aliasing really smooth), but sadly very few people realize this.
Now, about Dave Shea's comment and ClearType vs Apple's Display PDF (or whatever their text rendering engine is called or is part of): he's right that on higher resolutions you don't need so much to align to the pixel grid. BUT fonts only set hinting to small sizes! As your resolution gets better, your fonts will naturally get bigger, and ClearType's hinting and aligning will be greatly reduced, making it closer to Apple's rendering (while still being *arguably* a tiny bit clearer to read).
I'm guessing that by the time we get to 200 dpi, the graphics hardware will be phenomenal, and we'll still be doing anti-aliasing and ClearType because why not? The technology is there, we've got cycles to burn, and it makes it just that much better. Now if DPI suddenly jumped up without a corresponding boost in GPU power, I can understand if we dropped anti-aliasing and ClearType for a few years.
"1920x1200 doesnt really help me see more code - really not any better thant the 1600x1200 of most 4:3 19 or 20" displays. developers need more vertical for editing code."
Am I the only one who keeps several emacs buffers open side-by-side while coding? I can easily fit two files on screen at the same time, by splitting the window vertically into two tall views (or as emacs calls them, "frames"), even on my small laptop display. "C-x 5 2" is your friend. :)
I just wanted to tell everyone that the T221 monitor is really great; the 200dpi in a 22 inch monitor is really wonderful. You can barely see the pixels and the rendering is really very smooth.
I don't mind that the UI does not scale, as I have good eyes. Having a 3840x2400 pixel desktop is really a marvelous experience.
For the curious, there is an active group on yahoo (http://tech.groups.yahoo.com/group/IBM_T2X_LCD/messages?o=1) dedicated to the T221 family of displays.
Sometimes, you'll find one of the T221 variants or maybe a Viewsonic VP2290b appear on e-bay. Sadly, these displays are no longer manufactured. IBM stopped selling them through their official channels in early 2006, which is a shame for such a wonderful piece of technology. But I understand that having to pay about $8000 for a monitor is a lot of money and that there never was much demand for it. Vista should change this, but people are so used to large pixels that I doubt that there would be enough consumers ready to invest in high DPI displays.
But imagine having 9 million pixels with zero default! It's like having a Toshiba Libretto with a 22 inch display :-)
Pierre (author of www.creativedocs.net)
I remember distinctly (though without evidence) that the original Macintosh screen was 72 DPI.
Doubling the DPI will quadrouple the number of pixels which I think should be the factor to look at if you are trying to make a "Moore's law" comparison. But then "Moore's law" was about doubling the numbers of transistors at the same price (if I remember correct). Monitors isn't really about more pixels but also about the "quality" of pixels too. Update rates, contrast, brightness, colour space etc
It's particularly ironic that the DPI Wikipedia article that Jeff linked to clearly has a section entitled "Misuses of DPI measurement" and talks about how people mistakenly apply the term to refer to monitors, instead of using the proper term (which Chris and others have pointed out to be PPI).
"The keyboard? :)"
I'd say this one has actually gone backwards.
Agreed. You also used to be able to find mechanical keyboards that did the job right.
Now turn that awful music down...
Another issue is that a bid display + high DPI = hell-uv-a-lot pixels.
Imagine fast operations your OS does will now be noticably slow.
Moving a window around the screen, redrawing it, fading the screen etc. will need 10 times more processing (and more video memory too).
BTW. I watched a video on Channel9 on ClearType. Microsoft have invested millions in the technology, and have hired an independant benchmarking organization which found serious productivity gains reading ClearTyped text.
I am amazed that some people do everything and reason like crazy to defend OSX's superiority in all aspects to Windows. Don't forget that Apple has a smaller OS budget and it is understandable that they need to 'cut corners' here and there.
100 DPI ought to be enough for anybody.
The best screen resolution I've ever had was from my palm 320X240 pixels screen.
So, there's broad agreement that with much software setting large fonts and large icons just doesn't work. And broad agreement about small print and layout issues with websites.
What about those of us with LOW resolution vision? I don't know if high res monitors will help fix these things, or make it much worse.
Do head-mounted displays offer the possibility of high res over an apparently large area? I don't know where to go to find out about those
displays, price, availability, performance...
These discussions are a bit frustrating. Even though Joel Spolsky pointed out that *different* is likely to look *wrong* when it's new, many of the participants aren't getting that this might apply to them.
I see a lot of long-time Windows users having a glance at Safari for the first time, and declaring that Mac vs Windows type rendering equals accurate representation of typefaces vs better readability. A discussion about different design philosophies ensues.
This is a strawman position, because I haven't seen a shred of evidence that ClearType is more readable than Quartz text. We can't even realistically compare the two systems, based on a single pair of screenshots of a Google page set in Microsoft's bastard Arial font. Hinting is a feature of some fonts and not others, and varies in quality, and OS X does make some use of it.
To compare the two font engines, we need to see a variety of screenshot comparisons, with a variety of fonts.
And as David Shea pointed out,
"Marginalizing type designers is a pretty poor way to make any sort of point about typography, given that entire careers are based on an understanding of legibility and facilitating ease of reading. A statement like that one almost veers into dangerous “programmers knowing better than experts in their respective fields” territory, which I can’t imagine was his goal."
Readability is also affected a great deal by the selection and quality of fonts, as well as the line length and leading the text is set on—this is true for both print and screen. An evaluation of readability should consider at least some samples set by professional designers experienced in working with type.
Who says that more exaggerated hinting improves readability better than proper use of typefaces? Who has demonstrated that ClearType's readability is not *worse* than Quartz's?
My 12" dell with 1280x1024 yeilds a DPI of 136.6. Honestly, with subpixel rendering I don't think a higher DPI would be very beneficial to me.
The Neo1973 phone from FIC will have a 2.8" diagonal (43mm x 58mm) 480x640 LCD screen. This corresponds to about 283 DPI and the developers who already have it says it's gorgeous. (More at http://wiki.openmoko.org)
I think I already have a 200 DPI display on my desk. My Dell Axim x51v has a 3.7 inch VGA display. Using the same calculator, it yields 216.2 DPI. And it looks great, it's almost impossible to see the pixels. The fonts, etc., are adjusted so that everything is readable. This PDA was a little more expensive but still affordable to a fair number of consumers (IMHO).
I'm still waiting for my desktop version.
When comparing DPI you have to take into account more than just the stated resolution and the screen size. A display rated at 1920 pixels wide may not actually have that many physical pixels available. Until recently most "high res" plasmas could display 1920 but had a native resolution of only 1336. Check the specs first -- pixel pitch tells you as much as a 'resolution' number.
The benefit of a high DPI monitor is that you can fit more stuff on the same space, the same way you can put much, much more readable text on a piece of paper compared to a monitor of the same size.
It is refreshing to see concern about the quality of the graphics.
The story is incomplete without looking at COLOR.
Yes, the early machines might have been able to achieve some reasonably high resolutions, but only by abandoning colour. If you wanted 256 colours you had to drop the resolution.
You are not comparing like with like.
@ Mike Johnson
The white space is from a website designed for the majority of the www users that don't have large (15") high-res (800x600) displays.
While bigger and better displays may be "standard" in the US, they are not necessarily worldwide.
I appreciate Jeff's KISS blog design, as well as the content.
The post needs mention of th iPhone. We're up to 160DPI.
I don't understand why, after a certain screen size(apple cinema hd), monitors drop in resolution to 1080p or 1080i. The price of a dpi gain is unnecessarily exponential.
I'm a firm believer in higher resolution displays, they offer more room to do things. My friend and i were always dreaming of tearing apart 4 19" 1280x1024 lcd monitors and making the gap between them as small as possible to emulate a super high resolution display(5120 x 4096). I believe this is possible with a crossfire or sli setup, correct me if I am wrong.
iAt 100dpi, ClearType wins out,/i
Whereas, at 80dpi, ClearType looks like I'm drunk.
Even when bigger displays are standard, most people don't modify their resolution, and many people still are running CRTs, meaning that they're running at the default resolution of Windows (800x600 if they're running XP, 640x480 before that). For many people with vision problems it's really not acceptable to run at higher resolutions without scaling the font size up, and many people have already noted the problems with that.
One of the things I love about running at high resolutions on wide screen monitors is opening documents side-by-side, or using the 2-page reading view in Word.
Someone previously mentioned HD movies, but realistically we're already beyond the realm of HD when we're on a computer. A higher DPI would mean that the HD movies would be smaller on the screen if they're run in their native resolution than they are now, and most current computer monitors can run at a higher resolution than HD anyway (1080p? Let's see 1600p).
One of the issues with current gaming performance has to do with antialiasing anyway. As cards became capable of acceptable performance at the high end of monitor resolutions, and manufacturers realized that most of their users were running games at low resolutions (1024x768 or 800x600 was common at the time, and some people used to run even lower resolutions to squeeze out maximum performance), they started using FSAA methods that often render at sub-pixel levels to improve the quality of the images on the screen. In other words, graphics card manufacturers have been working in a 200dpi (or 400, sometimes more) world and sending 100dpi to the screen for some time now, and you could easily disable the FSAA to get better performance on a higher resolution screen. The need for higher end video cards is primarily driven by the desire to use all possible features of any given game. None of my systems have top-of-the-line video cards, and I haven't had much trouble running games in quite a while unless I had GPU-intensive features enabled (that usually can be disabled).
One Laptop Per Child (OLPC) has a 200DPI display. And the machine clocks in at $150 at the moment, so the technology can't be that extortionate.
They says screen specs are:
Viewing area: 152.4 mm 114.3 mm (7.5" diagonal)
Resolution: 1200 (H) 900 (V) resolution (200 DPI)
Technically, a 15" monitor is a quarter of a 30" monitor, not a half.
I have a Viewsonic VP2290b, the 200dpi monster that used to sell for $10k. They can be had on ebay cheaper than that now (still more than a New Dell 30").
It has been difficult to live with. I'm tied to a particular video card now that has dual output dual-link DVI, and I still only get 30Hz refresh (which is actually just fine with this screen). Fonts are a big problem - all the fonts that work well on 75dpi screens tend to suck on 200dpi. It draws 150 watts and has a fan inside it. Unresized 10 megapixel photos are eye-opening, the sharpness of consumer LCDs is gone but is replaced by a depth of fine detail normally only realised when viewing a professionally print on photo paper.
But it sure has its advantages when coding. I fit three 132x132 terminal windows side by side. A 1280x1024 browser window only takes up one corner of the display. I can work on 5 source code files, have 3 man pages open, and an SQL console, with each window being full-sized and capable of doing normal work.
Apparently they aren't making this displays any more. Not only are the expensive to make, but not very popular with the majority of people who have to squint to read the small fonts it is capable of displaying. 200dpi was too big a jump; there needs to be organic growth in resolution so that operating systems have a chance to catch up.
"100 DPI ought to be enough for anybody.
Haacked on June 15, 2007 02:33 AM"
Update your .sigs now, people...
Vizeroth wrote, "Someone previously mentioned HD movies, but realistically we're already beyond the realm of HD when we're on a computer. [...] most current computer monitors can run at a higher resolution than HD anyway (1080p? Let's see 1600p)."
I think you're mistaken about how HDTV resolutions are denoted. The number is the vertical resolution, so 1080p is actually 1920 x 1080, a resolution that only a handful of "current computer monitors" support.
Aren't we talking about PPI and not DPI?
"I can't think of any other piece of computer hardware that has improved so little since 1984."
What a non sequitur. Video has improved where it's mattered, just think about CGA graphics and the primitive graphics modes of some years ago. Why don't you complain about still using a mere three primary colours? We should be up to 30 by now. Or why don't you complain sound, or the number of keys on a keyboard?
Looking at long-term trends, you're missing the causes of those trends and drastically missing the near-term trend.
Yes, computer displays have been stuck at about 75dpi for a very long time, only recently advancing to 100dpi and beyond. Why?
1. Too many pixels. A 600dpi display would be awesome to behold. However, given the standard DVI connector and a 60-75Hz refresh rate, the max size of that beast would be about 5" diagonal (don't even get me started on the VGA limitations we lived with until about seven years ago!). Now we have dual-DVI connectors, of course, but even that only brings up the max screen size to 10". Given a choice between crystal-clear 10" and good-enough 45-40" displays, a significant portion of the market would choose the giganto-screen. And, honestly, you get a lot more usable data on that than on a 10" printed-paper-quality screen.
2. Dumb OSes. Until OS X Tiger (prototype) and Leopard on the Apple side, and until Vista on the Windows side, having a DPI of twice the "norm" meant that all your screen targets were half the size. Having a 600dpi screen would mean your on-screen square-inch icon target becomes one-36th the size (6x resolution == 36x reduction in area). People can't deal with targets that small, and until very recently OSes couldn't reliably provide larger targets.
3. Increased processing needed for larger screens. Obviously it doesn't matter if you have a 30" 200-pixel-wide screen or if you have a 20" 2000-pixel-wide screen here. However, given that the average person will be able to view significantly less information on the 20" screen relative to the 30" screen, the extra processing power needed isn't offset by added utility.
4. Disappointing lack of human evolution. The constant here is human eyesight and hand/eye coordination. That hasn't changed at all. Yes, 600dpi is ideal, but 75dpi is "reasonable". The quality increase with increased dpi moving from, say, 100dpi to 200dpi is significant, but not so large that it can outweigh the negatives above.
4 isn't changing any time soon, but 2 and 3 have already changed drastically, and 1 might see further improvements in the near future.
Looking at near-term trends, we've gone from 75-ish dpi to 100dpi fairly slowly, and went from 100dpi to 133dpi pretty darned quickly.
Now, obviously given point (1) we're more likely to see 200dpi screens in less-than-gargantuan proportions in the next few years. But, honestly, if you're using a 30" monitor to read type at 6 or 8 points, you're a little off your rocker.
Overall, in the average-size screen range, what's *not* holding us back is the ability to pack LCD pixels in tighter. We can make 200dpi screens. What is holding us back from that is that there aren't enough people willing to *buy* the higher-dpi screens, primarily because of (2) above.
I love my 1920x1200 17" laptop (133dpi?).
Most desktop LCD's look blurry when forced to use them. The last company I worked for wanted to purchase 19" LCD's. I could not find higher res units available through normal channels at any price! Why are these not available? Therefore we bought on price.
I don't believe 133dpi is enough yet, and hope my next machine is higher
That said I don't recommend high rez displays to just anyone right now, only power users. As stated by other you have to know how to make windows XP and a lot of websites jump through hoops. Hopefully the new versions of windows and macs address this.
Some applications don't even work properly (under XP). The one that comes to mind first is Google Earth! The driving direction box is a fixed dpi causing the bottom row of fields to barely show up. Making changes to system settings to simply run different software is not fun.
I'm not a fan of the wide format displays. I believe a more square ratio screen has more functionality for the majority of users outside of niche activities such as movie editing and programming.
In a related argument I once operated a small portrait studio on the side (I'm a left brain guy by nature.. so it was a challenge). When going digital from medium format film cameras, the long format was really awkward to compose with. How do you tell Nikon or Canon to make a 4:5 format digital camera for it's pro users?
I think the important thing we need to focus on is that we need choice. I have a Dell Inspiron 8600 with the WUXGA screen and I love the little thing to death. Yes, some people can't read type that small. Some people don't want their OS windows to scale. But I do, and there are more like me. Some people have bad vision, I have 20/10. Do you know how much it pisses me off that I can't even *buy* a good screen like what I want? Do you know how much it pisses me off that I can't buy a 19" 1600x1200 LCD for my desktop?
And yes, I am a gamer. I don't understand all those whiners that complain about low resolution gaming. The LCDs usually upscale anyway, and frankly they do a good job of it. UT2004 still looks fantastic at 1280x800 rezo on my laptop, World of Warcraft looks great, and I'm sure a lot of other newer games would look great too if they actually ran on my (now) puny Mobility Radeon 9600.
Somewhat off-topic, but:
As I read your latest post I couldn't help but think of an analogy with respect to music synthesisers. Not one manufacturer today builds a keyboard bed with polyphonic aftertouch, despite the fact that this feature is highly desired by true performance musicians who value the expressive power of poly AT.
Even KORG's top of the line, no feature unimplemented, ultimate keyboard, the $8000 OASYS, does not feature poly AT. The reason always seems to be, it's too expensive to justify the limited market. New model keyboards will sell without it. The "less good" monophonic aftertouch that they have is "good enough".
I'm wondering if Hi-DPI displays are in a similar position.
Great post Jeff. You really opened my eyes as to why my laptop display looks sooo much better than my desktop display. As I type this, I'm running a 15.4 inch Asus G1 laptop with a 1680x1050 resolution that's hooked up to a 17 inch LCD monitor at 1280x1024.
Needless to say, the picture on the laptop blows away the picture on the external monitor. At first I thought it was the resolution + glossy screen, but your post makes it clear that its actually the DPI that makes the difference.
See http://daringfireball.net/2007/06/high_res_macbook_pro for a comparison of current Apple offerings. 200 DPI ThinkPads have been around for years, but haven't been popular.
It's a chicken-and-egg issue, and one that's changing. People don't buy high-res displays, because the software is designed for low-DPI displays. If you use one of those 200 DPI ThinkPads, you have to either have really good eyesight, or set all your fonts to huge, plus have a web browser that will scale images up.
Apple has been telling developers for a few years now that they need to start migrating away from bitmaps and resolution assumptions. And, from what I understand, Microsoft wants people to use a resolution-independent API that's new with Vista.
The iPhone and high-DPI MacBook Pro are only Apple's first volley. Once OS X 10.5 is out, I would expect Apple to introduce more high-DPI devices in quick succession.
One more thing...
I have a friend who just replaced his ancient 200 DPI ThinkPad with the new MacBook Pro. When he bought the ThinkPad, it was pricey, but not $20k pricey. He's been holding on to the ThinkPad for years because, once he got it configured right, he didn't want to give it up.
The growth rate is considerably less than even you have indicated. In late 1991, I bought my first computer (first with my own money anyway). It had a 14" (13.3" viewable) screen because that was pretty much all I could afford, but I got a pretty expensive ATI graphics card. I ran at 1152x864 resolution on that itty bitty display. According to TAG, that's a DPI of 108, better even than your 2004 Apple Cinema HD display.
Basically, in the last fifteen years, the cheaper monitors have done around 96 DPI, and the more expensive monitors have done 110, lately up to slightly over 120, with few exceptions.
Those numbers were true 15 years ago. They're true now. Based on our growth rate, I'd expect mainstream 200 dpi monitors, oh...never. At least not in my lifetime.
I do hope that the growth rate accelerates, but I'm not counting on it.
I read a post up aways about computer hardware lacking support for high resolution in regards to gaming.......by the time 200 DPI is standard hardware and software will be leaps and bounds ahead of where we are now....Plus look at the rate of change in resolution over the years and compare it to the advances in processing power and memory then draw a conclusion on what the problem will be?
Perhaps mobile phone displays will drive high dpi development. I believe that the Openmoko is projected to use a 280 DPI display, and I dont think its being custom made for it or anything.
"To reach 200 DPI, that same 15.4" laptop display would have to pack in 2560 x 1600 pixels. Imagine a 30" Apple Cinema HD display shrunken by half, and you'll get the idea."
The size is measured diagonally so going from 30" to 15.4" isn't shrinking it by a half.
Going from 30" to 15.4" is reduction of around 75% of the display area - a 50% reduction on both the horizontal and vertical axis.
Yesterday I've read Jakob Nielsen's "Designing Web Usability" (http://www.useit.com/jakob/webusability/). He notices that reading from screen is much more tiresome than from the printed page because of screen blinking (this problem has been eliminated in LCD) and its _lower resolution_.
I think that switching to 200 DPI (or even better, to 300 DPI) monitor can seriously improve person's efficiency. Imagine working with your code when the characters look like they have been printed on a laser printer.
There is an 'electronic paper' technology developed at E Ink Corporation (http://www.eink.com/), but unfortunately, by now they only can afford maximum a 9.7" display (http://www.eink.com/products/matrix/High_Res.html), and yes, they are grayscale only.
Creating an actual panel with 200 DPI is not a problem. Think about it: 200 dpi is really 600 dpi (counting the RGB components separately). That means a single sub-pixel is 0.0254/600 = 0.0423 [mm], which is enormous, given that it's 10,000 times larger than the features on a modern CPU.
Sure it's a different technology, but the point is that making smaller pixels is not the bottleneck. It's far harder to get the other bits working: A 21", 16:9, 200 dpi screen means driving a whopping 65 million sub-pixels. With a 60 Hertz refresh rate, this requires a data-stream of 31 Gbit/s. That's completely out of the question with analog connections, and the digital standards are only emerging now. Even the latest HDMI spec defines a clock of "only" 340 MHz, which results in bandwidth of 10.2 Gbit/s. Still not enough for a reasonably sized 200 dpi screen.
According to the calculator, my laptop has a resolution of 168 dpi which would explain why some things on it look particularly crisp.
For most things, I find that portrait orientation is much better and so normally have my LCDs rotated. The growing popularity of widescreen makes this harder as I find 1200 to be the minimum useful horizontal resolution.
So now we have 3 days of articles about why you think clear type is better? Seriously get over it you're sounding like a child who's upset people didn't agree with him the first time.
Regarding if ClearType is better or not:
I think it's fairly obvious that aligned vertical/horizontal lines are more distinct with ClearType. While I'm all for font designers handling this themselves: correct me if I'm wrong, but I don't believe that the 9-point Arial font is specified separately from the 10-point Arial font, right? And I also don't believe that the 9-point Arial font on a low-resolution (say, 100dpi) display is specified differently from the 9-point Arial font on a high-resolution (say, printed) display. Which, to me, says that font designers, as wonderful and intelligent as they are, can not hope to make up for display pixelation. Right?
In any case, there is still the argument that ClearType makes some text *less* readable, not more readable. An obvious example was given (then pooh-poohed by a few other commenters) in the first posting here: when the under-baseline parts of "g" and "j" get absorbed into the underline because both get aligned to the same pixel row, you definitely *lose* readability. Of course there ae solutions to this: don't vertically hint below the baseline, or move the default underline so that it can not align with below-baseline lines, or make underlining additive instead of masking (ie, if the underline coincides with another line the pair "thickens"). But, as best I can tell, ClearType doesn't do that yet.
we have 3 days of articles about why you think clear type is better
We have 3 days of articles because I think it's an interesting topic that touches on multiple aspects of software and hardware engineering. I said up front that neither approach is "right". I think it's important that people have enough knowledge to know why they are seeing what they are seeing rather than just accepting it as fact.
Also, here are a few other citations on the vast difference in resolution between print and screen:
It's something that Edward Tufte talks about quite a bit.
Its a bit unfair to say screens have improved by a factor of 1.6 over the past 20 years.
if you measure the number of pixels in each screen:
1984 Original Macintosh 512 x 342 NPix=175104
1984 IBM PC AT 640 x 350 NPIX=224000
1994 Apple Multiple Scan 17 1024 x 768 NPix=786432
2004 Apple Cinema HD display 2560 x 1600 NPix=4096000
You see that the number of pixels have improved by a factor of over 23 in the same time.
number of pixels determines how much information you can really put on the screen, not DPI. also, high DPI gets less and less useful the higher it gets, because of out eyes limitations.
The Sony UX180P through UX390 have 4.5" diagonal screens with 1024x600 pixels. Doing the math, this is:
sqrt((1 024^2) + (600^2)) / 4.5 = 264 dpi (color)
Applying Clear Type or other subpixel technology leverages 791 sppi.
(Note: the otherwise impressive OQO O2 has only 800x480.)
Terretta got there before me! Yes, the Sony's advertise up to 285dpi (ppi?) on the recent models. They have an enlarge function to dynamically enlarge and scroll through the now larger than screen page. At the multimedia shows in Tokyo you often see mobile phones with similarly high-res displays, albeit small. The problem of quality production is clearly being solved, but making larger versions is hard. I'm looking forward to Sony and other Japanese makers coming out with 10" in the near future. Time to buy a real portable laptop then!
I use a Dell 17" that's 1900x1200 which I think is 130 dpi or more. I love it... but then, like so many other computer users, I'm nearsighted and can easily see tiny text clearly. I use my Dell for software development and find that the 1900x1200 format is PERFECT for development because I find a lot more value in having a wide array of tool palettes surrounding the code than I find in having 200 lines of code visible on screen at once.
I do think that my display approaches the limit of diminishing marginal returns. I wouldn't want much more information (or much smaller text) on the screen.
I agree with the poster above about distance of eye to display. I had an original Mac, and you sit a lot closer to that little 72 dpi screen than when sitting in front of a 100 dpi 24" monitor. I like to sit back.
For that matter, I have a 60" 1080p DLP TV and a computer connected to it that drives it at full 1900x1200. At that resolution, sitting on the couch 10' away, the text is about the same "size" as when sitting right in front of my Dell notebook. Now that's a developer's workstation.
What is the big deal about DPI? ...what about the majority of users that simply set down at a computer to browse the web, type a paper, view an Excel spreadsheet, etc? You know that "average user". What will they do if this becomes standard?
They'll enjoy how their eyes no longer feel tired after viewing Excel spreadsheets and browsing the web all day.
Looking at words printed on paper is less hard on the eyes than looking at a screen. Partially, this is because a screen is a big light shining in your eyes, and until we get digital paper we're stuck with that.
But the other part of the reason is DPI. As Jeff linked to above, consumer printers have been able to print at 1200 DPI and above for years. It's nicer looking at text like this because you can't see the dots. Your eye doesn't try and focus on the tiny little details that make up the image, because it can't. They're too small.
When computer screens get better resolution, they'll be more like paper: our eyes won't be able to see the dots, so text will get much less tiring to look at.
Every user with eyes will benefit.
Dell Dimension 9200, 20" widescreen at 1680 x 1050 - can't see a darn thing text-wise and I already have my glasses on. Other than watching movies, the point of high resolution is what? I'm trading this in and going back to pens.
Hello, I just stumbled on this discussion as I am in the process of buying a new 15.4 laptop and I'm really hesitating between 1920 and 1680, which equates to 147 dpi vs 128 dpi.
Of course if everything was scaling properly (especially the web, as for the OS and most apps like word or excel or editing tools it is now ok) I would go for 1920, but for now I thing I will go to 1680 ... , what do you thing ?
Otherwise I fully agree that higher dpi could change -- a lot -- to the way computers are perceived and especially for the web.
After all for text, quality printing on paper goes up to 3200 dpi no ?
For example, the Hitachi TX36D58VC1CAA laptop panel is 14" and 1600x1200, about $450. If you search ebay.com for vga lvds you will find controller for about $50. Do you want to pay $500 for a 14" monitor?
My understanding of some of this discussion is that it is mostly about having very high resolution screens and then scaling the dpi to make it readable. I find the results completely unacceptable, as it only changes the font size in some places, and leaves a mixed result that's hard to read, as well as page design that's all out of whack. At least on XP. Maybe it works better on Vista. Do you expect to be able to read a 14" screen with 1680 or 1920 resolution? I doubt you'll be able to see anything.
Alain75, in response to yours, my conclusion is that it's best to sit in front of whatever you're thinking of buying so you can see for yourself.
I made some calculations showing comparable sizes, (see my website, the last two articles are about my recent experience with monitors) although that doesn't cover everything. A 20" wide monitor I purchased at 1680 x 1050 res might be OK (although smaller for reading than I would like) except the text is so faint I can't see it at all without straining. The text seems to be medium grey instead of black so it tends to disappear into the white background. Some-one suggested this had to do with different technologies used in manufacturing monitors. Some-one else said not.
I didn't know 15" laptops came with 1680 and 1920 resolution. It sounds very high to me. Have you seen that in person?
Cool article dude, too many comments though, I'll try to read them all!
I've loved it especially as I'm trying to buy a screen with the same pixels density as the screen of my VAIO (17", 1920x1200,133dpi)! this sucks!
The best resolution screen I've used is the 254ppi one on a Nokia N90. Strangely, it's the only phone that uses that 352x416 display. Subsequent Nokias use a 320x240 in the same size.
at 100dpi clear type looks great
on a shopping channel i saw a guy pedalling a projector. he had a standard laptop which i would guess was running at 1280px across - 1900 absolute max - projecting it onto a wall, and claiming 'you've increased your screen resolution 30 times! think how many more windows you can fit into the screen now!'. funny in an infuriating way.
I think that the total display resolution is as important as dpi resolution. The situation gradualy tends to turn ridiculous - compare the digital SLR resolution of 20 Mps.(o reven more) and that of standart computer display of a humble 2 Mps!
Manufacturers seem to be not interested to manufacture to make high resolution displays due to economical reasons. While consumer continues to bye old-fashioned displays there is no use spend money to build new plants and develop new technologies
Though at present there is non expansive technical way to increase the total resolution of existing computer displays by a factor of 10, i.e. dpi resolution by a factor of 3 – e.g. via using ferroelectric LCD alongside with double addressing technique and modulated amplitude illumination.
Not sure if it was mentioned before, but the Thinkpad R50p had a 4:3 QXGA (2048x1536) display running at 171 DPI. I would love to get myself one of those, as the 1920x1200 in my current 15.4 laptop is still kind of limiting. I'd be happy with 2560x1600, likely with a little font bump though (Windows Classic theme is already small enough at 147 dpi, maybe 170 would be OK, 200 would be tiny).
More than a year has passed since this thread was started. Still no higher resolution monitors available.
It's funny that so few people recognize that it's insufficient screen resolution why people prefer to read printouts rather than directly from the screen. I suffer if I have to read nonsens like: a 9 point font is too small to read on a 200 dpi display.... A 9 point font should be (roughly spoken) 9/72 inch high. So it has the same size on a 3200 dpi printer as on a 72 dpi 1984 Macintosh. Everybody will agree that the 3200 dpi version is better to read than the 72 dpi version. It's just OSes unable to scale images that hampered the progress.
I have Vista on my 1400x1040 12.1 display at 120 dpi setting, and essentially all programs and web pages work now. So, one obstacle is gone.
Concerning screen size: for reading, I guess 30 - 50 distance is optimum. A column of text should not be wider than about 5 inch, and the font should be something like 7 to 10 points (open a book or a newspaper, if you do not believe that). The resolution should be at least 300 dpi, but 3200 is not too much. There is not much sense in a single document wider than 8. For sure we want to be able to place several documents on the desktop, but if I have 2000 pixels available I'd rather save my eyes by concentrating them on 10 inch and do without documents in parallel.
By the way: progress is even slower than assumed in this article. I had almost forgotten that: as a student, I encountered an (at that time already old) Tek 4014 display (see http://www.science.uva.nl/museum/tek4014.html). 4096x4096 pixels, on ~12x12. Technical drawings looked as clear as on a pen plotter (anybody around who remembers what a plotter is?).
What do we learn from this? Uumph, also the Concorde commenced service in 1976, and since then we are going down, because we spend to much time with computer games.
I can't think of any other piece of computer hardware that has improved so little since 1984.
The keyboard? :)
Indeed. I'm still using an IBM model M keyboard and I wouldn't have it any other way.
The new Sony Vaio P UMPC has a density of 222 dpi. 1600x768 @ 8. Naturally, its getting the standard mix of thrashing by people who consider it unusably small without ever having used it, and those who are excited about it. It does seem that there is a more reasonable balance now than previously, although threads about putting a WUXGA panel into a previous generation 15.4 MBPRO (sadly, no longer possible, as the new panel is manufactured into the lid in a way that it cannot be replaced without breaking both) tend to be overrun by people claiming its just a bad idea, who must have nothing better to do.
I swapped out the display panel in my 15.4 mbpro to WUXGA, and still found it insufficient (for coding and document editing, primarily), but better than any option apple offers. And at home, I have a 30 dell panel which frustrates me every time I think about it. I can comfortably VNC from my 15 laptop to my home display with only slight downscaling, see the entire thing on my laptop, and work comfortably on it, even with the minimum terminal font sizes reasonably usable on the dell.
I keep hoping for a reasonable pixel density even on any 19 display, so I can replace this low density 30 monster with several of them. That they density is increasing on portables, albeit very slowly, gives me reason to hope.