June 12, 2007
I've finally determined What's Wrong With Apple's Font Rendering. As it turns out, there actually wasn't anything wrong with Apple's font rendering, per se. Apple simply chose a different font rendering philosophy, as Joel Spolsky explains:
Apple generally believes that the goal of the algorithm should be to preserve the design of the typeface as much as possible, even at the cost of a little bit of blurriness.
Microsoft generally believes that the shape of each letter should be hammered into pixel boundaries to prevent blur and improve readability, even at the cost of not being true to the typeface.
So we answer the question with another question. What do you respect more: the pixel grid, or the font designer? It's not surprising that Apple would side with the font designer, because Steve Jobs thinks Microsoft has no taste. But me, I'm a pragmatist. Given the ubiquity of relatively low DPI displays, I'm with Dave Shea. I side with the pixel grid.
Joel talks about the pixel grid, and how Microsoft's type rendering pays more attention to it. Speaking as someone who thinks a lot about the pixel grid, I have to say I think I'm coming around to the idea that Microsoft's ClearType simply works better.
Alright, I'd better qualify that quickly. Think about it this way – as a designer, you don't just set type in Photoshop and let it go, right? You tweak. You kern. You attempt to match the letters to the pixel grid as closely as possible to reduce the blurriness. Sometimes spacing suffers, and you have to choose between a slightly blurry letter with perfect spacing, or a more precise fit within the pixel grid with just slightly off spacing. I can't be the only one that leans toward the latter most times.
And that's the difference here. ClearType is a closer match to what I do manually already. Yes, I prefer the way type on OS X looks; ClearType seems too sharp and overly blocky, the subtleties of the curves are lost and it's overly chunky. But, for the medium in which it's being rendered, it seems like a more ideal solution.
Dave's opinion carries a lot of weight here, not just because he's a well-known designer, but because the three citations he provides demonstrate just how common it is for designers to do exactly the kind of manual, per-pixel tweaks that ClearType does for us automatically. And it's not just an aesthetic choice, either – there's plenty of hard data to support the assertion that snapping fonts to the pixel grid improves reading accuracy.
A fascinating greyscale-only variant of this rendering techique, FontFocus, illustrates beautifully how subtle tweaks can "snap" fonts to the pixel grid for better readability:
Typography, if you haven't figured this out by now, is really complicated. It's one of the few areas of "computer science" that actually justifies the title. I highly recommend reading the entire FontFocus article, as it's very instructive.
Dave Shea thinks the pixel grid will be moot once high resolution displays become ubiquitious. I wholeheartedly agree, although I'm unsure when exactly that will be. The history of display resolution increases have been quite modest so far. Ten years ago I was using a single 17" 1024x768 display; now I'm using three 20" 1600x1200 displays. So you'll forgive me if I'm not overly optimistic about this theoretical jump from 100 DPI to 200 DPI.
I don't understand why Apple is asking us to sacrifice the present at the altar of the future. Can't we have hinting at low resolutions, and accuracy at high resolutions, too? Snapping fonts to a pixel grid may very well be irrelevant when everyone is luxuriating in the glow of their 200 DPI monitors. Until that glorious day arrives, respecting the pixel grid certainly makes text a lot more readable for those of us stuck in the here and now.
Posted by Jeff Atwood
It might also be related to the graphical subsystem and the monitor. I have an Apple Cinema Display and I'm using MacOS X. When I look at your original post with the two images, before I even know which one is which(!), I saw the first and it looked very appealing to my eyes. Then I saw the second one and only though "OMG, what is this? A screenshot of a Linux browser running an a X11 with only pixel fonts (no TTF support) and without any kind of font smoothing?". I had to look twice to actually see that the lower image uses font smoothing as well.
I'm not sure if this because I view the image on an Apple system (hardware and software) or because I'm just used to Apple's font smoothing. While the second one might indeed be slightly more readable than the first one, the letters are so horribly "thin", a lot more blocky, which leaves so much space between the single letters (way too much for my taste) and as has been pointed out here, probably destroys the font.
Okay, what font was that? Times or Arial? Not mucht to destroy there in the first place :P However, try a webpage that uses a much more detailed font, where letters have much more fine grained details. While Apple might make the font appear a bit unclear and very smooth, you will recognize that this font is special (and not just Arial or Times) - I'm afraid IE will lose mmost difference and below a certain font size, every font will look the same in IE. Not necessarily bad as long as the text is always readable, it's actually good; but it destroys the whole purpose of using different fonts at all.
Would you test Linux ?
It seems there is something like ClearType built-in (but fine tuning windows in Gnome Font Settings seems quite different from Windows ClearType tuning PowerToy)
So i wonder, what does Linux (or perhaps we'd better say x.org) side with ?
I wish there was a way to update the whole font-rendering engine on Windows? I don't like how typefaces become synthetic and so digital on my windows machine.
I much prefer OSX's rendering, but I can see the Windows argument. I think it comes down to design purity vs. design practicality - with strong arguments for both sides. Take yer pick folks.
I much prefer OSX's rendering, but I can see the Windows argument. I think it comes down to design purity vs. design practicality - with strong arguments for both sides. Take yer pick folks.
I'd like to to get some feedback on the use of pixel grids for image analysis -- specifically to track textual changes in an image as a result of image manipulation versus textual changes as a result of saving a bitmap image as a compressed JPG.
Yes, in Safari always the font was pleasant to me, but it is beautiful till the moment when it above 12px, all is more low awful approximately.
Bah! I turn off ClearType in both Windows XP and IE7. Yes, I do use LCD monitors and they are in native resolution. I just prefer crisp-edged text for reading.
P.S. does anybody remember the old Mac System 6 and 7 days when the same letter would render with different stroke widths depending on how it aligned to the grid? I used to call these the blobbies (c.f. jaggies). You'd insert text and the rest of the paragraph would jitter as it moved across the screen.
Remember fonts like Venice and Chicago that were designed specifically for 72-dpi screen and dot-matrix print? (It was an Apple stroke of genius making both the same.) All the angles on the letter shapes were 45. I can only imagine how much fun they must have had converting those from bitmap when TrueType was first introduced.
We've come a long way, but the design-for-low-res vs. design-for-hi-res choice won't disappear. Neither will the problem of readability vs. accurate font metrics.
Well, I am certainly not a designer. But I know that at the beginning, I disliked any and all font smoothing. I just wanted clear edges. Somwhere down the road, I started liking ClearType. Now I can't go back to a PC without ClearType.
Currently I am fiddling with my wife's MacBook and I have to say that I just can't get passed my dislike for the OS X smoothing although I have approached it from different angles. That is definitely one of the main reasons that is keeping me from switching.
Some strange observations. I am far-sighted. For regular vision, I use glasses that are +5 strong. But I also use reading glasees that are +7.5 strong. These have their clear focus at about 20 inches. And they make every tiny detail sharp and clear at that distance. And guess what, the fonts rendered on the MacBook look quite odd through those. Not so much that I see a blur, but I see all the tiny bits that make that blur possible. One pixel leak here, another one there. When I put my +5 glasees back on, things turn more into a blur BUT, readability increases since the optical system in my brain is probably cancelling the leaking pixels and rendering the font image into a block. In the end, to my eyes through my reading glasses, OS X displays fonts like a cheap ink jet would print them, and ClearType displays fonts like a laser would print them. That's at least what I can say from my own experience.
Something tells me that everybody has a different resolution to their eyes at the distance they are using their computers and this probably is one of the contributing factors as well in this debate.
At the end, though, I realy wish the choice was left to the user and his/her circumstance.
A follow-up on my post of yesterday.
I did a full reading of the article at antigrain.com (http://www.antigrain.com/research/font_rasterization/index.html) which has been referred to above. Granted that I probably understood only half of it really, there was this argument about Apple's approach in that article:
But it looks like they also rigidly snap symbols to pixels, no matter how blurry they look. So, what is their mission? To render blurry text only in order to make people buy higher resolution displays?! It's an unfair game!
So I went to the Apple store on 5th Ave today to really test this one with my own vision. I have to say my findings agree with the above statement. As soon as you start browsing on one of the 23 or 30 ACD's you get the point. Text appears very uniform and crisp on those displays as opposed to all other Apple offerings. I say this with some degree of confidence in my vision when it is armed by my reading glasses. Once they are on, I'm 100% there in the very detail of the pixels when I am in the correct focus distance much like a camera in sharp focus. The larger ACD's were of courese sitting right next to the full mac line so I did get to compare these with all Apple offerings. To me the 20 iMac with the TN-Film display was the worst. It was worse actually than the white MacBook I thought. Nonetheless, I admit that this whole thing is an impression since I did not have the chance to really sit down and do an all out comparison, installing FF and other non-Apple software or tinkering with font smoothing settings in each system although most of them were set to auto.
Another test that I did was trying to see how ClearType compares to Quartz through my wife's vision. This was an attempt to test my previous post. I know that she had LASIK for her near-sightedness and then she also has some amount of astigmatism but she ususally does not wear her glasses. And guess what, Quartz looked fuller to her on the MacBook. One could argue that it is a thing of preference. But I am curious about what shapes that preference and I highly suspect that vision really does play a role in this. I suspect that her astigmatism introduces some amount of blur to the visual information presented to her. So that when she looks at ClearType, it looks thinner and undefined to her. On the other hand when she looks at Quartz, the boldening of regular typefaces by that technology looks fuller to her. Also, the blur that is already there in her vision probably cancels out some of the blur introduced by Quartz.
Of course now I am curious to see how ClearType would fare with different displays as well although my experience with it does seem to be more uniform compared to Quartz up to now.
Although a 20 ACD is on its way to my residence at the moment, I agree that this seems to be an unfair game that Apple plays. I'm sure if business will were there to compensate for todays display technology while at the same time preparing for higher resolution displays of tomorrow, the enginneering skill would be found to do it.
On a last note, when compared to the new LED backlit 24 display, the older ACD's did render fonts better I thought. I suspect that here, display technology could be at play. I am not sure what type the new display is (IPS, SVA, TN). But the fact that there is some amount of grain that comes with the older S-IPS ACD's may be contributing to smoother quartz performance on older ACD's. Did Apple engineers design Quartz only when they were sitting in front of higher resolution S-IPS monitors? Beats me.
There is an excellent book about typography by Jan Tschichold, German typographer. Book is called Typographische Gestaltung.
I'm a programmer and I have to say that for programming ClearType is just much better to read. I wish OSX had ClearType selectable, so when I programme on OSX it looked as good as Windows. It's the one thing I envy Windows for.
Respecting the SHAPE of a font is not always respecting the designer, when metal forged typefaces were designed, there was a design for each typeface size, because type design is an optical science and a shape that looks good at 12pt does not necessarily looks good at 72pt (actually it doesn't).
Typefaces are meant to work as a system and serve a purpose: be readable. ANYTHING that stands in the way of that purpose should be removed.
I agree with Brian. Mac needs to offer programmers better font rendering. I work for a larger programming house and we all have the same complaint about OS X - it's simply too blurry when aliased.
In Windows ClearType is vastly superior to read code with - it's clean and crisp and designed for reading, perfect for coding. On Mac I have to remove aliasing in the IDE it's that blurry and gives me a headache.
Now at least with Windows you have the option - you can turn ClearType off and have blurry fonts if you want. On Mac you just have to take their argument that they do rendering best and that's that. But for programmers they don't do it best, not by a long chalk. Which is a shame because just about everything else about OS X is superior to Windows.
To be honest, Ubuntu (or Linux in general) slaughters both Microsoft and Apple in the font rendering field. In Ubuntu, users get to fine tune font rendering, from DPI to subpixel/greyscale/none AA options.
Vonnick, FreeType does a great job at larger sizes, but it falls flat on its face at small sizes, particularly because it doesn't support TrueType hinting (patent issues). Instead, FreeType uses autohinting, which is generally quite good. Unfortunately, autohinting falls apart at about 9pt or below; you may note that the default GNOME font size is 10pt for this reason.
FreeType does quite well in World of Warcraft, though.
Honestly, Microsoft's classic (GDI) font rendering is quite good at small sizes, but it sucks at large sizes. ClearType does a poor job of antialiasing vertical features (in fact, it doesn't antialias them at all), compared with just about any competent rendering engine, Mac OS X included.
Microsoft's new (WPF / Silverlight) font rendering is absolute crap. WPF fonts are somehow simultaneously aliased and blurry, which seems like it shouldn't be possible. WPF makes it possible.
The BEST font rendering, in my opinion, is actually in Flash. Flash manages to do well at all sizes. Incidentally, Flash also apparently doesn't use native TTF hints, but it does a much better job than FreeType at autohinting.
OK. It's been about 6,5 months since my last post on Nov 06, 2008. Meanwhile I did switch to a Mac and have been using it for about 5 months now.
In all honesty, today I find that I like the OS X font smoothing better. Just as adamantly I had sided with Clear Type back then, today I could vote for Quartz over Clear Type in a heartbeat. Like in my previous Clear Type loving posts, today I could write as much about the benefits of Quartz. Inconsistent right?
Well, all I can say today in light of my experience is that it is mostly about habit. Today I'm not even sure there are absolutely better design choices regarding font smoothing. Unless some design group teams up with a group of optmetricians and neurologists; sample the population for determining the best working middle ground in font smoothing and only then apply that knowledge back to font rendering algorithms; I don't think there will be a clear winner in this department. Our brains are creatures of habit. So what happens down the line when you use an OS? I guess your brain stars adjusting and compensating for what appears to be a bad design in the beginning. Once your optic processing does the switch, what you were used to to begin with starts looking worse...
If you are some obsessive geek like me who is considering a switch to a mac and have issues with Quartz font smoothing; fear not. You'll get used to it. Like me, you may even start liking it more than Clear Type.
freetype, which is the font renderer used in all Linux distros, offers different hinting and rasterization methods, which should cover all variants currently available:
anti-aliasing can be either disabled or used in subpixel and greyscale mode, for any hinting style.
"no hinting": preserves the font face, but makes readability hard (OSX style, but with a different gamma contrast).
"slight hinting": hints only vertically, so the font keeps it face, but looks less muddled. This is also the default setting in Ubuntu 9.04. (Vista style, not exactly though, a bit thicker)
"TrueType hinting": works pretty much like legacy GDI, hinting the font after TrueType specification. (Windows 95 style, not perfect)
"autohinting": an algorithm tries to work out the best way to hint the font. this one sucks because all fonts tend to look the same at low point sizes, but it's the most pragmatic == for font haters. Didn't see any distro using this lately.
At least on Linux, any taste in font rasterization can be fulfilled. End of sales pitch :D
For crazy people: I found a japanese site where someone offered an injector that replaced the GDI font renderer in windows with freetype - you can launch any app with it and it looks Linux'ish. It's splendid :D
I'm a Windows user with Safari as the main browser to READ texts. The difference to me is not whether keeping the font shape or the pixel grid, but the burden on you eyes when reading text (such as Wikipedia articles) in ordinary sizes (such as Arial size 11) for more than 4 hours.
Don't worry if you disagree with me, just try Safari 3 and Firefox 3 (or IE 6) on Windows XP with a matte LCD, and actually read Wikipedia for 30 minutes, you'll find the difference.
Subpixel rendering was not invented by Microsoft and I'm not even sure they claimed that, but they did see the advantages of applying the technology to font rendering. Also, Apple did not "abandon" the idea but it is used in their current font rendering tech as well. The unique invention with ClearType is how it adjusts the pixels to improve screen readability while losing some of the WYSIWYG quality when it is printed. As someone nicely phrased it above "Take yer pick folks."
What I want to know is why we can't have Mac rendering for word processors and publishing apps that make documents meant for print (e.g., Word, Adobe InDesign, etc) and then ClearType rendering for everything else meant for the web (websites/webapps, powerpoint presentations, etc).
Won't the world be a better place if monitors render for functionality rather than for preference or style?
Maybe its cause I have a laptop, but I really like the Safari Font Rendering. That said, its not really a huge deal. I am surprised so many people have noticed such a big difference. Again, maybe its because I have a laptop.
Anyone tried out the new Safari 3.02? the release notes says that changes were made to the text rendering. does the issue get any better?
Been playing around with safar for the last day.
Prefer the font rendering to windows, my resolution is running at 1680x1050 and its far easier to read text within safari.
I'd usually move firefox over to my secondary monitor, running a lower resolution in order to comfortably read a site.
This font rendering for me alone is enough reason to ditch firefox for safari.
(Apologies if this has already been mentioned.)
When Microsoft released ClearType, I recall Steve Wozniak (Apple's co-founder) writing that he used an identical method 20-odd years ago on the Apple ][
ClearType was not the result of superior Microsoft R&D, as some have suggested here. It's old technology which Apple abandoned (http://www.grc.com/ctwho.htm ) and which Microsoft later independently reinvented.
IIRC, Apple chose to avoid ad-hoc solutions like ClearType and instead develop a device-independent rendering system (DIR) system, that could render text or graphics on any display hardware - from projection screens to phone displays to 2400 dpi printers.
Quartz largely solves the resolution-independence part of a DIR system: clearly some people like what Quartz produces for their screens, while others prefer the device-dependent results offered by Microsoft. That's fine, but note that ClearType will *not* scale so what you see on your screen won't be what you get - even on a printer.