June 12, 2007
I've finally determined What's Wrong With Apple's Font Rendering. As it turns out, there actually wasn't anything wrong with Apple's font rendering, per se. Apple simply chose a different font rendering philosophy, as Joel Spolsky explains:
Apple generally believes that the goal of the algorithm should be to preserve the design of the typeface as much as possible, even at the cost of a little bit of blurriness.
Microsoft generally believes that the shape of each letter should be hammered into pixel boundaries to prevent blur and improve readability, even at the cost of not being true to the typeface.
So we answer the question with another question. What do you respect more: the pixel grid, or the font designer? It's not surprising that Apple would side with the font designer, because Steve Jobs thinks Microsoft has no taste. But me, I'm a pragmatist. Given the ubiquity of relatively low DPI displays, I'm with Dave Shea. I side with the pixel grid.
Joel talks about the pixel grid, and how Microsoft's type rendering pays more attention to it. Speaking as someone who thinks a lot about the pixel grid, I have to say I think I'm coming around to the idea that Microsoft's ClearType simply works better.
Alright, I'd better qualify that quickly. Think about it this way – as a designer, you don't just set type in Photoshop and let it go, right? You tweak. You kern. You attempt to match the letters to the pixel grid as closely as possible to reduce the blurriness. Sometimes spacing suffers, and you have to choose between a slightly blurry letter with perfect spacing, or a more precise fit within the pixel grid with just slightly off spacing. I can't be the only one that leans toward the latter most times.
And that's the difference here. ClearType is a closer match to what I do manually already. Yes, I prefer the way type on OS X looks; ClearType seems too sharp and overly blocky, the subtleties of the curves are lost and it's overly chunky. But, for the medium in which it's being rendered, it seems like a more ideal solution.
Dave's opinion carries a lot of weight here, not just because he's a well-known designer, but because the three citations he provides demonstrate just how common it is for designers to do exactly the kind of manual, per-pixel tweaks that ClearType does for us automatically. And it's not just an aesthetic choice, either – there's plenty of hard data to support the assertion that snapping fonts to the pixel grid improves reading accuracy.
A fascinating greyscale-only variant of this rendering techique, FontFocus, illustrates beautifully how subtle tweaks can "snap" fonts to the pixel grid for better readability:
Typography, if you haven't figured this out by now, is really complicated. It's one of the few areas of "computer science" that actually justifies the title. I highly recommend reading the entire FontFocus article, as it's very instructive.
Dave Shea thinks the pixel grid will be moot once high resolution displays become ubiquitious. I wholeheartedly agree, although I'm unsure when exactly that will be. The history of display resolution increases have been quite modest so far. Ten years ago I was using a single 17" 1024x768 display; now I'm using three 20" 1600x1200 displays. So you'll forgive me if I'm not overly optimistic about this theoretical jump from 100 DPI to 200 DPI.
I don't understand why Apple is asking us to sacrifice the present at the altar of the future. Can't we have hinting at low resolutions, and accuracy at high resolutions, too? Snapping fonts to a pixel grid may very well be irrelevant when everyone is luxuriating in the glow of their 200 DPI monitors. Until that glorious day arrives, respecting the pixel grid certainly makes text a lot more readable for those of us stuck in the here and now.
Posted by Jeff Atwood
microsoft will feel significantly stupid when the average person has a monitor with 5 times the DPI that's around now. pixel... grid... riiiight... no one will care in a couple years.
for now, i find the osx fonts a lot more pleasing to the eye, if they're blurry i just make them bigger, it's better for your eyes anyway.
Ever print a Word Doc that you've perfectly squeezed into 2 pages? Ever have it print on more than 2 pages? Is this still a problem in Vista and Office 10?
I haven't used Office in years but generally I go from Doc-PDF-Printer to work around this problem.
Is this a problem in Mac OS X?
So there are two issues:
1) Layout: Picking the pixel grid that will be rendered to (kerning, leading, tracking)
2) Rasterization: Picking pixels to fill (grid-fitting, subpixel rendering, anti-aliasing, etc)
If you are more conservative with text layout you usually need to be more aggressive filling pixels during rasterization to make your fonts look good.
@Michael Graham Richard. FreeType is generally used in the Linux world. To my eye, it looks closer to the Apple way of rendering.
I've worked on projects that used FreeType for video games in the PC world. However, we rasterized glyphs out to a texture that were hardware rendered.
Hardware rendering has its own quirks. It's called a pixel shader in the DirectX world and a fragment shader in the OpenGL world. Fragment shader is the more correct term because multisample anti-aliasing (MSAA) happens after the fragment shader.
MSAA is user controlled at the driver level. So basically all of your text that you've gone and painstakingly anti-aliased has blurred because the hardware attempts to anti-alias it again.
The One Laptop Per Child has a 200 DPI display (1200x900 on a 6"x4.5" screen)
I think I prefer the Apple version, the fonts in windows just seem too thin really. And it makes boldness _far_ too extreme. But overall I'd never really noticed it even when switching between my os x to windows. Maybe it works better on Apple's displays somehow? (Although I can't possibly think how)
Interesting fact though, Ubuntu gives you four options of different font rendering methods to pick from. Has examples of the effect each has on it too, which is interesting to look at in the light of all of this debate. You can see it on tim-perry.co.uk/fonts.png .
Man, posting anything about the Mac really brings the morons out of the woodwork.
See, this is why I don't have comments on my blog, Jeff. ;)
Sounds like the old form vs. function debate to me. I think it is just a philosophical difference between MS and Apple. And then there is the *nix crowd which has neither.
Camz: “when you look a bit deeper into OS X you discover that their display is managed with display postscript.”
Not true. That was true for NeXT, OS X is not using it in any way. The display model of OS X is similar to some (early version) of PDF, but Quartz is no “display PDF” even if some have used that term.
Fred: “Apple has had WebKit running on Windows for years, it's what renders the iTunes Music Store inside iTunes.”
Not true, at least not a few years ago according to a leading Webkit developer. http://weblogs.mozillazine.org/hyatt/archives/2004_06.html#005666
That could change though with time. If it had I would guess Apple blogosphere would have noticed it a heart beat :)
Then about the fonts: They are highly subjective. color perception, panel, settings of aliasing, personal preferences, personal history with aliased text etc. all have a *big* impact on the outcome.
In some ways I find the ClearType way better, but in some ways the Apple way... I frankly don’t like either very much. OS X rendering is much too fuzzy, Cleartype is too light and its quality seems to depend much on panel and other issues than OS X’s. Maybe one of them will some day tuned more into my liking, but that would mean finding some kind of a middle ground.
BTW, I’m a Mac user too, sorry for posting a moronic comment.
Ever print a Word Doc that you've perfectly squeezed into 2 pages?
Ever have it print on more than 2 pages? Is this still a problem in
Vista and Office 10?
It's not really an OS-related issue, and I haven't used Office 10 at home since Office 2003 came out (and I don't print at work unless someone's thick-headed; I don't even use Word for much work-related other than my resume, because they want it in a .doc file with a specific format)... Office 2007 (12) hasn't caused this problem for me, but I have a tendency to preview my documents to make sure I set up all of the margins and so forth before I print the documents if I'm planning on turning them over to someone else. I really tend to think that Word has some issues when using it for text layout, but they seem to have gotten better in the last decade. Word also opens in the "Print Layout" view (at least for me) rather than "Draft", so it has a tendency to be more accurate with page breaks and margins than it used to (simply because people expected it to be in the layout mode when it wasn't, and didn't always know there even was a separate mode).
One thing I wanted to point out for those who primarily use Macs and only occasionally use Windows - are you sure ClearType was turned on? It's on by default in Vista, but in Windows XP it had to be explicitly enabled. It's possible your bad experiences with font rendering on Windows has to do with the un-antialiased Windows XP text.
The trouble is that non-96-dpi displays still suck. The fonts may scale (depending on whether the developer implemented the call to CreateFont correctly!) but the pixel-fitting algorithm then means that the spacing changes subtly and controls may not be large enough for the expanded text. Also, the taskbar icons are scaled using the StretchBlt API's pixel-doubling mechanism rather than picking a large icon and scaling it down.
Windows Vista has a new scheme for scaling the display but that results in really fuzzy results because ClearType is applied *before* scaling, and you get the fuzzy icons problem too. Also, it's turned off by default (at least it was in RC2, I don't believe I've checked this in RTM yet) and XP's system is used by default. I think the new scheme requires hardware acceleration (Aero) to be turned on too.
The new scheme probably works great on 300-400dpi displays. It sucks at 120dpi.
Remember PIXEL FONTS? That's right. PIXEL FONTS. Google them. Get Flash. Enjoy.
"What do you respect more: the pixel grid, or the font designer?"
You went with the Grid? Time to ditch that Matrix screensaver, Jeff.
I think, font rendering of safari on macos x is better than safari on windows.
I prefer the Mac way. Some days I think Windows still look like the old Geos for CBM64 ;)
A thought, if Safari has been ported to Windows did they port Cocoa? That would be a great scoop I think :)
Monitors with large numbers of dots per inch simply have no market at the moment as we lack an operating system and related software for that system that is writen in a resolution independant manner.
Take icons for an example. Most icon work I've seen is done with pixel maps. This means that the ideal display for icons is to have one pixel of icon image for the pixel on the screen. Sure, you can scale pixel images but it doesn't look ideal.
If you were to take software that looks good on a 100 dpi monitor and try running it on a 200 dpi monitor then the icons will look best at half the size of 100dpi monitor. This is most likely not the effect that you were designing. Your goal had most likely create a inch wide graphic that appears an inch wide on anything: different monitors, printouts, whatever.
The dream would be to encode your data in a resolution independant manner - with something like vector graphics - so that regardless of the screen resolution you would still see your icon taking up the same amount of space on the screen.
We need an OS and software that is resolution independant before we can see higher resolution monitors. Because most guis are designed in a resolution dependant manner (all guis that I've worked on at some point do something on the gui in terms of number of pixels and not it units of inches) upping the resolution would suddenly make a lot of software look bad.
We're not seeing high resolution monitors for a simple reason: there's a lot of negative effects if they became widely used.
What we're seeing with the Apple's fonts is Apple trying to move to a resolution independant world. It's a hard step to take as there is so much resolution dependant legacy code out there. It's hard to add new stuff like resolution independance while not breaking everything else on the planet.
"The iPhone, which runs OS X, the version of Safari that was released as Public Beta on Monday and uses the same font rendering technology, has a 160 DPI display. The future is on June 29th."
Y'all speaking about a magic leap from 100 to 200dpi! Please be aware that classic fonts were designed when there was no "resolution" concept - typesets were actually lead castings. These days, professional print quality begins at 1200dpi, going up to 3600 dpi. A 600 dpi laser printer is barely enough for printing proofs, and any fonts on 300dpi lasers look like crap.
So, let's face it, there is no point to preserve the print-oriented font designs on displays. I mentioned "greek text below..." setting before, to emphasize the fact that designing page layouts and reading text are two completely different tasks.
For the monitors, we need a totally different design approach - which ClearType achieved just fine. In PrintPreview, though, rendering as close to printed shapes as possible, is really a good idea.
if I am not mistaken, a 17" screen would give you a resolution of 1280x1024 (which is what I use), while it is a 15" that would give you the resolution you spoke of, 1024x768.
It's funny to see all those blurry fonts, and that people tolerate them. There's still nothing as crisp as good old X11 bitmapped fonts, and those are what I prefer in my text editor or terminal window. In the browser, whatever freetype is giving me with those Bitstream Vera fonts on Ubuntu is fine.
"Steve Jobs thinks Microsoft has no taste. "
Not only does Steve Jobs think that, Bill Gates admitted that he and (by association) Microsoft have no taste.
months, nearly a year, ago O'Reilly listed "Fonts Encoding" on their 'be out real soon' list. It's still on that list; July or August. I've kept wondering: a) why the subject deserves its own text, and b) why would anybody care to enough to buy it. Well, I guess it's difficult enough that they can't get it out the door. And this thread demonstrates that some folks care way more than I do. So long as Proggy fonts are supported.
@John and The Other One
John is right! YEAH! there is absolutely nothing wrong with the raster fonts, if you're writing code or running some terminal windows, that is doing some useful work instead of hanging out in blogs. Even more, you'd most likely want these fonts monospaced!
Raster fonts are very fast and never blurry -- they are designed for display's pixel grid. You may not get 11pt size, but the choice between 10 and 12pt is enough.
Is it plausible that some of the reason for the difference traces back to some kind of patent issue? I can't directly recall or reconstruct the status of the controversy, but I recall one from when ClearType was first announced.
Maybe if Apple did it the "the better way," they'd be risking patent expenses for licensing or lawyers?
I don't think Apple ported Cocoa, they just rewrote the GUI in Microsoft's C++ (which makes it freak out in Wine). Apple has had WebKit running on Windows for years, it's what renders the iTunes Music Store inside iTunes.
not knowing anything about programming with Quartz, would altering the way it renders fonts be something that a low-level hacker could possibly change? Or are we doomed with blurriness until 2012 or whenever 300dpi displays finally show up?
Not trying to defend Apple on their decision, but when you look a bit deeper into OS X you discover that their display is managed with display postscript. They are using a render engine that has been designed for print, and the techniques that they use DO look good in print where we have much higher DPI available (600 on low-end devices, 1200 on mid-level, and 2400 on high-end). It should be no surprise that they wind up using the techniques for print, since their display is using a print-based render engine. The same core is used for printing from OS X as well, so to make the display behave/render differently would require changes to that core, something, I suspect they are not willing to persue.
I noticed that you can turn the anti-aliasing off totally on the screen (ie, "defaults write NSGlobalDomain AppleAntiAliasingThreshold 100", but when you go to "print" it renders the print preview in its full, smoothed glory regardless of whether or not on-screen antialiasing has been forced off.
So on some level, the on-screen display is handled differently than the rendering that's being used to print. I think the question really comes down to whether Apple wants to invest the resources to tune for low DPI screens and provide end-users with better tools to adjust the rendering to their own tastes.
I'm curious how the font rendering in Linux compares? I haven't used Linux in a couple years, but I remember being impressed by how nice fonts looked on Slackware (version 9 or 10, can't remember).
Font rendering in windows is Beautiful...So Clear, so sharp......It takes my breath away. The font rendering on OS X gives me a headache.
As a recent Windows to Mac convert, I have to say the one thing that really strikes me when I have to use a PC again, is how lovely the fonts are on the Mac.
Everything looks a lot less harsh and jagged. I can see what you mean about the bluriness, but it has never caused me reading problems, and if anything it's a lot more restful on the eye.
Also, I used to hate the way that when using Word on the PC, if you increased the font size of the whole document, it would sudenly appear to be bolder or lighter, seemingly at random.
Font rending on Windows looks like crap. It's too sharp. It feels abrasive. Mac font rending is much smoother and nicer.
Use zoom, people. Use zoom.
User of Mac, Windows Linux
I concur with Spolsky's explanation: it all comes down to whether you respect the font designer, or the pixel grid. As a daily user of both OS X and Windows, there are times when I appreciate each different approach.
By the way, Camz, OS X is not build on Display Postscript per se. NextStep and OpenStep used to be based on Display PostScript, but Apple ripped all that out (due to licensing issues with Adobe) and rewrote it themselves into what might be called "Display PDF". Having said that, though, your main point still applies: the whole display architecture boils down to something that operates in "real space" and is not necessarily aligned to the pixel grid.
For me, when I'm writing or doing page layout (especially for something that's going to be printed), I appreciate the Apple approach, because in this context ClearType's inaccuracies drive me crazy. When I'm reading a lot (and I'm not so interested in the aesthetic appearance of the text, just its content thank-you-very-much), then I appreciate ClearType. Frankly I wish both approaches were available in the same OS, and that I could switch between them depending on what I was doing at the time.
Here's the thing, Jeff. When dealing with anything that has a hint of Apple vs. Microsoft you better expect a good chunk of logic, reason and objectivity to be thrown right out the window. The relationship of Apple and Microsoft fanboys to their respective companies is like that of a parent to a child.
For instance, how many parents, in a public discussion, would admit that their child is a bad student, sucks at sports, is lazy, has a bad attitude, and is uglier than a frog's butt? I've noticed fanboys are the same way --- which is funny --- because in a line of work where logic, reason and objectivity are so important, there can be an alarmingly substantial lack of it sometimes among a lot of programmers.
Anyhoo, I'm not airing my opinion on this topic. I just think it'd be nice if some people would peel back their built in or cultivated prejudices and really, really take a deeper look. Always look deep, always question.
I recently purchased a Dell Latitude D820. It has a 15.4" LCD and I was able to choose a WUXGA display (1920x1200). This works out to a 147DPI. Initially at 96 dpi all of the fonts were impossibly tiny. I was afraid I would be slouching the entire time using my laptop. I bumped it up to 120DPI setting in Control Panel and I could read!
Then, I dug around, and asked some people and determined that it was ~147DPI and so I went home to the laptop determined to set its DPI. When I picked Vista's custom setting I was greeted with a ruler! So I asked my wife for a ruler and I dragged until it looked right. Lo and Behold the number looked best at 147DPI.
Fonts now are actually LARGE and VERY readable. I'm VERY impressed. Vista got this VERY right. I love it. and I love my near 150DPI display. So I am 1/2 way to that 100DPI to 200DPI leap.
Apple's Viewpoint: Garbage in, Garbage out
Microsoft's Viewpoint: Garbage in, (arbitrary deterministic magic on behalf of Microsoft), Something nice
I would like to point out that Apple's viewpoint is shared by Unix hackers, disillusioned programmers, and anyone who validates their webpages XHTML Strict. Microsoft's viewpoint is responsible for many instances of IE poorly rendering CSS2.
It would make sense to have everything align to the printing grid if you're creating something for print.
Personally, I have created very little in my career, or even my personal life, that is intended for print. Almost everything I've written since I last attended school was meant to be read on the screen, or was meant to be interpreted by a compiler. Therefore, I prefer that most things conform to the screen I'm reading on, rather than a page I'll never print.
Furthermore, a web browser is not a design interface, and things designed for the web (and displayed in a browser) should look good on the screen.
That being said, I find it interesting that the images displayed under the Safari discussion look worse on my 1600x1200 CRT display than Safari itself looks on my lower resolution (1280x768? I don't remember off-hand the resolution of the 17" widescreen laptop) LCD display. Of course, for the most part, most font-smoothing techniques look better on LCD than CRT. I think the problem for me may simply be that when something looks fuzzy on the screen, my eyes try to adjust focus to bring the words into focus, and this eventually gives me a headache.
Interesting. And it definitely is complicated.
I don't think it's true that Mac font rendering uses no hinting at all. I duplicated some of the FontFocus samples in TextEdit. OS X's 11-px Times pixel-grid alignment is almost identical to the FontFocus antialiasing, but the OS X display also benefits from subpixel smoothing, and I think it is superior. The 9-px Helvetica sample wasn't as good on the Mac, though.
You can see my samples at http://zajac.ca/fonts/ .
Also note that the fontblog study you link to does *not* indicate that snapping to pixels improves readability, but only that ClearType subpixel rendering improves readability over no antialiasing. The summary does not indicate whether the aliased text in the study was hinted (which snaps to the pixel grid, regardless of smoothing) or not.
What's interesting about this is often, for professional printing, True Color is preserved over Shape. For example, even in the latest version, Photoshop's less-than-100% zoom does not do any anti-aliasing. The reason is anti-aliasing introduces colors that are not really in the image, and this is generally frowned upon by many photo and print professionals who need exacting color accuracy between the monitor and the output. Photoshop's zoom will make finely-detailed images appear to be broken -- it will sacrifice the true shape of the image to preserve the true color. Photoshop does have many levels of optional font anti-aliasing, but these "bake" the anti-aliasing into the image -- it is NOT just for display on a monitor -- so I believe this is mostly for web work.
Color accuracy really isn't an issue when it comes to something mundane as rendering a web page, but it's an example where "respecting the font designer" isn't necessarily the "more professional" way of doing things.
Actually I like my Mac's text rendering far better than that of any PC I'm using.
You make the whole issue sound much more dramatic than it is, I think.
Apple's font rendering doesn't scale, and that's why I am quite happy in saying that Microsoft has got a design decision right, for once. If an OS has to remove a styling under a certain pixel size, then it just can't be the right way to do things. I'm still keeping my brand new MacBook Pro, though.
Apple's Viewpoint: Garbage in, Garbage out
Microsoft's Viewpoint: Garbage in, (arbitrary deterministic magic on behalf of Microsoft), Something nice
LOL, it's funny because it's true..
You can see my samples at http://zajac.ca/fonts/
This is very cool, and I agree with your conclusions on that page. I don't think a greyscale-only approach can truly compete with the 3x effective increase in horiziontal resolution that RGB subpixel approaches deliver.
Not having actually used Safari on Windows yet, I can't comment on how the Apple rendering works on Windows compared to my impression of it on the Mac.
While I can understand differences in taste and approach, there is one area in which Windows has not made an effort at all, and that is double-byte font rendering. I work in both English and Japanese on both Windows and Mac environments. The Japanese on Windows looks like something out of dot-matrix days--almost unbearably bad. Turning ClearType on only makes the western language fonts look better. It does nothing for Japanese (or other double-byte languages). I was actually looking forward to Vista in the hope that they would do something about that, but from all the Japanese Vista PCs I've seen so far, it appears they still don't care. How hard can it possibly be? The underlying font technology is the same for both western and double-byte fonts. So now (and for the foreseeable future) I will continue whenever possible to do all of my Japanese computing tasks on my Mac, which renders the same with all languages and fonts and looks very nice, in my opinion.
Please stop ignoring the fact that the iPhone has a 160dpi screen, multiple times zoom functionality on webpages in Safari (and maybe other things like Mail), and will be in real users customers hands in two weeks.
I use both a PC and a Mac, and I have to say I definitely prefer the font rendering on Windows over OS X, because it is _SO_ much easier to read...
I'll give one example... A friend of mine was working on a script on a BSD box from his Mac, and literally spent hours and hours trying to figure out why it wasn't working. He asked me to look it over, and I found the problem in under a minute... It turned out that he had typed a - instead of a . in a domain name and on his Mac the two looked virtually identical -- both gray blobs, so he couldn't find the problem. From my PC I spotted the problem a mile away.
Not only that, but when you work in small fonts like I do (to maximize screen space) things become pretty much illegible below about 9 pt on OS X because it ends up as a garbled gray mess, but you can go to 7 pt (or sometimes lower) on Windows and still have readable text.
I agree that in software like page layout programs that the Mac approach may be better, but how many of us are actually doing that? Shouldn't readability be the primary goal for 99%+ of the population?
He asked me to look it over, and I found the problem in under a
minute... It turned out that he had typed a - instead of a .
On my previous job, year or two ago, not wilingly though I was working under OS X envoriment.. and you have no idea how many hours I've spend trying to figure out the problem, just to find out that it was a typo, similar that your mate had. It was hell for the eyes too.... changing IDEs (native or java based) didn't help, while turining on antialiasing would eat up the memory which eventually would slow down the system.. increasing the font size also didnt help, as it made giving more attention to scrollbar then work.
if I am not mistaken, a 17" screen would give you a resolution of
1280x1024 (which is what I use), while it is a 15" that would give
you the resolution you spoke of, 1024x768.
unless you're using a widescreen monitor, or a higher resolution monitor. Personally, I thought my 17" widescreen was 1280x800, but it turns out that it's 1440x900 (16:10 instead of 16:9). I've only been using widescreen monitors for a short time so I'm not as familiar with the resolutions, whereas 1024x768 and 1152x864 are resolutions I've used for years on 4:3 CRTs (and 1600x1200 for the last 5 years or so).
Unfortunately, none of it really matters all that much. I can't run Safari on the system that has the CRT on it for a more direct comparison (XP and Vista only, supposedly), and the program is only useful for browsing the built-in bookmarks on my Vista machine with the LCD screen. Maybe I'll try again when the program comes out of beta, although I still have issues with the use of the brushed metal interface on Vista. Of course, I also still think that a browser's font rendering should be optimized for reading on the screen, rather than reading on a page I'll never print.
Yikes! I'm still on a single 17" 1024x768 monitor, though my laptop screen is significantly better. :)
Actually Steve Jobs knows enough typography to know that we do not read letter-by-letter but reads more like one word-pattern at a time.
Keeping the typography correct is not a matter of "pleasing the (font) designer" - it is a matter of readability. Unless you do actually spell your way through letter-by-letter when you read something.
Since there is clearly different opinions on this ClearType/grid and Apple/typeface issue, Microsoft and Apple should have a setting the user can change instead of just forcing it to look how they want it to, so the customer can decide.
The problem with designing a system that does two things at once is that it will suck at doing at least one of the things. Especially when that thing is font rendering, it seems.
The issues are not just limited to computer typography. They show up in all kinds of visual media where imaging technology is used. But let's start with an instance where non-computer media and typography collide...
A long time ago, a company that licenses and sells subtitled videos used to boast that their English-subtitled Japanese videos were better than their competitor's English-subtitled Japanese videos because they used italicized antialiased fonts that were optimized for on-screen readability on a standard TV set (and painted in a color called "optical yellow"). As strange as it is for a company to be trying to sell subtitling (as opposed to, say, the artistic qualities of the actual video), they were actually quite right: for quite some time they really were much easier to read than other subtitled videos.
Unfortunately, the company with the obscure but real technical advantage lost it when DVD replaced VHS and laserdisc. The custom subtitle overlays generated for the analog media failed to translate well to DVD's limited pixel depth for subtitles, and this combined with the loss of the smooth horizontal filtering from the analog processes in the video tapes turned the subtitles into jagged, blocky, garishly colored horrors.
If you watch a lot of subtitled movies, the technical quality of the subtitles can be important. Movies imported from Hong Kong are infamous for exasperated English-speaking audiences calling out "Speak darker!" at showings where the plain white text just disappears into some white object in the lower foreground during a critical piece of dialogue. If neither you nor your date speak Cantonese, you'll just have to guess what was said.
The VHS-then-DVD subtitlers designed for a system that had one pixel grid, then failed to redesign when the pixel grid changed (not just resolution, but interaction rules between adjacent pixels too). The Hong Kong subtitlers didn't even bother dealing with trivial and obvious signal saturation issues, let alone pixel alignment.
Movies are routinely edited for their delivery media, because all delivery media have different properties. The one technique we've probably all seen is "pan scan", where about 1/3 of the image is simply sliced away to fit on a TV screen. Sometimes separate 5.1 DTS and stereo audio tracks are provided, instead of simply remixing the 5.1-channel to 2-channel in the DVD player. Entire scenes are sometimes reshot for VHS (IIRC there's a night scene in one of the Starship Troopers movies which is dark blue on DVD, but was reshot in green and on a different set for the VHS version). Most of the work in DVD mastering involves stretching color response curves, performing spatial filtering, or tweaking more obscure MPEG encoder parameters because one scene or another is chewed up and spit out by the defaults. TV networks routinely use different graphics on analog carriers vs. digital ones.
So (getting back to the point now ;-) it seems obvious to me that we should expect all kinds of nasty artifacts from trying to mix media and intent in arbitrary ways. There's just no way to solve this problem by technical means (if we just eliminate all the other font rendering systems, and all the other fonts, and standardize on one true display DPI...but those are political and economic problems, not technical ones).
The problem won't go away with 300DPI displays either, at least not until everyone with a lower resolution display goes away. 75DPI web pages will look like crap on 300DPI web browsers, and 300DPI web pages will look like crap on 75DPI web browsers.
In Ubuntu you can change to whatever anti-aliasing style you want. It's quite handy. Want a OS X look? Just use less hinting. Want a XP look? Just use more hinting.
I found just changign the font in Safari's options (to Calibri) fixed most of the problems I found with reading text in safari
Just a follow-up to my earlier comments on the lack of Japanese font smoothing for Windows.
I installed Safari on Windows and was pleased to see that the font rendering in that browser carries through to all languages. Japanese looks just as good as it does on the Mac. It's too bad that Windows doesn't want to bother doing it itself. But I think the reason they don't is because of the adjustments to the pixel grid that has been discussed so much here. Japanese (and Chinese, of course) characters are often very complex. Adjusting lines to fit a pixel grid could potentially make it look like an illegible blob--a crisp blob, but nonetheless a blob. The dot-matrix style they continue to use today (I wish I could post screenshots here to show you) actually cheats in the way it displays the characters by often times leaving out less critical lines at lower point sizes.
Although I can't side completely for either method with regards to western text, for double-byte languages Apple's way of doing things is hands down the best (if for no other reason than because Microsoft doesn't do anything at all). I can't tell you how many times Japanese natives I work with who use only Windows come over to my desk and express near awe at how good a page of Japanese text looks on my Mac (the majority of whom have since bought Macs for personal use).
While I probably won't use Safari on Windows for my main browser until it's official release, I'll definitely be using it for Japanese web browsing on that platform.
Why is this even a question, and not simply an option?
I can't say more than that, because it honestly seems far too simple a concept to warrant all the brain power being spent debating this.
Way up near the start of this thread someone was commenting about Vista's new scaling algorithm for monitors that are not 96dpi.
The reason that some apps look nasty is that Vista requires apps to explicitly declare that they are resolution-independent. If they don't, they are given a bitmap display surface at 96dpi and then that surface is scaled appropriately to make it fit in on an alternative-dpi display. Because this scaling is done during compositing, any text the app rendered will have been rasterized already at 96dpi, and so if cleartype is enabled it's liable to look terrible. Microsoft really should have disabled cleartype for apps where this scaling is being applied, but whatever.
This is also why this only works if you have the fancy graphical effects turned on: the scaling is done by the compositor in hardware.
When I just started using Safari - I wholeheartly agreed with your earlier observation - but after a few days I have to admit I have defected especially after seeing Safari's font rendering with fonts other than Arial and to some extent Verdana. Take Georgia - Go to Wired.com and click on any article. Not only is Safari's rendering superior for larger fonts (IE7 has a lot of jagged edges) - but even on smaller sizes (the article text font) Safari is a delight to read, while IE7 just elongates the characters vertically a little bit - and the RGB noise is also visible to some extent. Or take Trebuchet, or any other font. I have to admit - the consistency of Apple font rendering technology has put in me in their camp. (Come to think of it the Copyright message on your blog is shouting RGB noise in IE7, but looks pretty clean and more readable on Safari)
I much prefer OSX's rendering, but I can see the Windows argument. I think it comes down to design purity vs. design practicality - with strong arguments for both sides. Take yer pick folks.
I personally think that one of the absolutely strongest reasons to use OS X instead of windows is that is has WAY more beautiful typography (that's what this is about). Especially, this makes a lot of difference on the web. I've always been fascinated how anyone using windows can do serious web design - after all, text is very central to web design, so designing for the web in windows must be like designing with a blind fold. It's a bit like trying to write a cook book when you only have foul ingredients to try the recipes with.
I can see why you would want to respect the grid more in small font sizes*, but in any larger sizes, it's just silly and is almost disrespectful to both the font designer and the user. Legibility is not only about making the letters clear and distinct with high contrast, it is also (more, even) about making it easy to percieve the nuances in their shapes, which, in a well designed font, create words instead of just a sequence of letters. ClearType changes and ruin the shape with cold mathematical algorithms, OS X preserves it pretty much the way it's supposed to be. About 99% of the work any font designer has put into their font will be completely in vain when the font is used on windows.
So, I actually expected windows users to be happy to finally get a browser with a more beautiful typography. But, I'm not really expecting a windows user to even notice that it actually does look better. ;)
* This only applies to a certain frame of font sizes where the lines of the letters are about 1 px wide. There, you want to make sure that you don't get 2 px of grey blur instead of 1 px of black (as demonstrated by your example), but both above AND below this, proper AA not respecting the grid is much better for legibility.
ClearType changes and ruin the shape with cold mathematical algorithms, OS X preserves it pretty much the way it's supposed to be.
That's true. But while ClearType ruins the shape with cold mathematical algorithms, it makes the letters more readable, on. the. screen.
This says it all.
The idea behind is to make text better readable on screen.
You like the exactness of the font? Print it on paper. use 600 dpi printer. Use 1200 dpi printer.
But don't impose on me the "Beauty of typography" on a media with mere 72 dots per inch support, no, thanks!
Because what I get then, is blurred letters, supposed to be more beatiful. Beautiful, yes, but from farther distance.
For small font sizes on screen, give me ClearType. For long screen reading, give me ClearType.
It's not a battle Microsoft vs. Apple.
It's a question of readability. On. The. Screen.
And for me, this time MS did it better:)
A good proof for switching off anti-aliasing fonts on a screen is actually this website. The only things I can see are blurry letters and words.
As a long time windows user I must also say I much prefer the Mac philosophy of rendering fonts. It took a little while to grow on me but now that it did I can't go back. Often I look at a web page in Safari and I'm not even sure if I'm looking at HTML text or text in a inlined JPG.
It's only now that I really appreciate the aesthetic pleasures of good typography.
One poster mentioned that Japanese fonts in Windows are only bitmapped and that was true up to Windows XP. Windows Vista includes a Cleartype-enabled Japanese font called "Meiryo" That being said, despite being a nice improvement over the ubiquitous MS Gothic, it still loses out IMHO to Safari's Japanese Font rendering, which is simply stunning.
The two technologies ARE ”available in the same OS”. I work in InDesign (creating magazine pages for print) and it, of course, does not at all use Windows’ built-in on-screen type rendering – it has its own engine. The text is not as legible, but the WYSIWYG is better. Which is great, and as it should be.
So for print, all this is a complete non-issue. There is no ”excuse” for Apple’s method; they really go for that look on the screen. It has nothing to do with ”printing from Mac OS X”, that’s a nonsense red herring that can only come from non-professionals.
I cannot stand Windows font rendering. Microsoft renders to the pixel grid. The result is that every diagonal line shows the pixels. There are lots of sharp black and white contrasts in those diagonal lines that don't belong to the character originally. Especially bad with a large 'S'. The pixel-grid-based appearance of Windows fonts confuses me. This is also an issue when you read both on-screen and from paper.
Apple renders to an overall grey impression. This approximation is more easy to read. The visual system inside the head can reconstruct the characters from there. This allows also very fine grey approximations with screens with high DPI.
I guess there are people who prefer the Microsoft way of rendering fonts. The visual system of some people might benefit from the sharper contrasts. But really deciding which is better for the reader should be done after several days of using them. Viewing habits are not easy to change in a few minutes. Many people are used to Microsoft's rendering and Apple's rendering might look strange for the first hours, but after some time (hours/days) you might or might not like it. But then the decision is not done on viewing habits. Try to use the Apple rendering for some time and then go back to Microsoft's rendering. You might be surprised.
I'm not a print expert, but I am a programming and language expert. Never have I been the victim of dot vs. dash or some other character at small font 2 hour search problem. First and foremost, the right tool, compiler or lint would flesh that kind of thing out immediately so don't blame it on the font rendering system. That is the tool user's problem, or lack of tool using. Second, the argument about reading on screen vs print is total BS. I work on a screen and not once since about 1989 or 1990 have I printed out volumes of code listings to view, total nonsense to argue that is a valid angle on this silly dispute. I want screen display to be easy to read, and I find sizing fonts solves all problems to a reasonable 14 point instead of 10 or 12. Zoom in or out of documents of PDF type and that removes that type of problem too. The iPlone is a good example of what is to come, expect 200-300 dpi sub 5" devices in the next 12-24 months, and in 36-60 you can expect that to ship in all form fractors making grid and pixelated solutions rather meaningless. At that time all design will be embracing verctors and not rasters to describe. Today, this 'half baked' OS X solution to rendering works great. If ya don't like it, turn it off in preferences.
most commonly used fonts on the web have specific hinting and spacings for small pixel sizes intended to increase readability and reduce reading fatigue. You lose the font designers intentions when you start distorting the letters just to flip pixels. Word shape recognition is reduced. etc.
not to mention that the ms way only produces acceptable results on blocky fonts (arial, verdana, tahoma) it is horrible with serifed or accented fonts as well as non western fonts like asian and arabic letterforms.
i could see how one could confuse individual letter clarity preferences with actual readability statistics, like wpm over x/time etc.
i prefer to see the letters as the font designers intended.
"most commonly used fonts on the web have specific hinting and spacings for small pixel sizes intended to increase readability and reduce reading fatigue. You lose the font designers intentions when you start distorting the letters just to flip pixels."
I'm sorry, but this is a contradiction. Who do you think created and intended the "specific hinting and spacings"? I recently spoke with a renowned font designer, and he complained that Apple ignores the hinting he (and other designers) specifically puts into his fonts. Mind you, I am a proponent of Apple's font rendering (for the most part), but this plain ignoring of pixel hinting gives Apple a bad name in font designer circles. At least, they could make it an option. I also do not like the way ClearType looks, but something along the line of FontFocus, which you linked to, seems to be a balanced approach, at least for smaller type.
An addition: In an article by John Gruber which was linked to before in the comments on the previous article (http://daringfireball.net/2003/11/panther_text_rendering), it is shown that Apple did change its rasterization technique to align fonts more closely on the pixel grid *vertically*. The "blurring" problems we are seeing are mostly horizontally between letters. What that means for hinting, I do not know.
There's a new article "Texts Rasterization Exposures":
It's rather long but I hope there is some interest in it.
I tried to summarize my experience and observations
concerning the situation with text rasterization in Windows
and Linux. The article also contains demo applications to play
with my method of RGB sub-pixel text rendering. I admit some
statements may sound questionable, so, I appreciate any comments,
criticism, and discussions.
Hammer it to the grid! I actually *prefer* pixel-fonts when I can get them.
"Not only that, but when you work in small fonts like I do (to maximize screen space) things become pretty much illegible below about 9 pt on OS X because it ends up as a garbled gray mess, but you can go to 7 pt (or sometimes lower) on Windows and still have readable text."
Apple Menu System Preferences Appearance Turn off smoothing for font sizes [n] and smaller /dumbass
I prefer the rendering on Macs, the spacing between letters on PCs looks too weird for me. Another thing that will affect your preference is the system you use more often. If you typically use a Mac and then look at a PC ... yuk and the other way around. In the magnified pics above, you were lucky to get letters that ended up with one pixel between each letter on the PC version. In some fonts on Windows, some letter spacings jump between, let's say, 2 pixels and 1 pixel. To me that looks horrible.
Disclaimer: I haven't tried Vista to see if there's any difference.
Leopard is going to be resolution independent.
Look for Resolution Independence on this page.
Here's what it says on the linked page.
The old assumption that displays are 72dpi has been rendered obsolete by advances in display technology. Macs now ship with displays that sport native resolutions of 100dpi or better. Furthermore, the number of pixels per inch will continue to increase dramatically over the next few years. This will make displays crisper and smoother, but it also means that interfaces that are pixel-based will shrink to the point of being unusable. The solution is to remove the 72dpi assumption that has been the norm. In Leopard, the system will be able to draw user interface elements using a scale factor. This will let the user interface maintain the same physical size while gaining resolution and crispness from high dpi displays.
The introduction of resolution independence may mean that there is work that you’ll need to do in order to make your application look as good as possible. For modern Cocoa applications, most of the work will center around raster-based resources. For older applications that use QuickDraw, more work will be required to replace QuickDraw-based calls with Quartz ones."
I much prefer Apple's font rendering.
Just following up to Mr. G Williams post above --
As he noted, users of Japanese (and probably Chinese and Korean versions) of Windows even in XP do not benefit from ClearType. Font rendering in XP remains for those users just as backwards as it was in Windows 3.1.
I recently remembered this problem existed when I reinstalled Windows XP the other day -- most Japanese users are greeted with a UI close to the first screenshot in this page : http://www.geocities.jp/poe99/CAT/XP/page08/index.htm
For some reason I can't remember -- and a MSDN blogger even addressed this at one point, the native Japanese fonts, MS Gothic and MS Mincho, are instructed to ignore ClearType rendering at all cost.
There was a mod at one point listed that more or less required you to decompile the Microsoft-provided fonts and to recompile them without the "Ignore ClearType" flags. Once you have your freshly un-castrated files, you simply replace the files and you're greeted with a much more pleasant reading experience found in the second screenshot of the LOVELY XP! page linked above.
However, there's been a new development -- with the release of Vista, there's also a new typeface! A ClearType-enhanced Corbel-alike for Japanese (and possibly other versions too) users called Meiryo.
You can see the differences in the last screenshot of LOVELY XP! The author was able to get his hands on the Meiryo typeface. The implementation is not perfect due to the way Windows handles and replaces fonts on-the-fly. Windows, according to the registry uses FontLink to replace Tahoma with the appropriate font when Tahoma cannot accomodate the character set (ie displaying say, a Japanese word in the middle an English sentance)
It appears that if you simply replace MS UI Gothic with Meiryo, the typeface becomes bolded twice and muddled (last screenshot; left), versus when you simply relink Meiryo with Tahoma in the registry (last screenshot; right). I'll look into this more when I get home. I still don't fully understand the process or what causes the difference.
I find that _all_ anti-aliasing is awful, and from what I know, it has become very difficult to disable anti-aliasing on all common operating systems, Windows XP being the last one where it was easy.
- Vista doesn't let you disable AA for all fonts, some remaining blurred. Also some of the fonts it uses by default rasterise very poorly, probably missing hinting.
- OS X demands extra TinkerTool hacks, and even after that the fonts are rasterised poorly, probably missing or not using hinting, from what I've read. (I haven't actually used OS X.) Hinting is absolutely necessary for unblurred fonts.
- On Linux, the bytecode interpreter needed for hinting is often disabled for patent reasons, and the "autohinter" is joke in case of unblurred fonts, and not that good with blurred ones either. Furthermore, even if the bytecode interpreter is compiled in, it is very difficult to write the hundreds of lines of XML to disable blurring, and allow beautiful old X11 bitmap fonts (that the AA/TTF-fundies distributing the software block in general and some in particular, substituting with worse poorly-rasterising TTF fonts), and so on. (Gnome and KDE let you configure some, but not all aspects for themselves, but not other applications, and if one has to even touch Gnome or KDE, one could just as well use Windows or OS X. Some distributions provide shortcuts to some of these operations, but demand root/superuser access that you might not have, or prefer not to use for such tasks that should be trivial.)
300dpi probably isn't good enough to not see the pixels--the blur or rainbow colouring in case of anti-aliased fonts, irrespective of method, on the typical flat high contrast background situation necessary for reading comfort (what little can be had on bright computer displays as opposed to non-reflective paper). Maybe 600dpi would be enough. In any case, anti-aliasing has defeated itself at a resolution where the method used -- or not used -- no longer matters in personal ergonomics. But the trend in computing seems to be that "OS designers know better", and therefore personal choice is made difficult, and you're expected to "sacrifice present on the altar of the future", quoting the original post.
I just tested out the Safari browser on an XP machine. I LOVE the speed of the browser. I sure wish they would fix the fonts because that makes it unusable to me. I opened firefox and Safari side-by-side on my desktop and the difference is amazing. In safari the text is blurry and colorful. I found this page searching for a solution because I really WANT to use Safari. Its great in most other aspects but completely unusable due to the fonts.
I feel I'm pretty impartial here. I have one server running Debian Etch, a desktop running Ubuntu and use XP occasionally at home and daily at work. I'm generally anti-microsoft (and hate IE 7).
I've tested this on several quality monitors and its the same on each: undeniably blurry.
It seems that rendering fonts of safari on macos x is better than safari on windows. The "blurring" problems this or other I can't say.
As for me, this time MS did it better.
I'm agree with Iman, about fonts rendering of Safari on MacOSX is better than in Safari on Windows
"The old assumption that displays are 72dpi has been rendered obsolete by advances in display technology. Macs now ship with displays that sport native resolutions of 100dpi or better. Furthermore, the number of pixels per inch will continue to increase dramatically over the next few years. This will make displays crisper and smoother, but it also means that interfaces that are pixel-based will shrink to the point of being unusable. The solution is to remove the 72dpi assumption that has been the norm. In Leopard, the system will be able to draw user interface elements using a scale factor. This will let the user interface maintain the same physical size while gaining resolution and crispness from high dpi displays." No comment,here you are http://xopca.com/search.php?q=Bjorkx=27y=14 here and there
Always use the BCI for my fonts in KDE. I feel sick when I'm using a Windows computer with Cleartype! It's just wrong!
Sorry, you lost me there:
"Typography, if you haven't figured this out by now, is really complicated. It's one of the few areas of "computer science" that actually justifies the title."
First, (computer) typography is not "area of computer science", and second, there are many areas of computer science (complexity theory, programming language semantics, parsing, etc.) that definitely justify the title. Computer science is a branch of mathematics dealing with the well-defined, mathematical notion of computability, and has little to nothing to do with "coding".
Can you please use title instead of/together with alt for images? Some info are lost in the alt text when images are displayed, e.g. the "imagination, 9ppem Helvetica, FontFocus on" here.
Apple's approach to anti-aliasing works better for animated text. Since ClearType fiddles with the shape of the glyphs, you get a weird "glittlering" effect when the text moves in increments less than a pixel-width. Apple's rendering results in glyph shapes that are perceptually more consistent in this situation, and so the animation looks smoother. I don't know whether this had anything to do with their decision or not though.
I tried some modifictions - as well as setting font smoothing to LIGHT, I then increased font size and then choose fonts with wide spacing as default. Lucida Console 24 and Bitstream Vera Sans Mono 24. Have not tried it on many websites - I guess I will need to go back to a smaller font for most sites.
I'm still on a single 17" 1024x768 monitor, though my laptop screen is significantly better. :)
When I first moved to OSX from Windows (which isn't too long ago. About 1+ years), the difference in the font rendering didn't bother me much. So I chose to remain neutral in that issue.
Then when I first tried viewing Safari on Windows side-by-side with IE/Firefox, I realised that the font rendering differences are huge. But which did I prefer? No idea. So I chose to remain neutral in that issue.
Couple of months later, when I found myself doing web development and web design, I played around with all the different fonts (Helvetica, Palatino, Georgia) available on OSX and Windows. Suddenly, and unexpectedly, I found myself picking a side. OSX is the clear winner.
Windows' rendering engine completely thrashed the original typeface of the fonts, especially on some of my favourite fonts, like Palatino. I nearly cried when I saw how ugly a lot of my fonts look on Windows. I have no idea how a beautiful font like Palatino end up looking so different.
But yeah, if you don't care about fonts, then you most likely won't understand how I feel. I know, because I didn't care about fonts 1 year ago.
Ok, enough of this "it looks ugly" crap that people use to judge fonts. Cleartype DOES blurify fonts but it makes them a hell of a lot more readable. Try reading a books on a computer screen without cleartype and you'll have to get up every 5 minutes from eyestrain. 1280 x 800 LCD laptop screen.
Leaving the differences in font rendering philosophy aside, Microsoft's font rendering technology is more developed, more researched, and superior to Apple's. Microsoft built the technology to allow font designers to develop fonts for on-screen reading. With Microsoft's technology, font designers can specify instructions built into the font on how each character should be fitted to the grid to look best on screen. Apple's technology cannot do that.
Thus it is wrong to say that Microsoft does not respect the font design in rendering. The companies technology only gave designers more tools and more power in optimizing font rendering for on-screen reading. It is still the designer who decides how fonts should look on the screen. In contrast, Apple does not allow font designers to take into account the screen when they define fonts.
Put it another way, it would be very easy to change Microsoft's rendering to look like Apple's (by tuning a few internal Windows parameters), but it would be impossible for Apple to emulate Windows font rendering without a complete redesign of MacOS font rendering. This really shows which technology is better.
Instead of redesigning its font rendering system, Apple chose to play marketing, and claim that it is just a matter of font rendering philosophy -- people are free to choose what philosophy they like. However, it is also a matter of technology, and Microsoft's font rendering technology is by far more developed and mature.
@Adrian: Sorry, but your claims are simply wrong –I really wonder about your (dis-)information sources… E. g. font designers are able to “specify instructions built into the font on how each character should be fitted to the grid to look best on screen” equally on both Windows and OS X (Fabrizio Schiavi’s excellent Pragmata font might serve as sufficient proof). AFAIK there is no such thing as OS specific TrueType instructions.
As a designer with strong roots in typography (who happens to have some CS education, as well) I strongly oppose your (Adrian's) notion of MS's technology being “far more developed”. As a matter of fact, typography cannot be simply reduced to the “optimal” way of rendering single glyphs in one medium or another. ClearType has some advantages over OS X on rendering single glyphs with higher contrast on low res devices –at least for fonts that have not been purposefully designed for low resolutions (i. e. 96 dpi). OTOH that advantage is lost again when taking into account other important typographical parameters like spacing, gray tone etc. – in which cases ClearType happens to behave more like a bull in a china shop. Quite acceptable if you are e. g. writing code –definitely not so acceptable when dealing with text on a “page”. People just don't read letter by letter. And spacing matters more in that case. In the end, I currently prefer OS X's font rendering, amongst other reasons because it behaves more predictably and scales better. Still, I would sometimes prefer ClearType for code work – at least, if I would not be using Pragmata (with antialiasing turned off in my editor of choice). ;-)
Well, to me Apple is a clear winner here. The difference is that the guys at Apple give more importance to design, aesthetics and also (arguably) to users' comfort.
I guess it is a matter of personal preferences, but when it comes to reading text, Apple's font rendering to me is so much more pleasant that I can't even imagine why someone could prefer MS' one. Seriously, no possible comparison. And notice I am NOT a mac user.
I was directed to this site upon trying out Safari and receiving a massive dose of pain from the chubby fuzzy font. I questioned the Apple user who reminded me to try it out (I have no brand loyalty) as to why the hell Safari was making me feel like I had had my eyes jabbed. He couldn't tell the difference. Eventually we figured out the problem: He's near-sighted, and more or less blind, while I'm far-sighted, and have overall supreme vision unless something is closer than six inches from my eye. :P
Too bad, too. I was hoping to eventually get a MacBook, but I'll have to wait for their technology to match their philosophy so I can look at the screen without being in pain, or else wait until I start losing my sight in thirty years or so.
well, i came to this site when looking for a way to improve windows font rendering. i was sitting here on my mac thinking how darn beautiful the fonts look, wishing i could have the same on my vista machine in the other room.
it is a subjective choice, but i vote for beauty every time.
Incenjucar - was that win safari or mac safari? win safari seems to be trying to render fonts as if it was on a mac, but it just doesn't work properly. so i agree it makes your eyes hurts on a win box, but on a mac it looks great imho. but then, i'm using mac firefox right now, and that looks beautiful too. it's the OS that determines the quality of the font rendering more than the application i would say. win safari tries but fails to get a 'mac look' on a win box. i also think it's buggy as hell and doesn't respect web standards, but that's another story.
in general, fonts on the mac look more like they're on paper than on a screen. for me, that's a good thing.
as a web designer, i wish i could get the beauty of mac-rendered fonts for people viewing on windows. but there's no way apart from using images.
The Japanese on Windows looks like something out of dot-matrix days--almost unbearably bad.
It Seems that you don't have taste either :D
I tried some modifictions - as well as setting font smoothing to LIGHT, I then increased font size and then choose fonts with wide spacing as default. Lucida Console 24 and Bitstream Vera Sans Mono 24. Have not tried it on many websites - I guess I will need to go back to a smaller font for most sites. http://www.dllempire.com/