December 16, 2008
Are you familiar with the uncanny valley?
No, not that uncanny valley. Well, on second thought, yes, that uncanny valley.
In 1978, the Japanese roboticist Masahiro Mori noticed something interesting: The more humanlike his robots became, the more people were attracted to them, but only up to a point. If an android become too realistic and lifelike, suddenly people were repelled and disgusted.
The problem, Mori realized, is in the nature of how we identify with robots. When an android, such as R2-D2 or C-3PO, barely looks human, we cut it a lot of slack. It seems cute. We don't care that it's only 50 percent humanlike. But when a robot becomes 99 percent lifelike-- so close that it's almost real-- we focus on the missing 1 percent. We notice the slightly slack skin, the absence of a truly human glitter in the eyes. The once-cute robot now looks like an animated corpse. Our warm feelings, which had been rising the more vivid the robot became, abruptly plunge downward. Mori called this plunge "the Uncanny Valley," the paradoxical point at which a simulation of life becomes so good it's bad.
This phenomenon has also been noted in cartoons.
McCloud's book Understanding Comics was the first place I ran into a concept which is a sort of corollary to the Uncanny Valley. Call it Lake Empathy: If a character is very simple, more iconic than realistic, it's much easier for people to pour themselves into it -- to view it not as a third party, but instead as a personal avatar.
For example, you probably see more of yourself in the character to the left than in the characters to the right.
The seminal Understanding Comics was where I first encountered this concept, too. It's a sort of digital Zeno's Paradox. The more accurate your digital representation of a person, the more visible the subtle imperfections become. This is why computer generated people in recent movies like Polar Express feel even more unnatural than the highly abstract people in 1995's Toy Story. (The current state of the art, at least by some accounts, is The Emily Project. You be the judge.)
But does the uncanny valley effect apply to software user interfaces, too? Bill Higgins thinks it does.
The problem is that our minds have a model of how humans should behave and the pseudo-humans, whether robotic or computer-generated images, don't quite fit this model, producing a sense of unease - in other words, we know that something's not right - even if we can't precisely articulate what's wrong.
There's a lesson here for software designers, and one that I've talked about recently -- we must ensure that we design our applications to remain consistent with the environment in which our software runs. In more concrete terms: a Windows application should look and feel like a Windows application, a Mac application should look and feel like a Mac application, and a web application should look and feel like a web application.
Bill extends this to web applications: a web app that apes the conventions of a desktop application is attempting to cross the uncanny valley of user interface design. This is a bad idea for all the same reasons; the tiny flaws and imperfections of the simulation will be grossly magnified for users. Consider the Zimbra web-based email that Bill refers to.
It's pretty obvious that their inspiration was Microsoft Outlook, a desktop application.
In my experience, shoehorning desktop conventions into web applications rarely ends well. I was never able to articulate exactly why, but the uncanny valley theory goes a long way towards explaining it:
If you're considering or actively building Ajax/RIA applications, you should consider the Uncanny Valley of user interface design. When you build a "desktop in the web browser"-style application, you're violating users' unwritten expectations of how a web application should look and behave. This choice may have significant negative impact on learnability, pleasantness of use, and adoption.
As I've mentioned before, one of the great strengths of web applications is that they aren't bound by the crusty old conventions of desktop applications. They're free to do things differently -- and hopefully better. Web applications should play to their strengths, instead of attempting to clone desktop applications.
If you end up anywhere near the uncanny valley of user interface, that sense of unease you feel is perfectly normal. You're clearly in the wrong place.
Posted by Jeff Atwood
There is a similar issue with dolls. In the western culture, we grow up with dolls, so we're used to them. But I once read a story about an Indian girl who was given a doll - she threw it away saying it's a demon.
Not necessarily.....Web apps will definitely challenge desktop apps in due course.
Google Docs as any other Web Office application is mostly easy to use because it is leveraging the user experience we all got used to in the past through interaction with desktop applications. Because Google Docs is web based it adds to the expected set of desktop features (”File-Save as” etc.) rich online sharing and collaboration features.
Will those users who are not experts and that even do not want to become experts understand that they better have a different expectation about the behavior of web based Office suites versus desktop solutions especially when it comes to the control of access to their documents?
Google Docs presentation serves as a good example of this. After you gave a presentation on Google Docs to an audience:
* people will not have to ask you in the future to access the latest version of your presentation and
* you will not even be able to notice that others are accessing your presentation, Google Docs will tell you instead that there are no Viewers for that presentation.
Some details here:
He's an experienced database developer (who considers coders a lower lifeform :) ) who finds cynical amusement in young-uns who prove the adage that those who ignore history are condemned to repeat it. And the browser is still the sh**ty paradigm, as the Ajax, et al folk are proving. As good an idea as Ajax is, to claim that it's new technology (or that anything about the Web is new technology beyond the pretty pixels) is delusional. As to punch cards, I work with 20-somethings who insist on code being 80 columns wide. Stunning. The vacuum tubes are in my amplifiers, which I built.
The real new frontier is the solid state disk database machine; it will change how software is built more than any other development in decades. Moore's law, so far, has been used to support old paradigms by making it possible to keep doing the same old thing. (See the 18 Dec thread.) The SSD machine will empower parsimonious data structures (read 3 and 4 NF) run by inherently parallel database engines. This will destroy bloated flat-file and xml and even cloud thingees for transactional applications.
You have my word on that.
What all this has to do with uncanny valleys is the following: the most useful UI will win. That the disconnected, block mode, local edit UI is the WORST UI is not in dispute. Jeff, IIRC, has said so on this site; passing this off as, paraphrasing, that's the way it is and it's good enough. It's not good enough, and the Ajax movement is proof. And Silverlight and LINQ and and...
The UI for the SSD database will be logical CRUD. Habanero (without SSD driving it), is an example. As simple as it gets.
Explain it to me in Star Wars
...you're violating users' unwritten expectations of how a web application should look and behave.
I think that if we follow the users expectations in design, we will not end up with as vibrant or as beautiful of a design as possible. the average end user, who would never read this, is slow and thinks that robots are r2-d2 or on junkyard wars. currently they wouldn't comprehend what advancement Masahiro Mori's robot is. they expect designs that are outdated because their views of the internet and computers are usually outdated. we could explain it to them but they would only hear it and the GIGO part of their brains would take affect and get rid of the information.
We need to push the boundaries by building things for the community who is up to date and capable of the understanding and comprehension that is necessary to appreciate the works of art and advancments that we put forward
Tgeahre, you think to much of yourself.
I use GMail exclusively over IMAP. eyeOS and Mozilla Prism disgust me.
I write web scrapers to use web app stuff from desktop apps (if I had any free time, I'd do a desktop UI for Facebook).
Spot on, Jeff, and you've given cause to why I'm so quick to cuss out an AI voice. When they were sluggish and unrealistic, I cut 'em some slack as I thought they were incapable. But when they speak smoothly (and use that condescending tone,) I just wanna reach out and smack 'em for not letting me have the human being I asked for. I KNOW they were coded to ignore that, to try to force me to use the automated system that I also know is incapable of resolving the problem. Because they're realistic, I get annoyed far more quickly. Good catch. Thanks!
Forgive me, but I am lost at how I am thinking about myself?
I think the thesis of this post is quite far-fetched.
the paradoxical point at which a simulation of life becomes so good it's bad.
might explain why nowadays graphically cool video games are considered less fun than old 90's games...
This reminds me a lot of an old webhosting that I inherited from a predecessor. The interface looked *exactly* like Windows 98. Each thing was a window, and the options were icons. It was largely useless, due to the fact that Windows 98 isn't a webhosting frontend...
Pointless, and ugly.
Wow. This article completely embodies what I have been saying for years. In seeing some of the post already listed I see several people saying things like they never thought of the web application as a desktop app.
As developers, we have to remember that the majority of our users are not at the same computer literacy that we are at. Ajax applications have their own risk of being in a browser and appearing to be like any other web page, yet users still try to use the back button, and wonder why the page resets every time they go to it when it's a ajax app.
Desktop applications are not dead, and you can't cram everything in a browser despite what management thinks at a lot of companies.
I'm such a has-been, my website looks like a desktop.
( http://ove.fi )
I would like to see a follow-up poost on this topic in let's say, 5 years and see how things have progressed.
Hmmm, that's pretty interesting. Not sure if I buy it completely- i think a user's intuition when browsing a website also has a lot to do with a user's prior experiences in browsing other websites, so a user can become accustomed to AJAX interfaces, and at that point you're no longer violating their expectations, but forming them. This should, however, make designers question whether their AJAX actions really do in fact improve the user experience, or simply emulate other experiences which are better.
indonesiahere welkume hotel travil
Yeah Dude! This sh1t is whack man. No way! You know what I mean? Yo, this is da future, on a stick yeah? You gettit dude? I gotta come back then real em in man yeah dude, I gotta split. Yo, peace out and magic rooms xxx
What the hell does the uncanny valley have to do with Zeno's Paradoxes? The paradoxes are about algebra's insufficiency to describe the real world, Newton solved them with his invention of calculus.
So if I introduce you to my friend and say he's an android you'll be repulsed? Then I reveal to you he's not an android? Then what? Are you trying to say people are repulsed by each other?
If this were true, a lot of people would hate star trek's character 'Data'
I agree. When an application attempts to transcend its role users general are put off.
For example, I am a big fan of iTunes. But iTunes on my windows machine is odd. It looks exactly the same as iTunes on my Mac (which I love), and perhaps that is why I'm put off. The application attempts to redesign the look and feel of well know and established Windows widgets.
It's offensive when someone thinks you should be fooled by their design, and you aren't.
-Bad hairpieces are worse than no hairpieces.
-Indiana Jones getting bounced around in a lead fridge.
I find the new Yahoo webmail to be the worst at this. It has its own tabs! Why?
I wish more of the decision makers would understand this concept.
I had to port a PDA application to a website once, and the manager wanted it to look and act just like the pda version, because that's what everyone was use to. This included a Tap-Hold bring up a context menu or to scroll through a page. It finally took a ton of bad responses from a beta release to convince him otherwise.
Anyone who uses a screenshot from a zombie video game has my vote, not to mention the good read!
~ Aaron I
I 3 Zombies