June 6, 2007
I'm currently reading through Peter Morville's excellent book Ambient Findability. It cites some papers that attempt to explain the search behavior of web users, starting with the berrypicking model:
In a 1989 article entitled "The Design of Browsing and Berrypicking Techniques for the Online Search Interface," Marcia Bates exposed the inadequacy of the classic information retrieval model characterized by a single query.
Instead, she proposed a berrypicking model that recognizes the iterative and interactive nature of the information seeking process. Bates understood that the query and the information need itself evolve as users interact with documents and search systems. She also recognized that since relevant documents (like berries) tend to be scattered, users move fluidly between search and browse modes, relying on a rich variety of strategies including footnote chasing, area scanning, and citation, subject, and author searching.
In short, Bates described information seeking behavior on today's Web, back in 1989. Google relies on the citations we call "inbound links." Blogs support "backward chaining" through trackbacks. Flickr and del.icio.us allow us to pivot on subject or author. The Web allows our information seeking to grow more iterative and interactive with each innovation. The berrypicking model is more relevant today than ever.
Bates' research was picked up by Peter Pirolli and Stuart Card and folded into their 1995 paper titled Information Foraging in Information Access Environments:
We use the term Information Foraging both to conjure up the metaphor of organisms browsing for sustenance and to indicate a connection to the more technical optimal foraging theory found in biology and anthropology. Animals adapt their behavior and their structure through evolution to survive and reproduce to their circumstance. Animals adapt, among other reasons, to increase their rate of energy intake. To do this they evolve different methods: a wolf hunts ("forages") for prey, but a spider builds a web and allows the prey to come to it. Humans seeking information also adopt different strategies, sometimes with striking parallels to those of animal foragers. The wolf-prey strategy bears some resemblance to classic information retrieval, and the spider-web strategy is like information filtering.
So if you've ever wondered why users behave like animals online, now you know. There's real science behind it in information foraging. Instead of hunting for food, users hunt for information, ruthlessly, and without compunction.
In practice, what this means is that users pursue "information scent". Users will click the back button nearly instantly when they don't catch a whiff of the right information from the current page. Jakob Nielsen explains:
Information foraging's most famous concept is information scent: users estimate a given hunt's likely success from the spoor: assessing whether their path exhibits cues related to the desired outcome. Informavores will keep clicking as long as they sense (to mix metaphors) that they're "getting warmer" -- the scent must keep getting stronger and stronger, or people give up. Progress must seem rapid enough to be worth the predicted effort required to reach the destination.
If you think this is all a bunch of trumped-up academic terminology for the basic principle of human laziness, well.. you're right.
Humans are under less evolutionary pressure to improve their Web use, but basic laziness is a human characteristic that might be survival-related (don't exert yourself unless you have to). In any case, people like to get maximum benefit for minimum effort. That's what makes information foraging a useful tool for analyzing online media.
Whether you call it "information foraging" or the rather more honest "maximum benefit for minimum effort", it's a powerful model of the way people actually work online. There are billions of web pages, and only a tiny fraction are worth the users' time. That's why informavores are unforgiving. They will..
The last point is noted by Nielsen in his new book Prioritizing Web Usability:
In recent years, highly improved search engines have reversed [the idea of "sticky" web sites] by emphasizing quality in their sorting of search results. It is now extremely easy for users to find other good sites. Information foraging predicts that the easier it is to find good patches [of information], the quicker users will leave a patch. Thus, the better search engines get at highlighting quality sites, the less time users will spend on any one site. This theoretical prediction was amply confirmed by the empirical data we collected for this book: People left the sites they found useless within less than two minutes.
Frankly, I'm surprised it's a whole two minutes. You better hope the information scent is strong on your page, because Informavores' fingers are always hovering over the back button. And they have very itchy trigger fingers.
Posted by Jeff Atwood
I don't understand the 2 minutes bit-- that seems like a very long time. In the BBC article you link to, it says, "75% of the 1,058 people asked would not return to websites that took longer than four seconds to load. " That's more like it.
salas - That's per site, assuming the first page loads within 4 seconds. They probably read that page, maybe one or two more, and then leave. If you have to sign up for an new account within that two minutes, and it's not instaneous, you've probably lost the user.
I wouldn't say it has anything to do with the "basic principle of human laziness", but with efficiency. If a particular search path doesn't smell fruitful (so to speak), why should I keep following it? I need to Get Shit Done, not waste time searching. The sooner I find my information, the sooner I can get back to work.
"Users will click the back button nearly instantly when they don't catch a whiff of the right information from the current page."
This may be true for 'Informavores' who are still some steps behind evolution. As far as I can tell by observing my own online behaviour, I rarely need to use the back button, because I stick to the few sites (mostly forums) where useful information density is very high. The second option is always google groups for me, where the search is broader but sometimes more accurate. Detailed search result links already tell you a lot about the site behind it.
That said, your site's one of the few I visit frequently, Jeff, for the information scent is very strong on your page ;)
Excellent article, thanks for all of the information presented. This FINALLY explains something I noticed myself doing awhile back...in the era of tabbed browsing, now I can follow the scent of something via multiple tabs and not even worry about using the "back" button. I've often found myself with 10 tabs open, a little piece of information on each one that pertained to whatever subject I was reading about.
It would be interesting to somehow map or graph how many tabs, length of time spent reading each site, etc. until the thirst for information is finally quenched.
The back button? More like the X.
Doesn't everyone shift-click on google results? The back button skips a beat even in the best case, and punishment increases for each click you get away from google.
I find myself in a number of annoying situations when I use click/back instead of shift-click/close.
The Jakob Nielsen comment "Progress must seem rapid enough to be worth the predicted effort required to reach the destination." reminds me of a paper that I read on artificial intelligence. It defines curiosity as behavior which is expected to maintain or increase the rate of learning over an extended period (i.e. avoiding things that look easy to learn or that look impossible to learn). Even if human curiosity does not quite work this way, it does seem like a handy metaphor.
I agree that the Flash, heavy scripting, and other BS triggering warnings about popups, etc. will drive me away from a site FAST. Next up is somebody's whizzy CSS layout that falls apart if you aren't using the CrapMaster 3.14159 web browser the idiots tested on. And of course the opaque and useless home page and worse yet a redirect that throws you to the stinking home page leads to a big BACK click (or X) too.
Sometimes I wonder why more companies don't have their web team's product independently audited on a regular basis.
Another great post for reference, thanks Jeff.
Lucky for you I already have you on my iGoogle page... *sniff-sniff*
The back button?
Evoluted Informavores use mouse rocker navigation :D
Yes, I am also alpha male when it comes to this. I have reached a stage where I can see from the Google result whether the prey will be satisfactory for my hunting.
There's a lot of utility in the predictive and/or corrective power with well-designed search engines. When you browse a blog or forum that uses a custom search engine, you quickly experience how much of a difference there is.
For example, when I browse and search in Wikipedia, I don't even bother with Wikipedia's search box. Overall, it's much faster for me to return to my browser's address bar, search for a term with a single letter shortcut ("g VMWare wikipedia"), wait for Google's results, and select the first or second one. The alternative via Wikipedia's own search engine is both slower and more inferior in that I have to read and browse through the results.
I think there's something negative to be said about websites that continue to stubbornly stick to their own homebrew, off-the-shelf search engine even though it creates an inferior overall search experience. These webmasters need to realize that they should consider the best tools for every task. There's an ROI to every decision and sticking with inferior tools "just because" is a poor return for users.
2 Minutes is a long time for a speed reader.
Patrick, which sites are you refering to exactly? - Wikipedia is a very impressive site,BUT it has been "befriended" by google, which is why it's quicker, and more relevant to search on google than on wikipedia for a wikipedia entry. Try searching for something in one of your emails in your hotmail account in google, and you'll find you get no results. It's horses for courses.
Yes it's true "google" is awesome (everyone bow down to the god-gle - and don't mention china!!http://news.bbc.co.uk/1/hi/technology/4645596.stm)-
but its also creating a monopoly in the searching environment which isn't always a good thing. creating your own search on your site creates a more complete and professional experience. Personally if I use a site that sells its soul to the google pop-up I loose a little interest and confidence in the site itself. More and more these days AJAX is powering very fast and clever searches which don't require "google" intervention. I have written spiders, bots, and search engines - and I have never done anything "just because", and quite frankly I resent being called inferior, just because I'm not using google pop-ups on my site.
I'm afaid left with the fact that I don't quite understand your point? - what exactly is a homebrew,off-the-shelf search engine? is it like a clever,stupid comment?
This seems to reflect my experience as well use tabs to surf not the back button ..
Site fails to load/has masses of flash/adverts/layout is broken within a few seconds then it's onto the next site
Site does not seem relevant within less than a second then its onto the next site ...
The ones that annoy are the sites that load quickly enough, look promising and then fail to deliver... those are the "two minutes" sites, these often are news type sites that cite a story (no longer available) about an article on another site that might be relevant and you have to read half the page to find out there are never going to say anything useful?
As others have mentioned already, between tabbed browsing and mouse options for the back function (I have back and forward for the 3rd and 4th buttons on my trackball, and miss them dearly when I don't have the trackball connected to the laptop), the time spent on a seemingly irrelevant site is significantly shorter for a large number of users (even if it is still a small percentage of the overall user base). Besides that, with Google's results page I only visit sites that don't present seemingly relevant information when I'm really having a hard time finding what I'm looking for.
More often than not I'll refine my search terms before I get to the second page of a Google search result.
Of course, tabbed browsing could also explain the 2 minutes per page: if I'm having a lot of trouble finding relevant results, I'll open the next 4 or 5 links in new tabs and go through them, one after the other, trying to find something useful. So, it's completely possible for a page to be loaded into my browser for several minutes without me looking at it at all. This is even worse when I'm actually reading information, as I'll open links within whatever I'm reading in new tabs and let them sit until I finish the page I'm on, only to find later that I'm not really interested in the linked-to content.
Great stuff. I find this particularly true since much of my time involves researching minor things that I have no knowledge over (like macros for Excel or quirky oddity errors that occurs on servers I am involved in) and there usually are just so many paths to go down to find exactly what will help. First sign it's a dead-end I go back to the search.
Franky, my left hand is almost always over the Tab and Alt keys, and my right hand (or at least thumb) can quickly jump off the trackball/mouse and hit the left key to go Back.
I also have found myself instead using new tabs to open each possible search result and then closing out each "dead-end" tab instead of going back. Makes it easier to open several different results - especially when multiple results are saying the same thing.
So that's why I am always hovering over the back button while I am on this blog...haha
Joking aside, I would admit I am a victim of the informavore bug...
I think you're right on the money, Jeff.
I know when I'm looking in Google, I use 'open in a new window' for anything that looks like a good hit. If I don't see something worthwhile right away, I hit the middle mouse button (which is mapped to close application).
My attention span when I'm seriously looking for something and/or shopping is about zero.
I think many large e-commerce sites take this very science into consideration. Land's End is one of the best out there (not a plug and I'm not affiliated in any way... don't even buy their stuff) because of their search. Try it and see how boss it is.
I like the spider web idea. I know that now with all the RSS feeds you can subscribe to almost all the information I want just comes to me.
But when I do have to search I'll google something, click the first 1-5 results, and if I do not find what I am looking for without even having to scroll down the page I am back to google onto the next link. This works good because usually one of the 5 pages will have what I need on the first visible part of the page.
I would have to say that 2 minutes is WAY to high. Myself, I usually give a page 10 - 15 seconds to show me what I want to see with a minimum of 2 clicks.
"because Informavores' fingers are always hovering over the back button. And they have very itchy trigger fingers."
quite literally :), my middle mouse button is my back button.
I read this entry and the only thing I could think of was:
"May the scent be with you..."
this theory is wrong. web user behavior is vegetal, not animal.
"Users will click the back button nearly instantly when they don't catch a whiff of the right information from the current page"
Smaller, but good point. It also explains why "back button's" importance is so high - it's a natural and essential part of the information-seeking process. If you know, beforehand, you can not jump back - you are less willing to sneak into suspicious holes.
Anything that destroys back button (ill-written Ajax, Flash-based navigation) is wicked and evil! ;)
The book you mention - "Ambient Findability" - is an excellent read - very thought provoking. I bought it because I was amazed O'Reilly would publish something so different, but it is very cool and very interesting. Funky phrases like an 'Internet of Things' pop up every other page.
This is very interesting and aligns with my own search habits. Rather than spend time reading the entire page, I usually skim the first paragraph or two to establish relevance and either continue skimming or back out to the search results.
If the page is annoyingly flashy (flash animation, animated advertisements, blinking text) or difficult to read (gray text on a black background), I back out without bothering to look at the text. This one is somewhat interesting because I do assess the appeal of the page layout in about 50 milliseconds and form an immediate impression. If the page looks amateurish or garish, I'm outta here.
I also immediately back out if I am greeted with a table of contents, such as a long listing of links to forum topics. My reasoning is that I'm conducting a search and I need information quickly; I haven't the time or the inclination to skim a long list of links to find a topic that may or may not be relevant. If I'm desperate, I might Ctrl-F in FireFox and perform a quick page search... but this is rare.
Taking me to a page with an instant registration requirement is another fast and easy way to make me leave a site. I immediately assume registration means I am providing the organization with marketing information and that they will immediately begin sending me newsletters, advertisements, and marketing information via email. Perhaps this isn't the case, but I'm not taking time to research the site to find out. Perhaps more important, I have no idea whether the information is even remotely relevant to my needs, so why bother?
One such site is infamous for posting the question, but requiring the reader to register to see the answer. I haven't bothered because the organization has yet to prove they can provide relevant, credible, and accurate responses. Earn my trust with your information and I'll consider registration, but not before.
We are currently doing a study on average load times for our homepage at my work. I've been advocating this for a while. Sadly, the designers rule the roost here, and there flash and script heavy pages load slow even on my high speed.
With the users 2 minute attention span, if you use one minute of that just loading the page, then your really isolating your intended audience.
Great article, Jeff. Thanks for putting it together.