March 19, 2012
What was Microsoft's original mission?
In 1975, Gates and Allen form a partnership called Microsoft. Like most startups, Microsoft begins small, but has a huge vision – a computer on every desktop and in every home.
The existential crisis facing Microsoft is that they achieved their mission years ago, at least as far as the developed world is concerned. When was the last time you saw a desktop or a home without a computer? 2001? 2005? We're long since past the point where Microsoft's original BHAG was met, and even exceeded. PCs are absolutely ubiquitous. When you wake up one day to discover that you've completely conquered the world … what comes next?
Apparently, the Post PC era.
Microsoft never seemed to recover from the shock of achieving their original 1975 goal. Or perhaps they thought that they hadn't quite achieved it, that there would always be some new frontier for PCs to conquer. But Steve Jobs certainly saw the Post PC era looming as far back as 1996:
The desktop computer industry is dead. Innovation has virtually ceased. Microsoft dominates with very little innovation. That's over. Apple lost. The desktop market has entered the dark ages, and it's going to be in the dark ages for the next 10 years, or certainly for the rest of this decade.
If I were running Apple, I would milk the Macintosh for all it's worth – and get busy on the next great thing. The PC wars are over. Done. Microsoft won a long time ago.
What's more, Jobs did something about it. Apple is arguably the biggest (and in terms of financials, now literally the biggest) enemy of general purpose computing with the iPhone and iPad. These days, their own general purpose Mac operating system, OS X, largely plays second fiddle to the iOS juggernaut powering the iPhone and iPad.
The slope of this graph is the whole story. The complicated general purpose computers are at the bottom, and the simpler specialized computers are at the top.
I'm incredibly conflicted, because as much as I love the do-anything computer …
- I'm not sure that many people in the world truly need a general purpose computer that can do anything and install any kind of software. Simply meeting the core needs of browsing the web and email and maybe a few other basic things covers a lot of people.
- I believe the kitchen-sink-itis baked into the general purpose computing foundations of PCs, Macs, and Unix make them fundamentally incompatible with our brave new Post PC world. Updates. Toolbars. Service Packs. Settings. Anti-virus. Filesystems. Control panels. All the stuff you hate when your Mom calls you for tech support? It's deeply embedded into of the culture and design of every single general purpose computer. Doing potentially "anything" comes at a steep cost in complexity.
- Very, very small PCs – the kind you could fit in your pocket – are starting to have the same amount of computing grunt as a high end desktop PC of, say, 5 years ago. And that was plenty, even back then, for a relatively inefficient general purpose operating system.
But the primary wake up call, at least for me, is that the new iPad finally delivered an innovation that general purpose computing has been waiting on for thirty years: a truly high resolution display at a reasonable size and price. In 2007 I asked where all the high resolution displays were. Turns out, they're only on phones and tablets.
That's why I didn't just buy the iPad 3 (sorry, The New iPad). I bought two of them. And I reserve the right to buy more!
iPad 3 reviews that complain "all they did was improve the display" are clueless bordering on stupidity. Tablets are pretty much by definition all display; nothing is more fundamental to the tablet experience than the quality of the display. These are the first iPads I've ever owned (and I'd argue, the first worth owning), and the display is as sublime as I always hoped it would be. The resolution and clarity are astounding, a joy to read on, and give me hope that one day we could potentially achieve near print resolution in computing. The new iPad screen is everything I've always wanted on my desktops and laptops for the last 5 years, but I could never get.
Don't take my word for it. Consider what screen reading pioneer, and inventor of ClearType, Bill Hill has to say about it:
The 3rd Generation iPad has a display resolution of 264ppi. And still retains a ten-hour battery life (9 hours with wireless on). Make no mistake. That much resolution is stunning. To see it on a mainstream device like the iPad - rather than a $13,000 exotic monitor - is truly amazing, and something I've been waiting more than a decade to see.
It will set a bar for future resolution that every other manufacturer of devices and PCs will have to jump.
And the display calibration experts at DisplayMate have the measurements and metrics to back these claims up, too:
… the new iPad’s picture quality, color accuracy, and gray scale are not only much better than any other Tablet or Smartphone, it’s also much better than most HDTVs, laptops, and monitors. In fact with some minor calibration tweaks the new iPad would qualify as a studio reference monitor.
Granted, this is happening on tiny 4" and 10" screens first due to sheer economics. It will take time for it to trickle up. I shudder to think what a 24 or 27 inch display using the same technology as the current iPad would cost right now. But until the iPhone and iPad, near as I can tell, nobody else was even trying to improve resolution on computer displays – even though all the existing HCI research tells us that higher resolution displays are a deep fundamental improvement in computing.
At the point where these simple, fixed function Post-PC era computing devices are not just "enough" computer for most folks, but also fundamentally innovating in computing as a whole
… well, all I can say is bring on the post-PC era.
[advertisement] What's your next career move? Stack Overflow Careers has the best job listings from great companies, whether you're looking for opportunities at a startup or Fortune 500. You can search our job listings or create a profile and let employers find you.
Posted by Jeff Atwood
Is there a magic DPI which makes it qualify for a "post-pc" era device. Would 1920x1080 on a 10" device be "worse" than an iPad 3?
I think at a certain point, those are just numbers that the general public wouldn't care about or even notice unless directly pointed out to them. I would simply propose that the post-pc era began with the iPad 1 and that the iPad 3's higher resolution simply improved upon it.
I hate being put in a position to defend Microsoft, but history demands it.
Have you forgotten the viral Project Origami video from back in the day? Just look at it:
What MS promised, it couldn't fulfill. But Apple has -- and also gone beyond it with that iPad Retina Display.
Probably 250 pixels per inch or higher density for your average screen at reading distance (about 30cm).
I think they augment more than supplant, but we'll see. Maybe when they add voice a reliable voice to text feature to overcome this major short coming: typing anything of length.
The iPad 4, ya the next one, might be my first Apple computer purchase ever. :O
I'm not sure that very high resolution displays are, in fact, _any_ part of "computing", let alone a "deep fundamental improvement" in "computing". Perhaps you meant that high res displays are a "deep fundamental improvement in computer displays"? As you probably realize, the display only shows you the output of the "computing". It actually has nothing to do with the "computing" part. If you were referring to the user experience, then there are words that reflect that milieu, as well. I don't mean to pick nits, but your confusion that equates a display's resolution with "computing" make it difficult to take your more technical arguments at face value. It's like saying a better modem improves "computing".
While I'm quite pleased that Apple is pushing tablet hardware forward, I'm firmly in the camp that apathetically says they "just" improved the display. The higher-resolution display offers a more pleasant experience, but the same experience as with the iPads 1 and 2.
Anyone who didn't have a use for the iPad 2 won't develop new use cases due to a higher-resolution display that displays the same amount of content at once, so I don't place much significance on the development.
So now we get busy figuring out how to make tablets into our developer platform for the early 21st century. I love a good challenge.
"existing HCI research tells us that higher resolution displays are a deep fundamental improvement in computing"
Can you share any references for this? Sounds like interesting reading.
Presumably you had to buy two because in this amazing post PC era operating systems don't support multi-user accounts?
Dan Booth, I don't "share" my laptop with my wife. She has her own laptop. If she uses mine she uses Guest or if I'm already logged in it's not that big of a deal for her to just look something up. Same goes vice versa. The ability to have multiple accounts doesn't mean you have to share the device.
Now, in Jeff's case you may be right. I won't speak for him specifically…
I agree with Jeff. The experience of computing on an iPad wasn't complete until this revision. Personally, I hated my iPad1 and didn't see the reason to get an iPad2 as there was no change in screen resolution. Like Jeff, I now have two iPadNew.
i heard today that XBox outsells all other kit and is now ranked as the most common device in the home; presumably this is part of the "existential crisis" you mention...
I lived and worked through the early laser-printer era. 200dpi was a technological marvel but not actually useful for real work. 300dpi was a game-changer: you knew you were looking at computer printout, but you could actually work with it. TeX came into its own, math formulas and all. When 600dpi came in, you could work with the printouts without really noticing that they were computer printouts.
I also had a chance one summer to work with an APS-5 phototypesetter having, I believe, 7200dpi of resolution. The letterforms were absolutely gorgeous, but it was clearly ridiculously high end and far beyond what was needed for usability.
I have seen the new Retina display, and I welcome the new standard. But I will be even happier when we get another factor of 4 up to 500dpi. Pixels rule!
In 2007 I asked where all the high resolution displays were. Turns out, they're only on phones and tablets.
Oh, absolutely. I can't find a high-resolution computer monitor - anything above 1920 × 1200, at any size/DPI - for less than the price of a whole iPad 3. This is quite insane.
"all the existing HCI research tells us that higher resolution displays are a deep fundamental improvement in computing"
Mind providing some citations on that?
@A Facebook User
Go back to Facebook, kid. And take your nits with you.
I look at those Gmail icons and say "Whoopdeedoo!". Do I really care that the space on which I pressed my index finger is packed with more pixels? No. I get my email either way. My iPad 1 is just as useful a book reader or Netflix viewer. (Sure the camera is nice but...)
It's like when you're reclining in a theater seat watching the latest blockbuster. The great new CGI effects will wow you up to a point, but there still has to be a story. Those super crisp pages of the ebook I'm reading still have to convey some useful content.
What made tablets great, even revolutionary, wasn't the awesome display. It was a new form of human interface. Fingertips replaced the mouse and the keyboard... well, to be honest I couldn't have created this post without a keyboard. I agree with Castaa. Once they perfect voice to text, we'll really be rocking.
Wrong. The desktop is far from dead. I dont know anyone who can produce on a tablet or phone like they can produce on a desktop and that is the only reason it will never go away. Look at that Windows 8 "metro" disaster waiting to occur...... put that on a desktop and its certainly dead. The PC will always be the place where people will generate content and build the code that runs those phones and tablets and whatever is next. Not vice-versa. Its nice for games and for photos and browsing maps and looking up stuff - sure, but its clumsy as heck and give me my desk with my keyboard and mouse and yall can keep playin with your tablets.
Great post. I do agree that the new display drives greater useability and a more satisfying user experience. I find myself reaching for my new iPad more because it is such a pleasure to use. Snappy response, great reading quality, and maximum flexibility with networking ( 4g Verizon ), I have a machine that meets 55 to 60 percent of my computing needs. Email, content viewing can be handled by the device. My computing experience is "blended" among 3 devices. Laptop, iPad and iPhone. It will be interesting to see how competion responds. Current proponents of other platforms tout unique App management, "open" systems, minimal app oversight. If that is the best differentiation, Apple will continue to rule post-pc world.
You really should give credit for images that you "borrow" for your post. The macro shots are the ones from The Verge (http://www.theverge.com/) and you really should give credit (especially if you didn't get permission to use it).
It's quite smart how Apple has marketed their displays relative to the resolution of the human eye. By emphasizing the 'Retina' name, they claim the resolution title, and simultaneously nip any further competition in the bud by ingraining consumers that more pixels aren't going to be visible anyway, so they won't bother paying more for it. (Of course that assumes any other manufacturer can even match this display... someday perhaps :)
This sort of reminds me of the debate over how high you should set the bit rate when ripping CDs. Audiophiles will insist they can tell the difference between 128 and 256, but the average person can't. It's great that the new iPad finally has the resolution it should have had from the beginning, but I don't think it's exactly a hallelujah moment. What can you actually *do* with a New iPad that you couldn't do with an iPad 1 or 2? I would think the practical advantages are pretty much that you can read text a bit faster or squeeze a bit more content on the screen without having to squint. Perhaps photos look a bit sharper. But all of these are incremental benefits, not a revelation.
The real revelation was the difference between the iPad 1 and everything that came before it...
As an old programmer, I read here comments, which reminds me at discussion about the first graphical user interfaces. What was a mouse good for? I could do the same work much faster on my 80x25 screen.
Indeed, I think, the new iPad display is an incredible step forward for computer users. The device feels so natural, I forgot almost, that it is a computer.
But the the post pc device will left room for desktop PCs mainly for content creation. But I am honest, most time I spend in content consumption and not in content creation.
@Takkun the image is already linked to the source
@Axbm great reference. I loved this quote from http://www.useit.com/alertbox/9703b.html
> The screen readability problem will be solved in the future, since screens with 300 dpi resolution have been invented and have been found to have as good readability as paper. High-resolution screens are currently too expensive (high-end monitors in commercial use have about 110 dpi), but will be available in a few years and common ten years from now.
Heh, not quite. That was written in 1997, and "in ten years" would have been 2007. Only now in 2012 are we getting the very first truly hi-res screen mass produced screen. But it's attached to this Post PC tablet, you see...
The Samsung Galaxy Tab 11.6 will have a 11.6" display with 2560 x 1600 WXQGA resolution -- and be open. I'm waiting a few more months for that. Because while you're right about post-PC, you're not right about the closed world of Apple. Innovation doesn't play well with boundaries.
can you use it on the beach when the suns out?
So what we really need now is to connect those tablets to the PC to use them as the new human interface. Something like they started doing with iPad and Photoshop.
resolution on most devices is a novelty, not a necessity. The idea of increasing resolution stems from the need to increase screen realestate so more information can be displayed. The current trend of increasing pixel density,( which many manufacturers are doing, not just apple), simply make the display "prettier" but no more useful (unless you look at a lot of photos). You can't add anymore info to the already tiny 10" screen because of physical limits. Who want text that 1mm high etc.
Secondly as the ipad 3 has shown, it requires a lot more memory, and more processing power to process, just for the display. It need bigger CPU and bigger battery without truly benefit processing power.
Lastly, like many fellow tech enthusiasts, I truly hope that the future is NOT Apple. I don't believe it is, but what a nightmare that would be.
I still want to run real software, and real multiple displays that are useable from a distance, and have a CHOICE, etc etc
AND I want more processing power than a 10 year old desktop system - 1 Gig CPU with a whole 1 GB RAM - and 4 core graphics. - Wow - my old P4 had more than that!!!!
I want 3 to 4 Gig CPU, 4 to 8 Gig Ram and 100's of graphic processing cores and true multitasking and terabytes of storage etc etc etc
I remember when I first got HD TV - I was amazed at the difference in quality. I'd spend ages regaling everyone I knew about how great it was. However, in reality a lot of broadcast content is still supplied in SD format and I soon realised that when a program was good then I never really cared or thought about "the pixels" because I was engrossed in it. (Probably my favourite TV of the year was an SD broadcast of a Danish series ("The Killing") - with subtitles).
So, yeah, a great display is nice - but it is content that defines the experience. A funny video of a cat riding a tortoise on YouTube won't be funnier because it's in high definition (and, in fact, won't be in high definition on the new iPad as a lot of content will have to be upscaled). Certain things like readability will definitely be improved, but backlit screens will always be more of a strain on the eyes than low-tech paper and won't work great in bright light.
Personally, I'd like to see Apple innovate a bit more in terms of the UI. Does a home screen full of icons become a better user experience because those icons are a little bit sharper? Or is the problem that a screen full of static icons is actually awful UI (equivalent to a Windows 98 desktop of an elderly relative packed full of shortcuts)?
I don't see a move from the PC to a "Post PC Era". To me it looks as if the computing world constantly tries to figure out what the ration between computers and users should be.
Every ten years or so somebody notices that it is inefficient or somehow not good enough if [only one user uses a given computer|a given computer is used by more than one user] and advocates that instead [many users should use one computer|every user should get his own computer].
In the 60s and 70s we called it mainframe, now we call it cloud. The clients are more sophisticated (the iPads and phones and the like, not the users) and the cloud consists of many networked computers rather than one big one, but the effect is the same: many users use one computer (system) again. (In the 90s the expected shift from many computers to one computer system failed because of a mismatch between user expectations and available connectivity technology. But we did get the Web.)
I think that perhaps the entire history of computers can be explained as the constant struggle to make one computer system support more than one user whenever every user had his own computer and to give every user his own computer whenever one computer system was used by many users. This resulted in more computers and more users because we always added but never subtracted.
"Jakob Nielsen wrote about 300 DPI displays back in 1997: http://www.useit.com/alertbox/9703b.html"
This is off-topic, but just too interesting. From the article:
"Use hypertext to split up long information into multiple pages"
That is bad. That is really really bad. He got it so terribly wrong even though he wrote the article at a time when PCs were as badly connected to the net as mobile devices are now.
I absolutely hate loading a Web page for a minute or two (depending on location) and then find that after a minute of reading I have to load _another_ page (and another, and another). The ridiculous habit of pretending that a Web page is like a page in a book or magazine is, imho, one of the worst features of the Web today.
Good thing Jeff's blog is not like that.
Interesting article, although I'm still not quite sold on the whole "tablet" as a form of computing.
The trouble with high resolution PC displays is that Windows 7 and earlier doesn't scale up well. You can increase the font size, but that doesn't make everything scale up, so some things you need to click on are very small. Also, not everyone tests their applications with different font sizes, so you get strange effects with bigger fonts.
Great post! I had read your post about high resolution monitors around 2007. Back then 24 inch screens had resolution of 1920x1200 fast forward 5 years (bloody 5 years!) and we still have 24 inch monitors with exactly same resolution, worse still with this HD marketing gimmicks the resolution has actually come down to 1920x1080 in other displays! Can you fucking believe that. Laptops come with that sore of eye 1336x720 display. I have avoided buying a laptop for last 5 years just because of this. When the rumours were going around about new ipad having retina display, I thought it was more of a wishful thinking rather than anything else but they actually did it. I truly hope this innovation will be replicated across the board as far as displays are concerned. I still cant get my head around when I think that the 10 inch ipad has more pixels than my 24 inch monitor. This has to change.
It's an interesting article, but I'm not convinced.
Ignoring the current leveraging going on around the environments that these new devices exist in (because that's arguably a separate though important discussion e.g. I'd never buy anything from Apple), tablets & phones only augment the tech landscape. They don't replace it. We're still going to have servers and all that that entails. We're still going to have legions of workstations for content creation.
We haven't even begun to see the effects of long term PC use, let alone heavy phone & tablet usage. How long before we see the next set of health effects of working too long around a cramped too-small tablet trying to type long documents for example? 'oh but I use my iPad for everything; I bought it instead of a PC'. Maybe they'll call it iPad Shoulder to match the Nintendo Thumb.
It's lovely that the iPad has that resolution - but is there any user value to it? Communication is still more important than watching high resolution Youtube of a cat. Hasn't the entire crux of the article been that the tablet & phone aren't aimed at general purpose computing, but email, browsing, what 'the general public' want? Beyond occasional hits like Angry Birds, the gaming market is still firmly on consoles & the PC and for some good reasons. So I'm still not sure I see what the benefit is beyond the wow cool & current gadget factor. I applaud the innovation and maybe it will drive something by itself, but I don't think the average consumer (which in fairness doesn't really describe Apple's target market anyway) would be that concerned.
If I were a betting man, I'd still look to the mobile phone market first simply because more people care about making phone calls than having some very expensive paperweight that has a bigger display, and more people will be able to afford one GSM SIM than two (and so they're going to have a phone first either way). Internet access just isn't as ubiquituous as people like to claim, either.
I don't think these devices are 'there' yet. When cheap tablets are being given away in third world countries to help jumpstart their access to the Internet (which happens with PCs), then we're in the post-PC era. I'm not even going to touch on things like the reuse/recycle value of a general purpose PC vs the number of locked mobiles & tablets that get thrown away each year because the new big thing is out.
Thank you for pointing out this truth which should be obvious, but obviously isn't. Screen DPI is the one thing that has not improved in the last ten years, despite making such a big difference on usability.
As for "but is there any user value to it?":
Yes. We can finally do away with all the annoying anti-aliasing algorithms, with all the blurry true-type fonts and ugly pixel-perfect fonts, we can skip calibrating monitor and graphics card to get proper sub-pixel AA, and most of all: Make all the different software do this right.
I spend all my day staring at a relatively blurry big but low-res screen (~26" at 1980x1020), and I don't think it's doing my eyes a favour. It's the biggest jump in user friendly-ness we've seen in a decade. XP was very usable already, Windows 7 is a lot more polished, but it's essentially the same thing. I'm still using a mouse, I'm still using a (mechanical) keyboard based on an ancient design by IBM. I can't wait to get a 300 DPI screen, or two.
I'm genuinely amazed there's not a single comment here about how it isn't possible/practical to write code on a "post-pc" device.
Bar a few specialised industries, how many people can really do all their work on an iPad?
When can I get a damn 2x resolution 27" imac? I'm guessing apple will be first there too (as in double the pixels on the desktop but only use them to make stuff clearer, not make everything smaller), probably on the MacBook air first. Bring it on!
@Mattola Pollardo why would you want to take it to the beach? So those billions of fine grains of sand can blow into ever nook and cranny - especially the charging port?
"I'm genuinely amazed there's not a single comment here about how it isn't possible/practical to write code on a "post-pc" device.
Bar a few specialised industries, how many people can really do all their work on an iPad?"
@Jack, I wrote one above. Its absurd to think that I can develop truly be creative on those things and its even more absurd that the marketing einsteins at apple or microsoft or samsung or wherever try to tell me that I could be productive and creative on them. They're toys and I do have one but its just a toy. Its clumsy, dont try to hand someone the iPad so you can show them something because they're going to grab it and touch the screen and activate SOMETHING - but what you're trying to show them will be gone.
It is what it is but its not a true blue desktop workhorse and it wont be for quite some time - if ever.
Surely not really post-PC, simply post-laptop. Desktops have a reason to exist, and will do so. It's laptops that don't, really. The half or dozen or so people worldwide who really do write a novel in a coffeshop don't really consitute much of a market...
Why on earth would I want to buy a new system that I would use most of the time with about as much power as a 5 year old (now outdated) PC and a display that's even smaller? Granted, the higher PPI sure looks good, but I like bigger displays of sufficient resolution more than smaller displays with insane resolution. I remember reading somewhere: why am I watching netflix on a tiny ipad when there's a 50" TV in front of me?
Again, hard to imagine any kind of serious content creation on these things. I've never heard of anyone editing video on an ipad. Coding might end up with me stabbing someone, and virtualization certainly isn't happening either.
And speaking of post-pc sales numbers, why are post-pc devices being thrown into pc sales (therefore making Apple the #1 pc company in terms of sales)? Either they are PCs, or they are post-pcs and don't throw them in with pcs. Just goes to show that the entire news culture is part of the Apple cult. It sickens me that it's headline making news when there's a new Macbook pro/air/imac/mac pro (or even if there's a shortage of them), but no one gives a crap whenever there's a new Dell XPS or HP notebook.
I am excited for the next gen MacBookPro that has this same display. Seems like the next natural progression. Thanks for the perspective Jeff. I had considered buying several ipad2 units for my wide and kids but you are correct. Why? A hundred bucks buys more than double the experience
But until the iPhone and iPad, near as I can tell, nobody else was even trying to improve resolution on computer displays – even though all the existing HCI research tells us that higher resolution displays are a deep fundamental improvement in computing.
Actually, the original Droid had a 265ppi display, which was released in October 2009. I attended a seminar by Edward Tufte in April 2010, and he specifically pointed out the high resolution display of the Droid. The iPhone 3GS had a 163ppi display.
The funny thing is that Motorola didn't have Apple's marketing skills, and the high resolution display wasn't given much fanfare.
The iPhone 4 and its 326ppi Retina Display™ wouldn't come out until June 2010.
Ajbrehm: the reason websites split content up into multiple pages now isn't because they're following his accessibility guidelines - they're doing it to increase page & ad views.
I'm surprised nobody has mentioned the one thing about that graph that bugs me: iPhones, iPods and iPads, in general, cost *significantly* less than Mac devices. I think that has a fairly significant impact on number of sales. Not to mention, people will replace their iPhone generally every 2-3 years or so, whereas people are going to go a bit longer before they replace their Mac (also due to that difference in price). And you also really need to be looking at the slope of the Mac graph over the last 5 or 10 years. It's fun to compare to the start of life for the Mac line, but let's be realistic: the iPhone, iPod touch and iPad were all born into a VASTLY different consumer tech environment than was the Mac.
iOS hasn't supplanted Unix - it's derived from Unix, and validates its good design.
As impressive as these devices are, I'm glad I still can have my clunky old PC's where I can remove and add hardware and software as I like.
The graph, although interesting, tries to compare devices which are not comparable.
For one, the relative price of the various "computers" is very different, I have no doubt that if the Apple II cost the equivalent of 400 USD when it was launched way more people would have bought it.
Then comes the intended usage. Apple II had a very limited use (either pure office work, or some nebulous programming). What is today the use of mass spectrometers in the general population? Pretty abysmal. They are very useful devices, though (especially if you watch NCIS).
The iDevices graphs have roughly the same slope, the iPhone being also a phone (which adds functionality = expected increased consumption).
So all in all the results are not that surprising.
It would only be post-pc if we could sell all our computers and still get our jobs done. Not even close. I still need three monitors (and a few more would be better) as well as a large keyboard and a mouse.
Granted, I am a developer, so I need more specialized hardware, but even our secretaries would struggle terribly trying to look up phone numbers and addresses on a relative clunker like the ipad3.
At home, I have the same problem. I can't do anything useful without an actual computer. I have a tablet, and I love it, but I still need the PC. Until tablets get more useful so that you can actually do real work with them, we are still going to need to the pc.
I don't know that you will ever be able to do real work with such a small screen.
As a sidebar, when printing went digital in the late 80s/early 90s images that were processed digitally were "high" resolution at 300dpi. Those high resolution images were used when printing in 175lpi glossy magazines and travel brochures. With the retina display of the new iPad approaching this resolution, it's "close enough" to the same quality of a high quality glossy magazine and designed to be held and used at the same distance.
I'm sure Samsung will also have an HR display tablet soon™ (although I don't follow the tech that much) for the Galaxy tablets. The "post PC era" is true for the general public, but (at the moment) you still need a desktop/laptop/workstation for specific tasks, that aren't going away anytime soon, including programming applications for the devices coming up in the "post PC" era.
You're making the classic mistake of thinking that just because something's not right for you, it's not right for anyone.
To completely rip of Steve Jobs, the desktop is a pickup truck, the iPad is a tiny car.
Some people need a pickup truck, because they do certain things, in the IT world say they are compiling code (although, with the iPad you can just SSH to a server and using VIM code just fine on it, but that's besides the point).
But most people out there just need the "small car", so browsing or maybe they just need to read/make do changes to MS Office (or equivalent) documents, all of which is so incredibly easy on the iPad.
I code across multiple languages for a living, so I need a pickup truck and I'm cool with that, but that doesn't mean I have to force my parents to drive one when all they want to do is surf the net, email and look at pictures of their grand kids.
Re: stuff about "open". Come on now, we all know Android isn't "open", even the most staunch Google supporters would agree with that. Also, what innovation has Android given us lately? Every Android tablet so far as been much worse than the iPad, can't be updated, has horrible battery life and a low customer satisfaction rating.
Yes, that beautiful new display is completely necessary when half of it is filled with the image of a keyboard. Try as we might we can barely find a use for the ipad in our household regardless of how crystal clear the display becomes.
Having a 4th-gen iPod Touch w/ Retina, the pixelly mess of the 1st/2nd-gen iPads were distractingly obvious. If I owned both, I'd probably get used to the iPad because I'd use it more frequently for the expanded space, but given a choice between Retina/non-Retina iPad? That upgrade alone is easily worth $100.
Now if we could get PC display manufacturers to make Retina-class 3840x2560 displays...
The Post-PC world was declared back in 1998 as a result of the Palm Pilot. I first saw it in Infoworld. I found this posting of a Red Herring article from back then about top 10 trends of Post-PC computing.
Odd, no mention of Apple. You'd think if Apple invented this term... :-)
I like my iPad, but the reality is it's a complementary device to my PC. It doesn't replace it. To make it a replacement would involve making it into a PC.
So from my point of view, mobile phones and tablets have expanded my computing abilities. I'm able to do things today I didn't use to be able to do. That's what is meant by the Post-PC world. It doesn't mean the PC is going away, it just means the PC is no longer the center of your computing power.
I agree with part of what Castaa said. I will always use a pc if I need to compose anything longer than my email address. In fact, having to screen type in my email address is painful. The next major problem that must be solved is input. I still think innovations like keyboards on the back of a tablet have promise. However, I still hope for a brain wave interface :p
There's one thing left for the world to go into the post PC era I guess. Build apps on the tablet/smartphone. It would be cool if we could write code, add server and test our apps all on our tab/phone. That would just complete the post PC transition complete.
But until the iPhone and iPad, near as I can tell, nobody else was even trying to improve resolution on computer displays
There's a reason for this = pixel density above ~100ppi is irrelevant on desktops and laptops. These devices are viewed typically 25" from your eyes, which is double the viewing distance of tablets! At this distance, one can't distinguish individual pixels and a typical 92ppi 24" monitor qualifies as "Retina".
Even the low ppi on HDTVs are fine. Since you're sitting typically 6-8 ft away, your eyes can't distinguish pixels on 1080p 50" display and thus these devices also qualify as "Retina" too.
So relatively, the iPadNew really didn't achieve anything astounding in terms of display technology -- in fact, computers and TVs have had "Retina" display for years. iPadNew just brought tablets up to speed.
I think a lot of commenters here are missing a couple of key points. First, "Post-PC" doesn't mean that PCs disappear, it only means that the world ceases to revolve around them anymore. Everything you create assumes an audience of tablets and phones, and you only limit yourself to a PC when you have to. Second, the retina display isn't about just being nicer - it crosses a threshold. Like crossing the threshold from really cheap to free, it removes a perceptual barrier.
Actually, the original Droid had a 265ppi display, which was released in October 2009.
Yes - for resolution 854×480
Are calibration tweaks possible on the ipad3 as mentioned above?
To flip the coin over, I am so glad the tablet era has arrived. So much easier for lots of simple things. Email, movies, books, news and most of my little games—on the tablet.
Still. I think it's too soon to declare the desktop dead. Maybe dead for being exciting and new, but still very useful and very relevant for most of the work being done in industry.
That said, the pc can be killed by a few feature additions:
- Really fast and complete tablet docking stations
- Really effective voice to text
- Better gestures to do more complicated work
All these exist, but they are not of high enough quality yet. Make high quality version of all three and suddenly the pc is unnecessary.
Wojtek Swiatek: "The graph, although interesting, tries to compare devices which are not comparable."
Agreed, but even more so, I'm really skeptical that you can draw any real conclusions from that graph (let alone the conclusions the author makes) based on the differences in time periods measured.
The Applie ]['s starting point is 1977, and the Mac's starting point is 1984. You can't compare the speed of market penetration for expensive and exotic personal computers in 1977, when nobody really know what a personal computer could do and there was no established software industry, with the potential of selling a much less expensive and well-understood device to a marketplace of highly tech-savvy people.
You may as well put the automobile and television set on that chart too - you'd find that the curves are much less steep than the personal computer, even though those products were absolutely revolutionary and became ubiquitous to our everyday life. America is just a different consumer environment now than it was when the PC was new, or when the TV and automobile were new.
Specifically, America in the last decade is far quicker to learn about and embrace new technology of any kind than it was receptive to computers in 1977-1984. The whole marketplace is way too different to assume that the difference in the curves can be attributed to the factors called out in the article.
Overall I liked and enjoyed the article and the comments, but that really stood out as a poor interpretation of data.
Since this is a "programming" blog, I'll comment on "PC".
To me, as a "programmer", the PC represents a "programmable computer." This is the origin of my attraction to it.
The iPad is not a "PC" because it is not directly "programmable" - you need to get a "PC", more specifically an Apple "PC" to create a program for the iPad, then distribute this program to the iPad thru a strictly regulated application signing and distribution mechanism controlled by Apple. Have done it, loads of fun. Fascinating that the hardest part of writing an iPad app is getting the thing digitally verified and running on the iPad.
I find the iPad an amusing device for "consuming" media, as a "programmer" i tend to spend more time producing software.
Now if there were some sort of verbal compiler for the iPad...
as Shakespeare may have put it today :
For integer I equals zero. I, less than one thousand? I, plus plus!
Your posts are always so insightful. I love reading your blog.
Lets have a look at what I use my computer for:
Can I do this with an ipad? yes
Can I do it a fast as on a PC? no
Can I do this with an ipad? no
Can I do this with an ipad? no
Am I living in the post PC era? no
...You're telling me that you bought two oversized and overpriced iPod Touchs (because, let's face it, that's exactly what the iPad is) just because they have a high resolution?
Is this one of those "the PC gaming is dead" kind of speeches?
These devices can not fully replace a PC unless they have such a flexibility when they can be called PCs themselves. Not in the strict technical term, but in the original sense of the word. Once I hook up some decent input devices to an iPad and run some sensible operating system on it (iOS doesn't qualify), than I have no fear to call it a Personal Computer.
so, I assume you wrote this post on your new iPads, right?
I have to disagree with you here in some respects. The tablet is very useful, been waiting for decades to see them come to fruition. Apple however hasn't yet won the game, nor are they likely to. Their operating system is piss-poor, especially in the UI department. (seriously? fields of infinite icons?) Methods of language inputting is a nightmare (on screen keyboards can GO TO HELL >:|)
I type at between 80 and 100 WPM, I'm a writer and programmer. I talk at maybe 30-40 WPM so even if we get perfect speech to text it would be a pain. Not to mention punctuation and grammar through speech to text is a double pain.
Also, I game, and I'm sorry the iPad nor any android devices out there are quite powerful enough to handle my favorite games. This is more a symptom of lazy and inefficient coding than anything else, but that has become standard.
I admit, completely, that the ipad 3's display is amazing, crisp, clear, and high DPI. I've been waiting for a display like that for eons. However that isn't enough, they do not do enough with it. Not to mention trying to use your finger to touch many of those small buttons is a trial of patience. Not everyone has fingers that small!
I partially disagree. At least with the terminology.
What does "Post PC" mean? The fact is that now your phone is a "PC". More than "Post PC", we are in the era of "Pervasive PC".
Said that, I believe that in the world there is still space for conventional computers. Both the client devices and the servers hosting the services need to be programmed somehow. You do not want call it "PC"? More than welcome, let's call it "Workstation" then.
Moreove, I do not believe that people doing CAD will move to tablets and touch-screens anytime soon: they still need large monitors and a precise pointing device (and fingers are not).
On the other hand, I think that everything that my mom needs is a tablet to do whatever she does with a PC today. I.e., writing some emails, reading recipes and checking what other people are doing on Facebook.
To go back to the point of your post, many things are "dead" reading the blogs these days: minframes are dead, fortran is dead and even C is dead (in the words of a friend of mine). They are not. They just represent a smaller slice of the market and they work *very* well in their niche. The fact that the market share of these products is smaller is not due to the fact that less people use them, but just a lot people more use other (newer?) technologies. The number of Fortran programmers did not decrease with time, there are just more PHP programmers around.
"they achieved their mission years ago"...Right.
So, what do you think they use most in Somalia? Vaio or Apple?
Jeff I usually read your posts with great pleasure but it makes me sad that even smart, educated people like you mistake (Western Europe + USA) as "the world".
Let me remind you: 80% of the Earth population is NOT living the same way you are. I can't be bothered to look up numbers right now, but a very very large percentage of those doesn't own a computer, and a sensible amount of these people wouldn't even know what to do with one.
"When was the last time you saw a desktop or a home without a computer?"
Every day, everywhere, here in Lebanon, although it is a fairly rich country (richer than other under-developed countries). In the capital, Beirut, a good bunch of people own a desktop, but still, in many cases, they don't. Outside the capital, you sometimes have one desktop per street, and people knock on their neighbor's door to use it, or go to network centers.
In these countries, most email checking and communication goes through mobiles, but not tablets mind you, and not iphones either.
And if they own another machine, it HAS to be all-purpose because there just isn't enough money to get yet another box for playing and another to do office work. That one desktop has to able to cater for the daughter's school research needs, the son's office work, the father's porn browsing and the mother's facebook, plus the occasional game or specialized software install.
So next time you plan to get all prophet on us and make a prognostic, be sure to specify you are talking about your own geographical surroundings, because obviously you have no idea how the rest of us live.
I think this is the "second" post PC era. The first was after the fad with cheap home computers---the Vic 20, the Apple II, the PCjr, ah the days! They went away "when people got tired of boxes that go 'bing'".
The second PC era began with web/multimedia/etc.
Perhaps there will be a third PC era---I don't know, because of in-home manufacturing through the follow-on to 3d printing? Telepresence that doesn't have the problems of teleconferencing today? Will we all lock ourselves in our homes Asimov-style?
Todd Vance: "I think this is the "second" post PC era. The first was after the fad with cheap home computers---the Vic 20, the Apple II, the PCjr, ah the days! They went away "when people got tired of boxes that go 'bing'". "
I have to disagree with this. Yeah, once the novelty wore off, there was a burnout in the early 1980s on the absolute lowest price point computers that didn't really do much (the Timex Sinclair and some of the TRS-80s come to mind, maybe the TI-99), but of the computers you describe above only the VIC-20 might qualify as a computer that just goes 'bing,' and the VIC was still pretty capable for its time. The Apple II was an extremely versatile computer that spawned the first spreadsheet program and some of the first word processors, and with development was a viable competitive product into the 1980s. The PC Jr. had its issues, but the very similar Tandy machines were successful, as obviously was the full-strength IBM PC and the dissimilar but still very capable Commodore 64.
You seem to be implying that PC buying hit a big trough after the early 1980s, until the second PC era began in the early 1990s with the Internet and multimedia, and I just don't think that's true. It's not as if people purchased most early PCs as a fad, and subsequently got rid of computers altogether until the 1990s. Instead, they upgraded. People who bought VIC-20s may have replaced them with a Commodore 64. People who bought IBM machines may have replaced them with 286s, then 386s. People who bought Apple IIs may have subsequently moved on to Macs. Customer adoption of the PC just kept building through the 1980s as technology improved, but that doesn't mean that the early PCs were a fad.
I do agree that the web and multimedia accelerated the process, though. Access to amazing new quality and quantity of content really helped increase adoption.
I apologize, but I had to stop reading at your "3-points-bulletin'" list.
Any IT-literate person cannot possibly desire to be living in a world with devices that are highly specialized to perform ONLY basic tasks and by design lack the capability of expansion (for any purpose that is).
How can you possibly advocate this kind of opinion?
...I feel inclined to conclude this comment with a very cheap but compelling citation from the realm of /b/ or /g/ by I am bigger than that.
> The trouble with high resolution PC displays is that Windows 7 and earlier doesn't scale up well. You can increase the font size, but that doesn't make everything scale up, so some things you need to click on are very small. Also, not everyone tests their applications with different font sizes, so you get strange effects with bigger fonts.
Well-behaved Windows applications should scale up all user interface elements, and in general the OS itself has been pretty good at doing this since Windows Vista. Regrettably, making programs that look good at multiple DPIs is harder than it should be: try getting pixel-perfect bitmap images in a WPF application at multiple DPIs, for example. WinRT at least provides built-in support for multiple-DPI bitmap images, but until that framework is made to work on the desktop, it's useless for many classes of programs.
As for being in a post-PC era, can we keep the high-resolution screens and leave the walled gardens?
Things have been improving, bit by little bit. I bought a 27" monitor in 2007 with a 1920x1200 resolution, and in 2011 the best available at that size was upgraded to 2560x1440 - 30% more in each direction. I'm thinking about it. Compared to the doubled Apple resolution, it's not much, but it's not like manufacturers are completely forgetting that aspect.
As an aside, reading text for an hour on the iPad 2 hurt my eyes. Reading on an iPad 3 does not, as long as the room is adequately bright. It does make a big difference in how much you have to squint.
> Heh, not quite. That was written in 1997, and "in ten years" would have been 2007. Only now in 2012 are we getting the very first truly hi-res screen mass produced screen. But it's attached to this Post PC tablet, you see...
I bought a Toshiba u820 in 2008, so he was pretty much spot on.
It sucked! Windows isn't really built for being used on high dpi-devices (try setting the dpi to anything but the default and be prepared for a world of hurt) and Toshiba wasn't/isn't really prepared to make a high profile product with ubuntu (or another distro).
@A Facebook User Thank you, thank you so much for pointing that out!!!!!
Better display is not computing. Didn't we have fun back in the day with our Amiga's, x486 CPU's? We sure did! Jeff, I'm sorry to say that I'm disappointed in your post. Perhaps you wrote it just to gain popularity since the new iPad keywords are extremely popular these days? :S
Congratulations on having good near vision. For the farsighted folks out there, of which there are many, reading anything up close can be considerable challenge which no level of pixel density will help.
I'm blessed/cursed with mild nearsightedness and I do miss the pixel density of my iPhone4- that's about all I miss about it.
Saying those that complain that "all they did was upgrade the screen" are bordering on stupidity is harsh and subjective. What one person considers a must-have feature, the next can hardly be bothered by. And then there's the fact that not everyone is a gadget junkie. And on the other end of the sprectrum are the technophobes that are scared of specs.
For many, a device with "retina display" is something they can't do without because Apple has spent over a billion marketing dollars to convince of such. When it comes down to it, many can't tell the difference: http://www.pcmag.com/article2/0,2817,2401726,00.asp
The placebo effect is a real thing.
Smartphones are faster and have more memory than mainframes of the 1980s. Everyone now has the problem of figuring out what to do with them. The computer industry has the problem of figuring out how to make money with them or off the users.
How many useless variations in operating systems do we need? How often do we need to upgrade? We are supposed to buy a computer because it is thin? LOL I would rather have a thicker computer that would run for 24 hours.
(from twitter) @clipperhouse: I wish there was a product that combined an iPad, a keyboard, and a some way to prop up a screen
Neither Windows nor Mac OS X scale that well - too many assumptions of too many programs would be broken. Even iOS with its retina displays actually does not really scale. The reason why Apple exactly doubles the resolution, while keeping the size the same, is that iOS now treats a square of four pixels as one pixel. Oh, the text is properly anti-aliased, and you can display icons, photos and videos accurately, but the default coordinate system is still the same as on the original iPhone or iPad. That makes it easier for applications to run unmodified. Of course, unmodified apps have low-res icons, and other graphics, so they only benefit when rendering text and vector graphics.
Fortunately, you can change the scaling in Quartz 2D, and optimize your code to retina displays. But you do that again by knowing what the screen resolution is, and the next step has to double the resolution again, and the programs have to be adapted again.
Apart from KDE 4 and Gnome 3, no UI has made the effort to go fully scalable, i.e. using vector graphics for the icons (pixel graphics will always have scaling issues), and actually honor the dpi information of the screen (Android has screen classes, so while Android apps are not automatically scaling to whatever screen there is, the situation is good enough). So at the moment, a high-res screen for a desktop PC would be only usable on a Linux system, which probably explains why nobody is making them - the market is too small.
My hope is that the new iPad puts enough pressure to make high-res screens with whatever stopgap technology you need (the iOS approach to just double the resolution, but don't tell legacy programs should be good enough for Windows, too).
I think many, if not most, folks here are missing the fundamental point of the higher resolution display:
Couple this with a stylus, and you bet we're in a whole new world of computing.
Current displays are not close enough to paper-resolution to make it good for anything other than sticky-notes. But the iPad 3... and similarly-spec'd devices... and I think we're on to something.
The title is misleading. While the rise of different devices (more mobile)is evident, the static multi-purpose machine is still in great need. In fact, cell phones are getting bigger and more feature-rich to mimic what we get in a PC. So the PC metaphor is more valid than ever, the problem lays on the actual implementation, the actual PC machines.
About screens: hard core gamers try to partically solve the problem by using multiple monitors. Video cards like ADM/ATI Radeon HD6850 are designed to use all the space combined.
Monitors in laptops are in need of a refresh across the board - I fully agree, and one good takeaway from the launch of the ipad3 is that apple has put resolution centre-stage as a key metric by which consumers should measure devices. What features the masses are concerned with, manufacturers will build, and hopefully this push for higher resolution in devices will trickle down across the rest of digital device ecosphere over the next few years.
Thanks for the link, and for using a quote from my blog in this well-written and accurate post. Only inaccuracy I could spot was my name. It's Bill Hill, not Bill Hills :-)
I think you're absolutely right, no matter what some of the commenters here believe. Higher resolution is a key computing advance.
Since personal computers first appeared, humans have had to adapt to their idiosyncrasies - and one of the least noticable impacts of low-res was that our brains had to perform lots of extra work doing the pixel interpolations needed to turn blocky asemblies of coarse pixels into text and pictures our brains understand.
With this breakthrough into higher resolution, and a much easier and more intuitive UI, Apple has adapted the computer to humans, instead of the other way around.
Of course, higher resolution has nothing to do with some arbitrary number like 1024 x 768, or 2560 x 1920. It's about the number of pixels you pack into an inch.
Human vision has a vernier acuity of 600 pixels per inch (ppi). That's edge detection. However, in practice, there's a strong Law of Diminishing Returns, which means that the improvement a user sees starts to fall off dramatically by around 200ppi. Throwing more ppi at the screen brings scarely-noticable improvement. And the math is killer. To go from 100ppi to 200ppi means four times as many pixels to compute. You need a much faster, harder-working graphics card and that uses a lot more power. To go to 300 ppi is 9x, to 600ppi, 36x!
This killer math is why high-res displays made it onto mobile phones long before a 10" iPad.
I''m amazed that Apple managed to double number of pixels in the iPad display, and still retain the same battery life (which I think is also key - a student or worker can use it for the entire day without requiring a power cord and an outlet).
Yes, more pixels per inch would be nice. But Apple has broken through a threshold with the new iPad, and I'm prepared to stay at this level forever if need be. And I'm certainly never going back!
I've written about all of these issues in other posts on my blog, The Future of Reading:
As you kindly say, I've been pioneering readability onscreen for a long time; I created my first eBook in 1985, when I wrote the user manual (remember those?) for Guide, the first Macintosh hypertext authoring program.
I've been a writer for some 56 years. And I'm writing this now on my iPad, which has just become the best and most flexible writing system I've ever used in my whole life - with the addition of an Apple wireless keyboard and an Origami iPad stand/keyboard case costing a total of $110.
I write about this in my latest post:
Once again, thanks for focusing on this topic.
I am not 100% in agreement with the reason for the iphone/ipad sales growth. It is not just because people felt pc's are dead. The rapid adoption of the iphone and ipad compared to the macbooks is also due to the tight social global connections of people. Word travels quicker than earlier times. Hence adoption rates are also faster.
The mac os lion software has one of the fastest adoption rate. That is because they had used the power of internet to get the product easily to the consumer. Previous OS's depended on physical cds and dvds to reach the consumer.
To echo and paraphrase the other comments: Saying “An iPad isn’t a valid substitute for a PC because you can’t write apps on it” is like saying “A Mini isn’t a valid way to commute because you can’t haul other cars with it.” Not everyone needs a computer to write other computer programs. The days where computer were primarily used by computer programmers are long gone.
Yes, I don't know what game Windows or Microsoft is playing, that's why there are so many people also going online to Online repair sites like Techie Now, pc ninja and what not, to fix their computer problem. I also own an ipad3 and the icons where stunning and alive. I was also amaze by Apple, since they don't waste time and give what the consumers want, their idea plus the elegant innovation apple has, a truly Titan in the world of technology, also thanks to the man with the vision of it all, The Late Steve Jobs(RIP). His legacy will continue and his contribution to the world of technology has surpass any expectations. A true Prodigy.
So now we get busy figuring out how to make tablets into our developer platform for the early 21st century. Thats be so nice.
I see many people today looking at this in a very 'black and white' perspective - thundering headlines about the PC/Desktop being dead. The traditional PC won't be going away for a few years yet. Anyone who works any kind of data entry job such as programming, finance, statistics of any type...the people that do the boots on the ground entry and work for it...they won't be using a pad or mobile device of any type very soon to be doing their work. If they do, it will probably have a monitor, keyboard, and mouse attached to it, which would make it...yes, you guessed it, a desktop like computing machine :)
I am sitting in an office next to a rack room that has a couple hundred computers in it. Not tablets. Not mobile phones. Servers in rackmount cases, many of which are KVM'd to other parts of this building running specialized software. The software won't be created anytime soon for the mobile platforms, and those platforms don't have the reliability and power yet anyway. My accounting department down the hall won't be giving up their workstations anytime soon, at most sometime in the future they might be migrated to a cloud situation.
There is a lot more going on under the hood of our pretty little internet, WWW, and 'connected world' than a lot of people realize. The desktop computer will mutate and change eventually, once something appropriate to replace it has come along. People say keyboards are deadl. Again, I say ask anyone doing data entry. A good typist will blow the doors off anyone using voice recognition, and even with really good voice recog. software, privacy and confidentiality are an issue.
To the average user out there at the shallow end of the pool who uses an application or two, gets their email, and starts their vehicle from their mobile device, things may seem like a grand revolution. For those of us who have an idea of what is behind the scenes, we know it takes a little longer for things to really change. The real power is in the back room. It just isn't visible to everyone.
I cannot believe that it is allready so long ago 1975 when gates started his empire...and the power is so large, like Google also very big company having much effect on global economics to.
I now own an iPhone 5, several retina iPads, and a Nexus 7. I'm sure there are many more of these devices on the way. In the calculus of deciding what kind of computing device I want with me, even the most awesome ultraportable laptop I can find is no longer enough. iphone unlock