July 3, 2007
Riding the waves of technology in the computer industry is exhilarating when you're twenty, but there's a certain emptiness that begins to creep in around the edges by the time you're forty. When you've spent the last twenty years doing nothing but frantically hanging ten on the latest, biggest, coolest waves of technology, fatigue inevitably begins to set in. There's an increasing sense of Dj vu - of doing the same thing over and over, with only small improvements to show for it each time. On a bad day, you can feel like you're living the movie Groundhog Day, and you've just woken up to the melodic strains of Sonny and Cher singing "I've Got You, Babe". Again.
Don't get me wrong. As a child, I believed that computers would change the world for the better. I still believe computers are changing the world for the better. But that doesn't mean we should accept them unquestioningly into our lives, either. Software developers are almost by definition technologists. So we love this stuff. To us, technology is its own reward. But sometimes it's healthy, even for us technologists, to push back and ask hard questions about whether a particular technology is making our lives better-- or worse. For every single Tivo or iPod, there are hundreds of also-ran Microsoft Bobs and iSmells. People found entire careers on the shifting sands of technology, often on technologies that become utterly obsolete. It's easy to make the wrong choice, and devilishly hard to predict what will still matter ten years from now.
Instead of investing so much time in technology, as Rick Strahl points out, why not invest in the one technology guaranteed to pay dividends -- ourselves?
Is your quality of life really better because of the gadgetry? Cell phones are bringing connectivity to us anywhere and everywhere. Good because you can be in contact whenever necessary. But bad because you can in fact be in contact anywhere and everywhere. It takes all sorts of self-constraint to implement the 'just say NO' policy on cell phones and turn them off. Always on, always connected, always surrounded by the constant media buzzsaw. When's the last time you connected with -- oh I don't know -- yourself? Or nature?
A similar sentiment is echoed by a technology worker in this recent San Francisco Chronicle article; don't neglect the human beings behind all these computers.
Ray Carlson, 46, of Hayward worked as a help desk analyst for software companies and found that "technology for its own sake" was the unquestioned workplace edict. "There was always a constant learning curve that could only be fully ascended by those whose first love was technology," he said. "One had to be willing to allow one's career to run roughshod over any sane boundaries between work and home. People bragged about how many hours they worked and that they had no life outside of work."
In retrospect, he said, "The dot-com bust was the best thing that ever happened to me up to that point. It caused me to recognize that people, not machines, are my passion.
Perhaps the original purveyor of this "tune in, turn off, and drop out" policy is Cliff Stoll. His book Silicon Snake Oil dates back to 1996, the veritable dark ages of the internet. He followed it up in 1999 with High Tech Heretic: Why Computers Don't Belong in the Classroom and Other Reflections by a Computer Contrarian. Lest you think Mr. Stoll is some kind of computer hating luddite, consider that he's an astronomer and a hard-core UNIX hacker from way back. I remember reading excerpts from his book The Cuckoo's Egg in Byte. It's a gripping narrative of Mr. Stoll tracking a wily KGB hacker who infiltrated the Lawrence Berkeley Labs systems-- along with hundreds of other military and education sites-- in the mid 1980s.
Mr. Stoll's reservations are based on extensive use of the internet, all the way back to its formative years; he's been online since 1976. Familiarity, in this case, breeds contempt. You can get a sense of Stoll's position in this 1996 interview:
One of the lies of the Internet is that it is an information superhighway and that we need lots more information. But I have never met anyone standing on a street corner, sign in hand, saying we need more information. Just the opposite, many of us, especially those of us working in technical fields, say, "I've got all the information I need. Give me less, but give me higher quality information." And that's what's missing from the Internet, quality. When it doesn't cost anything to post the stuff, people naturally post anything they wish. As a result, when I need quality information, I turn to that which is published on paper for the obvious reason that it costs money to publish on paper. Because of that, there is a built-in filter. They are called editors. Because it costs money, they will only allow that which has quality content. So when I want quality, I look on a piece of paper. I look at that which has been edited. And that's what is grossly and desperately missing from the World Wide Web: editors, critics, reviewers, reporters.
The answer to me is self-evident. It's economic. You get what you pay for. When it's cheap or free to publish something on the World Wide Web, you will naturally publish that which costs the least, has the least and has the least economic value. If you have a catalog or parts list, you put it online. When you have something you want people to study and think hard about, you'll put it on paper. Quality writing takes time. Somebody who puts time and money and effort into it... are they going to give it away for free? Maybe, but I doubt it.
Some of his criticisms are prescient. Still, I don't think the internet is anywhere near as destructive and devoid of value today as Mr. Stoll imagined it would be ten years ago. If anything, quite the opposite. But it's not the specific criticisms that matter; it's his spirit of healthy skepticism that I admire most:
This much is certain: Unless we debate these questions in public, we move blindly. We listen to some cyberguru who says this is the way the future is, close your eyes and trust me. I don't believe in gurus. I believe in skepticism, in discussion, in public debate. It's our responsibility as citizens, as technologists, to debate where this stuff is likely to go and to ask difficult questions.
Too often, we get so heads-down in the digital rat race that we forget to get off the treadmill for a moment and ask those hard questions. We lose perspective and don't bother questioning our assumptions. We forget to take time out to be analog for a little while. As a person who all-too-willingly spends nearly all his waking life in front of a computer* It might be a little disingenuous for me to talk about using technology in moderation. But I think that's exactly what the doctor ordered: moderation in all things, including moderation. The analog world is an essential part of a balanced digital information diet.
Now if you'll excuse me, I have to go check my email.
* I spend the remaining time dreaming about computers.
Posted by Jeff Atwood
Nice post Jeff, I agree that it is good to switch off from technology and to get back to knowing yourself and nature once again. My way of doing this is to go hiking once a year to a remote location for at least a week (this year it is Iceland), where I will have no mobile phone, no computers, not TV, no electricity...
When you step off the technology treadmill, even for a week or two, you step back on again feeling a lot more invigorated about the whole thing.
"Riding the waves of technology in the computer industry is exhilarating when you're twenty, but there's a certain emptiness that begins to creep in around the edges by the time you're forty."
Very good post. I'm 35 and I've been programming since I've been around 10 and the emptiness started to creep in around 32 on bad days.
And when arguing with younger developers or seeing the next hype, I most often hear "I've Got You, Babe".
Stephan Schmidt :: email@example.com
Reposita Open Source - Monitor your software development
Blog at http://stephan.reposita.org - No signal. No noise.
Personally, I only read information that's been hand-chiselled onto on stone tablets. Because it takes so much time, there's a built-in filter and I know it's really important information. And that totally outweighs the increased difficulty of accessing the information.
Corker of a post, Jeff.
Having just returned from a three day trip to Cornwall, and during this time experiencing my first full day away from the Web in six months - I must say, I didn't find the disconnection easy.
One day into the trip I found a great pub that provided free wifi for paying customers - it's a good thing their lattes were nice. But I'm a fresh graduate who's incredibly hungry to get into business, so it's understandable that I can't bear being without the Web at this point in my life, right? I want to generate an income asap, and I want to do it on the Web. Why should people expect me to disconnect?
My brain wants information that is relevant to me, and up-to-date too. Your average UK newspaper doesn't provide this, so I turn to RSS feeds collated by Newsgator. To get RSS feeds, I need the Web. That's just the way it is. I don't care how information is delivered, I just want to find it interesting.
The funny thing is, Coding Horror is kinda guilty of this phenomenon itself.
You post several times a week, often interesting and insightful stuff, always with lots of links which take me around an hour to follow through completely. Could I live without them? Sure. Yet Coding Horror is one of the blogs that survived when I purged my Bloglines account from 95 to 35 feeds.
I thought this passage from the Stoll interview was very interesting:
You're not saying there are dangers to the Internet equal to the dangers of nuclear energy?
Oh no ... I'd say it's much closer to the promises and reality of a highway system in the 1970s. The argument then was that high-speed roads would be good for the country, good for the cities, good for farmers, good for defense. They will bring us closer to one another. All of these promises are similar to promises of the Internet. But no one asked the obvious question: Might this highway system be bad for the country? Might it create a civilization where people waste hours everyday commuting because they have moved to the suburbs? Might the highway system make the U.S. dependent on foreign oil?
Similar grand promises were also made for television in the 1940s. They said it will inform and entertain us; it will make us a closer nation. It will be good for the family by providing a place for all of us to gather in the evening. These promises are also surprisingly similar to the promises made for the Internet. The reality is that television has helped devastate society. But no one asked the obvious question: Do we want or need television?
Do we have the choice of accepting or rejecting a new technology? Doesn't it just get thrown into our laps, and we don't find out its ills until much later, when it may be too late?
As the stone-chiseled joke above alludes to, it's a bit absurd to claim that printing on paper implies higher quality publications. We only need to look at....almost everything printed on paper.
Greate article...and i hope it wount be that bad in the end...
I think the worst example of the addiction to technology is the Blackberry. I live in Europe where they're not so prevalent yet, and every time I go to NYC for work I'm amazed. There's people checking their mail in meetings, in restaurants, in the movies, in nightclubs. It's ridiculous. When was the last time you received a piece of mail that absolutely 100% could not wait until tomorrow? I'll tell you... never, that's when.
I swore I'd never get a phone that checked my email, but when they release the iPhone with 3G in Europe, it's going to be difficult :-)
I'm 50 in a couple of months, programming since my late teens, and I'm still finding continual delight in learning -- and playing with -- new stuff. Being held back by the dead weight of the old, and not being able to make real use of new stuff -- that's the drag.
Being able to abstract from the particular to the general is one of the key skills of the software trade -- and as much for getting the kernel of value from any particular new and transient tool-set, as for writing the code. That way you don't end up nailing your colours to the mast and going down with whatever particular tool you are using at the moment.
The internet may not have been on the roadmap of the future when I was a kid (where are my skiing holidays on Mars?) but it's something that I find serendipitous and wonderful. If nothing else, it is a marvellous mechanism for finding affinity groups without regard to distance. I never got on with the telephone (something for serious use only in childhood); and gave the TV the boot 20 years ago -- by contrast, on the 'net, I feel at home.
When I'm out of connectivity, it's like half my memory and 95% of my social contact has been rudely cut away. A week spent walking or cycling may be good for the body (depending on how many beer stops and hearty meals are involved); but I find myself intellectually starved at the end of such a holiday.
Top tip, spend time in front of your wife for some more analog experiences ;)
Paul Graham wrote a great essay about the difference between 'wise' and 'smart'. http://www.paulgraham.com/wisdom.html
After I read it I realized that the huge effort I made acquiring transient knowledge about computers was making me smart, when I really wanted to be wise. I haven't figured out how to be wise yet so I have settled for trying to be smart about things that don't change every other year.
By the way Jeff, this is easily the most interesting and useful blog I have found on computers. I really look forward to checking in every morning. Thanks for the effort.
Stoll had some interesting conclusions in Silicon Snake Oil, but there are also some statements that are funny today, like: Pictures will never establish in the internet, because of modem speed ...
Professional Writers and Editors don't help
Newspapers : They have professional writers, professional editors and are often less reliable than a blog by an amateur
The thing that really makes information reliable is accountability and authority, an article in Nature is more likely to be correct than one in a tabloid newspaper, because Nature has a reputation of being reliable and want to keep it, whereas the tabloid does not care ...
This also applies to the Internet - some sites have a reputation of being correct and try to keep that reputation to survive ... Some don't care
Thanks for writing such a good blog. I've been reading for a few months now and really enjoy it. Definitely part of the "quality" side of the internet.
I went to a computer conference a while ago and dutifully took notes all through the sessions and keynotes. A more experienced teammate of mine walked around without a notepad or pen. He says, "If they talk about something new and interesting that is worth remembering, I'll remember it."
There is the voice of wisdom.
The debate of digital veracity highlights what seems, to me, to be the problem with ideas like Wikipedia. Sure, it's great that the entire world can contribute to Wikipedia and make the articles as accurate as possible. But it's also possible for loons to post a bunch of inaccurate crap that smells like truth. We all know that it's already happened. Sure, I use Wikipedia to satisfy curiosity about a particular topic. But would I use it to research something that is important? Nah. Then it's time for the books, or the digital versions thereof.
I liken it a bit to MTV in the mid-80s vs today. Before it became completely unwatchable, the music videos of today had camera shots that lasted maybe .5 seconds? 20 years ago the camera lingered -- you could really watch the guitar player or vocalist.
Even looking at some old movies it is amazing to see how long a camera shot would last and how the actors had to learn their lines and carry on for sometimes a minute or more without a cut.
For me it has become a challenge every day to lengthen my attention span in a world that seems to work at shortening it.
"Because it costs money, they will only allow that which has quality content. "
Two words: National Enquirer.
In fact the sole metric for printed publication has nothing whatsoever to do with "quality content". It's "The cost of printing this will be $x. Will I be able to sell it for $y, where y x?"
Incidentally, in "Silicon Snake Oil" the author _explicitly_ stated that information was more useful and valuable in printed form than on a computer screen. Information wasn't "real" on a computer. I watched him jabber this nonsense on C-SPAN2 and get his butt handed to him.
How about the massive number of software programmers and designers whom are totally against voting machines? They know what happens when a computer goes wrong, and I suspect they know (but aren't saying) that they have no trust that the industry could produce the level of competence required at the prices we are willing to pay.
Steve, tell me about it. I miss the days before the frenetic camera shot.
One of my favorite quotes about computers is from Dijkstra: “Computer science is no more about computers than astronomy is about telescopes.” For a long time I didn't really understand that quote, but now I realize that as a computer scientist, it isn't the computer that I study. The computer is merely a tool - an indispensible one, but still just a tool. My real passion is information and knowledge. Toxonomy, the semantic web, artificial intelligence, the Internet, Wikipedia, information asymmetries... This is incredibly exciting stuff to ponder!
People might not realize it, but we are in the middle of the Information Revolution - which is changing the world as much or more than the Industrial Revolution before it.
If you ever get the chance to see Cliff Stoll speak in person - do it. I got to attend one of his presentations back in the Cuckoo's Nest days, and he's a great speaker. Very entertaining as well as very incisive.
"The thought of expelling sickness is itself sickness. Sickness will be expelled by abandoning yourself to it, and carrying on within its midst." -The Life-Giving Sword
I would claim to have been a programmer for some time now, but I don't think of technology for its own sake, perhaps because I know that I have the ability to, as you suggest, 'unplug' at any given moment and walk away from it all. I think that it's quite possible to work with passion and still know that nothing about the technology that we work on will change the world. It will always come down to those human factors that you're so prone to write about. And so it seems a little easier to let myself go and drop into the maelstrom of whatever the acronym of the month is, try to fully catch the zen of it, and know that the thing that emerges on the other side will not be technology, but myself. As someone said recently, "...how only lasts about 5 years, but why is forever."
Now who could that have been... :)
Great Post Jeff
I definately echo the 40-something sentiment. I'm much more cautious than I was 10 years ago in terms of adopting new technology. Its fun to read about and see whats happening, but I no longer feel the need to "get it now!"
In a field where the "hot new item" changes each year, its easy to watch people get quickly led down the path of chasing the latest fad ... anyone heard of iPhone - thats so yesterdays news now.
I've also read Stoll's books - they were excellant (and a great for cookie recipe I seem to remember). I think the point of the quality of information is key. We seem to have many sources of information today ... 175,000+ new blogs still being created each day. Yet the trick is find good, authoritative sources.
Yeah I just recently read an article called "Don't let technology take over" about this and commented on my blog about it (http://harmons.blogspot.com/2007/06/is-technology-taking-over-our-lives.html)
I'm probably about as far back as it goes in adopting new technology for my personal use - but like to at least keep knowledgeable about the up-and-coming even though i usually don't use it for awhile.
It's interesting that in software development, there's actually a pretty strong market (it's not the entire market obviously but they have their share) in NOT going with the latest technology - like those that are AS400 programmers or COBOL programmers, etc. I'm not in this field - but I guess there's comfort in knowing that if you want it, you *can* have the same for quite some time.
Regarding the minimal barriers to publishing on the internet vs. printing on paper, and any implications to quality of content, I think it's important to remember that Stoll's comments are from the era of hampsterdance. Much of the content on the internet was questionable. There's surely more content of questionable value on the internet today than 11 years ago, but it's easier to find the higher quality stuff. Search technology, for one, worked differently. Altavista was good at raking bits out of the muck, but not at separating/ranking what was useful vs. what wasn't (maybe google is better...)
All in all, I'd have to say that at the time of his comments (mid-90's) I'd have to agree that the internet was a valuable but not always trustworthy resource. There was also a lot less content that would truly replace good printed technical sources. Today - of course - I don't think twice about googling for answers, solutions, etc.
Another great article.
I check Coding Horror every day, but despite using different browsers, clearing my cache and general techie wizardry I seem to get days in clumps.
I got July 3rd and July 2nd today (6th), but on checking the site on the 2nd and 3rd I didn't have any new articles.
Am I missing something here?
Is there some sort of rip in the US/UK boundary? :)
I've been programming for almost a decade and I still haven't mastered compilers or programming languages, a fascinating frontier for me.
I could see how you'd get burned out on a C#/IIS/SQL Server/ASP.Net-only diet, though.
Learn Haskell or Lisp or Erlang or something, mang.
Amen! You just read my mind.
Much of the content on the internet was questionable. There's surely more content of questionable value on the internet today than 11 years ago, but it's easier to find the higher quality stuff. Search technology, for one, worked differently
I would say that the #1 thing Cliff Stoll didn't see coming was Google and PageRank. To be fair, I don't think anyone did.
Since 99.99% of everything on the internet is crap (and that might be an optimistic figure), and the volume of content on the internet keeps growing exponentially, having a tool that can point you to that .01% is absolutely critical.
A god example of technology getting in the way...
Few months ago, when I was at a live rock concert I noticed most people were recording the show with their digital cameras, all the time. There were even people with a digital camera one hand and with a cell phone other hand (sending pictures to friends).
Yeah! It is great to be in a live show and watch it through a mini LCD screen!
I think that for a lot of people it is more important to tell and prove you were there than just enjoy the damn (and expensive) show.
Great post. I turn 40 next year and I've noticed that I've been looking backward in history for inspiration instead of forward like I did in the 80's. Technology overload, I suppose. The simplicity of 80's tech had an appeal that's lost today...just plug in the Atari 800XL to a TV and you're good to go. BASIC was a snap to learn. $6.95/hr CompuServe at 300-baud, though...bleccch!
We've sure seen quite a tech evolution in our lifetimes, haven't we?
re. iSmell, a href="http://www.nataliedee.com/070107/its-all-cool-until-you-get-to-the-sauerkraut-track.jpg"It's all good until you get to the sauerkraut track./a
One of the best things my parents did in raising me was to refuse to answer the phone during dinner time. This was before answering machines, and before telemarketers were prevalent.
I can kind of relate to what you're talking about here. I do like computers for their own sake, but at the same time I really like it when I see them help people. That's what really matters to me.
I'm in my late 30s, and last year I had an epiphany of sorts. I realized that while I had managed to derive enjoyment out of what I had been doing, I didn't enjoy it fully. There was always something missing. I remembered really liking programming when I was in public school, and in college. I had a few years of enjoyment out in the work world, where I got to "geek out" and try some things, but then it began to fade from there. Over time I realized that there are some deep flaws in our industry. There's a disconnect between what computers can do and how people looking in from the outside understand them. This often leads to a bit of a mess (I'm being polite), and there follows a struggle with inefficient technology that just leads to frustration and angst. It also leads to computers not being used to their full potential. I don't mean CPU cycles or memory being wasted. I'm talking about the fact that expectations of the benefit that can be derived from them just isn't that high.
This finally came to head for me last year. On the one hand I love the vision of what computers can be in our society, and on the other it's kind of depressing that most people don't see it, and don't expect to. I've found that I really like hanging out with the people who hold a higher vision of what computing is and can be. It's really changed my perspective. That's the career part.
I agree that every once in a while it's nice to get away from it all as well, and get out in nature--to get out of my head, and more in touch with my heart.
As for Cliff Stoll, I wish I heard more from him. I used to see him on C-SPAN and on MSNBC occasionally, back 10 years ago when it was actually an interesting channel to watch. I remember his rants against computers in schools. He said they were coming at the expense of the creative arts, an important part of education. He also complained about how they were being used with kids, that their methods for using them were stifling creativity, and creating a rigidity in the students. I can sympathize with that view, though I don't agree that computers don't belong in schools. I think if there's a problem, it's with the teachers and administrators who don't understand how to truly use them well (and I'm not talking about basic literacy). The computer is just a tool. It's all in how its used. It really comes down to people's understanding of what they're good for.
Thought-provoking post and comments. I only had computer experience on the job briefly in the mid 1980s and since 1995, until I got my first personal computer about 7 months ago. Wow.
Stoll's commentary about tvs and freeways and computers is salient today. It leads directly to questions about science itself and how or whether people view it as a tool or as an end in itself. Those are indeed unpopular questions.
At this point there are still many workers in many fields who have earned or retained some degree of prominence, even in "empirical" disciplines, without any critical support from computers. This phase of scientific knowledge may be passing us as we speak. The study of anything requires exposure to the object studied, and it should be a greater, better understanding of these objects or phenomena that is the goal of science and technology rather than an (over)emphasis on data manipulation and methodology itself. If it is true that a picture is worth a thousand words, then a walk in the forest (with keenly trained senses) is worth a thousand datasets from the many applicable disciplines.
It is tragic that there are students in biology who only know their organisms by their gene sequence profiles. Why are so many students more attracted by theoretical approaches that require elaborate, computer-aided algorithms than they are by studying the frogs or birds themselves in the field? The amount of field work to be done just to have a respectable grasp of what is happening in vulcanology, oceanography, ichthyology, etc., would keep thousands of scientists busy for millennia. In these cases computers are certainly an important resource, but are secondary to direct observations in nature. Today we seem to be training young minds to look to the computer first as a *representation* of knowledge instead of looking around in the world outside.
Very interesting and thoughtful post. I am always amazed to see how many people are thinking in a similar way to mine. Many times it is surprising.
Somewhat in the spirit of your post, i am now trying to promote an idea of a Commfree Day - one day every month to be spent away from media and information technology. I have a post with more explanation about it and will be glad to hear what you and others think about it: http://thinkmacro.wordpress.com/2007/07/08/commfree-day/
I agree the computer should not be the first thing students observe about nature, but rather secondary. If they could take what they've observed in nature and try to reproduce it in the computer, that would be a representation of knowledge, but it would be of the *student's* knowledge, not of whoever wrote the software. Through this process of constructing a representation of the real world, the students might actually learn something about how nature really works.
What I'm seeing is that each different technology that gets adopted is a small improvement over what came before in terms of robustness. There are some languages that have been around for decades, so called "research languages", that are sitting on the sidelines, growing old waiting for the mainstream languages to catch up to what they've had all along.
I think C#/Java are an improvement over C/C++ in terms of memory management and ease-of-programming, but idiomatically as far as the programmer is concerned, they're just a little better. For example when I was building web applications in C# I was thankful to be using that, instead of C or C++. I can't imagine trying to manage the complexity of the kind of apps. I was creating in those languages. Garbage collection definitely saved the day. Still, C# was somewhat inadequate for the complexity I was dealing with.
I think that the reason we keep chasing the technology "du jour" is that fundamentally none of the core problems have been solved. We keep jumping at the next big thing, thinking that it will finally make systems/applications easy to develop. By the time you're 40 you've been through this cycle enough times to know that C#/.NET [insert the latest hot thing here] is really not that much easier than C and a set of good libraries was, or COBOL, or Pascal, or Visual-Whatever, etc.
You didn't count the time you spent doing the following:
1) Researching motherboards, power supplies, video cards, compatibility, and the like = 40+ hours
2) Hunting for the best price online for EACH component = 8 hours
3) Downloading all of the latest drivers = 4 hours
4) The cost of the OS = $150
So how much did you save again?!?!?
Technology for the sake of technology is bad and all too common. :( There are some places I don't want to see technology, things that technology/computers can make worse. Voting is at the top of the list.
I have started programming only a decade ago and am into my early twenties and this article does ring a bell. As of now I get totally awestruck by something new coming every few months and keep chasing it till I find something more glamorous to chase.
But this article makes me think, if all this gonna sustain and pay-off later??
Dunno... Still chasing the answer!!
"why not invest in the one technology guaranteed to pay dividends"
One word: COBOL.
50 years and still going. 50 years and still recognizable to its original authors and programmers.