June 26, 2007
One of my most eye-opening early experiences was a tour of a local manufacturing plant during high school. One of our tour guides was a MIT trained engineer who accompanied us, explaining how everything worked. At the end of the tour, he gave each of us a picture of a spider he had taken under one of the electron microscopes they had at the facility. He labelled it "Boris the Spider" after the Who song. I kept that photo in my school locker for months.
As a college-bound high school junior, I was impressed. I thought my Apple II was the neatest tool ever, but this guy had a freaking electron microscope. He was articulate, intelligent, and on top of that, one of the coolest people I had ever met. And he graduated from MIT, one of the best engineering schools in the country. During lunch, I asked him how much of his schoolwork applied to his current engineering job. His response?
I can't think of a single thing from my MIT classes I've used on the job.
This blew my mind. What's the value of a marquee college degree if none of the skills you learn are useful on the job?
At first, I was incredulous. But after considering my own high school educational experience, it started to make more sense. And certainly after attending college for a year, I knew exactly what he meant. The value of education isn't in the specific material you learn-- it's in learning how to learn. In Knowledge Access as a Public Good, danah boyd presents Wikipedia as a perfect example of the latter:
Why are we telling our students not to use Wikipedia rather than educating them about how Wikipedia works? Sitting in front of us is an ideal opportunity to talk about how knowledge is produced, how information is disseminated, how ideas are shared. Imagine if we taught the "history" feature so that students would have the ability to track how a Wikipedia entry is produced and assess for themselves what the authority of the author is. You can't do this with an encyclopedia. Imagine if we taught students how to fact check claims in Wikipedia and, better yet, to add valuable sources to a Wikipedia entry so that their work becomes part of the public good.
Passively reading the material in an encyclopedia or textbook is learning, in a sense. But learning how to research and question the material you read-- and, as in Wikipedia, how to update it so you're adding to the communal wealth of knowledge-- is a far more valuable skill. This kind of participatory, hands-on experience outstrips any kind of traditional classroom textbook. Why read textbooks when you can help write one? There's no substitute for learning on the battlefield.
Nowhere is the importance of learning how to learn more critical than in the field of software development. Programming is, almost by definition, continuously learning: your entire career will be one long, unbroken string of learning one new bit of technology after another. Every ten years the software development field reinvents itself, and it's our job to keep up.
If you don't like learning new things, you will despise software engineering. It's all we do. That's why learning how to learn is such an important skill for software engineers. In our field, how only lasts about five years, but why is forever.
Posted by Jeff Atwood
I agree with Steve. Software is no different intellectually than other professions, but some people in the industry think they are doing more significant stuff than engineers, scientists and surgeons etc. Helping a surgeon with some software tool is not very interesting, the surgeons work itself is still far more important than the tool.
I think software, at present, is a fashion industry, trumped up by the most unashamed marketing effort and BS around. When it gets back to the sober intellectual environment of say most engineering or medical science, it will be worth writing about with the same pen.
"Nowhere is the importance of learning how to learn more critical than in the field of software development."
I dunno... I hope my medical doctor is learning about all the latest medical research.
Well, I'd have to agree. Having an Associate's in Psychology, a Bachelor's in Philosophy, a Bachelor's in Psychology, and a Bachelor's in Classical Studies, I didn't learn much about scripting and automation but I'd say there were more than a few times that the classics degree crept up while writing about automation in my
blog (http://www.TheMacroHook.blogspot.com). I've always been one to look down the overgrown paths to see what's there and now that I'm programing, I haven't lost the desire.
My mum works as an LSA and I talk to her quite a lot about school and where the curriculum is heading. It's interesting to note that here in the UK that it's becoming a lot less about what you learn and more how you learn. Soon students won't be learning all about Tudors and Victorians, but will be told to pick a subject and research it.
It should be noted that, at least at any decent CS department, computer science should not be considered computer programming. They aren't the same and all the useful programming learning comes usually a couple of years before graduation.
I taught intro courses and I always used to tell my students that their next 3 or 4 years would be spent learning mostly, as this post said, how to learn. Programming, more than most fields, is tool-dependent and, more than most fields, the tools change rapidly.
(I also told them that a lot of what would be useful to them they wouldn't be able to identify. Things that are useful to a programmer in courses like algorithms and automata aren't readily apparent but they instill a sort of subconscious problem solving pathway.)
This coming fall I will be a sophomore Computer Science major at Virginia Tech. Currently, I'm working as an intern for Lockheed Martin in Northern Virginia, and I've found this blog to be very entertaining and highly relevant to my [future] work and interests. I've done most of my learning in Java and a little C++, so I've taken to heart your (and everyone else I talk to) comments about learning many languages and keeping up with them.
More specifically, in regards to your thoughts on using nothing from college in your job, I've found that a lot of coworkers agree, but I would disagree. As of now, I've done a little bit of coding, a bunch of diagrams (UML, etc.), and a lot of testing code. However, 95% of this I learned at least a portion of in my classes from the past two semesters. If you asked me to do any of this a year ago, I would've been completely lost. However, a lot of what I am doing is applying the knowledge I've learned in different ways than what exactly I was taught and learning new tools in which to do it (we only covered UML class diagrams over a class or two, but half of my work here has been with UML). In that sense, I suppose that they are teaching me how to learn through what I'm learning. So...yes, they're teaching me how to learn by giving me the necessary tools (both knowledge and application skills) to succeed.
I understand that my perspective will be much different from yours and many of my coworkers, but as a current college student, I'm using a lot of what I've learned for *both* the how and the why.
Anyway, sorry this is a little long, but I fit perfectly into what this post is talking about, and I hope these comments bring up some good discussion.
This is starting to sound a lot like a flame-war and after re-reading this post several times, I am really struggling to see where everyone is getting this from.
Taking a single sentence out of context is no way to formulate a coherent argument. Did I miss the part where Jeff said "no other profession is as important as software development" or "software development is the only industry that requires continued education".
1. Has software altered just about every industry and profession and how it is performed (from warehousing, medical, travel, etc)?
2. As new software tools are developed, do those industries not continue to expand and increase in ability and offerings?
Software development and *just about* everything else are inexplicably connected now. If programmers stop learning to remain on the cutting edge how can said tools advance? It brings the whole machine to a stand-still or at least a slower pace.
Phil, you wrote "I hope my medical doctor is learning about all the latest medical research". Me too. That is for sure. I was horrified to read last year about surgeons "googling" topics and items to make educated medical opinions (sorry I don't a have a link for that story). However, I also later read that several new tools are being offered (possibly in conjunction with deep web search technology of non-public secure information) to assist Doctors to find medical information not just readily indexed by web bots.
Do you think a doctor realized what he was missing by Googling? Maybe. But while brilliant, doctors are still end-users. He knew there could be something better but not how to get to that information. A problem was identified and a tool was delivered.
This cycle of identification and development is what make good software engineers valuable and why learning can't stop for software development. It's not about software vs medical professions. Its if software developers stop learning tools drop off for engineering science too, so its imperative and downright essential for software devs to keep up.
Interesting article once again. I'm not terribly surprised you're now including advertisements in the articles, but would you mind explaining why you've chosen to or chosen to now?
And perhaps this question will be answered in the above, but will this be a future occurrence?
It never ceases to amaze me how you manage to find time to post a well thought out, well written, and well researched and cited article almost daily - and you still program.
You should get paid for this :P
Keep up the good work Jeff! Always a pleasure to read!
(Cool spidey pic too!)
Ian mentioned Doctors Googling for information and making educated guesses.
The above article touches on various subjects, but specifically what I think Ian was referring to (and more importantly why Doctors can now use Google as a reliable information gathering tool)
I too would find myself agape to know my physician was Google-Fu in preparation for a surgery, diagnosis, or the likes, but I also have a world reknown mechanic working down the street (literally) who uses googles and forums to flesh out ideas and tinkerings before working on a car. The guy's an absolute genius, and if he's using google for help, honestly, who shouldn't be?
At least these doctors are smart enough to use all the resources available to them :)
should someone coin the term "learnology"=learning to learn? Maybe there's a better one?
I think this whole "your profession wouldn't exist without mine"-game is dumb.
Oh by the way, I'm an electrician. Nobody would get *any* work done if it wasn't for me!
Seriously though, change happens in most fields. In some fields, more rapidly than others. My colleague has 40 years behind him as an electrician, and he likes to tell stories about how they worked back then. Much of it can be likened to writing hex and working with 100k memory, compared to how we work now.
Couldn't agree more. I may be a slightly off topic, but, as the father of a 2 year old and another on the way, I want the education machine AND PARENTS to do more to get kids to see learning as an enjoyable experience. Knowing how to learn without the desire is also a travesty.
Does anyone else see this as mutually exclusive to the whole standardized testing/rote learning of recent years?
I learned the web at MIT in one Philip Greenspun's bootcamps. It was an incredible learning process and what I remember most is when Philip told us the the only difference between most of the bootcamp students and the regular MIT students was that they (MIT students) had learned to break a problem down into small, solvable pieces. Whenever I get stuck, whether it is a programming problem or something that I'm trying to learn I just break down more and more until I get it.
I wish I knew about flash cards in high school and college. I went back to get my CIS degree and got straight A's and was working driving as a legal courier at the same time. I could hold up a flash card while barreling down the freeway.
But I digress. I think underlining stuff in a book overwelmes me when I try to go back over it and I don't focus on each item correctly. But a flash card accepts no excuses--you either answer it right or you don't. AND, once you are sure you are going to answer it correctly, then you can remove that card from the stack instead of having it waste your time like an underlined item in a book.
Very true. This doesn't count, of course, for lower schooling, but it is true for everything else.
one of the best engineering schools in the country. (MIT)
It is the best, and not just in the country, but in the world.
"Boris the Spider" - hahaha. That's real '70s humor there.
You know, I completely agree with you regarding "Learning, or Learning How To Learn" but I'd say College can be completely useless, just for the points that you mentioned.
I am a young guy, and I work for a development firm. By working, I mean getting payed, and not an intern.
Guess what? Interns have trouble getting my job, and they come with a degree after 4 years of Computer Science.
I dropped out of high school, and I never even went to middle school. Throughout those years I simply read books, and taught myself. For the most part, you could call it homestudy.
I'd say, you're right, it is very important to learn how to learn; it shouldn't be in college though. I learned at a young age, and I believe that should be the standard. When I did attend High School, it was completely useless, and I am glad I dropped out. I now make more than anyone I know my age, and a lot of people that are older than me.
I really think college is useless as well, it's only going to put me in debt, and anything I'd study would be practically useless.
Anyway, I'm only giving my two bits of experience and how well I'm doing because of it.
I think this is a brilliant post.
In high school I took many stabs at learning different things, including programming, only to fail miserably. Once I got done with college it was very easy to pick stuff up and learn lots of new things. I also have not used anything from school in my job but going through 4 years of Chemical Engineering certainly made it easy to teach myself things, or at the very least know what tools I need to learn new things and how to seek them out and apply them.
Interestingly there is a great deal of similarity between Chemical Engineering and programming, which are basically just two approaches to problem solving which is a very important skill to have. If you ever need something to stimulate your mind, get a textbook on any subject, and turn to the problem sets first and learn how to solve them from there. You will learn very quickly and can learn to bypass the stuff that you won't need, kind of like when your assignment is to solve 60 math problems when you have figured out the method after problem 20.
Earlier this year I tried to articulate why I thought learning to learn (though that term is much more apt than any I managed to come up with) is more important than simply learning some skill. It's certainly been invaluable to me.
Yikes, how did I miss so many mistakes when I wrote that post ...
Unlike your tour guide, I can think of quite a number of things I learned in college that have been and are applicable to my job. It was college where pointers "clicked", I figured out recursion, I learned all about big-O notation and how to organize my code. I learned about the principle of DRY though we didn't call it that and I was encouraged to break things down into manageable units - back then it was Pascal and C and the idea was functions, then it was C++ and classes. I learned about search and sort algorithms, reusability, the fundamentals of computer graphics and event driven programming. These days I don't have to write my own linked list implementation, but I could if I needed to. But there was and still is value in understanding all of that.
I agree that college can teach you how to learn. I also know that I applied quite a bit of my college CS knowledge to my work and still do today, though the languages have mostly changed. And that is just the CS stuff - I have used some of the math I learned in college too. College was anything but a waste of time for me despite my best efforts at the time to slack off.
Programming is not an industry of any particular innate social significance. All the talk of software being necessary for various other fields is just rationalization that could be applied just as easily by other fields -- medical advances are expensive to finance, so you could even argue that investment bankers are the real heroes.
Surely Jeff's original context is that keeping one's knowledge up to date is at least as vital to one's career in software as it is in pretty much any other field. Heart surgery is a more valuable field to society, but AFAIK, every technique you learn as a surgeon doesn't get thrown out the window every 5 or 10 years the way it does in software.
Learning how to learn is definately a valuable skill in the field of software development. Without it we'd all be a big bunch of cut and paster's. One of the reasons why it may not be perceived as most critical in IT could be that we are not really in the business of saving lives here. At least not in the field of business IT. But in one indirect way or another, I'd like to think that we actually are.
Playing by the rules we invented? LOL!!! You must be on crack! Let me let you in on little secret buddy: we are still constrained by the rules of physics: space, time, matter, and time, not to go any deeper (see computational theory, np-hard problems, the very foundation of all the crap we build).
Now if you're referring to the rules by which we must play to program in a framework that some dimwit came up with, then yes... you would be fighting against the rules the dimwit's limited creative capability was able to conceive of...
I agree that learning how to learn is essential.
However, almost none of my schooling taught me how to learn. What it taught me was:
* Learning is tedious and mainly involves memorizing apparently random and pointless facts.
* Work, which you'll be doing the rest of your life, is similarly tedious; that's why we give you tedious homework to prepare you for it.
* It doesn't matter what you learn, or especially if you learn more than you "have" to. What matters is the grade you get.
And I went to "good" public schools, mind you. It took a few years after I quit going to school before I could enjoy learning again.
My goal is to home-school my kid so he won't have to grow up thinking learning is boring and pointless like I did.
I was just saying the other day:
* The IT guy at work is learning to play the guitar
* My brother who works in a Network Operations Center is learning French
* I am STILL learning Software Engineering
Every day it changes. And every day I want to know how and why.
- Until the Von Neumann architecture is replaced (in commonly used computers); we sure do play by the rules.
- A professor of mine pointed out that the difference between a professional and a laborer was that the professional had, and used, a library.
- My bestest boss ever kept a copy of Snedecor to hand, and referred to it regularly.
- If that guide never used anything he learned at MIT; neither have Click and Clack the Tappet Brothers.
Kyralessa, you summed it up perfectly. That's exactly what I would have said for public high schools/universities.
in reply to the post of John A. Davis on June 28, 2007 03:34 PM
and so much wasted time to make the flashcards.
underlining items in the book works great if your memory can keep up and remembers by a quick glance which parts are not needed to be read any more. and it doesn't waste time, you can underline the stuff very quickly unlike making flashcards or writing down summaries.
to "Ben on June 28, 2007 12:38 PM"
heheh. your niece was right. what you were saying to her was probably worded in a very overcomplicated way. that is not for kids. they will not learn that way, they need other ways for learning stuff.
One of my profs said "getting your undergraduate degree is just proof that you can be trained". A Masters and doctorate, in most disciplines, do require actual critical thinking and new approaches. For most physical sciences at least, a B.S. = What, a M.S. = How, and a PhD = Why. In your undergraduate courses you learn the frame of reference and terms for your major, you masters thesis and courses you learn how those things interact, and for your dissertation you explain why thing work the way they do.
Great Post Jeff,
After teaching college for the past 23 years I couldn't agree more.
When I started in this industry my favorite programming language was solder, and PROMS had an amazing 256 Bytes of storage. Everyone programmed in Assembler and we struggled to write self modifying code spending hours optimizing and rewriting to try to save even 1 byte of space.
EVERYTHING fit on one floppy: O/S, development system, application and data.
The entire documentation for everything fit in one slim three ring binder which could easily be read in a single sitting.
Modern systems have expanded complexity by what, 10,000 times? (Pulling a number out of my ..... hat).
No other discipline approaches the amount of change as computers have, and in the same time frame, not even close.
Og, the caveman doctor was testing drugs and experimental surgery techniques long ago and nothing much has really changed since then. Test it on a patient see what happens shrug.
Architecture and engineering haven’t changed all that much in the last 1,000 years either. Not really. Fancier slide rules, better materials.
And here we are programming systems on machines that would have been considered supercomputers 25 years ago in a global network that was unthinkable 20 years ago in a language that didn’t exist 12 years ago using methodologies that were unheard of 10 years ago exchanging data using technologies that didn’t exist 5 years ago and displaying it with techniques that didn’t exist 3 years ago.
25% of the material I teach gets replaced every year, a major platform shift every 4 years. I teach very little (if anything) of what I taught 5 years ago. Everything gets pushed down and taught in courses and schools previous to mine.
Every year brings a new layer, what started originally as a single layer small application now has over 20 layers looking at all of the hardware and software involved.
I specialize in disciplines that did not exist 10 years ago, application development security and development methodologies (UP, Agile, etc) in a hostile environment where bringing up an unpatched server will result in your machine being overtaken by 10 year old hackers and turned into IRC porn servers within 48 hours of turning them on.
My last few projects would have been in a Bond movie 5 years ago. Wireless RFID tracking of a manufacturing process utilizing fingerprint scanners, PDA’s, GPS, GIS, with all the data being captured by SAP.
The only thing that has been consistent at my job is how much stuff has changed.
When I first started in this field I was proud of how much I knew, now I am all too aware of how much I don’t know …..
yeah i agree..... sometimes we have to 'unlearn' some knowledge in order to learn new stuff... Our life is a continuous cycle of learning and 'unlearning' knowledge as we move ahead in life and gain experience on many things that we encounter in the journey
Jeff where does the picture in "Learning On the Battlefield" come from? Good post as usual
There are several folks here where I work who have Masters Degrees and such, and almost all of them are subpar programmers at best...
Jeff, you may have Jumped the Shark, seriously.
"Nowhere is the importance of learning how to learn more critical than in the field of software development"? Are you mad? What about open heart surgery, or dozens of other professions?
You elevate software development to some sort of mystical status that is somehow saving the world or something, but it's not. Men who had 100K of memory and had to swap things in and out of it and who had to write hex, now those guys I could respect.
Great post. I recently tried to explain this to my niece who hates history. I told her every smart person I know has knowledge of and opinions on historical events that have no importance to their lives. I tried to tell her that history itself isn't as important as learning to retain, organize, and evaluate information that may not be immediately relevant.
She rolled her eyes and stop telling me about school. Kids these days . . .
I don't think that's what Jeff is saying. He's pointing out that software engineering can be pretty faddish, where new things replace old. Object-oriented programming, Java, version control, design patterns, ORM, etc., etc. If you like Java, and find it tough to learn a new programming language, then you're going to be unhappy.
Whether this kind of learning is meaningful is doubtful. Someone may argue that they should learn one language and be done with it, and work on something that requires real smarts. Instead, in this field, a great deal of time is devoted to mastering things that quickly become obsolete.
Yet, if you aren't keeping up, then you might find yourself out of a job.
Nice post, Jeff. Enjoyable as always.
Steve - I think what Jeff means is that in very few professions do the primitives change so often as in software development. Medical technology may advance, but human organs are pretty much the same as they were thousands of years ago. I think the guys we should really respect are the ones who at one time wrote things in hex, but have adapted to new technology over the years and are now at home in .NET, Java, etc.
"You elevate software development to some sort of mystical status that is somehow saving the world or something"
Software touches EVERY OTHER INDUSTRY, or at least should. Writing good software means creating tools that assist surgeons how to learn to be surgeons, and then later on how to be better ones. Software advances fields study. That is a fact.
Two years ago, I had major skull and jaw surgery. Three dimensional x-rays were taken and imported into an application that allowed my surgeon to perform the exact procedure virtually a week before cutting into me.
That’s not just a model. That is a model with my exact bone and muscle structure. I was not allowed to see the actual software in use, but the doctor was more than willing to answer my technical questions afterwards, since I was obviously nearly more amazed by the software used than the actual procedure itself. He knew which teeth had awkward roots into my jawline without making a slice.
"I can't imagine my life before this modeling thing."
Hmmmm. I hear a lot of that from people using good software. "I can't imagine my life without [insert software/hardware product]".
Maybe doctor-programmers worked on the development of this application, but more likely than not it was some awesome programmers working closely with doctors. Nobody dismisses software development as easy and Jeff was not saying it was more important that medicine either.
However, choose one single field of expertise and study the world now that touches so many other industries with a direct effect upon their advanced other than hardware/software (including firmware) development and I will, I dunno. Become a plumber.
You elevate software development to some sort of mystical status that is somehow saving the world or something
That's not my intent. The rules are a little different in software engineering. Most other fields play by God's rules (physics, biology, chemistry, etcetera) We're playing by rules we invented-- and keep re-inventing year after year! That's why it's a particular challenge for us to be very efficient learners.
Yes, software is amazing, but to say "Nowhere is the importance of learning how to learn *more* critical than in the field of software development" is a bit preposterous and pompous.
This is a classic post. Thank you. I will keep this one bookmarked and point my teacher friends to it next time they tell me how much of the internet they block at the school router.
A lot of people are mentioning that they sure hope their doctors are learning something every year and keeping up with the latest research.
Me too. Does the doctor lose his or her job if they don't? Probably not.
Programmers have a huge incentive to keep up. It's called money. Your doctor probably has to be forced into it, and resents the time away from their practice.
Doctors learn how to learn too, and get damn good at it if they last through med school. Afterwards....??????
You're completely missing the point. You skipped middle school and dropped out of high school, and you never went to a college/university. So, how can you judge its usefulness if you haven't experienced it?
Graduating High School demonstrates your ability to go through a general academic program (math, history, science, english, phys ed/home ec/shop) and acquire a wide range of knowledge. Any four year degree at a college or university will do the same, but it will offer additional classes in the subject matter of your choosing.
Also, the most important component of a quality education is the social aspect. I learned more about people of all types and skill levels at college, and I learned more about how the world really works by getting to know a lot of people from different countries. You'd be surprised at how many I keep in contact with 5+ years out.
Developmentally, the "college experience" is all about balance. You learn to balance the learning with the fun. I argue that it's where I learned who I really am.
So depending on the school, you may be in debt, sure. But I argue it was worth every penny.
There are honest educators and dishonest educators.
For dishonest educators, education is about indoctrination and control.
For honest educators, education is about institutional endurance tests. Surviving four (or more) years of degradation for its own sake proves that you deserve a job.
Fortunately it is still possible to educate oneself. You have to do that first, THEN go to school and endure the degradation.
I think this post proves itself.
What about learning new ways... of learning to learn? :-)
Learning IN ANY FIELD comes when you have:
- Maturity (relative to your age, whatever it is)
- Astonishment capacity
I think that if we could teach or learn how to keep these "skills" growing, learning to learn will be easier. Of course, they are necessary, but not enough.
"Men who had 100K of memory and had to swap things in and out of it and who had to write hex, now those guys I could respect."
I have to agree, There are much more area not only in CS where you need to learn more than a simple programmer have to learn ever. With good logic, you can write programs by learning the basic commands for hours.
Even a normal sysadmin have to learn 5x as much as a programmer. Basically in programming you have to learn how to think and in school they try to help you in this with examples, exercises, the theory is bullshit. I've seen many of my friends mindlessly memorize things for exams, they have never become any kind of programmer.
Hi F Davison, imagine a doctor in the middle of a high risk surgery searching for something in Wikipedia because he doesn't know (but of course, he knows how to find out about something...!)
I got a lot out of my undergraduate and graduate education. And most of my professors who were focused on teaching saw their job as teaching students how to learn. Or so they claimed.
I don't believe it. At least I don't believe they were as serious as they thought they were.
The fact is, they taught us how to learn through practice, with a few techniques ("use flashcards") and practically no theory. Not even the psychology professors spent much time teaching us about learning-- at least in the sense of how to intentionally absorb information.
There is plenty of research on the effects of sleep on learning, the relationship of alertness and attention to learning, and so on. And one can glean from this a variety of practical techniques that could be taught.
If professors are serious about teaching students how to learn, they should keep abreast of research not only in their own field, but also research that has practical applications on how to learn their field.
Very good post. I was troubled by the value of education we receive from school too but I was surprised in a good way when reading your post. It's not become inspiring. Thank you.
This is why my motto is...
"If you didn't learn anything today, you weren't paying attention!"
" imagine a doctor in the middle of a high risk surgery searching for something in Wikipedia because he doesn't know (but of course, he knows how to find out about something...!)"
It is impossible for a doctor to know every surgical procedure that they might have to perform. The emphasis in medical school is on re-enforcing the fundamentals, biochemistry, endocrinology, anatomy, etc... and laying down a pattern for learning. Surgeons learn basic techniques so they can apply them towards learning new procedures.
Personally, I'd be more worried if my surgeon or primary care doc WASN'T looking things up.
I have BS in computer science, and at least five fairly distinct 'careers' under my belt- all having something to do with computers, but none 'computer science'.
I agree completely.
to steal a comment from digg
"you'd realize Computer Science is a math degree. It's a product of pure mathematics in fact, study algorithms/turing/etc and you'll understand. Computer Science isn't about learning how to write in a programming language, that's what you learn as a product of learning the entire CS curriculum. The point is to understand all the underlying complexity that helps you understand how all programming languages are the same (how the machine actually works, and to manipulate it), not including the minor deviations which give certain languages specific intent. Yet even those processes can be still applied in any language not designed with those conveniences in place. This understanding enables you to develop well designed algorithms of your own for science applications. A CS degree is not for the IT field, it's for any field of science so you can help scientist visualize and analyze data from testing, etc. You're a functioning component of real scientific investigation with a CS degree. You have to know your math, and very well, in order to develop the tools necessary for all the various fields of research out there that you could potentially contribute to. Too many people go in to CS thinking it's about programming only, and maybe about networking. That's not the fundamental point of Computer Science. It's a real science."
Also, E. W. Dijkstra once said, “Computer science is no more about computers [and programming] than astronomy is about telescopes.”
You have talked about hiring before (FizzBuzz etc) and it seems to me that you are not testing recruits for how well they learnt to learn... you are testing their _actual_ skills.
I suspect that you would argue that the most talented graduates learnt the applicable stuff in their own time outside the course; nevertheless, if 'learning to learn' was your biggest expectation of an education, I in turn would expect you to test your future employees only on natural-language-logic ability and not actual programming grins
I can't believe you have put that monster's picture on your blog.
It keeps me from viewing your blog! I'm afraid of spiders!!! I can't help but staring at this horrific monstrosity and wonder what kind of nightmers I'll have to deal with tonight..
I am surprised to see that nobody has commented on the wikipedia part of the post.
Personally, I have found the experience of contributing to wikipedia to be very educative. I wonder if it can be used as a learning tool.
Nobody has mentioned lawyers. Yet.
As a programmer all I seem to do is learn. A few years ago I new very little about any type of programming and with the new career I have had to "learn how to learn". I could not agree more this this post. Having had to teach myself entire languages with little to no help, I have even had to learn to teach what I don't know.
I just read the comment about "pompousness" of Jeff's claim that the ability to learn is of most importance in software development. Unlike many other techical fields, software development requires monhtly/weekly/daily update cycle to your knowledge - both theoretical and hands-on. Being able to pick up new knowledge quickly is *key* to staying on top of the field in software development. If you skip a beat, you are obsolete within 6 months.
A surgeon has a very demanding task - it is probably the most demanding profession on Earth. However, the knowledge update cycle in that field is different, for obvious reasons. Surgeon is also at a lower risk of having his knowledge outdated within 6 months.
Great post, Jeff. I can tell you from firsthand experience that almost non of my schooling has directly related to programming, but the training I received in research and critical thinking have allowed me to quickly research and pick up new languages as I go along. I only have a year of college under my belt, I didn't actually take ANY programming classes, only general education. When I started my job, which is primarily ASP programming, I didn't even know ASP, but I was able to pick it up very quickly and two years later I've had many battlefield learning experiences and I am the dominant developer in my organization, all because I was taught how to learn.
Woaw! That's a scary pic of a spider!
I don't really like spiders or any kind of bugs.
It looks giant in the pic. Freaky!
that's a cool looking spider (laughing hysterically)
mADE MY SHAKE THE SPIDER IS CREAPY!!!!!! HOW COULD YOU LOOK AT THAT UP CLOSE????!!!!!!?????
Interesting post- that certainly explains colleges. The only thing I find sad is that so many comments are on the picture of the spider. They don't even realize that it's an intelligently written blog.
I think a lot of have grievously missed the point here. Without wasting any more time on the comments here citing quotes and whatnot; those of you whoe are taking the stance that he is pompous or blowing the importance/difficultly of software engineering out of proportion are the ones being absurd.
His points are not only valid, they are fact. Coming from a family of doctors, and being a med student myself, and a computer nerd... I can tell you that from my friends with programming degrees to my father and uncles with their plethora of doctoral specialties, software advancement (which helps push hardware advancement) is far more dynamic. You have any idea how often truly relevant, field changing medical breakthroughs are made? Not very. Have any real idea of how often something is discovered or created that chance how you can/have to develop? Pretty damn often.
So he is right, and for the one above who made the comment about the surgeons work being more interesting than the software/hardware used to aid him... You are rather opinionated aren't you? Who are you to mandate another persons point of interest? I find myself wowed at my fathers steady hands and beautiful work every time I have seen him in action as a dentist, does that mean I shouldn't be really astounded when I see a new x$ contraption he has that creates 3d virtual models of the persons entire face? Or the newest addition, a big gray box that you feed an impression to, and an finished filling or cap rolls out 5 minutes later that is a 90% guaranteed fit?
If you can read something as well written as this, and the only things you can say about it are lacking substance... Just hollow diatribe, attempts to sound smarter than you are by challenging the writing with 5 point words. Do us all a favor and quietly comment to yourself at home without sharing.
wow this spider is great it look well weird
hi im the celebre hacker, the war hacker
je parle frenais, est la guerre des hacker a commencer,
lol lol lol lol lol lol lol lol lol lol lol lol lol lol lol lol lol lol lol lol
THIS IS THE DUMBEST ARTICAL I HAVE EVER READ I CANT BELIVE I ACCTTULY READ ALL OF IT!!! THANK YOU FOR WASTING MY TIME YOU WASTED 5MINS OF MY LIFE THAT I WILL NEVER GET THANKS TO SOME BULLSHIT ARTICAL:(
this spider is so cool!!!!!!!!!!! when i saw it i had to buy one!!
"Yes, software is amazing, but to say "Nowhere is the importance of learning how to learn *more* critical than in the field of software development" is a bit preposterous and pompous."
it is more neccessary to know how to keep learning in the field of software developement than it is in the field of [for example]Surgery
because people still have pretty much the same anatomy as they have for thousands of years.. Surgeons do ofcourse need to know how to learn, it would be ridiculous to say otherwise, but the imporatance of knowing how to keep learning new things is needed more continuously in the feild of software developement. I have friends who's parents have been surgeons for years & they have been doing the same types of surgeries & have been using the same basic procedures for all those years...Whereas software developers have to keep changing the way they think if they want to stay in the business. As has been stated by most other people here..
You do seem to have a valid point though.. maby you just disclosed it in the wrong way..? So instead of sharing that valid point with us...you just sound pompous and ignorant
to JEFF SUCKS..
Why the hell did you read it if you thought it sucked so bad...Also, you just sound like a douche bag with that comment of yours.. If you dont have anything valid to say, then bugger off..it's not that hard..
In my opinion, you are not learning, you are digesting information. Even though I've migrated over from VB to C# over the past few years, I can't say I've learned anything new from a programming standpoint except for C# features like structured exception handling, generic lists, USING statement. and the like. Yes, those are interesting items to learn, but was there anything wrong with my prevous VB programs besides that they were, well old.
I just had to learn how to do it in the new langauge. Could my program have run OK without structured error handling (try catch)? Sure.
So yes, learning is important, but many brain cycles wasted on "upgrading" existing knowledge.
It seems like this industry is always coming with new ways to make 1+1=2, some good, some bad. I think a lot of this has nothing to do with learning but how quickly you can rifle through information.
Great post. My university degree (English Literature) provides me with no knowledge to do my job as a developer/ technical project manager. However completing it gave me 1. the confidence that I could learn and 2.techniques I could use to find out what knowledge was relevant to any given domain and to filter out the irrelevant stuff.
I regret the time I took out of earning money to do my degree, yet without my degree experience, somtimes I don't think I would have the skills to learn and reference the ever-changing and even obscure technical points my job seems to require.
Despite that, I still believe it mostly comes down to how much you want to learn something as motivation cannot be bought. There are so many claims on our attention these days that only the information that is truly relevant and useful will stick. And that's always the information you personally consider important. Whether you find it in school or through personal study is irrelevant, if you want to learn it, you'll learn it.
My degree subject was Philosophy. It was all about learning critical thinking and analysing difficult problems. There hasn't been a day that has passed that I don't use it in some way. I can't say the same of any of the computer science and programming qualifications I have...
My sixth grade teacher once said "It's not so important to know something as to know how to find out about something". That has stuck with my whole life.
I agree, many years Spent at Webster University Thailand, but at the end of the day, it's your attitude towards the material, subject, the teacher presenting the material/subject that will make the biggest difference.
Can't say everything I learned wasn't helpful, but definitely gave a stable base for what was to come. And real world is always harsher than safety of classrooms