March 28, 2007
I occasionally get emails from people asking how to prepare for a career in software development. Some are students wondering what classes they should take; others have been bitten by the programming bug and are considering their next steps.
I always answer with the same advice. There's no substitute for learning on the battlefield.
It appears to me that software development is happening in industry, not in the universities. Universities are great for problems that can be solved by sitting alone and thinking or experimenting for months on end. Universities were great for giving us automata theory, complexity analysis, compilers and the like. But universities are not at all well suited to understanding what is happening during software development.
Software development at the moment is much more like the early manufacture of samurai swords, shields, and battlefield tactics. You make a pile of swords or war tactics, send them onto the battlefield, and see which ones worked better. Then you make different swords and tactics, and so on. You have to be on the battlefield.
I can't imagine learning the things I've learned while sitting peacefully in my office reflecting. Most of my original reflections and predictions were just wrong. So any one of you who is interested in this topic probably has to work as a developer or consultant, so you can see the moment-to-moment action and get raw data.
Of course, software development only teaches you how to talk to your computer. Higher education is still worthwhile because it teaches you how to talk to people. With a good educational background, you'll learn how to read effectively, how to write coherently, and how to think critically amongst your peers.
If I were founding a university I would found first a smoking room; then when I had a little more money in hand I would found a dormitory; then after that, or more probably with it, a decent reading room and a library. After that, if I still had more money that I couldn't use, I would hire a professor and get some textbooks. (Stephen Leacock)
For a fast-moving field like computer science, the work you're doing is far more relevant than any classes you're taking. If you must choose between formal schooling and work experience, always choose work. If you're in school, aggressively pursue real-world experience that compliments your schoolwork.
Fortunately, this is a battle you can fight on multiple fronts:
- If you're a student, seek out internships like your life depends on it. Some of the best programmers I've ever met have been college interns. Intern somewhere that you can absorb and learn as much as possible. You won't make much money, but the experience will be priceless.
- Participate in local user groups. User groups are an unbeatable resource for people just starting out in their careers; they're an excellent source of advice and mentorship.
- Contribute to an open-source project. There are thousands, so pick whatever strikes your fancy. But pick one and really dig in, become an active contributor. Absolutely nothing is more practical than working collaboratively with software developers all over the globe, from all walks of life.
- Publish articles. The cleverest code in the world won't help you if you can't clearly communicate how that code works, or what it's for. Try your hand at writing. CodeProject is an excellent sandbox to practice in. Publish an article and the large, active CodeProject community will let you know how you're doing with ratings and comments.
- Start a blog. Pick a writing schedule and stick with it; I recommend once a week at minimum. Select a general theme for your blog and write on topics related (at least tangentially) to that theme. And don't be an echo chamber.
You don't have to do all these things, but if you're serious about your career, pick at least two and follow through. For more detailed advice, I highly recommend Rob's advice on how to become a programmer.
In software development, you learn by doing. As long as you're out on the battlefield fighting the good fight, you're bound to improve.
Posted by Jeff Atwood
I'm currently going to school for a CS degree.. however it seems to be a fairly useless thing. I've done a lot of programming outside of school, and I feel that that has helped me far more then school ever will. The classes I take seem pretty useless.. I taught myself threading the summer before I took any classes on it. Even then, the class on it didn't really go into any of the problems that occur when dealing with them.
I had a job last summer working with databases.. that probably taught me more then I'll learn in my upcoming "Database Systems" class.
I'll be starting my third year of classes in a few months.. So far, I haven't learned anything relating to decent development methods (source control? No one here knows it exists.). Nor have I learned anything about program design (other then hearing about the advantages to object-oriented code on a continuous basis)
Schools are not taking the correct approach to things. At my school, students going for an Architecture degree spend a ton of time working with the latest tools in their field. They have people come in from industry to grade their projects. By the end of school, they will probably know what they are doing. In contrast, CS majors have two classes where Java/C++ is taught. These two classes are easily the majority of the programming that will be done in school.. Yet they only have students programming for 1.5hrs/week. I'm no longer surprised when students who have only done what's required of them in school have major problems understanding whats going on in class, and problems doing the pretty simple programs assigned to them.
If we stick with the battlefield analogy for a moment, I would claim that you better be prepared for the battle. Entering the battlefield without proper preparation will only get you killed. If someone else relies on you winning the battle to survive, you might get them into trouble too. Don't get me wrong, I strongly believe that experience is critical to improve, but IMHO it is important to enter the "battlefield" with a solid foundation. You need to spend some time to learn and explore outside the battlefield to get to know your strenths and weaknesses. Once that is done head for the field.
Kim, the consequences for software failure almost never involve loss of life-- only loss of money. This is strikingly different from other engineering disciplines, such as the oft-quoted "bridge building" metaphor.
While I don't think people should enter the work force unprepared, most of the nitty-gritty learning will be done on the job anyway. Military rigidity is wholly inappropriate for the fluid nature of software.
In software, you never stop learning how to learn. Some might argue that you have to run as fast as you can to merely stay in the same place...
"Learning how to learn is vital to being able to work, and *that's* something a good university will teach you."
I often recall myself saying the exact same thing. What's the point in learning everything about a certain subject. You'll end up forgetting it in the end anyway. What hopefully sticks is some kind of "intuition" that guides you through similar stuff in the future.
I'm not sure that university can make a programmer, but I do believe a good degree can instill good habits. I think degrees can also filter out those who aren't inclined to work in the industry. Thats not to say that there aren't good programmers without degrees, as the best programmer I've worked with didn't have one.
Oddly enough, the best and worst programmers I know are the ones 'bitten by the bug'. The best ones know how to channel what they have learnt, and have strived to learn through practice exploration. The worst ones get idealistic about the small amount they have learnt, and will push that along (a lot like attacking someone with a blunt sword).
I think the balance requires a solid education, and a large amount of trial and error - think of the Samuri who trains but never fights, versus the Samuri who charges into battle not knowing how to conduct himself.
Loved the article (as usual) but I don't have a response.
Just wanted to point out that images arne't showing up on any of the blog articles in firefox... works fine in ie. Not sure if it's just me, or if anyone else experiences this as well.
I have just finished my CS degree which I have been doing on and off for the last 10 years part time. For the last 5 of those years, I've been working full-time as well.
I have to say that almost nothing I have been taught *directly* at university has been of use to me in my professional life. Having said that, being at university exposed me to several different ways of thinking which I have found to be of use.
If you aren't going to go to university I think that the best thing you can do is to play with as many languages as you can. Learn the "wierd" ones like Scheme, Prolog, Haskell, and Lisp. Pick up Python, Perl and Ruby. Try out a few different environments such as ASP.NET, CGI, Ruby on Rails, PHP, Windows Forms, Tcl/Tk, GTK and Command line interfaces. Each one will teach you something new about the way programmer intent can be expressed or how to interract with your intended audience.
Never stop coding, never stop learning and try not to make the same mistake over and over. Being a programmer is about passion.
As an employer, I much much much prefer to hire people with at least 2 years of a comp.sci degree under their belt. That's enough for them to have learnt the basic structures and algorithms. We've had people who were self-taught (a lot of them, as we initially favoured that), but we've found that it leads to some awkward blind spots. Better by far people have a formal grounding, _then_ go out and learn the messier bits of reality.
In terms of office dynamics, it also helps to have a mix of people - the more varied the experiences of the programmers, the more they have to share with each other, which is good all round. We have programmers who were sysadmins, programmers who are formally taught, and programmers whose experience comes mostly from the field - and they all have a slightly different view on any given problem.
Overall, though, I really strongly would recommend _against_ picking work over school, at least until you're past first year or two of uni. Oh, and from here, a uni degree means you can travel and find jobs - that's not to be discounted lightly.
My recomendation is to learn another subject in addition to programming. A solid understanding of the problem domain goes a long way.
The main goal of college education is - increase the mental maturity of students as well as soak their mind with the basics of a subject matter. Although college may cover only basics, remember that it is an important foundation for the rest of our life. Perry's scheme of intellectual development is worth a read:
"It appears to me that software development is happening in industry, not in the universities."
This is like saying that cars aren't built by the materials science people. Damn right they aren't.
Universities have a focus on research. The industry has a focus on producing products. The fundamental fault people make is thinking that a university education is preparation for work. It isn't, and it can't. In order to expose a student to all things in a software development company, it would have to *be* one.
What a university can do is allow you to widen your horizons. You will explore many areas of computer science - different languages, paradigms, and aspects of computers.
Many (but not all) of those who go straight to work end up with a narrower view of programming. For example, if they get a job as a PHP coder, then they will never ever learn anything except PHP, and when PHP is "out" and something else is "in", well, they're out of a job. (In the same way, many navel-gazing researchers forget that the wonderful new thing they have invented can't be built in practice.)
Universities can provide you with breadth. They can keep you from being stuck in one small part of the programming world, and will let you see what may come out of the labs. That's why you should stay in touch with the CS research community even after you've graduated.
Remember, the samurai who fought with swords were eventually slaughtered by the ones who got guns.
University = research
Software development company = commerce
Battlefield = you, commerce, research, social skills, coding, balancing work and free time, your family, life itself, ...
For me it's about solving real problems. By real I mean actually filling a need right now. In University, you tend either to solve non-real problems, or to solve very narrow slices of real problems.
Quite often in industry you find yourself taking a week to solve the core of a problem and 6 months to actually finish the application. This is because in the real world you have many other peripheral and non-functional requirements to deal with. The thing needs to be usable, fast, scalable, secure, deployable, upgradable, pretty, stable, profitable, maintainable, etc etc. The devil is in the details, so to speak. But solving these problems efficiently and cooperatively is the mark of a great developer (and development process).
Agree 100%. Baptism By Fire is the only way.
Get data structures and algorithms down in school. After that, you learn best by coding, and by being exposed to those with experience who code well. You'll find such people most frequently in environments where code is moved to production quickly, and frequently.
I like this analogy, the point to me is, you cannot get better at writing AJAX enabled web "applications" by reading about what others have done. I 100% will tell you that there are non-obvious problems you WILL encounter for any non-trivial application.
I was once tasked with writing an example application. The only requirement was that it "uses struts". This was the only application I've worked on that I would consider an (almost) complete failure. Every time I tried to inject a functional requirement, we got tied up with architecture details (should it be an interface or an abstract base class, explain) and never actually delivered anything.
It was amazing to me that the "leads" on the team where 100% happy to sit around for 10 hours per day and talk about the reasons why one solution or another might be better, but actually refused to TRY anything. In my experience, if you don't actually try to implement a design, you actually have no idea if it's going to work or not.
I ended up getting my job without a degree. granted I do make only 50k (this is my first year there though). And not have the CS degree has actually made it easier. The guys I work with that have them always seem to make things MUCH more complicated than they need to be... thus logging more hours... thus making more money... damn is that what college taught them? :p.
You mentioned CodeProject, but you forgot an important caveat: Don't actually use anything that you find on CodeProject. It may be alright for getting your own stuff rated, but due to the quality of most code there Alex at Worse Than Failure had to make a special exception that code from that site not normally be featured on his site, otherwise he wouldn't need any outside contributions for years.
I learnt a lot on the job as well as in Uni. To say one replaces or supercedes the other is nonsense.
Get practical experience while in Uni, it'll help you learn better.
Get theoretical experience while at work, it'll help you code better.
Excesses of one lead to mental m@sturbation and analysis paralysis, excesses of the other lead to shoddy copy+paste code that lacks expression. Life's a balance.
One must really want to become a great developer to invest so much energy in such a small, meaningless part one's life.
Anyone here ever considered what's the cost?
My view resides with those who see college as a filter...
(I agree with the 'research' aspects as well,
but I'm not hiring the research folks for what I need).
If you are developing some software and dont know what the head of a queue is, then you need to learn the terminology.
If you went to school, but didn't 'succeed' well enough to graduate,
then you're fighting the establishment too much, or you're not able to
learn stuff, or you have no discipline.
If you never went to school, you could still know the basics, could still learn stuff, and still have discipline. I've hired and worked with such folks with great success... but not often.
You know, for some reason I thought you were advocating a career in the military as a programmer...
I can learn from university what I can not learn from the "battlefield".
I couldn't agree more that the only way to really learn is by doing. And failing. Failure is the ONLY real teacher. I think that R. Buckminster Fuller once said that "The purpose of Engineering is to fail as quickly and spectacularly as possible".
I just came back from a training seminar. I "won" enough software books to fill almost a linear foot of shelf space. I won't read them. I may, just may, refer to them if I run into a problem. If I want to learn something in a new language, programming paradigm or methodology, the best I can do is read enough about it to just dive into it and start working. I am not afraid to throw away code that is not beautiful. I have written less code overall by throwing away already written code than I would "save" by patching poor code with more poor code.
"Just Do It!" isn't just for sneakers.
I went to school for Economics and Drinking.
Been working as a programmer since I was 15, though. Persistence and determination pays off.
Perhaps I'm fortunate that my lecturers actually worked for software companies so they knew what was actually useful. Granted that real world experience is what counts, but having a guide was important.
I have to say that you are right Jeff. Learn to do by doing is extremely important for a programmer.
I was taught through a college where the classes were designed to teach us through code. Instead of simply learning advanced theories like binary trees in a classroom, we coded it. In order to code it, we had to learn it. Not only that, but we had to do a large percentage of our class work in groups. This brought in the social requirements of working as a team.
When I finished school, I came to work with an excellent ability to learn and adapt. I am not coding in a language that I learned in school and I was able to start coding the second day of work without any difficulty because I already learned multiple languages in the classroom. Our classes forced us to work and learn as teams in many different situations, coding many different types of problems and languages. You learn as a team and you excel as a team.
I have yet to find a single University where this method of learning is used. I may not have a degree, but in many ways, my education far surpasses anything taught in the universities I have looked into.
When hiring Software Engineers I look for a CS or IS degree. No degree, no job. Next I look at personality and social skills. If I don't think they can fit in with me and my team then no job. Finally I look at experience (I include certifications in this category) and compare that with the position's requirements, salary level and the expected learning curve. I am much more flexible on experience since I know that with the degree they already have the basic skills they need to do well in the work place and it is just a matter of good management and mentoring (i.e. this goes back to personality and social skills) to bring out the best in them.
Absolutely agree with you.
At this time I'm 17 years old, student of Moscow Aviation Institute, and I already work as programmer (part-time).
Dunno about US, but here in Russia higher education decent only in theoretical fields. IT classes frozen in time, in end of eighties probably :). We study ancient artifacts like FORTRAN and pascal :).
Most of graduates don't have a clue about where to start.
Conclusion: you can not taught one how to code, one should learn it. I mean 'If you want to learn programming, you should program'
Most of people think it's hard, but it's hard only without passion.
Sorry for my poor English :).
while i understand the need to get on the battlefield (a poor comparison im my opinion - i hardly think my work has anything to do with a war mindset), there's nothing wrong with a little bootstrapping. read relevant journals and books. get certified. keep a project idea journal.
you'll be surprised how far a little ingenuity will get you.
I have to agree with both;
I am sure most places have the self taught person who "wrote Pascal while you were in diapers" and could care less about that "extensibiwhatis" mumbo jumbo. This is the way I have always done it.
And then there is the old straw about the recent PhD Graduate that thought he was doing a great job making a database schema normalized to the umpteenth degree in Oracle, when the requirements called for about 4kbytes of data persistence that could have been put in XML....
Don't neglect one for the other, but then I guess that could be applied to many things in life...
"When hiring Software Engineers I look for a CS or IS degree. No degree, no job."
Sadly Phil, this is the attitude that most people take when hiring their employees.
What separates the CS or IS degree holders from college diploma holders? A piece of paper.
I've heard of only one job interview where the ability to learn was one of the things tested. They sat the guy down in front of a computer, gave him a book about the company's proprietary programming language and gave him one hour to code a task in that language.
This is what separates the men from the boys. When you can sit down in front of a computer, learn a new language, analyze the problem given, and code a solution for that problem in one hour, you are a programmer. No piece of paper (a degree or diploma) will ever tell an employer if a person can learn and adapt. The only thing a degree or diploma can tell you is that the person can memorize facts for an exam.
You simply can't beat on the job experience. Even working with teachers who don't code regularly get lost in the business of theory. Don't get me wrong theory is good to know and can be extremely beneficial but I've noticed many teachers have lost their edge in terms of a 'working professional'
The best teacher is your peer who is better than you and who writes inspiring code and shows you the way.
Hobby programming is nice but you won't lose sleep if you never get around to finishing your project and how many of us never finish?
I don't care how, but you must get your foot in the door somewhere even if the pay is shit and get to coding. And when you hit your first wall of not knowing how to program something your boss asks for...well only the strong survive...and if you're a survivor then you are a programmer. Because to program is to survive.
There are two kinds of programmers: Those with a degree and those without one. They don't do the same work. For the stuff we do where I work, we would quite simply not consider somebody without a CS degree from a prestigious university.
The CS degree gives you the theory. Work experience gives you knowledge of APIs and experience. For some programming jobs, you only need to latter two. For some, you need all of them.
We study ancient artifacts like FORTRAN and pascal :).
This is a perfect example of somebody who does not yet understand. You don't study a language. You study concepts. The language often does not matter at all. In fact, it's probably a good thing you see a bit of FORTRAN and Pascal (or Oberon, or Eiffel) to learn about the history of procedural languages. That tells you a lot about why we do the things we do the way we do them. In the same vein, you'll proably not learn (much) about functional programming or about logic programming if you aren't forced to do so when getting your CS degree.
I find that academic research runs 5-30+ years ahead of industry. Even stuff like relational databases took a decade to move from some "ivory tower" paper to a commercial product (paper in 1969 to the product that became Oracle in 1978). Now, SQL and relational databases are everywhere. It wasn't always that way. Our profession has a very short memory and part of that is because of the "use and discard" mentality, where folks over 30 tend to be discriminated against. Functional programming? That is ivory tower academic stuff that is only starting to become known outside of CS departments - more than 30 years after being "discovered."
Do you learn how to use things like "source code control" in school? Most likely not. Less than half the companies I have worked for as a developer in the past decade have used source code control. Why learn something that half the companies in the US won't use?
Theory is important for understanding how one can get better. It isn't a replacement for actually going out and "doing it."
Don't confuse "talent" with "skill." When describing the difference, I use a sloping line, of the "y = ax+b" sort. Talent increases the slope of the line, while skill represents the y-axis. A talented person who practices not at all will have a very low skill (y-axis measure), while someone who has low talent, but practices very vigorously will end up with a higher skill level than the talented person. Learning actually has a second order component where the more you learn, the faster you learn more - like an interest rate. Richard Hamming gave a speech on this two decades ago, and it is still true today:
As an analogy, I like to point to the club band scene. I play guitar (not in front of others), so I can tell who has better chops than another. You can see who has had "classical" training almost immediately. There is an upper level that self-trained guitarists can reach, while one who has had formal training can reach much higher levels of skill.
I agree with Jeff in my case I specialized in networking but by turns in life I ended up as a System Developer and yes sometimes the lack of experience can get you in tight spots but if you get mentored and you prepare yourself on the "fly" you can end succesfull intern or employee. That both of them it has been my case.
Summary of thread so far:
1. People who haven't got a CS degree think degrees overrated, because they didn't go to a university and became great coders.
2. People who have a CS degree think degrees are vital, because they did go to a university and became great coders.
There are obviously many paths to wisdom.
Academia isn't always years ahead of industry.
Two recent notable examples: graphics hardware / rendering and data center scale engineering / operations.
Of course, if you insist that academia is 5-30 years ahead, then I'll agree... it's just that often times they're years ahead in an impractical ("wrong") direction :)
The Canadian Academy of Engineering requires a year worth of internship before it allows you to graduate with an engineering degree (which comp sci degrees are considered in Canada.) I wish this was a requirement of more programs in the US.
I'm with Jamie on this topic. I have a diploma from a technical college. The course work was very practical. We did learn the theory, but we also applied it through labs and assignments. There wasn't a single data structure or advanced algorithm we didn't write by hand. After graduating I was able to go head to head with CS degrees who had several years experience. I have worked for at least 2 companies that actually give preference to graduates from the college I attended over applicants with a CS degree. Mainly because we can produce rigth from the get-go. I have had on several occassions, fresh CS grads put on my teams and found, wth a couple of exceptions, that they took longer to come up to speed with the technology used and took longer to produce quality results, even after a year or two on the job.
I've been working as a developer now for 18 years, and have never had a problem finding work - in fact I turn it down on a fairly regular basis. I went independent (contracting) 7 years ago and it's been steady ever since.
Don't get me wrong. I'm not trying to knock or slag CS degrees. I have worked with many CS grads who were quite excellent. I have also worked with diplomas from "lesser" technical colleges who were totally useless. I considered going back to uni to get a CS degree myself (all courses from my tech college are directly transferable to the uni in my city - no other tech college here can do that -, so it wouldn't take long). After a couple of years working, though, I just didn't see the point. In this city, at least, there doesn't seem to be any extra benefit. I don't think those of you who exclude people because they don't have a "degree" are likely missing out on some fantastic resources. Not all tech colleges are bad (though I agree there are a lot that are). Some are very good. A degree, in and of itself, does not mean the person holding it is a better developer. It just means they have a degree. /end rant.
I have to say I disagree with you on many aspects of this article, but first, I do agree that there is no substitute for experience. However:
Trying and failing does no good if you don't know what you're doing in the first place. In a scholastic environment, you can get the help you need to figure out what's going on so your failures can be corrected and you can show what you know. In a profession environment, this scenario would result in you finding a new job.
There has to be a foundation for any job. You don't become CEO of a bank without knowing business math, and you don't become a programmer without knowing software principles such as memory usage, syntax, and the like. If you just jump right into the industry with no prior knowledge, odds are (as I have seen) your code is going to be very inefficient, ugly, and buggy.
"the work you're doing is far more relevant than any classes you're taking" - wrong. Most of my classes at my school took the principles of computers and built on them to make me able to use any programming language I want. From learning about basic data structures to sorting algorithms to compilers, I gained a total understanding of how software works, and with that foundation, I can currently use C, C++, C#, and Delphi to write any application I want, and any other language I want to pick up I can do so in about a week as compared to months for someone who's never had any formal training. Training is vital to being able to work.
"seek out internships like your life depends on it" - probably the best piece of advice for students out there. However, schools offer something that you can't get anywhere else: a pre-existing network that you're automatically in. In the workplace, you have to build a network of contacts, friends, and supervisors so that if something happens, you can find another job fairly easily. Typically, that network is quite small for a long time. However, through schools, that network is built through the instructors, your peers, and the support faculty (secretaries, career services, etc). It's already in place and ready to direct you to a great job *if* you have the skills and knowledge.
I won't go into the rest of it, but will close by saying Experience is great, but lack of skills will get you nowhere. It's 100 times harder to get those skills with experience only, and a good school will set you up for great sucess.
LKM, you're absolutely right. I messed up :D
There are paradigms, concepts, that will stay the same long, if not infinite, period of time(procedural, functional, oo, general, etc). But there are technologies that are constantly evolving. If you know concepts you can do anything theoretically. But in practice you ought to know more than just concepts.
In ancient Japan sword masters tested katanas on wood... and peasants. It's blood what makes sword weapon.
Same in CS. Swords stay swords. Steel changes, shapes evolving, and so on. But you still need to spill the blood.
Once you know concepts you are blacksmith. And when you kill - you are sword master.
So by that I mean one thing: good programmer always in process of learning, and applying knowledge in practice (yeah nothing new here :D). Not just waiting for something/someone to teach him.
Training is vital to being able to work.
I'm sorry, but I do not agree with this. Learning how to learn is vital to being able to work, and *that's* something a good university will teach you.
But the courseware itself? Largely irrelevant in our field.
For best results, get out there and immerse yourself in the thick of it with your peers. Make mistakes!
tcliu -- You are my hero.
Both of your comments are insightful... the first, into our industry. The second, into the people that make up our industry.
No way. You're saying that experience is important? That's unheard of. Theory and practice are the same thing, aren't they? Universities produce all or almost all of the software that's used by millions of people around the world, don't they? Evil corporations don't innovate, they just steal the ideas from universities, right?
I'm one of those who were mislead by thinking they should go to school first, get a degree, and then find a job. I never dared to apply for a programming job before completing my degree. So, I learned the science and management of programming. I made my code clean and maintainable, I understood the importance of good requirements and well-designed models, I even solved a few nasty puzzles and had a peek at a few related fields. Then, I was dropped into the battlefield.
I didn't have the acronyms. I didn't have any specs. Nobody had any kind of methodology whatsoever. All I was expected to care about were "implementation details", as my teachers called them. Nobody had the faintest idea of what professional software development was about. They just hired programmers to do the technical stuff.
In the years that I fought, I acquired a few language skills, learned to configure system services, followed some of the hype and got a feel for the type of applications I'm developing. I don't really model. I don't test much either. Who would care anyways? In some ways, what I'm doing today professionally is much closer in nature to what I was doing in my basement as a teen in BASIC than what I learned to do in school.
Sometimes, I wonder if I haven't regressed.
Jeff, you say - "As long as you're out on the battlefield fighting the good fight, you're bound to improve."
But what if you're out on the battlefield, but fighting the BAD fight? Whe wrong-way fight? You're bound to...
I disagree with the notion that it is necessary to attend a university to prepare one's self for real work as a software developer. I taught myself C++ by writing code and reading books. It took about a year in between stocking shelves at the local supermarket to master the basics. After realizing that I wanted to write code for a living, I decided that I should probably go to school.
I wouldn't say it was a mistake, because it opened a door to an internship that grew into a full time position. But, I learned far less _useful_ information through my 4 years of (part-time) college than I did in that first year of actual programming. After working full-time and taking night classes for 4 years, I decided to put school on hold and haven't looked back since.
If you lack the ability to learn complex new concepts without the structure of a university environment, you will not survive in this field. Going to a university may teach you how to learn, but if you already can, teach yourself the skills necessary to build something impressive. Then do it. Contribute to an open source project, or create your own. In my book, a background of excellent work is far more impressive than a college degree.
When hiring Software Engineers I look for a CS or IS degree. No degree, no job...
Phil, i think you are a moron, truly. I find this method of hiring really discriminatory, especially concerning programming. I left uni. where i was studing computer science, why?, because my professors were idiots. If I have the discipline to teach myself, and at a level and pace that supercedes uni.. how am i lesser than the other programmer that stayed in class? That degree requirement of yours seems too rigid in my view. and if i had to guess, you're probably a bad programmer.
Ultimately as Jeff says in another post, that you have to have the mindset to be a programmer. I don't believe that a CS or IS degree is any more important than a mathematics degree. Programming requires logical thinking, which college/university course can filter out the least suitable.
One thing that I don't think has been mentioned enough is that CS courses tend to favour Open Source software, which means zero in terms of candidate suitability in a Microsoft house.
In my company we tend to hire people from a numerate science discipline (physics, chemistry, bioinformatics, ...), mainly because our customers are scientists, engineers, etc. Learning SW engineering/development is done on the job plus specific training. We use few CS graduates since they wouldn't fit our customers. Yes, some more formal education may be helpful but I agree with the notion that the more interesting bits are happening in industry, not at university: look at agile/XP, test-driven, refactoring, patterns, ...
Do Generals make analogies to software when talking about Battles? Doubt it.
Battlefield analogy is cute but isn't great. Analogies in general bite because someone always picks a hole in some aspect and they're never perfect.
Custards last stand anyone. Er, actually it reminds me of many software projects actually.
The battlefield analogy is poor because the way it was presented was basically saying 'experience is good. don't let your lack of schooling stop you for getting practical experience before you finish your degree'.
However there are so many more aspects to battlefields that are being conveniently sidestepped for the sake of a good article. If you think armies just wake up one day, run to a field, and start fighting you're an idiot. Those that do, die. Battles are planning, and planning takes skill. Skill is learnt through a combination of experience and revision (of history, tactics, etc). Soldiers train, practice manoeuvres, etc etc etc.
Battles themselves are short, explosive bursts of activity following orders and adapting to situations.
Writing software is talking to clients about features, prioritising, coding, testing, prototyping, meeting, talking about more features, reprioritizing, changing code, maintenance, more testing, regression testing, more meetings, and the odd successful, on-time delivery ....
So how exactly is a battle in any way similar to the process of delivering software?
Because one should never criticise ideas without offering their own, here's my newbie advice:
Don't expect to be an architect or know everything out the door. You won't solve things the best way the first time. Failure is ok. Have a goal.
AVOID analysis paralysis. Coding and testing something will give you a better indication of how things work than looking at a sheet of paper all day and drawing pretty pictures. Don't just throw away those prototypes though, refactor them into something useful.
Write tests first.
DO find a good 'commander' (ie mentor) and get in the trenches. Listen to them, take what they say, but don't believe everything you hear. Think critically.
DO move around. You don't want to get stuck in a job for more than 2 years. (it's better for your salary as well as for your experience)
DO NOT feel bad about quitting.
DO say NO. Avoiding bad experience is better than no experience, so don't just take anything that comes along. It's your life, and if someone asks you to write bs code for their horrible WTF of a system you should pause, take a look around you, and if you see your future spiralling into the drain then leave. It's just not worth it. You don't have kids or a mortgage yet, and you can manage eating KD for a few more weeks.
I'll just echo many of the previous comments by saying that education is a crucial aspect of any programmer's career. However, without any experience "on the job", a well-trained programmer often knows just enough to be dangerous.
I've seen that some people consider certifications to fall under "experience". Personally, I'd consider them a proof of an education. Somebody who has a Sun Java Certification most likely knows what they're doing. Being Sun Certified says leaps and bounds more to me than a mention of passing a Java course or two at some state college somewhere.
That depends highly on the faculty that "gets" the Comp Sci program. You'd be correct if it's the engineering faculty (or you're confusing it with a software engineering degree).
Many universities have the Science faculty responsible for Comp Sci, in which case Comp Sci students escape with a BSc, not an engineering degree (and so are not engineers). There's also the weird case of the University of Waterloo, which has a Math faculty (which gets the CS program, obviously).
You also need to stay on the battlefield long enough to learn from your mistakes. E.g. if you design a system and then leave before it goes live, you've only seen part of the battle. You don't know what happened in the end, so you don't know the ultimate outcome of your decisions. Naturally, you'll assume that your contribution was successful, but was it? What were the final consequences of your decisions? Did you make a mistake? Did you omit something important? If you don't hang round until the end you'll never know.
Too much "experience" in this industry consists of people doing stuff, but then moving on to other projects before they even have a chance identify their mistakes, never mind learn from them.
I read somewhere (but can't find the link) that proficiency in software development is not correlated with years of experience, but with the number of times a person has been through the whole lifecyle of a project, right through to completion.
Also, you learn faster if you can get accurate feedback faster: http://www.agilekiwi.com/making_better_programmers.htm
Personally, I agree that talent and work experience is paramount, and that most programming jobs can be done without academic background (though I think it adds a lot of value).
However, I take issue with the tone from the article and some of the commentators, as if universities are a thing of the past ("universities were great...") or as if they are behind the times in research.
As with most industries, the software industry is and have always been years behind the research front in the academia.
Some examples (also from electrical engineering): distributed computing, garbage collection, communications, voice and video compression, image analysis and processing, control systems, parallel processing. I could go on.
This should not come as a surprise. The purpose of academia is to research. Even if the area is obscure. Even if it seems of little practical use. This is the difference between science and engineering.
Later, come the engineers in the industry, which have a problem and need a solution (or saw a solution and are looking for a problem). These _academically trained_ engineers keep on top of current (or older) research. Or maybe they vaguely remember something from their studies. Perhaps they have friends in academia, who remember reading this or that article.
The result is that these engineers read the research, then apply the solution (for example, an algorithm) to the problem. However, research is seldom interested in practical considerations. So the engineers need to understand the solution, and then adapt it to the real world. Of course, that takes a lot of time.
This is the norm in most disciplines, and not that different in many areas of software engineering.
This is also why it pays to have academic background. You know so much more, you have a wide base of knowledge to draw from. You have a strong theoretical background, meaning you can quickly become familiar with new subjects, just by reading the basic book on whatever subject you are currently working on. Some jobs are so hard that you need an actual academic expert on the subject. Just try developing a V.17 modem yourself.
Of course, there are exceptions. They are exceptions.
I would tend to agree with this whole battlefield analogy. From my experiences from college then the past 7+ years in the field, I feel I didn't become an effected developer until several years after leaving college. I even wrote about it on my blog 9http://www.billrowell.com/2007/03/08/what-did-you-really-learn-in-college/) a while back, reflecting about what I had really learned.
@Phil: "When hiring Software Engineers I look for a CS or IS degree. No degree, no job."
And that's why you miss out on some really good programmers working for you.
The best programmer I've ever known was a Russian immigrant who Americanized his name. He had a Soviet high school education, and no higher learning whatsoever. He was a legend at a very large corporation in Houston, so it wasn't just me that thought he was good.
I myself don't have a CS/IS degree (no college degree at all, actually - I entered the military right after high school), and I've been programming since 1986. Since I've sold commercially (including a library with source for other developers which sold more than 10K copies worldwide), I must do OK as well. I work primarily in Delphi, but also work in C#, C++, and ASM.
I actually have a pretty good reputation among Delphi programmers, and the company I currently work for typically requires a degree. They actually invented a new position in order to be able to get me here; the new position required the degree or X years of experience as an equivalent. If they were willing to go to those lengths so I could work for them, I must do OK.
When I was interviewing programmers for the above-mentioned large Houston corporation, I looked at the resumes for experience first, and then education. I did a phone screening looking for knowledge levels and an indication of their personality; those who were obviously not knowledgeable enough (like the guy who coughed a couple of times before answering each question in order to hide the sound of his keyboard as he googled the answer) or seemed a poor fit with the people they'd be working with, didn't get an in-person interview. Neither did the gentleman who, about halfway through the 30 minute interview, flushed the toilet. :-)
I've had interesting experiences with college-educated people. I've met some who were really, really intelligent. I've also met people with Masters degrees who didn't have the common sense you'd expect from a new teenager. I watched "Are You Smarter Than A 5-Year Old" for a couple of minutes last night, and saw a 3.5 GPA college graduate who thought that the sentence "In Oklahoma, Oprah met our ostrich Ozzie." only contained one proper noun (Oklahoma), and that Oprah and Ozzie were pronouns. The 10-year old she was paired with at the time saved her. :-) How much is that 3.5 GPA worth now, and what does her college degree prove?
It's the underlying intelligence that's most important, not the degree. If the smarts are there, they can learn now what they need to and didn't get to go to college to learn.
Reviewing what I wrote above, I'd like to clarify one thing:
I have no problem with the guy who was using Google to find answers; that's actually good. The problem was that the questions he was looking for answers to were basic knowledge that every Delphi programmer knows; if they didn't, they'd never get anything done. We hadn't even gotten to any of the tougher questions.
But the courseware itself? Largely irrelevant in our field.
Depends on what you do. I was actually surprised how much I was able to use, for example when writing search engines or compilers.
As I said, some jobs require it, some don't.
I think the main issue is that those of you who never got a cs degree actually don't really understand what is being thaught there.
For example, Paul wrote:
"One thing that I don't think has been mentioned enough is that CS courses tend to favour Open Source software, which means zero in terms of candidate suitability in a Microsoft house."
I think that is a perfect example of somebody who simply does not understand what students actually learn. They don't learn "open source software," they don't learn specific examples ("instances" :-) of technology. they learn concepts. It does not matter one bit to Microsoft whether they learned how to design and write, say, compilers using open source, or whether they used closed source to learn it. The important part is that they've learned it.
Anyway, my point is this: Most of the people here claiming that CS degrees are useless or antiquated or that experience gives you everything you get during a CS degree probably have no idea what students actually learn when they get CS degrees.
Any paper qualifications (degree included) only tells you the minimum of what the person has/could achieve. That means, it only tell this person can do/know at least this and this and this....
To know what's the most the person can achieve, you still need to probe for some more; ie. interviews...
BTW, having basics such as bachelor degree helps in understanding the concepts along the way, along with having more broad understanding of the field. If the person in question dont use that to advantage, he/she can easily lose to someone who works harder to gain the knowledge. No suprises here.
In other words, as other poster have noted, there are many path to wisdom. Pick yours.
"With a good educational background, you'll learn how to read effectively, how to write coherently, and how to think critically amongst your peers."
Have you spoken to a college grad lately? I tend to disagree with the above statement.
I really like wat u said. Can u provide a little more details for a starter like me who only knows how to use loops in C language to write 10-20 line programs in compiler. I am currently in first year computer science Indian college, where only aim set by uneducated teachers is to clear all the exams and have above 75% attendance.
I can study from books, I can work hard. And I can also force my close uncle to allow me entry into his software compay. I am craving for success, I am getting sick.
The only thing that scares me about leaving college is
1. I might miss the basic knowledge (as people say).
2. I will get involved in same kind of work in the office and will be deprived of core knowledge to evolve as a real software creator.(I am not going to be mediocre).
3. I have given a huge money as donation,(thats the only way in India) so failure will literally kill me.
I beg for your sincere and may be long advice.
The thing is complicated, but the thruth is that experience play a very important role.
And sometimes is better than a degree..
Buy the way, the article is very good..
Computing is more and more like speaking; Necessary to get a lot of things done. Contributions are made on different technical levels. Just as effective human communication is done by battlefield commanders and movie scriptwriters very differently; Different computer users and programmers will make contributions very differently.
Some artists will start with Photoshop, move through scripting and invent something new from that vantage point. Others will follow in the path of Hoare and Knuth and invent algorithms. Computing is becoming multicultural in it's own techie way.
The Warren Buffets and Mozarts of the future will be using computer languages and other tools. Just not necessarily the same languages and tools.
So there are no one or two right backgrounds.
Apparently, those who didn’t go to college insist that the things being taught in universities are useless in the battlefield. I think getting a degree (at least bachelor’s) then spend the rest of your life in the battlefield is the best way to go. If you go straight to the battlefield without a degree, you will end up trying to convince your peers that a degree is unnecessary until one day, you finally realize that a degree is important and you drop everything in order to get one.
University is an absolute waste of time and money. You learn outdated ways of writing software, from people who have worked mostly on their own and in university their entire life. I had a lecturer teaching about working on team projects when he had never worked professionally in a team. On another topic, the lecturer who taught data structures and algorithms was a biologist associate professor who had never written much code. And this is coming from one of the famed Group of 8 universities. I was a tutor while studying and learnt that there is not only no incentive for teaching students, lecturers specifically do not want to teach students! The reason being they get paid for doing research and publishing papers, not to teach. And you wonder why there's more and more dodgy developers...
Learning from the battlefield from day 1 is the way to go. Nothing beats practical experience and learning from your mistakes.
My computer science degree was not from one of the top universities and I found it all rather easy. Fortunately, when I graduated I was able to get into a reasonable job which provided a stepping stone into my second job. That's the one where I started to improve and so on.
I think my degree has helped in that, it covered a lot of different topics. I learned about hardware architecture, electronics, lots of different programming languages, different operating systems, networking issues etc. This gives a well rounded IT professional who knows around the IT area. I think focussing completely and solely on programming is fine but misses some important stuff that could be more difficult to learn in industry.
I do agree that getting internships and getting involved in open-source is good.
You seem to be using software development and programming interchangably - "software development only teaches you how to talk to your computer. Higher education is still worthwhile because it teaches you how to talk to people."
Programming only teaches you how to talk to your computer; to be a good developer, you'd better know how to talk to people as well. From that point of view, one of the major goals of modern education (socialization) dovetails nicely with one of the things that will separate a good software developer from the programmer in the corner cubicle with the fluorescent bulbs overhead turned off.
The only thing that I would add is try to find a job/internship where you will be challenged and learn. This is coming from personal experience where from the job aspect I haven't learned anything that I didn't know 3 years ago, but at the same time it allows me venues to learn the different languages without having a high outlay of money.
As for whether a person has a degree or not I really don't think that should be a deciding factor. I majored in IT with a Minor in Comp Sci and honestly I didn't learn to much except for how to drink and work within a group more efficiently. I don't mean to bring up FizzBuzz again, but things like that I think are more beneficial to hiring then using if you have a degree you get looked at if you don't you are SOL. There are plenty of programmers who are great programmers and developers who have no college and on the other side there are some people who have degrees but can't program a Hello World application. Personally I think interview as many as you can without making decisions on if they have a degree are not, that way you have a better chance of getting the best fit.
That is all so true. Once you learn the concepts languages are just syntax. If you get a person who has a strong understanding of software development concepts I would be willing to place money that you could have them play around with a language and within a short time that person would become productive in a sense that they are able to build small if not large applications in the language.
Ah yes, the battlefield, where we wage war against the 1000s of lines of code in the applications we maintain or create. The debugger is my friend and exceptions are my enemies. Send in the Asserts followed by the Debug.TraceLines. Unhandled exception, have no fear, stack dump is here.
CS Degree required? Maybe, it will get you in the door, but university in general is overrated. I don't have A CS degree, although I have an engineering degree. I took a few CS classes, have 40 credits toward CS degree, but I doubt that I will ever formally get one. Real experience counts much more.
I think to be a good programmer you have to be smart and probably have an above average intelligence. You have to know how to solve problems that are sometimes complex in nature. Handling pressure is requied as well, because when you don't know what your doing and trying to get something to work, it will be a pressure filled situation.
So, programmers don't know what we are doing? Oh yeah, because many times you will be in the dark, coming up with a solution. Keeping with Jeff's military theme, Adapt, Improvise, Overcome.
Start with simple small programs or chunks or program code and then move to more to complex systems.
Also, one last thing, don't just write code, debug it as well. Test it, beat on it, performance test it, measure its complexity. This will help in the understanding or your code and others.
I have found (in my admittedly little field experience) that there is a distinct difference between university taught programmers and "I just started doing it" programmers. That difference is knowledge of basic underlying principles. The reasons why you would use a for loop in one instance and a while loop in the other. How to solve the basic building block problems and how to deduce the more elegant solution from the vast array of possibilities. No "How to learn insert language in 24 hours" will teach that sufficiently.
I have seen "real world" programmers who don't know, and really don't care because what they do "works". It's ugly, and brittle but it works. I argue that even though lives aren't lost (most of the time) the reputation of the software industry is tarnished by things that "just work".
Certainly you cannot learn to program by just studying the textbook, no more than you can do any field of engineering. But that does not preclude learning the basics before working with larger problems.
Interesting theory. College is way behind industry. So just forgo college and go straight to work. Sort of a catch 22, as most companies offering internships require that the applicant be an upper graduate in College. Also, this article seems like it has been written by an undisciplined mind, in that it offering many superb and bright opinions but doesn't allude to any studies, evidence or facts backing his claims. It's articles like these that put the "art" in computer "science." Basically I'll believe the guy when he backs his claims with evidence.