January 12, 2008
Greg Wilson recently emailed me the following question:
I'm teaching a software engineering class to third-year students at the University of Toronto starting in January, and would like to include at least one hour on deployment --- [deployment] never came up in any of my classes, and it's glossed over pretty quickly in most software engineering textbooks, but I have learned the hard way that it's often as big a challenge as getting the application written in the first place.
Deployment is a huge hurdle. It's a challenge even for the best software development teams, and it's incredibly important: if users can't get past the install step, none of the code you've written matters! And yet, as Greg notes, existing software engineering textbooks give this crucial topic only cursory treatment. Along the same lines, a few weeks ago, a younger coworker noted to me in passing that he never learned anything about source control in any of his computer science classes. How could that be? Source control is the very bedrock of software engineering.
If we aren't teaching fundamental software engineering skills like deployment and source control in college today, we're teaching computer science the wrong way. What good is learning to write code in the abstract if you can't work on that code as a team in a controlled environment, and you can't deploy the resulting software? As so many computer science graduates belatedly figure out after landing their first real programming job, it isn't any good at all.
Today's computer science students should develop software under conditions as close as possible to the real world, or the best available approximation thereof. Every line of code should be written under source control at all times. This is not negotiable. When it's time to deploy the code, try deploying to a commercial shared web host, and discovering everything that entails. If it's an executable, create a standalone installer package that users have to download, install, and then have some mechanism to file bug reports when they inevitably can't get it to work. Students should personally follow up on each bug filed for the software they've written.
Will this be painful? Boy, oh boy, will it ever. It'll be excruciating. Students will hate it. They'll begin to question why anyone in their right mind would want to write software.
Welcome to the real world.
After I wrote my response to Greg, Joel Spolsky posted an entry on computer science education that, at least to my eye, seemed hauntingly similar to the advice I offered:
I think the solution would be to create a programming-intensive BFA in Software Development -- a Julliard for programmers. Such a program would consist of a practical studio requirement developing significant works of software on teams with very experienced teachers, with a sprinkling of liberal arts classes for balance.
When I said BFA, Bachelor of Fine Arts, I meant it: software development is an art, and the existing Computer Science education, where you're expected to learn a few things about NP completeness and Quicksort is singularly inadequate to training students how to develop software.
Imagine instead an undergraduate curriculum that consists of 1/3 liberal arts, and 2/3 software development work. The teachers are experienced software developers from industry. The studio operates like a software company. You might be able to major in Game Development and work on a significant game title, for example, and that's how you spend most of your time, just like a film student spends a lot of time actually making films and the dance students spend most of their time dancing.
This is not to say that computer science programs should neglect theory. Fundamental concepts such as algorithms and data structures are still important. My algorithms class was my favorite and by far the most useful class I ever took for my own computer science degree. But teaching these things at the expense of neglecting more prosaic real world software engineering skills-- skills that you'll desperately need as a practicing software developer-- is a colossal mistake. It's what Steve Yegge was alluding to in his fantastical Wizard School essay.. I think.
There is the concern that all those highfalutin' computer science degrees could degenerate into little more than vocational school programs, something Joel mentioned in his excellent Yale address:
At Ivy League institutions, everything is Unix, functional programming, and theoretical stuff about state machines. As you move down the chain to less and less selective schools Java starts to appear. Move even lower and you literally start to see classes in topics like Microsoft Visual Studio 2005 101, three credits. By the time you get to the 2 year institutions, you see the same kind of SQL-Server-in-21-days "certification" courses you see advertised on the weekends on cable TV. Isn't it time to start your career in (different voice) Java Enterprise Beans!
You can have it both ways. That's why I'm so gung-ho for internships. College CS classes tend to be so dry and academic that you must spend your summers working in industry, otherwise you won't have the crucial software engineering skills you'll need to survive once you graduate. Unimportant little things like, say, source control and deployment and learning to deal with users. I constantly harp on internships whenever I meet college students pursuing a computer science degree. It's for your own good.
It does strike me as a bit unfair to force students to rely on internships to complete their education in computer science. Or, perhaps, something even worse. "Want to learn computer science? No college necessary! Just download some ISO images and found your own social networking startup!" Unleashing the naked greed of the TechCrunch crowd on tender young programming minds seems downright cruel.
So how should we teach computer science? The more cynical among us might say you can't. I think that's a cop-out. If students want to prepare themselves for a career in software development, they need to shed the theory and spend a significant portion of their time creating software with all the warty, prickly, unglamorous bits included. Half of software engineering is pain mitigation. If you aren't cursing your web hosting provider every week, fighting with your source control system every day, deciphering angry bug reports from your users every hour-- you aren't being taught computer science.
Posted by Jeff Atwood
Algorithms and data structures, the very core of CS, power all of today's top notch software products and services. Without them, we would have no search engines that crawl billions of documents, databases, sophisticated network routing protocols, iPods, video games, distributed computing, etc. Basically all that's left is the logic that powers today's crappy applications that your boss wants to outsource.
Without a basic 4 year education in CS, you're left with an college graduate who really has no hope of every producing anything of merit unless they're the rare of exception that's self driven, and has at least studied the fundamentals on his or her own.
What's left is software engineering as you mention, which is quite important, but not to the detriment of a proper CS education. Just as a variety of other posters have mentioned, the university isn't really a good place to learn this. A university will never replicate politics, the stress, the budgets, the variety of coworkers (all with different life goals), the customers, etc. All of which, in my humble opinion, form the majority of problems and issues you'll face as a software engineer (see your blog a few entries back). Any attempt at a simulation will be just that. I can land a 747 in Microsoft's Flight Simulator, but do you trust me to do the same with your life at stake?
As for technical topics like source control, buy your employees a good book (in this case: http://www.scmpatterns.com/book/), have them read it, discuss the concepts as a team, and try to relate the material to your current projects. The command line switches/etc. can be learned with a man page.
I really hope you post a follow up blog entry.
Well, i agree that there should be courses that teach source control, but it is not that hard to understand and learn it by yourself.
I think that more critical is to teach the programmer what the user thinks and wants.
You're making the common mistake here. Computer science is not software engineering. Believe it or not, some people do want to just learn the abstract, and there is reason to learn that and that's what computer science is. If you want to learn development, take up software engineering.
I personally believe that the best programmers understand things from the bottom up, starting with the logic gates, microprocessors, assembly language, and then to native high level compiled languages like C++ or D. If you have written a native windows application in C or C++, programming a .net windows app is a walk in the park, and you know whats going on under the hood if you need a pinvoke to work around the frameworks limitations. After a few years learning how everything works under the hood, then I think the focus should be on learning real world application development from the requirements gathering usability stages to coding, deployment and documentation. The high level languages are great, but a computer science program needs to start with the basics. It should most definitely end with real world application development, including internships or mentoring programs. I think companies would love to bring in students at a low hourly rate for mentoring because finding loyal, experienced help is really hard these days.
If there is one thing I wished they teached me more, it is the human interaction. To get a feeling what I mean , there is an excellent webtunnel http://www.managinghumans.com/ .
I've done so many technical correct projects but still they fail, due to a lack of human aspects.
Also interaction is often not learned when teached ex-cathedra.
My 2 eurocents.
I'm on a computer games course and I've found nobody knows anything about versioning or repositories, and there seems to be no intention of teaching it either.
Imagine the look of horror on my groups face when I mentioned the word 'subversion'.
The ANU's Bachelor of Software Engineering and Bachelor of Information Technology programs do this to a limited extent, but not for all assignments. In second year, there's a project worked on in pairs which is submitted through SVN. Third year (and also fourth year for BSEng, who do the projects a second time but manage a team instead of just working in one) include a year-long team project which is all managed with SVN. The third/fourth year projects are sourced from groups in industry, which means that the final product had better be deployable if it's to be useful.
College CS classes tend to be so dry and academic that you must spend your summers working in industry
Today with the advent of bachelors/masters and fees (in Germany), most people are earning money in their summers. It's to some degree different in Fachhochschulen (more practice, more specialism, furthermore you have to consider something important: a college (in USA) is not an university. There are some colleges which are quality-wise similar to an university, but some colleges are level-wise equal to a German Gymnasium too! At the moment I'm approaching a 2nd 'career' or better a 2nd qualification (CS with B.Sc.), so I'm confronted with these differences between the 'traditional' way of studying (Diplom, Magister) and the 'new' way (bachelor,master). You don't have the time anymore for such things (but you have to do it somehow).
This is not to say that computer science programs should neglect theory.
Apart from practical considerations like above, there are certain things which have to change. Computer science needs more bandwidth, more focus on 'reality' without losing its foundations.
Getting a literature degree doesn't prepare you for a job in publishing. A chemistry degree doesn't prepare you for a job at Glaxo. So why should a CS degree prepare you for a job in the software industry?
You're confusing academia, which deals solely with the thirst for knowledge, with the vocational.
I think the major problem is that there are very rarely vocational training courses available for programmers starting a new job. You can bet that the chemists at Glaxo, particularly those taken on as graduates, are sent on extensive training courses on how to apply their knowledge and work with the company's equipment. The programmers are probably shoved in the basement and told to get on with it. They're expected to pick up SVN as they go.
I agree with Sauron?! here. Software Engineering and Computer Science are two very different subjects. I'm in the fortunate position learning both and I've been in the industry for two years so these concepts aren't new to me.
But even so, the Software Engineering courses has us doing group projects and we opted to use source safe ourselves, we weren't taught it. Deployment was also not taught, or had a very brief glance.
University courses seem to lead towards academic pursuits, not the industry. If I wanted to learn programming I would take a tech course.
Yeah, don't call it computer science if what you're really teaching is computer engineering. Us computer scientists get all uppity about that...
Seriously, I've taken probably 3/4 of the _theoretical_ computer science class that Caltech has to offer. One of the missing classes in that 1/4 was algorithms, which I dropped a few weeks in, because it was just too alien to my thinking to actually be writing pseudocode as opposed to proving things about complexity classes and computability. Hmm, maybe I'm more of a "computer mathematician," but still, my point stands.
(Wait, did I have a point? Now I'm not sure.)
I'm a computer science student, attending a local liberal arts college that just happens to have a decent CS department. While the department is pretty standard, they do have an absolute requirement that you intern at a company for at least a year before graduation.
In my case this presents some problems, since I already have a rather lucrative job in contract web development. I can program circles around most of the people in the course, but I'm still going to be forced to take time out of my well-paid and enjoyable contract work to work at an "official company".
However annoying this may be, the college does do one thing right: they have one course per semester devoted to "special topics"--and that can be any serious topic the students vote for. This semester the course topic is C#/ASP.NET. Last semester, it was PHP. Every semester the course will be updated and a new topic picked, keeping it relevant and fresh.
What properties should a single line of code have and what they require? At a minimum it would be:
* secure - the programmer should know about security, best practices, their environment, and other factors;
* efficient - the programmer shouldn't write an O(n^6) algorithm if they can help it;
* readable - the programmer writing the code won't be the only one to ever read it;
* tested - the line has to be right, right?;
* documented - some record should exist of who wrote the line and why;
* commented - not the same as 'documented', since the 'why' here is different; and,
* styled - the programmer should follow best practices, use appropriate libraries, follow convention;
From this perspective it takes both disciplines and lots of education to write a single line of correct, functioning code. But that's only part of the problem, since after writing that line you have to worry about people, process, requirements, design, and constraints, as well as tools, languages, and environments.
I guess my core point is this: there's a lot (~10 years worth of education) you have to know to write a single line of code, and there's even more (taking the rest of your life) to know about dealing with the people that pay you for that line.
In the end arguing over Computer Science and Software Engineering seems silly to me, especially once you add in the people and managements skills you need. It's like taking a cup and chopping it in half, and then arguing over which half is more important. You need both to get a drink!
Being in charge of a software engineering course, I have been wondering about the same question. In my experience, the key element is the "boredom factor" of the course.
I have been discussing about it a while ago.
For example, it takes roughly 30 min to get started with Revision Control, but it's way much more efforts to learn all the good practices that come along with Revision Control. The job of teacher is to open the path; students, if interested, will make their own make way.
My 2cts on the question,
Universities are good at teaching abstract concepts and at evaluating whether those concepts have been learned. They are not so great at teaching the practical aspects of software engineering. However, everybody seems to observe this and think "Oh my god, universities are useless! We must make them better at teaching people exactly what it is like to work in industry." I contend, however, that this is a waste of time.
Practice can be learned on the job, very effectively. Theory, however, is much harder to learn without lectures, workshops and assessments, which universities are well suited to provide. If someone doesn't get a strong understanding of the theory at university, they may never get it at all. If they don't learn the practice, but at least have the theory, they will certainly learn on the job. I believe that university courses should focus on what they can do well, the theory part, and not the day to day experience of being a software engineer, for which they are particularly *un*-suited.
Source control is an example of something that, while indisputably critical to the practice of software engineering, is *hard* to teach at university, but *easy* to learn on the job. Hard to teach because you really need to see it working on a sizable project with many members and multiple branches and merges going on. Easy to learn on the job because there are no particularly difficult concepts involved, and no particular creativity is required in the process.
What I find much more important to teach is how to learn and figure out new things. Tell the students, "To solve todays homework, you'll have to find out how to do xxx. It's not in the books or the scripts for this lesson. Good luck.". Because that's how it works in real life. Better learn how to Google and how to use your results.
Easy to learn [source control] on the job because there are no particularly difficult concepts involved, and no particular creativity is required in the process.
This has not been my experience when working with clients. Most programmers never fully grasp source control beyond the most absolute basic of "I've got a lock" concepts. I also think source control is a much deeper and far more complex subject than you allow.
I also dispute the idea that you can't "work on a sizable project with many members and multiple branches and merges" in a university setting. Perhaps they could try having students contribute to an existing open source project of some kind, even in a small way?
I recently developed a very small app for a friend of mine in visual C++. The app took me about 3 hours to create, debug, and test. It took me about 2 weeks to deploy.
I realised then, that I've spent 3.5 years at university (2 separate courses) and never learnt how to deploy an application properly. What's worse - there seems to be no great literature or step by step instructions on deployment! Why .net doesn't have a simple 'deploy' button is beyond me.
University is so behind the times. Recently doing an Internet Client Side computing, we were taught that there's over 1,000,000 web pages. NO WAY!! THAT MANY?!
However I'm looking forward to this year - teams of 4-6 are to develop an application over the whole year for actual clients. Industry grade. All documentation, user-guides, manuals, charts etc. Let's just hope they require source control and really teach us deployment while they're at it! Oh oh, and finally.. a Design Patterns subject! About time.
I agree with many of the comments here, in that the science of computing has nothing to do with competently writing software.
What the bejesus does "web hosting ... source control system ... angry bug reports from your users every hour" have to do with science? That all sounds like some kind of practical software writing to me. Having a firm grasp of the science of computation would certainly help to develop your algorithms, but source control, bug tracking, and all that gubbins are not part of the science. If that's the way the student wants to go, they should take software engineering instead.
To paraphrase Dijkstra, calling it computer science is like calling surgery knife science, and gives completely the wrong idea. Perhaps it should have been called "Computational Mathematics"; take the emphasis away from this tool called a computer.
Computer science and software engineering are (or should be) as different as physics and civil engineering. Software engineers need to learn some computer science in the same way civil engineers need to learn some physics, but they aren't the same thing at all. We need some computer scientists in academia to research new languages and technologies, but people wanting to be commercial software developers should study software engineering. I believe it has been a huge mistake to think that a degree in computer science is a good grounding for commercial software development.
My university BSc(Hons) Computing class is exactly what you describe.
You learn about how (and why) source control is vital
You learn how to deploy software for different environment (from user installs, to network based installations)
You learn three application development languages, three scripting languages.
You also have more Sciencey type modules, where you learn about Software Execution Models, Compiler Theory and the obligatory Algorithm design, but all in all, it focuses as much as is sensible, on application development (web and desktop).
I've no real opinion on which is the better approach for new students to take, but I think thats because, for a successful project, you need all different skill sets.
If for example, you were developing a client/server application to handle some kind of user input, it might be sensible to have a Software Engineer to develop the client-side UI code, and the core of the server, but you still need someone to come in and say "This can be made 20% faster if we do X", or to design the communications protocols and be able to actually methodically analyse the workings of the program.
I'm finishing my 4th year overall (1st of Masters Degree in Portugal now with the Bologna naming) and yes, there were no classes on subjects and tools like cvs and deployment, but i guess that depends on the master you choose, there are people going to AI, myself at networking and systems, information systems and software engineering. I really don't know the specs for the classes given in software engineering so i can't really say that those subjects are discussed. Altough i think its important i don't think it's that necessary, mostly because, like many have mentioned before, the university environment is academic. For instance, now i will have to go on a internship for my last year where i will get the real world treatment and assignments that (supposedly) will complement what i learned in the university.
Also on my 3rd year at a previous university i was involved on a 9 people project with full CVS control and deployment of a Java based web services system. Let me tell with was not pretty, not only because we were rushed into the subject with no previous preparation but also because the teachers didn't pay much attention to the subject in classes. Also we were not in any way prepared to manage a 9 men team :)
Well, i agree that there should be courses that teach source control, but it is not that hard to understand and learn it by yourself.
Not really my point; source control should be seamlessly integrated into all the code you write in CS classes. It's not necessarily a class you take, it should be more like the air your breathe.
[Source control is easy to learn and not a creative process]
This has not been my experience when working with clients. Most programmers never fully grasp source control beyond the most absolute basic of "I've got a lock" concepts. I also think source control is a much deeper and far more complex subject than you allow.
I may not have sufficient experience on the subject. I had limited exposure to source control at university, but where I work now we use Perforce (mostly using cvs for our own projects with little instruction), and it didn't seem particularly hard to get up to speed on the basics of it, including making branches and integrating between them. I would have thought that a company with a good source control system will not have difficulty teaching graduates to use it, and a company with a bad or non-existant source control system isn't likely to be the sort of company that will care whether graduates know about source control. Of course, if graduates are forming their own start-ups, then they will definitely benefit from learning a bit about source control at uni. But surely such individuals would have already found out about such things on their own initiative?
I also dispute the idea that you can't "work on a sizable project with many members and multiple branches and merges" in a university setting. Perhaps they could try having students contribute to an existing open source project of some kind, even in a small way?
I don't claim that it's impossible, but I believe that it is not easy to oversee, and that it's not representative of source control in a corporate setting. For one thing, many open source projects don't even give source control access to anyone who comes along, and they must instead construct patches and sent them to committers. And even if they do, how does the university evaluate their students' contributions to the project? Students already complain of unfairness in team projects when all the team members are students on the same course! Again, it's not impossible, and I would certainly be interested to hear stories of how it works in practice, but I still feel that overall it's a lot more effort than it would be for the graduates to learn the same things on the job.
I've scanned the comments, so if I repeat anything, sorry about that.
I'm going to draw on my experience as a recent graduate from NC State University with a Comp. Sci. degree. While NC State isn't a perfect school, it does comes close to what you're looking for Jeff.
We had a range of classes that covered all sorts of topics, as we were certainly required to learn our share of theory. But there were two courses required for all CS majors, Software Engineering and Senior Design, which emphasized more software engineering over theory.
In Software Engineering (the class) the focus of the class was a group project, and each group was required to use SVN and to write JUnit tests (NC State is primarily a Java school, but I also wrote in C, C++, C# and Assembly while I was there. And Prolog - how I hate Prolog.). It was also the class in which we were introduced to design patterns. Now, I admit that I didn't internalize much of the stuff at the time, as about the only design pattern I came out of the class recognizing was MVC, but it was an introduction to it all.
Senior design was the capstone for it all. Small groups of students were formed and we were assigned to complete projects for companies such as Intel, Nortel, Northrup Grumman, Microsoft, Red Hat (It helps when their headquarters are located 100 feet from the new Comp Sci building), etc. Now, in Senior Design we were given the freedom to complete the project in almost any way we saw fit, but the expectation was that we used what we had learned while satisfying the requirements given to us by the company who had developed the project.
NC State also seriously stressed internships and co-ops.
Looking back in hindsight, I think that because of the emphasis on building software that I did loose out on some theory. Once I entered the work force I also realized there was a lot I didn't know - but my school had at least given me an introduction to it all, with the exception of deployment. I don't think that was ever really covered.
I even had a professor who encouraged us to read Code Complete. He gave it away as a prize to students who did the best on the coding assignments in his class. Of course, I didn't manage to follow his advice and pick up the book until after I was out of school.
Now, is NC State's model the one model that should be followed? I don't necessarily think so. I think having a diverse array of models that produces Comp. Sci. graduates with a range of skills is the best. But as a student, it is nice to have an option of what kind of academic model you would like to follow and to know which schools follow which model.
So there's my experience, for what it might be worth.
How long does it take to learn to use source control? Thirty minutes? Sixty? So what if they don't cover it in school. I suspect most schools devote part of a lecture to it somewhere along the line. I got fifteen minutes on the basics of vi back in the early nineties. I can't really say I needed an entire course on editing. I work with a number of software engineering majors, and for the most part I find the computer science majors to be more useful. For whatever reason the CS majors have more programming experience. Regardless of major, the people who don't read books and try new technologies are useless before long. So hand new hires a book on source control and see if they read it. If not, let 'em go.
I'm a current third-year CS student at Fla State. I've been programming since I got my first computer and I've always been a diligent student of the subject instead of just throwing things together, so you could say I know more than some of my peers. Either way, though, the amount of information they give us in classes is ludicrous to say the least. I'm obviously not done yet, but if the trend in my classes continues I'm going to graduate knowing absolutely nothing more than what I started off with.
Universities aren't about teaching how to apply tools, they're about theory. If tools are required to teach theory, all the better, but the tool isn't taught, the theory is. If you want to learn how to use a tool, e.g. a programming language, .net, vs.net, svn, just buy a book or read the manual. If you want to learn the theory behind concurrent usage of textfiles in a versioning system, attent the course at the university where they MIGHT use svn to teach the theory, but perhaps also their own in-house system.
Above, there's a comment from someone who claims universities are so behind. I can't disagree more. Perhaps that person has looked at a bad university, but if I check what for example Dutch universities are doing in the field of CS, it's awesome stuff, and when I then look at the 'real world', the real world is lagging, which is good.
I always find it funny when some student whines about the languages taught at some university: "Why do we have to learn prolog, or miranda? No-one uses these languages in projects!".... it's not the language that's being taught, it's the theory. The language is just used to get the material across.
What's wrong with how the ACM thinks CS should be taught? They lay out a very nice curriculum with the understanding that it is a "science" and should be taught as such. I chose my university, not only because of the price - but because it follows the ACM's guidelines to a "T". I like the idea of a governing body, such as the ACM, providing a model for universities to follow... much like accreditation, if the school you were going to wasn't accredited.. would you stay?
Experience will be a contributing factor for when CS graduates look for work - this isn't much different from other fields, if you want the job you've been dreaming of. Pretty much old news.
Let's remember the reason for higher education, to instill in its students the tools they need for /learning/ for the rest of their lives, not to teach two year old tools that are not going to exist three years from now or become deprecated. Higher education is more of a process of learning how to learn - in that same token, learning SVN on the job shouldn't be that difficult, considering it takes no more than a few hours to learn.
I agree with the first comment - there is a distinct difference between computer science and software engineering though there is a good degree of overlap. The former is concerned with the advancement of theory, the latter with its application.
Regarding source control - Jeff is right. Many developers never get beyond the "I have a lock" phase. Some never have to. Most do, or rather, should but its amazing how new the concepts of merging and branching are to many developers and how many codeshops lives could be made easier if they got to grips with source control (and configuration management as a whole).
Well, it's just silly to think that a computer science program should be all theory and no real-world, practical training. Perhaps some students take physch courses for the theory - the abstract concepts - but I doubt very seriously that anyone really takes a course in the computer science roster out of pure academic curiosity. There has to be a certain degree of vocational-type training involved.
Back on the deployment tangent, students must also be made aware that software deployment doesn't occur via physical media or on a single PC at a time anymore. If your app can't be pushed out via Group Policy, SMS, or distributed via some other desktop management tool across my LAN/WAN to 1000 users in one fell swoop, I'll be looking for an alternative app.
Studying CS is like studying architecture. Even if you finish, it doesn't mean you can build a house on your own. For one thing, nobody taught you how to use a hammer.
I'm gonna go with first commenter. Comp sci isn't software engineerig, any more than mechanics is bridge building.
Anyway, I studied a five year master in comp. sci and comp. eng. (what we in sweden call Civilingenjr, Datateknik) which is basically a mix between ee and cs.
They way they've chosen to teach "real life" is pretty decent, I think. We have one course covering the theory of software engineering (processes and project models, scm, etc) and a few project courses testing this, where we're put in groups of six-ten people, given a project at a company and a methodology to follow (agile) and a problem we have to develop software to solve. Source control, structured testing, etc., is mandatory. The projects usually last five weeks, although those five weeks are spread over a semester.
It's a good model I think more schools should try. Two-three projects really grows your strengths in software engineering.
I actually consider that I had a pretty fantastic education. I attended East Tennessee State University, and was actually advised by a gentleman who worked at Oak Ridge National Lab. He told me that if I wanted a very practical education with lots of hands on experience that was the best place in the area.
The first two years are what I would call the theory years with heavy emphasis on data structures, file processing, computer organization and math. In fact I really think the first three semesters were designed to make you quit. If after staying up all night implementing a sorting algorithm in assembly, or a B+ tree in C++, you still have the warm fuzzies about being a programmer then you are probably born to do this stuff.
However we were in for a real shock. After all the afore mentioned stuff is candy to CS geeks like ourselves. Year three rang in the dreaded "Software Engineering I" and "Software Engineering II". A year long adventure in writing software, start to finish, for a real client (usually) and all the resulting headaches thereafter. The course was a laughable 3 credit hours per semester, but it took up at least 20 hours per week of my time. Team dynamics, configuration management problems, issues with learning new technologies (PHP / MySQL -- we learned with C++ and Oracle), made us all hate the project and each other for a while. I seriously considered my choice of profession during the final month of the project where a handful of people programmed day and night to get the project out by the deadline, all the while trying to keep up with the "theory" that the class material was based on.
We all wanted to slit our wrists, but in the end we learned valuable lessons that would stick with us in our professional careers.
Things that I never learned, or even heard of, in school:
That's a pretty big gap of need-to-know-stuff that I was missing.
I even realized the need for good collaborative tools when working on a 4-person senior design project and....set up a wiki. Because that's all I knew about. In the end the project was a complete failure, mainly because we didn't know about any of the things in the above list, and neither did our advisor.
Digipen works pretty much exactly like that. You get your SVN by your second class.
(Also from Sweden) How old is the oldest programmer at your office? I'm 40 now and still coding (have a MSc in CS degree). I can still work as a programmer because I have my own company and work as a consultant. But most common I think is that after the studies one only work 5-8 yrs coding and then becomes some kind of manager. And this because most programmers are just having it as a job. Think you put the finger on that once http://www.codinghorror.com/blog/archives/001002.html
So what is left for us who love to program? how many do you know that have written a whole program by them self? and can we be sure the teachers really know what they should teach? I'm a bit cynical and seem always to get that quote in my head about teachers being those "who can't cut it, if they could they be doing it"
Just having completed a engineering/computer science degree, I suppose I can't complain.
Most classes are split in theory labos. Algorithm theory would explain how black trees work, the labo would let us implement it in c++/editor of choice. Gui design would explain us how to create a good Gui (yeah...), the labo let us implement it in Visual Studio vb.net/C#.net. OO in java/netbeans, networking in c/textpad, windows in vbscript (the horror), unix in perl etc.
And some combined projects with multiple people which would use CVS and combine classes. Website db or GUIs computer graphics theory. And of course ending with a "now install this website on the apache server).
So yes, I did get all the basics like algorithms, OS, networks, databases, compilers, hardware, computer graphics, OO, security,... But also got to see multiple languages (Java, vb.net, c, c++, c# perl, vbscript, bash script...) and some environments to use them in.
An interesting note thats getting repeated here alot. CS is not SE. Nor is it any sort of practical degree. It's a science. Based on algorithms and proofs.
While I agree that most CS degrees are too theorectical and should include more 'lab' time, and should cover stuff like robust development, that is not their focus.
Jeff, what you are asking for is a software engineering degree, a BSc or a BA, either way, it's not CS. Software engineering should cover debugging, testing, deployment, proper development practices, etc. It should cross over with CS in data structures and such and avoid getting bogged down in the more theorectical aspects. But don't diluate a CS degree. They're just not the same.
It's the same that nowdays, many universities offer mathematics degrees and applied math, which focuses on real world aspects.
Confusing the two(CS and SE) will just bring more problems to the industry.
I remember looking at course outlines for my CS degree and noting that much of what is under "Achievements" is generic skills such as analysis skills and problem solving skills.
DON'T DISCOUNT THESE!
I was taught in Uni that the human brain can fit at most 7 things in it at once. Practicing trying to fit more than this in your mind at once is a skill that can be used across all of the IT industry. Across many other industries as well I imagine.
Given the time constraints of a 3 year CS degree I think the intense focus on generic skills is justifiable. Claiming that students are ready for the workforce is NOT. They are as ready as they can be in 3 years.
NO ! NO !
Yes get the students to develop as many non-trivial projects as possible, but NEVER EVER DICTATE,
A fundamental part of learning is DOING.
ALL my problems with Computer Science at University came down to I HAD THE EXPIRENCE - THEY MADE ME DO IT WRONG !
Another thing that they should teach in computer science classes is how to stand up to the client when they ask for a system that is a coding horror.
No pun intended.
Echoing the rest - computer science is not software engineering, and the constant conflation of the two makes this post awkward reading.
I've been told it was Dijkstra who said (although wikiquote has it as unsourced) that "Computer Science is no more about computers than astronomy is about telescopes." My personal take is that 'computer science' is something we're stuck with for historical reasons: a terrible name for a subject that is really a branch of mathematics.
By all means let us do as Joel suggests, and have universities teach **software development** (not 'engineering', note - engineers have to get things right first time). But let's not pretend that source control is anything to do with computer science.
I can really relate to this article. I don't think I had a single class on deployment or on source control. And I should have. I really needed it.
When I got my first programming job I had to learn how to best deal with version control and with creating decent user friendly install scripts. Not anything I had ever learned in school.
CompSci or SoftwareEngineering, plus variations on the theme.
Once upon a time a long, long time ago, I worked for what is now a major player in the desktop computer manufacturing business. At that time computers were sold via independent dealers. The dealers had a problem: how to train a high school graduate to fix a computer such that the fix was effective, but that the tech was unemployable anywhere else, hence keep his wages low.
The BigComputerCompany hired a PhD in education to develop such a cirriculum. And, he did and he and the company were extremely proud of the results. That, it would appear, became the model for tech support for every company that has to service and support a computer.
This same model appears to be getting into educational philosophy driven by industry everywhere. The local community college teaches a "network technician" course, for example. I talked to an instructor on the course. He told me that industry, via the state's "Employment Development" department, spec'd out the program. It is exactly what was in place at BigComputerCompany: teach 'em exactly how to plug in an RJ45; how to put a switch in a rack. Nothing more. Whatever you do, don't teach 'em to think.
It saddens me.
I am not sure about the premise of the article. Particularly today, I think most of the mentioned things (deployment, bug tracking, source control) can be easily learned outside the pure academia, for example working on open source projects. Source control system is easy, I think: anynody who could follow theoritical courses on computer science should be able to understand the concept behind svn/svn/mercurial in one hour. The practise is then... well, a question of practice.
Actually, using open source as an excuse to teach the concepts of source control and co could be the way to go, IMHO. That's where I learned all this stuff myself.
I switched to a physics course halfway through my third year into a cs degree. I was doing well in the abstract classes, but the heavy reliance on the visual studio/eclipse ide environments early in the CS program made it harder to handle the programming during the abrupt switch to the unix environment and command line in year 3. One of the big deciding factors in switching majors was the Computer Scientist vs.Computer Programmer discussion I had with my advisor. I like solving problems, but frankly many computer problems aren't as interesting to me as other areas of mathematics or science. here is a nice bit of overlap though, and I have had many chances to enhance my programming skills working on models. The best programming lessons I received were from trial and error and lots of reading on my own - books like Code Complete, g++ for linux, sed awk, etc....which has led me to a position writing code for an aerospace consulting firm that I am enjoying very much. Programming is like an art or sport - the more you practice, the better you get. oooops I didn't mean to pontificate that much, I did have a cool internship during school building an installer in NSIS for an application. It was a great introduction to the real world issues of software development and delivery.
I'm biased, because I moved straight into full time software engineering at a big defense contractor at the age of 18, having only a couple years of part-time community college. I've been profoundly underwhelmed since then with the preparation for real-world coding that college gives to new programmers. Even Phd's in CS often have no appreciation for the pragmatics involved in delivering software.
My opinion? You can't teach software engineering in a classroom, period. It should be taught the way crafts used to be taught, with apprenticeship.
Jeff, I agree with most of your commenters here. College is not vocational school. The purpose of college or university education is to prepare the student to teach himself on the job, to give him enough of a background in a subject to understand why we do things the way we do, so that he can (in time) suggest improvements that will better achieve our goals. A college graduate's value is in the ability to question the way we operate and devise better processes. That is the reason why 1/2 of a college course (at the B.A./B.S. level) is general education.
In nearly any job field, the only way to really learn the practical applications is on the job. Have you ever gone into a supermarket and seen a new employee shadowing an experienced one? While you may think he should have learned everything he needed in high school, it takes weeks on the job before that employee is fully productive. This is surely true in other fields as well. Why would anyone make the mistake of thinking that "software engineering" is any different?
Like many of the others, there's a huge difference between computer science and software engineering. However, I'll go one step further and say there's also a huge difference between software engineering and software development.
Think about the terms "Scientist", "Engineer", and "Developer".
In the physical world, scientists are people that perform experiments, record such changes, write papers, etc... They are the people at the leading edge of the field. They are people that push the limits.
Now, think about an "Engineer". Think of a Civil Engineer, or Electrical Engineer. These are people that take well known concepts and *design* real world things. I say design, because a Civil Engineer doesn't pick up a shovel or welding torch, an electronic engineer doesn't actually build the product. They just create the blueprints.
Now think of "Developer". In the real world, so called "Software Engineers" are really "Developers", but they often have to wear the hats of "Engineer" and "Scientist". A Developer would be someone like a "Lab Technician" or "General Contractor". But we have much more rigorous seperation of job activities in the physical world than we do in the virtual one.
A Software Engineer, unlike his physical counterparts, also has to do the heavy lifting. And it's often like there's a whole bunch of architects building a skyscraper, but each only working, and doing all the physical labor, of their little piece.
How stupid is that?
It seems that I won't be the first to mention that university is *not* about preparation for the real world. That is outside the scope of what university does.
The goal of university is to teach people to *think*. It doesn't matter if you're doing an English BA or a Physics BSc: undergrad is about learning critical thinking skills, with a focus on something that interests the student.
As such, the practical aspects of being an in-the-trenches programmer is irrelevant to the goals of university. Of course, any good professor will mention that things like source control and unit testing are good, important things, those are not necessary things that you need to learn in order to solve a problem.
Do I unit test? Hells, yes. Do I use source control? Hells, yes. Did I need a university course to tell me to? Hells, no.
Universities should teach C, they should teach algorithms, they should teach the concepts of object oriented techniques and they should teach about problem solving. Anything else is really the realm of community college programs and work-prep schools, which is where real-world programming is taught.
Now, the real question is should employers place such a high level of importance on a CompSci BSc over a computer studies diploma from DeVry? It's going to depend on the employer. The BSc grad will have been taught critical thinking skills, the community college grad will have been taught practical technique. It really will depend on the employer's needs.
I have to take some exception to Joel's statements about the quality of institutions. I started off at a 2 year school and had proper courses in C++ and other languages. After transferring to RIT I found I already knew most of what they were trying to teach me...
That said, we were introduced to source control RIT and the co-ops (paid internships) helped round out the res tof the stuff I needed to know. I actually spent my first co-op creating an installer package for HP print drivers. (Mac version, not Windows version which I've heard was horrible... so no blaming me!)
Why do you guys *always* use Computer Science and Software Engineering interchangeably? They're not the same thing. Computer Science is like Mathematics, while Software Engineering is like Mechanical Engineering. They're different, but you do have to teach engineers some science for them to understand and apply.
There is a separate argument to be had about why the top schools don't have (good) software engineering courses, or about how those software engineering course are taught, or about why people expect Computer Science courses to produce good software engineers automatically.
Remember that there are lots of excellent software engineers who never studied computer science or software engineering, like Phil Haack who was a Math major (I think). I suspect it's more of a case of "Two types of programmers" than the exact content of an individual's education.
I feel that there is an extreme importance in CS students learning how to work in the real world. For example, and architect never sees a lesson called "Pencils 101", yet because pencils (in a generic sense) are its most important tools he/she has to learn how to use it the university “teaches” him/her that. Not through a course, but he/she does learn it and professors make sure students learn the differences between an 2H pencil and a HB and how that affects the results.
Now, pencils are not required to know and understand architecture; actually pencils for theory of architecture are completely irrelevant. But the world expects an architect to know how to use a pencil to deliver.
The same is true for CS students. The world at large expects us to produce something. It can be algorithms to solve problems, but it can also be programs to solve more complex problems. We are expected to be able to do both. And when that happens you need to know how to use the tools required for the job. Just as college teaches you how to use Big-O notation (which is a tool) to analyze an algorithm, it should teach you the theory behind things like deployment or source control (and theory must be reinforced by practice), because it is something that will provide you with more ways to see, understand and present solutions to a problem.
That said, the emphasis has to be in the theory behind these concepts (source control, deployment), because it’s the only thing that will be useful in the long term (as a difference from a specific tool, which has a shorter lifetime).
But not teaching the theory and uses of these tools (and these tools are orders of magnitudes more complex that the pencil for architects) greatly limits a student’s capabilities to innovate and provide new ideas to those areas, areas that fall within the scope of CS and that are required for any kind of work.
I do some research work for a history professor who studies how academic disciplines have changed on account of booms and busts. One thing that's interesting is that as an academic discipline becomes really popular and commercial— as computer science did during its boom time in the 1990s —the focus changes from intellectual issues and onto social and business issues. Textbooks start having things in them about working with others, not solving theoretical problems. I only did some preliminary work on CS for him (most of it was on Physics, where the effect is very clear in the boom of the 1950s-1960s), but it seemed to follow the same sort of trends. Anyway, I think this is an interesting post to follow up on the Knuth post from yesterday — from what I have read of him he's very much of the pre-boom generation in his writing, which to my knowledge doesn't contain a thing about working with others or getting jobs.
"Unleashing the naked greed of the TechCrunch crowd on tender young programming minds seems downright cruel"
Jeff, I know you didn't mean anything... :)
Even though I agree with your theme, I also agree with the first comment to this post.
Certainly we should come to common terms.
And: The width of the text is tooooooooo long.
Anything more than 550px is bad.
I do know some classes that have actually involved contributing to open source projects, or working in real-world (or close-to-real-world environments), and everything I've heard about them has been positive.
I think open source offers a great educational opportunity, once somebody has reached the level where they are up to contributing.
The question here is not what do students want out of a degree program centered around computers. It is about what benefits the field as a whole.
People tend to mistake abstract math, logic and other bits of computer science as useless in the "real world". I would argue that a Computer Science program (especially a Bachelor of Science) stay heavily based on theory.
At the University of Minnesota, they have two computer science degrees. The Bachelor of Arts and the Bachelor of Sciences, the differences are very minor (http://www.cs.umn.edu/xml_handbooks/handbook.php?page=ugradsection=CS%20Requirements). That link shows you the difference between the two.
Now the thing about an undergraduate degree, and a common misconception, is that a Bachelor's degree just teaches out how to "speak" the language of that discipline. Now one could get into a HUGE debate (That I wish to avoid) about the difference between discipline and philosophy, but my argument is that I would rather people learn the proper discipline of Computer Science than any philosophical aspects underneath it.
I personally would rather not de-value the worth of a Bachelor's degree by making it "business" specific. If you made the degree based on "programming concepts" then you are teaching a philosophy of a discipline that will become worthless when that language or concept dies out. Teach a person about how that language was created and maybe they can connect the differences and develop their own theories about it.
I guess that will always be the difference between Academia and Business. Business wants trained drones and academia wants to solve the worlds problems. The two don't mix. If you really want drones go after the junior colleges.
Just my 2 cents.
Computer Science and Software engineering are really different. However i still cant understand is why employers are giving more emphasis to Computer Science graduates, when they acutually require is people with skills in practical programming?
I feel a change should come from the industry, not academia;
Got to agree with the difference between computer science and software engineering. I don't know about the situation in america but in the UK it it harder to find a software engineering degree than a computer science degree. Also the differentiation is often not even made even at the top universities or not explained in literature very well which you need to be told about really at that age.
I personally took a computer science degree and luckily mine did have quite a lot of programming and team programming modules. I think an offer of extra modules to teach something like a microsoft or java certification would be a great advantage when looking for a job after you finish.
I completely agree. My Software Engineering course was great and I did learn lots of useful stuff that served me well in practice but I think deployment is under developed.
Even in business it's commonly overlooked as being a serious activity.
but I doubt very seriously that anyone really takes a course in the computer science roster out of pure academic curiosity.
Quite simply, you are wrong.
I was accepted to several "higher ranked" programs for CS and ended up attending the University of Cincinnati because of their stellar Cooperative Education program. Five years, the middle three on a six months school, six months work rotation. I spent six months learning the theory and pursuing my own stuff, and six months working and learning in a real development environment. During my first co-op, I was mostly expected to learn stuff like source control, deployment, et cetera, and contribute on a theoretical level (brainstorming sessions and such). By the end I was basically able to contribute on every level on the project.
If you're looking at attending school for CS, don't be blinded by rankings and lists. Programs like this one will be more useful in the long run, in my opinion, because there's no class you can take that will give you work experience.
I think you're confusing computer science with software engineering. Many computer/software engineering programs teach concepts like source control as mine did. They often include a fair bit of internship/co-op experience as a requisite for graduation. I had already worked in industry for 6 semesters at 3 different companies by the time I graduated That is where I learned the majority of these concepts.
You teach computer science like you teach anything else in life: by teaching core concepts and having the student practice, make mistakes, and learn from their mistakes.
Internships are a great way to get real-world practice. When I was an undergrad I did a "co-op" program, which was an eight month job with a computer consulting company. The college gave some credit, I spent 8 months living in another city and working a "real" job, getting a salary, etc.
That was where I learned about ASP, databases, working in groups, working with clients, writing functional specs, and all the other stuff that happens in the real world, but sadly isn't covered very much in academic settings.
IMO, all schools should offer a co-op program and make in mandatory to achieve an engineering degree.
Internships. This is what makes my alma mater, RIT, a great CS school IMHO.
Is there a good definition for CS or SE education? Most people build stuff with software, which takes at least some analysis.
It's too big of a world. Data structures, algorithms, building classes, scalability, memory management, I/O, languages, etc. Not to mention the types of things computers are used for, which can be so different.
I was a math major and did numerical analysis programming in FORTRAN and other stuff in Pascal. Went to a large corp. after college and learned databases, which I learned nothing about in college (no PCs then either).
I guess the real thing I noticed I was missing after school were concepts about memory, binary type knowledge, and at least some assembler, which is valuable since it makes you think about the computer in a careful way.
As someone who has already earned a bachelors in Computer Science and looking back on my courses taken I can see why small details like source control was never covered. In upper division I was required to take only one Software Engineering class which amounted to 4 months of discussion of core SE topics including team development projects we were assigned. SE is a large field with so many high level concepts including development models (especially RUP), use cases, design patterns and others. There is only so much that you can stuff into a 1 semester SE class and keep in mind that even the offering of one SE class is really quite gracious in itself as a CS degree certainly does not have to cover that ground whatsoever. So little details like source versioning - while important - can be learned easily through a variety of means outside the classroom.
In fact, I knew little of source versioning in a team environment until I graduated and got my first (and current) job as a software engineer. It took me all of about 2-3 days of using source control and applying diffs to have learned all the basics. The heart of the issue as I see it however is whether we want Computer Science to evolve into a more practical SE-like discipline or let the recent growth of SE degrees offered by universities across the world fill this requirement. Personally I opt for the latter scenario. I believe CS should not even approach some kind of hybrid CS-SE degree unless the classic CS curricula is also offered.
This is something that McConnell covered in Professional Software Development. He (and Steve Tockey) worked on a Master's in Software Engineering program at the University of Washington.
Drop the ubiquitous math minor requirement, which is only there as problem solving exercise and to weed out people with short attention spans anyway, and teach actual SOFTWARE ENGINEERING in that credit space. All the fun stuff like using source control, writing to a commercial API, interfacing with a (non-trivial) database, and so on. All the things that classes on matrix algebra and differential equations, nor the classes on assembly language and fork(), aren't doing squat to help you with.
"All the fun stuff like using source control, writing to a commercial API, interfacing with a (non-trivial) database, and so on."
That's what I consider the boring part of software development. I liked the linear (matrix) algebra in particular, because it led directly to an understanding of 3D rendering (perhaps my favorite field of CS).
The thing is, CS is such a huge field. Some people study it because they enjoy learning about algorithms, compilers, AI, graphics, etc... - stuff that ordinarily is useless in 99% of the industry.
Jeff, you bring up some interesting points.
I graduated with a BS in CS from Carnegie Mellon, and I was surprised to learn the differences in the attitude of what Computer Science education entails from my peers at Drexel University. At CMU, one can complete one's four years without ever touching Visual Studio. In fact, if you twist it right, you can even avoid ever touching C++.
I spent a semester interning out in "the real world" for a small software company and came back humbled about how little I knew regarding practical software development. At one point, I asked my undergraduate advisor whether it was wise to send students out into the world without having used C++. His response was pretty simple: ideally, the CMU education teaches students enough to grasp C++ when they see it, but also enough to understand why C++ should go away. The danger of teaching "practical" skills (as opposed to high-level abstractions) is that you risk creating a generation of COBOL programmers who can't see the forest for the trees---and who are obsolete when the industry moves onto the next big thing.
It's also worth noting that a lot of people in Computer Science may never be planning to write installable software. CS (at CMU at least) teaches students to know Computer Science with an eye towards academia, not necessarily to go out into industry. There is a wide field of work to be done in algorithm design, cryptography, and various other fields that need never even consider an installer. It's important for students to realize that if they plan to go out into the working world they will need such skills, but it's less important to roll them into an already-crowded 4-year program.
I think we ignore the ins-and-outs of writing installers for the same reason that we teach introductory programming in Java instead of C++. While memory management is an important subject, it clouds the core issues being taught in introductory courses. Similarly, the topic of making an end-to-end software distribution solution isn't as important as writing the software in the first place because---especially in the academic world---people CAN use your software without an installer, but they CAN'T install software that was never written. A good CS education doesn't need to include an in-depth study of a problem-space as specific as "distributing software to a client's machine"---there are a lot of scenarios that completely obviate the need for such a skill.
I think Jeff has it right when he says it should be the air you breath.
You often need to be *taught* theory, what you need from tools is *experience*. There is little point doing a course on source control (except for people who actually want to *write* it) it should be a quick practical very close to the beginning of the course where you are given your own personal repository (if you want it) plus teaching of the basics. After that RTFM, you need to learn how to do that so get used to it. After that point all your practicals should use it (and enforce it - submitting them should involve simply providing them with the tag and where the build./make/run scripts are)
It would be like them teaching you how to script... you should damn well know how to learn it yourself (but they should make sure you have access to the documentation)
My course was pretty theoretical and academic but it did two things very well. A lecturer creates a mailing list every year in which every one is subscribed, he is on the list too but rarely delurked. The list was immensely useful for collaborating on touch problems, areas which were unclear, bugs in practicals (they did some things badly :( )
Secondly, if you wanted to explore some aspect in more details for your own enjoyment/experience every single tutor/lecturer (bar one who was rubbish) was more than happy to help you with it.
A university course should, above all else, teach you how to learn. If they're really good they should teach you how to teach (at some level since it requires far greater understanding of the subject to teach it than to use it)
That said they never mentioned source control once at mine, that would have been useful, but I've never had a problem with it.
For the record they used Java, C, Haskell, CSP, Prolog and more. The only two they gave any serious tutoring on was Haskell and Java (as the basis for teaching functional and OOP). Everything else you were expected to learn yourself. I think Sun paid them a lot of money to include use Java :)
Diego, thanks for including that link to my site
I'm constantly amazed at what people in high tech don't know how to do. I always end up becoming the go-to guy for esoteric tech problems
When you graduate you know nothing and usually can't write maintainable code - I've seen stuff written by new graduates in a commercial setting where they were trying to get something out quickly and it was plain awful, didn't stop them being bright or good but without a decent mentor you are lost. You've spent your time writing narrow code that is looked at once by a lecturer and thrown away - sensible variable names, clear paths through the default case, using exceptions properly - very hard to do properly until you've had to unpick someone else's mess and realise how bad your own code is.
I've been doing this stuff 20 years now and there's still lots of stuff I don't know, but at least I know I don't know it.
I also did a sandwich degree in the UK, where our third year was spent in industry and we went back to finish our degree off. Most of us, to be honest, considered the last year to be a waste of time. But we went back to get that magic door-opening piece of paper.
You need to *study* "Code Complete" and "Writing Solid Code", plus Jon Bentley's stuff, and you need to be humble enough to realise you know nothing. This is hard when the ink on that degree you worked so hard for is still wet...
"The high level languages are great, but a computer science program needs to start with the basics."
From a computer scientist perspective I would say start from the maths and work down.
"A university course should, above all else, teach you how to learn. If they're really good they should teach you how to teach (at some level since it requires far greater understanding of the subject to teach it than to use it)" - matt
bah, I already know how to learn, I need the coures to train me in the stuff I need to know to get a job. Computer Engineering for practicallity, wikipedia for theory.
If you aren't cursing your web hosting provider every week, fighting with your source control system every day, deciphering angry bug reports from your users every hour-- you aren't being taught computer science.
Or rather, you aren't being taught how it often is.
What about teaching about how it should be?
Tools that work?
A support team who can identify bugs?
Not really my point; source control should be seamlessly integrated into all the code you write in CS classes. It's not necessarily a class you take, it should be more like the air your breathe.
Source control isn't that necessary. Seriously, it's just a backup. Maybe you're an idiot not to use it, but that doesn't make it necessary. ref: Seatbelts.
Similarly, deployment is a platform issue. Tired of hard deployments, blame the platform! Why doesn't the language/platform vendor make it easier on you? It's not (always) that difficult.
request: I want a blog/wiki of software best practices that cover anything other than Hello-World complexity appications?
I'm with you on the sentiment, but I agree with others: computer science is not the same thing as software development. University isn't purely job training, either.
I had to laugh at the quote from Joel Spolsky that there should be Fine Arts degrees in software development. People love to say that what they do is "an art," but there is no way that software development is fine art. Are fields with similar challenges, such as auto repair or aeronautical engineering, fine art? Of course not. Nothing wrong with that.
I've taught four semesters of Software Engineering (working on the fifth right now) and I've stressed deployment (as well as source control) almost from the beginning.
The main reason? IT MAKES GRADING A HECK OF A LOT EASIER. If you don't specifically require a one-click installer as part of their grade, you can end up spending 15 minutes just getting a student project installed and configured -- this is before you even get around to evaluating the quality of the work.
I would say the hackery involved to get some student projects up and running would surprise you, but it wouldn't. We were all students once.
Lots of commenters are slamming others for not drawing the distinction between "computer science" and "software engineering." However, many, many colleges and universities do not offer separate courses of study - they lump it all into their CS track. So, in those situations especially, there must be some vocational training included with the abstract theory.
I think computer science and the programming profession need to be decoupled on the conceptual level. Computer Science for Software Engineers should be understood to be an advanced, academic discipline while workaday business programming for programmers should be understood to be a vocation. If you are a genuis you go to an university to study Computer Science. If you just want a good job you go to community college to learn programming.
I'm just starting to learn SubVersion after reading many programmer blogs and realizing it is an important job skill. But don't just whine that programmers don't know anything about source control. Write an easy to follow tutorial on how to set up and use SubVersion on Windows for .NET projects. I found enough information to get me started on the DMB Consulting Blog.
At my college, CS and SE were different... They started similarly enough, with algorithms and programming basics and concepts, but deviated the second year into the things you speak of (game development, source control, UI, life cycle etcetera). CS students had to only take on SE class, software engineering.
When I graduated and started my job I didn't know much about source control, merging especially, as well as autoconf and makefiles. This job has been an eye opener, since it relates to scientific computing (clusters), and can run on linux, mac and windows. We have to do a lot of testing and SE work, but we also do a ton of CS solving NP problems and math and statistics! It's great fun to actually do CS (theory stressing) and, on the side, do advanced SE (lots of environments and testing, and distribution).
Source control? Are you kidding me? What's next, learning how to use Visual Studio? Students should not "develop software under conditions as close as possible to the real world" when they study computer science. They should do that in their spare time. Studying computer science means learning the hard stuff. Write compilers, write an OS, learn about concepts and ideas. Design patterns, memory management, chip design, object orientation (in an object-oriented language like Eiffel, not a crutch like Java), designing software. That's what students should learn. Not how to check in code into SVN. That'll take a day or two on the job, or when they do an internship.
I have to disagree with Joel Spolsky. Software development is not an art. It is an engineering science.
The worst programs i've seen were from people who considered themselves: artists.
Since I happen to have a BFA, and I started life as a BS major, I understand Jeff's stance completely, and couldn't agree more. It's not about the definition of Computer Science, it's that the "assumption" in the non-Computer Science field is that CS majors are qualified software developers.
My degree is in Studio Art, from a run-of-the-mill state university. Fine Arts teaches both theory and practice: history and composition are about advancing the Art, medium-specific classes are about execution, and about MAKING A LIVING PRODUCTING ART. The goal is to NOT SUCK.
A fair parallel to Version Control would be Drawing from Life using Perspective. Perspective takes about 30 minutes to teach, but 100s of hours to master. There are all kinds of props, techniques, mnuemonics, and gizmos that will help you master Perspective. You need the "applied" courses like Drawing 300 to explore each of them with someone who's made a living at it. (My instructor had a PhD in drawing!!!)
CS could easily accomodate this kind of "applied" degree.
College will hopefully get the framework to be a decent programmer laid out. With some extra work and some years of experience, that framework may help you to become a decent developer.
I wrote about this on my blog the other day ( http://48klocs.blogspot.com/2008/01/meaningful-certification-is-hard.html ). I don't know where the idea that students spring forth from college with their BA in comp sci fully-formed and ready to develop came from - lawyers and doctors both have long periods of time where they're mentored AFTER additional years of education. If you believe that developers are cogs that are easily replaced, I can see how the belief that education can somehow be tweaked to deliver shiny new cogs makes sense. If you believe that development is as much an intellectual pursuit as the product it delivers (like me) then the notion that there's any signifier that universally guarantees a solid developer is a pretty laughable one.
Unfortunately, as capitalism slowly rolls over and dies of its cancer in the US, most private colleges couldn't care less about how well they are teaching future programmers, as long as that tuition keeps rolling in.
Advert: The evolution of my dept's offerings has been in the
direction of many degrees: a traditional BS, a set of BAs
with "practical" options -- web, graphics, games, ... Plus
a Computer Engineering program and Bioinformatics.
And we have been running internships since 1983.
It's been tried. There are a few problems with this style of approach. Take a look at Neumont University. It's a relatively new school that focuses entirely on project based development. From the first quarter until graduation, you're working with teams of 4-8 developers on quarter-long projects, required to use source control, unit testing software, and with a variety of oddities thrown in based on the teachers whims. For example, during one quarter, halfway through each team was moved to another team's project and thus had to read through the code, figure out what was working and what wasn't, and finish it. Other quarters involved such oddities as randomly switching team members across groups (getting teams to make sure a new developer can get into the project as quickly as possible).
The school has run into significant problems. The first was accreditation. Since it takes 5 years of a college operating before it can actually give out degrees, Neumont bought another college (Morrison University), switched their campus to the main campus, and tried to sidestep the issue. However, being relatively unknown and due to the way the college is designed, degrees from it aren't entirely useful.
Second, the dropout rate. The entire college is practically a proof of the post you linked ( a href="http://www.codinghorror.com/blog/archives/000635.html"http://www.codinghorror.com/blog/archives/000635.html/a ). The first quarter has a dropout rate over 50%. You can tell which rooms are dedicated to first quarter students because they're 2 to 3 times the size of other classrooms. I've made a point of getting to know as many students as possible and I've worked with a lot of first quarter students. Some of these people have been quite intelligent, but for some reason could just never wrap their heads around some of the key concepts of programming. While I'm sure that they'll go on to great jobs in other fields, it's proof that you can't just use intelligence as a guide for computer programmers, it requires a specific mindset.
On the other hand, when I've visited friends at more normal colleges, I've snuck in to a few CS classes with them to see what kind of education they were getting. Friends in the second half of their junior year of a CS degree were learning concepts that I learned my first quarter. It seems that the state of CS teaching is in horrible disarray, and while revolutionary new ideas are at least better than what we have, they're nowhere near what we need.
I wish our profession was like the medical field. With time in the field as part of graduating requirements. I knew a hell of a lot about theory when i graduated. I knew nothing about how to actually program in the real world.
It's taken me a year and a half of self teaching and making blunders to get what little i have down about good practices with source control and deployment. No one I've worked with has been able to mentor in these areas.
I graduated from Rose-Hulman, which is one of the classes Joel mentioned on his blog. I graduated just after they decided to split the "Computer Science Software Engineering" curriculum into two different degrees. Computer Science as I know it is closer to Mathematics. In fact, when I started at Rose, they wouldn't let a CS major have a minor in Math. You only had to take TWO extra math classes to get it. You were forced to double major if you really wanted that math degree.
The CS curriculum I took did have a few Software Engineering courses, but I don't remember them. We weren't focused on that aspect. I do wish I had learned a bit more, as I'm in a SE position now, but I think splitting the curriculum in to two degrees was the right decision.
I do agree, however, that even in CS there should be more use of standard SE tools in projects. Some of those techniques, like version control, is useful to everyone whether they're doing research programing or product programming.
I am a second year student at a community college chasing my AAS in computer science. Yes, SCIENCE. The idea of having to walk into the fine arts building feels me with dread. How can there be a computer degree in fine arts? Logic is the base of computer science, how would one learn logic in fine arts? I think that this has to be one of the most horrendous ideas that I have ever heard of.
We are lucky at Walters State, the head of the dept worked for 20 some odd years in the field before he started teaching. We learn theory by coding, coding and more coding. In my two years here I have learned (or learning now) five computer languages as well as Cisco networking.
True, I spent last summer as an Intern at Oak Ridge National Labs and none of the languages that I had learned was being used, but the experience of constant coding and problem solving served me well. I was able to jump in and do !research! after my freshman year. Find someone who is seeking a degree in fine arts do that.
I'd like to give some kudos to Drexel University. I did not attend Drexel, but I have interviewed, hired and worked with a number of Drexel grads.
For those who don't know, Drexel is a University based in Philadelphia that has a very strong internship requirement. Every Drexel student (at least in Comp Sci) has to complete an internship.
As a result, each of these students has had exposure to a real world environment. For some they had an IT focused internship and they learned a lot about Customer Service and what the day to day job is like. For those who had software development internships they had exposure to source control, documentation and sometimes even deployment.
I've seen every Drexel grad achieve a lot of success and be very valuable additions to a team in a very short period of time.
Oh... and Drexel has a Ph.D. program and does a lot of research as well. So, it's not just a "learn to work" program.
Nice discussion. I graduated in CS 5 years ago, and agree that classes are for learning theory, summers for writing code / getting your hands dirty.
A few assignments really stick out, which helped me:
* Our sophomore-level programming classes *required* all code be checked into RCS. The actual version control system doesn't matter: understanding the concepts (check in, check out, revert, history, etc.) is what's important. I don't think using version control is an engineering detail -- it's as important as a physicist knowing how to use a calculator. It's a crucial tool for the job.
* A great programming class had us learn a new language each week. One week it'd be Perl and doing file manipulations. Another was Java for some graphics. Another was implementing grep in C [did you realize a regular expression engine is surprisingly small?]. Another assignment was looking at the sed source code and adding features. Each assignment exposed us to a new aspect of programming.
Computers are an interesting mix of engineering and theory, and you need both. I went to one of the Ivy league institutions, and though it does focus on a lot of theory you get your hands dirty too. It depends on the teacher.
The last time the accreditation team came to my department, the Computer Science department at BYU, I was invited to give my input on how our school is doing. Of all the students in the room, the overwhelming responses were "We do not have the chance to write" and "We do not know how it is in the 'Industry'." In the Computer Science Department there is a major split between the Theorists and the Industrialists, with the Theorists holding all the power.
Luckily for us, the students, our voices rang out. We are now required to write reports in all Junior/Senior level classes. We have also added a required course to the Junior Level: "Software Design and Test" which requires the use of SVN. This was extracted from a senior level class "Software Development Processes" which has the class run as a company building software from a 'typical' customer with vague requirements, etc.
I have really enjoyed these courses taught by excellent professors. But what's more, I have enjoyed my internships at Amazon.com which also threw the _need_ for me to speak up on these subjects to our college curriculum committee. Thank heavens they listened.
There's deployment and deployment. It seems most of you are discussing deployment in the context of packaging a standalone application and making it available for customers to install by themselves. This is indeed a time consuming task, but IMO not a difficult one, and I think packaging is a more apropriate term for it.
The kind of deployment I have seen and worked with, and which does take knowledge and precision is deployment of enterprise level distributed applications. Look at the SOA-trend. Sure, it's all fine and dandy to develop services and have thin clients consuming them under some kind of domain model with a strictly controlled document format (like WebServices should be), but how do you deploy this and how do you make sure that applications keep working while you upgrade?
I used to work at the IT-department of a very well known swedish furniture company which has just such an infrastructure: All business logic is developed as Java Enterprise Applications which are deployed on WebSphere app servers. Thin clients developed mainly in VB.NET run on citrix terminal servers or PC's around the world. Everything needs to run 24/7 with minimal downtime, and redundance is required in everything.
Did every developer know how to deploy things into this infrastructure? Heck no! Dedicated deployment managers took care of scheduling update work packages for the operations people etc. Other companies I've been working for with more customer packaged applications also have this: Dedicated integrators and packagers make sure that things are configured, built and packaged correctly and then made available through the correct chanels. I belive it's a waste of good programmers to have them spend alot of time on theese issues, when it's not really what they are good at, and often not what they want to do.
Should theese thing be part of CS? I think not. To me, CS is, and should be, the science of computer logic. It should not be a vocational education for programmers.
Over here, in Sweden, there are really three (well one might say four) masters programmes that commonly leads to a position in IT. There is the Computer Science programme, wich is a theoretical education about algorithms, compilers etc, you know it. There is the Computer Engineering programme, which has a bit in common with CS, but is much more vocational, and includes more of the practicalities of software development. There also is the System Analysis programme, which has very little technical focus, rather focusing on business analysis, mapping requirements, IT project management, documentation and architecture. The forth one is Cognitive Science, generally focusing on UI-design. Most professionals come from the Computer Engineering programme, I myself did that and the System Analysis. I think this is a good spread, making for people with focused education in specific areas of software development, rather than trying to create one general education for all IT-professionals.
The problem I encounter at my school is that when this is brought up to the administration, they assure us that the course is taught this way!
I think there will always be a disconnect between industry and education because schools are run by educators, not software developers. The people who run the schools are simply incapable of understanding. If you try to explain that the curriculum is insufficient in some areas and needlessly in-depth in others, you are likely to be rebutted with quotes from the syllabus and course catalogue. When you try to talk about industry requirements, they will counter by pointing out prerequisites.
I think all students, no matter what their field of study is, would be wise to cultivate relationships with people working in the industries they hope to find their own place in. I wonder how many computer science students actually ask someone who codes for a living, “What skills do you use the most at work?” or even “What did they ask you when you interviewed for the job?” The best thing you can do is seek out a mentor who is doing the kind of job you some day hope to do.
It is useless to teach the very practical aspects of software engineering, e.g. "Visual Studio 2005" or "Windows API", in university, because these things are outdated by the time students get into a real job.
For example, I took a course on GPU programming two years ago, and although the basic concepts haven't changed, pretty much everything else has.
Therefore, universities should focus on the fundamental concepts, not on the implementations thereof. Which is exactly what they do.