January 12, 2008
Greg Wilson recently emailed me the following question:
I'm teaching a software engineering class to third-year students at the University of Toronto starting in January, and would like to include at least one hour on deployment --- [deployment] never came up in any of my classes, and it's glossed over pretty quickly in most software engineering textbooks, but I have learned the hard way that it's often as big a challenge as getting the application written in the first place.
Deployment is a huge hurdle. It's a challenge even for the best software development teams, and it's incredibly important: if users can't get past the install step, none of the code you've written matters! And yet, as Greg notes, existing software engineering textbooks give this crucial topic only cursory treatment. Along the same lines, a few weeks ago, a younger coworker noted to me in passing that he never learned anything about source control in any of his computer science classes. How could that be? Source control is the very bedrock of software engineering.
If we aren't teaching fundamental software engineering skills like deployment and source control in college today, we're teaching computer science the wrong way. What good is learning to write code in the abstract if you can't work on that code as a team in a controlled environment, and you can't deploy the resulting software? As so many computer science graduates belatedly figure out after landing their first real programming job, it isn't any good at all.
Today's computer science students should develop software under conditions as close as possible to the real world, or the best available approximation thereof. Every line of code should be written under source control at all times. This is not negotiable. When it's time to deploy the code, try deploying to a commercial shared web host, and discovering everything that entails. If it's an executable, create a standalone installer package that users have to download, install, and then have some mechanism to file bug reports when they inevitably can't get it to work. Students should personally follow up on each bug filed for the software they've written.
Will this be painful? Boy, oh boy, will it ever. It'll be excruciating. Students will hate it. They'll begin to question why anyone in their right mind would want to write software.
Welcome to the real world.
After I wrote my response to Greg, Joel Spolsky posted an entry on computer science education that, at least to my eye, seemed hauntingly similar to the advice I offered:
I think the solution would be to create a programming-intensive BFA in Software Development -- a Julliard for programmers. Such a program would consist of a practical studio requirement developing significant works of software on teams with very experienced teachers, with a sprinkling of liberal arts classes for balance.
When I said BFA, Bachelor of Fine Arts, I meant it: software development is an art, and the existing Computer Science education, where you're expected to learn a few things about NP completeness and Quicksort is singularly inadequate to training students how to develop software.
Imagine instead an undergraduate curriculum that consists of 1/3 liberal arts, and 2/3 software development work. The teachers are experienced software developers from industry. The studio operates like a software company. You might be able to major in Game Development and work on a significant game title, for example, and that's how you spend most of your time, just like a film student spends a lot of time actually making films and the dance students spend most of their time dancing.
This is not to say that computer science programs should neglect theory. Fundamental concepts such as algorithms and data structures are still important. My algorithms class was my favorite and by far the most useful class I ever took for my own computer science degree. But teaching these things at the expense of neglecting more prosaic real world software engineering skills-- skills that you'll desperately need as a practicing software developer-- is a colossal mistake. It's what Steve Yegge was alluding to in his fantastical Wizard School essay.. I think.
There is the concern that all those highfalutin' computer science degrees could degenerate into little more than vocational school programs, something Joel mentioned in his excellent Yale address:
At Ivy League institutions, everything is Unix, functional programming, and theoretical stuff about state machines. As you move down the chain to less and less selective schools Java starts to appear. Move even lower and you literally start to see classes in topics like Microsoft Visual Studio 2005 101, three credits. By the time you get to the 2 year institutions, you see the same kind of SQL-Server-in-21-days "certification" courses you see advertised on the weekends on cable TV. Isn't it time to start your career in (different voice) Java Enterprise Beans!
You can have it both ways. That's why I'm so gung-ho for internships. College CS classes tend to be so dry and academic that you must spend your summers working in industry, otherwise you won't have the crucial software engineering skills you'll need to survive once you graduate. Unimportant little things like, say, source control and deployment and learning to deal with users. I constantly harp on internships whenever I meet college students pursuing a computer science degree. It's for your own good.
It does strike me as a bit unfair to force students to rely on internships to complete their education in computer science. Or, perhaps, something even worse. "Want to learn computer science? No college necessary! Just download some ISO images and found your own social networking startup!" Unleashing the naked greed of the TechCrunch crowd on tender young programming minds seems downright cruel.
So how should we teach computer science? The more cynical among us might say you can't. I think that's a cop-out. If students want to prepare themselves for a career in software development, they need to shed the theory and spend a significant portion of their time creating software with all the warty, prickly, unglamorous bits included. Half of software engineering is pain mitigation. If you aren't cursing your web hosting provider every week, fighting with your source control system every day, deciphering angry bug reports from your users every hour-- you aren't being taught computer science.
Posted by Jeff Atwood
When I went to uni we did deployments, except we just called them Makefiles. :-)
On the assertion that CS students need to breathe version control:
Considering most of the time you're working alone in uni, the value of source control is lost and just a burden. As I'm sure everyone can attest to: when you're in a team of one you have no infrastructure and you don't use source control, continuous integration, etc. Especially for something as short-term as an assignment that you maybe work 2 weeks on. Total overkill.
Many Computing BSc courses in the UK are now 'sandwich' courses. This means that your degree is elongated to 4 years with the third being an 'internship'. This is incredibly beneficial because:
i) Eventually you have to practice the theory, why not get a real grasp before your final exams!
ii) You need inspiration, to find both what you're good at and what you enjoy, so when you return to university you know;
iii)which modules to select for your all important final year and what to write your dissertation on.
iv) You're more attractive to some of your peers because you've made your initial mistakes elsewhere!
Good article - i found myself continually nodding my head throughout the read. As a recent graduate with an IT degree, I became a software engineer and database programmer, where I have been for 2 years after graduation. I would benefited from a focus on deployment in my curriculum. However, we did learn and develop software projects as a team.
College is for learning how to learn. If you graduate from a 4 year CS program, and then you can't figure out source control...then I agree that there's a problem...but it's with the person. If you want to learn skills, then you should be able to do that easily after you graduate. After all, the software field is probably the most dynamic...I constantly have to learn new concepts. In fact, even if they taught source control when I went to school, it's so different now that it wouldn't matter.
Source control is not just a computer engineering skill. It's basic computer literacy. It ought to be taught in middle school, in the same class that teaches touch typing.
Jeff, I also never learned anything about source control in any of my computer science classes. Even after five years of formal training. Everything I know about revision control systems I learned in the industry, much of it working on open source projects. I (probably) studied on a whole different country, so it seems that this is a widespread problem.
Computer science students really should be encouraged to participate in open source projects during school (some will like it so much that will continue to do so afterwards). There is so much to learn there. Heck, they can learn just by watching, if they want.
You should make the student submit all work in source controlled environement in a programming esq module. Then its just part of the way they work they are use to it etc etc. But deployment i would have loved on my degree to have had a course whice taught just a little bit of ant. I remember in my EJB module it was just like this is ant type ant and the supplied build.xml will make you a war file.
now im talking teaching the basics of Ant all the way upto using ANT to check out from CVS build and deploy to enevironments such as QA and Live.
This sort of experience is what people look for when recruiting.
Every class on source control or deployment will push out a class on a topic you will never have another chance to learn in quite the same way as you will when you are in college.
I most regret blowing off the graduate course in Advanced Cryptography theory because I was spending outrageous hours on my Master's thesis. Will I get a chance to learn how non-interactive Zero knowledge proofs work from some of the people who were doing the fundemental work? ... no. I could read the papers, but I won't be told in a problem set that I am mistaken ... I won't be able to ask a question in class.
Can I learn about deployment and source control? ... I actually don't have a choice. I learn about it again every time we change it.
I am not the least upset that college classes don't teach workman like skills in software engineering. I am more upset that they might skip teach fundemental concepts.
I'd be fine with 2 types of classes to address your concerns: a software engineering lab, that takes a workable stack and emulates real world experience on larger projects, and an Essay course that requires students to explore topics on real world problems and write papers. In fact every Comp Sci major should be required to have that second class. Writing a proposal to do XXX (or essay on topic XXX) is a core skill. Having a class that forces you to write proposal after proposal does positively impact a useful skill, and broaden your scope.
I do not however agree that practical real world skills have a large place in Computer Science education. So what if new recruits don't gain practiocal experience right in college? Who cares? Every skill you are talking about should be taught by an employer in the context of the environment where it will be used.
If the recruit is not smart enough to master practical skills in the workplace then they are not a good recruit. But for goodness sakes let them have an actual college experience rather than this 'prepare for the workplace' mistake of pushing professional development into college curiculum.
30 years ago at MIT I earned a BS in Computer Science from the EE department. The coursework, by department policy, was strictly theoretical. If you wanted to learn OS 360 and how to program in BAL, you had to go over to the business school. The justification for the policy was that any attempt to teach with a particular programming language, operating system, or development environment was doomed to be obsolete almost as soon as you graduated. By teaching theory, we could hope to learn "classical" knowledge which would endure.
The broad survey of the field they taught gave me a solid underpinning for layering on the practical applications and vocational experience I learned later in the workplace.
A vocational (or shudder, "Credentials") school teaches you HOW. MIT taught me WHO, WHAT, WHERE, and especially, WHY. Where my peers try to look up answers in a book (or online), I can derive a method from base principles. Where my peers at their best can only cut and paste other's open source code, I can apply classical algorithms and bend and extend them to my will. I am weary of those who whine a project is impossible unless they can beg, borrow, or steal baseline code off the shelf.
At a shallow level, academia may appear to lack relevance. At a deeper level, it supplies the foundation for significant innovation, real breakthroughs, and original contributions. Further, I'll match the half-life value of my educational experience against that which can be expected from any vocational school you can name.
At school I learned the most important lesson of how - HOW to learn. So many of my competitors in the workplace may be out of work in 5-8 years solely because they never picked up a book, read an online tutorial, or listened to a technical podcast once they left the schoolroom.
Dear reader - never give up. Never surrender (-GalaxyQuest).
I'm a Computer Engineering student at the University of Waterloo. Our school is very co-op orientated. Every 4 months we switch between having a co-op term and a school term.
Most of the classes I have taken up to date involve learning sorting algorithms, how to build link lists and trees, hash tables, etc from scratch.
I remember on my very first developing internship, I was setup in front of a computer with Visual Studio and Source Safe (neither of which i had heard of in my life). When my manager asked me to work on part of a code, I started to try to build a hash table... from scratch... not realizing it's already in Visual Studio. I was so lost that term, but I learned A LOT.
I am very grateful for the co-op program at my school because it provides me with valuable work experience. However, I also feel a little cheated by my school since I am paying so much for an education that can't even be applied in the real world.
In my country if you want to be a very specialized surgeon you have to study general medicine and all the theories and medicine lessons, no matter what specialization you will take. So if you want to be a brain surgeon you need to study the anatomy of the toes little bones, no matter if you will never have to bother with toe bones as a brain surgeon. The same connection that exists between the narrow range of very important and vital activities a brain surgeon does during an operation and the theoretical medicine knowledge, exists in the software engineer and the computer scientist. Dont rush to say it is not a proper exmple. The analogy is the same. You know what is difference? That the consequences of ignoring these connections are unacceptable in the case of the med-surgeon analogy while in the case of cs-ee they are acceptable if it costs much in time and money to maintain (studying, tuitions, etc...). And this is a mistake or a necessary evil because:
The reason a specialized surgeon is oblidged to learn pretty much everything about the body although he will be dealing with a tiny centimeter of a specific body part for his whole life is because a wrong assessment or intervenation in a part of the body can trigger a wide spectrum of effects in any place in the body. Does that ring any bell regarding a project, or an algorithm, or a theorem in general?
So the question is: Would an experienced software engineer with 15 years xp in the field trade 4-5 of them for academic experience? I say definitely yes. Not because it would benefit him. It might not so. But it would benefit the art
Look up the Threads program at the Georgia Institute of Technology. Those people have the right idea.
I had to take labs for typing and Linux, and a class on computer operating systems (DOS, Windows, *nix), and non-CS students are often required to show competency in word processing and spreadsheets (usually Word and Excel), and possibly Power Point and various internet applications (browser, email, etc.). Of course, this is a combination of 2 or 3 schools, none of them required every part of it.
It struck me within a couple of weeks of starting my first "professional" program, especially working with a (small) team, that the classes I took were only so useful to actually do the work. I can only imagine what it's like for someone that goes through 4 years of school seeing little more than 2 years of learning to write code in Java and, hopefully, some data structures and algorithms (still taught in Java...).
Worse yet, the older I get, the more I feel like I need a stack of books (or good online help) to write even a simple program in a language I used extensively in school. Of course, it would probably help if I didn't have to write in a different language every time I switched projects (and you want to spend how much time converting this Pascal code to C? I thought that OS came with a compiler for Pascal...).
School subjects are much more interesting than real world. Boring everyday development and deployment is what I do during summer.
Haha, you sound like an old man! If it doesn't hurt, its not good for ya! Well i don't think that is neccessary. However, I unfortunately could not tell you how to teach Computer Science. I sure as hell didn't get taught right.
I believe people misconceiving that CS make good programmers forms a large part of the problem. For instance, software development / programming job ads often require CS degrees.
As an enterprise security practitioner, I see lots of the crap development pumps out. In one case, a security company (which also has CS degree as a hiring criteria) wrote custom software laced with cross site scripting vulnerabilities for my client's web authentication module. When this was pointed out to them, the "programmer" reluctantly went back to rewrite the code. That rewritten code also had the same security vulnerability. Clearly, the programmer's CS background didn't help produce secure code... but that appeared to be an acceptable standard for the security firm he worked for.
It's this sort of stuff that shows the uselessness of CS degree in many areas of applied practice. For that matter, it that sort of stuff in IT in general that makes me roll my eyes. Sure, CS has value and serves its purpose in some areas. But, just as there are good lawyers and bad lawyers or bad doctors vs good doctors, the quality of applied IT comes down to individuals (or proven repeatable processes) -- as opposed to degrees alone.
"Perhaps they could try having students contribute to an existing open source project of some kind, even in a small way?"
This is the kind of thing that Google has been encouraging with its Summer of Code initiative:
Following on from this is their Highly Open Participation Contest:
which is targeted at even younger folks.
This is really an excellent idea. In my experience, the vast majority of "professional" software engineers could learn a lot by becoming involved with an open source project or two.
It is for this reason that I LOVED my senior design class where I went to college. We culminate everything with team design for a COMPUTER GAME!
Students are given the game to write, split into teams with team leaders, and everything else is their choice - langauges to use, the name of their game, the name of their teams, their own team logos, etc.
Certain requirements are there - Each team's game has to be able to play network games (host and join), each game has to have an AI with 3 levels of difficulty, everything must be documented, etc.
My only problem with it, is that it's only a one semester long class, but MAN it was GREAT!
If only students knew that earning their CS degree meant becoming Dilberts - complete with the cubicles, pointy-haired bosses, and all that WTF insanity - IT would be a far less popular field!
Maybe that's why universities are so focused on the fun, but useless aspects of computer science?
I think that most people will find that the proponents of Universities as academic institutions with little focus on the applicable nature of computer science are most often those currently employed by those institutions or those attending them. The truth of the matter is that Computer Science is largely and almost entirely an applied science; otherwise it would be a mathematics degree focusing on computability etc.
Re. the main article, a solution that has fotunately been found by a professor of mine is a large scale research/implementation component of each course that he teaches at my University. The student must research a concept from the course material in depth and provide both a research paper and a 'practical implementation' of the concept (where practical implementation is flexible due to the, sometimes, complex or obscure material).
I've been teaching computer science for ages. I mention deployment in the context that it is one of the things you should sort out as early in a project as possible. After that you sort out uninstall. That way you can get programs out into the field for users to play with, and send out new versions with no tears.
I also stress that the "out of the box" experience for a customer is absolutely crucial and that if you foul this up you are on the defensive before you start.
With regard to the teaching of computer science, I've been doing this for a while, and have decided that it is time for a bit of silliness:
You're making the common mistake here. Computer science is not software engineering. Believe it or not, some people do want to just learn the abstract, and there is reason to learn that and that's what computer science is. If you want to learn development, take up software engineering.
Computer Science and Software Engineering are two *entirely different disciplines*. Computer Science is more related to computational theory than it is to programming. Software Engineering is the course you should be taking if you actually want to learn how to develop software. Any good college/university that offers Software Engineering will cover Source Control and Deployment -- I know mine does.
I've been wondering why I enjoyed my CS studies, but feel turned off by any "real world" aspects of CS. 3D graphics and design of algorithms were my favorite subjects at univ, but I don't care at all about gathering requirements, dealing with incompetent users, creating data driven ASP.NET websites / web services, using AJAX to make things "look pretty", etc... Learning new things is fun, but hacking together "business-y" products is just a drag. Maybe I'm just one of those few who love true computer science, but not software engineering?
"You're making the common mistake here. Computer science is not software engineering. Believe it or not, some people do want to just learn the abstract, and there is reason to learn that and that's what computer science is. If you want to learn development, take up software engineering.
Sauron on January 13, 2008 03:01 AM"
The Dark Lord is correct - what is needed is almost a double major in Computer Science and Software Engineering. Two sides of the same coin and both are ultimately needed to produce a solid developer.
Dijkstra used only a pen and paper for his programs. What sort of a source control tool do you suggest for this method?
plsease i want you to send the answers of this question i am about to send to you through this e-mail email@example.com. question: why do you think that student of omputer science need to be teach software engineering. thank you..
from: birma ishaku
dept: computer science
univesity of maiduguri
plsease i want you to send the answers of this question i am about to send to you through this e-mail firstname.lastname@example.org. question: why do you think that student of omputer science need to be teach software engineering. thank you..
from: birma ishaku
dept: computer science
univesity of maiduguri
why don't schools teach A level computing ?
Firstly I would like to thank you for writing this enlightening blogpost. I agree that what we have been teaching as a part of Computer Science curriculum needs to be near to the factual world. But I would like to inform you that the computer science degrees these days impart more practical knowledge and are more focused towards engineering skills. Industrial exposure and the latest technologies now being an integral part of the computer science curriculum adds to this change.
Perhaps we need a major in software development that's not as theory oriented as computer science, or as theory oriented as software engineering, and certainly more hands-on than information science.
There's a lot of overlapping fields here, and it's much like complaining that a computer engineer should know more about being an electrician; they're different things.
soo TRUE... I've taken database classes back in the day where we didn't even open up an sql prompt. Much less integrate it into a programming objects. sigh... academia why are you soo useless.
Keep in mind that this is coming from a USA perspective and I’m not intending to be a *nit*, but a Computer Engineering Program is not the same as a Software Engineering Program. They aren’t even close. The interchanging of these terms in the comments denigrates the efforts that Computer Engineer Degree recipients have achieved.
A Software Engineering Program is akin to a Sanitation Engineering Program. It uses the word *engineering” to sound like something more than it is. The bottom line is if the degree did not prepare you to take AND pass the Engineer In Training (EIT) Exam then it is not an engineering degree or program. Engineering has a specific meaning in reputable academia schools, culminating in taking the EIT Exam. From my brief investigation, all the Software Engineering Programs that I found online do not prepare the graduate for this exam.
The type of education discussed in this article that more appropriately describes a Software Engineering degree is called an Information Systems Degree. The emphasis in this program is on “the application of computing”.
Whereas, Computer Science “focuses on the theory of computation and computer organization”.
Computer Engineeering emphasizes the application of engineering principles to the design of computer hardware and software. It has basically the same computer science courses as the Computer Science Curriculum.
Where Computer Science and Computer Engineering differ is that Computer Engineering has more required math courses, more Engineering/Physical Sciences/Electrical/Computer Hardware courses (e.g. Statics, Dynamics, Thermodynamics, Circuits, Computer Systems Design….). Computer Science replaces these engineering classes with a broader selection of elective Liberal Arts electives and a couple of more elective Computer Science courses.
While it may not be obvious at first glance, the Computer Engineering Degree is definitely the more rigorous and difficult to obtain degree of those listed. It requires more credit hours to complete and the required classes are harder.
The difference between the degree seekers was quite obvious when taking courses that overlapped. The Information Systems people had more experience actually doing something, but struggled mightily in actually solving problems that required more advanced concept skills. Probably why one poster said the Software Engineering students haven’t been very useful in his experience. Whereas the Computer Science students were equivalent to the Computer Engineering students at thinking up solutions, they did lack the applied mathematical skills that the Computer Engineering students had. Thus, it was typical that the Comp E’s got A’s, Comp Sci’s got B’s and IS’s got the C’s.
Like I said, I don’t want to be a *nit* but people pick the Computer Engineering Degree because that is what interests them, even though they know full well that it will mean a lot of Friday and Saturday nights that they will have work to do while their friends who picked Information Systems (or Software Engineering) go out partying. I don’t want the extra effort made by those who chose to get their Computer Engineering Degree to be diminished by being lumped in with a quasi-engineering degree whose classes don’t even begin to compare with the type of classes required in a Computer Engineering Degree.
P.S. Also, if you hire someone that can’t grasp the concept of a Source Control Tool in fairly short order (with or without learning it in school) then that is a person that you probably should find a way to get rid of as they are obviously in the wrong business.
A Question to ponder: How do you teach deployment to someone in school that they can actually use on the job? Deployment is so specific to whatever type of environment you are working in that you can almost guarantee that nobody in that class is going to ever use the specific deployment method that they are taught in school on a real job. That is why the person went to school, to learn how to learn new skills like “How do I deploy my application?”
A final remark, almost all software development shops do development in different ways. This is why many companies prefer a *blank slate* graduate rather than a pseudo-experienced one. That way they get to teach the new employee THEIR WAY without bad habits getting in the way. The vocational training approach would create many bad habits. Yes, experience gained is usually good, but bad experience is worse than no experience.
I would just like to add my voice to this discussion. I agree with Sauron, bob and the others making the distinction between academic education and practical training.
It is almost, or rather it should be, academic (pardon the pun) what degree someone has. The fact is that they have demonstrated that they are capable of understanding and learning at a degree level. I don't think it matters whether they ever actually use the knowledge they learn, my undergrad degree is in Philosophy but I have never needed to discuss Plato in my job though the critical and analytical skills I developed have proved extremely useful in my life. That I can read a book and understand it quickly and therefore teach myself what I need to know is the skill that a degree has furnished me with. To me that is the point of an education, to help you with your life and not your job.
That is not to say that someone without the academic education could not teach themselves from the same book but a degree says that the person has demonstrated that they can over 3 years or so.
Source control and working in a team are undeniably essential skills for the real world, but then an induction process should ensure new employees have those skills rather than relying on the universities to workplace train for employers.
Finally wouldn't the world be a dull place if we were all just the set of skills we needed for our jobs. Where would innovation come from?
There is something to be said for employing people who do not major in CS but in some other subject area and then apply their domain skills from that area in software instead.
This is the path I took into software and it has worked out well for me and my employers. It does have the disadvantage, however, of leaving gaps in my software knowledge. I know lots about CVS and SVN but almost nothing about databases in either the abstract or specific. I have, however, deliberately avoided DBs in my work to date.
I have hired several people over the years (I am now 43). I prefer to hire those with another string to their bow in preference those with a pure CS background. I do this because I find that pure CS people often find it hard to talk with other disciplines in non-CS terms. Admittedly, this problem lessens with experience. However, for many it remains a long time.
I should emphasize that I have spend a lot of time on projects with multidisciplinary engineering teams. This is not your classic software vs. business issue. If the engineers find it hard to develop a common language to talk amongst themselves then the product is almost certainly doomed.
FYI, the recent _agile_ movement seems designed to piss engineers in other disciplines off. All design suffers the same problem as software design. The only thing that makes software special, in any way whatsoever, is that software is one of the few things that can continue to evolve long after the _physical_ product is in the market.
Anyway, I detect a certain haughtiness in some CS courses. A looking down upon the non-academic world. CS used to be the only way to learn about computers at university. This is no longer the case. Other options exist now. Those options should reside in the Engineering faculty. CS should remain in the Science faculty along with Physics, Chemistry and Biology.
It is time for software engineering, the thing that most of us really do, to separate itself from its science parent in much the same way that chemical engineering has been separated from chemistry.
Maybe it is too early to do this but this will happen and is already happening in many institutions.
I think this is a good for both CS and SE.
I served on a committee to create a new CS degree program at a top public university. To my amazement, at the first meeting, the chair of the CS department told us that she views every student who graduates with a BS in CompSci and becomes a professional software developer as a _failure_.
In her mind, the purpose of her department is to produce CompSci PhDs who can "advance the field of computing". This kind of elitist attitude is what we're up against and is why the American university system is not producing enough capable software engineers.
[Anonymous to protect the guilty.]
Today's computer science students should develop software under conditions as close as possible to the real world, or the best available approximation thereof.
This is so true. So many times do I have to reconfigure software to make it work for a true management enterprise environment. When will developers learn that users genreally don't and can't have admin rights on their machines for commercial applications in real enterprise environment.
One of the major portions of my job is either reconfiguring myself or working with the vendor to plug the admin right's holes in their software offerings.
Understanding Computer Science theory is important for a developer to analyze problems and develop quality solutions. As such, fundamental theory should always be part of a quality Computer Science education.
Because of the relatively young age of the computer field, most universities and colleges are trying to figure out how to grow their Computer Science programs. Eventually, we should see a more general separation of Software Engineering (application-based) programs from Computer Science (theory-based) programs. And as the field ages, more specialized degrees should emerge (security, graphics, etc.), just as Renaissance science has diverged into Biology, Chemistry, Physics, etc.
While schools hang on to a single Computer Science degree program, it will increasingly be hard for them to decide upon a balance between theory and application. Unfortunately, in such cases in-depth coverage on vital application skills will continue to be overlooked.
It's a bit ridiculous to expect computer science to focus on the applied end of things. You wouldn't expect a math program to train you to be an accountant. Why would you expect anything different from computer science? Maybe a better solution would be to offer separate degree plans, one for science and one for development.
The only thing I ever learned at College was how to learn--anything.
Before i say anything else, i am a huge fan of yours. On to my criticism, You seem to forget that Computer Science has "science" in the title while Software Engineering has "engineering" in the title; That is: The first is a Science while the second is an applied Science. I am not saying that it should be keept cleanly separated.
Each one should have 20% of the other in its curriculum (e.g. SCM) and computer science should have a quota since the world does not have space for that many computer scientists and they end up becoming software engineers that have to learn their career the hard way