October 2, 2006
Steve Yegge's scathing criticism of agile methodologies takes a page from Joel Spolsky's book. It's not merely an indictment of Agile, it's also a celebration of how his company does business. Just substitute "Google" for "Fog Creek Software" and you'll get the idea.
It's a long post, so I'll save you the effort of reading it: if you're practicing Agile and you don't work at Google, you're probably doing it wrong. Here's how Google does it:
- There are managers, sort of, but most of them code at least half-time, making them more like tech leads.
- Developers can switch teams and/or projects any time they want, no questions asked; just say the word and the movers will show up the next day to put you in your new office with your new team.
- Google has a philosophy of not ever telling developers what to work on, and they take it pretty seriously.
- Developers are strongly encouraged to spend 20% of their time (and I mean their M-F, 8-5 time, not weekends or personal time) working on whatever they want, as long as it's not their main project.
- There aren't very many meetings. I'd say an average developer attends perhaps 3 meetings a week, including their 1:1 with their lead.
- It's quiet. Engineers are quietly focused on their work, as individuals or sometimes in little groups or 2 to 5.
- There aren't Gantt charts or date-task-owner spreadsheets or any other visible project-management artifacts in evidence, not that I've ever seen.
- Even during the relatively rare crunch periods, people still go get lunch and dinner, which are (famously) always free and tasty, and they don't work insane hours unless they want to.
- Google drives behavior through incentives. Engineers working on important projects are, on average, rewarded more than those on less-important projects. The rewards and incentives are too numerous to talk about here, but the financial incentives range from gift certificates and massage coupons up through giant bonuses and stock grants.
- Google is a peer-review oriented culture, and earning the respect of your peers means a lot there. More than it does at other places, I think. [..] your actual performance review is almost entirely based on your peer reviews, so it has an indirect financial impact on you.
- [Google] has a long all-hands in which they show every single project that launched to everyone, and put up the names and faces of the teams (always small) who launched each one, and everyone applauds.
If nothing else, it's an interesting window into Google's software development process.
The whole FogCreek/Google-Is-So-Totally-Awesome thing is an annoyance. But I have a deeper problem with this post. I think Steve's criticisms of agile are hysterical and misplaced; attacking Agile is a waste of time because most developers haven't even gotten there yet! The real enemy isn't Agile, it's Waterfall and Big Design Up Front. Even "bad" Agile is a huge quality of life improvement for developers stuck in the dark ages of BDUF. I know because I've been there.
Rather than wasting time and effort on discriminating between "good" and "bad" Agile, we should be banding together in the name of Anything But Waterfall. The fact that some maladjusted developer or project manager could use Steve's well-written, reasonable sounding rant as a justification to keep their project in the dark ages of Waterfall and BDUF absolutely kills me. Who is the real enemy here?
I'm not the only person with criticisms of Steve's rant. Dare Obasanjo notes that the talent meritocracy at Google sounds disturbingly similar to the one outlined in Malcolm Gladwell's The Talent Myth:
This "talent mind-set" is the new orthodoxy of American management. It is the intellectual justification for why such a high premium is placed on degrees from first-tier business schools, and why the compensation packages for top executives have become so lavish. In the modern corporation, the system is considered only as strong as its stars, and, in the past few years, this message has been preached by consultants and management gurus all over the world. None, however, have spread the word quite so ardently as McKinsey, and, of all its clients, one firm took the talent mind-set closest to heart. It was a company where McKinsey conducted twenty separate projects, where McKinsey's billings topped ten million dollars a year, where a McKinsey director regularly attended board meetings, and where the C.E.O. himself was a former McKinsey partner. The company, of course, was Enron.
Read the rest of the article; the similarities are truly startling. It's not very reassuring to think that the only difference between Enron and Google is their "Don't be evil" motto. We now have laws in place to protect us from Enron, eg, Sarbanes-Oxley. I'm not aware of any pending motto enforcement acts. Yet.
calls it "a great rant", but also cleverly juxtaposes two of Steve's sentences to illustrate a point:
This nickel-a-line-of-code gig is lame. You know where the real money is at? You start your own religion. [...] There is nothing like it on the face of this earth. I could talk for hours, days about how amazing it is to work at Google, and I wouldn't be done. And they're not done either. Every week it seems like there's a new perk, a new benefit, a new improvement, a new survey asking us all if there's any possible way in which life at Google could be better.
One wonders if the "good" agile at Google isn't just as much of a religion as the "bad" Agile.
Posted by Jeff Atwood
Great post as usual. I think there is middle ground between BDUF and XP. That's probably where most of us spend our time.
Having been doing this stuff for 20+ years, I've seen very good, hopeless, and in between.
- you need to think stuff through as best you can
- you need to use a language/tool that let's you get things done relatively quickly (whatever that means)
- your building blocks need to be such that they can be modified and/or jiggled around relatively easily
- build a kick-ass library of re-useable functions and make people use them
- code review, and mentor
- estimate carefully, work on fixed price, and when the customer has a change say "no problem, but it's time and money"
- prototype with the user at regular intervals
This stuff stays pretty much the same year after year. Tools get better (and sometimes worse). Programmers many times write code without thinking very much beforehand (and may call that XP). Event driven programming made things more complex in some ways.
I don't think I really believe all of this Google story. At some point something has a priority, some boundaries (so you know when it's done), etc., and enough people working on it so as to not need more developers one week than make sense.
For the anti-Waterfall movement to catch on, it needs a catchy name.
I was really surprised that there wasn't a YouTube link to TLC's "don't go chasing waterfalls"...
The comparison to religion is an apt one. Google's development philosophy seems to boil down to "hire a bunch of really smart people, treat them well, and have faith that they'll come up with cool stuff".
In Google's case, it probably mostly works because they really only have one product - an ad firehose. All of their other "products" have as their only goal to widen the stream. The cost of failure (and they've had them) is low, and even modest success is purely accretive.
Hardly your typical development organization.
The truth is often somewhere in the middle. There's no silver bullet for the problems of software development. You'll get far by planning ahead, by communicating a lot (more), by not reinventing what someone else has already invented, and so on. Look at other mature engineering disciplines and learn from them; software development really isn't that unique.
Like Kevin says, what works at Google won't necessarily work elsewhere. In many ways, Google creates its own weather patterns.
I'm all for no *artificial* deadlines (http://haacked.com/archive/2006/03/12/ArtificialDeadlinesAreTheDevilsWork.aspx) but no deadlines would seem to run afour of Parkinson's Law (http://en.wikipedia.org/wiki/Parkinson's_law).
I'd be all for no-deadlines honestly, but I have this little problem of clients and their whiny demands. Something Google doesn't have to worry about apparently.
But you gotta admit, part of you would like to work in that developer's Utopia, if it were found to be true. ;)
There was an email discussion of this at work in response to us using Scrum, one of the "Evil Agiles." Here was my response, with comparisons to scrum [in brackets]. Sorry for the length, it's mostly due to copy-and-paste of Stevey's stuff.
Although he bashes Agile religion and associates that with Scrum, he actually describes many Scrummish things in his description of the kind of “Good Agile” that works well at Google.
- Google has a philosophy of not ever telling developers what to work on, and they take it pretty seriously. [scrum is about self-organizing teams]
- there aren't very many meetings. I'd say an average developer attends perhaps 3 meetings a week, including their 1:1 with their lead. [short daily scrums]
- there aren't Gantt charts or date-task-owner spreadsheets or any other visible project-management artifacts in evidence, not that I've ever seen. [note: he later mentions using a work queue which is basically your product/sprint backlog]
Google a peer-review oriented culture [daily scrums, code reviews]
Another incentive is that every quarter, without fail, they have a long all-hands in which they show every single project that launched to everyone [this is basically the sprint review meeting where you show off your working, potentially-shippable code]
Google takes launching very seriously, and I think that being recognized for launching something cool might be the strongest incentive across the company. [similar to scrum’s focus on a potentially shippable product each month]
Teams are always situated close together in fishbowl-style open seating, so that pair programming happens exactly when it's needed (say 5% of the time), and never otherwise. [self-organizing teams]
They take things like unit testing, design documents and code reviews more seriously than any other company I've even heard about. They work hard to keep their house in order at all times [continual testing, refactoring, and keeping the quality high are crucial to delivering working software each month]
Google isn't foolish enough or presumptuous enough to claim to know how long stuff should take. So the only company-wide dates I'm ever aware of are the ends of each quarter, because everyone's scrambling to get on that big launch screen [scrum is about adapting to changes as you go, including changes in estimates. The burndown graph helps to make intelligent trade-offs for delivering something meaningful each month]
Google gives you access to any resources you need in order to get your job done, or to learn how to get your job done. [scrum is about giving your teams everything they need to do their jobs well, including removing obstacles]
Then all you need is a work queue. That's it. [backlogs]
With nothing more than a work queue (a priority queue, of course), you immediately attain most of the supposedly magical benefits of Agile Methodologies. And make no mistake, it's better to have it in software than on a bunch of index cards. If you're not convinced, then I will steal your index cards. [backlogs!]
With a priority queue, you have a dumping-ground for any and all ideas (and bugs) that people suggest as the project unfolds. No engineer is ever idle, unless the queue is empty, which by definition means the project has launched. [backlogs!! Notice that the backlog drops to zero in order to launch]
You always know how much work is left, and if you like, you can make time estimates based on the remaining tasks. [more backlogging]
A work queue is completely transparent, so there is minimal risk of accidental duplication of work. [visibility is one of the 3 main ideas behind scrum. The other two are inspection and adaptation]
There are other things he mentions that are different from scrum, but in general I think the things he likes most about working at Google are the same things that Scrum encourages, as long as it doesn’t become a blindly followed religion.
I find that the Waterfall method would be a step up from the way most shops (that I've seen) are run.
You folks need to get state of the art. If you'd been to Waterfall 2006 you'd be singing a different tune. The developments are profound and inspiring.
Are you sure that you really like to go down the "Guilt by Association" road?
Google employees work in a building.
Enron employees worked in a building.
Enron was evil.
Google is evil.
You also simply assert that Sarbanes-Oxley protects us. Assertion is not proof.
Methodologies are just like religions. They don't apply to everyone and each has some factor of error in logic/implementation.
Holistically you cannot box your product into any given pre-existing process. The process must enable the product, it's not the other way around. This is why most methodologies fail.
It's like picking up a wrench, and finding out you need to drive in a nail. You can use the wrench to bang on the nail head, but at the end of the day you exhausted your effort using the wrong process to complete the product. If you had examined the product you would have known that a hammer would be much more useful. The problem is knowing up front that you are building a garage and not changing a tire. CLEAR Objectification of your product and determination of scope up front... I don't know what methodologies support that, I have been privy to a few, but maybe the "priest" was more interested in the end result than managing the congregation.
The Enron-Google comparison is actually much more than just guilt by association. Having read "The Smartest Guys in the Room" just before reading Yegge's blog rant, I was struck by a number of similarities.
Enron's problem was actually NOT that it was evil. Enron was simply unprofitable. Or rather, it had only one profitable business (trading) which supported all its other businesses, most of which were huge money sinks. They spent much of their time desperately searching for the "next big thing", but never had the patience to allow any of these initiatives time to grow into real businesses. Finally, when their one profitable business hit a rough patch, the company imploded.
All the accounting scandals and so on which prompted the creation of SOX merely delayed the inevitable. SOX would not have prevented Enron's implosion, it would only have made it happen sooner and with less collateral damage.
Anyway, the parallels with Google should be clear. Google is mostly supported by its search business, which is massively profitable, but it remains mostly a one-trick pony. None of its other businesses come close to replicating their success in search, in fact many of them are total busts. Much of the cultural stuff described by Yegge looks very much like a desperate hunt for the "next big thing" - but just like Enron, Google appears to lack the stamina to see any of them through.
Fortunately for Google, it looks highly unlikely that their search product will implode in the same way as Enron's trading did. After all, trading is intrinsically volatile. But if a big competitor (eg Yahoo or MS) managed to eat enough of their market in search, I think they would quickly have to switch to a much more conventional (ie profit-driven) way of doing business.
Hmm. nice post good links. Trouble is, Im wondering whether I ought to ask my tutor that I dont want to use UP in my software model and please may I use agile instead? I wonder what he would say. I suppose you need to know the rules before you can break them ....
I agree to Neil mostly. Google loves off its search, and essentially has a kilo of other little services that should just cover costs and create public attention. A person that uses a google image viewer, a google mini search and other junk is IMHO unlikely to not use Google for searching, too!
IMHO, a service is a winner for google if it covers its development costs...
Nice post as usual, but how do you push your company to change their methods, I mean I could do it, but the top level managements, project managers, and a bunch of other people who just cant life without BDUF?
Should I tell them to read codinghorror.com? :D
Having one big source of review is the way most big companies work, it's specialization. One company builds cars, another owns houses in Manhattan. I don't see Google being much more vulnerable than any other company in this aspect. The whole "I'm not aware of any pending motto enforcement acts." sounds very much like, "there is just a matter of time until they do something evil" to me and whole thing sounds very much like an attempt to use a company which had a very bad name to smear another. Why choose Enron? There are plenty of companies with a similar strategy.
I liked Steve's article but only because I had my own ideas about Agile and they tend to err on the side of Steve's. Also, I liked hearing about Google.
As for Agile... there's not a lot I like about it. Note cards up in your cube? Why not use *gasp* a real bug tracking/task managing system. Daily meetings? Gag. Pair programming? Put me in a dark room, give me my tasks and leave me alone and I'll have them done quicker than if I have to sit in a cramped cube with someone else and watch them code.
Yes, Waterfall sucks but Agile also sucks. There are other ways to do it besides Waterfall and Agile too.
And for those that think that Google is evil... honestly, if they do what Steve said they did, I want to freaking work there. That sounds a lot better than all the companies I've worked for.
"There aren't Gantt charts..."
Gantt charts are evil. Every project I've worked on that had a "project manager" or used Gantt charts was a massive failure.
To Hendra Saputra:
It's a hard, long road. "Fearless Change" by Mary Lynn Manns and Linda Rising is a good place to start :)
Best of luck.
Parkinson's Law is not a Law. It has been demonstrated from extensive studies (for example, read PeopleWare), that the highest productivity comes from a lack of deadlines. There are numerous theories as to why this is the case. The next best productivity rate is a when a deadline is set by a third party guru (not your supervisor, not yourself, not you + your supervisor).
Somebody said this (or something close to it) a few topics ago: "Why do development discussions always polarize to the extremes".
The problem I see here is that people are saying "BDUF didn't work on project X therefore no design is preferable". Or they are saying that "Waterfall, a rigid weakly iterative approach, doesn't work therefore we must have no process unless it is highly iterative (and of course, the most iterative approach is to hack)".
These arguments are completely flawed since they lack any kind of logical support.
I am a proponent of design. I believe that very few software projects do anywhere near enough design. In other words, I am adamantly opposed to the "vague idea forms; hack it out" approach. However, I am not a proponent of waterfall - I do not believe that one can or should completely design a system up front without some actual coding/experimentation. This is a middle ground attitude - not a polarization or extreme attitude. Although I regularly attend the Agile meetings in my home town and have listened to and spoken with Agilists such as Alistair Cockburn, Mary Poppendieck, Steve Adolph and Phillipe Kruchten, I'm not even convinced Agile is the correct route. One problem with Agile approaches is that every last one of them gets somehow corrupted by developers to mean "vague idea forms; hack it out". This is what happened to XP, it is what happened to Scrum at the last software company I worked for, etc.
The enemy is NOT waterfall, it is NOT BDUF, it is not even BHWNP (Blind Hacking With No Plan) it is those who would drive any _reasonable_ iterative process out of balance toward one of the extremes because they don't understand the needs of others on the project. Business people, managers, customers need vision into the project to verify their requirements are being met - C# code does NOT provide that vision. Developers need to verify that ideas they have in mind will work and need to prototype. The middle ground that ALL should be happy with is the various layers of design so well described in the UML and in Cockburn's Writing Effective Use Cases.
Although it is not fair to condemn Steve Yegge's comments based on only the knowledge gleaned from Jeff Atwood's initial post here, I would suggest that Yegge is claiming Agile to be too much process - he is taking an extreme position of "let the developers discover cool project ideas without business/management interference". This is likely the exact reason that Google has only one highly recognizable/differentiated product - the search engine that was driven by a clear business need so long ago.
While it's probably not relevant to the discussion, Tom S., that site is pure genius.
Johan Tibell wrote
Why choose Enron? There are plenty of companies with a similar strategy.
Did you read the linked articles? The comparison to Enron was that Steve Yegge's descriptions of the culture at Google are eerily similar to the descriptions in a 2002 article of the culture at Enron.
Neil gives a good example of other similarities in his comment.
Woh, way too many comments to digest, so I will just say what I would have said if I had been the first to comment... we are talking methodologies here, and there have been many in the past, many in use now, and more to come... I have personally worked using upwards of 20 different methodologies or variants over the years, have lived through introductions of new methodologies several times, and led a project myself to implement a new methodology... the issue is not which methodology is the best, it is which one is best for you, your project, your company. If Waterfall or RUP or IEM or whatever is working for you, don't change to a new methodology just to be trendy... if your current methodology is NOT working for you, then turf it for sure, but be aware that this takes some time... The point of a methodology, in the end, is not control or command or standards or rules, its about communication, within the team, out to customers, and on; a methodology gives a set of terms, methods and guides so that everyone on the project knows what it means when you talk about Requirements, or Design docs, or Specifications, or Phases, or Iteration, or Prototype, and on and on.
How else can I say this? If Waterfall is in many peoples' view the equivalent of Mac Truck, and your project just needs a VW Bug, don't buy a Mac Truck... but sometimes you need a Mac Truck..
I think it's tempting to fall into what Eugene Volokh called the "a href="http://www.volokh.com/posts/1133568627.shtml"reverse Mussolini fallacy/a." -- that since Mussolini made the trains run on time, and Mussolini was bad, then making the trains run on time is bad.
In this case, Enron had a star culture, Enron was bad, ergo star cultures are bad.
But I think the thing about Enron was that the star culture was built around perceived talent and potential as manifested by degrees. It seems to me that Google, and the software industry in general, measures stars by what they've actually accomplished, rather than what diplomas are hanging in their office.
Mr. Wright is definitely Mr. Right this time.
Whether BDUF, waterfall, agile, or foobar is best depends on external circumstances.
If you're working internally directly with the direct users (or a group who can officially take the rap), then Agile can work. Not "only will work" or "must work", but "can actually be done effectively".
Otherwise, with any other user type, it can't work. With any other user type, there has to be a paper trail of culpability. Has to be, alas. Note that Joel's business fits this circumstance.
The other externality that really matters is the nature of the application. If you're doing something that's amenable to taking some input, storing it, and regurgitating it in some form; then you'll make a better application (no matter the development paradigm) if you take database design seriously. If you're doing a data based application and take the attitude that any knucklehead coder can specify a bunch of SQL tables; well, you'll fail. This can be done iteratively, and so on. If data isn't the issue (you're doing something that really is all code, like a device driver), then do what you want. Again, the paradigm is secondary.
And, lastly, there will always be iteration, if only fixing bugs. It's merely a question of local feedback vs. global feedback (for those with an electronics bent).
You see, I don't get to work with talented people.
So, most unit tests I've seen are the worst form of cherry-picking, and are used as an excuse to ignore reports of logic errors.
Most iterations are just the regular old "death march" in fast forward, and result in nothing ever getting merged, even if it's important.
Pair programming is social terrorism; if your colleague is too shallow to even try to understand your code, it's "stinky", even if you're right and can prove it.
The items on the index cards have meaning only to a tiny minority of the team; attempts to refactor and simplify can and will be shot down because they might require a reorganization of work units.
Agile in the hands of idiots is worse than Waterfall. Much, much worse.
"You folks need to get state of the art. If you'd been to Waterfall 2006 you'd be singing a different tune."
I do look forward to Waterfall 2007. The theme of "Over the falls chained inside a barrel" just grabs you and tosses you ever the edge.
Whew! Seems like a lotta Agile bashing going on lately. I can't help but to wonder whether those who bash it intensely have practiced it in earnest. Ron Jeffries' "Extreme Programming Adventures in C#" gives a great insight into just how it can work.
I don't practice it currently, but have, and know it does work when done properly.
Someone commented, "Look at other mature engineering disciplines and learn from them; software development really isn't that unique." Well I just happened to be reading about Denver's massive transportation project (trexproject.com), and it sure sounds alot like agile; "... design-build... allows the project to begin construction while completing design".
Nonetheless, I agree with Stuart; "Do as much design as is needed to get you started."
I have bun hear one time!!!!!!!!!!!
III LOVE IT, ITS COOOOL
Good post. I read Steve's post last week and I have to agree. It's more a rant about how great Google is as opposed to how bad agile is. I mostly read it for the Google tid-bits.
Too much of anything is bad for you.. agile or waterfall.
Well, I liked Steve's article even if it was a bit one-sided. Having said that I rarely see pragmatic Agilists writing, I only ever seem to encounter the [apparently] rabid sort. I suspect this is because the pragmatists are just getting on with things and leaving everyone to work things out for themselves.
I don't believe in BUFD and have never worked in a Waterfall environment even though many of the other engineering disciplines around me seem to think they do. The hard fact is that whenever a design activity happens, Waterfall goes out the window. Having said that, once you commit a electronic circuit to PCB you don't have all that much opportunity for refactoring your design. The only tools you have to you are the ability to break a circuit with a knife and the ability to solder wires directly onto pins which should have been connected to a component but weren't because you screwed up the net names on your schematic and you didn't pick it up in testing.
Having said that, once you get to rev 3 of your PCB, it looks pretty good. That doesn't stop mgmt from wanting rev 1 to be perfect.
It does show, however, that design is hard and even with the greatest tools in the world you can still make simple mistakes.
So what has this got to do with Agile?
If Agile is about people and relationships I am all for it. If it is about some mystical process which I have to breathe in and become converted to, I think I'll pass.
I have managed to avoid most religions up to now. Some might consider by use of tabs/space and my (rather unique) alignment policy to be bordering on religion but I do not; at least not until I write a book about it!
Personally, I am all for adaptability. I have never seen a specification set in stone or even concrete. I expect it all to change.
Just don't expect me to keep to my original schedule if you change what you want.
And even then ... watch out! I am notoriously bad at estimating things I haven't done before.
Unfortunately, I live in the world of the schedule. We need to bring something to market by X because that is when the trade show is. And it is one of those every second year shows so it is big *and* important. Well, X means fininshing by W which means feature-complete by V. That's only a few months away. We'll hardly be getting started.
So let's talk about reality.
You don't really want an estimate, do you? You want me to hit a target.
So lets talk about scope.
OK - I guess I feel compelled to defend BDUF - in certain circumstances. The company at which I work produces software for safety critical embedded systems. We have verification teams separated from design/implementation teams, not merely because we want to, but because we *have* to - you need to ensure independence of verification from implementation. So...how are you going to communicate the intent of a piece of software without a design? Look at the code? Don't think so. In addition, when you have to demonstrate traceability from customer requirements down to code, you need intermediate forms to facilitate that.
Still, we do do iterative delivery (always have done, really), so that makes us agile-ish :-)
And we're big on unit testing - so much so that a Paul Hogan paraphrase is needed when looking at TDD - 'you call *that* unit testing'? :-)
And when I write non safety critical software (which is most of the time), BDUF goes out the window. Do as much design as is needed to get you started. Leave it that.
Couple of points here:
1. Joel, Steve, Microsoft, and Google are right: talent matters huge, and trying to disprove decades of evidence in several industries by citing one counterexample - Enron - is a pathetic argument. It's pure Argumentum ad Hitlerium. Enron didn't fail BECAUSE they hired talented people, it failed because (a) it was thoroughly corrupt at the upper levels, and (b) it used MBAs as the sole indicator of talent.
2. I personally don't subscribe to the BDUF or the NDUF (No Design Up Front) methodologies; I am a firm believer in KHTDYFJ (Know How To Do Your F$cking Job).
Any developer worth his salt knows that the right amount of up-front design (as well as the right amount of up-front specs, prototyping, architecture, etc.) depends very heavily on context.
It's so easy to claim that XP/Scrum/Agile/etc. is "still better than the waterfall" because:
[a] Programmers get to do more of what they like (code) and less of what they hate (plan / communicate);
[b] One catastrophic failure can be disguised as hundreds of smaller "enhancement requests"; and
[c] The constant motion ("release cycle") proves that things are getting done, which they are, except that they should have been done months ago.
Agile is merely a nuisance if practiced by talented programmers with good managers. It's a disaster when used by average developers and profit-hungry execs. Either way, it's much, much worse than the waterfall. It seems to succeed in the short term, just barely, but hurts much more in the long term. I say "Anything but Agile", even if that means waterfall, but like I said before, I'll take KHTDYFJ any day.
Sometimes, you ARE gonna need it, and if you lack the ability to think ahead, your code is eventually going to end up on The Daily WTF.
The waterfall, although it has flaws, was used for many years for a reason. It works. Agile is the new thing, and it works too, but differently. I think that we need to look at both and really find out WHY each one works to get a real improvement.
Waterfall works becuase of a few things:
1) You know where you are going before you start
2) Seperation of responsibilities has some advantages (at least between developer and tester)
Agile works because:
1) Requirements change, agile handles it.
My theory is a little different.
1) The designer, developer and business analyst should all be the same person. If you don't understand a problem, you won't make a very good solution.
2) The tester, requirements provider, and end user should be the same person. Who better to test whether it met the requirements.
3) Design is upfront, but from a what to do, not a how to do it standpoint. (Requirements and design are really just the developer discussing how technology can fit the business need with the user)
4) Parts are small enough to be completed quickly so that they can be tested before the whole thing heads off in the wrong direction.
Keep the user involved so that as things change, they can be handled (agile) and keep the parts small enough that the change is not catastrophic (agile) but know where you are going on each deliverable before you start it, and it is not done until the tester (user) is happy with it (waterfall). Knowing what you are going to do, and what the project should accomplish is the only way to set reasonable delivery dates or costs (waterfall)
1. Almost every tool the agile programmer uses was developed with waterfall. Waterfall therefore can't be the almost guaranteed failure they insist it is.
2. Most agile arguments are made against a caricature of waterfall that even places like IBM didn't practice. The author of this post makes that same mistake.
I'm at a loss to understand your "anything but Waterfall" mentality. Waterfall seems to be the only programming "methodology" [I hate the word] that pays anything more than lip service to the idea that software, like any other complex product, needs to be DESIGNED. And in my 34 years of computing experience, I've found that nothing leads to more productivity and responsiveness in a software project than having great design. Conversely, nothing slows down an IT department's response rate than having to deal constantly with the instability, bugs, and other difficulties that are inherent in poorly-designed or non-designed software.
For many years I successfully delivered very complex systems using a practice that I call "schedule-driven Waterfall". Every quarter the business agrees to the requirements for the coming quarter's development. After locking in something that can be reasonably designed and implemented in 3 months, developers proceed to build what was specified, with the first month of each quarter dedicated to pure design, and with no need to be shooting at a moving target. The business (or the client) may request changes at any time, but it is understood up front that no such "late" changes will be implemented until the following quarter at the earliest. The reward for this maturity? Quarter after quarter my clients got something that I've never seen delivered by any project using any other methodolgy: a thoroughly tested product with zero known bugs, and a near-zero bug incidence thereafter.
This process works but it requires discipline on the part of the business/client - i.e. the understanding that you simply cannot conduct a successful software development in an environment of continually changing requirements. It is an adult, real-world view based upon real-world limitations on what can actaully be done, and it requires the presence of grownups on both sides of the IT table. Which is perhaps why it will never be the fad du jour.
I read the Agile Manifesto, which supposedly defines the Agile mindeset, and it seems like some sort of back-to-the-60s hippie thing. "Individuals and interactions over processes and tools?" "Responding to change over following a plan?" "Customer collaboration over contract negotiation?" Yes and we're all going to have peace and love and get ourselves back to the Garden just because we want to. Right.
Sorry, I've been there, done that. Any project that isn't sitting on top of a great design and whose programmers aren't obsessively and constantly concerned with delivering great design is doomed to an eternity of low productivity.