August 4, 2008
Although I love reading programming books, I find software project management books to be some of the most mind-numbingly boring reading I've ever attempted. I suppose this means I probably shouldn't be a project manager. The bad news for the Stack Overflow team is that I effectively am one.
That's not to say that all software project management books are crap. Just most of them. One of the few that I've found compelling enough to finish is Johanna Rothman's Behind Closed Doors: Secrets of Great Management. She co-wrote it with Esther Derby.
After reading it, you'll realize this is the book they should be handing out to every newly minted software project manager. And you'll be deeply depressed because you don't work with any software project managers who apparently have read it.
I originally discovered Johanna when one of her pieces was cited in the original Spolsky Best Software Writing book. Her article on team compensation (pdf) basically blew my mind; it forced me to rethink my entire perspective on being paid to work at a job. You should read it. If you have a manager, you should get him or her to read it, too. (Update: this essay is actually by Mary Poppendieck, who is also great. I'm leaving it in the post because it's fantastic reading, even if it's a little off topic.)
Since then, I've touched on her work briefly in Schedule Games and You Are Not Your Job. But I'd like to focus on a specific aspect of project management that I'm apparently not very good at. A caller in Podcast #16 took me to task for my original Stack Overflow schedule claims way back in late April. What was supposed to be "6 to 8 weeks" became.. well, something more like three months.
My problem is that I'm almost pathologically bad about writing things down. Unless I'm writing a blog entry, I suppose. I prefer to keep track of what I'm doing in my head, only anticipating as far ahead as the next item I plan to work on, while proceeding forward as quickly as I can. I think I fell prey, at least a little bit, to this scenario:
"Look, Mike," Tomas said. "I can hand off my code today and call it 'feature complete', but I've probably got three weeks of cleanup work to do once I hand it off." Mike asked what Tomas meant by "cleanup." "I haven't gotten the company logo to show up on every page, and I haven't gotten the agent's name and phone number to print on the bottom of every page. It's little stuff like that. All of the important stuff works fine. I'm 99-percent done."
Do you see the problem here? I know, there are so many it's difficult to know where to begin listing them all, but what's the deepest, most fundamental problem at work here?
This software developer does not have a detailed list of all the things he needs to do. Which means, despite adamantly claiming that he is 99 percent done -- he has no idea how long development will take! There's simply no factual basis for any of his schedule claims.
It is the job of a good software project manager to recognize the tell-tale symptoms of this classic mistake and address them head on before they derail the project. How? By
forcingencouraging developers to create a detailed list of everything they need to do. And then breaking that list down into subitems. And then adding all the subitems they inevitably forgot because they didn't think that far ahead. Once you have all those items on a list, then -- and only then -- you can begin to estimate how long the work will take.
Until you've got at least the beginnings of a task list, any concept of scheduling is utter fantasy. A very pleasant fantasy, to be sure, but the real world can be extremely unforgiving to such dreams.
Johanna Rothman makes the same point in a recent email newsletter, and offers specific actions you can take to avoid being stuck 90% done:
- List everything you need to do to finish the big chunk of work. I include any infrastructure work such as setting up branches in the source control system.
- Estimate each item on that list. This initial estimate will help you see how long it might take to complete the entire task.
- Now, look to see how long each item on that list will take to finish. If you have a task longer than one day, break that task into smaller pieces. Breaking larger tasks into these inch-pebbles is critical for escaping the 90% Done syndrome.
- Determine a way to show visible status to anyone who's interested. If you're the person doing the work, what would you have to do to show your status to your manager? If you're the manager, what do you need to see? You might need to see lists of test cases or a demo or something else that shows you visible progress.
- Since you've got one-day or smaller tasks, you can track your progress daily. I like to keep a chart or list of the tasks, my initial estimated end time and the actual end time for each task. This is especially important for you managers, so you can see if the person is being interrupted and therefore is multitasking. (See the article about the Split Focus schedule game.)
I'm not big on scheduling -- or lists -- but without the latter, I cannot have the former. It's like trying to defy the law of gravity. Thus, on our project, we're always 90% done. If you'd like escape the 90% done ghetto on your software project, don't learn this the hard way, like I did. Every time someone asks you what your schedule is, you should be able to point to a list of everything you need to do. And if you can't -- the first item on your task list should be to create that list.
Posted by Jeff Atwood
It's almost scary how often people seem to forget the fact that in order to know when you are done you need to know what being done means. There is no way you can say you are 83% done if you don't know what 100% done would be. That's like asking a contractor how long it takes to build a house without showing them any plans or drawings. Also, if you're not the only one working on the project, everyone else needs to know what 100% done means as well. The fact that the definition of being done tends to change during the duration of a project (new features, anyone?) doesn't help either.
I think this is one of the well-known problems of software development that Extreme Programming, Scrum and other agile methodologies and techniques are trying to solve. Even if they can't help you predict the future, at least they can help you see where you are, and hopefully where you are headed--or should be headed--in the near future.
Listing what is actually to be done, in detail, of course helps one plan better.
But I continue to be amazed, year after year, how managers think that writing down a list with times to completion actually makes it so.
I am likewise amazed that most managers don't understand the inherent lower-bound bias in initial time estimates.
Jeff you are right about software project management books. EXTREMELY BORING almost all of them. I'll check this book out next time I'm at the bookstore.
Some people at my company are perpetual We are 99% done people.
It seems after time and time and time again of being wrong, people would say to themselves hmm, I seem to be overly confident a lot, I should think about that the next time I give an estimate.
Of course, they don't. They continue to try to please management (but really just frustrate them) by making estimates that are unrealistic and overconfident.
Never estimate based on the best case scenario.
Attempting to put software to a schedule is analogous to solving Turing's halting problem for a given input.
Sometimes you can tell very quickly how long something will take. Other times it is so nearly impossible to plan ahead that you might as well just start programming. Sometimes it's somewhere in the middle. Sometimes the schedule shifts midway from trivial to NP-Hard, or vice versa.
Actually it's quite normal that there are more requirements/stuff to do then the team is able to finish before the deadline. SCRUM approach would be to ask client to prioritize the stories (tasks) and deliver those which are the most valuable. Projects are usually evolving, users ask for new features ... it's never 100% done from this perspective.
How are you supposed to know what to write on the list?
The only way you can write a detailed list is to design the system. And designing the system should take about half the time in the project.
That's not to say that a list of tasks isn't useful, but getting to the point where you can write a list of concrete, completable development tasks may just be the biggest part of the project.
I see really two threads to the responses here:
1. You're right and here's some more stuff on how to do lists and
2. I don't do lists, because it's too hard
I do think that there's some truth to the second, although I feel like a troll saying so. The problem is that a lot of time, when you're writing exploratory software, when you sit down to write up a list, you really, honestly don't know what needs to be done. And you can't flesh that list out to tasks that you can clearly say are done before you start doing some things. The define the I/O behavior of module X can't be drilled down, because it involves list items like design the Y data model, and you don't yet know that you need a Y data structure.
I'm working on a project that's trying to use scrum now, and I'd have to say that while it seemed really agile in the beginning, it's now seeming like same old bad project management with generation of tons of tasks with ill-defined estimates for completion, etc. Precisely because we're doing tasks that need more work before they can be turned into lists of actionable items.
I'm starting to think that this is one of the software engineering dilemmas for which, as Fred Brooks said, there is no silver bullet.
Huh? The article you cite in paragraph 4 was written by Mary Poppendieck, not Johanna Rothman...
I don't understand why this is so difficult for most developers to do.
We're all so logical, right? Lists!
The other problem is analysis paralysis. So busy trying to make decisions that you never make any. Sometimes it is best to just get on with stuff and then use your understanding of the problem to do the estimate properly.
Jeff - have you looked at any of the Anti-patterns stuff? I read the book years ago and it made sense then, main thrust was to identify what kind of problem you suffer from and how to fix it. Worth a look.
I wonder what it is about software developers (or our managers) that makes us so prone to being 90% done?
I chalk it up to the desire to please, both ourselves and our managers. The way I see it, this will persist so long as the boss is more excited to hear it's 90% done than she is to hear I've detailed the task and I estimate I'm really only 25% done. The later means your deadline is going to slip, and that's a reality that a lot of people will resist.
It's surely more immediately satisfying to convince yourself you're 90% done than face the reality of the situation. I think for most people, getting past that rush will require having a manager that's honestly appreciative of being presented reality, and displeased when presented a fantasy.
Create a list of tasks? You must be joking. I've got all this real work to do. ;)
Jeremy stacks are obviously most programmers favourite. You get paperwork dump it in a pile dump some more on top of it. I always tell my boss it's date ordered. :)
How's that list coming?
it's 90% done... just the little things I haven't thought of and don't know about remain.
Have to agree. I only ever track tasks at 0% or 100% complete.
If anything more fine-grain is needed then the task can be broken down further.
The Modified Pareto rule for software projects:
When you are 80% done, you will spend 80% of the time used up until this point, on the remaining 20%.
This is a recursive rule, so you will actually never be done - although you probably will get to a point where you consider yourself close enough to done to actually ship.
Trouble is, you have to maintain the list, adding on all the things that you didn't think you hadn't thought of.
It might well be that new stuff comes up, that has to be added. I find it has a tendency to add about 10% to the overall estimate.
One day, I'll get myself and my users well-enough organised to be able to use stories and iterations (like XP) or sprints (Scrum) or something similar.
Yeah, let's map out all the features and break each feature down into tasks. Whack that into a plan and what do we have? Waterfall!
The real problem is that software development quite often involves invention. As Cockburn puts it: A cooperative game of communication and invention.
Often, the only way we know everything we need to do is to actually do it.
Not wishing to label everyone the same way, I think the dislike of lists and scheduling is fairly usual with software developers.
This might just be down to the idea of If I could write down everything it needed to do then I'd have done it already!
The advantage of Software as a Service is that a 90% complete solution can often be put live with the last 10% being mopped up as required.
This ties in well with a developers dislike of lists as the 90% list is a lot easier to produce. Ask for a list of everything needed to be done in order to go live and everything needed to be done to be complete and one will happen a lot quicker than the other.
As long as everyone at all stages is well prepared for this 90% mentality then it's actually a good morale and performance boost. It usually falls into the possible side of things rather than the impossible.
Then when it comes to hitting the final 10% as a separate entity it also now falls into the possible side of things.
It's a Divide and Conquer mechanism for software development that I've always found extremely productive and nice to work with!
If you do want to create a list, make sure you know what 'done' means for each item. If not, your list is just as useless as not having such such list. Worse, you've tricked yourself that you have such a list.
I am really tired of seeing such useless list from project managers ...
I find one way to have a better idea of how finished I am is to try to completely finish a task before I move on to the next one (or finish it as much as I can before I need to wait for something). I remember feeling blind panic when I was working on something once and I'd done all the major work but none of finishing up and had no idea where I was in relation to the Gantt chart I'd drawn up with my manager at the beginning.
lists are good but they suffer the same flaw as memory - you can forget to put something on there. :)
i'm surprised this (making a list) is considered anything more than common sense to be frank. if someone has to be told this, they probably lack the natural aptitude to manage a project... still, i guess we must respect the training courses etc... its the way of the world.
although i will say the biggest project i work on with a list of objectives has slipped and slipped and slipped due to my failure to anticipate how much programming i am really going to do in my spare time.
moral: bad programmers can still run late with good lists. :)
I started having the same problem in my side-project - we were concnentrating on the big items, forgetting about the smaller ones. I've tried writing things down, and after 2 months of trying we ended up on trac integrated with our source control subversion.
Very nice way to go, you can add things to be done in a flash, from anywhere, so that you don't forget them.
Yep, a list of task or to-do list really helps me do my work and know how far I am from finishing. I keep it on paper for now and always add items as I find out about them.
It's a good incentive to keep on working too when you see the list getting smaller and smaller.
Having a list is excellent, but you have to make sure your team understands that you cannot check off a single item until it is truly done.
Not code complete, not done but still needs tests, or done but needs refactoring. Not done means you delivered zero business value and thus you cannot show any progress (doing so would be lying). If you show the illusion of progress then you aren't painting the right picture to your stakeholders.
It's amazing how many programmers consider something done and then still work on it for hours (or days) longer to polish it.
i'm 90% done reading this blog.
Well done, Jeff.
I'm glad you owned up to your project management blooper. While I agree with Joel that this project's slippage is not a major issue in this case, the average developer is often faced with projects where schedule slips are disastrous. Thanks for setting a good example to all the shoot-from-the-hip estimators.
I think many a developer's pain in this area comes from being poorly managed. They are asked for a rough estimate on how long it would take to do x. They provide an overly optimistic estimate, and the project is approved on that basis. If they develop a realistic schedule and present it at this point, they are often harshly criticised for their inaccurate rough estimate, and the criticism continues through to the inevitably late delivery. If there is no schedule, however, the developer can continue to claim that he is 90% finished, and defend any unforeseen problems that may occur.
I have learned that the only way to counteract this pattern is to refuse to be drawn into giving a rough estimate for anything, and always asking for a complete requirements spec, whereupon I make the schedule and give the schedule as the estimate.
Keep on blogging, young Jedi.
Don't forget to include a line item in your time estimate for BSaP.
That's B.S. and Politics. All of the things that suck up your time, but have nothing to do with actually producing the end product. Meetings, convincing execs that the tool they heard about in some tech journal may not be the best thing for this project, etc.
Ha. I work at Nucor (referenced in the Poppendieck article). Its a great place to work for the most part. There are problems with the incentive system, but its far better than the alternatives. Basically what Nucor has done is get all of its people focused on making more money for the company. If the company makes more money, we (the employees) all make more money. Its clear how to make more money ... produce more steel! Its not a principle that would be easily applied elsewhere.
I agree, this is the only way to have accurate estimates and prevent the 90% done. I believe she wrote an article a few years ago and coined the term inch pebbles for these tasks.
And once you get your estimate, you still need to factor in the unknowable unknowns that pop up as you code.
You're one awesome dude!
I listened to the podcast where that caller called you out for being behind schedule. I listened to Joel being the prick he sometimes is when he teaches others. Although I agreed with both of them, you should have kept track of your tasks, I don't think they taught the principle effectively. In spite of them, however, you had the guts to recognize your mistakes. I wish you success in trying to correct them.
I'd like to see a success story of this system being used on a real project.
If you don't have a list, you never know when you are done or if you're late.
Most people don't update their schedule when they update their list.
If I may make one suggestion....
First, Jeff..I loved your article about the quantity vs. quality and I think it applies here.
I'm a developer, like most people here, and I build a lot of software. However, unlike probably most of the developers commenting on this blog I have a team of ten people that I employ and I'm the head developer PLUS I'm the guy at the end of the day going Criminee...I have $30,000 in payroll due tomorrow.
So the challenge for me has been how to do I get organized and sharpen my skills at the same time? I build software that keeps me organized!
I'll tell you what; build a to do list or two that meets how YOU like to work and it'll be worth it. As my skills and needs have changed, so has my to do list program. My program meets MY methods of organization and as those needs change so does the feature set.
Just a suggestion.
I surprised people gripe about the delays. Anyone who has ever worked in software should know that delays are practically a way of life.
List or no list, software is still a tricky beast!
(Yes, you can minimize them, but come on!!! they still happen!!)
Keep up the hard work Jeff. I'm excited to see the site whenever it's ready.
I usually tell me team that if there's a but followed by the word done then its not complete!
I usually hear things like its done but..
I think that the problem with all the project management is that you'll be stuck at 0% for a long time trying to map out everything, which could have been spent doing a POC or base implementation and which in the end will give you a better idea of the actual requirements. Oh, i'm 100% done i already have the requirements and implementation plan, and then BAM you encounter the issues that only come up in development and have to replan again.
Not defending all action and no planning, just that alot of planning will kill a project.
BTW, i think this post of yours is actually in conflict with the previous one.
I don't know if you noticed, but on the first article you linked to, it specifically says that it is outdated, and no longer valid, and specifically says not to read it.
this is also one of the cardinal rules of GTD, without a to-do list you can't finish anything, software development is not an exception.
I found Software Estimation, demystifying the black art a good read for this topic. Although by definition boring to practitioners, Steve McConnell's emphasis on empirical data points and the avoidance of guesstimation makes the content far more applicable than similar textbooks I've come across.
My lists tend to have a status entry for each actionable item (ie my actual assignments, rather than individual tasks that lead to the assignment being complete). In my current position, the status reads as follows:
User Acceptance Testing
and then if it passes all of those steps, I label it Production and move it to a different list, where I can pull back the details and notes I took on that particular item if something slips past testing. I also keep a list of every file I change while I'm working on that item, because I don't want to forget to check some little change to one file into source control and then spend more time looking for it later when it won't pass test or peer review.
Yes, I did notice.
But, I think that the first has a few interesting ideas that aren't in the second. It seems more targetted towards individual developers - it tells you how you can manage your own time and come up with your own estimates more accurately than before, all with a _really simple_ spreadsheet.
The second, evidence-based scheduling article seems to be more targetted towards project managers who oversee a number of developers, are far enough away from the estimates to do meta-analyses and such like, and have the time and incentives to create the interesting graphs and velocity measurements. I think a lot of individual developers would look at the second article and think that that's a lot of work for them, more work than they can justify, if they just want to improve their own work.
So while Joel would like you to not read the first article, I still think it is worth reading, especially for individuals. If you've got a manager who's willing to implement the ideas from the second article, great. If you don't have such a manager but want to improve your own workflow anyway, then I'd say the first is more relevant.
Well I just gave the link to one of my PM's and he said theres a problem. I've never read a book!!!
Why does it not suprise me!!! Does it come in audible version!?!?
From what I have seen the reward cycle in IT has a number of problems:
1. The manager normally has no concept how subordinates do what they do, so are normally incapable of realistically measuring success. This is NORMAL in IT, because the manager is busy managing, not becoming a technical expert. So judgements on rewards almost always come down to perceptions and perceptions are almost never accurate.
2. Teams are more concerned about lack of performance by other members than they are about individually being singled out for praise.
In both situations a 360 degree process, where customers/clients/users, managers and team members all get to rate others. This way a manager may “love” a sycophant, but team members are able to report contribution and actual skill. Similarly, this gives the subordinates a chance to rate their leader. It is all done anonymously to avoid reprisal. In my experience this is the most accurate measure as it removes one person’s perceptions as the basis of assessment.
On the point of money, a couple of bonuses/raises in a row can become expected, so that a cycle of no raise can actually be seen as punishment.
Way to man-up and own the responsibility, Jeff! This was not only a good post in it's own right, but it was very humble and honest of you to respond to being called to task on your scheduling. Thanks for putting yourself out there.
Part of the problem I've had with making lists of what I need to do is the hubris of either I *know* what needs to be done, I don't need a list and the aforementioned by the time I make the list I could be DONE with it! But, honestly, there are some things I probably don't want to know that I have to do and writing them down in a list is not only acknowledging them, but is a form of committing to them :)
However, going from development to management to development again, I find myself WANTING a list of prioritized tasks. Partially because I want to know that what I'm working on is important (even the small stuff) and that my leadership (manager, VP, CEO, etc.) have all *THOUGHT* about what needs to be done, given a value to it and consciously decided what is more important than what.
That frees me up to not be worried about the stability of my job environment and to just code like a mad dog (...who's focused and directed and deliberately working on important things. Bad analogy, I know).
Anyway, excellent and thanks!
A slaves work is never done -- Jam tomorrow, never today.
My rule is to make the most sober and pessimistic estimate you can and multiply it by two. Jeff, I'm sure you thought your initial estimate was realistic. If you had used the x2 rule you would have been dead on.
I did this once for a client I was working for. When I told him the estimate he said it was way too long and cut the estimate in half to what he felt was more realistic. Guess how long it took? :) Even when I'm right, I'm wrong.
I always work from lists when I'm trying to finish up a project. Once I feel like I've done all the major stuff I start listing all those annoying little fit and finish jobs in order to wrap it all up. The funny thing is that if I start with a list of 100 items by the time I've got 50 done I've accumulated 50 more. So even that estimate is crap. Yes boss, I did half the work, but I'm only 33% done. That's about when I start triage - next release for you and you, etc.
At least if it's on the punchlist it gets done (eventually).
Jeff, I was surprised to learn you were 37 (from the stack overflow site) -- many of the articles you write and things you say (here and the podcast) make you come across as a much (much!) younger, inexperienced developer. For example, I'm almost pathologically bad about writing things down. How'd you get this far in the development world without being called out on that by your bosses?
Would somebody tell me how to estimate the time needed to debug?
@dwj Obviously the wisdom that comes with age that you refer to doesn't include tact.
I have been once taught (by my former programming teacher; he worked for big software companies in Germany and Japan) the most essential truth about software development ever:
You cannot estimate how long a programming task will take; period!
I know that there are Mio of people who disagree, but I think this is absolutely true. Let me add some more quote
Programming is not like building a house over and over again. If you build your first house, you have no idea how long it will take to build a wall, make the roof or creating one square meter of floor. Once you did all this, you know how long it took and when you build your second house, you know that it will take about as long as it did the first time, if not faster, since you are getting better at a task the more often you are doing it.
Every piece of code you write is a *new* piece of code, you are never repeating the same task again. Why would you repeat the same task? If you ever need the same code again, you copypaste it; that will take you about 5 seconds. If you ever need the same application again, you just copy the one you wrote last time.
A software developer faces the problem that he needs to write new code every day, as there is no point in writing *old* code again. So every time the task is a new task for you. It's like you are building a house today, a car tomorrow, and creating a lovely garden the day after tomorrow; and you are always doing it for the first time. How can you estimate in advance how long you will need for it?
This so very true! After years of development I can assure you, I'm doing something new every day. I'll continue to quote:
Thinking you can estimate a task you have never done is just an illusion. You can make a vague guess, sometimes it's good, sometimes it's horrible. Sometimes you need twice the time and sometimes only half the time you estimated.
Actually all my time estimations are, well, I just make up these numbers. I have no basis for them. I could role a dice and the numbers would be as accurate as they are today. Quoting again:
Programming is like art. Ask an artist how long his next painting will take. Do you guess you can get a reliably estimation? His picture is done, when he considers it done. He can't say in advance how long it will take. He might be 99% done and then consider some things need to be repainted and within a couple of minutes the state falls back from 99% to below 80%.
There will be the day where software developing companies are going to realize that. That's why OpenSource software is often better in aspects like performance or security; it's not time based. It's done when it's done. The developers themselves decide when it is called 1.0, not some marketing department or some supervisor. If they consider something a nasty hack, they'll fix it, even if this means it takes another 3 weeks for the software to be done.
As odd as this might sound, but I personally would prefer if the time frame is set from above. I can surely make a list of all tasks to be done and show them to my supervisor every day; but I can't say how long each of these tasks will take.
It's a different situation if my supervisor says Okay, for that, you have one day; for this one, it's very important, you may take up to three days; and for this tiny one, investing more than half a day won't pay off. If I know that I have only one day for this task and I start in the morning, I know it needs to be done before I call it a day. That means at the end of the day it's done. Whether it's good or not, whether *I* would consider it done or not, doesn't matter. It's done, because the time frame says it has to be done and if it's far from perfect, far from good, far from reliable... it's not my problem. The time frame dictates me when it has to be done.
If my supervisor then looks at the result and says I'm not happy with it. This is too unstable, too slow, looks too much like a hack, I will reply Well, you only gave me one day, that's the best I can come up with in one day. Give me more time and I will make it better. Then it's up to him to give me another day or maybe two days, to make it really good; or maybe not and it will stay as it is right now.
When you try and determine how long a project will take to complete, you must also factor in how long your analysis of how long the project will take to complete will take to complete.
I like id's approach... When it's done.
Or if you're forced to make a projection, project high and look like a hero when it comes in early (or worst case, a good projector when it comes in on time).
I must suk.
I'm a relatively jr developer (read poor architecture skills), working with a beta technology, with no real guidance. It's no wonder the project I'm on is taking forever. I mean i guess I kinda know what I'm supposed to be building.
I think I'm gonna sit down and make a list of all the stuff I at least know I gotta do. Then figure out how to do it.
Many times when someone asks how long something will take, I want to respond with how should I know? Mostly because in this ugly 3 years of a career i've had, I've never had a project manager or technical lead that had an idea what the architecture / end product is supposed to be.
When I pull a number out of thin air no-one calls me on it. It's a really bad on my part. Then again when I struggle there has never been anyone to help... so it's probably no wonder this project is failing and my last one did too.
Anyway thanks for this post. It reminds me that I don't want to suk and motivates me to be better.
Very nice post.
It is the job of a good software project manager to recognize the tell-tale symptoms of this classic mistake and address them head on before they derail the project. How? By (forcing)encouraging developers to create a detailed list of everything they need to do.
I agree. My first thought, however, was to take Tomas up on his offer to hand it off as only 99-percent done. I don't know him well enough to know how well he normally estimates. (In fact, I couldn't know him very well given he'd been allowed to get into this situation.) But I do know that handing-off, or releasing, is the most effective way of shedding light on a situation. Once he has handed-off his code, other eyes can be added to check for feature-completeness, and double-check his own residue list.
Further, rather than focusing on detailed lists (which is also necessary), I would restructure Tomas's work so that he is forced to hand-off, or at least demo, every day. Especially, as he moves forward to do his cleanup, as these are all mini-features that should be independent and immediately obvious.
What I used to tell my students is to try and find a way to organize tasks so as to be able to show your manager some visible improvement every few hours (e.g., every half day) in case he or she walks into your office and asks how it's going. However, I told them that the main reason for organizing your work this way is for your own benefit. There's nothing like seeing new features (or new spec paragraphs, or help doc pages) every few hours that you can point to to build up your own morale (even if the boss doesn't walk in the door), and hence improve your own productivity. And it has the added benefit of getting you to think in terms of a list of things that you need to do.
My best friend is MS onenote. Every time I think of something else that needs doing, I add it there with a checkbox. I've got all my projects organized nicely in one place. I also note all the steps I took during construction or bug fixing. This kind of organization and documentation is something I can't live w/o.
I think you're missing an extremely important component, and thats to make sure that these list of items are all tangible things that can be verified as complete.
So instead of
Finish creating database tables
Create User, Customer and Address tables according to spec 1.1 and select a demo record from each using Query Analyzer.
The difference is very subtle, but critical. Programmers(and I've been guilty of this too) tend to generalize things way to much. We end up with huge buckets like Finish adding corporate logo to website. Well what does that mean? In all the headers? And footers? And that flash movie showing how cool we are? And the Apply Now page? And the Directions?
Its never clear when you're done. And if it isn't clear when a task is done, then it'll never be done, and always exist in that 99% finished state until someone just arbitrarily closes it.
Granted, you'll completely forget certain tasks, so they won't be in the original estimate, but thats how software managers grow and develop, they become better at breaking down the complex tasks involved in software development into those granular verifiable deliverables that can be tracked and benchmarked against.
I think programmers are inherently bad at time estimation of all computer tasks. Usually a programmer has in their head how they are going to solve a problem, fix a bugor what they have left to finish. This automatically put's it in your mind that it is an easy fix an will take no time at all.
One of the ones I find myself particularly bad at is report formatting such as in crystal reports. The reason for this is you find yourself going i will just move this then you test, then you add a field then you test I can find myself taking an hour formatting a report where in my head I have just said well just need to tidy up that report a bit make sure everything looks ok. Suppose this applies to most GUI stuff. This is one of the bits of the job I find tedious I usually fell I have done the bulk of the work and these are those last 10% jobs.
I stopped reading that .pdf at agile development.
It's all fun and games until you actually have to meet an estimate.
Programming is often like a bad RPG. You go to the castle just to find out the princess is in another castle. You go to the other castle to find it locked, but the key is held by the guardsman. You talk to the guardsman to find that he loaned the key to his son, who hasn't returned from the deep dungeon for a long time. You go to the dungeon and rescue the boy from the scary monster to find out he dropped the key on the way to the dungeon. You backtrack to find a note on the ground by someone who has taken the key back to the castle; the key is in the castle so you can't get in. You talk to the guardsman to find out another way into the castle, but it requires explosives. The explosives got stolen yesterday by a thief who lives in a cave by the lake. This thief would love to trade the explosives for these really wild barries that are located in the poisonous forest. In order to survive entering the forest you must first get magic syrum, which is gained by ....
I think you get the picture. The point is, you don't often know the real task list right away. This is especially true if you are integrating a familiar system with an unfamiliar one. If both systems are unfamiliar, the task may very well be impossible.
Skip this true experience of me using ANTLR if you're not interested:
I was excited to find a parser-generator to properly break up code for my project. I found out there was a book for it so I ordered it through Amazon. I begin by downloading ANTLR, which is archived. I find I have to update my java to properly extract it from the archive. All I need ANTLR to do is replace the function that breaks up code into a syntax tree. I download a grammar and run ANTLR using the example given. Instead of getting a tree, I get a flat tree of tokens (a linked list). I read about generating 2D syntax trees using rewrite rules; however, after trying to get the tree from one rule I find problems with the rules and ultimately decide this won't work. I search some more and find an example of an ANTLR parse tree which is generated from -debug mode. I find this to actually meet my needs, but I find that the parser puts every rule it tries to match (as opposed to the ones it actually does match) into its tree. This means I have to weed out duplicates, based on the start character and end character of the token. ANTLR looks like it has functionality to give you those values for all tokens; however, on parser tokens it will just return 0: you have to calculate the start and end of a parser token yourself (and put the parser token into a container with those values, as the set function for those values does nothing). Armed with all this, now I can properly integrate the ANTLR output. However, it is in a slightly different format now that it uses nodes instead of raw strings like before so I have to do some more work. What I had originally estimated for 1 month (including testing) is now 4 months into it and I am finally testing the last part of the integration.
ANTLR is a fine tool, but I felt like I was going through a bad RPG in order to use it. I can't think of a way to write an estimate that takes adventures like these into account.
We've been using the Scrum agile methods for about 5 months now.
Got to split up our work in tasks that are maximum a days worth. If you do fail to estimate the time a task will take, at worst you'll only be off a few hours.
Once you have all those items on a list
If only that was possible. The thing is that the list keeps changing. This is not necessarily a problem - that's what agile techniques are for.
The real problem is thinking there is such a thing as an accurate estimate for a big software project.
IME the best you can do is estimate user stories or use cases in story points (**not** days - programmers are notoriously bad at estimating time), track your velocity in terms of story points/day, and predict from that how long to do all the things that are currently in your list. If you haven't already, look up burndown charts.
The whole issue of what done means for a task needs to be addressed too. Too often programmers think it's just delivering code, and don't consider testing, code reviews, documentation etc
Would somebody tell me how to estimate the time needed to debug?
You don't. Debugging should be a part of the construction/programming activity. Unit tests and continuous integration will help you on the way, but if the task takes longer than expected--because you spend a lot of time debugging--chances are the task is more complicated than you first estimated, or you chose the wrong implementation or design approach. Either way, you should re-estimate the task or break it down into new, smaller tasks and estimate them instead.
I am never able to understand why this percentage completion comes into place. I worked for Nationwide Insurance, for 7 months. And those guys keep on asking how much percentage you have done. Obviously, I was never able to answer it. I mean, how can I tell that this home page that I am designing is 50% done, when I have just started coding. Software development is not like baking a cookie, in which you can precisely tell how much more time it will take to bake it up in your microwave.
In Scrum (mentioned by a few others), the list of things to do is called the backlog, and it works wonders. I work in a development center where all projects use scrum, and it's working very well for us.
I guess, it's a developers desease not to plan ahead, cause it takes the time that it takes and we don't like to feel pressurised to complete the work we've commited on. ;)
Actually, you can give a percentage estimation! Not precisly, though... In the company I work for, we use SCRUM now for nearly 10 month - and besides some teething troubles, it serves us great. We're forced to think of the tasks a user stories needs to be completed (that is, the code is tested, documented, possibly reviewed and demonstrated). Taks are estimated in hours, considering possible know-how-transfer and pair-programming. If a tasks takes longer than 16 hours on the first estimation, it has to be broken down in smaller pieces where applicable. We tried to use story points for estimating the user story, but gave it up in favor of ideal man days. (We found it really hard to measure complexity, work load etc across 3 teams with different scopes).
Since using those agile methods, seeing the tasks on our board and checking on them daily, we have massivly increased our effectiveness!
I like the analogy - entertaining and pretty accurate.
The challenge with making a complete list is that you can't anticipiate everything and it takes too much time anyway. It's waterfall as someone else pointed out.
In agile methods you tend to have a backlog. The key is NOT to list ALL the tasks at the beginning.
Requirements basically come in course grain requirements (CGRs), which are broken down into fine grain requirements (FGRs) which are broken down into tasks. CGRs can typically be completed in a couple of weeks, FGRs in days and tasks in hours (may vary by project and temperament).
Now, at the beginning of the project it is relatively simple to list all your CGRs, it shouldn't be too long a list.
You then take about 10% of your CGRs and go to town on breaking them right down into FGRs and tasks and then you estimate it all carefully. Yes, this will probably take a few days to do but it is worth it.
This will normally allow you to estimate the remaining 90% of CGRs reasonably accurately without actually doing the detailed break down.
This allows you to do the detail break down of each CGR later, when you are just about to start the work. Just in time estimation :)
The most important tool we use is the burndown chart and the run-rate.
So, if you estimate all your work and then measure how much work you actually get done then you can apply your run rate to make your remaining schedule much more accurate. For example, if it consistently takes you 60 hours to do what you estimated to take 40 hours, then you can apply that formula. We do that with our development teams and it means we get fairly realistic delivery timescales over time. But, this only works once you build up a history of data, of course.
In addition, the burndown gives you a constantly updated estimate of the release date as well, driven by developers without a project management overhead.
If all the above sounds like theory - well, it's how we work with our onshore and offshore delivery team. It is not without problems but it beats any other method I have ever used in terms of transparency and accuracy.
I like software project management books.Its really awesome
In my recent experiences at startups in silicon valley it is entirely too common that development teams only receive verbal requirements and at best, confusing direction. Product managers and development leads, and decision makers simply must take appropriate time to clarify the desired product and its features (and actually write them down coherently rather than in sound bites).
Developers and especially offshore teams must have the ability to contemplate requirements in written form as a stake in the ground upon which to ask questions and at least have a starting point to establish reasonable expectations.
Or I could be living in a dreamland.
Doh. I missed the point before I hit Post.
Taking time to capture remaining work and measuring against completed tasks allows for more precise time estimates, and raises the integrity of the developer. This completes the loop on the project management.
P.S. With all due respect to agile methods, it is perverted by those who claim that the code is the documentation. Those who say such things are, in my opinion, too lazy to document at least the APIs between major areas of interacting code which multiple developers are working on together.
Predictability is better than speed. In a perfect world you'd be able to honestly communicate that you're done early, or that xyz happened, or just that it was a lot harder than anticipated, and people would understand.
So then we want to overestimate, and give a 'worst case' deadline. We're honest and quite proud that we're done early and tell everyone. The problem is that if you do that too often, then there's an expectation that you're always done early. If your estimates get better, it doesn't matter, because people expect you to be early.
So, you lie. You pessimistically estimate 3 weeks, and if you're done in 1, or 2.5 hours, it doesn't matter. Just test the code more and read Dilbert. When the deadline approaches, announce you're done just a shade before the estimate. Try to estimate better the next time around.
In the end, you'll be known as the guy who always delivers on time. Your schedules are predictable, and everyone is happy, including you, because you have less stress in your life. Predictability also helps your manager, because he trusts your estimates, which allows him to be confident talking to his bosses because his timelines are better too.
Hey Jeff GREAT POST
Usually I come here for a laugh and to challenge my ideas because you are well sometimes (mostly) a bit off the track but this one is different one of the rarities
keep your opinions opinionating!
About this time you begin to realise that no matter what you say (Jeff that is) or have said, most of the responses are from people that cannot read or cannot comprehend, no matter what. How very depressing. But still, lively debate, as they say, can only improve our understanding of the human condition. Keep up the good work!
Try some English humour (apologies for the correct spelling!)
I guess I'd catagorize this as a nice try, but I don't see *anything* that can get rid of the month-long 90%.
I'll give you a current example. I'm working on porting some code, and after a *lot* of fits and starts, have all of it working except the byte swapping on some of the variables transmitted to another machine is wrong. How long of a task does that sound?
Well, it turns out after a 3-day investiagtion into the old code, that the code my predecessor on this project wrote to handle parsing of the executable's debugger STABS isn't picking up the FLOATs for some reason. They were all comming out as INTs. How long of a task does it sound now?
Well, it turns out, after another week and a half dive through the 10KSLOC that parse STABS, that all 10KLOC are doing exactly *nothing*. I can comment it out at the top and the program works exactly the same. Its pretty clear to me that its going to be easier to write the STABS parsing code over myself rather than try to get a totally nonfunctional 10KSLOC routine working the way I need it to. I've written parsers before. Wrote my master's thesis on it actually. And I have all this old code that at least shows me what the proper calls to make are. So how long of a task does it sound now?
Well, it turns out that, outside of the creators of gdb, almost nobody ever has to write a parser for STABS. There's only one piece of documentation I can find anywhere. It also turns out that the format is really contorted, and is totally designed around the implementation that the linker (and in some case gdb) writers happened to be using way back when. Still, after a week I have it working for some basic types. In another week, I have it working for the complicated types too. The only remaining issue is that it isn't quite handling some of the C includes and Fortran scopes right. So how long of a task does it sound now?
Well, it turns out that Fortran's scoping rules are different than C's, and rather than encode that difference in STABS, they figured they'd let the parsers handle it. So now I have to go back and recognize what language each object file was coded in and handle the parser's scopes totally different for different languages. There goes 4 days.
It also turns out that they decided to economize on file size by not spitting out the entire contents of identical includes. Instead they just referenced them by name and a hash key. After a couple more days I figure out that you need the hash key because #ifdefs can make the same include file really a different file in different compilations. They still cheated a bit on this in the cases where some include files in include files aren't reincluded. Tack on another 2 weeks to deal with this.
If you have some estimating method that can prevent this rather typical example of the perpetual 90%, then in my book you are now only 2 miracles shy of your sainthood qualification.
That's kind of long. Sorry.
I guess what I'm really getting at is that a lot of the time estimating on a software task before you actually start working on it is like estimating how many dolls are in a matryoshka doll without lifting any of them. You can try to eyeball it, but really its just going to be a guess. Often a horribly wrong one.
Thanks for this article! To-do lists are everyone's best friend, and it's great to see you advocating them.
@Andrew Peters: Amen! A 30 day sprint where you are supposed to execute exactly what's been perfectly planned out in advance in a requirements document isn't agile and certainly isn't iterative, even if you have scrum meetings every morning.
I keep a personal list of issues that I discover need to be worked on, and I've found it's useful to keep this separate from any bug tracking, sprint backlog, or other team/manager-owned artifacts. Sometimes items from the personal list translate into new issues or revised estimates to bring up with the team, but often they are details that just get resolved in the course of work, the important thing is that you have a way to capture issues as they arise so can figure out what you need to do later. This is very loosely based ideas in David Allen's _Getting Things Done_.
I knew a guy who said that the proper way to estimate time needed to finish a programming job was to take the time you think you will need, double it and move it up to the next time level, that is, 5 minutes is ten hours, 2 hours is four days, and three days is 6 weeks...
In the olden days (Windows 286) I worked at a place where we did fixed price work. The design doc was a paid deliverable, then we had to estimate the # of hours per screen, etc.
There was a precursor to MS Project, a DOS app called Timeline, that was in some was quite superior. I tracked the burn rate and physical completion % of each piece.
It was (and is) doable. But in today's world it isn't the norm to think things through perhaps? And, when our client wanted a change it was always time and money -- but maybe we don't hold our clients accountable as often as we should?
If there is no completed spec then you can never be done, and there is no delivery date that really can be promised.
So, you lie. You pessimistically estimate 3 weeks, and if you're
done in 1, or 2.5 hours, it doesn't matter. ... Try to estimate
better the next time around.
You seem to be buying into the myth that your estimate can *possibly* be anything better than a SWAG. What's worse, you seem to be beating yourself up over not doing it better somehow. Forgive yourself for not being clairvoyant, give your guesses, and move on.
I thought the caller in Podcast #16 was a bit harsh really. It sounds like Jeff has done a good job so far getting out a beta quickly and getting early feedback.
I've realised I've falled into the 90% trap myself. I've blogged about it here:
Planning seems such an obvious thing, but as a programmer it's too easy to get caught up in features.
It is a bit pretentious to consider a blog-project as a measure for project management on any software development project. Also, Jeff's project looks a lot like a digg web site (or any web site with digg functionality). True challenge on every software development project are the requirements, integration, configuration and (smooth) rollout. By large, the biggest amount of time in software development process is eaten up by requirements and the back-and-forth between business sponsors and middle management. Coding is usually done in whatever time is left after that, with more requirements changes along the way.
If your software development is slipping even though you had detailed requirements, you need to update your team. Senior and middle management also need to update their technical skills, to fully understand the implications of each technology used.
The advantage of Software as a Service is that a 90% complete solution can often be put live with the last 10% being mopped up as required.
That creates a pretty useful standard for 90% done... if a developer says something is 90% done, you can ask the question if you had to ship it, could you?. If the answer is yes, perhaps it really is 90% done, with a fair amount of cleanup work left. If the answer is no, then you aren't 90% done. If the core functionality isn't there or it flat out doesn't work, you aren't really 90% done.
Of course, this is one area where test-driven development helps a ton - you know exactly what the target is (documented via the tests) and you know exactly when you meet the target (tests pass). Makes the line between done and not done far clearly.
Yeah, I feel the same pain always. Well, I admit that it may be the developer's fault in that he/she may overestimate his/her own capability. However, it is not unusual that even you have a clean plan at the beginning, feature request/changes from the customer quickly suck your buffer time quite easily. Sure, you can allocate twice or three times more time to prevent it from happening. Then, you will find that you lose the customer as somebody else may claim that they can do it in a much shorted time. What do you say? Perhaps, that is the reason why Ruby on Rails becomes so popular.
Speaking from experience, for a long time i stayed away from making Task Lists to organise my time.
It wasn't until i forced myself to use them for a particular project that i realised how damn useful they were. My productivity improved and i really felt like i was getting something done (since i could actually see it on the screen) - as opposed to being clueless and taking much longer to get anything done.
I Just wish that i would have figured that out sooner. :)
This software developer does not have a detailed list of all the things he needs to do. Which means, despite adamantly claiming that he is 99 percent done -- he has no idea how long development will take! There's simply no factual basis for any of his schedule claims
First off, knowing that you are 100% done is relative. Are you talking about 100% meeting the functionality for the business requirements or do you mean that -plus- time added for unit testing AND some sort of initial cleanup? Be clear when you ask that question to your developers and don't pressure them to give you the perfect answer you expect because you just want to get your CIO off your back.
Now I'm not advocating letting your developers give you a grey answer every time, I'm just saying listen to your developers, don't always expect 100% perfect estimates. It's your job to build in cushion for testing and also I feel some for cleanup and you should be direct with the business that the standard is this and push back a little when the business is telling you to skip that part of the project and that this is not important.
I think this cleanup portion (aside of just functional code for the business requirements checklist) is very interesting (sarcastically).
Often times your manager doesn't give a damn about the cleanup you talk about when he asks you how much you have done for a,b, and c in the project plan, and just wants the stuff out the door because they are pressured. If you tell him or her well, it's functional, quite often, they're gonna tell the all too common you ah that's good enough, don't clean anything up right now, we are not looking for perfection, so lets just get it to QA. To me that's piss poor management. The developer is also not looking for perfection. He has a life and he just wants to do a little cleanup usually so chill...give some time for that.
There should always be a -short- period of time built into your project plan (1-2 SOLID days or so) for cleanup by your developers in addition to time for GOOD unit testing. Being 100% accurate about if you are done is relative but at the same token, you can't skim quality just because you want to force them to give you a hack job and say it's 100% and functional even though it has not been tested well, and there are some important cleanup that would affect the team or the application which affects the business later down the road which you know probably will never be refactored due to lack of time which resides in most IT shops dictated by poor management and unrealistic business timelines / expectations.
I think not polishing some of your code before it goes to QA is code and run and just a half assed approach to development and management no matter how much the business is breathing down your back. If your developer does not feel comfortable pushing it to QA, step back, let him have a day or two more to do some of that testing or cleanup so that he can feel confident that the application is fairly solid and so will you because you trust your developers. Maybe they had a long night (wife got into a car accident night before) and they really need that extra day to clean up. Think about the human aspect of that also as you address your developers, not just the business needs every damn time.
Forcing works better than encouraging.
If the source code is written decently, I find that the biggest factor is the developer's perception of what done is. So, I always ask if their estimate includes issue mitigation. Many times it does not.
Then, I ask them how many cycles of mitigation the change will take (usually based on complexity of the changes or the developer's knowledge of the code). From this conversation, the developer almost always adjusts their estimate of completion. Over time, the developer just gets better at estimating since we are always asking pretty much the same set of questions.
Areas that need refactoring are typically the worse for trusting anyone's estimate for any changes. I try to factor this in when setting expectations with the customer.