September 26, 2006
Michael Hunter's blog byline is unapologetically over-the-top: making developers cry since 1995.
That's probably why he's such an awesome tester. Well, that, and the braids. Never before in the history of testing professionals have the top and bottom halves of a man's head been so mismatched.*
The absolute worst testers you can possibly have are developers. They're better than nothing. But barely. Even a mediocre tester will make your application better, and by proxy, encourage you to become a better developer. The very best testers will drag you, kicking and screaming if necessary, across the bug-bar threshold. Professional testers force you to become a better developer. Sometimes it's painful. But in a good way, like a heavy workout.
To get an idea of how gnarly the work of a test professional actually is, take a look at Michael's Did I Remember To (test) list. I can barely read the first page without wincing in sympathetic pain. And the list goes on, and on, and on.
Michael recently expanded that list into an entire series of blog entries for DDJ titled "You are not done yet", which are now captured in handy PDF form -- Michael Hunter's You Are Not Done Yet Checklist.
Pick something. Anything. A feature in your favorite software application, your favorite toy, your favorite piece of furniture. Now start brainstorming things you could do to test it. Think of as many different things to do to that object as you can. Come back and continue reading when you’re done.
What’s that? You’re back already? There are test cases you haven’t thought of, I guarantee it. How do I know? Because for even the tiniest bit of something – the Find dialog box in your web browser, say, there are billions of possible test cases. Some of them are likely to find interesting issues and some of them aren’t. Some of them we execute because we want to confirm that certain functionality works correctly. These latter cases are the basis of my You Are Not Done Yet list.
This list is large and can be overwhelming at first. Fear not. You have probably already covered many of these cases. Others won’t be applicable to your situation. Some may be applicable yet you will decide to pass on them for some reason or other. Verifying you have executed each of these test cases is not the point of the list. The point is to get you thinking about all of the testing you have and have not done and
point out areas you meant to cover which you haven’t yet.
So don’t quail at the thought of all this testing you haven’t done yet. Instead, customize this list to your context. Scratch off items which do not apply. Use the list as a launch point for finding items not on it which do apply. Use it to organize your testing before you start. Use it as a last-minute checklist before you finish. How you use it is not nearly as important as that you use it in the first place.
Brrr. It's enough to make you hang up your Tab and Fritos to become a console developer. But if you can pass that testing gauntlet, you've definitely earned your stripes as a seasoned software developer.
* I kid! I kid because I love! please don't test my app
Posted by Jeff Atwood
I can definitely attest to the fact that, as a professional developer, I hate testing. Debugging, fixing, no problem. Testing, ugh. I just find it so darn tedious. Though I have to admit, I have only begun to play with some of the sophisticated test suites available now. Maybe those really do help alleviate some tedium. But I also worry that they will just result in a false sense of security.
Oh, and also....
Holy buckets. That terrifying photo will surely haunt my nightmares for the rest of my days.
I'm all over TDD; unit tests rock. But the kind of testing you're talking about? I'm awful at it. I get bored filling out the same UI, so I just plug in the same values I used the last 20 times. (=
Bane of my existence : Testing.
I cried when I saw that list. But, I have to admit.. its true v: It does help - but does NOT mean that I have to like it...
I'm with Jerry. I have never (repeat, never) worked with an excellent tester. Plenty of average ones, lots of below average ones, and a few good ones. (The same can be said for developers, now I think of it.)
That's in 10 years of professional development and about a dozen organisations, from blue chip to small software houses. Have I been unlucky? I doubt it.
I agree that developers don't make excellent testers (that's what users are for ;) but I'm darn sure that I test a lot of stuff that most testers wouldn't.
Good testers are very hard to find. I think that a lot of it is to do with there being no solid route into testing. No-one says 'I want to be a software tester when I grow up' (at least not that I've met). All the testers I've ever worked with have sort of fallen into the job after not having a clear career path. This makes for bored employees (hey, they're testing!) and low level of interest in their work. Hence, poor testing and a high turnover of staff.
Nor does there seem to be an agreed approach or methodology to testing (again, not that I've seen, I'm sure they exist) so each tester or group of testers just do their own thing.
Maybe Microsoft (or other) needs to come up with an industry standard certification for testers?
Thanks for the great link. I think it is great that Michael is working on the Expression suite. The beta of Expression Web Designer is awesome, and there's something more than testing in the mix.
So I just added three RSS feeds: Michael's MSDN blog, his DDJ blog, and HMK's Spurious Thoughts, stumbled over in passing. I need this. I do. It is good for me.
I would definitely want Michael and kindred spirits testing any product of mine.
The best software testers are woman :-)Their actions are unexpectable. uh.. uh.. painful!
Absolutely dead on.
I've only known a couple of really good testers in my life; the best had been a field engineer, so he knew what developers are capable of doing to users. We had a good relationship because I'd also done field work and knew how important testing was - we joked all the time that my job was to develop new code and his job was to make my life hell.
I'm a tester/verifier. Thanks for the photo, Jeff. Though, I'm sure sure I understand its underlying implications.
TESTER PRIDE!!! XD
At the big M, testers are hired at the same pay level as developers, and have the same potential career growth. As a result of this, MS has hundreds, if not thousands of testers with skills similar to Michael. It's a good place to be a tester, and our developers really don't cry *that* much.
I'm a software tester/QA and occasional developer at the same job. I do care more for coding than testing, but testing is my primary responsibility. So even though I'm a developer at heart, I take a great deal of pride in seeing software go out and work right the first time dur to my efforts.
It's been mentioned before, but I get paid a fraction of what the full devs get--maybe half. It's not a good motivator.
Pay peanuts. Get monkeys. The old adage applies.
It's different where I work. Testers are paid at the same level as developers, and all of the test scripters also program in C++, some flavor of VB or a .net language. We've also found something interesting. The very best *manual* testers are also scripters, not the domain experts.
We program our scripts from scratch (no recording), so we're forced to think logically about the software and our workflow.
It's unfortunately true that bean counters try and cut testing first. Fortunately, those companies tend to be self-solving problems. Unfortunately, bean counters aren't accountable and can get a new job more easily than the development and testing staff, which is perceived as the source of the failure. They are not, usually, in my experience. The root causes of software failure are almost always rooted in irrational behavior of an uninformed and lazy management.
I'm lucky to work where I work now, but I know many other poor souls out there testing some sort of CRM app for a WWW company (Widgets to Wankers on the Web), who hire uneducated people onshore or offshore, treat them as throwaways, and wonder why they're having trouble.
You mentioned that "Even a mediocre tester will make your application better..."
Perhaps, but my experience has been that there are many, many sub-mediocre testers. Deficient tests and test cases actually increase the development workload with minimal associated increase in quality.
So, in disfunctional organizations, testing may actually have a negative effect on product development.
The relationship should not be adversarial, however, I think testers need to stand up for their rights, too. They don't get enough respect, either in salary or the project plan.* Testers should push a little!
* Evidently unless you're a tester at Microsoft, which is certainly a good sign.
Hey Jeff, let me test your app? Pppppplease?
Ralph and Jeff are dead-on that the tester-dev relationship should not be adversarial, and I go to great lengths to prove to my devs that I am their friend, not their enemy. If they cry anyway, well, that's because they still check in bugs for me to find. g/
when looking at the list above, especially the Printing and setup options sound like bad use case planning to me.
I took a stab at beta testing a few months ago - and one of my most important conclusion was that good test cases can kill most really obvious bugs:
In addition, eating your own dog food and code reviews should be ok for any one man micro ISV.
This whole "you're not done yet" attitude suggests that some of these tests are testing implicit or unstated requirements.
If the developers had this list of requirements up front, there would be no excuse to cry about the tests later. And there would be nobody asking "are we done now?"
As someone in development I remember meeting a interaction designer whose first words to me were: "I'm going to make you cry." I replied - "Okay, I can do that if that is what you want." I don't understand why QA and Development allow themselves to be set up as advisories. I think that more can be accomplished when both sides have a common shared goal of shipping the correct features with high quality.
the trouble is... nobody likes someone saying their app is shit. even when it is.
The relationship should not be adversarial, but friendly competition is how I think of it. My goal is to check in software that you can't break. You goal is to break it. Who wins? Obviously my job is harder, but the better I do my job, the harder I make your job. The more we compete with each other, the more the customer wins.
If I do a poor job, my QA should come in with a laundry list and just kick my butt. If I'm a developer worth my salt, I'll either fight the bogus claims or hang my head in shame for sucking so bad.
My company has been bought by bigger companies, but back in the day when we had what I consider a proper developer-QA relationship we used to have the slogan, "It's not cool till QA says it's cool."
Don't get me wrong. We loved our QA. Still do. I don't know a single member from the original crew that wouldn't kill to get one of our original awesome testers. Yeah... you get rework, but rework is much better on the ego than bug reports from the field.
One more note:
When I say 'Obviously my job is harder', I mean that only in the sense that creating an unbreakable thing is harder than breaking something. That is not to say that QA is a lesser job or that the pay should be less... on the contrary, QA is a difficult job. A skilled QA analyst must think outside the box and think of all the things a developer _didn't_ think of. After creating a viable, repeatable test plan, a QA analyst must perform the rather monotonous task of acting out that test plan. At a 3-1 Dev-QA ratio, I think a Senior QA analyst is worth the same price as a Senior Developer.
Done now. Thanks!
the trouble is... nobody likes someone saying their app is shit. even when it is.
It is tough when a dream meets reality. As another commentor said I've had good relationship with testers and actually started looking at writing stuff with minimal bugs as a challenge...wish I could win ;)
I've been testing for nearly 6 years. For equivalent seniority, pay is somewhat less, but I'm pulling in over 80k. Respect is always an issue, and timelines are always a problem. I will fight like a wild cat if a PM tries to compress my timeline, because inevitably I will be blamed, whether implicitly or directly, for defects in production. On the other hand, I do my best to befriend developers. Still, sometimes their egos get the best of them, and I'll show my pimp hand so they dont think they can roll me over. The job can actually require quite a lot of creativity and latitude, but it has so many drawbacks that I just can't seem to learn to enjoy it. There is endemic disrespect for someone who doesn't produce anything tangible besides what is perceived as friction in the project plan. What's that saying, credit the developer, debit the tester? It's true. I have worked my NUTS off to get a build out that had virtually zero defects, and barely got a "thanks", but I've missed monster bugs before and been fried on a pan for it. It just doesn't add up. That's why I make a point to thank my junior testers every chance I get. Most PMs and execs are slaves to the timeline, and most of the ones I've worked for a clueless about the value QA brings. Just today I had a spat with a developer over my desire to delay our release by 5 days so I could finish my job correctly. I've warned my PM a dozen times about the risks - but it just doesn't seem to sink in. Basically, I have to show up and pretend to be a team player, but I've grown to absolutely detest my job. I can write circles in SQL around most of the people I work with, but the future feels bleak right now - QA entombed, I fear. I need another job, I'm burnt out and getting too uppity.
Garret, when you say 'Obviously my job is harder,' I don't know whether to laugh or to cry. Can you see that from a macro view it's a sad state of affairs that something "a developer worked so hard on" can be "broken with such ease" by a tester (your assertion; my quotes)?
Having been a developer in a previous life (deadLanguage=COBOL), I have developed a sense that the real issue is what I call "Happy-path Development(TM)" wherein all elements conspire to allow time only to program/unit-test the 'here's what it does' requirements with no time left for anticipating 'stupid user tricks' and other lethal corner cases. Obviously we need to all hang together on this or we're likely to all hang separately.
At any rate, should the opportunity present itself you should consider taking a 'walk on the wild (tester) side;' you'll learn the joy of trying to prove to upper management that, "there are no needles in this haystack."
Early in my career I had the pleasure of working with two dynamite testers. These ladies could break a rock-solid application in a matter of minutes, and as a result forced me to become a much better (and slightly more humble) coder.
Professional testers rock, but they are very hard to find!
IME, there are two reasons why testers are so mediocre: testers are paid far less than programmers and they almost always have the amount of time they want to test the system compressed by management.
Let's face it, no software developer ever wants to become a tester - your earning potential goes into the can. If you pay people, say half, of what you pay your software developers, then you just are not going to get quality people. Many of the skills needed to test modern software applications need more programming-like skills and\or need more work from the programmers. Spending over $50K tools like Rational Functional Tester or WinRunner is money that could have been spent on training people with the skills they need to test the application or hiring talented people.
In general, the bean-counter in management have absolutely no respect for the testers and what they are doing. Almost always because of overruns in development, the testing schedule becomes compressed. Then once the application is in test, it becomes a preassure cooker to finish. I know I do not like working in that environment and if I was getting paid about half of what I make now, I certainly think about leaving after the whole ordeal was over.
Whilst I generally agree, there are some cases where developers make excellent testers, and those cases tend to be related to security issues.
For instance, lets take a web-application. A non-technical tester is probably not going to try XSS, SQL injection attacks or hack around with the query string or edit hidden fields. As a developer you often have a good idea what the weak spots of an application are - the buttons that can be pushed to crash it. Whenever I see a query string I just can't help hacking it! "Ummm, so there are ten pages of search results - what happens if I change ?page=10 to ?page=11 or ?page=ABC?".
You might say this is the job of a specialised security consultant, but many small firms can't afford that luxury. That is where developers knowledge comes in handy to compliment other test paths.
Maybe I'm an unusual case, but I am a developer that does quite well at testing. Now to be fair, I am not the best developer out there, but I can develop my way out of a wet paper bag. :)
As a developer, when I was testing a bug fix I had just implemented, I would typically find 2 to 3 more bugs. That happened pretty much ever bug fix. My managers saw this and decided I would be better off in the QA department and I felt the same way.
Maybe the statement: "Great developers make bad testers" is better. I suspect that even great developers could make great testers too, they just don't feel it's worth their time to do both. That may be a true statement, but in the age of TDD, developers (good or bad) need to become great testers too.