Tuesday, December 8, 2009

Why Blog?

Gus Mueller led me to Dan Wood's article on blogging the other day. I didn't quite get around to reading it until just now, but in the meantime, I semi-subconsciously thought about why I think that blogging is a good thing to do. Dan's reasons, although very good reasons, are not my reasons.

  1. Blogging pushes me to develop my thoughts from un-examined opinions to defensible theses.
  2. Dan hopes future customers will read his blog. I hope other future people find my blog: mentors, co-authors of papers, peers, critics, skeptics, co-workers, people I can mentor?
  3. Blogging exercises my communication skills. A big part of testing is detective work. But another big part is story-telling. I want to get better at telling clear, convincing, entertaining stories.

Wednesday, December 2, 2009

Testing for Experts, or When a Spec is Worth the Bother

@jonbell's Ignite Seattle talk collided with a recent testing adventure of mine. I was really frustrated testing a new feature for one of our apps. About the time I read his talk, (oops, forgot to go in person!) I stopped trying to test the feature and started writing up how I thought this feature was supposed to work. Now I'm not so grumpy anymore.

I used to think there was just testing. Then I broke it out into two types of testing: learning and checking. Now I think there's 3.

I still think there's learning. And I still think there's checking. But there's also learning A to check B.

For many of the features I'm supposed to test, the only written spec is the bug. Sometimes it only has a sentence or two. And that's enough for the engineer to write the feature (mostly) correctly and enough for me to (mostly)test it.[0] Writing more details would just lead to time-wasting, abandoned specs.[1] For example, "Add a 'new folder' toolbar button." The engineers I work with all know how a toolbar button typically works, and so do I. We can make reasonable guesses about what the user expects. I check that it feels predictable[2], shake out a few corner cases, and off we go.

But sometimes we're building expert features. I am not an expert in our users' field. I can't trust my gut to know which operation ought to take precedence when two of them interact. When I realize I'm in this situation, I need to learn first, and then test. This is when I really need a spec. Maybe the engineer needed a spec too, and the bug is already full of notes from him and our PM clarifying how the feature needs to work. Or more often, I need to write my own spec.

"But you just said you don't know this stuff. How can you write the spec?"

I'd say I'm *exactly* the person to write the spec, because I'm the one who knows what questions it needs to answer. If I ask the engineer "How is this supposed to work?" I might get part of what I need to know, but I tend to feel like I'm wasting their time if I camp out in their offices too long. On the other hand, if I write up a short wiki page on how I *think* it's supposed to work, they're often quick to jump in with corrections and clarifying comments.

Developing the spec is the learning step. Now I can pretend to be an expert in the field, and properly check the behavior of the new expert feature.

[0] Honestly, now, when was the last time you *fully* tested something?
[1] Thinking of http://www.joelonsoftware.com/articles/fog0000000033.html
[2] borrowing from Jon

Saturday, November 14, 2009

Knitting & Testing

This afternoon, I dropped by while James Bach was signing copies of his latest book, The Buccaneer Scholar. Somehow I mentioned that I knit, and he explained one of his new teaching ideas, that test script writers could learn a lot from the way knitters write patterns. It's common to write test scripts so that anyone could read and execute them. But a knitting pattern is written for the (more-or-less) knowledgeable knitter, who is already familiar with a number of abbreviations and other conventions in knitting patterns. Just as it would be incredibly tedious to re-explain how to ssk (slip-slip-knit) every time that stitch is required, we could perhaps save a lot of tedium in our test scripts by developing a common jargon within a testing community[1].

[Edit: In the following paragraph, I completely failed to mention the contributions of @superpuppy, who shares with me an interest both in knitting and in breaking software.]

Then James sent me on a mission to find 2 books -- one that just assumed you knew knitting and used its jargon, and another that provided the introduction needed to understand and implement the patterns in the more advanced book. I found some books for him (5, since I'm an over-achiever), but the process sent me in 3 other directions as well:

1) The books on that shelf were not the intro ones I would have recommended if I was picking from all knitting books ever. While the Borders' collection of knitting books was respectable for a mall-sized chain bookstore, the selection felt limited compared to a yarn store or the internet. The main book I referenced while learning to knit was the Vogue Knitting Quick Reference. For crochet, I used 10-20-30 Minutes To Learn To Crochet.[2] The books I did offer James were still reasonable intro guides, covering the jargon and formatting of patterns. The crochet intro explained not only the cryptic terminology, but also the standard way of diagramming crochet. (The knitting may have taught how to read knitting charts as well, though I didn't notice it.)

2) There are a lot of parallels between knitting books and software testing texts. There's the plain old pattern collections, like test scripts. Then there's books that talk about what to do when things go wrong. And just as a tester may turn to hardware documentation while testing software that runs on it, so a knitter might find a book exclusively about various types of fibers instructive. At first I thought this was because the two crafts are similar, but perhaps it's more because human beings learn different skills in similar ways.

3) It feels like there's a lot more connections between knitting (and/or crochet) and testing than just the idea of how to write up scripts that James identified. I'm going to start a list here, and I hope you'll join in by commenting below, or tweeting me.

- A pattern is a test case/script. There's a set of steps, and an expected output.

- Keeping track of exactly what you've done so far is important. Count your rows, log your actions.

- Following the pattern/script exactly is a reasonable first step. But variation is good too! Try a different yarn and finishing stitch. Try it using the mouse instead of the keyboard, or with a bigger input file.

- When you diverge from the beaten path, keep track of what you did, so you or someone else can do it again later.

- There are books that can teach you how to do this stuff. But a workshop (at Weaving Works, or PNSQC) can be so much more effective. There are also videos (how to knit a mo√ębius, how to test an Easy button).

- There are heuristics. The Knitting Stitch Bible is nothing but stitch pattern heuristics -- no patterns for whole garments, no 'how to knit and purl' at the beginning, but 39 ribbing variations and 37 cables to choose from. I have yet to find such a concise book of heuristics for software testing, though I do find individual heuristics scattered all over the internet, and James gives great detail about a few in The Buccaneer Scholar. (Why isn't there an encyclopedia of testing heuristics?)

- There are holy wars, based on over-generalizations, imprecise terms, and uninspected biases. Knitting is more refined than crochet. Automated testing is more rigorous than exploratory testing. Well, what do you mean by 'refined', and is that necessarily a good thing? What do you mean by rigor? Or automated?

[1] A community could be a project team, department, company, formal or informal intellectual group, etc.

[2] This book deserves special note as the only intro book for crochet I've ever found that respects left-handed students. Most books show the first diagram or two redrawn for a left-handed context, and then expect you to just reverse the right-handed diagram for all the other techniques they teach. 10-20-30 includes diagrams for many (all?) of the techniques where hand positioning is important.

Monday, August 24, 2009

Upstairs, they call that a Liz

This is the best analogy I've seen yet for how I became a software tester:

I didn't accidentally kill anyone while getting my CS degree, but I sure wrote a lot of bugs and crashes into my homework assignments. Spending all that time chasing down segfaults seems to have trained me to look for similar mistakes in other people's applications.