Friday, 4 March 2011

The Cross-Disciple Pirates and the Canon of Test Techniques

I used to consider incorporating techniques from other disciplines into testing as something a little different. It felt right, but since the ‘industry’ didn’t do that, it seemed like a way of individually revealing our personal approach to testing.
But testing has a secret history. The building of the Traditional Testing Canon has remained shrouded in mystery until now. So for all testers following tradition, set yourself free, continue to follow Tradition, just follow the one true one. As I reveal here as “Thee True And Aythentic Historee of Software Testing - A tale of Action and Adventure”.


Consider a Traditional Testing Canon (say each of these in an authoritative voice):
  • State Transition Analysis
  • Path Analysis
  • Data Flow Testing
  • Boundary Value Analysis
  • Equivalence Classes
  • Domain Analysis
  • Decision Tables
  • Classification Trees
  • Use Case Testing
  • Orthogonal Array Testing
The ‘canon’ will categorise these in different ways – structural, functional, non-functional, etc. But we can ignore all that stuff.
So where would I go to learn these ‘things’? Books from the late 80s (Software Testing Techniques, by Beizer), the mid-90s (Software Testing – A Craftsman’s Approach by Jorgensen), and on, and on, and on. At least 20 years of the same stuff in Software Testing books and Certification Schemes.
But most of this does not stem from testing.
  • For State Transition Analysis, Path Analysis, Data Flow Testing try Graph Theory , Automata Theory, Network Theory
  • For Boundary Value Analysis and Equivalence Classes try Goldilocks and the three bears, Berenstein Bears, and Dr Seuss
  • For Domain Analysis, Classification Trees, Orthogonal Array Testing try Set Theory, Statistics, Data Mining
  • For Use Case Testing try Soft Systems Analysis
  • For Decision Tables try Logic
Suddenly Testing looks like a field set up by pirates. But, clearly since they were on the side of ‘good’ they must have been privateers.
How liberating. Testing as a tale of Action and Adventure.

Thee True And Aythentic Historee of Software Testing - A tale of Action and Adventure

Act One

The field of testing started by looting and pillaging a bunch of other fields to take things that it could apply. And then applied them.
We once terrorised the ocean of intellectual investigation. Studying. Learning. Applying. Simplifying. Nothing was safe from brutal inquiry.

Act Two

Then things went wrong.
Somehow it got canonised. Standardised. Fixed.
The flow stopped. It got stagnant.
We stopped sailing, we took our ‘earnings’ and retired to the Caribbean.
We drank our wine, and rum, and got fat (or at least I did).
We hired out our ships as amusement rides.

Act Three

But all testers didn’t really stop looking.
The canon just stopped expanding.
Some testers continued to loot and pillage. It just didn’t seem very ‘respectable’.
But testers bring in techniques from other disciplines all the time. Sometimes they talk about it. Sometimes they just use it and get results.
We should view this as normal.
Indeed, we should have an expectation. Every man-tester, woman-tester, other-tester has a duty to the testing field to Loot, Pillage and Plunder all disciplines of inquiry and knowledge.
Bring back your booty. And tell your tales of adventure.

Testers, embrace your inner pirate. Co-incidentally a tester wrote a book about that kind of thing.

3 comments:

  1. Alan - This is very impressive. History of testing from 80s-90's into 2010 - great story.

    The fact that good testers learn from other descriplines and look for broder approach to testing challenges of today - is a secret of testers that did not retire and settled on Caribbean beaches sipping beer.

    Learning manuafacturing has been a very dominant pattern of cross descipline learning in software testing which you seem to have missed. Many like to quote Deming, Juran, FW Taylor and other propents of Japenese Quality revolution than quote Plato, Descartes, Galieo, Kuhn, Feynman, Taleb, Northhead, Carl Segan, Russel, Kant, Hume, Popper, Weinberg (group of philosophers, scientists, critical thinkers and historians of science, Economics).

    Manufacturing as a metaphor for software (and software testing) is really a bad one.

    Shrini

    Thanks Shrini,

    I missed out a lot of cross discipline stuff. Seems like an unbounded set to me.

    I still get a lot of value from reading Deming, he seems like a systems thinker to me. I also found value in some of the lean manufacturing writings, although I never took the step of using manufacturing as a metaphor for software development (including testing).

    ReplyDelete
  2. >>> I still get a lot of value from reading Deming, he seems like a systems thinker to me. I also found value in some of the lean manufacturing writings,

    The problem I see with people like Deming are their domain of work is /primarily/ "manufacturing" - the world of concrete things - touch-feel-precise. In Lean for example - statement of "identify waste and eliminate it" looks pretty common sensical. I always struggled with Lean/Six Sigma as the concepts of "waste", "re-work", "inspection", "statistical variability" when imported into software become vague. A lean process guy can say "lets apply pereto or fisbone to identify waste and let us rearrange the process such that we eliminate the waste". Software making is lot different from making a car.



    Unfortunately, as you mention in your other comment on the models, that problem does not reside with Deming. It resides with people taking the application of his work by other people and applying that to software in possibly sub-optimal ways.

    From my reading of Deming, it seemed that he taught people how to deal with systems. And taught statistical experimentation for an industry where variability cost a lot of money.

    You mixed a lot of concepts together in the above paragraph. Deming was pre-lean, and pre-the-ninjas-of-sigma. Deming created few tools and taught many concepts.

    You might enjoy John Seddon who applies Deming's work to service systems.

    I gained a lot by reading Deming. I still find value in some 'lean' books and a lot of nonsense in others. I found no value in six sigma. I found value in many of the early books on the TPS. When I read them I remember that they created their tools to solve their problem - I try and find out about their thought processes, not their tools, so I find more value in the early books. I still have much to read about that period and that industry.

    At some point I'll post a list of the books I got value from and what I gained.

    Thanks for all the comments,

    Alan

    ReplyDelete
  3. The whole trouble with mindlessly copying ideas rooted in physical manufacturing, is that a physical machine, and thus a physical process of any kind, is considerably simpler than a software system.

    With software, "anything else, anywhere else, anytime else" has the potential of screwing things up, whereas with a physical machine "the thing that smashed into something else must have been at the same place and at the same time, and so the other thing which was the root cause of the smash must have been nearby in both space and time." A considerably smaller number of possibilities.

    That's pretty much why I pooh-pooh the pundits who, I think, take their trademarked ideas away from a square-hole where perhaps they do apply (mechanical things and processes), and who then take those same ideas and shove them into the round-hole of software. Like it or not, we can never be "completely sure."

    We can, however, be "sure enough." Even though software systems may have a prodigious number of ways that they can screw-up, there's only a much smaller number of ways in which they actually will do so in real life. A good tester seems (to me) to be someone who knows how to write the tests that actually matter, in the scenarios that are actually reasonably likely to occur. And, a good tester knows how to explain and to defend her (it usually is a "her") decisions. You can run the test and understand clearly what the outcome of that test will ... and won't ... tell you.

    Yes, in a very abstract sense, the process of "testing anything" can be described, and certain statements can be made likewise "in a very abstract sense." But there's such a thing as too much abstraction. To be useful, those abstractions must be grounded in reality. And "physical systems" vs. "software systems" have two very-different sets of constraints that they each live by. One is apples; the other is toothpaste.



    Thanks for the comments Mike. "The whole trouble with mindlessly copying" might boil down to the "mindlessly" part.

    ReplyDelete