Wednesday, 20 February 2008

Some Notes on Software Testing Waste - my lean perspective

One key 'lean', or 'TPS', concept covers 'muda', or waste. Over my years in Software Testing I have tried to make my testing processes more 'Agile' or 'Lean', but when I started I didn't have the concept of 'muda', I just had the knowledge that things weren't working, so I changed them and got rid of the things that I thought added no, or little value. Now that I know a little about Agile and a little about Lean, I can try to apply the concepts of 'muda' to testing.



The 7 types of waste

  1. Overproduction
  2. Waiting
  3. Transportation
  4. Inventory
  5. Motion
  6. Over-processing
  7. Defects

Overproduction - the production of 'things' before another process requires them. This seems like the very definition of a scripted testing process. From my experience as a scripted tester, when I held the belief "all tests must have scripts". I indulged in this form of waste as a standard practice.

Overproduction led to a lot of Inventory and Inventory led to Defects in the scripts because the system under test would obviously change as its development team continued to construct it.

One method of countering overproduction involved not writing test scripts. Instead I made sure that I learned how to use the system and exercise it in such a way that I could cover the test ideas that I created. I found that I could document my understanding using models and diagrams instead of sequential lists of steps.

I still created Inventory, represented by a lot of test ideas, neatly categorised and cross referenced to documented requirements because our test process led to a lot of Over-processing. As a tester I created a lot of documentation that languished un-read, because I believed that I had to tell my 'customers' as much as possible, rather than as much (or as little) as they needed to know to help them make a decision.

I found that the test teams I worked on had rhythms where we would end up Waiting (for requirements and specifications, environments and systems) and then there would erupt a flurry of overwork where we had too much to do on the delivered 'thing'.

Transportation and Motion resonate less with my memories of software projects but I do remember having to move test cases, defects, metrics etc. from test tools into 'reports'. (Sadly reports which generally went unread)

Some strategies that I have used to deal with waste on test processes:

  1. Automatically generate reports and metrics
  2. Automatically generate test scripts from models
  3. Automatically maintain cross references
  4. Stop writing test scripts
  5. Writing clear high level 'test ideas'
  6. Cut down report contents to the important points
  7. View my 'test reports' as tools for me so that I gain value from them
  8. Talk people through 'diagrams' rather than writing a report

Strategies in response to a process that for one reason or another I couldn't, or didn't believe, that I could change.

'fear' drove some of the wastes:

  1. Testing will get the blame if we 'miss' something so we need to document everything and get it approved
  2. We have to caveat our test plans and get them 'approved' so that people can't override us later
  3. If I don't write a script I might not cover all the 'things' I have to cover when I execute the 'test'

Some of things these 'fears' tried to protect me from happened anyway and resulted from the 'relationships' in place. I now try to view the processes and people in place as a system and examine the relationships in that system on a project by project basis.

Unjustifiable 'beliefs' about testing drove other wastes:

  1. All tests must have a test script
  2. Test plans must follow the IEEE format
  3. Users must sign off on test scripts
  4. Testing need 'complete' requirements we start test design
  5. We do not test unstable systems
  6. Repeat all tests when the system changes
  7. Testing must remain 'independent' from development

Most of the beliefs (and there exist plenty more I could list) seem like excuses for not thinking and identifying the most relevant approach to my particular environment.

'Chances are' you don't need to know the 7 wastes, you just need to listen to your gut and the unease which you feel for your current process and approach and take steps to reduce that unease.

'Chances are' you just need to look at the 'value' that each of your documents and processes add to the overall development effort and cut out as much as possible those that add no, or little value. If, during your analysis, you end up as the only identifiable person getting value from the 'thing', then make sure you produce just enough to maximise that value.

2 comments:

  1. [...] http://www.eviltester.com/index.php/2008/02/20/some-notes-on-software-testing-waste-my-lean-perspect... [...]

    ReplyDelete
  2. Transportation is interesting. In testing, actually in all software development, we deal with the transportation of ideas. A requirements document is a vehicle to transport ideas, a 'test case', a test plan, etc.

    "Each time a product is moved it stands the risk of being damaged, lost, delayed, etc. as well as being a cost for no added value. Transportation does not make any transformation to the product that the consumer is willing to pay for." (http://en.wikipedia.org/wiki/Muda_%28Japanese_term%29). Every time we take an idea, and put it in a requirements document, and then write a test based on the requirements document , then the idea is being transported several times, each time risking damage or loss. In fact, you often see that by the time an idea has gone through several documents, from High level requirements, to low level requirements, to specifications, to test cases, to sign off reports, then the original ideas which these documents serve to transport have been maimed beyond all recognition! Which may be why we end up with software that may meet its specifications, but doesn't actually solve the problem is was designed to solve. Or doesn't do it well.

    ReplyDelete