Strategy is not where you are heading, but how you’re getting somewhere in the long run. That goes for all strategies, and even for test strategies. Though for test strategies we often get caught up in mechanics of selector strategies, testing types and techniques that we lose track of the higher purpose: Moving the business towards a vision.
Continue readingministryofTesting
Explore, Code and Business
There are plenty of contexts where testing happens that doesn’t formally involve testers, or formally involve a testing phase. Still contexts like those have testing as an implicit activity.
We are neglecting the business know-how when all we talk about wrt. smarter testing is either the practices of exploration or of coding. Let’s bring in smarter testing for the business and coding sides too.
Continue readingOne Test Case is All You Need
[Originally on Ministry of Testing, November 20, 2013: One Test Case is All You Need. Reposted from the web archives]
Continue readingTest Master S1:E1
In Taskmaster (the TV show) celebreties are rated on solving challenges. At Test Bash Manchester 2020 the participants got a challenge too. A good interactive challenge, where you had to think outside the box. Encore MoT Team!
Continue readingWhen Subject Matter Experts test
Teaser for [Test Bash Brighton 2020] : How to Coach Subject Matter Experts to Do Testing
In the recent years I have been working on projects with no dedicated testers but plenty of testing. The testing has primarily been performed by subject matter experts. This is where it gets interesting, as my role on these projects has been to lead the testing being performed by people that have limited experience in testing. They also have no desire to be testing specialists, after all they are already specialists in their own subjects, however, everyone agrees and insist that the testing needs doing. So how do we ensure that the testing being done is done well?
After having worked on several very different projects, yet still with subject matter experts doing the testing, I have been able to get both the public process clerks and the technology specialists to perform excellent testing. This talk is about the approaches that I have found work well:
- One of the approaches is for me to prepare the test cases and prepare them only as headlines. Sometimes preparing the tests as open questions helps too.
- Another approach is to lead them as if they are doing the project participation voluntarily. They probably are, but still it helps to respect where they are coming from.
The lessons though (good and bad) is relevant to many testers in other situations, especially being the only “tester” on the team. The story applies equally to developers and business end users doing most of the testing and you will have them contributing with great testing in no time!
What you will know after the talk:
- An understanding of how testing looks when done by subject matter experts
- How to lead a testing activity with an appreciative and motivating style
- Examples of how teams can do great testing without dedicated testers

A Track down History
Currently I am doing a large V-model waterfall project with a three month testing period and 500+ requirements. To track the progress I want to dust off my old “test progress tracking” method that I matured and described in 2011 and 2014.
The approach was documented in two articles for “The Testing Planet” a paper magazine by the Software Testing Club. The “STC” later became the now world famous Ministry of Testing. Unfortunately the original articles are no longer available on the MoT site – they are on the WayBackMachine. So not to track down that path, here’s a recap of:
- A Little Track History that goes a Long Way [The Testing Planet by Ministry of Testing, July 2011] The purpose of this tracking tool is to collect just enough data to answer the frequent question “Will we finish on time” [https://web.archive.org/web/20160410200818/https://www.ministryoftesting.com/2011/07/a-little-track-history-that-goes-a-long-way/]
- The Daily Defect Count and the Image of a Camel [The Testing Planet by The Ministry of Testing, April 2014] Count the defects daily – the ones that are part of the project work load. The number goes up and down during the cycle – why and what can you learn? [https://web.archive.org/web/20140718052546/https://www.ministryoftesting.com/2014/04/daily-defect-count-image-camel/]
A Little Track History that goes a Long Way
For large (enterprise, waterfall) projects tracking test progress is important, as it helps us understand if we will finish a given test scope on time. Tracking many of projects have given me one key performance indicator: daily Percent Passed tests as compared to an s-curve. The data in the S-curve is based on the following data points, based on numerous projects:
Time progress | Expected Passed Progress |
10% | 0% |
20% | 5% |
30% | 10% |
40% | 30% |
50% | 45% |
60% | 60% |
70% | 75% |
80% | 90% |
90% | 95% |
100% | 100% |

If the “Actual” curve is flat over more days or is below the blue trend line, then investigate what is blocking the test progress… defects perhaps:
The Defect Count and Camel Image
During a large project like this the active bug count goes up and down. No one can tell what we find tomorrow or how many we will find. In my experience tracking the daily count of active defects (i.e. not resolved) is key, and will oscillate like the humps on a camel:

If the curve doesn’t bend down after a few days there are bottlenecks in the timely resolution of defects found. When the count goes up – testing a new (buggy) area is usually happening. Over time the tops of the humps should be lower and lower and by the end of the project, steep down to 0.
And thus a little track history has come a long way.

A 30 Days Agile Experience
In September 2017 the Ministry of Testing had a crowd-based knowledge sharing event called “30 Days of Agile Testing” with a small learning activity for each day of the month. As with the similar security event I set up a weekly schedule at work to meet for an time-boxed hour and discuss 3-5 selected topics each time.
Our score was 17 topics discussed – some more discussed than actually tried out. Hence the half marks on the poster on the window below. Me and my coworkers work on many different teams – so to dig into specific team tools and processes was out of scope.
Here is a few of our findings:
- Day 3 – YouTube is full of videos on agile, but our Chinese colleague cannot view them. They can though view videos locally on the Dojo and TestHuddle.
- Day 4 – We discussed the Agile Manifesto in relation to GxP/regulated projects. GxP and agile can work together, when you control your deliveries with for instance ATDD. See DevOpsDays Seattle 2017: Continuous Delivery Sounds Great But … and How to build agility into a regulated project?
- Day 12/26 – We need the test cases in many contexts, but do we need the Test Plan specifically. Perhaps we can reduce the test plan to “Requirements/scope”, “test cases” and “staffing” as in Plutora Test.
- Day 13: Perhaps not an IDE, but then learn some other technical tools that is not automation: See 30 Days Of Agile Testing! Day Thirteen and 6 Technical Testing Skills that Aren’t Automation
- Day 18 – Put the testers and test management tasks on the sprint board. Adjust the burn down for part time people. Keep all testers and other people in the project in the loop.
Links to “the Club” on some of the topics we selected:
- 30 Days of Agile Testing, Day 1: Agile testing books
- 30 Days of Agile testing, Day 2: Mindmap of what agile testing is
- 30 Days of Testing – Day 3 Videos
- 30 Days of Agile testing, Day 4: The agile manifesto
- 30 Days of Agile testing, Day 8: Talk to a developer, rather than creating a ticket
- 30 Days of Agile testing, Day 12: Test documentation
- 30 days of Agile Testing, Day 26: What does your Test Plan look like?