WeTest Workshop: Regression Obsession

Big Thoughts

31 May 2013 • Written by Adam Howard, Aaron Hodder & Katrina Clokie

Michael Bolton was the special guest for the Wellington WeTest Workshop on 18 April, this session focusing on 'Regression Obsession'...

With everyone settled, beverages in hand and notebooks at the ready, Michael ripped in to a 20-minute summary of a presentation on Regression Obsession that usually takes an hour.

Michael set about challenging the disproportionate focus placed on regression testing, queried how many bugs it really finds and asked why regression merits its own testing phase. This was the perfect opening to the evening’s main event – Open Season – a moderated discussion of the topic on the table.

One attendee started the debate describing how he felt that the obsession with regression generally comes from the top of an organisation and that this is prompted by fear. He pointed out it is more embarrassing for a manager to release a product with a bug in a piece of functionality that used to work, than to release a bug in a piece of functionality that had never worked.

This idea of re-testing what’s known to work, lead us to a discussion around the important idea that regression and repetition are different things. That is, we don't necessarily need to look for regression problems the same way each time.

Michael stated that automated checks should be used to ask "Is there a problem here?" rather than "Does it work?" A passing test doesn't mean that the product is ready to ship. It means that the single problem we are looking for in the particular space that the check is being made doesn’t actually exist.

Checks should be used to grant testers the opportunity to investigate the product, rather than focusing on confirmatory testing activities that automated checks are suited to.

Another member of the group had also experienced a regression obsession from higher levels – in her case a distinct two-week phase of regression testing was mandated by the Steering Committee within the organisation.

The consensus was, that although regression testing is often prescribed as a distinct phase, it is actually an activity that occurs – or should occur – within all testing.

Assurity’s Aaron Hodder presented the analogy of a roast dinner. The diner doesn't approach the plate with a phased schedule: one minute for eating peas, three minutes for carrots, two minutes for chicken. What if they want more peas? Or would like to eat the chicken and carrots together?

Instead of specifying the items on a plate, a person would simply say that they are ‘eating a roast’. Similarly, testers should label their activity as testing, performing the activities they attribute most value to at any given time, rather than allocating time against perceived phases.

Towards the end of the evening, the topic of cost vs. benefit for regression testing was raised. Though this had previously been touched upon with Michael stating that regression testing found very few defects and Nigel Charman of Assurity presenting the ‘Goldilocks problem’ of how to determine the correct amount of regression to run, the passion of the room sparked a red card or two and took the debate to another level.

The consensus was that the earlier in the SDLC automated functional checks are present, the better. Unit testing as a means of basic regression was seen as cost effective and essential, but it was more difficult to pin down definite answers around costs vs. benefits later in the lifecycle and at the systems testing level.

If regression testing was finding a large number of bugs, this may be an indicator of an endemic development issue which would be very valuable to fix. Alternately, regression testing may find very few bugs – but of high importance – so would still be a valuable exercise.

As always, it was interesting to hear how different testers from different organisations do things, as well as their struggles and triumphs.  The animated discussion during the break and after the event – even with the double distraction of pizza and beer – was testament to the energy and passion of the attendees leaving everyone buzzing and deep in thought.

Thanks again to Assurity for sponsoring the event, allowing testing practitioners to share their stories with one another for free and able to continue to discover and progress the craft of software testing.

Adam Howard with Katrina Edgar and Aaron Hodder

 

 

Search the Assurity website (Hit ESC to cancel)