How Custom Testing Traps Emerge

How Custom Testing Traps Emerge

Back in 2009, at a software testing conference, I saw a talk entitled ‘Traps of Custom Testing’. I liked that talk but noticed some things that were missing. Therefore, in 2010, at the next conference, I made a presentation entitled “How Custom Testing Traps Emerge.”
But first things first.

The first presentation I was pointed to three stereotypes in customers’ behavior – greediness, laziness, and bluntness.


Greediness

The software testing team is asking for some additional time to conduct tests. The customer objects, saying: “You just want to blow the budget.” The testing team's reasons – the increased amount of functionalities, delays in development, unclear requirements, and a high defect density – are ignored.

The presenter’s explanation is that the customer lacks an understanding of the situation, process transparency, and certainty of outcomes. And she suggests explaining to the customer the risks and quality decrease, to prioritize test areas, and to define the criteria for tests completion and acceptance.

I think the customer has every right to make such objections since in the previous projects:

  • Quality assurance (not only testing!) costs were not estimated separately
  • More efficient development ensured lower costs of testing
  • Testing costs did not ensure acceptable product quality (although they were high) due to low general process culture
  • Tenders were organized so that one of the selection criteria was a justified total cost of the project


Laziness

The software testing team is not planning to write product test documentation. The customer objects, saying that they are just lazy! The testing team's reasons are ignored. We had not written test plans and test cases because we didn’t have time. We didn't do that because that had not been agreed. We won't write a test report because activities were not recorded, the customer saw that and agreed.

Her explanation is that the customer lacks an understanding of the situation, process transparency, and certainty of outcomes. And she suggests explaining to the customer that a test plan and a test strategy should always be in place and approved, that there should be representatives from the customer in charge of the business domain, that scenarios should always be written, and the customer should be warned about risks.

I think the customer has every right to make such objections since in the previous projects:

  • It was the absence of those actions/artifacts that led to the project failure
  • The software testing team were lazy indeed and didn't provide the promised deliverables (for example, test plans) which had been paid for
  • There was no objective evaluation of project quality and no possibility to control its progress
  • Tenders were organized so that one of the selection criteria was a justified degree of following the customer’s processes


Bluntness

The software testing team insists that they are all high-class experts and can do their job well. The customer argues that they are no more than “high-class mouse clickers” and asks them to show their certificates. The testing team's reasons – that testing a specific domain requires particular experts; we don't have certificates but have enough experience and skills – are ignored.

The presenter’s explanation is that the customer lacks an understanding of the situation, process transparency, and certainty of outcomes. And she suggests to explain to the customer that they need to create test artifacts and have a portfolio, and that manual testing is very important.

I think the customer has every right to make such objections since in the previous projects:

  • In the case of failed projects, they also insisted that they had required expertise and skills
  • The project team’s declared experience and expertise were not confirmed by any artifacts (results of successfully completed projects)
  • They had successful projects implemented by a highly professional team
  • Tenders were organized so that one of the selection criteria was the existence of confirmed expertise of the project team

How Custom Testing Traps Emerge.jpg


Roots

For all the stereotypes, the presenter pointed out that the customer lacks an understanding of the situation, process transparency, and certainty of outcomes.

But first things first.

  • Understanding the situation – why does the customer need to understand it?
  • Process transparency – why does the customer need to see it?
  • Certainty of outcomes – the customer has reasons to be uncertain

Even the names of these stereotypes can be interpreted differently:

Greediness
  • The customer - reasonable economy
  • The project team - profitability

Laziness
  • The customer - saving efforts
  • The project team - saving costs

Bluntness
  • The customer - a need for guarantees
  • The project team - why not obtain certificates confirming the existing of experience and expertise

Conclusion:
  • Stereotypes are mostly formed by project teams themselves
  • It is necessary not to overcome such formed (and forming) stereotypes but to stop their being formed
  • Different projects teams reap the “benefits” of each other
  • Efforts should be coordinated, primarily in terms of managing the customer’s expectations
  • You should not bring up the customer but work on projects that meet the customer’s reasonable expectations

Check out our software testing trainings and start working on developing or expanding your software testing skills.

Come learn with us!

Alexandr Alexandrov
Software Testing Consultant
Nadal masz pytania?
Połącz sięz nami