Test Case Calamity

One question. Do you manually create your test cases? If the answer is yes, you might want to turn away now as our research has turned up some pretty ugly conclusions.

We looked at a large multi-national financial services company and a number of examples of how their development teams approached test case design. As with most organizations, they tackled testing manually.

Digging a little deeper, we uncovered five failures that can easily derail projects and increase project cost.
We also contrasted the existing method with an automated route and it resulted in something of a wakeup call:

Time wasting

  • The financial services company created 11 test cases in 6 hours with 16% requirement coverage – nowhere near adequate
  • Agile Designer automatically created 17 test cases in 2 hours with 100% coverage

Overstaffing

  • By manually checking and changing ALL the test cases when a requirement changed, it took 2 testers 2 days to manually check all their existing test cases
  • Agile Designer took 5 minutes

Overtesting

  • On one project, it took the company 5 hours to manually create 150 test cases and it resulted in 80% coverage – 18 x more testing than was needed. With each test case costing about $200 to run that was $26K wasted
  • It took 40 minutes for Agile Designer to create 19 test cases with 95% coverage and at a fraction of the cost

Undertesting

  • Another project relied on 3 test cases which provided just 5% coverage, this resulted in bugs making it into production which is expensive to fix
  • Agile Designer generated 12 test cases with 100% coverage in 30 minutes

Case overload

  • In one project, the possible number of cases identified was 326
  • Agile Designer identified that only 17 were needed for 100% coverage

Any organization that relies on manual test case design has to accept that it is a surefire way to build in project failure and they should accept that expensive re-works are inevitable.

So, if your status quo is too difficult to challenge, carry on.

Alternatively, automate your test case design and demand for the following reasons:

  • A 95% reduction test case creation time
  • An 80% improvement in functional coverage – from 20% to close to 100%
  • A four times reduction in over testing – this is the average we found
  • And a 20% to 30% project cost reduction

The Orasi webinar on May 28th demonstrated the Grid-Tools data masking and subsetting products and usage within a test cycle.
Participants gained an understanding of the following:

  • When data masking is needed and how to use Grid-Tools Enterprise Data Masking
  • Why subsetting is beneficial and how to use Grid-Tools Data Subset
  • How to integrate data masking and subsetting into your Test Data Management Strategy

Access the recording here.

 

By Huw Price, Managing Director, Grid-Tools

Leave a comment