Automatic Creation of Acceptance Tests by Extracting Conditionals from Requirements: NLP Approach and Case Study

Daniel Mendez, Jannik Fischbach, Julian Frattini, Andreas Vogelsang, Michael Unterkalmsteiner, Parisa Yousefi, Pablo Restrepo Henao, Tedi Juricic, Carsten Wiecher und Jeannette Radduenz

Journal of Systems and Software,

2022 · DOI: https://doi.org/10.48550/arXiv.2202.00932

Zusammenfassung

Acceptance testing is crucial to determine whether a system fulfills end-user requirements. However, the creation of acceptance tests is a laborious task entailing two major challenges: (1) practitioners need to determine the right set of test cases that fully covers a requirement, and (2) they need to create test cases manually due to insufficient tool support. Existing approaches for automatically deriving test cases require semi-formal or even formal notations of requirements, though unrestricted natural language is prevalent in practice. In this paper, we present our tool-supported approach CiRA (Conditionals in Requirements Artifacts) capable of creating the minimal set of required test cases from conditional statements in informal requirements. We demonstrate the feasibility of CiRA in a case study with three industry partners. In our study, out of 578 manually created test cases, 71.8 % can be generated automatically. Additionally, CiRA discovered 80 relevant test cases that were missed in manual test case design. CiRA is publicly available at this http URL.

Url: https://arxiv.org/abs/2202.00932