Evaluating guidelines for empirical software engineering studies

Research output: Chapters, Conference Papers, Creative and Literary Works (RGC: 12, 32, 41, 45)32_Refereed conference paper (with ISBN/ISSN)Not applicablepeer-review

24 Scopus Citations
View graph of relations

Author(s)

  • Barbara Kitchenham
  • Hiyam Al-Khilidar
  • Muhammad Ali Babar
  • Mike Berry
  • Karl Cox
  • Felicia Kurniawati
  • Mark Staples
  • Zhang He
  • Zhu Liming

Detail(s)

Original languageEnglish
Title of host publicationISESE'06 - Proceedings of the 5th ACM-IEEE International Symposium on Empirical Software Engineering
Pages38-47
Volume2006
Publication statusPublished - 2006
Externally publishedYes

Publication series

Name
Volume2006

Conference

TitleISCE'06 - 5th ACM-IEEE International Symposium on Empirical Software Engineering
PlaceBrazil
CityRio de Janeiro
Period21 - 22 September 2006

Abstract

Background. Several researchers have criticized the standards of performing and reporting empirical studies in software engineering. In order to address this problem, Andreas Jedlitschka and Dietmar Pfahl have produced reporting guidelines for controlled experiments in software engineering. They pointed out that their guidelines needed evaluation. We agree that guidelines need to be evaluated before they can be widely adopted. If guidelines are flawed, they will cause more problems that they solve. Aim. The aim of this paper is to present the method we used to evaluate the guidelines and report the results of our evaluation exercise. We suggest our evaluation process may be of more general use if reporting guidelines for other types of empirical study are developed. Method. We used perspective-based inspections to perform a theoretical evaluation of the guidelines. A separate inspection was performed for each perspective. The perspectives used were: Researcher, Practitioner/Consultant, Meta-analyst, Replicator, Reviewer and Author. Apart from the Author perspective, the inspections were based on a set of questions derived by brainstorming. The inspection using the Author perspective reviewed each section of the guidelines sequentially. Results. The question-based perspective inspections detected 42 issues where the guidelines would benefit from amendment or clarification and 8 defects. Conclusions. Reporting guidelines need to specify what information goes into what section and avoid excessive duplication. Software engineering researchers need to be cautious about adopting reporting guidelines that differ from those used by other disciplines. The current guidelines need to be revised and the revised guidelines need to be subjected to further theoretical and empirical validation. Perspective-based inspection is a useful validation method but the practitioner/consultant perspective presents difficulties. Copyright 2006 ACM.

Research Area(s)

  • Controlled experiments, Guidelines, Perspective-based inspection, Software engineering

Citation Format(s)

Evaluating guidelines for empirical software engineering studies. / Kitchenham, Barbara; Al-Khilidar, Hiyam; Babar, Muhammad Ali; Berry, Mike; Cox, Karl; Keung, Jacky; Kurniawati, Felicia; Staples, Mark; He, Zhang; Liming, Zhu.

ISESE'06 - Proceedings of the 5th ACM-IEEE International Symposium on Empirical Software Engineering. Vol. 2006 2006. p. 38-47.

Research output: Chapters, Conference Papers, Creative and Literary Works (RGC: 12, 32, 41, 45)32_Refereed conference paper (with ISBN/ISSN)Not applicablepeer-review