Summary
Background: We previously devised and published a guideline for reporting health informatics
evaluation studies named STARE-HI, which is formally endorsed by IMIA and EFMI.
Objective: To develop a prioritization framework of ranked reporting items to assist authors
when reporting health informatics evaluation studies in space restricted conference
papers and to apply this prioritization framework to measure the quality of recent
health informatics conference papers on evaluation studies.
Method: We deconstructed the STARE-HI guideline to identify reporting items. We invited a
total of 111 authors of health informatics evaluation studies, reviewers and editors
of health Informatics conference proceedings to score those reporting items on a scale
ranging from “0 – not necessary in a conference paper” through to “10 – essential
in a conference paper” by a web-based survey. From the responses we derived a mean
priority score. All evaluation papers published in proceedings of MIE2006, Medinfo2007,
MIE2008 and AMIA2008 were rated on these items by two reviewers. From these ratings
a priority adjusted completeness score was computed for each paper.
Results: We identified 104 reporting items from the STARE-HI guideline. The response rate
for the survey was 59% (66 out of 111). The most important reporting items (mean score
≥ 9) were “Interpret the data and give an answer to the study question – (in Discussion)”,
“Whether it is a laboratory, simulation or field study – (in Methods-study design)”
and “Description of the outcome measure/evaluation criteria – (in Methods-study design)”.
Per reporting area the statistically more significant important reporting items were
distinguished from less important ones. Four reporting items had a mean score ≤ 6.
The mean priority adjusted completeness of evaluation papers of recent health informatics
conferences was 48% (range 14 –78%).
Conclusion: We produced a ranked list of reporting items from STARE-HI according to their prioritized
relevance for inclusion in space-limited conference papers. The priority adjusted
completeness scores demonstrated room for improvement for the analyzed conference
papers. We believe that this prioritization framework is an aid to improving the quality
and utility of conference papers on health informatics evaluation studies.
Keywords
Publication - standards - guidelines as topic - evaluation studies - congresses