Subscribe to RSS

DOI: 10.1055/s-0044-1792083
Comparison of Two Quality Analysis Checklists Used to Appraise Studies Regarding the Assessment of Auditory Processing Disorder in Older Adults
Authors
Funding The authors declare that they did not receive financial support from agencies in the public, private, or non-profit sectors to conduct the present study.
Abstract
Introduction A meta-analysis of published articles is usually done using standard scales and checklists. Several such scales and checklists are reported in the literature. However, there is little information regarding their utility so one can select the most appropriate one, especially in the field of audiology.
Objective The current study aimed to compare a quality analysis carried out using the standard quality assessment criteria (SQAC) for evaluating primary research papers from a variety of fields', and the Modified Downs and Black Checklist (MDBC) for a set of articles in the area of auditory processing deficits (APDs) in older adults.
Methods Two published checklists suitable for the field of audiology (SQAC and MDBC) were compared for a quality analysis of articles on APD in older adults. The two checklists were compared after categorizing their items into five subsections. Two audiologists rated the articles according to both checklists.
Results The interrater reliability was found to be good for both checklists. Significant differences between the checklists were observed for specific subsections. However, there was no significant correlation between the two checklists.
Conclusion It is inferred that the selection of an appropriate quality assessment checklist depends on the objective of the study. If the aim of a quality analysis study is to differentiate articles based on their overall caliber, or primarily based on the subsections, SQAC is recommended. However, if the aim is to distinguish research articles primarily based on the control of variables, or differentiate intervention-based studies, the MDBC is recommended.
Publication History
Received: 03 October 2023
Accepted: 03 September 2024
Article published online:
29 May 2025
© 2025. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution 4.0 International License, permitting copying and reproduction so long as the original work is given appropriate credit (https://creativecommons.org/licenses/by/4.0/)
Thieme Revinter Publicações Ltda.
Rua do Matoso 170, Rio de Janeiro, RJ, CEP 20270-135, Brazil
Vipin Ghosh, Asha Yathiraj, Darshan Devananda. Comparison of Two Quality Analysis Checklists Used to Appraise Studies Regarding the Assessment of Auditory Processing Disorder in Older Adults. Int Arch Otorhinolaryngol 2025; 29: s00441792083.
DOI: 10.1055/s-0044-1792083
-
References
- 1 Haidich AB. Meta-analysis in medical research. Hippokratia 2010; 14 (Suppl. 01) 29-37
- 2 La Torre G, Chiaradia G, Gianfagna F. Quality assessment in meta-analysis. Ital J Public Health 2006; 3 (02) 44-50
- 3 Luchini C, Veronese N, Nottegar A. et al. Assessing the quality of studies in meta-research: Review/guidelines on the most important quality assessment tools. Pharm Stat 2021; 20 (01) 185-195
- 4 Sterne JAC, Savović J, Page MJ. et al. RoB 2: a revised tool for assessing risk of bias in randomised trials. BMJ 2019; 366: l4898
- 5 Detsky AS, Naylor CD, O'Rourke K, McGeer AJ, L'Abbé KA. Incorporating variations in the quality of individual randomized trials into meta-analysis. J Clin Epidemiol 1992; 45 (03) 255-265
- 6 Begg C, Cho M, Eastwood S. et al. Improving the quality of reporting of randomized controlled trials. The CONSORT statement. JAMA 1996; 276 (08) 637-639
- 7 Jadad AR, Moore RA, Carroll D. et al. Assessing the quality of reports of randomized clinical trials: is blinding necessary?. Control Clin Trials 1996; 17 (01) 1-12
- 8 Wells G, Shea B, O'Connell D, Peterson J, Welch V. The Newcastle-Ottawa Scale (NOS) for assessing the quality of nonrandomised studies in meta-analyses. https://scholar.archive.org/work/zuw33wskgzf4bceqgi7opslsre/access/wayback/http://www3.med.unipmn.it/dispense_ebm/2009-2010/Corso%20Perfezionamento%20EBM_Faggiano/NOS_oxford.pdf
- 9 von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP. STROBE Initiative. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. Ann Intern Med 2007; 147 (08) 573-577
- 10 Kmet L, Cook L, Lee R. Standard quality assessment criteria for evaluating primary research papers from a variety of fields. https://era.library.ualberta.ca/items/48b9b989-c221-4df6-9e35-af782082280e/download/a1cffdde-243e-41c3-be98-885f6d4dcb29
- 11 Downs SH, Black N. The feasibility of creating a checklist for the assessment of the methodological quality both of randomised and non-randomised studies of health care interventions. J Epidemiol Community Health 1998; 52 (06) 377-384
- 12 Koo TK, Li MY. A Guideline of Selecting and Reporting Intraclass Correlation Coefficients for Reliability Research. J Chiropr Med 2016; 15 (02) 155-163 https://doi.org/10.1016/j.jcm.2016.02.012

