CC BY 4.0 · Int J Angiol
DOI: 10.1055/s-0043-1761280
Original Article

Validity of Routinely Reported Rutherford Scores Reported by Clinicians as Part of Daily Clinical Practice

1   Department of Vascular Surgery, University Hospitals Leuven, Leuven, Belgium
2   Department Biomedical Data Sciences, Medical Decision Making, Leiden University Medical Centre, Leiden, The Netherlands
,
Perla J. Marang-van de Mheen
2   Department Biomedical Data Sciences, Medical Decision Making, Leiden University Medical Centre, Leiden, The Netherlands
,
Louis Thielman
1   Department of Vascular Surgery, University Hospitals Leuven, Leuven, Belgium
,
Pieter Stijnen
3   Management Information and Reporting, University Hospitals Leuven, Leuven, Belgium
,
Jaap F. Hamming
4   Department of Vascular Surgery, Leiden University Medical Centre, Leiden, The Netherlands
,
Inge Fourneau
1   Department of Vascular Surgery, University Hospitals Leuven, Leuven, Belgium
› Author Affiliations
Funding This research is funded by the joint-PhD program from the Leiden University Medical Center (reference 18-1921) and the Department Cardiovascular Sciences KU Leuven (Belgium).

Abstract

Routinely reported structured data from the electronic health record (EHR) are frequently used for secondary purposes. However, it is unknown how valid routinely reported data are for reuse.

This study aimed to assess the validity of routinely reported Rutherford scores by clinicians as an indicator for the validity of structured data in the EHR.

This observational study compared clinician-reported Rutherford scores with medical record review Rutherford scores for all visits at the vascular surgery department between April 1, 2016 and December 31, 2018. Free-text fields with clinical information for all visits were extracted for the assignment of the medical record review Rutherford score, after which the agreement with the clinician-reported Rutherford score was assessed using Fleiss' Kappa.

A total of 6,633 visits were included for medical record review. Substantial agreement was shown between clinician-reported Rutherford scores and medical record review Rutherford scores for the left (k = 0.62, confidence interval [CI]: 0.60–0.63) and right leg (k = 0.62, CI: 0.60–0.64). This increased to the almost perfect agreement for left (k = 0.84, CI: 0.82–0.86) and right leg (k = 0.85, CI: 0.83–0.87), when excluding missing clinician-reported Rutherford scores. Expert's judgment was rarely required to be the deciding factor (11 out of 6,633).

Substantial agreement between clinician-reported Rutherford scores and medical record review Rutherford scores was found, which could be an indicator for the validity of routinely reported data. Depending on its purpose, the secondary use of routinely collected Rutherford scores is a viable option.



Publication History

Article published online:
25 February 2023

© 2023. International College of Angiology. This is an open access article published by Thieme under the terms of the Creative Commons Attribution License, permitting unrestricted use, distribution, and reproduction so long as the original work is properly cited. (https://creativecommons.org/licenses/by/4.0/)

Thieme Medical Publishers, Inc.
333 Seventh Avenue, 18th Floor, New York, NY 10001, USA

 
  • References

  • 1 Cook JA, Collins GS. The rise of big clinical databases. Br J Surg 2015; 102 (02) e93-e101
  • 2 Manuel DG, Rosella LC, Stukel TA. Importance of accurately identifying disease in studies using electronic health records. BMJ 2010; 341: c4226
  • 3 Worster A, Haines T. Advanced statistics: understanding medical record review (MRR) studies. Acad Emerg Med 2004; 11 (02) 187-192
  • 4 Benchimol EI, Smeeth L, Guttmann A. et al; RECORD Working Committee. The REporting of studies Conducted using Observational Routinely-collected health Data (RECORD) statement. PLoS Med 2015; 12 (10) e1001885
  • 5 Hardman RL, Jazaeri O, Yi J, Smith M, Gupta R. Overview of classification systems in peripheral artery disease. Semin Intervent Radiol 2014; 31 (04) 378-388
  • 6 Gerhard-Herman MD, Gornik HL, Barrett C. et al; Writing Committee Members, ACC/AHA Task Force Members. 2016 AHA/ACC Guideline on the Management of Patients with Lower Extremity Peripheral Artery Disease: executive summary. Vasc Med 2017; 22 (03) NP1-NP43
  • 7 Rutherford RB, Baker JD, Ernst C. et al. Recommended standards for reports dealing with lower extremity ischemia: revised version. J Vasc Surg 1997; 26 (03) 517-538
  • 8 Ricco JB, Gargiulo M, Stella A. et al. Impact of angiosome- and nonangiosome-targeted peroneal bypass on limb salvage and healing in patients with chronic limb-threatening ischemia. J Vasc Surg 2017; 66 (05) 1479-1487
  • 9 Stella J, Engelbertz C, Gebauer K. et al. Outcome of patients with chronic limb-threatening ischemia with and without revascularization. Vasa 2020; 49 (02) 121-127
  • 10 Biagioni LC, Pereira L, Nasser F, Biagioni RB, Burihan MC, Wolosker N. Comparison between antegrade common femoral artery access and superficial femoral artery access in infrainguinal endovascular interventions. J Vasc Surg 2021; 74 (03) 763-770
  • 11 Powers EM, Shiffman RN, Melnick ER, Hickner A, Sharifi M. Efficacy and unintended consequences of hard-stop alerts in electronic health record systems: a systematic review. J Am Med Inform Assoc 2018; 25 (11) 1556-1566
  • 12 Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)—a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform 2009; 42 (02) 377-381
  • 13 Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics 1977; 33 (01) 159-174
  • 14 Lasota AN, Overvad K, Eriksen HH, Tjønneland A, Schmidt EB, Grønholdt MM. Validity of peripheral arterial disease diagnoses in the Danish National Patient Registry. Eur J Vasc Endovasc Surg 2017; 53 (05) 679-685
  • 15 De Almeida Chaves S, Derumeaux H, Do Minh P, Lapeyre-Mestre M, Moulis G, Pugnet G. Assessment of the accuracy of using ICD-10 codes to identify systemic sclerosis. Clin Epidemiol 2020; 12: 1355-1359
  • 16 Pimentel MA, Browne EN, Janardhana PM. et al. Assessment of the accuracy of using ICD-9 codes to identify uveitis, herpes zoster ophthalmicus, scleritis, and episcleritis. JAMA Ophthalmol 2016; 134 (09) 1001-1006
  • 17 Warwick J, Slavova S, Bush J, Costich J. Validation of ICD-10-CM surveillance codes for traumatic brain injury inpatient hospitalizations. Brain Inj 2020; 34 (13-14): 1763-1770
  • 18 Valik JK, Ward L, Tanushi H. et al. Validation of automated sepsis surveillance based on the Sepsis-3 clinical criteria against physician record review in a general hospital population: observational study using electronic health records data. BMJ Qual Saf 2020; 29 (09) 735-745