CC BY-NC-ND 4.0 · Rev Bras Ortop (Sao Paulo) 2019; 54(06): 644-648
DOI: 10.1016/j.rbo.2018.04.002
Artigo Original
Sociedade Brasileira de Ortopedia e Traumatologia. Published by Thieme Revnter Publicações Ltda Rio de Janeiro, Brazil

Intra and Interobserver Agreement Regarding the Walch Classification System for Shoulder Joint Arthritis[*]

Article in several languages: português | English
Lauro José Rocchetti Pajolli
1  Departamento de Ortopedia e Traumatologia, Escola Paulista de Medicina, Universidade Federal de São Paulo (Unifesp), São Paulo, SP, Brasil
,
1  Departamento de Ortopedia e Traumatologia, Escola Paulista de Medicina, Universidade Federal de São Paulo (Unifesp), São Paulo, SP, Brasil
,
Isabella Ferrari
1  Departamento de Ortopedia e Traumatologia, Escola Paulista de Medicina, Universidade Federal de São Paulo (Unifesp), São Paulo, SP, Brasil
,
Fábio Teruo Matsunaga
1  Departamento de Ortopedia e Traumatologia, Escola Paulista de Medicina, Universidade Federal de São Paulo (Unifesp), São Paulo, SP, Brasil
,
Nicola Archetti Netto
1  Departamento de Ortopedia e Traumatologia, Escola Paulista de Medicina, Universidade Federal de São Paulo (Unifesp), São Paulo, SP, Brasil
,
Marcel Jun Sugawara Tamaoki
1  Departamento de Ortopedia e Traumatologia, Escola Paulista de Medicina, Universidade Federal de São Paulo (Unifesp), São Paulo, SP, Brasil
› Author Affiliations
Further Information

Publication History

13 October 2017

03 April 2018

Publication Date:
13 December 2019 (online)

Abstract

Objective To evaluate the inter- and intraobserver agreement regarding the Walch classification system for shoulder arthritis.

Methods Computed tomography scans of the shoulder joint of adult patients were selected between 2012 and 2016, and they were classified by physicians with different levels of expertise in orthopedics. The images were examined at three different times, and the analyses were evaluated by the Fleiss Kappa index to verify the intra- and interobserver agreement.

Results The Kappa index for the intraobserver agreement ranged from 0.305 to 0.545. The inter-observer agreement was very low at the end of the three evaluations (κ = 0.132).

Conclusion The intraobserver agreement regarding the modified Walch classification varied from moderate to poor. The interobserver agreement was low.

* Work developed at Hospital da Pontifícia Universidade Católica de Campinas, Campinas, SP, Brazil. Originally Published by Elsevier.