Ultraschall Med 2022; 43(S 01): S9
DOI: 10.1055/s-0042-1749501
Abstracts
Gynäkologie

Artificial Intelligence Algorithm for the automatic classification of anterior/posterior/transverse fetal occiput positions during labor

Ramirez Ruben Zegarra
1   Department of Medicine and Surgery, Unit of Surgical Sciences, Obstetrics and Gynecology, University of Parma
2   Technical University of Munich, Hospital rechts der Isar
,
Andrea Dall'Asta
1   Department of Medicine and Surgery, Unit of Surgical Sciences, Obstetrics and Gynecology, University of Parma
,
Francesco Conversano
3   National Research Council, Institute of Clinical Physiology
,
M.G. Dr Trani
4   Amolab srl
,
R Morello
3   National Research Council, Institute of Clinical Physiology
,
P Pisani
3   National Research Council, Institute of Clinical Physiology
,
M Di Paola
3   National Research Council, Institute of Clinical Physiology
,
Sergio Casciaro
3   National Research Council, Institute of Clinical Physiology
,
Tullio Ghi
1   Department of Medicine and Surgery, Unit of Surgical Sciences, Obstetrics and Gynecology, University of Parma
› Author Affiliations
 

Objective To develop an Artificial Intelligence algorithm, that automatically classifies intrapartum ultrasound (US) images into fetal Occiput Anterior (OA), Posterior (OP) or Transverse (OT) positions by using two Convolutional Neural Networks (CNNs) working in sequence.

Methods Multicenter international prospective study including 21 Maternities and conducted on singleton term pregnancies with cephalic presenting fetus in the second stage of labor. Transperineal US images of the fetal head on axial plane were selected and classified as fetal OA, OP or OT position and archived on a cloud for remote analysis. Two CNNs were independently trained to classify the fetal head position into OA/non-OA (CNNA/nA) and OP/OT (CNNP/T) position, respectively. Two balanced datasets were created for each CNN. Both CNN were trained on labeled data (training dataset) during the training phase. During the testing phase, we evaluated the diagnostic accuracy of both CNN together on unlabeled data (testing database), as follows: 1) First the image is classified by CNNA/nA as OA or non-OA position. 2) Second, if the image is classified as OA position, the algorithm ends. 3) if the image is classified as non-OA position, the CNNP/T classifies the image as OP or OT position.

Results A total of 1191 transperineal US images of the fetal head on axial plane. The CNNA/nA correctly classified the fetal occiput position into OA or non-OA position in 98.3% of the cases. The CNNP/T correctly classified the fetal head position into OP or OT position in 90.7% of the cases. The overall accuracy of the AI-algorithm for the classification of OA, OP or OT positions was 94.9%.

Conclusion An AI-algorithm for the automatic assessment of the fetal head position at TPU has been developed, and can accurately distinguish between OA, OP and OT positions starting from ultrasound images acquired on the transperineal axial plane. This indicates that CNNs can be successfully used for the automatic classification of intrapartum US images. Moreover, our work suggests that CNN could be employed for the identification of further fetal occiput positions.



Publication History

Article published online:
20 June 2022

© 2022. Thieme. All rights reserved.

Georg Thieme Verlag
Rüdigerstraße 14, 70469 Stuttgart, Germany