Novel artificial intelligence approach for automatic differentiation of fetal occiput anterior and non-occiput anterior positions during labor

Ultrasound Obstet Gynecol. 2022 Jan;59(1):93-99. doi: 10.1002/uog.23739.

Abstract

Objectives: To describe a newly developed machine-learning (ML) algorithm for the automatic recognition of fetal head position using transperineal ultrasound (TPU) during the second stage of labor and to describe its performance in differentiating between occiput anterior (OA) and non-OA positions.

Methods: This was a prospective cohort study including singleton term (> 37 weeks of gestation) pregnancies in the second stage of labor, with a non-anomalous fetus in cephalic presentation. Transabdominal ultrasound was performed to determine whether the fetal head position was OA or non-OA. For each case, one sonographic image of the fetal head was then acquired in an axial plane using TPU and saved for later offline analysis. Using the transabdominal sonographic diagnosis as the gold standard, a ML algorithm based on a pattern-recognition feed-forward neural network was trained on the TPU images to discriminate between OA and non-OA positions. In the training phase, the model tuned its parameters to approximate the training data (i.e. the training dataset) such that it would identify correctly the fetal head position, by exploiting geometric, morphological and intensity-based features of the images. In the testing phase, the algorithm was blinded to the occiput position as determined by transabdominal ultrasound. Using the test dataset, the ability of the ML algorithm to differentiate OA from non-OA fetal positions was assessed in terms of diagnostic accuracy. The F1 -score and precision-recall area under the curve (PR-AUC) were calculated to assess the algorithm's performance. Cohen's kappa (κ) was calculated to evaluate the agreement between the algorithm and the gold standard.

Results: Over a period of 24 months (February 2018 to January 2020), at 15 maternity hospitals affiliated to the International Study group on Labor ANd Delivery Sonography (ISLANDS), we enrolled into the study 1219 women in the second stage of labor. On the basis of transabdominal ultrasound, they were classified as OA (n = 801 (65.7%)) or non-OA (n = 418 (34.3%)). From the entire cohort (OA and non-OA), approximately 70% (n = 824) of the patients were assigned randomly to the training dataset and the rest (n = 395) were used as the test dataset. The ML-based algorithm correctly classified the fetal occiput position in 90.4% (357/395) of the test dataset, including 224/246 with OA (91.1%) and 133/149 with non-OA (89.3%) fetal head position. Evaluation of the algorithm's performance gave an F1 -score of 88.7% and a PR-AUC of 85.4%. The algorithm showed a balanced performance in the recognition of both OA and non-OA positions. The robustness of the algorithm was confirmed by high agreement with the gold standard (κ = 0.81; P < 0.0001).

Conclusions: This newly developed ML-based algorithm for the automatic assessment of fetal head position using TPU can differentiate accurately, in most cases, between OA and non-OA positions in the second stage of labor. This algorithm has the potential to support not only obstetricians but also midwives and accoucheurs in the clinical use of TPU to determine fetal occiput position in the labor ward. © 2021 International Society of Ultrasound in Obstetrics and Gynecology.

Keywords: artificial intelligence; fetal occiput position; intrapartum ultrasound; transperineal ultrasound.

MeSH terms

  • Adult
  • Area Under Curve
  • Artificial Intelligence*
  • Female
  • Fetus / diagnostic imaging
  • Fetus / embryology
  • Head / diagnostic imaging
  • Head / embryology
  • Humans
  • Labor Presentation*
  • Labor Stage, Second
  • Obstetric Labor Complications / diagnostic imaging*
  • Pregnancy
  • Prospective Studies
  • Ultrasonography, Prenatal / methods*