Automatic standard plane and diagnostic usability classification in obstetric ultrasounds

Objective: This study introduces an innovative end-to-end deep learning pipeline designed to automatically classify and order fetal ultrasound standard planes in alignment with the guidelines of the Canadian Association of Radiologists, while also assessing the diagnostic usability of each view. The...

Full description

Saved in:
Bibliographic Details
Main Authors: Adam Lim, Mohamed Abdalla, Farbod Abolhassani, Wyanne Law, Benjamin Fine, Dafna Sussman
Format: Article
Language:English
Published: Elsevier 2024-12-01
Series:WFUMB Ultrasound Open
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2949668324000181
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1846124852954529792
author Adam Lim
Mohamed Abdalla
Farbod Abolhassani
Wyanne Law
Benjamin Fine
Dafna Sussman
author_facet Adam Lim
Mohamed Abdalla
Farbod Abolhassani
Wyanne Law
Benjamin Fine
Dafna Sussman
author_sort Adam Lim
collection DOAJ
description Objective: This study introduces an innovative end-to-end deep learning pipeline designed to automatically classify and order fetal ultrasound standard planes in alignment with the guidelines of the Canadian Association of Radiologists, while also assessing the diagnostic usability of each view. The primary objective is to address the manual and cumbersome challenges that interpreting radiologists encounter in the existing obstetric ultrasound workflow. Methods: We compiled a diverse dataset, comprising 33,561 de-identified two-dimensional obstetrical ultrasound images acquired from January 1, 2010, to June 1, 2020. This dataset was categorized into 19 distinct classes associated with standard planes and further partitioned into training, validation, and testing subsets via a 60:20:20 stratified split. The standard plane and diagnostic usability networks are founded on a convolutional neural network framework and employ the benefits of transfer learning. Results: The standard plane classification network demonstrated promising results by achieving 99.4 % and 98.7 % for accuracy and F1 score, respectively. Subsequently, the diagnostic usability network demonstrated strong performance, registering 80 % accuracy and an 82 % F1 score. Notably, this study is the first to investigate whether deep learning methods can surpass sonographers in the standard plane labeling task, with some instances revealing the algorithm's capacity to rectify sonographer mislabeled planes. Conclusion: The results highlight the algorithm's potential to be integrated into a clinical setting by serving as a reliable assistive tool, alleviating the cognitive workload faced by radiologists and enhancing efficiency and diagnostic outcomes in the current obstetric ultrasound process.
format Article
id doaj-art-2ad6bbd0464f40f3bfb7164f4b2e139b
institution Kabale University
issn 2949-6683
language English
publishDate 2024-12-01
publisher Elsevier
record_format Article
series WFUMB Ultrasound Open
spelling doaj-art-2ad6bbd0464f40f3bfb7164f4b2e139b2024-12-13T11:09:12ZengElsevierWFUMB Ultrasound Open2949-66832024-12-0122100050Automatic standard plane and diagnostic usability classification in obstetric ultrasoundsAdam Lim0Mohamed Abdalla1Farbod Abolhassani2Wyanne Law3Benjamin Fine4Dafna Sussman5Department of Electrical, Computer and Biomedical Engineering, Toronto Metropolitan University, Faculty of Engineering and Architectural Sciences, Toronto, Ontario, Canada; Institute for Biomedical Engineering, Science and Technology (iBEST), Toronto Metropolitan University and St. Michael's Hospital, Toronto, Ontario, CanadaInstitute for Better Health, Mississauga, Ontario, CanadaInstitute for Better Health, Mississauga, Ontario, Canada; Dalla Lana School of Public Health, University of Toronto, Toronto, Ontario, CanadaDepartment of Radiology, Memorial Sloan Kettering Cancer Center, New York, NY, United StatesInstitute for Better Health, Mississauga, Ontario, Canada; Department of Medical Imaging, University of Toronto, Toronto, Ontario, CanadaDepartment of Electrical, Computer and Biomedical Engineering, Toronto Metropolitan University, Faculty of Engineering and Architectural Sciences, Toronto, Ontario, Canada; Institute for Biomedical Engineering, Science and Technology (iBEST), Toronto Metropolitan University and St. Michael's Hospital, Toronto, Ontario, Canada; Department of Obstetrics and Gynecology, Faculty of Medicine, University of Toronto, Toronto, Ontario, Canada; Corresponding author.Objective: This study introduces an innovative end-to-end deep learning pipeline designed to automatically classify and order fetal ultrasound standard planes in alignment with the guidelines of the Canadian Association of Radiologists, while also assessing the diagnostic usability of each view. The primary objective is to address the manual and cumbersome challenges that interpreting radiologists encounter in the existing obstetric ultrasound workflow. Methods: We compiled a diverse dataset, comprising 33,561 de-identified two-dimensional obstetrical ultrasound images acquired from January 1, 2010, to June 1, 2020. This dataset was categorized into 19 distinct classes associated with standard planes and further partitioned into training, validation, and testing subsets via a 60:20:20 stratified split. The standard plane and diagnostic usability networks are founded on a convolutional neural network framework and employ the benefits of transfer learning. Results: The standard plane classification network demonstrated promising results by achieving 99.4 % and 98.7 % for accuracy and F1 score, respectively. Subsequently, the diagnostic usability network demonstrated strong performance, registering 80 % accuracy and an 82 % F1 score. Notably, this study is the first to investigate whether deep learning methods can surpass sonographers in the standard plane labeling task, with some instances revealing the algorithm's capacity to rectify sonographer mislabeled planes. Conclusion: The results highlight the algorithm's potential to be integrated into a clinical setting by serving as a reliable assistive tool, alleviating the cognitive workload faced by radiologists and enhancing efficiency and diagnostic outcomes in the current obstetric ultrasound process.http://www.sciencedirect.com/science/article/pii/S2949668324000181Deep learningObstetric ultrasoundConvolutional neural networkImage classification
spellingShingle Adam Lim
Mohamed Abdalla
Farbod Abolhassani
Wyanne Law
Benjamin Fine
Dafna Sussman
Automatic standard plane and diagnostic usability classification in obstetric ultrasounds
WFUMB Ultrasound Open
Deep learning
Obstetric ultrasound
Convolutional neural network
Image classification
title Automatic standard plane and diagnostic usability classification in obstetric ultrasounds
title_full Automatic standard plane and diagnostic usability classification in obstetric ultrasounds
title_fullStr Automatic standard plane and diagnostic usability classification in obstetric ultrasounds
title_full_unstemmed Automatic standard plane and diagnostic usability classification in obstetric ultrasounds
title_short Automatic standard plane and diagnostic usability classification in obstetric ultrasounds
title_sort automatic standard plane and diagnostic usability classification in obstetric ultrasounds
topic Deep learning
Obstetric ultrasound
Convolutional neural network
Image classification
url http://www.sciencedirect.com/science/article/pii/S2949668324000181
work_keys_str_mv AT adamlim automaticstandardplaneanddiagnosticusabilityclassificationinobstetricultrasounds
AT mohamedabdalla automaticstandardplaneanddiagnosticusabilityclassificationinobstetricultrasounds
AT farbodabolhassani automaticstandardplaneanddiagnosticusabilityclassificationinobstetricultrasounds
AT wyannelaw automaticstandardplaneanddiagnosticusabilityclassificationinobstetricultrasounds
AT benjaminfine automaticstandardplaneanddiagnosticusabilityclassificationinobstetricultrasounds
AT dafnasussman automaticstandardplaneanddiagnosticusabilityclassificationinobstetricultrasounds