Deep learning using contrast-enhanced ultrasound images to predict the nuclear grade of clear cell renal cell carcinoma

Purpose To assess the effectiveness of a deep learning model using contrastenhanced ultrasound (CEUS) images in distinguishing between low-grade (grade I and II) and high-grade (grade III and IV) clear cell renal cell carcinoma (ccRCC). Methods A retrospective study was conducted using CEUS images of 177 Fuhrmangraded ccRCCs (93 low-grade and 84 high-grade) from May 2017 to December 2020. A total of 6412 CEUS images were captured from the videos and normalized for subsequent analysis. A deep learning model using the RepVGG architecture was proposed to differentiate between low-grade and high-grade ccRCC. The model’s performance was evaluated based on sensitivity, specificity, positive predictive value, negative predictive value and area under the receiver operating characteristic curve (AUC). Class activation mapping (CAM) was used to visualize the specific areas that contribute to the model’s predictions. Results For discriminating high-grade ccRCC from low-grade, the deep learning model achieved a sensitivity of 74.8%, specificity of 79.1%, accuracy of 77.0%, and an AUC of 0.852 in the test set. Conclusion The deep learning model based on CEUS images can accurately differentiate between low-grade and high-grade ccRCC in a non-invasive manner..

Medienart:

E-Artikel

Erscheinungsjahr:

2024

Erschienen:

2024

Enthalten in:

Zur Gesamtaufnahme - volume:42

Enthalten in:

World journal of urology - 42(2024), 1 vom: 21. März

Sprache:

Englisch

Beteiligte Personen:

Bai, Yun [VerfasserIn]
An, Zi-Chen [VerfasserIn]
Li, Fan [VerfasserIn]
Du, Lian-Fang [VerfasserIn]
Xie, Tian-Wu [VerfasserIn]
Zhang, Xi-Peng [VerfasserIn]
Cai, Ying-Yu [VerfasserIn]

Links:

Volltext [lizenzpflichtig]

BKL:

44.88

Themen:

Artificial intelligence
Classification
Contrast-enhanced ultrasound
Deep learning
Kidney neoplasms
Nuclear grade

Anmerkungen:

© The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2024. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

doi:

10.1007/s00345-024-04889-3

funding:

Förderinstitution / Projekttitel:

PPN (Katalog-ID):

SPR055235425