Deep learning-based segmentation of multisite disease in ovarian cancer
Purpose To determine if pelvic/ovarian and omental lesions of ovarian cancer can be reliably segmented on computed tomography (CT) using fully automated deep learning-based methods. Methods A deep learning model for the two most common disease sites of high-grade serous ovarian cancer lesions (pelvis/ovaries and omentum) was developed and compared against the well-established “no-new-Net” framework and unrevised trainee radiologist segmentations. A total of 451 CT scans collected from four different institutions were used for training (n = 276), evaluation (n = 104) and testing (n = 71) of the methods. The performance was evaluated using the Dice similarity coefficient (DSC) and compared using a Wilcoxon test. Results Our model outperformed no-new-Net for the pelvic/ovarian lesions in cross-validation, on the evaluation and test set by a significant margin (p values being 4 × $ 10^{–7} $, 3 × $ 10^{–4} $, 4 × $ 10^{–2} $, respectively), and for the omental lesions on the evaluation set (p = 1 × $ 10^{–3} $). Our model did not perform significantly differently in segmenting pelvic/ovarian lesions (p = 0.371) compared to a trainee radiologist. On an independent test set, the model achieved a DSC performance of 71 ± 20 (mean ± standard deviation) for pelvic/ovarian and 61 ± 24 for omental lesions. Conclusion Automated ovarian cancer segmentation on CT scans using deep neural networks is feasible and achieves performance close to a trainee-level radiologist for pelvic/ovarian lesions. Relevance statement Automated segmentation of ovarian cancer may be used by clinicians for CT-based volumetric assessments and researchers for building complex analysis pipelines. Key points • The first automated approach for pelvic/ovarian and omental ovarian cancer lesion segmentation on CT images has been presented. • Automated segmentation of ovarian cancer lesions can be comparable with manual segmentation of trainee radiologists. • Careful hyperparameter tuning can provide models significantly outperforming strong state-of-the-art baselines. Graphical Abstract.
Medienart: |
E-Artikel |
---|
Erscheinungsjahr: |
2023 |
---|---|
Erschienen: |
2023 |
Enthalten in: |
Zur Gesamtaufnahme - volume:7 |
---|---|
Enthalten in: |
European radiology experimental - 7(2023), 1 vom: 07. Dez. |
Sprache: |
Englisch |
---|
Beteiligte Personen: |
Buddenkotte, Thomas [VerfasserIn] |
---|
Links: |
Volltext [kostenfrei] |
---|
Themen: |
Deep learning |
---|
Anmerkungen: |
© The Author(s) 2023 |
---|
doi: |
10.1186/s41747-023-00388-z |
---|
funding: |
|
---|---|
Förderinstitution / Projekttitel: |
|
PPN (Katalog-ID): |
SPR054005388 |
---|
LEADER | 01000naa a22002652 4500 | ||
---|---|---|---|
001 | SPR054005388 | ||
003 | DE-627 | ||
005 | 20231207064647.0 | ||
007 | cr uuu---uuuuu | ||
008 | 231207s2023 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.1186/s41747-023-00388-z |2 doi | |
035 | |a (DE-627)SPR054005388 | ||
035 | |a (SPR)s41747-023-00388-z-e | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
100 | 1 | |a Buddenkotte, Thomas |e verfasserin |4 aut | |
245 | 1 | 0 | |a Deep learning-based segmentation of multisite disease in ovarian cancer |
264 | 1 | |c 2023 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
500 | |a © The Author(s) 2023 | ||
520 | |a Purpose To determine if pelvic/ovarian and omental lesions of ovarian cancer can be reliably segmented on computed tomography (CT) using fully automated deep learning-based methods. Methods A deep learning model for the two most common disease sites of high-grade serous ovarian cancer lesions (pelvis/ovaries and omentum) was developed and compared against the well-established “no-new-Net” framework and unrevised trainee radiologist segmentations. A total of 451 CT scans collected from four different institutions were used for training (n = 276), evaluation (n = 104) and testing (n = 71) of the methods. The performance was evaluated using the Dice similarity coefficient (DSC) and compared using a Wilcoxon test. Results Our model outperformed no-new-Net for the pelvic/ovarian lesions in cross-validation, on the evaluation and test set by a significant margin (p values being 4 × $ 10^{–7} $, 3 × $ 10^{–4} $, 4 × $ 10^{–2} $, respectively), and for the omental lesions on the evaluation set (p = 1 × $ 10^{–3} $). Our model did not perform significantly differently in segmenting pelvic/ovarian lesions (p = 0.371) compared to a trainee radiologist. On an independent test set, the model achieved a DSC performance of 71 ± 20 (mean ± standard deviation) for pelvic/ovarian and 61 ± 24 for omental lesions. Conclusion Automated ovarian cancer segmentation on CT scans using deep neural networks is feasible and achieves performance close to a trainee-level radiologist for pelvic/ovarian lesions. Relevance statement Automated segmentation of ovarian cancer may be used by clinicians for CT-based volumetric assessments and researchers for building complex analysis pipelines. Key points • The first automated approach for pelvic/ovarian and omental ovarian cancer lesion segmentation on CT images has been presented. • Automated segmentation of ovarian cancer lesions can be comparable with manual segmentation of trainee radiologists. • Careful hyperparameter tuning can provide models significantly outperforming strong state-of-the-art baselines. Graphical Abstract | ||
650 | 4 | |a Deep learning |7 (dpeaa)DE-He213 | |
650 | 4 | |a Omentum |7 (dpeaa)DE-He213 | |
650 | 4 | |a Ovarian Neoplasms |7 (dpeaa)DE-He213 | |
650 | 4 | |a Tomography (x-ray computed) |7 (dpeaa)DE-He213 | |
650 | 4 | |a Pelvis |7 (dpeaa)DE-He213 | |
700 | 1 | |a Rundo, Leonardo |4 aut | |
700 | 1 | |a Woitek, Ramona |4 aut | |
700 | 1 | |a Escudero Sanchez, Lorena |4 aut | |
700 | 1 | |a Beer, Lucian |4 aut | |
700 | 1 | |a Crispin-Ortuzar, Mireia |4 aut | |
700 | 1 | |a Etmann, Christian |4 aut | |
700 | 1 | |a Mukherjee, Subhadip |4 aut | |
700 | 1 | |a Bura, Vlad |4 aut | |
700 | 1 | |a McCague, Cathal |4 aut | |
700 | 1 | |a Sahin, Hilal |4 aut | |
700 | 1 | |a Pintican, Roxana |4 aut | |
700 | 1 | |a Zerunian, Marta |4 aut | |
700 | 1 | |a Allajbeu, Iris |4 aut | |
700 | 1 | |a Singh, Naveena |4 aut | |
700 | 1 | |a Sahdev, Anju |4 aut | |
700 | 1 | |a Havrilesky, Laura |4 aut | |
700 | 1 | |a Cohn, David E. |4 aut | |
700 | 1 | |a Bateman, Nicholas W. |4 aut | |
700 | 1 | |a Conrads, Thomas P. |4 aut | |
700 | 1 | |a Darcy, Kathleen M. |4 aut | |
700 | 1 | |a Maxwell, G. Larry |4 aut | |
700 | 1 | |a Freymann, John B. |4 aut | |
700 | 1 | |a Öktem, Ozan |4 aut | |
700 | 1 | |a Brenton, James D. |4 aut | |
700 | 1 | |a Sala, Evis |4 aut | |
700 | 1 | |a Schönlieb, Carola-Bibiane |4 aut | |
773 | 0 | 8 | |i Enthalten in |t European radiology experimental |d [Cham] : Springer International Publishing, 2017 |g 7(2023), 1 vom: 07. Dez. |w (DE-627)SPR038294699 |w (DE-600)2905812-0 |x 2509-9280 |7 nnns |
773 | 1 | 8 | |g volume:7 |g year:2023 |g number:1 |g day:07 |g month:12 |
856 | 4 | 0 | |u https://dx.doi.org/10.1186/s41747-023-00388-z |z kostenfrei |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a GBV_SPRINGER | ||
951 | |a AR | ||
952 | |d 7 |j 2023 |e 1 |b 07 |c 12 |