Exploiting the Dixon Method for a Robust Breast and Fibro-Glandular Tissue Segmentation in Breast MRI

Automatic breast and fibro-glandular tissue (FGT) segmentation in breast MRI allows for the efficient and accurate calculation of breast density. The U-Net architecture, either 2D or 3D, has already been shown to be effective at addressing the segmentation problem in breast MRI. However, the lack of publicly available datasets for this task has forced several authors to rely on internal datasets composed of either acquisitions without fat suppression (WOFS) or with fat suppression (FS), limiting the generalization of the approach. To solve this problem, we propose a data-centric approach, efficiently using the data available. By collecting a dataset of T1-weighted breast MRI acquisitions acquired with the use of the Dixon method, we train a network on both T1 WOFS and FS acquisitions while utilizing the same ground truth segmentation. Using the "plug-and-play" framework nnUNet, we achieve, on our internal test set, a Dice Similarity Coefficient (DSC) of 0.96 and 0.91 for WOFS breast and FGT segmentation and 0.95 and 0.86 for FS breast and FGT segmentation, respectively. On an external, publicly available dataset, a panel of breast radiologists rated the quality of our automatic segmentation with an average of 3.73 on a four-point scale, with an average percentage agreement of 67.5%.

Medienart:

E-Artikel

Erscheinungsjahr:

2022

Erschienen:

2022

Enthalten in:

Zur Gesamtaufnahme - volume:12

Enthalten in:

Diagnostics (Basel, Switzerland) - 12(2022), 7 vom: 11. Juli

Sprache:

Englisch

Beteiligte Personen:

Samperna, Riccardo [VerfasserIn]
Moriakov, Nikita [VerfasserIn]
Karssemeijer, Nico [VerfasserIn]
Teuwen, Jonas [VerfasserIn]
Mann, Ritse M [VerfasserIn]

Links:

Volltext

Themen:

Breast
Data-centric AI
Deep learning
Journal Article
MRI
Segmentation

Anmerkungen:

Date Revised 31.07.2022

published: Electronic

Citation Status PubMed-not-MEDLINE

doi:

10.3390/diagnostics12071690

funding:

Förderinstitution / Projekttitel:

PPN (Katalog-ID):

NLM344053857