Brain tumor image segmentation based on improved Polyp-PVT

Abstract Deep learning models have shown powerful feature extraction capabilities in brain tumor segmentation tasks to use medical image information and help doctors make disease diagnoses. However, due to the diverse shapes of brain tumors and the variability of brain imaging, the segmentation effect of the model on brain tumors has specific room for improvement. The Polyp-PVT network is adopted as the basic model based on its advantages of fast running speed and high segmentation accuracy. The attentional selective fusion (ASF) module is introduced to replace the CFM module. The channel shuffle module is introduced before the ASF module—PVT-ASF-CS network is proposed. This is helpful to improve the ability of feature fusion and segmentation effect of the network. At the same time, information mixing between channels can be completed without increasing the computation and parameter quantity. Based on the Brats2019 dataset, the ablation experiment of the PVT-ASF-CS network was conducted and compared with other models. The Dice coefficient and HD were used as evaluation indexes to evaluate the segmentation effect of the model. The experimental results show that the proposed PVT-ASF-CS model can extract more adequate and detailed features, and has a better segmentation effect on brain tumors..

Medienart:

Artikel

Erscheinungsjahr:

2023

Erschienen:

2023

Enthalten in:

Zur Gesamtaufnahme - volume:17

Enthalten in:

Signal, image and video processing - 17(2023), 8 vom: 01. Juni, Seite 4019-4027

Sprache:

Englisch

Beteiligte Personen:

Lan, Chaofeng [VerfasserIn]
Yu, Xinyu [VerfasserIn]
Zhang, Lei [VerfasserIn]
Xia, Yan [VerfasserIn]
Mao, Xiuhuan [VerfasserIn]
Zhang, Meng [VerfasserIn]

Links:

Volltext [lizenzpflichtig]

BKL:

54.74$jMaschinelles Sehen

53.73$jNachrichtenübertragung

Themen:

ASF
Brain tumors segmentation
Channel shuffle
Polyp-PVT

Anmerkungen:

© The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

doi:

10.1007/s11760-023-02632-w

funding:

Förderinstitution / Projekttitel:

PPN (Katalog-ID):

OLC2145518266