A robust and lightweight deep attention multiple instance learning algorithm for predicting genetic alterations
Copyright © 2023 Elsevier Ltd. All rights reserved..
Self-attention mechanism-based algorithms are attractive in digital pathology due to their interpretability, but suffer from computation complexity. This paper presents a novel, lightweight Attention-based Multiple Instance Mutation Learning (AMIML) model to allow small-scale attention operations for predicting gene mutations. Compared to the standard self-attention model, AMIML reduces the number of model parameters by approximately 70%. Using data for 24 clinically relevant genes from four cancer cohorts in TCGA studies (UCEC, BRCA, GBM, and KIRC), we compare AMIML with a standard self-attention model, five other deep learning models, and four traditional machine learning models. The results show that AMIML has excellent robustness and outperforms all the baseline algorithms in the vast majority of the tested genes. Conversely, the performance of the reference deep learning and machine learning models vary across different genes, and produce suboptimal prediction for certain genes. Furthermore, with the flexible and interpretable attention-based pooling mechanism, AMIML can further zero in and detect predictive image patches.
Medienart: |
E-Artikel |
---|
Erscheinungsjahr: |
2023 |
---|---|
Erschienen: |
2023 |
Enthalten in: |
Zur Gesamtaufnahme - volume:105 |
---|---|
Enthalten in: |
Computerized medical imaging and graphics : the official journal of the Computerized Medical Imaging Society - 105(2023) vom: 05. Apr., Seite 102189 |
Sprache: |
Englisch |
---|
Beteiligte Personen: |
Guo, Bangwei [VerfasserIn] |
---|
Links: |
---|
Themen: |
Attention mechanism |
---|
Anmerkungen: |
Date Completed 06.03.2023 Date Revised 07.04.2023 published: Print-Electronic Citation Status MEDLINE |
---|
doi: |
10.1016/j.compmedimag.2023.102189 |
---|
funding: |
|
---|---|
Förderinstitution / Projekttitel: |
|
PPN (Katalog-ID): |
NLM352497610 |
---|
LEADER | 01000naa a22002652 4500 | ||
---|---|---|---|
001 | NLM352497610 | ||
003 | DE-627 | ||
005 | 20231226053847.0 | ||
007 | cr uuu---uuuuu | ||
008 | 231226s2023 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.1016/j.compmedimag.2023.102189 |2 doi | |
028 | 5 | 2 | |a pubmed24n1174.xml |
035 | |a (DE-627)NLM352497610 | ||
035 | |a (NLM)36739752 | ||
035 | |a (PII)S0895-6111(23)00007-1 | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
100 | 1 | |a Guo, Bangwei |e verfasserin |4 aut | |
245 | 1 | 2 | |a A robust and lightweight deep attention multiple instance learning algorithm for predicting genetic alterations |
264 | 1 | |c 2023 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a ƒaComputermedien |b c |2 rdamedia | ||
338 | |a ƒa Online-Ressource |b cr |2 rdacarrier | ||
500 | |a Date Completed 06.03.2023 | ||
500 | |a Date Revised 07.04.2023 | ||
500 | |a published: Print-Electronic | ||
500 | |a Citation Status MEDLINE | ||
520 | |a Copyright © 2023 Elsevier Ltd. All rights reserved. | ||
520 | |a Self-attention mechanism-based algorithms are attractive in digital pathology due to their interpretability, but suffer from computation complexity. This paper presents a novel, lightweight Attention-based Multiple Instance Mutation Learning (AMIML) model to allow small-scale attention operations for predicting gene mutations. Compared to the standard self-attention model, AMIML reduces the number of model parameters by approximately 70%. Using data for 24 clinically relevant genes from four cancer cohorts in TCGA studies (UCEC, BRCA, GBM, and KIRC), we compare AMIML with a standard self-attention model, five other deep learning models, and four traditional machine learning models. The results show that AMIML has excellent robustness and outperforms all the baseline algorithms in the vast majority of the tested genes. Conversely, the performance of the reference deep learning and machine learning models vary across different genes, and produce suboptimal prediction for certain genes. Furthermore, with the flexible and interpretable attention-based pooling mechanism, AMIML can further zero in and detect predictive image patches | ||
650 | 4 | |a Journal Article | |
650 | 4 | |a Research Support, Non-U.S. Gov't | |
650 | 4 | |a Attention mechanism | |
650 | 4 | |a Deep learning | |
650 | 4 | |a Gene Mutation | |
650 | 4 | |a Mutiple Instance Learning | |
650 | 4 | |a Whole slide images | |
700 | 1 | |a Li, Xingyu |e verfasserin |4 aut | |
700 | 1 | |a Yang, Miaomiao |e verfasserin |4 aut | |
700 | 1 | |a Zhang, Hong |e verfasserin |4 aut | |
700 | 1 | |a Xu, Xu Steven |e verfasserin |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Computerized medical imaging and graphics : the official journal of the Computerized Medical Imaging Society |d 1996 |g 105(2023) vom: 05. Apr., Seite 102189 |w (DE-627)NLM012943649 |x 1879-0771 |7 nnns |
773 | 1 | 8 | |g volume:105 |g year:2023 |g day:05 |g month:04 |g pages:102189 |
856 | 4 | 0 | |u http://dx.doi.org/10.1016/j.compmedimag.2023.102189 |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a GBV_NLM | ||
951 | |a AR | ||
952 | |d 105 |j 2023 |b 05 |c 04 |h 102189 |