Big in Japan : Regularizing Networks for Solving Inverse Problems
© The Author(s) 2019..
Deep learning and (deep) neural networks are emerging tools to address inverse problems and image reconstruction tasks. Despite outstanding performance, the mathematical analysis for solving inverse problems by neural networks is mostly missing. In this paper, we introduce and rigorously analyze families of deep regularizing neural networks (RegNets) of the form B α + N θ ( α ) B α , where B α is a classical regularization and the network N θ ( α ) B α is trained to recover the missing part Id X - B α not found by the classical regularization. We show that these regularizing networks yield a convergent regularization method for solving inverse problems. Additionally, we derive convergence rates (quantitative error estimates) assuming a sufficient decay of the associated distance function. We demonstrate that our results recover existing convergence and convergence rates results for filter-based regularization methods as well as the recently introduced null space network as special cases. Numerical results are presented for a tomographic sparse data problem, which clearly demonstrate that the proposed RegNets improve classical regularization as well as the null space network.
Medienart: |
E-Artikel |
---|
Erscheinungsjahr: |
2020 |
---|---|
Erschienen: |
2020 |
Enthalten in: |
Zur Gesamtaufnahme - volume:62 |
---|---|
Enthalten in: |
Journal of mathematical imaging and vision - 62(2020), 3 vom: 02., Seite 445-455 |
Sprache: |
Englisch |
---|
Beteiligte Personen: |
Schwab, Johannes [VerfasserIn] |
---|
Links: |
---|
Themen: |
Convergence analysis |
---|
Anmerkungen: |
Date Revised 08.11.2023 published: Print-Electronic Citation Status PubMed-not-MEDLINE |
---|
doi: |
10.1007/s10851-019-00911-1 |
---|
funding: |
|
---|---|
Förderinstitution / Projekttitel: |
|
PPN (Katalog-ID): |
NLM30893024X |
---|
LEADER | 01000naa a22002652 4500 | ||
---|---|---|---|
001 | NLM30893024X | ||
003 | DE-627 | ||
005 | 20231225132811.0 | ||
007 | cr uuu---uuuuu | ||
008 | 231225s2020 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.1007/s10851-019-00911-1 |2 doi | |
028 | 5 | 2 | |a pubmed24n1029.xml |
035 | |a (DE-627)NLM30893024X | ||
035 | |a (NLM)32308256 | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
100 | 1 | |a Schwab, Johannes |e verfasserin |4 aut | |
245 | 1 | 0 | |a Big in Japan |b Regularizing Networks for Solving Inverse Problems |
264 | 1 | |c 2020 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a ƒaComputermedien |b c |2 rdamedia | ||
338 | |a ƒa Online-Ressource |b cr |2 rdacarrier | ||
500 | |a Date Revised 08.11.2023 | ||
500 | |a published: Print-Electronic | ||
500 | |a Citation Status PubMed-not-MEDLINE | ||
520 | |a © The Author(s) 2019. | ||
520 | |a Deep learning and (deep) neural networks are emerging tools to address inverse problems and image reconstruction tasks. Despite outstanding performance, the mathematical analysis for solving inverse problems by neural networks is mostly missing. In this paper, we introduce and rigorously analyze families of deep regularizing neural networks (RegNets) of the form B α + N θ ( α ) B α , where B α is a classical regularization and the network N θ ( α ) B α is trained to recover the missing part Id X - B α not found by the classical regularization. We show that these regularizing networks yield a convergent regularization method for solving inverse problems. Additionally, we derive convergence rates (quantitative error estimates) assuming a sufficient decay of the associated distance function. We demonstrate that our results recover existing convergence and convergence rates results for filter-based regularization methods as well as the recently introduced null space network as special cases. Numerical results are presented for a tomographic sparse data problem, which clearly demonstrate that the proposed RegNets improve classical regularization as well as the null space network | ||
650 | 4 | |a Journal Article | |
650 | 4 | |a Convergence analysis | |
650 | 4 | |a Convergence rates | |
650 | 4 | |a Convolutional neural networks | |
650 | 4 | |a Inverse problems | |
650 | 4 | |a Null space networks | |
650 | 4 | |a Regularizing networks | |
700 | 1 | |a Antholzer, Stephan |e verfasserin |4 aut | |
700 | 1 | |a Haltmeier, Markus |e verfasserin |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Journal of mathematical imaging and vision |d 2006 |g 62(2020), 3 vom: 02., Seite 445-455 |w (DE-627)NLM191877727 |x 0924-9907 |7 nnns |
773 | 1 | 8 | |g volume:62 |g year:2020 |g number:3 |g day:02 |g pages:445-455 |
856 | 4 | 0 | |u http://dx.doi.org/10.1007/s10851-019-00911-1 |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a GBV_NLM | ||
951 | |a AR | ||
952 | |d 62 |j 2020 |e 3 |b 02 |h 445-455 |