Residential College | false |
Status | 已發表Published |
Deep Parameterized Neural Networks for Hyperspectral Image Denoising | |
Xiong, Fengchao1,2; Zhou, Jun3; Zhou, Jiantao1,2; Lu, Jianfeng1; Qian, Yuntao4 | |
2023-09 | |
Source Publication | IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING |
ISSN | 0196-2892 |
Volume | 61Pages:5525015 |
Abstract | Sparse representation (SR)-based hyperspectral image (HSI) denoising methods normally average the local denoising results of multiple overlapped cubes to recover the whole HSI. Though interpretable, they rely on cumbersome hyperparameter settings and ignore the relationship between overlapped cubes, leading to poor denoising performance. This article combines SR and convolutional neural networks and introduces a deep parameterized sparse neural network (DPNet-S) to address the above issues. DPNet-S parameterizes the SR-based HSI denoising model with two modules: 1) sparse optimizer to extract sparse feature maps from noisy HSIs via recurrent usage of convolution, deconvolution, and soft shrinkage operations; and 2) image reconstructor to recover the denoised HSI from its sparse feature maps via deconvolution operations. We further replace the soft shrinkage operator with U-Net architecture to account for general HSI priors and more effectively capture the complex structures of HSIs, resulting in DPNet-U. Both networks directly learn the parameters from data and perform denoising on the whole HSI, which overcomes the limitations of SR-based methods. Moreover, our networks are generated from the denoising model and optimization procedures, thus leveraging the knowledge embedded and relying less on the number of training samples. Extensive experiments on both synthetic and real-world HSIs show that our DPNet-S and DPNet-U achieve remarkable results when compared with state-of-the-art methods. The codes will be publicly available at https://github.com/bearshng/dpnets for reproducible research. |
Keyword | Convolutional Neural Networks Hyperspectral Image (Hsi) Denoising Learning To Optimize (L2o) Sparse Representation (Sr) |
DOI | 10.1109/TGRS.2023.3318001 |
URL | View the original |
Indexed By | SCIE |
Language | 英語English |
WOS Research Area | Geochemistry & Geophysics ; Engineering ; Remote Sensing ; Imaging Science & Photographic Technology |
WOS Subject | Geochemistry & Geophysics ; Engineering, Electrical & Electronic ; Remote Sensing ; Imaging Science & Photographic Technology |
WOS ID | WOS:001119655900030 |
Publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 445 HOES LANE, PISCATAWAY, NJ 08855-4141 |
Scopus ID | 2-s2.0-85174502348 |
Fulltext Access | |
Citation statistics | |
Document Type | Journal article |
Collection | Faculty of Science and Technology |
Corresponding Author | Zhou, Jiantao |
Affiliation | 1.Nanjing University of Science and Technology, School of Computer Science and Engineering, Nanjing, 210094, China 2.University of Macau, State Key Laboratory of Internet of Things for Smart City, Department of Computer and Information Science, Macao 3.Griffith University, School of Information and Communication Technology, Nathan, 4111, Australia 4.Zhejiang University, College of Computer Science, Hangzhou, 310027, China |
First Author Affilication | University of Macau |
Corresponding Author Affilication | University of Macau |
Recommended Citation GB/T 7714 | Xiong, Fengchao,Zhou, Jun,Zhou, Jiantao,et al. Deep Parameterized Neural Networks for Hyperspectral Image Denoising[J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61, 5525015. |
APA | Xiong, Fengchao., Zhou, Jun., Zhou, Jiantao., Lu, Jianfeng., & Qian, Yuntao (2023). Deep Parameterized Neural Networks for Hyperspectral Image Denoising. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 61, 5525015. |
MLA | Xiong, Fengchao,et al."Deep Parameterized Neural Networks for Hyperspectral Image Denoising".IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING 61(2023):5525015. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment