Residential College | false |
Status | 已發表Published |
Dual-Hybrid Attention Network for Specular Highlight Removal | |
Guo, Xiaojiao1,4; Chen, Xuhang2,5,6; Luo, Shenghong1; Wang, Shuqiang3; Pun, Chi Man1 | |
2024-11 | |
Conference Name | 32nd ACM International Conference on Multimedia, MM 2024 |
Source Publication | MM 2024 - Proceedings of the 32nd ACM International Conference on Multimedia |
Pages | 10173-10181 |
Conference Date | 28 October 2024 - 1 November 2024 |
Conference Place | Melbourne |
Country | Australia |
Publication Place | New York, NY, USA |
Publisher | Association for Computing Machinery, Inc |
Abstract | Specular highlight removal plays a pivotal role in multimedia applications, as it enhances the quality and interpretability of images and videos, ultimately improving the performance of downstream tasks such as content-based retrieval, object recognition, and scene understanding. Despite significant advances in deep learning-based methods, current state-of-the-art approaches often rely on additional priors or supervision, limiting their practicality and generalization capability. In this paper, we propose the Dual-Hybrid Attention Network for Specular Highlight Removal (DHAN-SHR), an end-to-end network that introduces novel hybrid attention mechanisms to effectively capture and process information across different scales and domains without relying on additional priors or supervision. DHAN-SHR consists of two key components: the Adaptive Local Hybrid-Domain Dual Attention Transformer (L-HD-DAT) and the Adaptive Global Dual Attention Transformer (G-DAT). The L-HD-DAT captures local inter-channel and inter-pixel dependencies while incorporating spectral domain features, enabling the network to effectively model the complex interactions between specular highlights and the underlying surface properties. The G-DAT models global inter-channel relationships and long-distance pixel dependencies, allowing the network to propagate contextual information across the entire image and generate more coherent and consistent highlight-free results. To evaluate the performance of DHAN-SHR and facilitate future research in this area, we compile a large-scale benchmark dataset comprising a diverse range of images with varying levels of specular highlights. Through extensive experiments, we demonstrate that DHAN-SHR outperforms 18 state-of-the-art methods both quantitatively and qualitatively, setting a new standard for specular highlight removal in multimedia applications. The code and dataset are available at https://github.com/CXH-Research/DHAN-SHR. |
Keyword | Dual-hybrid Attention Spatial And Spectral Specular Highlight Removal |
DOI | 10.1145/3664647.3680745 |
URL | View the original |
Language | 英語English |
Scopus ID | 2-s2.0-85209802423 |
Fulltext Access | |
Citation statistics | |
Document Type | Conference paper |
Collection | Faculty of Science and Technology DEPARTMENT OF COMPUTER AND INFORMATION SCIENCE |
Corresponding Author | Wang, Shuqiang; Pun, Chi Man |
Affiliation | 1.University of Macau, Macao 2.Huizhou University, Huizhou, China 3.Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China 4.Baoshan University, China 5.University of Macau, Macao 6.Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, China |
First Author Affilication | University of Macau |
Corresponding Author Affilication | University of Macau |
Recommended Citation GB/T 7714 | Guo, Xiaojiao,Chen, Xuhang,Luo, Shenghong,et al. Dual-Hybrid Attention Network for Specular Highlight Removal[C], New York, NY, USA:Association for Computing Machinery, Inc, 2024, 10173-10181. |
APA | Guo, Xiaojiao., Chen, Xuhang., Luo, Shenghong., Wang, Shuqiang., & Pun, Chi Man (2024). Dual-Hybrid Attention Network for Specular Highlight Removal. MM 2024 - Proceedings of the 32nd ACM International Conference on Multimedia, 10173-10181. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment