Residential College | false |
Status | 已發表Published |
Semantic-based conditional generative adversarial hashing with pairwise labels | |
Li,Qi1,4; Wang,Weining2,4; Tang,Yuanyan5; Xu,Chengzhong6; Sun,Zhenan1,3,4 | |
2023-02-25 | |
Source Publication | Pattern Recognition |
ISSN | 0031-3203 |
Volume | 139Pages:109452 |
Abstract | Hashing has been widely exploited in recent years due to the rapid growth of image and video data on the web. Benefiting from recent advances in deep learning, deep hashing methods have achieved promising results with supervised information. However, it is usually expensive to collect the supervised information. In order to utilize both labeled and unlabeled data samples, many semi-supervised hashing methods based on Generative Adversarial Networks (GANs) have been proposed. Most of them still need the conditional information, which is usually generated by the pre-trained neural networks or leveraging random binary vectors. One natural question about these methods is that how can we generate a better conditional information given the semantic similarity information? In this paper, we propose a general two-stage conditional GANs hashing framework based on the pairwise label information. Both the labeled and unlabeled data samples are exploited to learn hash codes under our framework. In the first stage, the conditional information is generated via a general Bayesian approach, which has a much lower dimensional representation and maintains the semantic information of original data samples. In the second stage, a semi-supervised approach is presented to learn hash codes based on the conditional information. Both pairwise based cross entropy loss and adversarial loss are introduced to make full use of labeled and unlabeled data samples. Extensive experiments have shown that the propose algorithm outperforms current state-of-the-art methods on three benchmark image datasets, which demonstrates the effectiveness of our method. |
Keyword | Generative Adversarial Networks Hashing With Pairwise Labels Semantic-based Conditional Information |
DOI | 10.1016/j.patcog.2023.109452 |
URL | View the original |
Indexed By | SCIE |
Language | 英語English |
Funding Project | Research on Key Simulation and Testing Technologies for Connected Intelligent Driving Vehicles |
WOS Research Area | Computer Science ; Engineering |
WOS Subject | Computer Science, Artificial Intelligence ; Engineering, Electrical & Electronic |
WOS ID | WOS:000949911600001 |
Publisher | ELSEVIER SCI LTD, THE BOULEVARD, LANGFORD LANE, KIDLINGTON, OXFORD OX5 1GB, OXON, ENGLAND |
Scopus ID | 2-s2.0-85149395380 |
Fulltext Access | |
Citation statistics | |
Document Type | Journal article |
Collection | THE STATE KEY LABORATORY OF INTERNET OF THINGS FOR SMART CITY (UNIVERSITY OF MACAU) Faculty of Science and Technology DEPARTMENT OF COMPUTER AND INFORMATION SCIENCE |
Corresponding Author | Wang,Weining; Sun,Zhenan |
Affiliation | 1.National Key Laboratory of Multimodal Artificial Intelligence System, 2.The Laboratory of Cognition and Decision Intelligence for Complex Systems, 3.School of Artificial Intelligence,University of Chinese Academy of Sciences, 4.Institute of Automation,Chinese Academy of Sciences,Beijing,100190,China 5.Zhuhai UM Science and Technology Research Institute,FST University of Macau,Macau, 6.State Key Laboratory of IoTSC,Department of Computer and Information Science,University of Macau,Macau SAR,999078,China |
Recommended Citation GB/T 7714 | Li,Qi,Wang,Weining,Tang,Yuanyan,et al. Semantic-based conditional generative adversarial hashing with pairwise labels[J]. Pattern Recognition, 2023, 139, 109452. |
APA | Li,Qi., Wang,Weining., Tang,Yuanyan., Xu,Chengzhong., & Sun,Zhenan (2023). Semantic-based conditional generative adversarial hashing with pairwise labels. Pattern Recognition, 139, 109452. |
MLA | Li,Qi,et al."Semantic-based conditional generative adversarial hashing with pairwise labels".Pattern Recognition 139(2023):109452. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment