Residential College | false |
Status | 已發表Published |
Structure Suture Learning-Based Robust Multiview Palmprint Recognition | |
Zhao, Shuping1; Fei, Lunke2; Wen, Jie3; Zhang, Bob1; Zhao, Pengyang4; Li, Shuyi1 | |
2022 | |
Source Publication | IEEE Transactions on Neural Networks and Learning Systems |
ISSN | 2162-237X |
Volume | 35Issue:6Pages:8401-8413 |
Abstract | Low-quality palmprint images will degrade the recognition performance, when they are captured under the open, unconstraint, and low-illumination conditions. Moreover, the traditional single-view palmprint representation methods have been difficult to express the characteristics of each palm strongly, where the palmprint characteristics become weak. To tackle these issues, in this article, we propose a structure suture learning-based robust multiview palmprint recognition method (SSL_RMPR), which comprehensively presents the salient palmprint features from multiple views. Unlike the existing multiview palmprint representation methods, SSL_RMPR introduces a structure suture learning strategy to produce an elastic nearest neighbor graph (ENNG) on the reconstruction errors that simultaneously exploit the label information and the latent consensus structure of the multiview data, such that the discriminant palmprint representation can be adaptively enhanced. Meanwhile, a low-rank reconstruction term integrating with the projection matrix learning is proposed, in such a manner that the robustness of the projection matrix can be improved. Particularly, since no extra structure capture term is imposed into the proposed model, the complexity of the model can be greatly reduced. Experimental results have proven the superiority of the proposed SSL_RMPR by achieving the best recognition performances on a number of real-world palmprint databases. |
Keyword | Elastic Nearest Neighbor Graph (Enng) Electronic Mail Face Recognition Feature Extraction Image Reconstruction Learning Systems Multiview Learning Neural Networks Palmprint Recognition Palmprint Recognition Structure Suture Learning |
DOI | 10.1109/TNNLS.2022.3227473 |
URL | View the original |
Indexed By | SCIE |
Language | 英語English |
WOS Research Area | Computer Science ; Engineering |
WOS Subject | Computer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic |
WOS ID | WOS:000903572100001 |
Publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 445 HOES LANE, PISCATAWAY, NJ 08855-4141 |
Scopus ID | 2-s2.0-85146257479 |
Fulltext Access | |
Citation statistics | |
Document Type | Journal article |
Collection | DEPARTMENT OF COMPUTER AND INFORMATION SCIENCE |
Corresponding Author | Wen, Jie; Zhang, Bob |
Affiliation | 1.Department of Computer and Information Science, PAMI Research Group, University of Macau, Taipa, China 2.School of Computer Science and Technology, Guangdong University of Technology, Guangzhou, China 3.Shenzhen Key Laboratory of Visual Object Detection and Recognition, Harbin Institute of Technology, Shenzhen, China 4.Department of Electronic Engineering, Tsinghua University, Beijing, China |
First Author Affilication | University of Macau |
Corresponding Author Affilication | University of Macau |
Recommended Citation GB/T 7714 | Zhao, Shuping,Fei, Lunke,Wen, Jie,et al. Structure Suture Learning-Based Robust Multiview Palmprint Recognition[J]. IEEE Transactions on Neural Networks and Learning Systems, 2022, 35(6), 8401-8413. |
APA | Zhao, Shuping., Fei, Lunke., Wen, Jie., Zhang, Bob., Zhao, Pengyang., & Li, Shuyi (2022). Structure Suture Learning-Based Robust Multiview Palmprint Recognition. IEEE Transactions on Neural Networks and Learning Systems, 35(6), 8401-8413. |
MLA | Zhao, Shuping,et al."Structure Suture Learning-Based Robust Multiview Palmprint Recognition".IEEE Transactions on Neural Networks and Learning Systems 35.6(2022):8401-8413. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment