UM
Residential Collegefalse
Status已發表Published
Efficient Human Motion Retrieval via Temporal Adjacent Bag of Words and Discriminative Neighborhood Preserving Dictionary Learning
Liu, Xin1; He, Gao-Feng1; Peng, Shu-Juan1; Cheung, Yiu-ming1,2; Tang, Yuan Yan3
2017-12
Source PublicationIEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS
ISSN2168-2291
Volume47Issue:6Pages:763-776
Abstract

Human motion retrieval from motion capture data forms the fundamental basis for computer animation. In this paper, the authors propose an efficient human motion retrieval approach via temporal adjacent bag of words (TA-BoW) and discriminative neighborhood preserving dictionary learning (DNP-DL). The retrieval process includes two phases: offline training and online retrieval. In the first phase, the original skeleton model is first simplified and then pairwise joint distances are computed to characterize each motion frame. Then, a novel motion descriptor, namely TA-BoW, is proposed to discriminatively code the motion appearances, through which the articulated complexity and spatiotemporal dimensionality can be greatly reduced. Subsequently, by considering the neighborhood relationships of intraclass structure and the advantage of Fisher criterion, a DNP-DL method is exploited through which each human action can be discriminatively and sparsely represented by a linear combination of such dictionary atoms. In the second phase, a hierarchical retrieval mechanism is used by incorporating the sparse classification and chi-square ranking, whereby the searching range is significantly reduced. The experimental results show that the proposed human motion retrieval approach performs better than the state-of-the-art competing approaches.

KeywordHierarchical Retrieval Mechanism Human Motion Retrieval Neighborhood Preserving Dictionary Pairwise Distance Temporal Adjacent Bag Of Words (Ta-bow)
DOI10.1109/THMS.2017.2675959
URLView the original
Indexed BySCIE
Language英語English
WOS Research AreaComputer Science
WOS SubjectComputer Science, Artificial Intelligence ; Computer Science, Cybernetics
WOS IDWOS:000415153100002
PublisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
The Source to ArticleWOS
Scopus ID2-s2.0-85016420840
Fulltext Access
Citation statistics
Document TypeJournal article
CollectionUniversity of Macau
Corresponding AuthorLiu, Xin; He, Gao-Feng; Peng, Shu-Juan; Cheung, Yiu-ming; Tang, Yuan Yan
Affiliation1.Department of Computer Science, Huaqiao University, Xiamen 361021, China
2.HKBU Institute of Research and Continuing Education, Shenzhen 518057, China
3.Department of Computer and Information Science, University of Macau, Macau 999078, China
Corresponding Author AffilicationUniversity of Macau
Recommended Citation
GB/T 7714
Liu, Xin,He, Gao-Feng,Peng, Shu-Juan,et al. Efficient Human Motion Retrieval via Temporal Adjacent Bag of Words and Discriminative Neighborhood Preserving Dictionary Learning[J]. IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 2017, 47(6), 763-776.
APA Liu, Xin., He, Gao-Feng., Peng, Shu-Juan., Cheung, Yiu-ming., & Tang, Yuan Yan (2017). Efficient Human Motion Retrieval via Temporal Adjacent Bag of Words and Discriminative Neighborhood Preserving Dictionary Learning. IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 47(6), 763-776.
MLA Liu, Xin,et al."Efficient Human Motion Retrieval via Temporal Adjacent Bag of Words and Discriminative Neighborhood Preserving Dictionary Learning".IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS 47.6(2017):763-776.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Liu, Xin]'s Articles
[He, Gao-Feng]'s Articles
[Peng, Shu-Juan]'s Articles
Baidu academic
Similar articles in Baidu academic
[Liu, Xin]'s Articles
[He, Gao-Feng]'s Articles
[Peng, Shu-Juan]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Liu, Xin]'s Articles
[He, Gao-Feng]'s Articles
[Peng, Shu-Juan]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.