Residential Collegefalse
Status已發表Published
Novel Efficient RNN and LSTM-Like Architectures: Recurrent and Gated Broad Learning Systems and Their Applications for Text Classification
Du,Jie1; Vong,Chi Man2; Philip Chen,C. L.3
2021-03-01
Source PublicationIEEE Transactions on Cybernetics
ABS Journal Level3
ISSN2168-2267
Volume51Issue:3Pages:1586-1597
Abstract

High accuracy of text classification can be achieved through simultaneous learning of multiple information, such as sequence information and word importance. In this article, a kind of flat neural networks called the broad learning system (BLS) is employed to derive two novel learning methods for text classification, including recurrent BLS (R-BLS) and long short-term memory (LSTM)-like architecture: gated BLS (G-BLS). The proposed two methods possess three advantages: 1) higher accuracy due to the simultaneous learning of multiple information, even compared to deep LSTM that extracts deeper but single information only; 2) significantly faster training time due to the noniterative learning in BLS, compared to LSTM; and 3) easy integration with other discriminant information for further improvement. The proposed methods have been evaluated over 13 real-world datasets from various types of text classification. From the experimental results, the proposed methods achieve higher accuracies than LSTM while taking significantly less training time on most evaluated datasets, especially when the LSTM is in deep architecture. Compared to R-BLS, G-BLS has an extra forget gate to control the flow of information (similar to LSTM) to further improve the accuracy on text classification so that G-BLS is more effective while R-BLS is more efficient.

KeywordBroad Learning System (Bls) Sequence Information Simultaneous Learning Text Classification Word Importance
DOI10.1109/TCYB.2020.2969705
URLView the original
Indexed BySCIE
Language英語English
WOS Research AreaAutomation & Control Systems ; Computer Science
WOS SubjectAutomation & Control Systems ; Computer Science, Artificial Intelligence ; Computer Science, Cybernetics
WOS IDWOS:000619376300040
Scopus ID2-s2.0-85101145440
Fulltext Access
Citation statistics
Document TypeJournal article
CollectionDEPARTMENT OF COMPUTER AND INFORMATION SCIENCE
Corresponding AuthorVong,Chi Man
Affiliation1.National-Regional Key Technology Engineering Laboratory for Medical Ultrasound,Guangdong Key Laboratory for Biomedical Measurements and Ultrasound Imaging,School of Biomedical Engineering,Health Science Center,Shenzhen University,Shenzhen,518060,China
2.Department of Computer and Information Science,University of Macau,Macau,Macao
3.School of Computer Science and Engineering,South China University of Technology,Guangzhou,510641,China
Corresponding Author AffilicationUniversity of Macau
Recommended Citation
GB/T 7714
Du,Jie,Vong,Chi Man,Philip Chen,C. L.. Novel Efficient RNN and LSTM-Like Architectures: Recurrent and Gated Broad Learning Systems and Their Applications for Text Classification[J]. IEEE Transactions on Cybernetics, 2021, 51(3), 1586-1597.
APA Du,Jie., Vong,Chi Man., & Philip Chen,C. L. (2021). Novel Efficient RNN and LSTM-Like Architectures: Recurrent and Gated Broad Learning Systems and Their Applications for Text Classification. IEEE Transactions on Cybernetics, 51(3), 1586-1597.
MLA Du,Jie,et al."Novel Efficient RNN and LSTM-Like Architectures: Recurrent and Gated Broad Learning Systems and Their Applications for Text Classification".IEEE Transactions on Cybernetics 51.3(2021):1586-1597.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Du,Jie]'s Articles
[Vong,Chi Man]'s Articles
[Philip Chen,C. L.]'s Articles
Baidu academic
Similar articles in Baidu academic
[Du,Jie]'s Articles
[Vong,Chi Man]'s Articles
[Philip Chen,C. L.]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Du,Jie]'s Articles
[Vong,Chi Man]'s Articles
[Philip Chen,C. L.]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.