Status | 已發表Published |
Novel Efficient RNN and LSTM-like architectures: Recurrent and Gated Broad Learning Systems and Their Applications to Text Classification | |
Du, J.; Vong, C. M.; Chen, C. L. | |
2021-03-01 | |
Source Publication | IEEE Transactions on Cybernetics (SCI-E) |
ISSN | 2168-2267 |
Pages | 1586-1597 |
Abstract | High accuracy of text classification can be achieved through simultaneous learning of multiple information, such as sequence information and word importance. In this article, a kind of flat neural networks called the broad learning system (BLS) is employed to derive two novel learning methods for text classification, including recurrent BLS (R-BLS) and long short-term memory (LSTM)-like architecture: gated BLS (G-BLS). The proposed two methods possess three advantages: 1) higher accuracy due to the simultaneous learning of multiple information, even compared to deep LSTM that extracts deeper but single information only; 2) significantly faster training time due to the noniterative learning in BLS, compared to LSTM; and 3) easy integration with other discriminant information for further improvement. The proposed methods have been evaluated over 13 real-world datasets from various types of text classification. From the experimental results, the proposed methods achieve higher accuracies than LSTM while taking significantly less training time on most evaluated datasets, especially when the LSTM is in deep architecture. Compared to R-BLS, G-BLS has an extra forget gate to control the flow of information (similar to LSTM) to further improve the accuracy on text classification so that G-BLS is more effective while R-BLS is more efficient. |
Keyword | Broad learning system (BLS) sequence information simultaneous learning text classification word importance |
URL | View the original |
Language | 英語English |
The Source to Article | PB_Publication |
PUB ID | 62171 |
Document Type | Journal article |
Collection | DEPARTMENT OF COMPUTER AND INFORMATION SCIENCE |
Corresponding Author | Vong, C. M. |
Recommended Citation GB/T 7714 | Du, J.,Vong, C. M.,Chen, C. L.. Novel Efficient RNN and LSTM-like architectures: Recurrent and Gated Broad Learning Systems and Their Applications to Text Classification[J]. IEEE Transactions on Cybernetics (SCI-E), 2021, 1586-1597. |
APA | Du, J.., Vong, C. M.., & Chen, C. L. (2021). Novel Efficient RNN and LSTM-like architectures: Recurrent and Gated Broad Learning Systems and Their Applications to Text Classification. IEEE Transactions on Cybernetics (SCI-E), 1586-1597. |
MLA | Du, J.,et al."Novel Efficient RNN and LSTM-like architectures: Recurrent and Gated Broad Learning Systems and Their Applications to Text Classification".IEEE Transactions on Cybernetics (SCI-E) (2021):1586-1597. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment