Residential College | false |
Status | 已發表Published |
Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture | |
Chen, C. L. Philip; Liu, Zhulin | |
2018-01 | |
Source Publication | IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS |
ISSN | 2162-237X |
Volume | 29Issue:1Pages:10-24 |
Abstract | Broad Learning System (BLS) that aims to offer an alternative way of learning in deep structure is proposed in this paper. Deep structure and learning suffer from a time-consuming training process because of a large number of connecting parameters in filters and layers. Moreover, it encounters a complete retraining process if the structure is not sufficient to model the system. The BLS is established in the form of a flat network, where the original inputs are transferred and placed as "mapped features" in feature nodes and the structure is expanded in wide sense in the "enhancement nodes." The incremental learning algorithms are developed for fast remodeling in broad expansion without a retraining process if the network deems to be expanded. Two incremental learning algorithms are given for both the increment of the feature nodes (or filters in deep structure) and the increment of the enhancement nodes. The designed model and algorithms are very versatile for selecting a model rapidly. In addition, another incremental learning is developed for a system that has been modeled encounters a new incoming input. Specifically, the system can be remodeled in an incremental way without the entire retraining from the beginning. Satisfactory result for model reduction using singular value decomposition is conducted to simplify the final structure. Compared with existing deep neural networks, experimental results on the Modified National Institute of Standards and Technology database and NYU NORB object recognition dataset benchmark data demonstrate the effectiveness of the proposed BLS. |
Keyword | Big Data Big Data Modeling Broad Learning System (Bls) Deep Learning Incremental Learning Random Vector Functional-link Neural Networks (Rvflnn) Single Layer Feedforward Neural Networks (Slfn) Singular Value Decomposition (Svd) |
DOI | 10.1109/TNNLS.2017.2716952 |
URL | View the original |
Indexed By | SCIE |
Language | 英語English |
WOS Research Area | Computer Science ; Engineering |
WOS Subject | Computer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic |
WOS ID | WOS:000419558900002 |
Publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
The Source to Article | WOS |
Scopus ID | 2-s2.0-85028939310 |
Fulltext Access | |
Citation statistics | |
Document Type | Journal article |
Collection | University of Macau |
Recommended Citation GB/T 7714 | Chen, C. L. Philip,Liu, Zhulin. Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29(1), 10-24. |
APA | Chen, C. L. Philip., & Liu, Zhulin (2018). Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 29(1), 10-24. |
MLA | Chen, C. L. Philip,et al."Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 29.1(2018):10-24. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment