Residential College | false |
Status | 已發表Published |
Dynamic Network Structure: Doubly Stacking Broad Learning Systems With Residuals and Simpler Linear Model Transmission | |
Runshan Xie1,2; Chi-Man Vong3; C. L. Philip Chen4; Shitong Wang1,2 | |
2022-02-16 | |
Source Publication | IEEE Transactions on Emerging Topics in Computational Intelligence |
ISSN | 2471-285X |
Volume | 6Issue:6Pages:1378-1395 |
Abstract | While broad learning system (BLS) has demonstrated its distinctive performance with its solid theoretical foundation, strong generalization capability and fast learning speed, a relatively large network structure (i.e., a large number of enhancement nodes) is often required to assure satisfactory performance especially for challenging datasets, which may inevitably deteriorate its generalization capability due to overfitting phenomenon. In this study, by stacking several broad learning sub-systems, a doubly stacked broad learning system through residuals and simpler linear model transmission, called RST&BLS, is presented to enhance BLS performance in network size, generalization capability and learning speed. With the use of shared feature nodes and simpler linear models between stacked layers, the design methodology of RST&BLS is motivated by three facets: 1) analogous to human-like neural behaviors that some common neuron blocks are always activated to deal with the correlated problems, an enhanced ensemble of BLS sub-systems is resulted; 2) rather than a complicated model, human prefers a simple model (as a component of the final model); 3) extra overfitting-avoidance capability between shared feature nodes and the remaining hidden nodes from the second layer can be assured in theory. Except for performance advantage over the comparative methods, experimental results on twenty-one classification/regression datasets indicate the superiority of RST&BLS in terms of smaller network structure (i.e., fewer adjustable parameters), better generalization capability and fewer computational burdens. |
Keyword | Broad Learning System (Bls) Stacked Structure Simple Linear Models Learning Algorithms Overfitting Co-adaption Generalization |
Subject Area | Computer Science |
MOST Discipline Catalogue | Computer Science, Artificial Intelligence |
DOI | 10.1109/TETCI.2022.3146983 |
URL | View the original |
Indexed By | SCIE |
Language | 英語English |
WOS Research Area | Computer Science |
WOS Subject | Computer Science, Artificial Intelligence |
WOS ID | WOS:000761350100001 |
Publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 445 HOES LANE, PISCATAWAY, NJ 08855-4141 |
Scopus ID | 2-s2.0-85124843388 |
Fulltext Access | |
Citation statistics | |
Document Type | Journal article |
Collection | Faculty of Science and Technology DEPARTMENT OF COMPUTER AND INFORMATION SCIENCE |
Corresponding Author | Shitong Wang |
Affiliation | 1.School of AI and Computer Science, Jiangnan University, Wuxi 214122, China 2.Taihu Jiangsu Key Construction Laboratory of IoT Application Technologies, JiangSu 214122, China 3.Department of Computer and Information Science, University of Macau, Macau 999078, China 4.School of Computer Science and Engineering, South China University of Technology, Guangzhou 510006, China |
Recommended Citation GB/T 7714 | Runshan Xie,Chi-Man Vong,C. L. Philip Chen,et al. Dynamic Network Structure: Doubly Stacking Broad Learning Systems With Residuals and Simpler Linear Model Transmission[J]. IEEE Transactions on Emerging Topics in Computational Intelligence, 2022, 6(6), 1378-1395. |
APA | Runshan Xie., Chi-Man Vong., C. L. Philip Chen., & Shitong Wang (2022). Dynamic Network Structure: Doubly Stacking Broad Learning Systems With Residuals and Simpler Linear Model Transmission. IEEE Transactions on Emerging Topics in Computational Intelligence, 6(6), 1378-1395. |
MLA | Runshan Xie,et al."Dynamic Network Structure: Doubly Stacking Broad Learning Systems With Residuals and Simpler Linear Model Transmission".IEEE Transactions on Emerging Topics in Computational Intelligence 6.6(2022):1378-1395. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment