Residential College | false |
Status | 已發表Published |
BNGBS: An efficient network boosting system with triple incremental learning capabilities for more nodes, samples, and classes | |
Feng, Liangjun1; Zhao, Chunhui1; Chen, C. L.Philip2; Li, Yuan Long3; Zhou, Min4; Qiao, Honglin3; Fu, Chuan3 | |
2020-10-28 | |
Source Publication | Neurocomputing |
ISSN | 0925-2312 |
Volume | 412Pages:486-501 |
Abstract | As an ensemble algorithm, network boosting enjoys a powerful classification ability but suffers from the tedious and time-consuming training process. To tackle the problem, in this paper, a broad network gradient boosting system (BNGBS) is developed by integrating gradient boosting machine with broad networks, in which the classification loss caused by a base broad network is learned and eliminated by followed networks in a cascade manner. The proposed system is constructed as an additive model and can be easily optimized by a greedy strategy instead of the tedious back-propagation algorithm, resulting in a more efficient learning process. Meanwhile, triple incremental learning capabilities including the increment of feature nodes, increment of input samples, and increment of target classes are designed. The proposed system can be efficiently updated and expanded based on the current status instead of being entirely retrained when the demands for more feature nodes, input samples, and target classes are proposed. The node-increment ability allows to add more feature nodes into the built system if the current structures are not effective for learning. The sample-increment ability is developed to allow the model to keep learning from the coming batch data. The class-increment ability is used to tackle the issue that the coming batch data may contain unseen categories. In comparison with existing popular machine learning methods, comprehensive results based on eight benchmark datasets illustrate the effectiveness of the proposed broad network gradient boosting system for the classification task. |
Keyword | Additive Model Broad Network Cascade Model Gradient Boosting Machine Greedy Strategy Incremental Learning |
DOI | 10.1016/j.neucom.2020.06.100 |
URL | View the original |
Indexed By | SCIE |
Language | 英語English |
WOS Research Area | Computer Science |
WOS Subject | Computer Science, Artificial Intelligence |
WOS ID | WOS:000571878800002 |
Publisher | ELSEVIER, RADARWEG 29, 1043 NX AMSTERDAM, NETHERLANDS |
Scopus ID | 2-s2.0-85088373354 |
Fulltext Access | |
Citation statistics | |
Document Type | Journal article |
Collection | DEPARTMENT OF COMPUTER AND INFORMATION SCIENCE |
Corresponding Author | Zhao, Chunhui |
Affiliation | 1.State Key Laboratory of Industrial Control Technology, College of Control Science and Engineering, Zhejiang University, Hangzhou, 310027, China 2.Department of Computer and Information Science, University of Macau, Macau, 999078, China 3.Alibaba Group, Hangzhou, 310024, China 4.Pangang Group Xichang Steel and Vanadium Co., Ltd., Xichang, 615000, China |
Recommended Citation GB/T 7714 | Feng, Liangjun,Zhao, Chunhui,Chen, C. L.Philip,et al. BNGBS: An efficient network boosting system with triple incremental learning capabilities for more nodes, samples, and classes[J]. Neurocomputing, 2020, 412, 486-501. |
APA | Feng, Liangjun., Zhao, Chunhui., Chen, C. L.Philip., Li, Yuan Long., Zhou, Min., Qiao, Honglin., & Fu, Chuan (2020). BNGBS: An efficient network boosting system with triple incremental learning capabilities for more nodes, samples, and classes. Neurocomputing, 412, 486-501. |
MLA | Feng, Liangjun,et al."BNGBS: An efficient network boosting system with triple incremental learning capabilities for more nodes, samples, and classes".Neurocomputing 412(2020):486-501. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment