Residential College | false |
Status | 已發表Published |
RIFLE: Backpropagation in depth for deep transfer learning through re-initializing the fully-connected layer | |
Li,Xingjian1,2; Xiong,Haoyi1; An,Haozhe1; Xu,Chengzhong2,3; Dou,Dejing1 | |
2020 | |
Conference Name | 37th International Conference on Machine Learning, ICML 2020 |
Source Publication | 37th International Conference on Machine Learning, ICML 2020 |
Volume | PartF168147-8 |
Pages | 5966-5975 |
Conference Date | 13 July 2020 - 18 July 2020 |
Conference Place | Virtual, Online |
Abstract | Fine-tuning the deep convolution neural network (CNN) using a pre-trained model helps transfer knowledge learned from larger datasets to the target task. While the accuracy could be largely improved even when the training dataset is small, the transfer learning outcome is usually constrained by the pre-trained model with close CNN weights (Liu et al., 2019), as the backpropagation here brings smaller updates to deeper CNN layers. In this work, we propose RI- FLE- a simple yet effective strategy that deepens backpropagation in transfer learning settings, through periodically Re-Initializing the Fullyconnected LayEr with random scratch during the fine-tuning procedure. RIFLE brings meaningful updates to the weights of deep CNN layers and improves low-level feature learning, while the effects of randomization can be easily converged throughout the overall learning procedure. The experiments show that the use of RI- FLE significantly improves deep transfer learning accuracy on a wide range of datasets, outperforming known tricks for the similar purpose, such as Dropout, DropConnect, Stochastic Depth, Disturb Label and Cyclic Learning Rate, under the same settings with 0.5%-2% higher testing accuracy. Empirical cases and ablation studies further indicate RIFLE brings meaningful updates to deep CNN layers with accuracy improved. |
URL | View the original |
Language | 英語English |
Scopus ID | 2-s2.0-85105596195 |
Fulltext Access | |
Citation statistics | |
Document Type | Conference paper |
Collection | Faculty of Science and Technology |
Corresponding Author | Li,Xingjian |
Affiliation | 1.Big Data Lab,Baidu Research,Beijing,China 2.Faculty of Science and Technology,University of Macau,Macao 3.State Key Lab of IOTSC,Department of Computer Science,University of Macau,Macao |
First Author Affilication | Faculty of Science and Technology |
Corresponding Author Affilication | Faculty of Science and Technology |
Recommended Citation GB/T 7714 | Li,Xingjian,Xiong,Haoyi,An,Haozhe,et al. RIFLE: Backpropagation in depth for deep transfer learning through re-initializing the fully-connected layer[C], 2020, 5966-5975. |
APA | Li,Xingjian., Xiong,Haoyi., An,Haozhe., Xu,Chengzhong., & Dou,Dejing (2020). RIFLE: Backpropagation in depth for deep transfer learning through re-initializing the fully-connected layer. 37th International Conference on Machine Learning, ICML 2020, PartF168147-8, 5966-5975. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment