Residential College | false |
Status | 已發表Published |
Research on the Great Multi-model Pyramid Training Framework and Enhanced Loss Function for Fine-grained Classification | |
Guo, Pengze1![]() ![]() | |
2024-10-04 | |
Conference Name | AIAHPC 2024: International Conference on Artificial Intelligence, Automation and High Performance Computing |
Source Publication | Proceedings of the 2024 4th International Conference on Artificial Intelligence, Automation and High Performance Computing
![]() |
Pages | 414-422 |
Conference Date | July 19-21 2024 |
Conference Place | Zhuhai, China |
Country | CHINA |
Publisher | Association for Computing Machinery |
Abstract | Knowledge distillation aims to improve the performance of a smaller student model by transferring knowledge from a larger teacher model. Traditional methods focus on training the student to mimic the teacher’s output activations. In this paper, we propose a multi-model training framework, MPTF (multi-model pyramid training framework), and a novel loss function, C-LOSS, to enhance the student model’s ability to distinguish between similar classes. By appropriately sacrificing storage and computation time, our framework merges categories with similar features into super categories during the initial training phase, followed by refining the classification of these super categories. The C-LOSS function is introduced to optimize intra-class and inter-class distances to improve discriminative ability. Experiments on CIFAR-10, CIFAR-100, and EuroSAT datasets show that our method significantly improves overall accuracy and confusing class accuracy. Despite these advancements, further research is needed to refine the segmentation of super categories and enhance the scalability of our framework. This study demonstrates the potential of staged training strategies and optimized loss functions in improving the performance of small models in fine-grained classification tasks. |
Keyword | Confusing Class Accuracy Enhanced Loss Function (c-Loss) Fine-grained Classification Knowledge Distillation Multi-model Pyramid Training Framework (Mptf) |
DOI | 10.1145/3690931.3691001 |
URL | View the original |
Language | 英語English |
Scopus ID | 2-s2.0-85212588654 |
Fulltext Access | |
Citation statistics | |
Document Type | Conference paper |
Collection | Faculty of Science and Technology |
Corresponding Author | Guo, Pengze; Guo, Pengze |
Affiliation | 1.Faculty of Science and Technology, University of Macau, Macao 2.Faculty of Science and Technology, University of Macau, Macao |
First Author Affilication | Faculty of Science and Technology |
Corresponding Author Affilication | Faculty of Science and Technology |
Recommended Citation GB/T 7714 | Guo, Pengze,Guo, Pengze. Research on the Great Multi-model Pyramid Training Framework and Enhanced Loss Function for Fine-grained Classification[C]:Association for Computing Machinery, 2024, 414-422. |
APA | Guo, Pengze., & Guo, Pengze (2024). Research on the Great Multi-model Pyramid Training Framework and Enhanced Loss Function for Fine-grained Classification. Proceedings of the 2024 4th International Conference on Artificial Intelligence, Automation and High Performance Computing, 414-422. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment