UM  > Faculty of Science and Technology
Residential Collegefalse
Status已發表Published
Research on the Great Multi-model Pyramid Training Framework and Enhanced Loss Function for Fine-grained Classification
Guo, Pengze1; Guo, Pengze2
2024-10-04
Conference NameAIAHPC 2024: International Conference on Artificial Intelligence, Automation and High Performance Computing
Source PublicationProceedings of the 2024 4th International Conference on Artificial Intelligence, Automation and High Performance Computing
Pages414-422
Conference DateJuly 19-21 2024
Conference PlaceZhuhai, China
CountryCHINA
PublisherAssociation for Computing Machinery
Abstract

Knowledge distillation aims to improve the performance of a smaller student model by transferring knowledge from a larger teacher model. Traditional methods focus on training the student to mimic the teacher’s output activations. In this paper, we propose a multi-model training framework, MPTF (multi-model pyramid training framework), and a novel loss function, C-LOSS, to enhance the student model’s ability to distinguish between similar classes. By appropriately sacrificing storage and computation time, our framework merges categories with similar features into super categories during the initial training phase, followed by refining the classification of these super categories. The C-LOSS function is introduced to optimize intra-class and inter-class distances to improve discriminative ability. Experiments on CIFAR-10, CIFAR-100, and EuroSAT datasets show that our method significantly improves overall accuracy and confusing class accuracy. Despite these advancements, further research is needed to refine the segmentation of super categories and enhance the scalability of our framework. This study demonstrates the potential of staged training strategies and optimized loss functions in improving the performance of small models in fine-grained classification tasks.

KeywordConfusing Class Accuracy Enhanced Loss Function (c-Loss) Fine-grained Classification Knowledge Distillation Multi-model Pyramid Training Framework (Mptf)
DOI10.1145/3690931.3691001
URLView the original
Language英語English
Scopus ID2-s2.0-85212588654
Fulltext Access
Citation statistics
Document TypeConference paper
CollectionFaculty of Science and Technology
Corresponding AuthorGuo, Pengze; Guo, Pengze
Affiliation1.Faculty of Science and Technology, University of Macau, Macao
2.Faculty of Science and Technology, University of Macau, Macao
First Author AffilicationFaculty of Science and Technology
Corresponding Author AffilicationFaculty of Science and Technology
Recommended Citation
GB/T 7714
Guo, Pengze,Guo, Pengze. Research on the Great Multi-model Pyramid Training Framework and Enhanced Loss Function for Fine-grained Classification[C]:Association for Computing Machinery, 2024, 414-422.
APA Guo, Pengze., & Guo, Pengze (2024). Research on the Great Multi-model Pyramid Training Framework and Enhanced Loss Function for Fine-grained Classification. Proceedings of the 2024 4th International Conference on Artificial Intelligence, Automation and High Performance Computing, 414-422.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Guo, Pengze]'s Articles
[Guo, Pengze]'s Articles
Baidu academic
Similar articles in Baidu academic
[Guo, Pengze]'s Articles
[Guo, Pengze]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Guo, Pengze]'s Articles
[Guo, Pengze]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.