Residential Collegefalse
Status已發表Published
A novel sequential structure for lightweight multi-scale feature learning under limited available images
Liu,Peng1; Du,Jie2; Vong,Chi Man1
2023-04-26
Source PublicationNeural Networks
ISSN0893-6080
Volume164Pages:124-134
Abstract

Although multi-scale feature learning can improve the performances of deep models, its parallel structure quadratically increases the model parameters and causes deep models to become larger and larger when enlarging the receptive fields (RFs). This leads to deep models easily suffering from over-fitting issue in many practical applications where the available training samples are always insufficient or limited. In addition, under this limited situation, although lightweight models (with fewer model parameters) can effectively reduce over-fitting, they may suffer from under-fitting because of insufficient training data for effective feature learning. In this work, a lightweight model called Sequential Multi-scale Feature Learning Network (SMF-Net) is proposed to alleviate these two issues simultaneously using a novel sequential structure of multi-scale feature learning. Compared to both deep and lightweight models, the proposed sequential structure in SMF-Net can easily extract features with larger RFs for multi-scale feature learning only with a few and linearly increased model parameters. The experimental results on both classification and segmentation tasks demonstrate that our SMF-Net only has 1.25M model parameters (5.3% of Res2Net50) with 0.7G FLOPS (14.6% of Res2Net50) for classification and 1.54M parameters (8.9% of UNet) with 3.35G FLOPs (10.9% of UNet) for segmentation but achieves higher accuracy than SOTA deep models and lightweight models, even when the training data is very limited available.

KeywordImage Classification And Segmentation Lightweight Model Multi-scale Feature Sequential Structure
DOI10.1016/j.neunet.2023.04.023
URLView the original
Indexed BySCIE
Language英語English
WOS Research AreaComputer Science ; Neurosciences & Neurology
WOS SubjectComputer Science, Artificial Intelligence ; Neurosciences
WOS IDWOS:001005775700001
PublisherPERGAMON-ELSEVIER SCIENCE LTD, THE BOULEVARD, LANGFORD LANE, KIDLINGTON, OXFORD OX5 1GB, ENGLAND
Scopus ID2-s2.0-85156152118
Fulltext Access
Citation statistics
Document TypeJournal article
CollectionDEPARTMENT OF COMPUTER AND INFORMATION SCIENCE
Corresponding AuthorDu,Jie; Vong,Chi Man
Affiliation1.Department of Computer and Information Science,University of Macau,999078,China
2.National-Regional Key Technology Engineering Laboratory for Medical Ultrasound,Guangdong Key Laboratory for Biomedical Measurements and Ultrasound Imaging,School of Biomedical Engineering,Shenzhen University Medical School,Shenzhen University,Shenzhen,518060,China
First Author AffilicationUniversity of Macau
Corresponding Author AffilicationUniversity of Macau
Recommended Citation
GB/T 7714
Liu,Peng,Du,Jie,Vong,Chi Man. A novel sequential structure for lightweight multi-scale feature learning under limited available images[J]. Neural Networks, 2023, 164, 124-134.
APA Liu,Peng., Du,Jie., & Vong,Chi Man (2023). A novel sequential structure for lightweight multi-scale feature learning under limited available images. Neural Networks, 164, 124-134.
MLA Liu,Peng,et al."A novel sequential structure for lightweight multi-scale feature learning under limited available images".Neural Networks 164(2023):124-134.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Liu,Peng]'s Articles
[Du,Jie]'s Articles
[Vong,Chi Man]'s Articles
Baidu academic
Similar articles in Baidu academic
[Liu,Peng]'s Articles
[Du,Jie]'s Articles
[Vong,Chi Man]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Liu,Peng]'s Articles
[Du,Jie]'s Articles
[Vong,Chi Man]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.