UM  > Faculty of Science and Technology
Residential Collegefalse
Status已發表Published
DeIL: Direct-and-Inverse CLIP for Open-World Few-Shot Learning
Shao, Shuai1,4; Bai, Yu2; Wang, Yan3; Liu, Baodi2; Zhou, Yicong4
2024-09
Conference Name2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
Source PublicationProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Pages28505-28514
Conference Date16-22 June 2024
Conference PlaceSeattle, WA, USA
CountryUSA
PublisherIEEE Computer Society
Abstract

Open-World Few-Shot Learning (OFSL) is a critical field of research, concentrating on the precise identification of target samples in environments with scarce data and unre-liable labels, thus possessing substantial practical signif-icance. Recently, the evolution of foundation models like CLIP has revealed their strong capacity for representation, even in settings with restricted resources and data. This development has led to a significant shift in focus, tran-sitioning from the traditional method of “building models from scratch” to a strategy centered on “efficiently utilizing the capabilities of foundation models to extract rele-vant prior knowledge tailored for OFSL and apply it judi-ciously”. Amidst this backdrop, we unveil the Direct-and-Inverse CLIP (DeIL), an innovative method leveraging our proposed “Direct-and-Inverse” concept to activate CLIP-based methods for addressing OFSL. This concept transforms conventional single-step classification into a nuanced two-stage process: initially filtering out less probable cate-gories, followed by accurately determining the specific cat-egory of samples. DeIL comprises two key components: a pretrainer (frozen) for data denoising, and an adapter (tun-able) for achieving precise final classification. In experiments, DeIL achieves SOTA performance on 11 datasets. https://github.com/The-Shuai/DeIL. 

KeywordComputer Vision Filtering Noise Reduction Transforms Pattern Recognition Few Shot Learning Clip Open-world Few-shot Learning
DOI10.1109/CVPR52733.2024.02693
URLView the original
Language英語English
Scopus ID2-s2.0-85204816865
Fulltext Access
Citation statistics
Document TypeConference paper
CollectionFaculty of Science and Technology
DEPARTMENT OF COMPUTER AND INFORMATION SCIENCE
Corresponding AuthorLiu, Baodi; Zhou, Yicong
Affiliation1.Zhejiang Lab, China
2.China University of Petroleum (East China), China
3.Beihang University, China
4.University of Macau, Macao
First Author AffilicationUniversity of Macau
Corresponding Author AffilicationUniversity of Macau
Recommended Citation
GB/T 7714
Shao, Shuai,Bai, Yu,Wang, Yan,et al. DeIL: Direct-and-Inverse CLIP for Open-World Few-Shot Learning[C]:IEEE Computer Society, 2024, 28505-28514.
APA Shao, Shuai., Bai, Yu., Wang, Yan., Liu, Baodi., & Zhou, Yicong (2024). DeIL: Direct-and-Inverse CLIP for Open-World Few-Shot Learning. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 28505-28514.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Shao, Shuai]'s Articles
[Bai, Yu]'s Articles
[Wang, Yan]'s Articles
Baidu academic
Similar articles in Baidu academic
[Shao, Shuai]'s Articles
[Bai, Yu]'s Articles
[Wang, Yan]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Shao, Shuai]'s Articles
[Bai, Yu]'s Articles
[Wang, Yan]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.