UM  > Faculty of Science and Technology
Residential Collegefalse
Status已發表Published
Double Correction Framework for Denoising Recommendation
He, Zhuangzhuang1; Wang, Yifan2; Yang, Yonghui1; Sun, Peijie2; Wu, Le3; Bai, Haoyue1; Gong, Jinqi4; Hong, Richang3; Zhang, Min2
2024-08
Conference NameKDD '24: The 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
Source PublicationKDD '24: Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
Pages1062-1072
Conference DateAugust 25-29, 2024
Conference PlaceBarcelona
CountrySpain
Publication PlaceNew York, NY, USA
PublisherAssociation for Computing Machinery
Abstract

As its availability and generality in online services, implicit feedback is more commonly used in recommender systems. However, implicit feedback usually presents noisy samples in real-world recommendation scenarios (such as misclicks or non-preferential behaviors), which will affect precise user preference learning. To overcome the noisy samples problem, a popular solution is based on dropping noisy samples in the model training phase, which follows the observation that noisy samples have higher training losses than clean samples. Despite the effectiveness, we argue that this solution still has limits. (1) High training losses can result from model optimization instability or hard samples, not just noisy samples. (2) Completely dropping of noisy samples will aggravate the data sparsity, which lacks full data exploitation.

To tackle the above limitations, we propose a Double Correction Framework for Denoising Recommendation (DCF), which contains two correction components from views of more precise sample dropping and avoiding more sparse data. In the sample dropping correction component, we use the loss value of the samples over time to determine whether it is noise or not, increasing dropping stability. Instead of averaging directly, we use the damping function to reduce the bias effect of outliers. Furthermore, due to the higher variance exhibited by hard samples, we derive a lower bound for the loss through concentration inequality to identify and reuse hard samples. In progressive label correction, we iteratively re-label highly deterministic noisy samples and retrain them to further improve performance. Finally, extensive experimental results on three datasets and four backbones demonstrate the effectiveness and generalization of our proposed framework. Our code is available at https://github.com/bruno686/DCF. 

KeywordDenoising Implicit Feedback Recommendation
DOI10.1145/3637528.3671692
URLView the original
Language英語English
Scopus ID2-s2.0-85203713958
Fulltext Access
Citation statistics
Document TypeConference paper
CollectionFaculty of Science and Technology
Corresponding AuthorWu, Le; Zhang, Min
Affiliation1.Hefei University of Technology, Hefei, China
2.Tsinghua University, Beijing, China
3.Hefei University of Technology, Institute of Dataspace, Hefei Comprehensive National Science Center, Hefei, China
4.University of Macau, Macao
Recommended Citation
GB/T 7714
He, Zhuangzhuang,Wang, Yifan,Yang, Yonghui,et al. Double Correction Framework for Denoising Recommendation[C], New York, NY, USA:Association for Computing Machinery, 2024, 1062-1072.
APA He, Zhuangzhuang., Wang, Yifan., Yang, Yonghui., Sun, Peijie., Wu, Le., Bai, Haoyue., Gong, Jinqi., Hong, Richang., & Zhang, Min (2024). Double Correction Framework for Denoising Recommendation. KDD '24: Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 1062-1072.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[He, Zhuangzhuang]'s Articles
[Wang, Yifan]'s Articles
[Yang, Yonghui]'s Articles
Baidu academic
Similar articles in Baidu academic
[He, Zhuangzhuang]'s Articles
[Wang, Yifan]'s Articles
[Yang, Yonghui]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[He, Zhuangzhuang]'s Articles
[Wang, Yifan]'s Articles
[Yang, Yonghui]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.