UM  > Faculty of Science and Technology
Residential Collegefalse
Status已發表Published
Focused quantization for sparse CNNs
Zhao,Yiren1; Gao,Xitong2; Bates,Daniel1; Mullins,Robert1; Xu,Cheng Zhong3
2019
Conference Name33rd Conference on Neural Information Processing Systems (NeurIPS)
Source PublicationAdvances in Neural Information Processing Systems
Volume32
Conference DateDEC 08-14, 2019
Conference PlaceVancouver, CANADA
Abstract

Deep convolutional neural networks (CNNs) are powerful tools for a wide range of vision tasks, but the enormous amount of memory and compute resources required by CNNs pose a challenge in deploying them on constrained devices. Existing compression techniques, while excelling at reducing model sizes, struggle to be computationally friendly. In this paper, we attend to the statistical properties of sparse CNNs and present focused quantization, a novel quantization strategy based on power-of-two values, which exploits the weight distributions after fine-grained pruning. The proposed method dynamically discovers the most effective numerical representation for weights in layers with varying sparsities, significantly reducing model sizes. Multiplications in quantized CNNs are replaced with much cheaper bit-shift operations for efficient inference. Coupled with lossless encoding, we built a compression pipeline that provides CNNs with high compression ratios (CR), low computation cost and minimal loss in accuracy. In ResNet-50, we achieved a 18.08× CR with only 0.24% loss in top-5 accuracy, outperforming existing compression methods. We fully compressed a ResNet-18 and found that it is not only higher in CR and top-5 accuracy, but also more hardware efficient as it requires fewer logic gates to implement when compared to other state-of-the-art quantization methods assuming the same throughput.

URLView the original
Indexed ByCPCI-S
Language英語English
WOS Research AreaComputer Science
WOS SubjectComputer Science, Artificial Intelligence
WOS IDWOS:000534424305056
Scopus ID2-s2.0-85084832480
Fulltext Access
Citation statistics
Document TypeConference paper
CollectionFaculty of Science and Technology
Corresponding AuthorZhao,Yiren; Gao,Xitong
Affiliation1.University of Cambridge,United Kingdom
2.Shenzhen Institutes of Advanced Technology,China
3.University of Macau,Macao
Recommended Citation
GB/T 7714
Zhao,Yiren,Gao,Xitong,Bates,Daniel,et al. Focused quantization for sparse CNNs[C], 2019.
APA Zhao,Yiren., Gao,Xitong., Bates,Daniel., Mullins,Robert., & Xu,Cheng Zhong (2019). Focused quantization for sparse CNNs. Advances in Neural Information Processing Systems, 32.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Zhao,Yiren]'s Articles
[Gao,Xitong]'s Articles
[Bates,Daniel]'s Articles
Baidu academic
Similar articles in Baidu academic
[Zhao,Yiren]'s Articles
[Gao,Xitong]'s Articles
[Bates,Daniel]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Zhao,Yiren]'s Articles
[Gao,Xitong]'s Articles
[Bates,Daniel]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.