UM  > Faculty of Science and Technology
Residential Collegefalse
Status已發表Published
Leveraging GANs via Non-local Features
Peng, Xuyang1; Liu, Weifeng2; Liu, Baodi2; Zhang, Kai3; Lu, Xiaoping4; Zhou, Yicong5
2021-09
Conference Name30th International Conference on Artificial Neural Networks, ICANN 2021
Source PublicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12892 LNCS
Pages551-562
Conference DateSeptember 14-17, 2021
Conference PlaceBratislava, Slovakia
CountrySlovakia
Author of SourceFarkaš I., Masulli P., Otte S., Wermter S.
Publication PlaceBERLIN, GERMANY
PublisherSpringer Science and Business Media Deutschland GmbH
Abstract

Recent years, Generative Adversarial Networks (GANs) have achieved tremendous success in image synthesis, which usually employ the convolutional operation to extract image features. However, most existing convolutional GANs only extract features in a local neighborhood at a time, which may often cause a lack of non-local information resulting in generating the wrong semantic object in the wrong position. In this paper, we propose a Graph Convolutional Architecture (GCA) for GANs to tackle this problem. GCA constructs a pixel-level graph structure between image regions through an attention mechanism and leverages Graph Convolutional Networks (GCNs) to extract non-local features. GCA extracts the connections between different regions of the image through GCNs, which is a more effective method of using relationship information than directly adding long-range dependencies to the model. We implement the GCA into Deep Convolutional Generative Adversarial Networks (DCGAN), Self-Attention Generative Adversarial Networks (SAGAN), and Concurrent-Single-Image-GAN (ConSinGAN). Extensive experiments are conducted to verify the performance of GCA. The results demonstrate that the GCA can significantly boost the quality of the generated image with more non-local features.

KeywordGenerative Adversarial Networks Non-local Features Attention Mechanism
DOI10.1007/978-3-030-86340-1_44
URLView the original
Indexed ByCPCI-S
Language英語English
WOS Research AreaComputer Science
WOS SubjectComputer Science, Artificial Intelligence ; Computer Science, Theory & Methods
WOS IDWOS:000711922300044
Scopus ID2-s2.0-85115703099
Fulltext Access
Citation statistics
Document TypeConference paper
CollectionFaculty of Science and Technology
Corresponding AuthorLiu, Weifeng
Affiliation1.College of Oceanography and Space Informatics, China University of Petroleum (East China), Qingdao, China
2.College of Control Science and Engineering, China University of Petroleum (East China), Qingdao, China
3.School of Petroleum Engineering, China University of Petroleum (East China), Qingdao, China
4.Haier Industrial Intelligence Institute Co., Ltd., Qingdao, China
5.University of Macau, Macao
Recommended Citation
GB/T 7714
Peng, Xuyang,Liu, Weifeng,Liu, Baodi,et al. Leveraging GANs via Non-local Features[C]. Farkaš I., Masulli P., Otte S., Wermter S., BERLIN, GERMANY:Springer Science and Business Media Deutschland GmbH, 2021, 551-562.
APA Peng, Xuyang., Liu, Weifeng., Liu, Baodi., Zhang, Kai., Lu, Xiaoping., & Zhou, Yicong (2021). Leveraging GANs via Non-local Features. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 12892 LNCS, 551-562.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Peng, Xuyang]'s Articles
[Liu, Weifeng]'s Articles
[Liu, Baodi]'s Articles
Baidu academic
Similar articles in Baidu academic
[Peng, Xuyang]'s Articles
[Liu, Weifeng]'s Articles
[Liu, Baodi]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Peng, Xuyang]'s Articles
[Liu, Weifeng]'s Articles
[Liu, Baodi]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.