UM  > Faculty of Science and Technology
Residential Collegefalse
Status已發表Published
Extract, attend, predict: Aspect-based sentiment analysis with deep self-attention network
Yiwei Lv1; Minghao Hu2; Chao Yang3; YuanYan Tang1; Hongjun Wang4
2019-08
Conference Name21st IEEE International Conference on High Performance Computing and Communications, 17th IEEE International Conference on Smart City and 5th IEEE International Conference on Data Science and Systems, HPCC/SmartCity/DSS 2019
Source PublicationProceedings - 21st IEEE International Conference on High Performance Computing and Communications, 17th IEEE International Conference on Smart City and 5th IEEE International Conference on Data Science and Systems, HPCC/SmartCity/DSS 2019
Pages297-304
Conference Date10-12 August 2019
Conference PlaceZhangjiajie, China
CountryChina
PublisherIEEE
Abstract

Aspect-based sentiment analysis aims to predict sentiment polarities for given aspect terms in a sentence. Previous work typically encodes the aspect and the sentence separately, with either RNNs or CNNs along with sophisticated attention mechanisms. However, CNNs and RNNs suffer from problems such as restricted local receptive field and long-term dependency, respectively. Besides, separately encoding aspects and sentences also results in problems such as the aspect has no context information and neighboring aspects are not considered. To address these problems, we propose a novel approach that conducts an extract-attend-predict process with deep self-attention for aspect-based sentiment analysis. Unlike previous methods that use either RNNs or CNNs as the basic encoder, we utilizes a pre-trained deep self-attention encoder to avoid the difficulty in capturing long-distance words. Moreover, instead of performing separately encoding, our model directly extracts the aspect representation from contextualized sentence representations based on the span boundary of target aspect. A multi-granularity attending mechanism is further applied to capture the interaction between aspects and sentences, which is later used to predict the sentiment polarity. We conduct experiments on two benchmark datasets and the results show that our approach outperforms previous state-of-the-art models.

KeywordAspect-based Sentiment Analysis Deep Self-attention Extract-attendpredict Multi-granularity Attending
DOI10.1109/HPCC/SmartCity/DSS.2019.00054
URLView the original
Language英語English
Scopus ID2-s2.0-85073509731
Fulltext Access
Citation statistics
Document TypeConference paper
CollectionFaculty of Science and Technology
DEPARTMENT OF COMPUTER AND INFORMATION SCIENCE
Affiliation1.University of Macau, Macau, China
2.National University of Defense Technology, Changsha, China
3.Hunan University, Changsha, China
4.Beijing TRS Information Technology Co.,Ltd., Beijing, China
First Author AffilicationUniversity of Macau
Recommended Citation
GB/T 7714
Yiwei Lv,Minghao Hu,Chao Yang,et al. Extract, attend, predict: Aspect-based sentiment analysis with deep self-attention network[C]:IEEE, 2019, 297-304.
APA Yiwei Lv., Minghao Hu., Chao Yang., YuanYan Tang., & Hongjun Wang (2019). Extract, attend, predict: Aspect-based sentiment analysis with deep self-attention network. Proceedings - 21st IEEE International Conference on High Performance Computing and Communications, 17th IEEE International Conference on Smart City and 5th IEEE International Conference on Data Science and Systems, HPCC/SmartCity/DSS 2019, 297-304.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Yiwei Lv]'s Articles
[Minghao Hu]'s Articles
[Chao Yang]'s Articles
Baidu academic
Similar articles in Baidu academic
[Yiwei Lv]'s Articles
[Minghao Hu]'s Articles
[Chao Yang]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Yiwei Lv]'s Articles
[Minghao Hu]'s Articles
[Chao Yang]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.