×
验证码:
换一张
Forgotten Password?
Stay signed in
Login With UMPASS
English
|
繁體
Login With UMPASS
Log In
ALL
ORCID
TI
AU
PY
SU
KW
TY
JN
DA
IN
PB
FP
ST
SM
Study Hall
Image search
Paste the image URL
Home
Faculties & Institutes
Scholars
Publications
Subjects
Statistics
News
Search in the results
Faculties & Institutes
Faculty of Scie... [11]
Faculty of Arts ... [2]
Authors
WONG FAI [11]
LIU SIYOU [3]
SUN YUQI [1]
YUAN YULIN [1]
Document Type
Conference pape... [11]
Journal article [3]
Date Issued
2024 [1]
2023 [2]
2021 [5]
2019 [2]
2014 [2]
2013 [1]
More...
Language
英語English [14]
Source Publication
Findings of the ... [2]
NAACL HLT 2019 -... [2]
2024 Joint Inter... [1]
CoNLL 2013 - 17t... [1]
CoNLL 2014 - 18t... [1]
Findings of the ... [1]
More...
Indexed By
SCIE [2]
CPCI-S [1]
CPCI-SSH [1]
ESCI [1]
Funding Organization
Funding Project
×
Knowledge Map
UM
Start a Submission
Submissions
Unclaimed
Claimed
Attach Fulltext
Bookmarks
Browse/Search Results:
1-10 of 14
Help
Selected(
0
)
Clear
Items/Page:
5
10
15
20
25
30
35
40
45
50
55
60
65
70
75
80
85
90
95
100
Sort:
Select
Issue Date Ascending
Issue Date Descending
Journal Impact Factor Ascending
Journal Impact Factor Descending
WOS Cited Times Ascending
WOS Cited Times Descending
Submit date Ascending
Submit date Descending
Title Ascending
Title Descending
Author Ascending
Author Descending
A Paradigm Shift: The Future of Machine Translation Lies with Large Language Models
Conference paper
Lyu, Chenyang, Du, Zefeng, Xu, Jitao, Duan, Yitao, Wu, Minghao, Lynn, Teresa, Aji, Alham Fikri, Wong, Derek F., Liu, Siyou, Wang, Longyue. A Paradigm Shift: The Future of Machine Translation Lies with Large Language Models[C]:European Language Resources Association (ELRA), 2024, 1339-1352.
Authors:
Lyu, Chenyang
;
Du, Zefeng
;
Xu, Jitao
;
Duan, Yitao
;
Wu, Minghao
; et al.
Favorite
|
TC[Scopus]:
1
|
Submit date:2024/07/04
Large Language Models
Machine Translation
New Trends
Findings of the WMT 2023 Shared Task on Discourse-Level Literary Translation: A Fresh Orb in the Cosmos of LLMs
Conference paper
Wang, Longyue, Tu, Zhaopeng, Gu, Yan, Liu, Siyou, Yu, Dian, Ma, Qingsong, Lyu, Chenyang, Zhou, Liting, Liu, Chao Hong, Ma, Yufeng, Chen, Weiyu, Graham, Yvette, Webber, Bonnie, Koehn, Philipp, Way, Andy, Yuan, Yulin, Shi, Shuming. Findings of the WMT 2023 Shared Task on Discourse-Level Literary Translation: A Fresh Orb in the Cosmos of LLMs[C]. Philipp Koehn, Barry Haddow, Tom Kocmi, Christof Monz:Association for Computational Linguistics, 2023, 55-67.
Authors:
Wang, Longyue
;
Tu, Zhaopeng
;
Gu, Yan
;
Liu, Siyou
;
Yu, Dian
; et al.
Favorite
|
TC[Scopus]:
5
|
Submit date:2024/02/22
A Benchmark Dataset and Evaluation Methodology for Chinese Zero Pronoun Translation
Journal article
Xu,Mingzhou, Wang,Longyue, Liu,Siyou, Wong,Derek F., Shi,Shuming, Tu,Zhaopeng. A Benchmark Dataset and Evaluation Methodology for Chinese Zero Pronoun Translation[J]. Language Resources and Evaluation, 2023, 57, 1263–1293.
Authors:
Xu,Mingzhou
;
Wang,Longyue
;
Liu,Siyou
;
Wong,Derek F.
;
Shi,Shuming
; et al.
Favorite
|
TC[WOS]:
1
TC[Scopus]:
2
IF:
1.7
/
2.0
|
Submit date:2023/08/03
Benchmark Dataset
Discourse
Evaluation Metric
Machine Translation
Zero Pronoun
Recent Advances in Dialogue Machine Translation
Journal article
Liu, Siyou, Sun, Yuqi, Wang, Longyue. Recent Advances in Dialogue Machine Translation[J]. Information (Switzerland), 2021, 12(11), 484.
Authors:
Liu, Siyou
;
Sun, Yuqi
;
Wang, Longyue
Adobe PDF
|
Favorite
|
TC[WOS]:
2
TC[Scopus]:
6
IF:
2.4
/
2.6
|
Submit date:2022/08/28
Dialogue
Neural Machine Translation
Discourse Issue
Benchmark Data
Existing Approaches
Real-life Applications
Building Advanced System
Context-aware Self-Attention Networks for Natural Language Processing
Journal article
Yang, Baosong, Wang, Longyue, Wong, Derek F., Shi, Shuming, Tu, Zhaopeng. Context-aware Self-Attention Networks for Natural Language Processing[J]. Neurocomputing, 2021, 458, 157-169.
Authors:
Yang, Baosong
;
Wang, Longyue
;
Wong, Derek F.
;
Shi, Shuming
;
Tu, Zhaopeng
Favorite
|
TC[WOS]:
34
TC[Scopus]:
40
IF:
5.5
/
5.5
|
Submit date:2021/12/08
Context Modeling
Inductive Bias
Natural Language Processing
Self-attention Networks
On the Complementarity between Pre-Training and Back-Translation for Neural Machine Translation
Conference paper
Liu, Xuebo, Wang, Longyue, Wong, Derek F., Ding, Liang, Chao, Lidia S., Shi, Shuming, Tu, Zhaopeng. On the Complementarity between Pre-Training and Back-Translation for Neural Machine Translation[C]. Moens M.-F., Huang X., Specia L., Yih S.W.-T.:Association for Computational Linguistics (ACL), 2021, 2900-2907.
Authors:
Liu, Xuebo
;
Wang, Longyue
;
Wong, Derek F.
;
Ding, Liang
;
Chao, Lidia S.
; et al.
Favorite
|
TC[Scopus]:
22
|
Submit date:2022/05/13
On the Copying Behaviors of Pre-Training for Neural Machine Translation
Conference paper
Liu, Xuebo, Wang, Longyue, Wong, Derek F., Ding, Liang, Chao, Lidia S., Shi, Shuming, Tu, Zhaopeng. On the Copying Behaviors of Pre-Training for Neural Machine Translation[C]. Zong C., Xia F., Li W., Navigli R.:Association for Computational Linguistics (ACL), 2021, 4265-4275.
Authors:
Liu, Xuebo
;
Wang, Longyue
;
Wong, Derek F.
;
Ding, Liang
;
Chao, Lidia S.
; et al.
Favorite
|
TC[Scopus]:
23
|
Submit date:2022/05/13
Progressive Multi-Granularity Training for Non-Autoregressive Translation
Conference paper
Ding, Liang, Wang, Longyue, Liu, Xuebo, Wong, Derek F., Tao, Dacheng, Tu, Zhaopeng. Progressive Multi-Granularity Training for Non-Autoregressive Translation[C], 2021, 2797-2803.
Authors:
Ding, Liang
;
Wang, Longyue
;
Liu, Xuebo
;
Wong, Derek F.
;
Tao, Dacheng
; et al.
Favorite
|
TC[Scopus]:
29
|
Submit date:2022/05/13
Convolutional self-attention networks
Conference paper
Yang,Baosong, Wang,Longyue, Wong,Derek F., Chao,Lidia S., Tu,Zhaopeng. Convolutional self-attention networks[C], 2019, 4040-4045.
Authors:
Yang,Baosong
;
Wang,Longyue
;
Wong,Derek F.
;
Chao,Lidia S.
;
Tu,Zhaopeng
Favorite
|
TC[WOS]:
51
TC[Scopus]:
88
|
Submit date:2021/03/11
Modeling recurrence for transformer
Conference paper
Hao, Jie, Wang, Xing, Yang, Baosong, Wang, Longyue, Zhang, Jinfeng, Tu, Zhaopeng. Modeling recurrence for transformer[C], 2019, 1198-1207.
Authors:
Hao, Jie
;
Wang, Xing
;
Yang, Baosong
;
Wang, Longyue
;
Zhang, Jinfeng
; et al.
Favorite
|
TC[WOS]:
29
TC[Scopus]:
48
|
Submit date:2022/05/23