Context Gates for Neural Machine Translation
Saved in:
| Main Authors: | Zhaopeng Tu, Yang Liu, Zhengdong Lu, Xiaohua Liu, Hang Li |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
The MIT Press
2021-03-01
|
| Series: | Transactions of the Association for Computational Linguistics |
| Online Access: | http://dx.doi.org/10.1162/tacl_a_00048 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Modeling Past and Future for Neural Machine Translation
by: Zaixiang Zheng, et al.
Published: (2021-03-01) -
Based on Gated Dynamic Encoding Optimization, the LGE-Transformer Method for Low-Resource Neural Machine Translation
by: Zhizhan Xu, et al.
Published: (2024-01-01) -
Learning to Remember Translation History with a Continuous Cache
by: Zhaopeng Tu, et al.
Published: (2021-03-01) -
Multilingual Denoising Pre-training for Neural Machine Translation
by: Yinhan Liu, et al.
Published: (2021-03-01) -
Sublemma-Based Neural Machine Translation
by: Thien Nguyen, et al.
Published: (2021-01-01)