Skip to content

Papers related to Machine Translation (continuously updating & welcome Star/Fork/PR)

Notifications You must be signed in to change notification settings

alphadl/inspiring_papers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

papers_about_mt and other inspired papers

[1] attention is all you need

[2] RvNN preordering En-Jp MT

[3] Curriculum Learning for Natural Answer Generation

[4] Extracting Relational Facts by an End-to-End Neural Model with Copy Mechanism

[5] Generating Natural Answers by Incorporating Copying and Retrieving Mechanisms in Sequence-to-Sequence Learning

[6] Close to Human Quality TTS with Transformer

[7] Cross-lingual Knowledge Projection Using Machine Translation and Target-side Knowledge Base Completion

[7-appendix] poster of paper7

[8] Unsupervised Cross-lingual Transfer of Word Embedding Spaces

[9] Commonsense Knowledge Base Completion

[10] ConceptNet 5.5: An Open Multilingual Graph of General Knowledge

[11] Adversarial learning meets graphs

[12] Answering Cloze-style Software Questions Using Stack Overflow

[13] Neural Machine Translation and Sequence-to-sequence Models: A Tutorial

[14] Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation

[15] Zero-Shot Dual Machine Translation

[16] Commonsense Knowledge Base Completion

[17] Unsupervised Cross-lingual Transfer of Word Embedding Spaces

[18] An Empirical Study on Development Set Selection Strategy for Machine Translation Learning

[19] Bagging-based System Combination for Domain Adaptation

[20] Cross-Sentence N-ary Relation Extraction with Graph LSTMs

[21] Distant supervision for relation extraction without labeled data

[22] NMT-Keras

[23] Fine-Tuning for Neural Machine Translation with Limited Degradation across In- and Out-of-Domain Data

[24] Improving Neural Machine Translation with Conditional Sequence Generative Adversarial Nets

[25 undocumented paper] Zhirui Zhang, Shujie Liu, Mu Li, Ming Zhou and Enhong Chen, Bidirectional Generative Adversarial Networks for Neural Machine Translation, The SIGNLL Conference on Computational Natural Language Learning (CoNLL 2018).

[26] Unsupervised Neural Machine Translation with Weight Sharing

[27] PHRASE-BASED ATTENTIONS

[28] MULTILINGUAL NEURAL MACHINE TRANSLATION WITH KNOWLEDGE DISTILLATION

[29] A Smorgasbord of Features to Combine Phrase-Based and Neural Machine Translation

[30] Unveiling the Linguistic Weaknesses of Neural Machine Translation

[31] Pre-Translation for Neural Machine Translation

[32] Improving Lexical Choice in Neural Machine Translation

[33] Improving Neural Machine Translation through Phrase-based Forced Decoding

[34] Guiding Neural Machine Translation with Retrieved Translation Pieces

[35] Sentence Weighting for Neural Machine Translation Domain Adaptation

[36] Instance Weighting for Neural Machine Translation Domain Adaptation

[37] Sentence Embedding for Neural Machine Translation Domain Adaptation

[38] Cost Weighting for Neural Machine Translation Domain Adaptation

[39] Stanford Neural Machine Translation Systems for Spoken Language Domains

[40] Sequence to Sequence Learning with Neural Networks

[41] Ensemble Distillation for Neural Machine Translation

[42] Automatic Evaluation of Machine Translation Quality Using Longest Common Subsequence and Skip-Bigram Statistics

[43] The Best Lexical Metric for Phrase-Based Statistical MT System Optimization.pdf

[44] Efficient Extraction of Oracle-best Translations from Hypergraphs

[45] Distilling the Knowledge in a Neural Network

[46 slide] GAN and its application to NLP

[47 slide] Knowledge Distillation via GAN

[48] Adversarial Generation of Natural Language

[49] Refining Source Representations with Relation Networks for Neural Machine Translation

[50] Neural Machine Translation of Rare Words with Subword Units

[51] Fully Character-Level Neural Machine Translation without Explicit Segmentation

[52] Improving Zero-Shot Translation of Low-Resource Languages

[53] Findings of the Second Shared Task on Multimodal Machine Translation and Multilingual Image Description

[54] MULTILINGUAL IMAGE DESCRIPTION WITH NEURAL SEQUENCE MODELS

[55] 2016WMT Multimodal translation---A Shared Task on Multimodal Machine Translation and Crosslingual Image Description

[56] Show, Attend and Tell- Neural Image Caption Generation with Visual Attention

[57] Show and Tell: A Neural Image Caption Generator

[58] WMT2017多模态第一名CMU 评测报告

[59] PhD thesis of Raj,京都大学 低资源多语种翻译

[60] Multi-Task Learning for Multiple Language Translation

[61] THUMT: An Open Source Toolkit for Neural Machine Translation

[62] Zero-Resource Translation with Multi-Lingual Neural Machine Translation

[63] Multi-Source Neural Translation

[64 Msc_thesis] Domain Adaptation for Multilingual Neural Machine Translation

[65] Transfer Learning for Low-Resource Neural Machine Translation

[66] Multi-Source Neural Machine Translation with Missing Data

[67] A Tree-based Decoder for Neural Machine Translation

[68 dynet NMT] XNMT: The eXtensible Neural Machine Translation Toolkit

[69](reinforcement leanring for NMT)Sequence level training with recurrent neural networks .pdf

[70] A Study of Reinforcement Learning for Neural Machine Translation

[71] Training Tips for the Transformer Model

[72] Bidirectional Generative Adversarial Networks for Neural Machine Translation

[73] Dual Learning for Machine Translation

[74] A Teacher-Student Framework for Zero-Resource Neural Machine Translation

[75] Dual Transfer Learning for Neural Machine Translation with Marginal Distribution Regularization

[76] When and Why are Pre-trained Word Embeddings Useful for Neural Machine Translation?

[77] Meta-Learning for Low-Resource Neural Machine Translation

[78] Phrase-Based & Neural Unsupervised Machine Translation

[79] Model-Level Dual Learning

[80] You May Not Need Attention

[81] An Analysis of Encoder Representations in Transformer-Based Machine Translation

[82] An Introductory Survey on Attention Mechanisms in NLP Problems

[83] On Zero-shot Cross-lingual Transfer of Multilingual Neural Machine Translation.pdf

[84] Bilingual-GAN: Neural Text Generation and Neural Machine Translation as Two Sides of the Same Coin

[85] GraphSeq2Seq- Graph-Sequence-to-Sequence for Neural Machine Translation

[86] Simplifying Neural Machine Translation with Addition-Subtraction Twin-Gated Recurrent Networks

[87] Recurrent Additive Networks

[88] Factored Neural Language Models

[89] Neural Machine Translation By Generating Multiple Linguistic Factors

[90] Deep Architectures for Neural Machine Translation

[91] A Context-Aware Recurrent Encoder for Neural Machine Translation

[92] Regularization techniques for fine-tuning in neural machine translation

[93] Effective Domain Mixing for Neural Machine Translation

[94] Multi-Domain Neural Machine Translation with Word-Level Domain Context Discrimination

[95] Towards Linear Time Neural Machine Translation with Capsule

[96] Agreement on Target Bidirectional LSTMs for Sequence-to-Sequence Learning

[97] Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks

[98] Does String-Based Neural MT Learn Source Syntax?

[99] Graph Convolutional Networks for Text Classification

[100] Incorporating Structural Alignment Biases into an Attentional Neural Translation Model

[101] The Importance of Being Recurrent for Modeling Hierarchical Structure

[102] An Empirical Exploration of Skip Connections for Sequential Tagging

[103] Incorporating Copying Mechanism in Sequence-to-Sequence Learning

[104] Attention Focusing for Neural Machine Translation by Bridging Source and Target Embeddings

[105] Highway Networks

[106] Extreme Adaptation for Personalized Neural Machine Translation

[107] Improved Neural Machine Translation with a Syntax-Aware Encoder and Decoder

[108] Rico Sennrich NMT- what’s linguistics got to do with it?

[110] A robust self-learning method for fully unsupervised cross-lingual mappings of word embeddings

[111] Unsupervised Nerual Machine Translation

[112] Unsupervised Statistical Machine Translation

[113] An Effective Approach to Unsupervised Machine Translation

[114] lkaiser-Tensor2Tensor Transformers New Deep Models for NLP

[115] Temporal dynamics of semantic relations in word embeddings- an application to predicting armed conflict participants

[116] A Tutorial on Deep Latent Variable Models of Natural Language

[117] Latent Alignment and Variational Attention

[118] Pervasive Attention- 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction

[119] Semi-Autoregressive Neural Machine Translation

[120] Insertion Transformer

About

Papers related to Machine Translation (continuously updating & welcome Star/Fork/PR)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published