Skip to content

Sequence to sequence encoder-decoder model with Attention for Neural Machine Translation

Notifications You must be signed in to change notification settings

prashanthi-r/Eng-Hin-Neural-Machine-Translation

Repository files navigation

Sequence to sequence encoder-decoder model with Attention for Neural Machine Translation

The Google Colaboratory notebook names RathiKashi_PrashanthiR_Eng-Hin-tensorflow-nmt.ipynb in this repository is an LSTM Encoder-Decoder based model with Attention for machine translation from English to Hindi with Attention using TensorFlow.

Additional related topics, including Literature Review for Neural Machine Translation, Implementation Details, Results, BLeU Score, Drawbacks of the Model have been discussed in Course_Project_Advanced_Machine_Learning_Prashanthi_Rathi.pdf.

Contributors to this project and report:

Prashanthi R and Rathi Kashi

Key references:

  1. https://www.tensorflow.org/tutorials/text/nmt_with_attention
  2. https://stackabuse.com/python-for-nlp-neural-machine-translation-with-seq2seq-in-keras/