Skip to content

Latest commit

 

History

History
10 lines (8 loc) · 464 Bytes

README.md

File metadata and controls

10 lines (8 loc) · 464 Bytes

seq2seq-model-using-attention

Sequence to sequence model for different tasks like machine translation, summarization, question answering, etc using attention layer implementation. In this repository, I have used this seq2seq model to perform the task of summarization on amazon food reviews dataset.

Extract the glove.6B.50d.zip file in the data folder.

Then run the following commands:

!python3 prepare.py

!python3 train.py

!python3 generate.py