Skip to content

Sequence to sequence model for different tasks like machine translation, summarization, question answering, etc using attention layer implementation.

Notifications You must be signed in to change notification settings

Jaswanth-Batturi/seq2seq-model-using-attention

Repository files navigation

seq2seq-model-using-attention

Sequence to sequence model for different tasks like machine translation, summarization, question answering, etc using attention layer implementation. In this repository, I have used this seq2seq model to perform the task of summarization on amazon food reviews dataset.

Extract the glove.6B.50d.zip file in the data folder.

Then run the following commands:

!python3 prepare.py

!python3 train.py

!python3 generate.py

About

Sequence to sequence model for different tasks like machine translation, summarization, question answering, etc using attention layer implementation.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published