Skip to content

Seq2Seq model implemented with pytorch, using Bahdanau Attention and Luong Attention.

Notifications You must be signed in to change notification settings

wenhaofang/Seq2SeqAtn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Seq2Seq with Attention Mechanism

This is a Seq2Seq model with Bahdanau Attention and Luong Attention.

Datasets:

Models:

Data Process

PYTHONPATH=. python dataprocess/process.py

Unit Test

  • for loader
PYTHONPATH=. python loaders/loader1.py
  • for module
# Seq2Seq with Attention Bahdanau
PYTHONPATH=. python modules/module1.py --attention_type bahdanau

# Seq2Seq with Attention Luong and AlignMethod Dot
PYTHONPATH=. python modules/module1.py --attention_type luong --align_method dot

# Seq2Seq with Attention Luong and AlignMethod General
PYTHONPATH=. python modules/module1.py --attention_type luong --align_method general

# Seq2Seq with Attention Luong and AlignMethod Concat
PYTHONPATH=. python modules/module1.py --attention_type luong --align_method concat

Main Process

python main.py

You can change the config either in the command line or in the file utils/parser.py

Here are the examples:

python main.py --attention_type bahdanau

python main.py --attention_type luong --align_method dot

python main.py --attention_type luong --align_method general

python main.py --attention_type luong --align_method concat

About

Seq2Seq model implemented with pytorch, using Bahdanau Attention and Luong Attention.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages