We use a modified fork of huggingface transformers for our experiments.
If you are using conda use the following command:
conda env create -f environment.yml
Otherwise, for creating python environment use:
pip install requirements.txt
-
We used the dataset released in the MuP2022 shared task
-
Make sure to create `train, dev, test' csv files with column names "text" and "summary"
To fine-tune any huggingface model you can use the run.sh
script. When running the different models described in the paper, ensure you pass the appropriate arguments.
sh run.sh
You can download the BART-large-cnn fine-tuned on MuP2022 dataset