Skip to content

Repo for arXiv preprint "Gradient-based Adversarial Attacks against Text Transformers"

License

Notifications You must be signed in to change notification settings

jakobd16/CS269-Final-Project

 
 

Repository files navigation

Gradient-based Adversarial Attacks against Text Transformers

Install Dependencies and Data Download:

  1. Install HuggingFace dependences
conda install -c huggingface transformers
pip install datasets
  1. (Optional) For attacks against DBPedia14, download from Kaggle and setup data directory to contain:
<data_dir>/dbpedia_csv/
       train.csv
       test.csv

Citation

Please cite [1] if you found the resources in this repository useful.

[1] C. Guo *, A. Sablayrolles *, Herve Jegou, Douwe Kiela. Gradient-based Adversarial Attacks against Text Transformers. EMNLP 2021.

@article{guo2021gradientbased,
  title={Gradient-based Adversarial Attacks against Text Transformers},
  author={Guo, Chuan and Sablayrolles, Alexandre and Jégou, Hervé and Kiela, Douwe},
  journal={arXiv preprint arXiv:2104.13733},
  year={2021}
}

License

This project is CC-BY-NC 4.0 licensed, as found in the LICENSE file.

About

Repo for arXiv preprint "Gradient-based Adversarial Attacks against Text Transformers"

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%