Skip to content

This repository is for the paper Recurrent Inference in Text Editing. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 1758–1769, Online. Association for Computational Linguistics.

Notifications You must be signed in to change notification settings

ShiningLab/Recurrent-Text-Editing

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

84 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Recurrent-Text-Editing

This repository is for the paper Recurrent Inference in Text Editing. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 1758–1769, Online. Association for Computational Linguistics.

[arXiv] [Poster] [Slides] [Video]

Methods

  • End2end
  • Tagging
  • Recurrence

Models

  • Naive GRU RNN
  • Naive LSTM RNN
  • Bi-directional GRU RNN
  • Bi-directional LSTM RNN
  • Bi-directional GRU RNN with Attention
  • Bi-directional LSTM RNN with Attention
  • Transformer

Data

  • Arithmetic Operators Restoration (AOR)
  • Arithmetic Equation Simplification (AES)
  • Arithmetic Equation Correction (AEC)

Directory

  • data - for data generation
  • main - for training
  • exp_log - validation and testing performance during training for experiments in the original work
code/
├── README.md
├── data
├── main
├── exp_log
└── reference

Process

  1. Generate raw datasets under code/data/
  2. Copy raw datasets from code/data/ to code/main/res/data/
  3. Pre-process datasets under code/main/res/data/
  4. Start training under code/main/

Authors

BibTex

@inproceedings{shi-etal-2020-recurrent,
    title = "Recurrent Inference in Text Editing",
    author = "Shi, Ning  and
      Zeng, Ziheng  and
      Zhang, Haotian  and
      Gong, Yichen",
    booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings",
    month = nov,
    year = "2020",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://www.aclweb.org/anthology/2020.findings-emnlp.159",
    pages = "1758--1769",
    abstract = "In neural text editing, prevalent sequence-to-sequence based approaches directly map the unedited text either to the edited text or the editing operations, in which the performance is degraded by the limited source text encoding and long, varying decoding steps. To address this problem, we propose a new inference method, Recurrence, that iteratively performs editing actions, significantly narrowing the problem space. In each iteration, encoding the partially edited text, Recurrence decodes the latent representation, generates an action of short, fixed-length, and applies the action to complete a single edit. For a comprehensive comparison, we introduce three types of text editing tasks: Arithmetic Operators Restoration (AOR), Arithmetic Equation Simplification (AES), Arithmetic Equation Correction (AEC). Extensive experiments on these tasks with varying difficulties demonstrate that Recurrence achieves improvements over conventional inference methods.",
}

About

This repository is for the paper Recurrent Inference in Text Editing. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 1758–1769, Online. Association for Computational Linguistics.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages