Skip to content

Latest commit

 

History

History
60 lines (54 loc) · 8.44 KB

intent_detection_slot_filling.md

File metadata and controls

60 lines (54 loc) · 8.44 KB

Intent Detection and Slot Filling

Intent Detection and Slot Filling is the task of interpreting user commands/queries by extracting the intent and the relevant slots.

Example (from ATIS):

Query: What flights are available from pittsburgh to baltimore on thursday morning
Intent: flight info
Slots: 
    - from_city: pittsburgh
    - to_city: baltimore
    - depart_date: thursday
    - depart_time: morning

ATIS

ATIS (Air Travel Information System) (Hemphill et al.) is a dataset by Microsoft CNTK. Available from the github page. The slots are labeled in the BIO (Inside Outside Beginning) format (similar to NER). This dataset contains only air travel related commands. Most of the ATIS results are based on the work here.

Model Slot F1 Score Intent Accuracy Paper / Source Code
Bi-model with decoder 96.89 98.99 A Bi-model based RNN Semantic Frame Parsing Model for Intent Detection and Slot Filling
CTRAN 98.46 98.07 CTRAN: CNN-Transformer-based network for natural language understanding Official
SlotRefine + BERT 96.16 97.74 SlotRefine: A Fast Non-Autoregressive Model for Joint Intent Detection and Slot Filling Official
SlotRefine 96.22 97.11 SlotRefine: A Fast Non-Autoregressive Model for Joint Intent Detection and Slot Filling Official
Stack-Propagation + BERT 96.10 97.50 A Stack-Propagation Framework with Token-level Intent Detection for Spoken Language Understanding Official
JointBERT-CAE 96.1 97.50 CAE: Mechanism to Diminish the Class Imbalanced in SLU Slot Filling Task Official
Co-interactive Transformer 95.90 97.70 A Co-Interactive Transformer for Joint Slot Filling and Intent Detection Official
Heterogeneous Attention 95.58 97.76 Joint agricultural intent detection and slot filling based on enhanced heterogeneous attention mechanism
Stack-Propagation 95.90 96.90 A Stack-Propagation Framework with Token-level Intent Detection for Spoken Language Understanding Official
Attention Encoder-Decoder NN 95.87 98.43 Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling
SF-ID (BLSTM) network 95.80 97.76 A Novel Bi-directional Interrelated Model for Joint Intent Detection and Slot Filling Official
Context Encoder 95.80 NA Improving Slot Filling by Utilizing Contextual Information
Capsule-NLU 95.20 95.00 Joint Slot Filling and Intent Detection via Capsule Neural Networks Official
Joint GRU model(W) 95.49 98.10 A Joint Model of Intent Determination and Slot Filling for Spoken Language Understanding
Slot-Gated BLSTM with Attension 95.20 94.10 Slot-Gated Modeling for Joint Slot Filling and Intent Prediction Official
Joint model with recurrent slot label context 94.64 98.40 Joint Online Spoken Language Understanding and Language Modeling with Recurrent Neural Networks Official
Recursive NN 93.96 95.40 JOINT SEMANTIC UTTERANCE CLASSIFICATION AND SLOT FILLING WITH RECURSIVE NEURAL NETWORKS
Encoder-labeler Deep LSTM 95.66 NA Leveraging Sentence-level Information with Encoder LSTM for Natural Language Understanding
RNN with Label Sampling 94.89 NA Recurrent Neural Network Structured Output Prediction for Spoken Language Understanding
Hybrid RNN 95.06 NA Using recurrent neural networks for slot filling in spoken language understanding.
RNN-EM 95.25 NA Recurrent neural networks with external memory for language understanding
CNN-CRF 94.35 NA Convolutional neural network based triangular crf for joint intent detection and slot filling

SNIPS

SNIPS is a dataset by Snips.ai for Intent Detection and Slot Filling benchmarking. Available from the github page. This dataset contains several day to day user command categories (e.g. play a song, book a restaurant).

Model Slot F1 Score Intent Accuracy Paper / Source Code
CTRAN 98.30 99.42 CTRAN: CNN-Transformer-based Network for Natural Language Understanding Official
SlotRefine + BERT 97.05 99.04 SlotRefine: A Fast Non-Autoregressive Model for Joint Intent Detection and Slot Filling Official
Stack-Propagation + BERT 97.00 99.00 A Stack-Propagation Framework with Token-level Intent Detection for Spoken Language Understanding Official
JointBERT-CAE 97.00 98.30 CAE: Mechanism to Diminish the Class Imbalanced in SLU Slot Filling Task Official
Heterogeneous Attention 96.32 98.29 Joint agricultural intent detection and slot filling based on enhanced heterogeneous attention mechanism
Co-interactive Transformer 95.90 98.80 A Co-Interactive Transformer for Joint Slot Filling and Intent Detection Official
Stack-Propagation 94.20 98.00 A Stack-Propagation Framework with Token-level Intent Detection for Spoken Language Understanding Official
SlotRefine 93.72 97.44 SlotRefine: A Fast Non-Autoregressive Model for Joint Intent Detection and Slot Filling Official
Context Encoder 93.60 NA Improving Slot Filling by Utilizing Contextual Information
SF-ID (BLSTM) network 92.23 97.43 A Novel Bi-directional Interrelated Model for Joint Intent Detection and Slot Filling Official
Capsule-NLU 91.80 97.70 Joint Slot Filling and Intent Detection via Capsule Neural Networks Official
Slot-Gated BLSTM with Attention 88.80 97.00 Slot-Gated Modeling for Joint Slot Filling and Intent Prediction Official