Bi-Directional Attention Flow (BiDAF) question answering model enhanced by multi-layer convolutional neural network character embeddings.
-
Updated
Jan 28, 2020 - Python
Bi-Directional Attention Flow (BiDAF) question answering model enhanced by multi-layer convolutional neural network character embeddings.
We implemented QANet from scratch and improved baseline BiDAF. We also used an ensemble of BiDAF and QANet models to achieve EM/F1 of 69.47/71.96, ranking #3 on the leaderboard as of Mar 4, 2022.
BiDAF reading comprehension model with Answer Pointer head.
Question answering on the SQuAD dataset, for NLP class at UNIBO
BI-DIRECTIONAL ATTENTION FLOW FOR MACHINE COMPREHENSION
This is BIDAF mechanism based question answering network implementation without using and pretrained language representations.
Implementation of the Bi-Directional Attention Flow Model (BiDAF) in Python using Keras
ML Projects and Experience in Industry and Academia.
Implementation of the machine comprehension model in our ACL 2019 paper: Augmenting Neural Networks with First-order Logic.
Answering a query about a given context paragraph using a model based on recurrent neural networks and attention.
Multiple Sentences Bi-directional Attention Flow (Multi-BiDAF) network is a model designed to fit the BiDAF model of Seo et al. (2017) for the Multi-RC dataset. This implementation is built on the AllenNLP library.
State of the art of Neural Question Answering using PyTorch.
CS224N, Stanford, Winter 2018
Implementing the Bidirectional Attention Flow model using pytorch
Usage example for the AllenNLP BiDAF pre-trained model
Add a description, image, and links to the bidaf topic page so that developers can more easily learn about it.
To associate your repository with the bidaf topic, visit your repo's landing page and select "manage topics."