[NeurIPS 2023 Main Track] This is the repository for the paper titled "Don’t Stop Pretraining? Make Prompt-based Fine-tuning Powerful Learner"
-
Updated
Feb 4, 2024 - Python
[NeurIPS 2023 Main Track] This is the repository for the paper titled "Don’t Stop Pretraining? Make Prompt-based Fine-tuning Powerful Learner"
Repository for My HuggingFace Natural Language Processing Projects
PyTorch implementation for "Training and Inference on Any-Order Autoregressive Models the Right Way", NeurIPS 2022 Oral, TPM 2023 Best Paper Honorable Mention
11th place solution of NeurIPS 2024 - Predict New Medicines with BELKA competition on Kaggle: https://www.kaggle.com/competitions/leash-BELKA
Code for the publication of WWW'22
Evaluation of zero-shot classification models on Turkish datasets.
Transformer for Automatic Speech Recognition
Use BERTRAM to get single-token embeddings for idioms on the MAGPIE dataset.
🗨️ This repository contains a collection of notebooks and resources for various NLP tasks using different architectures and frameworks.
Pre-training a Transformer from scratch.
Course materials for the Machine Learning for NLP course taught by Sameer Singh for the Cognitive Science summer school 2022.
Measuring Biases in Masked Language Models for PyTorch Transformers. Support for multiple social biases and evaluation measures.
Pre-train a custom BERT model on unlabeled Persian text, with Masked language modeling objective
A Pre-trained Language Model for Semantic Similarity Measurement of Persian Informal Short Texts
External Knowledge Infusion using INCIDecoder into BERT for Chemical Mapping
Comparing Data-Driven Techniques for Enhancing Negation Sensitivity in MLM-Based Laguage-Models
Customized Pretraining for NLG Tasks
Source codes and materials of Advanced Spelling Error Correction project.
Masked Language Modeling demo using XLM-RoBERTa + Gradio/FastAPI
Implementation of Transformer Encoders / Masked Language Modeling Objective
Add a description, image, and links to the masked-language-modeling topic page so that developers can more easily learn about it.
To associate your repository with the masked-language-modeling topic, visit your repo's landing page and select "manage topics."