A Pre-trained Language Model for Semantic Similarity Measurement of Persian Informal Short Texts
-
Updated
Aug 1, 2024 - Python
A Pre-trained Language Model for Semantic Similarity Measurement of Persian Informal Short Texts
11th place solution of NeurIPS 2024 - Predict New Medicines with BELKA competition on Kaggle: https://www.kaggle.com/competitions/leash-BELKA
Implementation of Transformer Encoders / Masked Language Modeling Objective
Customized Pretraining for NLG Tasks
🗨️ This repository contains a collection of notebooks and resources for various NLP tasks using different architectures and frameworks.
Pre-training a Transformer from scratch.
Transformer for Automatic Speech Recognition
Measuring Biases in Masked Language Models for PyTorch Transformers. Support for multiple social biases and evaluation measures.
[NeurIPS 2023 Main Track] This is the repository for the paper titled "Don’t Stop Pretraining? Make Prompt-based Fine-tuning Powerful Learner"
Pre-train a custom BERT model on unlabeled Persian text, with Masked language modeling objective
Source codes and materials of Advanced Spelling Error Correction project.
Repository for My HuggingFace Natural Language Processing Projects
External Knowledge Infusion using INCIDecoder into BERT for Chemical Mapping
PyTorch implementation for "Training and Inference on Any-Order Autoregressive Models the Right Way", NeurIPS 2022 Oral, TPM 2023 Best Paper Honorable Mention
Masked Language Modeling demo using XLM-RoBERTa + Gradio/FastAPI
Comparing Data-Driven Techniques for Enhancing Negation Sensitivity in MLM-Based Laguage-Models
Use BERTRAM to get single-token embeddings for idioms on the MAGPIE dataset.
Evaluation of zero-shot classification models on Turkish datasets.
Course materials for the Machine Learning for NLP course taught by Sameer Singh for the Cognitive Science summer school 2022.
Code for the publication of WWW'22
Add a description, image, and links to the masked-language-modeling topic page so that developers can more easily learn about it.
To associate your repository with the masked-language-modeling topic, visit your repo's landing page and select "manage topics."