Skip to content

JayaswalVivek/BERT_Based_Regression

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 

Repository files navigation

DistilBERT for performing a NLP task.

Data Source: Kaggle
URL: https://www.kaggle.com/competitions/commonlitreadabilityprize
Problem Definition: Rate the complexity of literary passages for grades 3-12 classroom use
Problem Type: Regression using unstructured data

An implementation of DistilBERT for Kaggle's "CommonLit Readability Prize" challenge. The source data sets can be downloaded from Kaggle's website.

Evaluation Summary
RMSE: 0.606

Releases

No releases published

Packages

No packages published