Skip to content

tchayintr/thbert

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Yet another pre-trained model for Thai BERT

thbert

BERT, a pre-trained unsupervised natural language processing model, prepared for fine-tuning to perform NLP downstream tasks significantly.

To enable research oppotunities with very few Thai Computational Linguitic resources, we willingly introduce fundamental language resouces, Thai BERT, build from scratch for researchers and enthusiast.

Pre-trained models

Each .zip file contains three items:

  • A TensorFlow checkpoint (thbert_model.ckpt) containing the pre-trained weights (3 files).
  • A vocab file (vocab.txt) to map WordPiece to word id.
  • A config file (bert_config.json) which specifies the hyperparameters of the model.

Pre-training data

Source

Tokenization

About

Yet another pre-trained BERT particularly in Thai

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages