Quickly fine-tune language models for your downstream NLP tasks.
-
Updated
Jun 1, 2019 - Python
Quickly fine-tune language models for your downstream NLP tasks.
AWD-LSTM and ULMFiT reproduction from scratch
Method Development for Predicting Protein Subcellular Localization Based on Deep Learning
Supplementary scripts and data for my thesis
Filipino pretrained BERT & ULMFiT models, plus large unlabeled text corpora
Deploy FastAI Trained PyTorch Model in TorchServe and Host in GCP's AI-Platform Prediciton.
Deep learning (DL) approaches use various processing layers to learn hierarchical representations of data. Recently, many methods and designs of natural language processing (NLP) models have shown significant development, especially in text mining and analysis. For learning vector-space representations of text, there are famous models like Word2…
One-Stop Solution to encode sentence to fixed length vectors from various embedding techniques
中文ULMFiT 情感分析 文本分类
Add a description, image, and links to the ulmfit topic page so that developers can more easily learn about it.
To associate your repository with the ulmfit topic, visit your repo's landing page and select "manage topics."