[NeurIPS'22 Spotlight] A Contrastive Framework for Neural Text Generation
-
Updated
Mar 7, 2024 - Python
[NeurIPS'22 Spotlight] A Contrastive Framework for Neural Text Generation
MixText: Linguistically-Informed Interpolation of Hidden Space for Semi-Supervised Text Classification
Workshop material for the AMLD 2020 workshop on "Meet your Artificial Self: Generate text that sounds like you"
This is a dataset intended to train a LLM model for a completely CVE focused input and output.
R Interface to OpenAI's GPT-2 model
Machine Learning + LSTM Implementation to capture sentiments surrouding the Colombian elections in 2018. Sentiment analysis entirely on Spanish tweets with interesting external data sourcing.
Implementing a scalable content team using AI involves creating a framework that blends the strengths of AI technologies with the creative and supervisory capabilities of human team members. This strategy aims to enhance efficiency, creativity, and content output quality.
Generate fake restaurant reviews with GPT-2 using Yelp Dataset
Code for "Planning and Generating Natural and Diverse Disfluent Texts as Augmentation for Disfluency Detection"
Codes for "NAST: A Non-Autoregressive Generator with Word Alignment for Unsupervised Text Style Transfer" (ACL 2021 findings)
Auto Complete anything using a gguf model
Code for "Semi-supervised Formality Style Transfer using Language Model Discriminator and Mutual Information Maximization"
📰 Must-read papers on Diffusion Models for Text Generation 🔥
A Bidirectional LSTM Model for lyrics generation
This repository contains code for generating blog content using the LLama 2 language model. It integrates with Streamlit for easy user interaction. Simply input your blog topic, desired word count, and writing style to generate engaging blog content.
A Query-Answer chatbot for PDFs using local Large language and Embeddings models. Please read this readme fully before using.
Discord Bot that combines functionalities from Eleven Labs and OpenAI API.
By following the tutorial on TensorFlow , I generated a text using a character-based Recurrent Neural Networks. Dataset is based on Shakespeare's writing. Given a sequence of characters from the data, a LSTM-based model is trained to predict the next character in the sequence.
Add a description, image, and links to the textgeneration topic page so that developers can more easily learn about it.
To associate your repository with the textgeneration topic, visit your repo's landing page and select "manage topics."