Team project - generate recipe based on ingredients available in the fridge
-
Updated
Oct 3, 2024 - TypeScript
Team project - generate recipe based on ingredients available in the fridge
A library that leverages the pre-trained openai-community/gpt2 model for multilingual text classification (French, German, and English), providing straightforward fine-tuning capabilities for sequence classification tasks.
Training transformer models (e.g. RoBERTa, GPT2 and GPT-J) from scratch.
MindSpore online courses: Step into LLM
中文nlp解决方案(大模型、数据、模型、训练、推理)
Build a Large Language Model (From Scratch) book and Finetuned Models
Welcome to our this repository showcasing the power of ChatGPT-2! In this project, we utilize the API access from Hugging Face to seamlessly integrate ChatGPT-2 into a user-friendly interface. With officially supported UI elements, users can easily interact with ChatGPT-2 in a visually appealing environment.https://yonatankinfe.github.io/ChatGPT2/
Emotionally Conditioned Melody Generation in ABC Notation
This project involves building a chatbot for a clothing store using a fine-tuned GPT-2 model. The chatbot is designed to answer customer queries related to products, services, and store policies, and to handle out-of-scope questions gracefully.
NLP (Natural Language Processing)
AI-driven video analysis system that extracts and transcribes audio with Whisper, detects objects using YOLO, and generates comprehensive scene descriptions with GPT-2. The project combines transcriptions and object detections to produce detailed, context-aware video narratives.
ruGPT3Large advanced Telegram chatbot
Pretraining the 124m GPT2 model
A Python-based chatbot project built on the autogen and tinygrad foundation, utilizing advanced agents for dynamic conversations and function orchestration, enhancing and expanding traditional chatbot capabilities.
Generative Pretrained Model (GPT) in JAX. A step by step guide to train LLMs on large datasets from scratch
This repository contains the replication material and the Appendix for our final project for the course "Deep Learning for the Social Sciences" (SuSe 2024) at the University of Konstanz.
GPT2 made from scratch and trained on a small text corpus
Training GPT-2 on FineWeb-Edu in JAX/Flax
Add a description, image, and links to the gpt2 topic page so that developers can more easily learn about it.
To associate your repository with the gpt2 topic, visit your repo's landing page and select "manage topics."