Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the PaLM architecture. Basically ChatGPT but with PaLM
-
Updated
Jan 14, 2024 - Python
Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the PaLM architecture. Basically ChatGPT but with PaLM
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
Implementation of AudioLM, a SOTA Language Modeling Approach to Audio Generation out of Google Research, in Pytorch
Implementation of MusicLM, Google's new SOTA model for music generation using attention networks, in Pytorch
Implementation of Make-A-Video, new SOTA text to video generator from Meta AI, in Pytorch
Awesome List of Attention Modules and Plug&Play Modules in Computer Vision
Implementation of Toolformer, Language Models That Can Use Tools, by MetaAI
Implementation of Muse: Text-to-Image Generation via Masked Generative Transformers, in Pytorch
Implementation of Phenaki Video, which uses Mask GIT to produce text guided videos of up to 2 minutes in length, in Pytorch
Implementation of Alphafold 3 in Pytorch
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
PyTorch Dual-Attention LSTM-Autoencoder For Multivariate Time Series
Implementation of MeshGPT, SOTA Mesh generation using Attention, in Pytorch
Learning YOLOv3 from scratch 从零开始学习YOLOv3代码
Implementation of MEGABYTE, Predicting Million-byte Sequences with Multiscale Transformers, in Pytorch
Official PyTorch Implementation for "Rotate to Attend: Convolutional Triplet Attention Module." [WACV 2021]
An implementation of local windowed attention for language modeling
Sparse and structured neural attention mechanisms
Multi heads attention for image classification
Implementation of ChatGPT, but tailored towards primary care medicine, with the reward being able to collect patient histories in a thorough and efficient manner and come up with a reasonable differential diagnosis
Add a description, image, and links to the attention-mechanisms topic page so that developers can more easily learn about it.
To associate your repository with the attention-mechanisms topic, visit your repo's landing page and select "manage topics."