Implementation of Alphafold 3 in Pytorch
-
Updated
Aug 6, 2024 - Python
Implementation of Alphafold 3 in Pytorch
Implementation of Band Split Roformer, SOTA Attention network for music source separation out of ByteDance AI Labs
Explorations into the recently proposed Taylor Series Linear Attention
Implementation of the conditionally routed attention in the CoLT5 architecture, in Pytorch
An implementation of local windowed attention for language modeling
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
Implementation of Phenaki Video, which uses Mask GIT to produce text guided videos of up to 2 minutes in length, in Pytorch
PyTorch Dual-Attention LSTM-Autoencoder For Multivariate Time Series
Implementation of MagViT2 Tokenizer in Pytorch
Implementation of Toolformer, Language Models That Can Use Tools, by MetaAI
All codes implemented on Korean voice phishing detection papers
Implementation of the LongRoPE: Extending LLM Context Window Beyond 2 Million Tokens Paper
Implementation of MambaFormer in Pytorch ++ Zeta from the paper: "Can Mamba Learn How to Learn? A Comparative Study on In-Context Learning Tasks"
Implementation of Agent Attention in Pytorch
Implementation of MeshGPT, SOTA Mesh generation using Attention, in Pytorch
Implementation of Diffusion Policy, Toyota Research's supposed breakthrough in leveraging DDPMs for learning policies for real-world Robotics
Implementation of a single layer of the MMDiT, proposed in Stable Diffusion 3, in Pytorch
Transformers, including the T5 and MarianMT, enabled effective understanding and generating complex programming codes. Consequently, they can help us in Data Security field. Let's see how!
This repository implements the paper "Revolutionizing Age, Gender, and Ethnicity Recognition with Multi-Modal Vision Transformers," using Vision Transformers (ViT) with CBAM, Coordinate Attention, and Self-Attention. Leveraging the "UTK-Face" dataset, it enhances detection accuracy for age, gender, and ethnicity.
Zeta implemantion of "Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers"
Add a description, image, and links to the attention-mechanisms topic page so that developers can more easily learn about it.
To associate your repository with the attention-mechanisms topic, visit your repo's landing page and select "manage topics."