Skip to content

An implementation of multiple notable attention mechanisms using TensorFlow 2

Notifications You must be signed in to change notification settings

gdao-research/attention_mechanism

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Attention Mechanisms

An implementation of multiple notable attention mechanisms using TensorFlow 2.

from attention_mechanism.utils import ReZero

Reversible Sequence

Part of Reformer

from attention_mechanism.sequence import ReversibleSequence

Traditional Multi head attention with scaled dot product

from attention_mechanism.attention import MultiHeadAttention

Standalone self-attention module with linear complexity with respect to sequence length using FAVOR+

from attention_mechanism.attention import SelfAttention

The module PerformerLM is designed to work with Language Model, and Performer can be applied to more general high input dimensions such as images

from attention_mechanism.performer import Performer, PerformerLM

Lambda Layer for Lambda Networks

from attention_mechanism.lamda_layer import LambdaLayer
from attention_mechanism.vision_transformer import VisionTransformer

Incoming...

Releases

No releases published

Packages

No packages published

Languages