Sequence to Sequence and attention from scratch using Tensorflow
-
Updated
Sep 21, 2017 - Jupyter Notebook
Sequence to Sequence and attention from scratch using Tensorflow
An active vision system which builds a 3D environment map autonomously using visual attention mechanisms.
Multi heads attention for image classification
Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
In this repository, one can find the code for my master's thesis project. The main goal of the project was to study and improve attention mechanisms for trajectory prediction of moving agents.
Collection of various of my custom TensorFlow-Keras 2.0+ layers, utils and such
A collection of layers, ops, utilities and more for TensorFlow 2.0 high-level API Keras
Computer-aided diagnosis in histopathological images of the Endometrium
Sparse and structured neural attention mechanisms
Joint Multi-label Attention Network (JMAN)
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Official PyTorch Implementation for "Rotate to Attend: Convolutional Triplet Attention Module." [WACV 2021]
Hierarchical probabilistic 3D U-Net, with attention mechanisms (—𝘈𝘵𝘵𝘦𝘯𝘵𝘪𝘰𝘯 𝘜-𝘕𝘦𝘵, 𝘚𝘌𝘙𝘦𝘴𝘕𝘦𝘵) and a nested decoder structure with deep supervision (—𝘜𝘕𝘦𝘵++). Built in TensorFlow 2.5. Configured for voxel-level clinically significant prostate cancer detection in multi-channel 3D bpMRI scans.
Learning YOLOv3 from scratch 从零开始学习YOLOv3代码
Implementation of Transframer, Deepmind's U-net + Transformer architecture for up to 30 seconds video generation, in Pytorch
"Make-A-Video", new SOTA text to video by Meta-FAIR - Tensorflow
Master Project on Image Captioning using Supervised Deep Learning Methods
Implementation of Denoising Diffusion for protein design, but using the new Equiformer (successor to SE3 Transformers) with some additional improvements
Implementation of fused cosine similarity attention in the same style as Flash Attention
Add a description, image, and links to the attention-mechanisms topic page so that developers can more easily learn about it.
To associate your repository with the attention-mechanisms topic, visit your repo's landing page and select "manage topics."