Deep learning methods for sentiment analysis classification of covid-19 vaccination tweets
-
Updated
May 30, 2023 - Jupyter Notebook
Deep learning methods for sentiment analysis classification of covid-19 vaccination tweets
Collection of various of my custom TensorFlow-Keras 2.0+ layers, utils and such
This repository implements the paper "Revolutionizing Age, Gender, and Ethnicity Recognition with Multi-Modal Vision Transformers," using Vision Transformers (ViT) with CBAM, Coordinate Attention, and Self-Attention. Leveraging the "UTK-Face" dataset, it enhances detection accuracy for age, gender, and ethnicity.
Transformers, including the T5 and MarianMT, enabled effective understanding and generating complex programming codes. Consequently, they can help us in Data Security field. Let's see how!
An active vision system which builds a 3D environment map autonomously using visual attention mechanisms.
✨ The Tensorflow implementation for the IEEE Access paper: "ARERec: Attentive Local Interaction Model for Sequential Recommendation".
👻 The PyTorch implementation for the IEEE Access paper: "PAC-MAN: Multi-Relation Network in Social Community for Personalized Hashtag Recommendation".
Master Project on Image Captioning using Supervised Deep Learning Methods
A repository to get train transformers to access longer context for causal language models, most of these methods are still in testing. Try them out if you'd like but please lmk your results so we don't duplicate work :)
A PyTorch implementation of the Multi-Mode CNN to reconstruct Chlorophyll-a time series in the global ocean from oceanic and atmospheric physical drivers
All codes implemented on Korean voice phishing detection papers
Omni-Modality Processing, Understanding, and Generation
Zeta implemantion of "Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers"
Implementation of the model "Hedgehog" from the paper: "The Hedgehog & the Porcupine: Expressive Linear Attentions with Softmax Mimicry"
The open source implementation of the multi grouped query attention by the paper "GQA: Training Generalized Multi-Query Transformer Models from Multi-Head Checkpoints"
A collection of layers, ops, utilities and more for TensorFlow 2.0 high-level API Keras
A simple python package for question answering.
Joint Multi-label Attention Network (JMAN)
An simple pytorch implementation of Flash MultiHead Attention
Add a description, image, and links to the attention-mechanisms topic page so that developers can more easily learn about it.
To associate your repository with the attention-mechanisms topic, visit your repo's landing page and select "manage topics."