Skip to content

Unofficial implementation of Compact Convolution Transformers with CBAM Attention

License

Notifications You must be signed in to change notification settings

Cloud-Tech-AI/cct

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Compact Convolution Transformers

This repository contains the unofficial PyTorch implementation of Compact Convolution Transformers (CCT) with Convolution Block Attention Module (CBAM).

Content

Model Architecture

Use Convolutions with CBAM to tokenize the image and then use the transformer encoder to process the tokens.

Compact Convolution Transformer (CCT)

Convolutional Block Attention Module (CBAM)

Installation

git clone https://github.com/Cloud-Tech-AI/cct.git
cd cct
pip install .

Usage

import torch

from cct import CCT

model = CCT(
    model_name='cct_2',
    tokenizer_config={'cbam': True}
)
img = torch.randn(1, 3, 224, 224)
output = model(img)

References

  • The official implementation for CCT here
  • The official implementation for CBAM here

Citation

@article{DBLP:journals/corr/abs-2104-05704,
  author       = {Ali Hassani, Steven Walton, Nikhil Shah, Abulikemu Abuduweili, Jiachen Li, Humphrey Shi},
  title        = {Escaping the Big Data Paradigm with Compact Transformers},
  year         = {2021},
  url          = {https://arxiv.org/abs/2104.05704},
}
@article{DBLP:journals/corr/abs-1807-06521,
  author       = {Sanghyun Woo, Jongchan Park, Joon{-}Young Lee, In So Kweon},
  title        = {{CBAM:} Convolutional Block Attention Module},
  journal      = {CoRR},
  year         = {2018},
  url          = {http://arxiv.org/abs/1807.06521},
}

About

Unofficial implementation of Compact Convolution Transformers with CBAM Attention

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages