Skip to content

Exploring the Application of Attention Mechanisms in Conjunction with Baseline Models on the COVID-19-CT Dataset

License

Notifications You must be signed in to change notification settings

d4nh5u/Covid19_CT_Attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Exploring the Application of Attention Mechanisms in Conjunction with Baseline Models on the COVID-19-CT Dataset

License: MIT

Table of Contents

Team

  • Chia-Chun Hsu (d4nh5u)
    • Department of BioMedical Engineering (BME)
    • National Cheng Kung University (NCKU)
    • Feel free to reach out → Email
  • Chao-Chun Cheng
    • Department of Computer Science and Information Engineering (CSIE)
    • National Cheng Kung University (NCKU)

Abstract

This project investigates the efficacy of attention mechanisms integrated with baseline models for analyzing COVID-19-CT dataset. Our team explored different attention mechanisms like SE, CBAM, ECA, and compared their impact on baseline models such as ResNet-50, ResNeXt-50, DenseNet121, DenseNet169 and ConvNeXt-Tiny. The study involved rigorous evaluation through metrics like accuracy, precision, recall, F1-score, and the use of Grad-CAM for visualization. Our findings contribute valuable insights into the application of attention mechanisms in medical image analysis, specifically for COVID-19-CT datasets.

📣 Experimental results and in-depth analysis will be presented in a formal paper in the future. 📣

Acknowledgements

This project makes use of the COVID-19-CT dataset, generously made available by UCSD-AI4H on GitHub. We extend our sincere gratitude to UCSD-AI4H for their hard work and contribution to the open-source community. 🙇🙇

License

This project is licensed under the MIT License - see the LICENSE file for details.

Copyright (c) 2024 d4nh5u (Chia-Chun Hsu)