Skip to content

A Tensorflow Implementation of "SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers"

Notifications You must be signed in to change notification settings

IMvision12/SegFormer-tf

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SegFormer-tf

This repository is about an implementation of the research paper "SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers"

SegFormer is a Transformer-based framework for semantic segmentation that unifies Transformers with lightweight multilayer perceptron (MLP) decoders.

Model Architecture :

Detailed overview of MiT :

Usage:

Clone Github Repo:

$ git clone https://github.com/IMvision12/SegFormer-tf
$ cd SegFormer-tf

Then import model

import tensorflow as tf
from models import SegFormer_B3
model = SegFormer_B3(input_shape = (224, 224, 3), num_classes = 19)
print(model.summary())

References

[1] SegFormer paper: https://arxiv.org/pdf/2105.15203

[2] Official SegFormer Repo: https://github.com/NVlabs/SegFormer

About

A Tensorflow Implementation of "SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages