Skip to content

Latest commit

 

History

History
78 lines (56 loc) · 3.79 KB

README.md

File metadata and controls

78 lines (56 loc) · 3.79 KB

Exchanger4SITS: Revisiting the Encoding of Satellite Image Time Series

The official code repository for the paper "Revisiting the Encoding of Satellite Image Time Series".

arXiv GitHub Stars Github Forks

News

  • The preprint is under review.
  • The codebase is still under construction and therefore is subject to further modifications.
  • The paper has been accepted to BMVC 2023 as an oral presentation.
  • The model weights have been made available on Zenodo.
  • The slides, poster, and accompanying video will be released after BMVC on a separate project page.
  • I have been focusing on expanding this work to a journal paper and the code is subject to further modifications.

Schematic Overview of Collect--Update--Distribute

schematic illustration

Qualitative Results from Exchanger+Mask2Former on PASTIS

qualitative results

New SOTA Results on PASTIS Benchmark Dataset

PASTIS - Semantic Segmentation

PWC

Model Name mIoU #Params (M) FLOPs
U-TAE 63.1 1.09 47G
TSViT 65.4 2.16 558G
Exchanger+Unet 66.8 8.08 300G
Exchanger+Mask2Former 67.9 24.59 329G

PASTIS - Panoptic Segmentation

PWC

Model Name SQ RQ PQ #Params (M) FLOPs
UConvLSTM+PaPs 80.2 43.9 35.6 2.50 55G
U-TAE+PaPs 81.5 53.2 43.8 1.26 47G
Exchanger+Unet+PaPs 80.3 58.9 47.8 9.99 301G
Exchanger+Mask2Former 84.6 61.6 52.6 24.63 332G

License

License: MIT

Notes

  • The panoptic segmentation model Exchanger+Mask2Former has been trained by splitting the input into four 64x64 patches and stitch the prediction results together. Later on, I found this trick is crucial for replicating the results.

Citation

If you find our work or code useful in your research, please consider citing the following BibTex entry:

@article{cai2023rethinking,
  title={Rethinking the Encoding of Satellite Image Time Series},
  author={Cai, Xin and Bi, Yaxin and Nicholl, Peter and Sterritt, Roy},
  journal={arXiv preprint arXiv:2305.02086},
  year={2023}
}

Acknowledgements

The codebase is built upon the following great work:

I would like to thank Zenodo for hosting the model weights and appreciate the constructive and insightful comments from BMVC reviewers.