Skip to content

nelioasousa/strongsort-yolo

 
 

Repository files navigation

StrongSORT with OSNet for YOLOv7

This is a forked modification of StrongSORT-YOLO.

Introduction

This repository contains a highly configurable two-stage-tracker that adjusts to different deployment scenarios. The detections generated by YOLOv7 are passed to StrongSORT which combines motion and appearance information based on OSNet in order to track the objects. It can track any object that your YOLOv7 model was trained to detect. The algorith uses a forked version of YOLOv7, since the original is no longer maintained.

Before you run the tracker

  1. Clone the repository recursively:

git clone --recurse-submodules https://github.com/nelioasousa/strongsort-yolo.git

If you already cloned and forgot to use --recurse-submodules you can run git submodule update --init

  1. Make sure that you fulfill all the requirements: Python 3.8 or 3.9; torch (>=1.7.0 and !=1.12.0), torchvision (>=0.8.1 and !=0.13.0) and compatible CUDA driver; all required dependencies installed. Check requirements.txt for more informations.

Tracking sources

Tracking can be runned on videos compatible with OpenCV's VideoCapture class. Also supports sequence of images, as long as those can be opened with cv2.imread() method.

Select YOLOv7 object detector checkpoint

There is a clear trade-off between model inference speed and accuracy.

$ python track.py ... --yolo-weights weights/yolov7-tiny.pt --img 640
                                             yolov7.pt            1280
                                             yolov7x.pt           ...
                                             yolov7-w6.pt
                                             yolov7-e6.pt
                                             yolov7-d6.pt
                                             yolov7-e6e.pt
                                             ...

Filter tracked classes

By default the tracker tracks all classes. If you want to track a subset of the classes, add their corresponding index after the --classes flag. The indexing is zero-based.

python track.py ... --classes 16 17  # tracks classes with ids 16 e 17

Draw Objects Trajectories

To draw the trajectory lines of the objects in the output video, call the flags --save-vid AND --draw-trajectory. Without --draw-trajectory the output video will contain only tracking bouding boxes. Customize the bboxes labels with --hide-labels, --hide-conf and --hide-class.

$ python track.py --source test.mp4 --yolo-weights weights/*.pt --save-vid --draw-trajectory

MOT compliant results

MOT compliant results can be saved to project/name/labels/example.txt by calling --save-txt AND --mot-format flags.

python track.py ... --source example.mp4 --project results --name exp1 --save-txt --mot-format

The above snippet will save the MOT compliant annotation into results/exp1/labels/example.txt

Thanks to

This project was only possible thanks to the efforts of the following:

@article{wang2022yolov7,
  title={{YOLOv7}: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors},
  author={Wang, Chien-Yao and Bochkovskiy, Alexey and Liao, Hong-Yuan Mark},
  journal={arXiv preprint arXiv:2207.02696},
  year={2022}
}

@article{du2023strongsort,
  title={Strongsort: Make deepsort great again},
  author={Du, Yunhao and Zhao, Zhicheng and Song, Yang and Zhao, Yanyun and Su, Fei and Gong, Tao and Meng, Hongying},
  journal={IEEE Transactions on Multimedia},
  year={2023},
  publisher={IEEE}
}

@article{torchreid,
  title={Torchreid: A Library for Deep Learning Person Re-Identification in Pytorch},
  author={Zhou, Kaiyang and Xiang, Tao},
  journal={arXiv preprint arXiv:1910.10093},
  year={2019}
}

@misc{luiten2020trackeval,
  author={Jonathon Luiten, Arne Hoffhues},
  title={TrackEval},
  howpublished={https://github.com/JonathonLuiten/TrackEval},
  year={2020}
}
Others

About

StrongSORT using YOLOv7 and OSNet

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%