Skip to content

Latest commit

 

History

History
78 lines (68 loc) · 3.01 KB

README.md

File metadata and controls

78 lines (68 loc) · 3.01 KB

logo

A Unified Animal Perception Model via Few-shot Learning

Setup

  1. Download Datasets
<Root>
|--<AnimalKingdom>
|   |--<animal1>_<rgb>
|   | ...
|   |--<animal2>_<label>
|   |...
|
|--<APT-36K>
|   |--<animal1>_<rgb>
|   | ...
|   |--<animal2>_<label>
|   |...
|
|--<AnimalPose>
|   |--<animal1>_<rgb>
|   | ...
|   |--<animal2>_<label>
|   |...
|
|--<Oxford-IIITPet>
|   |--<animal1>_<rgb>
|   | ...
|   |--<animal2>_<label>
|   |...
|
|...
  1. Create data_paths.yaml file and write the root directory path (<Root> in the above structure) by UniASET: PATH_TO_YOUR_UniASET.

  2. Install pre-requirements by pip install -r requirements.txt.

  3. Create model/pretrained_checkpoints directory and download BEiT pre-trained checkpoints to the directory.

  • We used beit_base_patch16_224_pt22k checkpoint for our experiment.

Usage

Training

python main.py --stage 0 --task_id [0/1/2/3]
  • If you want to train universally on all tasks, please set task_id=3.
  • If you want to train on the specific task, please follow task_id=0: pose estimation, task_id=1: semantic segmentation, task_id=2: classification.

Fine-tuning

python main.py --stage 1 --task [kp/mask/cls]
  • If you want to finetune on the specific task, please follow task=kp: pose estimation, task=mask: semantic segmentation, task=cls: classification.

Evaluation

python main.py --stage 2 --task [kp/mask/cls]
  • If you want to evaluate on the specific task, please follow task=kp: pose estimation, task=mask: semantic segmentation, task=cls: classification.

Acknowledgements

Our code refers the following repositores: