The official implementation for the CVPR 2025 paper [ Learning Occlusion-Robust Vision Transformers for Real-Time UAV Tracking ]
Models & Raw Results Baidu Driver Models & Raw Results Google Driver
Create and activate a conda environment:
conda create -n ORTrack python=3.8
conda activate ORTrack
Install the required packages:
pip install -r requirements.txt
Put the tracking datasets in ./data. It should look like:
${PROJECT_ROOT}
-- data
-- lasot
|-- airplane
|-- basketball
|-- bear
...
-- got10k
|-- test
|-- train
|-- val
-- coco
|-- annotations
|-- images
-- trackingnet
|-- TRAIN_0
|-- TRAIN_1
...
|-- TRAIN_11
|-- TEST
Run the following command to set paths:
cd <PATH_of_ORTrack>
python tracking/create_default_local_file.py --workspace_dir . --data_dir ./data --save_dir ./output
You can also modify paths by these two files:
./lib/train/admin/local.py # paths for training
./lib/test/evaluation/local.py # paths for testing
Download pre-trained DeiT-Tiny weights, Eva02-Tiny weights , and ViT-Tiny weights and put it under `$USER_ROOT$/.cache/torch/hub/checkpoints/.
# Training ORTrack-DeiT
python tracking/train.py --script ortrack --config deit_tiny_patch16_224 --save_dir ./output --mode single
# Training ORTrack-D-DeiT
# You need to download the model weight of ORTrack-DeiT and place them under the directory $PROJECT_ROOT$/teacher_model/deit_tiny_patch16_224.
python tracking/train.py --script ortrack --config deit_tiny_distilled_patch16_224 --save_dir ./output --mode single
Download the model weights from Google Drive or BaiduNetDisk
Put the downloaded weights on <PATH_of_ORTrack>/output/checkpoints/train/ortrack/deit_tiny_patch16_224
Change the corresponding values of lib/test/evaluation/local.py
to the actual benchmark saving paths
Testing examples:
- VisDrone2018 or other off-line evaluated benchmarks (modify
--dataset
correspondingly)
python tracking/test.py ortrack deit_tiny_patch16_224 --dataset visdrone2018 --threads 4 --num_gpus 1
python tracking/analysis_results.py # need to modify tracker configs and names
- BioDrone
python tracking/test.py ortrack deit_tiny_patch16_224 --dataset biodrone --threads 4 --num_gpus 1
# Profiling ORTrack-DeiT
python tracking/profile_model.py --script ortrack --config deit_tiny_patch16_224
-
This repo is based on OSTrack and PyTracking library which are excellent works and help us to quickly implement our ideas.
-
We use the implementation of the DeiT, Eva02, and ViT from the Timm repo.
If our work is useful for your research, please consider citing:
@inproceedings{wu2025ortrack,
title={Learning Occlusion-Robust Vision Transformers for Real-Time UAV Tracking},
author={Wu, You and Wang, Xucheng and Yang, Xiangyang and Liu, Mengyuan and Zeng, Dan and Ye, Hengzhou and Li, Shuiwang},
booktitle={CVPR},
year={2025}
}