Ruijie Zhu1,2,
Mulin Yu2,
Linning Xu3,
Lihan Jiang1,2,
Yixuan Li3,
Tianzhu Zhang1,
Jiangmiao Pang2,
Bo Dai4
1 USTC
2 Shanghai AI Lab
3 CUHK
4 HKU
ICCV 2025
The overall architecture of ObjectGS. We first use a 2D segmentation pipeline to assign object ID and lift it to 3D. Then we initialize the anchors and use them to generate object-aware neural Gaussians. To provide semantic guidance, we model the Gaussian semantics and construct classification-based constraints. As a result, our method enables both object-level and scene-level reconstruction.
To train ObjectGS, you should download the following dataset:
Or you can use our processed subsets: Google Drive
We organize the datasets as follows:
├── data
│ | 3dovs
│ ├── bed
│ ├── bench
│ ├── ...
│ | lerf_mask
│ ├── figurines
│ ├── ramen
│ ├── teatime
│ | replica
│ ├── office_0
│ ├── office_1
│ ├── ...
│ | scannetpp_ovs
│ ├── 09bced689e
│ ├── 0d2ee665be
│ ├── ...
- Clone this repo:
git clone git@github.com:RuijieZhu94/ObjectGS.git --recursive
- Install dependencies:
cd ObjectGS
conda env create --file environment.yml
python ply_preprocessing.py
# 3dgs version
bash train_3d.sh /path/to/your/dataset
# 2dgs version
bash train_2d.sh /path/to/your/dataset
bash render.sh path/to/your/training/folder
python vis_ply.py
Please refer to Gaussian-Grouping for OVS evaluation.
# exporting scene mesh
python export_mesh.py -m path/to/your/training/folder
# exporting object mesh
python export_object_mesh.py -m path/to/your/training/folder --query_label_id -1
## query_label_id: specify a object id (0~255), -1 for all objects
If you find our work useful, please cite:
@inproceedings{zhu2025objectgs,
title={ObjectGS: Object-aware Scene Reconstruction and Scene Understanding via Gaussian Splatting},
author={Zhu, Ruijie and Yu, Mulin and Xu, Linning and Jiang, Lihan and Li, Yixuan and Zhang, Tianzhu and Pang, Jiangmiao and Dai, Bo},
booktitle={International Conference on Computer Vision (ICCV), 2025},
year={2025}
}
Our code is based on Scaffold-GS, HorizonGS, GSplat and Gaussian-Grouping. We thank the authors for their excellent work!