- Clone the repository and move into the top-level directory
cd SGM
- Create conda environment.
conda env create -f environment.yml
- Activate the environment.
conda activate sgm
- We provide pre-trained model of sgm and frontier_prediction. For evaluation, you can download the sgm model to
$SGM_ROOT/pretrained_models/frontier.ckpt
and the frontier_prediction model to$SGM_ROOT/pretrained_models/MAE-checkpoint-199.pth
.
We use a modified version of the Gibson ObjectNav evaluation setup from SemExp.
- Download the Gibson ObjectNav dataset to
$SGM_ROOT/data/datasets/objectnav/gibson
.cd $SGM_ROOT/data/datasets/objectnav wget -O gibson_objectnav_episodes.tar.gz https://utexas.box.com/shared/static/tss7udt3ralioalb6eskj3z3spuvwz7v.gz tar -xvzf gibson_objectnav_episodes.tar.gz && rm gibson_objectnav_episodes.tar.gz
- Download the image segmentation model [URL] to
$SGM_ROOT/pretrained_models
. - To visualize episodes with the semantic map and potential function predictions, add the arguments
--print_images 1 --num_pf_maps 3
in the evaluation script.
The data
folder should look like this
data/
├── datasets/objectnav/gibdon/v1.1
├── train/
│ ├── content/
│ ├── train_info.pbz2
│ └── train.json.gz
├── val/
│ ├── content/
│ ├── val_info.pbz2
│ └── val.json.gz
├── scene_datasets/
├── gibson_semantic/
├── Allensville_semantic.ply
├── Allensville.glb
├── Allensville.ids
├── Allensville.navmesh
├── Allensville.scn
├── ...
├── semantic_maps/
├── gibson/semantic_maps
├── semmap_GT_info.json
├── Allensville_0.png
├── Allensville.h5
├── ...
sh experiment_scripts/gibson/eval_sgm.sh
Download the training_gibson dataset to $SGM_ROOT/SGM_train/mapdata/gibson
.
- Create conda environment.
conda env create -f SGM_train/environment.yml
- Activate the environment.
conda activate sgm_train
sh SGM_train/run_train.sh