Skip to content

Latest commit

 

History

History
158 lines (134 loc) · 6.67 KB

README.md

File metadata and controls

158 lines (134 loc) · 6.67 KB

UniDoorManip: Learning Universal Door Manipulation Policy Over Large-scale and Diverse Door Manipulation Environments

This is the official repository of UniDoorManip: Learning Universal Door Manipulation Policy Over Large-scale and Diverse Door Manipulation Environments.

Project | Paper | ArXiv | Video Overview

Mechanism

Dataset

Our dataset consists of the door parts (bodies and handles), and the integrated doors across six categories(Interior, Window, Car, Safe, StorageFurniture, Refrigerator). The door parts are selected from PartNet-Mobility and 3d Warehouse. We annotate the part axis and poses and develop a method to integrate the parts into the doors.

Here we have already constructed some door examples, you can download the pakage here.

For more details about the door parts and the method of parts integration, please refer to the DatasetGeneration directory.

Simulation

We provide individual simulation environments for each category and mechanism. Here we show some task demos with a movable robot arm.

How to use our code

Here we introduce the procedure to run the simulation using the doors integrated by us.

Downloading

Please download the integrated door pakages here and unzip it in the repository directory.

Installation

This repository has been developed and tested with Ubuntu 20.04 and CUDA 11.7. To set up the environments, please follow these steps:

  1. Create a new Anaconda environment named unidoormanip. We recommend the Python version 3.8.
    conda create -n unidoormanip python=3.8
    conda activate unidoormanip
  2. Install the dependecies.
    pip install torch==1.13.1 torchvision==0.14.1 ipdb trimesh
  3. Install the simulator IsaacGym following the official guide.
  4. Install the PointNet++
   git clone --recursive https://github.com/erikwijmans/Pointnet2_PyTorch
   cd Pointnet2_PyTorch
   pip install -r requirements.txt
   pip install -e .

Notice that PointNet++ requires the match of CUDA version and pytorch version.

Run the simulation

We already provide the shell scripts, just run the file and you will see the simulation environments.

  cd [path_to_this_repo]/Simulation
  bash scripts/franka_slider_open_[category].sh

For example, if you want to use the lever door simulation, you can run the following code:

  cd [path_to_this_repo]/Simulation
  bash scripts/franka_slider_open_lever_door.sh

The result is shown below.

Citation

If you find our project useful, welcome to cite our paper

@article{li2024unidoormanip,
  title={UniDoorManip: Learning Universal Door Manipulation Policy Over Large-scale and Diverse Door Manipulation Environments},
  author={Li, Yu and Zhang, Xiaojie and Wu, Ruihai and Zhang, Zilong and Geng, Yiran and Dong, Hao and He, Zhaofeng},
  journal={arXiv preprint arXiv:2403.02604},
  year={2024}
}

and our motivating projects PartNet-Mobility and VAT-MART.

@inproceedings{xiang2020sapien,
  title={Sapien: A simulated part-based interactive environment},
  author={Xiang, Fanbo and Qin, Yuzhe and Mo, Kaichun and Xia, Yikuan and Zhu, Hao and Liu, Fangchen and Liu, Minghua and Jiang, Hanxiao and Yuan, Yifu and Wang, He and others},
  booktitle={Proceedings of the IEEE/CVF conference on computer vision and pattern recognition},
  pages={11097--11107},
  year={2020}
}

@inproceedings{
wu2022vatmart,
title={{VAT}-Mart: Learning Visual Action Trajectory Proposals for Manipulating 3D {ART}iculated Objects},
author={Ruihai Wu and Yan Zhao and Kaichun Mo and Zizheng Guo and Yian Wang and Tianhao Wu and Qingnan Fan and Xuelin Chen and Leonidas Guibas and Hao Dong},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=iEx3PiooLy}
}

Contact

If you have any questions, please feel free to contact Yu Li at brucelee_at_bupt_edu_cn, Xiaojie Zhang at sectionz_at_bupt_edu_cn and Ruihai Wu at wuruihai_at_pku_edu_cn.