Skip to content

A library of long-horizon Task-and-Motion-Planning (TAMP) problems in kitchen and household scenes, as well as planners to solve them

License

Notifications You must be signed in to change notification settings

Learning-and-Intelligent-Systems/kitchen-worlds

Repository files navigation

Kitchen Worlds

A library of long-horizon Task-and-Motion-Planning (TAMP) problems in kitchen and household scenes, as well as planning algorithms to solve them

  • procedurally generate scenes with rigid and articulated objects
  • visualize a scene in LISDF format (an extension to SDF that includes URDF)
  • solve a TAMP problem defined by a scene.lisdf and problem.pddl using existing domain.pddl and stream.pddl using PDDLStream
  • visualize the robot trajectory in pybullet

Installation

  1. Clone the repo along with the submodules. It may take a few minutes.
git clone git@github.com:Learning-and-Intelligent-Systems/kitchen-worlds.git --recursive
In case the submodules are not up-to-date, checkout the most recent changes:
(cd pybullet_planning; git checkout master; git pull); \
  (cd pddlstream; git checkout caelan/diverse); \
  (cd assets/models; git checkout main); 
  1. Install dependencies. It may take a dozen minutes.
conda env create -f environment.yml
conda activate kitchen
## sudo apt-get install graphviz graphviz-dev  ## on Ubuntu
  1. Build FastDownward, the task planner used by PDDLStream planner.
## sudo apt install cmake g++ git make python3  ## if not already installed
(cd pddlstream; ./downward/build.py)
  1. Build IK solvers (If using mobile manipulators; skip this if you're only using the floating gripper).
  • (If on Ubuntu, this one is better) TracIK for whole-body IK that solves for base, torso, and arm together

    sudo apt-get install libeigen3-dev liborocos-kdl-dev libkdl-parser-dev liburdfdom-dev libnlopt-dev libnlopt-cxx-dev swig
    pip install git+https://github.com/mjd3/tracikpy.git
  • IKFast solver for arm planning (the default IK), which needs to be compiled for each robot type. Here's example for PR2:

    ## sudo apt-get install python-dev
    (cd pybullet_planning/pybullet_tools/ikfast/pr2; python setup.py)

Test Installation

conda activate kitchen
python examples/test_parse_lisdf.py
python examples/test_data_generation.py

Tutorial

Here are some example scripts to help you understand the scene generation and task and motion planning tools. Once they are all working for you, we recommend you follow the next section to set up your own data generation pipeline with custom config files.

Generate Worlds, Problems, and Plans

Generating data involves creating data folders that include scene layout scene.lisdf, problem.pddl, plan.json, and trajectory commands.pkl. It can be run without gui (faster) and can be run in parallel. Note that planning is not guaranteed to be return a solution within timeout, depending on the domain.

There are two scripts for collecting data. Note that both script may result in a failed output folder or early stop of simulation because the problem can't be solved, e.g. when the world generation script failed to find an initial world configuration for randomly sampled objects.

  1. One is simpler, cleaner, and more adaptable for your tasks.
python examples/test_data_generation.py --config_name kitchen_full_pr2.yaml  ## PR2 with extended torso range
python examples/test_data_generation.py --config_name kitchen_full_feg.yaml  ## floating franka gripper
python examples/test_data_generation.py --config_name kitchen_full_feg.yaml --simulate  ## simulation mode broken as of Oct 17, 2024
python examples/test_data_generation.py --config_path {path/to/your/custom_data_config.yaml}
  1. The other uses a more general set of classes and processes.
python examples/test_data_generation_pigi.py  ## PR2 with extended torso range

Once a plan is generated and console stopped generated more logs, press Enter to visualize execution.

Quick Start Your Scene or Trajectory Generation Pipeline

(Last tested 7 Oct, 2024)

We suggest putting your custom data generation code and config files inside a directory on the same level as kitchen-worlds/pybullet_planning in the project repo. For example, in kitchen-worlds/your_project_folder

Step 1a) To generate PIGINet data

PIGINet data uses a specific procedure to generate scenes with randomized clutter. We recommend following 1b for your custom purposes.

The argument is name to your custom configuration file in kitchen-worlds/your_project_folder/configs:

Output will be in outputs/{config.data.exp_subdir}/{timestamped_data_name}.

## generates data folders with scene, problem, plan, trajectory
python your_project_folder/run_generation_pigi_custom.py

## render images, can run in parallel
python your_project_folder/render_images_custom.py --task custom_piginet_data --parallel

Step 1b) To generate custom data (different world layout, goals, robots, etc.)

The argument is name to your custom configuration file in kitchen-worlds/your_project_folder/configs:

Data can be generated in parallel on CPU (set flag in config yaml file).

Output will be in outputs/{config.data.out_dir}/{timestamped_data_names}.

## generates data folders with scene, problem, plan, trajectory
python your_project_folder/run_generation_custom.py --config_name config_generation.yaml

Step 2) To render images for the generated scene

To train vision language models using generated data, we generate images for all runs in one output subdirectory associated for a task / data generation batch.

For example, if data is generated using step 1b and default config is used, data will be generated in outputs/custom_pr2_kitchen_full, where custom_pr2_kitchen_full constitutes our task name here.

Some camera poses are defined in code, some in outputs/{task_name}/{timestamped_data_name}/planning_config.json.

Output will be in each individual outputs/{task_name}/{timestamped_data_name}/

python your_project_folder/render_images_custom.py --task {task_name}

Step 3) To render video from successful planning runs

Replay the trajectory for review or rendering

Given path, for example, timestamped_data_dir = 'custom_pr2_kitchen_full/241007_233942'. Output will be in {timestamped_data_dir}/

python your_project_folder/run_replay_custom.py -p {timestamped_data_dir}

Acknowledgements

The development is partially performed during internship at NVIDIA Research, Seattle Robotics Lab.

This repo works thanks for the tools provided by LIS lab members and alum:

  • the pybullet_tools package is an awesome set of tools developed by Caelan Garret. A forked version is included with my own helper functions.
  • the pddlstream is a planning framework developed by Caelan Garret.
  • the lisdf package is an input/output specification for TAMP problems developed by William Shen, Nishanth Kumar, Aidan Curtis, and Jiayuan Mao.

All the object models and urdf files are downloaded for free from the following sources:

  • most articulated object models are downloaded from PartNet Mobility dataset (Mo, Kaichun, et al. "Partnet: A large-scale benchmark for fine-grained and hierarchical part-level 3d object understanding." Proceedings of the IEEE/CVF conference on computer vision and pattern recognition. 2019.)
  • most kitchen object models are downloaded from Free3D.

About

A library of long-horizon Task-and-Motion-Planning (TAMP) problems in kitchen and household scenes, as well as planners to solve them

Resources

License

Stars

Watchers

Forks

Contributors 4

  •  
  •  
  •  
  •