Skip to content
/ pmp Public

Official Implementation of "PMP: Learning to Physically Interact with Environments using Part-wise Motion Priors" (SIGGRAPH 2023)

License

Notifications You must be signed in to change notification settings

jinseokbae/pmp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Official Implementation of "PMP: Learning to Physically Interact with Environments using Part-wise Motion Priors" (SIGGRAPH 2023) (paper, video, talk)

Status

Released

  • Assets

    • Deepmimic-MPL Humanoid
    • Objects for interaction
    • Retargeted motion data (check for the license)
  • Simulation Configuration Files

    • .yaml files for whole-body and hand-only gyms
    • documentations about details

Todo

  • Shell script to install all external dependencies

  • Retargeting pipeline (Mixamo to Deepmimic-MPL Humanoid)

  • Whole-body Gym : training hand-equipped humanoid

    • Model (Train / Test)
      • pretrained weights
    • Environments
  • Hand-only Gym : training one hand to grab a bar

    • Model (Train / Test)
      • pretrained weight
      • expert trajectories
    • Environment

Note) I'm currently focusing on the other projects mainly so this repo will be updated slowly. In case you require early access to the full implementation, please contact me through my personal website.

Installation

This code is based on Isaac Gym Preview 4. Please run installation code and create a conda environment following the instruction in Isaac Gym Preview 4. We assume the name of conda environment is pmp_env. Then, run the following script.

conda activate pmp_env
cd pmp
pip install -e .

Acknowledgement

Codebase

This code is based on the official release of IsaacGymEnvs. Especially, this code largely borrows implementations of AMP in the original codebase (paper, code).

Humanoid

Our whole-body agent is modified from the humanoid in Deepmimic. We replace the sphere-shaped hands of the original humanoid with the hand from Modular Prosthetic Limb (MPL).

Motion data

We use Mixamo animation data for training part-wise motion prior. We retarget mixamo animation data into our whole-body humanoid using the similar process used in the original codebase.

About

Official Implementation of "PMP: Learning to Physically Interact with Environments using Part-wise Motion Priors" (SIGGRAPH 2023)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published