Expressive Whole-Body Control for Humanoid Robots

Apr 4, 2022 | Data Science

Xuxin Cheng · Yandong Ji · Junming Chen
Ge Yang · Xiaolong Wang

Website | arXiv | Video | Summary

Introduction

The control of humanoid robots has reached new expressive heights, allowing for dynamic motion and advanced interaction in various environments. This guide will walk you through the installation and usage of a powerful framework that enables whole-body control for humanoid robots.

Installation

Follow these steps to set up your development environment:

bash
conda create -n humanoid python=3.8
conda activate humanoid
cd expressive_humanoid
pip3 install torch==1.10.0+cu113 torchvision==0.11.1+cu113 torchaudio==0.10.0+cu113 -f https://download.pytorch.org/whl/cu113/torch_stable.html
git clone git@github.com:chengxuxin/expressive_humanoid.git
cd isaacgym
pip install -e .
cd ~/expressive_humanoid/rsl_rl
pip install -e .
cd ~/expressive_humanoid/legged_gym
pip install -e .
pip install numpy==1.24 pydelatin wandb tqdm opencv-python ipdb pyfqmr flask dill gdown

Prepare Dataset

  • Download the dataset: Extract it into ASEaseposelibdata/cmu_fbx_all.
  • Generate .yaml file: Use the script to add more motions if needed with:
    bash
            cd ASEaseposelib
            python parse_cmu_mocap_all.py
            
  • Import motions:
    bash
            cd ASEaseposelib
            python fbx_importer_all.py
            
  • Retarget motions:
    bash
            cd ASEaseposelib
            mkdir pkl
            python retarget_motion_h1_all.py
            
  • Generate key body positions: This step requires running a simulation. Use:
    bash
            cd legged_gym
            legged_gym/scripts/python train.py debug --task h1_view --motion_name motions_debug.yaml --debug
            

Usage

After installation and preparation, you’re ready to train and play with humanoid models.

  • To train a new policy:
    bash
            python train.py xxx-xx-some_descriptions_of_run --device cuda:0 --entity WANDB_ENTITY
            
  • To play a policy:
    bash
            python play.py xxx-xx
            
  • To play with example pretrained models:
    bash
            python play.py 060-40 --delay --motion_name motions_debug.yaml
            

Understanding the Code with an Analogy

Think of building a humanoid robot as crafting a complex dance performance. Each stage of the setup serves a specific purpose, similar to how a choreographer designs a dance. The installation process establishes the necessary dance floor and lighting, while the dataset preparation aligns the right music and movements. Training the model is akin to rehearsing; the robot learns its steps, checks which moves flow, and solidifies its choreography. Every command, from train.py to play.py, represents cues that guide the robot’s performance on stage, ensuring it delivers an engaging and responsive act each time.

Troubleshooting

If you encounter issues, consider the following:

  • Ensure you have the correct Python version installed (3.8).
  • Check the compatibility of your environment with the specified package versions.
  • Verify the success of dataset downloading and extraction.
  • Make sure that any dependencies, particularly PyTorch and Isaac Gym, are correctly installed.

For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.

Conclusion

At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

Stay Informed with the Newest F(x) Insights and Blogs

Tech News and Blog Highlights, Straight to Your Inbox