How to Use MoveNet with Ryzen AI

Jan 25, 2024 | Educational

Welcome to the fast-paced world of pose estimation! Today, we’re diving into MoveNet — an ultra-fast model capable of detecting 17 keypoints on the human body with impressive accuracy. The ease of integrating this model with the AMD Ryzen AI environment makes it an even more attractive option. Let’s break down the steps on how you can set this powerful model up on your system seamlessly.

Installation

To get started with MoveNet, we first need to prepare the Ryzen AI environment. This involves installing essential prerequisites that the model needs to function optimally.

  • Follow the complete installation guide provided in Ryzen AI Installation.
  • Run the following script to install the necessary packages:
bash
pip install -r requirements.txt

Data Preparation (Optional: For Accuracy Evaluation)

If you’re aiming to evaluate accuracy, you’ll want to prepare the COCO dataset. Here’s how you can do that:

  1. Download COCO Dataset 2017

    Visit cocodataset.org to download the dataset. You’ll need train2017.zip, val2017.zip, and the annotations file.

    After downloading, unzip the files into a folder structure like below:

    data/
    ├── annotations
    │   ├── person_keypoints_train2017.json
    │   └── person_keypoints_val2017.json
    ├── train2017
    │   ├── xx.jpg
    │   └── ...
    └── val2017
        ├── xx.jpg
        └── ...
    
  2. Convert Data to Required Format

    Modify the paths in lines 282 to 287 in make_coco_data_17keypoints.py if necessary.

    Run the following command to preprocess the dataset:

    python make_coco_data_17keypoints.py

    The required data format is a JSON file structured as follows:

    Keypoints order: [nose, left_eye, right_eye, ...]
    One item: [img_name: '0.jpg', keypoints: [x0,y0,z0,x1,y1,z1,...], center: [x,y], bbox: [x0,y0,x1,y1], ...]
    

Testing and Evaluation

Now that you have your data prepared, it’s time to evaluate the model’s performance.

  • Modify the DATASET_PATH in eval_onnx.py as needed.
  • To test the accuracy of the quantized model, run:
python eval_onnx.py --ipu --provider_config PathToVaip_config.json

Performance Metrics

After running the evaluation, you should receive a performance metric. For instance, an achievable accuracy on the IPU can be around:

Metric Accuracy
Accuracy 79.745%

Troubleshooting

While setting up MoveNet might be straightforward, you could face a few hiccups along the way. Here are some common troubleshooting ideas:

  • If the installation fails, ensure that all dependencies in requirements.txt are satisfied.
  • For data preparation issues, double-check your folder structure and file paths in the Python scripts.
  • If accuracy seems off, revisit the data processing script and ensure you have correctly followed the instructions.

In case you encounter difficulties you can’t resolve, reach out for support or more insights on the community pages of fxis.ai.

Conclusion

At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

Now, roll up your sleeves and start leveraging MoveNet with Ryzen AI for your pose estimation projects! Happy coding!

Stay Informed with the Newest F(x) Insights and Blogs

Tech News and Blog Highlights, Straight to Your Inbox