How to Implement SARL*: A Deep Reinforcement Learning Approach for Human-Aware Robot Navigation

Dec 19, 2023 | Data Science

If you’re venturing into the realm of robotics, particularly with mobile robots like the TurtleBot2, you’ve stumbled upon an exciting yet challenging project: implementing the SARL* algorithm from the paper SARL*: Deep Reinforcement Learning based Human-Aware Navigation for Mobile Robot in Indoor Environments. This blog post will serve as your compass to successfully navigate through the intricacies of this implementation.

Introduction

The SARL* algorithm enhances social awareness in robotic navigation, offering an advanced solution to real-world navigation challenges. The algorithm integrates multiple technologies including SLAM, path planning, pedestrian detection, and deep reinforcement learning. In this article, let’s walk through the framework, the system setup, experiments, installation, and how to get your mobile robot navigating effectively.

Method Overview

Method Overview

To simplify the understanding of the SARL* process, imagine sending a child into a busy playground filled with other kids (obstacles) and benches (static obstacles). The goal is for the child to move to a specific spot while avoiding colliding with others. The SARL* algorithm acts as the child in this analogy, learning how to navigate through the playground of indoor environments by continuously adjusting its path in response to dynamic obstacles – other people. Just as the child would learn from each trip to the playground, the SARL* agent learns from each navigation experience through reinforcement learning.

System Setup

For the setup, ensure that you have the following:

  • Laser scanner: Hokuyo UTM-30LX or RPLIDAR-A2.
  • Robot platform: TurtleBot 2.

System Setup

Some Experiments

Experiments

Before diving into code, let’s look at what you’ll need to get started and some experiments to conduct. For full details, please refer to the original paper to understand the context of these undertakings.

Code Structure

The SARL* implementation comprises several critical components:

  • Python-RVO2: A simulator for crowd using Optimal Reciprocal Collision Avoidance.
  • laser_filters: A package for filtering laser scans (optional).
  • navigation: A modified ROS navigation stack.
  • people: A stack for detecting and tracking humans.
  • rplidar_ros: A ROS package for RPLIDAR sensor use.
  • sarl_star_ros: The core ROS package for SARL* navigation.
  • turtlebot_apps: Stack for using ROS with the TurtleBot.

Build and Installation

To ensure a smooth build and installation process, follow these steps:

  1. Install ROS kinetic.
  2. Create and build a catkin workspace:
    mkdir -p ~/sarl_ws/src
    cd ~/sarl_ws/src
    catkin_make
    source devel/setup.bash
    git clone https://github.com/LeeKeyu/sarl_star.git
  3. Install other dependencies:
    sudo apt-get install libbullet-dev libsdl-image1.2-dev libsdl-dev ros-kinetic-bfl ros-kinetic-tf2-sensor-msgs \
    ros-kinetic-turtlebot ros-kinetic-turtlebot-apps ros-kinetic-turtlebot-interactions ros-kinetic-turtlebot-simulator \
    ros-kinetic-kobuki-ftdi ros-kinetic-ar-track-alvar-msgs
    pip install empy configparser
  4. Install Python-RVO2:
    cd sarl_star/Python-RVO2
    pip install -r requirements.txt
    python setup.py build
    python setup.py install
  5. Install CrowdNav, which has modifications from the original SARL implementation:
    cd sarl_star/sarl_star_ros/CrowdNav
    pip install -e .
  6. Finally, build the catkin workspace again:
    cd ~/sarl_ws
    catkin_make
    source devel/setup.bash

Start the Navigation

Once the setup is complete, let’s get the TurtleBot ready for navigation:

  1. Ensure that your PC is connected to TurtleBot2 and the lidar sensor.
  2. Bring up the TurtleBot using the command:
    roslaunch turtlebot_bringup minimal.launch
  3. Build a map of your environment. Depending on the lidar, use:
    • Hokuyo:
      roslaunch turtlebot_navigation hokuyo_gmapping_movebase.launch
    • RPlidar:
      roslaunch turtlebot_navigation rplidar_gmapping_movebase.launch
  4. To save the map, use:
    mkdir -p ~/sarl_ws/src/sarl_star/sarl_star_ros/map
    rosrun map_server map_saver -f ~/sarl_ws/src/sarl_star/sarl_star_ros/map/new_map
  5. Now, start navigating using the SARL* policy:
    roslaunch sarl_star_ros sarl_star_navigation.launch

Troubleshooting

If you face any issues during the setup, consider the following:

  • Ensure all dependencies are installed correctly.
  • Double-check the connections between the PC, TurtleBot2, and the lidar sensor.
  • Verify that the correct launch files are executed based on the lidar used.
  • Make sure you have built the workspace after every change.

For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.

Customizing Parameters

For fine-tuning the SARL* algorithm parameters, navigate to the configuration files located in sarl_star/sarl_star_ros/CrowdNav/crowd_nav/configs. You can customize the local goal distance by changing the value in sarl_star/navigation/base_local_planner/src/goal_functions.cpp.

Conclusion

Implementing SARL* for human-aware navigation is an innovative venture that combines technology and robotic intelligence. By following the steps outlined in this guide, you should be on your way to creating a mobile robot capable of navigating complex indoor environments with consideration for human presence.

At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

Stay Informed with the Newest F(x) Insights and Blogs

Tech News and Blog Highlights, Straight to Your Inbox