Socially-aware robot navigation strives to find efficient methods to autonomously navigate known environments while incorporating social metrics derived from human behavior. Several methods built for navigation on dynamic and uncertain environments have been adapted to resemble h
...
Socially-aware robot navigation strives to find efficient methods to autonomously navigate known environments while incorporating social metrics derived from human behavior. Several methods built for navigation on dynamic and uncertain environments have been adapted to resemble human navigation, but fail to appropriately reflect the social characteristics of human decision making.
A reliable solution for this problem is found in model-based methods working as local or global planners. Model-based research on human aware navigation focuses on two specific alternatives. The first option corresponds to models that apply social-psychology and cognitive sciences to create human-like behavior, where the Social Force Model is the predominant approach. The second alternative is related to models that use machine learning to copy human-like characteristics into mathematical models.
The former approach has proven to be efficient in human-aware navigation, but further studies are required to analyze the expansion of this method to more complex human-interactive navigation. In this thesis project, we look to test the Modified Extended Social Force Model (MESFM) to implement a guiding behavior on a humanoid robot. The MESFM incorporates a new force linked to the guided person to maintain a natural distance between both subjects, while generating smooth navigational maneuvers.
In addition, whether caused by sensor failure or occlusion incidents, the event of losing track of the guided person has a good probability of occurring. This scenario is studied by extending the high level control of the robotic system with a searching mode.
The architecture proposed to control the guiding behavior and the searching mode exploits the design of the humanoid robot Pepper, from SoftBank Robotics, to incorporate human-like gestures and increase the interaction between robot and human.
We aim to apply this behavior as an Office-Guide Robot and test this solution to understand the reaction of people involved in the guiding task though subjective and objective metrics. Our system is developed on top of the open-software Robot Operative System (ROS) with features extracted from Pepper's Naoqi framework.