Human-Like Behavior Generation Based on Head-Arms Model for Robot Tracking External Targets and Body Parts.
Journal:
IEEE transactions on cybernetics
Published Date:
Aug 1, 2015
Abstract
Facing and pointing toward moving targets is a usual and natural behavior in daily life. Social robots should be able to display such coordinated behaviors in order to interact naturally with people. For instance, a robot should be able to point and look at specific objects. This is why, a scheme to generate coordinated head-arm motion for a humanoid robot with two degrees-of-freedom for the head and seven for each arm is proposed in this paper. Specifically, a virtual plane approach is employed to generate the analytical solution of the head motion. A quadratic program (QP)-based method is exploited to formulate the coordinated dual-arm motion. To obtain the optimal solution, a simplified recurrent neural network is used to solve the QP problem. The effectiveness of the proposed scheme is demonstrated using both computer simulation and physical experiments.