Physics-based motion planning is a challenging task, since it requires the computation of the robot motions while allowing possible interactions with (some of) the obstacles in the environment. Kinodynamic motion planners equipped with a dynamic engine acting as state propagator are usually used for that purpose. The difficulties arise in the setting of the adequate forces for the interactions and because these interactions may change the pose of the manipulatable obstacles, thus either facilitating or preventing the finding of a solution path. The use of knowledge can alleviate the stated difficulties. This paper proposes the use of an enhanced state propagator composed of a dynamic engine and a low-level geometric reasoning process that is used to determine how to interact with the objects, i.e. from where and with which forces. The proposal, called ¿-PMP can be used with any kinodynamic planner, thus giving rise to e.g. ¿-RRT. The approach also includes a preprocessing step that infers from a semantic abstract knowledge described in terms of an ontology the manipulation knowledge required by the reasoning process. The proposed approach has been validated with several examples involving an holonomic mobile robot, a robot with differential constraints and a serial manipulator, and benchmarked using several state-of-the art kinodynamic planners. The results showed a significant difference in the power consumption with respect to simple physics-based planning, an improvement in the success rate and in the quality of the solution paths.
The paper deals with the problem of motion planning for anthropomorphic dual-arm robots. It introduces a measure of the similarity of the movements needed to solve two given tasks. Planning using this measure to select proper arm synergies for a given task improves the planning performance and the resulting plan.
Current approaches do not allow robots to execute a task and simultaneously convey emotions to users using their body motions. This paper explores the capabilities of the Jacobian null space of a humanoid robot to convey emotions. A task priority formulation has been implemented in a Pepper robot which allows the specification of a primary task (waving gesture, transportation of an object, etc.) and exploits the kinematic redundancy of the robot to convey emotions to humans as a lower priority task. The emotions, defined by Mehrabian as points in the pleasure–arousal–dominance space, generate intermediate motion features (jerkiness, activity and gaze) that carry the emotional information. A map from this features to the joints of the robot is presented. A user study has been conducted in which emotional motions have been shown to 30 participants. The results show that happiness and sadness are very well conveyed to the user, calm is moderately well conveyed, and fear is not well conveyed. An analysis on the dependencies between the motion features and the emotions perceived by the participants shows that activity correlates positively with arousal, jerkiness is not perceived by the user, and gaze conveys dominance when activity is low. The results indicate a strong influence of the most energetic motions of the emotional task and point out new directions for further research. Overall, the results show that the null space approach can be regarded as a promising mean to convey emotions as a lower priority task.
—This work presents a knowledge-based task and motion planning framework based on a version of the FastForward task planner. A reasoning process on symbolic literals in terms of knowledge and geometric information about the workspace, together with the use of a physics-based motion planner, is proposed to evaluate the applicability and feasibility of manipulation actions and to compute the heuristic values that guide the search. The proposal results in low-cost physically-feasible plans