RAP Group

Head : Michel Devy
Secretary : Natacha Ouwanssi
Scientific themes studied in the Robotics, Action and Perception research group are in the continuity of works made from thirty years at LAAS-CNRS, about the functional and decisional autonomy of robotic systems executing tasks in dynamic or changing environments. Within the Robotics and Artificial Intelligence department of the lab, the RAP group focuses mainly its works on the functional layer integrated on robotic systems : either indoor or outdoor robots equipped generally with several sensors (vision, audio, laser scanners, RFID readers…) or humanoid robot, only fitted out in visual and audio sensors. The RAP group is concerned by the integration of advanced functions on a physical robot executing tasks in a passive environment, by Humans, Robots and Environment interaction when making profit of an Ambiant Intelligence , and by applications of robotics technologies in other domains :
  • A physical robot has an embedded system with all functionalities required in order to execute tasks with a given autonomy level, interacting with humans, either remote operators (e.g. space exploration) or users (e.g. personal robot at home, or service robots in public areas). These functionalities enable the robot to control sensors and actuators, to generate plans (navigation and/or manipulation) from models for the environment and for robot actions, and to control the execution of these plans so that the robot can fit its behaviour (1) to the current state of the environment including changes with respect to its initial model, and (2) to activities of other mobile objects sharing the same space (humans, other robots or vehicles…).
  • Many applications involve networks of Sensors and Robots, e.g. fleets of dedicated robots (guide or cleaning robots, person movers…) executing joint tasks in smart environments (commercial centers, pedestrian streets, airports…). Robots exchange data about their current states (positions, configurations…), and also, receive information from sensors mounted in the environment (monitoring cameras, RFID readers, optical barriers…) or dropped during a deployment step. Every sensor and robot detects events and generates interpretations (object or activity recognition): data fusion must be undertaken at the lower (signal) or higher (object, activity) levels, in order to make faster, more meaningful and more robust the perception of every subsystem.
  • Technologies developped for robots can be also exploited for other applicative domains : visual monitoring, transport, control of industrial process, 3D modelling, video analysis… The RAP group participates to collaborative projects that aim to study specific problems emerging from these applications.
As it might be expected in robotics, every research work executed in the RAP group has two objectives : a theoretical contribution in control or in perception, and an experimental validation. So the RAP group uses numerous experimental means : robots managed by the LAAS robotic platform, and also specific sensors and actuators presented hereafter.

The cartesian robot FESTO Infrared stereovision (8-12μm)
The Smart Camera developped by Delta Technologies Sud Ouest
The Optinum 3D scanner from NOOMEO

The RAP group has research activities in four interleaved themes :