i.Drive

i.Drive is an interdepartmental laboratory where AIRLab is the technology provider for robotics.

The laboratory aims at developing inter-disciplinary proficiency required for analysis and modelling of behavioral aspects due to the interaction between driver, vehicle, infrastructure, and environment through:

  • A fixed structural component based on a virtual realty simulator aimed at the ex-ante test of expected behavioral models, the joined optimization of vehicle and road infrastructure, the increase of ex-post and in-itinere statistical significance ofexperiments carried out on roads;
  • A mobile component based on an instrumented vehicle aimed at measuring on field performance and reactions of drivers in different driving conditions and at collecting environmental data to be reproduced ex-post by simulation.

Contact: Matteo Matteucci

For additional details: http://www.idrive.polimi.it/

MADROB + BEAST

MADROB (Modular Active Door for RObot Benchmarking) and BEAST (Benchmark-Enabling Active Shopping Trolley) are benchmarks for autonomous robots aimed at measuring their capabilities and performance when dealing with devices that are common in human environments.

MADROB is focused on opening doors; BEAST considers the problem of pushing a shopping trolley. Both make use of a device with the same features of its real counterpart, fitted with sensors (to assess the actions of the robots on it: e.g., force applied to the handle of the door, precision in following a trajectory with the cart) and actuators (to introduce disturbances simulating real-world phenomena: e.g., wind pushing the door panel, stone under the trolley’s wheel).

Beyond the hardware and software, MADROB and BEAST also comprise procedures and performance metrics that enable objective evaluation of the performance of robots, as well as comparisons between different robots and between a robots and humans.

Contact: Matteo Matteucci

For additional details: http://eurobench2020.eu/developing-the-framework/modular-active-door-for-robot-benchmarking-madrob/, http://eurobench2020.eu/developing-the-framework/benchmark-enabling-active-shopping-trolley-beast/

Personal Mobility Kit

The PMK is an add-on for commercial electric wheelchairs that uses robotic technology to provide two new functionalities:

  • autonomous driving, where the user only has to select her goal and the PMK drives the wheelchair safely to destination;
  • assisted driving, where the user is in charge of driving and the PMK only intervenes to ensure safety (e.g., slowing down to avoid a collision with a child jumping in front of the wheelchair) or provide help in difficult maneuvers (e.g., while approaching doorways).

The PMK has been developed with the collaboration of disabled people. Its design and implementation are focused on the principle of shared autonomy: the robotic part of the wheelchair only intervenes when this actually makes the user feel more empowered by this intervention, augmenting the user’s autonomy and independence.

Contact: Matteo Matteucci

Bedmover

This is an example of collaboration between AIRLab and industry for the development of innovative products. In this case, the product is a device for the assisted transportation of hospital beds (with patients) employing robot technology for flexible movement in tight spaces and safety.

This project, a collaboration between AIRLab, Università di Milano – Bicocca and Info Solution SpA financed by Regione Lombardia, led to the development of international patents and a commercial product (BeN).

Contact: Matteo Matteucci

Emotional robots

Expressing emotions with a robot that could have any shape and abilities requires to study the expressive power of different interaction channels, and the different features involved in emotional expression: shape, movement, colors, sound, light, timing, input channels, etc.

On this way we have developed, among others, the robots listed below. Some of them exploit emotion expression in their job (robots for people with disabilities, artist robots) others are just experiments to evaluate the basic components of emotion expression.

OIMI, an emotional soft robot

OIMI is built on an holonomic base (Triskarino), free to run on the ground, with infrared remote sensors, on which a kind of soft egg is fixed. The main body also includes tactile sensors that make it possible to detect whether OIMI is caressed, hugged, or hit. Under the base there is a circle of colored LEDs, and a loudspeaker to enable to emit sounds that express its emotional state. On the upper part of the body  five touch sensible sections are used to communicate explicitly with the robot. On the front of the body an area enables the user to compose faces with given elements. 

OIMI is built on an holonomic base (Triskarino), free to run on the ground, with infrared remote sensors, on which a kind of soft egg is fixed. The main body also includes tactile sensors that make it possible to detect whether OIMI is caressed, hugged, or hit. Under the base there is a circle of colored LEDs, and a loudspeaker to enable to emit sounds that express its emotional state. On the upper part of the body  five touch sensible sections are used to communicate explicitly with the robot. On the front of the body an area enables the user to compose faces with given elements. 

OIMI is intended to be used as an agent, other than the therapist, with whom children, especially those affected by Autistic Spectrum Disorder (ASD), can interact. We have implemented some autonomous reactions, mainly related to how it is manipulated and to the distance the user is. OIMI  can also be driven by an operator or by programs running locally. OIMI has been implemented in 6 versions, the first three, with the name of Teo, within the Polisocial KROG project as a mobile element that interacts with a large screen and a Kinect system in play activities. The fourth version was provided as operational support for therapies and free play at the “Il Sogno” ONLUS Association, in Castenuovo Garfagnana (LU), where is operational since 2018. The fifth version is exploiting novel touch sensors and improved interaction capabilities. A sixth version is considering movement of the subject to drive the interaction. Other versions are under development, integrating sound direction and analysis, and goal directed actions (e.g. to improve attention while playing).

More information on Robots and disabilities at AIRLab is available from: http://playbot4all.polimi.it