Personal page: https://www.deib.polimi.it/eng/people/details/100410
Personal page: https://www.deib.polimi.it/eng/people/details/100410
Personal page: https://www.deib.polimi.it/eng/people/details/267262
OIMI is built on an holonomic base (Triskarino), free to run on the ground, with infrared remote sensors, on which a kind of soft egg is fixed. The main body also includes tactile sensors that make it possible to detect whether OIMI is caressed, hugged, or hit. Under the base there is a circle of colored LEDs, and a loudspeaker to enable to emit sounds that express its emotional state. On the upper part of the body five touch sensible sections are used to communicate explicitly with the robot. On the front of the body an area enables the user to compose faces with given elements.
OIMI is built on an holonomic base (Triskarino), free to run on the ground, with infrared remote sensors, on which a kind of soft egg is fixed. The main body also includes tactile sensors that make it possible to detect whether OIMI is caressed, hugged, or hit. Under the base there is a circle of colored LEDs, and a loudspeaker to enable to emit sounds that express its emotional state. On the upper part of the body five touch sensible sections are used to communicate explicitly with the robot. On the front of the body an area enables the user to compose faces with given elements.
OIMI is intended to be used as an agent, other than the therapist, with whom children, especially those affected by Autistic Spectrum Disorder (ASD), can interact. We have implemented some autonomous reactions, mainly related to how it is manipulated and to the distance the user is. OIMI can also be driven by an operator or by programs running locally. OIMI has been implemented in 6 versions, the first three, with the name of Teo, within the Polisocial KROG project as a mobile element that interacts with a large screen and a Kinect system in play activities. The fourth version was provided as operational support for therapies and free play at the “Il Sogno” ONLUS Association, in Castenuovo Garfagnana (LU), where is operational since 2018. The fifth version is exploiting novel touch sensors and improved interaction capabilities. A sixth version is considering movement of the subject to drive the interaction. Other versions are under development, integrating sound direction and analysis, and goal directed actions (e.g. to improve attention while playing).
More information on Robots and disabilities at AIRLab is available from: http://playbot4all.polimi.it