Sprayin’ with Brain

Sprayin’ with Brain is a line of research projects dealing with applications of Artificial Intelligenge and Robotics to agriculture. Collaboration with external experts (such as researchers in agriculture) and interaction with manufacturers of agricultural machinery and farmers keeps Sprayin’ with Brain focused on real-world problems.

Contact: Matteo Matteucci

GRAPE

GRAPE (Ground Robot for vineyArd Monitoring and ProtEction) explores the use of autonomous robots as means for pesticide-free pest control in vineyards. An robot fitted with an arm and a dispenser of pheromone-coated devices has the task of navigating through the vineyard, find suitable target locations for the devices, and place them in place.

Contact: Matteo Matteucci

For additional details: http://www.echord.eu/grape.html

Emotional robots

Expressing emotions with a robot that could have any shape and abilities requires to study the expressive power of different interaction channels, and the different features involved in emotional expression: shape, movement, colors, sound, light, timing, input channels, etc.

On this way we have developed, among others, the robots listed below. Some of them exploit emotion expression in their job (robots for people with disabilities, artist robots) others are just experiments to evaluate the basic components of emotion expression.

OIMI, an emotional soft robot

OIMI is built on an holonomic base (Triskarino), free to run on the ground, with infrared remote sensors, on which a kind of soft egg is fixed. The main body also includes tactile sensors that make it possible to detect whether OIMI is caressed, hugged, or hit. Under the base there is a circle of colored LEDs, and a loudspeaker to enable to emit sounds that express its emotional state. On the upper part of the body  five touch sensible sections are used to communicate explicitly with the robot. On the front of the body an area enables the user to compose faces with given elements. 

OIMI is built on an holonomic base (Triskarino), free to run on the ground, with infrared remote sensors, on which a kind of soft egg is fixed. The main body also includes tactile sensors that make it possible to detect whether OIMI is caressed, hugged, or hit. Under the base there is a circle of colored LEDs, and a loudspeaker to enable to emit sounds that express its emotional state. On the upper part of the body  five touch sensible sections are used to communicate explicitly with the robot. On the front of the body an area enables the user to compose faces with given elements. 

OIMI is intended to be used as an agent, other than the therapist, with whom children, especially those affected by Autistic Spectrum Disorder (ASD), can interact. We have implemented some autonomous reactions, mainly related to how it is manipulated and to the distance the user is. OIMI  can also be driven by an operator or by programs running locally. OIMI has been implemented in 6 versions, the first three, with the name of Teo, within the Polisocial KROG project as a mobile element that interacts with a large screen and a Kinect system in play activities. The fourth version was provided as operational support for therapies and free play at the “Il Sogno” ONLUS Association, in Castenuovo Garfagnana (LU), where is operational since 2018. The fifth version is exploiting novel touch sensors and improved interaction capabilities. A sixth version is considering movement of the subject to drive the interaction. Other versions are under development, integrating sound direction and analysis, and goal directed actions (e.g. to improve attention while playing).

More information on Robots and disabilities at AIRLab is available from: http://playbot4all.polimi.it