The robot toddler that teaches diabetic children about their condition

Ask people to consider artificial emotional intelligence and chances are they'll imagine super servants and workers but another application is in medicine. We explore the relationship between AEI and health

At the ‘Feeling Emotional’ Late Spectacular event at London’s Wellcome Collection, researchers, scientists, artists and performers hosted a series of events exploring human emotion – how we feel and express ourselves through art and science.

If you were lucky enough to gain entry, you may have seen the robot infant Robin in his playpen. Robin is an autonomous robot programmed by Dr Lola Cañamero and Dr Matthew Lewis at the University of Hertfordshire to have diabetes and demonstrate certain behaviours associated with the illness.

This Emotion Modeling research project uses play and bonding activities to educate diabetic children about managing their condition.

Cañamero and Lewis invite young children to come and play with Robin and, while artificial emotional intelligence (AEI) might seem unnecessary for a healthcare project, Cañamero explains that: “Emotions are an essential component in humans and they affect pretty much everything we do: our way of thinking, our way of moving, the way we look at things and what we’re interested in is how that occurs throughout the body.”

Child caregivers

Robin is a standard off-the-shelf Aldebaran NAO robot, which are designed with emotional capabilities, but his unique personality has been created by the research team. He has been programmed so that as his blood sugar levels fluctuate, his behaviour changes, and he requires food, a drink, or a virtual shot of insulin to regulate his glucose.

Robin 1 - Ceri Jones

Image courtesy of Ceri Jones

In addition to his diabetes, Lewis coded Robin to have toddler qualities; he is affectionate and playful, has bouts of energy where he will dance and wander around, and displays curiosity about his surroundings.

“You can make a robot that is a bit like a puppet, and many of the robots that we see are like that, with very expressive faces,” says Cañamero. “But we can actually programme the robot by giving them motivations, which are really numbers that have to be kept within a range, so there’s no magic.”

“By giving it that we can have the robot do things on its own, decide what to do, what it wants to do and likes to do.” This was evident during the demonstration, as Robin tottered around freely, staring up at the crowd, asking for food, drinks, and lots of cuddles.

This naïve and curious character is essential for the study, says Lewis, explaining that: “By putting the child in a situation where they’re looking after Robin, it’s a sort of playful version of managing themselves, but because it’s a toddler, it’s very much the child who becomes the caretaker and is in charge.”

“It’s really their decisions. They’re not following instructions or anything; they are the person who makes those decisions.”

Initiative and independence

Developing a singular AEI is an expensive and complex process, so why is this preferential to simply programming a normal robot? “The robots have both motivations and emotions but these give them their own values and reasons for things,” says Cañamero.

We wanted an agent which had its own motivations and maybe didn’t want to eat the correct food

“They want to eat, for example, or when they have satisfied their hunger they might want to play, like Robin. Emotions in addition [to motivations] make them like or dislike the things they do, or the way people interact with them.”

The project focuses on children between seven and 12 years old, an age where most children are gaining greater independence from their parents, and so need the tools to deal with their condition.

“We wanted to have a situation that felt like something in the real world. And when you’re managing diabetes things don’t always go right,” explains Lewis.

“Rather than have a script where the child knows we do a certain thing and then the results are as expected, we wanted an agent which had its own motivations and maybe didn’t want to eat the correct food.”

The playpen holds a variety of healthy foods along with sweets and sugary drinks, and with no adults present, Robin could become unwell. As Lewis adds: “The child was put in a situation where they say, ‘no, you should eat this. This is good for you, you need to eat it’, which should reinforce the value that they put in diabetes management for themselves.”

Learning for life

Children respond to Robin as naturally as they would to any high-tech toy: with fascination and excitement, as well as enjoying a rare positive experience at a clinic. Cañamero feels that being endearing and also unpredictable makes Robin transcend the robot’s toy status and makes him seem more like a vulnerable younger sibling.

Robin 3 - Russell Dornan

Image courtesy of Russell Dornan

But the valuable medical insight is gained through the realism of Robin’s behaviour. Children recognise the contrast between his dancing and whooping during a glucose high (hyperglycaemia) and his tendency to sit down and moan due to a low (hypoglycaemia) from their own experiences, so they can relate to him.

Cañamero says: “They identify so they feel, ‘Okay, Robin is tired. I remember that’s very important for me and I find that very difficult’, and they want to help the robot. It makes them think how to apply the knowledge that they learn in books about diabetes.”

The Emotional Modeling project is successfully helping researchers connect with children, offering them a new and essential type of learning experience. Cañamero has been using robots in her research for many years and says that, although adults may have reservations when dealing with robots, “For children, it’s a natural thing. It’s part of their world now.”

For more information about Dr Cañamero and Dr Lewis’s Emotion Modeling project at the University of Hertfordshire, please visit www.emotion-modeling.info/robin. Or, to explore what it means to be human through medicine, art and science visit Wellcome Collection, London, UK. 

Soviet report detailing lunar rover Lunokhod-2 released for first time

Russian space agency Roskosmos has released an unprecedented scientific report into the lunar rover Lunokhod-2 for the first time, revealing previously unknown details about the rover and how it was controlled back on Earth.

The report, written entirely in Russian, was originally penned in 1973 following the Lunokhod-2 mission, which was embarked upon in January of the same year. It had remained accessible to only a handful of experts at the space agency prior to its release today, to mark the 45th anniversary of the mission.

Bearing the names of some 55 engineers and scientists, the report details the systems that were used to both remotely control the lunar rover from a base on Earth, and capture images and data about the Moon’s surface and Lunokhod-2’s place on it. This information, and in particularly the carefully documented issues and solutions that the report carries, went on to be used in many later unmanned missions to other parts of the solar system.

As a result, it provides a unique insight into this era of space exploration and the technical challenges that scientists faced, such as the low-frame television system that functioned as the ‘eyes’ of the Earth-based rover operators.

A NASA depiction of the Lunokhod mission. Above: an image of the rover, courtesy of NASA, overlaid onto a panorama of the Moon taken by Lunokhod-2, courtesy of Ruslan Kasmin.

One detail that main be of particular interest to space enthusiasts and experts is the operation of a unique system called Seismas, which was tested for the first time in the world during the mission.

Designed to determine the precise location of the rover at any given time, the system involved transmitting information over lasers from ground-based telescopes, which was received by a photodetector onboard the lunar rover. When the laser was detected, this triggered the emission of a radio signal back to the Earth, which provided the rover’s coordinates.

Other details, while technical, also give some insight into the culture of the mission, such as the careful work to eliminate issues in the long-range radio communication system. One issue, for example, was worked on with such thoroughness that it resulted in one of the devices using more resources than it was allocated, a problem that was outlined in the report.

The document also provides insight into on-Earth technological capabilities of the time. While it is mostly typed, certain mathematical symbols have had to be written in by hand, and the report also features a number of diagrams and graphs that have been painstakingly hand-drawn.

A hand-drawn graph from the report, showing temperature changes during one of the monitoring sessions during the mission

Lunokhod-2 was the second of two unmanned lunar rovers to be landed on the Moon by the Soviet Union within the Lunokhod programme, having been delivered via a soft landing by the unmanned Luna 21 spacecraft in January 1973.

In operation between January and June of that year, the robot covered a distance of 39km, meaning it still holds the lunar distance record to this day.

One of only four rovers to be deployed on the lunar surface, Lunokhod-2 was the last rover to visit the Moon until December 2013, when Chinese lunar rover Yutu made its maiden visit.

Robot takes first steps towards building artificial lifeforms

A robot equipped with sophisticated AI has successfully simulated the creation of artificial lifeforms, in a key first step towards the eventual goal of creating true artificial life.

The robot, which was developed by scientists at the University of Glasgow, was able to model the creation of artificial lifeforms using unstable oil-in-water droplets. These droplets effectively played the role of living cells, demonstrating the potential of future research to develop living cells based on building blocks that cannot be found in nature.

Significantly, the robot also successfully predicted their properties before they were created, even though this could not be achieved using conventional physical models.

The robot, which was designed by Glasgow University’s Regius Chair of Chemistry, Professor Lee Cronin, is driven by machine learning and the principles of evolution.

It has been developed to autonomously create oil-in-water droplets with a host of different chemical makeups and then use image recognition to assess their behaviour.

Using this information, the robot was able to engineer droplets to have different properties­. Those which were found to be desirable could then be recreated at any time, using a specific digital code.

“This work is exciting as it shows that we are able to use machine learning and a novel robotic platform to understand the system in ways that cannot be done using conventional laboratory methods, including the discovery of ‘swarm’ like group behaviour of the droplets, akin to flocking birds,” said Cronin.

“Achieving lifelike behaviours such as this are important in our mission to make new lifeforms, and these droplets may be considered ‘protocells’ – simplified models of living cells.”

One of the oil droplets created by the robot

The research, which is published today in the journal PNAS, is one of several research projects being undertaken by Cronin and his team within the field of artificial lifeforms.

While the overarching goal is moving towards the creation of lifeforms using new and unprecedented building blocks, the research may also have more immediate potential applications.

The team believes that their work could also have applications in several practical areas, including the development of new methods for drug delivery or even innovative materials with functional properties.