The robot toddler that teaches diabetic children about their condition

Ask people to consider artificial emotional intelligence and chances are they'll imagine super servants and workers but another application is in medicine. We explore the relationship between AEI and health

At the ‘Feeling Emotional’ Late Spectacular event at London’s Wellcome Collection, researchers, scientists, artists and performers hosted a series of events exploring human emotion – how we feel and express ourselves through art and science.

If you were lucky enough to gain entry, you may have seen the robot infant Robin in his playpen. Robin is an autonomous robot programmed by Dr Lola Cañamero and Dr Matthew Lewis at the University of Hertfordshire to have diabetes and demonstrate certain behaviours associated with the illness.

This Emotion Modeling research project uses play and bonding activities to educate diabetic children about managing their condition.

Cañamero and Lewis invite young children to come and play with Robin and, while artificial emotional intelligence (AEI) might seem unnecessary for a healthcare project, Cañamero explains that: “Emotions are an essential component in humans and they affect pretty much everything we do: our way of thinking, our way of moving, the way we look at things and what we’re interested in is how that occurs throughout the body.”

Child caregivers

Robin is a standard off-the-shelf Aldebaran NAO robot, which are designed with emotional capabilities, but his unique personality has been created by the research team. He has been programmed so that as his blood sugar levels fluctuate, his behaviour changes, and he requires food, a drink, or a virtual shot of insulin to regulate his glucose.

Robin 1 - Ceri Jones

Image courtesy of Ceri Jones

In addition to his diabetes, Lewis coded Robin to have toddler qualities; he is affectionate and playful, has bouts of energy where he will dance and wander around, and displays curiosity about his surroundings.

“You can make a robot that is a bit like a puppet, and many of the robots that we see are like that, with very expressive faces,” says Cañamero. “But we can actually programme the robot by giving them motivations, which are really numbers that have to be kept within a range, so there’s no magic.”

“By giving it that we can have the robot do things on its own, decide what to do, what it wants to do and likes to do.” This was evident during the demonstration, as Robin tottered around freely, staring up at the crowd, asking for food, drinks, and lots of cuddles.

This naïve and curious character is essential for the study, says Lewis, explaining that: “By putting the child in a situation where they’re looking after Robin, it’s a sort of playful version of managing themselves, but because it’s a toddler, it’s very much the child who becomes the caretaker and is in charge.”

“It’s really their decisions. They’re not following instructions or anything; they are the person who makes those decisions.”

Initiative and independence

Developing a singular AEI is an expensive and complex process, so why is this preferential to simply programming a normal robot? “The robots have both motivations and emotions but these give them their own values and reasons for things,” says Cañamero.

We wanted an agent which had its own motivations and maybe didn’t want to eat the correct food

“They want to eat, for example, or when they have satisfied their hunger they might want to play, like Robin. Emotions in addition [to motivations] make them like or dislike the things they do, or the way people interact with them.”

The project focuses on children between seven and 12 years old, an age where most children are gaining greater independence from their parents, and so need the tools to deal with their condition.

“We wanted to have a situation that felt like something in the real world. And when you’re managing diabetes things don’t always go right,” explains Lewis.

“Rather than have a script where the child knows we do a certain thing and then the results are as expected, we wanted an agent which had its own motivations and maybe didn’t want to eat the correct food.”

The playpen holds a variety of healthy foods along with sweets and sugary drinks, and with no adults present, Robin could become unwell. As Lewis adds: “The child was put in a situation where they say, ‘no, you should eat this. This is good for you, you need to eat it’, which should reinforce the value that they put in diabetes management for themselves.”

Learning for life

Children respond to Robin as naturally as they would to any high-tech toy: with fascination and excitement, as well as enjoying a rare positive experience at a clinic. Cañamero feels that being endearing and also unpredictable makes Robin transcend the robot’s toy status and makes him seem more like a vulnerable younger sibling.

Robin 3 - Russell Dornan

Image courtesy of Russell Dornan

But the valuable medical insight is gained through the realism of Robin’s behaviour. Children recognise the contrast between his dancing and whooping during a glucose high (hyperglycaemia) and his tendency to sit down and moan due to a low (hypoglycaemia) from their own experiences, so they can relate to him.

Cañamero says: “They identify so they feel, ‘Okay, Robin is tired. I remember that’s very important for me and I find that very difficult’, and they want to help the robot. It makes them think how to apply the knowledge that they learn in books about diabetes.”

The Emotional Modeling project is successfully helping researchers connect with children, offering them a new and essential type of learning experience. Cañamero has been using robots in her research for many years and says that, although adults may have reservations when dealing with robots, “For children, it’s a natural thing. It’s part of their world now.”

For more information about Dr Cañamero and Dr Lewis’s Emotion Modeling project at the University of Hertfordshire, please visit www.emotion-modeling.info/robin. Or, to explore what it means to be human through medicine, art and science visit Wellcome Collection, London, UK. 

Atari tells fans its new Ataribox console will arrive in late 2018

Atari has revealed more details about its Ataribox videogame console today, with the company disclosing that the console will ship in late 2018 for somewhere between $249 and $299.

Atari says that it will launch the Ataribox on Indiegogo this autumn.

The company said it chose to launch the console in this way because it wants fans to be part of the launch, be able to gain access to early and special editions, as well as to make the Atari community “active partners” in the rollout of Ataribox.

“I was blown away when a 12-year-old knew every single game Atari had published. That’s brand magic. We’re coming in like a startup with a legacy,” said Ataribox creator and general manager Feargal Mac in an interview with VentureBeat.

“We’ve attracted a lot of interest, and AMD showed a lot of interest in supporting us and working with us. With Indiegogo, we also have a strong partnership.”

Images courtesy of Atari

Atari also revealed that its new console will come loaded with “tons of classic Atari retro games”, and the company is also working on developing current titles with a range of studios.

The Ataribox will be powered by an AMD customised processor, with Radeon Graphics technology, and will run Linux, with a customised, easy-to-use user interface.

The company believes this approach will mean that, as well as being a gaming device, the Ataribox will also be able to service as a complete entertainment unit that delivers a full PC experience for the TV, bringing users streaming, applications, social, browsing and music.

“People are used to the flexibility of a PC, but most connected TV devices have closed systems and content stores,” Mac said. “We wanted to create a killer TV product where people can game, stream and browse with as much freedom as possible, including accessing pre-owned games from other content providers.”

In previous releases, Atari has said that it would make two editions of its new console available: a wood edition and a black and red version.

After being asked by many fans, the company has revealed that the wood edition will be made from real wood.

Atari has asked that fans let it know what they think of the new console via its social channels

Scientists, software developers and artists have begun using VR to visualise genes and predict disease

A group of scientists, software developers and artists have taken to using virtual reality (VR) technology to visualise complex interactions between genes and their regulatory elements.

The team, which comprises of members from Oxford University, Universita’ di Napoli and Goldsmiths, University of London, have been using VR to visualise simulations of a composite of data from genome sequencing, data on the interactions of DNA and microscopy data.

When all this data is combined the team are provided with an interactive, 3D image that shows where different regions of the genome sit relative to others, and how they interact with each other.

“Being able to visualise such data is important because the human brain is very good at pattern recognition – we tend to think visually,” said Stephen Taylor, head of the Computational Biology Research Group at Oxford’s MRC Weatherall Institute of Molecular Medicine (WIMM).

“It began at a conference back in 2014 when we saw a demonstration by researchers from Goldsmiths who had used software called CSynth to model proteins in three dimensions. We began working with them, feeding in seemingly incomprehensible information derived from our studies of the human alpha globin gene cluster and we were amazed that what we saw on the screen was an instantly recognisable model.”

The team believe that being able to visualise the interactions between genes and their regulatory elements will allow them to understand the basis of human genetic diseases, and are currently applying their techniques to study genetic diseases such as diabetes, cancer and multiple sclerosis.

“Our ultimate aim in this area is to correct the faulty gene or its regulatory elements and be able to re-introduce the corrected cells into a patient’s bone marrow: to perfect this we have to fully understand how genes and their regulatory elements interact with one another” said Professor Doug Higgs, a principal researcher at the WIMM.

“Having virtual reality tools like this will enable researchers to efficiently combine their data to gain a much broader understanding of how the organisation of the genome affects gene expression, and how mutations and variants affect such interactions.”

There are around 37 trillion cells in the average adult human body, and each cell contains two meters of DNA tightly packed into its nucleus.

While the technology to sequence genomes is well established, it has been shown that the manner in which DNA is folded within each cell affects how genes are expressed.

“There are more than three billion base pairs in the human genome, and a change in just one of these can cause a problem. As a model we’ve been looking at the human alpha globin gene cluster to understand how variants in genes and their regulatory elements may cause human genetic disease,” said Prof Jim Hughes, associate professor of Genome Biology at Oxford University.