Talking with robots: Training future robot chefs with speech

Without a detailed knowledge of coding and programming languages it is almost impossible for the average person to be able to interact with most robots.

It is one of the biggest hurdles that needs to be overcome if our future is to include living with robots.

Thankfully, researchers at Cornell University, US, are teaching robots to understand instructions that are given to them by voice – and are teaching one robot to cook.

The robot is able to scan the room it is in with a 3D camera, and has been trained to associate the objects it sees with what they can do.

For example, it understands that water can be put in a pan and that the pan can be heated up on a stove.

It has a built in programming language that includes commands that allow it to find, grasp, carry or fill, for objects such as a pan.

The researchers have made the robot clever enough to be able to perform the same actions at a different time – even if the pan has been moved.

While the work is in the early stages it shows a promise of what could be to come with the future of robots and those that are able to learn by themselves in a human language.

On the researchers’ website the team of three, who are primarily behind the work, say that they want to be able to make a robot take generic instructions and be able to act on them, as well as understanding them.

“In order for robots to perform tasks in real world, they need to be able to understand our natural language commands,” the team wrote on its website.

“While there is a lot of past research that went into the task of language parsing, they often require the instructions to be spelled out in full detail which makes it difficult to use them in real world situations.

“Our goal is to enable a robots to even take an ill-specified instruction as generic as ‘Make a cup of coffee’ and be able to figure out how to fill a cup with milk or use one if it already has milk etc. depending on how the environment looks.”

The team are also collecting data to help with the robot’s development and have created an online simulator that anyone can try out.

Understandably the robot is not perfect yet and during testing the researchers found that it performed correctly 64% of the time.

This did include when the commands were changed and the robot is able to fill in the missing step.

For us to be able to utilise the potential of robots to aid us in everyday tasks such as cooking a meal, they need to be able to understand what we say to them in our native language.

Then, when combined with artificial intelligence, learn over time.

Japanese juggernaut SoftBank has recently announced that it will be selling a robot nanny called Pepper, from 2015, that can respond to the emotions of humans.

It is mainly being touted as a companion for children, and will report children’s positive emotional responses to their mothers.


Featured image courtesy of Cornell University


Robot nanny to be sold in Japanese stores from 2015

SoftBank, a Japanese technology megacorporation, has announced the launch of an emotionally responsive, human-like robot for home use.

The robot, known as Pepper, has been described by the company as the “world’s first personal robot that reads emotions”, and is designed to target Japan’s home care market, which faces a significant shortfall of workers.

The robot is primarily being pushed as a companion for children, with SoftBank suggesting Pepper could read and interact with children, later reporting the children’s positive emotional responses to their mothers.

At birthdays, Pepper could be found encouraging fun by initiating singing and dancing, a prospect that’s sure to add additional air of cringe to any family gathering.

Other more serious possible uses for the robot include as a nurse or emergency medical workers. It could also prove an effective companion for elderly people.

Speaking at press conference in Tokyo this morning, Masayoshi Son, SoftBank CEO, said: “People describe others as being robots because they have no emotions, no heart. For the first time in human history, we’re giving a robot a heart, emotions.”

pepper-1

The robot is designed to ‘learn’ by recognising positive emotions and adjusting its behaviour autonomously in response.

This learning approach is accelerated through an interconnected cloud AI: habits and likes of a family that own a Pepper robot are learnt by the unit and shared with its fellow robots to provide an overall increase emotional response.

Although SoftBank admits that Pepper will initially make mistakes, it believes that over time this cloud AI should result in more empathetic robots that can more accurately read emotions and situations.

With large, round eyes and a build that is highly reminiscent of the NAO robot that is popular with robot researchers working with children, Pepper clearly supports the mantra that faces make robots more trustworthy. And given that parents are being encouraged to treat the robot as a baby sitter, this is a vital component of the robot’s design.

pepper-2

Pepper will be showcased in SoftBank’s Ginza and Omotesando stores in Tokyo from tomorrow as a greeter, so we are likely to hear early feedback about its effectiveness within weeks.

We’ll have to wait longer to learn how it fairs in homes, though, as it won’t be available for sale until February 2015.

Once it does go on sale, however, it could prove a runaway success. Pepper is being retailed at the shockingly reasonable price of ¥198,000 (£1,150/$1,900) plus tax, making it within reach of typical families, as well as schools and care homes.


Images courtesy of SoftBank.