Without a detailed knowledge of coding and programming languages it is almost impossible for the average person to be able to interact with most robots.
It is one of the biggest hurdles that needs to be overcome if our future is to include living with robots.
Thankfully, researchers at Cornell University, US, are teaching robots to understand instructions that are given to them by voice – and are teaching one robot to cook.
The robot is able to scan the room it is in with a 3D camera, and has been trained to associate the objects it sees with what they can do.
For example, it understands that water can be put in a pan and that the pan can be heated up on a stove.
It has a built in programming language that includes commands that allow it to find, grasp, carry or fill, for objects such as a pan.
The researchers have made the robot clever enough to be able to perform the same actions at a different time – even if the pan has been moved.
While the work is in the early stages it shows a promise of what could be to come with the future of robots and those that are able to learn by themselves in a human language.
On the researchers’ website the team of three, who are primarily behind the work, say that they want to be able to make a robot take generic instructions and be able to act on them, as well as understanding them.
“In order for robots to perform tasks in real world, they need to be able to understand our natural language commands,” the team wrote on its website.
“While there is a lot of past research that went into the task of language parsing, they often require the instructions to be spelled out in full detail which makes it difficult to use them in real world situations.
“Our goal is to enable a robots to even take an ill-specified instruction as generic as ‘Make a cup of coffee’ and be able to figure out how to fill a cup with milk or use one if it already has milk etc. depending on how the environment looks.”
The team are also collecting data to help with the robot’s development and have created an online simulator that anyone can try out.
Understandably the robot is not perfect yet and during testing the researchers found that it performed correctly 64% of the time.
This did include when the commands were changed and the robot is able to fill in the missing step.
For us to be able to utilise the potential of robots to aid us in everyday tasks such as cooking a meal, they need to be able to understand what we say to them in our native language.
Then, when combined with artificial intelligence, learn over time.
Japanese juggernaut SoftBank has recently announced that it will be selling a robot nanny called Pepper, from 2015, that can respond to the emotions of humans.
It is mainly being touted as a companion for children, and will report children’s positive emotional responses to their mothers.
Featured image courtesy of Cornell University