Video: “Robots won’t replace human jobs”

The next wave of robotics to enter mass use will be service robots working autonomously with or for humans, according to Nick Hawes, senior lecturer in robotics at the University of Birmingham.

Speaking at last week’s AI & Robotics Innovation Forum in London, UK, Hawes outlined the benefits for service industries such as care, where simple tasks such as cleaning and monitoring could be undertaken by robots.

This would free up humans to perform more complex roles and give them more time to attend to the needs of their charges, something that would be very welcome in an industry that is under pressure from tight budgets and an ageing population.

For Hawes, roboticists need to consider in what markets existing robotics technology will have the biggest impact, so that they can develop technologies that can be used in real-world situations.

However, he highlighted the need to ensure the acceptability of robots in working environments: people must not see robots as replacing them, but more as helpers that do the most mundane tasks and free up people’s time with more complex and subjective work.

One the of biggest challenges in making mobile, autonomous robots is enabling them to safely and effectively respond to the wide range of environments and situations found in human spaces.

“I see enabling robust and reliable autonomy in human environments as a key enabler for mobile robots,” Hawes said.

In his talk at the forum, Hawes outlined the three ingredients needed in an autonomous system: perception, decision making and action.

Perception is the area that robotics has achieved the most in, with technologies such as Kinect making the jump to consumer use. However, decision making – how the robot decides on its next move – and action – how the robot affects the world around it – still have some way to go.


Hawes is currently working on a project with security megafirm G4S to create night watch robots.

Called STRANDS, the project aims to teach robots the normal patterns of daily life in an office environment to detect variations in behaviour that may indicate a security issue.

At present the trial robot, affectionately known as Bob, is being taught daily patterns by continual patrolling of set spaces at different times of the day.

Although Bob considerable work is being done to teach bob how to respond to environments that a human would have no problem with, he could lead to a robot that can spot security issues or behavioural shifts that a human might have missed.

Additional robot footage courtesy of IPA320 and fccysf.

Why Robots Need Faces: the Rise of Service Robotics

Roboticists have been putting faces on robots for a long time, but it’s only recently that scientists have started to understand how beneficial this is, according to Plymouth University professor of cognitive systems and robotics Tony Belpaeme.

Speaking at the RE.WORK AI & Robotics Innovation Forum in London today, Belpaeme explained how faces on robots are not just cute, but actually serve an important function.

People connect with robots far more easily if they can anthropomorphise them, which is going to become particularly important as robots enter more and more service roles.

“Faces are necessary [for robots in service roles],” said Belpaeme. “Without them, they are seen as less positive and less trustworthy.”


At present the majority of robots are in roles that do not require human interaction, such as manufacturing. However Belpaeme sees significant potential for robots in roles where interaction with humans is paramount.

One area where robots could make a real difference is education.

The benefits of one-to-one tutoring are significant: Belpaeme referenced Bloom’s 2 sigma problem – the observation that children who are privately tutored perform two standard deviations better than those who are only taught in classroom sessions, which is equivalent to being above 98% of children taught conventionally.

But although this is known it has never been acted on because of the costs involved: private tutoring is just too expensive to provide on a large scale. However, robots could bring tutoring to all children at an affordable cost.

Undoubtedly some sceptics will question the effectiveness of robot tutors, and there is research going on at present to assess this. One such project is the EU-based Emote project, which is researching the use of empathy-based robotic tutors (pictured above).

However, there is already some evidence to show the benefits of robots as tutors. Belpaeme cited a study where onscreen learning was assessed and compared on its own, with an onscreen robot and with a physical robot providing tutoring. The physical robot was found to be much better than the onscreen robot, suggesting that a suitably friendly-feeling robot could provide an acceptable alternative to a real-life tutor.


Education isn’t the only area where robots with faces can play a role. Baxter, a manufacturing and factory robot manufactured by Rethink Robotics (pictured at the top), has been given a face to make him intuitive to use, with a variety of expressions making his operation common-sense.

Elsewhere anthropomorphic robots have been used for health purposes. They show significant promise in “compliance” roles, such as encouraging patients to stick to specific diets, and have been found to be beneficial for autistic children.

A particularly remarkable example that Belpaeme discussed was the case of an eight year old boy undergoing rehabilitation after a severe stroke. Having shown very limited response to therapy, the boy was introduced to a NAO robot (the same type use by the Emote project), which practitioners used to conduct his physiotherapy.

The NAO robot demonstrated the physical exercises that the boy emulated, and just six days later he had recovered enough to be discharged from hospital.

Featured image courtesy of Rethink Robotics.

First body image courtesy of Emote.

Second body image courtesy of Aldebaran Robotics.