All posts by Lucy Ingham

Driverless Cars: Could Vehicle Ownership be Relegated to History?

Driverless cars could eventually result in the death of car ownership, according to Phil Williams, project manager of the Technology Strategy Board special interest group Robotics and Autonomous Systems.

Williams, who was speaking at RE.WORK’s AI & Robotics Innovation Forum, said: “The likelihood is that we will see a reduction of car ownership”.

He said that transport was likely to move towards being a service. “Today it’s called a taxi,” he added. “In 10 or 15 years time it might be an easy car.”

This change would be likely to come about because payment-per-trip would increasingly become the most affordable option. Williams highlighted how insurance, MOT and road tax are already proving too expensive for some, and suggested that as driverless cars became mainstream they would make the cost of individual journeys much cheaper.

The percentage of young people learning to drive in developed countries has been on the decline over the past few years, a trend that experts are connecting with rising costs of car ownership and driving lessons and the increase of online activities.


Hugo Elias, senior engineer at Shadow Robot Company, also indentified a likely move towards driverless taxis and away from ownership.

“By 2020 every car manufactured in the past 10 years will be driverless, 10 years after than perhaps all cars will be driverless,” he said.

“Perhaps at some point in the future almost nobody will own their own cars.”

He argued that this could result in fewer cars in operation. Unlike now where most cars spend a large percentage of their service life sat in garages or driveways, driverless taxis could run almost all of the time, meaning a smaller number would be needed for the same number of people.

This, Elias believes, could have an impact on the design of cities. There would be a move away from car-centric cities such as Los Angles, and a rise in smaller cities built to accommodate pedestrians and bikes, such as Amsterdam, the Netherlands.


However, Paul Newman, BP professor of information engineering at the University of Oxford, was keen to stress that fully driverless cars that would operate completely autonomously were a long way from being a reality.

“This is a technology that’s going to blend over time. It’s not going to be a step change,” he said.

Newman, who is involved in the development of the first road-legal driverless car in the UK, argued that the technology that is underdevelopment at present is “hands-free driving” that still requires drivers to be alert and ready to take control.

“Insurance will disable the car if you sleep in it,” he said.

He did concede that truly driverless technology could eventually be possible, but argued that this was a very long time away. Newman said: “Maybe many, many, many, many years down the line you may not be facing forwards.”

Images courtesy of Mike and Maaike.

Why Robots Need Faces: the Rise of Service Robotics

Roboticists have been putting faces on robots for a long time, but it’s only recently that scientists have started to understand how beneficial this is, according to Plymouth University professor of cognitive systems and robotics Tony Belpaeme.

Speaking at the RE.WORK AI & Robotics Innovation Forum in London today, Belpaeme explained how faces on robots are not just cute, but actually serve an important function.

People connect with robots far more easily if they can anthropomorphise them, which is going to become particularly important as robots enter more and more service roles.

“Faces are necessary [for robots in service roles],” said Belpaeme. “Without them, they are seen as less positive and less trustworthy.”


At present the majority of robots are in roles that do not require human interaction, such as manufacturing. However Belpaeme sees significant potential for robots in roles where interaction with humans is paramount.

One area where robots could make a real difference is education.

The benefits of one-to-one tutoring are significant: Belpaeme referenced Bloom’s 2 sigma problem – the observation that children who are privately tutored perform two standard deviations better than those who are only taught in classroom sessions, which is equivalent to being above 98% of children taught conventionally.

But although this is known it has never been acted on because of the costs involved: private tutoring is just too expensive to provide on a large scale. However, robots could bring tutoring to all children at an affordable cost.

Undoubtedly some sceptics will question the effectiveness of robot tutors, and there is research going on at present to assess this. One such project is the EU-based Emote project, which is researching the use of empathy-based robotic tutors (pictured above).

However, there is already some evidence to show the benefits of robots as tutors. Belpaeme cited a study where onscreen learning was assessed and compared on its own, with an onscreen robot and with a physical robot providing tutoring. The physical robot was found to be much better than the onscreen robot, suggesting that a suitably friendly-feeling robot could provide an acceptable alternative to a real-life tutor.


Education isn’t the only area where robots with faces can play a role. Baxter, a manufacturing and factory robot manufactured by Rethink Robotics (pictured at the top), has been given a face to make him intuitive to use, with a variety of expressions making his operation common-sense.

Elsewhere anthropomorphic robots have been used for health purposes. They show significant promise in “compliance” roles, such as encouraging patients to stick to specific diets, and have been found to be beneficial for autistic children.

A particularly remarkable example that Belpaeme discussed was the case of an eight year old boy undergoing rehabilitation after a severe stroke. Having shown very limited response to therapy, the boy was introduced to a NAO robot (the same type use by the Emote project), which practitioners used to conduct his physiotherapy.

The NAO robot demonstrated the physical exercises that the boy emulated, and just six days later he had recovered enough to be discharged from hospital.

Featured image courtesy of Rethink Robotics.

First body image courtesy of Emote.

Second body image courtesy of Aldebaran Robotics.