Games industry veteran developing “wearable second brain” for Alzheimer’s and dementia sufferers

A wearable memory aid tool that forms part of a “web of care” for sufferers of dementia and Alzheimer’s is being developed by games industry veteran Martin Kenwright through his recently founded digital technology company Starship.

Kenwright is best known for 3D real-time strategy game Wargasm and flight simulator game F-22: Air Dominance Fighter, both developed at his first studio Digital Image Design, as well as Playstation 3 launch title MotorStorm, developed by Evolution Studios.

After selling Evolution to Sony in 2007, Kenwright took time off from the industry to care for family members suffering from dementia.

“I kind of realised there’s something from my business, there’s something that I could do that would not be a magic bullet, but really of make a difference,” Kenwright said in an interview with Factor.

“Before I knew it, I’d filed two patent applications, and the idea essentially was wearable memory. It’s a bit of a sweeping statement and I don’t say it lightly, but it was almost to create the world’s first Alzheimer’s and dementia memory tool,” he explained.

MartinKenwright1

Kenwright was keen not to reveal too many details of the project, known as Forget Me Not, which is still under development, but explained that this would be a wearable product – a “wearable second brain” – that would help the memory-afflicted.

“We could create a technology that could allow people to remember all kinds of objects in the world in-situ, in a way that is completely natural. The idea is of them in a moment of panic being able to recall what they were looking for,” he said.

“It wasn’t that we’ve created some big wearable VR helmet thing, but a proven de-risk, patented game-changing proposition, game technology.”

Kenwright believes that this product, although conceptually strange now, will become a standard wearable for people with memory problems in the future.

“I do generally believe that memory aids will become as common as hearing aids in 5 to 10 years,’ he said.

brain-2

Starship has taken its time developing the technology, in part to allow chips to become more energy-efficient, something that is important to the technology.

“I can’t really go into hardware and software, but hardware companies have seen it and think its brilliant,” he explained. “A lot of people have been coming up with things like Google Glass that are very Orwellian. Finally someone’s looking at something of genuine use and value and need in this sector that can create a win-win: profit and salvation in one as it were.”

However, Kenwright is keen to stress that this isn’t some miracle solution, and will have limitations.

“What we’ve learned with people with a lot of these afflictions is that if you don’t know how to use some of these devices before you are diagnosed, you’ll never be able to learn again,” he explained.

“I think when things do become ubiquitous like Glass and wearables, in just the same way smartphones and smart devices have, it will be part of the furniture and hence become as common as hearing aids. Memory aids will be one and the same.”

Kenwright sees huge potential for wearables as an aid for Alzheimer’s patients.

“You think Alzheimer’s, memory issues; it’s like the biggest state of concern in the US,” he said. “33 million a year afflicted people, $40bn a year being spent on all these wonder drugs and you’ve kind of got to think, does it all have to be about drugs? Can’t we create tools and devices?”


Body image 1 courtesy of Starship Group.


Talking with robots: Training future robot chefs with speech

Without a detailed knowledge of coding and programming languages it is almost impossible for the average person to be able to interact with most robots.

It is one of the biggest hurdles that needs to be overcome if our future is to include living with robots.

Thankfully, researchers at Cornell University, US, are teaching robots to understand instructions that are given to them by voice – and are teaching one robot to cook.

The robot is able to scan the room it is in with a 3D camera, and has been trained to associate the objects it sees with what they can do.

For example, it understands that water can be put in a pan and that the pan can be heated up on a stove.

It has a built in programming language that includes commands that allow it to find, grasp, carry or fill, for objects such as a pan.

The researchers have made the robot clever enough to be able to perform the same actions at a different time – even if the pan has been moved.

While the work is in the early stages it shows a promise of what could be to come with the future of robots and those that are able to learn by themselves in a human language.

On the researchers’ website the team of three, who are primarily behind the work, say that they want to be able to make a robot take generic instructions and be able to act on them, as well as understanding them.

“In order for robots to perform tasks in real world, they need to be able to understand our natural language commands,” the team wrote on its website.

“While there is a lot of past research that went into the task of language parsing, they often require the instructions to be spelled out in full detail which makes it difficult to use them in real world situations.

“Our goal is to enable a robots to even take an ill-specified instruction as generic as ‘Make a cup of coffee’ and be able to figure out how to fill a cup with milk or use one if it already has milk etc. depending on how the environment looks.”

The team are also collecting data to help with the robot’s development and have created an online simulator that anyone can try out.

Understandably the robot is not perfect yet and during testing the researchers found that it performed correctly 64% of the time.

This did include when the commands were changed and the robot is able to fill in the missing step.

For us to be able to utilise the potential of robots to aid us in everyday tasks such as cooking a meal, they need to be able to understand what we say to them in our native language.

Then, when combined with artificial intelligence, learn over time.

Japanese juggernaut SoftBank has recently announced that it will be selling a robot nanny called Pepper, from 2015, that can respond to the emotions of humans.

It is mainly being touted as a companion for children, and will report children’s positive emotional responses to their mothers.


Featured image courtesy of Cornell University