In-ear computing: The hearables heralding the dawn of a screen-free future

Hearables have been touted as the next big thing for wearables for some time, but will they really have a meaningful impact on our lives? We hear from Bragi CMO Jarrod Jordan about how the technology could transform the way we communicate

For the past few decades, computing has advanced at an incredible pace. Within a single generation we’ve gone from cumbersome desktops to devices that are effectively pocket-sized supercomputers, so it comes as no surprise that technology manufacturers and consumers alike are hungry for the next step.

For many this step is known as ‘the fourth platform’: something that takes us beyond our current screen-based lives, and pushes computing and communication seamlessly into the background.

And while we’re quite not there yet, hearables may very well be that something.

“The hearable is actually the precipice of the very beginning of the start of fourth-platform computing, where people can put a device inside of their ear, they can actually become more productive, they can move more places and they can do more things,” explains Jarrod Jordan, CMO of Bragi, speaking at the Wearable Technology Show.

“This isn’t just happening five years from now; we are talking about 18 months from now, it’s starting to become more and more prominent. People are starting to look at this in the same way they did when they were looking at, for example, when the iPhone first came out.”

Bragi is arguably in a very good place for this oncoming breakthrough. The Germany-based company is behind one of the first true hearables, the Dash, which has developed from a Kickstarter success story back in 2014 to a fully fledged product now available in stores. And with an SDK (software development kit) to allow third-parties to develop apps for the device on its way, it has all the makings of a truly useful device.

Beyond the tethered smartphone

Wearable technology has long been touted as a game-changing space from which the next generation of computing will come, but so far much of what’s been developed has failed to live up to that claim. Most devices remain entirely reliant on smartphones, making them more peripherals to existing devices rather than a technology that truly pushes things forwards in their own right.

Images courtesy of Bragi

Which begs the question: what does a true fourth-platform device need to offer?

“A few things need to happen in order for that fourth platform to exist, or to have that device or item exist on the fourth platform,” Jordan explains. “It has to make the user more integrated into real-world scenarios; it has to make the user be more productive and it has to be automated to help them with predictive behaviours – in other words it has to start doing things on behalf of the user without the user having to think about it.”

For some, virtual reality could be that platform, but Jordan argues that it so far fails to achieve these goals.

“As much as I love it as a form of entertainment, the idea that you have an integration with the real world, or that you can become automated or more productive with a device currently over your head [is wrong],” he says. “[VR] actually brings you out of the world and distracts you from what’s happening around you.”

Another option is the voice-enabled devices such as Amazon Echo, which are arguably much closer to being true fourth-platform devices, but fail in that they are typically in fixed locations with little ability to gather data about their users.

“What’s great about this is it does do a lot of the things I just mentioned: you can actually walk in the room and get things ordered, you can have things turn on or turn off etc,” Jordan says. “But there’s a couple of things: it doesn’t actually integrate with you as a human, it doesn’t understand what your body or your biometrics are telling it and it can go with you but it doesn’t travel with you per se.”

The logical step for some, then, is implanted computers. They’re always there, they can gather data and provide unseen feedback and assistance and they don’t need to rely on a smartphone. But they come with a rather significant problem: how many of us are really up for having tech surgically implanted inside us?

“To a lot of people that bothers them; it even bothers me,” says Jordan. “I don’t necessarily want a device inside of me, but I do need a device that can somehow get inside of me, grab different parts of my biometrics and help me become more productive or more active.”

When does a headphone become a hearable?

For Jordan, true fourth-platform devices will combine the best of these nearly-there technologies into something consumers will actually want to use.

“The way I look at it, there are three ways that these things need to come together to make that fourth platform,” he says. “It needs to be embedded yet detachable: I think if it’s inside of you then that’s a problem, I just don’t think adoption of that by the masses is really there.

It needs to leverage multiple sensors so it’s not only voice, it’s not only eyes, it’s not only touch: its taking in several different components of your body and being able to give output from that

“It needs to leverage multiple sensors so it’s not only voice, it’s not only eyes, it’s not only touch: its taking in several different components of your body and being able to give output from that. It needs to be able to augment your behaviour and predict your behaviour as well.”

Hearables, he argues, are this device, although he is keen to stress that not all technology you can put in your ears is really a true hearable.

“It is not as simply a truly wireless in-ear device. Many of them are excellent: wonderful sound, fun to use etc but they are not a computer,” he explains.

“If you cannot update your device with firmware, just like you get with your iPhone through those OS updates, if you cannot do that with your hearable it is not by definition a hearable. It is a headphone and it may be excellent, it may be fun to use, but not exactly a hearable.

“The second thing is it must be intelligent; the device must be able to pick up what you are doing and give you a feedback loop in order to make you more productive.”

Bragi Dash: in-ear computers

Whether Bragi’s own device, the Dash, fulfils these needs will ultimately be decided by its users, but it does make a compelling case. Because while the Dash looks like just a regular set of wireless earbuds, it is in fact a complete computer in two parts, housed entirely inside the minimal casing.

“We did not go out to build headphones. We actually went out to build in-ear computers; a binary computer, with a left and right side allowing us to make even more datasets and even more predictions,” says Jordan.

“In the building a device we were challenged – first of all we had nanotechnology: how do we push all of these things into very, very little space? We put 27 sensors inside of our device, we put infrared, we put accelerometers, a gyroscope, a 32 bit processor and a 4GB hard drive all in a thing the size of a dime that sits inside your ear.”

And that means that Dash can do pretty much all the things you’d expect from conventional wearable technology, without needing to hook up to a phone or plant sensors across your body.

We did not go out to build headphones. We actually went out to build in-ear computers; a binary computer, with a left and right side allowing us to make even more datasets and even more predictions

“We have tracking heart rate, respiration, acceleration, temperature, rotation, absolute direction: all of these types of things can be gathered through the device. All of those things can also be gathered and put into datasets,” he says.

“You have a headset that functions similarly to how you make normal telephone calls. You have an earphone microphone – that means the microphone is actually inside your ear not outside. You have noise cancellation with audio transparency: that means that you can hear the world around you as well as what’s in your device, so you’re actually able to have an augmented situation there. Speech control in the ambient microphone: again, those are things that allow you to sit there and make things more productive.”

Dash also solves the interaction problem – usually in-ear wearables rely on smartphones – with a mixture of gestures and voice commands.

“Right now on our device you can actually nod your head and answer a call. You can say no and reject that call. You can shake your head three times and make music shuffle; you can double tap your face twice and say ’tell me how to get home’ and the device will tell you how to get home,” Jordan explains.

But that’s not all Bragi has planned for the device. The company is already working with IBM Watson and several automotive companies to build on the Dash’s capabilities, and hopes to be able to utilise the data collected to significantly advance how the device can help you in your day to day life.

“We are collecting biometric data: we know your red-and-white blood cell counts; we know your difference between your scared heart rate and your nervous heart rate and your exercise induced heart rate,” he says. “We can see the difference between all of those so we can actually look to a world where we can start to build apps on top of those behaviours to allow you to then become more productive based on exactly what’s happening with your body, as well as starting to predict what may happen to you in the future.”

A post-screen world

The true promise of hearables lies in their ability to interact with the increasingly connected world around us, and remove many of the increasingly prevalent screens that are carving through our lives. Computers would remain ever-present, but in a manner that would be less intrusive, and more able to respond to our needs without us needing to tell them to.

“By integrating with you and into the Internet of Things, think about all those gestures, think about all that biofeedback you’re getting, and imagine being able to control the devices around you,” Jordan enthuses. “So you yourself become the trackpad.

“Imagine being able to walk into a room and simply control the devices. Let’s say you walk home and you just lift your arms up and the lights turn on, a double snap and a little Barry White starts playing.

“Your temperature is high, you’re on your way home and all of a sudden that air conditioner at home knows to turn on. You can do things that are very different by having the computer integrated into you.”

Adding stem cells to the brains of mice “slowed or reversed” ageing

Albert Einstein College of Medicine scientists “slowed or reversed” ageing in mice by injecting stem cells into their brains.

The study, published online in the journal Nature, saw the scientists implant stem cells into mice’s hypothalamus, which caused molecules called microRNAs (miRNAs) to be released.

The miRNA molecules were then extracted from the hypothalamic stem cells and injected into the cerebrospinal fluid of two groups of mice: middle-aged mice whose hypothalamic stem cells had been destroyed and normal middle-aged mice.

This treatment significantly slowed aging in both groups of animals as measured by tissue analysis and behavioural testing that involved assessing changes in the animals’ muscle endurance, coordination, social behaviour and cognitive ability.

“Our research shows that the number of hypothalamic neural stem cells naturally declines over the life of the animal, and this decline accelerates aging,” said senior author Dongsheng Cai, M.D., Ph.D., professor of molecular pharmacology at Einstein.

“But we also found that the effects of this loss are not irreversible. By replenishing these stem cells or the molecules they produce, it’s possible to slow and even reverse various aspects of aging throughout the body.”

To reach the conclusion that stem cells in the hypothalamus held the key to aging, the scientists first looked at the fate cells in the hypothalamus as healthy mice got older.

The number of hypothalamic stem cells began to diminish when the mice reached about 10 months, which is several months before the usual signs of aging start appearing. “By old age—about two years of age in mice—most of those cells were gone,” said Dr. Cai.

Images courtesy of the Mayo Clinic.

The researchers next wanted to learn whether this progressive loss of stem cells was actually causing aging and was not just associated with it.

To do this, the scientists observed what happened when they selectively disrupted the hypothalamic stem cells in middle-aged mice.

“This disruption greatly accelerated aging compared with control mice, and those animals with disrupted stem cells died earlier than normal,” said Dr. Cai.

Finally, to work out whther adding stem cells to the hypothalamus counteracted ageing, the scientists injected hypothalamic stem cells into the brains of middle-aged mice whose stem cells had been destroyed as well as into the brains of normal old mice.

In both groups of animals, the treatment slowed or reversed various measures of aging.

The scientists are now trying to identify the particular populations of microRNAs that are responsible for the anti-aging effects seen in mice, which is perhaps the first step toward slowing the aging process and successfully treating age-related diseases in humans.

Self-driving delivery cars coming to UK roads by 2018

A driverless vehicle designed to deliver goods to UK homes is set to take to the road next year after the successful conclusion of an equity crowdfunding campaign.

Developed by engineers at The University of Aberystwyth-based startup The Academy of Robotics, the vehicle, Kar-Go, is road-legal, and capable of driving on roads without any specific markings without human intervention.

Kar-Go has successfully raised £321,000 through Crowdcube – 107% of its goal – meaning the company now has the funds to build its first commercially ready vehicles. This amount will also, according to William Sachiti, Academy of Robotics founder and CEO, be matched by “one of the largest tech companies” in the world.

Images courtesy of Academy of Robotics

The Academy of Robotics has already built and tested a prototype version of Kar-Go, and is working with UK car manufacturer Pilgrim to produce the fully street-legal version.

The duo has already gained legal approval from the UK government’s Centre for Autonomous Vehicles, meaning the cars will be able to immediately operate on UK roads once built.

The aim of Kar-Go is to partner with suppliers of everyday consumer goods to significantly reduce the cost of deliveries, and the company’s goal in this area is ambitious: Sachiti believes Kar-Go could reduce delivery costs by as much as 98%.

Whether companies go for the offering remains to be seen, but the company says it is in early stage discussions with several of the largest fast-moving consumer goods companies in Europe, which would likely include the corporations behind some of the most recognisable brands found in UK supermarkets.

Introducing Kar-go Autonomous Delivery from Academy of Robotics on Vimeo.

While some will be sceptical, Sachiti is keen to drive the company to success, and already has an impressive track record in future-focused business development. He previously founded Clever Bins – the solar powered digital advertising bins found in many of the nation’s cities – and digital concierge service MyCityVenue – now part of SecretEscapes.

“As a CEO, it is one of my primary duties to make sure Kar-go remains a fantastic investment, this can only be achieved by our team producing spectacular results. We can’t wait to show the world what we produce,” he said.

“We have a stellar team who are excited to have begun working on what we believe will probably be the best autonomous delivery vehicle in the world. For instance, our multi-award winning lead vehicle designer is part of the World Championship winning Brabham Formula One design team, and also spent years as a Design Engineer at McLaren.”