In-ear computing: The hearables heralding the dawn of a screen-free future

Hearables have been touted as the next big thing for wearables for some time, but will they really have a meaningful impact on our lives? We hear from Bragi CMO Jarrod Jordan about how the technology could transform the way we communicate

For the past few decades, computing has advanced at an incredible pace. Within a single generation we’ve gone from cumbersome desktops to devices that are effectively pocket-sized supercomputers, so it comes as no surprise that technology manufacturers and consumers alike are hungry for the next step.

For many this step is known as ‘the fourth platform’: something that takes us beyond our current screen-based lives, and pushes computing and communication seamlessly into the background.

And while we’re quite not there yet, hearables may very well be that something.

“The hearable is actually the precipice of the very beginning of the start of fourth-platform computing, where people can put a device inside of their ear, they can actually become more productive, they can move more places and they can do more things,” explains Jarrod Jordan, CMO of Bragi, speaking at the Wearable Technology Show.

“This isn’t just happening five years from now; we are talking about 18 months from now, it’s starting to become more and more prominent. People are starting to look at this in the same way they did when they were looking at, for example, when the iPhone first came out.”

Bragi is arguably in a very good place for this oncoming breakthrough. The Germany-based company is behind one of the first true hearables, the Dash, which has developed from a Kickstarter success story back in 2014 to a fully fledged product now available in stores. And with an SDK (software development kit) to allow third-parties to develop apps for the device on its way, it has all the makings of a truly useful device.

Beyond the tethered smartphone

Wearable technology has long been touted as a game-changing space from which the next generation of computing will come, but so far much of what’s been developed has failed to live up to that claim. Most devices remain entirely reliant on smartphones, making them more peripherals to existing devices rather than a technology that truly pushes things forwards in their own right.

Images courtesy of Bragi

Which begs the question: what does a true fourth-platform device need to offer?

“A few things need to happen in order for that fourth platform to exist, or to have that device or item exist on the fourth platform,” Jordan explains. “It has to make the user more integrated into real-world scenarios; it has to make the user be more productive and it has to be automated to help them with predictive behaviours – in other words it has to start doing things on behalf of the user without the user having to think about it.”

For some, virtual reality could be that platform, but Jordan argues that it so far fails to achieve these goals.

“As much as I love it as a form of entertainment, the idea that you have an integration with the real world, or that you can become automated or more productive with a device currently over your head [is wrong],” he says. “[VR] actually brings you out of the world and distracts you from what’s happening around you.”

Another option is the voice-enabled devices such as Amazon Echo, which are arguably much closer to being true fourth-platform devices, but fail in that they are typically in fixed locations with little ability to gather data about their users.

“What’s great about this is it does do a lot of the things I just mentioned: you can actually walk in the room and get things ordered, you can have things turn on or turn off etc,” Jordan says. “But there’s a couple of things: it doesn’t actually integrate with you as a human, it doesn’t understand what your body or your biometrics are telling it and it can go with you but it doesn’t travel with you per se.”

The logical step for some, then, is implanted computers. They’re always there, they can gather data and provide unseen feedback and assistance and they don’t need to rely on a smartphone. But they come with a rather significant problem: how many of us are really up for having tech surgically implanted inside us?

“To a lot of people that bothers them; it even bothers me,” says Jordan. “I don’t necessarily want a device inside of me, but I do need a device that can somehow get inside of me, grab different parts of my biometrics and help me become more productive or more active.”

When does a headphone become a hearable?

For Jordan, true fourth-platform devices will combine the best of these nearly-there technologies into something consumers will actually want to use.

“The way I look at it, there are three ways that these things need to come together to make that fourth platform,” he says. “It needs to be embedded yet detachable: I think if it’s inside of you then that’s a problem, I just don’t think adoption of that by the masses is really there.

It needs to leverage multiple sensors so it’s not only voice, it’s not only eyes, it’s not only touch: its taking in several different components of your body and being able to give output from that

“It needs to leverage multiple sensors so it’s not only voice, it’s not only eyes, it’s not only touch: its taking in several different components of your body and being able to give output from that. It needs to be able to augment your behaviour and predict your behaviour as well.”

Hearables, he argues, are this device, although he is keen to stress that not all technology you can put in your ears is really a true hearable.

“It is not as simply a truly wireless in-ear device. Many of them are excellent: wonderful sound, fun to use etc but they are not a computer,” he explains.

“If you cannot update your device with firmware, just like you get with your iPhone through those OS updates, if you cannot do that with your hearable it is not by definition a hearable. It is a headphone and it may be excellent, it may be fun to use, but not exactly a hearable.

“The second thing is it must be intelligent; the device must be able to pick up what you are doing and give you a feedback loop in order to make you more productive.”

Bragi Dash: in-ear computers

Whether Bragi’s own device, the Dash, fulfils these needs will ultimately be decided by its users, but it does make a compelling case. Because while the Dash looks like just a regular set of wireless earbuds, it is in fact a complete computer in two parts, housed entirely inside the minimal casing.

“We did not go out to build headphones. We actually went out to build in-ear computers; a binary computer, with a left and right side allowing us to make even more datasets and even more predictions,” says Jordan.

“In the building a device we were challenged – first of all we had nanotechnology: how do we push all of these things into very, very little space? We put 27 sensors inside of our device, we put infrared, we put accelerometers, a gyroscope, a 32 bit processor and a 4GB hard drive all in a thing the size of a dime that sits inside your ear.”

And that means that Dash can do pretty much all the things you’d expect from conventional wearable technology, without needing to hook up to a phone or plant sensors across your body.

We did not go out to build headphones. We actually went out to build in-ear computers; a binary computer, with a left and right side allowing us to make even more datasets and even more predictions

“We have tracking heart rate, respiration, acceleration, temperature, rotation, absolute direction: all of these types of things can be gathered through the device. All of those things can also be gathered and put into datasets,” he says.

“You have a headset that functions similarly to how you make normal telephone calls. You have an earphone microphone – that means the microphone is actually inside your ear not outside. You have noise cancellation with audio transparency: that means that you can hear the world around you as well as what’s in your device, so you’re actually able to have an augmented situation there. Speech control in the ambient microphone: again, those are things that allow you to sit there and make things more productive.”

Dash also solves the interaction problem – usually in-ear wearables rely on smartphones – with a mixture of gestures and voice commands.

“Right now on our device you can actually nod your head and answer a call. You can say no and reject that call. You can shake your head three times and make music shuffle; you can double tap your face twice and say ’tell me how to get home’ and the device will tell you how to get home,” Jordan explains.

But that’s not all Bragi has planned for the device. The company is already working with IBM Watson and several automotive companies to build on the Dash’s capabilities, and hopes to be able to utilise the data collected to significantly advance how the device can help you in your day to day life.

“We are collecting biometric data: we know your red-and-white blood cell counts; we know your difference between your scared heart rate and your nervous heart rate and your exercise induced heart rate,” he says. “We can see the difference between all of those so we can actually look to a world where we can start to build apps on top of those behaviours to allow you to then become more productive based on exactly what’s happening with your body, as well as starting to predict what may happen to you in the future.”

A post-screen world

The true promise of hearables lies in their ability to interact with the increasingly connected world around us, and remove many of the increasingly prevalent screens that are carving through our lives. Computers would remain ever-present, but in a manner that would be less intrusive, and more able to respond to our needs without us needing to tell them to.

“By integrating with you and into the Internet of Things, think about all those gestures, think about all that biofeedback you’re getting, and imagine being able to control the devices around you,” Jordan enthuses. “So you yourself become the trackpad.

“Imagine being able to walk into a room and simply control the devices. Let’s say you walk home and you just lift your arms up and the lights turn on, a double snap and a little Barry White starts playing.

“Your temperature is high, you’re on your way home and all of a sudden that air conditioner at home knows to turn on. You can do things that are very different by having the computer integrated into you.”

Soviet report detailing lunar rover Lunokhod-2 released for first time

Russian space agency Roskosmos has released an unprecedented scientific report into the lunar rover Lunokhod-2 for the first time, revealing previously unknown details about the rover and how it was controlled back on Earth.

The report, written entirely in Russian, was originally penned in 1973 following the Lunokhod-2 mission, which was embarked upon in January of the same year. It had remained accessible to only a handful of experts at the space agency prior to its release today, to mark the 45th anniversary of the mission.

Bearing the names of some 55 engineers and scientists, the report details the systems that were used to both remotely control the lunar rover from a base on Earth, and capture images and data about the Moon’s surface and Lunokhod-2’s place on it. This information, and in particularly the carefully documented issues and solutions that the report carries, went on to be used in many later unmanned missions to other parts of the solar system.

As a result, it provides a unique insight into this era of space exploration and the technical challenges that scientists faced, such as the low-frame television system that functioned as the ‘eyes’ of the Earth-based rover operators.

A NASA depiction of the Lunokhod mission. Above: an image of the rover, courtesy of NASA, overlaid onto a panorama of the Moon taken by Lunokhod-2, courtesy of Ruslan Kasmin.

One detail that main be of particular interest to space enthusiasts and experts is the operation of a unique system called Seismas, which was tested for the first time in the world during the mission.

Designed to determine the precise location of the rover at any given time, the system involved transmitting information over lasers from ground-based telescopes, which was received by a photodetector onboard the lunar rover. When the laser was detected, this triggered the emission of a radio signal back to the Earth, which provided the rover’s coordinates.

Other details, while technical, also give some insight into the culture of the mission, such as the careful work to eliminate issues in the long-range radio communication system. One issue, for example, was worked on with such thoroughness that it resulted in one of the devices using more resources than it was allocated, a problem that was outlined in the report.

The document also provides insight into on-Earth technological capabilities of the time. While it is mostly typed, certain mathematical symbols have had to be written in by hand, and the report also features a number of diagrams and graphs that have been painstakingly hand-drawn.

A hand-drawn graph from the report, showing temperature changes during one of the monitoring sessions during the mission

Lunokhod-2 was the second of two unmanned lunar rovers to be landed on the Moon by the Soviet Union within the Lunokhod programme, having been delivered via a soft landing by the unmanned Luna 21 spacecraft in January 1973.

In operation between January and June of that year, the robot covered a distance of 39km, meaning it still holds the lunar distance record to this day.

One of only four rovers to be deployed on the lunar surface, Lunokhod-2 was the last rover to visit the Moon until December 2013, when Chinese lunar rover Yutu made its maiden visit.

Robot takes first steps towards building artificial lifeforms

A robot equipped with sophisticated AI has successfully simulated the creation of artificial lifeforms, in a key first step towards the eventual goal of creating true artificial life.

The robot, which was developed by scientists at the University of Glasgow, was able to model the creation of artificial lifeforms using unstable oil-in-water droplets. These droplets effectively played the role of living cells, demonstrating the potential of future research to develop living cells based on building blocks that cannot be found in nature.

Significantly, the robot also successfully predicted their properties before they were created, even though this could not be achieved using conventional physical models.

The robot, which was designed by Glasgow University’s Regius Chair of Chemistry, Professor Lee Cronin, is driven by machine learning and the principles of evolution.

It has been developed to autonomously create oil-in-water droplets with a host of different chemical makeups and then use image recognition to assess their behaviour.

Using this information, the robot was able to engineer droplets to have different properties­. Those which were found to be desirable could then be recreated at any time, using a specific digital code.

“This work is exciting as it shows that we are able to use machine learning and a novel robotic platform to understand the system in ways that cannot be done using conventional laboratory methods, including the discovery of ‘swarm’ like group behaviour of the droplets, akin to flocking birds,” said Cronin.

“Achieving lifelike behaviours such as this are important in our mission to make new lifeforms, and these droplets may be considered ‘protocells’ – simplified models of living cells.”

One of the oil droplets created by the robot

The research, which is published today in the journal PNAS, is one of several research projects being undertaken by Cronin and his team within the field of artificial lifeforms.

While the overarching goal is moving towards the creation of lifeforms using new and unprecedented building blocks, the research may also have more immediate potential applications.

The team believes that their work could also have applications in several practical areas, including the development of new methods for drug delivery or even innovative materials with functional properties.