In-ear computing: The hearables heralding the dawn of a screen-free future

Hearables have been touted as the next big thing for wearables for some time, but will they really have a meaningful impact on our lives? We hear from Bragi CMO Jarrod Jordan about how the technology could transform the way we communicate

For the past few decades, computing has advanced at an incredible pace. Within a single generation we’ve gone from cumbersome desktops to devices that are effectively pocket-sized supercomputers, so it comes as no surprise that technology manufacturers and consumers alike are hungry for the next step.

For many this step is known as ‘the fourth platform’: something that takes us beyond our current screen-based lives, and pushes computing and communication seamlessly into the background.

And while we’re quite not there yet, hearables may very well be that something.

“The hearable is actually the precipice of the very beginning of the start of fourth-platform computing, where people can put a device inside of their ear, they can actually become more productive, they can move more places and they can do more things,” explains Jarrod Jordan, CMO of Bragi, speaking at the Wearable Technology Show.

“This isn’t just happening five years from now; we are talking about 18 months from now, it’s starting to become more and more prominent. People are starting to look at this in the same way they did when they were looking at, for example, when the iPhone first came out.”

Bragi is arguably in a very good place for this oncoming breakthrough. The Germany-based company is behind one of the first true hearables, the Dash, which has developed from a Kickstarter success story back in 2014 to a fully fledged product now available in stores. And with an SDK (software development kit) to allow third-parties to develop apps for the device on its way, it has all the makings of a truly useful device.

Beyond the tethered smartphone

Wearable technology has long been touted as a game-changing space from which the next generation of computing will come, but so far much of what’s been developed has failed to live up to that claim. Most devices remain entirely reliant on smartphones, making them more peripherals to existing devices rather than a technology that truly pushes things forwards in their own right.

Images courtesy of Bragi

Which begs the question: what does a true fourth-platform device need to offer?

“A few things need to happen in order for that fourth platform to exist, or to have that device or item exist on the fourth platform,” Jordan explains. “It has to make the user more integrated into real-world scenarios; it has to make the user be more productive and it has to be automated to help them with predictive behaviours – in other words it has to start doing things on behalf of the user without the user having to think about it.”

For some, virtual reality could be that platform, but Jordan argues that it so far fails to achieve these goals.

“As much as I love it as a form of entertainment, the idea that you have an integration with the real world, or that you can become automated or more productive with a device currently over your head [is wrong],” he says. “[VR] actually brings you out of the world and distracts you from what’s happening around you.”

Another option is the voice-enabled devices such as Amazon Echo, which are arguably much closer to being true fourth-platform devices, but fail in that they are typically in fixed locations with little ability to gather data about their users.

“What’s great about this is it does do a lot of the things I just mentioned: you can actually walk in the room and get things ordered, you can have things turn on or turn off etc,” Jordan says. “But there’s a couple of things: it doesn’t actually integrate with you as a human, it doesn’t understand what your body or your biometrics are telling it and it can go with you but it doesn’t travel with you per se.”

The logical step for some, then, is implanted computers. They’re always there, they can gather data and provide unseen feedback and assistance and they don’t need to rely on a smartphone. But they come with a rather significant problem: how many of us are really up for having tech surgically implanted inside us?

“To a lot of people that bothers them; it even bothers me,” says Jordan. “I don’t necessarily want a device inside of me, but I do need a device that can somehow get inside of me, grab different parts of my biometrics and help me become more productive or more active.”

When does a headphone become a hearable?

For Jordan, true fourth-platform devices will combine the best of these nearly-there technologies into something consumers will actually want to use.

“The way I look at it, there are three ways that these things need to come together to make that fourth platform,” he says. “It needs to be embedded yet detachable: I think if it’s inside of you then that’s a problem, I just don’t think adoption of that by the masses is really there.

It needs to leverage multiple sensors so it’s not only voice, it’s not only eyes, it’s not only touch: its taking in several different components of your body and being able to give output from that

“It needs to leverage multiple sensors so it’s not only voice, it’s not only eyes, it’s not only touch: its taking in several different components of your body and being able to give output from that. It needs to be able to augment your behaviour and predict your behaviour as well.”

Hearables, he argues, are this device, although he is keen to stress that not all technology you can put in your ears is really a true hearable.

“It is not as simply a truly wireless in-ear device. Many of them are excellent: wonderful sound, fun to use etc but they are not a computer,” he explains.

“If you cannot update your device with firmware, just like you get with your iPhone through those OS updates, if you cannot do that with your hearable it is not by definition a hearable. It is a headphone and it may be excellent, it may be fun to use, but not exactly a hearable.

“The second thing is it must be intelligent; the device must be able to pick up what you are doing and give you a feedback loop in order to make you more productive.”

Bragi Dash: in-ear computers

Whether Bragi’s own device, the Dash, fulfils these needs will ultimately be decided by its users, but it does make a compelling case. Because while the Dash looks like just a regular set of wireless earbuds, it is in fact a complete computer in two parts, housed entirely inside the minimal casing.

“We did not go out to build headphones. We actually went out to build in-ear computers; a binary computer, with a left and right side allowing us to make even more datasets and even more predictions,” says Jordan.

“In the building a device we were challenged – first of all we had nanotechnology: how do we push all of these things into very, very little space? We put 27 sensors inside of our device, we put infrared, we put accelerometers, a gyroscope, a 32 bit processor and a 4GB hard drive all in a thing the size of a dime that sits inside your ear.”

And that means that Dash can do pretty much all the things you’d expect from conventional wearable technology, without needing to hook up to a phone or plant sensors across your body.

We did not go out to build headphones. We actually went out to build in-ear computers; a binary computer, with a left and right side allowing us to make even more datasets and even more predictions

“We have tracking heart rate, respiration, acceleration, temperature, rotation, absolute direction: all of these types of things can be gathered through the device. All of those things can also be gathered and put into datasets,” he says.

“You have a headset that functions similarly to how you make normal telephone calls. You have an earphone microphone – that means the microphone is actually inside your ear not outside. You have noise cancellation with audio transparency: that means that you can hear the world around you as well as what’s in your device, so you’re actually able to have an augmented situation there. Speech control in the ambient microphone: again, those are things that allow you to sit there and make things more productive.”

Dash also solves the interaction problem – usually in-ear wearables rely on smartphones – with a mixture of gestures and voice commands.

“Right now on our device you can actually nod your head and answer a call. You can say no and reject that call. You can shake your head three times and make music shuffle; you can double tap your face twice and say ’tell me how to get home’ and the device will tell you how to get home,” Jordan explains.

But that’s not all Bragi has planned for the device. The company is already working with IBM Watson and several automotive companies to build on the Dash’s capabilities, and hopes to be able to utilise the data collected to significantly advance how the device can help you in your day to day life.

“We are collecting biometric data: we know your red-and-white blood cell counts; we know your difference between your scared heart rate and your nervous heart rate and your exercise induced heart rate,” he says. “We can see the difference between all of those so we can actually look to a world where we can start to build apps on top of those behaviours to allow you to then become more productive based on exactly what’s happening with your body, as well as starting to predict what may happen to you in the future.”

A post-screen world

The true promise of hearables lies in their ability to interact with the increasingly connected world around us, and remove many of the increasingly prevalent screens that are carving through our lives. Computers would remain ever-present, but in a manner that would be less intrusive, and more able to respond to our needs without us needing to tell them to.

“By integrating with you and into the Internet of Things, think about all those gestures, think about all that biofeedback you’re getting, and imagine being able to control the devices around you,” Jordan enthuses. “So you yourself become the trackpad.

“Imagine being able to walk into a room and simply control the devices. Let’s say you walk home and you just lift your arms up and the lights turn on, a double snap and a little Barry White starts playing.

“Your temperature is high, you’re on your way home and all of a sudden that air conditioner at home knows to turn on. You can do things that are very different by having the computer integrated into you.”

School will use facial analysis to identify students who are dozing off

In September the ESG business school in Paris will begin using artificial intelligence and facial analysis to determine whether students are paying attention in class. The school says the technology will be used to improve performance of students and professors.

Source: The Verge

Company offers free training for coal miners to become wind farmers

A Chinese wind-turbine maker wants American workers to retrain and become wind farmers. The training program was announced at an energy conference in Wyoming, where the American arm of Goldwind, a Chinese wind-turbine manufacturer is located.

Source: Quartz

Google AI defeats human Go champion

Google's DeepMind AI AlphaGo has defeated the world's number one Go player Ke Jie. AlphaGo secured the victory after winning the second game in a three-part match. DeepMind founder Demis Hassabis said Ke Jie "pushed AlphaGo right to the limit".

Source: BBC

Vegan burgers that taste like real meat to hit Safeway stores

Beyond Meat, which promises its plant-based burgers bleed and sizzle like real ground beef and is backed by investors like Bill Gates, will begin distributing its plant-based burgers in more than 280 Safeway stores in California, Hawaii and Nevada.

Source: Bloomberg

The brain starts to eat itself after chronic sleep deprivation

Brain cells that destroy and digest worn-out cells and debris go into overdrive in mice that are chronically sleep-deprived. The discovery could explain why a chronic lack of sleep puts people at risk of neurological disorders like Alzheimer’s disease.

Source: New Scientist

"We can still act and it won’t be too late," says Obama

Former US President Barack Obama has written an op-ed piece in the Guardian giving his views on some of the greatest challenges facing the world – food and climate change – and what we can do about them. "We can still act and it won’t be too late," writes Obama.

Source: The Guardian

Juno mission: Jupiter’s magnetic field is even weirder than expected

It has long been known that Jupiter has the most intense magnetic field in the solar system, but the first round of results from NASA’s Juno mission has revealed that it is far stronger and more misshapen than scientists predicted.

Announcing the findings of the spacecraft’s first data-collection pass, which saw Juno fly within 2,600 miles (4,200km) of Jupiter on 27th August 2016, NASA mission scientists revealed that the planet far surpassed the expectations of models.

Measuring Jupiter’s magnetosphere using Juno’s magnetometer investigation (MAG) tool, they found that the planet’s magnetic field is even stronger than models predicted, at 7.766 Gaus: 10 times stronger than the strongest fields on Earth.

Furthermore, it is far more irregular in shape, prompting a re-think about how it could be generated.

“Juno is giving us a view of the magnetic field close to Jupiter that we’ve never had before,” said Jack Connerney, Juno deputy principal investigator and magnetic field investigation lead at NASA’s Goddard Space Flight Center in Greenbelt, Maryland.

“Already we see that the magnetic field looks lumpy: it is stronger in some places and weaker in others.

An enhanced colour view of Jupiter’s south pole. Image courtesy of NASA/JPL-Caltech/SwRI/MSSS/Gabriel Fiset. Featured image courtesy of NASA/SWRI/MSSS/Gerald Eichstädt/Seán Doran

At present, scientists cannot say for certain why or how Jupiter’s magnetic field is so peculiar, but they do already have a theory: that the field is not generated from the planet’s core, but in a layer closer to its surface.

“This uneven distribution suggests that the field might be generated by dynamo action closer to the surface, above the layer of metallic hydrogen,” said Connerney.

However, with many more flybys planned, the scientists will considerable opportunities to learn more about this phenomenon, and more accurately pinpoint the bizarre magnetic field’s cause.

“Every flyby we execute gets us closer to determining where and how Jupiter’s dynamo works,” added Connerney.

With each flyby, which occurs every 53 days, the scientists are treated to a 6MB haul of newly collected information, which takes around 1.5 days to transfer back to Earth.

“Every 53 days, we go screaming by Jupiter, get doused by a fire hose of Jovian science, and there is always something new,” said Scott Bolton, Juno principal investigator from the Southwest Research Institute in San Antonio.

A newly released image of Jupiter’s stormy south pole. Image courtesy of NASA/JPL-Caltech/SwRI/MSSS/Betsy Asher Hall/Gervasio Robles

An unexpected magnetic field was not the only surprise from the first data haul. The mission also provided a first-look at Jupiter’s poles, which are unexpectedly covered in swirling, densely clustered storms the size of Earth.

“We’re puzzled as to how they could be formed, how stable the configuration is, and why Jupiter’s north pole doesn’t look like the south pole,” said Bolton. “We’re questioning whether this is a dynamic system, and are we seeing just one stage, and over the next year, we’re going to watch it disappear, or is this a stable configuration and these storms are circulating around one another?”

Juno’s Microwave Radiometer (MWR) also threw up some surprises, with some of the planet’s belts appearing to penetrate down to its surface, while others seem to evolve into other structures. It’s a curious phenomenon, and one which the scientists hope to better explore on future flybys.

“On our next flyby on July 11, we will fly directly over one of the most iconic features in the entire solar system – one that every school kid knows – Jupiter’s Great Red Spot,” said Bolton.

“If anybody is going to get to the bottom of what is going on below those mammoth swirling crimson cloud tops, it’s Juno and her cloud-piercing science instruments.”