Using mixed realities in space will mean astronauts don’t need as much training

It costs millions of dollars to train and put an astronaut into space but as we explore more of the universe and send more people to space the resources aren’t always going to be there. Holograms and augmented reality are going to step up to the plate.

Microsoft’s latest creation, the HoloLens, is being sent to space and one of the perceived advantages it will provide is that astronauts won’t have to be as trained as they previously were.

The headset will give astronauts assistance when they need it, for example by allowing a skilled member of staff on Earth to see what the space explorer sees and then being able to draw annotations and instructions onto the view that’s seen by the wearer.

This approach may mean that astronauts don’t need to be trained in as many specific areas – someone in space could be coached through a repair by an experienced engineer who is located elsewhere.

Officials from NASA see it as a possible way to teach astronauts more skills.
“HoloLens and other virtual and mixed reality devices are cutting-edge technologies that could help drive future exploration and provide new capabilities to the men and women conducting critical science on the International Space Station,” said NASA’s Sam Scimemi. “This new technology could also empower future explorers requiring greater autonomy on the journey to Mars.”

The HoloLens has already been tested onboard NASA’s anti-gravity flights and is due to be rolled out to the International Space Station on 28 June, in the next resupply mission by SpaceX as part of NASA’s Project Sidekick.

The project works in two ways: the first is a ‘remote expert mode’ where experts based on Earth use Skype to communicate with and provide real-time guidance to someone in space, as well as being able to annotate the space environment.

The second is a ‘procedure mode’ that augments individual procedures with animated holographic illustrations displayed on top of the objects that the astronaut is interacting with. The approach would mean that the astronaut is directed by their headset and doesn’t need to be trained in what they are doing.

Alex Kipman from Microsoft said that incorporating mixed realities would help astronauts to “unlock new potential”.


Image courtesy of Microsoft. Featured image courtesy of NASA

When the headsets arrive in space the software and hardware will be tested in the standalone mode, before the remote mode is tested with a second set of devices.

When Microsoft announced the HoloLens, earlier this year, it said it had already partnered up with NASA to put humans ‘on Mars’.

The augmented reality headset will let those on Earth see what the Curiosity Rover, on Mars, can see, allowing them to explore the planet in a new way with visual information and data overlaid onto the wearer’s view.

Fitness trackers are surging in popularity and they present us the with information we need to make changes to our lives. How long will it be until brain-based technology can unlock the key to our decision making?

Our brains drive everything that we do and one day we will be able to use them to control the screens in front of us, but before this technology will help us make better life decisions.

Trevor Coleman, co-founder and chief product officer at Interaxon, has said that technology will be developed that helps us to adjust our lives based on information collected from our brains.

His company specialises in brain-sensing technology and has developed Muse, a wearable headband that aims to help wearers improve their brain activity.

shutterstock_227580421Coleman said that ‘control technology’ which may involve a thought interface will take a long time to develop before it is able to replace a keyboard or a mouse, but in the meantime the amount of information we can gather about ourselves will change the way we live.

“What is really powerful, and what is happening right now and is going to continue for a number of years, is that we’re developing ways of giving people this astonishing new wealth of information about themselves,” he said.

“This is about being able to have technologies that understand how you’re feeling and are able to respond to those feelings and thoughts almost as they happen.”

Coleman, who has helped to develop the Muse headband which launched at the UK’s Wearable Technology Show today, said that in the future an interface could offer help when it detects that the wearer is frustrated with a problem they are trying to tackle.

“Or, if you were deep in focus, if you had a big deadline coming up and you’re working on a document, that your phone might only let through calls from people you’ve marked as really important without you having to constantly be adjusting the notification preferences on your phone and managing this ever expanding fleet of devices and how they all talk to each other,” he said.

Stopping the notification rot

On average we unlock and check our phones 110 times a day  and it is no surprise considering the amount of notifications that we are being bombarded with.

The technological world has become increasingly noisy and it can be difficult to concentrate on the things that we are trying to achieve. One study found that its 15 users received an average of 63.5 mobile notifications a day.

“I think maintaining focus is a critical skill in today’s world. So often there’s notifications and every app is clamouring for your attention and trying to get at you, so your ability to tune out the world and focus is really important to being able to get anything significant done in your life,” Coleman said.

We call it responsive technology but it is humanistic

This technological attention-seeking can cause us to neglect our brains and the demand that they are being put under.

Coleman said that the aims behind the Muse headband are to help “reconnect to yourself and to develop stronger focus”.

To help achieve this the wearable uses EEG technology to passively detect changes in the brain – from outside the head.

It then provides a range of scenarios and exercises intended to help the wearer monitor their attention span and develop ways to improve it. In essence it is providing data about our brains that we are able to monitor and see change over time.

Analytics for the mind

Learning more about how our bodies, and in particular our minds, work gives us a greater ability to change our habits and live in a healthier way. The rise of fitness trackers and the daily monitoring of our steps and food intake has stemmed from this logic.

A device that focuses on the brain could monitor study habits, for example, over a long period of time and then suggest a break when it knows your focus is going to drop, thus maximising productivity and reducing fatigue.


Image courtesy of Muse

Coleman says this could be much like an athlete being trained by a coach who knows about how they perform and what’s needed to get the best out of them.

Technology like this is already being developed in some formats. Argus Labs has been working on algorithm-based technology which gathers social media data and analyses how you use your mobile phone, to help predict the songs that you want to listen to.

Researchers in Bangladesh have even developed technology that can interpret your mood based on how you are typing on a keyboard.

Coleman believes this sort of approach will grow and it will allow us to achieve things that we might not believe we could.

“If you have software or devices in your life that can be coaching you like that, whether it is about how you want to feel, or how you want to perform in emotional or physical tasks, the technology in your life is like a supportive coach – helping you get to where you want to be,” he said.

“We call it responsive technology but it is humanistic, it is supportive, and it is helping you achieve something that you might not have thought possible for yourself.”

You can read the more about how brain technology is changing in the latest issue of Factor Magazine which is out later this week.