Is the world ready to start experiencing music in virtual reality?

From Paul McCartney to Avenged Sevenfold, artists of all genres are beginning to explore the possibilities of VR. But is VR the next stage in performance technology, or is it just another flash-in-the-pan gimmick?

From 360° music videos to live shows streamed directly to your headset, virtual reality has become a hot topic for the music industry.

The technology has even piqued the interest of some of music’s biggest hitters. In 2016, Paul McCartney released a VR documentary that allowed viewers to learn about the Beatles while standing a few feet away from the man himself. More recently, animated funk troupe Gorillaz released their trippy video for ‘Saturnz Barz’, which has since amassed nearly 9 million views on YouTube.

Nevertheless, some critics are still unconvinced by VR, and the question now is whether the format can overcome the challenges to become a truly revolutionary force in music.

Creating videos with a 360° viewpoint in mind

In the saturated music market, artists are trying to find new ways to stand out. VR has therefore been used to add a fresh twist to music videos, making them more replayable and shareable online.

Most notably, musicians have been stitching together 360° videos that allow headset users to look around their immediate environment during the track, enhancing their immersion and making them feel closer to their favourite artists than ever before. A memorable example is the video for ‘Crown’ by American hip-hop act Run the Jewels, in which the viewer is able to turn around to look at various eccentric characters while the performers rap in the background.

At this year’s VR World event in London, digital marketing expert Mattie Bennett spoke about the way that VR helps musicians to reclaim listeners’ attention from their eternal internet binge.

“VR is really exciting for me because it makes me feel actually this is something that will make people focus, rather than listening to an album and start scrolling through Facebook and looking at cat videos on Instagram or whatever,” he says. “They are kind of lost in the music again, so that’s exciting.”

From Bennett’s perspective, the use of movement and spatial sound (which changes depending on where you are looking) could help add a sensorial aspect to videos: “What I see VR being able to do is create these environments where audio enhances the experience. So, for example, imagine instead of looking at the Abbey Road album cover, you are actually walking down Abbey Road while listening to it, and the environment around you changes.”

Bennett’s example hints at an interactive element that several artists have already explored. Last year, VR producer Tyler Hurd created a bizarre yet brilliant music video for ‘Old Friend’ by synthpop group Future Islands. During the song, the headset wearer uses an HTC Vive to flail their limbs about in a mad animated dreamscape, trying to match the time of the cartoonish dancers around them. The video creates an irresistible need to party while playing, causing Wired to comment that it was “the best example of VR’s potential so far”.

Hurd’s VR creation achieves something normal music videos can’t, which according to many commentators is exactly what it should be doing to truly hit the mainstream. Ryan Pulliam, Co-Founder of Specular Theory, told Electronic Beats Magazine that VR needs to be the focus of the music video, not just a gimmick. “If what you’re trying to do can also be done in 2D, it probably won’t make for a great story in 360°,” he said. “Creators must approach the concept with a full 360° viewpoint in mind to enhance the story, not simply enlarge it.”

Taking people inside a sweaty rave

VR has helped make artist’s digital content stand out from the crowd. However, some believe that the technology’s real future is in live music.

Last October, Avenged Sevenfold live-streamed a VR gig in 360° from the top of the Capitol Records Building in Los Angeles. The event was a ground-breaking experiment that allowed audiences across the world to don headsets and watch the heavy metal band perform on stage around them.

This proximity to the stars could be VR’s major USP, and something many consumers might be willing to pay for. The concept of being up close and personal with stars has been explored in the hugely successful Rock Band and Guitar Hero games, as well as Queen’s lauded ‘Bohemian Rhapsody Experience’, in which Freddie Mercury is brought to life as a neon-lit avatar.

Glastonbury Festival sells out every year without fail. But adding VR elements could allow fans who’ve missed out on tickets to join in from the comfort of their own bedroom

VR would also offer concert promoters whole new revenue streams to explore. For example, the UK’s Glastonbury Festival sells out every year without fail. But adding VR elements could allow fans who’ve missed out on tickets to join in from the comfort of their own bedroom. The technology could have logistical implications too; fans could use VR to check their seats before booking, for instance.

One company capitalising on the opportunity has been online broadcasting platform Boiler Room, which announced it would be opening the world’s first VR music venue in 2017. Using a specialised recording space developed alongside Inception VR, the company intends to film gigs that can be streamed directly to viewers’ devices.

Though ambitious, Boiler Room founder Blaise Belville sees the idea as fulfilling a real niche, saying in a press release that the venue will provide “immersive online experiences that bring people even closer to what it’s like being at a sweaty rave or an amazing concert half-way across the world”.

Are VR events a good enough substitute for the real thing?

The potential in VR in music has been hinted at, but it’s still far from being mainstream. Opinions are divided about just how far headsets and waggly wands will shape the industry in the future.

Many commentators have argued that live-streaming gigs through VR will never be a good enough substitute for the real thing. Independent Venue Week founder Sybil Bell told the BBC that the format loses the “romance” of going to see your favourite band perform in the flesh, and that “you can’t get that atmosphere through a screen”.

Such staunch criticism is supported by the fact that, technology-wise, VR simply isn’t there yet. On YouTube’s Engineering and Developer’s blog, software engineer Anjali Wheeler wrote that 360° music videos require huge numbers of pixels per video frame in order to match humans ‘visual acuity’, and that developers need to enhance the projection methods of their VR tech in order to make their videos that much more immersive.

What’s more, internet speeds need to improve to keep up with the insane demands of VR content. A stable, low-resolution 360° livestream in VR requires users to have bandwidth speeds of around 25Mbit/s, which can jump to up to between 80Mbit/s and 100Mbit/s if you want to play the same video in HD quality. Talk of 5G networks is promising, as they could provide faster connections that prevent the laggy, stop-start experiences that VR users currently contend with, but the implementation of these are still a long way off.

While technology needs to move upwards, prices need to decline. Cheaper solutions such as Google Cardboard have made VR more accessible to the masses, but more ambitious musical projects could require consumers to have access to high-end headsets, such as the Oculus Rift (currently retailing at a whopping £549 / $499).

Maintaining such a high cost of entry will have consumers using a mouse to scroll tediously around 360° videos forever, instead of engaging with the experience as it is intended. This could potentially be the biggest barrier to VR’s progress. If consumers aren’t buying it, then VR seems like a less exciting investment to record labels, who will be the one’s bankrolling their artists’ movement into the field.

These problems will need to be addressed before VR can be a major revenue raiser for the music industry. However, even if the format doesn’t look set to shape the scene any time soon, according to Bennett, it’s given visual music content a fresh lease of life that should inspire more amazing VR projects in the future.

“Some artists will want to stick to what works traditionally for lack of a better word,” he says. “But I think the likes of more experimental VR artists, that see music not just sound but as an extension of sound, they’re not looking at in the same way that labels are looking it.

“Artists are able to create music and environments that enhance it. I think more artists are going down that (VR) route.”

Scientists, software developers and artists have begun using VR to visualise genes and predict disease

A group of scientists, software developers and artists have taken to using virtual reality (VR) technology to visualise complex interactions between genes and their regulatory elements.

The team, which comprises of members from Oxford University, Universita’ di Napoli and Goldsmiths, University of London, have been using VR to visualise simulations of a composite of data from genome sequencing, data on the interactions of DNA and microscopy data.

When all this data is combined the team are provided with an interactive, 3D image that shows where different regions of the genome sit relative to others, and how they interact with each other.

“Being able to visualise such data is important because the human brain is very good at pattern recognition – we tend to think visually,” said Stephen Taylor, head of the Computational Biology Research Group at Oxford’s MRC Weatherall Institute of Molecular Medicine (WIMM).

“It began at a conference back in 2014 when we saw a demonstration by researchers from Goldsmiths who had used software called CSynth to model proteins in three dimensions. We began working with them, feeding in seemingly incomprehensible information derived from our studies of the human alpha globin gene cluster and we were amazed that what we saw on the screen was an instantly recognisable model.”

The team believe that being able to visualise the interactions between genes and their regulatory elements will allow them to understand the basis of human genetic diseases, and are currently applying their techniques to study genetic diseases such as diabetes, cancer and multiple sclerosis.

“Our ultimate aim in this area is to correct the faulty gene or its regulatory elements and be able to re-introduce the corrected cells into a patient’s bone marrow: to perfect this we have to fully understand how genes and their regulatory elements interact with one another” said Professor Doug Higgs, a principal researcher at the WIMM.

“Having virtual reality tools like this will enable researchers to efficiently combine their data to gain a much broader understanding of how the organisation of the genome affects gene expression, and how mutations and variants affect such interactions.”

There are around 37 trillion cells in the average adult human body, and each cell contains two meters of DNA tightly packed into its nucleus.

While the technology to sequence genomes is well established, it has been shown that the manner in which DNA is folded within each cell affects how genes are expressed.

“There are more than three billion base pairs in the human genome, and a change in just one of these can cause a problem. As a model we’ve been looking at the human alpha globin gene cluster to understand how variants in genes and their regulatory elements may cause human genetic disease,” said Prof Jim Hughes, associate professor of Genome Biology at Oxford University.

Using CRISPR, UK scientists edit DNA of human embryos

For the first time in the UK, scientists have altered human embryos. Using the gene-editing tool CRISPR, the scientists turned off the protein OCT4, which is thought to be important in early embryo development. In doing so, cells that normally go on to form the placenta, yolk sac and foetus failed to develop.

Source: BBC

Tesla and AMD developing AI chip for self-driving cars

Tesla has partnered with AMD to develop a dedicated chip that will handle autonomous driving tasks in its cars. Tesla's Autopilot programme is currently headed by former AMD chip architect Jim Keller, and it is said that more than 50 people are working on the initiative under his leadership.

Source: CNBC

Synthetic muscle developed that can lift 1,000 times its own weight

Scientists have used a 3D printing technique to create an artificial muscle that can lift 1,000 times its own weight. "It can push, pull, bend, twist, and lift weight. It's the closest artificial material equivalent we have to a natural muscle," said Dr Aslan Miriyev, from the Creative Machines lab.

Source: Telegraph

Head of AI at Google criticises "AI apocalypse" scaremongering

John Giannandrea, the senior vice president of engineering at Google, has condemned AI scaremongering, promoted by people like Elon Musk ."I just object to the hype and the sort of sound bites that some people have been making," said Giannandrea."I am definitely not worried about the AI apocalypse."

Source: CNBC

Scientists engineer antibody that attacks 99% of HIV strains

Scientists have engineered an antibody that attacks 99% of HIV strains and is built to attack three critical parts of the virus, which makes it harder for the HIV virus to resist its effects. The International Aids Society said it was an "exciting breakthrough". Human trials will begin in 2018.

Source: BBC

Facebook has a plan to stop fake news from influencing elections

Mark Zuckerberg has outlined nine steps that Facebook will take to "protect election integrity". “I care deeply about the democratic process and protecting its integrity," he said during a live broadcast on his Facebook page. "I don’t want anyone to use our tools to undermine our democracy.”