Pressure-sensitive artificial skin signals brain cells when touched

Artificial skin that sends pressure sensations directly to brain cells has been developed for the first time, bringing the eventual goal of flexible, healing and feeling artificial skin a step closer.

Developed by Zhenan Bao, professor of chemical engineering at Stanford, the skin is able to detect the level of pressure applied to it, be it a light touch or a hard press.

Bao, who has been working on the development of artificial skin for a decade, led a team of 17 researchers to create the technology, which has been detailed in an article published today in the journal Science.

“This is the first time a flexible, skin-like material has been able to detect pressure and also transmit a signal to a component of the nervous system,” she said.

Bao aims for the skin, which is designed to fit over a prosthetic limb, to eventually be able to heal, signal pain and detect touch and temperature.

artificial-skin-1

The artificial skin is made up of two layers of plastic, with the top layer providing sensing capabilities and the bottom sending the data to nerve cells as electrical signals.

The top layer’s sensing abilities are achieved by giving the plastic a waffle pattern, which makes the plastic very sensitive to pressure, and then dispersing billions of carbon nanotubes throughout it.

These nanotubes conduct electricity when squeezed together, so when pressure on the skin increases, the nanotubes are pushed closer together, and more electricity is conducted.

The second layer, which takes the form of a flexible electronic circuit, then transmits this electricity to nerve cells in pulses, allowing the level of pressure to be determined.

This is designed to mimic the way real human skin works, as our own awareness of pressure is the result of our brain interpreting short pulses of electricity in a similar manner.

The have not yet directly tested the skin by hooking it up to a human brain, however. Instead they took inspiration from a field known as optogenetics – where optics and genetics meet – to generate an artificial version of part of the human nervous system, which they signalled by transferring the electrical signals into pulses of light.

While this was an effective proof of concept, in the long run the researchers plan to use a different approach to directly stimulate human nerves with the electrical pulses. They are confident this can be achieved as other researchers have already found ways to stimulate neurons directly with such pulses.

artificial-skin-2

Images courtesy of Bao Research Group, Stanford University

There is still considerable work ahead before Bao’s dream of fully sensory artificial skin can be realised, but this work is an exceptionally important step along the way.

“We have a lot of work to take this from experimental to practical applications,” she said. “But after spending many years in this work, I now see a clear path where we can take our artificial skin.”

With just two layers in the current system, the researchers believe it will be easy to add additional sensors as they are developed.

Among those the researchers want to create are sensors to determine different textures, allowing the wearer to differentiate between fabrics, for example, and sensors to determine the temperature of an object.

Soviet report detailing lunar rover Lunokhod-2 released for first time

Russian space agency Roskosmos has released an unprecedented scientific report into the lunar rover Lunokhod-2 for the first time, revealing previously unknown details about the rover and how it was controlled back on Earth.

The report, written entirely in Russian, was originally penned in 1973 following the Lunokhod-2 mission, which was embarked upon in January of the same year. It had remained accessible to only a handful of experts at the space agency prior to its release today, to mark the 45th anniversary of the mission.

Bearing the names of some 55 engineers and scientists, the report details the systems that were used to both remotely control the lunar rover from a base on Earth, and capture images and data about the Moon’s surface and Lunokhod-2’s place on it. This information, and in particularly the carefully documented issues and solutions that the report carries, went on to be used in many later unmanned missions to other parts of the solar system.

As a result, it provides a unique insight into this era of space exploration and the technical challenges that scientists faced, such as the low-frame television system that functioned as the ‘eyes’ of the Earth-based rover operators.

A NASA depiction of the Lunokhod mission. Above: an image of the rover, courtesy of NASA, overlaid onto a panorama of the Moon taken by Lunokhod-2, courtesy of Ruslan Kasmin.

One detail that main be of particular interest to space enthusiasts and experts is the operation of a unique system called Seismas, which was tested for the first time in the world during the mission.

Designed to determine the precise location of the rover at any given time, the system involved transmitting information over lasers from ground-based telescopes, which was received by a photodetector onboard the lunar rover. When the laser was detected, this triggered the emission of a radio signal back to the Earth, which provided the rover’s coordinates.

Other details, while technical, also give some insight into the culture of the mission, such as the careful work to eliminate issues in the long-range radio communication system. One issue, for example, was worked on with such thoroughness that it resulted in one of the devices using more resources than it was allocated, a problem that was outlined in the report.

The document also provides insight into on-Earth technological capabilities of the time. While it is mostly typed, certain mathematical symbols have had to be written in by hand, and the report also features a number of diagrams and graphs that have been painstakingly hand-drawn.

A hand-drawn graph from the report, showing temperature changes during one of the monitoring sessions during the mission

Lunokhod-2 was the second of two unmanned lunar rovers to be landed on the Moon by the Soviet Union within the Lunokhod programme, having been delivered via a soft landing by the unmanned Luna 21 spacecraft in January 1973.

In operation between January and June of that year, the robot covered a distance of 39km, meaning it still holds the lunar distance record to this day.

One of only four rovers to be deployed on the lunar surface, Lunokhod-2 was the last rover to visit the Moon until December 2013, when Chinese lunar rover Yutu made its maiden visit.

Robot takes first steps towards building artificial lifeforms

A robot equipped with sophisticated AI has successfully simulated the creation of artificial lifeforms, in a key first step towards the eventual goal of creating true artificial life.

The robot, which was developed by scientists at the University of Glasgow, was able to model the creation of artificial lifeforms using unstable oil-in-water droplets. These droplets effectively played the role of living cells, demonstrating the potential of future research to develop living cells based on building blocks that cannot be found in nature.

Significantly, the robot also successfully predicted their properties before they were created, even though this could not be achieved using conventional physical models.

The robot, which was designed by Glasgow University’s Regius Chair of Chemistry, Professor Lee Cronin, is driven by machine learning and the principles of evolution.

It has been developed to autonomously create oil-in-water droplets with a host of different chemical makeups and then use image recognition to assess their behaviour.

Using this information, the robot was able to engineer droplets to have different properties­. Those which were found to be desirable could then be recreated at any time, using a specific digital code.

“This work is exciting as it shows that we are able to use machine learning and a novel robotic platform to understand the system in ways that cannot be done using conventional laboratory methods, including the discovery of ‘swarm’ like group behaviour of the droplets, akin to flocking birds,” said Cronin.

“Achieving lifelike behaviours such as this are important in our mission to make new lifeforms, and these droplets may be considered ‘protocells’ – simplified models of living cells.”

One of the oil droplets created by the robot

The research, which is published today in the journal PNAS, is one of several research projects being undertaken by Cronin and his team within the field of artificial lifeforms.

While the overarching goal is moving towards the creation of lifeforms using new and unprecedented building blocks, the research may also have more immediate potential applications.

The team believes that their work could also have applications in several practical areas, including the development of new methods for drug delivery or even innovative materials with functional properties.