Eight-year study casts serious doubt on future food security

An extensive study designed to simulate the growing conditions of the future has cast significant doubt on widely held assumptions about the impact of climate change on food production, suggesting that we will face significant crop failures far sooner than previously thought.

The study, which is published today in the journal Nature Plants, saw researchers from the University of Illinois conduct an eight year-long study of soybeans that were grown outdoors in a carbon dioxide-rich atmosphere.  This was designed to mimic the higher atmospheric CO₂ concentrations that we are projected to experience by 2050.

It had been thought that the increased levels of CO₂ would balance future water shortages, by prompting the plans to reduce the size of the pores in their leaves and so reducing gaseous exchange with the atmosphere. This would reduce the amount of water the plants needed from the soil, resulting in crops that were only minimally affected by climate change.

“If you read the most recent Intergovernmental Panel on Climate Change reports and if you read the scientific literature on the subject for the last 30 years, the concluding statement is nearly always that elevated carbon dioxide will ameliorate drought stress in crops,” explained lead author Andrew Leakey, an associate professor of plant biology at the University of Illinois.

However, the study found a flaw in that premise, in that it only works in wetter growing seasons.

“[The theory] was consistent with what we saw with our own experiments the first four years, the relatively wet years,” added Leakey. “But when the growing seasons were hot and dry, that pattern broke down.”

The Soybean Free Air Concentration Enrichment facility, which allowed researchers to simulate the CO₂-rich environment of 2050. Image courtesy of Don Hamerman

The Soybean Free Air Concentration Enrichment facility, which allowed researchers to simulate the CO₂-rich environment of 2050. Image courtesy of Don Hamerman

The researchers created the CO₂-rich environment in real farm fields using a technology known as the Soybean Free Air Concentration Enrichment Facility. This featured sensors that that can measure wind speed and direction, prompting the regulated release of gases to simulate higher concentrations of CO₂.

This allowed the researchers to determine that plants grown in a hot, dry CO₂-rich environment needed more water than plants growing under the same conditions but with current atmospheric CO₂ levels; the opposite of what previous research had suggested.

“All of the model predictions up to this point were assuming that in 2050, elevated CO₂ was going to give us a 15% increase in yield over what we had at the beginning of this century,” Leakey said. “And what we’re seeing is that as it gets hotter and drier, that number diminishes to zero. No gain.

“What we think is happening is that early in the growing season, when the plant has enough water, it’s able to photosynthesize more as a result of the higher CO2 levels. It’s got more sugars to play with, it grows more, it creates all this extra leaf area. But when it gets dry, the plant has overextended itself, so later in the season it’s now using more water.”

soybean-crop

The research has significant implications for the management of food security in the future, with soybeans being the fourth biggest food crop in the world by area harvested.

In addition to providing a valuable source of protein for nonmeat eaters, they are used in a wide array of foods, oils and sauces, particularly in East Asia where the crop has formed a significant part of the diet since at least 7,000 BC.

Soybeans are also used extensively for livestock feed, making their importance for food security even greater.

Researchers discover remains of “Triassic Jaws” who dominated the seas after Earth’s most severe mass extinction event

Researchers have discovered the fossil remains of an unknown large predatory fish called Birgeria: an approximately 1.8-meter-long primitive bony fish with long jaws and sharp teeth that swallowed its prey whole.

Swiss and US researchers led by the Paleontological Institute and Museum of the University of Zurich say the Birgeria dominated the sea that once covered present-day Nevada one million years after the mass extinction.

Its period of dominance began following “the most catastrophic mass extinction on Earth”, which took place about 252 million years ago – at the boundary between the Permian and Triassic geological periods.

Image courtesy of UZH. Featured image courtesy of Nadine Bösch

Up to 90% of the marine species of that time were annihilated, and before the discovery of the Birgeria, palaeontologists had assumed that the first predators at the top of the food chain did not appear until the Middle Triassic epoch about 247 to 235 million years ago.

“The surprising find from Elko County in northeastern Nevada is one of the most completely preserved vertebrate remains from this time period ever discovered in the United States,” emphasises Carlo Romano, lead author of the study.

Although, species of Birgeria existed worldwide. The most recent discovery belongs to a previously unknown species called Birgeria Americana, and is the earliest example of a large-sized Birgeria species, about one and a half times longer than geologically older relatives.

The researchers say the discovery of Birgeria is proof that food chains recovered quicker than previously thought from Earth’s most devastating mass extinction event.

According to earlier studies, marine food chains were shortened after the mass extinction event and recovered only slowly and stepwise.

However, finds such as the newly discovered Birgeria species and the fossils of other vertebrates now show that so-called apex predators (animals at the very top of the food chain) already lived early after the mass extinction.

“The vertebrates from Nevada show that previous interpretations of past biotic crises and associated global changes were too simplistic,” said Romano.

Revolutionary DNA sunscreen gives better protection the longer its worn

Researchers have developed a ground-breaking sunscreen made of DNA that offers significant improvements over conventional versions.

Unlike current sunscreens, which need to be reapplied regularly to remain effective, the DNA sunscreen improves over time, offering greater protection the longer it is exposed to the sun.

In addition, it also keeps the skin hydrated, meaning it could also be beneficial as a treatment for wounds in extreme or adverse environments.

Developed by researchers from Binghamton University, State University of New York, the innovative sunscreen could prove essential as temperatures climb and many are increasingly at risk of conditions caused by excessive UV exposure, such as skin cancer.

“Ultraviolet (UV) light can actually damage DNA, and that’s not good for the skin,” said Guy German, assistant professor of biomedical engineering at Binghamton University.

“We thought, let’s flip it. What happens instead if we actually used DNA as a sacrificial layer? So instead of damaging DNA within the skin, we damage a layer on top of the skin.”

The DNA sunscreen has the potential to become a standard, significantly improving the safety of spending time in the sun

The research, which is published today in the journal Scientific Reports, involved the development of thin crystalline DNA films.

These films are transparent in appearance, but able to absorb UV light; when the researchers exposed the film to UV light, they found that its absorption rate improved, meaning the more UV is was exposed to, the more it absorbed.

“If you translate that, it means to me that if you use this as a topical cream or sunscreen, the longer that you stay out on the beach, the better it gets at being a sunscreen,” said German.

The film will no doubt attract the attention of sunscreen manufacturers, who will likely be keen to commercialise such a promising product. However, the researchers have not said if there is any interest as yet, and if there is any clear timeline to it becoming a commercial product.

 

The film’s properties are not just limited to sun protection, however. The DNA film can also store water at a far greater rate than conventional skin, limiting water evaporation and increasing the skin’s hydration.

As a result, the film is also being explored as a wound covering, as it would allow the wound to be protected from the sun, keep it moist – an important factor for improved healing – and allow the wound to be monitored without needing to remove the dressing.

“Not only do we think this might have applications for sunscreen and moisturizers directly, but if it’s optically transparent and prevents tissue damage from the sun and it’s good at keeping the skin hydrated, we think this might be potentially exploitable as a wound covering for extreme environments,” said German.