Ad
Ad
Ad
Author

ScientifiCult

Browsing

Heat stress: the climate is putting European forests under sustained pressure

No year since weather records began was as hot and dry as 2018. A first comprehensive analysis of the consequences of this drought and heat event shows that central European forests sustained long-term damage. Even tree species considered drought-resistant, such as beech, pine and silver fir, suffered. The international study was directed by the University of Basel, which is conducting a forest experiment unique in Europe.

In a forest near Basel researchers study the effects of climate change on the most important and sensitive part of the trees – the canopy. A total of 450 trees between 50 and 120 years old grow on the 1.6 hectare research area. Credits: University of Basel

Until now, 2003 has been the driest and hottest year since regular weather records began. That record has now been broken. A comparison of climate data from Germany, Austria and Switzerland shows that 2018 was significantly warmer. The average temperature during the vegetation period was 1.2°C above the 2003 value and as high as 3.3°C above the average of the years from 1961 to 1990.

Part of the analysis, which has now been published, includes measurements taken at the Swiss Canopy Crane II research site in Basel, where extensive physiological investigations were carried out in tree canopies. The goal of these investigations is to better understand how and when trees are affected by a lack of water in order to counter the consequences of climate change through targeted management measures.

When trees die of thirst

Trees lose a lot of water through their surfaces. If the soil also dries out, the tree cannot replace this water, which is shown by the negative suction tension in the wood’s vascular tissue. It’s true that trees can reduce their water consumption, but if the soil water reservoir is used up, it’s ultimately only a matter of time until cell dehydration causes the death of a tree.

Physiological measurements at the Basel research site have shown the researchers that the negative suction tension and water shortage in trees occurred earlier than usual. In particular, this shortage was more severe throughout all of Germany, Austria and Switzerland than ever measured before. Over the course of the summer, severe drought-related stress symptoms therefore appeared in many tree species important to forestry. Leaves wilted, aged and were shed prematurely.

Death of a beech tree in a forest near Basel: during the 2018 heatwave the leaves died prematurely, the following year the tree stopped forming new shoots. Credits: Urs Weber, University of Basel

Spruce, pine and beech most heavily affected

The true extent of the summer heatwave became evident in 2019: many trees no longer formed new shoots – they were partially or wholly dead. Others had survived the stress of the drought and heat of the previous year, but were increasingly vulnerable to bark beetle infestation or fungus. Trees with partially dead canopies, which reduced the ability to recover from the damage, were particularly affected.

“Spruce was most heavily affected. But it was a surprise for us that beech, silver fir and pine were also damaged to this extent,” says lead researcher Professor Ansgar Kahmen. Beech in particular had until then been classified as the “tree of the future”, although its supposed drought resistance has been subject to contentious discussion since the 2003 heatwave.

heat European forests
Death of a beech tree in a forest near Basel: during the 2018 heatwave the leaves died prematurely, the following year the tree stopped foring new shoots. Credits: Urs Weber, University of Basel

Future scenarios to combat heat and drought

According to the latest projections, precipitation in Europe will decline by up to a fifth by 2085, and drought and heat events will become more frequent. Redesigning forests is therefore essential. “Mixed woodland is often propagated,” explains plant ecologist Kahmen, “and it certainly has many ecological and economic advantages. But whether mixed woodland is also more drought-resistant has not yet been clearly proven. We still need to study which tree species are good in which combinations, including from a forestry perspective. That will take a long time.”

Another finding of the study is that it is only possible to record the impacts of extreme climate events on European forests to a limited extent using conventional methods, and thus new analytical approaches are needed.“The damage is obvious. More difficult is precisely quantifying it and drawing the right conclusions for the future,” says Kahmen. Earth observation data from satellites could help track tree mortality on a smaller scale. Spatial patterns that contain important ecological and forestry-related information can be derived from such data: which tree species were heavily impacted, when and at which locations, and which survived without damage? “A system like this already exists in some regions in the US, but central Europe still lacks one.”

Original source

Schuldt, Bernhard & Buras, Allan & Arend, Matthias & Vitasse, Yann & Beierkuhnlein, Carl & Damm, Alexander & Gharun, Mana & Grams, Thorsten & Hauck, Markus & Hajek, Peter & Hartmann, Henrik & Hilbrunner, Erika & Hoch, Günter & Holloway-Phillips, Meisha & Körner, Christian & Larysch, Elena & Luebbe, Torben & Nelson, Daniel & Rammig, Anja & Kahmen, Ansgar.

A first assessment of the impact of the extreme 2018 summer drought on Central European forests.
Basic and Applied Ecology (April 2020); doi: 10.1016/j.baae.2020.04.003

 

Press release on the heat stress upon European forests from the University of Basel.

What started out as a hunt for ice lurking in polar lunar craters turned into an unexpected finding that could help clear some muddy history about the Moon’s formation.

Team members of the Miniature Radio Frequency (Mini-RF) instrument on NASA’s Lunar Reconnaissance Orbiter (LRO) spacecraft found new evidence that the Moon’s subsurface might be richer in metals, like iron and titanium, than researchers thought. That finding, published July 1 in Earth and Planetary Science Letters, could aid in drawing a clearer connection between Earth and the Moon.

“The LRO mission and its radar instrument continue to surprise us with new insights about the origins and complexity of our nearest neighbor,” said Wes Patterson, Mini-RF principal investigator from the Johns Hopkins Applied Physics Laboratory (APL) in Laurel, Maryland, and a study coauthor.

Moon metallic
This image based on data from NASA’s Lunar Reconnaissance Orbiter spacecraft shows the face of the Moon we see from Earth. The more we learn about our nearest neighbor, the more we begin to understand the Moon as a dynamic place with useful resources that could one day even support human presence. Credits: NASA / GSFC / Arizona State University

Substantial evidence points to the Moon as the product of a collision between a Mars-sized protoplanet and young Earth, forming from the gravitational collapse of the remaining cloud of debris. Consequently, the Moon’s bulk chemical composition closely resembles that of Earth.

Look in detail at the Moon’s chemical composition, however, and that story turns murky. For example, in the bright plains of the Moon’s surface, called the lunar highlands, rocks contain smaller amounts of metal-bearing minerals relative to Earth. That finding might be explained if Earth had fully differentiated into a core, mantle and crust before the impact, leaving the Moon largely metal-poor. But turn to the Moon’s maria — the large, darker plains — and the metal abundance becomes richer than that of many rocks on Earth.

This discrepancy has puzzled scientists, leading to numerous questions and hypotheses regarding how much the impacting protoplanet may have contributed to the differences. The Mini-RF team found a curious pattern that could lead to an answer.

Using Mini-RF, the researchers sought to measure an electrical property within lunar soil piled on crater floors in the Moon’s northern hemisphere. This electrical property is known as the dielectric constant, a number that compares the relative abilities of a material and the vacuum of space to transmit electric fields, and could help locate ice lurking in the crater shadows. The team, however, noticed this property increasing with crater size.

For craters approximately 1 to 3 miles (2 to 5 kilometers) wide, the dielectric constant of the material steadily increased as the craters grew larger, but for craters 3 to 12 miles (5 to 20 kilometers) wide, the property remained constant.

“It was a surprising relationship that we had no reason to believe would exist,” said Essam Heggy, coinvestigator of the Mini-RF experiments from the University of Southern California in Los Angeles and lead author of the published paper.

Discovery of this pattern opened a door to a new possibility. Because meteors that form larger craters also dig deeper into the Moon’s subsurface, the team reasoned that the increasing dielectric constant of the dust in larger craters could be the result of meteors excavating iron and titanium oxides that lie below the surface. Dielectric properties are directly linked to the concentration of these metal minerals.

If their hypothesis were true, it would mean only the first few hundred meters of the Moon’s surface is scant in iron and titanium oxides, but below the surface, there’s a steady increase to a rich and unexpected bonanza.

Comparing crater floor radar images from Mini-RF with metal oxide maps from the LRO Wide-Angle Camera, Japan’s Kaguya mission and NASA’s Lunar Prospector spacecraft, the team found exactly what it had suspected. The larger craters, with their increased dielectric material, were also richer in metals, suggesting that more iron and titanium oxides had been excavated from the depths of 0.3 to 1 mile (0.5 to 2 kilometers) than from the upper 0.1 to 0.3 miles (0.2 to 0.5 kilometers) of the lunar subsurface.

“This exciting result from Mini-RF shows that even after 11 years in operation at the Moon, we are still making new discoveries about the ancient history of our nearest neighbor,” said Noah Petro, the LRO project scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “The MINI-RF data is incredibly valuable for telling us about the properties of the lunar surface, but we use that data to infer what was happening over 4.5 billion years ago!”

These results follow recent evidence from NASA’s Gravity Recovery and Interior Laboratory (GRAIL) mission that suggests a significant mass of dense material exists just a few tens to hundreds of kilometers beneath the Moon’s enormous South Pole-Aitken basin, indicating that dense materials aren’t uniformly distributed in the Moon’s subsurface.

The team emphasizes that the new study can’t directly answer the outstanding questions about the Moon’s formation, but it does reduce the uncertainty in the distribution of iron and titanium oxides in the lunar subsurface and provide critical evidence needed to better understand the Moon’s formation and its connection to Earth.

“It really raises the question of what this means for our previous formation hypotheses,” Heggy said.

Anxious to uncover more, the researchers have already started examining crater floors in the Moon’s southern hemisphere to see if the same trends exist there.

LRO is managed by NASA’s Goddard Space Flight Center in Greenbelt, Maryland for the Science Mission Directorate at NASA Headquarters in Washington. Mini-RF was designed, built and tested by a team led by APL, Naval Air Warfare Center, Sandia National Laboratories, Raytheon and Northrop Grumman.

For more information on LRO, visit:

https://www.nasa.gov/lro

 

Press release from NASA/Space Goddard Flight Center, by Jeremy Rehm

Why are plants green?

UC Riverside-led research team’s model to explain photosynthesis lays out the next challenging phase of research on how green plants transform light energy into chemical energy

UC Riverside-led research team’s model to explain photosynthesis lays out the next challenging phase of research on how green plants transform light energy into chemical energy. Credits: Gabor lab, UC Riverside

When sunlight shining on a leaf changes rapidly, plants must protect themselves from the ensuing sudden surges of solar energy. To cope with these changes, photosynthetic organisms — from plants to bacteria — have developed numerous tactics. Scientists have been unable, however, to identify the underlying design principle.

An international team of scientists, led by physicist Nathaniel M. Gabor at the University of California, Riverside, has now constructed a model that reproduces a general feature of photosynthetic light harvesting, observed across many photosynthetic organisms.

Nathaniel Gabor is an associate professor of physics at UC Riverside. Credits: CIFAR

Light harvesting is the collection of solar energy by protein-bound chlorophyll molecules. In photosynthesis — the process by which green plants and some other organisms use sunlight to synthesize foods from carbon dioxide and water — light energy harvesting begins with sunlight absorption.

The researchers’ model borrows ideas from the science of complex networks, a field of study that explores efficient operation in cellphone networks, brains, and the power grid. The model describes a simple network that is able to input light of two different colors, yet output a steady rate of solar power. This unusual choice of only two inputs has remarkable consequences.

“Our model shows that by absorbing only very specific colors of light, photosynthetic organisms may automatically protect themselves against sudden changes — or ‘noise’ — in solar energy, resulting in remarkably efficient power conversion,” said Gabor, an associate professor of physics and astronomy, who led the study appearing today in the journal Science. “Green plants appear green and purple bacteria appear purple because only specific regions of the spectrum from which they absorb are suited for protection against rapidly changing solar energy.”

Gabor first began thinking about photosynthesis research more than a decade ago, when he was a doctoral student at Cornell University. He wondered why plants rejected green light, the most intense solar light.  Over the years, he worked with physicists and biologists worldwide to learn more about statistical methods and the quantum biology of photosynthesis.

Richard Cogdell, a renowned botanist at the University of Glasgow in the United Kingdom and a coauthor on the research paper, encouraged Gabor to extend the model to include a wider range of photosynthetic organisms that grow in environments where the incident solar spectrum is very different.

“Excitingly, we were then able to show that the model worked in other photosynthetic organisms besides green plants, and that the model identified a general and fundamental property of photosynthetic light harvesting,” he said. “Our study shows how, by choosing where you absorb solar energy in relation to the incident solar spectrum, you can minimize the noise on the output — information that can be used to enhance the performance of solar cells.”

Coauthor Rienk van Grondelle, an influential experimental physicist at Vrije Universiteit Amsterdam in the Netherlands who works on the primary physical processes of photosynthesis, said the team found the absorption spectra of certain photosynthetic systems select certain spectral excitation regions that cancel the noise and maximize the energy stored.

“This very simple design principle could also be applied in the design of human-made solar cells,” said van Grondelle, who has vast experience with photosynthetic light harvesting.

Gabor explained that plants and other photosynthetic organisms have a wide variety of tactics to prevent damage due to overexposure to the sun, ranging from molecular mechanisms of energy release to physical movement of the leaf to track the sun. Plants have even developed effective protection against UV light, just as in sunscreen.

“In the complex process of photosynthesis, it is clear that protecting the organism from overexposure is the driving factor in successful energy production, and this is the inspiration we used to develop our model,” he said. “Our model incorporates relatively simple physics, yet it is consistent with a vast set of observations in biology. This is remarkably rare. If our model holds up to continued experiments, we may find even more agreement between theory and observations, giving rich insight into the inner workings of nature.”

To construct the model, Gabor and his colleagues applied straightforward physics of networks to the complex details of biology, and were able to make clear, quantitative, and generic statements about highly diverse photosynthetic organisms.

“Our model is the first hypothesis-driven explanation for why plants are green, and we give a roadmap to test the model through more detailed experiments,” Gabor said.

Photosynthesis may be thought of as a kitchen sink, Gabor added, where a faucet flows water in and a drain allows the water to flow out. If the flow into the sink is much bigger than the outward flow, the sink overflows and the water spills all over the floor.

“In photosynthesis, if the flow of solar power into the light harvesting network is significantly larger than the flow out, the photosynthetic network must adapt to reduce the sudden over-flow of energy,” he said. “When the network fails to manage these fluctuations, the organism attempts to expel the extra energy. In doing so, the organism undergoes oxidative stress, which damages cells.”

The researchers were surprised by how general and simple their model is.

“Nature will always surprise you,” Gabor said. “Something that seems so complicated and complex might operate based on a few basic rules. We applied the model to organisms in different photosynthetic niches and continue to reproduce accurate absorption spectra. In biology, there are exceptions to every rule, so much so that finding a rule is usually very difficult. Surprisingly, we seem to have found one of the rules of photosynthetic life.”

Gabor noted that over the last several decades, photosynthesis research has focused mainly on the structure and function of the microscopic components of the photosynthetic process.

“Biologists know well that biological systems are not generally finely tuned given the fact that organisms have little control over their external conditions,” he said. “This contradiction has so far been unaddressed because no model exists that connects microscopic processes with macroscopic properties. Our work represents the first quantitative physical model that tackles this contradiction.”

Next, supported by several recent grants, the researchers will design a novel microscopy technique to test their ideas and advance the technology of photo-biology experiments using quantum optics tools.

“There’s a lot out there to understand about nature, and it only looks more beautiful as we unravel its mysteries,” Gabor said.

Gabor, Cogdell, and van Grondelle were joined in the research by Trevor B. Arp, Jed Kistner-Morris, and Vivek Aji at UCR.

The research was supported by the Air Force Office of Scientific Research Young Investigator Program, the National Science Foundation, and through a U.S. Department of the Navy’s Historically Black Colleges and Universities/Minority Institutions award. Gabor was also supported through a Cottrell Scholar Award and a Canadian Institute for Advanced Research Azrieli Global Scholar Award. Other sources of funding were the NASA MUREP Institutional Research Opportunity program, the U.S. Department of Energy, the Biotechnological and Biological Sciences Research Council, the Royal Netherlands Academy of Arts and Sciences, and the Canadian Institute for Advanced Research.

The research paper is titled, “Quieting a noisy antenna reproduces photosynthetic light harvesting spectra.”

 

 

 

Press release from the University of California, Riverside

Traffic density, wind and air stratification influence concentrations of air pollutant NO2

Leipzig researchers use a calculation method to remove weather influences from air pollution data

traffic air pollutant nitrogen dioxide COVID-19
Traffic density, wind and air stratification influence the pollution with the air pollutant nitrogen dioxide, according to the conclusion of a TROPOS study commissioned by the LfULG. Credits: Burkhard Lehmann, LfULG

Leipzig/Dresden. In connection with the effects of the COVID-19 pandemic, satellite measurements made headlines showing how much the air pollutant nitrogen dioxide (NO2) had decreased in China and northern Italy.  In Germany, traffic density is the most important factor. However, weather also has an influence on NO2 concentrations, according to a study by the Leibniz Institute for Tropospheric Research (TROPOS), which evaluated the influence of weather conditions on nitrogen dioxide concentrations in Saxony 2015 to 2018 on behalf of the Saxon State Office for Environment, Agriculture and Geology (LfULG). It was shown that wind speed and the height of the lowest air layer are the most important factors that determine how much pollutants can accumulate locally.

In order to determine the influence of various weather factors on air quality, the team used a statistical method that allows meteorological fluctuations to be mathematically removed from long-term measurements. The air quality fluctuates, in some cases very strongly, due to different emissions and the influence of the weather. Until now, however, it has been difficult to estimate, what share legal measures such as low emission zones or diesel driving bans have and what share the weather influences have in the actual air quality? With the method used, this will be easier in the future.

Nitrogen dioxide (NO2) is an irritant gas which attacks the mucous membrane of the respiratory tract, causes inflammatory reactions as an oxidant and increases the effect of other air pollutants. As a precursor substance, it can also contribute to the formation of particulate matter. Limit values have been set in the EU to protect the population: For nitrogen dioxide, an annual average value of 40 micrograms per cubic metre of air applies (μg/m³). To protect the health of the population, measures must be taken if these limit values are not complied with. In 2018/2019, for example, various measures were taken in Germany, ranging from a reduction in the number of lanes (e.g. in Leipzig) to driving bans for older diesel vehicles (e.g. in Stuttgart).

To evaluate the effectiveness of such measures, it would be helpful to determine the exact influence of weather conditions. The Saxon State Office for Environment, Agriculture and Geology (LfULG) therefore commissioned TROPOS to carry out a study on the influence of weather factors on NO2 concentrations and provided its measurement data from the Saxon air quality measurement network and meteorological data for this purpose. The researchers were thus able to evaluate data from 29 stations in Saxony over a period of four years, which represent a cross-section of air pollution – from stations at traffic centres to urban and rural background stations and stations on the ridge of the Erzgebirge mountains. They also calculated the height of the lowest layer in the atmosphere and incorporated data from traffic counting stations in Leipzig and Dresden into the study. A method from the field of machine learning was used for the statistical modelling, the application of which in the field of air quality was first published by British researchers in 2009.

In this way, the study was able to demonstrate that the traffic density at all traffic stations is most significantly responsible for nitrogen oxide concentrations. However, two weather parameters also have a significant influence on nitrogen dioxide concentrations: wind speed and the height of the so-called mixing layer. The latter is a meteorological parameter that indicates the height to which the lowest layer of air, where the emissions mix, extends. “It was also shown that high humidity can also reduce the concentration of nitrogen dioxide, which could be due to the fact that the pollutants deposit more strongly on moist surfaces. However, the exact causes are still unclear,” says Dominik van Pinxteren.

The statistical analysis has also enabled the researchers to remove the influence of the weather from the time series of pollutant concentrations: Adjusted for the weather, the concentration of nitrogen oxides (NOx) decreased by a total of 10 micrograms per cubic meter between 2015 and 2018 on average over all traffic stations in Saxony. In urban and rural areas and on the ridge of the Erzgebirge, however, NOx concentrations tend to remain at the same level. Even though there have been some improvements in air quality in recent years, there are good scientific arguments for further reducing air pollution.

In a way, this also applies to premature conclusions from the corona crisis: in order to find out how strong the influence of the initial restrictions on air quality actually was, the influence of the weather would have to be statistically removed in a longer series of measurements. To this end, investigations for the Leipzig area are currently underway at TROPOS, as is a Europe-wide study of the EU research infrastructure for short-lived atmospheric constituents such as aerosol, clouds and trace gases (ACTRIS), the German contribution to which is coordinated by TROPOS.

Publication:

Dominik van Pinxteren, Sebastian Düsing, Alfred Wiedensohler, Hartmut Herrmann (2020): Meteorological influences on nitrogen dioxide: Influence of weather conditions and weathering on nitrogen dioxide concentrations in outdoor air 2015 to 2018. Series of publications of the LfULG, issue 2/2020 (in German only)
https://publikationen.sachsen.de/bdb/artikel/35043
This study was commissioned by the State Office for Environment, Agriculture and Geology (LfULG).

Project:

LfULG-Projekt „Meteorologische Einflüsse auf Stickstoffdioxid“:
https://www.luft.sachsen.de/Inhalt_FuE_Projekt_Witterung_NOx_Ozon.html

 

Press release on traffic density, wind and air stratification influence concentrations of air pollutant NO2 by Tilo Arnhold from the Leibniz Institute for Tropospheric Research (TROPOS)

Addressing the infodemic around the COVID 19 pandemic, decision-making simulation game wins first ComplexityJam

Addressing the topic of an onslaught of conflicting information and fake news in connection to the coronavirus pandemic, ComplexityJam #survivetheinfodemic challenged participants to represent he complexity of the situation through games and interactive digital narratives, in an online international game jam event coordinated by INDCOR EU COST Action and MOME University, which ended on June 13 with a virtual award ceremony.

The main award went to “Temp in Charge”.

The ComplexityJam international online game developing competition was initiated by the INDCOR COST Action, which stands for Interactive Digital Narratives for Complexity Representations. The INDCOR project was launched on May 29 with almost 80 participants attending from 12 different countries, including the US, the UK, Sweden, The Netherlands, Hungary, and Turkey. 11 entries were developed before the deadline, June 5. The resulting works addressed issues of social distancing, information overload, fake news identification, successful collaboration and the responsible decision-making during the pandemic. The main task was to provide orientation during the pandemic and provide an outlet for playful creativity through the creation of complex representations.
ComplexityJam infodemic pandemic

The winners were selected by a five-member International jury of acclaimed scholars and award-winning professionals: Janet Murray (Professor and Associate Dean for Research and Faculty Affairs, Georgia Institute of Technology, US), Lindsay Grace (Professor and Knight Chair for Interactive Media, University of Miami, US), Szabolcs Józsa (Founder, Nemesys Games, HU), Odile Limpach (ProfessorCologne Game Lab, DE) and Simon Meek (Creative Director, The Secret Experiment, BAFTA winner, UK).

  1. “Temp in Charge” wins the ComplexityJam main awardhttps://rocinante.itch.io/temp-in-charge The president contracted pneumonia and you are his temporary stand-in for just one week. You must make decisions about the current pandemic and economic situation. There is no need to panic when you have the EasyGuv 6000 application at hand by which running a government becomes an easy task. Find solutions to your problems with just a few clicks.

Team: Resul Alıcı (Bahcesehir University Game Design graduate and Unico Studio game designer), Burak Karakas (Bahcesehir University).
The jury found this work to offer the most complete experience. It directly addresses the question of difficult decisions based on competing pieces of information. “Temp in Charge” makes us aware of the complexity of political decision-making via a friendly, easy-to use interface.

  1. “Trial Day” wins Runner-up Award for developmenthttps://erencaylak.itch.io/trial-day Trial Day is a game about information overload in the age of a pandemic, of post-truth and fake news. You are an aspiring journalist. Welcome to your new job’s trial day! Your ultimate goal? Play their game as best as you can and identify which news pieces to trust! But be aware: your choices and behavior are being monitored!

Team: Eren Çaylak, Sid Chou, Glenn Curtis, Yiting Liu, Dimitra Mavrogonatou, Kirstin McLellan. (This team was assembled by the ComplexityJam organizers and included participants from New York University, Turkey, Greece, and Glasgow School of Art)
The jury particularly liked the trial aspect and its rapid-fired approach that challenges the interactor to make quick decisions. The simple, yet effective graphic depiction of the trial elements adds considerably to the experience.

  1. “Essential workers” wins Special Award “the most potential for further development” https://aanupam3.itch.io/essential-workersEssential Workers is a cooperative online multiplayer game about a community working together to overcome the COVID-19 pandemic. Players must balance their personal safety against the necessities of the community. If anyone loses, everyone loses.

Team: Aditya Anupam, Jordan Graves, Marian Dominquez Mirazo, Colin Stricklin, Kevin Tang, Michael Vogel (all Georgia Institute of Technology, USA)
The jury was impressed by this entry and how it translates an underlying scientific model into accessible game play. In addition, it raises awareness of “essential workers” – people in important jobs who are too often underpaid and underappreciated.

  1. Honorary mentions: RAWRER, the Cretacian version of an imagined dinosaur version of Twitter, is a game where the dinosaur community circulates news about the ongoing Volcano crisis and tries to spread the word on how they should best address the situation: https://noha-morte.itch.io/rawrer-mobile-game

Team: Olga Chatzifoti (Glasgow Schol of Art), Christina Chrysanthopoulou (Game Developer)
According to the Jury this entry addresses the infodemic via a fantasy world, in which a population of dinosaurs discussing the severity of an impending threat in a manner analogous to the discussion around the COVID-19 pandemic. The developers created an impressive and detailed system for the interactor to explore.

  1. Rabid is a point and click adventure game where, as a mayor, you need to make decisions that will influence the lives of the anthropomorphic animals living in your town. You can decide which information to rely on, which to investigate further, but the issues you have to face might not be black and white, and sometimes you need to pick priorities or the lesser evil. https://kuvasz013.itch.io/rabid

Team: Ágnes Fábián, Viktória Fehér, Ádám Kovács, Rebeka Kovács, Miklós Levente Papp, Noémi Rózsa, Eszter Szabó-Zichy
According to the Jury this experience was created with much love for detail and description. The interactor experiences the complexity of decision-making in a friendly environment that could also work for younger audiences.

All games are available at: https://itch.io/jam/complexityjam
The event was supported by the COST Action INDCOR, COST – European Cooperation in Science and Technology (indcor.eu, cost.eu), MOME – Moholy-Nagy Univesriyt of art and Design, National Research, Development and Innovation Office, Hungary (mome.hu).

Press release. INDCOR (1, 2)

Ultracold atoms trapped in appropriately prepared optical traps can arrange themselves in surprisingly complex, hitherto unobserved structures, according to scientists from the Institute of Nuclear Physics of the Polish Academy of Sciences in Cracow. In line with their most recent predictions, matter in optical lattices should form tensile and inhomogeneous quantum rings in a controlled manner.

Quantum rings laser lasers optical lattice
Ultracold atoms caught in an optical trap form suprisingly complex structures. Dependently on mutual interactions between particles with opposite spins, phases with various properties can be created locally. Credits: IFJ PAN

An optical lattice is a structure built of light, i.e. electromagnetic waves. Lasers play a key role in the construction of such lattices. Each laser generates an electromagnetic wave with strictly defined, constant parameters which can be almost arbitrary modified. When the laser beams are matched properly, it is possible to create a lattice with well know properties. By overlapping of waves, the minima of potential can be obtained, whose arrangement enables simulation of the systems and models well-known from solid states physics. The advantage of such prepared systems is the relatively simply way to modify positions of these minima, what in practice means the possibility of preparing various type of lattices.

If we introduce appropriately selected atoms into an area of space that has been prepared in this way, they will congregate in the locations of potential minima. However, there is an important condition: the atoms must be cooled to ultra-low temperatures. Only then will their energy be small enough not to break out of the subtle prepared trap, explains Dr. Andrzej Ptok from the Institute of Nuclear Physics of the Polish Academy of Sciences (IFJ PAN) in Cracow.

Structures formed by atoms (or groups of atoms) trapped in the optical lattice resemble crystals. Depending on the configuration of the laser beams, they can be one-, two- or three-dimensional. Unlike crystals, they are defect-free. What’s more, while in crystals the possibility of modifying the structure of the lattice is negligible, optical lattices are quite easy to configure. All that’s needed to change the properties of the laser light or the cutting angles of the beams. These features make optical lattices popular as quantum simulators. They can be used to reproduce various spatial configurations of atoms or groups of atoms, including even those that do not exist in nature.

In their research, the scientists from the IFJ PAN works with trapped atoms in optical lattices. Groups of fermions, i.e. atoms with spin of 1/2 (spin is a quantum feature describing the rotation of particles) were placed in their sites. In each site a certain number of atoms had the spin oriented in one direction (up), and the rest – in the opposite direction (down). Modification of interaction between atoms in such way to be attractive leads to creation of pairs of atoms, which correspond to the Cooper pairs in superconductors – pairs of electrons with opposite spins in the same site of lattice.

The parameters of the optical lattice can be used to influence the interaction between atoms of different spin trapped in individual sites. Moreover, in such way a state can be prepared, which mimic applied external magnetic fields on the system. It is given by control the proportions between the numbers of atoms of different spin, says Dr. Konrad J. Kapcia from IFJ PAN and notes that systems prepared in this way can reproduce the effects of relatively large magnetic fields without needing to use these fields. This is possible because we know how a given magnetic field would impact into the difference between numbers of particles with opposite spins, explains researchers.

According to the predictions of the Cracow-based physicists, an interesting phase separation should take place in systems prepared in this manner. As a result, core-shell structure formed by matter trapped in an optical lattice, a core of paired atoms of one phase, surrounded by a shell of paired atoms of the second phase, will automatically form.

The whole situation can be represented by a tasty example. Imagine a plate of rice with a thick sauce. By proper preparation of the plate, we can affect the relative position between the rice and the sauce. For example, we can prepare system in such way, that the rice will be in the center, while the sauce forms a ring around it. From the same ingredients we can also construct the reverse system: in the middle of the plate there will be the sauce surrounded by a ring of the rice. In our case, the plate is the optical trap with atoms and their pairs, and the rice and sauce are the two phases, grouping different types of atom pairs, Dr. Ptok describes.

The work of the physicists from IFJ PAN, published in Scientific Reports, is of a theoretical nature. Due to their simplicity, however, the described systems of ultracold atoms in optical traps can be quickly verified in laboratory experiments. Physicists from the IFJ PAN predicted that ultracold atoms trapped in optical lattices can form quantum rings with an inhomogeneous structure.

 

Scientific papers:

“Superfluidity of fermionic pairs in a harmonic trap. Comparative studies: Local Density Aprroximation and Bogoliubov-de Gennes solutions”
A. Cichy, A. Ptok;
Journal of Physics Communications 4, 055006 (2020);
DOI: 10.1088/2399-6528/ab8f02

“Phase separations induced by a trapping potential in one-dimensional fermionic systems as a source of core-shell structures”
A. Cichy, K. J. Kapcia, A. Ptok;
Scientific Reports 9, 6719 (2019);
DOI: 10.1038/s41598-019-42044-w

 

Press release from the Institute of Nuclear Physics Polish Academy of Sciences

According to new research, black holes could be like a hologram, where all the information is amassed in a two-dimensional surface able to reproduce a three-dimensional image. The study which demonstrates it, and which unites two discordant theories, has recently been published in Physical Review X

black holes hologram
What researchers have done is apply the theory of the holographic principle to black holes. In this way, their mysterious thermodynamic properties have become more understandable: focusing on predicting that these bodies have a great entropy and observing them in terms of quantum mechanics, you can describe them just like a hologram: they have two dimensions, in which gravity disappears, but they reproduce an object in three dimensions. Credits: Gerd Altmann for Pixabay

We can all picture that incredible image of a black hole that travelled around the world about a year ago. Yet, according to new research by SISSA, ICTP and INFN, black holes could be like a hologram, where all the information is amassed in a two-dimensional surface able to reproduce a three-dimensional image. In this way, these cosmic bodies, as affirmed by quantum theories, could be incredibly complex and concentrate an enormous amount of information inside themselves, as the largest hard disk that exists in nature, in two dimensions.

This idea aligns with Einstein’s theory of relativity, which describes black holes as three dimensional, simple, spherical, and smooth, as they appear in that famous image. In short, black holes “appear” as three dimensional, just like holograms. The study which demonstrates it, and which unites two discordant theories, has recently been published in Physical Review X.

The mystery of black holes

For scientists, black holes are a big question mark for many reasons. They are, for example, excellent representatives of the great difficulties of theoretical physics in putting together the principles of Einstein’s general theory of relativity with those of quantum physics when it comes to gravity. According to the first theory, they would be simple bodies without information. According to the other, as claimed by Jacob Bekenstein and Stephen Hawking, they would be “the most complex existing systems” because they would be characterised by an enormous “entropy”, which measures the complexity of a system, and consequently would have a lot of information inside them.

The holographic principle applied to black holes

To study black holes, the two authors of the research, Francesco Benini (SISSA Professor, ICTP scientific consultant and INFN researcher) and Paolo Milan (SISSA and INFN researcher), used an idea almost 30 years old, but still surprising, called the “holographic principle”. The researchers say: “This revolutionary and somewhat counterintuitive principle proposes that the behavior of gravity in a given region of space can alternatively be described in terms of a different system, which lives only along the edge of that region and therefore in a one less dimension. And, more importantly, in this alternative description (called holographic) gravity does not appear explicitly. In other words, the holographic principle allows us to describe gravity using a language that does not contain gravity, thus avoiding friction with quantum mechanics”.

What Benini and Milan have done “is apply the theory of the holographic principle to black holes. In this way, their mysterious thermodynamic properties have become more understandable: focusing on predicting that these bodies have a great entropy and observing them in terms of quantum mechanics, you can describe them just like a hologram: they have two dimensions, in which gravity disappears, but they reproduce an object in three dimensions”.

From theory to observation

“This study,” explain the two scientists, “is only the first step towards a deeper understanding of these cosmic bodies and of the properties that characterise them when quantum mechanics crosses with general relativity. Everything is more important now at a time when observations in astrophysics are experiencing an incredible development. Just think of the observation of gravitational waves from the fusion of black holes result of the collaboration between LIGO and Virgo or, indeed, that of the black hole made by the Event Horizon Telescope that produced this extraordinary image. In the near future, we may be able to test our theoretical predictions regarding quantum gravity, such as those made in this study, by observation. And this, from a scientific point of view, would be something absolutely exceptional”.

 

Press release from the Scuola Internazionale Superiore di Studi Avanzati

Alien frog invasion wreaks havoc on natural habitat

A warning on World Environment Day

The spotted-thighed frog is easily identified by the distinct spots on its thighs. Credits: UniSA/Christine Taylor

Indiscriminate feeding by an alien population of the carnivorous spotted-thighed frog – could severely affect the native biodiversity of southern Australia according to a new study by the University of South Australia.

The invasive amphibian – Litoria cyclorhyncha – which has hitchhiked across the Nullarbor from Western Australia – has now established a community of 1000-plus in Streaky Bay, South Australia, with sightings also confirmed on the Eyre Peninsula and at the Adelaide airport.

This is the first study of the spotted-thighed frog’s diet in its invaded range with the findings providing important biological information about the impact of the alien species on natural ecosystems.

Ecology experts, UniSA’s Associate Professor Gunnar Keppel and Christine Taylor, say the potential of the spotted-thighed frog spreading to other parts of Australia is very concerning given its destructive eating patterns.

“This frog is an indiscriminate eating machine that will devour just about anything it can fit into its mouth,” Taylor says.

“We’re talking about a relatively large, predatory tree frog that, as a species is alien to South Australia, and it could have devastating impact on invaded habitats.

“As it eats away at local species, it’s impacting the natural ecosystem, which can displace or destroy local food webs, outcompete native birds, reptiles and mammals for resources, and potentially change natural biodiversity.”

Biodiversity is the theme of this year’s United Nations World Environment Day.

Published in the Australian Journal of Zoology, the study examined the stomach contents of 76 spotted-thighed frogs across three habitats – an artificial wetland, seminatural bushland and an urban setting.

The carnivorous spotted-thighed frog will indiscriminately devour just about anything it can fit into its mouth. Credits: UniSA/Christine Taylor

On average, each frog had at least six prey items in its stomach, with prey estimated to include 200 different species, 60 per cent of which were beetles, spiders and insects. Native geckos, young frogs and mice were also identified as prey.

Introduced species can have terrible outcomes for Australia, if not understood well. The infamous introduction of the cane toad in the 1930s as a mechanism to control sugar cane beetles, is just one example. The failure of that initiative continues to ravage Australia’s ecology, with the cane toad now listed as a threatening pest under the Environment Protection and Biodiversity Conservation Act.

Assoc Prof Keppel says it is important that people understand how detrimental introduced species can be for whole environments. He warns that if the spread of the spotted-thighed frog is not kept under control they could dominate many ecosystems in south-east Australia, at the expense of the local flora and fauna.

“The spotted-thighed frog is obviously very mobile. Already it’s managed to travel more than 2000 kilometres and set up a colony in Streaky Bay. But its considerable tolerance of salinity and potential ability to withstand high temperatures could lead to further geographic spread, and if not controlled, it could extend further eastward into the Murray-Darling Basin,” Assoc Prof Keppel says.

“It’s vital that we continue to protect Australia’s biodiversity. Preventing further dispersal of the spotted-thighed frog is a high conservation priority.

“The state government should consider managing the invasive population of spotted-thighed frogs at Streaky Bay. This should include education programs to inform people about what to do if they find a frog, as well as the feasibility of exterminating the population in South Australia.

“Importantly, if you do see one of these critters in your travels – leave it be. We don’t want it hitchhiking any further.”

spotted-thighed frog
The spotted-thighed frog is native to southwestern Australia. Credits: Christine Taylor

Press release from the University of South Australia

Scientists identify a temperature tipping point for tropical forests

point tropical forests
An aerial view of a tropical forest along the eastern Pacific Ocean shoreline of Barro Colorado Island, Panama. Credit: Smithsonian Tropical Research Institute photo

A study in Science by 225 researchers working with data from 590 forest sites around the world concludes that tropical forests release much more carbon into the atmosphere at high temperatures.

All living things have tipping points: points of no return, beyond which they cannot thrive. A new report in Science shows that maximum daily temperatures above 32.2 degrees Celsius (about 90 degrees Fahrenheit) cause tropical forests to lose stored carbon more quickly. To prevent this escape of carbon into the atmosphere, the authors, including three scientists affiliated with the Smithsonian Tropical Research Institute in Panama, recommend immediate steps to conserve tropical forests and stabilize the climate.

Carbon dioxide is an important greenhouse gas, released as we burn fossil fuels. It is absorbed by trees as they grow and stored as wood. When trees get too hot and dry they may close the pores in their leaves to save water, but that also prevents them from taking in more carbon. And when trees die, they release stored carbon back into the atmosphere.

Tropical forests hold about 40 percent of all the carbon stored by land plants. For this study, researchers measured the ability of tropical forests in different sites to store carbon.

“Tropical forests grow across a wide range of climate conditions,” said Stuart Davies, director of Smithsonian ForestGEO, a worldwide network of 70 forest study sites in 27 countries. “By examining forests across the tropics, we can assess their resilience and responses to changes in global temperatures. Many other studies explored how individual forests respond to short-term climatic fluctuations. This study takes a novel approach by exploring the implications of thermal conditions currently experienced by all tropical forests.”

By comparing carbon storage in trees at almost 600 sites around the world that are part of several different forest monitoring initiatives: RAINFORAfriTRONT-FORCES and the Smithsonian’s ForestGEO, the huge research team led by Martin Sullivan from the University of Leeds and Manchester Metropolitan University found major differences in the amount of carbon stored by tropical forests in South America, Africa, Asia and Australia. South American forests store less carbon than forests in the Old World, perhaps due to evolutionary differences in which tree species are growing there.

They also found that the two most important factors predicting how much carbon is lost by forests are the maximum daily temperature and the amount of precipitation during the driest times of the year.

As temperatures reach 32.2 degrees Celsius, carbon is released much faster. Trees can deal with increases in the minimum nighttime temperature (a global warming phenomenon observed at some sites), but not with increases in maximum daytime temperature.

They predict that South American forests will be the most affected by global warming because temperatures there are already higher than on other continents and the projections for future warming are also highest for this region. Increasing carbon in the atmosphere may counterbalance some of this loss but would also exacerbate warming.

Forests can adapt to warming temperatures, but it takes time. Tree species that cannot take the heat die and are gradually replaced by more heat-tolerant species. But that may take several human generations.

“This study highlights the importance of protecting tropical forests and stabilizing the Earth’s climate,” said Jefferson Hall, co-author and director of the Smithsonian’s Agua Salud Project in Panama. “One important tool will be to find novel ways to restore degraded land, like planting tree species that help make tropical forests more resilient to the realities of the 21st century.” The Agua Salud project asks how native tree species adapted to an area can be used to manage water, store carbon and promote biodiversity conservation at a critical point where North and South America connect.

An aerial view of a tropical forest on the eastern Pacific Ocean shoreline of Barro Colorado Island, Panama. Credit: Smithsonian Tropical Research Institute photo
A relevant note:

One of the oldest permanent tropical forest study sites, located on Barro Colorado Island in Panama, is not being monitored for the first time in 40 years as a result of the COVID-19 pandemic, giving scientists less of a handle on any climate change effects that may be taking place.

Steve Paton, director of STRI’s physical monitoring program notes that in 2019 there were 32 days with maximum temperatures over 32 degrees Celsius at a weather station in the forest canopy on the Island and a first glance at his data indicates that these exceptionally hot days are becoming more common.

The Smithsonian Tropical Research Institute, headquartered in Panama City, Panama, is a unit of the Smithsonian Institution. The Institute furthers the understanding of tropical biodiversity and its importance to human welfare, trains students to conduct research in the tropics and promotes conservation by increasing public awareness of the beauty and importance of tropical ecosystems.

The paper Long-term thermal sensitivity of Earth’s tropical forests is published in Science 22 May 2020 (DOI: 10.1126/science.aaw7578)

 

Press release from the Smithsonian Tropical Research Institute

World can likely capture and store enough carbon dioxide to meet climate targets

The world is currently on track to fulfil scenarios on diverting atmospheric CO2 to underground reservoirs, according to a new study by Imperial.

The capture and storage of carbon dioxide (CO2) underground is one of the key components of the Intergovernmental Panel on Climate Change’s (IPCC) reports keeping global warming to less than 2°C above pre-industrial levels by 2100.

Carbon capture and storage (CCS) would be used alongside other interventions such as renewable energy, energy efficiency, and electrification of the transportation sector.

carbon dioxide storage
Picture by Gerd Altmann

The IPCC used models to create around 1,200 technology scenarios whereby climate change targets are met using a mix of these interventions, most of which require the use of CCS.

Their reports are available here and here.

Now a new analysis from Imperial College London suggests that just 2,700 Gigatonnes (Gt) of carbon dioxide (CO2) would be sufficient to meet the IPCC’s global warming targets. This is far less than leading estimates by academic and industry groups of what is available, which suggest there is more than 10,000 Gt of CO2 storage space globally.

It also found that that the current rate of growth in the installed capacity of CCS is on track to meet some of the targets identified in IPCC reports, and that research and commercial efforts should focus on maintaining this growth while identifying enough underground space to store this much CO2.

The findings are published in Energy & Environmental Science.

Capturing carbon

CCS involves trapping CO2 at its emission source, such as fossil-fuel power stations, and storing it underground to keep it from entering the atmosphere. Together with other climate change mitigation strategies, CCS could help the world reach the climate change mitigation goals set out by the IPCC.

However, until now the amount of storage needed has not been specifically quantified.

The research team, led by Dr Christopher Zahasky at Imperial’s Department of Earth Science and Engineering, found that worldwide, there has been 8.6 per cent growth in CCS capacity over the past 20 years, putting us on a trajectory to meet many climate change mitigation scenarios that include CCS as part of the mix.

Dr Zahasky, who is now an assistant professor at the University of Wisconsin-Madison but conducted the work at Imperial, said: “Nearly all IPCC pathways to limit warming to 2°C require tens of Gts of CO2 stored per year by mid-century. However, until now, we didn’t know if these targets were achievable given historic data, or how these targets related to subsurface storage space requirements.

“We found that even the most ambitious scenarios are unlikely to need more than 2,700 Gt of CO2 storage resource globally, much less than the 10,000 Gt of storage resource that leading reports suggest is possible.?Our study shows that if climate change targets are not met by 2100, it won’t be for a lack of carbon capture and storage space.”

Study co-author Dr Samuel Krevor, also from the Department of Earth Science and Engineering, said: “Rather than focus our attention on looking at how much storage space is available, we decided for the first time to evaluate how much subsurface storage resource is actually needed, and how quickly it must be developed, to meet climate change mitigation targets.”

Speed matters

The study has shown for the first time that the maximum storage space needed is only around 2,700 Gt, but that this amount will grow if CCS deployment is delayed. The researchers worked this out by combining data on the past 20 years of growth in CCS, information on historical rates of growth in energy infrastructure, and models commonly used to monitor the depletion of natural resources.

The researchers say that the rate at which CO2 is stored is important in its success in climate change mitigation. The faster CO2 is stored, the less total subsurface storage resource is needed to meet storage targets. This is because it becomes harder to find new reservoirs or make further use of existing reservoirs as they become full.

They found that storing faster and sooner than current deployment might be needed to help governments meet the most ambitious climate change mitigation scenarios identified by the IPCC.

The study also demonstrates how using growth models, a common tool in resource assessment, can help industry and governments to monitor short-term CCS deployment progress and long-term resource requirements.

However, the researchers point out that meeting CCS storage requirements will not be sufficient on its own to meet the IPCC climate change mitigation targets.

Dr Krevor said: “Our analysis shows good news for CCS if we keep up with this trajectory – but there are many other factors in mitigating climate change and its catastrophic effects, like using cleaner energy and transport as well as significantly increasing the efficiency of energy use.”

Funding for this work was provided by ACT ELEGANCYDETEC (CH), BMWi (DE), RVO (NL), Gassnova (NO), BEIS (UK), GasscoEquinor and Total, the European Commission under the Horizon 2020 programme, the UK CCS Research Centre and EPSRC.

Global geologic carbon storage requirements of climate change mitigation scenarios” by Christopher Zahasky and Samuel Krevor, published 21 May 2020 in Energy & Environmental Science.

 

 

 

Press release by Caroline Brogan, from the Imperial College London