Ad
Ad
Ad
Author

ScientifiCult

Browsing

Ultracold atoms trapped in appropriately prepared optical traps can arrange themselves in surprisingly complex, hitherto unobserved structures, according to scientists from the Institute of Nuclear Physics of the Polish Academy of Sciences in Cracow. In line with their most recent predictions, matter in optical lattices should form tensile and inhomogeneous quantum rings in a controlled manner.

Quantum rings laser lasers optical lattice
Ultracold atoms caught in an optical trap form suprisingly complex structures. Dependently on mutual interactions between particles with opposite spins, phases with various properties can be created locally. Credits: IFJ PAN

An optical lattice is a structure built of light, i.e. electromagnetic waves. Lasers play a key role in the construction of such lattices. Each laser generates an electromagnetic wave with strictly defined, constant parameters which can be almost arbitrary modified. When the laser beams are matched properly, it is possible to create a lattice with well know properties. By overlapping of waves, the minima of potential can be obtained, whose arrangement enables simulation of the systems and models well-known from solid states physics. The advantage of such prepared systems is the relatively simply way to modify positions of these minima, what in practice means the possibility of preparing various type of lattices.

If we introduce appropriately selected atoms into an area of space that has been prepared in this way, they will congregate in the locations of potential minima. However, there is an important condition: the atoms must be cooled to ultra-low temperatures. Only then will their energy be small enough not to break out of the subtle prepared trap, explains Dr. Andrzej Ptok from the Institute of Nuclear Physics of the Polish Academy of Sciences (IFJ PAN) in Cracow.

Structures formed by atoms (or groups of atoms) trapped in the optical lattice resemble crystals. Depending on the configuration of the laser beams, they can be one-, two- or three-dimensional. Unlike crystals, they are defect-free. What’s more, while in crystals the possibility of modifying the structure of the lattice is negligible, optical lattices are quite easy to configure. All that’s needed to change the properties of the laser light or the cutting angles of the beams. These features make optical lattices popular as quantum simulators. They can be used to reproduce various spatial configurations of atoms or groups of atoms, including even those that do not exist in nature.

In their research, the scientists from the IFJ PAN works with trapped atoms in optical lattices. Groups of fermions, i.e. atoms with spin of 1/2 (spin is a quantum feature describing the rotation of particles) were placed in their sites. In each site a certain number of atoms had the spin oriented in one direction (up), and the rest – in the opposite direction (down). Modification of interaction between atoms in such way to be attractive leads to creation of pairs of atoms, which correspond to the Cooper pairs in superconductors – pairs of electrons with opposite spins in the same site of lattice.

The parameters of the optical lattice can be used to influence the interaction between atoms of different spin trapped in individual sites. Moreover, in such way a state can be prepared, which mimic applied external magnetic fields on the system. It is given by control the proportions between the numbers of atoms of different spin, says Dr. Konrad J. Kapcia from IFJ PAN and notes that systems prepared in this way can reproduce the effects of relatively large magnetic fields without needing to use these fields. This is possible because we know how a given magnetic field would impact into the difference between numbers of particles with opposite spins, explains researchers.

According to the predictions of the Cracow-based physicists, an interesting phase separation should take place in systems prepared in this manner. As a result, core-shell structure formed by matter trapped in an optical lattice, a core of paired atoms of one phase, surrounded by a shell of paired atoms of the second phase, will automatically form.

The whole situation can be represented by a tasty example. Imagine a plate of rice with a thick sauce. By proper preparation of the plate, we can affect the relative position between the rice and the sauce. For example, we can prepare system in such way, that the rice will be in the center, while the sauce forms a ring around it. From the same ingredients we can also construct the reverse system: in the middle of the plate there will be the sauce surrounded by a ring of the rice. In our case, the plate is the optical trap with atoms and their pairs, and the rice and sauce are the two phases, grouping different types of atom pairs, Dr. Ptok describes.

The work of the physicists from IFJ PAN, published in Scientific Reports, is of a theoretical nature. Due to their simplicity, however, the described systems of ultracold atoms in optical traps can be quickly verified in laboratory experiments. Physicists from the IFJ PAN predicted that ultracold atoms trapped in optical lattices can form quantum rings with an inhomogeneous structure.

 

Scientific papers:

“Superfluidity of fermionic pairs in a harmonic trap. Comparative studies: Local Density Aprroximation and Bogoliubov-de Gennes solutions”
A. Cichy, A. Ptok;
Journal of Physics Communications 4, 055006 (2020);
DOI: 10.1088/2399-6528/ab8f02

“Phase separations induced by a trapping potential in one-dimensional fermionic systems as a source of core-shell structures”
A. Cichy, K. J. Kapcia, A. Ptok;
Scientific Reports 9, 6719 (2019);
DOI: 10.1038/s41598-019-42044-w

 

Press release from the Institute of Nuclear Physics Polish Academy of Sciences

According to new research, black holes could be like a hologram, where all the information is amassed in a two-dimensional surface able to reproduce a three-dimensional image. The study which demonstrates it, and which unites two discordant theories, has recently been published in Physical Review X

black holes hologram
What researchers have done is apply the theory of the holographic principle to black holes. In this way, their mysterious thermodynamic properties have become more understandable: focusing on predicting that these bodies have a great entropy and observing them in terms of quantum mechanics, you can describe them just like a hologram: they have two dimensions, in which gravity disappears, but they reproduce an object in three dimensions. Credits: Gerd Altmann for Pixabay

We can all picture that incredible image of a black hole that travelled around the world about a year ago. Yet, according to new research by SISSA, ICTP and INFN, black holes could be like a hologram, where all the information is amassed in a two-dimensional surface able to reproduce a three-dimensional image. In this way, these cosmic bodies, as affirmed by quantum theories, could be incredibly complex and concentrate an enormous amount of information inside themselves, as the largest hard disk that exists in nature, in two dimensions.

This idea aligns with Einstein’s theory of relativity, which describes black holes as three dimensional, simple, spherical, and smooth, as they appear in that famous image. In short, black holes “appear” as three dimensional, just like holograms. The study which demonstrates it, and which unites two discordant theories, has recently been published in Physical Review X.

The mystery of black holes

For scientists, black holes are a big question mark for many reasons. They are, for example, excellent representatives of the great difficulties of theoretical physics in putting together the principles of Einstein’s general theory of relativity with those of quantum physics when it comes to gravity. According to the first theory, they would be simple bodies without information. According to the other, as claimed by Jacob Bekenstein and Stephen Hawking, they would be “the most complex existing systems” because they would be characterised by an enormous “entropy”, which measures the complexity of a system, and consequently would have a lot of information inside them.

The holographic principle applied to black holes

To study black holes, the two authors of the research, Francesco Benini (SISSA Professor, ICTP scientific consultant and INFN researcher) and Paolo Milan (SISSA and INFN researcher), used an idea almost 30 years old, but still surprising, called the “holographic principle”. The researchers say: “This revolutionary and somewhat counterintuitive principle proposes that the behavior of gravity in a given region of space can alternatively be described in terms of a different system, which lives only along the edge of that region and therefore in a one less dimension. And, more importantly, in this alternative description (called holographic) gravity does not appear explicitly. In other words, the holographic principle allows us to describe gravity using a language that does not contain gravity, thus avoiding friction with quantum mechanics”.

What Benini and Milan have done “is apply the theory of the holographic principle to black holes. In this way, their mysterious thermodynamic properties have become more understandable: focusing on predicting that these bodies have a great entropy and observing them in terms of quantum mechanics, you can describe them just like a hologram: they have two dimensions, in which gravity disappears, but they reproduce an object in three dimensions”.

From theory to observation

“This study,” explain the two scientists, “is only the first step towards a deeper understanding of these cosmic bodies and of the properties that characterise them when quantum mechanics crosses with general relativity. Everything is more important now at a time when observations in astrophysics are experiencing an incredible development. Just think of the observation of gravitational waves from the fusion of black holes result of the collaboration between LIGO and Virgo or, indeed, that of the black hole made by the Event Horizon Telescope that produced this extraordinary image. In the near future, we may be able to test our theoretical predictions regarding quantum gravity, such as those made in this study, by observation. And this, from a scientific point of view, would be something absolutely exceptional”.

 

Press release from the Scuola Internazionale Superiore di Studi Avanzati

Alien frog invasion wreaks havoc on natural habitat

A warning on World Environment Day

The spotted-thighed frog is easily identified by the distinct spots on its thighs. Credits: UniSA/Christine Taylor

Indiscriminate feeding by an alien population of the carnivorous spotted-thighed frog – could severely affect the native biodiversity of southern Australia according to a new study by the University of South Australia.

The invasive amphibian – Litoria cyclorhyncha – which has hitchhiked across the Nullarbor from Western Australia – has now established a community of 1000-plus in Streaky Bay, South Australia, with sightings also confirmed on the Eyre Peninsula and at the Adelaide airport.

This is the first study of the spotted-thighed frog’s diet in its invaded range with the findings providing important biological information about the impact of the alien species on natural ecosystems.

Ecology experts, UniSA’s Associate Professor Gunnar Keppel and Christine Taylor, say the potential of the spotted-thighed frog spreading to other parts of Australia is very concerning given its destructive eating patterns.

“This frog is an indiscriminate eating machine that will devour just about anything it can fit into its mouth,” Taylor says.

“We’re talking about a relatively large, predatory tree frog that, as a species is alien to South Australia, and it could have devastating impact on invaded habitats.

“As it eats away at local species, it’s impacting the natural ecosystem, which can displace or destroy local food webs, outcompete native birds, reptiles and mammals for resources, and potentially change natural biodiversity.”

Biodiversity is the theme of this year’s United Nations World Environment Day.

Published in the Australian Journal of Zoology, the study examined the stomach contents of 76 spotted-thighed frogs across three habitats – an artificial wetland, seminatural bushland and an urban setting.

The carnivorous spotted-thighed frog will indiscriminately devour just about anything it can fit into its mouth. Credits: UniSA/Christine Taylor

On average, each frog had at least six prey items in its stomach, with prey estimated to include 200 different species, 60 per cent of which were beetles, spiders and insects. Native geckos, young frogs and mice were also identified as prey.

Introduced species can have terrible outcomes for Australia, if not understood well. The infamous introduction of the cane toad in the 1930s as a mechanism to control sugar cane beetles, is just one example. The failure of that initiative continues to ravage Australia’s ecology, with the cane toad now listed as a threatening pest under the Environment Protection and Biodiversity Conservation Act.

Assoc Prof Keppel says it is important that people understand how detrimental introduced species can be for whole environments. He warns that if the spread of the spotted-thighed frog is not kept under control they could dominate many ecosystems in south-east Australia, at the expense of the local flora and fauna.

“The spotted-thighed frog is obviously very mobile. Already it’s managed to travel more than 2000 kilometres and set up a colony in Streaky Bay. But its considerable tolerance of salinity and potential ability to withstand high temperatures could lead to further geographic spread, and if not controlled, it could extend further eastward into the Murray-Darling Basin,” Assoc Prof Keppel says.

“It’s vital that we continue to protect Australia’s biodiversity. Preventing further dispersal of the spotted-thighed frog is a high conservation priority.

“The state government should consider managing the invasive population of spotted-thighed frogs at Streaky Bay. This should include education programs to inform people about what to do if they find a frog, as well as the feasibility of exterminating the population in South Australia.

“Importantly, if you do see one of these critters in your travels – leave it be. We don’t want it hitchhiking any further.”

spotted-thighed frog
The spotted-thighed frog is native to southwestern Australia. Credits: Christine Taylor

Press release from the University of South Australia

Scientists identify a temperature tipping point for tropical forests

point tropical forests
An aerial view of a tropical forest along the eastern Pacific Ocean shoreline of Barro Colorado Island, Panama. Credit: Smithsonian Tropical Research Institute photo

A study in Science by 225 researchers working with data from 590 forest sites around the world concludes that tropical forests release much more carbon into the atmosphere at high temperatures.

All living things have tipping points: points of no return, beyond which they cannot thrive. A new report in Science shows that maximum daily temperatures above 32.2 degrees Celsius (about 90 degrees Fahrenheit) cause tropical forests to lose stored carbon more quickly. To prevent this escape of carbon into the atmosphere, the authors, including three scientists affiliated with the Smithsonian Tropical Research Institute in Panama, recommend immediate steps to conserve tropical forests and stabilize the climate.

Carbon dioxide is an important greenhouse gas, released as we burn fossil fuels. It is absorbed by trees as they grow and stored as wood. When trees get too hot and dry they may close the pores in their leaves to save water, but that also prevents them from taking in more carbon. And when trees die, they release stored carbon back into the atmosphere.

Tropical forests hold about 40 percent of all the carbon stored by land plants. For this study, researchers measured the ability of tropical forests in different sites to store carbon.

“Tropical forests grow across a wide range of climate conditions,” said Stuart Davies, director of Smithsonian ForestGEO, a worldwide network of 70 forest study sites in 27 countries. “By examining forests across the tropics, we can assess their resilience and responses to changes in global temperatures. Many other studies explored how individual forests respond to short-term climatic fluctuations. This study takes a novel approach by exploring the implications of thermal conditions currently experienced by all tropical forests.”

By comparing carbon storage in trees at almost 600 sites around the world that are part of several different forest monitoring initiatives: RAINFORAfriTRONT-FORCES and the Smithsonian’s ForestGEO, the huge research team led by Martin Sullivan from the University of Leeds and Manchester Metropolitan University found major differences in the amount of carbon stored by tropical forests in South America, Africa, Asia and Australia. South American forests store less carbon than forests in the Old World, perhaps due to evolutionary differences in which tree species are growing there.

They also found that the two most important factors predicting how much carbon is lost by forests are the maximum daily temperature and the amount of precipitation during the driest times of the year.

As temperatures reach 32.2 degrees Celsius, carbon is released much faster. Trees can deal with increases in the minimum nighttime temperature (a global warming phenomenon observed at some sites), but not with increases in maximum daytime temperature.

They predict that South American forests will be the most affected by global warming because temperatures there are already higher than on other continents and the projections for future warming are also highest for this region. Increasing carbon in the atmosphere may counterbalance some of this loss but would also exacerbate warming.

Forests can adapt to warming temperatures, but it takes time. Tree species that cannot take the heat die and are gradually replaced by more heat-tolerant species. But that may take several human generations.

“This study highlights the importance of protecting tropical forests and stabilizing the Earth’s climate,” said Jefferson Hall, co-author and director of the Smithsonian’s Agua Salud Project in Panama. “One important tool will be to find novel ways to restore degraded land, like planting tree species that help make tropical forests more resilient to the realities of the 21st century.” The Agua Salud project asks how native tree species adapted to an area can be used to manage water, store carbon and promote biodiversity conservation at a critical point where North and South America connect.

An aerial view of a tropical forest on the eastern Pacific Ocean shoreline of Barro Colorado Island, Panama. Credit: Smithsonian Tropical Research Institute photo
A relevant note:

One of the oldest permanent tropical forest study sites, located on Barro Colorado Island in Panama, is not being monitored for the first time in 40 years as a result of the COVID-19 pandemic, giving scientists less of a handle on any climate change effects that may be taking place.

Steve Paton, director of STRI’s physical monitoring program notes that in 2019 there were 32 days with maximum temperatures over 32 degrees Celsius at a weather station in the forest canopy on the Island and a first glance at his data indicates that these exceptionally hot days are becoming more common.

The Smithsonian Tropical Research Institute, headquartered in Panama City, Panama, is a unit of the Smithsonian Institution. The Institute furthers the understanding of tropical biodiversity and its importance to human welfare, trains students to conduct research in the tropics and promotes conservation by increasing public awareness of the beauty and importance of tropical ecosystems.

The paper Long-term thermal sensitivity of Earth’s tropical forests is published in Science 22 May 2020 (DOI: 10.1126/science.aaw7578)

 

Press release from the Smithsonian Tropical Research Institute

World can likely capture and store enough carbon dioxide to meet climate targets

The world is currently on track to fulfil scenarios on diverting atmospheric CO2 to underground reservoirs, according to a new study by Imperial.

The capture and storage of carbon dioxide (CO2) underground is one of the key components of the Intergovernmental Panel on Climate Change’s (IPCC) reports keeping global warming to less than 2°C above pre-industrial levels by 2100.

Carbon capture and storage (CCS) would be used alongside other interventions such as renewable energy, energy efficiency, and electrification of the transportation sector.

carbon dioxide storage
Picture by Gerd Altmann

The IPCC used models to create around 1,200 technology scenarios whereby climate change targets are met using a mix of these interventions, most of which require the use of CCS.

Their reports are available here and here.

Now a new analysis from Imperial College London suggests that just 2,700 Gigatonnes (Gt) of carbon dioxide (CO2) would be sufficient to meet the IPCC’s global warming targets. This is far less than leading estimates by academic and industry groups of what is available, which suggest there is more than 10,000 Gt of CO2 storage space globally.

It also found that that the current rate of growth in the installed capacity of CCS is on track to meet some of the targets identified in IPCC reports, and that research and commercial efforts should focus on maintaining this growth while identifying enough underground space to store this much CO2.

The findings are published in Energy & Environmental Science.

Capturing carbon

CCS involves trapping CO2 at its emission source, such as fossil-fuel power stations, and storing it underground to keep it from entering the atmosphere. Together with other climate change mitigation strategies, CCS could help the world reach the climate change mitigation goals set out by the IPCC.

However, until now the amount of storage needed has not been specifically quantified.

The research team, led by Dr Christopher Zahasky at Imperial’s Department of Earth Science and Engineering, found that worldwide, there has been 8.6 per cent growth in CCS capacity over the past 20 years, putting us on a trajectory to meet many climate change mitigation scenarios that include CCS as part of the mix.

Dr Zahasky, who is now an assistant professor at the University of Wisconsin-Madison but conducted the work at Imperial, said: “Nearly all IPCC pathways to limit warming to 2°C require tens of Gts of CO2 stored per year by mid-century. However, until now, we didn’t know if these targets were achievable given historic data, or how these targets related to subsurface storage space requirements.

“We found that even the most ambitious scenarios are unlikely to need more than 2,700 Gt of CO2 storage resource globally, much less than the 10,000 Gt of storage resource that leading reports suggest is possible.?Our study shows that if climate change targets are not met by 2100, it won’t be for a lack of carbon capture and storage space.”

Study co-author Dr Samuel Krevor, also from the Department of Earth Science and Engineering, said: “Rather than focus our attention on looking at how much storage space is available, we decided for the first time to evaluate how much subsurface storage resource is actually needed, and how quickly it must be developed, to meet climate change mitigation targets.”

Speed matters

The study has shown for the first time that the maximum storage space needed is only around 2,700 Gt, but that this amount will grow if CCS deployment is delayed. The researchers worked this out by combining data on the past 20 years of growth in CCS, information on historical rates of growth in energy infrastructure, and models commonly used to monitor the depletion of natural resources.

The researchers say that the rate at which CO2 is stored is important in its success in climate change mitigation. The faster CO2 is stored, the less total subsurface storage resource is needed to meet storage targets. This is because it becomes harder to find new reservoirs or make further use of existing reservoirs as they become full.

They found that storing faster and sooner than current deployment might be needed to help governments meet the most ambitious climate change mitigation scenarios identified by the IPCC.

The study also demonstrates how using growth models, a common tool in resource assessment, can help industry and governments to monitor short-term CCS deployment progress and long-term resource requirements.

However, the researchers point out that meeting CCS storage requirements will not be sufficient on its own to meet the IPCC climate change mitigation targets.

Dr Krevor said: “Our analysis shows good news for CCS if we keep up with this trajectory – but there are many other factors in mitigating climate change and its catastrophic effects, like using cleaner energy and transport as well as significantly increasing the efficiency of energy use.”

Funding for this work was provided by ACT ELEGANCYDETEC (CH), BMWi (DE), RVO (NL), Gassnova (NO), BEIS (UK), GasscoEquinor and Total, the European Commission under the Horizon 2020 programme, the UK CCS Research Centre and EPSRC.

Global geologic carbon storage requirements of climate change mitigation scenarios” by Christopher Zahasky and Samuel Krevor, published 21 May 2020 in Energy & Environmental Science.

 

 

 

Press release by Caroline Brogan, from the Imperial College London

A rising tide of marine disease? How parasites respond to a warming world

Sea star wasting disease, pictured here, is likely caused by the sea star associated densovirus. Credits: Oregon State Parks

Warming events are increasing in magnitude and severity, threatening many ecosystems worldwide. As the global temperatures continue to climb, it also raises uncertainties as to the relationship, prevalence, and spread of parasites and disease.

A recent study from the University of Washington explores the ways parasitism will respond to climate change, providing researchers new insights into disease transmission. The paper was published in May in Trends in Ecology and Evolution.

The review builds upon previous research by adding nearly two decades worth of new evidence to build a framework showing the parasite–host relationship under climate oscillations. Traditionally, climate related research is done over long time scales, however this unique approach examines how increasingly frequent “pulse warming” events alter parasite transmission.

“Much of what is known about how organisms and ecosystems can respond to climate change has focused on gradual warming,” said lead author Danielle Claar, a postdoctoral researcher at the UW School of Aquatic and Fishery Sciences. “Climate change causes not only gradual warming over time, but also increases the frequency and magnitude of extreme events, like heat waves.”

Claar explains that both gradual warming and pulse warming can and have influenced ecosystems, but do so in different ways. Organisms may be able to adapt and keep pace with the gradual warming, but an acute pulse event can have sudden and profound impacts.

parasites warming
A sea star ravaged by sea star wasting disease. Credits: Alison Leigh Lilly

The 2013-2015 “blob” is one such extreme heat pulse event which has been linked to a massive die-off of sea stars along the Pacific coast of the U.S. and Canada. Many species of sea stars, including the large sunflower sea star, were decimated by a sudden epidemic of wasting disease. Five years later, populations in the region are still struggling to recover. The abnormally warm waters associated with the blob are thought to have favored the spread of the sea star-associated densovirus, the suggested cause of the disease.

The authors compare the prevalence of these marine diseases to a rising tide, an ebbing tide, or a tsunami. Disease transmission can rise or ebb in concert with gradual warming or a series of pulse warming events. However, a severe pulse warming event could result in a tsunami, “initiating either a deluge or drought of disease,” as was observed with sea stars along the Pacific Northwest.

However, not all pulse heat events will cause the same response. What may benefit a particular parasite or host in one system can be detrimental in another. Warming can alter a parasite’s life cycle, limit the range of suitable host species, or even impair the host’s immune response. Some flatworms which target wildlife and humans cannot survive as long in warmer waters, decreasing their window for infecting a host. Another recent UW study shows parasites commonly found in sushi are on the rise with their numbers increasing 283-fold in the past 40 years, though the relationship between heat pulse events and their abundance is not yet clear.

 

“The relationships between hosts, parasites, and their corresponding communities are complex and depend on many factors, making outcomes difficult to predict,” said Claar, who recommends researchers make predictions on a case-by-case basis for their individual systems.

The authors conclude that rather than a straightforward tidal prediction, they would expect pulse warming to cause “choppy seas with the occasional rogue wave.”

“It is important that we are able to understand and predict how parasitism and disease might respond to climate change, so we can prepare for, and mitigate, potential impacts to human and wildlife health,” said Claar.

The paper’s co-author is Chelsea Wood, a UW assistant professor of aquatic and fishery sciences.

This research was supported by the NOAA Climate and Global Change Postdoctoral Fellowship Program, administered by UCAR’s Cooperative Programs for the Advancement of Earth System Science (CPAESS); the US National Science Foundation; a Sloan Research Fellowship from the Alfred P. Sloan Foundation; a UW Innovation Award from the UW President’s Innovation Imperative; and a UW Royalty Research Fund Award.

Press release by Dan Nicola from the School of Aquatic and Fishery Sciences of the University of Washington.

In recent years, the concept of Ecosystem Services (ES): the benefits people obtain from ecosystems, such as pollination provided by bees for crop growing, timber provided by forests or recreation enabled by appealing landscapes, has been greatly popularised, especially in the context of impeding ecological crises and constantly degrading natural environments.

ecosystem services matrix
There has been an increasing need for robust and practical methodologies to assess ecosystem services: the benefits people obtain from ecosystems. Credits: Pensoft, CC-BY 4.0

Hence, there has been an increasing need for robust and practical methodologies to assess ES, in order to provide key stakeholders and decision-makers with crucial information. One such method to map and assess ES: the ES Matrix approach, has been increasingly used in the last decade.

The ES Matrix approach is based on the use of a lookup table consisting of geospatial units (e.g. types of ecosystems, habitats, land uses) and sets of ES, meant to be assessed for a specific study area, which means that the selection of a particular study area is the starting point in the assessment. Only then, suitable indicators and methods for ES quantification can be defined. Based on this information, a score for each of the ES considered is generated, referring to ES potential, ES supply, ES flow/use or demand for ES.

Originally developed in a 2009 paper by a team, led by Prof Dr Benjamin Burkhard (Leibniz University Hannover and Leibniz Centre for Agricultural Landscape Research ZALF), the ES Matrix allows the assessment of the capacity of particular ecosystem types or geospatial units to provide ES.

Ten years later, a research led by Dr C. Sylvie Campagne (Leibniz University Hannover, Germany), Dr Philip Roche (INRAE, France), Prof Dr Felix Muller (University of Kiel, Germany) and Prof Dr Benjamin Burkhard conducted a review of 109 published studies applying the ES matrix approach to find out how the ES matrix approach was applied and whether this was done in an oversimplified way or not.

In their recent paper, published in the open-access, peer-reviewed journal One Ecosystem, the review confirms the method’s flexibility, appropriateness and utility for decision-making, as well as its ability to increase awareness of ES. Nevertheless, the ES matrix approach has often been used in a “quick and dirty” way that urges more transparency and integration of variability analyses, they conclude.

“We analysed the diversity of application contexts, highlighted trends of uses and proposed future recommendations for improved applications of the ES matrix. Amongst the main patterns observed, the ES matrix approach allows for the assessment of a higher number of ES than other ES assessment methods. ES can be jointly assessed with indicators for ecosystem condition and biodiversity in the ES matrix,” explains Campagne.

“Although the ES matrix allows us to consider many data sources to achieve the assessment scores for the individual ES, these were mainly used together with expert-based scoring (73%) and/or ES scores that were based on an already-published ES matrix or deduced by information found in related scientific publications (51%),” she elaborates.

In 29% of the studies, an already existing matrix was used as an initial matrix for the assessment and in 16% no other data were used for the matrix scores or no adaptation of the existing matrix used was made.

“Nevertheless, we recommend to use only scores assessed for a specific study or, if one wishes to use pre-existing scores from another study, to revise them in depth, taking into account the local context of the new assessment,” she points out.

The researchers also acknowledge the fact that 27% of the reviewed studies did not clearly explain their methodology, which underlines the lack of method elucidation on how the data had been used and where the scores came from. Although some studies addressed the need to consider variabilities and uncertainties in ES assessments, only a minority of studies (15%) did so. Thus, the team also recommends to systematically report and consider variabilities and uncertainties in each ES assessment.

“We emphasise the need for all scientific studies to describe clearly and extensively the whole methodology used to score or evaluate ES, in order to be able to rate the quality of the scores obtained. The increasing number of studies that use the ES matrix approach confirms its success, appropriateness, flexibility and utility to generate information for decision-making, as well as its ability to increase awareness of ES, but the application of the ES matrix has to become more transparent and integrate more variability analyses,” they conclude.

###

Original source:

Campagne CS, Roche P, Müller F, Burkhard B (2020) Ten years of ecosystem services matrix: Review of a (r)evolution. One Ecosystem 5: e51103. https://doi.org/10.3897/oneeco.5.e51103

 

Press release from Pensoft Publishers

A roadmap for effective treatment of COVID-19

Study outlines key immunological factors underlying COVID-19 disease progression and proposes a range of drugs that may be repurposed to treat the disease

Picture by Steve Buissinne

Due to the devastating worldwide impact of COVID-19, the illness caused by the SARS-CoV-2 virus, there has been unprecedented efforts by clinicians and researchers from around the world to quickly develop safe and effective treatments and vaccines. Given that COVID-19 is a complex new disease with no existing vaccine or specific treatment, much effort is being made to investigate the repurposing of approved and available drugs, as well as those under development.

In Frontiers in Immunology, a team of researchers from the U.S. Food and Drug Administration review all of the COVID-19 clinical and research findings to date. They provide a breakdown of key immunological factors underlying the clinical stages of COVID-19 illness that could potentially be targeted by existing therapeutic drugs.

Dr. Montserrat Puig of the U.S. Food and Drug Administration, senior author of the review, stated that “there are multiple factors involved in determining if the patient’s immune response will be insufficient or successful in combating the infection. Our review is an overview of these factors and how they can be considered to define the context in which medications currently used for other diseases, or development of novel agents, can be utilized to prevent, ameliorate or cure COVID-19.”

We know that during the early stage of COVID-19 people can show no symptoms or mild symptoms, and for many the disease resolves.

For others it can be catastrophic. The illness can progress to a severe stage with manifestations including Acute Respiratory Distress Syndrome, accompanied by severe lung inflammation and damage. Patients with severe COVID-19 are often admitted to intensive care units and require life support with medical ventilation.

This review compiles and summarizes published up-to-date studies unraveling the factors leading to the cytokine storm and its consequences observed in COVID-19, including the immunological events underlying the severe manifestation of the disease.

The analysis is further supplemented with knowledge previously acquired from other coronaviruses including SARS-CoV and MERS-CoV.

The authors underscore key immunological events that might tip the balance from a protective to a hyperinflammatory response leading to life-threatening conditions. They outline a promising list of currently available drugs that are either under study or under consideration for use in COVID-19 based on their potential to influence these key immunological events.

These drugs include those that could inhibit SARS-CoV-2 entry into host cells, antivirals with the potential to block SARS-CoV-2 replication or factors that could boost the antiviral response, monoclonal antibodies targeting pro-inflammatory cytokines that drive the hyperinflammatory response and therapeutics that could improve the function of the lungs.

Puig states that “approaches to therapy in the early stage of the disease will differ from those in its severe late stage.” Adding that “as the results of clinical trials become available, it may become increasingly clear that there is likely no single magic bullet to resolve the disease but a combination of several interventions that target different key factors of COVID-19 may well be required.”

Puig cautions that “the research and data obtained from COVID-19 studies are rapidly evolving and continuously updated. Thus, as clearly stated in our review, the information provided is a ‘lessons learned’ to date and describes the knowledge available at the time of the publication of the review.”

The description of the immunological profile of the clinical stages of COVID-19 provided in this review will enable more informed decisions about the type and timing of treatments to be evaluated in clinical trials.

Puig explains that “our hope is that the information contained in our review will help professionals in COVID-19 research develop new tools and agents to better treat those at high risk of severe COVID-19.”

 

Press release on the roadmap for the COVID-19 treatment from Frontiers

When pollen is in short supply, bumblebees damage plant leaves in a way that accelerates flower production, as an ETH research team headed up by Consuelo De Moraes and Mark Mescher has demonstrated.

Spring has sprung earlier than ever before this year, accompanied by temperatures more typical of early summertime. Many plants were already in full bloom by mid-​April, about three to four weeks earlier than normal. These types of seasonal anomalies are becoming increasingly frequent due to climate change, and the resulting uncertainty threatens to disrupt the timing of mutualistic relationships between plants and their insect pollinators.

A research team led by ETH Professors Consuelo De Moraes and Mark Mescher has now discovered that one peculiar bumblebee behaviour may help to overcome such challenges by facilitating coordination between the bees and the plants they pollinate. The group has found that bumblebee workers use their mouth parts to pinch into the leaves of plants that haven’t flowered yet, and that the resulting damage stimulates the production of new flowers that bloom earlier than those on plants that haven’t been given this “nudge”.

Their study has just been published in the journal Science. “Previous work has shown that various kinds of stress can induce plants to flower, but the role of bee-​inflicted damage in accelerating flower production was unexpected,” Mescher says.

bumblebees pollen
If bumblebees find too little pollen, they pierce the leaves of non-flowering plants in order to force them to produce flowers more quickly. Credits: Photograph: Hannier Pulido / ETH Zurich

Surprising behaviour from bumblebees

The researchers first noticed the behaviour during other experiments being undertaken by one of the authors, Foteini Pashalidou: pollinators were biting the leaves of test plants in the greenhouse. “On further investigation, we found that others had also observed such behaviours, but no one had explored what the bees were doing to the plants or reported an effect on flower production,” Mescher explains.

Following up on their observations, the ETH researchers devised several new laboratory experiments, and also conducted outdoor studies using commercially available bumblebee colonies – typically sold for the pollination of agricultural crops – and a variety of plant species.

Based on their lab studies, the researchers were able to show that the bumblebees’ propensity to damage leaves has a strong correlation with the amount of pollen they can obtain: Bees damage leaves much more frequently when there is little or no pollen available to them. They also found that damage inflicted on plant leaves had dramatic effects on flowering time in two different plant species. Tomato plants subjected to bumblebee biting flowered up to 30 days earlier than those that hadn’t been targeted, while mustard plants flowered about 14 days earlier when damaged by the bees.

“The bee damage had a dramatic influence on the flowering of the plants – one that has never been described before,” De Moraes says. She also suggests that the developmental stage of the plant when it is bitten by bumblebees may influence the degree to which flowering is accelerated, a factor the investigators plan to explore in future work.

The researchers tried to manually replicate the damage patterns caused by bees to see if they could reproduce the effect on flowering time. But, while this manipulation did lead to somewhat earlier flowering in both plant species, the effect was not nearly as strong as that caused by the bees themselves. This leads De Moraes to suggest that some chemical or other cue may also be involved. “Either that or our manual imitation of the damage wasn’t accurate enough,” she says. Her team is currently trying to identify the precise cues responsible for inducing flowering and characterising the molecular mechanisms involved in the plant response to bee damage.

 

Phenomenon also observed in the field

The ETH research team was also able to observe the bees’ damaging behaviour under more natural conditions, with doctoral student Harriet Lambert leading follow-​up studies on the rooftops of two ETH buildings in central Zurich. In these experiments, the researchers again observed that hungry bumblebees with insufficient pollen supplies frequently damaged the leaves of non-​blooming plants. But the damaging behaviour was consistently reduced when the researchers made more flowers available to the bees.

Furthermore, it was not only captive-​bred bumblebees from the researchers’ experimental colonies that damaged plant leaves. The investigators also observed wild bees from at least two additional bumblebee species biting the leaves of plants in their experimental plots. Other pollinating insects, such as honeybees, did not exhibit such behaviour, however: they seemed to ignore the non-​flowering plants entirely, despite being frequent visitors to nearby patches of flowering plants.

Delicate balance starting to tip

“Bumblebees may have found an effective method of mitigating local shortages of pollen,” De Moraes says. “Our open fields are abuzz with other pollinators, too, which may also benefit from the bumblebees’ efforts.” But it remains to be seen whether this mechanism is sufficient to overcome the challenges of changing climate. Insects and flowering plants have evolved together, sharing a long history that strikes a delicate balance between efflorescence and pollinator development. However, global warming and other anthropogenic environmental changes have the potential to disrupt the timing of these and other ecologically important interactions among species. Such rapid environmental change could result in insects and plants becoming increasingly out of sync in their development, for example. “And that’s something from which both sides stand to lose,” Mescher says.

 

Reference paper on bumblebees when pollen is in short supply

 

Paschalidou FG, Lambert H, Peybernes T, Mescher MC, De Moraes CM. Bumble bees damage plant leaves and accelerate flower production when pollen is scarce. Science, published online May 21st 2020. DOI: 10.1126/science.aay0496

 

Press release from ETH Zürich

Astronomers using the Atacama Large Millimeter/submillimeter Array (ALMA) found quasi-periodic flickers in millimeter-waves from the center of the Milky Way, Sagittarius (Sgr) A*. The team interpreted these blinks to be due to the rotation of radio spots circling the supermassive black hole with an orbit radius smaller than that of Mercury. This is an interesting clue to investigate space-time with extreme gravity.

“It has been known that Sgr A* sometimes flares up in millimeter wavelength,” tells Yuhei Iwata, the lead author of the paper published in the Astrophysical Journal Letters and a graduate student at Keio University, Japan. “This time, using ALMA, we obtained high-quality data of radio-wave intensity variation of Sgr A* for 10 days, 70 minutes per day. Then we found two trends: quasi-periodic variations with a typical time scale of 30 minutes and hour-long slow variations.”

The different color dots show the flux at different frequencies (blue: 234.0 GHz, green: 219.5 GHz, red: 217.5 GHz). Variations with about a 30-minute period are seen in the diagram. Credits: Y. Iwata et al./Keio University

Astronomers presume that a supermassive black hole with a mass of 4 million suns is located at the center of Sgr A*. Flares of Sgr A* have been observed not only in millimeter wavelength, but also in infrared light and X-ray. However, the variations detected with ALMA are much smaller than the ones previously detected, and it is possible that these levels of small variations always occur in Sgr A*.

ALMA Milky Way
Hot spots circling around the black hole could produce the quasi-periodic millimeter emission detected with ALMA. Credits: Keio University

The black hole itself does not produce any kind of emission. The source of the emission is the scorching gaseous disk around the black hole. The gas around the black hole does not go straight to the gravitational well, but it rotates around the black hole to form an accretion disk.

The team focused on short timescale variations and found that the variation period of 30 minutes is comparable to the orbital period of the innermost edge of the accretion disk with the radius of 0.2 astronomical units (1 astronomical unit corresponds to the distance between the Earth and the Sun: 150 million kilometers). For comparison, Mercury, the solar system’s innermost planet, circles around the Sun at a distance of 0.4 astronomical units. Considering the colossal mass at the center of the black hole, its gravity effect is also extreme in the accretion disk.

“This emission could be related with some exotic phenomena occurring at the very vicinity of the supermassive black hole,” says Tomoharu Oka, a professor at Keio University.

Their scenario is as follows. Hot spots are sporadically formed in the disk and circle around the black hole, emitting strong millimeter waves. According to Einstein’s special relativity theory, the emission is largely amplified when the source is moving toward the observer with a speed comparable to that of light. The rotation speed of the inner edge of the accretion disk is quite large, so this extraordinary effect arises. The astronomers believe that this is the origin of the short-term variation of the millimeter emission from Sgr A*.

The team supposes that the variation might affect the effort to make an image of the supermassive black hole with the Event Horizon Telescope. “In general, the faster the movement is, the more difficult it is to take a photo of the object,” says Oka. “Instead, the variation of the emission itself provides compelling insight for the gas motion. We may witness the very moment of gas absorption by the black hole with a long-term monitoring campaign with ALMA.” The researchers aim to draw out independent information to understand the mystifying environment around the supermassive black hole.

###

The research team members are: Yuhei Iwata (Keio University), Tomoharu Oka (Keio University), Masato Tsuboi (Japan Space Exploration Agency/The University of Tokyo), Makoto Miyoshi (National Astronomical Observatory of Japan/SOKENDAI), and Shunya Takekawa (National Astronomical Observatory of Japan)

Press release on ALMA spotting twinkling heart of Milky Way from the National Institutes of Natural Sciences