游客,您好!欢迎您进入土壤、生物与环境 平台!登录 | 注册  研究所一线科技信息监测服务  帮助中心  RSS
 您当前的位置:首页 > 编译报道
  • 摘要:

    监测内容包括业内新闻、高被引文章、领域知名学者、项目计划等等,敬请各位关注!

    点击量:18
  • 摘要:

    One of the central tenets of biology is that information flows from DNA to RNA in order to encode proteins, which function in the cell. Arguably just as critical as the genetic code is the timing of this information flow. By producing the right RNA and right proteins at the right time, a cell can effectively strategize its survival and success. One such regulatory element, the riboswitch, has excited interest as a potential target for antibiotics. After over 10 years of research, Prof. Harald Schwalbe's research group at the Goethe University, in collaboration with the Landick group at the University of Wisconsin, Prof. Jens Wöhnert from Goethe-University's Biology Department and the Süß group at the Technische Universität Darmstadt, has put together the puzzle pieces of a riboswitch-based regulatory process in the bacterium Bacillus subtilis, presenting the most extensive model of the timing of riboswitch action to date.

    A riboswitch is a short piece of RNA that can fold into different structures, depending on whether or not a small messenger molecule is binds to it. In transcriptional riboswitches, these different structures signal the nearby RNA polymerase to continue producing RNA or to stop. In their recent publication in eLife, the Schwalbe group and their collaborators released molecular structures of the xpt-pbuX riboswitch in the off-position after synthesis and in the on-position upon binding by the small messenger molecule guanine. They also demonstrated that this switch to the on-position takes a certain amount of time. This sets a certain requirement on this regulatory process.

    As RNA polymerase flies along a DNA strand, producing the corresponding RNA, it reaches the code for the xpt-pbuX switch, makes the riboswitch, and continues on. If guanine is not around, the RNA polymerase would detect the default off-position and halt synthesis. However, if guanine were to bind the riboswitch, the riboswitch would need to refold into the on-position, and RNA polymerase would have to wait long enough to detect the new conformation. Otherwise, it would always read "off," and that gene would never be read. Schwalbe and coworkers found that just such a pause does exist, and it's encoded into the DNA. After producing the xpt-pbuX switch, the RNA polymerase encounters this "pause site" on the DNA code and slows down, allowing the right amount of time for the riboswitch to refold.

    Story Source:

    Materials provided by Goethe University Frankfurt. Note: Content may be edited for style and length.

    Journal Reference:

    Hannah Steinert, Florian Sochor, Anna Wacker, Janina Buck, Christina Helmling, Fabian Hiller, Sara Keyhani, Jonas Noeske, Steffen Grimm, Martin M Rudolph, Heiko Keller, Rachel Anne Mooney, Robert Landick, Beatrix Suess, Boris Fürtig, Jens Wöhnert, Harald Schwalbe. Pausing guides RNA folding to populate transiently stable RNA structures for riboswitch-based transcription regulation. eLife, 2017; 6 DOI: 10.7554/eLife.21297

    点击量:29
  • 摘要:

    The key to produce better crops to meet the needs of the growing world's population may lie in combining the traditional knowledge of subsistence farmers of the Ethiopian highlands with plant genomics. Researchers in Italy and Ethiopia conducted research that demonstrates that the indigenous knowledge of traditional farmers, passed on from one generation to the next since hundreds of years, can be measured in a quantitative way and used with advanced genomic and statistical methods to identify genes responsible for farmers' preference of wheat.

    The livelihood of hundreds of millions of people living in smallholder farming systems depends on the products they obtain from marginal fields. Smallholder farmers are very knowledgeable in what they grow, because they must be efficient in selecting the crop varieties that will ensure the subsistence of their household. With an innovative approach overturning the classical scheme of modern plant breeding, the authors of this research developed a method to extract the traditional knowledge from smallholder farmers' and to use it to inform modern, genomic-driven breeding.

    The result of this original research, conducted by scientists from the Institute of Life Sciences of Scuola Superiore Sant'Anna in Pisa and from Bioversity International, and participated by the University of Bologna, the Amhara Agricultural Research Center, and the Mekelle Univeristy in Ethiopia, was published in Frontiers in Plant Science. Researches worked with 60 farmers from two smallholder farming communities in the Ethiopian highlands. For two straight weeks, farmers evaluated 400 wheat varieties for traits of their interest side by side with geneticists measuring agronomic traits on the same plots. The evaluation provided more than 200 thousands data points, that the researchers related to 30 million molecular data deriving from the genomic characterization of the wheat varieties. This approach allowed to identify for the first time genomic regions responding to smallholder farmers' traditional knowledge, demonstrating that farmers may identify genomic regions relevant for breeding that exceeds those identified by classic metric measurements of traits alone.

    "This study is a milestone in modern crop breeding" says Matteo Dell'Acqua, geneticist at the Scuola Sant'Anna and coordinator of the research "as it is the first in demonstrating that the traditional knowledge of smallholder farmers has a genetic basis that can be extracted with methods already available to the scientific community. These farmers can teach us how to produce crop varieties adapted to local agriculture, fighting food insecurity in farming systems most exposed to climate change"

    Story Source:

    Materials provided by Frontiers. Note: Content may be edited for style and length.

    Journal Reference:

    Yosef G. Kidane, Chiara Mancini, Dejene K. Mengistu, Elisabetta Frascaroli, Carlo Fadda, Mario Enrico Pè, Matteo Dell'Acqua. Genome Wide Association Study to Identify the Genetic Base of Smallholder Farmer Preferences of Durum Wheat Traits. Frontiers in Plant Science, 2017; 8 DOI: 10.3389/fpls.2017.01230

    点击量:32
  • 摘要:

    Heavy rainfall can cause rivers and drainage systems to overflow or dams to break, leading to flood events that bring damage to property and road systems as well potential loss of human life.

    One such event in 2008 cost $10 billion in damages for the entire state of Iowa. After the flood, the Iowa Flood Center (IFC) at the University of Iowa (UI) was established as the first center in the United States for advanced flood-related research and education.

    Today, simplified 2-D flood models are the state of the art for predicting flood wave propagation, or how floods spread across land. A team at IFC, led by UI Professor George Constantinescu, is creating 3-D non-hydrostatic flood models that can more accurately simulate flood wave propagation and account for the interaction between the flood wave and large obstacles such as dams or floodplain walls. These 3-D models also can be used to assess and improve the predictive capabilities of the 2-D models that government agencies and consulting companies use for predicting how floods will spread and the associated risks and hazards.

    Using one of the world's most powerful supercomputers -- Titan, the 27-petaflop Cray XK7 at the Oak Ridge Leadership Computing Facility (OLCF) -- Constantinescu's team performed one of the first highly resolved, 3-D, volume-of-fluid Reynolds-averaged Navier-Stokes (RANS) simulations of a dam break in a natural environment. The simulation allowed the team to map precise water levels for actual flood events over time. RANS is a widely used method for modeling turbulent flows.

    "Flood events, like those generated by dam breaks, can be computationally very expensive to simulate," Constantinescu said. "Previously, there wasn't enough computer power to do these kinds of time-accurate simulations in large computational domains, but with the power of high-performance computing [HPC] and Titan, we are achieving more than was previously thought possible."

    The project was supported in 2015 and 2016 within the OLCF's Director's Discretionary user program. The OLCF, a US Department of Energy (DOE) Office of Science User Facility located at DOE's Oak Ridge National Laboratory, provides HPC resources for research and development projects to advance scientific discovery.

    The team's 3-D simulations showed that commonly used 2-D models may inaccurately predict some aspects of flooding, such as the time over which dangerous flood levels last at certain locations and the amount of surface area flooded. Simulation results also demonstrated that 2-D models may underestimate the speed at which floods spread and overestimate the time at which flood waves reach their highest point.

    When the water sources that empty into a river rise simultaneously, they can trigger one or more successive flood waves. Accuracy of the 1-D, 2-D, or 3-D flood models that track how these waves move is crucial for predicting maximum flood depth, hazardous conditions, and other variables.

    "We need to know what's going to happen for situations in which a dam breaks," Constantinescu said. "We need to know who's going to be affected, how much time they will have to evacuate, and what else might happen to the environment as a result."

    Because 2-D models make simplified assumptions about some aspects of the flow, they can't account for changes in the flow, such as when the flood wave moves around large obstacles, changes rapidly in direction, or fully immerses bridge decks. The team needed a leadership-class supercomputer to run the 3-D simulations and accurately capture these changes.

    Titan Changes the Current

    Using a fully non-hydrostatic 3-D RANS solver, the team performed the first simulations of the hypothetical failure of two Iowa dams: the Coralville Dam in Iowa City and the Saylorville Dam in Des Moines. Each used a computational grid of about 30-50 million cells and covered a physical area of about 20 miles by 5 miles.

    The team used the state-of-the-art computational fluid dynamics software STAR-CCM+. This software features a volume-of-fluid method to track the position of the water's free surface -- the areas where water meets the air. In a scalability study, the team determined the peak performance of the code for the dam break simulations. The researchers used 2,500 of Titan's CPU processors for peak performance in each simulation.

    The researchers also computed the same dam break test cases using a standard 2-D model commonly used by IFC. When they compared the 2-D results against those of the 3-D simulations, they found that the 2-D model underestimated how quickly the flood wave moved across land and overestimated the time at which the maximum flood occurred. This finding is important because government agencies and consulting companies use 2-D shallow flow models to predict dam breaks and floods, as well as to estimate flood hazards.

    "By performing these 3-D simulations, we provided a huge data set that can be used to improve the accuracy of existing 2-D and 1-D flood models," Constantinescu said. "We can also examine the effectiveness of deploying flood protection structures for different flooding scenarios." The team ultimately showed that HPC can be used successfully to answer engineering questions related to the consequences of structural failure of dams and related hazards.

    Constantinescu said that as computers become faster and more powerful, simulations of full flooding events over larger physical regions will be possible. Summit, the OLCF's next-generation supercomputer that is scheduled to come online in 2018, will unearth new possibilities for Constantinescu's research.

    "Advances in numerical algorithms, automatic grid generation, and increased supercomputer power will eventually make the simulations of flood waves over large durations of time possible using Titan, and even more so with Summit," Constantinescu said. "Eventually, things we previously had to do by hand, such as generating a high-quality computational grid, will just be part of the typical software package."

    Story Source:

    Materials provided by DOE/Oak Ridge National Laboratory. Note: Content may be edited for style and length.

    点击量:35
  • 摘要:

    Radioactive contamination is the unwanted presence of radioactive substances in the environment. Our environment is contaminated by naturally-occurring and anthropogenic radionuclides, unstable isotopes of an element that releases radiation as it decomposes and becomes more stable, which are present in the air, soil, rain, etc. These radionuclides can be transferred throughout the food chain until reaching humans, and this could make for a potential health risk.

    Until now, to study the presence of radionuclides in different products for human consumption and their subsequent transfer, research has been based fundamentally on foods such as meats, fish or milk, without considering a foodstuff like fungi, which are well known for accumulating concentrations of some radionuclides in their fruiting bodies.

    As a result, the Environmental Radioactivity Laboratory of the University of Extremadura (LARUEX) has carried out a study to quantify radioactive presence in this foodstuff. Thus, the author of the study, Javier Guillén, explains that "this quantification is made using transfer coefficients that compare the radioactive content in the receptor compartment of the radioactive contamination, that is to say in the fungi, to that existing in the transmitter compartment, which in this case would be the soil."

    To conduct this research the authors considered the base level of radionuclides established in ecosystems with low radioactive content like our region, and then used the software called the ERICA Tool which, as the researcher explains, "allows one to enter the transfer coefficient from the soil to the organism -- in this case the fungus -- thus calculating the dose of radionuclides a non-human organism receives."

    From the study, we may conclude that the estimated dose rates for fungi in Spain are similar to those determined for other animals (animals and plants) and therefore this species can be used when assessing the presence or absence of radioactive contamination in the soil, as a result of which, as the researcher asserts, "even though it is not strictly necessary to include fungi amongst the existing instruments and frameworks of assessment, they can be used in ecosystems which may require them, based on criteria such as biodiversity."

    Moreover, in the case of the fungi analysed, which are concentrated in the Mediterranean area, we should also highlight the fact that they do not contain a high dose of radionuclides, meaning there is no environmental contamination and they are therefore perfectly suitable for consumption by humans.

    Story Source:

    Materials provided by University of Extremadura. Note: Content may be edited for style and length.

    Journal Reference:

    J. Guillén, A. Baeza, N.A. Beresford, M.D. Wood. Do fungi need to be included within environmental radiation protection assessment models? Journal of Environmental Radioactivity, 2017; 175-176: 70 DOI: 10.1016/j.jenvrad.2017.04.014

    点击量:89
  • 摘要:

    Harvard University researchers have resolved a conflict in estimates of how much the Earth will warm in response to a doubling of carbon dioxide in the atmosphere.

    That conflict -- between temperature ranges based on global climate models and paleoclimate records and ranges generated from historical observations -- prevented the United Nations' Intergovernmental Panel on Climate Change (IPCC) from providing a best estimate in its most recent report for how much the Earth will warm as a result of a doubling of CO2 emissions.

    The researchers found that the low range of temperature increase -- between 1 and 3 degrees Celsius -- offered by the historical observations did not take into account long-term warming patterns. When these patterns are taken into account, the researchers found that not only do temperatures fall within the canonical range of 1.5 to 4.5 degrees Celsius but that even higher ranges, perhaps up to 6 degrees, may also be possible.

    The research is published in Science Advances.

    It's well documented that different parts of the planet warm at different speeds. The land over the northern hemisphere, for example, warms significantly faster than water in the Southern Ocean.

    "The historical pattern of warming is that most of the warming has occurred over land, in particular over the northern hemisphere," said Cristian Proistosescu, PhD '17, and first author of the paper. "This pattern of warming is known as the fast mode -- you put CO2 in the atmosphere and very quickly after that, the land in the northern hemisphere is going to warm."

    But there is also a slow mode of warming, which can take centuries to realize. That warming, which is most associated with the Southern Ocean and the Eastern Equatorial Pacific, comes with positive feedback loops that amplify the process. For example, as the oceans warm, cloud cover decreases and a white reflecting surface is replaced with a dark absorbent surface.

    The researchers developed a mathematical model to parse the two different modes within different climate models.

    "The models simulate a warming pattern like today's, but indicate that strong feedbacks kick in when the Southern Ocean and Eastern Equatorial Pacific eventually warm, leading to higher overall temperatures than would simply be extrapolated from the warming seen to date," said Peter Huybers, Professor of Earth and Planetary Sciences and of Environmental Science and Engineering at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and co-author of the paper.

    Huybers and Proistosescu found that while the slow mode of warming contributes a great deal to the ultimate amount of global warming, it is barely present in present-day warming patterns. "Historical observations give us a lot of insight into how climate changes and are an important test of our climate models," said Huybers, "but there is no perfect analogue for the changes that are coming."

    Story Source:

    Materials provided by Harvard John A. Paulson School of Engineering and Applied Sciences. Note: Content may be edited for style and length.

    Journal Reference:

    Cristian Proistosescu, Peter J. Huybers. Slow climate mode reconciles historical and model-based estimates of climate sensitivity. Science Advances, 2017; 3 (7): e1602821 DOI: 10.1126/sciadv.1602821

    点击量:752
  • 摘要:

    In addition to the life on the surface of Earth and in its oceans, ecosystems have evolved deep under us in a realm coined the "deep biosphere" which stretches several kilometers down into the bedrock. Down there, the conditions are harsh and life is forced to adjust to a lifestyle that we at the surface would call extreme. One major difference to surface conditions is the lack of oxygen; a compound we take for granted and consider to be a prerequisite for survival but which subsurface life has to cope without.

    The knowledge about ancient life in this deep environment is extremely scarce and most studies so far have focused on the prokaryotes. A new study by an international team of researchers led by Dr Henrik Drake of the Linnaeus University and Dr Magnus Ivarsson of the Swedish Museum of Natural History sheds light on eukaryotes in this deep setting. They present the first in situ finding of fungi at great depth in the bedrock. This ancient life is found at 740 m below the ground surface. It represents a new piece in the deep biosphere puzzle.

    Henrik Drake, lead author of the study, explains the discovery: "In a cavity hidden within a vein in a drill core I was examining, there were beautiful mineral crystals and abundant mycelium of fungal hyphae. To me this was like observing a small community frozen in time."

    Magnus Ivarsson tells more about the fungi: "Our detailed synchrotron-based investigations clearly prove that it is fungi adapted to anaerobic conditions. The fungi is partly mineralized and partly organically preserved, which is a rare find that tells how organisms in this environment are fossilized and preserved."

    High spatial resolution isotope analysis within the minerals that occur along with the fungi revealed that a variety of microbial processes had occurred in the caveat, including methane consumption and sulfate reduction. The fungi could not be dated precisely but there are proxies pointing to an age of tens of millions of years.

    The study confirms a previously hypothesized consortium between fungi and sulfate reducing bacteria, a coupling that has yet been unsupported by direct evidence in nature. As fungi provide hydrogen gas that fuel prokaryotes, the findings suggest a re-evaluation of the energy cycling within the energy-poor deep continental biosphere. Eukaryotes have been neglected in the deep biosphere research. This new finding proposes that they may be key players in this globally vast realm.

    Studies of subterranean life-forms have implications for early life on our planet and for life on other planets, where hostile conditions may have inhibited colonization of the surface.

    Story Source:

    Materials provided by Linnaeus University. Note: Content may be edited for style and length.

    Journal Reference:

    Henrik Drake, Magnus Ivarsson, Stefan Bengtson, Christine Heim, Sandra Siljeström, Martin J. Whitehouse, Curt Broman, Veneta Belivanova, Mats E. Åström. Anaerobic consortia of fungi and sulfate reducing bacteria in deep granite fractures. Nature Communications, 2017; 8 (1) DOI: 10.1038/s41467-017-00094-6

    点击量:1755
  • 摘要:

    JRC scientists have proposed a new approach for identifying the impacts of climate change and extreme weather on the variability of global and regional wheat production. The study analysed the effect of heat and water anomalies on crop losses over a 30-year period.

    JRC scientists studied the relative importance of heat stress and drought on wheat yields between 1980 and 2010. They developed a new Combined Stress Index in order to better understand the effects of concurrent heat and water stress events.

    The study 'Wheat yield loss attributable to heat waves, drought and water excess at the global, national and subnational scales' was published in Environmental Research Letters earlier this month. It finds that heat stress concurrent with drought or water excess can explain about 40% of the changes in wheat yields from one year to another.

    One finding is that in contrast to the common perception, water excess affects wheat production more than drought in several countries. Excessive precipitation and greater cloud cover, especially during sensitive development stages of the crop, are major contributors to reduced yields, as they help pests and disease proliferate and make it harder for the plants to get the oxygen and light they need.

    In 2010, wheat contributed to 20% of all dietary calories worldwide. It therefore has a major role in food security worldwide, some countries being particular reliant on it . As climate change is increasing the duration, frequency and severity of extreme weather events, it has become increasingly urgent to identify their effects and provide early warnings, in order to ensure market stability and global food security.

    This study helps to better understand the role of weather factors in wheat production and global yield anomalies. It shows, for the first time, the effect of individual extreme events and their impacts on particular development stages of the crop (for example, the effect of drought during key development periods such as flowering and grain-filling).

    Compared to previous approaches, the Combined Stress Index has the advantage of being able to calculate the effect of single weather anomalies on total crop output at global and regional levels. Moreover, it offers a simple and practical tool to compute yield anomalies using seasonal, climate forecasts and projections, thereby allowing better adaptation studies and mitigation strategies to be established.

    The model explicitly accounts for the effects of temperature and soil moisture changes (positive and negative) on global and regional wheat production fluctuations.

    A specific case study was carried out at subnational level in France, where wheat was found to be more sensitive to overly wet conditions. In other countries, heat stress and drought are the most important predictors of crop losses. For instance, in Mediterranean countries, drought has a bigger detrimental effect on wheat yields than heat stress.

    Story Source:

    Materials provided by European Commission, Joint Research Centre (JRC). Note: Content may be edited for style and length.

    Journal Reference:

    M Zampieri, A Ceglar, F Dentener, A Toreti. Wheat yield loss attributable to heat waves, drought and water excess at the global, national and subnational scales. Environmental Research Letters, 2017; 12 (6): 064008 DOI: 10.1088/1748-9326/aa723b

    点击量:1793
  • 摘要:

    The use of untreated wastewater from cities to irrigate crops downstream is 50 percent more widespread than previously thought, according to a new study published this week in the journal Environmental Research Letters.

    The study relies on advanced modeling methods to provide the first truly comprehensive estimate of the global extent to which farmers use urban wastewater on irrigated cropland. Researchers analyzed data with geographic information systems (GIS) rather than depending on case study results, as in previous studies.

    The researchers also assessed for the first time 'indirect reuse', which occurs when wastewater gets diluted but still remains a dominant component of surface water flows. Such situations account for the majority of agricultural water reuse worldwide, but have been difficult to quantify on a global level due to different views of what constitutes diluted wastewater versus polluted water.

    Considering consumer safety the foremost priority, study authors highlight the need to mitigate public health risks through measures taken along the entire food supply chain. This includes improved wastewater treatment, but also preventive steps on farms and in food handling, since capacity for water treatment is increasing only slowly in developing countries.

    According to the study, farmers' use of wastewater is most prevalent in regions where there is significant wastewater generation and water pollution. In these circumstances, and where safer water is in short supply, wastewater offers a consistent and reliable means of irrigating fields, including high-value crops, such as vegetables, which often require more water than staple foods. Where raw wastewater is available, farmers may tend to prefer it because of its high concentrations of nutrients, which can lessen the need to apply purchased fertilizers. In most cases, however, farmers' use of this water is motivated by basic needs; they simply do not have alternatives.

    "The de facto reuse of urban wastewater is understandable, given the combination of increasing water pollution and declining freshwater availability, as seen in many developing countries," said Anne Thebo, a recent graduate at the University of California, Berkeley in the USA and lead author of the study. "As long as investment in wastewater treatment lags far behind population growth, large numbers of consumers eating raw produce will face heightened threats to food safety."

    Results show that 65 percent of all irrigated areas are within 40 km downstream of urban centers and are affected by wastewater flows to a large degree. Of the total area of 35.9 million hectares, 29.3 million hectares are in countries with very limited wastewater treatment, exposing 885 million urban consumers as well as farmers and food vendors to serious health risks. Five countries -- China, India, Pakistan, Mexico and Iran -- account for most of this cropland. These new findings supersede a widely cited 2004 estimate, based on case studies in some 70 countries and expert opinion, which had put the cropland area irrigated with wastewater at a maximum of 20 million hectares.

    "Gaining a better grasp of where, why and to what extent farmers use wastewater for irrigation is an important step toward addressing the problem," said second author Pay Drechsel of the International Water Management Institute (IWMI), who leads the CGIAR Research Program on Water, Land and Ecosystems. "While actions aimed at protecting human health are the first priority, we can also limit the hazards through a variety of tested approaches aimed at safely recovering and reusing valuable resources from wastewater. These include the water itself but also energy, organic matter and nutrients, all of which agriculture needs. Through such approaches, we have been helping the World Health Organisation (WHO) respond to the wastewater challenge over the years."

    "We hope this new study will focus the attention of policy makers and sanitation experts on the need to fulfill Sustainable Development Goal 6, particularly target 3, which calls for halving the proportion of untreated wastewater, and increasing recycling and safe water reuse," added Drechsel.

    "One major challenges is to cultivate behavior change from farm to fork, especially where risk awareness is low. Another consists of larger scale efforts to put the recovery and reuse of resources from wastewater and other waste on a business footing to make its management more attractive for the public and private sectors. Safe resource recovery and reuse have significant potential to address the health and environmental risks, while at the same time making cities more resilient and agriculture more sustainable, contributing to more circular economies."

    Story Source:

    Materials provided by IOP Publishing. Note: Content may be edited for style and length.

    Journal Reference:

    A L Thebo, P Drechsel, E F Lambin, K L Nelson. A global, spatially-explicit assessment of irrigated croplands influenced by urban wastewater flows. Environmental Research Letters, 2017; 12 (7): 074008 DOI: 10.1088/1748-9326/aa75d1

    点击量:1806
  • 摘要:

    Green infrastructure is an attractive concept, but there is concern surrounding its effectiveness. Researchers at the University of Illinois at Urbana-Champaign are using a mathematical technique traditionally used in earthquake engineering to determine how well green infrastructure works and to communicate with urban planners, policymakers and developers.

    Green roofs are flat, vegetated surfaces on the tops of buildings that are designed to capture and retain rainwater and filter any that is released back into the environment.

    "The retention helps ease the strain that large amounts of rain put on municipal sewer systems, and filtration helps remove any possible contaminants found in the stormwater," said Reshmina William, a civil and environmental engineering graduate student who conducted the study with civil and environmental engineering professor Ashlynn Stillwell.

    A good-for-the-environment solution to mitigating stormwater runoff may seem like a no-brainer, but a common concern regarding green roofs is the variability of their performance. One challenge is figuring out how well the buildings that hold them up will respond to the increased and highly variable weight between wet and dry conditions. Another challenge is determining how well they retain and process water given storms of different intensity, duration and frequency, William said.

    While studying reliability analysis in one of her courses, William came up with the idea to use a seemingly unrelated mathematical concept called fragility curves to confront this problem.

    "Earthquake engineering has a similar problem because it is tough to predict what an earthquake is going to do to a building," William said. "Green infrastructure has a lot more variability, but that is what makes fragility curves ideal for capturing and defining the sort of dynamics involved."

    William and Stillwell chose to study green roofs over other forms of green infrastructure for a very simple reason: There was one on campus fitted with the instrumentation needed to measure soil moisture, rainfall amount, temperature, humidity and many other variables that are plugged into their fragility curve model.

    "This is a unique situation because most green roofs don't have monitoring equipment, so it is difficult for scientists to study what is going on," Stillwell said. "We are very fortunate in that respect."

    William said the primary goal of this research is to facilitate communication between scientists, policymakers, developers and the general public about the financial risk and environmental benefit of taking on such an expense.

    "One of the biggest barriers to the acceptance of green infrastructures is the perception of financial risk," William said. "People want to know if the benefit of a green roof is going to justify the cost, but that risk is mitigated by knowing when an installation will be most effective, and that is where our model comes in."

    The results of their model and risk analysis, which appear in the Journal of Sustainable Water in the Built Environment, provide a snapshot of green infrastructure performance for this particular green roof. The results from a single model do not yield a one-size-fits-all approach to green infrastructure evaluation, and William and Stillwell said that is one of the strengths of their technique. Adaptability across different technologies and environments is essential to any green infrastructure analysis.

    Story Source:

    Materials provided by University of Illinois at Urbana-Champaign. Original written by Lois Yoksoulian. Note: Content may be edited for style and length.

    Journal Reference:

    Reshmina William, Ashlynn S. Stillwell. Use of Fragility Curves to Evaluate the Performance of Green Roofs. Journal of Sustainable Water in the Built Environment, 2017; 3 (4): 04017010 DOI: 10.1061/JSWBAY.0000831

    点击量:1810