The key to produce better crops to meet the needs of the growing world's population may lie in combining the traditional knowledge of subsistence farmers of the Ethiopian highlands with plant genomics. Researchers in Italy and Ethiopia conducted research that demonstrates that the indigenous knowledge of traditional farmers, passed on from one generation to the next since hundreds of years, can be measured in a quantitative way and used with advanced genomic and statistical methods to identify genes responsible for farmers' preference of wheat.
The livelihood of hundreds of millions of people living in smallholder farming systems depends on the products they obtain from marginal fields. Smallholder farmers are very knowledgeable in what they grow, because they must be efficient in selecting the crop varieties that will ensure the subsistence of their household. With an innovative approach overturning the classical scheme of modern plant breeding, the authors of this research developed a method to extract the traditional knowledge from smallholder farmers' and to use it to inform modern, genomic-driven breeding.
The result of this original research, conducted by scientists from the Institute of Life Sciences of Scuola Superiore Sant'Anna in Pisa and from Bioversity International, and participated by the University of Bologna, the Amhara Agricultural Research Center, and the Mekelle Univeristy in Ethiopia, was published in Frontiers in Plant Science. Researches worked with 60 farmers from two smallholder farming communities in the Ethiopian highlands. For two straight weeks, farmers evaluated 400 wheat varieties for traits of their interest side by side with geneticists measuring agronomic traits on the same plots. The evaluation provided more than 200 thousands data points, that the researchers related to 30 million molecular data deriving from the genomic characterization of the wheat varieties. This approach allowed to identify for the first time genomic regions responding to smallholder farmers' traditional knowledge, demonstrating that farmers may identify genomic regions relevant for breeding that exceeds those identified by classic metric measurements of traits alone.
"This study is a milestone in modern crop breeding" says Matteo Dell'Acqua, geneticist at the Scuola Sant'Anna and coordinator of the research "as it is the first in demonstrating that the traditional knowledge of smallholder farmers has a genetic basis that can be extracted with methods already available to the scientific community. These farmers can teach us how to produce crop varieties adapted to local agriculture, fighting food insecurity in farming systems most exposed to climate change"
Materials provided by Frontiers. Note: Content may be edited for style and length.
Yosef G. Kidane, Chiara Mancini, Dejene K. Mengistu, Elisabetta Frascaroli, Carlo Fadda, Mario Enrico Pè, Matteo Dell'Acqua. Genome Wide Association Study to Identify the Genetic Base of Smallholder Farmer Preferences of Durum Wheat Traits. Frontiers in Plant Science, 2017; 8 DOI: 10.3389/fpls.2017.01230
Heavy rainfall can cause rivers and drainage systems to overflow or dams to break, leading to flood events that bring damage to property and road systems as well potential loss of human life.
One such event in 2008 cost $10 billion in damages for the entire state of Iowa. After the flood, the Iowa Flood Center (IFC) at the University of Iowa (UI) was established as the first center in the United States for advanced flood-related research and education.
Today, simplified 2-D flood models are the state of the art for predicting flood wave propagation, or how floods spread across land. A team at IFC, led by UI Professor George Constantinescu, is creating 3-D non-hydrostatic flood models that can more accurately simulate flood wave propagation and account for the interaction between the flood wave and large obstacles such as dams or floodplain walls. These 3-D models also can be used to assess and improve the predictive capabilities of the 2-D models that government agencies and consulting companies use for predicting how floods will spread and the associated risks and hazards.
Using one of the world's most powerful supercomputers -- Titan, the 27-petaflop Cray XK7 at the Oak Ridge Leadership Computing Facility (OLCF) -- Constantinescu's team performed one of the first highly resolved, 3-D, volume-of-fluid Reynolds-averaged Navier-Stokes (RANS) simulations of a dam break in a natural environment. The simulation allowed the team to map precise water levels for actual flood events over time. RANS is a widely used method for modeling turbulent flows.
"Flood events, like those generated by dam breaks, can be computationally very expensive to simulate," Constantinescu said. "Previously, there wasn't enough computer power to do these kinds of time-accurate simulations in large computational domains, but with the power of high-performance computing [HPC] and Titan, we are achieving more than was previously thought possible."
The project was supported in 2015 and 2016 within the OLCF's Director's Discretionary user program. The OLCF, a US Department of Energy (DOE) Office of Science User Facility located at DOE's Oak Ridge National Laboratory, provides HPC resources for research and development projects to advance scientific discovery.
The team's 3-D simulations showed that commonly used 2-D models may inaccurately predict some aspects of flooding, such as the time over which dangerous flood levels last at certain locations and the amount of surface area flooded. Simulation results also demonstrated that 2-D models may underestimate the speed at which floods spread and overestimate the time at which flood waves reach their highest point.
When the water sources that empty into a river rise simultaneously, they can trigger one or more successive flood waves. Accuracy of the 1-D, 2-D, or 3-D flood models that track how these waves move is crucial for predicting maximum flood depth, hazardous conditions, and other variables.
"We need to know what's going to happen for situations in which a dam breaks," Constantinescu said. "We need to know who's going to be affected, how much time they will have to evacuate, and what else might happen to the environment as a result."
Because 2-D models make simplified assumptions about some aspects of the flow, they can't account for changes in the flow, such as when the flood wave moves around large obstacles, changes rapidly in direction, or fully immerses bridge decks. The team needed a leadership-class supercomputer to run the 3-D simulations and accurately capture these changes.
Titan Changes the Current
Using a fully non-hydrostatic 3-D RANS solver, the team performed the first simulations of the hypothetical failure of two Iowa dams: the Coralville Dam in Iowa City and the Saylorville Dam in Des Moines. Each used a computational grid of about 30-50 million cells and covered a physical area of about 20 miles by 5 miles.
The team used the state-of-the-art computational fluid dynamics software STAR-CCM+. This software features a volume-of-fluid method to track the position of the water's free surface -- the areas where water meets the air. In a scalability study, the team determined the peak performance of the code for the dam break simulations. The researchers used 2,500 of Titan's CPU processors for peak performance in each simulation.
The researchers also computed the same dam break test cases using a standard 2-D model commonly used by IFC. When they compared the 2-D results against those of the 3-D simulations, they found that the 2-D model underestimated how quickly the flood wave moved across land and overestimated the time at which the maximum flood occurred. This finding is important because government agencies and consulting companies use 2-D shallow flow models to predict dam breaks and floods, as well as to estimate flood hazards.
"By performing these 3-D simulations, we provided a huge data set that can be used to improve the accuracy of existing 2-D and 1-D flood models," Constantinescu said. "We can also examine the effectiveness of deploying flood protection structures for different flooding scenarios." The team ultimately showed that HPC can be used successfully to answer engineering questions related to the consequences of structural failure of dams and related hazards.
Constantinescu said that as computers become faster and more powerful, simulations of full flooding events over larger physical regions will be possible. Summit, the OLCF's next-generation supercomputer that is scheduled to come online in 2018, will unearth new possibilities for Constantinescu's research.
"Advances in numerical algorithms, automatic grid generation, and increased supercomputer power will eventually make the simulations of flood waves over large durations of time possible using Titan, and even more so with Summit," Constantinescu said. "Eventually, things we previously had to do by hand, such as generating a high-quality computational grid, will just be part of the typical software package."
Materials provided by DOE/Oak Ridge National Laboratory. Note: Content may be edited for style and length.
One of the central tenets of biology is that information flows from DNA to RNA in order to encode proteins, which function in the cell. Arguably just as critical as the genetic code is the timing of this information flow. By producing the right RNA and right proteins at the right time, a cell can effectively strategize its survival and success. One such regulatory element, the riboswitch, has excited interest as a potential target for antibiotics. After over 10 years of research, Prof. Harald Schwalbe's research group at the Goethe University, in collaboration with the Landick group at the University of Wisconsin, Prof. Jens Wöhnert from Goethe-University's Biology Department and the Süß group at the Technische Universität Darmstadt, has put together the puzzle pieces of a riboswitch-based regulatory process in the bacterium Bacillus subtilis, presenting the most extensive model of the timing of riboswitch action to date.
A riboswitch is a short piece of RNA that can fold into different structures, depending on whether or not a small messenger molecule is binds to it. In transcriptional riboswitches, these different structures signal the nearby RNA polymerase to continue producing RNA or to stop. In their recent publication in eLife, the Schwalbe group and their collaborators released molecular structures of the xpt-pbuX riboswitch in the off-position after synthesis and in the on-position upon binding by the small messenger molecule guanine. They also demonstrated that this switch to the on-position takes a certain amount of time. This sets a certain requirement on this regulatory process.
As RNA polymerase flies along a DNA strand, producing the corresponding RNA, it reaches the code for the xpt-pbuX switch, makes the riboswitch, and continues on. If guanine is not around, the RNA polymerase would detect the default off-position and halt synthesis. However, if guanine were to bind the riboswitch, the riboswitch would need to refold into the on-position, and RNA polymerase would have to wait long enough to detect the new conformation. Otherwise, it would always read "off," and that gene would never be read. Schwalbe and coworkers found that just such a pause does exist, and it's encoded into the DNA. After producing the xpt-pbuX switch, the RNA polymerase encounters this "pause site" on the DNA code and slows down, allowing the right amount of time for the riboswitch to refold.
Materials provided by Goethe University Frankfurt. Note: Content may be edited for style and length.
Hannah Steinert, Florian Sochor, Anna Wacker, Janina Buck, Christina Helmling, Fabian Hiller, Sara Keyhani, Jonas Noeske, Steffen Grimm, Martin M Rudolph, Heiko Keller, Rachel Anne Mooney, Robert Landick, Beatrix Suess, Boris Fürtig, Jens Wöhnert, Harald Schwalbe. Pausing guides RNA folding to populate transiently stable RNA structures for riboswitch-based transcription regulation. eLife, 2017; 6 DOI: 10.7554/eLife.21297
Radioactive contamination is the unwanted presence of radioactive substances in the environment. Our environment is contaminated by naturally-occurring and anthropogenic radionuclides, unstable isotopes of an element that releases radiation as it decomposes and becomes more stable, which are present in the air, soil, rain, etc. These radionuclides can be transferred throughout the food chain until reaching humans, and this could make for a potential health risk.
Until now, to study the presence of radionuclides in different products for human consumption and their subsequent transfer, research has been based fundamentally on foods such as meats, fish or milk, without considering a foodstuff like fungi, which are well known for accumulating concentrations of some radionuclides in their fruiting bodies.
As a result, the Environmental Radioactivity Laboratory of the University of Extremadura (LARUEX) has carried out a study to quantify radioactive presence in this foodstuff. Thus, the author of the study, Javier Guillén, explains that "this quantification is made using transfer coefficients that compare the radioactive content in the receptor compartment of the radioactive contamination, that is to say in the fungi, to that existing in the transmitter compartment, which in this case would be the soil."
To conduct this research the authors considered the base level of radionuclides established in ecosystems with low radioactive content like our region, and then used the software called the ERICA Tool which, as the researcher explains, "allows one to enter the transfer coefficient from the soil to the organism -- in this case the fungus -- thus calculating the dose of radionuclides a non-human organism receives."
From the study, we may conclude that the estimated dose rates for fungi in Spain are similar to those determined for other animals (animals and plants) and therefore this species can be used when assessing the presence or absence of radioactive contamination in the soil, as a result of which, as the researcher asserts, "even though it is not strictly necessary to include fungi amongst the existing instruments and frameworks of assessment, they can be used in ecosystems which may require them, based on criteria such as biodiversity."
Moreover, in the case of the fungi analysed, which are concentrated in the Mediterranean area, we should also highlight the fact that they do not contain a high dose of radionuclides, meaning there is no environmental contamination and they are therefore perfectly suitable for consumption by humans.
Materials provided by University of Extremadura. Note: Content may be edited for style and length.
J. Guillén, A. Baeza, N.A. Beresford, M.D. Wood. Do fungi need to be included within environmental radiation protection assessment models? Journal of Environmental Radioactivity, 2017; 175-176: 70 DOI: 10.1016/j.jenvrad.2017.04.014