The FLUKA code was implemented to simulate pulse-height spectra of a NaI(Tl) detector (excluding its resolution), considering radioactive sources of naturally occurring and artificial radionuclides present in the marine environment. For this purpose, a user-defined routine was developed for the proper simulation of the emitted γ-rays. The results were compared with simulations performed using the MCNP-CP code. The comparison of the recorded counts in the full-energy peaks, for the high intensity (emission probability >15%) emitted γ-rays for each radionuclide, yielded a satisfactory agreement (calculated ratios from 0.93 ± 0.05 to 1.07 ± 0.02) in all the studied cases.
Measurement is the foundation of exposure science. Associations between illness and environmental agents have been observed for millennia, but the ability to quantify exposure and dose has been possible only in the last century. Improved means of measurement and refined concepts of who, what, when, where, and why to measure have been the seminal contributions of exposure science to the study of disease causation and prevention. This paper examines critical advancements in exposure assessment associated with workplace health and safety, and the groundbreaking work of the US Public Health Service. Many of the key concepts of modern exposure science have their origin in these early studies. Occupational hygiene scientists have conducted receptor-based exposure analyses for more than 80 years, evaluating indoor air, defining microenvironments, and developing personal sampling techniques. Biological monitoring of community populations including children, dermal exposure monitoring, duplicate diet studies, and multi-pathway, aggregate exposure assessments can be traced to early public health studies. As we look to the future, we see that new technologies and techniques are expanding the scope of exposure science dramatically. We need to ensure that the highest of scientific standards are maintained, make a greater effort to include occupational hygiene scientists, microbiologists, and behavioral scientists in the field, and promote new sources of training and research support. Exposure science has a critical role to play in the prevention strategy that is central to public health. Journal of Exposure Science and Environmental Epidemiology (2010) 20, 493-502; doi: 10.1038/jes.2010.26; published online 28 April 2010
Starch has been the most popular and economic size material. Synthetic binders have also been developed to be used as size material to improve weaving loom efficiency. Some synthetic size materials have got restrictions in use mainly because of ecological reasons. In the recent years, many modifications have come up in the starch as a sizing agent. Different modification can give different properties, which can be suited for a particular application. In the present study, for comparative analysis of different varieties of natural starch, modified starch and synthetic size material have been evaluated and mechanical properties like cohesion power, adhesion power, abrasion resistance, bending rigidity etc. were studied. PVA shows best properties among all size material. Among modified starch, starch ester shows better properties.
Models of species distributions are increasingly being used to address a variety of problems in conservation biology. In many applications, perfect or constant detectability of species, given presence, is assumed. While this problem has been acknowledged and addressed through the development of occupancy models, we still know little regarding whether addressing the potential for imperfect detection improves the predictive performance of species distribution models in nature. Here, we contrast logistic regression models of species occurrence that do not correct for detectability to hierarchical occupancy models that explicitly estimate and adjust for detectability, and maximum entropy models that attempt to circumvent the detectability problem by using data from known presence locations only. We use a large-scale, long-term monitoring database across western Montana and northern Idaho to contrast these models for nine landbird species that cover a broad spectrum in detectability. Overall, occupancy models were similar to or better than other approaches in terms of predictive accuracy, as measured by the Area Under the ROC Curve (AUC) and Kappa, with maximum entropy tending to provide the lowest predictive accuracy. Models varied in the types of errors associated with predictions, such that some model approaches may be preferred over others in certain situations. As expected, predictive performance varied across a gradient in species detectability, with logistic regression providing lower relative performance for less detectable species and Maxent providing lower performance for highly detectable species. We conclude by discussing the advantages and limitations to each approach for developing large-scale species distribution models.
buy this issue The March/April 2016 issue of Foreign Affairs, published by the Council on Foreign Relations, is devoted in large part to the topic of economic stagnation. The editorial by Jonathan Tepperman, the journal's managing editor, declares: "Today, with China slumping, energy prices collapsing, and nervous consumers sitting on their hands, growth has ground to a halt almost everywhere, and economists, investors, and ordinary citizens are starting to confront a grim new reality: the world is stuck in the slow lane and nobody seems to know what to do about it." This is followed by eight articles on stagnation, only one of which, however-"The Age of Secular Stagnation" by Lawrence H. Summers-is, in our opinion, of any real importance... Summers heavily criticizes those like Robert J. Gordon, in The Rise and Fall of American Growth (2016), who attribute stagnation to supply-side "headwinds"...blocking productivity growth... Likewise Summers dispatches those like Kenneth Rogoff who see stagnation as merely the product of a debt supercycle associated with periodic financial crises... Despite such sharp criticisms of other mainstream interpretations of stagnation, Summers's own analysis can be faulted for being superficial and vague, lacking historical concreteness... In fact, the current mainstream debate on secular stagnation is so superficial and circumspect that one cannot help but wonder whether the main protagonists-figures like Summers, Gordon, Paul Krugman, and Tyler Cowen-are not deliberately tiptoeing around the matter, worried that if they get too close or make too much noise they might awaken some sleeping giant (the working class?) as in the days of the Great Depression and the New Deal. Click here to purchase a PDF version of this article at the Monthly Review website.