Background: Numerous studies have investigated the relationship between COX-2 8473 T > C polymorphism and cancer susceptibility, however, the results remain controversial. Therefore, we carried out the present meta-analysis to obtain a more accurate assessment of this potential association. Methods: In this meta-analysis, 79 case-control studies were included with a total of 38,634 cases and 55,206 controls. We searched all relevant articles published in PubMed, EMBASE, OVID, Web of Science, CNKI and Wanfang Data, till September 29, 2017. The pooled odds ratios (ORs) with 95% confidence intervals (CIs) were used to evaluate the strength of the association. We performed subgroup analysis according to ethnicity, source of controls, genotyping method and cancer type. Moreover, Trial sequential analysis (TSA) was implemented to decrease the risk of type I error and estimate whether the current evidence of the results was sufficient and conclusive. Results: Overall, our results indicated that 8473 T > C polymorphism was not associated with cancer susceptibility. However, stratified analysis showed that the polymorphism was associated with a statistically significant decreased risk for nasopharyngeal cancer and bladder cancer, but an increased risk for esophageal cancer and skin cancer. Interestingly, TSA demonstrated that the evidence of the result was sufficient in this study. Conclusion: No significant association between COX-2 8473 T > C polymorphism and cancer risk was detected.
Solar cells that convert sunlight into electrical power have demonstrated a large and consistent growth through several decades. The growth has spawned research on new technologies that potentially enable much faster, less costly and environmentally friendly manufacture from earth abundant materials. Here we review carbon based solar cells through a complete analysis of all the data that has been reported so far and we highlight what can be expected from carbon based technologies and draw scenarios of how it can be made of immediate use.
NMR spectroscopy is currently a premier technique for structural elucidation of organic molecules. Quantitative NMR (qNMR) methodology has developed more slowly but is now widely accepted, especially in the areas of natural product and medicinal chemistry. However, many undergraduate students are not routinely exposed to this important concept. This article describes a simple and practical lab experiment that has been successfully performed by students in a Quantitative Analysis class for several years and is based on a comparison of relative integration areas of species present in spectra of compound mixtures. In this experiment, NMR spectroscopy is used to determine the purity of common organic solvents using dimethyl sulfoxide as an internal standard in D2O. Groups of students analyze unknown samples containing one of the following solvents: methanol, ethanol, 2-propanol, tetrahydrofuran, or acetone, to which water has been added as an impurity. Over a period of five years, 54 students analyzed samples ranging from 60% to 99% purity with an average error of 2.64%. This experiment fills a niche in the initial portion of a standard Quantitative Analysis lab sequence by differentiating between qualitative and quantitative analysis, providing exposure to equipment not usually encountered in an introductory analytical lab, and generating numerical data for students to analyze and evaluate.
In this study we have worked on the evaluation of heavy metal contamination in the sediments taken from the Tisza River and its tributaries, and thereby used the sequential extraction method, geochemical normalization, the calculation of the enrichment factor (EF), and the methods of statistical analysis. The chemical fractionation of Ni, Cu, Zn, Cr, Pb, Fe, and Mn, carried out by using the modified Tessier method, points to different substrates and binding mechanisms of Cu, Zn and Pb in sediments of the tributaries and sediments of the Tisza River. The similarities in the distributions of Fe and Ni in all types of sediments are the result of geochemical similarity as well as of the fact that natural sources mainly affect the concentration levels of these elements. The calculated enrichment factors (EF, measured metal vs. background concentrations) indicated that metal contamination (Cu, Pb, Zn and Cr) was recorded in the sediments of the Tisza River, while no indications of pollution were detected in the tributaries of the Tisza River and the surrounding pools. The maximum values of the EF were close to 6 for Cu and Pb (moderately severe enrichment) and close to 4.5 for Zn (indicating moderate enrichment). It can be said that the Tisza River is slightly to moderately severely polluted with Cu, Zn, and Pb, and minorly polluted with Cr. It is concluded that sediments of the Tisza serve as a repository for heavy metal accumulation from adjacent urban and industrial areas.
The validity of parametric functional magnetic resonance imaging (fMRI) analysis has only been reported for simulated data. Recent advances in computer science and data sharing make it possible to analyze large amounts of real fMRI data. In this study, 1484 rest datasets have been analyzed in SPM8, to estimate true familywise error rates. For a familywise significance threshold of 5%, significant activity was found in 1%–70% of the 1484 rest datasets, depending on repetition time, paradigm and parameter settings. This means that parametric significance thresholds in SPM both can be conservative or very liberal. The main reason for the high familywise error rates seems to be that the global AR(1) auto correlation correction in SPM fails to model the spectra of the residuals, especially for short repetition times. The findings that are reported in this study cannot be generalized to parametric fMRI analysis in general, other software packages may give different results. By using the computational power of the graphics processing unit (GPU), the 1484 rest datasets were also analyzed with a random permutation test. Significant activity was then found in 1%–19% of the datasets. These findings speak to the need for a better model of temporal correlations in fMRI timeseries. ► We analyzed 1484 rest datasets in SPM8 and used a significance threshold of 5%. ► Significant activity was found in 1%–70% of the datasets. ► The whitening used in SPM fails to model the spectra of the residuals. ► The whitening especially fails for short repetition times. ► A better model of the temporal correlations in fMRI rest data is needed.
Understanding emotions is an important aspect of personal development and growth, and as such it is a key tile for the emulation of human intelligence. Besides being important for the advancement of AI, emotion processing is also important for the closely related task of polarity detection. The opportunity to automatically capture the general public's sentiments about social events, political movements, marketing campaigns, and product preferences has raised interest in both the scientific community, for the exciting open challenges, and the business world, for the remarkable fallouts in marketing and financial market prediction. This has led to the emerging fields of affective computing and sentiment analysis, which leverage human-computer interaction, information retrieval, and multimodal signal processing for distilling people's sentiments from the ever-growing amount of online social data.
Many studies indicated that industrialization and urbanization caused serious soil heavy metal pollution from industrialized age. However, fewer previous studies have conducted a combined analysis of the landscape pattern, urbanization, industrialization, and heavy metal pollution. This paper was aimed at exploring the relationships of heavy metals in the soil (Pb, Cu, Ni, As, Cd, Cr, Hg, and Zn) with landscape pattern, industrialisation, urbanisation in Taiyuan city using multivariate analysis. The multivariate analysis included correlation analysis, analysis of variance (ANOVA), independent-sample T test, and principal component analysis (PCA). Geographic information system (GIS) was also applied to determine the spatial distribution of the heavy metals. The spatial distribution maps showed that the heavy metal pollution of the soil was more serious in the centre of the study area. The results of the multivariate analysis indicated that the correlations among heavy metals were significant, and industrialisation could significantly affect the concentrations of some heavy metals. Landscape diversity showed a significant negative correlation with the heavy metal concentrations. The PCA showed that a two-factor model for heavy metal pollution, industrialisation, and the landscape pattern could effectively demonstrate the relationships between these variables. The model explained 86.71% of the total variance of the data. Moreover, the first factor was mainly loaded with the comprehensive pollution index (P), and the second factor was primarily loaded with landscape diversity and dominance (H and D). An ordination of 80 samples could show the pollution pattern of all the samples. The results revealed that local industrialisation caused heavy metal pollution of the soil, but such pollution could respond negatively to the landscape pattern. The results of the study could provide a basis for agricultural, suburban, and urban planning.
The comprehensive approach for the lipidomic characterization of human breast cancer and surrounding normal tissues is based on hydrophilic interaction liquid chromatography (HILIC)–electrospray ionization mass spectrometry (ESI-MS) quantitation of polar lipid classes of total lipid extracts followed by multivariate data analysis using unsupervised principal component analysis (PCA) and supervised orthogonal partial least square (OPLS). This analytical methodology is applied for the detailed lipidomic characterization of ten patients with the goal to find the statistically relevant differences between tumor and normal tissues. This strategy is selected for better visualization of differences, because the breast cancer tissue is compared with the surrounding healthy tissue of the same patient, therefore changes in the lipidome are caused predominantly by the tumor growth. A large increase of total concentrations for several lipid classes is observed, including phosphatidylinositols, phosphatidylethanolamines, phosphatidylcholines, and lysophosphatidylcholines. Concentrations of individual lipid species inside the abovementioned classes are also changed, and in some cases, these differences are statistically significant. PCA and OPLS analyses enable a clear differentiation of tumor and normal tissues based on changes of their lipidome. A notable decrease of relative abundances of ether and vinylether (plasmalogen) lipid species is detected for phosphatidylethanolamines, but no difference is apparent for phosphatidylcholines.
Background: High-throughput DNA sequencing technologies are generating vast amounts of data. Fast, flexible and memory efficient implementations are needed in order to facilitate analyses of thousands of samples simultaneously. Results: We present a multithreaded program suite called ANGSD. This program can calculate various summary statistics, and perform association mapping and population genetic analyses utilizing the full information in next generation sequencing data by working directly on the raw sequencing data or by using genotype likelihoods. Conclusions: The open source c/c++ program ANGSD is available at http://www.popgen.dk/angsd. The program is tested and validated on GNU/Linux systems. The program facilitates multiple input formats including BAM and imputed beagle genotype probability files. The program allow the user to choose between combinations of existing methods and can perform analysis that is not implemented elsewhere.
Wafer-scale, CMOS compatible graphene transfer has been established for device fabrication and can be integrated into a conventional CMOS process flow back end of the line. In Part I of this paper, statistical analysis of graphene FET (GFET) devices fabricated on wafer scale is presented. Device yield is approximately 75% (for 4500 devices) measured in terms of the quality of the top gate, oxide layer, and graphene channel. Statistical evaluation of the device yield reveals that device failure occurs primarily during the graphene transfer step. In Part II of this paper, device statistics are further examined to reveal the primary mechanism behind device failure. The analysis from Part II suggests that significant improvements to device yield, variability, and performance can be achieved through mitigation of compressive strain introduced in the graphene layer during the graphene transfer process. The combined analyses from Parts I and II present an overview of mechanisms influencing GFET behavior as well as device yield. These mechanisms include residues on the graphene surface, tears, cracks, contact resistance at the graphene/metal interface, gate leakage as well as the effects of postprocessing.