Background Random error may cause misleading evidence in meta-analyses. The required number of participants in a meta-analysis (i.e. information size) should be at least as large as an adequately powered single trial. Trial sequential analysis (TSA) may reduce risk of random errors due to repetitive testing of accumulating data by evaluating meta-analyses not reaching the information size with monitoring boundaries. This is analogous to sequential monitoring boundaries in a single trial. Methods We selected apparently conclusive (P ≤ 0.05) Cochrane neonatal meta-analyses. We applied heterogeneity-adjusted and unadjusted TSA on these meta-analyses by calculating the information size, the monitoring boundaries, and the cumulative Z-statistic after each trial. We identified the proportion of meta-analyses that did not reach the required information size and the proportion of these meta-analyses in which the Z-curve did not cross the monitoring boundaries. Results Of 54 apparently conclusive meta-analyses, 39 (72%) did not reach the heterogeneity-adjusted information size required to accept or reject an intervention effect of 25% relative risk reduction. Of these 39, 19 meta-analyses (49%) were considered inconclusive, because the cumulative Z-curve did not cross the monitoring boundaries. The median number of participants required to reach the required information size was 1591 (range, 339-6149). TSA without heterogeneity adjustment largely confirmed these results. Conclusions Many apparently conclusive Cochrane neonatal meta-analyses may become inconclusive when the statistical analyses take into account the risk of random error due to repetitive testing.
Cognitive regulation of emotions is a fundamental prerequisite for intact social functioning which impacts on both well being and psychopathology. The neural underpinnings of this process have been studied intensively in recent years, without, however, a general consensus. We here quantitatively summarize the published literature on cognitive emotion regulation using activation likelihood estimation in fMRI and PET (23 studies/479 subjects). In addition, we assessed the particular functional contribution of identified regions and their interactions using quantitative functional inference and meta-analytic connectivity modeling, respectively. In doing so, we developed a model for the core brain network involved in emotion regulation of emotional reactivity. According to this, the superior temporal gyrus, angular gyrus and (pre) supplementary motor area should be involved in execution of regulation initiated by frontal areas. The dorsolateral prefrontal cortex may be related to regulation of cognitive processes such as attention, while the ventrolateral prefrontal cortex may not necessarily reflect the regulatory process per se, but signals salience and therefore the need to regulate. We also identified a cluster in the anterior middle cingulate cortex as a region, which is anatomically and functionally in an ideal position to influence behavior and subcortical structures related to affect generation. Hence this area may play a central, integrative role in emotion regulation. By focusing on regions commonly active across multiple studies, this proposed model should provide important a priori information for the assessment of dysregulated emotion regulation in psychiatric disorders. (C) 2013 Elsevier Inc. All rights reserved.
Plant oils are an important renewable resource, and seed oil content is a key agronomical trait that is in part controlled by the metabolic processes within developing seeds. A large‐scale model of cellular metabolism in developing embryos of Brassica napus ( bna572 ) was used to predict biomass formation and to analyze metabolic steady states by flux variability analysis under different physiological conditions. Predicted flux patterns are highly correlated with results from prior 13 C metabolic flux analysis of B. napus developing embryos. Minor differences from the experimental results arose because bna572 always selected only one sugar and one nitrogen source from the available alternatives, and failed to predict the use of the oxidative pentose phosphate pathway. Flux variability, indicative of alternative optimal solutions, revealed alternative pathways that can provide pyruvate and NADPH to plastidic fatty acid synthesis. The nutritional values of different medium substrates were compared based on the overall carbon conversion efficiency (CCE) for the biosynthesis of biomass. Although bna572 has a functional nitrogen assimilation pathway via glutamate synthase, the simulations predict an unexpected role of glycine decarboxylase operating in the direction of NH 4 + assimilation. Analysis of the light‐dependent improvement of carbon economy predicted two metabolic phases. At very low light levels small reductions in CO 2 efflux can be attributed to enzymes of the tricarboxylic acid cycle (oxoglutarate dehydrogenase, isocitrate dehydrogenase) and glycine decarboxylase. At higher light levels relevant to the 13 C flux studies, ribulose‐1,5‐bisphosphate carboxylase activity is predicted to account fully for the light‐dependent changes in carbon balance.
There is a general consensus that supports the need for standardized reporting of metadata or information describing large-scale metabolomics and other functional genomics data sets. Reporting of standard metadata provides a biological and empirical context for the data, facilitates experimental replication, and enables the re-interrogation and comparison of data by others. Accordingly, the Metabolomics Standards Initiative is building a general consensus concerning the minimum reporting standards for metabolomics experiments of which the Chemical Analysis Working Group (CAWG) is a member of this community effort. This article proposes the minimum reporting standards related to the chemical analysis aspects of metabolomics experiments including: sample preparation, experimental analysis, quality control, metabolite identification, and data pre-processing. These minimum standards currently focus mostly upon mass spectrometry and nuclear magnetic resonance spectroscopy due to the popularity of these techniques in metabolomics. However, additional input concerning other techniques is welcomed and can be provided via the CAWG on-line discussion forum at http://msi-workgroups.sourceforge.net/ or http://Msiemail@example.com. Further, community input related to this document can also be provided via this electronic forum.
Abstract Background Budget impact analyses (BIAs) are an essential part of a comprehensive economic assessment of a health care intervention and are increasingly required by reimbursement authorities as part of a listing or reimbursement submission. Objectives The objective of this report was to present updated guidance on methods for those undertaking such analyses or for those reviewing the results of such analyses. This update was needed, in part, because of developments in BIA methods as well as a growing interest, particularly in emerging markets, in matters related to affordability and population health impacts of health care interventions. Methods The Task Force was approved by the International Society for Pharmacoeconomics and Outcomes Research Health Sciences Policy Council and appointed by its Board of Directors. Members were experienced developers or users of BIAs; worked in academia and industry and as advisors to governments; and came from several countries in North America and South America, Oceania, Asia, and Europe. The Task Force solicited comments on the drafts from a core group of external reviewers and, more broadly, from the membership of the International Society for Pharmacoeconomics and Outcomes Research. Results The Task Force recommends that the design of a BIA for a new health care intervention should take into account relevant features of the health care system, possible access restrictions, the anticipated uptake of the new intervention, and the use and effects of the current and new interventions. The key elements of a BIA include estimating the size of the eligible population, the current mix of treatments and the expected mix after the introduction of the new intervention, the cost of the treatment mixes, and any changes expected in condition-related costs. Where possible, the BIA calculations should be performed by using a simple cost calculator approach because of its ease of use for budget holders. In instances, however, in which the changes in eligible population size, disease severity mix, or treatment patterns cannot be credibly captured by using the cost calculator approach, a cohort or patient-level condition-specific model may be used to estimate the budget impact of the new intervention, accounting appropriately for those entering and leaving the eligible population over time. In either case, the BIA should use data that reflect values specific to a particular decision maker’s population. Sensitivity analysis should be of alternative scenarios chosen from the perspective of the decision maker. The validation of the model should include at least face validity with decision makers and verification of the calculations. Data sources for the BIA should include published clinical trial estimates and comparator studies for the efficacy and safety of the current and new interventions as well as the decision maker’s own population for the other parameter estimates, where possible. Other data sources include the use of published data, well-recognized local or national statistical information, and, in special circumstances, expert opinion. Reporting of the BIA should provide detailed information about the input parameter values and calculations at a level of detail that would allow another modeler to replicate the analysis. The outcomes of the BIA should be presented in the format of interest to health care decision makers. In a computer program, options should be provided for different categories of costs to be included or excluded from the analysis. Conclusions We recommend a framework for the BIA, provide guidance on the acquisition and use of data, and offer a common reporting format that will promote standardization and transparency. Adherence to these good research practice principles would not necessarily supersede jurisdiction-specific BIA guidelines but may support and enhance local recommendations or serve as a starting point for payers wishing to promulgate methodology guidelines.
Abstract Background The application of conjoint analysis (including discrete-choice experiments and other multiattribute stated-preference methods) in health has increased rapidly over the past decade. A wider acceptance of these methods is limited by an absence of consensus-based methodological standards. Objective The International Society for Pharmacoeconomics and Outcomes Research (ISPOR) Good Research Practices for Conjoint Analysis Task Force was established to identify good research practices for conjoint-analysis applications in health. Methods The task force met regularly to identify the important steps in a conjoint analysis, to discuss good research practices for conjoint analysis, and to develop and refine the key criteria for identifying good research practices. ISPOR members contributed to this process through an extensive consultation process. A final consensus meeting was held to revise the article using these comments, and those of a number of international reviewers. Results Task force findings are presented as a 10-item checklist covering: 1) research question; 2) attributes and levels; 3) construction of tasks; 4) experimental design; 5) preference elicitation; 6) instrument design; 7) data-collection plan; 8) statistical analyses; 9) results and conclusions; and 10) study presentation. A primary question relating to each of the 10 items is posed, and three sub-questions examine finer issues within items. Conclusions Although the checklist should not be interpreted as endorsing any specific methodological approach to conjoint analysis, it can facilitate future training activities and discussions of good research practices for the application of conjoint-analysis methods in health care studies.
As bariatric surgery becomes ever more popular, so does body-contouring surgery to eliminate excess skin after radical weight loss. To date, the literature has described a number of risk factors affecting the postoperative outcome. Our study aimed to define those factors more closely, focusing on abdominoplasty (“tummy tuck”) patients who suffered intra- and postoperative complications.The study collective included 205 patients over 5 years (2001–2006) who underwent dermolipectomy at our department. The mean follow-up was 5.94 years. Every abdominoplasty was performed under general anesthesia with intraoperative one-dose antibiotic. The analysis included a complete review of all medical records. Statistical analysis was performed with the R-2.5.0 Software for Windows.The overall rate for major complications that required operative revision and/or antibiotics was 10.2 %, including 2.9 % cases of infections. Forty-one percent had minor complications, such as seromas, hematomas, wound healing problems, and wound dehiscences. The logistic regression models demonstrated that smoking combined with the age, a BMI higher than 30 kg/m2, and the amount of removed tissue (measured in g) lead to significantly more wound healing problems in nearly all age groups. The probability of infections correlated with later drain removal.Regardless of the amount of tissue removed, no main risk factor for complications could be identified. A complication-free course and good outcome can be best achieved with careful patient selection and preoperative planning.
Marine echinoderms are filter-feeding invertebrates widely distributed along the coasts, and which are therefore extensively exposed to anthropogenic xenobiotics. They can serve as good sentinels for monitoring a large variety of contaminants in marine ecosystems. In this context, a multi-residue analytical method has been validated and applied to Holothuria tubulosa specimens and marine sediments for the determination of 36 organic compounds, which belong to some of the most problematic groups of emerging and priority pollutants (perfluoroalkyl compounds, estrogens, parabens, benzophenones, plasticizers, surfactants, brominated flame retardants and alkylphenols). Lyophilization of samples prior to solvent extraction and clean-up of extracts with C18, followed by liquid chromatography-tandem mass spectrometry analysis, is proposed. A Box-Behnken design was used for optimization of the most influential variables affecting the extraction and clean-up steps. For validation, matrix-matched calibration and recovery assay were applied. Linearity (% r(2)) higher than 99%, recoveries between 80% and 114% (except in LAS and NP1EO), RSD (precision) lower than 15% and limits of quantification between 0.03 and 12.5 ng g(-1) dry weight (d.w.) were achieved. The method was applied to nine samples of Holothuria collected along the coast of Granada (Spain), and to marine sediments around the animals. The results demonstrated high bioaccumulation of certain pollutants. A total of 25 out of the 36 studied compounds were quantified, being surfactants, alkylphenols, perfluoroalkyl compounds, triclocarban and parabens the most frequently detected. Nonylphenol was found in the highest concentration (340 and 323 ng g(-1) d.w. in sediment and Holothuria samples, respectively).
This paper contributes to the AC small signal modeling and analysis of Z source converter (ZSC) in continuous conduction mode. The AC small signal model considers the dynamics introduced by Z network uniquely contained in ZSC. AC small signal model of ZSC is derived and computer simulation results are used to validate the small signal modeling method. Various applications of the AC small signal models to ZSC design and experimental verifications are presented.