scispace - formally typeset
Search or ask a question

Showing papers by "University of Birmingham published in 2015"


Journal ArticleDOI
Mohsen Naghavi1, Haidong Wang1, Rafael Lozano1, Adrian Davis2  +728 moreInstitutions (294)
TL;DR: In the Global Burden of Disease Study 2013 (GBD 2013) as discussed by the authors, the authors used the GBD 2010 methods with some refinements to improve accuracy applied to an updated database of vital registration, survey, and census data.

5,792 citations


Journal ArticleDOI
28 Aug 2015-Science
TL;DR: A large-scale assessment suggests that experimental reproducibility in psychology leaves a lot to be desired, and correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.
Abstract: Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.

5,532 citations


Journal ArticleDOI
Theo Vos1, Ryan M Barber1, Brad Bell1, Amelia Bertozzi-Villa1  +686 moreInstitutions (287)
TL;DR: In the Global Burden of Disease Study 2013 (GBD 2013) as mentioned in this paper, the authors estimated the quantities for acute and chronic diseases and injuries for 188 countries between 1990 and 2013.

4,510 citations


Journal ArticleDOI
TL;DR: Recent advances in understanding of the mechanisms by which bacteria are either intrinsically resistant or acquire resistance to antibiotics are reviewed, including the prevention of access to drug targets, changes in the structure and protection of antibiotic targets and the direct modification or inactivation of antibiotics.
Abstract: Antibiotic-resistant bacteria that are difficult or impossible to treat are becoming increasingly common and are causing a global health crisis. Antibiotic resistance is encoded by several genes, many of which can transfer between bacteria. New resistance mechanisms are constantly being described, and new genes and vectors of transmission are identified on a regular basis. This article reviews recent advances in our understanding of the mechanisms by which bacteria are either intrinsically resistant or acquire resistance to antibiotics, including the prevention of access to drug targets, changes in the structure and protection of antibiotic targets and the direct modification or inactivation of antibiotics.

2,837 citations


Journal ArticleDOI
TL;DR: The third generation of the Sloan Digital Sky Survey (SDSS-III) took data from 2008 to 2014 using the original SDSS wide-field imager, the original and an upgraded multi-object fiber-fed optical spectrograph, a new near-infrared high-resolution spectrogram, and a novel optical interferometer.
Abstract: The third generation of the Sloan Digital Sky Survey (SDSS-III) took data from 2008 to 2014 using the original SDSS wide-field imager, the original and an upgraded multi-object fiber-fed optical spectrograph, a new near-infrared high-resolution spectrograph, and a novel optical interferometer. All the data from SDSS-III are now made public. In particular, this paper describes Data Release 11 (DR11) including all data acquired through 2013 July, and Data Release 12 (DR12) adding data acquired through 2014 July (including all data included in previous data releases), marking the end of SDSS-III observing. Relative to our previous public release (DR10), DR12 adds one million new spectra of galaxies and quasars from the Baryon Oscillation Spectroscopic Survey (BOSS) over an additional 3000 sq. deg of sky, more than triples the number of H-band spectra of stars as part of the Apache Point Observatory (APO) Galactic Evolution Experiment (APOGEE), and includes repeated accurate radial velocity measurements of 5500 stars from the Multi-Object APO Radial Velocity Exoplanet Large-area Survey (MARVELS). The APOGEE outputs now include measured abundances of 15 different elements for each star. In total, SDSS-III added 2350 sq. deg of ugriz imaging; 155,520 spectra of 138,099 stars as part of the Sloan Exploration of Galactic Understanding and Evolution 2 (SEGUE-2) survey; 2,497,484 BOSS spectra of 1,372,737 galaxies, 294,512 quasars, and 247,216 stars over 9376 sq. deg; 618,080 APOGEE spectra of 156,593 stars; and 197,040 MARVELS spectra of 5,513 stars. Since its first light in 1998, SDSS has imaged over 1/3 of the Celestial sphere in five bands and obtained over five million astronomical spectra.

2,471 citations


Journal ArticleDOI
TL;DR: The design of the hologram integrates a ground metal plane with a geometric metasurface that enhances the conversion efficiency between the two circular polarization states, leading to high diffraction efficiency without complicating the fabrication process.
Abstract: Using a metasurface comprising an array of nanorods with different orientations and a backreflector, a hologram image can be obtained in the visible and near-infrared with limited loss of light intensity.

2,075 citations


Journal ArticleDOI
TL;DR: The Global Burden of Disease, Injuries, and Risk Factor study 2013 (GBD 2013) as mentioned in this paper provides a timely opportunity to update the comparative risk assessment with new data for exposure, relative risks, and evidence on the appropriate counterfactual risk distribution.

1,656 citations


Journal ArticleDOI
TL;DR: The ALBI grade offers a simple, evidence-based, objective, and discriminatory method of assessing liver function in HCC that has been extensively tested in an international setting and eliminates the need for subjective variables such as ascites and encephalopathy, a requirement in the conventional C-P grade.
Abstract: Purpose Most patients with hepatocellular carcinoma (HCC) have associated chronic liver disease, the severity of which is currently assessed by the Child-Pugh (C-P) grade. In this international collaboration, we identify objective measures of liver function/dysfunction that independently influence survival in patients with HCC and then combine these into a model that could be compared with the conventional C-P grade. Patients and Methods We developed a simple model to assess liver function, based on 1,313 patients with HCC of all stages from Japan, that involved only serum bilirubin and albumin levels. We then tested the model using similar cohorts from other geographical regions (n = 5,097) and other clinical situations (patients undergoing resection [n = 525] or sorafenib treatment for advanced HCC [n = 1,132]). The specificity of the model for liver (dys)function was tested in patients with chronic liver disease but without HCC (n = 501). Results The model, the Albumin-Bilirubin (ALBI) grade, performed...

1,617 citations


Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, Ovsat Abdinov4  +5117 moreInstitutions (314)
TL;DR: A measurement of the Higgs boson mass is presented based on the combined data samples of the ATLAS and CMS experiments at the CERN LHC in the H→γγ and H→ZZ→4ℓ decay channels.
Abstract: A measurement of the Higgs boson mass is presented based on the combined data samples of the ATLAS and CMS experiments at the CERN LHC in the H→γγ and H→ZZ→4l decay channels. The results are obtained from a simultaneous fit to the reconstructed invariant mass peaks in the two channels and for the two experiments. The measured masses from the individual channels and the two experiments are found to be consistent among themselves. The combined measured mass of the Higgs boson is mH=125.09±0.21 (stat)±0.11 (syst) GeV.

1,567 citations


Journal ArticleDOI
28 Apr 2015-JAMA
TL;DR: PRISMA-IPD provides guidelines for reporting systematic reviews and meta-analyses of IPD, and although developed primarily for reviews of randomized trials, many items will apply in other contexts, including reviews of diagnosis and prognosis.
Abstract: Importance Systematic reviews and meta-analyses of individual participant data (IPD) aim to collect, check, and reanalyze individual-level data from all studies addressing a particular research question and are therefore considered a gold standard approach to evidence synthesis. They are likely to be used with increasing frequency as current initiatives to share clinical trial data gain momentum and may be particularly important in reviewing controversial therapeutic areas. Objective To develop PRISMA-IPD as a stand-alone extension to the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) Statement, tailored to the specific requirements of reporting systematic reviews and meta-analyses of IPD. Although developed primarily for reviews of randomized trials, many items will apply in other contexts, including reviews of diagnosis and prognosis. Design Development of PRISMA-IPD followed the EQUATOR Network framework guidance and used the existing standard PRISMA Statement as a starting point to draft additional relevant material. A web-based survey informed discussion at an international workshop that included researchers, clinicians, methodologists experienced in conducting systematic reviews and meta-analyses of IPD, and journal editors. The statement was drafted and iterative refinements were made by the project, advisory, and development groups. The PRISMA-IPD Development Group reached agreement on the PRISMA-IPD checklist and flow diagram by consensus. Findings Compared with standard PRISMA, the PRISMA-IPD checklist includes 3 new items that address (1) methods of checking the integrity of the IPD (such as pattern of randomization, data consistency, baseline imbalance, and missing data), (2) reporting any important issues that emerge, and (3) exploring variation (such as whether certain types of individual benefit more from the intervention than others). A further additional item was created by reorganization of standard PRISMA items relating to interpreting results. Wording was modified in 23 items to reflect the IPD approach. Conclusions and Relevance PRISMA-IPD provides guidelines for reporting systematic reviews and meta-analyses of IPD.

1,392 citations


Journal ArticleDOI
J. Aasi1, J. Abadie1, B. P. Abbott1, Richard J. Abbott1  +884 moreInstitutions (98)
TL;DR: In this paper, the authors review the performance of the LIGO instruments during this epoch, the work done to characterize the detectors and their data, and the effect that transient and continuous noise artefacts have on the sensitivity of the detectors to a variety of astrophysical sources.
Abstract: In 2009–2010, the Laser Interferometer Gravitational-Wave Observatory (LIGO) operated together with international partners Virgo and GEO600 as a network to search for gravitational waves (GWs) of astrophysical origin. The sensitivity of these detectors was limited by a combination of noise sources inherent to the instrumental design and its environment, often localized in time or frequency, that couple into the GW readout. Here we review the performance of the LIGO instruments during this epoch, the work done to characterize the detectors and their data, and the effect that transient and continuous noise artefacts have on the sensitivity of LIGO to a variety of astrophysical sources.

Journal ArticleDOI
01 Oct 2015-Europace
TL;DR: The current manuscript is an update of the original Practical Guide, published in June 2013, and listed 15 topics of concrete clinical scenarios for which practical answers were formulated, based on available evidence.
Abstract: The current manuscript is an update of the original Practical Guide, published in June 2013[Heidbuchel H, Verhamme P, Alings M, Antz M, Hacke W, Oldgren J, et al. European Heart Rhythm Association Practical Guide on the use of new oral anticoagulants in patients with non-valvular atrial fibrillation. Europace 2013;15:625-51; Heidbuchel H, Verhamme P, Alings M, Antz M, Hacke W, Oldgren J, et al. EHRA practical guide on the use of new oral anticoagulants in patients with non-valvular atrial fibrillation: executive summary. Eur Heart J 2013;34:2094-106]. Non-vitamin K antagonist oral anticoagulants (NOACs) are an alternative for vitamin K antagonists (VKAs) to prevent stroke in patients with non-valvular atrial fibrillation (AF). Both physicians and patients have to learn how to use these drugs effectively and safely in clinical practice. Many unresolved questions on how to optimally use these drugs in specific clinical situations remain. The European Heart Rhythm Association set out to coordinate a unified way of informing physicians on the use of the different NOACs. A writing group defined what needs to be considered as 'non-valvular AF' and listed 15 topics of concrete clinical scenarios for which practical answers were formulated, based on available evidence. The 15 topics are (i) practical start-up and follow-up scheme for patients on NOACs; (ii) how to measure the anticoagulant effect of NOACs; (iii) drug-drug interactions and pharmacokinetics of NOACs; (iv) switching between anticoagulant regimens; (v) ensuring adherence of NOAC intake; (vi) how to deal with dosing errors; (vii) patients with chronic kidney disease; (viii) what to do if there is a (suspected) overdose without bleeding, or a clotting test is indicating a risk of bleeding?; (xi) management of bleeding complications; (x) patients undergoing a planned surgical intervention or ablation; (xi) patients undergoing an urgent surgical intervention; (xii) patients with AF and coronary artery disease; (xiii) cardioversion in a NOAC-treated patient; (xiv) patients presenting with acute stroke while on NOACs; and (xv) NOACs vs. VKAs in AF patients with a malignancy. Additional information and downloads of the text and anticoagulation cards in >16 languages can be found on an European Heart Rhythm Association web site (www.NOACforAF.eu).

Journal ArticleDOI
TL;DR: This work has assembled de novo the Escherichia coli K-12 MG1655 chromosome in a single 4.6-Mb contig using only nanopore data and reconstructs gene order and has 99.5% nucleotide identity.
Abstract: We have assembled de novo the Escherichia coli K-12 MG1655 chromosome in a single 4.6-Mb contig using only nanopore data. Our method has three stages: (i) overlaps are detected between reads and then corrected by a multiple-alignment process; (ii) corrected reads are assembled using the Celera Assembler; and (iii) the assembly is polished using a probabilistic model of the signal-level data. The assembly reconstructs gene order and has 99.5% nucleotide identity.


Journal ArticleDOI
Roel Aaij, Bernardo Adeva1, Marco Adinolfi2, A. A. Affolder3  +700 moreInstitutions (63)
TL;DR: In this paper, the performance of the various LHCb sub-detectors and the trigger system are described, using data taken from 2010 to 2012, and it is shown that the design criteria of the experiment have been met.
Abstract: The LHCb detector is a forward spectrometer at the Large Hadron Collider (LHC) at CERN. The experiment is designed for precision measurements of CP violation and rare decays of beauty and charm hadrons. In this paper the performance of the various LHCb sub-detectors and the trigger system are described, using data taken from 2010 to 2012. It is shown that the design criteria of the experiment have been met. The excellent performance of the detector has allowed the LHCb collaboration to publish a wide range of physics results, demonstrating LHCb's unique role, both as a heavy flavour experiment and as a general purpose detector in the forward region.

Journal ArticleDOI
06 Feb 2015-BMJ
TL;DR: The stepped wedge cluster randomised trial is a novel research study design that is increasingly being used in the evaluation of service delivery type interventions and is particularly suited to evaluations that do not rely on individual patient recruitment.
Abstract: The stepped wedge cluster randomised trial is a novel research study design that is increasingly being used in the evaluation of service delivery type interventions. The design involves random and sequential crossover of clusters from control to intervention until all clusters are exposed. It is a pragmatic study design which can reconcile the need for robust evaluations with political or logistical constraints. While not exclusively for the evaluation of service delivery interventions, it is particularly suited to evaluations that do not rely on individual patient recruitment. As in all cluster trials, stepped wedge trials with individual recruitment and without concealment of allocation (or blinding of the intervention) are at risk of selection biases. In a stepped wedge design more clusters are exposed to the intervention towards the end of the study than in its early stages. This implies that the effect of the intervention might be confounded with any underlying temporal trend. A result that initially might seem suggestive of an effect of the intervention may therefore transpire to be the result of a positive underlying temporal trend. Sample size calculations and analysis must make allowance for both the clustered nature of the design and the confounding effect of time. The stepped wedge cluster randomised trial is an alternative to traditional parallel cluster studies, in which the intervention is delivered in only half the clusters with the remainder functioning as controls. When the clusters are relatively homogeneous (that is, the intra-cluster correlation is small), parallel studies tend to deliver better statistical performance than a stepped wedge trial. However, if substantial cluster-level effects are present (that is, larger intra-cluster correlations) or the clusters are large, the stepped wedge design will be more powerful than a parallel design, even one in which the intervention is preceded by a period of baseline control observations.

Journal ArticleDOI
TL;DR: In this paper, the influence of selective laser melting (SLM) process parameters (laser power, scan speed, scan spacing, and island size) on the porosity development in AlSi10Mg alloy builds has been investigated, using statistical design of experimental approach, correlated with the energy density model.

Journal ArticleDOI
Roel Aaij1, Bernardo Adeva2, Marco Adinolfi3, A. A. Affolder4  +719 moreInstitutions (49)
TL;DR: In this article, the pentaquark-charmonium states were observed in the J/ψp channel in Λ0b→J/K−p decays and the significance of these resonances is more than 9 standard deviations.
Abstract: Observations of exotic structures in the J/ψp channel, that we refer to as pentaquark-charmonium states, in Λ0b→J/ψK−p decays are presented. The data sample corresponds to an integrated luminosity of 3/fb acquired with the LHCb detector from 7 and 8 TeV pp collisions. An amplitude analysis is performed on the three-body final-state that reproduces the two-body mass and angular distributions. To obtain a satisfactory fit of the structures seen in the J/ψp mass spectrum, it is necessary to include two Breit-Wigner amplitudes that each describe a resonant state. The significance of each of these resonances is more than 9 standard deviations. One has a mass of 4380±8±29 MeV and a width of 205±18±86 MeV, while the second is narrower, with a mass of 4449.8±1.7±2.5 MeV and a width of 39±5±19 MeV. The preferred JP assignments are of opposite parity, with one state having spin 3/2 and the other 5/2.

Journal ArticleDOI
TL;DR: A review of the current state of scientific knowledge of definitions, processes, and quantification of hydrological drought is given in this paper, where the influence of climate and terrestrial properties (geology, land use) on hydrologic drought characteristics and the role of storage is discussed.
Abstract: Drought is a complex natural hazard that impacts ecosystems and society in many ways. Many of these impacts are associated with hydrological drought (drought in rivers, lakes, and groundwater). It is, therefore, crucial to understand the development and recovery of hydrological drought. In this review an overview is given of the current state of scientific knowledge of definitions, processes, and quantification of hydrological drought. Special attention is given to the influence of climate and terrestrial properties (geology, land use) on hydrological drought characteristics and the role of storage. Furthermore, the current debate about the use and usefulness of different drought indicators is highlighted and recent advances in drought monitoring and prediction are mentioned. Research on projections of hydrological drought for the future is summarized. This review also briefly touches upon the link of hydrological drought characteristics with impacts and the issues related to drought management. Finally, four challenges for future research on hydrological drought are defined that relate international initiatives such as the Intergovernmental Panel on Climate Change (IPCC) and the ‘Panta Rhei’ decade of the International Association of Hydrological Sciences (IAHS). WIREs Water 2015, 2:359–392. doi: 10.1002/wat2.1085 For further resources related to this article, please visit the WIREs website.

Journal ArticleDOI
TL;DR: This work describes the LALInference software library for Bayesian parameter estimation of compact binary signals, which builds on several previous methods to provide a well-tested toolkit which has already been used for several studies.
Abstract: The Advanced LIGO and Advanced Virgo gravitational-wave (GW) detectors will begin operation in the coming years, with compact binary coalescence events a likely source for the first detections. The gravitational waveforms emitted directly encode information about the sources, including the masses and spins of the compact objects. Recovering the physical parameters of the sources from the GW observations is a key analysis task. This work describes the LALInference software library for Bayesian parameter estimation of compact binary signals, which builds on several previous methods to provide a well-tested toolkit which has already been used for several studies. We show that our implementation is able to correctly recover the parameters of compact binary signals from simulated data from the advanced GW detectors. We demonstrate this with a detailed comparison on three compact binary systems: a binary neutron star, a neutron star–black hole binary and a binary black hole, where we show a cross comparison of results obtained using three independent sampling algorithms. These systems were analyzed with nonspinning, aligned spin and generic spin configurations respectively, showing that consistent results can be obtained even with the full 15-dimensional parameter space of the generic spin configurations. We also demonstrate statistically that the Bayesian credible intervals we recover correspond to frequentist confidence intervals under correct prior assumptions by analyzing a set of 100 signals drawn from the prior. We discuss the computational cost of these algorithms, and describe the general and problem-specific sampling techniques we have used to improve the efficiency of sampling the compact binary coalescence parameter space.

Journal ArticleDOI
TL;DR: The proposed system for defining and recording perioperative complications associated with esophagectomy provides an infrastructure to standardize international data collection and facilitate future comparative studies and quality improvement projects.
Abstract: Introduction: Perioperative complications influence long- and short-term outcomes after esophagectomy. The absence of a standardized system for defining and recording complications and quality measures after esophageal resection has meant that there is wide variation in evaluating their impact on these outcomes. Methods: The Esophageal Complications Consensus Group comprised 21 high-volume esophageal surgeons from 14 countries, supported by all the major thoracic and upper gastrointestinal professional societies. Delphi surveys and group meetings were used to achieve a consensus on standardized methods for defining complications and quality measures that could be collected in institutional databases and national audits. Results: A standardized list of complications was created to provide a template for recording individual complications associated with esophagectomy. Where possible, these were linked to preexisting international definitions. A Delphi survey facilitated production of specific definitions for anastomotic leak, conduit necrosis, chyle leak, and recurrent nerve palsy. An additional Delphi survey documented consensus regarding critical quality parameters recommended for routine inclusion in databases. These quality parameters were documentation on mortality, comorbidities, completeness of data collection, blood transfusion, grading of complication severity, changes in level of care, discharge location, and readmission rates. Conclusions: The proposed system for defining and recording perioperative complications associated with esophagectomy provides an infrastructure to standardize international data collection and facilitate future comparative studies and quality improvement projects.

Journal ArticleDOI
TL;DR: The global HCC BRIDGE study was a multiregional, large‐scale, longitudinal cohort study undertaken to improve understanding of real‐life management of patients with HCC, from diagnosis to death.
Abstract: Background & Aims Hepatocellular carcinoma (HCC) is the second most common cause of cancer deaths worldwide. The global HCC BRIDGE study was a multiregional, large-scale, longitudinal cohort study undertaken to improve understanding of real-life management of patients with HCC, from diagnosis to death.

Journal ArticleDOI
TL;DR: The demonstrated helicity multiplexed metasurface hologram with its high performance opens avenues for future applications with functionality switchable optical devices.
Abstract: Metasurfaces are engineered interfaces that contain a thin layer of plasmonic or dielectric nanostructures capable of manipulating light in a desirable manner. Advances in metasurfaces have led to various practical applications ranging from lensing to holography. Metasurface holograms that can be switched by the polarization state of incident light have been demonstrated for achieving polarization multiplexed functionalities. However, practical application of these devices has been limited by their capability for achieving high efficiency and high image quality. Here we experimentally demonstrate a helicity multiplexed metasurface hologram with high efficiency and good image fidelity over a broad range of frequencies. The metasurface hologram features the combination of two sets of hologram patterns operating with opposite incident helicities. Two symmetrically distributed off-axis images are interchangeable by controlling the helicity of the input light. The demonstrated helicity multiplexed metasurface hologram with its high performance opens avenues for future applications with functionality switchable optical devices.

Journal ArticleDOI
TL;DR: A complete review of the different techniques that have been developed to recycle fiber reinforced polymers is presented in this paper, focusing on the reuse of valuable products recovered by different techniques, in particular the way that fibres have been reincorporated into new materials or applications and main technological issues encountered.

Journal ArticleDOI
TL;DR: In this paper, a clinical classification is used to stratify management based on simple (non-perforated) and complex (gangrenous or perforated), although many patients remain with an equivocal diagnosis, which is one of the most challenging dilemmas.

Journal ArticleDOI
TL;DR: In this paper, the development of surface structure and porosity of Ti-6Al-4V samples fabricated by selective laser melting under different laser scanning speeds and powder layer thicknesses has been studied and correlated with the melt flow behavior through both experimental and modelling approaches.

Proceedings ArticleDOI
07 Dec 2015
TL;DR: The Visual Object Tracking challenge 2015, VOT2015, aims at comparing short-term single-object visual trackers that do not apply pre-learned models of object appearance and presents a new VOT 2015 dataset twice as large as in VOT2014 with full annotation of targets by rotated bounding boxes and per-frame attribute.
Abstract: The Visual Object Tracking challenge 2015, VOT2015, aims at comparing short-term single-object visual trackers that do not apply pre-learned models of object appearance. Results of 62 trackers are presented. The number of tested trackers makes VOT 2015 the largest benchmark on short-term tracking to date. For each participating tracker, a short description is provided in the appendix. Features of the VOT2015 challenge that go beyond its VOT2014 predecessor are: (i) a new VOT2015 dataset twice as large as in VOT2014 with full annotation of targets by rotated bounding boxes and per-frame attribute, (ii) extensions of the VOT2014 evaluation methodology by introduction of a new performance measure. The dataset, the evaluation kit as well as the results are publicly available at the challenge website.

Journal ArticleDOI
Marnix H. Medema1, Marnix H. Medema2, Renzo Kottmann2, Pelin Yilmaz2  +161 moreInstitutions (84)
TL;DR: This work proposes the Minimum Information about a Biosynthetic Gene cluster (MIBiG) data standard, to facilitate consistent and systematic deposition and retrieval of data on biosynthetic gene clusters.
Abstract: A wide variety of enzymatic pathways that produce specialized metabolites in bacteria, fungi and plants are known to be encoded in biosynthetic gene clusters. Information about these clusters, pathways and metabolites is currently dispersed throughout the literature, making it difficult to exploit. To facilitate consistent and systematic deposition and retrieval of data on biosynthetic gene clusters, we propose the Minimum Information about a Biosynthetic Gene cluster (MIBiG) data standard.

Journal ArticleDOI
Colm O'Dushlaine1, Lizzy Rossin1, Phil Lee2, Laramie E. Duncan1  +401 moreInstitutions (115)
TL;DR: It is indicated that risk variants for psychiatric disorders aggregate in particular biological pathways and that these pathways are frequently shared between disorders.
Abstract: Genome-wide association studies (GWAS) of psychiatric disorders have identified multiple genetic associations with such disorders, but better methods are needed to derive the underlying biological mechanisms that these signals indicate. We sought to identify biological pathways in GWAS data from over 60,000 participants from the Psychiatric Genomics Consortium. We developed an analysis framework to rank pathways that requires only summary statistics. We combined this score across disorders to find common pathways across three adult psychiatric disorders: schizophrenia, major depression and bipolar disorder. Histone methylation processes showed the strongest association, and we also found statistically significant evidence for associations with multiple immune and neuronal signaling pathways and with the postsynaptic density. Our study indicates that risk variants for psychiatric disorders aggregate in particular biological pathways and that these pathways are frequently shared between disorders. Our results confirm known mechanisms and suggest several novel insights into the etiology of psychiatric disorders.

Journal ArticleDOI
TL;DR: A survey of MaOEAs is reported and seven classes of many-objective evolutionary algorithms proposed are categorized into seven classes: relaxed dominance based, diversity-based, aggregation- based, indicator-Based, reference set based, preference-based and dimensionality reduction approaches.
Abstract: Multiobjective evolutionary algorithms (MOEAs) have been widely used in real-world applications. However, most MOEAs based on Pareto-dominance handle many-objective problems (MaOPs) poorly due to a high proportion of incomparable and thus mutually nondominated solutions. Recently, a number of many-objective evolutionary algorithms (MaOEAs) have been proposed to deal with this scalability issue. In this article, a survey of MaOEAs is reported. According to the key ideas used, MaOEAs are categorized into seven classes: relaxed dominance based, diversity-based, aggregation-based, indicator-based, reference set based, preference-based, and dimensionality reduction approaches. Several future research directions in this field are also discussed.