scispace - formally typeset
Search or ask a question

Showing papers by "Durham University published in 2019"


Journal ArticleDOI
TL;DR: This work provides a comprehensive overview of fundamental principles that underpin blockchain technologies, such as system architectures and distributed consensus algorithms, and discusses opportunities, potential challenges and limitations for a number of use cases, ranging from emerging peer-to-peer energy trading and Internet of Things applications, to decentralised marketplaces, electric vehicle charging and e-mobility.
Abstract: Blockchains or distributed ledgers are an emerging technology that has drawn considerable interest from energy supply firms, startups, technology developers, financial institutions, national governments and the academic community. Numerous sources coming from these backgrounds identify blockchains as having the potential to bring significant benefits and innovation. Blockchains promise transparent, tamper-proof and secure systems that can enable novel business solutions, especially when combined with smart contracts. This work provides a comprehensive overview of fundamental principles that underpin blockchain technologies, such as system architectures and distributed consensus algorithms. Next, we focus on blockchain solutions for the energy industry and inform the state-of-the-art by thoroughly reviewing the literature and current business cases. To our knowledge, this is one of the first academic, peer-reviewed works to provide a systematic review of blockchain activities and initiatives in the energy sector. Our study reviews 140 blockchain research projects and startups from which we construct a map of the potential and relevance of blockchains for energy applications. These initiatives were systematically classified into different groups according to the field of activity, implementation platform and consensus strategy used. 1 Opportunities, potential challenges and limitations for a number of use cases are discussed, ranging from emerging peer-to-peer (P2P) energy trading and Internet of Things (IoT) applications, to decentralised marketplaces, electric vehicle charging and e-mobility. For each of these use cases, our contribution is twofold: first, in identifying the technical challenges that blockchain technology can solve for that application as well as its potential drawbacks, and second in briefly presenting the research and industrial projects and startups that are currently applying blockchain technology to that area. The paper ends with a discussion of challenges and market barriers the technology needs to overcome to get past the hype phase, prove its commercial viability and finally be adopted in the mainstream.

1,399 citations


Journal ArticleDOI
TL;DR: An overview of basic and clinical MRSA research is provided and the expansive body of literature on the epidemiology, transmission, genetic diversity, evolution, surveillance and treatment of MRSA is explored.
Abstract: Methicillin-resistant Staphylococcus aureus (MRSA) is one of the most successful modern pathogens. The same organism that lives as a commensal and is transmitted in both health-care and community settings is also a leading cause of bacteraemia, endocarditis, skin and soft tissue infections, bone and joint infections and hospital-acquired infections. Genetically diverse, the epidemiology of MRSA is primarily characterized by the serial emergence of epidemic strains. Although its incidence has recently declined in some regions, MRSA still poses a formidable clinical threat, with persistently high morbidity and mortality. Successful treatment remains challenging and requires the evaluation of both novel antimicrobials and adjunctive aspects of care, such as infectious disease consultation, echocardiography and source control. In this Review, we provide an overview of basic and clinical MRSA research and summarize the expansive body of literature on the epidemiology, transmission, genetic diversity, evolution, surveillance and treatment of MRSA.

870 citations


Journal ArticleDOI
TL;DR: The full public release of all data from the TNG100 and TNG300 simulations of the IllustrisTNG project is presented in this article, which includes a comprehensive model for galaxy formation physics, and each TNG simulation selfconsistently solves for the coupled evolution of dark matter, cosmic gas, luminous stars, and supermassive black holes from early time to the present day.
Abstract: We present the full public release of all data from the TNG100 and TNG300 simulations of the IllustrisTNG project. IllustrisTNG is a suite of large volume, cosmological, gravo-magnetohydrodynamical simulations run with the moving-mesh code Arepo. TNG includes a comprehensive model for galaxy formation physics, and each TNG simulation self-consistently solves for the coupled evolution of dark matter, cosmic gas, luminous stars, and supermassive black holes from early time to the present day, $z=0$ . Each of the flagship runs—TNG50, TNG100, and TNG300—are accompanied by halo/subhalo catalogs, merger trees, lower-resolution and dark-matter only counterparts, all available with 100 snapshots. We discuss scientific and numerical cautions and caveats relevant when using TNG. The data volume now directly accessible online is ∼750 TB, including 1200 full volume snapshots and ∼80,000 high time-resolution subbox snapshots. This will increase to ∼1.1 PB with the future release of TNG50. Data access and analysis examples are available in IDL, Python, and Matlab. We describe improvements and new functionality in the web-based API, including on-demand visualization and analysis of galaxies and halos, exploratory plotting of scaling relations and other relationships between galactic and halo properties, and a new JupyterLab interface. This provides an online, browser-based, near-native data analysis platform enabling user computation with local access to TNG data, alleviating the need to download large datasets.

588 citations


Journal ArticleDOI
TL;DR: Enzalutamide with ADT significantly reduced the risk of metastatic progression or death over time versus placebo plus ADT in men with mHSPC, including those with low-volume disease and/or prior docetaxel, with a safety analysis that seems consistent with the safety profile of enzalutamia in previous clinical trials in castration-resistant prostate cancer.
Abstract: PURPOSEEnzalutamide, a potent androgen-receptor inhibitor, has demonstrated significant benefits in metastatic and nonmetastatic castration-resistant prostate cancer. We evaluated the efficacy and ...

560 citations


Journal ArticleDOI
TL;DR: Results showed that nivolumab plus ipilimumab continued to be superior to sunitinib in terms of overall survival and characterisation of response, and safety after extended follow-up in intermediate-risk or poor-risk patients.
Abstract: Summary Background In the ongoing phase 3 CheckMate 214 trial, nivolumab plus ipilimumab showed superior efficacy over sunitinib in patients with previously untreated intermediate-risk or poor-risk advanced renal cell carcinoma, with a manageable safety profile. In this study, we aimed to assess efficacy and safety after extended follow-up to inform the long-term clinical benefit of nivolumab plus ipilimumab versus sunitinib in this setting. Methods In the phase 3, randomised, controlled CheckMate 214 trial, patients aged 18 years and older with previously untreated, advanced, or metastatic histologically confirmed renal cell carcinoma with a clear-cell component were recruited from 175 hospitals and cancer centres in 28 countries. Patients were categorised by International Metastatic Renal Cell Carcinoma Database Consortium risk status into favourable-risk, intermediate-risk, and poor-risk subgroups and randomly assigned (1:1) to open-label nivolumab (3 mg/kg intravenously) plus ipilimumab (1 mg/kg intravenously) every 3 weeks for four doses, followed by nivolumab (3 mg/kg intravenously) every 2 weeks; or sunitinib (50 mg orally) once daily for 4 weeks (6-week cycle). Randomisation was done through an interactive voice response system, with a block size of four and stratified by risk status and geographical region. The co-primary endpoints for the trial were overall survival, progression-free survival per independent radiology review committee (IRRC), and objective responses per IRRC in intermediate-risk or poor-risk patients. Secondary endpoints were overall survival, progression-free survival per IRRC, and objective responses per IRRC in the intention-to-treat population, and adverse events in all treated patients. In this Article, we report overall survival, investigator-assessed progression-free survival, investigator-assessed objective response, characterisation of response, and safety after extended follow-up. Efficacy outcomes were assessed in all randomly assigned patients; safety was assessed in all treated patients. This study is registered with ClinicalTrials.gov, number NCT02231749, and is ongoing but now closed to recruitment. Findings Between Oct 16, 2014, and Feb 23, 2016, of 1390 patients screened, 1096 (79%) eligible patients were randomly assigned to nivolumab plus ipilimumab or sunitinib (550 vs 546 in the intention-to-treat population; 425 vs 422 intermediate-risk or poor-risk patients, and 125 vs 124 favourable-risk patients). With extended follow-up (median follow-up 32·4 months [IQR 13·4–36·3]), in intermediate-risk or poor-risk patients, results for the three co-primary efficacy endpoints showed that nivolumab plus ipilimumab continued to be superior to sunitinib in terms of overall survival (median not reached [95% CI 35·6–not estimable] vs 26·6 months [22·1–33·4]; hazard ratio [HR] 0·66 [95% CI 0·54–0·80], p Interpretation The results suggest that the superior efficacy of nivolumab plus ipilimumab over sunitinib was maintained in intermediate-risk or poor-risk and intention-to-treat patients with extended follow-up, and show the long-term benefits of nivolumab plus ipilimumab in patients with previously untreated advanced renal cell carcinoma across all risk categories. Funding Bristol-Myers Squibb and ONO Pharmaceutical.

527 citations


Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1491 moreInstitutions (239)
TL;DR: In this article, the authors present the second volume of the Future Circular Collider Conceptual Design Report, devoted to the electron-positron collider FCC-ee, and present the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan.
Abstract: In response to the 2013 Update of the European Strategy for Particle Physics, the Future Circular Collider (FCC) study was launched, as an international collaboration hosted by CERN. This study covers a highest-luminosity high-energy lepton collider (FCC-ee) and an energy-frontier hadron collider (FCC-hh), which could, successively, be installed in the same 100 km tunnel. The scientific capabilities of the integrated FCC programme would serve the worldwide community throughout the 21st century. The FCC study also investigates an LHC energy upgrade, using FCC-hh technology. This document constitutes the second volume of the FCC Conceptual Design Report, devoted to the electron-positron collider FCC-ee. After summarizing the physics discovery opportunities, it presents the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan. FCC-ee can be built with today’s technology. Most of the FCC-ee infrastructure could be reused for FCC-hh. Combining concepts from past and present lepton colliders and adding a few novel elements, the FCC-ee design promises outstandingly high luminosity. This will make the FCC-ee a unique precision instrument to study the heaviest known particles (Z, W and H bosons and the top quark), offering great direct and indirect sensitivity to new physics.

526 citations


Journal ArticleDOI
Arjun Dey, David J. Schlegel1, Dustin Lang2, Dustin Lang3  +162 moreInstitutions (52)
TL;DR: The DESI Legacy Imaging Surveys (http://legacysurvey.org/) as mentioned in this paper is a combination of three public projects (the Dark Energy Camera Legacy Survey, the Beijing-Arizona Sky Survey, and the Mayall z-band Legacy Survey) that will jointly image ≈14,000 deg2 of the extragalactic sky visible from the northern hemisphere in three optical bands (g, r, and z) using telescopes at the Kitt Peak National Observatory and the Cerro Tololo Inter-American Observatory.
Abstract: The DESI Legacy Imaging Surveys (http://legacysurvey.org/) are a combination of three public projects (the Dark Energy Camera Legacy Survey, the Beijing–Arizona Sky Survey, and the Mayall z-band Legacy Survey) that will jointly image ≈14,000 deg2 of the extragalactic sky visible from the northern hemisphere in three optical bands (g, r, and z) using telescopes at the Kitt Peak National Observatory and the Cerro Tololo Inter-American Observatory. The combined survey footprint is split into two contiguous areas by the Galactic plane. The optical imaging is conducted using a unique strategy of dynamically adjusting the exposure times and pointing selection during observing that results in a survey of nearly uniform depth. In addition to calibrated images, the project is delivering a catalog, constructed by using a probabilistic inference-based approach to estimate source shapes and brightnesses. The catalog includes photometry from the grz optical bands and from four mid-infrared bands (at 3.4, 4.6, 12, and 22 μm) observed by the Wide-field Infrared Survey Explorer satellite during its full operational lifetime. The project plans two public data releases each year. All the software used to generate the catalogs is also released with the data. This paper provides an overview of the Legacy Surveys project.

517 citations


Journal ArticleDOI
TL;DR: An international panel of clinicians and laboratory-based scientists convened by Cancer Research UK identify and discuss seven challenges that must be overcome if the authors are to cure all patients with a brain tumour.
Abstract: Despite decades of research, brain tumours remain among the deadliest of all forms of cancer. The ability of these tumours to resist almost all conventional and novel treatments relates, in part, to the unique cell-intrinsic and microenvironmental properties of neural tissues. In an attempt to encourage progress in our understanding and ability to successfully treat patients with brain tumours, Cancer Research UK convened an international panel of clinicians and laboratory-based scientists to identify challenges that must be overcome if we are to cure all patients with a brain tumour. The seven key challenges summarized in this Position Paper are intended to serve as foci for future research and investment.

466 citations


Journal ArticleDOI
TL;DR: This research presents a novel probabilistic approach that allows us to assess the importance of knowing the carrier and removal status of canine coronavirus, as a source of infection for other animals.
Abstract: Identification and management of patients at high bleeding risk undergoing percutaneous coronary intervention are of major importance, but a lack of standardization in defining this population limi ...

450 citations


Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1496 moreInstitutions (238)
TL;DR: In this paper, the authors describe the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider in collaboration with national institutes, laboratories and universities worldwide, and enhanced by a strong participation of industrial partners.
Abstract: Particle physics has arrived at an important moment of its history. The discovery of the Higgs boson, with a mass of 125 GeV, completes the matrix of particles and interactions that has constituted the “Standard Model” for several decades. This model is a consistent and predictive theory, which has so far proven successful at describing all phenomena accessible to collider experiments. However, several experimental facts do require the extension of the Standard Model and explanations are needed for observations such as the abundance of matter over antimatter, the striking evidence for dark matter and the non-zero neutrino masses. Theoretical issues such as the hierarchy problem, and, more in general, the dynamical origin of the Higgs mechanism, do likewise point to the existence of physics beyond the Standard Model. This report contains the description of a novel research infrastructure based on a highest-energy hadron collider with a centre-of-mass collision energy of 100 TeV and an integrated luminosity of at least a factor of 5 larger than the HL-LHC. It will extend the current energy frontier by almost an order of magnitude. The mass reach for direct discovery will reach several tens of TeV, and allow, for example, to produce new particles whose existence could be indirectly exposed by precision measurements during the earlier preceding e+e– collider phase. This collider will also precisely measure the Higgs self-coupling and thoroughly explore the dynamics of electroweak symmetry breaking at the TeV scale, to elucidate the nature of the electroweak phase transition. WIMPs as thermal dark matter candidates will be discovered, or ruled out. As a single project, this particle collider infrastructure will serve the world-wide physics community for about 25 years and, in combination with a lepton collider (see FCC conceptual design report volume 2), will provide a research tool until the end of the 21st century. Collision energies beyond 100 TeV can be considered when using high-temperature superconductors. The European Strategy for Particle Physics (ESPP) update 2013 stated “To stay at the forefront of particle physics, Europe needs to be in a position to propose an ambitious post-LHC accelerator project at CERN by the time of the next Strategy update”. The FCC study has implemented the ESPP recommendation by developing a long-term vision for an “accelerator project in a global context”. This document describes the detailed design and preparation of a construction project for a post-LHC circular energy frontier collider “in collaboration with national institutes, laboratories and universities worldwide”, and enhanced by a strong participation of industrial partners. Now, a coordinated preparation effort can be based on a core of an ever-growing consortium of already more than 135 institutes worldwide. The technology for constructing a high-energy circular hadron collider can be brought to the technology readiness level required for constructing within the coming ten years through a focused R&D programme. The FCC-hh concept comprises in the baseline scenario a power-saving, low-temperature superconducting magnet system based on an evolution of the Nb3Sn technology pioneered at the HL-LHC, an energy-efficient cryogenic refrigeration infrastructure based on a neon-helium (Nelium) light gas mixture, a high-reliability and low loss cryogen distribution infrastructure based on Invar, high-power distributed beam transfer using superconducting elements and local magnet energy recovery and re-use technologies that are already gradually introduced at other CERN accelerators. On a longer timescale, high-temperature superconductors can be developed together with industrial partners to achieve an even more energy efficient particle collider or to reach even higher collision energies.The re-use of the LHC and its injector chain, which also serve for a concurrently running physics programme, is an essential lever to come to an overall sustainable research infrastructure at the energy frontier. Strategic R&D for FCC-hh aims at minimising construction cost and energy consumption, while maximising the socio-economic impact. It will mitigate technology-related risks and ensure that industry can benefit from an acceptable utility. Concerning the implementation, a preparatory phase of about eight years is both necessary and adequate to establish the project governance and organisation structures, to build the international machine and experiment consortia, to develop a territorial implantation plan in agreement with the host-states’ requirements, to optimise the disposal of land and underground volumes, and to prepare the civil engineering project. Such a large-scale, international fundamental research infrastructure, tightly involving industrial partners and providing training at all education levels, will be a strong motor of economic and societal development in all participating nations. The FCC study has implemented a set of actions towards a coherent vision for the world-wide high-energy and particle physics community, providing a collaborative framework for topically complementary and geographically well-balanced contributions. This conceptual design report lays the foundation for a subsequent infrastructure preparatory and technical design phase.

425 citations


Journal ArticleDOI
A. Abada1, Marcello Abbrescia2, Marcello Abbrescia3, Shehu S. AbdusSalam4  +1501 moreInstitutions (239)
TL;DR: In this article, the physics opportunities of the Future Circular Collider (FC) were reviewed, covering its e+e-, pp, ep and heavy ion programs, and the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions.
Abstract: We review the physics opportunities of the Future Circular Collider, covering its e+e-, pp, ep and heavy ion programmes. We describe the measurement capabilities of each FCC component, addressing the study of electroweak, Higgs and strong interactions, the top quark and flavour, as well as phenomena beyond the Standard Model. We highlight the synergy and complementarity of the different colliders, which will contribute to a uniquely coherent and ambitious research programme, providing an unmatchable combination of precision and sensitivity to new physics.

Journal ArticleDOI
TL;DR: Improved genome assemblies of allotetraploid cotton species Gossypium hirsutum and GOSSypium barbadense provide insights into cotton evolution and inform the construction of introgression lines used to identify loci associated with fiber quality.
Abstract: Allotetraploid cotton species (Gossypium hirsutum and Gossypium barbadense) have long been cultivated worldwide for natural renewable textile fibers. The draft genome sequences of both species are available but they are highly fragmented and incomplete1-4. Here we report reference-grade genome assemblies and annotations for G. hirsutum accession Texas Marker-1 (TM-1) and G. barbadense accession 3-79 by integrating single-molecule real-time sequencing, BioNano optical mapping and high-throughput chromosome conformation capture techniques. Compared with previous assembled draft genomes1,3, these genome sequences show considerable improvements in contiguity and completeness for regions with high content of repeats such as centromeres. Comparative genomics analyses identify extensive structural variations that probably occurred after polyploidization, highlighted by large paracentric/pericentric inversions in 14 chromosomes. We constructed an introgression line population to introduce favorable chromosome segments from G. barbadense to G. hirsutum, allowing us to identify 13 quantitative trait loci associated with superior fiber quality. These resources will accelerate evolutionary and functional genomic studies in cotton and inform future breeding programs for fiber improvement.

Journal ArticleDOI
TL;DR: Treatment was well tolerated, and the efficacy of durvalumab plus tremelimumab therapy and durvalUMab monotherapy reflected a population of patients with mPDAC who had poor prognoses and rapidly progressing disease.
Abstract: Importance New therapeutic options for patients with metastatic pancreatic ductal adenocarcinoma (mPDAC) are needed This study evaluated dual checkpoint combination therapy in patients with mPDAC Objective To evaluate the safety and efficacy of the anti–PD-L1 (programmed death-ligand 1) antibody using either durvalumab monotherapy or in combination with the anticytotoxic T-lymphocyte antigen 4 antibody using durvalumab plus tremelimumab therapy in patients with mPDAC Design, Setting, and Participants Part A of this multicenter, 2-part, phase 2 randomized clinical trial was a lead-in safety, open-label study with planned expansion to part B pending an efficacy signal from part A Between November 26, 2015, and March 23, 2017, 65 patients with mPDAC who had previously received only 1 first-line fluorouracil–based or gemcitabine-based treatment were enrolled at 21 sites in 6 countries Efficacy analysis included the intent-to-treat population; safety analysis included patients who received at least 1 dose of study treatment and for whom any postdose data were available Interventions Patients received durvalumab (1500 mg every 4 weeks) plus tremelimumab (75 mg every 4 weeks) combination therapy for 4 cycles followed by durvalumab therapy (1500 mg every 4 weeks) or durvalumab monotherapy (1500 mg every 4 weeks) for up to 12 months or until the onset of progressive disease or unacceptable toxic effects Main Outcomes and Measures Safety and efficacy were measured by objective response rate, which was used to determine study expansion to part B The threshold for expansion was an objective response rate of 10% for either treatment arm Results Among 65 randomized patients, 34 (52%) were men and median age was 61 (95% CI, 37-81) years Grade 3 or higher treatment-related adverse events occurred in 7 of 32 patients (22%) receiving combination therapy and in 2 of 32 patients (6%) receiving monotherapy; 1 patient randomized to the monotherapy arm did not receive treatment owing to worsened disease Fatigue, diarrhea, and pruritus were the most common adverse events in both arms Overall, 4 of 64 patients (6%) discontinued treatment owing to treatment-related adverse events Objective response rate was 31% (95% CI, 008-1622) for patients receiving combination therapy and 0% (95% CI, 000-1058) for patients receiving monotherapy Low patient numbers limited observation of the associations between treatment response and PD-L1 expression or microsatellite instability status Conclusion and Relevance Treatment was well tolerated, and the efficacy of durvalumab plus tremelimumab therapy and durvalumab monotherapy reflected a population of patients with mPDAC who had poor prognoses and rapidly progressing disease Patients were not enrolled in part B because the threshold for efficacy was not met in part A Trial Registration ClinicalTrialsgov identifier:NCT02558894

Journal ArticleDOI
TL;DR: This data indicates that BAs, products of cholesterol metabolism and clearance, are produced in the liver and are further metabolized by gut bacteria and seem dysregulated in Alzheimer's disease (AD).
Abstract: Introduction Increasing evidence suggests a role for the gut microbiome in central nervous system disorders and a specific role for the gut-brain axis in neurodegeneration. Bile acids (BAs), products of cholesterol metabolism and clearance, are produced in the liver and are further metabolized by gut bacteria. They have major regulatory and signaling functions and seem dysregulated in Alzheimer's disease (AD). Methods Serum levels of 15 primary and secondary BAs and their conjugated forms were measured in 1464 subjects including 370 cognitively normal older adults, 284 with early mild cognitive impairment, 505 with late mild cognitive impairment, and 305 AD cases enrolled in the AD Neuroimaging Initiative. We assessed associations of BA profiles including selected ratios with diagnosis, cognition, and AD-related genetic variants, adjusting for confounders and multiple testing. Results In AD compared to cognitively normal older adults, we observed significantly lower serum concentrations of a primary BA (cholic acid [CA]) and increased levels of the bacterially produced, secondary BA, deoxycholic acid, and its glycine and taurine conjugated forms. An increased ratio of deoxycholic acid:CA, which reflects 7α-dehydroxylation of CA by gut bacteria, strongly associated with cognitive decline, a finding replicated in serum and brain samples in the Rush Religious Orders and Memory and Aging Project. Several genetic variants in immune response–related genes implicated in AD showed associations with BA profiles. Discussion We report for the first time an association between altered BA profile, genetic variants implicated in AD, and cognitive changes in disease using a large multicenter study. These findings warrant further investigation of gut dysbiosis and possible role of gut-liver-brain axis in the pathogenesis of AD.

Journal ArticleDOI
TL;DR: It is found that implicit measures can be changed, but effects are often relatively weak (|ds| < .30), and changes in implicit measures did not mediate changes in explicit measures or behavior.
Abstract: Using a novel technique known as network meta-analysis, we synthesized evidence from 492 studies (87,418 participants) to investigate the effectiveness of procedures in changing implicit measures, which we define as response biases on implicit tasks. We also evaluated these procedures' effects on explicit and behavioral measures. We found that implicit measures can be changed, but effects are often relatively weak (|ds| < .30). Most studies focused on producing short-term changes with brief, single-session manipulations. Procedures that associate sets of concepts, invoke goals or motivations, or tax mental resources changed implicit measures the most, whereas procedures that induced threat, affirmation, or specific moods/emotions changed implicit measures the least. Bias tests suggested that implicit effects could be inflated relative to their true population values. Procedures changed explicit measures less consistently and to a smaller degree than implicit measures and generally produced trivial changes in behavior. Finally, changes in implicit measures did not mediate changes in explicit measures or behavior. Our findings suggest that changes in implicit measures are possible, but those changes do not necessarily translate into changes in explicit measures or behavior. (PsycINFO Database Record (c) 2019 APA, all rights reserved).

Journal ArticleDOI
TL;DR: In this article, an integrative analysis spanning a broad spectrum of diverse literature enables a distinction between two different research lines in the field of entrepreneurship, and the findings of this study, based on articles from the journals included in the Web of Science database, facilitate a broader comprehension of two separate lines of research, which allows an analysis of the interaction among institutions, entrepreneurship and economic growth.
Abstract: This paper analyzes an emergent stream of research shedding light on the institutional factors shaping entrepreneurial activity and its effect on economic growth. This integrative analysis spanning a broad spectrum of diverse literature enables a distinction between two different research lines in the field of entrepreneurship. The findings of this study, based on articles from the journals included in the Web of Science database, facilitate a broader comprehension of two separate lines of research, which allows an analysis of the interaction among institutions, entrepreneurship, and economic growth. The systematic literature analysis over the last 25 years (1992–2016) of research reveals that institutions could be related to economic growth through entrepreneurship, which would open new research questions about what institutional factors are conducive to entrepreneurship, which in turn spurs economic growth. Thus, not only is understanding both complex relationships and their possible sequence useful for planning strategies and public policies, but it is also helpful for advancing and providing new insights in these research fields, which could be complementary and interdisciplinary.

Journal ArticleDOI
22 Aug 2019
TL;DR: Drug-induced liver injury (DILI) is an adverse reaction to drugs or other xenobiotics that occurs either as a predictable event when an individual is exposed to toxic doses of some compounds or as an unpredictable event with many drugs in common use as discussed by the authors.
Abstract: Drug-induced liver injury (DILI) is an adverse reaction to drugs or other xenobiotics that occurs either as a predictable event when an individual is exposed to toxic doses of some compounds or as an unpredictable event with many drugs in common use. Drugs can be harmful to the liver in susceptible individuals owing to genetic and environmental risk factors. These risk factors modify hepatic metabolism and excretion of the DILI-causative agent leading to cellular stress, cell death, activation of an adaptive immune response and a failure to adapt, with progression to overt liver injury. Idiosyncratic DILI is a relative rare hepatic disorder but can be severe and, in some cases, fatal, presenting with a variety of phenotypes, which mimic other hepatic diseases. The diagnosis of DILI relies on the exclusion of other aetiologies of liver disease as specific biomarkers are still lacking. Clinical scales such as CIOMS/RUCAM can support the diagnostic process but need refinement. A number of clinical variables, validated in prospective cohorts, can be used to predict a more severe DILI outcome. Although no pharmacological therapy has been adequately tested in randomized clinical trials, corticosteroids can be useful, particularly in the emergent form of DILI related to immune-checkpoint inhibitors in patients with cancer.

Journal ArticleDOI
18 Sep 2019
TL;DR: Sherpa as discussed by the authors is a general-purpose Monte Carlo event generator for the simulation of particle collisions in high-energy collider experiments, which is heavily used for event generation in the analysis and interpretation of LHC Run 1 and Run 2 data.
Abstract: Sherpa is a general-purpose Monte Carlo event generator for the simulation of particle collisions in high-energy collider experiments. We summarize essential features and improvements of the Sherpa 2.2 release series, which is heavily used for event generation in the analysis and interpretation of LHC Run 1 and Run 2 data. We highlight a decade of developments towards ever higher precision in the simulation of particle-collision events.

Journal ArticleDOI
TL;DR: Although most patients had no alterations in medical therapy, multiple clinical factors were independently associated with medication changes and quality improvement efforts are urgently needed to improve guideline-directed medication titration for HFrEF.

Journal ArticleDOI
TL;DR: A synthesis of empirical data and numerical modelling results related to pre-LGM ice sheets to produce new hypotheses regarding their extent in the Northern Hemisphere at 17 time-slices that span the Quaternary shows pronounced ice-sheet asymmetry within the last glacial cycle and significant variations in ice-marginal positions between older glacial cycles.
Abstract: Our understanding of how global climatic changes are translated into ice-sheet fluctuations and sea-level change is currently limited by a lack of knowledge of the configuration of ice sheets prior to the Last Glacial Maximum (LGM). Here, we compile a synthesis of empirical data and numerical modelling results related to pre-LGM ice sheets to produce new hypotheses regarding their extent in the Northern Hemisphere (NH) at 17 time-slices that span the Quaternary. Our reconstructions illustrate pronounced ice-sheet asymmetry within the last glacial cycle and significant variations in ice-marginal positions between older glacial cycles. We find support for a significant reduction in the extent of the Laurentide Ice Sheet (LIS) during MIS 3, implying that global sea levels may have been 30–40 m higher than most previous estimates. Our ice-sheet reconstructions illustrate the current state-of-the-art knowledge of pre-LGM ice sheets and provide a conceptual framework to interpret NH landscape evolution. How global climatic changes are translated into ice-sheet fluctuations and sea-level change is not well understood. Here the authors present a compilation of empirical data and numerical modelling results of pre-LGM Northern Hemisphere ice sheet changes and show pronounced ice-sheet asymmetry within the last glacial cycle and significant variations in ice-marginal positions between older glacial cycles.

Journal ArticleDOI
TL;DR: In this paper, the potential for observing gravitational waves from cosmological phase transitions with LISA was investigated, based on current state-of-the-art simulations of sound waves in the cosmic fluid after the phase transition completes.
Abstract: We investigate the potential for observing gravitational waves from cosmological phase transitions with LISA in light of recent theoretical and experimental developments. Our analysis is based on current state-of-the-art simulations of sound waves in the cosmic fluid after the phase transition completes. We discuss the various sources of gravitational radiation, the underlying parameters describing the phase transition and a variety of viable particle physics models in this context, clarifying common misconceptions that appear in the literature and identifying open questions requiring future study. We also present a web-based tool, PTPlot, that allows users to obtain up-to-date detection prospects for a given set of phase transition parameters at LISA.

Journal ArticleDOI
TL;DR: This work underlines the continuum from normal to aberrant perception, encouraging a more empathic approach to clinical hallucinations, and highlights the role of prior beliefs as a critical elicitor of hallucinations.

Journal ArticleDOI
04 Sep 2019-Nature
TL;DR: A trans-chromatin regulatory pathway is revealed that connects aberrant intergenic CpG methylation to human neoplastic and developmental overgrowth and NSD1-mediated H3K36me2 is required for the recruitment of DNMT3A and maintenance of DNA methylation at intergenic regions.
Abstract: Enzymes that catalyse CpG methylation in DNA, including the DNA methyltransferases 1 (DNMT1), 3A (DNMT3A) and 3B (DNMT3B), are indispensable for mammalian tissue development and homeostasis1–4. They are also implicated in human developmental disorders and cancers5–8, supporting the critical role of DNA methylation in the specification and maintenance of cell fate. Previous studies have suggested that post-translational modifications of histones are involved in specifying patterns of DNA methyltransferase localization and DNA methylation at promoters and actively transcribed gene bodies9–11. However, the mechanisms that control the establishment and maintenance of intergenic DNA methylation remain poorly understood. Tatton–Brown–Rahman syndrome (TBRS) is a childhood overgrowth disorder that is defined by germline mutations in DNMT3A. TBRS shares clinical features with Sotos syndrome (which is caused by haploinsufficiency of NSD1, a histone methyltransferase that catalyses the dimethylation of histone H3 at K36 (H3K36me2)8,12,13), which suggests that there is a mechanistic link between these two diseases. Here we report that NSD1-mediated H3K36me2 is required for the recruitment of DNMT3A and maintenance of DNA methylation at intergenic regions. Genome-wide analysis shows that the binding and activity of DNMT3A colocalize with H3K36me2 at non-coding regions of euchromatin. Genetic ablation of Nsd1 and its paralogue Nsd2 in mouse cells results in a redistribution of DNMT3A to H3K36me3-modified gene bodies and a reduction in the methylation of intergenic DNA. Blood samples from patients with Sotos syndrome and NSD1-mutant tumours also exhibit hypomethylation of intergenic DNA. The PWWP domain of DNMT3A shows dual recognition of H3K36me2 and H3K36me3 in vitro, with a higher binding affinity towards H3K36me2 that is abrogated by TBRS-derived missense mutations. Together, our study reveals a trans-chromatin regulatory pathway that connects aberrant intergenic CpG methylation to human neoplastic and developmental overgrowth. H3K36me2 targets DNMT3A to intergenic regions and this process, together with H3K36me3-mediated recruitment of DNMT3B, has a key role in establishing and maintaining genomic DNA methylation landscapes.

Journal ArticleDOI
TL;DR: In this article, the authors present measurements of the expansion rate of the universe based on a Hubble diagram of quasars, whose distances are estimated from their X-ray and ultraviolet emission.
Abstract: The concordance model (Λ cold dark matter (ΛCDM) model, where Λ is the cosmological constant) reproduces the main current cosmological observations1–4 assuming the validity of general relativity at all scales and epochs and the presence of CDM and of Λ, equivalent to dark energy with a constant density in space and time. However, the ΛCDM model is poorly tested in the redshift interval between the farthest observed type Ia supernovae5 and the cosmic microwave background. We present measurements of the expansion rate of the Universe based on a Hubble diagram of quasars. Quasars are the most luminous persistent sources in the Universe, observed up to redshifts of z ≈ 7.5 (refs. 6,7). We estimate their distances following a method developed by our group8–10, based on the X-ray and ultraviolet emission of the quasars. The distance modulus/redshift relation of quasars at z < 1.4 is in agreement with that of supernovae and with the concordance model. However, a deviation from the ΛCDM model emerges at higher redshift, with a statistical significance of ~4σ. If an evolution of the dark energy equation of state is allowed, the data suggest dark energy density increasing with time. The concordance cosmology model is poorly tested at high redshifts. Here the expansion rate of the Universe in the range 0.5 < z < 5.1 is measured based on a Hubble diagram of quasars, whose distances are estimated from their X-ray and ultraviolet emission.

Journal ArticleDOI
TL;DR: The proposed ARC-HBR consensus document represents the first pragmatic approach to a consistent definition of high bleeding risk in clinical trials evaluating the safety and effectiveness of devices and drug regimens for patients undergoing percutaneous coronary intervention.
Abstract: Identification and management of patients at high bleeding risk undergoing percutaneous coronary intervention are of major importance, but a lack of standardization in defining this population limits trial design, data interpretation, and clinical decision-making. The Academic Research Consortium for High Bleeding Risk (ARC-HBR) is a collaboration among leading research organizations, regulatory authorities, and physician-scientists from the United States, Asia, and Europe focusing on percutaneous coronary intervention-related bleeding. Two meetings of the 31-member consortium were held in Washington, DC, in April 2018 and in Paris, France, in October 2018. These meetings were organized by the Cardiovascular European Research Center on behalf of the ARC-HBR group and included representatives of the US Food and Drug Administration and the Japanese Pharmaceuticals and Medical Devices Agency, as well as observers from the pharmaceutical and medical device industries. A consensus definition of patients at high bleeding risk was developed that was based on review of the available evidence. The definition is intended to provide consistency in defining this population for clinical trials and to complement clinical decision-making and regulatory review. The proposed ARC-HBR consensus document represents the first pragmatic approach to a consistent definition of high bleeding risk in clinical trials evaluating the safety and effectiveness of devices and drug regimens for patients undergoing percutaneous coronary intervention.

Journal ArticleDOI
TL;DR: The authors provide an overview of the rapidly developing field of climate change vulnerability assessment (CCVA) and describe key concepts, terms, steps and considerations, and stress the importance of identifying the full range of pressures, impacts and their associated mechanisms that species face and using this as a basis for selecting the appropriate assessment approaches for quantifying vulnerability.
Abstract: Assessing species' vulnerability to climate change is a prerequisite for developing effective strategies to conserve them. The last three decades have seen exponential growth in the number of studies evaluating how, how much, why, when, and where species will be impacted by climate change. We provide an overview of the rapidly developing field of climate change vulnerability assessment (CCVA) and describe key concepts, terms, steps and considerations. We stress the importance of identifying the full range of pressures, impacts and their associated mechanisms that species face and using this as a basis for selecting the appropriate assessment approaches for quantifying vulnerability. We outline four CCVA assessment approaches, namely trait-based, correlative, mechanistic and combined approaches and discuss their use. Since any assessment can deliver unreliable or even misleading results when incorrect data and parameters are applied, we discuss finding, selecting, and applying input data and provide examples of open-access resources. Because rare, small-range, and declining-range species are often of particular conservation concern while also posing significant challenges for CCVA, we describe alternative ways to assess them. We also describe how CCVAs can be used to inform IUCN Red List assessments of extinction risk. Finally, we suggest future directions in this field and propose areas where research efforts may be particularly valuable.

Journal ArticleDOI
TL;DR: A combination prevention intervention with ART provided according to local guidelines resulted in a 30% lower incidence of HIV infection than standard care and the lack of effect with universal ART was unanticipated and not consistent with the data on viral suppression.
Abstract: Background A universal testing and treatment strategy is a potential approach to reduce the incidence of human immunodeficiency virus (HIV) infection, yet previous trial results are incons...

Journal ArticleDOI
TL;DR: Various applications of CRISPR/Cas9 in a range of important crops, compare it with other GE tools, and review its mechanism, limitations, and future possibilities are discussed.

Journal ArticleDOI
TL;DR: The value in evaluating boundaries between components of geomorphic systems as transition zones and examining the fluxes across them to understand landscape functioning is emphasized.
Abstract: Connectivity describes the efficiency of material transfer between geomorphic system components such as hillslopes and rivers or longitudinal segments within a river network. Representations of geomorphic systems as networks should recognize that the compartments, links, and nodes exhibit connectivity at differing scales. The historical underpinnings of connectivity in geomorphology involve management of geomorphic systems and observations linking surface processes to landform dynamics. Current work in geomorphic connectivity emphasizes hydrological, sediment, or landscape connectivity. Signatures of connectivity can be detected using diverse indicators that vary from contemporary processes to stratigraphic records or a spatial metric such as sediment yield that encompasses geomorphic processes operate over time and space. One approach to measuring connectivity is to determine the fundamental temporal and spatial scales for the phenomenon of interest and to make measurements at a sufficiently large multiple of the fundamental scales to capture reliably a representative sample. Another approach seeks to characterize how connectivity varies with scale, by applying the same metric over a wide range of scales or using statistical measures that characterize the frequency distributions of connectivity across scales. Identifying and measuring connectivity is useful in basic and applied geomorphic research and we explore the implications of connectivity for river management. Common themes and ideas that merit further research include; increased understanding of the importance of capturing landscape heterogeneity and connectivity patterns; the potential to use graph and network theory metrics in analyzing connectivity; the need to understand which metrics best represent the physical system and its connectivity pathways, and to apply these metrics to the validation of numerical models; and the need to recognize the importance of low levels of connectivity in some situations. We emphasize the value in evaluating boundaries between components of geomorphic systems as transition zones and examining the fluxes across them to understand landscape functioning.

Journal ArticleDOI
19 Mar 2019-JAMA
TL;DR: Among US adults, higher consumption of dietary cholesterol or eggs was significantly associated with higher risk of incident CVD and all-cause mortality in a dose-response manner and should be considered in the development of dietary guidelines and updates.
Abstract: Importance Cholesterol is a common nutrient in the human diet and eggs are a major source of dietary cholesterol. Whether dietary cholesterol or egg consumption is associated with cardiovascular disease (CVD) and mortality remains controversial. Objective To determine the associations of dietary cholesterol or egg consumption with incident CVD and all-cause mortality. Design, Setting, and Participants Individual participant data were pooled from 6 prospective US cohorts using data collected between March 25, 1985, and August 31, 2016. Self-reported diet data were harmonized using a standardized protocol. Exposures Dietary cholesterol (mg/day) or egg consumption (number/day). Main Outcomes and Measures Hazard ratio (HR) and absolute risk difference (ARD) over the entire follow-up for incident CVD (composite of fatal and nonfatal coronary heart disease, stroke, heart failure, and other CVD deaths) and all-cause mortality, adjusting for demographic, socioeconomic, and behavioral factors. Results This analysis included 29 615 participants (mean [SD] age, 51.6 [13.5] years at baseline) of whom 13 299 (44.9%) were men and 9204 (31.1%) were black. During a median follow-up of 17.5 years (interquartile range, 13.0-21.7; maximum, 31.3), there were 5400 incident CVD events and 6132 all-cause deaths. The associations of dietary cholesterol or egg consumption with incident CVD and all-cause mortality were monotonic (allPvalues for nonlinear terms, .19-.83). Each additional 300 mg of dietary cholesterol consumed per day was significantly associated with higher risk of incident CVD (adjusted HR, 1.17 [95% CI, 1.09-1.26]; adjusted ARD, 3.24% [95% CI, 1.39%-5.08%]) and all-cause mortality (adjusted HR, 1.18 [95% CI, 1.10-1.26]; adjusted ARD, 4.43% [95% CI, 2.51%-6.36%]). Each additional half an egg consumed per day was significantly associated with higher risk of incident CVD (adjusted HR, 1.06 [95% CI, 1.03-1.10]; adjusted ARD, 1.11% [95% CI, 0.32%-1.89%]) and all-cause mortality (adjusted HR, 1.08 [95% CI, 1.04-1.11]; adjusted ARD, 1.93% [95% CI, 1.10%-2.76%]). The associations between egg consumption and incident CVD (adjusted HR, 0.99 [95% CI, 0.93-1.05]; adjusted ARD, −0.47% [95% CI, −1.83% to 0.88%]) and all-cause mortality (adjusted HR, 1.03 [95% CI, 0.97-1.09]; adjusted ARD, 0.71% [95% CI, −0.85% to 2.28%]) were no longer significant after adjusting for dietary cholesterol consumption. Conclusions and Relevance Among US adults, higher consumption of dietary cholesterol or eggs was significantly associated with higher risk of incident CVD and all-cause mortality in a dose-response manner. These results should be considered in the development of dietary guidelines and updates.