scispace - formally typeset
Search or ask a question

Showing papers by "Rutgers University published in 2017"


Journal ArticleDOI
TL;DR: Patients with refractory large B‐cell lymphoma who received CAR T‐cell therapy with axi‐cel had high levels of durable response, with a safety profile that included myelosuppression, the cytokine release syndrome, and neurologic events.
Abstract: BackgroundIn a phase 1 trial, axicabtagene ciloleucel (axi-cel), an autologous anti-CD19 chimeric antigen receptor (CAR) T-cell therapy, showed efficacy in patients with refractory large B-cell lymphoma after the failure of conventional therapy. MethodsIn this multicenter, phase 2 trial, we enrolled 111 patients with diffuse large B-cell lymphoma, primary mediastinal B-cell lymphoma, or transformed follicular lymphoma who had refractory disease despite undergoing recommended prior therapy. Patients received a target dose of 2×106 anti-CD19 CAR T cells per kilogram of body weight after receiving a conditioning regimen of low-dose cyclophosphamide and fludarabine. The primary end point was the rate of objective response (calculated as the combined rates of complete response and partial response). Secondary end points included overall survival, safety, and biomarker assessments. ResultsAmong the 111 patients who were enrolled, axi-cel was successfully manufactured for 110 (99%) and administered to 101 (91%)....

3,363 citations


Proceedings ArticleDOI
01 Oct 2017
TL;DR: This paper proposes Stacked Generative Adversarial Networks (StackGAN) to generate 256 photo-realistic images conditioned on text descriptions and introduces a novel Conditioning Augmentation technique that encourages smoothness in the latent conditioning manifold.
Abstract: Synthesizing high-quality images from text descriptions is a challenging problem in computer vision and has many practical applications. Samples generated by existing textto- image approaches can roughly reflect the meaning of the given descriptions, but they fail to contain necessary details and vivid object parts. In this paper, we propose Stacked Generative Adversarial Networks (StackGAN) to generate 256.256 photo-realistic images conditioned on text descriptions. We decompose the hard problem into more manageable sub-problems through a sketch-refinement process. The Stage-I GAN sketches the primitive shape and colors of the object based on the given text description, yielding Stage-I low-resolution images. The Stage-II GAN takes Stage-I results and text descriptions as inputs, and generates high-resolution images with photo-realistic details. It is able to rectify defects in Stage-I results and add compelling details with the refinement process. To improve the diversity of the synthesized images and stabilize the training of the conditional-GAN, we introduce a novel Conditioning Augmentation technique that encourages smoothness in the latent conditioning manifold. Extensive experiments and comparisons with state-of-the-arts on benchmark datasets demonstrate that the proposed method achieves significant improvements on generating photo-realistic images conditioned on text descriptions.

2,486 citations


Journal ArticleDOI
TL;DR: This comprehensive review summarizes the topical developments in the field of luminescent MOF and MOF-based photonic crystals/thin film sensory materials.
Abstract: Metal–organic frameworks (MOFs) or porous coordination polymers (PCPs) are open, crystalline supramolecular coordination architectures with porous facets. These chemically tailorable framework materials are the subject of intense and expansive research, and are particularly relevant in the fields of sensory materials and device engineering. As the subfield of MOF-based sensing has developed, many diverse chemical functionalities have been carefully and rationally implanted into the coordination nanospace of MOF materials. MOFs with widely varied fluorometric sensing properties have been developed using the design principles of crystal engineering and structure–property correlations, resulting in a large and rapidly growing body of literature. This work has led to advancements in a number of crucial sensing domains, including biomolecules, environmental toxins, explosives, ionic species, and many others. Furthermore, new classes of MOF sensory materials utilizing advanced signal transduction by devices based on MOF photonic crystals and thin films have been developed. This comprehensive review summarizes the topical developments in the field of luminescent MOF and MOF-based photonic crystals/thin film sensory materials.

2,239 citations


Book
06 Jun 2017
TL;DR: Each component of emotional intelligence is discussed and shown through examples how to recognize it in potential leaders, how and why it leads to measurable business results, and how it can be learned.
Abstract: Superb leaders have very different ways of directing a team, a division, or a company. Some are subdued and analytical; others are charismatic and go with their gut. And different situations call for different types of leadership. Most mergers need a sensitive negotiator at the helm, whereas many turnarounds require a more forceful kind of authority. Psychologist and noted author Daniel Goleman has found, however, that effective leaders are alike in one crucial way: they all have a high degree of what has come to be known as emotional intelligence. In fact, Goleman's research at nearly 200 large, global companies revealed that emotional intelligence--especially at the highest levels of a company--is the sine qua non for leadership. Without it, a person can have first-class training, an incisive mind, and an endless supply of good ideas, but he still won't make a great leader. The components of emotional intelligence--self-awareness, self-regulation, motivation, empathy, and social skill--can sound unbusinesslike. But exhibiting emotional intelligence at the workplace does not mean simply controlling your anger or getting along with people. Rather, it means understanding your own and other people's emotional makeup well enough to move people in the direction of accomplishing your company's goals. In this article, the author discusses each component of emotional intelligence and shows through examples how to recognize it in potential leaders, how and why it leads to measurable business results, and how it can be learned. It takes time and, most of all, commitment. But the benefits that come from having a well-developed emotional intelligence, both for the individual and the organization, make it worth the effort.

1,396 citations


Journal ArticleDOI
TL;DR: OpenMM is a molecular dynamics simulation toolkit with a unique focus on extensibility, which makes it an ideal tool for researchers developing new simulation methods, and also allows those new methods to be immediately available to the larger community.
Abstract: OpenMM is a molecular dynamics simulation toolkit with a unique focus on extensibility. It allows users to easily add new features, including forces with novel functional forms, new integration algorithms, and new simulation protocols. Those features automatically work on all supported hardware types (including both CPUs and GPUs) and perform well on all of them. In many cases they require minimal coding, just a mathematical description of the desired function. They also require no modification to OpenMM itself and can be distributed independently of OpenMM. This makes it an ideal tool for researchers developing new simulation methods, and also allows those new methods to be immediately available to the larger community.

1,364 citations


Journal ArticleDOI
TL;DR: Clinical diagnostic criteria, published in 1996 by the National Institute of Neurological Disorders and Stroke/Society for PSP have excellent specificity, but their sensitivity is limited for variant PSP syndromes with presentations other than Richardson's syndrome.
Abstract: Background: PSP is a neuropathologically defined disease entity. Clinical diagnostic criteria, published in 1996 by the National Institute of Neurological Disorders and Stroke/Society for PSP, have excellent specificity, but their sensitivity is limited for variant PSP syndromes with presentations other than Richardson's syndrome. Objective: We aimed to provide an evidence- and consensus-based revision of the clinical diagnostic criteria for PSP. Methods: We searched the PubMed, Cochrane, Medline, and PSYCInfo databases for articles published in English since 1996, using postmortem diagnosis or highly specific clinical criteria as the diagnostic standard. Second, we generated retrospective standardized clinical data from patients with autopsy-confirmed PSP and control diseases. On this basis, diagnostic criteria were drafted, optimized in two modified Delphi evaluations, submitted to structured discussions with consensus procedures during a 2-day meeting, and refined in three further Delphi rounds. Results: Defined clinical, imaging, laboratory, and genetic findings serve as mandatory basic features, mandatory exclusion criteria, or context-dependent exclusion criteria. We identified four functional domains (ocular motor dysfunction, postural instability, akinesia, and cognitive dysfunction) as clinical predictors of PSP. Within each of these domains, we propose three clinical features that contribute different levels of diagnostic certainty. Specific combinations of these features define the diagnostic criteria, stratified by three degrees of diagnostic certainty (probable PSP, possible PSP, and suggestive of PSP). Clinical clues and imaging findings represent supportive features. Conclusions: Here, we present new criteria aimed to optimize early, sensitive, and specific clinical diagnosis of PSP on the basis of currently available evidence.

1,247 citations


Journal ArticleDOI
29 Jun 2017
TL;DR: Vitamin B12 deficiency causes reversible megaloblastic anemia, demyelinating disease, or both; current assays have insufficient sensitivity and specificity; methylmalonic acid levels are useful to confirm diagnosis.
Abstract: Vitamin B12 deficiency causes reversible megaloblastic anemia, demyelinating disease, or both. Current assays have insufficient sensitivity and specificity; methylmalonic acid levels are useful to confirm diagnosis. Parenteral or high-dose oral vitamin B12 is effective therapy.

1,066 citations


Journal ArticleDOI
02 Nov 2017-Nature
TL;DR: It is found that lactate can be a primary source of carbon for the TCA cycle and thus of energy, and during the fasted state, the contribution of glucose to tissue TCA metabolism is primarily indirect (via circulating lactate) in all tissues except the brain.
Abstract: Mammalian tissues are fuelled by circulating nutrients, including glucose, amino acids, and various intermediary metabolites. Under aerobic conditions, glucose is generally assumed to be burned fully by tissues via the tricarboxylic acid cycle (TCA cycle) to carbon dioxide. Alternatively, glucose can be catabolized anaerobically via glycolysis to lactate, which is itself also a potential nutrient for tissues and tumours. The quantitative relevance of circulating lactate or other metabolic intermediates as fuels remains unclear. Here we systematically examine the fluxes of circulating metabolites in mice, and find that lactate can be a primary source of carbon for the TCA cycle and thus of energy. Intravenous infusions of 13C-labelled nutrients reveal that, on a molar basis, the circulatory turnover flux of lactate is the highest of all metabolites and exceeds that of glucose by 1.1-fold in fed mice and 2.5-fold in fasting mice; lactate is made primarily from glucose but also from other sources. In both fed and fasted mice, 13C-lactate extensively labels TCA cycle intermediates in all tissues. Quantitative analysis reveals that during the fasted state, the contribution of glucose to tissue TCA metabolism is primarily indirect (via circulating lactate) in all tissues except the brain. In genetically engineered lung and pancreatic cancer tumours in fasted mice, the contribution of circulating lactate to TCA cycle intermediates exceeds that of glucose, with glutamine making a larger contribution than lactate in pancreatic cancer. Thus, glycolysis and the TCA cycle are uncoupled at the level of lactate, which is a primary circulating TCA substrate in most tissues and tumours.

961 citations


Journal ArticleDOI
TL;DR: In this paper, the authors study how to optimally manage the freshness of information updates sent from a source node to a destination via a channel and develop efficient algorithms to find the optimal update policy among all causal policies and establish sufficient and necessary conditions for the optimality of the zero-wait policy.
Abstract: In this paper, we study how to optimally manage the freshness of information updates sent from a source node to a destination via a channel. A proper metric for data freshness at the destination is the age-of-information , or simply age , which is defined as how old the freshest received update is, since the moment that this update was generated at the source node (e.g., a sensor). A reasonable update policy is the zero-wait policy, i.e., the source node submits a fresh update once the previous update is delivered, which achieves the maximum throughput and the minimum delay. Surprisingly, this zero-wait policy does not always minimize the age. This counter-intuitive phenomenon motivates us to study how to optimally control information updates to keep the data fresh and to understand when the zero-wait policy is optimal. We introduce a general age penalty function to characterize the level of dissatisfaction on data staleness and formulate the average age penalty minimization problem as a constrained semi-Markov decision problem with an uncountable state space. We develop efficient algorithms to find the optimal update policy among all causal policies and establish sufficient and necessary conditions for the optimality of the zero-wait policy. Our investigation shows that the zero-wait policy is far from the optimum if: 1) the age penalty function grows quickly with respect to the age; 2) the packet transmission times over the channel are positively correlated over time; or 3) the packet transmission times are highly random (e.g., following a heavy-tail distribution).

857 citations


Journal ArticleDOI
TL;DR: The roles of APA in diverse cellular processes, including mRNA metabolism, protein diversification and protein localization, and more generally in gene regulation are discussed, and the molecular mechanisms underlying APA are discussed.
Abstract: Alternative polyadenylation (APA) is an RNA-processing mechanism that generates distinct 3' termini on mRNAs and other RNA polymerase II transcripts. It is widespread across all eukaryotic species and is recognized as a major mechanism of gene regulation. APA exhibits tissue specificity and is important for cell proliferation and differentiation. In this Review, we discuss the roles of APA in diverse cellular processes, including mRNA metabolism, protein diversification and protein localization, and more generally in gene regulation. We also discuss the molecular mechanisms underlying APA, such as variation in the concentration of core processing factors and RNA-binding proteins, as well as transcription-based regulation.

758 citations


Journal ArticleDOI
TL;DR: A comprehensive census of class 2 types and class 2 subtypes in complete and draft bacterial and archaeal genomes is presented, evolutionary scenarios for the independent origin of different class 2 CRISPR–Cas systems from mobile genetic elements are outlined, and an amended classification and nomenclature of CRISpr–Cas is proposed.
Abstract: Class 2 CRISPR-Cas systems are characterized by effector modules that consist of a single multidomain protein, such as Cas9 or Cpf1. We designed a computational pipeline for the discovery of novel class 2 variants and used it to identify six new CRISPR-Cas subtypes. The diverse properties of these new systems provide potential for the development of versatile tools for genome editing and regulation. In this Analysis article, we present a comprehensive census of class 2 types and class 2 subtypes in complete and draft bacterial and archaeal genomes, outline evolutionary scenarios for the independent origin of different class 2 CRISPR-Cas systems from mobile genetic elements, and propose an amended classification and nomenclature of CRISPR-Cas.

Journal ArticleDOI
19 Jan 2017-Nature
TL;DR: Using a transcriptome-wide map of m6Am, it is found that m 6Am-initiated transcripts are markedly more stable than mRNAs that begin with other nucleotides and that m6 am is selectively demethylated by fat mass and obesity-associated protein (FTO).
Abstract: Internal bases in mRNA can be subjected to modifications that influence the fate of mRNA in cells. One of the most prevalent modified bases is found at the 5' end of mRNA, at the first encoded nucleotide adjacent to the 7-methylguanosine cap. Here we show that this nucleotide, N6,2'-O-dimethyladenosine (m6Am), is a reversible modification that influences cellular mRNA fate. Using a transcriptome-wide map of m6Am we find that m6Am-initiated transcripts are markedly more stable than mRNAs that begin with other nucleotides. We show that the enhanced stability of m6Am-initiated transcripts is due to resistance to the mRNA-decapping enzyme DCP2. Moreover, we find that m6Am is selectively demethylated by fat mass and obesity-associated protein (FTO). FTO preferentially demethylates m6Am rather than N6-methyladenosine (m6A), and reduces the stability of m6Am mRNAs. Together, these findings show that the methylation status of m6Am in the 5' cap is a dynamic and reversible epitranscriptomic modification that determines mRNA stability.

Journal ArticleDOI
Albert M. Sirunyan, Armen Tumasyan, Wolfgang Adam1, Ece Aşılar1  +2212 moreInstitutions (157)
TL;DR: A fully-fledged particle-flow reconstruction algorithm tuned to the CMS detector was developed and has been consistently used in physics analyses for the first time at a hadron collider as mentioned in this paper.
Abstract: The CMS apparatus was identified, a few years before the start of the LHC operation at CERN, to feature properties well suited to particle-flow (PF) reconstruction: a highly-segmented tracker, a fine-grained electromagnetic calorimeter, a hermetic hadron calorimeter, a strong magnetic field, and an excellent muon spectrometer. A fully-fledged PF reconstruction algorithm tuned to the CMS detector was therefore developed and has been consistently used in physics analyses for the first time at a hadron collider. For each collision, the comprehensive list of final-state particles identified and reconstructed by the algorithm provides a global event description that leads to unprecedented CMS performance for jet and hadronic τ decay reconstruction, missing transverse momentum determination, and electron and muon identification. This approach also allows particles from pileup interactions to be identified and enables efficient pileup mitigation methods. The data collected by CMS at a centre-of-mass energy of 8\TeV show excellent agreement with the simulation and confirm the superior PF performance at least up to an average of 20 pileup interactions.

Journal ArticleDOI
TL;DR: Recent developments have improved the User experience, including the high-speed NGL Viewer that provides 3D molecular visualization in any web browser, improved support for data file download and enhanced organization of website pages for query, reporting and individual structure exploration.
Abstract: The Research Collaboratory for Structural Bioinformatics Protein Data Bank (RCSB PDB, http://rcsborg), the US data center for the global PDB archive, makes PDB data freely available to all users, from structural biologists to computational biologists and beyond New tools and resources have been added to the RCSB PDB web portal in support of a 'Structural View of Biology' Recent developments have improved the User experience, including the high-speed NGL Viewer that provides 3D molecular visualization in any web browser, improved support for data file download and enhanced organization of website pages for query, reporting and individual structure exploration Structure validation information is now visible for all archival entries PDB data have been integrated with external biological resources, including chromosomal position within the human genome; protein modifications; and metabolic pathways PDB-101 educational materials have been reorganized into a searchable website and expanded to include new features such as the Geis Digital Archive

Journal ArticleDOI
TL;DR: A real-time, context-aware collaboration framework that lies at the edge of the RAN, comprising MEC servers and mobile devices, and amalgamates the heterogeneous resources at theedge is envisions.
Abstract: MEC is an emerging paradigm that provides computing, storage, and networking resources within the edge of the mobile RAN. MEC servers are deployed on a generic computing platform within the RAN, and allow for delay-sensitive and context-aware applications to be executed in close proximity to end users. This paradigm alleviates the backhaul and core network and is crucial for enabling low-latency, high-bandwidth, and agile mobile services. This article envisions a real-time, context-aware collaboration framework that lies at the edge of the RAN, comprising MEC servers and mobile devices, and amalgamates the heterogeneous resources at the edge. Specifically, we introduce and study three representative use cases ranging from mobile edge orchestration, collaborative caching and processing, and multi-layer interference cancellation. We demonstrate the promising benefits of the proposed approaches in facilitating the evolution to 5G networks. Finally, we discuss the key technical challenges and open research issues that need to be addressed in order to efficiently integrate MEC into the 5G ecosystem.

Journal ArticleDOI
TL;DR: It is shown that the ability of Cpf1 to process its own CRISPR RNA (crRNA) can be used to simplify multiplexed genome editing.
Abstract: Targeting of multiple genomic loci with Cas9 is limited by the need for multiple or large expression constructs. Here we show that the ability of Cpf1 to process its own CRISPR RNA (crRNA) can be used to simplify multiplexed genome editing. Using a single customized CRISPR array, we edit up to four genes in mammalian cells and three in the mouse brain, simultaneously.

Journal ArticleDOI
30 Jun 2017-Science
TL;DR: This article developed a flexible architecture for computing damages that integrates climate science, econometric analyses, and process models, and used this approach to construct spatially explicit, probabilistic, and empirically derived estimates of economic damage in the United States from climate change.
Abstract: Estimates of climate change damage are central to the design of climate policies. Here, we develop a flexible architecture for computing damages that integrates climate science, econometric analyses, and process models. We use this approach to construct spatially explicit, probabilistic, and empirically derived estimates of economic damage in the United States from climate change. The combined value of market and nonmarket damage across analyzed sectors-agriculture, crime, coastal storms, energy, human mortality, and labor-increases quadratically in global mean temperature, costing roughly 1.2% of gross domestic product per +1°C on average. Importantly, risk is distributed unequally across locations, generating a large transfer of value northward and westward that increases economic inequality. By the late 21st century, the poorest third of counties are projected to experience damages between 2 and 20% of county income (90% chance) under business-as-usual emissions (Representative Concentration Pathway 8.5).

Journal ArticleDOI
TL;DR: In this article, the integrability of 2D Integrable Quantum Field Theories (IQFTs) with finite ultraviolet cutoff was studied and the authors showed that for any such IQFT there are infinitely many integrable deformations generated by scalar local fields XsXs, which are in one-to-one correspondence with the local integrals of motion.

Journal ArticleDOI
TL;DR: A global synthesis of 2,200 root observations of >1,000 species along biotic and abiotic gradients demonstrates that soil hydrology is a globally prevalent force driving landscape to global patterns of plant rooting depth, and underscores a fundamental plant–water feedback pathway that may be critical to understanding plant-mediated global change.
Abstract: Plant rooting depth affects ecosystem resilience to environmental stress such as drought. Deep roots connect deep soil/groundwater to the atmosphere, thus influencing the hydrologic cycle and climate. Deep roots enhance bedrock weathering, thus regulating the long-term carbon cycle. However, we know little about how deep roots go and why. Here, we present a global synthesis of 2,200 root observations of >1,000 species along biotic (life form, genus) and abiotic (precipitation, soil, drainage) gradients. Results reveal strong sensitivities of rooting depth to local soil water profiles determined by precipitation infiltration depth from the top (reflecting climate and soil), and groundwater table depth from below (reflecting topography-driven land drainage). In well-drained uplands, rooting depth follows infiltration depth; in waterlogged lowlands, roots stay shallow, avoiding oxygen stress below the water table; in between, high productivity and drought can send roots many meters down to the groundwater capillary fringe. This framework explains the contrasting rooting depths observed under the same climate for the same species but at distinct topographic positions. We assess the global significance of these hydrologic mechanisms by estimating root water-uptake depths using an inverse model, based on observed productivity and atmosphere, at 30″ (∼1-km) global grids to capture the topography critical to soil hydrology. The resulting patterns of plant rooting depth bear a strong topographic and hydrologic signature at landscape to global scales. They underscore a fundamental plant–water feedback pathway that may be critical to understanding plant-mediated global change.

Journal ArticleDOI
30 Mar 2017-Nature
TL;DR: It is reported that many MPAs failed to meet thresholds for effective and equitable management processes, with widespread shortfalls in staff and financial resources; continued global expansion of MPAs without adequate investment in human and financial capacity is likely to lead to sub-optimal conservation outcomes.
Abstract: Marine protected areas (MPAs) are increasingly being used globally to conserve marine resources. However, whether many MPAs are being effectively and equitably managed, and how MPA management influences substantive outcomes remain unknown. We developed a global database of management and fish population data (433 and 218 MPAs, respectively) to assess: MPA management processes; the effects of MPAs on fish populations; and relationships between management processes and ecological effects. Here we report that many MPAs failed to meet thresholds for effective and equitable management processes, with widespread shortfalls in staff and financial resources. Although 71% of MPAs positively influenced fish populations, these conservation impacts were highly variable. Staff and budget capacity were the strongest predictors of conservation impact: MPAs with adequate staff capacity had ecological effects 2.9 times greater than MPAs with inadequate capacity. Thus, continued global expansion of MPAs without adequate investment in human and financial capacity is likely to lead to sub-optimal conservation outcomes.

Journal ArticleDOI
TL;DR: In this article, the authors identify major challenges to managing biodiversity in urban green spaces and important topics warranting further investigation, including governance, economics, social networks, multiple stakeholders, individual preferences, and social constraints.
Abstract: Cities play important roles in the conservation of global biodiversity, particularly through the planning and management of urban green spaces (UGS). However, UGS management is subject to a complex assortment of interacting social, cultural, and economic factors, including governance, economics, social networks, multiple stakeholders, individual preferences, and social constraints. To help deliver more effective conservation outcomes in cities, we identify major challenges to managing biodiversity in UGS and important topics warranting further investigation. Biodiversity within UGS must be managed at multiple scales while accounting for various socioeconomic and cultural influences. Although the environmental consequences of management activities to enhance urban biodiversity are now beginning to be addressed, additional research and practical management strategies must be developed to balance human needs and perceptions while maintaining ecological processes.

Journal ArticleDOI
TL;DR: The diverse metabolic fuel sources that can be produced by autophagy provide tumors with metabolic plasticity and can allow them to thrive in what can be an austere microenvironment, and understanding how autophile can fuel cellular metabolism will enable more effective combinatorial therapeutic strategies.


Journal ArticleDOI
TL;DR: This paper presents the Sport Concussion Assessment Tool 5th Edition (SCAT5), which is the most recent revision of a sport concussion evaluation tool for use by healthcare professionals in the acute evaluation of suspected concussion.
Abstract: This paper presents the Sport Concussion Assessment Tool 5th Edition (SCAT5), which is the most recent revision of a sport concussion evaluation tool for use by healthcare professionals in the acute evaluation of suspected concussion. The revision of the SCAT3 (first published in 2013) culminated in the SCAT5. The revision was based on a systematic review and synthesis of current research, public input and expert panel review as part of the 5th International Consensus Conference on Concussion in Sport held in Berlin in 2016. The SCAT5 is intended for use in those who are 13 years of age or older. The Child SCAT5 is a tool for those aged 5–12 years, which is discussed elsewhere.

Journal ArticleDOI
TL;DR: In this paper, the trigger system consists of two levels designed to select events of potential physics interest from a GHz (MHz) interaction rate of proton-proton (heavy ion) collisions.
Abstract: This paper describes the CMS trigger system and its performance during Run 1 of the LHC. The trigger system consists of two levels designed to select events of potential physics interest from a GHz (MHz) interaction rate of proton-proton (heavy ion) collisions. The first level of the trigger is implemented in hardware, and selects events containing detector signals consistent with an electron, photon, muon, tau lepton, jet, or missing transverse energy. A programmable menu of up to 128 object-based algorithms is used to select events for subsequent processing. The trigger thresholds are adjusted to the LHC instantaneous luminosity during data taking in order to restrict the output rate to 100 kHz, the upper limit imposed by the CMS readout electronics. The second level, implemented in software, further refines the purity of the output stream, selecting an average rate of 400 Hz for offline event storage. The objectives, strategy and performance of the trigger system during the LHC Run 1 are described.

Proceedings ArticleDOI
01 Oct 2017
TL;DR: A novel method called Contextual Pyramid CNN (CP-CNN) for generating high-quality crowd density and count estimation by explicitly incorporating global and local contextual information of crowd images is presented.
Abstract: We present a novel method called Contextual Pyramid CNN (CP-CNN) for generating high-quality crowd density and count estimation by explicitly incorporating global and local contextual information of crowd images. The proposed CP-CNN consists of four modules: Global Context Estimator (GCE), Local Context Estimator (LCE), Density Map Estimator (DME) and a Fusion-CNN (F-CNN). GCE is a VGG-16 based CNN that encodes global context and it is trained to classify input images into different density classes, whereas LCE is another CNN that encodes local context information and it is trained to perform patch-wise classification of input images into different density classes. DME is a multi-column architecture-based CNN that aims to generate high-dimensional feature maps from the input image which are fused with the contextual information estimated by GCE and LCE using F-CNN. To generate high resolution and high-quality density maps, F-CNN uses a set of convolutional and fractionally-strided convolutional layers and it is trained along with the DME in an end-to-end fashion using a combination of adversarial loss and pixellevel Euclidean loss. Extensive experiments on highly challenging datasets show that the proposed method achieves significant improvements over the state-of-the-art methods.

Journal ArticleDOI
TL;DR: An overview of research on one particular cognitive measure, the Symbol Digit Modalities Test (SDMT), recognized as being particularly sensitive to slowed processing of information that is commonly seen in MS, is provided.
Abstract: Cognitive and motor performance measures are commonly employed in multiple sclerosis (MS) research, particularly when the purpose is to determine the efficacy of treatment. The increasing focus of new therapies on slowing progression or reversing neurological disability makes the utilization of sensitive, reproducible, and valid measures essential. Processing speed is a basic elemental cognitive function that likely influences downstream processes such as memory. The Multiple Sclerosis Outcome Assessments Consortium (MSOAC) includes representatives from advocacy organizations, Food and Drug Administration (FDA), European Medicines Agency (EMA), National Institute of Neurological Disorders and Stroke (NINDS), academic institutions, and industry partners along with persons living with MS. Among the MSOAC goals is acceptance and qualification by regulators of performance outcomes that are highly reliable and valid, practical, cost-effective, and meaningful to persons with MS. A critical step for these neuroperformance metrics is elucidation of clinically relevant benchmarks, well-defined degrees of disability, and gradients of change that are deemed clinically meaningful. This topical review provides an overview of research on one particular cognitive measure, the Symbol Digit Modalities Test (SDMT), recognized as being particularly sensitive to slowed processing of information that is commonly seen in MS. The research in MS clearly supports the reliability and validity of this test and recently has supported a responder definition of SDMT change approximating 4 points or 10% in magnitude.

Journal ArticleDOI
Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam  +2285 moreInstitutions (147)
TL;DR: In this paper, an improved jet energy scale corrections, based on a data sample corresponding to an integrated luminosity of 19.7 fb^(-1) collected by the CMS experiment in proton-proton collisions at a center-of-mass energy of 8 TeV, are presented.
Abstract: Improved jet energy scale corrections, based on a data sample corresponding to an integrated luminosity of 19.7 fb^(-1) collected by the CMS experiment in proton-proton collisions at a center-of-mass energy of 8 TeV, are presented. The corrections as a function of pseudorapidity η and transverse momentum p_T are extracted from data and simulated events combining several channels and methods. They account successively for the effects of pileup, uniformity of the detector response, and residual data-simulation jet energy scale differences. Further corrections, depending on the jet flavor and distance parameter (jet size) R, are also presented. The jet energy resolution is measured in data and simulated events and is studied as a function of pileup, jet size, and jet flavor. Typical jet energy resolutions at the central rapidities are 15–20% at 30 GeV, about 10% at 100 GeV, and 5% at 1 TeV. The studies exploit events with dijet topology, as well as photon+jet, Z+jet and multijet events. Several new techniques are used to account for the various sources of jet energy scale corrections, and a full set of uncertainties, and their correlations, are provided. The final uncertainties on the jet energy scale are below 3% across the phase space considered by most analyses (p_T > 30 GeV and 0|η| 30 GeV is reached, when excluding the jet flavor uncertainties, which are provided separately for different jet flavors. A new benchmark for jet energy scale determination at hadron colliders is achieved with 0.32% uncertainty for jets with p_T of the order of 165–330 GeV, and |η| < 0.8.

Book ChapterDOI
TL;DR: The Worldwide Protein Data Bank partners are working closely with experts in related experimental areas to establish a federation of data resources that will support sustainable archiving and validation of 3D structural models and experimental data derived from integrative or hybrid methods.
Abstract: The Protein Data Bank (PDB)--the single global repository of experimentally determined 3D structures of biological macromolecules and their complexes--was established in 1971, becoming the first open-access digital resource in the biological sciences The PDB archive currently houses ~130,000 entries (May 2017) It is managed by the Worldwide Protein Data Bank organization (wwPDB; wwpdborg), which includes the RCSB Protein Data Bank (RCSB PDB; rcsborg), the Protein Data Bank Japan (PDBj; pdbjorg), the Protein Data Bank in Europe (PDBe; pdbeorg), and BioMagResBank (BMRB; wwwbmrbwiscedu) The four wwPDB partners operate a unified global software system that enforces community-agreed data standards and supports data Deposition, Biocuration, and Validation of ~11,000 new PDB entries annually (depositwwpdborg) The RCSB PDB currently acts as the archive keeper, ensuring disaster recovery of PDB data and coordinating weekly updates wwPDB partners disseminate the same archival data from multiple FTP sites, while operating complementary websites that provide their own views of PDB data with selected value-added information and links to related data resources At present, the PDB archives experimental data, associated metadata, and 3D-atomic level structural models derived from three well-established methods: crystallography, nuclear magnetic resonance spectroscopy (NMR), and electron microscopy (3DEM) wwPDB partners are working closely with experts in related experimental areas (small-angle scattering, chemical cross-linking/mass spectrometry, Forster energy resonance transfer or FRET, etc) to establish a federation of data resources that will support sustainable archiving and validation of 3D structural models and experimental data derived from integrative or hybrid methods

Journal ArticleDOI
TL;DR: Overall effect size (ES) and moderator effects were assessed using multilevel modeling to address ES dependency that is common, but typically not modeled, in meta-analyses, and only youth-focused behavioral therapies showed similar and robust effects across youth, parent, and teacher reports.
Abstract: Across 5 decades, hundreds of randomized trials have tested psychological therapies for youth internalizing (anxiety, depression) and externalizing (misconduct, attention deficit and hyperactivity disorder) disorders and problems. Since the last broad-based youth meta-analysis in 1995, the number of trials has almost tripled and data-analytic methods have been refined. We applied these methods to the expanded study pool (447 studies; 30,431 youths), synthesizing 50 years of findings and identifying implications for research and practice. We assessed overall effect size (ES) and moderator effects using multilevel modeling to address ES dependency that is common, but typically not modeled, in meta-analyses. Mean posttreatment ES was 0.46; the probability that a youth in the treatment condition would fare better than a youth in the control condition was 63%. Effects varied according to multiple moderators, including the problem targeted in treatment: Mean ES at posttreatment was strongest for anxiety (0.61), weakest for depression (0.29), and nonsignificant for multiproblem treatment (0.15). ESs differed across control conditions, with "usual care" emerging as a potent comparison condition, and across informants, highlighting the need to obtain and integrate multiple perspectives on outcome. Effects of therapy type varied by informant; only youth-focused behavioral therapies (including cognitive-behavioral therapy) showed similar and robust effects across youth, parent, and teacher reports. Effects did not differ for Caucasian versus minority samples, but more diverse samples are needed. The findings underscore the benefits of psychological treatments as well as the need for improved therapies and more representative, informative, and rigorous intervention science. (PsycINFO Database Record