scispace - formally typeset
Search or ask a question

Showing papers by "York University published in 2012"


Journal ArticleDOI
Georges Aad1, T. Abajyan2, Brad Abbott3, Jalal Abdallah4  +2964 moreInstitutions (200)
TL;DR: In this article, a search for the Standard Model Higgs boson in proton-proton collisions with the ATLAS detector at the LHC is presented, which has a significance of 5.9 standard deviations, corresponding to a background fluctuation probability of 1.7×10−9.

9,282 citations


Journal ArticleDOI
TL;DR: These guidelines are presented for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process vs. those that measure flux through the autophagy pathway (i.e., the complete process); thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from stimuli that result in increased autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field.

4,316 citations


Journal ArticleDOI
TL;DR: An integrated review of the neural mechanisms involved in contour grouping, border ownership, and figure-ground perception is concluded by evaluating what modern vision science has offered compared to traditional Gestalt psychology, whether the authors can speak of a Gestalt revival, and where the remaining limitations and challenges lie.
Abstract: In 1912, Max Wertheimer published his paper on phi motion, widely recognized as the start of Gestalt psychology. Because of its continued relevance in modern psychology, this centennial anniversary is an excellent opportunity to take stock of what Gestalt psychology has offered and how it has changed since its inception. We first introduce the key findings and ideas in the Berlin school of Gestalt psychology, and then briefly sketch its development, rise, and fall. Next, we discuss its empirical and conceptual problems, and indicate how they are addressed in contemporary research on perceptual grouping and figure–ground organization. In particular, we review the principles of grouping, both classical (e.g., proximity, similarity, common fate, good continuation, closure, symmetry, parallelism) and new (e.g., synchrony, common region, element and uniform connectedness), and their role in contour integration and completion. We then review classic and new image-based principles of figure–ground organization, how it is influenced by past experience and attention, and how it relates to shape and depth perception. After an integrated review of the neural mechanisms involved in contour grouping, border ownership, and figure–ground perception, we conclude by evaluating what modern vision science has offered compared to traditional Gestalt psychology, whether we can speak of a Gestalt revival, and where the remaining limitations and challenges lie. A better integration of this research tradition with the rest of vision science requires further progress regarding the conceptual and theoretical foundations of the Gestalt approach, which is the focus of a second review article.

1,047 citations


Proceedings ArticleDOI
25 Mar 2012
TL;DR: The proposed CNN architecture is applied to speech recognition within the framework of hybrid NN-HMM model to use local filtering and max-pooling in frequency domain to normalize speaker variance to achieve higher multi-speaker speech recognition performance.
Abstract: Convolutional Neural Networks (CNN) have showed success in achieving translation invariance for many image processing tasks. The success is largely attributed to the use of local filtering and max-pooling in the CNN architecture. In this paper, we propose to apply CNN to speech recognition within the framework of hybrid NN-HMM model. We propose to use local filtering and max-pooling in frequency domain to normalize speaker variance to achieve higher multi-speaker speech recognition performance. In our method, a pair of local filtering layer and max-pooling layer is added at the lowest end of neural network (NN) to normalize spectral variations of speech signals. In our experiments, the proposed CNN architecture is evaluated in a speaker independent speech recognition task using the standard TIMIT data sets. Experimental results show that the proposed CNN method can achieve over 10% relative error reduction in the core TIMIT test sets when comparing with a regular NN using the same number of hidden layers and weights. Our results also show that the best result of the proposed CNN model is better than previously published results on the same TIMIT test sets that use a pre-trained deep NN model.

901 citations


Journal ArticleDOI
TL;DR: In this article, the authors test hypotheses regarding differences in brand-related user-generated content (UGC) between Twitter (a microblogging site), Facebook (a social network) and YouTube (a content community) using data from a content analysis of 600 UGC posts for two retail-apparel brands.

780 citations


Journal ArticleDOI
TL;DR: PEPPSI (pyridine-enhanced precatalyst preparation, stabilization, and initiation) palladium precatalysts with bulky NHC ligands have established themselves as successful alternatives to palladium phosphine complexes.
Abstract: Palladium-catalyzed cross-coupling reactions enable organic chemists to form C-C bonds in targeted positions and under mild conditions. Although phosphine ligands have been intensively researched, in the search for even better cross-coupling catalysts attention has recently turned to the use of N-heterocyclic carbene (NHC) ligands, which form a strong bond to the palladium center. PEPPSI (pyridine-enhanced precatalyst preparation, stabilization, and initiation) palladium precatalysts with bulky NHC ligands have established themselves as successful alternatives to palladium phosphine complexes. This Review shows the success of these species in Suzuki-Miyaura, Negishi, and Stille-Migita cross-couplings as well as in amination and sulfination reactions.

740 citations


Posted Content
TL;DR: Corporate Social Responsibility (CSR) has become a pervasive topic in the business literature, but has largely neglected the role of institutions as discussed by the authors, which suggests going beyond grounding CSR in the voluntary behaviour of companies, and understanding the larger historical and political determinants of whether and in what forms corporations take on social responsibilities.
Abstract: Corporate Social Responsibility (CSR) has become a pervasive topic in the business literature, but has largely neglected the role of institutions. This introductory article to the Special Issue of Socio-Economic Review examines the potential contributions of institutional theory to understanding CSR as a mode of governance. This perspective suggests going beyond grounding CSR in the voluntary behaviour of companies, and understanding the larger historical and political determinants of whether and in what forms corporations take on social responsibilities. Historically, the prevailing notion of CSR emerged through the defeat of more institutionalized forms of social solidarity in liberal market economies. Meanwhile, CSR is more tightly linked to formal institutions of stakeholder participation or state intervention in other advanced economies. The tensions between business-driven and multi-stakeholder forms of CSR extend to the transnational level, where the form and meaning of CSR remain highly contested. CSR research and practice thus rest on a basic paradox between a liberal notion of voluntary engagement and a contrary implication of socially binding responsibilities. Institutional theory seems to be a promising avenue to explore how the boundaries between business and society are constructed in different ways, and improve our understanding of the effectiveness of CSR within the wider institutional field of economic governance.

717 citations


Journal ArticleDOI
TL;DR: Recognizing the “holobiont”—the multicellular eukaryote plus its colonies of persistent symbionts—as a critically important unit of anatomy, development, physiology, immunology, and evolution opens up new investigative avenues and conceptually challenges the ways in which the biological subdisciplines have heretofore characterized living entities.
Abstract: The notion of the "biological individual" is crucial to studies of genetics, immunology, evolution, development, anatomy, and physiology. Each of these biological subdisciplines has a specific conception of individuality, which has historically provided conceptual contexts for integrating newly acquired data. During the past decade, nucleic acid analysis, especially genomic sequencing and high-throughput RNA techniques, has challenged each of these disciplinary definitions by finding significant interactions of animals and plants with symbiotic microorganisms that disrupt the boundaries that heretofore had characterized the biological individual. Animals cannot be considered individuals by anatomical or physiological criteria because a diversity of symbionts are both present and functional in completing metabolic pathways and serving other physiological functions. Similarly, these new studies have shown that animal development is incomplete without symbionts. Symbionts also constitute a second mode of genetic inheritance, providing selectable genetic variation for natural selection. The immune system also develops, in part, in dialogue with symbionts and thereby functions as a mechanism for integrating microbes into the animal-cell community. Recognizing the "holobiont"--the multicellular eukaryote plus its colonies of persistent symbionts--as a critically important unit of anatomy, development, physiology, immunology, and evolution opens up new investigative avenues and conceptually challenges the ways in which the biological subdisciplines have heretofore characterized living entities.

694 citations


Journal ArticleDOI
TL;DR: In this article, the potential contributions of institutional theory to understand corporate social responsibility as a mode of governance are examined. But the focus is on the voluntary behavior of companies and not on the larger historical and political determinants of whether and in what forms corporations take on social responsibilities.
Abstract: *Corporate Social Responsibility (CSR) has become a pervasive topic in the business literature, but has largely neglected the role of institutions. This introductory article to the Special Issue of Socio-Economic Review examines the potential contributions of institutional theory to understanding CSR as a mode of governance. This perspective suggests going beyond grounding CSR in the voluntary behaviour of companies, and understanding the larger historical and political determinants of whether and in what forms corporations take on social responsibilities. Historically, the prevailing notion of CSR emerged through the defeat of more institutionalized forms of social solidarity in liberal market economies. Meanwhile, CSR is more tightly linked to formal institutions of stakeholder participation or state intervention in other advanced economies. The tensions between business-driven and multi-stakeholder forms of CSR extend to the transnational level, where the form and meaning of CSR remain highly contested. CSR research and practice thus rest on a basic paradox between a liberal notion of voluntary engagement and a contrary implication of socially binding responsibilities. Institutional theory seems to be a promising avenue to explore how the boundaries between business and society are constructed in different ways, and improve our understanding of the effectiveness of CSR within the wider institutional field of economic governance.

660 citations


Journal ArticleDOI
Perry Sadorsky1
TL;DR: In this paper, multivariate GARCH models are used to model conditional correlations and to analyze the volatility spillovers between oil prices and the stock prices of clean energy companies and technology companies.

606 citations


Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, S. Abdel Khalek  +3081 moreInstitutions (197)
TL;DR: A combined search for the Standard Model Higgs boson with the ATLAS experiment at the LHC using datasets corresponding to integrated luminosities from 1.04 fb(-1) to 4.9 fb(1) of pp collisions is described in this paper.

Journal ArticleDOI
31 Aug 2012-Science
TL;DR: Direct measurements show that ambient atmospheric particulate black carbon absorbs less solar radiation than theory suggested, suggesting that many climate models may be overestimating the amount of warming caused by black carbon emissions.
Abstract: Atmospheric black carbon (BC) warms Earth’s climate, and its reduction has been targeted for near-term climate change mitigation. Models that include forcing by BC assume internal mixing with non-BC aerosol components that enhance BC absorption, often by a factor of ~2; such model estimates have yet to be clearly validated through atmospheric observations. Here, direct in situ measurements of BC absorption enhancements ( E abs ) and mixing state are reported for two California regions. The observed E abs is small—6% on average at 532 nm—and increases weakly with photochemical aging. The E abs is less than predicted from observationally constrained theoretical calculations, suggesting that many climate models may overestimate warming by BC. These ambient observations stand in contrast to laboratory measurements that show substantial E abs for BC are possible.

Journal ArticleDOI
TL;DR: This work argues that boredom is universally conceptualized as “the aversive experience of wanting, but being unable, to engage in satisfying activity,” and proposes that boredom be defined in terms of attention.
Abstract: Our central goal is to provide a definition of boredom in terms of the underlying mental processes that occur during an instance of boredom. Through the synthesis of psychodynamic, existential, arousal, and cognitive theories of boredom, we argue that boredom is universally conceptualized as "the aversive experience of wanting, but being unable, to engage in satisfying activity." We propose to map this conceptualization onto underlying mental processes. Specifically, we propose that boredom be defined in terms of attention. That is, boredom is the aversive state that occurs when we (a) are not able to successfully engage attention with internal (e.g., thoughts or feelings) or external (e.g., environmental stimuli) information required for participating in satisfying activity, (b) are focused on the fact that we are not able to engage attention and participate in satisfying activity, and (c) attribute the cause of our aversive state to the environment. We believe that our definition of boredom fully accounts for the phenomenal experience of boredom, brings existing theories of boredom into dialogue with one another, and suggests specific directions for future research on boredom and attention.

Journal ArticleDOI
TL;DR: In this paper, a structural vector autoregression model is proposed to investigate the dynamic relationship between oil prices, exchange rates and emerging market stock prices, and the model also captures stylized facts regarding movements in oil prices.

Journal ArticleDOI
Georges Aad, B. Abbott1, Jalal Abdallah2, A. A. Abdelalim3  +3013 moreInstitutions (174)
TL;DR: In this article, detailed measurements of the electron performance of the ATLAS detector at the LHC were reported, using decays of the Z, W and J/psi particles.
Abstract: Detailed measurements of the electron performance of the ATLAS detector at the LHC are reported, using decays of the Z, W and J/psi particles. Data collected in 2010 at root s = 7 TeV are used, corresponding to an integrated luminosity of almost 40 pb(-1). The inter-alignment of the inner detector and the electromagnetic calorimeter, the determination of the electron energy scale and resolution, and the performance in terms of response uniformity and linearity are discussed. The electron identification, reconstruction and trigger efficiencies, as well as the charge misidentification probability, are also presented.

Proceedings ArticleDOI
12 Aug 2012
TL;DR: This is the first study on rumor analysis and detection on Sina Weibo, China's leading micro-blogging service provider, and examines an extensive set of features that can be extracted from the microblogs, and trains a classifier to automatically detect the rumors from a mixed set of true information and false information.
Abstract: The problem of gauging information credibility on social networks has received considerable attention in recent years. Most previous work has chosen Twitter, the world's largest micro-blogging platform, as the premise of research. In this work, we shift the premise and study the problem of information credibility on Sina Weibo, China's leading micro-blogging service provider. With eight times more users than Twitter, Sina Weibo is more of a Facebook-Twitter hybrid than a pure Twitter clone, and exhibits several important characteristics that distinguish it from Twitter. We collect an extensive set of microblogs which have been confirmed to be false rumors based on information from the official rumor-busting service provided by Sina Weibo. Unlike previous studies on Twitter where the labeling of rumors is done manually by the participants of the experiments, the official nature of this service ensures the high quality of the dataset. We then examine an extensive set of features that can be extracted from the microblogs, and train a classifier to automatically detect the rumors from a mixed set of true information and false information. The experiments show that some of the new features we propose are indeed effective in the classification, and even the features considered in previous studies have different implications with Sina Weibo than with Twitter. To the best of our knowledge, this is the first study on rumor analysis and detection on Sina Weibo.

Journal ArticleDOI
Daniele S. M. Alves1, Nima Arkani-Hamed, S. Arora2, Yang Bai1, Matthew Baumgart3, Joshua Berger4, Matthew R. Buckley5, Bart Butler1, Spencer Chang6, Spencer Chang7, Hsin-Chia Cheng6, Clifford Cheung8, R. Sekhar Chivukula9, Won Sang Cho10, R. Cotta1, Mariarosaria D'Alfonso11, Sonia El Hedri1, Rouven Essig12, Jared A. Evans6, Liam Fitzpatrick13, Patrick J. Fox5, Roberto Franceschini14, Ayres Freitas15, James S. Gainer16, James S. Gainer17, Yuri Gershtein2, R. N.C. Gray2, Thomas Gregoire18, Ben Gripaios19, J.F. Gunion6, Tao Han20, Andy Haas1, P. Hansson1, JoAnne L. Hewett1, Dmitry Hits2, Jay Hubisz21, Eder Izaguirre1, Jared Kaplan1, Emanuel Katz13, Can Kilic2, Hyung Do Kim22, Ryuichiro Kitano23, Sue Ann Koay11, Pyungwon Ko24, David Krohn25, Eric Kuflik26, Ian M. Lewis20, Mariangela Lisanti27, Tao Liu11, Zhen Liu20, Ran Lu26, Markus A. Luty6, Patrick Meade12, David E. Morrissey28, Stephen Mrenna5, Mihoko M. Nojiri, Takemichi Okui29, Sanjay Padhi30, Michele Papucci31, Michael Park2, Myeonghun Park32, Maxim Perelstein4, Michael E. Peskin1, Daniel J. Phalen6, Keith Rehermann33, Vikram Rentala34, Vikram Rentala35, Tuhin S. Roy36, Joshua T. Ruderman27, Veronica Sanz37, Martin Schmaltz13, S. Schnetzer2, Philip Schuster38, Pedro Schwaller17, Pedro Schwaller39, Pedro Schwaller40, Matthew D. Schwartz25, Ariel Schwartzman1, Jing Shao21, J. Shelton41, David Shih2, Jing Shu10, Daniel Silverstein1, Elizabeth H. Simmons9, Sunil Somalwar2, Michael Spannowsky7, Christian Spethmann13, Matthew J. Strassler2, Shufang Su35, Shufang Su34, Tim M. P. Tait34, Brooks Thomas42, Scott Thomas2, Natalia Toro38, Tomer Volansky8, Jay G. Wacker1, Wolfgang Waltenberger43, Itay Yavin44, Felix Yu34, Yue Zhao2, Kathryn M. Zurek26 
TL;DR: A collection of simplified models relevant to the design of new-physics searches at the Large Hadron Collider (LHC) and the characterization of their results is presented in this paper.
Abstract: This document proposes a collection of simplified models relevant to the design of new-physics searches at the Large Hadron Collider (LHC) and the characterization of their results. Both ATLAS and CMS have already presented some results in terms of simplified models, and we encourage them to continue and expand this effort, which supplements both signature-based results and benchmark model interpretations. A simplified model is defined by an effective Lagrangian describing the interactions of a small number of new particles. Simplified models can equally well be described by a small number of masses and cross-sections. These parameters are directly related to collider physics observables, making simplified models a particularly effective framework for evaluating searches and a useful starting point for characterizing positive signals of new physics. This document serves as an official summary of the results from the 'Topologies for Early LHC Searches' workshop, held at SLAC in September of 2010, the purpose of which was to develop a set of representative models that can be used to cover all relevant phase space in experimental searches. Particular emphasis is placed on searches relevant for the first similar to 50-500 pb(-1) of data and those motivated by supersymmetric models. This note largely summarizes material posted at http://lhcnewphysics.org/, which includes simplified model definitions, Monte Carlo material, and supporting contacts within the theory community. We also comment on future developments that may be useful as more data is gathered and analyzed by the experiments.

Journal ArticleDOI
TL;DR: CMC make multiple transitions across providers and care settings and CMC with TA have higher costs and home care use and Initiatives to improve their health outcomes and decrease costs need to focus on the entire continuum of care.
Abstract: BACKGROUND AND OBJECTIVE: Health care use of children with medical complexity (CMC), such as those with neurologic impairment or other complex chronic conditions (CCCs) and those with technology assistance (TA), is not well understood. The objective of the study was to evaluate health care utilization and costs in a population-based sample of CMC in Ontario, Canada. METHODS: Hospital discharge data from 2005 through 2007 identified CMC. Complete health system use and costs were analyzed over the subsequent 2-year period. RESULTS: The study identified 15 771 hospitalized CMC (0.67% of children in Ontario); 10 340 (65.6%) had single-organ CCC, 1063 (6.7%) multiorgan CCC, 4368 (27.6%) neurologic impairment, and 1863 (11.8%) had TA. CMC saw a median of 13 outpatient physicians and 6 distinct subspecialists. Thirty-six percent received home care services. Thirty-day readmission varied from 12.6% (single CCC without TA) to 23.7% (multiple CCC with TA). CMC accounted for almost one-third of child health spending. Rehospitalization accounted for the largest proportion of subsequent costs (27.2%), followed by home care (11.3%) and physician services (6.0%). Home care costs were a much larger proportion of costs in children with TA. Children with multiple CCC with TA had costs 3.5 times higher than children with a single CCC without TA. CONCLUSIONS: Although a small proportion of the population, CMC account for a substantial proportion of health care costs. CMC make multiple transitions across providers and care settings and CMC with TA have higher costs and home care use. Initiatives to improve their health outcomes and decrease costs need to focus on the entire continuum of care.

Journal ArticleDOI
TL;DR: Investigating rates of various forms of bullying, exploring the association between victimization and mental health problems, and investigating individual and contextual variables as correlates of victimization built an understanding of bullying experiences among children with ASD based on parent reports.
Abstract: Few studies have investigated bullying experiences among children diagnosed with autism spectrum disorders (ASD); however, preliminary research suggests that children with ASD are at greater risk for being bullied than typically developing peers. The aim of the current study was to build an understanding of bullying experiences among children with ASD based on parent reports by examining rates of various forms of bullying, exploring the association between victimization and mental health problems, and investigating individual and contextual variables as correlates of victimization. Victimization was related to child age, internalizing and externalizing mental health problems, communication difficulties, and number of friends at school, as well as parent mental health problems. Bullying prevention and intervention strategies are discussed.

Journal ArticleDOI
TL;DR: In this article, the authors investigate the rapid internationalization of many multinationals from emerging economies through acquisition in advanced economies, and they conceptualize these acquisitions as an act and form of entrepreneurship, aimed to overcome the "liability of emergingness" incurred by these firms and to serve as a mechanism for competitive catch-up through opportunity seeking and capability transformation.
Abstract: We investigate the rapid internationalization of many multinationals from emerging economies through acquisition in advanced economies. We conceptualize these acquisitions as an act and form of entrepreneurship, aimed to overcome the ‘liability of emergingness’ incurred by these firms and to serve as a mechanism for competitive catch-up through opportunity seeking and capability transformation. Our explanation emphasizes (1) the unique asymmetries (and not necessarily advantages) distinguishing emerging multinationals from advanced economy multinationals due to their historical and institutional differences, as well as (2) a search for advantage creation when firms possess mainly ordinary resources. The argument shifts the central focus from advantage to asymmetries as the starting point for internationalization and, additionally, highlights the role of learning agility rather than ability as a potential ‘asset of emergingness.’

Journal ArticleDOI
Georges Aad1, Brad Abbott2, J. Abdallah3, S. Abdel Khalek4  +3073 moreInstitutions (193)
TL;DR: In this paper, a Fourier analysis of the charged particle pair distribution in relative azimuthal angle (Delta phi = phi(a)-phi(b)) is performed to extract the coefficients v(n,n) =.
Abstract: Differential measurements of charged particle azimuthal anisotropy are presented for lead-lead collisions at root sNN = 2.76 TeV with the ATLAS detector at the LHC, based on an integrated luminosity of approximately 8 mu b(-1). This anisotropy is characterized via a Fourier expansion of the distribution of charged particles in azimuthal angle relative to the reaction plane, with the coefficients v(n) denoting the magnitude of the anisotropy. Significant v(2)-v(6) values are obtained as a function of transverse momentum (0.5 = 3 are found to vary weakly with both eta and centrality, and their p(T) dependencies are found to follow an approximate scaling relation, v(n)(1/n)(p(T)) proportional to v(2)(1/2)(p(T)), except in the top 5% most central collisions. A Fourier analysis of the charged particle pair distribution in relative azimuthal angle (Delta phi = phi(a)-phi(b)) is performed to extract the coefficients v(n,n) = . For pairs of charged particles with a large pseudorapidity gap (|Delta eta = eta(a) - eta(b)| > 2) and one particle with p(T) < 3 GeV, the v(2,2)-v(6,6) values are found to factorize as v(n,n)(p(T)(a), p(T)(b)) approximate to v(n) (p(T)(a))v(n)(p(T)(b)) in central and midcentral events. Such factorization suggests that these values of v(2,2)-v(6,6) are primarily attributable to the response of the created matter to the fluctuations in the geometry of the initial state. A detailed study shows that the v(1,1)(p(T)(a), p(T)(b)) data are consistent with the combined contributions from a rapidity-even v(1) and global momentum conservation. A two-component fit is used to extract the v(1) contribution. The extracted v(1) isobserved to cross zero at pT approximate to 1.0 GeV, reaches a maximum at 4-5 GeV with a value comparable to that for v(3), and decreases at higher p(T).

Journal ArticleDOI
TL;DR: It is shown that the use of multiple molecules leads to reduced error rate in a manner akin to diversity order in wireless communications, and that the additive inverse Gaussian noise channel model is appropriate for molecular communication in fluid media.
Abstract: In this paper, we consider molecular communication, with information conveyed in the time of release of molecules. These molecules propagate to the transmitter through a fluid medium, propelled by a positive drift velocity and Brownian motion. The main contribution of this paper is the development of a theoretical foundation for such a communication system; specifically, the additive inverse Gaussian noise (AIGN) channel model. In such a channel, the information is corrupted by noise that follows an IG distribution. We show that such a channel model is appropriate for molecular communication in fluid media. Taking advantage of the available literature on the IG distribution, upper and lower bounds on channel capacity are developed, and a maximum likelihood receiver is derived. Results are presented which suggest that this channel does not have a single quality measure analogous to signal-to-noise ratio in the additive white Gaussian noise channel. It is also shown that the use of multiple molecules leads to reduced error rate in a manner akin to diversity order in wireless communications. Finally, some open problems are discussed that arise from the IG channel model.

Journal ArticleDOI
Georges Aad1, Georges Aad2, Brad Abbott1, Brad Abbott3  +5592 moreInstitutions (189)
TL;DR: The ATLAS trigger system as discussed by the authors selects events by rapidly identifying signatures of muon, electron, photon, tau lepton, jet, and B meson candidates, as well as using global event signatures, such as missing transverse energy.
Abstract: Proton-proton collisions at root s = 7 TeV and heavy ion collisions at root(NN)-N-s = 2.76 TeV were produced by the LHC and recorded using the ATLAS experiment's trigger system in 2010. The LHC is designed with a maximum bunch crossing rate of 40 MHz and the ATLAS trigger system is designed to record approximately 200 of these per second. The trigger system selects events by rapidly identifying signatures of muon, electron, photon, tau lepton, jet, and B meson candidates, as well as using global event signatures, such as missing transverse energy. An overview of the ATLAS trigger system, the evolution of the system during 2010 and the performance of the trigger system components and selections based on the 2010 collision data are shown. A brief outline of plans for the trigger system in 2011 is presented.

Journal ArticleDOI
TL;DR: The entropy model of uncertainty (EMU), an integrative theoretical framework that applies the idea of entropy to the human information system to understand uncertainty-related anxiety, is proposed and is experienced subjectively as anxiety.
Abstract: Entropy, a concept derived from thermodynamics and information theory, describes the amount of uncertainty and disorder within a system. Self-organizing systems engage in a continual dialogue with the environment and must adapt themselves to changing circumstances to keep internal entropy at a manageable level. We propose the entropy model of uncertainty (EMU), an integrative theoretical framework that applies the idea of entropy to the human information system to understand uncertainty-related anxiety. Four major tenets of EMU are proposed: (a) Uncertainty poses a critical adaptive challenge for any organism, so individuals are motivated to keep it at a manageable level; (b) uncertainty emerges as a function of the conflict between competing perceptual and behavioral affordances; (c) adopting clear goals and belief structures helps to constrain the experience of uncertainty by reducing the spread of competing affordances; and (d) uncertainty is experienced subjectively as anxiety and is associated with activity in the anterior cingulate cortex and with heightened noradrenaline release. By placing the discussion of uncertainty management, a fundamental biological necessity, within the framework of information theory and self-organizing systems, our model helps to situate key psychological processes within a broader physical, conceptual, and evolutionary context.

Journal ArticleDOI
TL;DR: An overview of the scientific capabilities of WRFDA is provided, and together with results from sample operation implementations at the U.S. and international levels, the challenges associated with balancing academic, research, and operational data assimilation requirements in the context of the WRF effort to date are discussed.
Abstract: Data assimilation is the process by which observations are combined with short-range NWP model output to produce an analysis of the state of the atmosphere at a specified time. Since its inception in the late 1990s, the multiagency Weather Research and Forecasting (WRF) model effort has had a strong data assimilation component, dedicating two working groups to the subject. This article documents the history of the WRF data assimilation effort, and discusses the challenges associated with balancing academic, research, and operational data assimilation requirements in the context of the WRF effort to date. The WRF Model's Community Variational/Ensemble Data Assimilation System (WRFDA) has evolved over the past 10 years, and has resulted in over 30 refereed publications to date, as well as implementation in a wide range of real-time and operational NWP systems. This paper provides an overview of the scientific capabilities of WRFDA, and together with results from sample operation implementations at the U.S. ...

Journal ArticleDOI
TL;DR: The results achieved by different methods are compared and analysed to identify promising strategies for automatic urban object extraction from current airborne sensor data, but also common problems of state-of-the-art methods.
Abstract: . For more than two decades, many efforts have been made to develop methods for extracting urban objects from data acquired by airborne sensors. In order to make the results of such algorithms more comparable, benchmarking data sets are of paramount importance. Such a data set, consisting of airborne image and laserscanner data, has been made available to the scientific community. Researchers were encouraged to submit results of urban object detection and 3D building reconstruction, which were evaluated based on reference data. This paper presents the outcomes of the evaluation for building detection, tree detection, and 3D building reconstruction. The results achieved by different methods are compared and analysed to identify promising strategies for automatic urban object extraction from current airborne sensor data, but also common problems of state-of-the-art methods.

Journal ArticleDOI
I. Pĝris1, I. Pĝris2, Patrick Petitjean1, Éric Aubourg, Stephen Bailey3, Nicholas P. Ross3, Adam D. Myers4, Adam D. Myers5, Michael A. Strauss6, Scott F. Anderson7, Eduard Arnau, Julian E. Bautista, D. V. Bizyaev8, Adam S. Bolton9, Jo Bovy, W. N. Brandt10, Howard Brewington8, J. R. Browstein9, Nicolás G. Busca, Daniel M. Capellupo11, Daniel M. Capellupo12, William Carithers3, Rupert A. C. Croft13, Kyle S. Dawson9, T. Delubac14, Garrett Ebelke8, Daniel J. Eisenstein15, P. Engelke16, Xiaohui Fan17, N. Filiz Ak18, N. Filiz Ak10, Hayley Finley1, Andreu Font-Ribera3, Andreu Font-Ribera19, Jian Ge11, R. R. Gibson7, Patrick B. Hall20, Fred Hamann11, Joseph F. Hennawi5, Shirley Ho13, David W. Hogg21, Å Ivezić7, Linhua Jiang17, Amy Kimball7, Amy Kimball22, D. Kirkby23, Jessica A. Kirkpatrick3, Khee-Gan Lee6, Khee-Gan Lee5, J. M. Le Goff14, Britt Lundgren16, Chelsea L. MacLeod, Elena Malanushenko8, Viktor Malanushenko8, Claudia Maraston24, Ian D. McGreer17, Richard G. McMahon25, Jordi Miralda-Escudé, Demitri Muna21, Pasquier Noterdaeme1, Daniel Oravetz8, Nathalie Palanque-Delabrouille14, Kaike Pan8, Ismael Perez-Fournon26, Ismael Perez-Fournon27, Matthew M. Pieri24, Gordon T. Richards28, Emmanuel Rollinde1, Erin Sheldon29, David J. Schlegel3, Donald P. Schneider10, Anže Slosar29, Alaina Shelden8, Yue Shen15, A. Simmons8, S. A. Snedden8, Nao Suzuki30, Nao Suzuki3, Jeremy L. Tinker21, M. Viel, Benjamin A. Weaver21, David H. Weinberg31, Martin White3, W. M. Wood-Vasey32, C. Yeche14 
TL;DR: The Data Release 9 Quasar (DR9Q) catalog from the Baryon Oscillation Spectroscopic Survey (BOSS) of the Sloan Digital Sky Survey (III) is presented in this article.
Abstract: We present the Data Release 9 Quasar (DR9Q) catalog from the Baryon Oscillation Spectroscopic Survey (BOSS) of the Sloan Digital Sky Survey III. The catalog includes all BOSS objects that were targeted as quasar candidates during the survey, are spectrocopically confirmed as quasars via visual inspection, have luminosities M i [z = 2] 0 = 70 km s-1 Mpc-1 , ΩM = 0.3, and ΩΛ = 0.7) and either display at least one emission line with full width at half maximum (FWHM) larger than 500 km s-1 or, if not, have interesting/complex absorption features. It includes as well, known quasars (mostly from SDSS-I and II) that were reobserved by BOSS. This catalog contains 87 822 quasars (78 086 are new discoveries) detected over 3275 deg2 with robust identification and redshift measured by a combination of principal component eigenspectra newly derived from a training set of 8632 spectra from SDSS-DR7. The number of quasars with z > 2.15 (61 931) is ~2.8 times larger than the number of z > 2.15 quasars previously known. Redshifts and FWHMs are provided for the strongest emission lines (C iv, C iii], Mg ii). The catalog identifies 7533 broad absorption line quasars and gives their characteristics. For each object the catalog presents five-band (u , g , r , i , z ) CCD-based photometry with typical accuracy of 0.03 mag, and information on the morphology and selection method. The catalog also contains X-ray, ultraviolet, near-infrared, and radio emission properties of the quasars, when available, from other large-area surveys. The calibrated digital spectra cover the wavelength region 3600−10 500 A at a spectral resolution in the range 1300 < 2500; the spectra can be retrieved from the SDSS Catalog Archive Server. We also provide a supplemental list of an additional 949 quasars that have been identified, among galaxy targets of the BOSS or among quasar targets after DR9 was frozen.

Journal ArticleDOI
TL;DR: Air quality and health benefits of 14 specific emission control measures targeting BC and methane would have substantial co-benefits for air quality and public health worldwide, potentially reversing trends of increasing air pollution concentrations and mortality in Africa and South, West, and Central Asia.
Abstract: Background: Tropospheric ozone and black carbon (BC), a component of fine particulate matter (PM ≤ 2.5 µm in aerodynamic diameter; PM2.5), are associated with premature mortality and they disrupt g...

Journal ArticleDOI
06 Jul 2012-Vaccine
TL;DR: Interventions aimed at improving education about, and access to, analgesic interventions during immunization injections performed in childhood are recommended in order to prevent the development of needle fears and vaccine non-compliance.

Journal ArticleDOI
01 Nov 2012-Diabetes
TL;DR: Both AE and RE alone are effective for reducing abdominal fat and intrahepatic lipid in obese adolescent boys and RE but not AE is also associated with significant improvements in insulin sensitivity.
Abstract: The optimal exercise modality for reductions of abdominal obesity and risk factors for type 2 diabetes in youth is unknown. We examined the effects of aerobic exercise (AE) versus resistance exercise (RE) without caloric restriction on abdominal adiposity, ectopic fat, and insulin sensitivity and secretion in youth. Forty-five obese adolescent boys were randomly assigned to one of three 3-month interventions: AE, RE, or a nonexercising control. Abdominal fat was assessed by magnetic resonance imaging, and intrahepatic lipid and intramyocellular lipid were assessed by proton magnetic resonance spectroscopy. Insulin sensitivity and secretion were evaluated by a 3-h hyperinsulinemic-euglycemic clamp and a 2-h hyperglycemic clamp. Both AE and RE prevented the significant weight gain that was observed in controls. Compared with controls, significant reductions in total and visceral fat and intrahepatic lipid were observed in both exercise groups. Compared with controls, a significant improvement in insulin sensitivity (27%) was observed in the RE group. Collapsed across groups, changes in visceral fat were associated with changes in intrahepatic lipid (r = 0.72) and insulin sensitivity (r = -0.47). Both AE and RE alone are effective for reducing abdominal fat and intrahepatic lipid in obese adolescent boys. RE but not AE is also associated with significant improvements in insulin sensitivity.