scispace - formally typeset
Search or ask a question

Showing papers by "Stony Brook University published in 2013"


Journal ArticleDOI
01 Sep 2013-Stroke
TL;DR: A multidisciplinary panel of neurointerventionalists, neuroradiologists, and stroke neurologists with extensive experience in neuroimaging and IAT, convened at the “Consensus Meeting on Revascularization Grading Following Endovascular Therapy” with the goal of addressing heterogeneity in cerebral angiographic revascularization grading.
Abstract: See related article, p 2509 Intra-arterial therapy (IAT) for acute ischemic stroke (AIS) has dramatically evolved during the past decade to include aspiration and stent-retriever devices. Recent randomized controlled trials have demonstrated the superior revascularization efficacy of stent-retrievers compared with the first-generation Merci device.1,2 Additionally, the Diffusion and Perfusion Imaging Evaluation for Understanding Stroke Evolution (DEFUSE) 2, the Mechanical Retrieval and Recanalization of Stroke Clots Using Embolectomy (MR RESCUE), and the Interventional Management of Stroke (IMS) III trials have confirmed the importance of early revascularization for achieving better clinical outcome.3–5 Despite these data, the current heterogeneity in cerebral angiographic revascularization grading (CARG) poses a major obstacle to further advances in stroke therapy. To date, several CARG scales have been used to measure the success of IAT.6–14 Even when the same scale is used in different studies, it is applied using varying operational criteria, which further confounds the interpretation of this key metric.10 The lack of a uniform grading approach limits comparison of revascularization rates across clinical trials and hinders the translation of promising, early phase angiographic results into proven, clinically effective treatments.6–14 For these reasons, it is critical that CARG scales be standardized and end points for successful revascularization be refined.6 This will lead to a greater understanding of the aspects of revascularization that are strongly predictive of clinical response. The optimal grading scale must demonstrate (1) a strong correlation with clinical outcome, (2) simplicity and feasibility of scale interpretation while ensuring characterization of relevant angiographic findings, and (3) high inter-rater reproducibility. To address these issues, a multidisciplinary panel of neurointerventionalists, neuroradiologists, and stroke neurologists with extensive experience in neuroimaging and IAT, convened at the “Consensus Meeting on Revascularization Grading Following Endovascular Therapy” with the goal …

1,162 citations


Journal ArticleDOI
TL;DR: TERT and ATRX mutations were mutually exclusive, suggesting that these two genetic mechanisms confer equivalent selective growth advantages and provide a biomarker that may be useful for the early detection of urinary tract and liver tumors and aid in the classification and prognostication of brain tumors.
Abstract: Malignant cells, like all actively growing cells, must maintain their telomeres, but genetic mechanisms responsible for telomere maintenance in tumors have only recently been discovered. In particular, mutations of the telomere binding proteins alpha thalassemia/mental retardation syndrome X-linked (ATRX) or death-domain associated protein (DAXX) have been shown to underlie a telomere maintenance mechanism not involving telomerase (alternative lengthening of telomeres), and point mutations in the promoter of the telomerase reverse transcriptase (TERT) gene increase telomerase expression and have been shown to occur in melanomas and a small number of other tumors. To further define the tumor types in which this latter mechanism plays a role, we surveyed 1,230 tumors of 60 different types. We found that tumors could be divided into types with low (<15%) and high (≥15%) frequencies of TERT promoter mutations. The nine TERT-high tumor types almost always originated in tissues with relatively low rates of self renewal, including melanomas, liposarcomas, hepatocellular carcinomas, urothelial carcinomas, squamous cell carcinomas of the tongue, medulloblastomas, and subtypes of gliomas (including 83% of primary glioblastoma, the most common brain tumor type). TERT and ATRX mutations were mutually exclusive, suggesting that these two genetic mechanisms confer equivalent selective growth advantages. In addition to their implications for understanding the relationship between telomeres and tumorigenesis, TERT mutations provide a biomarker that may be useful for the early detection of urinary tract and liver tumors and aid in the classification and prognostication of brain tumors.

1,143 citations


Journal ArticleDOI
04 Jan 2013-Science
TL;DR: A global map of zoogeographic regions is generated by combining data on the distributions and phylogenetic relationships of 21,037 species of amphibians, birds, and mammals, and it is shown that spatial turnover in the phylogenetic composition of vertebrate assemblages is higher in the Southern than in the Northern Hemisphere.
Abstract: Modern attempts to produce biogeographic maps focus on the distribution of species, and the maps are typically drawn without phylogenetic considerations. Here, we generate a global map of zoogeographic regions by combining data on the distributions and phylogenetic relationships of 21,037 species of amphibians, birds, and mammals. We identify 20 distinct zoogeographic regions, which are grouped into 11 larger realms. We document the lack of support for several regions previously defined based on distributional data and show that spatial turnover in the phylogenetic composition of vertebrate assemblages is higher in the Southern than in the Northern Hemisphere. We further show that the integration of phylogenetic information provides valuable insight on historical relationships among regions, permitting the identification of evolutionarily unique regions of the world.

1,014 citations


Journal ArticleDOI
TL;DR: It is shown how to generate randomly symmetric structures, and how to introduce 'smart' variation operators, learning about preferable local environments, that substantially improve the efficiency of the evolutionary algorithm USPEX and allow reliable prediction of structures with up to ∼200 atoms in the unit cell.

1,010 citations


Journal ArticleDOI
08 Feb 2013-Science
TL;DR: A phylogenetic tree shows that crown clade Placentalia and placental orders originated after the K-Pg boundary, but phenomic signals overturn molecular signals to show Sundatheria (Dermoptera + Scandentia) as the sister taxon of Primates, a close link between Proboscidea and Sirenia (sea cows), and the monophyly of echolocating Chiroptera (bats).
Abstract: To discover interordinal relationships of living and fossil placental mammals and the time of origin of placentals relative to the Cretaceous-Paleogene (K-Pg) boundary, we scored 4541 phenomic characters de novo for 86 fossil and living species. Combining these data with molecular sequences, we obtained a phylogenetic tree that, when calibrated with fossils, shows that crown clade Placentalia and placental orders originated after the K-Pg boundary. Many nodes discovered using molecular data are upheld, but phenomic signals overturn molecular signals to show Sundatheria (Dermoptera + Scandentia) as the sister taxon of Primates, a close link between Proboscidea (elephants) and Sirenia (sea cows), and the monophyly of echolocating Chiroptera (bats). Our tree suggests that Placentalia first split into Xenarthra and Epitheria; extinct New World species are the oldest members of Afrotheria.

1,003 citations


Journal ArticleDOI
TL;DR: It is revealed that stable microvasculature constitutes a dormant niche, whereas sprouting neovasculature sparks micrometastatic outgrowth, which is a surprising result in dormancy models and in zebrafish.
Abstract: In a significant fraction of breast cancer patients, distant metastases emerge after years or even decades of latency. How disseminated tumour cells (DTCs) are kept dormant, and what wakes them up, are fundamental problems in tumour biology. To address these questions, we used metastasis assays in mice and showed that dormant DTCs reside on microvasculature of lung, bone marrow and brain. We then engineered organotypic microvascular niches to determine whether endothelial cells directly influence breast cancer cell (BCC) growth. These models demonstrated that endothelial-derived thrombospondin-1 induces sustained BCC quiescence. This suppressive cue was lost in sprouting neovasculature; time-lapse analysis showed that sprouting vessels not only permit, but accelerate BCC outgrowth. We confirmed this surprising result in dormancy models and in zebrafish, and identified active TGF-β1 and periostin as tumour-promoting factors derived from endothelial tip cells. Our work reveals that stable microvasculature constitutes a dormant niche, whereas sprouting neovasculature sparks micrometastatic outgrowth.

901 citations


Journal ArticleDOI
TL;DR: Revised criteria are proposed for pediatric acute disseminated encephalomyelitis, pediatric clinically isolated syndrome, pediatric neuromyELitis optica and pediatric MS to incorporate advances in delineating the clinical and neuroradiologic features of these disorders.
Abstract: Background: There has been tremendous growth in research in pediatric multiple sclerosis (MS) and immune mediated central nervous system demyelinating disorders since operational definitions for these conditions were first proposed in 2007. Further, the International Pediatric Multiple Sclerosis Study Group (IPMSSG), which proposed the criteria, has expanded substantially in membership and in its international scope. Objective: The purpose of this review is to revise the 2007 definitions in order to incorporate advances in delineating the clinical and neuroradiologic features of these disorders. Methods: Through a consensus process, in which input was sought from the 150 members of the Study Group, criteria were drafted, revised and finalized. Final approval was sought through a web survey. Results: Revised criteria are proposed for pediatric acute disseminated encephalomyelitis, pediatric clinically isolated syndrome, pediatric neuromyelitis optica and pediatric MS. These criteria were approved by 93% or more of the 56 Study Group members who responded to the final survey. Conclusions: These definitions are proposed for clinical and research purposes. Their utility will depend on the outcomes of their application in prospective research.

830 citations


Journal ArticleDOI
TL;DR: A two-step solid-state reaction for preparing cobalt molybdenum nitride with a nanoscale morphology has been used to produce a highly active and stable electrocatalyst for the hydrogen evolution reaction (HER) under acidic conditions that achieves an iR-corrected current density.
Abstract: A two-step solid-state reaction for preparing cobalt molybdenum nitride with a nanoscale morphology has been used to produce a highly active and stable electrocatalyst for the hydrogen evolution reaction (HER) under acidic conditions that achieves an iR-corrected current density of 10 mA cm–2 at −0.20 V vs RHE at low catalyst loadings of 0.24 mg/cm2 in rotating disk experiments under a H2 atmosphere. Neutron powder diffraction and pair distribution function (PDF) studies have been used to overcome the insensitivity of X-ray diffraction data to different transition-metal nitride structural polytypes and show that this cobalt molybdenum nitride crystallizes in space group P63/mmc with lattice parameters of a = 2.85176(2) A and c = 10.9862(3) A and a formula of Co0.6Mo1.4N2. This space group results from the four-layered stacking sequence of a mixed close-packed structure with alternating layers of transition metals in octahedral and trigonal prismatic coordination and is a structure type for which HER activ...

828 citations


Journal ArticleDOI
TL;DR: The proposed system to automatically generate natural language descriptions from images is very effective at producing relevant sentences for images and generates descriptions that are notably more true to the specific image content than previous work.
Abstract: We present a system to automatically generate natural language descriptions from images. This system consists of two parts. The first part, content planning, smooths the output of computer vision-based detection and recognition algorithms with statistics mined from large pools of visually descriptive text to determine the best content words to use to describe an image. The second step, surface realization, chooses words to construct natural language sentences based on the predicted content and general statistics from natural language. We present multiple approaches for the surface realization step and evaluate each using automatic measures of similarity to human generated reference descriptions. We also collect forced choice human evaluations between descriptions from the proposed generation system and descriptions from competing approaches. The proposed system is very effective at producing relevant sentences for images. It also generates descriptions that are notably more true to the specific image content than previous work.

791 citations


Journal ArticleDOI
M. G. Aartsen1, Rasha Abbasi2, Y. Abdou3, Markus Ackermann  +284 moreInstitutions (36)
TL;DR: These two neutrino-induced events could be a first indication of an astrophysical neutrinos flux; the moderate significance, however, does not permit a definitive conclusion at this time.
Abstract: We report on the observation of two neutrino-induced events which have an estimated deposited energy in the IceCube detector of 1.04 +/- 0.16 and 1.14 +/- 0.17 PeV, respectively, the highest neutrino energies observed so far. These events are consistent with fully contained particle showers induced by neutral-current nu(e,mu,tau) ((nu) over bar (e,mu,tau)) or charged-current nu(e) ((nu) over bar (e)) interactions within the IceCube detector. The events were discovered in a search for ultrahigh energy neutrinos using data corresponding to 615.9 days effective live time. The expected number of atmospheric background is 0.082 +/- 0.004(stat)(-0.057)(+0.041)(syst). The probability of observing two or more candidate events under the atmospheric background-only hypothesis is 2.9 x 10(-3) (2.8 sigma) taking into account the uncertainty on the expected number of background events. These two events could be a first indication of an astrophysical neutrino flux; the moderate significance, however, does not permit a definitive conclusion at this time.

786 citations


Proceedings ArticleDOI
27 Aug 2013
TL;DR: SIMPLE, a SDN-based policy enforcement layer for efficient middlebox-specific "traffic steering", is presented, a significant step toward addressing industry concerns surrounding the ability of SDN to integrate with existing infrastructure and support L4-L7 capabilities.
Abstract: Networks today rely on middleboxes to provide critical performance, security, and policy compliance capabilities. Achieving these benefits and ensuring that the traffic is directed through the desired sequence of middleboxes requires significant manual effort and operator expertise. In this respect, Software-Defined Networking (SDN) offers a promising alternative. Middleboxes, however, introduce new aspects (e.g., policy composition, resource management, packet modifications) that fall outside the purvey of traditional L2/L3 functions that SDN supports (e.g., access control or routing). This paper presents SIMPLE, a SDN-based policy enforcement layer for efficient middlebox-specific "traffic steering''. In designing SIMPLE, we take an explicit stance to work within the constraints of legacy middleboxes and existing SDN interfaces. To this end, we address algorithmic and system design challenges to demonstrate the feasibility of using SDN to simplify middlebox traffic steering. In doing so, we also take a significant step toward addressing industry concerns surrounding the ability of SDN to integrate with existing infrastructure and support L4-L7 capabilities.

Journal ArticleDOI
TL;DR: It is proposed that this MRI approach may provide the basis for a wholly new strategy to evaluate Alzheimer's disease susceptibility and progression in the live human brain.
Abstract: The glymphatic system is a recently defined brain-wide paravascular pathway for cerebrospinal fluid (CSF) and interstitial fluid (ISF) exchange that facilitates efficient clearance of solutes and waste from the brain. CSF enters the brain along para-arterial channels to exchange with ISF, which is in turn cleared from the brain along para-venous pathways. Because soluble amyloid β clearance depends on glymphatic pathway function, we proposed that failure of this clearance system contributes to amyloid plaque deposition and Alzheimer’s disease progression. Here we provide proof of concept that glymphatic pathway function can be measured using a clinically relevant imaging technique. Dynamic contrast-enhanced MRI was used to visualize CSF-ISF exchange across the rat brain following intrathecal paramagnetic contrast agent administration. Key features of glymphatic pathway function were confirmed, including visualization of para-arterial CSF influx and molecular size-dependent CSF-ISF exchange. Whole-brain imaging allowed the identification of two key influx nodes at the pituitary and pineal gland recesses, while dynamic MRI permitted the definition of simple kinetic parameters to characterize glymphatic CSF-ISF exchange and solute clearance from the brain. We propose that this MRI approach may provide the basis for a wholly new strategy to evaluate Alzheimer’s disease susceptibility and progression in the live human brain.

Journal ArticleDOI
TL;DR: PED offers a reasonably safe and effective treatment of large or giant intracranial internal carotid artery aneurysms, demonstrated by high rates of completeAneurysm occlusion and low rates of adverse neurologic events; even in aneurYSms failing previous alternative treatments.
Abstract: The Pipeline for Uncoilable or Failed Aneurysms study demonstrated a high rate (78 of 108, 73.6%) of complete occlusion of large and giant wide-necked aneurysms of the internal carotid artery and a reasonably low rate of major safety events (6 of 107, 5.6% rate of major stroke or neurologic death).

Journal ArticleDOI
12 Jun 2013-PLOS ONE
TL;DR: This study presents a framework for assessing three dimensions of climate change vulnerability, namely sensitivity, exposure and adaptive capacity, and finds that high concentration areas for species with traits conferring highest sensitivity and lowest adaptive capacity differ from those of highly exposed species.
Abstract: Climate change will have far-reaching impacts on biodiversity, including increasing extinction rates. Current approaches to quantifying such impacts focus on measuring exposure to climatic change and largely ignore the biological differences between species that may significantly increase or reduce their vulnerability. To address this, we present a framework for assessing three dimensions of climate change vulnerability, namely sensitivity, exposure and adaptive capacity; this draws on species’ biological traits and their modeled exposure to projected climatic changes. In the largest such assessment to date, we applied this approach to each of the world’s birds, amphibians and corals (16,857 species). The resulting assessments identify the species with greatest relative vulnerability to climate change and the geographic areas in which they are concentrated, including the Amazon basin for amphibians and birds, and the central Indo-west Pacific (Coral Triangle) for corals. We found that high concentration areas for species with traits conferring highest sensitivity and lowest adaptive capacity differ from those of highly exposed species, and we identify areas where exposure-based assessments alone may over or under-estimate climate change impacts. We found that 608–851 bird (6–9%), 670–933 amphibian (11– 15%), and 47–73 coral species (6–9%) are both highly climate change vulnerable and already threatened with extinction on the IUCN Red List. The remaining highly climate change vulnerable species represent new priorities for conservation. Fewer species are highly climate change vulnerable under lower IPCC SRES emissions scenarios, indicating that reducing greenhouse emissions will reduce climate change driven extinctions. Our study answers the growing call for a more biologically and ecologically inclusive approach to assessing climate change vulnerability. By facilitating independent assessment of the three dimensions of climate change vulnerability, our approach can be used to devise species and areaspecific conservation interventions and indices. The priorities we identify will strengthen global strategies to mitigate climate change impacts.


Journal ArticleDOI
TL;DR: The proximate causes of climate-change related extinctions and their empirical support are reviewed to support the idea that changing species interactions are an important cause of documented population declines and extinctions related to climate change.
Abstract: Anthropogenic climate change is predicted to be a major cause of species extinctions in the next 100 years. But what will actually cause these extinctions? For example, will it be limited physiological tolerance to high temperatures, changing biotic interactions or other factors? Here, we systematically review the proximate causes of climate-change related extinctions and their empirical support. We find 136 case studies of climatic impacts that are potentially relevant to this topic. However, only seven identified proximate causes of demonstrated local extinctions due to anthropogenic climate change. Among these seven studies, the proximate causes vary widely. Surprisingly, none show a straightforward relationship between local extinction and limited tolerances to high temperature. Instead, many studies implicate species interactions as an important proximate cause, especially decreases in food availability. We find very similar patterns in studies showing decreases in abundance associated with climate change, and in those studies showing impacts of climatic oscillations. Collectively, these results highlight our disturbingly limited knowledge of this crucial issue but also support the idea that changing species interactions are an important cause of documented population declines and extinctions related to climate change. Finally, we briefly outline general research strategies for identifying these proximate causes in future studies.

Journal ArticleDOI
TL;DR: This review emphasizes biochemical, structural, cell biological, and genetic studies since 2005 that have shed light on many aspects of the NER pathway.
Abstract: Nucleotide excision repair (NER) is the main pathway used by mammals to remove bulky DNA lesions such as those formed by UV light, environmental mutagens, and some cancer chemotherapeutic adducts from DNA. Deficiencies in NER are associated with the extremely skin cancer-prone inherited disorder xeroderma pigmentosum. Although the core NER reaction and the factors that execute it have been known for some years, recent studies have led to a much more detailed understanding of the NER mechanism, how NER operates in the context of chromatin, and how it is connected to other cellular processes such as DNA damage signaling and transcription. This review emphasizes biochemical, structural, cell biological, and genetic studies since 2005 that have shed light on many aspects of the NER pathway.

Proceedings ArticleDOI
19 May 2013
TL;DR: The current knowledge about various protection techniques are systematized by setting up a general model for memory corruption attacks, and what policies can stop which attacks are shown, to analyze the reasons why protection mechanisms implementing stricter polices are not deployed.
Abstract: Memory corruption bugs in software written in low-level languages like C or C++ are one of the oldest problems in computer security. The lack of safety in these languages allows attackers to alter the program's behavior or take full control over it by hijacking its control flow. This problem has existed for more than 30 years and a vast number of potential solutions have been proposed, yet memory corruption attacks continue to pose a serious threat. Real world exploits show that all currently deployed protections can be defeated. This paper sheds light on the primary reasons for this by describing attacks that succeed on today's systems. We systematize the current knowledge about various protection techniques by setting up a general model for memory corruption attacks. Using this model we show what policies can stop which attacks. The model identifies weaknesses of currently deployed techniques, as well as other proposed protections enforcing stricter policies. We analyze the reasons why protection mechanisms implementing stricter polices are not deployed. To achieve wide adoption, protection mechanisms must support a multitude of features and must satisfy a host of requirements. Especially important is performance, as experience shows that only solutions whose overhead is in reasonable bounds get deployed. A comparison of different enforceable policies helps designers of new protection mechanisms in finding the balance between effectiveness (security) and efficiency. We identify some open research problems, and provide suggestions on improving the adoption of newer techniques.

Journal ArticleDOI
TL;DR: High-resolution multinuclear/multidimensional solid-state NMR techniques are used with in situ synchrotron-based techniques to study the prototype conversion material RuO2, demonstrating a protocol for studying the structure and spatial proximities of nanostructures formed in this system, including the amorphous solid electrolyte interphase that grows on battery electrodes.
Abstract: Metal fluorides/oxides (MF(x)/M(x)O(y)) are promising electrodes for lithium-ion batteries that operate through conversion reactions. These reactions are associated with much higher energy densities than intercalation reactions. The fluorides/oxides also exhibit additional reversible capacity beyond their theoretical capacity through mechanisms that are still poorly understood, in part owing to the difficulty in characterizing structure at the nanoscale, particularly at buried interfaces. This study employs high-resolution multinuclear/multidimensional solid-state NMR techniques, with in situ synchrotron-based techniques, to study the prototype conversion material RuO2. The experiments, together with theoretical calculations, show that a major contribution to the extra capacity in this system is due to the generation of LiOH and its subsequent reversible reaction with Li to form Li2O and LiH. The research demonstrates a protocol for studying the structure and spatial proximities of nanostructures formed in this system, including the amorphous solid electrolyte interphase that grows on battery electrodes.

Journal ArticleDOI
Georges Aad1, T. Abajyan2, Brad Abbott3, Jalal Abdallah  +2942 moreInstitutions (201)
TL;DR: In this paper, the spin and parity quantum numbers of the Higgs boson were studied based on the collision data collected by the ATLAS experiment at the LHC, and the results showed that the standard model spin-parity J(...

Journal ArticleDOI
TL;DR: This review describes the Procrustes paradigm and the current methodological toolkit of geometric morphometrics, and highlights some of the theoretical advances that have occurred over the past ten years since the prior review (Adams et al., 2004).
Abstract: Twenty years ago, Rohlf and Marcus proclaimed that a “revolution in morphometrics” was underway, where classic analyses based on sets of linear distances were being supplanted by geometric approaches making use of the coordinates of anatomical landmarks. Since that time the field of geometric morphometrics has matured into a rich and cohesive discipline for the study of shape variation and covariation. The development of the field is identified with the Procrustes paradigm, a methodological approach to shape analysis arising from the intersection of the statistical shape theory and analytical procedures for obtaining shape variables from landmark data. In this review we describe the Procrustes paradigm and the current methodological toolkit of geometric morphometrics. We highlight some of the theoretical advances that have occurred over the past ten years since our prior review (Adams et al., 2004), what types of anatomical structures are amenable to these approaches, and how they extend the reach of geometric morphometrics to more specialized applications for addressing particular biological hypotheses. We end with a discussion of some possible areas that are fertile ground for future development in the field.

Journal ArticleDOI
TL;DR: In this paper, the authors presented a large-scale spatial resolution map of the CO-to-H$2}$ conversion factor and dust-togas ratio (DGR) in 26 nearby, star-forming galaxies.
Abstract: We present ~{}kiloparsec spatial resolution maps of the CO-to-H$_{2}$ conversion factor ({$α$}$_{CO}$) and dust-to-gas ratio (DGR) in 26 nearby, star-forming galaxies. We have simultaneously solved for {$α$}$_{CO}$ and the DGR by assuming that the DGR is approximately constant on kiloparsec scales. With this assumption, we can combine maps of dust mass surface density, CO-integrated intensity, and H I column density to solve for both {$α$}$_{CO}$ and the DGR with no assumptions about their value or dependence on metallicity or other parameters. Such a study has just become possible with the availability of high-resolution far-IR maps from the Herschel key program KINGFISH, $^{12}$CO J = (2-1) maps from the IRAM 30 m large program HERACLES, and H I 21 cm line maps from THINGS. We use a fixed ratio between the (2-1) and (1-0) lines to present our {$α$}$_{CO}$ results on the more typically used $^{12}$CO J = (1-0) scale and show using literature measurements that variations in the line ratio do not affect our results. In total, we derive 782 individual solutions for {$α$}$_{CO}$ and the DGR. On average, {$α$}$_{CO}$ = 3.1 M $_{☉}$ pc$^{–2}$ (K km s$^{–1}$)$^{–1}$ for our sample with a standard deviation of 0.3 dex. Within galaxies, we observe a generally flat profile of {$α$}$_{CO}$ as a function of galactocentric radius. However, most galaxies exhibit a lower {$α$}$_{CO}$ value in the central kiloparsec{mdash}a factor of ~{}2 below the galaxy mean, on average. In some cases, the central {$α$}$_{CO}$ value can be factors of 5-10 below the standard Milky Way (MW) value of {$α$}$_{CO, MW}$ = 4.4 M $_{☉}$ pc$^{–2}$ (K km s$^{–1}$)$^{–1}$. While for {$α$}$_{CO}$ we find only weak correlations with metallicity, the DGR is well-correlated with metallicity, with an approximately linear slope. Finally, we present several recommendations for choosing an appropriate {$α$}$_{CO}$ for studies of nearby galaxies.

Journal ArticleDOI
TL;DR: Global total shark mortality needs to be reduced drastically in order to rebuild depleted populations and restore marine ecosystems with functional top predators.

Journal ArticleDOI
Georges Aad1, T. Abajyan2, Brad Abbott3, Jalal Abdallah4  +2942 moreInstitutions (200)
TL;DR: In this article, the production properties and couplings of the recently discovered Higgs boson using the decays into boson pairs were measured using the complete pp collision data sample recorded by the ATLAS experiment at the CERN Large Hadron Collider at centre-of-mass energies of 7 TeV and 8 TeV, corresponding to an integrated luminosity of about 25/fb.

Proceedings ArticleDOI
27 Aug 2013
TL;DR: A proof-of-concept design of an incrementally deployable ICN architecture is presented and it is found that pervasive caching and nearest-replica routing are not fundamentally necessary and most of the performance benefits can be achieved with simpler caching architectures.
Abstract: Information-Centric Networking (ICN) has seen a significant resurgence in recent years. ICN promises benefits to users and service providers along several dimensions (e.g., performance, security, and mobility). These benefits, however, come at a non-trivial cost as many ICN proposals envision adding significant complexity to the network by having routers serve as content caches and support nearest-replica routing. This paper is driven by the simple question of whether this additional complexity is justified and if we can achieve these benefits in an incrementally deployable fashion. To this end, we use trace-driven simulations to analyze the quantitative benefits attributed to ICN (e.g., lower latency and congestion). Somewhat surprisingly, we find that pervasive caching and nearest-replica routing are not fundamentally necessary---most of the performance benefits can be achieved with simpler caching architectures. We also discuss how the qualitative benefits of ICN (e.g., security, mobility) can be achieved without any changes to the network. Building on these insights, we present a proof-of-concept design of an incrementally deployable ICN architecture.

Journal ArticleDOI
Georges Aad1, T. Abajyan2, Brad Abbott3, J. Abdallah4  +2897 moreInstitutions (184)
TL;DR: In this article, the luminosity calibration for the ATLAS detector at the LHC during pp collisions at root s = 7 TeV in 2010 and 2011 is presented, and a luminosity uncertainty of delta L/L = +/- 3.5 % is obtained.
Abstract: The luminosity calibration for the ATLAS detector at the LHC during pp collisions at root s = 7 TeV in 2010 and 2011 is presented. Evaluation of the luminosity scale is performed using several luminosity-sensitive detectors, and comparisons are made of the long-term stability and accuracy of this calibration applied to the pp collisions at root s = 7 TeV. A luminosity uncertainty of delta L/L = +/- 3.5 % is obtained for the 47 pb(-1) of data delivered to ATLAS in 2010, and an uncertainty of delta L/L = +/- 1.8 % is obtained for the 5.5 fb(-1) delivered in 2011.

Proceedings Article
14 Aug 2013
TL;DR: This work demonstrates that the first work to apply CFI to complex shared libraries such as glibc is effective against control-flow hijack attacks, and eliminates the vast majority of ROP gadgets.
Abstract: Control-Flow Integrity (CFI) has been recognized as an important low-level security property. Its enforcement can defeat most injected and existing code attacks, including those based on Return-Oriented Programming (ROP). Previous implementations of CFI have required compiler support or the presence of relocation or debug information in the binary. In contrast, we present a technique for applying CFI to stripped binaries on ×86/Linux. Ours is the first work to apply CFI to complex shared libraries such as glibc. Through experimental evaluation, we demonstrate that our CFI implementation is effective against control-flow hijack attacks, and eliminates the vast majority of ROP gadgets. To achieve this result, we have developed robust techniques for disassembly, static analysis, and transformation of large binaries. Our techniques have been tested on over 300MB of binaries (executables and shared libraries).

Journal ArticleDOI
TL;DR: Research designs are described that discriminate the remaining models and plea for deconstruction of neuroticism, finding that Neuroticism is etiologically not informative yet but useful as an efficient marker of non-specified general risk.

Journal ArticleDOI
TL;DR: The results suggest that more caution should be exercised in genomic medicine settings when analyzing individual genomes, including interpreting positive and negative findings with scrutiny, especially for indels.
Abstract: Background: To facilitate the clinical implementation of genomic medicine by next-generation sequencing, it will be critically important to obtain accurate and consistent variant calls on personal genomes. Multiple software tools for variant calling are available, but it is unclear how comparable these tools are or what their relative merits in real-world scenarios might be. Methods: We sequenced 15 exomes from four families using commercial kits (Illumina HiSeq 2000 platform and Agilent SureSelect version 2 capture kit), with approximately 120X mean coverage. We analyzed the raw data using near-default parameters with five different alignment and variant-calling pipelines (SOAP, BWA-GATK, BWA-SNVer, GNUMAP, and BWA-SAMtools). We additionally sequenced a single whole genome using the sequencing and analysis pipeline from Complete Genomics (CG), with 95% of the exome region being covered by 20 or more reads per base. Finally, we validated 919 single-nucleotide variations (SNVs) and 841 insertions and deletions (indels), including similar fractions of GATK-only, SOAP-only, and shared calls, on the MiSeq platform by amplicon sequencing with approximately 5000X mean coverage. Results: SNV concordance between five Illumina pipelines across all 15 exomes was 57.4%, while 0.5 to 5.1% of variants were called as unique to each pipeline. Indel concordance was only 26.8% between three indel-calling pipelines, even after left-normalizing and intervalizing genomic coordinates by 20 base pairs. There were 11% of CG variants falling within targeted regions in exome sequencing that were not called by any of the Illumina-based exome analysis pipelines. Based on targeted amplicon sequencing on the MiSeq platform, 97.1%, 60.2%, and 99.1% of the GATK-only, SOAP-only and shared SNVs could be validated, but only 54.0%, 44.6%, and 78.1% of the GATKonly, SOAP-only and shared indels could be validated. Additionally, our analysis of two families (one with four individuals and the other with seven), demonstrated additional accuracy gained in variant discovery by having access to genetic data from a multi-generational family. Conclusions: Our results suggest that more caution should be exercised in genomic medicine settings when analyzing individual genomes, including interpreting positive and negative findings with scrutiny, especially for indels. We advocate for renewed collection and sequencing of multi-generational families to increase the overall accuracy of whole genomes.

Journal ArticleDOI
Jonathan Sievers1, Jonathan Sievers2, Renée Hlozek1, Michael R. Nolta2, Viviana Acquaviva3, Graeme E. Addison4, Graeme E. Addison5, Peter A. R. Ade6, Paula Aguirre7, Mandana Amiri4, John W. Appel1, L. Felipe Barrientos7, Elia S. Battistelli4, Elia S. Battistelli8, Nick Battaglia2, Nick Battaglia9, J. Richard Bond2, Ben Brown10, B. Burger4, Erminia Calabrese5, Jay Chervenak11, Devin Crichton12, Sudeep Das13, Sudeep Das14, Mark J. Devlin15, Simon Dicker15, W. Bertrand Doriese16, Joanna Dunkley5, Rolando Dünner7, Thomas Essinger-Hileman1, David Faber1, R. P. Fisher1, Joseph W. Fowler1, Joseph W. Fowler16, Patricio A. Gallardo7, Michael S. Gordon1, Megan Gralla12, Amir Hajian1, Amir Hajian2, Mark Halpern4, Matthew Hasselfield1, Matthew Hasselfield4, Carlos Hernández-Monteagudo17, J. Colin Hill1, Gene C. Hilton16, Matt Hilton18, Matt Hilton19, Adam D. Hincks2, Adam D. Hincks1, Dave Holtz1, Kevin M. Huffenberger20, David H. Hughes21, John P. Hughes22, Leopoldo Infante7, Kent D. Irwin16, David Jacobson15, Brittany Johnstone23, Jean Baptiste Juin7, Madhuri Kaul15, Jeff Klein15, Arthur Kosowsky10, Judy M. Lau1, Michele Limon15, Michele Limon1, Michele Limon24, Yen-Ting Lin25, Yen-Ting Lin26, Yen-Ting Lin1, Thibaut Louis5, Robert H. Lupton1, Tobias A. Marriage12, Tobias A. Marriage1, Danica Marsden15, Danica Marsden27, Krista Martocci1, Philip Daniel Mauskopf28, Philip Daniel Mauskopf6, Michael R. McLaren15, Felipe Menanteau22, Kavilan Moodley18, Harvey Moseley11, Calvin B. Netterfield2, Michael D. Niemack16, Michael D. Niemack1, Michael D. Niemack29, Lyman A. Page1, William A. Page1, Lucas Parker1, Bruce Partridge30, Reed Plimpton15, Hernan Quintana7, Erik D. Reese15, Beth Reid1, Felipe Rojas7, Neelima Sehgal1, Neelima Sehgal31, Blake D. Sherwin1, Benjamin L. Schmitt15, David N. Spergel1, Suzanne T. Staggs1, O. R. Stryzak1, Daniel S. Swetz15, Daniel S. Swetz16, Eric R. Switzer1, Eric R. Switzer2, Robert Thornton23, Robert Thornton15, Hy Trac9, Carole Tucker6, Masao Uehara1, Katerina Visnjic1, Ryan Warne18, Grant W. Wilson32, Edward J. Wollack11, Yue Zhao1, Caroline Zunckel18 
TL;DR: In this article, a model of primary cosmological and secondary foreground parameters is fit to the map power spectra and lensing deflection power spectrum, including contributions from both the thermal Sunyaev-Zeldovich (tSZ) effect and the kinematic SZ effect, Poisson and correlated anisotropy from unresolved infrared sources, radio sources and the correlation between the tSZ effect and infrared sources.
Abstract: We present constraints on cosmological and astrophysical parameters from high-resolution microwave background maps at 148 GHz and 218 GHz made by the Atacama Cosmology Telescope (ACT) in three seasons of observations from 2008 to 2010. A model of primary cosmological and secondary foreground parameters is fit to the map power spectra and lensing deflection power spectrum, including contributions from both the thermal Sunyaev-Zeldovich (tSZ) effect and the kinematic Sunyaev-Zeldovich (kSZ) effect, Poisson and correlated anisotropy from unresolved infrared sources, radio sources, and the correlation between the tSZ effect and infrared sources. The power l2Cl/2π of the thermal SZ power spectrum at 148 GHz is measured to be 3.4±1.4 μK2 at l = 3000, while the corresponding amplitude of the kinematic SZ power spectrum has a 95% confidence level upper limit of 8.6 μK2. Combining ACT power spectra with the WMAP 7-year temperature and polarization power spectra, we find excellent consistency with the LCDM model. We constrain the number of effective relativistic degrees of freedom in the early universe to be Neff = 2.79±0.56, in agreement with the canonical value of Neff = 3.046 for three massless neutrinos. We constrain the sum of the neutrino masses to be Σmν < 0.39 eV at 95% confidence when combining ACT and WMAP 7-year data with BAO and Hubble constant measurements. We constrain the amount of primordial helium to be Yp = 0.225±0.034, and measure no variation in the fine structure constant α since recombination, with α/α0 = 1.004±0.005. We also find no evidence for any running of the scalar spectral index, dns/dln k = −0.004±0.012.