scispace - formally typeset
Search or ask a question

Showing papers by "Stanford University published in 2009"


Journal ArticleDOI
01 Feb 2009
TL;DR: In this paper, the authors describe the process of inducting theory using case studies from specifying the research questions to reaching closure, which is a process similar to hypothesis-testing research.
Abstract: Building Theories From Case Study Research - This paper describes the process of inducting theory using case studies from specifying the research questions to reaching closure. Some features of the process, such as problem definition and construct validation, are similar to hypothesis-testing research. Others, such as within-case analysis and replication logic, are unique to the inductive, case-oriented process. Overall, the process described here is highly iterative and tightly linked to data. This research approach is especially appropriate in new topic areas. The resultant theory is often novel, testable, and empirically valid. Finally, framebreaking insights, the tests of good theory (e.g., parsimony, logical coherence), and convincing grounding in the evidence are the key criteria for evaluating this type of research.

40,005 citations


Proceedings Article
01 Jan 2009
TL;DR: This paper discusses how ROS relates to existing robot software frameworks, and briefly overview some of the available application software which uses ROS.
Abstract: This paper gives an overview of ROS, an opensource robot operating system. ROS is not an operating system in the traditional sense of process management and scheduling; rather, it provides a structured communications layer above the host operating systems of a heterogenous compute cluster. In this paper, we discuss how ROS relates to existing robot software frameworks, and briefly overview some of the available application software which uses ROS.

8,387 citations


Journal ArticleDOI
08 Oct 2009-Nature
TL;DR: This paper examined potential sources of missing heritability and proposed research strategies, including and extending beyond current genome-wide association approaches, to illuminate the genetics of complex diseases and enhance its potential to enable effective disease prevention or treatment.
Abstract: Genome-wide association studies have identified hundreds of genetic variants associated with complex human diseases and traits, and have provided valuable insights into their genetic architecture. Most variants identified so far confer relatively small increments in risk, and explain only a small proportion of familial clustering, leading many to question how the remaining, 'missing' heritability can be explained. Here we examine potential sources of missing heritability and propose research strategies, including and extending beyond current genome-wide association approaches, to illuminate the genetics of complex diseases and enhance its potential to enable effective disease prevention or treatment.

7,797 citations


Proceedings ArticleDOI
Craig Gentry1
31 May 2009
TL;DR: This work proposes a fully homomorphic encryption scheme that allows one to evaluate circuits over encrypted data without being able to decrypt, and describes a public key encryption scheme using ideal lattices that is almost bootstrappable.
Abstract: We propose a fully homomorphic encryption scheme -- i.e., a scheme that allows one to evaluate circuits over encrypted data without being able to decrypt. Our solution comes in three steps. First, we provide a general result -- that, to construct an encryption scheme that permits evaluation of arbitrary circuits, it suffices to construct an encryption scheme that can evaluate (slightly augmented versions of) its own decryption circuit; we call a scheme that can evaluate its (augmented) decryption circuit bootstrappable.Next, we describe a public key encryption scheme using ideal lattices that is almost bootstrappable.Lattice-based cryptosystems typically have decryption algorithms with low circuit complexity, often dominated by an inner product computation that is in NC1. Also, ideal lattices provide both additive and multiplicative homomorphisms (modulo a public-key ideal in a polynomial ring that is represented as a lattice), as needed to evaluate general circuits.Unfortunately, our initial scheme is not quite bootstrappable -- i.e., the depth that the scheme can correctly evaluate can be logarithmic in the lattice dimension, just like the depth of the decryption circuit, but the latter is greater than the former. In the final step, we show how to modify the scheme to reduce the depth of the decryption circuit, and thereby obtain a bootstrappable encryption scheme, without reducing the depth that the scheme can evaluate. Abstractly, we accomplish this by enabling the encrypter to start the decryption process, leaving less work for the decrypter, much like the server leaves less work for the decrypter in a server-aided cryptosystem.

5,770 citations


Journal ArticleDOI
TL;DR: A series of improvements to the spectroscopic reductions are described, including better flat fielding and improved wavelength calibration at the blue end, better processing of objects with extremely strong narrow emission lines, and an improved determination of stellar metallicities.
Abstract: This paper describes the Seventh Data Release of the Sloan Digital Sky Survey (SDSS), marking the completion of the original goals of the SDSS and the end of the phase known as SDSS-II. It includes 11,663 deg^2 of imaging data, with most of the ~2000 deg^2 increment over the previous data release lying in regions of low Galactic latitude. The catalog contains five-band photometry for 357 million distinct objects. The survey also includes repeat photometry on a 120° long, 2°.5 wide stripe along the celestial equator in the Southern Galactic Cap, with some regions covered by as many as 90 individual imaging runs. We include a co-addition of the best of these data, going roughly 2 mag fainter than the main survey over 250 deg^2. The survey has completed spectroscopy over 9380 deg^2; the spectroscopy is now complete over a large contiguous area of the Northern Galactic Cap, closing the gap that was present in previous data releases. There are over 1.6 million spectra in total, including 930,000 galaxies, 120,000 quasars, and 460,000 stars. The data release includes improved stellar photometry at low Galactic latitude. The astrometry has all been recalibrated with the second version of the USNO CCD Astrograph Catalog, reducing the rms statistical errors at the bright end to 45 milliarcseconds per coordinate. We further quantify a systematic error in bright galaxy photometry due to poor sky determination; this problem is less severe than previously reported for the majority of galaxies. Finally, we describe a series of improvements to the spectroscopic reductions, including better flat fielding and improved wavelength calibration at the blue end, better processing of objects with extremely strong narrow emission lines, and an improved determination of stellar metallicities.

5,665 citations


Journal ArticleDOI
TL;DR: In this article, first-principles electronic structure calculations of the layered, stoichiometric crystals Sb2Te3, Bi2Se3, SbSe3 and BiSe3 were performed.
Abstract: Topological insulators are new states of quantum matter in which surface states residing in the bulk insulating gap of such systems are protected by time-reversal symmetry. The study of such states was originally inspired by the robustness to scattering of conducting edge states in quantum Hall systems. Recently, such analogies have resulted in the discovery of topologically protected states in two-dimensional and three-dimensional band insulators with large spin–orbit coupling. So far, the only known three-dimensional topological insulator is BixSb1−x, which is an alloy with complex surface states. Here, we present the results of first-principles electronic structure calculations of the layered, stoichiometric crystals Sb2Te3, Sb2Se3, Bi2Te3 and Bi2Se3. Our calculations predict that Sb2Te3, Bi2Te3 and Bi2Se3 are topological insulators, whereas Sb2Se3 is not. These topological insulators have robust and simple surface states consisting of a single Dirac cone at the Γ point. In addition, we predict that Bi2Se3 has a topologically non-trivial energy gap of 0.3 eV, which is larger than the energy scale of room temperature. We further present a simple and unified continuum model that captures the salient topological features of this class of materials. First-principles calculations predict that Bi2Se3, Bi2Te3 and Sb2Te3 are topological insulators—three-dimensional semiconductors with unusual surface states generated by spin–orbit coupling—whose surface states are described by a single gapless Dirac cone. The calculations further predict that Bi2Se3 has a non-trivial energy gap larger than the energy scale kBT at room temperature.

4,982 citations


Journal ArticleDOI
30 Jul 2009-Blood
TL;DR: The classification of myeloid neoplasms and acute leukemia is highlighted with the aim of familiarizing hematologists, clinical scientists, and hematopathologists not only with the major changes in the classification but also with the rationale for those changes.

4,274 citations


Journal ArticleDOI
25 Nov 2009-Cell
TL;DR: Reduction of lysyl oxidase-mediated collagen crosslinking prevented MMTV-Neu-induced fibrosis, decreased focal adhesions and PI3K activity, impeded malignancy, and lowered tumor incidence, and data show how collagenCrosslinking can modulate tissue fibrosis and stiffness to force focal adhesion, growth factor signaling and breast malignancies.

3,396 citations


Proceedings ArticleDOI
02 Aug 2009
TL;DR: This work investigates an alternative paradigm that does not require labeled corpora, avoiding the domain dependence of ACE-style algorithms, and allowing the use of corpora of any size.
Abstract: Modern models of relation extraction for tasks like ACE are based on supervised learning of relations from small hand-labeled corpora. We investigate an alternative paradigm that does not require labeled corpora, avoiding the domain dependence of ACE-style algorithms, and allowing the use of corpora of any size. Our experiments use Freebase, a large semantic database of several thousand relations, to provide distant supervision. For each pair of entities that appears in some Freebase relation, we find all sentences containing those entities in a large unlabeled corpus and extract textual features to train a relation classifier. Our algorithm combines the advantages of supervised IE (combining 400,000 noisy pattern features in a probabilistic classifier) and unsupervised IE (extracting large numbers of relations from large corpora of any domain). Our model is able to extract 10,000 instances of 102 relations at a precision of 67.6%. We also analyze feature performance, showing that syntactic parse features are particularly helpful for relations that are ambiguous or lexically distant in their expression.

2,965 citations


Journal ArticleDOI
29 Jan 2009-Nature
TL;DR: An initial analysis of the ∼730-megabase Sorghum bicolor (L.) Moench genome is presented, placing ∼98% of genes in their chromosomal context using whole-genome shotgun sequence validated by genetic, physical and syntenic information.
Abstract: Sorghum, an African grass related to sugar cane and maize, is grown for food, feed, fibre and fuel. We present an initial analysis of the approximately 730-megabase Sorghum bicolor (L.) Moench genome, placing approximately 98% of genes in their chromosomal context using whole-genome shotgun sequence validated by genetic, physical and syntenic information. Genetic recombination is largely confined to about one-third of the sorghum genome with gene order and density similar to those of rice. Retrotransposon accumulation in recombinationally recalcitrant heterochromatin explains the approximately 75% larger genome size of sorghum compared with rice. Although gene and repetitive DNA distributions have been preserved since palaeopolyploidization approximately 70 million years ago, most duplicated gene sets lost one member before the sorghum-rice divergence. Concerted evolution makes one duplicated chromosomal segment appear to be only a few million years old. About 24% of genes are grass-specific and 7% are sorghum-specific. Recent gene and microRNA duplications may contribute to sorghum's drought tolerance.

2,809 citations


Proceedings ArticleDOI
14 Jun 2009
TL;DR: The convolutional deep belief network is presented, a hierarchical generative model which scales to realistic image sizes and is translation-invariant and supports efficient bottom-up and top-down probabilistic inference.
Abstract: There has been much interest in unsupervised learning of hierarchical generative models such as deep belief networks. Scaling such models to full-sized, high-dimensional images remains a difficult problem. To address this problem, we present the convolutional deep belief network, a hierarchical generative model which scales to realistic image sizes. This model is translation-invariant and supports efficient bottom-up and top-down probabilistic inference. Key to our approach is probabilistic max-pooling, a novel technique which shrinks the representations of higher layers in a probabilistically sound way. Our experiments show that the algorithm learns useful high-level visual features, such as object parts, from unlabeled images of objects and natural scenes. We demonstrate excellent performance on several visual recognition tasks and show that our model can perform hierarchical (bottom-up and top-down) inference over full-sized images.

Journal ArticleDOI
24 Apr 2009
TL;DR: This information-theoretic survey provides guidelines for the spectral efficiency gains possible through cognitive radios, as well as practical design ideas to mitigate the coexistence challenges in today's crowded spectrum.
Abstract: Cognitive radios hold tremendous promise for increasing spectral efficiency in wireless systems. This paper surveys the fundamental capacity limits and associated transmission techniques for different wireless network design paradigms based on this promising technology. These paradigms are unified by the definition of a cognitive radio as an intelligent wireless communication device that exploits side information about its environment to improve spectrum utilization. This side information typically comprises knowledge about the activity, channels, codebooks, and/or messages of other nodes with which the cognitive node shares the spectrum. Based on the nature of the available side information as well as a priori rules about spectrum usage, cognitive radio systems seek to underlay, overlay, or interweave the cognitive radios' signals with the transmissions of noncognitive nodes. We provide a comprehensive summary of the known capacity characterizations in terms of upper and lower bounds for each of these three approaches. The increase in system degrees of freedom obtained through cognitive radios is also illuminated. This information-theoretic survey provides guidelines for the spectral efficiency gains possible through cognitive radios, as well as practical design ideas to mitigate the coexistence challenges in today's crowded spectrum.

Journal ArticleDOI
04 Jun 2009-Nature
TL;DR: The timing of a sensory input relative to a gamma cycle determined the amplitude and precision of evoked responses and provided the first causal evidence that distinct network activity states can be induced in vivo by cell-type-specific activation.
Abstract: Corticalgammaoscillations(20280Hz)predictincreasesinfocusedattention,andfailureingammaregulationisahallmark of neurological and psychiatric disease. Current theory predicts that gamma oscillations are generated by synchronous activity of fast-spiking inhibitory interneurons, with the resulting rhythmic inhibition producing neural ensemble synchrony by generating a narrow window for effective excitation. We causally tested these hypotheses in barrel cortex in vivo by targeting optogenetic manipulation selectively to fast-spiking interneurons. Here we show that light-driven activation of fast-spiking interneurons atvariedfrequencies (82200Hz) selectivelyamplifies gamma oscillations. Incontrast, pyramidal neuron activation amplifies only lower frequency oscillations, a cell-type-specific double dissociation. We found that the timing of a sensory input relative to a gamma cycle determined the amplitude and precision of evoked responses. Our data directly support the fast-spiking-gamma hypothesis and provide the first causal evidence that distinct network activity states can be induced in vivo by cell-type-specific activation.

01 Jan 2009
TL;DR: This work designs a somewhat homomorphic "boostrappable" encryption scheme that works when the function f is the scheme's own decryption function, and shows how, through recursive self-embedding, bootstrappable encryption gives fully homomorphic encryption.
Abstract: We propose the first fully homomorphic encryption scheme, solving an old open problem. Such a scheme allows one to compute arbitrary functions over encrypted data without the decryption key—i.e., given encryptions E(m1), ..., E( mt) of m1, ..., m t, one can efficiently compute a compact ciphertext that encrypts f(m1, ..., m t) for any efficiently computable function f. Fully homomorphic encryption has numerous applications. For example, it enables encrypted search engine queries—i.e., a search engine can give you a succinct encrypted answer to your (boolean) query without even knowing what your query was. It also enables searching on encrypted data; you can store your encrypted data on a remote server, and later have the server retrieve only files that (when decrypted) satisfy some boolean constraint, even though the server cannot decrypt the files on its own. More broadly, it improves the efficiency of secure multiparty computation. In our solution, we begin by designing a somewhat homomorphic "boostrappable" encryption scheme that works when the function f is the scheme's own decryption function. We then show how, through recursive self-embedding, bootstrappable encryption gives fully homomorphic encryption.

Journal ArticleDOI
TL;DR: A simple costless modification to iterative thresholding is introduced making the sparsity–undersampling tradeoff of the new algorithms equivalent to that of the corresponding convex optimization procedures, inspired by belief propagation in graphical models.
Abstract: Compressed sensing aims to undersample certain high-dimensional signals yet accurately reconstruct them by exploiting signal characteristics. Accurate reconstruction is possible when the object to be recovered is sufficiently sparse in a known basis. Currently, the best known sparsity–undersampling tradeoff is achieved when reconstructing by convex optimization, which is expensive in important large-scale applications. Fast iterative thresholding algorithms have been intensively studied as alternatives to convex optimization for large-scale problems. Unfortunately known fast algorithms offer substantially worse sparsity–undersampling tradeoffs than convex optimization. We introduce a simple costless modification to iterative thresholding making the sparsity–undersampling tradeoff of the new algorithms equivalent to that of the corresponding convex optimization procedures. The new iterative-thresholding algorithms are inspired by belief propagation in graphical models. Our empirical measurements of the sparsity–undersampling tradeoff for the new algorithms agree with theoretical calculations. We show that a state evolution formalism correctly derives the true sparsity–undersampling tradeoff. There is a surprising agreement between earlier calculations based on random convex polytopes and this apparently very different theoretical formalism.

Journal ArticleDOI
04 Jun 2009-Nature
TL;DR: Optogenetics opens the door to a new kind of informational analysis of brain function, permitting quantitative delineation of the functional significance of individual elements in the emergent operation and function of intact neural circuitry.
Abstract: Synchronized oscillations and inhibitory interneurons have important and interconnected roles within cortical microcircuits. In particular, interneurons defined by the fast-spiking phenotype and expression of the calcium-binding protein parvalbumin have been suggested to be involved in gamma (30-80 Hz) oscillations, which are hypothesized to enhance information processing. However, because parvalbumin interneurons cannot be selectively controlled, definitive tests of their functional significance in gamma oscillations, and quantitative assessment of the impact of parvalbumin interneurons and gamma oscillations on cortical circuits, have been lacking despite potentially enormous significance (for example, abnormalities in parvalbumin interneurons may underlie altered gamma-frequency synchronization and cognition in schizophrenia and autism). Here we use a panel of optogenetic technologies in mice to selectively modulate multiple distinct circuit elements in neocortex, alone or in combination. We find that inhibiting parvalbumin interneurons suppresses gamma oscillations in vivo, whereas driving these interneurons (even by means of non-rhythmic principal cell activity) is sufficient to generate emergent gamma-frequency rhythmicity. Moreover, gamma-frequency modulation of excitatory input in turn was found to enhance signal transmission in neocortex by reducing circuit noise and amplifying circuit signals, including inputs to parvalbumin interneurons. As demonstrated here, optogenetics opens the door to a new kind of informational analysis of brain function, permitting quantitative delineation of the functional significance of individual elements in the emergent operation and function of intact neural circuitry.

Journal ArticleDOI
TL;DR: The aim of this paper is to introduce a few key notions and applications connected to sparsity, targeting newcomers interested in either the mathematical aspects of this area or its applications.
Abstract: A full-rank matrix ${\bf A}\in \mathbb{R}^{n\times m}$ with $n

Journal ArticleDOI
TL;DR: The immense economic and social impact of wounds in the authors' society calls for allocation of a higher level of attention and resources to understand biological mechanisms underlying cutaneous wound complications.
Abstract: In the United States, chronic wounds affect 6.5 million patients. An estimated excess of US$25 billion is spent annually on treatment of chronic wounds and the burden is rapidly growing due to increasing health care costs, an aging population and a sharp rise in the incidence of diabetes and obesity worldwide. The annual wound care products market is projected to reach $15.3 billion by 2010. Chronic wounds are rarely seen in individuals who are otherwise healthy. In fact, chronic wound patients frequently suffer from "highly branded" diseases such as diabetes and obesity. This seems to have overshadowed the significance of wounds per se as a major health problem. For example, NIH's Research Portfolio Online Reporting Tool (RePORT; http://report.nih.gov/), directed at providing access to estimates of funding for various disease conditions does list several rare diseases but does not list wounds. Forty million inpatient surgical procedures were performed in the United States in 2000, followed closely by 31.5 million outpatient surgeries. The need for post-surgical wound care is sharply on the rise. Emergency wound care in an acute setting has major significance not only in a war setting but also in homeland preparedness against natural disasters as well as against terrorism attacks. An additional burden of wound healing is the problem of skin scarring, a $12 billion annual market. The immense economic and social impact of wounds in our society calls for allocation of a higher level of attention and resources to understand biological mechanisms underlying cutaneous wound complications.

Journal ArticleDOI
09 Apr 2009-Nature
TL;DR: It is shown that normal mammary epithelial stem cells contain lower concentrations of ROS than their more mature progeny cells, and subsets of CSCs in some tumours contain lower ROS levels and enhanced ROS defences compared to their non-tumorigenic progeny, which may contribute to tumour radioresistance.
Abstract: The metabolism of oxygen, although central to life, produces reactive oxygen species (ROS) that have been implicated in processes as diverse as cancer, cardiovascular disease and ageing. It has recently been shown that central nervous system stem cells and haematopoietic stem cells and early progenitors contain lower levels of ROS than their more mature progeny, and that these differences are critical for maintaining stem cell function. We proposed that epithelial tissue stem cells and their cancer stem cell (CSC) counterparts may also share this property. Here we show that normal mammary epithelial stem cells contain lower concentrations of ROS than their more mature progeny cells. Notably, subsets of CSCs in some human and murine breast tumours contain lower ROS levels than corresponding non-tumorigenic cells (NTCs). Consistent with ROS being critical mediators of ionizing-radiation-induced cell killing, CSCs in these tumours develop less DNA damage and are preferentially spared after irradiation compared to NTCs. Lower ROS levels in CSCs are associated with increased expression of free radical scavenging systems. Pharmacological depletion of ROS scavengers in CSCs markedly decreases their clonogenicity and results in radiosensitization. These results indicate that, similar to normal tissue stem cells, subsets of CSCs in some tumours contain lower ROS levels and enhanced ROS defences compared to their non-tumorigenic progeny, which may contribute to tumour radioresistance.

Posted Content
TL;DR: This paper proposed Instructional manipulation check (IMC), a new tool for detecting participants who are not following instructions and demonstrated how the inclusion of an IMC can increase statistical power and reliability of a dataset.
Abstract: Participants are not always as diligent in reading and following instructions as experimenters would like them to be. When participants fail to follow instructions, this increases noise and decreases the validity of their data. This paper presents and validates a new tool for detecting participants who are not following instructions – the Instructional manipulation check (IMC). We demonstrate how the inclusion of an IMC can increase statistical power and reliability of a dataset.

Journal ArticleDOI
TL;DR: This paper will discuss how geometry and topology can be applied to make useful contributions to the analysis of various kinds of data, particularly high throughput data from microarray or other sources.
Abstract: An important feature of modern science and engineering is that data of various kinds is being produced at an unprecedented rate This is so in part because of new experimental methods, and in part because of the increase in the availability of high powered computing technology It is also clear that the nature of the data we are obtaining is significantly different For example, it is now often the case that we are given data in the form of very long vectors, where all but a few of the coordinates turn out to be irrelevant to the questions of interest, and further that we don’t necessarily know which coordinates are the interesting ones A related fact is that the data is often very high-dimensional, which severely restricts our ability to visualize it The data obtained is also often much noisier than in the past and has more missing information (missing data) This is particularly so in the case of biological data, particularly high throughput data from microarray or other sources Our ability to analyze this data, both in terms of quantity and the nature of the data, is clearly not keeping pace with the data being produced In this paper, we will discuss how geometry and topology can be applied to make useful contributions to the analysis of various kinds of data Geometry and topology are very natural tools to apply in this direction, since geometry can be regarded as the study of distance functions, and what one often works with are distance functions on large finite sets of data The mathematical formalism which has been developed for incorporating geometric and topological techniques deals with point clouds, ie finite sets of points equipped with a distance function It then adapts tools from the various branches of geometry to the study of point clouds The point clouds are intended to be thought of as finite samples taken from a geometric object, perhaps with noise Here are some of the key points which come up when applying these geometric methods to data analysis • Qualitative information is needed: One important goal of data analysis is to allow the user to obtain knowledge about the data, ie to understand how it is organized on a large scale For example, if we imagine that we are looking at a data set constructed somehow from diabetes patients, it would be important to develop the understanding that there are two types of the disease, namely the juvenile and adult onset forms Once that is established, one of course wants to develop quantitative methods for distinguishing them, but the first insight about the distinct forms of the disease is key

Journal ArticleDOI
TL;DR: This paper proposed Instructional manipulation check (IMC), a new tool for detecting participants who are not following instructions and demonstrated how the inclusion of an IMC can increase statistical power and reliability of a dataset.

Journal ArticleDOI
TL;DR: The results demonstrate that resting-state functional connectivity reflects structural connectivity and that combining modalities can enrich the understanding of these canonical brain networks.
Abstract: Resting-state functional connectivity magnetic resonance imaging (fcMRI) studies constitute a growing proportion of functional brain imaging publications. This approach detects temporal correlations in spontaneous blood oxygen level--dependent (BOLD) signal oscillations while subjects rest quietly in the scanner. Although distinct resting-state networks related to vision, language, executive processing, and other sensory and cognitive domains have been identified, considerable skepticism remains as to whether resting-state functional connectivity maps reflect neural connectivity or simply track BOLD signal correlations driven by nonneural artifact. Here we combine diffusion tensor imaging (DTI) tractography with resting-state fcMRI to test the hypothesis that resting-state functional connectivity reflects structural connectivity. These 2 modalities were used to investigate connectivity within the default mode network, a set of brain regions—including medial prefrontal cortex (MPFC), medial temporal lobes (MTLs), and posterior cingulate cortex (PCC)/retropslenial cortex (RSC)—implicated in episodic memory processing. Using seed regions from the functional connectivity maps, the DTI analysis revealed robust structural connections between the MTLs and the retrosplenial cortex whereas tracts from the MPFC contacted the PCC (just rostral to the RSC). The results demonstrate that resting-state functional connectivity reflects structural connectivity and that combining modalities can enrich our understanding of these canonical brain networks.

Journal ArticleDOI
TL;DR: Sherpa as mentioned in this paper is a general-purpose tool for the simulation of particle collisions at high-energy colliders and contains a very flexible tree-level matrix-element generator for the calculation of hard scattering processes within the Standard Model and various new physics models.
Abstract: In this paper the current release of the Monte Carlo event generator Sherpa, version 1.1, is presented. Sherpa is a general-purpose tool for the simulation of particle collisions at high-energy colliders. It contains a very flexible tree-level matrix-element generator for the calculation of hard scattering processes within the Standard Model and various new physics models. The emission of additional QCD partons off the initial and final states is described through a parton-shower model. To consistently combine multi-parton matrix elements with the QCD parton cascades the approach of Catani, Krauss, Kuhn and Webber is employed. A simple model of multiple interactions is used to account for underlying events in hadron-hadron collisions. The fragmentation of partons into primary hadrons is described using a phenomenological cluster-hadronisation model. A comprehensive library for simulating tau-lepton and hadron decays is provided. Where available form-factor models and matrix elements are used, allowing for the inclusion of spin correlations; effects of virtual and real QED corrections are included using the approach of Yennie, Frautschi and Suura.

Journal ArticleDOI
TL;DR: In this paper, the authors use a spatially explicit modeling tool, integrated valuation of ecosystem services and tradeoffs (InVEST), to predict changes in ecosystem services, biodiversity conservation, and commodity production levels.
Abstract: Nature provides a wide range of benefits to people. There is increasing consensus about the importance of incorporating these “ecosystem services” into resource management decisions, but quantifying the levels and values of these services has proven difficult. We use a spatially explicit modeling tool, Integrated Valuation of Ecosystem Services and Tradeoffs (InVEST), to predict changes in ecosystem services, biodiversity conservation, and commodity production levels. We apply InVEST to stakeholder-defined scenarios of land-use/land-cover change in the Willamette Basin, Oregon. We found that scenarios that received high scores for a variety of ecosystem services also had high scores for biodiversity, suggesting there is little tradeoff between biodiversity conservation and ecosystem services. Scenarios involving more development had higher commodity production values, but lower levels of biodiversity conservation and ecosystem services. However, including payments for carbon sequestration alleviates this tradeoff. Quantifying ecosystem services in a spatially explicit manner, and analyzing tradeoffs between them, can help to make natural resource decisions more effective, efficient, and defensible.

Journal ArticleDOI
31 Jul 2009-Science
TL;DR: Current trends in world fisheries are analyzed from a fisheries and conservation perspective, finding that 63% of assessed fish stocks worldwide still require rebuilding, and even lower exploitation rates are needed to reverse the collapse of vulnerable species.
Abstract: After a long history of overexploitation, increasing efforts to restore marine ecosystems and rebuild fisheries are under way. Here, we analyze current trends from a fisheries and conservation perspective. In 5 of 10 well-studied ecosystems, the average exploitation rate has recently declined and is now at or below the rate predicted to achieve maximum sustainable yield for seven systems. Yet 63% of assessed fish stocks worldwide still require rebuilding, and even lower exploitation rates are needed to reverse the collapse of vulnerable species. Combined fisheries and conservation objectives can be achieved by merging diverse management actions, including catch restrictions, gear modification, and closed areas, depending on local context. Impacts of international fleets and the lack of alternatives to fishing complicate prospects for rebuilding fisheries in many poorer regions, highlighting the need for a global perspective on rebuilding marine resources.

Journal ArticleDOI
08 May 2009-Science
TL;DR: An n-type graphene field-effect transistor that operates at room temperature is fabricated and confirmed the carbon-nitrogen species in graphene thermally annealed in ammonia is covalently functionalized by nitrogen species.
Abstract: Graphene is readily p-doped by adsorbates, but for device applications, it would be useful to access the n-doped material. Individual graphene nanoribbons were covalently functionalized by nitrogen species through high-power electrical joule heating in ammonia gas, leading to n-type electronic doping consistent with theory. The formation of the carbon-nitrogen bond should occur mostly at the edges of graphene where chemical reactivity is high. X-ray photoelectron spectroscopy and nanometer-scale secondary ion mass spectroscopy confirm the carbon-nitrogen species in graphene thermally annealed in ammonia. We fabricated an n-type graphene field-effect transistor that operates at room temperature.

ReportDOI
TL;DR: This paper measured sizable gaps in marginal products of labor and capital across plants within narrowly defined industries in China and India compared with the United States, and calculated manufacturing TFP gains of 30%-50% in China, and 40%-60% in India.
Abstract: Resource misallocation can lower aggregate total factor productivity (TFP).We use microdata on manufacturing establishments to quantify the potential extent of misallocation in China and India versus the United States. We measure sizable gaps in marginal products of labor and capital across plants within narrowly defined industries in China and India compared with the United States. When capital and labor are hypothetically reallocated to equalize marginal products to the extent observed in the United States, we calculate manufacturing TFP gains of 30%–50% in China and 40%–60% in India.

Journal ArticleDOI
16 Apr 2009-Neuron
TL;DR: It is shown that five different neurodegenerative syndromes cause circumscribed atrophy within five distinct, healthy, human intrinsic functional connectivity networks, and a direct link between intrinsic connectivity and gray matter structure is discovered.

Journal ArticleDOI
21 May 2009-Nature
TL;DR: G-protein-coupled receptors mediate most of the authors' physiological responses to hormones, neurotransmitters and environmental stimulants, and so have great potential as therapeutic targets for a broad spectrum of diseases.
Abstract: G-protein-coupled receptors (GPCRs) mediate most of our physiological responses to hormones, neurotransmitters and environmental stimulants, and so have great potential as therapeutic targets for a broad spectrum of diseases. They are also fascinating molecules from the perspective of membrane-protein structure and biology. Great progress has been made over the past three decades in understanding diverse GPCRs, from pharmacology to functional characterization in vivo. Recent high-resolution structural studies have provided insights into the molecular mechanisms of GPCR activation and constitutive activity.