scispace - formally typeset
Search or ask a question

Showing papers by "University of British Columbia published in 2009"


Journal ArticleDOI
TL;DR: M mothur is used as a case study to trim, screen, and align sequences; calculate distances; assign sequences to operational taxonomic units; and describe the α and β diversity of eight marine samples previously characterized by pyrosequencing of 16S rRNA gene fragments.
Abstract: mothur aims to be a comprehensive software package that allows users to use a single piece of software to analyze community sequence data. It builds upon previous tools to provide a flexible and powerful software package for analyzing sequencing data. As a case study, we used mothur to trim, screen, and align sequences; calculate distances; assign sequences to operational taxonomic units; and describe the alpha and beta diversity of eight marine samples previously characterized by pyrosequencing of 16S rRNA gene fragments. This analysis of more than 222,000 sequences was completed in less than 2 h with a laptop computer.

17,350 citations


Journal ArticleDOI
TL;DR: In this article, the Wilkinson Microwave Anisotropy Probe (WMAP) 5-year data were used to constrain the physics of cosmic inflation via Gaussianity, adiabaticity, the power spectrum of primordial fluctuations, gravitational waves, and spatial curvature.
Abstract: The Wilkinson Microwave Anisotropy Probe (WMAP) 5-year data provide stringent limits on deviations from the minimal, six-parameter Λ cold dark matter model. We report these limits and use them to constrain the physics of cosmic inflation via Gaussianity, adiabaticity, the power spectrum of primordial fluctuations, gravitational waves, and spatial curvature. We also constrain models of dark energy via its equation of state, parity-violating interaction, and neutrino properties, such as mass and the number of species. We detect no convincing deviations from the minimal model. The six parameters and the corresponding 68% uncertainties, derived from the WMAP data combined with the distance measurements from the Type Ia supernovae (SN) and the Baryon Acoustic Oscillations (BAO) in the distribution of galaxies, are: Ω b h 2 = 0.02267+0.00058 –0.00059, Ω c h 2 = 0.1131 ± 0.0034, ΩΛ = 0.726 ± 0.015, ns = 0.960 ± 0.013, τ = 0.084 ± 0.016, and at k = 0.002 Mpc-1. From these, we derive σ8 = 0.812 ± 0.026, H 0 = 70.5 ± 1.3 km s-1 Mpc–1, Ω b = 0.0456 ± 0.0015, Ω c = 0.228 ± 0.013, Ω m h 2 = 0.1358+0.0037 –0.0036, z reion = 10.9 ± 1.4, and t 0 = 13.72 ± 0.12 Gyr. With the WMAP data combined with BAO and SN, we find the limit on the tensor-to-scalar ratio of r 1 is disfavored even when gravitational waves are included, which constrains the models of inflation that can produce significant gravitational waves, such as chaotic or power-law inflation models, or a blue spectrum, such as hybrid inflation models. We obtain tight, simultaneous limits on the (constant) equation of state of dark energy and the spatial curvature of the universe: –0.14 < 1 + w < 0.12(95%CL) and –0.0179 < Ω k < 0.0081(95%CL). We provide a set of WMAP distance priors, to test a variety of dark energy models with spatial curvature. We test a time-dependent w with a present value constrained as –0.33 < 1 + w 0 < 0.21 (95% CL). Temperature and dark matter fluctuations are found to obey the adiabatic relation to within 8.9% and 2.1% for the axion-type and curvaton-type dark matter, respectively. The power spectra of TB and EB correlations constrain a parity-violating interaction, which rotates the polarization angle and converts E to B. The polarization angle could not be rotated more than –59 < Δα < 24 (95% CL) between the decoupling and the present epoch. We find the limit on the total mass of massive neutrinos of ∑m ν < 0.67 eV(95%CL), which is free from the uncertainty in the normalization of the large-scale structure data. The number of relativistic degrees of freedom (dof), expressed in units of the effective number of neutrino species, is constrained as N eff = 4.4 ± 1.5 (68%), consistent with the standard value of 3.04. Finally, quantitative limits on physically-motivated primordial non-Gaussianity parameters are –9 < f local NL < 111 (95% CL) and –151 < f equil NL < 253 (95% CL) for the local and equilateral models, respectively.

5,904 citations


Journal ArticleDOI
TL;DR: In this large, international, randomized trial, it was found that intensive glucose control increased mortality among adults in the ICU: a blood glucose target of 180 mg or less per deciliter resulted in lower mortality than did a target of 81 to 108 mg perDeciliter.
Abstract: Background: The optimal target range for blood glucose in critically ill patients remains unclear. Methods: Within 24 hours after admission to an intensive care unit (ICU), adults who were expected to require treatment in the ICU on 3 or more consecutive days were randomly assigned to undergo either intensive glucose control, with a target blood glucose range of 81 to 108 mg per deciliter (4.5 to 6.0 mmol per liter), or conventional glucose control, with a target of 180 mg or less per deciliter (10.0 mmol or less per liter). We defined the primary end point as death from any cause within 90 days after randomization. Results: Of the 6104 patients who underwent randomization, 3054 were assigned to undergo intensive control and 3050 to undergo conventional control; data with regard to the primary outcome at day 90 were available for 3010 and 3012 patients, respectively. The two groups had similar characteristics at baseline. A total of 829 patients (27.5%) in the intensive-control group and 751 (24.9%) in the conventional-control group died (odds ratio for intensive control, 1.14; 95% confidence interval, 1.02 to 1.28; P=0.02). The treatment effect did not differ significantly between operative (surgical) patients and nonoperative (medical) patients (odds ratio for death in the intensive-control group, 1.31 and 1.07, respectively; P=0.10). Severe hypoglycemia (blood glucose level, < or = 40 mg per deciliter [2.2 mmol per liter]) was reported in 206 of 3016 patients (6.8%) in the intensive-control group and 15 of 3014 (0.5%) in the conventional-control group (P<0.001). There was no significant difference between the two treatment groups in the median number of days in the ICU (P=0.84) or hospital (P=0.86) or the median number of days of mechanical ventilation (P=0.56) or renal-replacement therapy (P=0.39). Conclusions: In this large, international, randomized trial, we found that intensive glucose control increased mortality among adults in the ICU: a blood glucose target of 180 mg or less per deciliter resulted in lower mortality than did a target of 81 to 108 mg per deciliter. (ClinicalTrials.gov number, NCT00220987.)

4,241 citations


Posted Content
TL;DR: In this article, the authors provide an introduction and user guide to regression discontinuity (RD) design for empirical researchers, including the basic theory behind RD design, details when RD is likely to be valid or invalid given economic incentives.
Abstract: This paper provides an introduction and "user guide" to Regression Discontinuity (RD) designs for empirical researchers. It presents the basic theory behind the research design, details when RD is likely to be valid or invalid given economic incentives, explains why it is considered a "quasi-experimental" design, and summarizes different ways (with their advantages and disadvantages) of estimating RD designs and the limitations of interpreting these estimates. Concepts are discussed using examples drawn from the growing body of empirical research using RD.

3,455 citations


Journal ArticleDOI
TL;DR: Both overweight and obesity are associated with the incidence of multiple co-morbidities including type II diabetes, cancer and cardiovascular diseases, and maintenance of a healthy weight could be important in the prevention of the large disease burden in the future.
Abstract: Background Overweight and obese persons are at risk of a number of medical conditions which can lead to further morbidity and mortality. The primary objective of this study is to provide an estimate of the incidence of each co-morbidity related to obesity and overweight using a meta-analysis.

3,006 citations


Proceedings Article
01 Jan 2009
TL;DR: A system that answers the question, “What is the fastest approximate nearest-neighbor algorithm for my data?” and a new algorithm that applies priority search on hierarchical k-means trees, which is found to provide the best known performance on many datasets.
Abstract: For many computer vision problems, the most time consuming component consists of nearest neighbor matching in high-dimensional spaces. There are no known exact algorithms for solving these high-dimensional problems that are faster than linear search. Approximate algorithms are known to provide large speedups with only minor loss in accuracy, but many such algorithms have been published with only minimal guidance on selecting an algorithm and its parameters for any given problem. In this paper, we describe a system that answers the question, “What is the fastest approximate nearest-neighbor algorithm for my data?” Our system will take any given dataset and desired degree of precision and use these to automatically determine the best algorithm and parameter values. We also describe a new algorithm that applies priority search on hierarchical k-means trees, which we have found to provide the best known performance on many datasets. After testing a range of alternatives, we have found that multiple randomized k-d trees provide the best performance for other datasets. We are releasing public domain code that implements these approaches. This library provides about one order of magnitude improvement in query time over the best previously available software and provides fully automated parameter selection.

2,934 citations


Journal ArticleDOI
TL;DR: The 2-locus combination of rbcL+matK will provide a universal framework for the routine use of DNA sequence data to identify specimens and contribute toward the discovery of overlooked species of land plants.
Abstract: DNA barcoding involves sequencing a standard region of DNA as a tool for species identification. However, there has been no agreement on which region(s) should be used for barcoding land plants. To provide a community recommendation on a standard plant barcode, we have compared the performance of 7 leading candidate plastid DNA regions (atpF–atpH spacer, matK gene, rbcL gene, rpoB gene, rpoC1 gene, psbK–psbI spacer, and trnH–psbA spacer). Based on assessments of recoverability, sequence quality, and levels of species discrimination, we recommend the 2-locus combination of rbcL+matK as the plant barcode. This core 2-locus barcode will provide a universal framework for the routine use of DNA sequence data to identify specimens and contribute toward the discovery of overlooked species of land plants.

2,255 citations


Journal ArticleDOI
03 Jun 2009-JAMA
TL;DR: A scientific consensus is emerging that the origins of adult disease are often found among developmental and biological disruptions occurring during the early years of life as mentioned in this paper, and that these early experiences can affect adult health in 2 ways: cumulative damage over time or by the biological embedding of adversities during sensitive developmental periods.
Abstract: A scientific consensus is emerging that the origins of adult disease are often found among developmental and biological disruptions occurring during the early years of life. These early experiences can affect adult health in 2 ways—either by cumulative damage over time or by the biological embedding of adversities during sensitive developmental periods. In both cases, there can be a lag of many years, even decades, before early adverse experiences are expressed in the form of disease. From both basic research and policy perspectives, confronting the origins of disparities in physical and mental health early in life may produce greater effects than attempting to modify health-related behaviors or improve access to health care in adulthood.

2,065 citations


Journal ArticleDOI
TL;DR: In this paper, the authors use a spatially explicit modeling tool, integrated valuation of ecosystem services and tradeoffs (InVEST), to predict changes in ecosystem services, biodiversity conservation, and commodity production levels.
Abstract: Nature provides a wide range of benefits to people. There is increasing consensus about the importance of incorporating these “ecosystem services” into resource management decisions, but quantifying the levels and values of these services has proven difficult. We use a spatially explicit modeling tool, Integrated Valuation of Ecosystem Services and Tradeoffs (InVEST), to predict changes in ecosystem services, biodiversity conservation, and commodity production levels. We apply InVEST to stakeholder-defined scenarios of land-use/land-cover change in the Willamette Basin, Oregon. We found that scenarios that received high scores for a variety of ecosystem services also had high scores for biodiversity, suggesting there is little tradeoff between biodiversity conservation and ecosystem services. Scenarios involving more development had higher commodity production values, but lower levels of biodiversity conservation and ecosystem services. However, including payments for carbon sequestration alleviates this tradeoff. Quantifying ecosystem services in a spatially explicit manner, and analyzing tradeoffs between them, can help to make natural resource decisions more effective, efficient, and defensible.

2,056 citations


Journal ArticleDOI
31 Jul 2009-Science
TL;DR: Current trends in world fisheries are analyzed from a fisheries and conservation perspective, finding that 63% of assessed fish stocks worldwide still require rebuilding, and even lower exploitation rates are needed to reverse the collapse of vulnerable species.
Abstract: After a long history of overexploitation, increasing efforts to restore marine ecosystems and rebuild fisheries are under way. Here, we analyze current trends from a fisheries and conservation perspective. In 5 of 10 well-studied ecosystems, the average exploitation rate has recently declined and is now at or below the rate predicted to achieve maximum sustainable yield for seven systems. Yet 63% of assessed fish stocks worldwide still require rebuilding, and even lower exploitation rates are needed to reverse the collapse of vulnerable species. Combined fisheries and conservation objectives can be achieved by merging diverse management actions, including catch restrictions, gear modification, and closed areas, depending on local context. Impacts of international fleets and the lack of alternatives to fishing complicate prospects for rebuilding fisheries in many poorer regions, highlighting the need for a global perspective on rebuilding marine resources.

2,009 citations


Journal ArticleDOI
TL;DR: Luminal B and luminal–HER2-positive breast cancers were statistically significantly associated with poor breast cancer recurrence-free and disease-specific survival in all adjuvant systemic treatment categories.
Abstract: Background Gene expression profiling of breast cancer has identified two biologically distinct estrogen receptor (ER)positive subtypes of breast cancer: luminal A and luminal B. Luminal B tumors have higher proliferation and poorer prognosis than luminal A tumors. In this study, we developed a clinically practical immunohistochemistry assay to distinguish luminal B from luminal A tumors and investigated its ability to separate tumors according to breast cancer recurrence-free and disease-specific survival. Methods Tumors from a cohort of 357 patients with invasive breast carcinomas were subtyped by gene expression profile. Hormone receptor status, HER2 status, and the Ki67 index (percentage of Ki67-positive cancer nuclei) were determined immunohistochemically. Receiver operating characteristic curves were used to determine the Ki67 cut point to distinguish luminal B from luminal A tumors. The prognostic value of the immunohistochemical assignment for breast cancer recurrence-free and disease-specific survival was investigated with an independent tissue microarray series of 4046 breast cancers by use of Kaplan – Meier curves and multivariable Cox regression. Results Gene expression profiling classified 101 (28%) of the 357 tumors as luminal A and 69 (19%) as luminal B. The best Ki67 index cut point to distinguish luminal B from luminal A tumors was 13.25%. In an independent cohort of 4046 patients with breast cancer, 2847 had hormone receptor – positive tumors. When HER2 immunohistochemistry and the Ki67 index were used to subtype these 2847 tumors, we classified 1530 (59%, 95% confidence interval [CI] = 57% to 61%) as luminal A, 846 (33%, 95% CI = 31% to 34%) as luminal B, and 222 (9%, 95% CI = 7% to 10%) as luminal – HER2 positive. Luminal B and luminal – HER2-positive breast cancers were statistically significantly associated with poor breast cancer recurrence-free and disease-specific survival in all adjuvant systemic treatment categories. Of particular relevance are women who received tamoxifen as their sole adjuvant systemic therapy, among whom the 10-year breast cancer – specific survival was 79% (95% CI = 76% to 83%) for luminal A, 64% (95% CI = 59% to 70%) for luminal B, and 57% (95% CI = 47% to 69%) for luminal – HER2 subtypes. Conclusion Expression of ER, progesterone receptor, and HER2 proteins and the Ki67 index appear to distinguish

Journal ArticleDOI
07 Jan 2009-JAMA
TL;DR: Selenium or vitamin E, alone or in combination at the doses and formulations used, did not prevent prostate cancer in this population of relatively healthy men.
Abstract: Context Secondary analyses of 2 randomized controlled trials and supportive epidemiologic and preclinical data indicated the potential of selenium and vitamin E for preventing prostate cancer. Objective To determine whether selenium, vitamin E, or both could prevent prostate cancer and other diseases with little or no toxicity in relatively healthy men. Design, Setting, and Participants A randomized, placebo-controlled trial (Selenium and Vitamin E Cancer Prevention Trial [SELECT]) of 35 533 men from 427 participating sites in the United States, Canada, and Puerto Rico randomly assigned to 4 groups (selenium, vitamin E, selenium + vitamin E, and placebo) in a double-blind fashion between August 22, 2001, and June 24, 2004. Baseline eligibility included age 50 years or older (African American men) or 55 years or older (all other men), a serum prostate-specific antigen level of 4 ng/mL or less, and a digital rectal examination not suspicious for prostate cancer. Interventions Oral selenium (200 μg/d from L-selenomethionine) and matched vitamin E placebo, vitamin E (400 IU/d of all rac-α-tocopheryl acetate) and matched selenium placebo, selenium + vitamin E, or placebo + placebo for a planned follow-up of minimum of 7 years and a maximum of 12 years. Main Outcome Measures Prostate cancer and prespecified secondary outcomes, including lung, colorectal, and overall primary cancer. Results As of October 23, 2008, median overall follow-up was 5.46 years (range, 4.17-7.33 years). Hazard ratios (99% confidence intervals [CIs]) for prostate cancer were 1.13 (99% CI, 0.95-1.35; n = 473) for vitamin E, 1.04 (99% CI, 0.87-1.24; n = 432) for selenium, and 1.05 (99% CI, 0.88-1.25; n = 437) for selenium + vitamin E vs 1.00 (n = 416) for placebo. There were no significant differences (all P>.15) in any other prespecified cancer end points. There were statistically nonsignificant increased risks of prostate cancer in the vitamin E group (P = .06) and type 2 diabetes mellitus in the selenium group (relative risk, 1.07; 99% CI, 0.94-1.22; P = .16) but not in the selenium + vitamin E group. Conclusion Selenium or vitamin E, alone or in combination at the doses and formulations used, did not prevent prostate cancer in this population of relatively healthy men. Trial Registration clinicaltrials.gov identifier: NCT00006392Published online December 9, 2008 (doi:10.1001/jama.2008.864).

Journal ArticleDOI
TL;DR: A literature review has been performed on the measurements of five key concepts in HRI: anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety, distilled into five consistent questionnaires using semantic differential scales.
Abstract: This study emphasizes the need for standardized measurement tools for human robot interaction (HRI). If we are to make progress in this field then we must be able to compare the results from different studies. A literature review has been performed on the measurements of five key concepts in HRI: anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety. The results have been distilled into five consistent questionnaires using semantic differential scales. We report reliability and validity indicators based on several empirical studies that used these questionnaires. It is our hope that these questionnaires can be used by robot developers to monitor their progress. Psychologists are invited to further develop the questionnaires by adding new concepts, and to conduct further validations where it appears necessary.

Journal ArticleDOI
TL;DR: The Wilkinson Microwave Anisotropy Probe (WMAP) is a medium-class Explorer (MIDEX) satellite aimed at elucidating cosmology through full-sky observations of the cosmic microwave background (CMB) as mentioned in this paper.
Abstract: The Wilkinson Microwave Anisotropy Probe (WMAP) is a Medium-Class Explorer (MIDEX) satellite aimed at elucidating cosmology through full-sky observations of the cosmic microwave background (CMB). The WMAP full-sky maps of the temperature and polarization anisotropy in five frequency bands provide our most accurate view to date of conditions in the early universe. The multi-frequency data facilitate the separation of the CMB signal from foreground emission arising both from our Galaxy and from extragalactic sources. The CMB angular power spectrum derived from these maps exhibits a highly coherent acoustic peak structure which makes it possible to extract a wealth of information about the composition and history of the universe. as well as the processes that seeded the fluctuations. WMAP data have played a key role in establishing ACDM as the new standard model of cosmology (Bennett et al. 2003: Spergel et al. 2003; Hinshaw et al. 2007: Spergel et al. 2007): a flat universe dominated by dark energy, supplemented by dark matter and atoms with density fluctuations seeded by a Gaussian, adiabatic, nearly scale invariant process. The basic properties of this universe are determined by five numbers: the density of matter, the density of atoms. the age of the universe (or equivalently, the Hubble constant today), the amplitude of the initial fluctuations, and their scale dependence. By accurately measuring the first few peaks in the angular power spectrum, WMAP data have enabled the following accomplishments: Showing the dark matter must be non-baryonic and interact only weakly with atoms and radiation. The WMAP measurement of the dark matter density puts important constraints on supersymmetric dark matter models and on the properties of other dark matter candidates. With five years of data and a better determination of our beam response, this measurement has been significantly improved. Precise determination of the density of atoms in the universe. The agreement between the atomic density derived from WMAP and the density inferred from the deuterium abundance is an important test of the standard big bang model. Determination of the acoustic scale at redshift z = 1090. Similarly, the recent measurement of baryon acoustic oscillations (BAO) in the galaxy power spectrum (Eisenstein et al. 2005) has determined the acoustic scale at redshift z approx. 0.35. When combined, these standard rulers accurately measure the geometry of the universe and the properties of the dark energy. These data require a nearly flat universe dominated by dark energy consistent with a cosmological constant. Precise determination of the Hubble Constant, in conjunction with BAO observations. Even when allowing curvature (Omega(sub 0) does not equal 1) and a free dark energy equation of state (w does not equal -1), the acoustic data determine the Hubble constant to within 3%. The measured value is in excellent agreement with independent results from the Hubble Key Project (Freedman et al. 2001), providing yet another important consistency test for the standard model. Significant constraint of the basic properties of the primordial fluctuations. The anti-correlation seen in the temperature/polarization (TE) correlation spectrum on 4deg scales implies that the fluctuations are primarily adiabatic and rule out defect models and isocurvature models as the primary source of fluctuations (Peiris et al. 2003).

Journal ArticleDOI
TL;DR: The effect of shale composition and fabric upon pore structure and CH 4 sorption is investigated for potential shale gas reservoirs in the Western Canadian Sedimentary Basin (WCSB) as mentioned in this paper.

Journal ArticleDOI
TL;DR: This model validates components of the MSKCC model with the addition of platelet and neutrophil counts and can be incorporated into patient care and into clinical trials that use VEGF-targeted agents.
Abstract: Purpose There are no robust data on prognostic factors for overall survival (OS) in patients with metastatic renal cell carcinoma (RCC) treated with vascular endothelial growth factor (VEGF) –targeted therapy. Methods Baseline characteristics and outcomes on 645 patients with anti-VEGF therapy–naive metastatic RCC were collected from three US and four Canadian cancer centers. Cox proportional hazards regression, followed by bootstrap validation, was used to identify independent prognostic factors for OS. Results The median OS for the whole cohort was 22 months (95% CI, 20.2 to 26.5 months), and the median follow-up was 24.5 months. Overall, 396, 200, and 49 patients were treated with sunitinib, sorafenib, and bevacizumab, respectively. Four of the five adverse prognostic factors according to the Memorial Sloan-Kettering Cancer Center (MSKCC) were independent predictors of short survival: hemoglobin less than the lower limit of normal (P < .0001), corrected calcium greater than the upper limit of normal (U...

Journal ArticleDOI
TL;DR: In this paper, the authors present cosmological constraints from the Wilkinson Microwave Anisotropy Probe (WMAP) alone for both the ACDM model and a set of possible extensions.
Abstract: The Wilkinson Microwave Anisotropy Probe (WMAP), launched in 2001, has mapped out the Cosmic Microwave Background with unprecedented accuracy over the whole sky. Its observations have led to the establishment of a simple concordance cosmological model for the contents and evolution of the universe, consistent with virtually all other astronomical measurements. The WMAP first-year and three-year data have allowed us to place strong constraints on the parameters describing the ACDM model. a flat universe filled with baryons, cold dark matter, neutrinos. and a cosmological constant. with initial fluctuations described by nearly scale-invariant power law fluctuations, as well as placing limits on extensions to this simple model (Spergel et al. 2003. 2007). With all-sky measurements of the polarization anisotropy (Kogut et al. 2003; Page et al. 2007), two orders of magnitude smaller than the intensity fluctuations. WMAP has not only given us an additional picture of the universe as it transitioned from ionized to neutral at redshift z approx.1100. but also an observation of the later reionization of the universe by the first stars. In this paper we present cosmological constraints from WMAP alone. for both the ACDM model and a set of possible extensions. We also consider tlle consistency of WMAP constraints with other recent astronomical observations. This is one of seven five-year WMAP papers. Hinshaw et al. (2008) describe the data processing and basic results. Hill et al. (2008) present new beam models arid window functions, Gold et al. (2008) describe the emission from Galactic foregrounds, and Wright et al. (2008) the emission from extra-Galactic point sources. The angular power spectra are described in Nolta et al. (2008), and Komatsu et al. (2008) present and interpret cosmological constraints based on combining WMAP with other data. WMAP observations are used to produce full-sky maps of the CMB in five frequency bands centered at 23, 33, 41, 61, and 94 GHz (Hinshaw et al. 2008). With five years of data, we are now able to place better limits on the ACDM model. as well as to move beyond it to test the composition of the universe. details of reionization. sub-dominant components, characteristics of inflation, and primordial fluctuations. We have more than doubled the amount of polarized data used for cosmological analysis. allowing a better measure of the large-scale E-mode signal (Nolta et al. 2008). To this end we describe an alternative way to remove Galactic foregrounds from low resolution polarization maps in which Galactic emission is marginalized over, providing a cross-check of our results. With longer integration we also better probe the second and third acoustic peaks in the temperature angular power spectrum, and have many more year-to-year difference maps available for cross-checking systematic effects (Hinshaw et al. 2008).

Journal ArticleDOI
23 Jan 2009-Science
TL;DR: Analysis of longitudinal data from unmanaged old forests in the western United States showed that background (noncatastrophic) mortality rates have increased rapidly in recent decades, with doubling periods ranging from 17 to 29 years among regions.
Abstract: Persistent changes in tree mortality rates can alter forest structure, composition, and ecosystem services such as carbon sequestration. Our analyses of longitudinal data from unmanaged old forests in the western United States showed that background (noncatastrophic) mortality rates have increased rapidly in recent decades, with doubling periods ranging from 17 to 29 years among regions. Increases were also pervasive across elevations, tree sizes, dominant genera, and past fire histories. Forest density and basal area declined slightly, which suggests that increasing mortality was not caused by endogenous increases in competition. Because mortality increased in small trees, the overall increase in mortality rates cannot be attributed solely to aging of large trees. Regional warming and consequent increases in water deficits are likely contributors to the increases in tree mortality rates.

Proceedings ArticleDOI
26 Apr 2009
TL;DR: In this paper, the performance of non-graphics applications written in NVIDIA's CUDA programming model is evaluated on a microarchitecture performance simulator that runs NVIDIA's parallel thread execution (PTX) virtual instruction set.
Abstract: Modern Graphic Processing Units (GPUs) provide sufficiently flexible programming models that understanding their performance can provide insight in designing tomorrow's manycore processors, whether those are GPUs or otherwise. The combination of multiple, multithreaded, SIMD cores makes studying these GPUs useful in understanding tradeoffs among memory, data, and thread level parallelism. While modern GPUs offer orders of magnitude more raw computing power than contemporary CPUs, many important applications, even those with abundant data level parallelism, do not achieve peak performance. This paper characterizes several non-graphics applications written in NVIDIA's CUDA programming model by running them on a novel detailed microarchitecture performance simulator that runs NVIDIA's parallel thread execution (PTX) virtual instruction set. For this study, we selected twelve non-trivial CUDA applications demonstrating varying levels of performance improvement on GPU hardware (versus a CPU-only sequential version of the application). We study the performance of these applications on our GPU performance simulator with configurations comparable to contemporary high-end graphics cards. We characterize the performance impact of several microarchitecture design choices including choice of interconnect topology, use of caches, design of memory controller, parallel workload distribution mechanisms, and memory request coalescing hardware. Two observations we make are (1) that for the applications we study, performance is more sensitive to interconnect bisection bandwidth rather than latency, and (2) that, for some applications, running fewer threads concurrently than on-chip resources might otherwise allow can improve performance by reducing contention in the memory system.

Journal ArticleDOI
TL;DR: Evidence from animal models suggests that a time-limited window of neuroplasticity opens following a stroke, during which the greatest gains in recovery occur, and how to optimally engage and modify surviving neuronal networks is studied.
Abstract: Reductions in blood flow to the brain of sufficient duration and extent lead to stroke, which results in damage to neuronal networks and the impairment of sensation, movement or cognition. Evidence from animal models suggests that a time-limited window of neuroplasticity opens following a stroke, during which the greatest gains in recovery occur. Plasticity mechanisms include activity-dependent rewiring and synapse strengthening. The challenge for improving stroke recovery is to understand how to optimally engage and modify surviving neuronal networks, to provide new response strategies that compensate for tissue lost to injury.

Journal ArticleDOI
TL;DR: An fMRI study that used experience sampling to provide an online measure of mind wandering during a concurrent task revealed a number of crucial aspects of the neural recruitment associated with mind wandering, highlighting the value of combining subjective self-reports with online measures of brain function for advancing the understanding of the neurophenomenology of subjective experience.
Abstract: Although mind wandering occupies a large proportion of our waking life, its neural basis and relation to ongoing behavior remain controversial. We report an fMRI study that used experience sampling to provide an online measure of mind wandering during a concurrent task. Analyses focused on the interval of time immediately preceding experience sampling probes demonstrate activation of default network regions during mind wandering, a finding consistent with theoretical accounts of default network functions. Activation in medial prefrontal default network regions was observed both in association with subjective self-reports of mind wandering and an independent behavioral measure (performance errors on the concurrent task). In addition to default network activation, mind wandering was associated with executive network recruitment, a finding predicted by behavioral theories of off-task thought and its relation to executive resources. Finally, neural recruitment in both default and executive network regions was strongest when subjects were unaware of their own mind wandering, suggesting that mind wandering is most pronounced when it lacks meta-awareness. The observed parallel recruitment of executive and default network regions—two brain systems that so far have been assumed to work in opposition—suggests that mind wandering may evoke a unique mental state that may allow otherwise opposing networks to work in cooperation. The ability of this study to reveal a number of crucial aspects of the neural recruitment associated with mind wandering underscores the value of combining subjective self-reports with online measures of brain function for advancing our understanding of the neurophenomenology of subjective experience.

Journal ArticleDOI
TL;DR: In this article, the authors investigated the global patterns of such impacts by projecting the distributional ranges of a sample of 1066 exploited marine fish and invertebrates for 2050 using a newly developed dynamic bioclimate envelope model.
Abstract: Climate change can impact the pattern of marine biodiversity through changes in species’ distributions. However, global studies on climate change impacts on ocean biodiversity have not been performed so far. Our paper aims to investigate the global patterns of such impacts by projecting the distributional ranges of a sample of 1066 exploited marine fish and invertebrates for 2050 using a newly developed dynamic bioclimate envelope model. Our projections show that climate change may lead to numerous local extinction in the sub-polar regions, the tropics and semi-enclosed seas. Simultaneously, species invasion is projected to be most intense in the Arctic and the Southern Ocean. Together, they result in dramatic species turnovers of over 60% of the present biodiversity, implying ecological disturbances that potentially disrupt ecosystem services. Our projections can be viewed as a set of hypothesis for future analytical and empirical studies.

Journal ArticleDOI
06 Feb 2009-Science
TL;DR: Tests of parallel evolution of reproductive isolation, trait-based assortative mating, and reproductive isolation by active selection have demonstrated that ecological speciation is a common means by which new species arise.
Abstract: Natural selection commonly drives the origin of species, as Darwin initially claimed. Mechanisms of speciation by selection fall into two broad categories: ecological and mutation-order. Under ecological speciation, divergence is driven by divergent natural selection between environments, whereas under mutation-order speciation, divergence occurs when different mutations arise and are fixed in separate populations adapting to similar selection pressures. Tests of parallel evolution of reproductive isolation, trait-based assortative mating, and reproductive isolation by active selection have demonstrated that ecological speciation is a common means by which new species arise. Evidence for mutation-order speciation by natural selection is more limited and has been best documented by instances of reproductive isolation resulting from intragenomic conflict. However, we still have not identified all aspects of selection, and identifying the underlying genes for reproductive isolation remains challenging.

Journal ArticleDOI
29 Jan 2009-Nature
TL;DR: In this paper, the authors reported frequent mutations in the heterotrimeric G protein alpha-subunit, GNAQ, in blue naevi (83%), and ocular melanoma of the uvea (46%).
Abstract: BRAF and NRAS are common targets for somatic mutations in benign and malignant neoplasms that arise from melanocytes situated in epithelial structures, and lead to constitutive activation of the mitogen-activated protein (MAP) kinase pathway. However, BRAF and NRAS mutations are absent in a number of other melanocytic neoplasms in which the equivalent oncogenic events are currently unknown. Here we report frequent somatic mutations in the heterotrimeric G protein alpha-subunit, GNAQ, in blue naevi (83%) and ocular melanoma of the uvea (46%). The mutations occur exclusively in codon 209 in the Ras-like domain and result in constitutive activation, turning GNAQ into a dominant acting oncogene. Our results demonstrate an alternative route to MAP kinase activation in melanocytic neoplasia, providing new opportunities for therapeutic intervention.

Journal ArticleDOI
01 May 2009
TL;DR: In this paper, the authors proposed a new regression method to evaluate the impact of changes in the distribution of the explanatory variables on quantiles of the unconditional (marginal) distribution of an outcome variable.
Abstract: We propose a new regression method to evaluate the impact of changes in the distribution of the explanatory variables on quantiles of the unconditional (marginal) distribution of an outcome variable. The proposed method consists of running a regression of the (recentered) influence function (RIF) of the unconditional quantile on the explanatory variables. The influence function, a widely used tool in robust estimation, is easily computed for quantiles, as well as for other distributional statistics. Our approach, thus, can be readily generalized to other distributional statistics.

Journal ArticleDOI
TL;DR: A review of the current state of the art in the research field of cold and ultracold molecules can be found in this paper, where a discussion is based on recent experimental and theoretical work and concludes with a summary of anticipated future directions and open questions in rapidly expanding research field.
Abstract: This paper presents a review of the current state of the art in the research field of cold and ultracold molecules. It serves as an introduction to the focus issue of New Journal of Physics on Cold and Ultracold Molecules and describes new prospects for fundamental research and technological development. Cold and ultracold molecules may revolutionize physical chemistry and few-body physics, provide techniques for probing new states of quantum matter, allow for precision measurements of both fundamental and applied interest, and enable quantum simulations of condensed-matter phenomena. Ultracold molecules offer promising applications such as new platforms for quantum computing, precise control of molecular dynamics, nanolithography and Bose-enhanced chemistry. The discussion is based on recent experimental and theoretical work and concludes with a summary of anticipated future directions and open questions in this rapidly expanding research field.



Journal ArticleDOI
24 Apr 2009-Science
TL;DR: To understand the biology and evolution of ruminants, the cattle genome was sequenced to about sevenfold coverage and provides a resource for understanding mammalian evolution and accelerating livestock genetic improvement for milk and meat production.
Abstract: To understand the biology and evolution of ruminants, the cattle genome was sequenced to about sevenfold coverage. The cattle genome contains a minimum of 22,000 genes, with a core set of 14,345 orthologs shared among seven mammalian species of which 1217 are absent or undetected in noneutherian (marsupial or monotreme) genomes. Cattle-specific evolutionary breakpoint regions in chromosomes have a higher density of segmental duplications, enrichment of repetitive elements, and species-specific variations in genes associated with lactation and immune responsiveness. Genes involved in metabolism are generally highly conserved, although five metabolic genes are deleted or extensively diverged from their human orthologs. The cattle genome sequence thus provides a resource for understanding mammalian evolution and accelerating livestock genetic improvement for milk and meat production.

Journal ArticleDOI
TL;DR: It is concluded that divergent selection makes diverse contributions to heterogeneous genomic divergence, and the number, size, and distribution of genomic regions affected by selection varied substantially among studies, leading us to discuss the potential role of Divergent selection in the growth of regions of differentiation (i.e. genomic islands of divergence), a topic in need of future investigation.
Abstract: Levels of genetic differentiation between populations can be highly variable across the genome, with divergent selection contributing to such heterogeneous genomic divergence. For example, loci under divergent selection and those tightly physically linked to them may exhibit stronger differentiation than neutral regions with weak or no linkage to such loci. Divergent selection can also increase genome-wide neutral differentiation by reducing gene flow (e.g. by causing ecological speciation), thus promoting divergence via the stochastic effects of genetic drift. These consequences of divergent selection are being reported in recently accumulating studies that identify: (i) ‘outlier loci’ with higher levels of divergence than expected under neutrality, and (ii) a positive association between the degree of adaptive phenotypic divergence and levels of molecular genetic differentiation across population pairs [‘isolation by adaptation’ (IBA)]. The latter pattern arises because as adaptive divergence increases, gene flow is reduced (thereby promoting drift) and genetic hitchhiking increased. Here, we review and integrate these previously disconnected concepts and literatures. We find that studies generally report 5–10% of loci to be outliers. These selected regions were often dispersed across the genome, commonly exhibited replicated divergence across different population pairs, and could sometimes be associated with specific ecological variables. IBA was not infrequently observed, even at neutral loci putatively unlinked to those under divergent selection. Overall, we conclude that divergent selection makes diverse contributions to heterogeneous genomic divergence. Nonetheless, the number, size, and distribution of genomic regions affected by selection varied substantially among studies, leading us to discuss the potential role of divergent selection in the growth of regions of differentiation (i.e. genomic islands of divergence), a topic in need of future investigation.