scispace - formally typeset
Search or ask a question

Showing papers by "University of Oxford published in 2002"



Journal ArticleDOI
Robert H. Waterston1, Kerstin Lindblad-Toh2, Ewan Birney, Jane Rogers3  +219 moreInstitutions (26)
05 Dec 2002-Nature
TL;DR: The results of an international collaboration to produce a high-quality draft sequence of the mouse genome are reported and an initial comparative analysis of the Mouse and human genomes is presented, describing some of the insights that can be gleaned from the two sequences.
Abstract: The sequence of the mouse genome is a key informational tool for understanding the contents of the human genome and a key experimental tool for biomedical research. Here, we report the results of an international collaboration to produce a high-quality draft sequence of the mouse genome. We also present an initial comparative analysis of the mouse and human genomes, describing some of the insights that can be gleaned from the two sequences. We discuss topics including the analysis of the evolutionary forces shaping the size, structure and sequence of the genomes; the conservation of large-scale synteny across most of the genomes; the much lower extent of sequence orthology covering less than half of the genomes; the proportions of the genomes under selection; the number of protein-coding genes; the expansion of gene families related to reproduction and immunity; the evolution of proteins; and the identification of intraspecies polymorphism.

6,643 citations


Journal ArticleDOI
21 Jun 2002-Science
TL;DR: It is shown that the human genome can be parsed objectively into haplotype blocks: sizable regions over which there is little evidence for historical recombination and within which only a few common haplotypes are observed.
Abstract: Haplotype-based methods offer a powerful approach to disease gene mapping, based on the association between causal mutations and the ancestral haplotypes on which they arose. As part of The SNP Consortium Allele Frequency Projects, we characterized haplotype patterns across 51 autosomal regions (spanning 13 megabases of the human genome) in samples from Africa, Europe, and Asia. We show that the human genome can be parsed objectively into haplotype blocks: sizable regions over which there is little evidence for historical recombination and within which only a few common haplotypes are observed. The boundaries of blocks and specific haplotypes they contain are highly correlated across populations. We demonstrate that such haplotype frameworks provide substantial statistical power in association studies of common genetic variation across each region. Our results provide a foundation for the construction of a haplotype map of the human genome, facilitating comprehensive genetic association studies of human disease.

5,634 citations


Journal ArticleDOI
TL;DR: The multipoint engine for rapid likelihood inference (Merlin) is a computer program that uses sparse inheritance trees for pedigree analysis; it performs rapid haplotyping, genotype error detection and affected pair linkage analyses and can handle more markers than other pedigree analysis packages.
Abstract: Efforts to find disease genes using high-density single-nucleotide polymorphism (SNP) maps will produce data sets that exceed the limitations of current computational tools. Here we describe a new, efficient method for the analysis of dense genetic maps in pedigree data that provides extremely fast solutions to common problems such as allele-sharing analyses and haplotyping. We show that sparse binary trees represent patterns of gene flow in general pedigrees in a parsimonious manner, and derive a family of related algorithms for pedigree traversal. With these trees, exact likelihood calculations can be carried out efficiently for single markers or for multiple linked markers. Using an approximate multipoint calculation that ignores the unlikely possibility of a large number of recombinants further improves speed and provides accurate solutions in dense maps with thousands of markers. Our multipoint engine for rapid likelihood inference (Merlin) is a computer program that uses sparse inheritance trees for pedigree analysis; it performs rapid haplotyping, genotype error detection and affected pair linkage analyses and can handle more markers than other pedigree analysis packages.

3,455 citations


Posted Content
TL;DR: In this paper, the authors studied the impact of trade and foreign direct investment on the productivity of domestic firms in the manufacturing sector in the country of Lithuania and found that a 10 percent increase in the foreign presence in downstream sectors is associated with a 0.38 percent rise in output of each domestic firm in the supplying industry.
Abstract: Many countries compete against one another in attracting foreign investors by offering ever more generous incentive packages and justifying their actions with the productivity gains that are expected to accrue to domestic producers from knowledge externalities generated by foreign affiliates. Despite this being hugely important to public policy choices, there is little conclusive evidence indicating that domestic firms benefit from foreign presence in their sector. It is possible, though, that researchers have been looking for foreign direct investment (FDI) spillovers in the wrong place. Multinationals have an incentive to prevent information leakage that would enhance the performance of their local competitors in the same industry but at the same time may want to transfer knowledge to their local suppliers in other sectors. Spillovers from FDI may be, therefore, more likely to take place through backward linkages - that is, contacts between domestic suppliers of intermediate inputs and their multinational clients - and thus would not have been captured by the earlier literature. This paper focuses on the understudied issue of FDI spillovers through backward linkages and goes beyond existing studies by shedding some light on factors driving this phenomenon. It also improves over existing literature by addressing several econometric problems that may have biased the results of earlier research. Based on a firm-level panel data set from Lithuania, the estimation results are consistent with the existence of productivity spillovers. They suggest that a 10 percent increase in the foreign presence in downstream sectors is associated with 0.38 percent rise in output of each domestic firm in the supplying industry. The data indicate that these spillovers are not restricted geographically, since local firms seem to benefit from the operation of downstream foreign affiliates on their own, as well as in other regions. The results further show that greater productivity benefits are associated with domestic-market, rather than export-oriented, foreign affiliates. But no difference is detected between the effects of fully-owned foreign firms and those with joint domestic and foreign ownership. The findings of a positive correlation between productivity growth of domestic firms and the increase in multinational presence in downstream sectors should not, however, be interpreted as a call for subsidizing FDI. These results are consistent with the existence of knowledge spillovers from foreign affiliates to their local suppliers, but they may also be a result of increased competition in upstream sectors. While the former case would call for offering FDI incentive packages, it would not be the optimal policy in the latter. Certainly more research is needed to disentangle these two effects. This paper - a product of Trade, Development Research Group - is part of a larger effort in the group to study the contribution of trade and foreign direct investment to technology transfer.

3,013 citations


Journal ArticleDOI
Q. R. Ahmad1, R. C. Allen2, T. C. Andersen3, J. D. Anglin4  +202 moreInstitutions (18)
TL;DR: Observations of neutral-current nu interactions on deuterium in the Sudbury Neutrino Observatory are reported, providing strong evidence for solar nu(e) flavor transformation.
Abstract: Observations of neutral-current nu interactions on deuterium in the Sudbury Neutrino Observatory are reported. Using the neutral current (NC), elastic scattering, and charged current reactions and assuming the standard 8B shape, the nu(e) component of the 8B solar flux is phis(e) = 1.76(+0.05)(-0.05)(stat)(+0.09)(-0.09)(syst) x 10(6) cm(-2) s(-1) for a kinetic energy threshold of 5 MeV. The non-nu(e) component is phi(mu)(tau) = 3.41(+0.45)(-0.45)(stat)(+0.48)(-0.45)(syst) x 10(6) cm(-2) s(-1), 5.3sigma greater than zero, providing strong evidence for solar nu(e) flavor transformation. The total flux measured with the NC reaction is phi(NC) = 5.09(+0.44)(-0.43)(stat)(+0.46)(-0.43)(syst) x 10(6) cm(-2) s(-1), consistent with solar models.

2,732 citations


Journal ArticleDOI
TL;DR: An analytic framework is described to identify and distinguish between moderators and mediators in RCTs when outcomes are measured dimensionally and it is recommended that R CTs routinely include and report such analyses.
Abstract: Randomized clinical trials (RCTs) not only are the gold standard for evaluating the efficacy and effectiveness of psychiatric treatments but also can be valuable in revealing moderators and mediators of therapeutic change. Conceptually, moderators identify on whom and under what circumstances treatments have different effects. Mediators identify why and how treatments have effects. We describe an analytic framework to identify and distinguish between moderators and mediators in RCTs when outcomes are measured dimensionally. Rapid progress in identifying the most effective treatments and understanding on whom treatments work and do not work and why treatments work or do not work depends on efforts to identify moderators and mediators of treatment outcome. We recommend that RCTs routinely include and report such analyses.

2,378 citations


Journal ArticleDOI
TL;DR: Simulations show λ to be a statistically powerful index for measuring whether data exhibit phylogenetic dependence or not and whether it has low rates of Type I error, which demonstrates that even partial information on phylogeny will improve the accuracy of phylogenetic analyses.
Abstract: The question is often raised whether it is statistically necessary to control for phylogenetic associations in comparative studies. To investigate this question, we explore the use of a measure of phylogenetic correlation, lambda, introduced by Pagel (1999), that normally varies between 0 (phylogenetic independence) and 1 (species' traits covary in direct proportion to their shared evolutionary history). Simulations show lambda to be a statistically powerful index for measuring whether data exhibit phylogenetic dependence or not and whether it has low rates of Type I error. Moreover, lambda is robust to incomplete phylogenetic information, which demonstrates that even partial information on phylogeny will improve the accuracy of phylogenetic analyses. To assess whether traits generally show phylogenetic associations, we present a quantitative review of 26 published phylogenetic comparative data sets. The data sets include 103 traits and were chosen from the ecological literature in which debate about the need for phylogenetic correction has been most acute. Eighty-eight percent of data sets contained at least one character that displayed significant phylogenetic dependence, and 60% of characters overall (pooled across studies) showed significant evidence of phylogenetic association. In 16% of tests, phylogenetic correlation could be neither supported nor rejected. However, most of these equivocal results were found in small phylogenies and probably reflect a lack of power. We suggest that the parameter lambda be routinely estimated when analyzing comparative data, since it can also be used simultaneously to adjust the phylogenetic correction in a manner that is optimal for the data set, and we present an example of how this may be done.

2,333 citations


Journal ArticleDOI
12 Sep 2002-Nature
TL;DR: It will be substantially harder to quantify the range of possible changes in the hydrologic cycle than in global-mean temperature, both because the observations are less complete and because the physical constraints are weaker.
Abstract: What can we say about changes in the hydrologic cycle on 50-year timescales when we cannot predict rainfall next week? Eventually, perhaps, a great deal: the overall climate response to increasing atmospheric concentrations of greenhouse gases may prove much simpler and more predictable than the chaos of short-term weather. Quantifying the diversity of possible responses is essential for any objective, probability-based climate forecast, and this task will require a new generation of climate modelling experiments, systematically exploring the range of model behaviour that is consistent with observations. It will be substantially harder to quantify the range of possible changes in the hydrologic cycle than in global-mean temperature, both because the observations are less complete and because the physical constraints are weaker.

2,267 citations


Journal ArticleDOI
TL;DR: In this paper, the moments and the asymptotic distribution of the realized volatility error were derived under the assumption of a rather general stochastic volatility model, and the difference between realized volatility and the discretized integrated volatility (which is called actual volatility) were estimated.
Abstract: Summary. The availability of intraday data on the prices of speculative assets means that we can use quadratic variation-like measures of activity in financial markets, called realized volatility, to study the stochastic properties of returns. Here, under the assumption of a rather general stochastic volatility model, we derive the moments and the asymptotic distribution of the realized volatility error—the difference between realized volatility and the discretized integrated volatility (which we call actual volatility). These properties can be used to allow us to estimate the parameters of stochastic volatility models without recourse to the use of simulation-intensive methods.

2,207 citations


Journal ArticleDOI
TL;DR: In this paper, the construction of a 10' latitude/longitude data set of mean monthly sur-face climate over global land areas, excluding Antarctica, was described, which includes 8 climate conditions: precipitation, wet-day frequency, temperature, diurnal temperature range, relative humid-ity, sunshine duration, ground frost frequency and windspeed.
Abstract: We describe the construction of a 10' latitude/longitude data set of mean monthly sur- face climate over global land areas, excluding Antarctica The climatology includes 8 climate ele- ments —precipitation, wet-day frequency, temperature, diurnal temperature range, relative humid- ity, sunshine duration, ground frost frequency and windspeed—and was interpolated from a data set of station means for the period centred on 1961 to 1990 Precipitation was first defined in terms of the parameters of the Gamma distribution, enabling the calculation of monthly precipitation at any given return period The data are compared to an earlier data set at 05o latitude/longitude resolution and show added value over most regions The data will have many applications in applied climatology, biogeochemical modelling, hydrology and agricultural meteorology and are available through the International Water Management Institute World Water and Climate Atlas (http://wwwiwmiorg) and the Climatic Research Unit (http://wwwcruueaacuk)

Journal ArticleDOI
26 Sep 2002-Nature
TL;DR: It is shown that the Saccharomyces cerevisiae Set1 protein can catalyse di- and tri-methylation of K4 and stimulate the activity of many genes, establishing the concept of methyl status as a determinant for gene activity and extending considerably the complexity of histone modifications.
Abstract: Lysine methylation of histones in vivo occurs in three states: mono-, di- and tri-methyl. Histone H3 has been found to be di-methylated at lysine 4 (K4) in active euchromatic regions but not in silent heterochromatic sites. Here we show that the Saccharomyces cerevisiae Set1 protein can catalyse di- and tri-methylation of K4 and stimulate the activity of many genes. Using antibodies that discriminate between the di- and tri-methylated state of K4 we show that di-methylation occurs at both inactive and active euchromatic genes, whereas tri-methylation is present exclusively at active genes. It is therefore the presence of a tri-methylated K4 that defines an active state of gene expression. These findings establish the concept of methyl status as a determinant for gene activity and thus extend considerably the complexity of histone modifications.

Journal ArticleDOI
TL;DR: Although there was substantial heterogeneity among studies (especially for antisocial personality disorder), only a small proportion was explained by differences in prevalence rates between detainees and sentenced inmates.

Journal ArticleDOI
24 Oct 2002-Nature
TL;DR: A framework for detecting the genetic imprint of recent positive selection by analysing long-range haplotypes in human populations is introduced, and the core haplotypes carrying the proposed protective mutation stand out and show significant evidence of selection.
Abstract: The ability to detect recent natural selection in the human population would have profound implications for the study of human history and for medicine. Here, we introduce a framework for detecting the genetic imprint of recent positive selection by analysing long-range haplotypes in human populations. We first identify haplotypes at a locus of interest (core haplotypes). We then assess the age of each core haplotype by the decay of its association to alleles at various distances from the locus, as measured by extended haplotype homozygosity (EHH). Core haplotypes that have unusually high EHH and a high population frequency indicate the presence of a mutation that rose to prominence in the human gene pool faster than expected under neutral evolution. We applied this approach to investigate selection at two genes carrying common variants implicated in resistance to malaria: G6PD and CD40 ligand. At both loci, the core haplotypes carrying the proposed protective mutation stand out and show significant evidence of selection. More generally, the method could be used to scan the entire genome for evidence of recent positive selection.

Journal ArticleDOI
TL;DR: Levels of endogenous sex hormones are strongly associated with breast cancer risk in postmenopausal women, and SHBG was associated with a decrease in Breast cancer risk.
Abstract: Background: Reproductive and hormonal factors are involved in the etiology of breast cancer, but there are only a few prospective studies on endogenous sex hormone levels and breast cancer risk We reanalyzed the worldwide data from prospective studies to examine the relationship between the levels of endogenous sex hormones and breast cancer risk in postmenopausal women Methods: We analyzed the individual data from nine prospective studies on 663 women who developed breast cancer and 1765 women who did not None of the women was taking exogenous sex hormones when their blood was collected to determine hormone levels The relative risks (RRs) for breast cancer associated with increasing hormone concentrations were estimated by conditional logistic regression on case–control sets matched within each study Linear trends and heterogeneity of RRs were assessed by two-sided tests or chi-square tests, as appropriate Results: The risk for breast cancer increased statistically significantly with increasing concentrations of all sex hormones examined: total estradiol, free estradiol, non-sex hormone-binding globulin (SHBG)-bound estradiol (which comprises free and albumin-bound estradiol), estrone, estrone sulfate, androstenedione, dehydroepiandrosterone, dehydroepiandrosterone sulfate, and testosterone The RRs for women with increasing quintiles of estradiol concentrations, relative to the lowest quintile, were 142 (95% confidence interval [CI] = 104 to 195), 121 (95% CI = 089 to 166), 180 (95% CI = 133 to 243), and 200 (95% CI = 147 to 271; Ptrend<001); the RRs for women with increasing quintiles of free estradiol were 138 (95% CI = 094 to 203), 184 (95% CI = 124 to 274), 224 (95% CI = 153 to 327), and 258 (95% CI = 176 to 378; Ptrend<001) The magnitudes of risk associated with the other estrogens and with the androgens were similar SHBG was associated with a decrease in breast cancer risk (P trend = 041) The increases in risk associated with increased levels of all sex hormones remained after subjects who were diagnosed with breast cancer within 2 years of blood collection were excluded from the analysis Conclusion: Levels of endogenous sex hormones are strongly associated with breast cancer risk in postmenopausal women [J Natl Cancer Inst 2002;94:606–16]

Journal ArticleDOI
TL;DR: The present paper provides a description of theEPIC study, with the aim of simplifying reference to it in future papers reporting substantive or methodological studies carried out in the EPIC cohort.
Abstract: The European Prospective Investigation into Cancer and Nutrition (EPIC) is an ongoing multi-centre prospective cohort study designed to investigate the relationship between nutrition and cancer, with the potential for studying other diseases as well. The study currently includes 519 978 participants (366 521 women and 153 457 men, mostly aged 35-70 years) in 23 centres located in 10 European countries, to be followed for cancer incidence and cause-specific mortality for several decades. At enrollment, which took place between 1992 and 2000 at each of the different centres, information was collected through a non-dietary questionnaire on lifestyle variables and through a dietary questionnaire addressing usual diet. Anthropometric measurements were performed and blood samples taken, from which plasma, serum, red cells and buffy coat fractions were separated and aliquoted for long-term storage, mostly in liquid nitrogen. To calibrate dietary measurements, a standardised, computer-assisted 24-hour dietary recall was implemented at each centre on stratified random samples of the participants, for a total of 36 900 subjects. EPIC represents the largest single resource available today world-wide for prospective investigations on the aetiology of cancers (and other diseases) that can integrate questionnaire data on lifestyle and diet, biomarkers of diet and of endogenous metabolism (e.g. hormones and growth factors) and genetic polymorphisms. First results of case-control studies nested within the cohort are expected early in 2003. The present paper provides a description of the EPIC study, with the aim of simplifying reference to it in future papers reporting substantive or methodological studies carried out in the EPIC cohort.

Journal ArticleDOI
04 Apr 2002-Nature
TL;DR: A mechanism by which progenitor cells of the central nervous system can give rise to non-neural derivatives is defined, and it is proposed that transdetermination consequent to cell fusion could underlie many observations otherwise attributed to an intrinsic plasticity of tissue stem cells.
Abstract: Recent reports have suggested that mammalian stem cells residing in one tissue may have the capacity to produce differentiated cell types for other tissues and organs (1–9). Here we define a mechanism by which progenitor cells of the central nervous system can give rise to non-neural derivatives. Cells taken from mouse brain were co-cultured with pluripotent embryonic stem cells. Following selection for a transgenic marker carried only by the brain cells, undifferentiated stem cells are recovered in which the brain cell genome has undergone epigenetic reprogramming. However, these cells also carry a transgenic marker and chromosomes derived from the embryonic stem cells. Therefore the altered phenotype does not arise by direct conversion of brain to embryonic stem cell but rather through spontaneous generation of hybrid cells. The tetraploid hybrids exhibit full pluripotent character, including multilineage contribution to chimaeras. We propose that transdetermination consequent to cell fusion (10) could underlie many observations otherwise attributed to an intrinsic plasticity of tissue stem cells (9).

Journal ArticleDOI
TL;DR: In this article, the authors give simple mass-matrices leading to tri-bimaximal mixing, and discuss its relation to the Fritzsch-Xing democratic ansatz.

Journal ArticleDOI
TL;DR: A new cognitive model of the maintenance of insomnia is presented, suggesting that individuals who suffer from insomnia tend to be overly worried about their sleep and about the daytime consequences of not getting enough sleep, and this excessive negatively toned cognitive activity triggers both autonomic arousal and emotional distress.

Journal ArticleDOI
27 Dec 2002-Cell
TL;DR: Antigen presenting cells (macrophages and dendritic cells) express pattern recognition molecules that are thought to recognize foreign ligands during early phases of the immune response, suggesting that they play a dual role in normal tissue function and host defense.

Journal ArticleDOI
TL;DR: In this paper, a review of the literature on poor households' use of risk management and risk-coping strategies is presented, which identifies the constraints on their effectiveness and discusses policy options.
Abstract: Poor rural and urban households in developing countries face substantial risks, which they handle with risk-management and risk-coping strategies, including self-insurance through savings and informal insurance mechanisms. Despite these mechanisms, however, vulnerability to poverty linked to risk remains high. This article reviews the literature on poor households’ use of risk-management and risk-coping strategies. It identifies the constraints on their effectiveness and discusses policy options. It shows that risk and lumpiness limit the opportunities to use assets as insurance, that entry constraints limit the usefulness of income diversification, and that informal risk-sharing provides only limited protection, leaving some of the poor exposed to very severe negative shocks. Public safety nets are likely to be beneficial, but their impact is sometimes limited, and they may have negative externalities on households that are not covered. Collecting more information on households’ vulnerability to poverty through both quantitative and qualitative methods can help inform policy.

Journal ArticleDOI
15 Jun 2002-BMJ
TL;DR: In some specialties there are numerous measures of quality of life and little standardisation, and Recommendations for the selection of patient assessed measures of health outcome are needed.
Abstract: Objectives: To assess the growth of quality of life measures and to examine the availability of measures across specialties. Design: Systematic searches of electronic databases to identify developmental and evaluative work relating to health outcome measures assessed by patients. Main outcome measures: Types of measures: disease or population specific, dimension specific, generic, individualised, and utility. Specialties in which measures have been developed and evaluated. Results: 3921 reports that described the development and evaluation of patient assessed measures met the inclusion criteria. Of those that were classifiable, 1819 (46%) were disease or population specific, 865 (22%) were generic, 690 (18%) were dimension specific, 409 (10%) were utility, and 62 (1%) were individualised measures. During 1990-9 the number of new reports of development and evaluation rose from 144 to 650 per year. Reports of disease specific measures rose exponentially. Over 30% of evaluations were in cancer, rheumatology and musculoskeletal disorders, and older people9s health. The generic measures—SF-36, sickness impact profile, and Nottingham health profile—accounted for 612 (16%) reports. Conclusions: In some specialties there are numerous measures of quality of life and little standardisation. Primary research through the concurrent evaluation of measures and secondary research through structured reviews of measures are prerequisites for standardisation. Recommendations for the selection of patient assessed measures of health outcome are needed.

Journal ArticleDOI
TL;DR: The role of the dystrophin complex and protein family in muscle is discussed and the physiological processes that are affected in Duchenne muscular dystrophy are described.
Abstract: The X-linked muscle-wasting disease Duchenne muscular dystrophy is caused by mutations in the gene encoding dystrophin. There is currently no effective treatment for the disease; however, the complex molecular pathology of this disorder is now being unravelled. Dystrophin is located at the muscle sarcolemma in a membrane-spanning protein complex that connects the cytoskeleton to the basal lamina. Mutations in many components of the dystrophin protein complex cause other forms of autosomally inherited muscular dystrophy, indicating the importance of this complex in normal muscle function. Although the precise function of dystrophin is unknown, the lack of protein causes membrane destabilization and the activation of multiple pathophysiological processes, many of which converge on alterations in intracellular calcium handling. Dystrophin is also the prototype of a family of dystrophin-related proteins, many of which are found in muscle. This family includes utrophin and α-dystrobrevin, which are involved in the maintenance of the neuromuscular junction architecture and in muscle homeostasis. New insights into the pathophysiology of dystrophic muscle, the identification of compensating proteins, and the discovery of new binding partners are paving the way for novel therapeutic strategies to treat this fatal muscle disease. This review discusses the role of the dystrophin complex and protein family in muscle and describes the physiological processes that are affected in Duchenne muscular dystrophy.

Journal ArticleDOI
22 Feb 2002-Cell
TL;DR: The phosphorylated CTD of RNA polymerase II provides key molecular contacts with these mRNA processing reactions throughout transcriptional elongation and termination.

Journal ArticleDOI
TL;DR: Layered double hydroxides (LDHs) have been investigated for many years as host materials for a range of anion exchange intercalation reactions as mentioned in this paper and have been used extensively as ion-exchange materials, catalysts, sorbents and halogen absorbers.
Abstract: Layered double hydroxides (LDHs) have been investigated for many years as host materials for a range of anion exchange intercalation reactions. In this role they have been used extensively as ion-exchange materials, catalysts, sorbents and halogen absorbers. More recently, there have been a tremendous number of new developments using these materials to store and deliver biologically active materials in vivo. Significant advances have been made recently on the characterisation of these materials, including structural studies and on the mechanism of intercalation using in situ techniques.

Journal ArticleDOI
TL;DR: These observations suggest important roles for NAD(P)H oxidases, endothelial NO synthase uncoupling, and protein kinase C signaling in mediating increased vascular superoxide production and endothelial dysfunction in human diabetes mellitus.
Abstract: Background— Increased superoxide production contributes to reduced vascular nitric oxide (NO) bioactivity and endothelial dysfunction in experimental models of diabetes. We characterized the sources and mechanisms underlying vascular superoxide production in human blood vessels from diabetic patients with coronary artery disease compared with nondiabetic patients. Methods and Results— Vascular superoxide production was quantified in both saphenous veins and internal mammary arteries from 45 diabetic and 45 matched nondiabetic patients undergoing coronary artery bypass surgery. NAD(P)H-dependent oxidases were important sources of vascular superoxide in both diabetic and nondiabetic patients, but both the activity of this enzyme system and the levels of NAD(P)H oxidase protein subunits (p22phox, p67phox, and p47phox) were significantly increased in diabetic veins and arteries. In nondiabetic vessels, endothelial NO synthase produced NO that scavenged superoxide. However, in diabetic vessels, the endothelium...

Journal ArticleDOI
TL;DR: The Blu-Ice and Distributed Control System software packages were developed to provide unified control over the disparate hardware resources available at a macromolecular crystallography beamline to increase the ease of use, the level of automation and the remote accessibility of beamlines.
Abstract: The Blu-Ice and Distributed Control System (DCS) software packages were developed to provide unified control over the disparate hardware resources available at a macromolecular crystallography beamline. Blu-Ice is a user interface that provides scientific experimenters and beamline support staff with intuitive graphical tools for collecting diffraction data and configuring beamlines for experiments. Blu-Ice communicates with the hardware at a beamline via DCS, an instrument-control and data-acquisition package designed to integrate hardware resources in a highly heterogeneous networked computing environment. Together, Blu-Ice and DCS provide a flexible platform for increasing the ease of use, the level of automation and the remote accessibility of beamlines. Blu-Ice and DCS are currently installed on four Stanford Synchrotron Radiation Laboratory crystallographic beamlines and are being implemented at sister light sources.

Journal ArticleDOI
TL;DR: In this paper, it was shown that spiral waves churn the stars and gas in a manner that largely preserves the overall angular momentum distribution and leads to little increase in random motion, and that changes in the angular momenta of individual stars are typically as large as ∼50 per cent over the lifetime of the disc.
Abstract: We show that spiral waves in galaxy discs churn the stars and gas in a manner that largely preserves the overall angular momentum distribution and leads to little increase in random motion. Changes in the angular momenta of individual stars are typically as large as ∼50 per cent over the lifetime of the disc. The changes are concentrated around the corotation radius for an individual spiral wave, but since transient waves with a wide range of pattern speeds develop in rapid succession, the entire disc is affected. This behaviour has profound consequences for the metallicity gradients with radius in both stars and gas, since the interstellar medium is also stirred by the same mechanism. We find observational support for stirring, propose a simple model for the distribution of stars over metallicity and age, and discuss other possible consequences.

Journal ArticleDOI
TL;DR: In this paper, the authors explored determinants of innovation capability in small UK electronics and software firms and found that the importance of R&D, the key role played by the regional science base in nurturing high-tech spin-offs, and proximity to suppliers.

Journal ArticleDOI
TL;DR: In this article, the authors show how quantum information theory extends traditional information theory by exploring the limits imposed by quantum, rather than classical, mechanics on information storage and transmission, and show that quantum computers can achieve enhanced speed over their classical counterparts using information-theoretic arguments.
Abstract: Quantum mechanics and information theory are among the most important scientific discoveries of the last century. Although these two areas initially developed separately, it has emerged that they are in fact intimately related. In this review the author shows how quantum information theory extends traditional information theory by exploring the limits imposed by quantum, rather than classical, mechanics on information storage and transmission. The derivation of many key results differentiates this review from the usual presentation in that they are shown to follow logically from one crucial property of relative entropy. Within the review, optimal bounds on the enhanced speed that quantum computers can achieve over their classical counterparts are outlined using information-theoretic arguments. In addition, important implications of quantum information theory for thermodynamics and quantum measurement are intermittently discussed. A number of simple examples and derivations, including quantum superdense coding, quantum teleportation, and Deutsch's and Grover's algorithms, are also included.