scispace - formally typeset
Search or ask a question

Showing papers by "University of Oxford published in 2001"


Journal ArticleDOI
Eric S. Lander1, Lauren Linton1, Bruce W. Birren1, Chad Nusbaum1  +245 moreInstitutions (29)
15 Feb 2001-Nature
TL;DR: The results of an international collaboration to produce and make freely available a draft sequence of the human genome are reported and an initial analysis is presented, describing some of the insights that can be gleaned from the sequence.
Abstract: The human genome holds an extraordinary trove of information about human development, physiology, medicine and evolution. Here we report the results of an international collaboration to produce and make freely available a draft sequence of the human genome. We also present an initial analysis of the data, describing some of the insights that can be gleaned from the sequence.

22,269 citations


Journal ArticleDOI
TL;DR: A new statistical method is presented, applicable to genotype data at linked loci from a population sample, that improves substantially on current algorithms and performs well in absolute terms, suggesting that reconstructing haplotypes experimentally or by genotyping additional family members may be an inefficient use of resources.
Abstract: Current routine genotyping methods typically do not provide haplotype information, which is essential for many analyses of fine-scale molecular-genetics data. Haplotypes can be obtained, at considerable cost, experimentally or (partially) through genotyping of additional family members. Alternatively, a statistical method can be used to infer phase and to reconstruct haplotypes. We present a new statistical method, applicable to genotype data at linked loci from a population sample, that improves substantially on current algorithms; often, error rates are reduced by >50%, relative to its nearest competitor. Furthermore, our algorithm performs well in absolute terms, suggesting that reconstructing haplotypes experimentally or by genotyping additional family members may be an inefficient use of resources.

7,482 citations


Journal ArticleDOI
TL;DR: The authors propose a novel hidden Markov random field (HMRF) model, which is a stochastic process generated by a MRF whose state sequence cannot be observed directly but which can be indirectly estimated through observations.
Abstract: The finite mixture (FM) model is the most commonly used model for statistical segmentation of brain magnetic resonance (MR) images because of its simple mathematical form and the piecewise constant nature of ideal brain MR images. However, being a histogram-based model, the FM has an intrinsic limitation-no spatial information is taken into account. This causes the FM model to work only on well-defined images with low levels of noise; unfortunately, this is often not the the case due to artifacts such as partial volume effect and bias field distortion. Under these conditions, FM model-based methods produce unreliable results. Here, the authors propose a novel hidden Markov random field (HMRF) model, which is a stochastic process generated by a MRF whose state sequence cannot be observed directly but which can be indirectly estimated through observations. Mathematically, it can be shown that the FM model is a degenerate version of the HMRF model. The advantage of the HMRF model derives from the way in which the spatial information is encoded through the mutual influences of neighboring sites. Although MRF modeling has been employed in MR image segmentation by other researchers, most reported methods are limited to using MRF as a general prior in an FM model-based approach. To fit the HMRF model, an EM algorithm is used. The authors show that by incorporating both the HMRF model and the EM algorithm into a HMRF-EM framework, an accurate and robust segmentation can be achieved. More importantly, the HMRF-EM framework can easily be combined with other techniques. As an example, the authors show how the bias field correction algorithm of Guillemaud and Brady (1997) can be incorporated into this framework to achieve a three-dimensional fully automated approach for brain MR image segmentation.

6,335 citations


Journal ArticleDOI
20 Apr 2001-Science
TL;DR: It is shown that the interaction between human pVHL and a specific domain of the HIF-1α subunit is regulated through hydroxylation of a proline residue by an enzyme the authors have termed Hif-α prolyl-hydroxylase (HIF-PH).
Abstract: Hypoxia-inducible factor (HIF) is a transcriptional complex that plays a central role in the regulation of gene expression by oxygen. In oxygenated and iron replete cells, HIF-alpha subunits are rapidly destroyed by a mechanism that involves ubiquitylation by the von Hippel-Lindau tumor suppressor (pVHL) E3 ligase complex. This process is suppressed by hypoxia and iron chelation, allowing transcriptional activation. Here we show that the interaction between human pVHL and a specific domain of the HIF-1alpha subunit is regulated through hydroxylation of a proline residue (HIF-1alpha P564) by an enzyme we have termed HIF-alpha prolyl-hydroxylase (HIF-PH). An absolute requirement for dioxygen as a cosubstrate and iron as cofactor suggests that HIF-PH functions directly as a cellular oxygen sensor.

5,186 citations


MonographDOI
TL;DR: In this paper, Flyvbjerg argues that the strength of social science is in its rich, reflexive analysis of values and power, essential to the social and economic development of any society.
Abstract: Making Social Science Matter presents an exciting new approach to social science, including theoretical argument, methodological guidelines, and examples of practical application. Why has social science failed in attempts to emulate natural science and produce normal theory? Bent Flyvbjerg argues that the strength of social science is in its rich, reflexive analysis of values and power, essential to the social and economic development of any society. Richly informed, powerfully argued, and clearly written, this book provides essential reading for all those in the social and behavioral sciences.

3,523 citations


Journal ArticleDOI
05 Oct 2001-Cell
TL;DR: Direct modulation of recombinant enzyme activity by graded hypoxia, iron chelation, and cobaltous ions mirrors the characteristics of HIF induction in vivo, fulfilling requirements for these enzymes being oxygen sensors that regulate HIF.

3,188 citations


Journal ArticleDOI
TL;DR: Anion recognition chemistry has grown from its beginnings with positively charged ammonium cryptand receptors for halide binding to a plethora of charged and neutral, cyclic and acyclic, inorganic and organic supramolecular host systems for the selective complexation, detection, and separation of anionic guest species.
Abstract: Anion recognition chemistry has grown from its beginnings in the late 1960s with positively charged ammonium cryptand receptors for halide binding to, at the end of the millennium, a plethora of charged and neutral, cyclic and acyclic, inorganic and organic supramolecular host systems for the selective complexation, detection, and separation of anionic guest species. Solvation effects and pH values have been shown to play crucial roles in the overall anion recognition process. More recent developments include exciting advances in anion-templated syntheses and directed self-assembly, ion-pair recognition, and the function of anions in supramolecular catalysis.

3,145 citations


Journal ArticleDOI
TL;DR: Estimation is improved by using nonlinear spatial filtering to smooth the estimated autocorrelation, but only within tissue type, and reduced bias to close to zero at probability levels as low as 1 x 10(-5).

2,655 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigated supplier and customer integration strategies in a global sample of 322 manufacturers and found that the widest degree of arc of integration with both suppliers and customers had the strongest association with performance improvement.

2,423 citations


Journal ArticleDOI
TL;DR: The 2dF Galaxy Redshift Survey (2dFGRS) as mentioned in this paper uses the 2DF multifibre spectrograph on the Anglo-Australian Telescope, which is capable of observing 400 objects simultaneously over a 2° diameter field.
Abstract: The 2dF Galaxy Redshift Survey (2dFGRS) is designed to measure redshifts for approximately 250 000 galaxies. This paper describes the survey design, the spectroscopic observations, the redshift measurements and the survey data base. The 2dFGRS uses the 2dF multifibre spectrograph on the Anglo-Australian Telescope, which is capable of observing 400 objects simultaneously over a 2° diameter field. The source catalogue for the survey is a revised and extended version of the APM galaxy catalogue, and the targets are galaxies with extinction-corrected magnitudes brighter than b J = 19.45. The main survey regions are two declination strips, one in the southern Galactic hemisphere spanning 80° × 15° around the SGP, and the other in the northern Galactic hemisphere spanning 75° × 10° along the celestial equator; in addition, there are 99 fields spread over the southern Galactic cap. The survey covers 2000 deg 2 and has a median depth of z = 0.11. Adaptive tiling is used to give a highly uniform sampling rate of 93 per cent over the whole survey region. Redshifts are measured from spectra covering 3600-8000 A at a two-pixel resolution of 9.0 A and a median S/N of 13 pixel - 1 . All redshift identifications are visually checked and assigned a quality parameter Q in the range 1-5; Q ≥ 3 redshifts are 98.4 per cent reliable and have an rms uncertainty of 85 km s - 1 . The overall redshift completeness for Q ≥ 3 redshifts is 91.8 per cent, but this varies with magnitude from 99 per cent for the brightest galaxies to 90 per cent for objects at the survey limit. The 2dFGRS data base is available on the World Wide Web at http://www. mso.anu.edu.au/2dFGRS.

2,296 citations


Journal ArticleDOI
TL;DR: An integrated approach to fitting psychometric functions, assessing the goodness of fit, and providing confidence intervals for the function’s parameters and other estimates derived from them, for the purposes of hypothesis testing is described.
Abstract: The psychometric function relates an observer’s performance to an independent variable, usually some physical quantity of a stimulus in a psychophysical task. This paper, together with its companion paper (Wichmann & Hill, 2001), describes an integrated approach to (1) fitting psychometric functions, (2) assessing the goodness of fit, and (3) providing confidence intervals for the function’s parameters and other estimates derived from them, for the purposes of hypothesis testing. The present paper deals with the first two topics, describing a constrained maximum-likelihood method of parameter estimation and developing several goodness-of-fit tests. Using Monte Carlo simulations, we deal with two specific difficulties that arise when fitting functions to psychophysical data. First, we note that human observers are prone to stimulus-independent errors (orlapses). We show that failure to account for this can lead to serious biases in estimates of the psychometric function’s parameters and illustrate how the problem may be overcome. Second, we note that psychophysical data sets are usually rather small by the standards required by most of the commonly applied statistical tests. We demonstrate the potential errors of applying traditionalX2 methods to psychophysical data and advocate use of Monte Carlo resampling techniques that do not rely on asymptotic theory. We have made available the software to implement our methods.

Journal ArticleDOI
TL;DR: In this paper, the problem of separating consistently the total correlations in a bipartite quantum state into a quantum and a purely classical part is discussed, and a measure of classical correlations is proposed and its properties are explored.
Abstract: We discuss the problem of separating consistently the total correlations in a bipartite quantum state into a quantum and a purely classical part. A measure of classical correlations is proposed and its properties are explored.

Journal ArticleDOI
TL;DR: Findings indicate that one emotional involvement of the human orbitofrontal cortex is its representation of the magnitudes of abstract rewards and punishments, such as receiving or losing money.
Abstract: The orbitofrontal cortex (OFC) is implicated in emotion and emotion-related learning. Using event-related functional magnetic resonance imaging (fMRI), we measured brain activation in human subjects doing an emotion-related visual reversal-learning task in which choice of the correct stimulus led to a probabilistically determined 'monetary' reward and choice of the incorrect stimulus led to a monetary loss. Distinct areas of the OFC were activated by monetary rewards and punishments. Moreover, in these areas, we found a correlation between the magnitude of the brain activation and the magnitude of the rewards and punishments received. These findings indicate that one emotional involvement of the human orbitofrontal cortex is its representation of the magnitudes of abstract rewards and punishments, such as receiving or losing money.

Journal ArticleDOI
10 May 2001-Nature
TL;DR: The results illuminate human history, suggesting that LD in northern Europeans is shaped by a marked demographic event about 27,000–53,000 years ago, implying that LD mapping is likely to be practical in this population.
Abstract: With the availability of a dense genome-wide map of single nucleotide polymorphisms (SNPs), a central issue in human genetics is whether it is now possible to use linkage disequilibrium (LD) to map genes that cause disease. LD refers to correlations among neighbouring alleles, reflecting 'haplotypes' descended from single, ancestral chromosomes. The size of LD blocks has been the subject of considerable debate. Computer simulations and empirical data have suggested that LD extends only a few kilobases (kb) around common SNPs, whereas other data have suggested that it can extend much further, in some cases greater than 100 kb. It has been difficult to obtain a systematic picture of LD because past studies have been based on only a few (1-3) loci and different populations. Here, we report a large-scale experiment using a uniform protocol to examine 19 randomly selected genomic regions. LD in a United States population of north-European descent typically extends 60 kb from common alleles, implying that LD mapping is likely to be practical in this population. By contrast, LD in a Nigerian population extends markedly less far. The results illuminate human history, suggesting that LD in northern Europeans is shaped by a marked demographic event about 27,000-53,000 years ago.

Journal ArticleDOI
01 Nov 2001-Nature
TL;DR: It is shown that these large droplets form by virtue of the insect's bumpy surface, which consists of alternating hydrophobic, wax-coated and hydrophilic, non-waxy regions, and may find application in water-trapping tent and building coverings, for example, or in water condensers and engines.
Abstract: This insect has a tailor-made covering for collecting water from early-morning fog. Some beetles in the Namib Desert collect drinking water from fog-laden wind on their backs1. We show here that these large droplets form by virtue of the insect's bumpy surface, which consists of alternating hydrophobic, wax-coated and hydrophilic, non-waxy regions. The design of this fog-collecting structure can be reproduced cheaply on a commercial scale and may find application in water-trapping tent and building coverings, for example, or in water condensers and engines.

Book
27 Sep 2001
TL;DR: In this article, the authors present a mathematical model for chaotic multidimensional flows and fractal dimension calculation based on the Lyapunov exponents and the Hamiltonian chaos.
Abstract: Preface 1. Introduction 2. One-dimensional maps 3. Nonchaotic multidimensional flows 4. Dynamical systems theory 5. Lyapunov exponents 6. Strange attractors 7. Bifurcations 8. Hamiltonian chaos 9. Time-series properties 10. Nonlinear prediction and noise reduction 11. Fractals 12. Calculation of fractal dimension 13. Fractal measure and multifractals 14. Nonchaotic fractal sets 15. Spatiotemporal chaos and complexity A. Common chaotic systems B. Useful mathematical formulas C. Journals with chaos and related papers Bibliography Index

Journal ArticleDOI
11 Jan 2001-Nature
TL;DR: The identification of the acute phase-regulated and signal-inducing macrophage protein, CD163, as a receptor that scavenges haemoglobin by mediating endocytosis of haptoglobin–haemoglobin complexes is reported.
Abstract: Intravascular haemolysis is a physiological phenomenon as well as a severe pathological complication when accelerated in various autoimmune, infectious (such as malaria) and inherited (such as sickle cell disease) disorders1. Haemoglobin released into plasma is captured by the acute phase protein haptoglobin, which is depleted from plasma during elevated haemolysis1. Here we report the identification of the acute phase-regulated and signal-inducing macrophage protein, CD163, as a receptor that scavenges haemoglobin by mediating endocytosis of haptoglobin–haemoglobin complexes. CD163 binds only haptoglobin and haemoglobin in complex, which indicates the exposure of a receptor-binding neoepitope. The receptor–ligand interaction is Ca2+-dependent and of high affinity. Complexes of haemoglobin and multimeric haptoglobin (the 2-2 phenotype) exhibit higher functional affinity for CD163 than do complexes of haemoglobin and dimeric haptoglobin (the 1-1 phenotype). Specific CD163-mediated endocytosis of haptoglobin–haemoglobin complexes is measurable in cells transfected with CD163 complementary DNA and in CD163-expressing myelo-monocytic lymphoma cells.

Journal ArticleDOI
Q. R. Ahmad1, R. C. Allen2, T. C. Andersen3, J. D. Anglin4  +202 moreInstitutions (17)
TL;DR: In this paper, the total flux of 8B neutrinos was determined to be (5.44±0.99)×106 cm−2 s−1, in close agreement with the predictions of solar models.
Abstract: Solar neutrinos from the decay of 8B have been detected at the Sudbury Neutrino Observatory (SNO) via the charged current (CC) reaction on deuterium and by the elastic scattering (ES) of electrons. The CC reaction is sensitive exclusively to νe, while the ES reaction also has a small sensitivity to νμ and ντ. The flux of νe from 8B decay measured by the CC reaction rate is φCC(ν e )=[1.75±0.07(stat.) −0.11 +0.12 (syst.)×0.05(theor.)]×106cm−2s−1. Assuming no flavor transformation, the flux inferred from the ES reaction rate is φES(ν x )=[2.39±0.34(stat.) −0.14 +0.16 (syst.)]×106cm−2s−1. Comparison of φCC(νe) to the Super-Kamiokande collaboration’s precision value of φES(νx) yields a 3.3σ difference, assuming the systematic uncertainties are normally distributed, providing evidence that there is a nonelectron flavor active neutrino component in the solar flux. The total flux of active 8B neutrinos is thus determined to be (5.44±0.99)×106 cm−2 s−1, in close agreement with the predictions of solar models.

Journal ArticleDOI
TL;DR: With the discovery of massive numbers of genetic markers and the development of better tools for genotyping, association studies will inevitably proliferate and now is the time to consider critically the design of such studies to avoid the mistakes of the past and to maximize their potential to identify new components of disease.
Abstract: Assessing the association between DNA variants and disease has been used widely to identify regions of the genome and candidate genes that contribute to disease. However, there are numerous examples of associations that cannot be replicated, which has led to skepticism about the utility of the approach for common conditions. With the discovery of massive numbers of genetic markers and the development of better tools for genotyping, association studies will inevitably proliferate. Now is the time to consider critically the design of such studies, to avoid the mistakes of the past and to maximize their potential to identify new components of disease.

Journal ArticleDOI
23 Mar 2001-Science
TL;DR: Almost all of the key molecules involved in the innate and adaptive immune response are glycoproteins, and specific glycoforms are involved in recognition events.
Abstract: Almost all of the key molecules involved in the innate and adaptive immune response are glycoproteins. In the cellular immune system, specific glycoforms are involved in the folding, quality control, and assembly of peptide-loaded major histocompatibility complex (MHC) antigens and the T cell receptor complex. Although some glycopeptide antigens are presented by the MHC, the generation of peptide antigens from glycoproteins may require enzymatic removal of sugars before the protein can be cleaved. Oligosaccharides attached to glycoproteins in the junction between T cells and antigen-presenting cells help to orient binding faces, provide protease protection, and restrict nonspecific lateral protein-protein interactions. In the humoral immune system, all of the immunoglobulins and most of the complement components are glycosylated. Although a major function for sugars is to contribute to the stability of the proteins to which they are attached, specific glycoforms are involved in recognition events. For example, in rheumatoid arthritis, an autoimmune disease, agalactosylated glycoforms of aggregated immunoglobulin G may induce association with the mannose-binding lectin and contribute to the pathology.

Journal ArticleDOI
TL;DR: The case is articulated for a top-down approach to theory building, in which scale is addressed explicitly and in which different response variables are clearly distinguished, to articulate the case for a general theory of diversity that must necessarily cover many disparate phenomena, at various scales of analysis.
Abstract: Aim Current weaknesses of diversity theory include: a failure to distinguish different biogeographical response variables under the general heading of diversity; and a general failure of ecological theory to deal adequately with geographical scale. Our aim is to articulate the case for a top-down approach to theory building, in which scale is addressed explicitly and in which different response variables are clearly distinguished. Location The article draws upon both theoretical contributions and empirical analyses from all latitudes, focusing on terrestrial ecosystems and with some bias towards (woody) plants. Methods We review current diversity theory and terminology in relation to scale of applicability. As a starting point in developing a general theory, we take the issue of geographical gradients in species richness as a main theme and evaluate the extent to which commonly cited theories are likely to operate at scales from the macro down to the local. Results A degree of confusion surrounds the use of the terms alpha, beta and gamma diversity, and the terms local, landscape and macro-scale are preferred here as a more intuitive framework. The distinction between inventory and differentiation diversity is highlighted as important as, in terms of scale of analysis, are the concepts of focus and extent. The importance of holding area constant in analysis is stressed, as is the notion that different environmental factors exhibit measurable heterogeneity at different scales. Evaluation of several of the most common diversity theories put forward for the grand clines in species richness, indicates that they can be collapsed to dynamic hypotheses based on climate or historical explanations. The importance of the many ecological/ biological mechanisms that have been proposed is evident mainly at local scales of analysis, whilst at the macro-scale they are dependent largely upon climatic controls for their operation. Local communities have often been found not to be saturated, i.e. to be non-equilibrial. This is argued, perhaps counter-intuitively, to be entirely compatible with the persistence through time of macro-scale patterns of richness that are climatically determined. The review also incorporates recent developments in macroecology, Rapoport’s rule, trade-offs, and the importance of isolation, landscape impedance and geometric constraints on richness (the mid-domain effect) in generating richness patterns; highlighting those phenomena that are contributory to the first-order climatic pattern, and those, such as the geometric constraints, that may confound or obscure these patterns. Main conclusions A general theory of diversity must necessarily cover many disparate phenomena, at various scales of analysis, and cannot therefore be expressed in a simple formula, but individual elements of this general theory may be. In particular, it appears possible to capture in a dynamic climate-based model and ‘capacity rule’, the form of the

Journal ArticleDOI
29 Mar 2001-Nature
TL;DR: Successful copying of the spider's internal processing and precise control over protein folding, combined with knowledge of the gene sequences of its spinning dopes, could permit industrial production of silk-based fibres with unique properties under benign conditions.
Abstract: Spider silk has outstanding mechanical properties despite being spun at close to ambient temperatures and pressures using water as the solvent. The spider achieves this feat of benign fibre processing by judiciously controlling the folding and crystallization of the main protein constituents, and by adding auxiliary compounds, to create a composite material of defined hierarchical structure. Because the 'spinning dope' (the material from which silk is spun) is liquid crystalline, spiders can draw it during extrusion into a hardened fibre using minimal forces. This process involves an unusual internal drawdown within the spider's spinneret that is not seen in industrial fibre processing, followed by a conventional external drawdown after the dope has left the spinneret. Successful copying of the spider's internal processing and precise control over protein folding, combined with knowledge of the gene sequences of its spinning dopes, could permit industrial production of silk-based fibres with unique properties under benign conditions.

Journal ArticleDOI
06 Sep 2001-Nature
TL;DR: This work identifies this unknown receptor on macrophages as dectin-1 (ref. 2), a finding that provides new insights into the innate immune recognition of β-glucans.
Abstract: The carbohydrate polymers known as β-1,3-d-glucans exert potent effects on the immune system — stimulating antitumour and antimicrobial activity, for example — by binding to receptors on macrophages and other white blood cells and activating them. Although β-glucans are known to bind to receptors, such as complement receptor 3 (ref. 1), there is evidence that another β-glucan receptor is present on macrophages. Here we identify this unknown receptor as dectin-1 (ref. 2), a finding that provides new insights into the innate immune recognition of β-glucans.

Journal ArticleDOI
TL;DR: A multiscale algorithm for the selection of salient regions of an image is introduced and its application to matching type problems such as tracking, object recognition and image retrieval is demonstrated.
Abstract: Many computer vision problems can be considered to consist of two main tasks: the extraction of image content descriptions and their subsequent matching. The appropriate choice of type and level of description is of course task dependent, yet it is generally accepted that the low-level or so called early vision layers in the Human Visual System are context independent. This paper concentrates on the use of low-level approaches for solving computer vision problems and discusses three inter-related aspects of this: saliencys scale selection and content description. In contrast to many previous approaches which separate these tasks, we argue that these three aspects are intrinsically related. Based on this observation, a multiscale algorithm for the selection of salient regions of an image is introduced and its application to matching type problems such as tracking, object recognition and image retrieval is demonstrated.

Journal ArticleDOI
TL;DR: This review describes recent empirical and theoretical work on the extent of linkage disequilibrium in the human genome, comparing the predictions of simple population-genetic models to available data and some implications that the emerging patterns of LD have for association-mapping strategies.
Abstract: In this review, we describe recent empirical and theoretical work on the extent of linkage disequilibrium (LD) in the human genome, comparing the predictions of simple population-genetic models to available data. Several studies report significant LD over distances longer than those predicted by standard models, whereas some data from short, intergenic regions show less LD than would be expected. The apparent discrepancies between theory and data present a challenge—both to modelers and to human geneticists—to identify which important features are missing from our understanding of the biological processes that give rise to LD. Salient features may include demographic complications such as recent admixture, as well as genetic factors such as local variation in recombination rates, gene conversion, and the potential segregation of inversions. We also outline some implications that the emerging patterns of LD have for association-mapping strategies. In particular, we discuss what marker densities might be necessary for genomewide association scans.

Journal ArticleDOI
TL;DR: Recent findings in TR cell biology are reviewed, a model for how TR cells may inhibit the development of immune pathology is outlined and potential therapeutic benefits that may arise from the manipulation of TR cell function are discussed.
Abstract: It is now well established that regulatory T (T(R)) cells can inhibit harmful immunopathological responses directed against self or foreign antigens. However, many key aspects of T(R) cell biology remain unresolved, especially with regard to their antigen specificities and the cellular and molecular pathways involved in their development and mechanisms of action. We will review here recent findings in these areas, outline a model for how T(R) cells may inhibit the development of immune pathology and discuss potential therapeutic benefits that may arise from the manipulation of T(R) cell function.

Journal ArticleDOI
TL;DR: An explicit model for the evolution of complex disease loci is proposed, incorporating mutation, random genetic drift, and the possibility of purifying selection against susceptibility mutations, showing that, for the most plausible range of mutation rates, neutral susceptibility alleles are unlikely to be at intermediate frequencies and contribute little to the overall genetic variance for the disease.
Abstract: Little is known about the nature of genetic variation underlying complex diseases in humans. One popular view proposes that mapping efforts should focus on identification of susceptibility mutations that are relatively old and at high frequency. It is generally assumed—at least for modeling purposes—that selection against complex disease mutations is so weak that it can be ignored. In this article, I propose an explicit model for the evolution of complex disease loci, incorporating mutation, random genetic drift, and the possibility of purifying selection against susceptibility mutations. I show that, for the most plausible range of mutation rates, neutral susceptibility alleles are unlikely to be at intermediate frequencies and contribute little to the overall genetic variance for the disease. Instead, it seems likely that the bulk of genetic variance underlying diseases is due to loci where susceptibility mutations are mildly deleterious and where there is a high overall mutation rate to the susceptible class. At such loci, the total frequency of susceptibility mutations may be quite high, but there is likely to be extensive allelic heterogeneity at many of these loci. I discuss some practical implications of these results for gene mapping efforts.

Journal ArticleDOI
25 Oct 2001-Nature
TL;DR: The genome sequence is sequenced of a S. typhi (CT18) that is resistant to multiple drugs, revealing the presence of hundreds of insertions and deletions compared with the Escherichia coli genome, ranging in size from single genes to large islands.
Abstract: Salmonella enterica serovar Typhi (S. typhi) is the aetiological agent of typhoid fever, a serious invasive bacterial disease of humans with an annual global burden of approximately 16 million cases, leading to 600,000 fatalities. Many S. enterica serovars actively invade the mucosal surface of the intestine but are normally contained in healthy individuals by the local immune defence mechanisms. However, S. typhi has evolved the ability to spread to the deeper tissues of humans, including liver, spleen and bone marrow. Here we have sequenced the 4,809,037-base pair (bp) genome of a S. typhi (CT18) that is resistant to multiple drugs, revealing the presence of hundreds of insertions and deletions compared with the Escherichia coli genome, ranging in size from single genes to large islands. Notably, the genome sequence identifies over two hundred pseudogenes, several corresponding to genes that are known to contribute to virulence in Salmonella typhimurium. This genetic degradation may contribute to the human-restricted host range for S. typhi. CT18 harbours a 218,150-bp multiple-drug-resistance incH1 plasmid (pHCM1), and a 106,516-bp cryptic plasmid (pHCM2), which shows recent common ancestry with a virulence plasmid of Yersinia pestis.

Journal ArticleDOI
TL;DR: The model is diabetes-specific and incorporates glycaemia, systolic blood pressure and lipid levels as risk factors, in addition to age, sex, ethnic group, smoking status and time since diagnosis of diabetes, which provides the estimates ofCHD risk required by current guidelines for the primary prevention of CHD in Type II diabetes.
Abstract: A definitive model for predicting absolute risk of coronary heart disease (CHD) in male and female people with Type II diabetes is not yet available. This paper provides an equation for estimating the risk of new CHD events in people with Type II diabetes, based on data from 4540 U.K. Prospective Diabetes Study male and female patients. Unlike previously published risk equations, the model is diabetes-specific and incorporates glycaemia, systolic blood pressure and lipid levels as risk factors, in addition to age, sex, ethnic group, smoking status and time since diagnosis of diabetes. All variables included in the final model were statistically significant (P<0.001, except smoking for which P=0.0013) in likelihood ratio testing. This model provides the estimates of CHD risk required by current guidelines for the primary prevention of CHD in Type II diabetes.

Journal ArticleDOI
TL;DR: In this paper, it was shown that the prediction for BR(μ→e,γ) is in general larger than the experimental upper bound, especially if the largest Yakawa coupling is O (1) and the solar data are explained by a large angle MSW effect, which recent analyses suggest as the preferred scenario.