scispace - formally typeset
Search or ask a question

Showing papers by "University of Cambridge published in 2001"


Journal ArticleDOI
TL;DR: In this paper, the authors developed a new approach to the problem of testing the existence of a level relationship between a dependent variable and a set of regressors, when it is not known with certainty whether the underlying regressors are trend- or first-difference stationary.
Abstract: This paper develops a new approach to the problem of testing the existence of a level relationship between a dependent variable and a set of regressors, when it is not known with certainty whether the underlying regressors are trend- or first-difference stationary. The proposed tests are based on standard F- and t-statistics used to test the significance of the lagged levels of the variables in a univariate equilibrium correction mechanism. The asymptotic distributions of these statistics are non-standard under the null hypothesis that there exists no level relationship, irrespective of whether the regressors are I(0) or I(1). Two sets of asymptotic critical values are provided: one when all regressors are purely I(1) and the other if they are all purely I(0). These two sets of critical values provide a band covering all possible classifications of the regressors into purely I(0), purely I(1) or mutually cointegrated. Accordingly, various bounds testing procedures are proposed. It is shown that the proposed tests are consistent, and their asymptotic distribution under the null and suitably defined local alternatives are derived. The empirical relevance of the bounds procedures is demonstrated by a re-examination of the earnings equation included in the UK Treasury macroeconometric model. Copyright © 2001 John Wiley & Sons, Ltd.

13,898 citations


Journal ArticleDOI
TL;DR: The revised criteria facilitate the diagnosis of MS in patients with a variety of presentations, including “monosymptomatic” disease suggestive of MS, disease with a typical relapsing‐remitting course, and disease with insidious progression, without clear attacks and remissions.
Abstract: The International Panel on MS Diagnosis presents revised diagnostic criteria for multiple sclerosis (MS). The focus remains on the objective demonstration of dissemination of lesions in both time and space. Magnetic resonance imaging is integrated with dinical and other paraclinical diagnostic methods. The revised criteria facilitate the diagnosis of MS in patients with a variety of presentations, including "monosymptomatic" disease suggestive of MS, disease with a typical relapsing-remitting course, and disease with insidious progression, without clear attacks and remissions. Previously used terms such as "clinically definite" and "probable MS" are no longer recommended. The outcome of a diagnostic evaluation is either MS, "possible MS" (for those at risk for MS, but for whom diagnostic evaluation is equivocal), or "not MS."

6,720 citations


Journal ArticleDOI
TL;DR: The Autism-Spectrum Quotient is a valuable instrument for rapidly quantifying where any given individual is situated on the continuum from autism to normality, and its potential for screening for autism spectrum conditions in adults of normal intelligence remains to be fully explored.
Abstract: Currently there are no brief, self-administered instruments for measuring the degree to which an adult with normal intelligence has the traits associated with the autistic spectrum. In this paper, we report on a new instrument to assess this: the Autism-Spectrum Quotient (AQ). Individuals score in the range 0-50. Four groups of subjects were assessed: Group 1: 58 adults with Asperger syndrome (AS) or high-functioning autism (HFA); Group 2: 174 randomly selected controls. Group 3: 840 students in Cambridge University; and Group 4: 16 winners of the UK Mathematics Olympiad. The adults with AS/HFA had a mean AQ score of 35.8 (SD = 6.5), significantly higher than Group 2 controls (M = 16.4, SD = 6.3). 80% of the adults with AS/HFA scored 32+, versus 2% of controls. Among the controls, men scored slightly but significantly higher than women. No women scored extremely highly (AQ score 34+) whereas 4% of men did so. Twice as many men (40%) as women (21%) scored at intermediate levels (AQ score 20+). Among the AS/HFA group, male and female scores did not differ significantly. The students in Cambridge University did not differ from the randomly selected control group, but scientists (including mathematicians) scored significantly higher than both humanities and social sciences students, confirming an earlier study that autistic conditions are associated with scientific skills. Within the sciences, mathematicians scored highest. This was replicated in Group 4, the Mathematics Olympiad winners scoring significantly higher than the male Cambridge humanities students. 6% of the student sample scored 32+ on the AQ. On interview, 11 out of 11 of these met three or more DSM-IV criteria for AS/HFA, and all were studying sciences/mathematics, and 7 of the 11 met threshold on these criteria. Test-retest and interrater reliability of the AQ was good. The AQ is thus a valuable instrument for rapidly quantifying where any given individual is situated on the continuum from autism to normality. Its potential for screening for autism spectrum conditions in adults of normal intelligence remains to be fully explored.

4,988 citations


Journal ArticleDOI
TL;DR: The Revised Eyes Test has improved power to detect subtle individual differences in social sensitivity and was inversely correlated with the Autism Spectrum Quotient (the AQ), a measure of autistic traits in adults of normal intelligence.
Abstract: In 1997 in this Journal we published the ‘‘Reading the Mind in the Eyes’’ Test, as a measure of adult ‘‘mentalising’’. Whilst that test succeeded in discriminating a group of adults with Asperger syndrome (AS) or high-functioning autism (HFA) from controls, it suered from several psychometric problems. In this paper these limitations are rectified by revising the test. The Revised Eyes Test was administered to a group of adults with AS or HFA (N fl 15) and again discriminated these from a large number of normal controls (N fl 239) drawn from dierent samples. In both the clinical and control groups the Eyes Test was inversely correlated with the Autism Spectrum Quotient (the AQ), a measure of autistic traits in adults of normal intelligence. The Revised Eyes Test has improved power to detect subtle individual dierences in social sensitivity.

4,858 citations


Journal ArticleDOI
TL;DR: In this paper, the final results of the Hubble Space Telescope (HST) Key Project to measure the Hubble constant are presented, and the results are based on a Cepheid calibration of several secondary distance methods applied over the range of about 60-400 Mpc.
Abstract: We present here the final results of the Hubble Space Telescope (HST) Key Project to measure the Hubble constant. We summarize our method, the results, and the uncertainties, tabulate our revised distances, and give the implications of these results for cosmology. Our results are based on a Cepheid calibration of several secondary distance methods applied over the range of about 60-400 Mpc. The analysis presented here benefits from a number of recent improvements and refinements, including (1) a larger LMC Cepheid sample to define the fiducial period-luminosity (PL) relations, (2) a more recent HST Wide Field and Planetary Camera 2 (WFPC2) photometric calibration, (3) a correction for Cepheid metallicity, and (4) a correction for incompleteness bias in the observed Cepheid PL samples. We adopt a distance modulus to the LMC (relative to which the more distant galaxies are measured) of μ0 = 18.50 ± 0.10 mag, or 50 kpc. New, revised distances are given for the 18 spiral galaxies for which Cepheids have been discovered as part of the Key Project, as well as for 13 additional galaxies with published Cepheid data. The new calibration results in a Cepheid distance to NGC 4258 in better agreement with the maser distance to this galaxy. Based on these revised Cepheid distances, we find values (in km s-1 Mpc-1) of H0 = 71 ± 2 ± 6 (systematic) (Type Ia supernovae), H0 = 71 ± 3 ± 7 (Tully-Fisher relation), H0 = 70 ± 5 ± 6 (surface brightness fluctuations), H0 = 72 ± 9 ± 7 (Type II supernovae), and H0 = 82 ± 6 ± 9 (fundamental plane). We combine these results for the different methods with three different weighting schemes, and find good agreement and consistency with H0 = 72 ± 8 km s-1 Mpc-1. Finally, we compare these results with other, global methods for measuring H0.

3,397 citations


Journal ArticleDOI
TL;DR: The estimates of energy usage predict the use of distributed codes, with ≤15% of neurons simultaneously active, to reduce energy consumption and allow greater computing power from a fixed number of neurons.
Abstract: Anatomic and physiologic data are used to analyze the energy expenditure on different components of excitatory signaling in the grey matter of rodent brain. Action potentials and postsynaptic effects of glutamate are predicted to consume much of the energy (47% and 34%, respectively), with the resting potential consuming a smaller amount (13%), and glutamate recycling using only 3%. Energy usage depends strongly on action potential rate--an increase in activity of 1 action potential/cortical neuron/s will raise oxygen consumption by 145 mL/100 g grey matter/h. The energy expended on signaling is a large fraction of the total energy used by the brain; this favors the use of energy efficient neural codes and wiring patterns. Our estimates of energy usage predict the use of distributed codes, with

2,912 citations


Journal ArticleDOI
01 Mar 2001-Nature
TL;DR: A stepwise model for the formation of a transcriptionally silent heterochromatin is provided: SUV39H1 places a ‘methyl marker’ on histone H3, which is then recognized by HP1 through its chromo domain, which may also explain the stable inheritance of theheterochromatic state.
Abstract: Heterochromatin protein 1 (HP1) is localized at heterochromatin sites where it mediates gene silencing. The chromo domain of HP1 is necessary for both targeting and transcriptional repression. In the fission yeast Schizosaccharomyces pombe, the correct localization of Swi6 (the HP1 equivalent) depends on Clr4, a homologue of the mammalian SUV39H1 histone methylase. Both Clr4 and SUV39H1 methylate specifically lysine 9 of histone H3 (ref. 6). Here we show that HP1 can bind with high affinity to histone H3 methylated at lysine 9 but not at lysine 4. The chromo domain of HP1 is identified as its methyl-lysine-binding domain. A point mutation in the chromo domain, which destroys the gene silencing activity of HP1 in Drosophila, abolishes methyl-lysine-binding activity. Genetic and biochemical analysis in S. pombe shows that the methylase activity of Clr4 is necessary for the correct localization of Swi6 at centromeric heterochromatin and for gene silencing. These results provide a stepwise model for the formation of a transcriptionally silent heterochromatin: SUV39H1 places a 'methyl marker' on histone H3, which is then recognized by HP1 through its chromo domain. This model may also explain the stable inheritance of the heterochromatic state.

2,811 citations


Journal ArticleDOI
TL;DR: The new model outperforms the Dayhoff and JTT models with respect to maximum-likelihood values for a large majority of the protein families in the authors' database and suggests that it provides a better overall fit to the evolutionary process in globular proteins and may lead to more accurate phylogenetic tree estimates.
Abstract: Phylogenetic inference from amino acid sequence data uses mainly empirical models of amino acid replacement and is therefore dependent on those models. Two of the more widely used models, the Dayhoff and JTT models, are estimated using similar methods that can utilize large numbers of sequences from many unrelated protein families but are somewhat unsatisfactory because they rely on assumptions that may lead to systematic error and discard a large amount of the information within the sequences. The alternative method of maximum-likelihood estimation may utilize the information in the sequence data more efficiently and suffers from no systematic error, but it has previously been applicable to relatively few sequences related by a single phylogenetic tree. Here, we combine the best attributes of these two methods using an approximate maximum-likelihood method. We implemented this approach to estimate a new model of amino acid replacement from a database of globular protein sequences comprising 3,905 amino acid sequences split into 182 protein families. While the new model has an overall structure similar to those of other commonly used models, there are significant differences. The new model outperforms the Dayhoff and JTT models with respect to maximum-likelihood values for a large majority of the protein families in our database. This suggests that it provides a better overall fit to the evolutionary process in globular proteins and may lead to more accurate phylogenetic tree estimates. Potentially, this matrix, and the methods used to generate it, may also be useful in other areas of research, such as biological sequence database searching, sequence alignment, and protein structure prediction, for which an accurate description of amino acid replacement is required.

2,647 citations


Journal ArticleDOI
TL;DR: In this article, the dispersion of peak positions and intensities with excitation wavelength is used to understand the nature of resonant Raman scattering in carbon and how to derive the local bonding and disorder from the Raman spectra.
Abstract: The Raman spectra of a wide range of disordered and amorphous carbons have been measured under excitation from 785 to 229 nm. The dispersion of peak positions and intensities with excitation wavelength is used to understand the nature of resonant Raman scattering in carbon and how to derive the local bonding and disorder from the Raman spectra. The spectra show three basic features, the D and G around 1600 and 1350 ${\mathrm{cm}}^{\mathrm{\ensuremath{-}}1}$ for visible excitation and an extra T peak, for UV excitation, at \ensuremath{\sim}1060 ${\mathrm{cm}}^{\mathrm{\ensuremath{-}}1}$. The G peak, due to the stretching motion of ${\mathrm{sp}}^{2}$ pairs, is a good indicator of disorder. It shows dispersion only in amorphous networks, with a dispersion rate proportional to the degree of disorder. Its shift well above 1600 ${\mathrm{cm}}^{\mathrm{\ensuremath{-}}1}$ under UV excitation indicates the presence of ${\mathrm{sp}}^{2}$ chains. The dispersion of the D peak is strongest in ordered carbons. It shows little dispersion in amorphous carbon, so that in UV excitation it becomes like a density-of-states feature of vibrations of ${\mathrm{sp}}^{2}$ ringlike structures. The intensity ratio $I(D)/I(G)$ falls with increasing UV excitation in all forms of carbon, with a faster decrease in more ordered carbons, so that it is generally small for UV excitation. The T peak, due to ${\mathrm{sp}}^{3}$ vibrations, only appears in UV Raman, lying around 1060 ${\mathrm{cm}}^{\mathrm{\ensuremath{-}}1}$ for H-free carbons and around 980 ${\mathrm{cm}}^{\mathrm{\ensuremath{-}}1}$ in hydrogenated carbons. In hydrogenated carbons, the ${\mathrm{sp}}^{3}{\mathrm{C}\ensuremath{-}\mathrm{H}}_{x}$ stretching modes around 2920 ${\mathrm{cm}}^{\mathrm{\ensuremath{-}}1}$ can be clearly detected for UV excitation. This assignment is confirmed by deuterium substitution.

2,553 citations


Journal ArticleDOI
TL;DR: It is proposed that the epidemiological associations between poor fetal and infant growth and the subsequent development of type 2 diabetes and the metabolic syndrome result from the effects of poor nutrition in early life, which produces permanent changes in glucose-insulin metabolism.
Abstract: The thrifty phenotype hypothesis proposes that the epidemiological associations between poor fetal and infant growth and the subsequent development of type 2 diabetes and the metabolic syndrome result from the effects of poor nutrition in early life, which produces permanent changes in glucose-insulin metabolism. These changes include reduced capacity for insulin secretion and insulin resistance which, combined with effects of obesity, ageing and physical inactivity, are the most important factors in determining type 2 diabetes. Since the hypothesis was proposed, many studies world-wide have confirmed the initial epidemiological evidence, although the strength of the relationships has varied from one study to another. The relationship with insulin resistance is clear at all ages studied. Less clear is the relationship with insulin secretion. The relative contribution of genes and environment to these relationships remains a matter of debate. The contributions of maternal hyperglycaemia and the trajectory of postnatal growth need to be clarified.

2,520 citations


Journal ArticleDOI
TL;DR: Recent progress is described in understanding of how cells detect and signal the presence and repair of one particularly important form of DNA damage induced by ionizing radiation—the DNA double-strand break (DSB).
Abstract: To ensure the high-fidelity transmission of genetic information, cells have evolved mechanisms to monitor genome integrity. Cells respond to DNA damage by activating a complex DNA-damage-response pathway that includes cell-cycle arrest, the transcriptional and post-transcriptional activation of a subset of genes including those associated with DNA repair, and, under some circumstances, the triggering of programmed cell death. An inability to respond properly to, or to repair, DNA damage leads to genetic instability, which in turn may enhance the rate of cancer development. Indeed, it is becoming increasingly clear that deficiencies in DNA-damage signaling and repair pathways are fundamental to the etiology of most, if not all, human cancers. Here we describe recent progress in our understanding of how cells detect and signal the presence and repair of one particularly important form of DNA damage induced by ionizing radiation-the DNA double-strand break (DSB). Moreover, we discuss how tumor suppressor proteins such as p53, ATM, Brca1 and Brca2 have been linked to such pathways, and how accumulating evidence is connecting deficiencies in cellular responses to DNA DSBs with tumorigenesis.

Journal ArticleDOI
TL;DR: In this paper, the authors studied the dark-matter halo density profiles in a high-resolution N-body simulation of a CDM cosmology and found that the redshift dependence of the median concentration is cvirRvir/rs.
Abstract: We study dark-matter halo density profiles in a high-resolution N-body simulation of aCDM cosmology. Our statistical sample contains �5000 haloes in the range 10 11 10 14 h −1 M⊙ and the resolution allows a study of subhaloes inside host haloes. The profiles are parameterized by an NFW form with two parameters, an inner radius rs and a virial radius Rvir, and we define the halo concentration cvirRvir/rs. We find that, for a given halo mass, the redshift dependence of the median concentration is cvir / (1 + z) −1 . This corresponds to rs(z) � constant, and is contrary to earlier suspicions that cvir does not vary much with redshift. The implications are that high- redshift galaxies are predicted to be more extended and dimmer than expected before. Second, we find that the scatter in halo profiles is large, with a 1� �(logcvir) = 0.18 at a given mass, corresponding to a scatter in maximum rotation velocities of �Vmax/Vmax = 0.12. We discuss implications for modelling the Tully-Fisher relation, which has a smaller reported intrinsic scatter. Third, subhaloes and haloes in dense environments tend to be more concentrated than isolated haloes, and show a larger scatter. These results suggest that cvir is an essential parameter for the theory of galaxy modelling, and we briefly discuss implications for the universality of the Tully- Fisher relation, the formation of low surface brightness galaxies, and the origin of the Hubble sequence. We present an improved analytic treatment of halo formation that fits the measured relations between halo parameters and their redshift dependence, and can thus serve semi-analytic studies of galaxy formation.

Journal ArticleDOI
TL;DR: The 2dF Galaxy Redshift Survey (2dFGRS) as mentioned in this paper uses the 2DF multifibre spectrograph on the Anglo-Australian Telescope, which is capable of observing 400 objects simultaneously over a 2° diameter field.
Abstract: The 2dF Galaxy Redshift Survey (2dFGRS) is designed to measure redshifts for approximately 250 000 galaxies. This paper describes the survey design, the spectroscopic observations, the redshift measurements and the survey data base. The 2dFGRS uses the 2dF multifibre spectrograph on the Anglo-Australian Telescope, which is capable of observing 400 objects simultaneously over a 2° diameter field. The source catalogue for the survey is a revised and extended version of the APM galaxy catalogue, and the targets are galaxies with extinction-corrected magnitudes brighter than b J = 19.45. The main survey regions are two declination strips, one in the southern Galactic hemisphere spanning 80° × 15° around the SGP, and the other in the northern Galactic hemisphere spanning 75° × 10° along the celestial equator; in addition, there are 99 fields spread over the southern Galactic cap. The survey covers 2000 deg 2 and has a median depth of z = 0.11. Adaptive tiling is used to give a highly uniform sampling rate of 93 per cent over the whole survey region. Redshifts are measured from spectra covering 3600-8000 A at a two-pixel resolution of 9.0 A and a median S/N of 13 pixel - 1 . All redshift identifications are visually checked and assigned a quality parameter Q in the range 1-5; Q ≥ 3 redshifts are 98.4 per cent reliable and have an rms uncertainty of 85 km s - 1 . The overall redshift completeness for Q ≥ 3 redshifts is 91.8 per cent, but this varies with magnitude from 99 per cent for the brightest galaxies to 90 per cent for objects at the survey limit. The 2dFGRS data base is available on the World Wide Web at http://www. mso.anu.edu.au/2dFGRS.

Journal ArticleDOI
10 Aug 2001-Science
TL;DR: Self-organization of liquid crystalline and crystalline-conjugated materials has been used to create, directly from solution, thin films with structures optimized for use in photodiodes, demonstrating that complex structures can be engineered from novel materials by means of simple solution-processing steps and may enable inexpensive, high-performance, thin-film photovoltaic technology.
Abstract: Self-organization of liquid crystalline and crystalline-conjugated materials has been used to create, directly from solution, thin films with structures optimized for use in photodiodes. The discotic liquid crystal hexa-peri-hexabenzocoronene was used in combination with a perylene dye to produce thin films with vertically segregated perylene and hexabenzocoronene, with large interfacial surface area. When incorporated into diode structures, these films show photovoltaic response with external quantum efficiencies of more than 34 percent near 490 nanometers. These efficiencies result from efficient photoinduced charge transfer between the hexabenzocoronene and perylene, as well as from effective transport of charges through vertically segregated perylene and hexabenzocoronene pi systems. This development demonstrates that complex structures can be engineered from novel materials by means of simple solution-processing steps and may enable inexpensive, high-performance, thin-film photovoltaic technology.

Journal ArticleDOI
TL;DR: HERWIG as mentioned in this paper is a general-purpose Monte Carlo event generator, which includes the simulation of hard lepton-lepton, leptonhadron and hadron-hadron scattering and soft hadronhadron collisions in one package.
Abstract: HERWIG is a general-purpose Monte Carlo event generator, which includes the simulation of hard lepton-lepton, lepton-hadron and hadron-hadron scattering and soft hadron-hadron collisions in one package. It uses the parton-shower approach for initial- and final-state QCD radiation, including colour coherence effects and azimuthal correlations both within and between jets. This article updates the description of HERWIG published in 1992, emphasising the new features incorporated since then. These include, in particular, the matching of first-order matrix elements with parton showers, a more correct treatment of heavy quark decays, and a wide range of new processes, including many predicted by the Minimal Supersymmetric Standard Model, with the option of R-parity violation. At the same time we offer a brief review of the physics underlying HERWIG, together with details of the input and control parameters and the output data, to provide a self-contained guide for prospective users of the program.

Journal ArticleDOI
TL;DR: The results suggest that recent trends in agriculture have had deleterious and measurable effects on bird populations on a continental scale and predict that the introduction of EU agricultural policies into former communist countries hoping to accede to the EU in the near future will result in significant declines in the important bird populations there.
Abstract: The populations of farmland birds in Europe declined markedly during the last quarter of the 20th century, representing a severe threat to biodiversity. Here, we assess whether declines in the populations and ranges of farmland birds across Europe reflect differences in agricultural intensity, which arise largely through differences in political history. Population and range changes were modelled in terms of a number of indices of agricultural intensity. Population declines and range contractions were significantly greater in countries with more intensive agriculture, and significantly higher in the European Union (EU) than in former communist countries. Cereal yield alone explained over 30% of the variation in population trends. The results suggest that recent trends in agriculture have had deleterious and measurable effects on bird populations on a continental scale. We predict that the introduction of EU agricultural policies into former communist countries hoping to accede to the EU in the near future will result in significant declines in the important bird populations there.

Journal ArticleDOI
TL;DR: In this article, a dual tree of wavelet filters is proposed to obtain real and imaginary parts of the wavelet transform. And the dual tree can be extended for image and other multi-dimensional signals.

Journal ArticleDOI
TL;DR: The specific RET codon mutation correlates with the MEN2 syndromic variant, the age of onset of M TC, and the aggressiveness of MTC; consequently, that mutation should guide major management decisions, such as whether and when to perform thyroidectomy.
Abstract: This is a consensus statement from an international group, mostly of clinical endocrinologists. MEN1 and MEN2 are hereditary cancer syndromes. The commonest tumors secrete PTH or gastrin in MEN1, and calcitonin or catecholamines in MEN2. Management strategies improved after the discoveries of their genes. MEN1 has no clear syndromic variants. Tumor monitoring in MEN1 carriers includes biochemical tests yearly and imaging tests less often. Neck surgery includes subtotal or total parathyroidectomy, parathyroid cryopreservation, and thymectomy. Proton pump inhibitors or somatostatin analogs are the main management for oversecretion of entero-pancreatic hormones, except insulin. The roles for surgery of most entero-pancreatic tumors present several controversies: exclusion of most operations on gastrinomas and indications for surgery on other tumors. Each MEN1 family probably has an inactivating MEN1 germline mutation. Testing for a germline MEN1 mutation gives useful information, but rarely mandates an intervention. The most distinctive MEN2 variants are MEN2A, MEN2B, and familial medullary thyroid cancer (MTC). They vary in aggressiveness of MTC and spectrum of disturbed organs. Mortality in MEN2 is greater from MTC than from pheochromocytoma. Thyroidectomy, during childhood if possible, is the goal in all MEN2 carriers to prevent or cure MTC. Each MEN2 index case probably has an activating germline RET mutation. RET testing has replaced calcitonin testing to diagnose the MEN2 carrier state. The specific RET codon mutation correlates with the MEN2 syndromic variant, the age of onset of MTC, and the aggressiveness of MTC; consequently, that mutation should guide major management decisions, such as whether and when to perform thyroidectomy.

Journal ArticleDOI
TL;DR: It is reported that embryonic angiogenesis in mice was not affected by deficiency of PlGF, andTransplantation of wild-type bone marrow rescued the impairedAngiogenesis and collateral growth in Pgf−/− mice, indicating that PlGF might have contributed to vessel growth in the adult by mobilizing bone-marrow–derived cells.
Abstract: Vascular endothelial growth factor (VEGF) stimulates angiogenesis by activating VEGF receptor-2 (VEGFR-2). The role of its homolog, placental growth factor (PlGF), remains unknown. Both VEGF and PlGF bind to VEGF receptor-1 (VEGFR-1), but it is unknown whether VEGFR-1, which exists as a soluble or a membrane-bound type, is an inert decoy or a signaling receptor for PlGF during angiogenesis. Here, we report that embryonic angiogenesis in mice was not affected by deficiency of PlGF (Pgf-/-). VEGF-B, another ligand of VEGFR-1, did not rescue development in Pgf-/- mice. However, loss of PlGF impaired angiogenesis, plasma extravasation and collateral growth during ischemia, inflammation, wound healing and cancer. Transplantation of wild-type bone marrow rescued the impaired angiogenesis and collateral growth in Pgf-/- mice, indicating that PlGF might have contributed to vessel growth in the adult by mobilizing bone-marrow-derived cells. The synergism between PlGF and VEGF was specific, as PlGF deficiency impaired the response to VEGF, but not to bFGF or histamine. VEGFR-1 was activated by PlGF, given that anti-VEGFR-1 antibodies and a Src-kinase inhibitor blocked the endothelial response to PlGF or VEGF/PlGF. By upregulating PlGF and the signaling subtype of VEGFR-1, endothelial cells amplify their responsiveness to VEGF during the 'angiogenic switch' in many pathological disorders.

Journal ArticleDOI
16 Feb 2001-Science
TL;DR: Modulation of DNA repair should lead to clinical applications including improvement of radiotherapy and treatment with anticancer drugs and an advanced understanding of the cellular aging process.
Abstract: Cellular DNA is subjected to continual attack, both by reactive species inside cells and by environmental agents. Toxic and mutagenic consequences are minimized by distinct pathways of repair, and 130 known human DNA repair genes are described here. Notable features presently include four enzymes that can remove uracil from DNA, seven recombination genes related to RAD51, and many recently discovered DNA polymerases that bypass damage, but only one system to remove the main DNA lesions induced by ultraviolet light. More human DNA repair genes will be found by comparison with model organisms and as common folds in three-dimensional protein structures are determined. Modulation of DNA repair should lead to clinical applications including improvement of radiotherapy and treatment with anticancer drugs and an advanced understanding of the cellular aging process.


Journal ArticleDOI
TL;DR: The current state of research concerning acyl H SL-mediated quorum-sensing is reviewed and two non-acyl HSL-based systems utilised by the phytopathogens Ralstonia solanacearum and Xanthomonas campestris are described.
Abstract: It has become increasingly and widely recognised that bacteria do not exist as solitary cells, but are colonial organisms that exploit elaborate systems of intercellular communication to facilitate their adaptation to changing environmental conditions. The languages by which bacteria communicate take the form of chemical signals, excreted from the cells, which can elicit profound physiological changes. Many types of signalling molecules, which regulate diverse phenotypes across distant genera, have been described. The most common signalling molecules found in Gram-negative bacteria are N-acyl derivatives of homoserine lactone (acyl HSLs). Modulation of the physiological processes controlled by acyl HSLs (and, indeed, many of the non-acyl HSL-mediated systems) occurs in a cell density- and growth phase-dependent manner. Therefore, the term ‘quorum-sensing’ has been coined to describe this ability of bacteria to monitor cell density before expressing a phenotype. In this paper, we review the current state of research concerning acyl HSL-mediated quorum-sensing. We also describe two non-acyl HSL-based systems utilised by the phytopathogens Ralstonia solanacearum and Xanthomonas campestris.

Journal ArticleDOI
TL;DR: In this article, a local exchange functional called OPTX was proposed to approximate the Hartree-Fock (HF) energies of the 18 first and second row atoms. But it was shown that neither δ p or t = Σ δ i |2 improves the fit to these atomic energies.
Abstract: We first attempt to determine a local exchange functional Ex[p] which accurately reproduces the Hartree-Fock (HF) energies of the 18 first and second row atoms. Ex[p is determined from p and |δp|, and we find that we can improve significantly upon Becke's original generalized gradient approximation functional (commonly called B88X) by allowing the coefficient of the Dirac exchange term to be optimized (it is argued that molecules do not behave like the uniform electron gas). We call this new two parameter exchange functional OPTX. We find that neither δ p or t = Σ δ i |2 improve the fit to these atomic energies. These exchange functionals include not only exchange, but also left-right correlation. It is therefore proposed that this functional provides a definition for exchange energy plus left-right correlation energy when used in Kohn-Sham (KS) calculations. We call this energy the Kohn-Sham exchange (or KSX) energy. It is shown that for nearly all molecules studied these KSX energies are lower than the ...

Journal ArticleDOI
30 Nov 2001-Science
TL;DR: The densities of seedlings and saplings of canopy trees are severely reduced on herbivore-affected islands, providing evidence of a trophic cascade unleashed in the absence of top-down regulation.
Abstract: The manner in which terrestrial ecosystems are regulated is controversial. The "top-down" school holds that predators limit herbivores and thereby prevent them from overexploiting vegetation. "Bottom-up" proponents stress the role of plant chemical defenses in limiting plant depredation by herbivores. A set of predator-free islands created by a hydroelectric impoundment in Venezuela allows a test of these competing world views. Limited area restricts the fauna of small (0.25 to 0.9 hectare) islands to predators of invertebrates (birds, lizards, anurans, and spiders), seed predators (rodents), and herbivores (howler monkeys, iguanas, and leaf-cutter ants). Predators of vertebrates are absent, and densities of rodents, howler monkeys, iguanas, and leaf-cutter ants are 10 to 100 times greater than on the nearby mainland, suggesting that predators normally limit their populations. The densities of seedlings and saplings of canopy trees are severely reduced on herbivore-affected islands, providing evidence of a trophic cascade unleashed in the absence of top-down regulation.

Journal ArticleDOI
TL;DR: Prenatal exposure to famine, especially during late gestation, is linked to decreased glucose tolerance in adults, and this effect of famine on glucose tolerance is especially important in people who become obese as adults.

Proceedings ArticleDOI
01 Jan 2001
TL;DR: Both optimal and suboptimal Bayesian algorithms for nonlinear/non-Gaussian tracking problems, with a focus on particle filters are reviewed.
Abstract: Increasingly, for many application areas, it is becoming important to include elements of nonlinearity and non-Gaussianity in order to model accurately the underlying dynamics of a physical system. Moreover, it is typically crucial to process data on-line as it arrives, both from the point of view of storage costs as well as for rapid adaptation to changing signal characteristics. In this paper, we review both optimal and suboptimal Bayesian algorithms for nonlinear/non-Gaussian tracking problems, with a focus on particle filters. Particle filters are sequential Monte Carlo methods based on point mass (or “particle”) representations of probability densities, which can be applied to any state-space model and which generalize the traditional Kalman filtering methods. Several variants of the particle filter such as SIR, ASIR, and RPF are introduced within a generic framework of the sequential importance sampling (SIS) algorithm. These are discussed and compared with the standard EKF through an illustrative example.

Journal ArticleDOI
TL;DR: The Indian Ocean Experiment (INDOEX) documented this Indo-Asian haze at scales ranging from individual particles to its contribution to the regional climate forcing as discussed by the authors, and integrated the multiplatform observations (satellites, aircraft, ships, surface stations, and balloons) with one-and four-dimensional models to derive the regional aerosol forcing resulting from the direct, the semidirect and the two indirect effects.
Abstract: Every year, from December to April, anthropogenic haze spreads over most of the North Indian Ocean, and South and Southeast Asia. The Indian Ocean Experiment (INDOEX) documented this Indo-Asian haze at scales ranging from individual particles to its contribution to the regional climate forcing. This study integrates the multiplatform observations (satellites, aircraft, ships, surface stations, and balloons) with one- and four-dimensional models to derive the regional aerosol forcing resulting from the direct, the semidirect and the two indirect effects. The haze particles consisted of several inorganic and carbonaceous species, including absorbing black carbon clusters, fly ash, and mineral dust. The most striking result was the large loading of aerosols over most of the South Asian region and the North Indian Ocean. The January to March 1999 visible optical depths were about 0.5 over most of the continent and reached values as large as 0.2 over the equatorial Indian ocean due to long-range transport. The aerosol layer extended as high as 3 km. Black carbon contributed about 14% to the fine particle mass and 11% to the visible optical depth. The single-scattering albedo estimated by several independent methods was consistently around 0.9 both inland and over the open ocean. Anthropogenic sources contributed as much as 80% (±10%) to the aerosol loading and the optical depth. The in situ data, which clearly support the existence of the first indirect effect (increased aerosol concentration producing more cloud drops with smaller effective radii), are used to develop a composite indirect effect scheme. The Indo-Asian aerosols impact the radiative forcing through a complex set of heating (positive forcing) and cooling (negative forcing) processes. Clouds and black carbon emerge as the major players. The dominant factor, however, is the large negative forcing (-20±4 W m^(−2)) at the surface and the comparably large atmospheric heating. Regionally, the absorbing haze decreased the surface solar radiation by an amount comparable to 50% of the total ocean heat flux and nearly doubled the lower tropospheric solar heating. We demonstrate with a general circulation model how this additional heating significantly perturbs the tropical rainfall patterns and the hydrological cycle with implications to global climate.

Journal ArticleDOI
01 May 2001-Brain
TL;DR: It is predicted that the resolution of questions concerning the functional neuroanatomical subdivisions of the frontal cortex will ultimately depend on a fuller cognitive psychological fractionation of memory control processes, an enterprise that will be guided and tested by experimentation.
Abstract: The new functional neuroimaging techniques, PET and functional MRI (fMRI), offer sufficient experimental flexibility and spatial resolution to explore the functional neuroanatomical bases of different memory stages and processes. They have had a particular impact on our understanding of the role of the frontal cortex in memory processing. We review the insights that have been gained, and attempt a synthesis of the findings from functional imaging studies of working memory, encoding in episodic memory and retrieval from episodic memory. Though these different aspects of memory have usually been studied in isolation, we suggest that there is sufficient convergence with respect to frontal activations to make such a synthesis worthwhile. We concentrate in particular on three regions of the lateral frontal cortex-ventro-lateral, dorsolateral and anterior-that are consistently activated in these studies, and attribute these activations to the updating/maintenance of information, the selection/manipulation/monitoring of that information, and the selection of processes/subgoals, respectively. We also acknowledge a number of empirical inconsistencies associated with this synthesis, and suggest possible reasons for these. More generally, we predict that the resolution of questions concerning the functional neuroanatomical subdivisions of the frontal cortex will ultimately depend on a fuller cognitive psychological fractionation of memory control processes, an enterprise that will be guided and tested by experimentation. We expect that the neuroimaging techniques will provide an important part of this enterprise.

Journal ArticleDOI
TL;DR: In this paper, the effect of residual stresses on fatigue lifetimes and structural integrity are first summarised, followed by the definition and measurement of residual stress, which are characterised according to the characteristic length scale over which they self-equilibrate.
Abstract: Residual stress is that which remains in a body that is stationary and at equilibrium with its surroundings. It can be very detrimental to the performance of a material or the life of a component. Alternatively, beneficial residual stresses can be introduced deliberately. Residual stresses are more difficult to predict than the in-service stresses on which they superimpose. For this reason, it is important to have reliable methods for the measurement of these stresses and to understand the level of information they can provide. In this paper, which is the first part of a two part overview, the effect of residual stresses on fatigue lifetimes and structural integrity are first summarised, followed by the definition and measurement of residual stresses. Different types of stress are characterised according to the characteristic length scale over which they self-equilibrate. By comparing this length to the gauge volume of each technique, the capability of a range of techniques is assessed. In the sec...

Journal ArticleDOI
TL;DR: The GAIA astrometric mission has recently been approved as one of the next two ''cornerstones'' of ESA's science programme, with a launch datetarget of not later than mid-2012 as discussed by the authors.
Abstract: The GAIA astrometric mission has recently been approved as one of thenext two ``cornerstones'' of ESA's science programme, with a launch datetarget of not later than mid-2012. GAIA will provide positional andradial velocity measurements with the accuracies needed to produce astereoscopic and kinematic census of about one billion stars throughoutour Galaxy (and into the Local Group), amounting to about 1 percent ofthe Galactic stellar population. GAIA's main scientific goal is toclarify the origin and history of our Galaxy, from a quantitative censusof the stellar populations. It will advance questions such as when thestars in our Galaxy formed, when and how it was assembled, and itsdistribution of dark matter. The survey aims for completeness to V=20mag, with accuracies of about 10 mu as at 15 mag. Combined withastrophysical information for each star, provided by on-boardmulti-colour photometry and (limited) spectroscopy, these data will havethe precision necessary to quantify the early formation, and subsequentdynamical, chemical and star formation evolution of our Galaxy.Additional products include detection and orbital classification of tensof thousands of extra-Solar planetary systems, and a comprehensivesurvey of some 105-106 minor bodies in our SolarSystem, through galaxies in the nearby Universe, to some 500 000 distantquasars. It will provide a number of stringent new tests of generalrelativity and cosmology. The complete satellite system was evaluated aspart of a detailed technology study, including a detailed payloaddesign, corresponding accuracy assesments, and results from a prototypedata reduction development. (Less)