scispace - formally typeset
Search or ask a question

Showing papers by "University of Oxford published in 1996"


Journal ArticleDOI
TL;DR: An instrument to assess the quality of reports of randomized clinical trials (RCTs) in pain research is described and its use to determine the effect of rater blinding on the assessments of quality is described.

15,740 citations


Journal ArticleDOI
Claude Amsler1, Michael Doser2, Mario Antonelli, D. M. Asner3  +173 moreInstitutions (86)
TL;DR: This biennial Review summarizes much of particle physics, using data from previous editions.

12,798 citations


Journal ArticleDOI
13 Jan 1996-BMJ
TL;DR: Evidence Based Medicine (IBM) as discussed by the authors is the conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individual patients, which is a hot topic for clinicians, public health practitioners, purchasers, planners and the public.
Abstract: It's about integrating individual clinical expertise and the best external evidence Evidence based medicine, whose philosophical origins extend back to mid-19th century Paris and earlier, remains a hot topic for clinicians, public health practitioners, purchasers, planners, and the public. There are now frequent workshops in how to practice and teach it (one sponsored by the BMJ will be held in London on 24 April); undergraduate1 and postgraduate2 training programmes are incorporating it3 (or pondering how to do so); British centres for evidence based practice have been established or planned in adult medicine, child health, surgery, pathology, pharmacotherapy, nursing, general practice, and dentistry; the Cochrane Collaboration and Britain's Centre for Review and Dissemination in York are providing systematic reviews of the effects of health care; new evidence based practice journals are being launched; and it has become a common topic in the lay media. But enthusiasm has been mixed with some negative reaction.4 5 6 Criticism has ranged from evidence based medicine being old hat to it being a dangerous innovation, perpetrated by the arrogant to serve cost cutters and suppress clinical freedom. As evidence based medicine continues to evolve and adapt, now is a useful time to refine the discussion of what it is and what it is not. Evidence based medicine is the conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individual patients. The …

12,134 citations


Book
01 Jan 1996
TL;DR: Professor Ripley brings together two crucial ideas in pattern recognition; statistical methods and machine learning via neural networks in this self-contained account.
Abstract: From the Publisher: Pattern recognition has long been studied in relation to many different (and mainly unrelated) applications, such as remote sensing, computer vision, space research, and medical imaging. In this book Professor Ripley brings together two crucial ideas in pattern recognition; statistical methods and machine learning via neural networks. Unifying principles are brought to the fore, and the author gives an overview of the state of the subject. Many examples are included to illustrate real problems in pattern recognition and how to overcome them.This is a self-contained account, ideal both as an introduction for non-specialists readers, and also as a handbook for the more expert reader.

5,632 citations



Journal ArticleDOI
TL;DR: The uses of such internal models for solving several fundamental computational problems in motor control are outlined and the evidence for their existence and use by the central nervous system is reviewed.

2,033 citations


Book
John Cardy1
26 Apr 1996
TL;DR: In this article, the authors provide a thoroughly modern graduate-level introduction to the theory of critical behavior, including phase diagrams, fixed points, cross-over behavior, finite-size scaling, perturbative renormalization methods, low-dimensional systems, surface critical behaviour, random systems, percolation, polymer statistics, critical dynamics and conformal symmetry.
Abstract: This text provides a thoroughly modern graduate-level introduction to the theory of critical behaviour. Beginning with a brief review of phase transitions in simple systems and of mean field theory, the text then goes on to introduce the core ideas of the renormalization group. Following chapters cover phase diagrams, fixed points, cross-over behaviour, finite-size scaling, perturbative renormalization methods, low-dimensional systems, surface critical behaviour, random systems, percolation, polymer statistics, critical dynamics and conformal symmetry. The book closes with an appendix on Gaussian integration, a selected bibliography, and a detailed index. Many problems are included. The emphasis throughout is on providing an elementary and intuitive approach. In particular, the perturbative method introduced leads, among other applications, to a simple derivation of the epsilon expansion in which all the actual calculations (at least to lowest order) reduce to simple counting, avoiding the need for Feynman diagrams.

1,728 citations


Journal ArticleDOI
26 Dec 1996-Nature
TL;DR: In this article, the authors visualized the airflow around the wings of the hawkmoth Manduca sexta and a 'hovering' large mechanical model, and found an intense leading-edge vortex was found on the downstroke, of sufficient strength to explain the high-lift forces.
Abstract: INSECTS cannot fly, according to the conventional laws of aerodynamics: during flapping flight, their wings produce more lift than during steady motion at the same velocities and angles of attack1–5. Measured instantaneous lift forces also show qualitative and quantitative disagreement with the forces predicted by conventional aerodynamic theories6–9. The importance of high-life aerodynamic mechanisms is now widely recognized but, except for the specialized fling mechanism used by some insect species1,10–13, the source of extra lift remains unknown. We have now visualized the airflow around the wings of the hawkmoth Manduca sexta and a 'hovering' large mechanical model—the flapper. An intense leading-edge vortex was found on the down-stroke, of sufficient strength to explain the high-lift forces. The vortex is created by dynamic stall, and not by the rotational lift mechanisms that have been postulated for insect flight14–16. The vortex spirals out towards the wingtip with a spanwise velocity comparable to the flapping velocity. The three-dimensional flow is similar to the conical leading-edge vortex found on delta wings, with the spanwise flow stabilizing the vortex.

1,663 citations


Journal ArticleDOI
TL;DR: A method (HOLE) that allows the analysis of the dimensions of the pore running through a structural model of an ion channel is presented and can be used to predict the conductance of channels using a simple empirically corrected ohmic model.

1,390 citations


Journal ArticleDOI
19 Apr 1996-Cell
TL;DR: This review summarizes the current understand of the crystal structure Control mechanisms that have been recognized to determination of cAPK and showed the structural importance of Thr-197 or domains that may function in response to second phosphorylation and demonstrated possible roles of messengers.

1,356 citations


Gavin Lowe1
01 Jan 1996
TL;DR: This paper uses FDR, a refinement checker for CSP to discover an attack upon the Needham-Schroeder Public-Key Protocol, which allows an intruder to impersonate another agent, and adapt the protocol, and uses FDR to show that the new protocol is secure, at least for a small system.
Abstract: In this paper we analyse the well known Needham-Schroeder Public-Key Protocol using FDR, a refinement checker for CSP. We use FDR to discover an attack upon the protocol, which allows an intruder to impersonate another agent. We adapt the protocol, and then use FDR to show that the new protocol is secure, at least for a small system. Finally we prove a result which tells us that if this small system is secure, then so is a system of arbitrary size.

Book ChapterDOI
15 Apr 1996
TL;DR: The Condensation algorithm combines factored sampling with learned dynamical models to propagate an entire probability distribution for object position and shape, over time, and is markedly superior to what has previously been attainable from Kalman filtering.
Abstract: The problem of tracking curves in dense visual clutter is a challenging one. Trackers based on Kalman filters are of limited use; because they are based on Gaussian densities which are unimodal, they cannot represent simultaneous alternative hypotheses. Extensions to the Kalman filter to handle multiple data associations work satisfactorily in the simple case of point targets, but do not extend naturally to continuous curves. A new, stochastic algorithm is proposed here, the Condensation algorithm — Conditional Density Propagation over time. It uses ‘factored sampling’, a method previously applied to interpretation of static images, in which the distribution of possible interpretations is represented by a randomly generated set of representatives. The Condensation algorithm combines factored sampling with learned dynamical models to propagate an entire probability distribution for object position and shape, over time. The result is highly robust tracking of agile motion in clutter, markedly superior to what has previously been attainable from Kalman filtering. Notwithstanding the use of stochastic methods, the algorithm runs in near real-time.

Journal ArticleDOI
TL;DR: In this paper, the stability of a quantum superposition of two different stationary mass distributions is examined, where the perturbing effect of each distribution on the space-time structure is taken into account, in accordance with the principles of general relativity.
Abstract: The stability of a quantum superposition of two different stationary mass distributions is examined, where the perturbing effect of each distribution on the space-time structure is taken into account, in accordance with the principles of general relativity. It is argued that the definition of the time-translation operator for the superposed space-times involves an inherent ill-definedness, leading to an essential uncertainty in the energy of the superposed state which, in the Newtonian limit, is proportional to the gravitational self-energyEΔ of the difference between the two mass distributions. This is consistent with a suggested finite lifetime of the order of ħ/EΔ for the superposed state, in agreement with a certain proposal made by the author for a gravitationally induced spontaneous quantum state reduction, and with closely related earlier suggestions by Diosi and by Ghirardiet al.

Journal ArticleDOI
TL;DR: Breast cancer and hormonal contraceptives: Collaborative reanalysis of individual data on 53297 women with breast cancer and 100239 women without breast cancer from 54 epidemiological studies as mentioned in this paper.

Journal Article
TL;DR: Analysis of macrophage infiltration in invasive breast carcinomas indicates a role for macrophages in angiogenesis and prognosis in breast cancer and that this cell type may represent an important target for immunoinhibitory therapy in Breast cancer.
Abstract: Angiogenesis is a key process in tumor growth and metastasis and is a major independent prognostic factor in breast cancer A range of cytokines stimulate the tumor neovasculature, and tumor-associated macrophages have been shown recently to produce several important angiogenic factors We have quantified macrophage infiltration using Chalkley count morphometry in a series of invasive breast carcinomas to investigate the relationship between tumor-associated macrophage infiltration and tumor angiogenesis, and prognosis There was a significant positive correlation between high vascular grade and increased macrophage index (P = 003), and a strong relationship was observed between increased macrophage counts and reduced relapse-free survival (P = 0006) and reduced overall survival (P = 0004) as an independent prognostic variable These data indicate a role for macrophages in angiogenesis and prognosis in breast cancer and that this cell type may represent an important target for immunoinhibitory therapy in breast cancer

Journal ArticleDOI
05 Apr 1996-Science
TL;DR: A simple mathematical approach is developed to explore the relation between antiviral immune responses, virus load, and virus diversity in infections with the human T cell leukemia virus and the human immunodeficiency virus.
Abstract: Mathematical models, which are based on a firm understanding of biological interactions, can provide nonintuitive insights into the dynamics of host responses to infectious agents and can suggest new avenues for experimentation. Here, a simple mathematical approach is developed to explore the relation between antiviral immune responses, virus load, and virus diversity. The model results are compared to data on cytotoxic T cell responses and viral diversity in infections with the human T cell leukemia virus (HTLV-1) and the human immunodeficiency virus (HIV-1).

Journal ArticleDOI
TL;DR: The details of a lattice Boltzmann approach to phase separation in nonideal one- and two-component fluids are presented and the kinetics of the approach to equilibrium lie within the expected universality classes.
Abstract: We present the details of a lattice Boltzmann approach to phase separation in nonideal one- and two-component fluids. The collision rules are chosen such that the equilibrium state corresponds to an input free energy and the bulk flow is governed by the continuity, Navier-Stokes, and, for the binary fluid, a convection-diffusion equation. Numerical results are compared to simple analytic predictions to confirm that the equilibrium state is indeed thermodynamically consistent and that the kinetics of the approach to equilibrium lie within the expected universality classes. The approach is compared to other lattice Boltzmann simulations of nonideal systems. \textcopyright{} 1996 The American Physical Society.

Book
01 Jan 1996
TL;DR: The book discusses data Refinement, Relaxing and Unwinding Data Refinement and Z, and the importance of Equality and Definite Description in the application of data refinement.
Abstract: * Introduction. * Propositional Logic. * Predicate Logic. * Equality and Definite Description. * Sets. * Definitions. * Relations. * Functions. * Sequences. * Free Types. * Schemas Schema Operators. * Promotion, Preconditons. * Data Refinement. * Relaxing and Unwinding Data Refinement and Z. * Applications of Data Refinement. * The Refinement Calculus. * A File System. * A Telecommunications Protocol. * An Operating System Scheduler: A Bounded Buffer Module. * An Unordered Set Module. * A Save Area. * Solutions to Exercises. * Appendices. * Bibliography. * Index.

Journal ArticleDOI
Gregory D. Schuler1, Mark S. Boguski1, Elizabeth A. Stewart2, Lincoln Stein3, Gabor Gyapay, Kate Rice4, Robert E. White5, P. Rodriguez-Tomé6, Amita Aggarwal2, Eva Bajorek2, S. Bentolila, B. B. Birren3, Adam Butler4, Andrew B. Castle3, N. Chiannilkulchai, Angela M. Chu2, C M Clee4, Sid Cowles2, P. J. R. Day5, T. Dibling4, N. Drouot, Ian Dunham4, Simone Duprat, C. East4, C A Edwards4, Jun Fan2, Nicole Y. Fang7, Cécile Fizames, Christine Garrett4, L. Green4, David Hadley2, Midori A. Harris2, Paul Harrison4, Shannon T. Brady2, Andrew A. Hicks4, E. Holloway4, L. Hui3, S. Hussain2, C. Louis-Dit-Sully5, J. Ma3, A. MacGilvery4, Christopher Mader2, A. Maratukulam2, Tara C. Matise8, K. B. McKusick2, Jean Morissette9, Andrew J. Mungall4, Delphine Muselet, H. C. Nusbaum3, David C. Page3, Ammon B. Peck4, Shanti M. Perkins2, Mark Piercy2, Fawn Qin2, John Quackenbush2, S A Ranby4, Tim Reif2, Steve Rozen3, C. Sanders2, X. She2, James Silva3, Donna K. Slonim3, Carol Soderlund4, W.-L. Sun2, P. Tabar2, T. Thangarajah5, Nathalie Vega-Czarny, Douglas Vollrath2, S. Voyticky2, T. E. Wilmer4, Xiao-Yu Wu3, Mark Raymond Adams10, Charles Auffray11, Nicole A.R. Walter12, Rhonda Brandon10, Anindya Dehejia1, Peter N. Goodfellow13, R. Houlgatte11, James R. Hudson1, Susan E. Ide1, K. R. Iorio10, Wha‐Young Lee, N. Seki, Takahiro Nagase, K. Ishikawa, N. Nomura, Cheryl Phillips10, Mihael H. Polymeropoulos1, Mina Sandusky10, Karin Schmitt13, Richard Berry12, K. Swanson, R. Torres1, J. C. Venter10, James M. Sikela12, Jacques S. Beckmann, Jean Weissenbach, Richard M. Myers2, David R. Cox2, Michael R. James5, David Bentley4, Panos Deloukas4, Eric S. Lander3, Thomas J. Hudson3, Thomas J. Hudson14 
25 Oct 1996-Science
TL;DR: The gene map unifies the existing genetic and physical maps with the nucleotide and protein sequence databases in a fashion that should speed the discovery of genes underlying inherited human disease.
Abstract: The human genome is thought to harbor 50,000 to 100,000 genes, of which about half have been sampled to date in the form of expressed sequence tags. An international consortium was organized to develop and map gene-based sequence tagged site markers on a set of two radiation hybrid panels and a yeast artificial chromosome library. More than 16,000 human genes have been mapped relative to a framework map that contains about 1000 polymorphic genetic markers. The gene map unifies the existing genetic and physical maps with the nucleotide and protein sequence databases in a fashion that should speed the discovery of genes underlying inherited human disease. The integrated resource is available through a site on the World Wide Web at http://www.ncbi.nlm.nih.gov/SCIENCE96/.

Journal ArticleDOI
TL;DR: The concept of quantum privacy amplification and a cryptographic scheme incorporating it which is provably secure over a noisy channel is introduced and implemented using technology that is currently being developed.
Abstract: Existing quantum cryptographic schemes are not, as they stand, operable in the presence of noise on the quantum communication channel. Although they become operable if they are supplemented by classical privacy-amplification techniques, the resulting schemes are difficult to analyze and have not been proved secure. We introduce the concept of quantum privacy amplification and a cryptographic scheme incorporating it which is provably secure over a noisy channel. The scheme uses an “entanglement purification” procedure which, because it requires only a few quantum controllednot and single-qubit operations, could be implemented using technology that is currently being developed. [S0031-9007(96)01288-4] Quantum cryptography [1 ‐ 3] allows two parties (traditionally known as Alice and Bob) to establish a secure random cryptographic key if, first, they have access to a quantum communication channel, and second, they can exchange classical public messages which can be monitored but not altered by an eavesdropper (Eve). Using such a key, a secure message of equal length can be transmitted over the classical channel. However, the security of quantum cryptography has so far been proved only for the idealized case where the quantum channel, in the absence of eavesdropping, is noiseless. That is because, under existing protocols, Alice and Bob detect eavesdropping by performing certain quantum measurements on transmitted batches of qubits and then using statistical tests to determine, with any desired degree of confidence, that the transmitted qubits are not entangled with any third system such as Eve. The problem is that there is in principle no way of distinguishing entanglement with an eavesdropper (caused by her measurements) from entanglement with the environment caused by innocent noise, some of which is presumably always present. This implies that all existing protocols are, strictly speaking, inoperable in the presence of noise, since they require the transmission of messages to be suspended whenever an eavesdropper (or, therefore, noise) is detected. Conversely, if we want a protocol that is secure in the presence of noise, we must find one that allows secure transmission to continue even in the presence of eavesdroppers. To this end, one might consider modifying the existing pro

Journal ArticleDOI
TL;DR: In this paper, the development of strategic thinking since the 1960s and an emerging perspective on strategy as "practice" is mapped, focusing on strategists and strategizing, rather than organizations and strategies.

Journal ArticleDOI
TL;DR: The International Classification of Childhood Cancer (ICCC) updates the widely used Birch and Marsden classification scheme to accommodate important changes in recognition of different types of neoplasms, while preserving continuity with the original classification.
Abstract: The International Classification of Childhood Cancer (ICCC) updates the widely used Birch and Marsden classification scheme. ICCC is based on the second edition of the International Classification of Diseases for Oncology (ICD-O-2). The purpose of the new classification is to accommodate important changes in recognition of different types of neoplasms, while preserving continuity with the original classification. The grouping of neoplasms into 12 main diagnostic groups is maintained. The major changes are: (1) intracranial and intraspinal germ-cell tumours now constitute a separate subgroup within germ-cell tumours; (2) histiocytosis X (Langerhans-cell histiocytosis) is excluded from ICCC; (3) Kaposi's sarcoma is a separate subgroup within soft-tissue sarcomas; (4) skin carcinoma is a separate subgroup within epithelial neoplasms; (5) "other specified" and "unspecified" neoplasms are now usually separate sub-categories within the main diagnostic groups. Draft copies of the ICCC were distributed to some 200 professionals with interest and expertise in the field and their comments are considered in this final version. This classification will be used for presentation of data in the second volume of the IARC Scientific Publication "International Incidence of Childhood Cancer." A computer programme for automated classification of childhood tumours coded according to ICD-O-1 or ICD-O-2 is now available from IARC.

Journal ArticleDOI
TL;DR: The total daily production of plasma virus is, on average, higher in chronic HBV carriers than in HIV-infected patients, but the half-life of virus-producing cells is much shorter in HIV.
Abstract: Treatment of chronic hepatitis B virus (HBV) infections with the reverse transcriptase inhibitor lamivudine leads to a rapid decline in plasma viremia and provides estimates for crucial kinetic constants of HBV replication We find that in persistently infected patients, HBV particles are cleared from the plasma with a half-life of approximately 10 day, which implies a 50% daily turnover of the free virus population Total viral release into the periphery is approximately 10(11) virus particles per day Although we have no direct measurement of the infected cell mass, we can estimate the turnover rate of these cells in two ways: (i) by comparing the rate of viral production before and after therapy or (ii) from the decline of hepatitis B antigen during treatment These two independent methods give equivalent results: we find a wide distribution of half-lives for virus-producing cells, ranging from 10 to 100 days in different patients, which may reflect differences in rates of lysis of infected cells by immune responses Our analysis provides a quantitative understanding of HBV replication dynamics in vivo and has implications for the optimal timing of drug treatment and immunotherapy in chronic HBV infection This study also represents a comparison for recent findings on the dynamics of human immunodeficiency virus (HIV) infection The total daily production of plasma virus is, on average, higher in chronic HBV carriers than in HIV-infected patients, but the half-life of virus-producing cells is much shorter in HIV Most strikingly, there is no indication of drug resistance in HBV-infected patients treated for up to 24 weeks

Journal ArticleDOI
01 Dec 1996-Pain
TL;DR: Antidepressants are effective in relieving neuropathic pain and it is still unclear which drug class should be first choice, with very similar results for anticonvulsants.
Abstract: The objective of this study was to review the effectiveness and safety of antidepressants in neuropathic pain. In a systematic review of randomised controlled trials, the main outcomes were global judgements, pain relief or fall in pain intensity which approximated to more than 50% pain relief, and information about minor and major adverse effects. Dichotomous data for effectiveness and adverse effects were analysed using odds ratio and number needed-to-treat (NNT) methods. Twenty-one placebo-controlled treatments in 17 randomised controlled trials were included, involving 10 antidepressants. In six of 13 diabetic neuropathy studies the odds ratios showed significant benefit compared with placebo. The combined odds ratio was 3.6 (95% CI 2.5-5.2), with a NNT for benefit of 3 (2.4-4). In two of three postherpetic neuralgia studies the odds ratios showed significant benefit, and the combined odds ratio was 6.8 (3.5-14.3), with a NNT of 2.3 (1.7-3.3). In two atypical facial pain studies the combined odds ratio for benefit was 4.1 (2.3-7.5), with a NNT of 2.8 (2-4.7). Only one of three central pain studies had analysable dichotomous data. The NNT point estimate was 1.7. Comparisons of tricyclic antidepressants did not show any significant difference between them; they were significantly more effective than benzodiazepines in the three comparisons available. Paroxetine and mianserin were less effective than imipramine. For 11 of the 21 placebo-controlled treatments there was dichotomous information on minor adverse effects; combining across pain syndromes the NNT for minor (noted in published report) adverse effects was 3.7 (2.9-5.2). Information on major (drug-related study withdrawal) adverse effects was available from 19 reports; combining across pain syndromes the NNT for major adverse effects was 22 (13.5-58). Antidepressants are effective in relieving neuropathic pain. Compared with placebo, of 100 patients with neuropathic pain who are given antidepressants, 30 will obtain more than 50% pain relief, 30 will have minor adverse reactions and four will have to stop treatment because of major adverse effects. With very similar results for anticonvulsants it is still unclear which drug class should be first choice. Treatment would be improved if we could harness the dramatic improvement seen on placebo in some of the trials.

Journal ArticleDOI
09 Nov 1996-BMJ
TL;DR: An indicator of skewness can be used when there are data for several groups of individuals and deviations from the normal distribution and a relation between the standard deviation and mean across groups often go together.
Abstract: PIP: Many statistical methods of analysis assume that the data have a normal distribution When the data do not, they can often be changed to make them more normal However, readers of published papers may wish to be certain that the authors have conducted a proper analysis One can clearly see whether the distributional assumption is met when data are presented in the form of a histogram or scatter diagram However, when only summary statistics are presented, the task becomes far more difficult An idea of the distribution can be gleaned if the summary statistics include the range of the data For example, a range from 7 to 41 around a mean of 15 suggests that the data are positively skewed Belief in that assumption may be unreliable because the range is based upon the two most extreme, and atypical, values Similar asymmetry affecting the lower and upper quartiles would better indicate a skewed distribution it is suggested that for measurements which must be positive, if the mean is smaller than twice the standard deviation, the data are likely to be skewed A second indicator of skewness can be used when there are data for several groups of individuals Deviations from the normal distribution and a relation between the standard deviation and mean across groups often go together A standard deviation which increases as the mean increases is a strong indication of positively skewed data, and specifically that a log transformation may be needed

Journal ArticleDOI
15 Feb 1996-Nature
TL;DR: In this article, the authors show that exposing hyperaccumu-lator species of Alyssum to nickel elicits a large and proportional increase in the levels of free histidine, which is shown to be coordinated with nickel in vivo.
Abstract: A NUMBER of terrestrial plants accumulate large quantities of metals such as zinc, manganese, nickel, cobalt and copper in their shoots1. The largest group of these so-called 'metal hyperaccumulators' is found in the genus Alyssum, in which nickel concentrations can reach 3% of leaf dry biomass2,3. Apart from their intrinsic interest, plants exhibiting this trait could be of value in the decontamination of metal-polluted soils4–6. However, the biochemical basis of the capacity for metal accumulation has not been elucidated. Here we report that exposing hyperaccumu-lator species of Alyssum to nickel elicits a large and proportional increase in the levels of free histidine, which is shown to be coordinated with nickel in vivo. Moreover, supplying histidine to a non-accumulating species greatly increases both its nickel tolerance and capacity for nickel transport to the shoot, indicating that enhanced production of histidine is responsible for the nickel hyperaccumulation phenotype in Alyssum.

Journal ArticleDOI
TL;DR: Cold shock elicits an immediate rise in cytosolic free calcium concentration ([Ca2+]cyt) in both chilling-resistant Arabidopsis and chilling-sensitive tobacco and this suggests that acclimation involves modification of plant calcium signaling to provide a "cold memory."
Abstract: Cold shock elicits an immediate rise in cytosolic free calcium concentration ([Ca2+]cyt) in both chilling-resistant Arabidopsis and chilling-sensitive tobacco (Nicotiana plumbaginifolia). In Arabidopsis, lanthanum or EGTA caused a partial inhibition of both cold shock [Ca2+]cyt elevation and cold-dependent kin1 gene expression. This suggested that calcium influx plays a major role in the cold shock [Ca2+]cyt response and that an intracellular calcium source also might be involved. To investigate whether the vacuole (the major intracellular calcium store in plants) is involved, we targeted the calcium-dependent photoprotein aequorin to the cytosolic face of the vacuolar membrane. Cold shock calcium kinetics in this microdomain were consistent with a cold-induced vacuolar release of calcium. Treatment with neomycin or lithium, which interferes with phosphoinositide cycling, resulted in cold shock [Ca2+]cyt kinetics consistent with the involvement of inositol trisphosphate and inositide phosphate signaling in this response. We also investigated the effects of repeated and prolonged low temperature on cold shock [Ca2+]cyt. Differences were observed between the responses of Arabidopsis and N. plum-baginifolia to repeated cold stimulation. Acclimation of Arabidopsis by pretreatment with cold or hydrogen peroxide caused a modified calcium signature to subsequent cold shock. This suggests that acclimation involves modification of plant calcium signaling to provide a "cold memory."

Journal ArticleDOI
14 Jun 1996-Cell
TL;DR: The two PDGF null phenotypes reveal analogous morphogenetic functions for myofibroblast-type cells in lung and kidney organogenesis, and show that PDGF-B is required in the ontogeny of kidney mesangial cells.

Journal ArticleDOI
TL;DR: Antibody kinetics showed that more than half of the AIDS–KS patients who were examined IgG–seroconverted before KS development, and antibody levels did not decline after seroconversion, suggest that the rate of infection was constant and that the risk of developing KS once infected with KSHV is not highly dependent on the duration of infection.
Abstract: A major controversy regarding Kaposi's sarcoma–associated herpesvirus (KSHV or HHV8)1,2 is whether or not it is a ubiquitous infection of humans3,4. Immunoassays based on KSHV– and Epstein–Barr virus (EBV)–coinfected cell lines show that most US AIDS–KS patients have specific antibodies to KSHV–related antigens2,5,6. We have developed a sensitive indirect immunofluorescence assay (IFA) based on an EBV–negative, KSHV–infected cell line, BCP–1. When we used this IFA assay, KSHV–related antibodies were found in 71–88% of serum samples from US, Italian and Ugandan AIDS–KS patients, as well as all serum samples examined from HIV–seronegative KS patients. Although none of the US blood donors examined were KSHV seropositive by IFA, intermediate and high seroprevalence rates were found in Italian and Ugandan control populations. Antibody kinetics showed that more than half of the AIDS–KS patients who were examined IgG–seroconverted before KS development, and antibody levels did not decline after seroconversion. For these patients, seropositivity rates increased linearly with time, suggesting that the rate of infection was constant and that the risk of developing KS once infected with KSHV is not highly dependent on the duration of infection. These data strongly suggest that KSHV is not ubiquitous in most populations and that the virus may be under strict immunologic control in healthy KSHV–infected persons.

Journal ArticleDOI
TL;DR: In this paper, it was shown that Laurentia and Baltica shared a common drift history for the time interval 750-600 Ma as they rotated clockwise and drifted southward from an equatorial position during the opening of the Proto-Pacific between Laurentia, and the two continents of Baltica and Laurentia witnessed the break-up of one supercontinent, Rodinia and the formation of another, but less long-lived, Pangea.