scispace - formally typeset
Search or ask a question

Showing papers by "Imperial College London published in 1990"


Journal ArticleDOI
TL;DR: In this paper, the authors present a review of the applicability and applicability of numerical predictions of turbulent flow, and advocate that computational economy, range of applicability, and physical realism are best served by turbulence models in which the magnitudes of two turbulence quantities, the turbulence kinetic energy k and its dissipation rate ϵ, are calculated from transport equations solved simultaneously with those governing the mean flow behaviour.

11,866 citations


Journal ArticleDOI
TL;DR: In this paper, three sampling-based approaches, namely stochastic substitution, the Gibbs sampler, and the sampling-importance-resampling algorithm, are compared and contrasted in relation to various joint probability structures frequently encountered in applications.
Abstract: Stochastic substitution, the Gibbs sampler, and the sampling-importance-resampling algorithm can be viewed as three alternative sampling- (or Monte Carlo-) based approaches to the calculation of numerical estimates of marginal probability distributions. The three approaches will be reviewed, compared, and contrasted in relation to various joint probability structures frequently encountered in applications. In particular, the relevance of the approaches to calculating Bayesian posterior densities for a variety of structured models will be discussed and illustrated.

6,294 citations


Journal Article
TL;DR: Stochastic substitution, the Gibbs sampler, and the sampling-importance-resampling algorithm can be viewed as three alternative sampling- (or Monte Carlo-) based approaches to the calculation of numerical estimates of marginal probability distributions.
Abstract: Stochastic substitution, the Gibbs sampler, and the sampling-importance-resampling algorithm can be viewed as three alternative sampling- (or Monte Carlo-) based approaches to the calculation of numerical estimates of marginal probability distributions. The three approaches will be reviewed, compared, and contrasted in relation to various joint probability structures frequently encountered in applications. In particular, the relevance of the approaches to calculating Bayesian posterior densities for a variety of structured models will be discussed and illustrated.

6,223 citations


Journal ArticleDOI
TL;DR: A system (OR-Library) that distributes test problems by electronic mail (e-mail) that has available test problems drawn from a number of different areas of operational research.
Abstract: In this note we present a system (OR-Library) that distributes test problems by electronic mail (e-mail). This system currently has available test problems drawn from a number of different areas of...

1,939 citations


Journal ArticleDOI
TL;DR: It is shown that the optimal buying and selling policies are the local times of the two-dimensional process of bank and stock holdings at the boundaries of a wedge-shaped region which is determined by the solution of a nonlinear free boundary problem.
Abstract: In this paper, optimal consumption and investment decisions are studied for an investor who has available a bank account paying a fixed rate of interest and a stock whose price is a log-normal diffusion. This problem was solved by Merton and others when transactions between bank and stock are costless. Here we suppose that there are charges on all transactions equal to a fixed percentage of the amount transacted. It is shown that the optimal buying and selling policies are the local times of the two-dimensional process of bank and stock holdings at the boundaries of a wedge-shaped region which is determined by the solution of a nonlinear free boundary problem. An algorithm for solving the free boundary problem is given.

1,320 citations


Book ChapterDOI
01 Jan 1990
TL;DR: The UHV chamber can also contain an array of techniques for cleaning the surface (provision for heating the sample, ion bombardment) as well as some means of detecting impurities at the surface, usually by detection of Auger signals from adsorbed atoms as mentioned in this paper.
Abstract: The essential elements are an ultrahigh vacuum (UHV) chamber to preserve surface cleanliness, an electron gun to produce a collimated beam of electrons in the energy range 0 to 500 eV, a crystal holder and manipulator, and some means of observing the diffracted electrons, typically a fluorescent screen. Further details may be found elsewhere.(1–3) The major difficulty is common to all surface experiments, namely, to keep the surface clean. The UHV chamber will normally contain an array of techniques for cleaning the surface (provision for heating the sample, ion bombardment) as well as some means of detecting impurities at the surface, usually by detection of Auger signals from adsorbed atoms. LEED is very sensitive to cleanliness of the surface and small amounts of contaminant can produce quite spurious results. Experiments conducted on clean, perfect, surfaces can produce a large amount of structural information of high precision. Obviously it is only possible to produce precise data for surfaces which are well defined in the first place.

1,068 citations


Journal ArticleDOI
TL;DR: In southern Italy, 44 contacts of hepatitis B virus carriers, including infants of carrier mothers, became HBsAg positive despite passive and active immunisation according to standard protocols, and infection was confirmed by the presence of additional markers of viral replication.

1,043 citations


Journal ArticleDOI
01 Feb 1990-Brain
TL;DR: The results suggest diminished neuronal firing or decreased dendritic synaptic density with age, as well as a strict coupling between CMRO2 and CBF, and between CBF and CBV was found, while OER was constant and independent ofCBF andCMRO2.
Abstract: Regional cerebral blood flow (CBF), oxygen extraction ratio (OER), oxygen utilization (CMRO2) and blood volume (CBV) were measured in a group of 34 healthy volunteers (age range 22–82 yrs) using the 15O steady-state inhalation method and positron emission tomography. Between subjects CBF correlated positively with CMRO2, although the interindividual variability of the measured values was large. OER was not dependent on CMRO2, but highly negatively correlated with CBF. CBV correlated positively with CBF. When considering the values of all the regions of interest within a single subject, a strict coupling between CMRO2 and CBF, and between CBF and CBV was found, while OER was constant and independent of CBF and CMRO2. In ‘pure’ grey and white matter regions CMRO2, CBF and CBV decreased with age approximately 0.50% per year. In other regions the decline was less evident, most likely due to partial volume effects. OER did not change or showed a slight increase with age (maximum in the grey matter region 0.35%/yr). The results suggest diminished neuronal firing or decreased dendritic synaptic density with age.

1,038 citations


Journal ArticleDOI
TL;DR: The use of the Gibbs sampler as a method for calculating Bayesian marginal posterior and predictive densities is reviewed and illustrated with a range of normal data models, including variance components, unordered and ordered means, hierarchical growth curves, and missing data in a crossover trial.
Abstract: The use of the Gibbs sampler as a method for calculating Bayesian marginal posterior and predictive densities is reviewed and illustrated with a range of normal data models, including variance components, unordered and ordered means, hierarchical growth curves, and missing data in a crossover trial. In all cases the approach is straightforward to specify distributionally and to implement computationally, with output readily adapted for required inference summaries.

1,020 citations


Journal ArticleDOI
TL;DR: The receding horizon control strategy as mentioned in this paper provides a relatively simple method for determining feedback control for linear or nonlinear systems and is especially useful for the control of slow non-linear systems, such as chemical batch processes, where it is possible to solve, sequentially, open-loop fixed-horizon, optimal control problems online.
Abstract: The receding horizon control strategy provides a relatively simple method for determining feedback control for linear or nonlinear systems. The method is especially useful for the control of slow nonlinear systems, such as chemical batch processes, where it is possible to solve, sequentially, open-loop fixed-horizon, optimal control problems online. The method has been shown to yield a stable closed-loop system when applied to time-invariant or time-varying linear systems. It is shown that the method also yields a stable closed-loop system when applied to nonlinear systems. >

926 citations


Journal ArticleDOI
20 Apr 1990-Science
TL;DR: A series of peptide derivatives based on the transition-state mimetic concept has been designed that inhibit the proteinase from the human immunodeficiency virus, and antiviral activity was observed in the nanomolar range in three different cell systems.
Abstract: A series of peptide derivatives based on the transition-state mimetic concept has been designed that inhibit the proteinase from the human immunodeficiency virus (HIV). The more active compounds inhibit both HIV-1 and HIV-2 proteinases in the nanomolar range with little effect at 10 micromolar against the structurally related human aspartic proteinases. Proteolytic cleavage of the HIV-1 gag polyprotein (p55) to the viral structural protein p24 was inhibited in chronically infected CEM cells. Antiviral activity was observed in the nanomolar range (with one compound active below 10 nanomolar) in three different cell systems, as assessed by p24 antigen and syncytium formation. Cytotoxicity was not detected at 10 and 5 micromolar in C8166 and JM cells, respectively, indicating a high therapeutic index for this new class of HIV proteinase inhibitors.

Journal ArticleDOI
TL;DR: A model for dynamic change management which separates structural concerns from component application concerns is presented and is applied to an example problem, 'evolving philosophers', which has been implemented and tested in the Conic environment for distributed systems.
Abstract: A model for dynamic change management which separates structural concerns from component application concerns is presented. This separation of concerns permits the formulation of general structural rules for change at the configuration level without the need to consider application state, and the specification of application component actions without prior knowledge of the actual structural changes which may be introduced. In addition, the changes can be applied in such a way so as to leave the modified system in a consistent state, and cause no disturbance to the unaffected part of the operational system. The model is applied to an example problem, 'evolving philosophers'. The principles of this model have been implemented and tested in the Conic environment for distributed systems. >

Journal ArticleDOI
TL;DR: An account is given of work published during the past 10 years incriminating species of phlebotomine sandflies as vectors of Leishmania species which infect man.
Abstract: An account is given of work published during the past 10 years incriminating species of phlebotomine sandflies as vectors of Leishmania species which infect man. An assessment is made of the degrees of certainty of the vectorial roles of eighty-one species and subspecies of sandflies (thirty-seven Old World and forty-four New World) in the transmission of twenty-nine leishmanial parasites of mammals. At least one species of sandfly is considered to be a proven vector of each of ten parasites. Of the eighty-one sandfly taxa, evidence is judged to be sufficient to incriminate nineteen as proven vectors (eleven Phlebotomus species and eight Lutzomyia species or subspecies) and evidence for a further fourteen (nine Phlebotomus species and five Lutzomyia species or subspecies) is considered to be strong. The suggested criteria for incrimination of a vector are anthropophily and common infection with the same leishmanial parasite as that found in man in the same place. More weight should be given to natural infections persisting after the digestion of a bloodmeal than those in the presence of blood. Supporting evidence is a concordance in the distribution of the fly and the disease in man, proof that the fly feeds regularly on the reservoir host, a flourishing development of the parasite in infected flies and the experimental transmission of the parasite by the bite of the fly.

Journal ArticleDOI
TL;DR: A family of explicit Runge-Kutta formulas that contains imbedded formulas of all orders 1 through 4 is derived, which is very efficient for problems with smooth solution as well as problems having rapidly varying solutions.
Abstract: Explicit Runge-Kutta methods (RKMs) are among the most popular classes of formulas for the approximate numerical integration of nonstiff, initial value problems. However, high-order Runge-Kutta methods require more function evaluations per integration step than, for example, Adams methods used in PECE mode, and so, with RKMs, it is expecially important to avoid rejected steps. Steps are often rejected when certain derivatives of the solutions are very large for part of the region of integration. This corresponds, for example, to regions where the solution has a sharp front or, in the limit, some derivative of the solution is discontinuous. In these circumstances the assumption that the local truncation error is changing slowly is invalid, and so any step-choosing algorithm is likely to produce an unacceptable step. In this paper we derive a family of explicit Runge-Kutta formulas. Each formula is very efficient for problems with smooth solution as well as problems having rapidly varying solutions. Each member of this family consists of a fifty-order formula that contains imbedded formulas of all orders 1 through 4. By computing solutions at several different orders, it is possible to detect sharp fronts or discontinuities before all the function evaluations defining the full Runge-Kutta step have been computed. We can then either accpet a lower order solution or abort the step, depending on which course of action seems appropriate. The efficiency of the new algorithm is demonstrated on the DETEST test set as well as on some difficult test problems with sharp fronts or discontinuities.

Journal ArticleDOI
TL;DR: An elevated level of lipop protein(a) is a strong risk factor for CHD in patients with familial hypercholesterolemia, and the increase in risk is independent of age, sex, smoking status, and serum levels of total cholesterol, triglyceride, or high-density lipoprotein cholesterol.
Abstract: Familial hypercholesterolemia carries a marked increase in the risk of coronary heart disease (CHD), but there is considerable variation between individuals in susceptibility to CHD. To investigate the possible role of lipoprotein(a) as a risk factor for CHD, we studied the association between serum lipoprotein(a) levels, genetic types of apolipoprotein(a) (which influence lipoprotein(a) levels), and CHD in 115 patients with heterozygous familial hypercholesterolemia. The median lipoprotein(a) level in the 54 patients with CHD was 57 mg per deciliter, which is significantly higher than the corresponding value of 18 mg per deciliter in the 61 patients without CHD. According to discriminant-function analysis, the lipoprotein(a) level was the best discriminator between the two groups (as compared with all other lipid and lipoprotein levels, age, sex, and smoking status). Phenotyping for apolipoprotein(a) was performed in 109 patients. The frequencies of the apolipoprotein(a) phenotypes and alleles d...

Journal ArticleDOI
TL;DR: A general approach to hierarchical Bayes changepoint models is presented, including an application to changing regressions, changing Poisson processes and changing Markov chains, which avoids sophisticated analytic and numerical high dimensional integration procedures.
Abstract: SUMMARY A general approach to hierarchical Bayes changepoint models is presented. In particular, desired marginal posterior densities are obtained utilizing the Gibbs sampler, an iterative Monte Carlo method. This approach avoids sophisticated analytic and numerical high dimensional integration procedures. We include an application to changing regressions, changing Poisson processes and changing Markov chains. Within these contexts we handle several previously inaccessible problems.

Journal ArticleDOI
TL;DR: In this article, the use of high speed, high capacity vector computers allows the resultant finite-difference equations to be factored in-place, allowing inversions to be generated using data from a very large number of source positions.
Abstract: Frequency-domain methods are well suited to the imaging of wide-aperture cross-hole data. However, although the combination of the frequency domain with the wavenumber domain has facilitated the development of rapid algorithms, such as diffraction tomography, this has also required linearization with respect to homogeneous reference media. This restriction, and association restrictions on source-receiver geometries, are overcome by applying inverse techniques that operate in the frequency-space domain. In order to incorporate the rigorous modelling technique of finite differences into the inverse procedure a nonlinear approach is used. To reduce computational costs the method of finite differences is applied directly to the frequency-domain wave equation. The use of high speed, high capacity vector computers allow the resultant finite-difference equations to be factored in-place. In this way wavefields can be computed for additional source positions at minimal extra cost, allowing inversions to be generated using data from a very large number of source positions. Synthetic studies show that where weak scatter approximations are valid, diffraction tomography performs slightly better than a single iteration of non-linear inversion. However, if the background velocities increase systematically with depth, diffraction tomography is ineffective whereas non-linear inversion yields useful images from one frequency component of the data after a single iteration. Further synthetic studies indicate the efficacy of the method in the time-lapse monitoring of injection fluids in tertiary hydrocarbon recovery projects.

Journal ArticleDOI
TL;DR: The authors studied the interrelationships in the strategic profile of a sample of small firms and, by using cross-sectional analysis, tried to identify any evidence to support the "stages of growth" theories.
Abstract: This paper studies the interrelationships in the strategic profile of a sample of small firms, and, by using cross-sectional analysis, attempts to identify any evidence to support the ‘stages of growth’ theories. Three surrogates for comparative growth were used in the analysis: number of employees, sales turnover, and profitability. A cluster analysis identified eight different ‘types’ of small firms characterized by ‘internal’ variables of ownership, management, and product structure; and by ‘external’ variables of product/market positioning. Analysis of variance tests found no significant differences between the clusters with regard to size. The results suggest that firms do change, but not necessarily in any prescribed sequence. Indeed, the evidence presented in this paper suggests that future research should be focused on developing theories which better describe the heterogeneity of the sector by analyzing the development within clusters of firms rather than seeking generalized overarching theories.

Journal ArticleDOI
TL;DR: A method, based on the use of proportions, for restricting weight flexibility in data envelopment analysis when the decision-making units being evaluated have multiple inputs and outputs is presented.
Abstract: In this paper we present a method, based on the use of proportions, for restricting weight flexibility in data envelopment analysis. This method is applicable when the decision-making units being evaluated have multiple inputs and outputs.

Journal ArticleDOI
TL;DR: In this paper, a co-rotational formulation for three-dimensional beams is presented, in which both the internal force vector and tangent stiffness matrix are consistently derived from the adopted "strain measures".

Journal ArticleDOI
01 Jun 1990-Science
TL;DR: The data indicate that the APP gene is tightly linked to HCHWA-D and therefore, in contrast to familial Alzheimer's disease, cannot be excluded as the site of mutation in HCH WA-D.
Abstract: Human hereditary cerebral hemorrhage with amyloidosis of the Dutch type (HCHWA-D), an autosomal dominant form of cerebral amyloid angiopathy (CAA), is characterized by extensive amyloid deposition in the small leptomeningeal arteries and cortical arterioles, which lead to an early death of those afflicted in their fifth or sixth decade. Immunohistochemical and biochemical studies have indicated that the amyloid subunit in HCHWA-D is antigenically related to and homologous in sequence with the amyloid beta protein isolated from brains of patients with Alzheimer's disease and Down syndrome. The amyloid beta protein is encoded by the amyloid beta protein precursor (APP) gene located on chromosome 21. Restriction fragment length polymorphisms detected by the APP gene were used to examine whether this gene is a candidate for the genetic defect in HCHWA-D. The data indicate that the APP gene is tightly linked to HCHWA-D and therefore, in contrast to familial Alzheimer's disease, cannot be excluded as the site of mutation in HCHWA-D.

Journal ArticleDOI
18 May 1990-Cell
TL;DR: It is concluded that sorting of internalized receptor for degradation or recycling can occur through spatial segregation within the MVB, and sorting of EGF-R is controlled by tyrosine kinase activity.

Journal ArticleDOI
TL;DR: This analysis shows that forward running wavelets dominate during both the acceleration and deceleration phases of blood flow in the aorta, and is a time domain analysis which can be applied to nonperiodic or transient flow.
Abstract: The one-dimensional equations of flow in the elastic arteries are hyperbolic and admit nonlinear, wavelike solutions for the mean velocity, U, and the pressure, P. Neglecting dissipation, the solutions can be written in terms of wavelets defined as differences of the Riemann invariants across characteristics. This analysis shows that the product, dUdP, is positive definite for forward running wavelets and negative definite for backward running wavelets allowing the determination of the net magnitude and direction of propagating wavelets from pressure and velocity measured at a point in the artery. With the linearizing assumption that intersecting wavelets are additive, the forward and backward running wavelets can be separately calculated. This analysis, applied to measurements made in the ascending aorta of man, shows that forward running wavelets dominate during both the acceleration and deceleration phases of blood flow in the aorta. The forward and backward running waves calculated using the linearized analysis are similar to the results of an impedance analysis of the data. Unlike the impedance analysis, however, this is a time domain analysis which can be applied to nonperiodic or transient flow.

Journal ArticleDOI
D. Decamp1, B. Deschizeaux1, J. P. Lees1, M-N Minard1  +471 moreInstitutions (24)
TL;DR: This paper presents a solution to support strategic processes in a PSEE by providing a flexible guidance during process enactment and shows that supporting processes is more concerned with the flexibility of guidance offered during the process performance than with enforcement of a collection of predefined process models.
Abstract: Process-centred Software Engineering Environments (PSEE) are the most recent generation of environments supporting software development activities. Most of PSEE are based on mechanisms promoting enforcement and automation of process activities. In this kind of mechanisms the process models are prescribed in a detailed and complete way. But the experience shows that supporting processes is more concerned with the flexibility of guidance offered during the process performance than with enforcement of a collection of predefined process models. In this paper, we present a solution to support strategic processes in a PSEE by providing a flexible guidance during process enactment.

Journal ArticleDOI
26 Jul 1990-Nature
TL;DR: Fluorescence microscopy and video recording of living cells to trace the passage of ligand–receptor complexes has identified the endosomal compartment as an extensive network of tubular cisternae.
Abstract: Complexes of cell-surface receptors and their ligands are commonly internalized by endocytosis and enter a prelysosomal endosomal pathway for further processing. Fluorescence microscopy and video recording of living cells to trace the passage of ligand–receptor complexes has identified the endosomal compartment as an extensive network of tubular cisternae. Endocytosed material entering this reticulum enters discrete swellings, identified as multivesicular bodies by electron microscopy, which move along the reticulum towards the pericentriolar area.

Journal ArticleDOI
13 Sep 1990-Nature
TL;DR: The inheritance of five polymorphic DNA markers from the proximal long arm of chromosome 21 in a large unselected series of pedi-grees with familial Alzheimer's disease suggests that Alzheimer's Disease is not a single entity, but rather results from genetic defects on chromosome 21 and from other genetic or nongenetic factors.
Abstract: Alzheimer's disease, a fatal neurodegenerative disorder of unknown aetiology, is usually considered to be a single disorder because of the general uniformity of the disease phenotype. Two recent genetic linkage studies revealed co-segregation of familial Alzheimer disease with the D21S1/S11 and D21S16 loci on chromosome 21. But two other studies, one of predominantly multiplex kindreds with a late age-of-onset, the other of a cadre of kindreds with a unique Volga German ethnic origin, found absence of linkage at least to D21S1/S11. So far it has not been possible to discern whether these conflicting reports reflect aetiological heterogeneity, differences in methods of pedigree selection, effects of confounding variables in the analysis (for example, diagnostic errors, assortative matings), or true non-replication. To resolve this issue, we have now examined the inheritance of five polymorphic DNA markers from the proximal long arm of chromosome 21 in a large unselected series of pedigrees with familial Alzheimer's disease. Our data suggest that Alzheimer's disease is not a single entity, but rather results from genetic defects on chromosome 21 and from other genetic or nongenetic factors.

Journal ArticleDOI
TL;DR: Two important shortcomings of traditional plant demography are emphasized; the dearth of simple manipulative experiments on such issues as seed limitation, and the tendency to locate study plots around existing mature individuals (the omission of ‘empty quadrats’ may introduce serious bias into the estimation of plant recruitment rates).
Abstract: Long-term studies of plant populations are reviewed, and their dynamics summarized in three categories. Many short-lived plants have ephemeral, pulsed dynamics lasting only a single generation, with recruitment determined almost entirely by germination biology and by the frequency and intensity of disturbance. Such populations are not amenable to traditional population models. At the other extreme, some long-lived plants have such protracted tenancy of their microsites that it is impossible to establish what pattern of dynamics (if any) their populations exhibit. A relatively small number of species show what we would traditionally regard as population dynamics at a given point in space (i.e. reasonably predictable trajectories that can be modelled by N t+1 = f ( N t )). A major difficulty in generalizing about plant dynamics is that the majority of species are successional; their recruitment depends upon the death, through senescence or disturbance, of the dominant plants. Where we do have data spanning several generations, it is clear that: (i) the populations are regulated by density dependent processes; (ii) in contrast to some animal populations, numbers appear to vary less from year to year in places where mean density is higher, and less from place to place in years when mean density is high than when density is low; (iii) few, if any, plant populations show persistent cyclic or chaotic dynamics, but (iv) there are several robust generalizations that stem from the immobility and phenotypic plasticity of plants (the law of constant yield; self-thinning rules, etc.). These generalizations are analysed in the context of simple theoretical models of plant dynamics, and the patterns observed in long-term studies are compared with similar data from animal populations. Two important shortcomings of traditional plant demography are emphasized; (i) the dearth of simple manipulative experiments on such issues as seed limitation, and (ii) the tendency to locate study plots around existing mature individuals (the omission of ‘empty quadrats’ may introduce serious bias into the estimation of plant recruitment rates).

Journal ArticleDOI
TL;DR: In this article, a detailed study of the methods of analysing the experimental data obtained from fracture mechanics tests using double-cantilever beam, end loaded split and end notched flexure specimens.
Abstract: One of the most important mechanical properties of a fibre-polymer composite is its resistance to delamination. The presence of delaminations may lead not only to complete fracture but even partial delaminations will lead to a loss of stiffness, which can be a very important design consideration. Because delamination may be regarded as crack propa­gation then an obvious scheme for characterizing this phenomenon has been via a fracture mechanics approach. There is, therefore, an extensive literature on the use of fracture mechanics to ascertain the interlaminar fracture energies, G c , for various fibre-polymer composites using different test geometries to yield mode I, mode II and mixed mode I/II values of G c . Nevertheless, problems of consistency and discussions on the accuracy of such results abound. This paper describes a detailed study of the methods of analysing the experimental data obtained from fracture mechanics tests using double-cantilever beam, end loaded split and end notched flexure specimens. It is shown that to get consistent and accurate values of G c it is necessary to consider aspects of the tests such as the end rotation and deflection of the crack tip, the effective shortening of the beam due to large displacements of the arms, and the stiffening of the beam due to the presence of the end blocks bonded to the specimens. Analytical methods for ascertaining the various correction constants and factors are described and are successfully applied to the results obtained from three different fibre-polymer composites. These composites exhibit different types of fracture behaviour and illustrate the wide range of effects that must be considered when values of the interlaminar fracture energies, free from artefacts from the test method and the analysis method, are required.

Journal ArticleDOI
29 Nov 1990
TL;DR: Various approaches to estimating what the total number of species on Earth might be are outlined: these approaches include extrapolation of past trends; direct assessments based on the overall fraction previously recorded among newly studied groups of tropical insects; indirect assessment derived from recent studies of arthropods in the canopies of tropical trees.
Abstract: This paper begins with a survey of the patterns in discovering and recording species of animals and plants, from Linnaeus9 time to the present. It then outlines various approaches to estimating what the total number of species on Earth might be: these approaches include extrapolation of past trends; direct assessments based on the overall fraction previously recorded among newly studied groups of tropical insects; indirect assessment derived from recent studies of arthropods in the canopies of tropical trees (giving special attention to the question of what fraction of the species found on a given host-tree are likely to be ‘effectively specialized9 on it); and estimates inferred from theoretical and empirical patterns in species-size relations or in food web structure. I conclude with some remarks on the broader implications of our ignorance about how many species there are.

Journal ArticleDOI
TL;DR: The increased frequency of hirsutism in obese compared with lean women with PCOS is associated with increased bio‐availability of androgens to peripheral tissues and enhanced activity of 5α‐reductase in obese subjects.
Abstract: Two hundred and sixty-three women with ultrasound-diagnosed polycystic ovary syndrome were studied of whom 91 (35%) were obese (BMI greater than 25 kg/m2). Obese women with PCOS had a greater prevalence of hirsutism (73% compared with 56%) and menstrual disorders than non-obese subjects. Total testosterone and androstenedione concentrations in serum were similar in the two subgroups but SHBG concentrations were significantly lower, and free testosterone levels higher, in obese compared with lean subjects. In addition, concentrations of androsterone glucuronide, a marker of peripheral 5 alpha-reductase activity, were higher in obese than in non-obese women with PCOS. There were no significant correlations of either SHBG or free testosterone with androsterone glucuronide suggesting that obesity has independent effects on transport and on metabolism of androgen. There were no significant differences between the subgroups in either baseline gonadotrophin concentrations or the pulsatile pattern of LH and FSH secretion studied over an 8-h period. There was, however, an inverse correlation of FSH with BMI, but only in the obese subgroup. In conclusion, the increased frequency of hirsutism in obese compared with lean women with PCOS is associated with increased bio-availability of androgens to peripheral tissues and enhanced activity of 5 alpha-reductase in obese subjects. The mechanism underlying the higher prevalence of anovulation in obese women remains unexplained.