scispace - formally typeset
Search or ask a question

Showing papers on "Population published in 2001"


Book
01 Jan 2001
TL;DR: This is the essential companion to Jeffrey Wooldridge's widely-used graduate text Econometric Analysis of Cross Section and Panel Data (MIT Press, 2001).
Abstract: The second edition of this acclaimed graduate text provides a unified treatment of two methods used in contemporary econometric research, cross section and data panel methods. By focusing on assumptions that can be given behavioral content, the book maintains an appropriate level of rigor while emphasizing intuitive thinking. The analysis covers both linear and nonlinear models, including models with dynamics and/or individual heterogeneity. In addition to general estimation frameworks (particular methods of moments and maximum likelihood), specific linear and nonlinear methods are covered in detail, including probit and logit models and their multivariate, Tobit models, models for count data, censored and missing data schemes, causal (or treatment) effects, and duration analysis. Econometric Analysis of Cross Section and Panel Data was the first graduate econometrics text to focus on microeconomic data structures, allowing assumptions to be separated into population and sampling assumptions. This second edition has been substantially updated and revised. Improvements include a broader class of models for missing data problems; more detailed treatment of cluster problems, an important topic for empirical researchers; expanded discussion of "generalized instrumental variables" (GIV) estimation; new coverage (based on the author's own recent research) of inverse probability weighting; a more complete framework for estimating treatment effects with panel data, and a firmly established link between econometric approaches to nonlinear panel data and the "generalized estimating equation" literature popular in statistics and other fields. New attention is given to explaining when particular econometric methods can be applied; the goal is not only to tell readers what does work, but why certain "obvious" procedures do not. The numerous included exercises, both theoretical and computer-based, allow the reader to extend methods covered in the text and discover new insights.

28,298 citations


Journal ArticleDOI
TL;DR: This study provides a potential standardized definition for frailty in community-dwelling older adults and offers concurrent and predictive validity for the definition, and finds that there is an intermediate stage identifying those at high risk of frailty.
Abstract: Background: Frailty is considered highly prevalent in old age and to confer high risk for falls, disability, hospitalization, and mortality. Frailty has been considered synonymous with disability, comorbidity, and other characteristics, but it is recognized that it may have a biologic basis and be a distinct clinical syndrome. A standardized definition has not yet been established. Methods: To develop and operationalize a phenotype of frailty in older adults and assess concurrent and predictive validity, the study used data from the Cardiovascular Health Study. Participants were 5,317 men and women 65 years and older (4,735 from an original cohort recruited in 1989-90 and 582 from an African American cohort recruited in 1992-93). Both cohorts received almost identical baseline evaluations and 7 and 4 years of follow-up, respectively, with annual examinations and surveillance for outcomes including incident disease, hospitalization, falls, disability, and mortality. Results: Frailty was defined as a clinical syndrome in which three or more of the following criteria were present: unintentional weight loss (10 lbs in past year), self-reported exhaustion, weakness (grip strength), slow walking speed, and low physical activity. The overall prevalence of frailty in this community-dwelling population was 6.9%; it increased with age and was greater in women than men. Four-year incidence was 7.2%. Frailty was associated with being African American, having lower education and income, poorer health, and having higher rates of comorbid chronic diseases and disability. There was overlap, but not concordance, in the cooccurrence of frailty, comorbidity, and disability. This frailty phenotype was independently predictive (over 3 years) of incident falls, worsening mobility or ADL disability, hospitalization, and death, with hazard ratios ranging from 1.82 to 4.46, unadjusted, and 1.29-2.24, adjusted for a number of health, disease, and social characteristics predictive of 5-year mortality. Intermediate frailty status, as indicated by the presence of one or two criteria, showed intermediate risk of these outcomes as well as increased risk of becoming frail over 3-4 years of follow-up (odds ratios for incident frailty = 4.51 unadjusted and 2.63 adjusted for covariates, compared to those with no frailty criteria at baseline). Conclusions: This study provides a potential standardized definition for frailty in community-dwelling older adults and offers concurrent and predictive validity for the definition. It also finds that there is an intermediate stage identifying those at high risk of frailty. Finally, it provides evidence that frailty is not synonymous with either comorbidity or disability, but comorbidity is an etiologic risk factor for, and disability is an outcome of, frailty. This provides a potential basis for clinical assessment for those who are frail or at risk, and for future research to develop interventions for frailty based on a standardized ascertainment of frailty.

16,255 citations


Book
13 Dec 2001
TL;DR: The Content Analysis Guidebook provides an accessible core text for upper-level undergraduates and graduate students across the social sciences that unravels the complicated aspects of content analysis.
Abstract: List of Boxes List of Tables and Figures Foreword Acknowledgments 1. Defining Content Analysis Is Content Analysis "Easy"? Is It Something That Anyone Can Do? A Six-Part Definition of Content Analysis 2. Milestones in the History of Content Analysis The Growing Popularity of Content Analysis Milestones of Content Analysis Research 3. Beyond Description: An Integrative Model of Content Analysis The Language of the Scientific Method How Content Analysis Is Done: Flowchart for the Typical Process of Content-Analysis Research Approaches to Content Analysis The Integrative Model of Content Analysis Evaluation With the Integrative Model of Content Analysis 4. Message Units and Sampling Units Defining the Population Archives Medium Management Sampling Sample Size 5. Variables and Predictions Identifying Critical Variables Hypotheses, Predictions, and Research Questions 6. Measurement Techniques Defining Measurement Validity, Reliability, Accuracy, and Precision Types of Validity Assessment Operationalization Computer Coding Selection of a Computer Text Content Analysis Program Human Coding Index Construction in Content Analysis 7. Reliability Intercoder Reliability Standards and Practices Issues in the Assessment of Reliability Pilot and Final Reliabilities Intercoder Reliability Coefficients: Issues and Comparisons Calculating Intercoder Reliability Coefficients Treatment of Variables That Do Not Achieve an Acceptable Level of Reliability The Use of Multiple Coders Advanced and Specialty Issues in Reliatbility Coefficient Selection 8. Results and Reporting Data Handling and Transformations Hypothesis Tesing Selecting the Appropriate Statistical Tests Frequencies Co-Occurences and In-Context Occurrences Time Lines Bivariate Relationships Multivariate Relationships 9. Contexts Psychometric Applications of Content Analysis Open-Ended Written and Pictorial Responses Linguistics and Semantic Networks Stylometrics and Computer Literary Analysis Interaction Analysis Other Interpersonal Behaviors Violence in the Media Gender Roles Minority Portrayals Advertising News Political Communication Web Analyses Other Applied Contexts Commercial and Other Client-Based Applications of Content Analysis Future Directions Resource 1: Message Archives - P.D. Skalski General Collections Film, Television and Radio Archives Literary and General Corpora Other Archives Resource 2: Using NEXIS for Text Acquisition for Content Analysis Resource 3: Computer Content Analysis Software - P.D. Skalski Part I. Quantitative Computer Text Analysis Programs Part II. VBPro How-To Guide and Executional Flowchart Resource 4: An Introduction to PRAM--A Program for Reliability Assessment With Multiple Coders Resource 5: The Content Analysis Guidebook Online Content Analysis Resources Bibliographies Message Archives and Corpora Reliability Human Coding Sample Materials Computer Content Analysis References Author Index Subject Index About the Authors

7,877 citations


Journal ArticleDOI
TL;DR: A new statistical method is presented, applicable to genotype data at linked loci from a population sample, that improves substantially on current algorithms and performs well in absolute terms, suggesting that reconstructing haplotypes experimentally or by genotyping additional family members may be an inefficient use of resources.
Abstract: Current routine genotyping methods typically do not provide haplotype information, which is essential for many analyses of fine-scale molecular-genetics data. Haplotypes can be obtained, at considerable cost, experimentally or (partially) through genotyping of additional family members. Alternatively, a statistical method can be used to infer phase and to reconstruct haplotypes. We present a new statistical method, applicable to genotype data at linked loci from a population sample, that improves substantially on current algorithms; often, error rates are reduced by >50%, relative to its nearest competitor. Furthermore, our algorithm performs well in absolute terms, suggesting that reconstructing haplotypes experimentally or by genotyping additional family members may be an inefficient use of resources.

7,482 citations


Journal ArticleDOI
Pavel Kroupa1
TL;DR: In this paper, the uncertainty inherent in any observational estimate of the IMF is investigated by studying the scatter introduced by Poisson noise and the dynamical evolution of star clusters, and it is found that this apparent scatter reproduces quite well the observed scatter in power-law index determinations, thus defining the fundamental limit within which any true variation becomes undetectable.
Abstract: A universal initial mass function (IMF) is not intuitive, but so far no convincing evidence for a variable IMF exists. The detection of systematic variations of the IMF with star-forming conditions would be the Rosetta Stone for star formation. In this contribution an average or Galactic-field IMF is defined, stressing that there is evidence for a change in the power-law index at only two masses: near 0.5 M⊙ and near 0.08 M⊙. Using this supposed universal IMF, the uncertainty inherent in any observational estimate of the IMF is investigated by studying the scatter introduced by Poisson noise and the dynamical evolution of star clusters. It is found that this apparent scatter reproduces quite well the observed scatter in power-law index determinations, thus defining the fundamental limit within which any true variation becomes undetectable. The absence of evidence for a variable IMF means that any true variation of the IMF in well-studied populations must be smaller than this scatter. Determinations of the power-law indices α are subject to systematic errors arising mostly from unresolved binaries. The systematic bias is quantified here, with the result that the single-star IMFs for young star clusters are systematically steeper by Δα≈0.5 between 0.1 and 1 M⊙ than the Galactic-field IMF, which is populated by, on average, about 5-Gyr-old stars. The MFs in globular clusters appear to be, on average, systematically flatter than the Galactic-field IMF (Piotto & Zoccali; Paresce & De Marchi), and the recent detection of ancient white-dwarf candidates in the Galactic halo and the absence of associated low-mass stars (Ibata et al.; Mendez & Minniti) suggest a radically different IMF for this ancient population. Star formation in higher metallicity environments thus appears to produce relatively more low-mass stars. While still tentative, this is an interesting trend, being consistent with a systematic variation of the IMF as expected from theoretical arguments.

6,784 citations


Journal ArticleDOI
TL;DR: In this paper, a method for estimating the effect of household economic status on educational outcomes without direct survey information on income or expenditures is proposed and defended, which uses an index based on household asset ownership indicators.
Abstract: This paper has an empirical and overtly methodological goal. The authors propose and defend a method for estimating the effect of household economic status on educational outcomes without direct survey information on income or expenditures. They construct an index based on indicators of household assets, solving the vexing problem of choosing the appropriate weights by allowing them to be determined by the statistical procedure of principal components. While the data for India cannot be used to compare alternative approaches they use data from Indonesia, Nepal, and Pakistan which have both expenditures and asset variables for the same households. With these data the authors show that not only is there a correspondence between a classification of households based on the asset index and consumption expenditures but also that the evidence is consistent with the asset index being a better proxy for predicting enrollments--apparently less subject to measurement error for this purpose--than consumption expenditures. The relationship between household wealth and educational enrollment of children can be estimated without expenditure data. A method for doing so - which uses an index based on household asset ownership indicators- is proposed and defended in this paper. In India, children from the wealthiest households are over 30 percentage points more likely to be in school than those from the poorest households.

4,661 citations


Journal Article
TL;DR: A new individual entering a population may be said to have a reproductive probability distribution as discussed by the authors, where the reproductive probability is zero from zygote to reproductive maturity, i.e., the individual will have no reproductive capability from birth to maturity.
Abstract: A new individual entering a population may be said to have a reproductive probability distribution. The reproductive probability is zero from zygote to reproductive maturity. Later, perhaps shortly...

3,800 citations


Journal ArticleDOI
13 Apr 2001-Science
TL;DR: Should past dependences of the global environmental impacts of agriculture on human population and consumption continue, 109 hectares of natural ecosystems would be converted to agriculture by 2050, accompanied by 2.4- to 2.7-fold increases in nitrogen- and phosphorus-driven eutrophication of terrestrial, freshwater, and near-shore marine ecosystems.
Abstract: During the next 50 years, which is likely to be the final period of rapid agricultural expansion, demand for food by a wealthier and 50% larger global population will be a major driver of global environmental change. Should past dependences of the global environmental impacts of agriculture on human population and consumption continue, 10(9) hectares of natural ecosystems would be converted to agriculture by 2050. This would be accompanied by 2.4- to 2.7-fold increases in nitrogen- and phosphorus-driven eutrophication of terrestrial, freshwater, and near-shore marine ecosystems, and comparable increases in pesticide use. This eutrophication and habitat destruction would cause unprecedented ecosystem simplification, loss of ecosystem services, and species extinctions. Significant scientific advances and regulatory, technological, and policy changes are needed to control the environmental impacts of agricultural expansion.

3,606 citations


Journal ArticleDOI
TL;DR: The number of people exposed to environmental tobacco smoke in California seems to have decreased over the same time period, where exposure is determined by the reported time spent with a smoker.
Abstract: Because human activities impact the timing, location, and degree of pollutant exposure, they play a key role in explaining exposure variation. This fact has motivated the collection of activity pattern data for their specific use in exposure assessments. The largest of these recent efforts is the National Human Activity Pattern Survey (NHAPS), a 2-year probability-based telephone survey ( n=9386) of exposure-related human activities in the United States (U.S.) sponsored by the U.S. Environmental Protection Agency (EPA). The primary purpose of NHAPS was to provide comprehensive and current exposure information over broad geographical and temporal scales, particularly for use in probabilistic population exposure models. NHAPS was conducted on a virtually daily basis from late September 1992 through September 1994 by the University of Maryland's Survey Research Center using a computer-assisted telephone interview instrument (CATI) to collect 24-h retrospective diaries and answers to a number of personal and exposure-related questions from each respondent. The resulting diary records contain beginning and ending times for each distinct combination of location and activity occurring on the diary day (i.e., each microenvironment). Between 340 and 1713 respondents of all ages were interviewed in each of the 10 EPA regions across the 48 contiguous states. Interviews were completed in 63% of the households contacted. NHAPS respondents reported spending an average of 87% of their time in enclosed buildings and about 6% of their time in enclosed vehicles. These proportions are fairly constant across the various regions of the U.S. and Canada and for the California population between the late 1980s, when the California Air Resources Board (CARB) sponsored a state-wide activity pattern study, and the mid-1990s, when NHAPS was conducted. However, the number of people exposed to environmental tobacco smoke (ETS) in California seems to have decreased over the same time period, where exposure is determined by the reported time spent with a smoker. In both California and the entire nation, the most time spent exposed to ETS was reported to take place in residential locations.

3,400 citations


Journal ArticleDOI
TL;DR: Estimates of the number of new cancer cases and deaths expected in the US in the current year and the most recent data on cancer incidence, mortality, and survival reveal large disparities in cancer incidence and mortality across racial/ethnic groups.
Abstract: Each year the American Cancer Society compiles estimates of the number of new cancer cases and deaths expected in the US in the current year and the most recent data on cancer incidence, mortality, and survival. An estimated 1,268,000 new cases of cancer will be diagnosed in the year 2001 and an estimated 553,400 Americans will die from cancer. Overall cancer incidence and death rates have continued to decrease in men and women since the early 1990s, and the decline in overall cancer mortality has been greater in recent years. Despite reductions in age-adjusted rates of cancer death, the total number of recorded cancer deaths in the US continues to increase, due to an aging and expanding population. Large disparities in cancer incidence and mortality across racial/ethnic groups continue. Black men and women experience higher incidence of cancer and poorer survival than white men and women. The disparity in survival reflects both diagnosis of cancer at later disease stages, and poorer survival within each stage of diagnosis.

3,346 citations


Journal ArticleDOI
TL;DR: In this article, the authors track some of the major myths on driving forces of land cover change and propose alternative pathways of change that are better supported by case study evidence, concluding that neither population nor poverty alone constitute the sole and major underlying causes of land-cover change worldwide.
Abstract: Common understanding of the causes of land-use and land-cover change is dominated by simplifications which, in turn, underlie many environment-development policies. This article tracks some of the major myths on driving forces of land-cover change and proposes alternative pathways of change that are better supported by case study evidence. Cases reviewed support the conclusion that neither population nor poverty alone constitute the sole and major underlying causes of land-cover change worldwide. Rather, peoples’ responses to economic opportunities, as mediated by institutional factors, drive land-cover changes. Opportunities and

Book ChapterDOI
TL;DR: The most consistent and pervasive effect is an increase in impervious surface cover within urban catchments, which alters the hydrology and geomorphology of streams as discussed by the authors, which results in predictable changes in stream habitat.
Abstract: The world’s population is concentrated in urban areas. This change in demography has brought landscape transformations that have a number of documented effects on stream ecosystems. The most consistent and pervasive effect is an increase in impervious surface cover within urban catchments, which alters the hydrology and geomorphology of streams. This results in predictable changes in stream habitat. In addition to imperviousness, runoff from urbanized surfaces as well as municipal and industrial discharges result in increased loading of nutrients, metals, pesticides, and other contaminants to streams. These changes result in consistent declines in the richness of algal, invertebrate, and fish communities in urban streams. Although understudied in urban streams, ecosystem processes are also affected by urbanization. Urban streams represent opportunities for ecologists interested in studying disturbance and contributing to more effective landscape management.

Journal ArticleDOI
TL;DR: The proportion of firms paying cash dividends falls from 66.5% in 1978 to 20.8% in 1999, due in part to the changing characteristics of publicly traded firms as discussed by the authors.

Book
30 Mar 2001
TL;DR: This paper presents a series of models for continuous single-species and multi-species population models, and a model forStructured Population Models, which combines continuous and discrete models for populations with spatial distribution.
Abstract: Preface * Ackn. * Prologue * Part I: Simple Single-Species Models * 1 Continuous Population Models * 2 Discrete Population Models * 3 Continuous single-species Population Models with Delays * Part II: Models for Interacting Species * 4 Introduction and Mathematical Preliminaries * 5 Continuous models for two interacting populations * 6 Harvesting in two-species models * Part III: Structured Population Models * 7 Basic ideas of Mathematical Epidemiology * 8 Models for population with age structure * Epilogue * Answers to selected Exercises * References

Journal ArticleDOI
TL;DR: In this article, a large-scale CO survey of the first and second Galactic quadrants and the nearby molecular cloud complexes in Orion and Taurus, obtained with the CfA 1.2 m telescope, was combined with 31 other surveys obtained over the past two decades with that instrument and a similar telescope on Cerro Tololo in Chile, to produce a new composite CO survey.
Abstract: New large-scale CO surveys of the first and second Galactic quadrants and the nearby molecular cloud complexes in Orion and Taurus, obtained with the CfA 1.2 m telescope, have been combined with 31 other surveys obtained over the past two decades with that instrument and a similar telescope on Cerro Tololo in Chile, to produce a new composite CO survey of the entire Milky Way. The survey consists of 488,000 spectra that Nyquist or beamwidth ( °) sample the entire Galactic plane over a strip 4°-10° wide in latitude, and beamwidth or ° sample nearly all large local clouds at higher latitudes. Compared with the previous composite CO survey of Dame et al. (1987), the new survey has 16 times more spectra, up to 3.4 times higher angular resolution, and up to 10 times higher sensitivity per unit solid angle. Each of the component surveys was integrated individually using clipping or moment masking to produce composite spatial and longitude-velocity maps of the Galaxy that display nearly all of the statistically significant emission in each survey but little noise. The composite maps provide detailed information on individual molecular clouds, suggest relationships between clouds and regions widely separated on the sky, and clearly display the main structural features of the molecular Galaxy. In addition, since the gas, dust, and Population I objects associated with molecular clouds contribute to the Galactic emission in every major wavelength band, the precise kinematic information provided by the present survey will form the foundation for many large-scale Galactic studies. A map of molecular column density predicted from complete and unbiased far-infrared and 21 cm surveys of the Galaxy was used both to determine the completeness of the present survey and to extrapolate it to the entire sky at |b| 5°), X shows little systematic variation with latitude from a mean value of (1.8 ± 0.3) × 1020 cm-2 K-1 km-1 s. Given the large sky area and large quantity of CO data analyzed, we conclude that this is the most reliable measurement to date of the mean X value in the solar neighborhood.

Journal ArticleDOI
27 Apr 2001-Science
TL;DR: The unprecedented rates of climate changes anticipated to occur in the future, coupled with land use changes that impede gene flow, can be expected to disrupt the interplay of adaptation and migration, likely affecting productivity and threatening the persistence of many species.
Abstract: Tree taxa shifted latitude or elevation range in response to changes in Quaternary climate. Because many modern trees display adaptive differentiation in relation to latitude or elevation, it is likely that ancient trees were also so differentiated, with environmental sensitivities of populations throughout the range evolving in conjunction with migrations. Rapid climate changes challenge this process by imposing stronger selection and by distancing populations from environments to which they are adapted. The unprecedented rates of climate changes anticipated to occur in the future, coupled with land use changes that impede gene flow, can be expected to disrupt the interplay of adaptation and migration, likely affecting productivity and threatening the persistence of many species.

Journal ArticleDOI
TL;DR: Three elementary measures of cancer frequency are confined ourselves to: incidence, mortality and prevalence.

Journal ArticleDOI
11 Jul 2001-JAMA
TL;DR: The sex-specific Framingham CHD prediction functions perform well among whites and blacks in different settings and can be applied to other ethnic groups after recalibration for differing prevalences of risk factors and underlying rates of CHD events.
Abstract: Context The Framingham Heart Study produced sex-specific coronary heart disease (CHD) prediction functions for assessing risk of developing incident CHD in a white middle-class population. Concern exists regarding whether these functions can be generalized to other populations. Objective To test the validity and transportability of the Framingham CHD prediction functions per a National Heart, Lung, and Blood Institute workshop organized for this purpose. Design, Setting, and Subjects Sex-specific CHD functions were derived from Framingham data for prediction of coronary death and myocardial infarction. These functions were applied to 6 prospectively studied, ethnically diverse cohorts (n = 23 424), including whites, blacks, Native Americans, Japanese American men, and Hispanic men: the Atherosclerosis Risk in Communities Study (1987-1988), Physicians' Health Study (1982), Honolulu Heart Program (1980-1982), Puerto Rico Heart Health Program (1965-1968), Strong Heart Study (1989-1991), and Cardiovascular Health Study (1989-1990). Main Outcome Measures The performance, or ability to accurately predict CHD risk, of the Framingham functions compared with the performance of risk functions developed specifically from the individual cohorts' data. Comparisons included evaluation of the equality of relative risks for standard CHD risk factors, discrimination, and calibration. Results For white men and women and for black men and women the Framingham functions performed reasonably well for prediction of CHD events within 5 years of follow-up. Among Japanese American and Hispanic men and Native American women, the Framingham functions systematically overestimated the risk of 5-year CHD events. After recalibration, taking into account different prevalences of risk factors and underlying rates of developing CHD, the Framingham functions worked well in these populations. Conclusions The sex-specific Framingham CHD prediction functions perform well among whites and blacks in different settings and can be applied to other ethnic groups after recalibration for differing prevalences of risk factors and underlying rates of CHD events.

Journal ArticleDOI
TL;DR: In this article, the authors present a data set that improves the measurement of educational attainment for a broad group of countries, and they extend their previous estimates to 1995 for the population over ages 15 and 25.
Abstract: This paper presents a data set that improves the measurement of educational attainment for a broad group of countries. We extend our previous estimates to 1995 for educational attainment for the population over ages 15 and 25. We also provide projections for 2000. We discuss the estimation method for the measures of educational attainment and relate our estimates to alternative international measures of human capital stocks.

Journal ArticleDOI
TL;DR: The ability to quantify the variance of the human brain as a function of age in a large population of subjects for whom data is also available about their genetic composition and behaviour will allow for the first assessment of cerebral genotype-phenotype-behavioural correlations in humans to take place in a population this large.
Abstract: Motivated by the vast amount of information that is rapidly accumulating about the human brain in digital form, we embarked upon a program in 1992 to develop a four–dimensional probabilistic atlas and reference system for the human brain. Through an International Consortium for Brain Mapping (ICBM) a dataset is being collected that includes 7000 subjects between the ages of eighteen and ninety years and including 342 mono– and dizygotic twins. Data on each subject includes detailed demographic, clinical, behavioural and imaging information. DNA has been collected for genotyping from 5800 subjects. A component of the programme uses post–mortem tissue to determine the probabilistic distribution of microscopic cyto– and chemoarchitectural regions in the human brain. This, combined with macroscopic information about structure and function derived from subjects in vivo , provides the first large scale opportunity to gain meaningful insights into the concordance or discordance in micro– and macroscopic structure and function. The philosophy, strategy, algorithm development, data acquisition techniques and validation methods are described in this report along with database structures. Examples of results are described for the normal adult human brain as well as examples in patients with Alzheimer's disease and multiple sclerosis. The ability to quantify the variance of the human brain as a function of age in a large population of subjects for whom data is also available about their genetic composition and behaviour will allow for the first assessment of cerebral genotype–phenotype–behavioural correlations in humans to take place in a population this large. This approach and its application should provide new insights and opportunities for investigators interested in basic neuroscience, clinical diagnostics and the evaluation of neuropsychiatric disorders in patients.

Journal ArticleDOI
TL;DR: A successful feeder-free hES culture system in which undifferentiated cells can be maintained for at least 130 population doublings and are suitable for scaleup production is demonstrated.
Abstract: Previous studies have shown that maintenance of undifferentiated human embryonic stem (hES) cells requires culture on mouse embryonic fibroblast (MEF) feeders. Here we demonstrate a successful feeder-free hES culture system in which undifferentiated cells can be maintained for at least 130 population doublings. In this system, hES cells are cultured on Matrigel or laminin in medium conditioned by MEF. The hES cells maintained on feeders or off feeders express integrin alpha6 and beta1, which may form a laminin-specific receptor. The hES cell populations in feeder-free conditions maintained a normal karyotype, stable proliferation rate, and high telomerase activity. Similar to cells cultured on feeders, hES cells maintained under feeder-free conditions expressed OCT-4, hTERT, alkaline phosphatase, and surface markers including SSEA-4, Tra 1-60, and Tra 1-81. In addition, hES cells maintained without direct feeder contact formed teratomas in SCID/beige mice and differentiated in vitro into cells from all three germ layers. Thus, the cells retain fundamental characteristics of hES cells in this culture system and are suitable for scaleup production.

Journal ArticleDOI
TL;DR: Screening different groups of elderly individuals in a general or specialty practice would be beneficial in detecting dementia and persons with memory impairment who were not demented were characterized in the literature as having mild cognitive impairment.
Abstract: Article abstract—Objective: The goal of this project was to determine whether screening different groups of elderly individuals in a general or specialty practice would be beneficial in detecting dementia. Background: Epidemiologic studies of aging and dementia have demonstrated that the use of research criteria for the classification of dementia has yielded three groups of subjects: those who are demented, those who are not demented, and a third group of individuals who cannot be classified as normal or demented but who are cognitively (usually memory) impaired. Methods: The authors conducted computerized literature searches and generated a set of abstracts based on text and index words selected to reflect the key issues to be addressed. Articles were abstracted to determine whether there were sufficient data to recommend the screening of asymptomatic individuals. Other research studies were evaluated to determine whether there was value in identifying individuals who were memory-impaired beyond what one would expect for age but who were not demented. Finally, screening instruments and evaluation techniques for the identification of cognitive impairment were reviewed. Results: There were insufficient data to make any recommendations regarding cognitive screening of asymptomatic individuals. Persons with memory impairment who were not demented were characterized in the literature as having mild cognitive impairment. These subjects were at increased risk for developing dementia or AD when compared with similarly aged individuals in the general population. Recommendations: There were sufficient data to recommend the evaluation and clinical monitoring of persons with mild cognitive impairment due to their increased risk for developing dementia (Guideline). Screening instruments, e.g., Mini-Mental State Examination, were found to be useful to the clinician for assessing the degree of cognitive impairment (Guideline), as were neuropsychologic batteries (Guideline), brief focused cognitive instruments (Option), and certain structured informant interviews (Option). Increasing attention is being paid to persons with mild cognitive impairment for whom treatment options are being evaluated that may alter the rate of progression to dementia.

Journal ArticleDOI
12 Jul 2001-Headache
TL;DR: The prevalence, sociodemographic profile, and the burden of migraine in the United States in 1999 and to compare results with the original American Migraine Study, a 1989 population‐based study employing identical methods are described.
Abstract: Objective.—To describe the prevalence, sociodemographic profile, and the burden of migraine in the United States in 1999 and to compare results with the original American Migraine Study, a 1989 population-based study employing identical methods. Methods.—A validated, self-administered questionnaire was mailed to a sample of 20 000 households in the United States. Each household member with severe headache was asked to respond to questions about symptoms, frequency, and severity of headaches and about headache-related disability. Diagnostic criteria for migraine were based on those of the International Headache Society. This report is restricted to individuals 12 years and older. Results.—Of the 43 527 age-eligible individuals, 29 727 responded to the questionnaire for a 68.3% response rate. The prevalence of migraine was 18.2% among females and 6.5% among males. Approximately 23% of households contained at least one member suffering from migraine. Migraine prevalence was higher in whites than in blacks and was inversely related to household income. Prevalence increased from aged 12 years to about aged 40 years and declined thereafter in both sexes. Fifty-three percent of respondents reported that their severe headaches caused substantial impairment in activities or required bed rest. Approximately 31% missed at least 1 day of work or school in the previous 3 months because of migraine; 51% reported that work or school productivity was reduced by at least 50%. Conclusions.—Two methodologically identical national surveys in the United States conducted 10 years apart show that the prevalence and distribution of migraine have remained stable over the last decade. Migraine-associated disability remains substantial and pervasive. The number of migraineurs has increased from 23.6 million in 1989 to 27.9 million in 1999 commensurate with the growth of the population. Migraine is an important target for public health interventions because it is highly prevalent and disabling.

Journal ArticleDOI
TL;DR: This paper examined a model of dynamic price adjustment based on the assumption that information disseminates slowly throughout the population and found that the change in inflation is positively correlated with the level of economic activity.
Abstract: This paper examines a model of dynamic price adjustment based on the assumption that information disseminates slowly throughout the population. Compared to the commonly used sticky-price model, this sticky-information model displays three, related properties that are more consistent with accepted views about the effects of monetary policy. First, disinflations are always contractionary (although announced disinflations are less contractionary than surprise ones). Second, monetary policy shocks have their maximum impact on inflation with a substantial delay. Third, the change in inflation is positively correlated with the level of economic activity.

Journal ArticleDOI
23 Mar 2001-Science
TL;DR: In response to viral or bacterial infection, antigen-specific CD8 T cells migrated to nonlymphoid tissues and were present as long-lived memory cells, pointing to the existence of a population of extralymphoid effector memory T cells poised for immediate response to infection.
Abstract: Many intracellular pathogens infect a broad range of host tissues, but the importance of T cells for immunity in these sites is unclear because most of our understanding of antimicrobial T cell responses comes from analyses of lymphoid tissue. Here, we show that in response to viral or bacterial infection, antigen-specific CD8 T cells migrated to nonlymphoid tissues and were present as long-lived memory cells. Strikingly, CD8 memory T cells isolated from nonlymphoid tissues exhibited effector levels of lytic activity directly ex vivo, in contrast to their splenic counterparts. These results point to the existence of a population of extralymphoid effector memory T cells poised for immediate response to infection.

Journal ArticleDOI
25 Oct 2001-Nature
TL;DR: The distribution of close homologues of S. typhimurium LT2 genes in eight related enterobacteria was determined using previously completed genomes of three related bacteria, sample sequencing of both S. enterica serovar Paratyphi A and Klebsiella pneumoniae as mentioned in this paper.
Abstract: Salmonella enterica subspecies I, serovar Typhimurium (S. typhimurium), is a leading cause of human gastroenteritis, and is used as a mouse model of human typhoid fever. The incidence of non-typhoid salmonellosis is increasing worldwide, causing millions of infections and many deaths in the human population each year. Here we sequenced the 4,857-kilobase (kb) chromosome and 94-kb virulence plasmid of S. typhimurium strain LT2. The distribution of close homologues of S. typhimurium LT2 genes in eight related enterobacteria was determined using previously completed genomes of three related bacteria, sample sequencing of both S. enterica serovar Paratyphi A (S. paratyphi A) and Klebsiella pneumoniae, and hybridization of three unsequenced genomes to a microarray of S. typhimurium LT2 genes. Lateral transfer of genes is frequent, with 11% of the S. typhimurium LT2 genes missing from S. enterica serovar Typhi (S. typhi), and 29% missing from Escherichia coli K12. The 352 gene homologues of S. typhimurium LT2 confined to subspecies I of S. enterica-containing most mammalian and bird pathogens-are useful for studies of epidemiology, host specificity and pathogenesis. Most of these homologues were previously unknown, and 50 may be exported to the periplasm or outer membrane, rendering them accessible as therapeutic or vaccine targets.

Journal ArticleDOI
TL;DR: The results suggest that recent trends in agriculture have had deleterious and measurable effects on bird populations on a continental scale and predict that the introduction of EU agricultural policies into former communist countries hoping to accede to the EU in the near future will result in significant declines in the important bird populations there.
Abstract: The populations of farmland birds in Europe declined markedly during the last quarter of the 20th century, representing a severe threat to biodiversity. Here, we assess whether declines in the populations and ranges of farmland birds across Europe reflect differences in agricultural intensity, which arise largely through differences in political history. Population and range changes were modelled in terms of a number of indices of agricultural intensity. Population declines and range contractions were significantly greater in countries with more intensive agriculture, and significantly higher in the European Union (EU) than in former communist countries. Cereal yield alone explained over 30% of the variation in population trends. The results suggest that recent trends in agriculture have had deleterious and measurable effects on bird populations on a continental scale. We predict that the introduction of EU agricultural policies into former communist countries hoping to accede to the EU in the near future will result in significant declines in the important bird populations there.

Journal ArticleDOI
Kim Lewis1
TL;DR: The nature of bacterial biofilm resistance to antimicrobials is the subject of the present minireview and describes an increased resistance of cells to killing.
Abstract: A biofilm is a population of cells growing on a surface and enclosed in an exopolysaccharide matrix. Biofilms are notoriously difficult to eradicate and are a source of many recalcitrant infections. The nature of bacterial biofilm resistance to antimicrobials is the subject of the present minireview. Pathogenic yeast such as Candida albicans also form recalcitrant biofilms, and this topic has recently been reviewed (5). Resistance is an ability of a microorganism to grow in the presence of an elevated level of an antimicrobial. In short, a strain for which the MIC is increased is resistant. By this conventional criterion, biofilm cells do not necessarily show increased resistance. With some exceptions, biofilm cells do not grow better than planktonic cells in the presence of a broad range of antimicrobials. This is evident from examination of susceptibility data in the biofilm literature (33). However, in most biofilm susceptibility studies, only survival of cells in a preformed biofilm rather than the ability of a biofilm to grow is recorded. Accordingly, the reported “resistance” describes an increased resistance of cells to killing. This is indeed what biofilms are good at: they are not easily eradicated by cidal antimicrobials. The ability of antimicrobials to inhibit biofilm growth indicates that they are able to diffuse through the biofilm and act normally against their targets. Why, then, do biofilm cells not die? This is the crux of the problem and the riddle that needs to be solved.

Journal ArticleDOI
TL;DR: A high-resolution analysis of the haplotype structure across 500 kilobases on chromosome 5q31 using 103 single-nucleotide polymorphisms (SNPs) in a European-derived population offers a coherent framework for creating a haplotype map of the human genome.
Abstract: Linkage disequilibrium (LD) analysis is traditionally based on individual genetic markers and often yields an erratic, non-monotonic picture, because the power to detect allelic associations depends on specific properties of each marker, such as frequency and population history. Ideally, LD analysis should be based directly on the underlying haplotype structure of the human genome, but this structure has remained poorly understood. Here we report a high-resolution analysis of the haplotype structure across 500 kilobases on chromosome 5q31 using 103 single-nucleotide polymorphisms (SNPs) in a European-derived population. The results show a picture of discrete haplotype blocks (of tens to hundreds of kilobases), each with limited diversity punctuated by apparent sites of recombination. In addition, we develop an analytical model for LD mapping based on such haplotype blocks. If our observed structure is general (and published data suggest that it may be), it offers a coherent framework for creating a haplotype map of the human genome.

Journal ArticleDOI
10 May 2001-Nature
TL;DR: The results illuminate human history, suggesting that LD in northern Europeans is shaped by a marked demographic event about 27,000–53,000 years ago, implying that LD mapping is likely to be practical in this population.
Abstract: With the availability of a dense genome-wide map of single nucleotide polymorphisms (SNPs), a central issue in human genetics is whether it is now possible to use linkage disequilibrium (LD) to map genes that cause disease. LD refers to correlations among neighbouring alleles, reflecting 'haplotypes' descended from single, ancestral chromosomes. The size of LD blocks has been the subject of considerable debate. Computer simulations and empirical data have suggested that LD extends only a few kilobases (kb) around common SNPs, whereas other data have suggested that it can extend much further, in some cases greater than 100 kb. It has been difficult to obtain a systematic picture of LD because past studies have been based on only a few (1-3) loci and different populations. Here, we report a large-scale experiment using a uniform protocol to examine 19 randomly selected genomic regions. LD in a United States population of north-European descent typically extends 60 kb from common alleles, implying that LD mapping is likely to be practical in this population. By contrast, LD in a Nigerian population extends markedly less far. The results illuminate human history, suggesting that LD in northern Europeans is shaped by a marked demographic event about 27,000-53,000 years ago.