scispace - formally typeset
Search or ask a question

Showing papers by "University of Washington published in 2001"


Journal ArticleDOI
Eric S. Lander1, Lauren Linton1, Bruce W. Birren1, Chad Nusbaum1  +245 moreInstitutions (29)
15 Feb 2001-Nature
TL;DR: The results of an international collaboration to produce and make freely available a draft sequence of the human genome are reported and an initial analysis is presented, describing some of the insights that can be gleaned from the sequence.
Abstract: The human genome holds an extraordinary trove of information about human development, physiology, medicine and evolution. Here we report the results of an international collaboration to produce and make freely available a draft sequence of the human genome. We also present an initial analysis of the data, describing some of the insights that can be gleaned from the sequence.

22,269 citations


Journal ArticleDOI
TL;DR: A new statistical method is presented, applicable to genotype data at linked loci from a population sample, that improves substantially on current algorithms and performs well in absolute terms, suggesting that reconstructing haplotypes experimentally or by genotyping additional family members may be an inefficient use of resources.
Abstract: Current routine genotyping methods typically do not provide haplotype information, which is essential for many analyses of fine-scale molecular-genetics data. Haplotypes can be obtained, at considerable cost, experimentally or (partially) through genotyping of additional family members. Alternatively, a statistical method can be used to infer phase and to reconstruct haplotypes. We present a new statistical method, applicable to genotype data at linked loci from a population sample, that improves substantially on current algorithms; often, error rates are reduced by >50%, relative to its nearest competitor. Furthermore, our algorithm performs well in absolute terms, suggesting that reconstructing haplotypes experimentally or by genotyping additional family members may be an inefficient use of resources.

7,482 citations


Journal ArticleDOI
TL;DR: Two computational modeling studies are reported, serving to articulate the conflict monitoring hypothesis and examine its implications, including a feedback loop connecting conflict monitoring to cognitive control, and a number of important behavioral phenomena.
Abstract: A neglected question regarding cognitive control is how control processes might detect situations calling for their involvement. The authors propose here that the demand for control may be evaluated in part by monitoring for conflicts in information processing. This hypothesis is supported by data concerning the anterior cingulate cortex, a brain area involved in cognitive control, which also appears to respond to the occurrence of conflict. The present article reports two computational modeling studies, serving to articulate the conflict monitoring hypothesis and examine its implications. The first study tests the sufficiency of the hypothesis to account for brain activation data, applying a measure of conflict to existing models of tasks shown to engage the anterior cingulate. The second study implements a feedback loop connecting conflict monitoring to cognitive control, using this to simulate a number of important behavioral phenomena.

6,385 citations


Journal ArticleDOI
TL;DR: In economics and management theories, scholars have traditionally assumed the existence of artifacts such as firms/organizations and markets as mentioned in this paper, and they argue that an explanation for the creation of such artifacts requires the notion of effectuation.
Abstract: In economics and management theories, scholars have traditionally assumed the existence of artifacts such as firms/organizations and markets. I argue that an explanation for the creation of such artifacts requires the notion of effectuation. Causation rests on a logic of prediction, effectuation on the logic of control. I illustrate effectuation through business examples and realistic thought experiments, examine its connections with existing theories and empirical evidence, and offer a list of testable propositions for future empirical work.

4,438 citations


Journal ArticleDOI
26 Oct 2001-Science
TL;DR: Larger numbers of species are probably needed to reduce temporal variability in ecosystem processes in changing environments and to determine how biodiversity dynamics, ecosystem processes, and abiotic factors interact.
Abstract: The ecological consequences of biodiversity loss have aroused considerable interest and controversy during the past decade. Major advances have been made in describing the relationship between species diversity and ecosystem processes, in identifying functionally important species, and in revealing underlying mechanisms. There is, however, uncertainty as to how results obtained in recent experiments scale up to landscape and regional levels and generalize across ecosystem types and processes. Larger numbers of species are probably needed to reduce temporal variability in ecosystem processes in changing environments. A major future challenge is to determine how biodiversity dynamics, ecosystem processes, and abiotic factors interact.

4,070 citations


Journal ArticleDOI
TL;DR: Changing the use of tetracyclines in human and animal health as well as in food production is needed if this class of broad-spectrum antimicrobials through the present century is to continue to be used.
Abstract: Tetracyclines were discovered in the 1940s and exhibited activity against a wide range of microorganisms including gram-positive and gram-negative bacteria, chlamydiae, mycoplasmas, rickettsiae, and protozoan parasites. They are inexpensive antibiotics, which have been used extensively in the prophlylaxis and therapy of human and animal infections and also at subtherapeutic levels in animal feed as growth promoters. The first tetracycline-resistant bacterium, Shigella dysenteriae, was isolated in 1953. Tetracycline resistance now occurs in an increasing number of pathogenic, opportunistic, and commensal bacteria. The presence of tetracycline-resistant pathogens limits the use of these agents in treatment of disease. Tetracycline resistance is often due to the acquisition of new genes, which code for energy-dependent efflux of tetracyclines or for a protein that protects bacterial ribosomes from the action of tetracyclines. Many of these genes are associated with mobile plasmids or transposons and can be distinguished from each other using molecular methods including DNA-DNA hybridization with oligonucleotide probes and DNA sequencing. A limited number of bacteria acquire resistance by mutations, which alter the permeability of the outer membrane porins and/or lipopolysaccharides in the outer membrane, change the regulation of innate efflux systems, or alter the 16S rRNA. New tetracycline derivatives are being examined, although their role in treatment is not clear. Changing the use of tetracyclines in human and animal health as well as in food production is needed if we are to continue to use this class of broad-spectrum antimicrobials through the present century.

3,647 citations


Journal ArticleDOI
26 Apr 2001-Nature
TL;DR: It is reported that mammalian TLR5 recognizes bacterial flagellin from both Gram-positive and Gram-negative bacteria, and that activation of the receptor mobilizes the nuclear factor NF-κB and stimulates tumour necrosis factor-α production, and the data suggest thatTLR5, a member of the evolutionarily conserved Toll-like receptor family, has evolved to permit mammals specifically to detect flageLLated bacterial pathogens.
Abstract: The innate immune system recognizes pathogen-associated molecular patterns (PAMPs) that are expressed on infectious agents, but not on the host. Toll-like receptors (TLRs) recognize PAMPs and mediate the production of cytokines necessary for the development of effective immunity. Flagellin, a principal component of bacterial flagella, is a virulence factor that is recognized by the innate immune system in organisms as diverse as flies, plants and mammals. Here we report that mammalian TLR5 recognizes bacterial flagellin from both Gram-positive and Gram-negative bacteria, and that activation of the receptor mobilizes the nuclear factor NF-kappaB and stimulates tumour necrosis factor-alpha production. TLR5-stimulating activity was purified from Listeria monocytogenes culture supernatants and identified as flagellin by tandem mass spectrometry. Expression of L. monocytogenes flagellin in non-flagellated Escherichia coli conferred on the bacterium the ability to activate TLR5, whereas deletion of the flagellin genes from Salmonella typhimurium abrogated TLR5-stimulating activity. All known TLRs signal through the adaptor protein MyD88. Mice challenged with bacterial flagellin rapidly produced systemic interleukin-6, whereas MyD88-null mice did not respond to flagellin. Our data suggest that TLR5, a member of the evolutionarily conserved Toll-like receptor family, has evolved to permit mammals specifically to detect flagellated bacterial pathogens.

3,575 citations


Journal ArticleDOI
TL;DR: Tests of general relativity at the post-Newtonian level have reached high precision, including the light deflection, the Shapiro time delay, the perihelion advance of Mercury, the Nordtvedt effect in lunar motion, and frame-dragging.
Abstract: The status of experimental tests of general relativity and of theoretical frameworks for analyzing them is reviewed and updated. Einstein’s equivalence principle (EEP) is well supported by experiments such as the Eotvos experiment, tests of local Lorentz invariance and clock experiments. Ongoing tests of EEP and of the inverse square law are searching for new interactions arising from unification or quantum gravity. Tests of general relativity at the post-Newtonian level have reached high precision, including the light deflection, the Shapiro time delay, the perihelion advance of Mercury, the Nordtvedt effect in lunar motion, and frame-dragging. Gravitational wave damping has been detected in an amount that agrees with general relativity to better than half a percent using the Hulse-Taylor binary pulsar, and a growing family of other binary pulsar systems is yielding new tests, especially of strong-field effects. Current and future tests of relativity will center on strong gravity and gravitational waves.

3,394 citations


Journal ArticleDOI
TL;DR: In this article, the authors track some of the major myths on driving forces of land cover change and propose alternative pathways of change that are better supported by case study evidence, concluding that neither population nor poverty alone constitute the sole and major underlying causes of land-cover change worldwide.
Abstract: Common understanding of the causes of land-use and land-cover change is dominated by simplifications which, in turn, underlie many environment-development policies. This article tracks some of the major myths on driving forces of land-cover change and proposes alternative pathways of change that are better supported by case study evidence. Cases reviewed support the conclusion that neither population nor poverty alone constitute the sole and major underlying causes of land-cover change worldwide. Rather, peoples’ responses to economic opportunities, as mediated by institutional factors, drive land-cover changes. Opportunities and

3,330 citations


Journal ArticleDOI
TL;DR: This survey and taxonomy of location systems for mobile-computing applications describes a spectrum of current products and explores the latest in the field to help developers of location-aware applications better evaluate their options when choosing a location-sensing system.
Abstract: This survey and taxonomy of location systems for mobile-computing applications describes a spectrum of current products and explores the latest in the field. To make sense of this domain, we have developed a taxonomy to help developers of location-aware applications better evaluate their options when choosing a location-sensing system. The taxonomy may also aid researchers in identifying opportunities for new location-sensing techniques.

3,237 citations


Journal ArticleDOI
TL;DR: Genetic evidence is presented that different mutations of the human gene FOXP3, the ortholog of the gene mutated in scurfy mice (Foxp3), causes IPEX syndrome.
Abstract: IPEX is a fatal disorder characterized by immune dysregulation, polyendocrinopathy, enteropathy and X-linked inheritance (MIM 304930). We present genetic evidence that different mutations of the human gene FOXP3, the ortholog of the gene mutated in scurfy mice (Foxp3), causes IPEX syndrome. Recent linkage analysis studies mapped the gene mutated in IPEX to an interval of 17-20-cM at Xp11. 23-Xq13.3.

Proceedings ArticleDOI
26 Aug 2001
TL;DR: It is proposed to model also the customer's network value: the expected profit from sales to other customers she may influence to buy, the customers those may influence, and so on recursively, taking advantage of the availability of large relevant databases.
Abstract: One of the major applications of data mining is in helping companies determine which potential customers to market to. If the expected profit from a customer is greater than the cost of marketing to her, the marketing action for that customer is executed. So far, work in this area has considered only the intrinsic value of the customer (i.e, the expected profit from sales to her). We propose to model also the customer's network value: the expected profit from sales to other customers she may influence to buy, the customers those may influence, and so on recursively. Instead of viewing a market as a set of independent entities, we view it as a social network and model it as a Markov random field. We show the advantages of this approach using a social network mined from a collaborative filtering database. Marketing that exploits the network value of customers---also known as viral marketing---can be extremely effective, but is still a black art. Our work can be viewed as a step towards providing a more solid foundation for it, taking advantage of the availability of large relevant databases.

Journal ArticleDOI
01 Aug 2001-Diabetes
TL;DR: The hypothesis that ghrelin plays a physiological role in meal initiation in humans is supported by the clear preprandials rise and postprandial fall in plasma ghrelIn levels.
Abstract: The recently discovered orexigenic peptide ghrelin is produced primarily by the stomach and circulates in blood at levels that increase during prolonged fasting in rats. When administered to rodents at supraphysiological doses, ghrelin activates hypothalamic neuropeptide Y/agouti gene-related protein neurons and increases food intake and body weight. These findings suggest that ghrelin may participate in meal initiation. As a first step to investigate this hypothesis, we sought to determine whether circulating ghrelin levels are elevated before the consumption of individual meals in humans. Ghrelin, insulin, and leptin were measured by radioimmunoassay in plasma samples drawn 38 times throughout a 24-h period in 10 healthy subjects provided meals on a fixed schedule. Plasma ghrelin levels increased nearly twofold immediately before each meal and fell to trough levels within 1 h after eating, a pattern reciprocal to that of insulin. Intermeal ghrelin levels displayed a diurnal rhythm that was exactly in phase with that of leptin, with both hormones rising throughout the day to a zenith at 0100, then falling overnight to a nadir at 0900. Ghrelin levels sampled during the troughs before and after breakfast correlated strongly with 24-h integrated area under the curve values (r = 0.873 and 0.954, respectively), suggesting that these convenient, single measurements might serve as surrogates for 24-h profiles to estimate overall ghrelin levels. Circulating ghrelin also correlated positively with age (r = 0.701). The clear preprandial rise and postprandial fall in plasma ghrelin levels support the hypothesis that ghrelin plays a physiological role in meal initiation in humans.

Book
01 Feb 2001
TL;DR: The Motility Models: From Crossbridges to Motion chapter describes the building blocks of the Cytoskeleton and some of the mechanisms used in its manufacture are described.
Abstract: Preface - Introduction - PART I: PHYSICAL PRINCIPLES - Mechanical Forces - Mass, Stiffness, and Damping of Proteins - Thermal Forces and Diffusion - Chemical Forces - Polymer Mechanics - PART II: CYTOSKELETON - Structures of Cytoskeletal Filaments - Mechanics of the Cytoskeleton - Polymerization of Cytoskeletal Filaments - Force Generation by Cytoskeletal Filaments - Active Polymerization - PART III: MOTOR PROTEINS - Structures of Motor Proteins - Speeds of Motors - ATP Hydrolysis - Steps and Forces - Motility Models: From Crossbridges to Motion - Afterword - Appendix - Bibliography - Index

Journal ArticleDOI
TL;DR: Surprisingly, despite little or no sequence homology, both type IA and type IIA topoisomerases from prokaryotes and the typeIIA enzymes from eukaryotes share structural folds that appear to reflect functional motifs within critical regions of the enzymes.
Abstract: ▪ Abstract DNA topoisomerases solve the topological problems associated with DNA replication, transcription, recombination, and chromatin remodeling by introducing temporary single- or double-strand breaks in the DNA. In addition, these enzymes fine-tune the steady-state level of DNA supercoiling both to facilitate protein interactions with the DNA and to prevent excessive supercoiling that is deleterious. In recent years, the crystal structures of a number of topoisomerase fragments, representing nearly all the known classes of enzymes, have been solved. These structures provide remarkable insights into the mechanisms of these enzymes and complement previous conclusions based on biochemical analyses. Surprisingly, despite little or no sequence homology, both type IA and type IIA topoisomerases from prokaryotes and the type IIA enzymes from eukaryotes share structural folds that appear to reflect functional motifs within critical regions of the enzymes. The type IB enzymes are structurally distinct from a...

Journal ArticleDOI
03 May 2001-Nature
TL;DR: The results of a numerical climate-model experiment support the argument that the stages in evolution of Asian monsoons are linked to phases of Himalaya–Tibetan plateau uplift and to Northern Hemisphere glaciation.
Abstract: The climates of Asia are affected significantly by the extent and height of the Himalayan mountains and the Tibetan plateau1,2,3,4 Uplift of this region began about 50 Myr ago, and further significant increases in altitude of the Tibetan plateau are thought to have occurred about 10–8 Myr ago4,5, or more recently However, the climatic consequences of this uplift remain unclear Here we use records of aeolian sediments from China6,7 and marine sediments from the Indian8,9,10 and North Pacific oceans11 to identify three stages of evolution of Asian climates: first, enhanced aridity in the Asian interior and onset of the Indian and east Asian monsoons, about 9–8 Myr ago; next, continued intensification of the east Asian summer and winter monsoons, together with increased dust transport to the North Pacific Ocean11, about 36–26 Myr ago; and last, increased variability and possible weakening of the Indian and east Asian summer monsoons and continued strengthening of the east Asian winter monsoon since about 26 Myr ago The results of a numerical climate-model experiment, using idealized stepwise increases of mountain–plateau elevation, support the argument that the stages in evolution of Asian monsoons are linked to phases of Himalaya–Tibetan plateau uplift and to Northern Hemisphere glaciation

Journal ArticleDOI
TL;DR: In this paper, a new construct, called job embeddedness, is introduced, which includes individuals' links to other people, teams, and groups, perceptions of their fit with job, organization, and community, and what they say they would have to sacrifice if they left their jobs.
Abstract: A new construct, entitled “job embeddedness,” is introduced. It includes individuals' (1) links to other people, teams, and groups, (2) perceptions of their fit with job, organization, and community, and (3) what they say they would have to sacrifice if they left their jobs. We developed a measure of job embeddedness with two samples. The results show that job embeddedness predicts the key outcomes of both intent to leave and 'voluntary turnover' and explains significant incremental variance over and above job satisfaction, organizational commitment, job alternatives, and job search.

Proceedings ArticleDOI
10 Dec 2001
TL;DR: This measurement study seeks to precisely characterize the population of end-user hosts that participate in Napster and Gnutella, and shows that there is significant heterogeneity and lack of cooperation across peers participating in these systems.
Abstract: The popularity of peer-to-peer multimedia file sharing applications such as Gnutella and Napster has created a flurry of recent research activity into peer-to-peer architectures. We believe that the proper evaluation of a peer-to-peer system must take into account the characteristics of the peers that choose to participate. Surprisingly, however, few of the peer-to-peer architectures currently being developed are evaluated with respect to such considerations. In this paper, we remedy this situation by performing a detailed measurement study of the two popular peer-to-peer file sharing systems, namely Napster and Gnutella. In particular, our measurement study seeks to precisely characterize the population of end-user hosts that participate in these two systems. This characterization includes the bottleneck bandwidths between these hosts and the Internet at large, IP-level latencies to send packets to these hosts, how often hosts connect and disconnect from the system, how many files hosts share and download, the degree of cooperation between the hosts, and several correlations between these characteristics. Our measurements show that there is significant heterogeneity and lack of cooperation across peers participating in these systems.

Journal ArticleDOI
04 May 2001-Science
TL;DR: An integrated approach to build, test, and refine a model of a cellular pathway, in which perturbations to critical pathway components are analyzed using DNA microarrays, quantitative proteomics, and databases of known physical interactions, suggests hypotheses about the regulation of galactose utilization and physical interactions between this and a variety of other metabolic pathways.
Abstract: We demonstrate an integrated approach to build, test, and refine a model of a cellular pathway, in which perturbations to critical pathway components are analyzed using DNA microarrays, quantitative proteomics, and databases of known physical interactions. Using this approach, we identify 997 messenger RNAs responding to 20 systematic perturbations of the yeast galactose-utilization pathway, provide evidence that approximately 15 of 289 detected proteins are regulated posttranscriptionally, and identify explicit physical interactions governing the cellular response to each perturbation. We refine the model through further iterations of perturbation and global measurements, suggesting hypotheses about the regulation of galactose utilization and physical interactions between this and a variety of other metabolic pathways.

Journal ArticleDOI
TL;DR: CYP3A5 was more frequently expressed in livers of African Americans than in those of Caucasians, and may be the most important genetic contributor to interindividual and interracial differences in CYP3A-dependent drug clearance and in responses to many medicines.
Abstract: Variation in the CYP3A enzymes, which act in drug metabolism, influences circulating steroid levels and responses to half of all oxidatively metabolized drugs. CYP3A activity is the sum activity of the family of CYP3A genes, including CYP3A5, which is polymorphically expressed at high levels in a minority of Americans of European descent and Europeans (hereafter collectively referred to as 'Caucasians'). Only people with at least one CYP3A5*1 allele express large amounts of CYP3A5. Our findings show that single-nucleotide polymorphisms (SNPs) in CYP3A5*3 and CYP3A5*6 that cause alternative splicing and protein truncation result in the absence of CYP3A5 from tissues of some people. CYP3A5 was more frequently expressed in livers of African Americans (60%) than in those of Caucasians (33%). Because CYP3A5 represents at least 50% of the total hepatic CYP3A content in people polymorphically expressing CYP3A5, CYP3A5 may be the most important genetic contributor to interindividual and interracial differences in CYP3A-dependent drug clearance and in responses to many medicines.

Journal ArticleDOI
TL;DR: This work develops a class of models where the probability of a relation between actors depends on the positions of individuals in an unobserved “social space,” and proposes Markov chain Monte Carlo procedures for making inference on latent positions and the effects of observed covariates.
Abstract: Network models are widely used to represent relational information among interacting units. In studies of social networks, recent emphasis has been placed on random graph models where the nodes usually represent individual social actors and the edges represent the presence of a specified relation between actors. We develop a class of models where the probability of a relation between actors depends on the positions of individuals in an unobserved “social space.” We make inference for the social space within maximum likelihood and Bayesian frameworks, and propose Markov chain Monte Carlo procedures for making inference on latent positions and the effects of observed covariates. We present analyses of three standard datasets from the social networks literature, and compare the method to an alternative stochastic blockmodeling approach. In addition to improving on model fit for these datasets, our method provides a visual and interpretable model-based spatial representation of social relationships and improv...

Journal ArticleDOI
TL;DR: In this article, the convergence properties of a block coordinate descent method applied to minimize a non-convex function f(x1,.., x 2, N 3 ) with certain separability and regularity properties were studied.
Abstract: We study the convergence properties of a (block) coordinate descent method applied to minimize a nondifferentiable (nonconvex) function f(x 1, . . . , x N ) with certain separability and regularity properties. Assuming that f is continuous on a compact level set, the subsequence convergence of the iterates to a stationary point is shown when either f is pseudoconvex in every pair of coordinate blocks from among N-1 coordinate blocks or f has at most one minimum in each of N-2 coordinate blocks. If f is quasiconvex and hemivariate in every coordinate block, then the assumptions of continuity of f and compactness of the level set may be relaxed further. These results are applied to derive new (and old) convergence results for the proximal minimization algorithm, an algorithm of Arimoto and Blahut, and an algorithm of Han. They are applied also to a problem of blind source separation.

Journal ArticleDOI
TL;DR: In this article, a variety of composite quasar spectra using a homogeneous data set of over 2200 spectra from the Sloan Digital Sky Survey (SDSS) was created, and the median composite covers a restwavelength range from 800 to 8555 A and reaches a peak signal-to-noise ratio of over 300 per 1 A resolution element in the rest frame.
Abstract: We have created a variety of composite quasar spectra using a homogeneous data set of over 2200 spectra from the Sloan Digital Sky Survey (SDSS). The quasar sample spans a redshift range of 0.044 ≤ z ≤ 4.789 and an absolute r' magnitude range of -18.0 to -26.5. The input spectra cover an observed wavelength range of 3800–9200 A at a resolution of 1800. The median composite covers a rest-wavelength range from 800 to 8555 A and reaches a peak signal-to-noise ratio of over 300 per 1 A resolution element in the rest frame. We have identified over 80 emission-line features in the spectrum. Emission-line shifts relative to nominal laboratory wavelengths are seen for many of the ionic species. Peak shifts of the broad permitted and semiforbidden lines are strongly correlated with ionization energy, as previously suggested, but we find that the narrow forbidden lines are also shifted by amounts that are strongly correlated with ionization energy. The magnitude of the forbidden line shifts is 100 km s-1, compared with shifts of up to 550 km s-1 for some of the permitted and semiforbidden lines. At wavelengths longer than the Lyα emission, the continuum of the geometric mean composite is well fitted by two power laws, with a break at ≈5000 A. The frequency power-law index, αν, is -0.44 from ≈1300 to 5000 A and -2.45 redward of ≈5000 A. The abrupt change in slope can be accounted for partly by host-galaxy contamination at low redshift. Stellar absorption lines, including higher order Balmer lines, seen in the composites suggest that young or intermediate-age stars make a significant contribution to the light of the host galaxies. Most of the spectrum is populated by blended emission lines, especially in the range 1500–3500 A, which can make the estimation of quasar continua highly uncertain unless large ranges in wavelength are observed. An electronic table of the median quasar template is available.

Journal ArticleDOI
TL;DR: A more robust algorithm is developed called MixtureMCL, which integrates two complimentary ways of generating samples in the estimation of Monte Carlo Localization algorithms, and is applied to mobile robots equipped with range finders.

Journal ArticleDOI
TL;DR: Simvastatin plus niacin provides marked clinical and angiographically measurable benefits in patients with coronary disease and low HDL levels, and the use of antioxidant vitamins in this setting must be questioned.
Abstract: Background Both lipid-modifying therapy and antioxidant vitamins are thought to have benefit in patients with coronary disease. We studied simvastatin–niacin and antioxidant-vitamin therapy, alone and together, for cardiovascular protection in patients with coronary disease and low plasma levels of high-density lipoprotein (HDL) cholesterol. Methods In a three-year, double-blind trial, 160 patients with coronary disease, low HDL cholesterol levels, and normal low-density lipoprotein (LDL) cholesterol levels were randomly assigned to receive one of four regimens: simvastatin plus niacin, antioxidants, simvastatin–niacin plus antioxidants, or placebos. The end points were arteriographic evidence of a change in coronary stenosis and the occurrence of a first cardiovascular event (death, myocardial infarction, stroke, or revascularization). Results The mean levels of LDL and HDL cholesterol were unaltered in the antioxidant group and the placebo group; these levels changed substantially (by –42 percent and +2...

Journal ArticleDOI
TL;DR: It is suggested that IC may be a crucial enabling factor for ToM development, possibly affecting both the emergence and expression of mental state knowledge.
Abstract: This research examined the relation between individual differences in inhibitory control (IC; a central component of executive functioning) and theory-of-mind (ToM) performance in preschool-age children. Across two sessions, 3- and 4-year-old children ( N � 107) were given multitask batteries measuring IC and ToM. Inhibitory control was strongly related to ToM, r � .66, p � .001. This relation remained significant controlling for age, gender, verbal ability, motor sequencing, family size, and performance on pretend-action and mental state control tasks. Inhibitory tasks requiring a novel response in the face of a conflicting prepotent response (Conflict scale) and those requiring the delay of a prepotent response (Delay scale) were significantly related to ToM. The Conflict scale, however, significantly predicted ToM performance over and above the Delay scale and control measures, whereas the Delay scale was not significant in a corresponding analysis. These findings suggest that IC may be a crucial enabling factor for ToM development, possibly affecting both the emergence and expression of mental state knowledge. The implications of the findings for a variety of executive accounts of ToM are discussed.

Proceedings ArticleDOI
01 Aug 2001
TL;DR: This paper describes a new framework for processing images by example, called “image analogies,” based on a simple multi-scale autoregression, inspired primarily by recent results in texture synthesis.
Abstract: This paper describes a new framework for processing images by example, called “image analogies.” The framework involves two stages: a design phase, in which a pair of images, with one image purported to be a “filtered” version of the other, is presented as “training data”; and an application phase, in which the learned filter is applied to some new target image in order to create an “analogous” filtered result. Image analogies are based on a simple multi-scale autoregression, inspired primarily by recent results in texture synthesis. By choosing different types of source image pairs as input, the framework supports a wide variety of “image filter” effects, including traditional image filters, such as blurring or embossing; improved texture synthesis, in which some textures are synthesized with higher quality than by previous approaches; super-resolution, in which a higher-resolution image is inferred from a low-resolution source; texture transfer, in which images are “texturized” with some arbitrary source texture; artistic filters, in which various drawing and painting styles are synthesized based on scanned real-world examples; and texture-by-numbers, in which realistic scenes, composed of a variety of textures, are created using a simple painting interface.

Proceedings ArticleDOI
26 Aug 2001
TL;DR: An efficient algorithm for mining decision trees from continuously-changing data streams, based on the ultra-fast VFDT decision tree learner is proposed, called CVFDT, which stays current while making the most of old data by growing an alternative subtree whenever an old one becomes questionable, and replacing the old with the new when the new becomes more accurate.
Abstract: Most statistical and machine-learning algorithms assume that the data is a random sample drawn from a stationary distribution. Unfortunately, most of the large databases available for mining today violate this assumption. They were gathered over months or years, and the underlying processes generating them changed during this time, sometimes radically. Although a number of algorithms have been proposed for learning time-changing concepts, they generally do not scale well to very large databases. In this paper we propose an efficient algorithm for mining decision trees from continuously-changing data streams, based on the ultra-fast VFDT decision tree learner. This algorithm, called CVFDT, stays current while making the most of old data by growing an alternative subtree whenever an old one becomes questionable, and replacing the old with the new when the new becomes more accurate. CVFDT learns a model which is similar in accuracy to the one that would be learned by reapplying VFDT to a moving window of examples every time a new example arrives, but with O(1) complexity per example, as opposed to O(w), where w is the size of the window. Experiments on a set of large time-changing data streams demonstrate the utility of this approach.

Journal ArticleDOI
TL;DR: Comparisons of estimated linear selection gradients and differentials suggest that indirect components of phenotypic selection were usually modest relative to direct components, and no evidence that stabilizing selection is stronger or more common than disruptive selection in nature.
Abstract: How strong is phenotypic selection on quantitative traits in the wild? We reviewed the literature from 1984 through 1997 for studies that estimated the strength of linear and quadratic selection in terms of standardized selection gradients or differentials on natural variation in quantitative traits for field populations. We tabulated 63 published studies of 62 species that reported over 2,500 estimates of linear or quadratic selection. More than 80% of the estimates were for morphological traits; there is very little data for behavioral or physiological traits. Most published selection studies were unreplicated and had sample sizes below 135 individuals, resulting in low statistical power to detect selection of the magnitude typically reported for natural populations. The absolute values of linear selection gradients |β| were exponentially distributed with an overall median of 0.16, suggesting that strong directional selection was uncommon. The values of |β| for selection on morphological and o...

Journal ArticleDOI
TL;DR: A maximum likelihood estimator based on the coalescent for unequal migration rates and different subpopulation sizes is developed and used to estimate gene flow in the Nile valley by using mtDNA data from three human populations.
Abstract: A maximum likelihood estimator based on the coalescent for unequal migration rates and different subpopulation sizes is developed. The method uses a Markov chain Monte Carlo approach to investigate possible genealogies with branch lengths and with migration events. Properties of the new method are shown by using simulated data from a four-population n-island model and a source–sink population model. Our estimation method as coded in migrate is tested against genetree; both programs deliver a very similar likelihood surface. The algorithm converges to the estimates fairly quickly, even when the Markov chain is started from unfavorable parameters. The method was used to estimate gene flow in the Nile valley by using mtDNA data from three human populations.