scispace - formally typeset
Search or ask a question

Showing papers by "University of California, Santa Cruz published in 2003"


Book
01 Jan 2003
TL;DR: In this article, the authors present concepts and ways of understanding the cultural nature of human development and the transformation of participation in cultural activities in families and communities, as well as the transition in individuals' roles in their communities.
Abstract: 1. Orienting concepts and ways of understanding the cultural nature of human development 2. Development as transformation of participation in cultural activities 3. Individuals, generations and dynamic cultural communities 4. Child rearing in families and communities 5. Developmental transitions in individuals' roles in their communities 6. Interdependence and autonomy 7. Thinking with the tools and institutions of culture 8. Learning through guided participation in cultural endeavours 9. Cultural change and relations among communities

5,048 citations


Journal ArticleDOI
TL;DR: The University of California Santa Cruz (UCSC) Genome Browser Database is an up to date source for genome sequence data integrated with a large collection of related annotations that is optimized to support fast interactive performance with the web-based UCSC Genome browser.
Abstract: The University of California Santa Cruz (UCSC) Genome Browser Database is an up to date source for genome sequence data integrated with a large collection of related annotations. The database is optimized to support fast interactive performance with the web-based UCSC Genome Browser, a tool built on top of the database for rapid visualization and querying of the data at many levels. The annotations for a given genome are displayed in the browser as a series of tracks aligned with the genomic sequence. Sequence data and annotations may also be viewed in a text-based tabular format or downloaded as tab-delimited flat files. The Genome Browser Database, browsing tools and downloadable data files can all be found on the UCSC Genome Bioinformatics website (http://genome.ucsc.edu), which also contains links to documentation and related technical information.

2,103 citations


Journal ArticleDOI
TL;DR: In this article, the authors discuss how metallicity affects the evolution and final fate of massive stars, and derive the relative populations of stellar populations as a function of metallity.
Abstract: How massive stars die-what sort of explosion and remnant each produces-depends chiefly on the masses of their helium cores and hydrogen envelopes at death. For single stars, stellar winds are the only means of mass loss, and these are a function of the metallicity of the star. We discuss how metallicity, and a simplified prescription for its effect on mass loss, affects the evolution and final fate of massive stars. We map, as a function of mass and metallicity, where black holes and neutron stars are likely to form and where different types of supernovae are produced. Integrating over an initial mass function, we derive the relative populations as a function of metallicity. Provided that single stars rotate rapidly enough at death, we speculate on stellar populations that might produce gamma-ray bursts and jet-driven supernovae.

2,007 citations


Journal ArticleDOI
TL;DR: In this paper, the authors argue against the common assumption that regularities are static and that general traits of individuals are attributable categorically to ethnic group membership, and suggest that a cultural-historical approach can be used to help move beyond this assumption by focusing researchers and practitioners' attention on variations in individuals' and groups' histories of engagement in cultural practices.
Abstract: This article addresses a challenge faced by those who study cultural variation in approaches to learning: how to characterize regularities of individuals’ approaches according to their cultural background. We argue against the common approach of assuming that regularities are static, and that general traits of individuals are attributable categorically to ethnic group membership. We suggest that a cultural-historical approach can be used to help move beyond this assumption by focusing researchers’ and practitioners’ attention on variations in individuals’ and groups’ histories of engagement in cultural practices because the variations reside not as traits of individuals or collections of individuals, but as proclivities of people with certain histories of engagement with specific cultural activities. Thus, individuals’ and groups’ experience in activities—not their traits—becomes the focus. Also, we note that cultural-historical work needs to devote more attention to researching regularities in the variat...

1,805 citations


Journal ArticleDOI
19 Jun 2003-Nature
TL;DR: In this article, the authors reported evidence for a very energetic supernova (a hypernova), temporally and spatially coincident with a gamma-ray burst at redshift z=0.1685.
Abstract: Over the past five years evidence has mounted that long-duration (greater than 2s) gamma-ray bursts (GRBs), the most brilliant of all astronomical explosions, signal the collapse of massive stars in our Universe. This evidence, originally based on the probable association of one unusual GRB with a supernova, now includes the association of GRBs with regions of massive star-formation in distant galaxies, tantalizing evidence of supernova-like light-curve 'bumps' in the optical afterglows of several bursts, and lines of freshly synthesized elements in the spectra of a few X-ray afterglows. These observations support, but do not yet conclusively validate, models based upon the deaths of massive stars, presumably associated with core collapse. Here we report evidence for a very energetic supernova (a hypernova), temporally and spatially coincident with a GRB at redshift z=0.1685. The timing of the supernova indicates that it exploded within a few days of the GRB, strongly suggesting that core-collapse events can give rise to GRBs. Amongst the GRB central engine models proposed to-date, the properties of this supernova thus favour the collapsar model.

1,415 citations


Proceedings ArticleDOI
05 Nov 2003
TL;DR: The traffic-adaptive medium access protocol (TRAMA) is introduced for energy-efficient collision-free channel access in wireless sensor networks and is shown to be fair and correct, in that no idle node is an intended receiver and no receiver suffers collisions.
Abstract: The traffic-adaptive medium access protocol (TRAMA) is introduced for energy-efficient collision-free channel access in wireless sensor networks. TRAMA reduces energy consumption by ensuring that unicast, multicast, and broadcast transmissions have no collisions, and by allowing nodes to switch to a low-power, idle state whenever they are not transmitting or receiving. TRAMA assumes that time is slotted and uses a distributed election scheme based on information about the traffic at each node to determine which node can transmit at a particular time slot. TRAMA avoids the assignment of time slots to nodes with no traffic to send, and also allows nodes to determine when they can become idle and not listen to the channel using traffic information. TRAMA is shown to be fair and correct, in that no idle node is an intended receiver and no receiver suffers collisions. The performance of TRAMA is evaluated through extensive simulations using both synthetic- as well as sensor-network scenarios. The results indicate that TRAMA outperforms contention-based protocols (e.g., CSMA, 802.11 and S-MAC) as well as scheduling-based protocols (e.g., NAMA) with significant energy savings.

1,287 citations


Journal ArticleDOI
TL;DR: In this paper, the accuracy of interpolated bD and δ 1 8 O estimates made using four methods is tested using resampling, and the best method lowers estimation error by 10-15% relative to others tested and gives an average error, using all available data, 2.5% of the global range.
Abstract: An accurate representation of the spatial distribution of stable hydrogen and oxygen isotopes in modern precipitation is required for many hydrological, paleoclimate, and ecological applications. No standardized method for achieving such a representation exists, and potential errors associated with previously employed methods are not understood. Using resampling, we test the accuracy of interpolated bD and δ 1 8 O estimates made using four methods. Prediction error for all methods is strongly related to number of data and will likely decline with the addition of new data. The best method lowers estimation error by 10-15% relative to others tested and gives an average error, using all available data, 2.5% of the global range. We present and interpret global maps of interpolated δD, δ 1 8 O, and deuterium excess in precipitation and the 95% confidence intervals for these values created using the optimal method. These depict global and regional patterns, make evident the robustness of interpolated isotopic patterns, and highlight target areas for future precipitation sampling.

1,071 citations


Journal ArticleDOI
TL;DR: In this article, a model for the assembly of supermassive black holes (SMBHs) at the center of galaxies that trace their hierarchical buildup far up in the dark halo merger tree is presented.
Abstract: We assess models for the assembly of supermassive black holes (SMBHs) at the center of galaxies that trace their hierarchical buildup far up in the dark halo merger tree. Motivated by the recent discovery of luminous quasars around redshift z ≈ 6—suggesting a very early assembly epoch—and by numerical simulations of the fragmentation of primordial molecular clouds in cold dark matter (CDM) cosmogonies, we assume that the first seed black holes (BHs) had intermediate masses and formed in (mini)halos collapsing at z ~ 20 from high-σ density fluctuations. As these pregalactic holes become incorporated through a series of mergers into larger and larger halos, they sink to the center because of dynamical friction, accrete a fraction of the gas in the merger remnant to become supermassive, form a binary system, and eventually coalesce. The merger history of dark matter halos and associated BHs is followed by cosmological Monte Carlo realizations of the merger hierarchy from early times until the present in a ΛCDM cosmology. A simple model, where quasar activity is driven by major mergers and SMBHs accrete at the Eddington rate a mass that scales with the fifth power of the circular velocity of the host halo, is shown to reproduce the observed luminosity function of optically selected quasars in the redshift range 1 < z < 5. A scheme for describing the hardening of a BH binary in a stellar background with core formation due to mass ejection is applied, where the stellar cusp proportional to r-2 is promptly regenerated after every major merger event, replenishing the mass displaced by the binary. Triple BH interactions will inevitably take place at early times if the formation route for the assembly of SMBHs goes back to the very first generation of stars, and we follow them in our merger tree. The assumptions underlying our scenario lead to the prediction of a population of massive BHs wandering in galaxy halos and the intergalactic medium at the present epoch and contributing 10% to the total BH mass density, ρSMBH = 4 × 105 M☉ Mpc-3 (h = 0.7). The fraction of binary SMBHs in galaxy nuclei is on the order of 10% today, and it increases with redshift so that almost all massive nuclear BHs at early epochs are in binary systems. The fraction of binary quasars (both members brighter than 0.1L*) instead is less than 0.3% at all epochs. The nuclear SMBH occupation fraction is unity (0.6) at the present epoch if the first seed BHs were as numerous as the 3.5 σ (4 σ) density peaks at z = 20.

985 citations


Book ChapterDOI
08 Jan 2003
TL;DR: The notion of "certain answers" in indefinite databases for the semantics for query answering in data exchange is adopted and the computational complexity of computing the certain answers in this context is investigated.
Abstract: Data exchange is the problem of taking data structured under a source schema and creating an instance of a target schema that reflects the source data as accurately as possible. In this paper, we address foundational and algorithmic issues related to the semantics of data exchange and to query answering in the context of data exchange. These issues arise because, given a source instance, there may be many target instances that satisfy the constraints of the data exchange problem. We give an algebraic specification that selects, among all solutions to the data exchange problem, a special class of solutions that we call universal. A universal solution has no more and no less data than required for data exchange and it represents the entire space of possible solutions. We then identify fairly general, and practical, conditions that guarantee the existence of a universal solution and yield algorithms to compute a canonical universal solution efficiently. We adopt the notion of "certain answers" in indefinite databases for the semantics for query answering in data exchange. We investigate the computational complexity of computing the certain answers in this context and also study the problem of computing the certain answers of target queries by simply evaluating them on a canonical universal solution.

916 citations


Journal ArticleDOI
TL;DR: This study compiled available information on the dispersal distance of the propagules of benthic marine organisms and used this information in the development of criteria for the design of marine reserves, suggesting that reserves be designed large enough to contain the short-distance dispersing propagules and be spaced far enough apart that long-distance dispersed propagules released from one reserve can settle in adjacent reserves.
Abstract: This study compiled available information on the dispersal distance of the propagules of benthic marine organisms and used this information in the development of criteria for the design of marine reserves. Many benthic marine organisms release propagules that spend time in the water column before settlement. During this period, ocean currents transport or disperse the propagules. When considering the size of a marine reserve and the spacing between reserves, one must consider the distance which propagules disperse. We could find estimates of dispersal distance for 32 taxa; for 25 of these, we were also able to find data on the time the propagules spent dispersing. Dispersal distance ranged from meters to thousands of kilometers, and time in the plankton ranged from minutes to months. A significant positive correlation was found between the log-transformed duration in the plankton and the log-transformed dispersal distance ( r 5 0.7776, r 2 5 0.60, df 5 1, 25, P 5 0.000); the more time propagules spend in the water column the further they tend to be dispersed. The frequency distribution of the log-transformed dispersal distance is bimodal (kurtosis 52 1.29, t 52 4.062, P , 0.001) with a gap between 1 and 20 km. Propagules that dispersed ,1 km spent less time in the plankton (,100 h), or if they remained in the plankton for a longer period, they tended to remain in the waters near the bottom. Propagules that dispersed .20 km spent more than 300 h in the plankton. The bimodal nature of the distribution suggests that evolutionary constraints may reduce the likelihood of evolving mid-range dispersal strategies (i.e., dispersal between 1 and 20 km) resulting in two evolutionarily stable dispersal strategies: dispersal , 1k m or.;20 km. We suggest that reserves be designed large enough to contain the short-distance dispersing propagules and be spaced far enough apart that long-distance dispersing propagules released from one reserve can settle in adjacent reserves. A reserve 4-6 km in diameter should be large enough to contain the larvae of short-distance dispersers, and reserves spaced 10- 20 km apart should be close enough to capture propagules released from adjacent reserves.

901 citations


Journal ArticleDOI
TL;DR: Analysis of the distribution of nifH phylotypes among habitats indicates that there are characteristic patterns of nitrogen fixing microorganisms in termite guts, sediment and soil environments, estuaries and salt marshes, and oligotrophic oceans.
Abstract: Biological nitrogen fixation is an important source of fixed nitrogen for the biosphere. Microorganisms catalyse biological nitrogen fixation with the enzyme nitrogenase, which has been highly conserved through evolution. Cloning and sequencing of one of the nitrogenase structural genes, nifH, has provided a large, rapidly expanding database of sequences from diverse terrestrial and aquatic environments. Comparison of nifH phylogenies to ribosomal RNA phylogenies from cultivated microorganisms shows little conclusive evidence of lateral gene transfer. Sequence diversity far outstrips representation by cultivated representatives. The phylogeny of nitrogenase includes branches that represent phylotypic groupings based on ribosomal RNA phylogeny, but also includes paralogous clades including the alternative, non-molybdenum, non-vanadium containing nitrogenases. Only a few alternative or archaeal nitrogenase sequences have as yet been obtained from the environment. Extensive analysis of the distribution of nifH phylotypes among habitats indicates that there are characteristic patterns of nitrogen fixing microorganisms in termite guts, sediment and soil environments, estuaries and salt marshes, and oligotrophic oceans. The distribution of nitrogen-fixing microorganisms, although not entirely dictated by the nitrogen availability in the environment, is non-random and can be predicted on the basis of habitat characteristics. The ability to assay for gene expression and investigate genome arrangements provides the promise of new tools for interrogating natural populations of diazotrophs. The broad analysis of nitrogenase genes provides a basis for developing molecular assays and bioinformatics approaches for the study of nitrogen fixation in the environment.

Journal ArticleDOI
TL;DR: The degree to which DOM penetrates the class of objects reflects the tension between two types of principles: one involves iconicity: the more marked a direct object qua object, the more likely it is to be overtly case-marked.
Abstract: A formal approach to the typology of differential object marking (DOM) is developed within the framework of Optimality Theory. The functional/typological literature has established that variation in DOM is structured by the dimensions of animacy and definiteness, with degree of prominence on these dimensions directly correlated with the likelihood of overt case-marking. In the present analysis, the degree to which DOM penetrates the class of objects reflects the tension between two types of principles. One involves iconicity: the more marked a direct object qua object, the more likely it is to be overtly case-marked. The other is a principle of economy: avoid case-marking. The tension between the two principles is resolved differently in different languages, as determined by language-particular ranking of the corresponding constraints. Constraints expressing object markedness are derived throughharmonic alignment of prominence scales. Harmonic alignment predicts a corresponding phenomenon ofdifferential subject marking. This too exists, though in a less articulated form.

ReportDOI
TL;DR: The authors argue that the normal evolution of the international monetary system involves the emergence of a periphery for which the development strategy is export-led growth supported by undervalued exchange rates, capital controls and official capital outflows in the form of accumulation of reserve asset claims on the center country.
Abstract: The economic emergence of a fixed exchange rate periphery in Asia has reestablished the United States as the center country in the Bretton Woods international monetary system. We argue that the normal evolution of the international monetary system involves the emergence of a periphery for which the development strategy is export-led growth supported by undervalued exchange rates, capital controls and official capital outflows in the form of accumulation of reserve asset claims on the center country. The success of this strategy in fostering economic growth allows the periphery to graduate to the center. Financial liberalization, in turn, requires floating exchange rates among the center countries. But there is a line of countries waiting to follow the Europe of the 1950s/60s and Asia today sufficient to keep the system intact for the foreseeable future.

Journal ArticleDOI
TL;DR: New alignment techniques that can handle large gaps in a robust fashion and discriminate between orthologous and paralogous alignments are developed and provide evidence that ≈2% of the genes in the human/mouse common ancestor have been deleted or partially deleted in the mouse.
Abstract: This study examines genomic duplications, deletions, and rearrangements that have happened at scales ranging from a single base to complete chromosomes by comparing the mouse and human genomes. From whole-genome sequence alignments, 344 large (>100-kb) blocks of conserved synteny are evident, but these are further fragmented by smaller-scale evolutionary events. Excluding transposon insertions, on average in each megabase of genomic alignment we observe two inversions, 17 duplications (five tandem or nearly tandem), seven transpositions, and 200 deletions of 100 bases or more. This includes 160 inversions and 75 duplications or transpositions of length >100 kb. The frequencies of these smaller events are not substantially higher in finished portions in the assembly. Many of the smaller transpositions are processed pseudogenes; we define a “syntenic” subset of the alignments that excludes these and other small-scale transpositions. These alignments provide evidence that ≈2% of the genes in the human/mouse common ancestor have been deleted or partially deleted in the mouse. There also appears to be slightly less nontransposon-induced genome duplication in the mouse than in the human lineage. Although some of the events we detect are possibly due to misassemblies or missing data in the current genome sequence or to the limitations of our methods, most are likely to represent genuine evolutionary events. To make these observations, we developed new alignment techniques that can handle large gaps in a robust fashion and discriminate between orthologous and paralogous alignments.

Journal ArticleDOI
TL;DR: In this article, the authors analyzed the halo occupation distribution, the probability for a halo of mass M to host a number of subhalos N, and two-point correlation function of galaxy-size dark matter halos using high-resolution dissipationless simulations of the concordance flat LCDM model.
Abstract: We analyze the halo occupation distribution (HOD), the probability for a halo of mass M to host a number of subhalos N, and two-point correlation function of galaxy-size dark matter halos using high-resolution dissipationless simulations of the concordance flat LCDM model. The halo samples include both the host halos and the subhalos, distinct gravitationally-bound halos within the virialized regions of larger host systems. We find that the first moment of the HOD, (M), has a complicated shape consisting of a step, a shoulder, and a power law high-mass tail. The HOD can be described by a Poisson statistics at high halo masses but becomes sub-Poisson for (M). We find that ~M^b with b~1 for a wide range of number densities, redshifts, and different power spectrum normalizations. This formulation provides a simple but accurate model for the halo occupation distribution found in simulations. At z=0, the two-point correlation function (CF) of galactic halos can be well fit by a power law down to ~100/h kpc with an amplitude and slope similar to those of observed galaxies. At redshifts z>~1, we find significant departures from the power-law shape of the CF at small scales. If the deviations are as strong as indicated by our results, the assumption of the single power law often used in observational analyses of high-redshift clustering is likely to bias the estimates of the correlation length and slope of the correlation function.

Journal ArticleDOI
TL;DR: This article examines how people learn by actively observing and "listening-in" on ongoing activities as they participate in shared endeavors, and contrasts these two traditions of organizing learning in terms of their participation structure.
Abstract: This article examines how people learn by actively observing and “listening-in” on ongoing activities as they participate in shared endeavors. Keen observation and listening-in are especially valued and used in some cultural communities in which children are part of mature community activities. This intent participation also occurs in some settings (such as early language learning in the family) in communities that routinely segregate children from the full range of adult activities. However, in the past century some industrial societies have relied on a specialized form of instruction that seems to accompany segregation of children from adult settings, in which adults “transmit” information to children. We contrast these two traditions of organizing learning in terms of their participation structure, the roles of more- and less-experienced people, distinctions in motivation and purpose, sources of learning (observation in ongoing activity versus lessons), forms of communication, and the role of assessment.

Journal ArticleDOI
TL;DR: Findings reveal that in their initial attention, older adults avoid negative information, consistent with older adults' generally better emotional well-being and their tendency to remember negative less well than positive information.
Abstract: We examined age differences in attention to and memory for faces expressing sadness, anger, and happiness. Participants saw a pair of faces, one emotional and one neutral, and then a dot probe that appeared in the location of one of the faces. In two experiments, older adults responded faster to the dot if it was presented on the same side as a neutral face than if it was presented on the same side as a negative face. Younger adults did not exhibit this attentional bias. Interactions of age and valence were also found for memory for the faces, with older adults remembering positive better than negative faces. These findings reveal that in their initial attention, older adults avoid negative information. This attentional bias is consistent with older adults' generally better emotional well-being and their tendency to remember negative less well than positive information.

Journal ArticleDOI
TL;DR: In this article, the authors define agroecology as the ecology of food systems and provide a framework that will guide research, education, and action in the multiple and interacting facets of an increasingly complex global agriculture and food system.
Abstract: We present a compelling rationale for defining agroecology as the ecology of food systems. Our purpose is to provide a framework that will guide research, education, and action in the multiple and interacting facets of an increasingly complex global agriculture and food system. To accomplish such goals, it is essential to build bridges and connections among and beyond our disciplines in production agriculture, as well as beyond the farm gate into the rural landscape and community. Fields of sociology, anthropology, environmental sciences, ethics, and economics are crucial to the mix. They provide additional vantage points from which we can view the food system anew, as well as insights on how to establish valuation criteria beyond neoclassical economics. Examples from Mexico, California, and the Nordic Region are used to illustrate the successful implementation of this educational strategy in universities. Design of individual farms using principles of ecology is expanded to the levels of landsca...

Journal ArticleDOI
TL;DR: In this paper, the relative distribution of the galaxy pixel flux values (the Gini coefficient or G) and the second-order moment of the brightest 20% of a galaxy's flux (M20) were measured.
Abstract: We present two new non-parametric methods for quantifying galaxy morphology: the relative distribution of the galaxy pixel flux values (the Gini coefficient or G) and the second-order moment of the brightest 20% of the galaxy's flux (M20). We test the robustness of G and M20 to decreasing signal-to-noise and spatial resolution, and find that both measures are reliable to within 10% at average signal-to-noise per pixel greater than 3 and resolutions better than 1000 pc and 500 pc, respectively. We have measured G and M20, as well as concentration (C), asymmetry (A), and clumpiness (S) in the rest-frame near-ultraviolet/optical wavelengths for 150 bright local "normal" Hubble type galaxies (E-Sd) galaxies and 104 0.05 < z < 0.25 ultra-luminous infrared galaxies (ULIRGs).We find that most local galaxies follow a tight sequence in G-M20-C, where early-types have high G and C and low M20 and late-type spirals have lower G and C and higher M20. The majority of ULIRGs lie above the normal galaxy G-M20 sequence, due to their high G and M20 values. Their high Gini coefficients arise from very bright nuclei, while the high second-order moments are produced by multiple nuclei and bright tidal tails. All of these features are signatures of recent and on-going mergers and interactions. We also find that in combination with A and S, G is more effective than C at distinguishing ULIRGs from the "normal" Hubble-types. Finally, we measure the morphologies of 45 1.7 < z < 3.8 galaxies from HST NICMOS observations of the Hubble Deep Field North. We find that many of the z $\sim$ 2 galaxies possess G and A higher than expected from degraded images of local elliptical and spiral galaxies, and have morphologies more like low-redshift single nucleus ULIRGs.

Journal ArticleDOI
TL;DR: In this article, the authors provide an empirical investigation of the medium-term determinants of current accounts for a large sample of industrial and developing countries, utilizing an approach that highlights macroeconomic determinants for longer-term saving and investment balances.

Journal ArticleDOI
TL;DR: It is shown that expression of a human equivalent to Drosophila's C4 pol II in human cultured cells affects alternative splicing of the fibronectin EDI exon and adenovirus E1a pre-mRNA and resplices of the Hox gene Ultrabithorax are stimulated, which demonstrates the transcriptional control ofAlternative splicing on an endogenous gene.

Journal ArticleDOI
TL;DR: A null model is developed that quantifies the impact of regulatory behaviors on body temperature and on performance of ectotherms and supports the alternative view that behavior has diverse—and sometimes conflicting—effects on the directions and rates at which other traits evolve.
Abstract: Some biologists embrace the classical view that changes in behavior inevitably initiate or drive evolutionary changes in other traits, yet others note that behavior sometimes inhibits evolutionary changes. Here we develop a null model that quantifies the impact of regulatory behaviors (specifically, thermoregulatory behaviors) on body temperature and on performance of ectotherms. We apply the model to data on a lizard (Anolis cristatellus) and show that ther- moregulatory behaviors likely inhibit selection for evolutionary shifts in thermal physiology with altitude. Because behavioral adjustments are commonly used by ectotherms to regulate physiological per- formance, regulatory behaviors should generally constrain rather than drive evolution, a phenomenon we call the "Bogert effect." We briefly review a few other examples that contradict the classical view of behavior as the inevitable driving force in evolution. Overall, our analysis and brief review challenge the classical view that behavior is invariably the driving force in evolution, and instead our work supports the alternative view that behavior has diverse—and some- times conflicting—effects on the directions and rates at which other traits evolve.

Journal ArticleDOI
14 Aug 2003-Nature
TL;DR: The generation and analysis of over 12 megabases of sequence from 12 species, all derived from the genomic region orthologous to a segment of about 1.8 Mb on human chromosome 7 containing ten genes, show conservation reflecting both functional constraints and the neutral mutational events that shaped this genomic region.
Abstract: The systematic comparison of genomic sequences from different organisms represents a central focus of contemporary genome analysis. Comparative analyses of vertebrate sequences can identify coding and conserved non-coding regions, including regulatory elements, and provide insight into the forces that have rendered modern-day genomes. As a complement to whole-genome sequencing efforts, we are sequencing and comparing targeted genomic regions in multiple, evolutionarily diverse vertebrates. Here we report the generation and analysis of over 12 megabases (Mb) of sequence from 12 species, all derived from the genomic region orthologous to a segment of about 1.8 Mb on human chromosome 7 containing ten genes, including the gene mutated in cystic fibrosis. These sequences show conservation reflecting both functional constraints and the neutral mutational events that shaped this genomic region. In particular, we identify substantial numbers of conserved non-coding segments beyond those previously identified experimentally, most of which are not detectable by pair-wise sequence comparisons alone. Analysis of transposable element insertions highlights the variation in genome dynamics among these species and confirms the placement of rodents as a sister group to the primates.

Journal ArticleDOI
TL;DR: Alternative food initiatives are appearing in many places Observers suggest that they share a political agenda: to oppose the structures that coordinate and globalize the current food system and to create alternative systems of food production that are environmentally sustainable, economically viable, and socially just as discussed by the authors.

Journal ArticleDOI
TL;DR: The synthesis of semiconductor nanocrystals of PbS, ZnS, CdS, and MnS through a facile and inexpensive synthetic process and all of the synthesized nanocry crystals were highly crystalline.
Abstract: We report on the synthesis of semiconductor nanocrystals of PbS, ZnS, CdS, and MnS through a facile and inexpensive synthetic process. Metal−oleylamine complexes, which were obtained from the reaction of metal chloride and oleylamine, were mixed with sulfur. The reaction mixture was heated under appropriate experimental conditions to produce metal sulfide nanocrystals. Uniform cube-shaped PbS nanocrystals with particle sizes of 6, 8, 9, and 13 nm were synthesized. The particle size was controlled by changing the relative amount of PbCl2 and sulfur. Uniform 11 nm sized spherical ZnS nanocrystals were synthesized from the reaction of zinc chloride and sulfur, followed by one cycle of size-selective precipitation. CdS nanocrystals that consist of rods, bipods, and tripods were synthesized from a reaction mixture containing a 1:6 molar ratio of cadmium to sulfur. Spherical CdS nanocrystals (5.1 nm sized) were obtained from a reaction mixture with a cadmium to sulfur molar ratio of 2:1. MnS nanocrystals with v...


Journal ArticleDOI
Bernard Aubert1, R. Barate1, D. Boutigny1, J.M. Gaillard1  +580 moreInstitutions (75)
TL;DR: In this paper, the authors observed a narrow state near 2.32 GeV/c(2) in the inclusive D(+)(s)pi(0) invariant mass distribution from e(+)e(-) annihilation data at energies near 10.6 GeV.
Abstract: We have observed a narrow state near 2.32 GeV/c(2) in the inclusive D(+)(s)pi(0) invariant mass distribution from e(+)e(-) annihilation data at energies near 10.6 GeV. The observed width is consistent with the experimental resolution. The small intrinsic width and the quantum numbers of the final state indicate that the decay violates isospin conservation. The state has natural spin-parity and the low mass suggests a J(P)=0(+) assignment. The data sample corresponds to an integrated luminosity of 91 fb(-1) recorded by the BABAR detector at the SLAC PEP-II asymmetric-energy e(+)e(-) storage ring.

Journal ArticleDOI
TL;DR: This unit describes how to use the Genome Browser and Table Browser for genome analysis, download the underlying database tables, and create and display custom annotation tracks.
Abstract: The University of California Santa Cruz (UCSC) Genome Browser is a popular Web-based tool for quickly displaying a requested portion of a genome at any scale, accompanied by a series of aligned annotation "tracks." The annotations generated by the UCSC Genome Bioinformatics Group and external collaborators include gene predictions, mRNA and expressed sequence tag alignments, simple nucleotide polymorphisms, expression and regulatory data, phenotype and variation data, and pairwise and multiple-species comparative genomics data. All information relevant to a region is presented in one window, facilitating biological analysis and interpretation. The database tables underlying the Genome Browser tracks can be viewed, downloaded, and manipulated using another Web-based application, the UCSC Table Browser. Users can upload personal datasets in a wide variety of formats as custom annotation tracks in both browsers for research or educational purposes. This unit describes how to use the Genome Browser and Table Browser for genome analysis, download the underlying database tables, and create and display custom annotation tracks.

Journal ArticleDOI
TL;DR: This article proposes a model that employs inductive learning to discover multiple rules, and assigns them confidence scores based on their performance in the lexicon, and concludes that speakers extend morphological patterns based on abstract structural properties, of a kind appropriately described with rules.

Proceedings ArticleDOI
18 Jun 2003
TL;DR: This work is inspired by recent progress on natural image statistics that the priors of image primitives can be well represented by examples and proposes a Bayesian approach to image hallucination, where primal sketch priors are constructed and used to enhance the quality of the hallucinated high resolution image.
Abstract: We propose a Bayesian approach to image hallucination. Given a generic low resolution image, we hallucinate a high resolution image using a set of training images. Our work is inspired by recent progress on natural image statistics that the priors of image primitives can be well represented by examples. Specifically, primal sketch priors (e.g., edges, ridges and corners) are constructed and used to enhance the quality of the hallucinated high resolution image. Moreover, a contour smoothness constraint enforces consistency of primitives in the hallucinated image by a Markov-chain based inference algorithm. A reconstruction constraint is also applied to further improve the quality of the hallucinated image. Experiments demonstrate that our approach can hallucinate high quality super-resolution images.