scispace - formally typeset
Search or ask a question

Showing papers by "University of California, Santa Cruz published in 2004"


Journal ArticleDOI
TL;DR: In this paper, a new calibration curve for the conversion of radiocarbon ages to calibrated (cal) ages has been constructed and internationally ratified to replace IntCal98, which extended from 0-24 cal kyr BP (Before Present, 0 cal BP = AD 1950).
Abstract: A new calibration curve for the conversion of radiocarbon ages to calibrated (cal) ages has been constructed and internationally ratified to replace IntCal98, which extended from 0-24 cal kyr BP (Before Present, 0 cal BP = AD 1950). The new calibration data set for terrestrial samples extends from 0-26 cal kyr BP, but with much higher resolution beyond 11.4 cal kyr BP than IntCal98. Dendrochronologically-dated tree-ring samples cover the period from 0-12.4 cal kyr BP. Beyond the end of the tree rings, data from marine records (corals and foraminifera) are converted to the atmospheric equivalent with a site-specific marine reservoir correction to provide terrestrial calibration from 12.4-26.0 cal kyr BP. A substantial enhancement relative to IntCal98 is the introduction of a coherent statistical approach based on a random walk model, which takes into account the uncertainty in both the calendar age and the 14C age to calculate the underlying calibration curve (Buck and Blackwell, this issue). The tree-ring data sets, sources of uncertainty, and regional offsets are discussed here. The marine data sets and calibration curve for marine samples from the surface mixed layer (Marine04) are discussed in brief, but details are presented in Hughen et al. (this issue a). We do not make a recommendation for calibration beyond 26 cal kyr BP at this time; however, potential calibration data sets are compared in another paper (van der Plicht et al., this issue).

3,737 citations


Journal ArticleDOI
TL;DR: This work has implemented k-means clustering, hierarchical clustering and self-organizing maps in a single multipurpose open-source library of C routines, callable from other C and C++ programs.
Abstract: Summary: We have implemented k-means clustering, hierarchical clustering and self-organizing maps in a single multipurpose open-source library of C routines, callable from other C and C++ programs. Using this library, we have created an improved version of Michael Eisen's well-known Cluster program for Windows, Mac OS X and Linux/Unix. In addition, we generated a Python and a Perl interface to the C Clustering Library, thereby combining the flexibility of a scripting language with the speed of C. Availability: The C Clustering Library and the corresponding Python C extension module Pycluster were released under the Python License, while the Perl module Algorithm::Cluster was released under the Artistic License. The GUI code Cluster 3.0 for Windows, Macintosh and Linux/Unix, as well as the corresponding command-line program, were released under the same license as the original Cluster code. The complete source code is available at http://bonsai.ims.u-tokyo.ac.jp/mdehoon/software/cluster. Alternatively, Algorithm::Cluster can be downloaded from CPAN, while Pycluster is also available as part of the Biopython distribution.

2,815 citations


Journal ArticleDOI
Elise A. Feingold1, Peter J. Good1, Mark S. Guyer1, S. Kamholz1  +193 moreInstitutions (19)
22 Oct 2004-Science
TL;DR: The ENCyclopedia Of DNA Elements (ENCODE) Project is organized as an international consortium of computational and laboratory-based scientists working to develop and apply high-throughput approaches for detecting all sequence elements that confer biological function.
Abstract: The ENCyclopedia Of DNA Elements (ENCODE) Project aims to identify all functional elements in the human genome sequence. The pilot phase of the Project is focused on a specified 30 megabases (∼1%) of the human genome sequence and is organized as an international consortium of computational and laboratory-based scientists working to develop and apply high-throughput approaches for detecting all sequence elements that confer biological function. The results of this pilot phase will guide future efforts to analyze the entire human genome.

2,248 citations


Journal ArticleDOI
TL;DR: The University of California Santa Cruz (UCSC) Table Browser provides text-based access to a large collection of genome assemblies and annotation data stored in the Genome Browser Database and offers an enhanced level of query support that includes restrictions based on field values, free-form SQL queries and combined queries on multiple tables.
Abstract: The University of California Santa Cruz (UCSC) Table Browser (http://genome.ucsc.edu/cgi-bin/ hgText) provides text-based access to a large collection of genome assemblies and annotation data stored in the Genome Browser Database. A flexible alternative to the graphical-based Genome Browser, this tool offers an enhanced level of query support that includes restrictions based on field values, free-form SQL queries and combined queries on multiple tables. Output can be filtered to restrict the fields and lines returned, and may be organized into one of several formats, including a simple tabdelimited file that can be loaded into a spreadsheet or database as well as advanced formats that may be uploaded into the Genome Browser as custom annotation tracks. The Table Browser User’s Guide located on the UCSC website provides instructions and detailed examples for constructing queries and configuring output.

2,223 citations


Journal ArticleDOI
TL;DR: This paper proposes an alternate approach using L/sub 1/ norm minimization and robust regularization based on a bilateral prior to deal with different data and noise models and demonstrates its superiority to other super-resolution methods.
Abstract: Super-resolution reconstruction produces one or a set of high-resolution images from a set of low-resolution images. In the last two decades, a variety of super-resolution methods have been proposed. These methods are usually very sensitive to their assumed model of data and noise, which limits their utility. This paper reviews some of these methods and addresses their shortcomings. We propose an alternate approach using L/sub 1/ norm minimization and robust regularization based on a bilateral prior to deal with different data and noise models. This computationally inexpensive method is robust to errors in motion and blur estimation and results in images with sharp edges. Simulation results confirm the effectiveness of our method and demonstrate its superiority to other super-resolution methods.

2,175 citations


Journal ArticleDOI
16 Jul 2004-Science
TL;DR: Pikitch et al. as discussed by the authors describe the potential benefits of implementation of ecosystem-based fishery management that, in their view, far outweigh the difficulties of making the transition from a management system based on maximizing individual species.
Abstract: Ecosystem-based fishery management (EBFM) is a new direction for fishery management, essentially reversing the order of management priorities so that management starts with the ecosystem rather than a target species. EBFM aims to sustain healthy marine ecosystems and the fisheries they support. Pikitch et al . describe the potential benefits of implementation of EBFM that, in their view, far outweigh the difficulties of making the transition from a management system based on maximizing individual species.

2,011 citations


Journal ArticleDOI
01 Apr 2004-Nature
TL;DR: This first comprehensive analysis of the genome sequence of the Brown Norway (BN) rat strain is reported, which is the third complete mammalian genome to be deciphered, and three-way comparisons with the human and mouse genomes resolve details of mammalian evolution.
Abstract: The laboratory rat (Rattus norvegicus) is an indispensable tool in experimental medicine and drug development, having made inestimable contributions to human health. We report here the genome sequence of the Brown Norway (BN) rat strain. The sequence represents a high-quality 'draft' covering over 90% of the genome. The BN rat sequence is the third complete mammalian genome to be deciphered, and three-way comparisons with the human and mouse genomes resolve details of mammalian evolution. This first comprehensive analysis includes genes and proteins and their relation to human disease, repeated sequences, comparative genome-wide studies of mammalian orthologous chromosomal regions and rearrangement breakpoints, reconstruction of ancestral karyotypes and the events leading to existing species, rates of variation, and lineage-specific and lineage-independent evolutionary events such as expansion of gene families, orthology relations and protein evolution.

1,964 citations


Journal ArticleDOI
01 Oct 2004-Science
TL;DR: The 34 million-base-pair draft nuclear genome of the marine diatom Thalassiosira pseudonana and its 129 thousand-base pair plastid and 44 thousand base-pair mitochondrial genomes were reported in this article.
Abstract: Diatoms are unicellular algae with plastids acquired by secondary endosymbiosis. They are responsible for approximately 20% of global carbon fixation. We report the 34 million-base pair draft nuclear genome of the marine diatom Thalassiosira pseudonana and its 129 thousand-base pair plastid and 44 thousand-base pair mitochondrial genomes. Sequence and optical restriction mapping revealed 24 diploid nuclear chromosomes. We identified novel genes for silicic acid transport and formation of silica-based cell walls, high-affinity iron uptake, biosynthetic enzymes for several types of polyunsaturated fatty acids, use of a range of nitrogenous compounds, and a complete urea cycle, all attributes that allow diatoms to prosper in aquatic environments.

1,945 citations


Journal ArticleDOI
TL;DR: In this article, a reanalysis of broad emission-line reverberation-mapping data was carried out for 35 active galactic nuclei (AGNs) based on a complete and consistent reanalysis, and it was shown that the highest precision measure of the virial product cτΔV2/G is obtained by using the cross-correlation function centroid (cf.
Abstract: We present improved black hole masses for 35 active galactic nuclei (AGNs) based on a complete and consistent reanalysis of broad emission-line reverberation-mapping data From objects with multiple line measurements, we find that the highest precision measure of the virial product cτΔV2/G, where τ is the emission-line lag relative to continuum variations and ΔV is the emission-line width, is obtained by using the cross-correlation function centroid (as opposed to the cross-correlation function peak) for the time delay and the line dispersion (as opposed to FWHM) for the line width and by measuring the line width in the variable part of the spectrum Accurate line-width measurement depends critically on avoiding contaminating features, in particular the narrow components of the emission lines We find that the precision (or random component of the error) of reverberation-based black hole mass measurements is typically around 30%, comparable to the precision attained in measurement of black hole masses in quiescent galaxies by gas or stellar dynamical methods Based on results presented in a companion paper by Onken et al, we provide a zero-point calibration for the reverberation-based black hole mass scale by using the relationship between black hole mass and host-galaxy bulge velocity dispersion The scatter around this relationship implies that the typical systematic uncertainties in reverberation-based black hole masses are smaller than a factor of 3 We present a preliminary version of a mass-luminosity relationship that is much better defined than any previous attempt Scatter about the mass-luminosity relationship for these AGNs appears to be real and could be correlated with either Eddington ratio or object inclination

1,893 citations


Journal ArticleDOI
28 May 2004-Science
TL;DR: There are 481 segments longer than 200 base pairs that are absolutely conserved between orthologous regions of the human, rat, and mouse genomes, which represent a class of genetic elements whose functions and evolutionary origins are yet to be determined, but which are more highly conserving between these species than are proteins.
Abstract: There are 481 segments longer than 200 base pairs (bp) that are absolutely conserved (100% identity with no insertions or deletions) between orthologous regions of the human, rat, and mouse genomes. Nearly all of these segments are also conserved in the chicken and dog genomes, with an average of 95 and 99% identity, respectively. Many are also significantly conserved in fish. These ultraconserved elements of the human genome are most often located either overlapping exons in genes involved in RNA processing or in introns or nearby genes involved in the regulation of transcription and development. Along with more than 5000 sequences of over 100 bp that are absolutely conserved among the three sequenced mammals, these represent a class of genetic elements whose functions and evolutionary origins are yet to be determined, but which are more highly conserved between these species than are proteins and appear to be essential for the ontogeny of mammals and other vertebrates.

1,690 citations


Journal ArticleDOI
TL;DR: The GOODS survey as mentioned in this paper is based on multiband imaging data obtained with the Hubble Space Telescope and the Advanced Camera for Surveys (ACS) and covers roughly 320 arcmin2 in the ACS F435W, F606w, F814W, and F850LP bands, divided into two well-studied fields.
Abstract: This special issue of the Astrophysical Journal Letters is dedicated to presenting initial results from the Great Observatories Origins Deep Survey (GOODS) that are primarily, but not exclusively, based on multiband imaging data obtained with the Hubble Space Telescope and the Advanced Camera for Surveys (ACS). The survey covers roughly 320 arcmin2 in the ACS F435W, F606W, F814W, and F850LP bands, divided into two well-studied fields. Existing deep observations from the Chandra X-Ray Observatory and ground-based facilities are supplemented with new, deep imaging in the optical and near-infrared from the European Southern Observatory and from the Kitt Peak National Observatory. Deep observations with the Space Infrared Telescope Facility are scheduled. Reduced data from all facilities are being released worldwide within 3-6 months of acquisition. Together, this data set provides two deep reference fields for studies of distant normal and active galaxies, supernovae, and faint stars in our own Galaxy. This Letter serves to outline the survey strategy and describe the specific data that have been used in the accompanying letters, summarizing the reduction procedures and sensitivity limits.

Journal ArticleDOI
TL;DR: A review of the successes and problems of both the classical dynamical theory and the standard theory of magnetostatic support, from both observational and theoretical perspectives, is given in this paper.
Abstract: Understanding the formation of stars in galaxies is central to much of modern astrophysics. However, a quantitative prediction of the star formation rate and the initial distribution of stellar masses remains elusive. For several decades it has been thought that the star formation process is primarily controlled by the interplay between gravity and magnetostatic support, modulated by neutral-ion drift (known as ambipolar diffusion in astrophysics). Recently, however, both observational and numerical work has begun to suggest that supersonic turbulent flows rather than static magnetic fields control star formation. To some extent, this represents a return to ideas popular before the importance of magnetic fields to the interstellar gas was fully appreciated. This review gives a historical overview of the successes and problems of both the classical dynamical theory and the standard theory of magnetostatic support, from both observational and theoretical perspectives. The outline of a new theory relying on control by driven supersonic turbulence is then presented. Numerical models demonstrate that, although supersonic turbulence can provide global support, it nevertheless produces density enhancements that allow local collapse. Inefficient, isolated star formation is a hallmark of turbulent support, while efficient, clustered star formation occurs in its absence. The consequences of this theory are then explored for both local star formation and galactic-scale star formation. It suggests that individual star-forming cores are likely not quasistatic objects, but dynamically collapsing. Accretion onto these objects varies depending on the properties of the surrounding turbulent flow; numerical models agree with observations showing decreasing rates. The initial mass distribution of stars may also be determined by the turbulent flow. Molecular clouds appear to be transient objects forming and dissolving in the larger-scale turbulent flow, or else quickly collapsing into regions of violent star formation. Global star formation in galaxies appears to be controlled by the same balance between gravity and turbulence as small-scale star formation, although modulated by cooling and differential rotation. The dominant driving mechanism in star-forming regions of galaxies appears to be supernovae, while elsewhere coupling of rotation to the gas through magnetic fields or gravity may be important.

Journal ArticleDOI
TL;DR: Experimental evaluation of alignment quality, using a program that simulates evolutionary change in genomic sequences, indicates that TBA is more accurate than earlier programs.
Abstract: We define a "threaded blockset," which is a novel generalization of the classic notion of a multiple alignment. A new computer program called TBA (for "threaded blockset aligner") builds a threaded blockset under the assumption that all matching segments occur in the same order and orientation in the given sequences; inversions and duplications are not addressed. TBA is designed to be appropriate for aligning many, but by no means all, megabase-sized regions of multiple mammalian genomes. The output of TBA can be projected onto any genome chosen as a reference, thus guaranteeing that different projections present consistent predictions of which genomic positions are orthologous. This capability is illustrated using a new visualization tool to view TBA-generated alignments of vertebrate Hox clusters from both the mammalian and fish perspectives. Experimental evaluation of alignment quality, using a program that simulates evolutionary change in genomic sequences, indicates that TBA is more accurate than earlier programs. To perform the dynamic-programming alignment step, TBA runs a stand-alone program called MULTIZ, which can be used to align highly rearranged or incompletely sequenced genomes. We describe our use of MULTIZ to produce the whole-genome multiple alignments at the Santa Cruz Genome Browser.

Journal ArticleDOI
TL;DR: This paper showed that US prestige-press coverage of global warming from 1988 to 2002 has contributed to a significant divergence of popular discourse from scientific discourse, resulting from an accumulation of tactical media responses and practices guided by widely accepted journalistic norms.
Abstract: This paper demonstrates that US prestige-press coverage of global warming from 1988 to 2002 has contributed to a significant divergence of popular discourse from scientific discourse. This failed discursive translation results from an accumulation of tactical media responses and practices guided by widely accepted journalistic norms. Through content analysis of US prestige press—meaning the New York Times , the Washington Post , the Los Angeles Times , and the Wall Street Journal —this paper focuses on the norm of balanced reporting, and shows that the prestige press's adherence to balance actually leads to biased coverage of both anthropogenic contributions to global warming and resultant action.

Journal ArticleDOI
TL;DR: In this paper, a new radiocarbon calibration curve, IntCal04 and Marine04, has been constructed and internationally rati- fied to replace the terrestrial and marine components of IntCal98.
Abstract: New radiocarbon calibration curves, IntCal04 and Marine04, have been constructed and internationally rati- fied to replace the terrestrial and marine components of IntCal98. The new calibration data sets extend an additional 2000 yr, from 0-26 cal kyr BP (Before Present, 0 cal BP = AD 1950), and provide much higher resolution, greater precision, and more detailed structure than IntCal98. For the Marine04 curve, dendrochronologically-dated tree-ring samples, converted with a box diffusion model to marine mixed-layer ages, cover the period from 0-10.5 cal kyr BP. Beyond 10.5 cal kyr BP, high-res- olution marine data become available from foraminifera in varved sediments and U/Th-dated corals. The marine records are corrected with site-specific 14C reservoir age information to provide a single global marine mixed-layer calibration from 10.5-26.0 cal kyr BP. A substantial enhancement relative to IntCal98 is the introduction of a random walk model, which takes into account the uncertainty in both the calendar age and the 14C age to calculate the underlying calibration curve (Buck and Blackwell, this issue). The marine data sets and calibration curve for marine samples from the surface mixed layer (Marine04) are discussed here. The tree-ring data sets, sources of uncertainty, and regional offsets are presented in detail in a companion paper by Reimer et al. (this issue). ABSTRACT. New radiocarbon calibration curves, IntCal04 and Marine04, have been constructed and internationally rati- fied to replace the terrestrial and marine components of IntCal98. The new calibration data sets extend an additional 2000 yr, from 0-26 cal kyr BP (Before Present, 0 cal BP = AD 1950), and provide much higher resolution, greater precision, and more detailed structure than IntCal98. For the Marine04 curve, dendrochronologically-dated tree-ring samples, converted with a box diffusion model to marine mixed-layer ages, cover the period from 0-10.5 cal kyr BP. Beyond 10.5 cal kyr BP, high-res- olution marine data become available from foraminifera in varved sediments and U/Th-dated corals. The marine records are corrected with site-specific 14C reservoir age information to provide a single global marine mixed-layer calibration from 10.5-26.0 cal kyr BP. A substantial enhancement relative to IntCal98 is the introduction of a random walk model, which takes into account the uncertainty in both the calendar age and the 14C age to calculate the underlying calibration curve (Buck and Blackwell, this issue). The marine data sets and calibration curve for marine samples from the surface mixed layer (Marine04) are discussed here. The tree-ring data sets, sources of uncertainty, and regional offsets are presented in detail in a companion paper by Reimer et al. (this issue).

Journal ArticleDOI
TL;DR: In this review, recent findings are surveyed to illustrate that protein fibrillogenesis requires a partially folded conformation, which is relatively unfolded, and shares many structural properties with the pre-molten globule state.

Journal ArticleDOI
10 Jun 2004-Nature
TL;DR: It is shown that RUE decreases across biomes as mean annual precipitation increases, and during the driest years at each site, there is convergence to a common maximum RUE (RUEmax) that is typical of arid ecosystems.
Abstract: Water availability limits plant growth and production in almost all terrestrial ecosystems. However, biomes differ substantially in sensitivity of aboveground net primary production (ANPP) to between-year variation in precipitation. Average rain-use efficiency (RUE; ANPP/precipitation) also varies between biomes, supposedly because of differences in vegetation structure and/or biogeochemical constraints. Here we show that RUE decreases across biomes as mean annual precipitation increases. However, during the driest years at each site, there is convergence to a common maximum RUE (RUE(max)) that is typical of arid ecosystems. RUE(max) was also identified by experimentally altering the degree of limitation by water and other resources. Thus, in years when water is most limiting, deserts, grasslands and forests all exhibit the same rate of biomass production per unit rainfall, despite differences in physiognomy and site-level RUE. Global climate models predict increased between-year variability in precipitation, more frequent extreme drought events, and changes in temperature. Forecasts of future ecosystem behaviour should take into account this convergent feature of terrestrial biomes.

Proceedings ArticleDOI
04 Jun 2004
TL;DR: This work presents fast recovery mechanism (FARM), a distributed recovery approach that exploits excess disk capacity and reduces data recovery time and examines essential factors that influence system reliability, performance, and costs by simulating system behavior under disk failures.
Abstract: Storage clusters consisting of thousands of disk drives are now being used both for their large capacity and high throughput. However, their reliability is far worse than that of smaller storage systems due to the increased number of storage nodes. RAID technology is no longer sufficient to guarantee the necessary high data reliability for such systems, because disk rebuild time lengthens as disk capacity grows. We present fast recovery mechanism (FARM), a distributed recovery approach that exploits excess disk capacity and reduces data recovery time. FARM works in concert with replication and erasure-coding redundancy schemes to dramatically lower the probability of data loss in large-scale storage systems. We have examined essential factors that influence system reliability, performance, and costs, such as failure detections, disk bandwidth usage for recovery, disk space utilization, disk drive replacement, and system scales, by simulating system behavior under disk failures. Our results show the reliability improvement from FARM and demonstrate the impacts of various factors on system reliability. Using our techniques, system designers will be better able to build multipetabyte storage systems with much higher reliability at lower cost than previously possible.

Journal ArticleDOI
TL;DR: In this article, the authors examine the accretion of cores of giant planets from planetesimals, gas accretion onto the cores, and their orbital migration and show that the mass and semimajor axis distributions generated in their simulations for the gas giants are consistent with those of the known extrasolar planets.
Abstract: In an attempt to develop a deterministic theory for planet formation, we examine the accretion of cores of giant planets from planetesimals, gas accretion onto the cores, and their orbital migration. We adopt a working model for nascent protostellar disks with a wide variety of surface density distributions in order to explore the range of diversity among extrasolar planetary systems. We evaluate the cores' mass growth rate c through runaway planetesimal accretion and oligarchic growth. The accretion rate of cores is estimated with a two-body approximation. In the inner regions of disks, the cores' eccentricity is effectively damped by their tidal interaction with the ambient disk gas and their early growth is stalled by isolation. In the outer regions, the cores' growth rate is much smaller. If some cores can acquire more mass than a critical value of several Earth masses during the persistence of the disk gas, they would be able to rapidly accrete gas and evolve into gas giant planets. The gas accretion process is initially regulated by the Kelvin-Helmholtz contraction of the planets' gas envelope. Based on the assumption that the exponential decay of the disk gas mass occurs on the timescales ~106-107 yr and that the disk mass distribution is comparable to those inferred from the observations of circumstellar disks of T Tauri stars, we carry out simulations to predict the distributions of masses and semimajor axes of extrasolar planets. In disks as massive as the minimum-mass disk for the solar system, gas giants can form only slightly outside the ice boundary at a few AU. However, cores can rapidly grow above the critical mass inside the ice boundary in protostellar disks with 5 times more heavy elements than those of the minimum-mass disk. Thereafter, these massive cores accrete gas prior to its depletion and evolve into gas giants. The limited persistence of the disk gas and the decline in the stellar gravity prevent the formation of cores capable of efficient gas accretion outside 20-30 AU. Unimpeded dynamical accretion of gas is a runaway process that is terminated when the residual gas is depleted either globally or locally in the form of a gap in the vicinity of their orbits. Since planets' masses grow rapidly from 10 to 100 M?, the gas giant planets rarely form with asymptotic masses in this intermediate range. Our model predicts a paucity of extrasolar planets with mass in the range 10-100 M? and semimajor axis less than 3 AU. We refer to this deficit as a planet desert. We also examine the dynamical evolution of protoplanets by considering the effect of orbital migration of giant planets due to their tidal interactions with the gas disks, after they have opened up gaps in the disks. The effect of migration is to sharpen the boundaries and to enhance the contrast of the planet desert. It also clarifies the separation between the three populations of rocky, gas giant, and ice giant planets. Based on our results, we suggest that the planets' mass versus semimajor axes diagram can provide strong constraints on the dominant formation processes of planets analogous to the implications of the color-magnitude diagram on the paths of stellar evolution. We show that the mass and semimajor axis distributions generated in our simulations for the gas giants are consistent with those of the known extrasolar planets. Our results also indicate that a large fraction (90%-95%) of the planets that have migrated to within 0.05 AU must have perished. Future observations can determine the existence and the boundaries of the planet desert in this diagram, which can be used to extrapolate the ubiquity of rocky planets around nearby stars. Finally, the long-term dynamical interaction between planets of various masses can lead to both eccentricity excitation and scattering of planets to large semimajor axes. These effects are to be included in future models.

Journal ArticleDOI
TL;DR: In this article, the authors analyzed the halo occupation distribution and two-point correlation function of galaxy-size dark matter halos using high-resolution dissipationless simulations of the concordance flat ΛCDM model.
Abstract: We analyze the halo occupation distribution (HOD) and two-point correlation function of galaxy-size dark matter halos using high-resolution dissipationless simulations of the concordance flat ΛCDM model The halo samples include both the host halos and the "subhalos," distinct gravitationally bound halos within the virialized regions of larger host systems We find that the HOD, the probability distribution for a halo of mass M to host a number of subhalos N, is similar to that found in semianalytic and N-body+gasdynamics studies Its first moment, NM, has a complicated shape consisting of a step, a shoulder, and a power-law high-mass tail The HOD can be described by Poisson statistics at high halo masses but becomes sub-Poisson for NM 4 We show that the HOD can be understood as a combination of the probability for a halo of mass M to host a central galaxy and the probability to host a given number Ns of satellite galaxies The former can be approximated by a steplike function, while the latter can be well approximated by a Poisson distribution, fully specified by its first moment The first moment of the satellite HOD can be well described by a simple power law Ns ∝ Mβ with β ≈ 1 for a wide range of number densities, redshifts, and different power spectrum normalizations This formulation provides a simple but accurate model for the halo occupation distribution found in simulations At z = 0, the two-point correlation function (CF) of galactic halos can be well fitted by a power law down to ~100 h-1 kpc with an amplitude and slope similar to those of observed galaxies The dependence of correlation amplitude on the number density of objects is in general agreement with results from the Sloan Digital Sky Survey At redshifts z 1, we find significant departures from the power-law shape of the CF at small scales, where the CF steepens because of a more pronounced one-halo component The departures from the power law may thus be easier to detect in high-redshift galaxy surveys than at the present-day epoch They can be used to put useful constraints on the environments and formation of galaxies If the deviations are as strong as indicated by our results, the assumption of the single power law often used in observational analyses of high-redshift clustering is dangerous and is likely to bias the estimates of the correlation length and slope of the correlation function

Journal ArticleDOI
01 Oct 2004-Science
TL;DR: Evidence now supports the idea that humans contributed to extinction on some continents, but human hunting was not solely responsible for the pattern of extinction everywhere, and suggests that the intersection of human impacts with pronounced climatic change drove the precise timing and geography of extinction in the Northern Hemisphere.
Abstract: One of the great debates about extinction is whether humans or climatic change caused the demise of the Pleistocene megafauna. Evidence from paleontology, climatology, archaeology, and ecology now supports the idea that humans contributed to extinction on some continents, but human hunting was not solely responsible for the pattern of extinction everywhere. Instead, evidence suggests that the intersection of human impacts with pronounced climatic change drove the precise timing and geography of extinction in the Northern Hemisphere. The story from the Southern Hemisphere is still unfolding. New evidence from Australia supports the view that humans helped cause extinctions there, but the correlation with climate is weak or contested. Firmer chronologies, more realistic ecological models, and regional paleoecological insights still are needed to understand details of the worldwide extinction pattern and the population dynamics of the species involved.

Journal ArticleDOI
TL;DR: Analyses of the means suggest that collective action tendencies become stronger the more fellow group members "put their money where their mouth is."
Abstract: Insights from appraisal theories of emotion are used to integrate elements of theories on collective action. Three experiments with disadvantaged groups systematically manipulated procedural fairness (Study 1), emotional social support (Study 2), and instrumental social support (Study 3) to examine their effects on collective action tendencies through group-based anger and group efficacy. Results of structural equation modeling showed that procedural fairness and emotional social support affected the group-based anger pathway (reflecting emotion-focused coping), whereas instrumental social support affected the group efficacy pathway (reflecting problem-focused coping), constituting 2 distinct pathways to collective action tendencies. Analyses of the means suggest that collective action tendencies become stronger the more fellow group members "put their money where their mouth is." The authors discuss how their dual pathway model integrates and extends elements of current approaches to collective action.

Journal ArticleDOI
TL;DR: A detailed study of several very important aspects of Super‐Resolution, often ignored in the literature, are presented, and robustness, treatment of color, and dynamic operation modes are discussed.
Abstract: Super-Resolution reconstruction produces one or a set of high-resolution images from a sequence of low-resolution frames. This article reviews a variety of Super-Resolution methods proposed in the last 20 years, and provides some insight into, and a summary of, our recent contributions to the general Super-Resolution problem. In the process, a detailed study of several very important aspects of Super-Resolution, often ignored in the literature, is presented. Spe- cifically, we discuss robustness, treatment of color, and dynamic operation modes. Novel methods for addressing these issues are accompanied by experimental results on simulated and real data. Finally, some future challenges in Super-Resolution are outlined and discussed. © 2004 Wiley Periodicals, Inc. Int J Imaging Syst Technol, 14, 47-57, 2004; Published online in Wiley InterScience (www.interscience.wiley. com). DOI 10.1002/ima.20007

Journal ArticleDOI
TL;DR: In this article, the meanings of academically useful words were taught together with strategies for using information from context, from morphology, from knowledge about multiple meanings, and from cognates to infer word meaning.
Abstract: Gaps in reading performance between Anglo and Latino children are associated with gaps in vocabulary knowledge. An intervention was designed to enhance fifth graders' academic vocabulary. The meanings of academically useful words were taught together with strategies for using information from context, from morphology, from knowledge about multiple meanings, and from cognates to infer word meaning. Among the principles underlying the intervention were that new words should be encountered in meaningful text, that native Spanish speakers should have access to the text's meaning through Spanish, that words should be encountered in varying contexts, and that word knowledge involves spelling, pronunciation, morphology, and syntax as well as depth of meaning. Fifth graders in the intervention group showed greater growth than the comparison group on knowledge of the words taught, on depth of vocabulary knowledge, on understanding multiple meanings, and on reading comprehension. The intervention effects were as large for the English-language learners (ELLs) as for the English-only speakers (EOs), though the ELLs scored lower on all pre- and posttest measures. The results show the feasibility of improving comprehension outcomes for students in mixed ELL-EO classes, by teaching word analysis and vocabulary learning strategies.

Journal ArticleDOI
TL;DR: In this paper, the relative distribution of the galaxy pixel flux values (the Gini coefficient or G) and the second-order moment of the brightest 20% of a galaxy's flux (M20) were measured and compared to decreasing signal-to-noise ratio (S/N) and spatial resolution.
Abstract: We present two new nonparametric methods for quantifying galaxy morphology: the relative distribution of the galaxy pixel flux values (the Gini coefficient or G) and the second-order moment of the brightest 20% of the galaxy's flux (M20). We test the robustness of G and M20 to decreasing signal-to-noise ratio (S/N) and spatial resolution and find that both measures are reliable to within 10% for images with average S/N per pixel greater than 2 and resolutions better than 1000 and 500 pc, respectively. We have measured G and M20, as well as concentration (C), asymmetry (A), and clumpiness (S) in the rest-frame near-ultraviolet/optical wavelengths for 148 bright local normal Hubble-type galaxies (E–Sd) galaxies, 22 dwarf irregulars, and 73 0.05 < z < 0.25 ultraluminous infrared galaxies (ULIRGs). We find that most local galaxies follow a tight sequence in G-M20-C, where early types have high G and C and low M20 and late-type spirals have lower G and C and higher M20. The majority of ULIRGs lie above the normal galaxy G-M20 sequence because of their high G and M20 values. Their high Gini coefficients arise from very bright nuclei, while the high second-order moments are produced by multiple nuclei and bright tidal tails. All of these features are signatures of recent and on-going mergers and interactions. We also find that in combination with A and S, G is more effective than C at distinguishing ULIRGs from the normal Hubble types. Finally, we measure the morphologies of 491.7 < z < 3.8 galaxies from HST NICMOS observations of the Hubble Deep Field North. We find that many of the z ~ 2 galaxies possess G and A higher than expected from degraded images of local elliptical and spiral galaxies and have morphologies more like low-redshift ULIRGs.

Journal ArticleDOI
TL;DR: The first analysis of the FOS database is presented and provides essential baseline data against which the effects of enzyme replacement can be measured.
Abstract: Background Fabry disease is a rare X-linked disorder caused by deficient activity of the lysosomal enzyme α-galactosidase A. Progressive accumulation of the substrate globotriaosylceramide in cells throughout the body leads to major organ failure and premature death. In response to the recent introduction of enzyme replacement therapy, the Fabry Outcome Survey (FOS) was established to pool data from European clinics on the natural history of this little-known disease and to monitor the long-term efficacy and safety of treatment. This paper presents the first analysis of the FOS database and provides essential baseline data against which the effects of enzyme replacement can be measured. Design Baseline data from a cohort of 366 patients from 11 European countries were analysed in terms of demography and clinical manifestations of Fabry disease. Results Misdiagnosis of Fabry disease is common, and the mean delay from onset of symptoms to correct diagnosis was 13·7 and 16·3 years in males and females, respectively. Although previously thought to have serious manifestations only in hemizygous men, the FOS database has confirmed that females heterozygous for Fabry disease are similarly affected. Furthermore, signs and symptoms of Fabry disease may be present from early childhood. Conclusions With the advent of enzyme replacement therapy, it is important that general practitioners and physicians in a range of specialties recognize the signs and symptoms of Fabry disease so that effective treatment can be given. Baseline data from FOS demonstrate that enzyme replacement therapy should not be restricted to hemizygous men, but should be considered for both heterozygous females and children.

Journal ArticleDOI
TL;DR: Recent research suggesting that an old-growth age structure, combined with a broad spatial distribution of spawning and recruitment, is at least as important as spawning biomass in maintaining long-term sustainable population levels is summarized.
Abstract: Numerous groundfish stocks in both the Atlantic and Pacific are considered overfished, resulting in large-scale fishery closures. Fishing, in addition to simply removing biomass, also truncates the age and size structure of fish populations and often results in localized depletions. We summarize recent research suggesting that an old-growth age structure, combined with a broad spatial distribution of spawning and recruitment, is at least as important as spawning biomass in maintaining long-term sustainable population levels. In particular, there is evidence that older, larger female rockfishes produce larvae that withstand starvation longer and grow faster than the offspring of younger fish, that stocks may actually consist of several reproductively isolated units, and that recruitment may come from only a small and different fraction of the spawning population each year. None of these phenomena is accounted for in current management programs. We examine alternative management measures that addre...

Journal ArticleDOI
TL;DR: Marine reserves are a promising tool for fisheries management and conservation of biodiversity, but they are not a panacea for fishery management problems as discussed by the authors, and their successful use requires a case-by-case understanding of the spatial structure of impacted fisheries, ecosystems and human communities.

Journal ArticleDOI
TL;DR: These compounds meet predicted criteria for disrupting Tcf/beta-catenin complexes and define a general standard to establish mechanism-based activity of small molecule inhibitors of this pathogenic protein-protein interaction.

Journal ArticleDOI
Daniela S. Gerhard1, Lukas Wagner1, Elise A. Feingold1, Carolyn M. Shenmen1, Lynette H. Grouse1, Greg Schuler1, Steven L. Klein1, Susan Old1, Rebekah S. Rasooly1, Peter J. Good1, Mark S. Guyer1, Allison M. Peck1, Jeffery G. Derge2, David J. Lipman1, Francis S. Collins1, Wonhee Jang1, Steven Sherry1, Mike Feolo1, Leonie Misquitta1, Eduardo Lee1, Kirill Rotmistrovsky1, Susan F. Greenhut1, Carl F. Schaefer1, Kenneth H. Buetow1, Tom I. Bonner1, David Haussler3, Jim Kent3, Mark Diekhans3, Terry Furey3, Michael R. Brent4, Christa Prange5, Kirsten Schreiber5, Nicole Shapiro5, Narayan K. Bhat2, Ralph F. Hopkins2, Florence Hsie, Tom Driscoll, M. Bento Soares6, Maria de Fatima Bonaldo6, Thomas L. Casavant6, Todd E. Scheetz6, Michael J. Brownstein1, Ted B. Usdin1, Shiraki Toshiyuki, Piero Carninci, Yulan Piao1, Dawood B. Dudekula1, Minoru S.H. Ko1, Koichi Kawakami7, Yutaka Suzuki8, Sumio Sugano8, C. E. Gruber, M. R. Smith, Blake A. Simmons, Troy Moore, Richard C. Waterman4, Stephen L. Johnson4, Yijun Ruan9, Chia-Lin Wei9, Sinnakaruppan Mathavan9, Preethi H. Gunaratne10, Jia Qian Wu10, Angela M. Garcia10, Stephen W. Hulyk10, Edwin Fuh10, Ye Yuan10, Anna Sneed10, Carla Kowis10, Anne Hodgson10, Donna M. Muzny10, John Douglas Mcpherson10, Richard A. Gibbs10, Jessica Fahey6, Jessica Fahey11, Erin Helton11, Mark Ketteman11, Anuradha Madan11, Anuradha Madan6, Stephanie Rodrigues6, Stephanie Rodrigues11, Amy Sanchez11, Michelle Whiting11, Anup Madan11, Anup Madan6, Alice C. Young1, Keith Wetherby1, Steven J. Granite1, Peggy N. Kwong1, Charles P. Brinkley1, Russell L. Pearson1, Gerard G. Bouffard1, Robert W. Blakesly1, Eric D. Green1, Mark Dickson12, Alex Rodriguez12, Jane Grimwood12, Jeremy Schmutz12, Richard M. Myers12, Yaron S.N. Butterfield13, Malachi Griffith13, Obi L. Griffith13, Martin Krzywinski13, Nancy Y. Liao13, Ryan Morrin13, Diana L. Palmquist13, Anca Petrescu13, Ursula Skalska13, Duane E. Smailus13, Jeff M. Stott13, Angelique Schnerch13, Jacqueline E. Schein13, Steven J.M. Jones13, Robert A. Holt13, Agnes Baross13, Marco A. Marra13, Sandra W. Clifton4, Kathryn A. Makowski, Stephanie Bosak, Joel A. Malek 
TL;DR: Comparison of the sequence of the MGC clones to reference genome sequences reveals that most cDNA clones are of very high sequence quality, although it is likely that some cDNAs may carry missense variants as a consequence of experimental artifact, such as PCR, cloning, or reverse transcriptase errors.
Abstract: The National Institutes of Health's Mammalian Gene Collection (MGC) project was designed to generate and sequence a publicly accessible cDNA resource containing a complete open reading frame (ORF) for every human and mouse gene The project initially used a random strategy to select clones from a large number of cDNA libraries from diverse tissues Candidate clones were chosen based on 5'-EST sequences, and then fully sequenced to high accuracy and analyzed by algorithms developed for this project Currently, more than 11,000 human and 10,000 mouse genes are represented in MGC by at least one clone with a full ORF The random selection approach is now reaching a saturation point, and a transition to protocols targeted at the missing transcripts is now required to complete the mouse and human collections Comparison of the sequence of the MGC clones to reference genome sequences reveals that most cDNA clones are of very high sequence quality, although it is likely that some cDNAs may carry missense variants as a consequence of experimental artifact, such as PCR, cloning, or reverse transcriptase errors Recently, a rat cDNA component was added to the project, and ongoing frog (Xenopus) and zebrafish (Danio) cDNA projects were expanded to take advantage of the high-throughput MGC pipeline