scispace - formally typeset
Search or ask a question

Showing papers by "Stanford University published in 2002"


Journal ArticleDOI
08 Aug 2002-Nature
TL;DR: A doubling in global food demand projected for the next 50 years poses huge challenges for the sustainability both of food production and of terrestrial and aquatic ecosystems and the services they provide to society.
Abstract: A doubling in global food demand projected for the next 50 years poses huge challenges for the sustainability both of food production and of terrestrial and aquatic ecosystems and the services they provide to society. Agriculturalists are the principal managers of global useable lands and will shape, perhaps irreversibly, the surface of the Earth in the coming decades. New incentives and policies for ensuring the sustainability of agriculture and ecosystem services will be crucial if we are to meet the demands of improving yields without compromising environmental integrity or public health.

6,569 citations


Book
31 Oct 2002
TL;DR: A student or researcher working in mathematics, computer graphics, science, or engineering interested in any dynamic moving front, which might change its topology or develop singularities, will find this book interesting and useful.
Abstract: This book is an introduction to level set methods and dynamic implicit surfaces. These are powerful techniques for analyzing and computing moving fronts in a variety of different settings. While it gives many examples of the utility of the methods to a diverse set of applications, it also gives complete numerical analysis and recipes, which will enable users to quickly apply the techniques to real problems. The book begins with a description of implicit surfaces and their basic properties, then devises the level set geometry and calculus toolbox, including the construction of signed distance functions. Part II adds dynamics to this static calculus. Topics include the level set equation itself, Hamilton-Jacobi equations, motion of a surface normal to itself, re-initialization to a signed distance function, extrapolation in the normal direction, the particle level set method and the motion of co-dimension two (and higher) objects. Part III is concerned with topics taken from the fields of Image Processing and Computer Vision. These include the restoration of images degraded by noise and blur, image segmentation with active contours (snakes), and reconstruction of surfaces from unorganized data points. Part IV is dedicated to Computational Physics. It begins with one phase compressible fluid dynamics, then two-phase compressible flow involving possibly different equations of state, detonation and deflagration waves, and solid/fluid structure interaction. Next it discusses incompressible fluid dynamics, including a computer graphics simulation of smoke, free surface flows, including a computer graphics simulation of water, and fully two-phase incompressible flow. Additional related topics include incompressible flames with applications to computer graphics and coupling a compressible and incompressible fluid. Finally, heat flow and Stefan problems are discussed. A student or researcher working in mathematics, computer graphics, science, or engineering interested in any dynamic moving front, which might change its topology or develop singularities, will find this book interesting and useful.

5,526 citations


Journal ArticleDOI
TL;DR: The calculation of the q‐value is discussed, the pFDR analogue of the p‐value, which eliminates the need to set the error rate beforehand as is traditionally done, and can yield an increase of over eight times in power compared with the Benjamini–Hochberg FDR method.
Abstract: Summary. Multiple-hypothesis testing involves guarding against much more complicated errors than single-hypothesis testing. Whereas we typically control the type I error rate for a single-hypothesis test, a compound error rate is controlled for multiple-hypothesis tests. For example, controlling the false discovery rate FDR traditionally involves intricate sequential p-value rejection methods based on the observed data. Whereas a sequential p-value method fixes the error rate and estimates its corresponding rejection region, we propose the opposite approach—we fix the rejection region and then estimate its corresponding error rate. This new approach offers increased applicability, accuracy and power. We apply the methodology to both the positive false discovery rate pFDR and FDR, and provide evidence for its benefits. It is shown that pFDR is probably the quantity of interest over FDR. Also discussed is the calculation of the q-value, the pFDR analogue of the p-value, which eliminates the need to set the error rate beforehand as is traditionally done. Some simple numerical examples are presented that show that this new approach can yield an increase of over eight times in power compared with the Benjamini–Hochberg FDR method.

5,414 citations


Journal ArticleDOI
TL;DR: It is shown that both the approximation accuracy and execution speed of gradient boosting can be substantially improved by incorporating randomization into the procedure.

5,355 citations


01 Jan 2002
TL;DR: An ontology defines a common vocabulary for researchers who need to share information in a domain that includes machine-interpretable definitions of basic concepts in the domain and relations among them.
Abstract: 1 Why develop an ontology? In recent years the development of ontologies—explicit formal specifications of the terms in the domain and relations among them (Gruber 1993)—has been moving from the realm of ArtificialIntelligence laboratories to the desktops of domain experts. Ontologies have become common on the World-Wide Web. The ontologies on the Web range from large taxonomies categorizing Web sites (such as on Yahoo!) to categorizations of products for sale and their features (such as on Amazon.com). The WWW Consortium (W3C) is developing the Resource Description Framework (Brickley and Guha 1999), a language for encoding knowledge on Web pages to make it understandable to electronic agents searching for information. The Defense Advanced Research Projects Agency (DARPA), in conjunction with the W3C, is developing DARPA Agent Markup Language (DAML) by extending RDF with more expressive constructs aimed at facilitating agent interaction on the Web (Hendler and McGuinness 2000). Many disciplines now develop standardized ontologies that domain experts can use to share and annotate information in their fields. Medicine, for example, has produced large, standardized, structured vocabularies such as SNOMED (Price and Spackman 2000) and the semantic network of the Unified Medical Language System (Humphreys and Lindberg 1993). Broad general-purpose ontologies are emerging as well. For example, the United Nations Development Program and Dun & Bradstreet combined their efforts to develop the UNSPSC ontology which provides terminology for products and services (www.unspsc.org). An ontology defines a common vocabulary for researchers who need to share information in a domain. It includes machine-interpretable definitions of basic concepts in the domain and relations among them. Why would someone want to develop an ontology? Some of the reasons are:

4,838 citations


Journal ArticleDOI
25 Jul 2002-Nature
TL;DR: It is shown that previously known and new genes are necessary for optimal growth under six well-studied conditions: high salt, sorbitol, galactose, pH 8, minimal medium and nystatin treatment, and less than 7% of genes that exhibit a significant increase in messenger RNA expression are also required for optimal Growth in four of the tested conditions.
Abstract: Determining the effect of gene deletion is a fundamental approach to understanding gene function. Conventional genetic screens exhibit biases, and genes contributing to a phenotype are often missed. We systematically constructed a nearly complete collection of gene-deletion mutants (96% of annotated open reading frames, or ORFs) of the yeast Saccharomyces cerevisiae. DNA sequences dubbed 'molecular bar codes' uniquely identify each strain, enabling their growth to be analysed in parallel and the fitness contribution of each gene to be quantitatively assessed by hybridization to high-density oligonucleotide arrays. We show that previously known and new genes are necessary for optimal growth under six well-studied conditions: high salt, sorbitol, galactose, pH 8, minimal medium and nystatin treatment. Less than 7% of genes that exhibit a significant increase in messenger RNA expression are also required for optimal growth in four of the tested conditions. Our results validate the yeast gene-deletion collection as a valuable resource for functional genomics.

4,328 citations


Journal ArticleDOI
TL;DR: Focusing on using probabilistic metrics such as average values or variance to quantify design objectives such as performance and power will lead to a major change in SoC design methodologies.
Abstract: On-chip micronetworks, designed with a layered methodology, will meet the distinctive challenges of providing functionally correct, reliable operation of interacting system-on-chip components. A system on chip (SoC) can provide an integrated solution to challenging design problems in the telecommunications, multimedia, and consumer electronics domains. Much of the progress in these fields hinges on the designers' ability to conceive complex electronic engines under strong time-to-market pressure. Success will require using appropriate design and process technologies, as well as interconnecting existing components reliably in a plug-and-play fashion. Focusing on using probabilistic metrics such as average values or variance to quantify design objectives such as performance and power will lead to a major change in SoC design methodologies. Overall, these designs will be based on both deterministic and stochastic models. Creating complex SoCs requires a modular, component-based approach to both hardware and software design. Despite numerous challenges, the authors believe that developers will solve the problems of designing SoC networks. At the same time, they believe that a layered micronetwork design methodology will likely be the only path to mastering the complexity of future SoC designs.

3,852 citations


Journal ArticleDOI
TL;DR: A common pattern of phylogenetic conservatism in ecological character is recognized and the challenges of using phylogenies of partial lineages are highlighted and phylogenetic approaches to three emergent properties of communities: species diversity, relative abundance distributions, and range sizes are reviewed.
Abstract: ▪ Abstract As better phylogenetic hypotheses become available for many groups of organisms, studies in community ecology can be informed by knowledge of the evolutionary relationships among coexisting species. We note three primary approaches to integrating phylogenetic information into studies of community organization: 1. examining the phylogenetic structure of community assemblages, 2. exploring the phylogenetic basis of community niche structure, and 3. adding a community context to studies of trait evolution and biogeography. We recognize a common pattern of phylogenetic conservatism in ecological character and highlight the challenges of using phylogenies of partial lineages. We also review phylogenetic approaches to three emergent properties of communities: species diversity, relative abundance distributions, and range sizes. Methodological advances in phylogenetic supertree construction, character reconstruction, null models for community assembly and character evolution, and metrics of community ...

3,615 citations


Journal ArticleDOI
TL;DR: This review focuses on two commonly used strategies for down-regulating emotion, reappraisal and suppression, and concludes with a consideration of five important directions for future research on emotion regulation processes.
Abstract: One of life's great challenges is successfully regulating emotions. Do some emotion regulation strategies have more to recommend them than others? According to Gross's (1998, Review of General Psychology, 2, 271-299) process model of emotion regulation, strategies that act early in the emotion-generative process should have a different profile of consequences than strategies that act later on. This review focuses on two commonly used strategies for down-regulating emotion. The first, reappraisal, comes early in the emotion-generative process. It consists of changing the way a situation is construed so as to decrease its emotional impact. The second, suppression, comes later in the emotion-generative process. It consists of inhibiting the outward signs of inner feelings. Experimental and individual-difference studies find reappraisal is often more effective than suppression. Reappraisal decreases emotion experience and behavioral expression, and has no impact on memory. By contrast, suppression decreases behavioral expression, but fails to decrease emotion experience, and actually impairs memory. Suppression also increases physiological responding for suppressors and their social partners. This review concludes with a consideration of five important directions for future research on emotion regulation processes.

3,555 citations


Journal ArticleDOI
TL;DR: Exercise capacity is known to be an important prognostic factor in patients with cardiovascular disease, but it is uncertain whether it predicts mortality equally well among healthy persons and there is also uncertainty regarding the predictive power of exercise capacity relative to other clinical and exercise-test variables.
Abstract: Background Exercise capacity is known to be an important prognostic factor in patients with cardiovascular disease, but it is uncertain whether it predicts mortality equally well among healthy persons. There is also uncertainty regarding the predictive power of exercise capacity relative to other clinical and exercise-test variables. Methods We studied a total of 6213 consecutive men referred for treadmill exercise testing for clinical reasons during a mean (±SD) of 6.2±3.7 years of follow-up. Subjects were classified into two groups: 3679 had an abnormal exercise-test result or a history of cardiovascular disease, or both, and 2534 had a normal exercise-test result and no history of cardiovascular disease. Overall mortality was the end point. Results There were a total of 1256 deaths during the follow-up period, resulting in an average annual mortality of 2.6 percent. Men who died were older than those who survived and had a lower maximal heart rate, lower maximal systolic and diastolic blood pressure, a...

3,418 citations


Journal ArticleDOI
TL;DR: Incremental dynamic analysis (IDA) is a parametric analysis method that has recently emerged in several different forms to estimate more thoroughly structural performance under seismic loads as mentioned in this paper, which involves subjecting a structural model to one or more ground motion record(s), each scaled to multiple levels of intensity, thus producing one (or more) curve(s) of response parameterized versus intensity level.
Abstract: Incremental dynamic analysis (IDA) is a parametric analysis method that has recently emerged in several different forms to estimate more thoroughly structural performance under seismic loads. It involves subjecting a structural model to one (or more) ground motion record(s), each scaled to multiple levels of intensity, thus producing one (or more) curve(s) of response parameterized versus intensity level. To establish a common frame of reference, the fundamental concepts are analysed, a unified terminology is proposed, suitable algorithms are presented, and properties of the IDA curve are looked into for both single-degree-of-freedom and multi-degree-of-freedom structures. In addition, summarization techniques for multi-record IDA studies and the association of the IDA study with the conventional static pushover analysis and the yield reduction R-factor are discussed. Finally, in the framework of performance-based earthquake engineering, the assessment of demand and capacity is viewed through the lens of an IDA study. Copyright © 2001 John Wiley & Sons, Ltd.

Journal ArticleDOI
20 Nov 2002-JAMA
TL;DR: Self-management education complements traditional patient education in supporting patients to live the best possible quality of life with their chronic condition, and may soon become an integral part of high-quality primary care.
Abstract: Patients with chronic conditions make day-to-day decisions about—selfmanage—their illnesses. This reality introduces a new chronic disease paradigm: the patient-professional partnership, involving collaborative care and self-management education. Self-management education complements traditional patient education in supporting patients to live the best possible quality of life with their chronic condition. Whereas traditional patient education offers information and technical skills, self-management education teaches problem-solving skills. A central concept in self-management is selfefficacy—confidence to carry out a behavior necessary to reach a desired goal. Self-efficacy is enhanced when patients succeed in solving patientidentified problems. Evidence from controlled clinical trials suggests that (1) programs teaching self-management skills are more effective than informationonly patient education in improving clinical outcomes; (2) in some circumstances, self-management education improves outcomes and can reduce costs for arthritis and probably for adult asthma patients; and (3) in initial studies, a self-management education program bringing together patients with a variety of chronic conditions may improve outcomes and reduce costs. Selfmanagement education for chronic illness may soon become an integral part of high-quality primary care.

Journal ArticleDOI
TL;DR: Experimental results showing that employing the active learning method can significantly reduce the need for labeled training instances in both the standard inductive and transductive settings are presented.
Abstract: Support vector machines have met with significant success in numerous real-world learning tasks. However, like most machine learning algorithms, they are generally applied using a randomly selected training set classified in advance. In many settings, we also have the option of using pool-based active learning. Instead of using a randomly selected training set, the learner has access to a pool of unlabeled instances and can request the labels for some number of them. We introduce a new algorithm for performing active learning with support vector machines, i.e., an algorithm for choosing which instances to request next. We provide a theoretical motivation for the algorithm using the notion of a version space. We present experimental results showing that employing our active learning method can significantly reduce the need for labeled training instances in both the standard inductive and transductive settings.

Journal ArticleDOI
TL;DR: The association between insomnia and major depressive episodes has been constantly reported: individuals with insomnia are more likely to have a major depressive illness and longitudinal studies have shown that the persistence of insomnia is associated with the appearance of a new depressive episode.

Journal ArticleDOI
TL;DR: The authors examine how firms search, or solve problems, to create new products and find that firms position themselves in a unidimensional search space that spans a spectrum from local to distant search.
Abstract: We examine how firms search, or solve problems, to create new products. According to organizational learning research, firms position themselves in a unidimensional search space that spans a spectrum from local to distant search. Our findings in the global robotics industry suggest that firms' search efforts actually vary across two distinct dimensions: search depth, or how frequently the firm reuses its existing knowledge, and search scope, or how widely the firm explores new knowledge.

Journal ArticleDOI
TL;DR: The method of “nearest shrunken centroids” identifies subsets of genes that best characterize each class, which was highly efficient in finding genes for classifying small round blue cell tumors and leukemias.
Abstract: We have devised an approach to cancer class prediction from gene expression profiling, based on an enhancement of the simple nearest prototype (centroid) classifier. We shrink the prototypes and hence obtain a classifier that is often more accurate than competing methods. Our method of "nearest shrunken centroids" identifies subsets of genes that best characterize each class. The technique is general and can be used in many other classification problems. To demonstrate its effectiveness, we show that the method was highly efficient in finding genes for classifying small round blue cell tumors and leukemias.

Proceedings ArticleDOI
03 Jun 2002
TL;DR: The need for and research issues arising from a new model of data processing, where data does not take the form of persistent relations, but rather arrives in multiple, continuous, rapid, time-varying data streams are motivated.
Abstract: In this overview paper we motivate the need for and research issues arising from a new model of data processing. In this model, data does not take the form of persistent relations, but rather arrives in multiple, continuous, rapid, time-varying data streams. In addition to reviewing past work relevant to data stream systems and current projects in the area, the paper explores topics in stream query languages, new requirements and challenges in query processing, and algorithmic issues.

01 Jan 2002
TL;DR: Social cognitive theory analyzes social diffusion of new styles of behavior in terms of the psychosocial factors governing their acquisition and adoption and the social networks through which they spread and are supported.
Abstract: Social cognitive theory provides an agentic conceptual framework within which to analyze the determinants and psychosocial mechanisms through which symbolic communication influences human thought, affect and action. Communications systems operate through two pathways. In the direct pathway, they promote changes by informing, enabling, motivating, and guiding participants. In the socially mediated pathway, media influences link participants to social networks and community settings that provide natural incentives and continued personalized guidance, for desired change. Social cognitive theory analyzes social diffusion of new styles of behavior in terms of the psychosocial factors governing their acquisition and adoption and the social networks through which they spread and are supported. Structural interconnectedness provides potential diffusion paths; sociocognitive factors largely determine what diffuses through those paths.

Journal ArticleDOI
TL;DR: An SQP algorithm that uses a smooth augmented Lagrangian merit function and makes explicit provision for infeasibility in the original problem and the QP subproblems is discussed.
Abstract: Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first derivatives are available and that the constraint gradients are sparse. We discuss an SQP algorithm that uses a smooth augmented Lagrangian merit function and makes explicit provision for infeasibility in the original problem and the QP subproblems. SNOPT is a particular implementation that makes use of a semidefinite QP solver. It is based on a limited-memory quasi-Newton approximation to the Hessian of the Lagrangian and uses a reduced-Hessian algorithm (SQOPT) for solving the QP subproblems. It is designed for problems with many thousands of constraints and variables but a moderate number of degrees of freedom (say, up to 2000). An important application is to trajectory optimization in the aerospace industry. Numerical results are given for most problems in the CUTE and COPS test collections (about 900 examples).

Journal ArticleDOI
28 Feb 2002-Nature
TL;DR: It is reported that newly generated cells in the adult mouse hippocampus have neuronal morphology and can display passive membrane properties, action potentials and functional synaptic inputs similar to those found in mature dentate granule cells.
Abstract: There is extensive evidence indicating that new neurons are generated in the dentate gyrus of the adult mammalian hippocampus, a region of the brain that is important for learning and memory1,2,3,4,5. However, it is not known whether these new neurons become functional, as the methods used to study adult neurogenesis are limited to fixed tissue. We use here a retroviral vector expressing green fluorescent protein that only labels dividing cells, and that can be visualized in live hippocampal slices. We report that newly generated cells in the adult mouse hippocampus have neuronal morphology and can display passive membrane properties, action potentials and functional synaptic inputs similar to those found in mature dentate granule cells. Our findings demonstrate that newly generated cells mature into functional neurons in the adult mammalian brain.

Journal ArticleDOI
20 Dec 2002-Science
TL;DR: General agreement of genetic and predefined populations suggests that self-reported ancestry can facilitate assessments of epidemiological risks but does not obviate the need to use genetic information in genetic association studies.
Abstract: We studied human population structure using genotypes at 377 autosomal microsatellite loci in 1056 individuals from 52 populations. Within-population differences among individuals account for 93 to 95% of genetic variation; differences among major groups constitute only 3 to 5%. Nevertheless, without using prior information about the origins of individuals, we identified six main genetic clusters, five of which correspond to major geographic regions, and subclusters that often correspond to individual populations. General agreement of genetic and predefined populations suggests that self-reported ancestry can facilitate assessments of epidemiological risks but does not obviate the need to use genetic information in genetic association studies.

Journal ArticleDOI
TL;DR: The growing literature that supports a critical role for AMPA receptors trafficking in LTP and LTD is reviewed, focusing on the roles proposed for specific AMPA receptor subunits and their interacting proteins.
Abstract: Activity-dependent changes in synaptic function are believed to underlie the formation of memories. Two prominent examples are long-term potentiation (LTP) and long-term depression (LTD), whose mechanisms have been the subject of considerable scrutiny over the past few decades. Here we review the growing literature that supports a critical role for AMPA receptor trafficking in LTP and LTD, focusing on the roles proposed for specific AMPA receptor subunits and their interacting proteins. While much work remains to understand the molecular basis for synaptic plasticity, recent results on AMPA receptor trafficking provide a clear conceptual framework for future studies.

Journal ArticleDOI
TL;DR: In this article, the authors show that the hierarchy of scales can be fixed by a choice of Ramond-Ramond and Neveu-Schwarz fluxes in the compact manifold, and give examples involving orientifold compactifications of type IIB string theory and F-theory compactifications on Calabi-Yau fourfolds.
Abstract: Warped compactifications with significant warping provide one of the few known mechanisms for naturally generating large hierarchies of physical scales. We demonstrate that this mechanism is realizable in string theory, and give examples involving orientifold compactifications of type-IIB string theory and F-theory compactifications on Calabi-Yau fourfolds. In each case, the hierarchy of scales is fixed by a choice of Ramond-Ramond and Neveu-Schwarz fluxes in the compact manifold. Our solutions involve compactifications of the Klebanov-Strassler gravity dual to a confining $\mathcal{N}=1$ supersymmetric gauge theory, and the hierarchy reflects the small scale of chiral symmetry breaking in the dual gauge theory.

Journal ArticleDOI
TL;DR: Functional magnetic resonance imaging findings support the hypothesis that prefrontal cortex is involved in constructing reappraisal strategies that can modulate activity in multiple emotion-processing systems.
Abstract: The ability to cognitively regulate emotional responses to aversive events is important for mental and physical health. Little is known, however, about neural bases of the cognitive control of emotion. The present study employed functional magnetic resonance imaging to examine the neural systems used to reappraise highly negative scenes in unemotional terms. Reappraisal of highly negative scenes reduced subjective experience of negative affect. Neural correlates of reappraisal were increased activation of the lateral and medial prefrontal regions and decreased activation of the amygdala and medial orbito-frontal cortex. These findings support the hypothesis that prefrontal cortex is involved in constructing reappraisal strategies that can modulate activity in multiple emotion-processing systems.

Journal ArticleDOI
TL;DR: An analytic framework is described to identify and distinguish between moderators and mediators in RCTs when outcomes are measured dimensionally and it is recommended that R CTs routinely include and report such analyses.
Abstract: Randomized clinical trials (RCTs) not only are the gold standard for evaluating the efficacy and effectiveness of psychiatric treatments but also can be valuable in revealing moderators and mediators of therapeutic change. Conceptually, moderators identify on whom and under what circumstances treatments have different effects. Mediators identify why and how treatments have effects. We describe an analytic framework to identify and distinguish between moderators and mediators in RCTs when outcomes are measured dimensionally. Rapid progress in identifying the most effective treatments and understanding on whom treatments work and do not work and why treatments work or do not work depends on efforts to identify moderators and mediators of treatment outcome. We recommend that RCTs routinely include and report such analyses.

Journal ArticleDOI
TL;DR: In this article, the authors investigate the hypothesis that the combination of three related innovations (i.e., information technology, complementary workplace reorganization, and new products and services) constitute a significant skill-biased technical change affecting labor demand in the United States.
Abstract: We investigate the hypothesis that the combination of three related innovations—1) information technology (IT), 2) complementary workplace reorganization, and 3) new products and services—constitute a significant skill-biased technical change affecting labor demand in the United States. Using detailed firm-level data, we find evidence of complementarities among all three of these innovations in factor demand and productivity regressions. In addition, firms that adopt these innovations tend to use more skilled labor. The effects of IT on labor demand are greater when IT is combined with the particular organizational investments we identify, highlighting the importance of IT-enabled organizational change.

Journal ArticleDOI
TL;DR: In this paper, the authors describe approximate digital implementations of two new mathematical transforms, namely, the ridgelet transform and the curvelet transform, which offer exact reconstruction, stability against perturbations, ease of implementation, and low computational complexity.
Abstract: We describe approximate digital implementations of two new mathematical transforms, namely, the ridgelet transform and the curvelet transform. Our implementations offer exact reconstruction, stability against perturbations, ease of implementation, and low computational complexity. A central tool is Fourier-domain computation of an approximate digital Radon transform. We introduce a very simple interpolation in the Fourier space which takes Cartesian samples and yields samples on a rectopolar grid, which is a pseudo-polar sampling set based on a concentric squares geometry. Despite the crudeness of our interpolation, the visual performance is surprisingly good. Our ridgelet transform applies to the Radon transform a special overcomplete wavelet pyramid whose wavelets have compact support in the frequency domain. Our curvelet transform uses our ridgelet transform as a component step, and implements curvelet subbands using a filter bank of a/spl grave/ trous wavelet filters. Our philosophy throughout is that transforms should be overcomplete, rather than critically sampled. We apply these digital transforms to the denoising of some standard images embedded in white noise. In the tests reported here, simple thresholding of the curvelet coefficients is very competitive with "state of the art" techniques based on wavelets, including thresholding of decimated or undecimated wavelet transforms and also including tree-based Bayesian posterior mean methods. Moreover, the curvelet reconstructions exhibit higher perceptual quality than wavelet-based reconstructions, offering visually sharper images and, in particular, higher quality recovery of edges and of faint linear and curvilinear features. Existing theory for curvelet and ridgelet transforms suggests that these new approaches can outperform wavelet methods in certain image reconstruction problems. The empirical results reported here are in encouraging agreement.

Journal ArticleDOI
Robert L. Strausberg, Elise A. Feingold1, Lynette H. Grouse1, Jeffery G. Derge2, Richard D. Klausner1, Francis S. Collins1, Lukas Wagner1, Carolyn M. Shenmen1, Gregory D. Schuler1, Stephen F. Altschul1, Barry R. Zeeberg1, Kenneth H. Buetow1, Carl F. Schaefer1, Narayan K. Bhat1, Ralph F. Hopkins1, Heather Jordan1, Troy Moore3, Steve I Max3, Jun Wang3, Florence Hsieh, Luda Diatchenko, Kate Marusina, Andrew A Farmer, Gerald M. Rubin4, Ling Hong4, Mark Stapleton4, M. Bento Soares5, Maria de Fatima Bonaldo5, Thomas L. Casavant5, Todd E. Scheetz5, Michael J. Brownstein1, Ted B. Usdin1, Shiraki Toshiyuki, Piero Carninci, Christa Prange6, Sam S Raha7, Naomi A Loquellano7, Garrick J Peters7, Rick D Abramson7, Sara J Mullahy7, Stephanie Bosak, Paul J. McEwan, Kevin McKernan, Joel A. Malek, Preethi H. Gunaratne8, Stephen Richards8, Kim C. Worley8, Sarah Hale8, Angela M. Garcia8, Stephen W. Hulyk8, Debbie K Villalon8, Donna M. Muzny8, Erica Sodergren8, Xiuhua Lu8, Richard A. Gibbs8, Jessica Fahey9, Erin Helton9, Mark Ketteman9, Anuradha Madan9, Stephanie Rodrigues9, Amy Sanchez9, Michelle Whiting9, Anup Madan9, Alice C. Young1, Yuriy O. Shevchenko1, Gerard G. Bouffard1, Robert W. Blakesley1, Jeffrey W. Touchman1, Eric D. Green1, Mark Dickson10, Alex Rodriguez10, Jane Grimwood10, Jeremy Schmutz10, Richard M. Myers10, Yaron S.N. Butterfield11, Martin Krzywinski11, Ursula Skalska11, Duane E. Smailus11, Angelique Schnerch11, Jacqueline E. Schein11, Steven J.M. Jones11, Marco A. Marra11 
TL;DR: The National Institutes of Health Mammalian Gene Collection (MGC) Program is a multiinstitutional effort to identify and sequence a cDNA clone containing a complete ORF for each human and mouse gene.
Abstract: The National Institutes of Health Mammalian Gene Collection (MGC) Program is a multiinstitutional effort to identify and sequence a cDNA clone containing a complete ORF for each human and mouse gene. ESTs were generated from libraries enriched for full-length cDNAs and analyzed to identify candidate full-ORF clones, which then were sequenced to high accuracy. The MGC has currently sequenced and verified the full ORF for a nonredundant set of >9,000 human and >6,000 mouse genes. Candidate full-ORF clones for an additional 7,800 human and 3,500 mouse genes also have been identified. All MGC sequences and clones are available without restriction through public databases and clone distribution networks (see http:mgc.nci.nih.gov).

Journal ArticleDOI
TL;DR: Infants who developed late-onset sepsis had a significantly prolonged hospital stay and were significantly more likely to die than those who were uninfected, especially if they were infected with Gram-negative organisms or fungi.
Abstract: Objective. Late-onset sepsis (occurring after 3 days of age) is an important problem in very low birth weight (VLBW) infants. To determine the current incidence of late-onset sepsis, risk factors for disease, and the impact of late-onset sepsis on subsequent hospital course, we evaluated a cohort of 6956 VLBW (401–1500 g) neonates admitted to the clinical centers of the National Institute of Child Health and Human Development Neonatal Research Network over a 2-year period (1998–2000). Methods. The National Institute of Child Health and Human Development Neonatal Research Network maintains a prospective registry of all VLBW neonates admitted to participating centers within 14 days of birth. Expanded infection surveillance was added in 1998. Results. Of 6215 infants who survived beyond 3 days, 1313 (21%) had 1 or more episodes of blood culture-proven late-onset sepsis. The vast majority of infections (70%) were caused by Gram-positive organisms, with coagulase-negative staphylococci accounting for 48% of infections. Rate of infection was inversely related to birth weight and gestational age. Complications of prematurity associated with an increased rate of late-onset sepsis included patent ductus arteriosus, prolonged ventilation, prolonged intravascular access, bronchopulmonary dysplasia, and necrotizing enterocolitis. Infants who developed late-onset sepsis had a significantly prolonged hospital stay (mean length of stay: 79 vs 60 days). They were significantly more likely to die than those who were uninfected (18% vs 7%), especially if they were infected with Gram-negative organisms (36%) or fungi (32%). Conclusions. Late-onset sepsis remains an important risk factor for death among VLBW preterm infants and for prolonged hospital stay among VLBW survivors. Strategies to reduce late-onset sepsis and its medical, social, and economic toll need to be addressed urgently.

Proceedings ArticleDOI
23 Jul 2002
TL;DR: A complementary approach, applicable in any domain with object-to-object relationships, that measures similarity of the structural context in which objects occur, based on their relationships with other objects is proposed.
Abstract: The problem of measuring "similarity" of objects arises in many applications, and many domain-specific measures have been developed, e.g., matching text across documents or computing overlap among item-sets. We propose a complementary approach, applicable in any domain with object-to-object relationships, that measures similarity of the structural context in which objects occur, based on their relationships with other objects. Effectively, we compute a measure that says "two objects are similar if they are related to similar objects:" This general similarity measure, called SimRank, is based on a simple and intuitive graph-theoretic model. For a given domain, SimRank can be combined with other domain-specific similarity measures. We suggest techniques for efficient computation of SimRank scores, and provide experimental results on two application domains showing the computational feasibility and effectiveness of our approach.