scispace - formally typeset
Search or ask a question

Showing papers by "Stanford University published in 1999"


Book
28 May 1999
TL;DR: This foundational text is the first comprehensive introduction to statistical natural language processing (NLP) to appear and provides broad but rigorous coverage of mathematical and linguistic foundations, as well as detailed discussion of statistical methods, allowing students and researchers to construct their own implementations.
Abstract: Statistical approaches to processing natural language text have become dominant in recent years This foundational text is the first comprehensive introduction to statistical natural language processing (NLP) to appear The book contains all the theory and algorithms needed for building NLP tools It provides broad but rigorous coverage of mathematical and linguistic foundations, as well as detailed discussion of statistical methods, allowing students and researchers to construct their own implementations The book covers collocation finding, word sense disambiguation, probabilistic parsing, information retrieval, and other applications

9,295 citations


Journal ArticleDOI
TL;DR: This report hopes this report will generate further thought about ways to improve the quality of reports of meta-analyses of RCTs and that interested readers, reviewers, researchers, and editors will use the QUOROM statement and generate ideas for its improvement.

4,767 citations


Journal ArticleDOI
TL;DR: This study reports the first disease-causing mutations in RTT and points to abnormal epigenetic regulation as the mechanism underlying the pathogenesis of RTT.
Abstract: Rett syndrome (RTT, MIM 312750) is a progressive neurodevelopmental disorder and one of the most common causes of mental retardation in females, with an incidence of 1 in 10,000-15,000 (ref. 2). Patients with classic RTT appear to develop normally until 6-18 months of age, then gradually lose speech and purposeful hand use, and develop microcephaly, seizures, autism, ataxia, intermittent hyperventilation and stereotypic hand movements. After initial regression, the condition stabilizes and patients usually survive into adulthood. As RTT occurs almost exclusively in females, it has been proposed that RTT is caused by an X-linked dominant mutation with lethality in hemizygous males. Previous exclusion mapping studies using RTT families mapped the locus to Xq28 (refs 6,9,10,11). Using a systematic gene screening approach, we have identified mutations in the gene (MECP2 ) encoding X-linked methyl-CpG-binding protein 2 (MeCP2) as the cause of some cases of RTT. MeCP2 selectively binds CpG dinucleotides in the mammalian genome and mediates transcriptional repression through interaction with histone deacetylase and the corepressor SIN3A (refs 12,13). In 5 of 21 sporadic patients, we found 3 de novo missense mutations in the region encoding the highly conserved methyl-binding domain (MBD) as well as a de novo frameshift and a de novo nonsense mutation, both of which disrupt the transcription repression domain (TRD). In two affected half-sisters of a RTT family, we found segregation of an additional missense mutation not detected in their obligate carrier mother. This suggests that the mother is a germline mosaic for this mutation. Our study reports the first disease-causing mutations in RTT and points to abnormal epigenetic regulation as the mechanism underlying the pathogenesis of RTT.

4,503 citations


Journal ArticleDOI
06 Aug 1999-Science
TL;DR: A total of 6925 Saccharomyces cerevisiae strains were constructed, by a high-throughput strategy, each with a precise deletion of one of 2026 ORFs (more than one-third of the ORFs in the genome), finding that 17 percent were essential for viability in rich medium.
Abstract: The functions of many open reading frames (ORFs) identified in genome-sequencing projects are unknown. New, whole-genome approaches are required to systematically determine their function. A total of 6925 Saccharomyces cerevisiae strains were constructed, by a high-throughput strategy, each with a precise deletion of one of 2026 ORFs (more than one-third of the ORFs in the genome). Of the deleted ORFs, 17 percent were essential for viability in rich medium. The phenotypes of more than 500 deletion strains were assayed in parallel. Of the deletion strains, 40 percent showed quantitative growth defects in either rich or minimal medium.

4,051 citations


Journal ArticleDOI
TL;DR: The authors show that the perception of time is malleable, and social goals change in both younger and older people when time constraints are imposed and suggest potential implications for multiple subdisciplines and research interests.
Abstract: Socioemotional selectivity theory claims that the perception of time plays a fundamental role in the selection and pursuit of social goals. According to the theory, social motives fall into 1 of 2 general categories--those related to the acquisition of knowledge and those related to the regulation of emotion. When time is perceived as open-ended, knowledge-related goals are prioritized. In contrast, when time is perceived as limited, emotional goals assume primacy. The inextricable association between time left in life and chronological age ensures age-related differences in social goals. Nonetheless, the authors show that the perception of time is malleable, and social goals change in both younger and older people when time constraints are imposed. The authors argue that time perception is integral to human motivation and suggest potential implications for multiple subdisciplines and research interests in social, developmental, cultural, cognitive, and clinical psychology.

3,874 citations


Proceedings Article
07 Sep 1999
TL;DR: Experimental results indicate that the novel scheme for approximate similarity search based on hashing scales well even for a relatively large number of dimensions, and provides experimental evidence that the method gives improvement in running time over other methods for searching in highdimensional spaces based on hierarchical tree decomposition.
Abstract: The nearestor near-neighbor query problems arise in a large variety of database applications, usually in the context of similarity searching. Of late, there has been increasing interest in building search/index structures for performing similarity search over high-dimensional data, e.g., image databases, document collections, time-series databases, and genome databases. Unfortunately, all known techniques for solving this problem fall prey to the \curse of dimensionality." That is, the data structures scale poorly with data dimensionality; in fact, if the number of dimensions exceeds 10 to 20, searching in k-d trees and related structures involves the inspection of a large fraction of the database, thereby doing no better than brute-force linear search. It has been suggested that since the selection of features and the choice of a distance metric in typical applications is rather heuristic, determining an approximate nearest neighbor should su ce for most practical purposes. In this paper, we examine a novel scheme for approximate similarity search based on hashing. The basic idea is to hash the points Supported by NAVY N00014-96-1-1221 grant and NSF Grant IIS-9811904. Supported by Stanford Graduate Fellowship and NSF NYI Award CCR-9357849. Supported by ARO MURI Grant DAAH04-96-1-0007, NSF Grant IIS-9811904, and NSF Young Investigator Award CCR9357849, with matching funds from IBM, Mitsubishi, Schlumberger Foundation, Shell Foundation, and Xerox Corporation. Permission to copy without fee all or part of this material is granted provided that the copies are not made or distributed for direct commercial advantage, the VLDB copyright notice and the title of the publication and its date appear, and notice is given that copying is by permission of the Very Large Data Base Endowment. To copy otherwise, or to republish, requires a fee and/or special permission from the Endowment. Proceedings of the 25th VLDB Conference, Edinburgh, Scotland, 1999. from the database so as to ensure that the probability of collision is much higher for objects that are close to each other than for those that are far apart. We provide experimental evidence that our method gives signi cant improvement in running time over other methods for searching in highdimensional spaces based on hierarchical tree decomposition. Experimental results also indicate that our scheme scales well even for a relatively large number of dimensions (more than 50).

3,705 citations


Journal ArticleDOI
TL;DR: Standardized guidelines for response assessment are needed to ensure comparability among clinical trials in non-Hodgkin's lymphomas (NHL), and two meetings were convened among United States and international lymphoma experts to develop a uniform set of criteria for assessing response in clinical trials.
Abstract: Standardized guidelines for response assessment are needed to ensure comparability among clinical trials in non-Hodgkin's lymphomas (NHL). To achieve this, two meetings were convened among United States and international lymphoma experts representing medical hematology/oncology, radiology, radiation oncology, and pathology to review currently used response definitions and to develop a uniform set of criteria for assessing response in clinical trials. The criteria that were developed include anatomic definitions of response, with normal lymph node size after treatment of 1.5 cm in the longest transverse diameter by computer-assisted tomography scan. A designation of complete response/unconfirmed was adopted to include patients with a greater than 75% reduction in tumor size after therapy but with a residual mass, to include patients-especially those with large-cell NHL-who may not have residual disease. Single-photon emission computed tomography gallium scans are encouraged as a valuable adjunct to assessment of patients with large-cell NHL, but such scans require appropriate expertise. Flow cytometric, cytogenetic, and molecular studies are not currently included in response definitions. Response rates may be the most important objective in phase II trials where the activity of a new agent is important and may provide support for approval by regulatory agencies. However, the goals of most phase III trials are to identify therapies that will prolong the progression-free survival, if not the overall survival, of the treated patients. We hope that these guidelines will serve to improve communication among investigators and comparability among clinical trials until clinically relevant laboratory and imaging studies are identified and become more widely available.

3,495 citations


Proceedings Article
12 Nov 1999
TL;DR: Snort provides a layer of defense which monitors network traffic for predefined suspicious activity or patterns, and alert system administrators when potential hostile traffic is detected.
Abstract: Network intrusion detection systems (NIDS) are an important part of any network security architecture. They provide a layer of defense which monitors network traffic for predefined suspicious activity or patterns, and alert system administrators when potential hostile traffic is detected. Commercial NIDS have many differences, but Information Systems departments must face the commonalities that they share such as significant system footprint, complex deployment and high monetary cost. Snort was designed to address these issues.

3,490 citations


Journal ArticleDOI
18 Feb 1999-Nature
TL;DR: In this paper, an experimental demonstration of electromagnetically induced transparency in an ultracold gas of sodium atoms, in which the optical pulses propagate at twenty million times slower than the speed of light in a vacuum, is presented.
Abstract: Techniques that use quantum interference effects are being actively investigated to manipulate the optical properties of quantum systems1. One such example is electromagnetically induced transparency, a quantum effect that permits the propagation of light pulses through an otherwise opaque medium2,3,4,5. Here we report an experimental demonstration of electromagnetically induced transparency in an ultracold gas of sodium atoms, in which the optical pulses propagate at twenty million times slower than the speed of light in a vacuum. The gas is cooled to nanokelvin temperatures by laser and evaporative cooling6,7,8,9,10. The quantum interference controlling the optical properties of the medium is set up by a ‘coupling’ laser beam propagating at a right angle to the pulsed ‘probe’ beam. At nanokelvin temperatures, the variation of refractive index with probe frequency can be made very steep. In conjunction with the high atomic density, this results in the exceptionally low light speeds observed. By cooling the cloud below the transition temperature for Bose–Einstein condensation11,12,13 (causing a macroscopic population of alkali atoms in the quantum ground state of the confining potential), we observe even lower pulse propagation velocities (17?m?s−1) owing to the increased atom density. We report an inferred nonlinear refractive index of 0.18?cm2?W−1 and find that the system shows exceptionally large optical nonlinearities, which are of potential fundamental and technological interest for quantum optics.

3,438 citations


Journal ArticleDOI
22 Jan 1999-Science
TL;DR: The synthesis of massive arrays of monodispersed carbon nanotubes that are self-oriented on patterned porous silicon and plain silicon substrates is reported and the mechanisms of nanotube growth and self-orientation are elucidated.
Abstract: The synthesis of massive arrays of monodispersed carbon nanotubes that are self-oriented on patterned porous silicon and plain silicon substrates is reported. The approach involves chemical vapor deposition, catalytic particle size control by substrate design, nanotube positioning by patterning, and nanotube self-assembly for orientation. The mechanisms of nanotube growth and self-orientation are elucidated. The well-ordered nanotubes can be used as electron field emission arrays. Scaling up of the synthesis process should be entirely compatible with the existing semiconductor processes, and should allow the development of nanotube devices integrated into silicon technology.

3,093 citations


Journal ArticleDOI
TL;DR: This article found that when the test was described as producing gender differences and stereotype threat was high, women performed substantially worse than equally qualified men did on difficult (but not easy) math tests among a highly selected sample of men and women.

Journal ArticleDOI
TL;DR: The chapter concludes by examining some of the psychological, social, and institutional barriers to the production of trust, and describes different forms of trust found in organizations, and the antecedent conditions that produce them.
Abstract: Scholarly interest in the study of trust and distrust in organizations has grown dramatically over the past five years. This interest has been fueled, at least in part, by accumulating evidence that trust has a number of important benefits for organizations and their members. A primary aim of this review is to assess the state of this rapidly growing literature. The review examines recent progress in conceptualizing trust and distrust in organizational theory, and also summarizes evidence regarding the myriad benefits of trust within organizational systems. The review also describes different forms of trust found in organizations, and the antecedent conditions that produce them. Although the benefits of trust are well-documented, creating and sustaining trust is often difficult. Accordingly, the chapter concludes by examining some of the psychological, social, and institutional barriers to the production of trust. CONTENTS

Journal ArticleDOI
09 Jul 1999-Science
TL;DR: Immunological synapse formation is now shown to be an active and dynamic mechanism that allows T cells to distinguish potential antigenic ligands and was a determinative event for T cell proliferation.
Abstract: The specialized junction between a T lymphocyte and an antigen-presenting cell, the immunological synapse, consists of a central cluster of T cell receptors surrounded by a ring of adhesion molecules. Immunological synapse formation is now shown to be an active and dynamic mechanism that allows T cells to distinguish potential antigenic ligands. Initially, T cell receptor ligands were engaged in an outermost ring of the nascent synapse. Transport of these complexes into the central cluster was dependent on T cell receptor—ligand interaction kinetics. Finally, formation of a stable central cluster at the heart of the synapse was a determinative event for T cell proliferation. A critical event in the initiation of the adaptive

Journal ArticleDOI
TL;DR: In this paper, the authors model a market populated by two groups of boundedly rational agents: "newswatchers" and "momentum traders" and provide a unified account of under- and overreactions.
Abstract: We model a market populated by two groups of boundedly rational agents: “newswatchers” and “momentum traders.” Each newswatcher observes some private information, but fails to extract other newswatchers' information from prices. If information diffuses gradually across the population, prices underreact in the short run. The underreaction means that the momentum traders can profit by trend-chasing. However, if they can only implement simple (i.e., univariate) strategies, their attempts at arbitrage must inevitably lead to overreaction at long horizons. In addition to providing a unified account of under- and overreactions, the model generates several other distinctive implications.

Journal ArticleDOI
TL;DR: A multimethod field study of 92 workgroups explored the influence of three types of workgroup diversity (social category diversity, value diversity, and informational diversity) and two moderators.
Abstract: A multimethod field study of 92 workgroups explored the influence of three types of workgroup diversity (social category diversity, value diversity, and informational diversity) and two moderators ...

Journal ArticleDOI
TL;DR: Given the many mechanisms for disengaging moral control, civilized life requires, in addition to humane personal standards, safeguards built into social systems that uphold compassionate behavior and renounce cruelty.
Abstract: Moral agency is manifested in both the power to refrain from behaving inhumanely and the proactive power to behave humanely. Moral agency is embedded in a broader sociocognitive self theory encompassing self-organizing, proactive, self-reflective, and self-regulatory mechanisms rooted in personal standards linked to self-sanctions. The self-regulatory mechanisms governing moral conduct do not come into play unless they are activated, and there are many psychosocial maneuvers by which moral self-sanctions are selectively disengaged from inhumane conduct. The moral disengagement may center on the cognitive restructuring of inhumane conduct into a benign or worthy one by moral justification, sanitizing language, and advantageous comparison; disavowal of a sense of personal agency by diffusion or displacement of responsibility; disregarding or minimizing the injurious effects of one's actions; and attribution of blame to, and dehumanization of, those who are victimized. Many inhumanities operate through a supportive network of legitimate enterprises run by otherwise considerate people who contribute to destructive activities by disconnected subdivision of functions and diffusion of responsibility. Given the many mechanisms for disengaging moral control, civilized life requires, in addition to humane personal standards, safeguards built into social systems that uphold compassionate behavior and renounce cruelty.

Journal ArticleDOI
17 Sep 1999-Science
TL;DR: A simple model is described that unifies much of the data that previously were viewed as contradictory about the molecular mechanisms of this long-lasting increase in synaptic strength in the hippocampus.
Abstract: Long-term potentiation of synaptic transmission in the hippocampus is the leading experimental model for the synaptic changes that may underlie learning and memory. This review presents a current understanding of the molecular mechanisms of this long-lasting increase in synaptic strength and describes a simple model that unifies much of the data that previously were viewed as contradictory.

Journal ArticleDOI
TL;DR: It is found that Bagging improves when probabilistic estimates in conjunction with no-pruning are used, as well as when the data was backfit, and that Arc-x4 behaves differently than AdaBoost if reweighting is used instead of resampling, indicating a fundamental difference.
Abstract: Methods for voting classification algorithms, such as Bagging and AdaBoost, have been shown to be very successful in improving the accuracy of certain classifiers for artificial and real-world datasets. We review these algorithms and describe a large empirical study comparing several variants in conjunction with a decision tree inducer (three variants) and a Naive-Bayes inducer. The purpose of the study is to improve our understanding of why and when these algorithms, which use perturbation, reweighting, and combination techniques, affect classification error. We provide a bias and variance decomposition of the error to show how different methods and variants influence these two terms. This allowed us to determine that Bagging reduced variance of unstable methods, while boosting methods (AdaBoost and Arc-x4) reduced both the bias and variance of unstable methods but increased the variance for Naive-Bayes, which was very stable. We observed that Arc-x4 behaves differently than AdaBoost if reweighting is used instead of resampling, indicating a fundamental difference. Voting variants, some of which are introduced in this paper, include: pruning versus no pruning, use of probabilistic estimates, weight perturbations (Wagging), and backfitting of data. We found that Bagging improves when probabilistic estimates in conjunction with no-pruning are used, as well as when the data was backfit. We measure tree sizes and show an interesting positive correlation between the increase in the average tree size in AdaBoost trials and its success in reducing the error. We compare the mean-squared error of voting methods to non-voting methods and show that the voting methods lead to large and significant reductions in the mean-squared errors. Practical problems that arise in implementing boosting algorithms are explored, including numerical instabilities and underflows. We use scatterplots that graphically show how AdaBoost reweights instances, emphasizing not only “hard” areas but also outliers and noise.

Book ChapterDOI
TL;DR: The Zimbardo Time Perspective Inventory (ZPI) as mentioned in this paper is a measure assessing personal variations in time perspective profiles and specific time perspective biases, and it has been shown to have convergent, divergent, discriminant and predictive validity.
Abstract: Time perspective (TP), a fundamental dimension in the construction of psychological time, emerges from cognitive processes partitioning human experience into past, present, and future temporal frames. The authors’ research program proposes that TP is a pervasive and powerful yet largely unrecognized influence on much human behavior. Although TP variations are learned and modified by a variety of personal, social, and institutional influences, TP also functions as an individual-differences variable. Reported is a new measure assessing personal variations in TP profiles and specific TP biases. The five factors of the Zimbardo Time Perspective Inventory were established through exploratory and confirmatory factor analyses and demonstrate acceptable internal and test–retest reliability. Convergent, divergent, discriminant, and predictive validity are shown by correlational and experimental research supplemented by case studies.

Journal ArticleDOI
TL;DR: In this paper, a reduced-form model of the valuation of contingent claims subject to default risk is presented, focusing on applications to the term structure of interest rates for corporate or sovereign bonds and the parameterization of losses at default in terms of the fractional reduction in market value that occurs at default.
Abstract: This article presents convenient reduced-form models of the valuation of contingent claims subject to default risk, focusing on applications to the term structure of interest rates for corporate or sovereign bonds. Examples include the valuation of a credit-spread option. This article presents a new approach to modeling term structures of bonds and other contingent claims that are subject to default risk. As in previous “reduced-form” models, we treat default as an unpredictable event governed by a hazard-rate process. 1 Our approach is distinguished by the parameterization of losses at default in terms of the fractional reduction in market value that occurs at default. Specifically, we fix some contingent claim that, in the event of no default, pays X at time T . We take as given an arbitrage-free setting in which all securities are priced in terms of some short-rate process r and equivalent martingale measure Q [see Harrison and Kreps (1979) and Harrison and Pliska (1981)]. Under this “risk-neutral” probability measure, we letht denote the hazard rate for default at time t and let Lt denote the expected fractional loss in market value if default were to occur at time t , conditional

Journal ArticleDOI
TL;DR: A systematic set of statistical algorithms are applied, based on whole-genome mRNA data, partitional clustering and motif discovery, to identify transcriptional regulatory sub-networks in yeast—without any a priori knowledge of their structure or any assumptions about their dynamics.
Abstract: Technologies to measure whole-genome mRNA abundances1,2,3 and methods to organize and display such data4,5,6,7,8,9,10 are emerging as valuable tools for systems-level exploration of transcriptional regulatory networks. For instance, it has been shown that mRNA data from 118 genes, measured at several time points in the developing hindbrain of mice, can be hierarchically clustered into various patterns (or 'waves') whose members tend to participate in common processes5. We have previously shown that hierarchical clustering can group together genes whose cis-regulatory elements are bound by the same proteins in vivo6. Hierarchical clustering has also been used to organize genes into hierarchical dendograms on the basis of their expression across multiple growth conditions7. The application of Fourier analysis to synchronized yeast mRNA expression data has identified cell-cycle periodic genes, many of which have expected cis-regulatory elements8. Here we apply a systematic set of statistical algorithms, based on whole-genome mRNA data, partitional clustering and motif discovery, to identify transcriptional regulatory sub-networks in yeast—without any a priori knowledge of their structure or any assumptions about their dynamics. This approach uncovered new regulons (sets of co-regulated genes) and their putative cis-regulatory elements. We used statistical characterization of known regulons and motifs to derive criteria by which we infer the biological significance of newly discovered regulons and motifs. Our approach holds promise for the rapid elucidation of genetic network architecture in sequenced organisms in which little biology is known.

Journal ArticleDOI
01 Mar 1999-Neuron
TL;DR: A genetic mosaic system in Drosophila is described, in which a dominant repressor of a cell marker is placed in trans to a mutant gene of interest, which allows for the study of gene functions in neuroblast proliferation, axon guidance, and dendritic elaboration in the complex central nervous system.

Journal ArticleDOI
TL;DR: In this paper, an integrative model of the relationships among diversity, conflict, and performance is presented, and the authors test that model with a sample of 45 teams and find that diversity shapes conflict and that conflict, in turn, shapes performance, but these linkages have subtleties.
Abstract: In this paper we present an integrative model of the relationships among diversity, conflict, and performance, and we test that model with a sample of 45 teams. Findings show that diversity shapes conflict and that conflict, in turn, shapes performance, but these linkages have subtleties. Functional background diversity drives task conflict, but multiple types of diversity drive emotional conflict. Race and tenure diversity are positively associated with emotional conflict, while age diversity is negatively associated with such conflict. Task routineness and group longevity moderate these relationships. Results further show that task conflict has more favorable effects on cognitive task performance than does emotional conflict. Overall, these patterns suggest a complex link between work group diversity and work group functioning.

Journal ArticleDOI
TL;DR: The Berlin Questionnaire was evaluated for the usefulness of this instrument in identifying patients with sleep apnea in primary care settings and was shown to be useful in sleep clinic and community surveys.
Abstract: Although sleep apnea is common, it often goes undiagnosed in primary care encounters. The Berlin Questionnaire was found to be a means of identifying patients who are likely to have sleep apnea.

Journal ArticleDOI
06 Aug 1999-Cell
TL;DR: It is determined that canine narcolepsy is caused by disruption of the hypocretin (orexin) receptor 2 gene (Hcrtr2) and this result identifies hypocretins as major sleep-modulating neurotransmitters and opens novel potential therapeutic approaches for Narcoleptic patients.

Journal ArticleDOI
TL;DR: Exploration of the genome using DNA microarrays and other genome–scale technologies should narrow the gap in the knowledge of gene function and molecular biology between the currently–favoured model organisms and other species.
Abstract: Thousands of genes are being discovered for the first time by sequencing the genomes of model organisms, an exhilarating reminder that much of the natural world remains to be explored at the molecular level. DNA microarrays provide a natural vehicle for this exploration. The model organisms are the first for which comprehensive genome-wide surveys of gene expression patterns or function are possible. The results can be viewed as maps that reflect the order and logic of the genetic program, rather than the physical order of genes on chromosomes. Exploration of the genome using DNA microarrays and other genome-scale technologies should narrow the gap in our knowledge of gene function and molecular biology between the currently-favoured model organisms and other species.

Journal ArticleDOI
TL;DR: An intervention designed specifically to meet the needs of a heterogeneous group of chronic disease patients, including those with comorbid conditions, was feasible and beneficial beyond usual care in terms of improved health behaviors and health status.
Abstract: Objectives.This study evaluated the effectiveness (changes in health behaviors, health status, and health service utilization) of a self-management program for chronic disease designed for use with a heterogeneous group of chronic disease patients. It also explored the differential effectiveness of

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a solution to the hierarchy problem not relying on low-energy supersymmetry or technicolor, instead, the problem is nullified by bringing quantum gravity down to the TeV scale.
Abstract: We recently proposed a solution to the hierarchy problem not relying on low-energy supersymmetry or technicolor. Instead, the problem is nullified by bringing quantum gravity down to the TeV scale. This is accomplished by the presence of $ng~2$ new dimensions of submillimeter size, with the SM fields localized on a 3-brane in the higher dimensional space. In this paper we systematically study the experimental viability of this scenario. Constraints arise both from strong quantum gravitational effects at the TeV scale, and more importantly from the production of massless higher dimensional gravitons with TeV suppressed couplings. Theories with $ng2$ are safe due mainly to the infrared softness of higher dimensional gravity. For $n=2,$ the six dimensional Planck scale must be pushed above $\ensuremath{\sim}30\mathrm{TeV}$ to avoid cooling SN 1987A and distortions of the diffuse photon background. Nevertheless, the particular implementation of our framework within type I string theory can evade all constraints, for any $ng~2,$ with string scale ${m}_{s}\ensuremath{\sim}1\mathrm{TeV}.$ We also explore novel phenomena resulting from the existence of new states propagating in the higher dimensional space. The Peccei-Quinn solution to the strong $\mathrm{CP}$ problem is revived with a weak scale axion in the bulk. Gauge fields in the bulk can mediate repulsive forces $\ensuremath{\sim}{10}^{6}--{10}^{8}$ times stronger than gravity at submillimeter distances, as well as help stabilize the proton. Higher-dimensional gravitons produced on our brane and captured on a different ``fat'' brane can provide a natural dark matter candidate.

Journal ArticleDOI
01 Jan 1999-Science
TL;DR: The temporal program of gene expression during a model physiological response of human cells, the response of fibroblasts to serum, was explored with a complementary DNA microarray representing 8600 different human genes.
Abstract: The temporal program of gene expression during a model physiological response of human cells, the response of fibroblasts to serum, was explored with a complementary DNA microarray representing about 8600 different human genes. Genes could be clustered into groups on the basis of their temporal patterns of expression in this program. Many features of the transcriptional program appeared to be related to the physiology of wound repair, suggesting that fibroblasts play a larger and richer role in this complex multicellular response than had previously been appreciated.

Journal ArticleDOI
27 Oct 1999-JAMA
TL;DR: In this article, the effects of reducing television, videotape, and video game use on changes in adiposity, physical activity, and dietary intake were evaluated in a randomized controlled school-based trial.
Abstract: ContextSome observational studies have found an association between television viewing and child and adolescent adiposity.ObjectiveTo assess the effects of reducing television, videotape, and video game use on changes in adiposity, physical activity, and dietary intake.DesignRandomized controlled school-based trial conducted from September 1996 to April 1997.SettingTwo sociodemographically and scholastically matched public elementary schools in San Jose, Calif.ParticipantsOf 198 third- and fourth-grade students, who were given parental consent to participate, 192 students (mean age, 8.9 years) completed the study.InterventionChildren in 1 elementary school received an 18-lesson, 6-month classroom curriculum to reduce television, videotape, and video game use.Main Outcome MeasuresChanges in measures of height, weight, triceps skinfold thickness, waist and hip circumferences, and cardiorespiratory fitness; self-reported media use, physical activity, and dietary behaviors; and parental report of child and family behaviors. The primary outcome measure was body mass index, calculated as weight in kilograms divided by the square of height in meters.ResultsCompared with controls, children in the intervention group had statistically significant relative decreases in body mass index (intervention vs control change: 18.38 to 18.67 kg/m2 vs 18.10 to 18.81 kg/m2, respectively; adjusted difference −0.45 kg/m2 [95% confidence interval {CI}, −0.73 to −0.17]; P=.002), triceps skinfold thickness (intervention vs control change: 14.55 to 15.47 mm vs 13.97 to 16.46 mm, respectively; adjusted difference, −1.47 mm [95% CI, −2.41 to −0.54]; P=.002), waist circumference (intervention vs control change: 60.48 to 63.57 cm vs 59.51 to 64.73 cm, respectively; adjusted difference, −2.30 cm [95% CI, −3.27 to −1.33]; P<.001), and waist-to-hip ratio (intervention vs control change: 0.83 to 0.83 vs 0.82 to 0.84, respectively; adjusted difference, −0.02 [95% CI, −0.03 to −0.01]; P<.001). Relative to controls, intervention group changes were accompanied by statistically significant decreases in children's reported television viewing and meals eaten in front of the television. There were no statistically significant differences between groups for changes in high-fat food intake, moderate-to-vigorous physical activity, and cardiorespiratory fitness.ConclusionsReducing television, videotape, and video game use may be a promising, population-based approach to prevent childhood obesity.