scispace - formally typeset
Search or ask a question

Showing papers by "Stanford University published in 2001"


Journal ArticleDOI
Eric S. Lander1, Lauren Linton1, Bruce W. Birren1, Chad Nusbaum1  +245 moreInstitutions (29)
15 Feb 2001-Nature
TL;DR: The results of an international collaboration to produce and make freely available a draft sequence of the human genome are reported and an initial analysis is presented, describing some of the insights that can be gleaned from the sequence.
Abstract: The human genome holds an extraordinary trove of information about human development, physiology, medicine and evolution. Here we report the results of an international collaboration to produce and make freely available a draft sequence of the human genome. We also present an initial analysis of the data, describing some of the insights that can be gleaned from the sequence.

22,269 citations


Journal ArticleDOI
TL;DR: A general gradient descent boosting paradigm is developed for additive expansions based on any fitting criterion, and specific algorithms are presented for least-squares, least absolute deviation, and Huber-M loss functions for regression, and multiclass logistic likelihood for classification.
Abstract: Function estimation/approximation is viewed from the perspective of numerical optimization in function space, rather than parameter space. A connection is made between stagewise additive expansions and steepest-descent minimization. A general gradient descent “boosting” paradigm is developed for additive expansions based on any fitting criterion.Specific algorithms are presented for least-squares, least absolute deviation, and Huber-M loss functions for regression, and multiclass logistic likelihood for classification. Special enhancements are derived for the particular case where the individual additive components are regression trees, and tools for interpreting such “TreeBoost” models are presented. Gradient boosting of regression trees produces competitive, highly robust, interpretable procedures for both regression and classification, especially appropriate for mining less than clean data. Connections between this approach and the boosting methods of Freund and Shapire and Friedman, Hastie and Tibshirani are discussed.

17,764 citations


Journal ArticleDOI
TL;DR: A method that assigns a score to each gene on the basis of change in gene expression relative to the standard deviation of repeated measurements is described, suggesting that this repair pathway for UV-damaged DNA might play a previously unrecognized role in repairing DNA damaged by ionizing radiation.
Abstract: Microarrays can measure the expression of thousands of genes to identify changes in expression between different biological states. Methods are needed to determine the significance of these changes while accounting for the enormous number of genes. We describe a method, Significance Analysis of Microarrays (SAM), that assigns a score to each gene on the basis of change in gene expression relative to the standard deviation of repeated measurements. For genes with scores greater than an adjustable threshold, SAM uses permutations of the repeated measurements to estimate the percentage of genes identified by chance, the false discovery rate (FDR). When the transcriptional response of human cells to ionizing radiation was measured by microarrays, SAM identified 34 genes that changed at least 1.5-fold with an estimated FDR of 12%, compared with FDRs of 60 and 84% by using conventional methods of analysis. Of the 34 genes, 19 were involved in cell cycle regulation and 3 in apoptosis. Surprisingly, four nucleotide excision repair genes were induced, suggesting that this repair pathway for UV-damaged DNA might play a previously unrecognized role in repairing DNA damaged by ionizing radiation.

12,102 citations


Journal ArticleDOI
TL;DR: Social cognitive theory distinguishes among three modes of agency: direct personal agency, proxy agency that relies on others to act on one's behest to secure desired outcomes, and collective agency exercised through socially coordinative and interdependent effort.
Abstract: The capacity to exercise control over the nature and quality of one's life is the essence of humanness. Human agency is characterized by a number of core features that operate through phenomenal and functional consciousness. These include the temporal extension of agency through intentionality and forethought, self-regulation by self-reactive influence, and self-reflectiveness about one's capabilities, quality of functioning, and the meaning and purpose of one's life pursuits. Personal agency operates within a broad network of sociostructural influences. In these agentic transactions, people are producers as well as products of social systems. Social cognitive theory distinguishes among three modes of agency: direct personal agency, proxy agency that relies on others to act on one's behest to secure desired outcomes, and collective agency exercised through socially coordinative and interdependent effort. Growing transnational embeddedness and interdependence are placing a premium on collective efficacy to exercise control over personal destinies and national life.

11,235 citations


Journal ArticleDOI
TL;DR: Survival analyses on a subcohort of patients with locally advanced breast cancer uniformly treated in a prospective study showed significantly different outcomes for the patients belonging to the various groups, including a poor prognosis for the basal-like subtype and a significant difference in outcome for the two estrogen receptor-positive groups.
Abstract: The purpose of this study was to classify breast carcinomas based on variations in gene expression patterns derived from cDNA microarrays and to correlate tumor characteristics to clinical outcome. A total of 85 cDNA microarray experiments representing 78 cancers, three fibroadenomas, and four normal breast tissues were analyzed by hierarchical clustering. As reported previously, the cancers could be classified into a basal epithelial-like group, an ERBB2-overexpressing group and a normal breast-like group based on variations in gene expression. A novel finding was that the previously characterized luminal epithelial/estrogen receptor-positive group could be divided into at least two subgroups, each with a distinctive expression profile. These subtypes proved to be reasonably robust by clustering using two different gene sets: first, a set of 456 cDNA clones previously selected to reflect intrinsic properties of the tumors and, second, a gene set that highly correlated with patient outcome. Survival analyses on a subcohort of patients with locally advanced breast cancer uniformly treated in a prospective study showed significantly different outcomes for the patients belonging to the various groups, including a poor prognosis for the basal-like subtype and a significant difference in outcome for the two estrogen receptor-positive groups.

10,791 citations


Journal ArticleDOI
01 Nov 2001-Nature
TL;DR: Stem cell biology has come of age: Unequivocal proof that stem cells exist in the haematopoietic system has given way to the prospective isolation of several tissue-specific stem and progenitor cells, the initial delineation of their properties and expressed genetic programmes, and the beginnings of their utility in regenerative medicine.
Abstract: Stem cell biology has come of age. Unequivocal proof that stem cells exist in the haematopoietic system has given way to the prospective isolation of several tissue-specific stem and progenitor cells, the initial delineation of their properties and expressed genetic programmes, and the beginnings of their utility in regenerative medicine. Perhaps the most important and useful property of stem cells is that of self-renewal. Through this property, striking parallels can be found between stem cells and cancer cells: tumours may often originate from the transformation of normal stem cells, similar signalling pathways may regulate self-renewal in stem cells and cancer cells, and cancer cells may include 'cancer stem cells' - rare cells with indefinite potential for self-renewal that drive tumorigenesis.

8,999 citations


Book ChapterDOI
19 Aug 2001
TL;DR: This work proposes a fully functional identity-based encryption scheme (IBE) based on the Weil pairing that has chosen ciphertext security in the random oracle model assuming an elliptic curve variant of the computational Diffie-Hellman problem.
Abstract: We propose a fully functional identity-based encryption scheme (IBE). The scheme has chosen ciphertext security in the random oracle model assuming an elliptic curve variant of the computational Diffie-Hellman problem. Our system is based on the Weil pairing. We give precise definitions for secure identity based encryption schemes and give several applications for such systems.

7,083 citations


Journal ArticleDOI
TL;DR: This work gives examples exhibiting several advantages over MOF, MP, and BOB, including better sparsity and superresolution, and obtains reasonable success with a primal-dual logarithmic barrier method and conjugate-gradient solver.
Abstract: The time-frequency and time-scale communities have recently developed a large number of overcomplete waveform dictionaries---stationary wavelets, wavelet packets, cosine packets, chirplets, and warplets, to name a few. Decomposition into overcomplete systems is not unique, and several methods for decomposition have been proposed, including the method of frames (MOF), matching pursuit (MP), and, for special dictionaries, the best orthogonal basis (BOB). Basis pursuit (BP) is a principle for decomposing a signal into an "optimal"' superposition of dictionary elements, where optimal means having the smallest l1 norm of coefficients among all such decompositions. We give examples exhibiting several advantages over MOF, MP, and BOB, including better sparsity and superresolution. BP has interesting relations to ideas in areas as diverse as ill-posed problems, abstract harmonic analysis, total variation denoising, and multiscale edge denoising. BP in highly overcomplete dictionaries leads to large-scale optimization problems. With signals of length 8192 and a wavelet packet dictionary, one gets an equivalent linear program of size 8192 by 212,992. Such problems can be attacked successfully only because of recent advances in linear and quadratic programming by interior-point methods. We obtain reasonable success with a primal-dual logarithmic barrier method and conjugate-gradient solver.

4,387 citations


Journal ArticleDOI
TL;DR: In this paper, the authors proposed a method called the "gap statistic" for estimating the number of clusters (groups) in a set of data, which uses the output of any clustering algorithm (e.g. K-means or hierarchical), comparing the change in within-cluster dispersion with that expected under an appropriate reference null distribution.
Abstract: We propose a method (the ‘gap statistic’) for estimating the number of clusters (groups) in a set of data. The technique uses the output of any clustering algorithm (e.g. K-means or hierarchical), comparing the change in within-cluster dispersion with that expected under an appropriate reference null distribution. Some theory is developed for the proposal and a simulation study shows that the gap statistic usually outperforms other methods that have been proposed in the literature.

4,283 citations


Proceedings ArticleDOI
01 May 2001
TL;DR: An implementation is demonstrated that is able to align two range images in a few tens of milliseconds, assuming a good initial guess, and has potential application to real-time 3D model acquisition and model-based tracking.
Abstract: The ICP (Iterative Closest Point) algorithm is widely used for geometric alignment of three-dimensional models when an initial estimate of the relative pose is known. Many variants of ICP have been proposed, affecting all phases of the algorithm from the selection and matching of points to the minimization strategy. We enumerate and classify many of these variants, and evaluate their effect on the speed with which the correct alignment is reached. In order to improve convergence for nearly-flat meshes with small features, such as inscribed surfaces, we introduce a new variant based on uniform sampling of the space of normals. We conclude by proposing a combination of ICP variants optimized for high speed. We demonstrate an implementation that is able to align two range images in a few tens of milliseconds, assuming a good initial guess. This capability has potential application to real-time 3D model acquisition and model-based tracking.

4,059 citations


Journal ArticleDOI
TL;DR: The ultimate goal of this work is to establish a standard for recording and reporting microarray-based gene expression data, which will in turn facilitate the establishment of databases and public repositories and enable the development of data analysis tools.
Abstract: Microarray analysis has become a widely used tool for the generation of gene expression data on a genomic scale. Although many significant results have been derived from microarray studies, one limitation has been the lack of standards for presenting and exchanging such data. Here we present a proposal, the Minimum Information About a Microarray Experiment (MIAME), that describes the minimum information required to ensure that microarray data can be easily interpreted and that results derived from its analysis can be independently verified. The ultimate goal of this work is to establish a standard for recording and reporting microarray-based gene expression data, which will in turn facilitate the establishment of databases and public repositories and enable the development of data analysis tools. With respect to MIAME, we concentrate on defining the content and structure of the necessary information rather than the technical format for capturing it.

Book ChapterDOI
09 Dec 2001
TL;DR: A short signature scheme based on the Computational Diffie-Hellman assumption on certain elliptic and hyperelliptic curves is introduced, designed for systems where signatures are typed in by a human or signatures are sent over a low-bandwidth channel.
Abstract: We introduce a short signature scheme based on the Computational Diffie-Hellman assumption on certain elliptic and hyperelliptic curves. The signature length is half the size of a DSA signature for a similar level of security. Our short signature scheme is designed for systems where signatures are typed in by a human or signatures are sent over a low-bandwidth channel.

Journal ArticleDOI
TL;DR: It is shown that KNNimpute appears to provide a more robust and sensitive method for missing value estimation than SVDimpute, and both SVD Impute and KNN Impute surpass the commonly used row average method (as well as filling missing values with zeros).
Abstract: Motivation: Gene expression microarray experiments can generate data sets with multiple missing expression values. Unfortunately, many algorithms for gene expression analysis require a complete matrix of gene array values as input. For example, methods such as hierarchical clustering and K-means clustering are not robust to missing data, and may lose effectiveness even with a few missing values. Methods for imputing missing data are needed, therefore, to minimize the effect of incomplete data sets on analyses, and to increase the range of data sets to which these algorithms can be applied. In this report, we investigate automated methods for estimating missing data. Results: We present a comparative study of several methods for the estimation of missing values in gene microarray data. We implemented and evaluated three methods: a Singular Value Decomposition (SVD) based method (SVDimpute), weighted K-nearest neighbors (KNNimpute), and row average. We evaluated the methods using a variety of parameter settings and over different real data sets, and assessed the robustness of the imputation methods to the amount of missing data over the range of 1–20% missing values. We show that KNNimpute appears to provide a more robust and sensitive method for missing value estimation than SVDimpute, and both SVDimpute and KNNimpute surpass the commonly used row average method (as well as filling missing values with zeros). We report results of the comparative experiments and provide recommendations and tools for accurate estimation of missing microarray data under a variety of conditions. Availability: The software is available at http://smi-web.


Journal ArticleDOI
TL;DR: The number of people exposed to environmental tobacco smoke in California seems to have decreased over the same time period, where exposure is determined by the reported time spent with a smoker.
Abstract: Because human activities impact the timing, location, and degree of pollutant exposure, they play a key role in explaining exposure variation. This fact has motivated the collection of activity pattern data for their specific use in exposure assessments. The largest of these recent efforts is the National Human Activity Pattern Survey (NHAPS), a 2-year probability-based telephone survey ( n=9386) of exposure-related human activities in the United States (U.S.) sponsored by the U.S. Environmental Protection Agency (EPA). The primary purpose of NHAPS was to provide comprehensive and current exposure information over broad geographical and temporal scales, particularly for use in probabilistic population exposure models. NHAPS was conducted on a virtually daily basis from late September 1992 through September 1994 by the University of Maryland's Survey Research Center using a computer-assisted telephone interview instrument (CATI) to collect 24-h retrospective diaries and answers to a number of personal and exposure-related questions from each respondent. The resulting diary records contain beginning and ending times for each distinct combination of location and activity occurring on the diary day (i.e., each microenvironment). Between 340 and 1713 respondents of all ages were interviewed in each of the 10 EPA regions across the 48 contiguous states. Interviews were completed in 63% of the households contacted. NHAPS respondents reported spending an average of 87% of their time in enclosed buildings and about 6% of their time in enclosed vehicles. These proportions are fairly constant across the various regions of the U.S. and Canada and for the California population between the late 1980s, when the California Air Resources Board (CARB) sponsored a state-wide activity pattern study, and the mid-1990s, when NHAPS was conducted. However, the number of people exposed to environmental tobacco smoke (ETS) in California seems to have decreased over the same time period, where exposure is determined by the reported time spent with a smoker. In both California and the entire nation, the most time spent exposed to ETS was reported to take place in residential locations.

Proceedings ArticleDOI
22 Jun 2001
TL;DR: This paper introduces the concept of on-chip networks, sketches a simple network, and discusses some challenges in the architecture and design of these networks.
Abstract: Using on-chip interconnection networks in place of ad-hoc glo-bal wiring structures the top level wires on a chip and facilitates modular design. With this approach, system modules (processors, memories, peripherals, etc...) communicate by sending packets to one another over the network. The structured network wiring gives well-controlled electrical parameters that eliminate timing iterations and enable the use of high-performance circuits to reduce latency and increase bandwidth. The area overhead required to implement an on-chip network is modest, we estimate 6.6%. This paper introduces the concept of on-chip networks, sketches a simple network, and discusses some challenges in the architecture and design of these networks.

Proceedings Article
30 Jul 2001
TL;DR: The overall structure of the ontology, the service profile for advertising services, and the process model for the detailed description of the operation of services are described, which compare DAML-S with several industry efforts to define standards for characterizing services on the Web.
Abstract: The Semantic Web should enable greater access not only to content but also to services on the Web. Users and software agents should be able to discover, invoke, compose, and monitor Web resources offering particular services and having particular properties. As part of the DARPA Agent Markup Language program, we have begun to develop an ontology of services, called DAML-S, that will make these functionalities possible. In this paper we describe the overall structure of the ontology, the service profile for advertising services, and the process model for the detailed description of the operation of services. We also compare DAML-S with several industry efforts to define standards for characterizing services on the Web.

Book
01 Jan 2001
TL;DR: In recent decades the study of social movements, revolution, democratization and other non-routine politics has flourished as mentioned in this paper, and yet research on the topic remains highly fragmented, reflecting the influence of at least three traditional divisions.
Abstract: In recent decades the study of social movements, revolution, democratization and other non-routine politics has flourished. And yet research on the topic remains highly fragmented, reflecting the influence of at least three traditional divisions. The first of these reflects the view that various forms of contention are distinct and should be studied independent of others. Separate literatures have developed around the study of social movements, revolutions and industrial conflict. A second approach to the study of political contention denies the possibility of general theory in deference to a grounding in the temporal and spatial particulars of any given episode of contention. The study of contentious politics are left to 'area specialists' and/or historians with a thorough knowledge of the time and place in question. Finally, overlaid on these two divisions are stylized theoretical traditions - structuralist, culturalist, and rationalist - that have developed largely in isolation from one another. This book was first published in 2001.

Journal ArticleDOI
TL;DR: This paper found that vertical specialization accounts for 21% of these countries' exports, and grew almost 30% between 1970 and 1990, and also found that growth in vertical specialization accounted for 30% of the growth in these countries’ exports.

Journal ArticleDOI
TL;DR: The previously reported benefits of carvedilol with regard to morbidity and mortality in patients with mild-to-moderate heart failure were also apparent in the patients with severe heart failure who were evaluated in this trial.
Abstract: Background: Beta-blocking agents reduce the risk of hospitalization and death in patients with mild-to-moderate heart failure, but little is known about their effects in severe heart failure. Methods: We evaluated 2289 patients who had symptoms of heart failure at rest or on minimal exertion, who were clinically euvolemic, and who had an ejection fraction of less than 25 percent. In a double-blind fashion, we randomly assigned 1133 patients to placebo and 1156 patients to treatment with carvedilol for a mean period of 10.4 months, during which standard therapy for heart failure was continued. Patients who required intensive care, had marked fluid retention, or were receiving intravenous vasodilators or positive inotropic drugs were excluded. Results: There were 190 deaths in the placebo group and 130 deaths in the carvedilol group. This difference reflected a 35 percent decrease in the risk of death with carvedilol (95 percent confidence interval, 19 to 48 percent; P=0.00013, unadjusted; P=0.0014, adjusted for interim analyses). A total of 507 patients died or were hospitalized in the placebo group, as compared with 425 in the carvedilol group. This difference reflected a 24 percent decrease in the combined risk of death or hospitalization with carvedilol (95 percent confidence interval, 13 to 33 percent; P<0.001). The favorable effects on both end points were seen consistently in all the subgroups we examined, including patients with a history of recent or recurrent cardiac decompensation. Fewer patients in the carvedilol group than in the placebo group withdrew because of adverse effects or for other reasons (P=0.02). Conclusions: The previously reported benefits of carvedilol with regard to morbidity and mortality in patients with mild-to-moderate heart failure were also apparent in the patients with severe heart failure who were evaluated in this trial.

Journal ArticleDOI
TL;DR: Social cognitive theory analyzes social diffusion of new styles of behavior in terms of the psychosocial factors governing their acquisition and adoption and the social networks through which they spread and are supported as discussed by the authors.
Abstract: Social cognitive theory provides an agentic conceptual framework within which to analyze the determinants and psychosocial mechanisms through which symbolic communication influences human thought, affect and action. Communications systems operate through two pathways. In the direct pathway, they promote changes by informing, enabling, motivating, and guiding participants. In the socially mediated pathway, media influences link participants to social networks and community settings that provide natural incentives and continued personalized guidance, for desired change. Social cognitive theory analyzes social diffusion of new styles of behavior in terms of the psychosocial factors governing their acquisition and adoption and the social networks through which they spread and are supported. Structural interconnectedness provides potential diffusion paths; sociocognitive factors largely determine what diffuses through those paths.

Journal ArticleDOI
TL;DR: Single-walled carbon nanotubes are molecular wires that exhibit interesting structural, mechanical, electrical, and electromechanical properties that make for an ideal miniaturized sensor.
Abstract: Single-walled carbon nanotubes (SWNTs) are molecular wires that exhibit interesting structural, mechanical, electrical, and electromechanical properties. 1-3 A SWNT is unique among solidstate materials in that every atom is on the surface. Surface chemistry could therefore be critical to the physical properties of SWNTs and their applications. 3-10 SWNT sidewall functionalization is important to soluble nanotubes, 4-6 self-assembly on surfaces, and chemical sensors. 8-10 For these purposes, it is imperative to functionalize the sidewalls of SWNTs in noncovalent ways to preserve the sp 2 nanotube structure and thus their electronic characteristics. Immobilization of biomolecules on carbon nanotubes has been pursued in the past, motivated by the prospects of using nanotubes as new types of biosensor materials. 11-15 The electronic properties of nanotubes coupled with the specific recognition properties of the immobilized biosystems would indeed make for an ideal miniaturized sensor. A prerequisite for research in this area is the development of chemical methods to immobilize biological molecules onto carbon nanotubes in a reliable manner. Thus far, only limited work has been carried out with multiwalled carbon nanotubes (MWNTs). 11-15 Metallothionein proteins were trapped inside and placed onto the outer surfaces of open-ended MWNTs.11-14 Streptavidin was found to adsorb on MWNTs presumably via hydrophobic interactions between the nanotubes and hydrophobic domains of the proteins. 15 DNA molecules adsorbed on MWNTs via nonspecific interactions were also observed. 12-14

Journal ArticleDOI
TL;DR: It is predicted that understanding the pathways that lead to the development of androgen-independent prostate cancer will pave the way to effective therapies for these, at present, untreatable cancers.
Abstract: The normal prostate and early-stage prostate cancers depend on androgens for growth and survival, and androgen ablation therapy causes them to regress. Cancers that are not cured by surgery eventually become androgen independent, rendering anti-androgen therapy ineffective. But how does androgen independence arise? We predict that understanding the pathways that lead to the development of androgen-independent prostate cancer will pave the way to effective therapies for these, at present, untreatable cancers.

Journal ArticleDOI
08 Feb 2001-Nature
TL;DR: Simulation of the evolution of the chemical composition of aerosols finds that the mixing state and direct forcing of the black-carbon component approach those of an internal mixture, largely due to coagulation and growth of aerosol particles.
Abstract: Aerosols affect the Earth's temperature and climate by altering the radiative properties of the atmosphere. A large positive component of this radiative forcing from aerosols is due to black carbon--soot--that is released from the burning of fossil fuel and biomass, and, to a lesser extent, natural fires, but the exact forcing is affected by how black carbon is mixed with other aerosol constituents. From studies of aerosol radiative forcing, it is known that black carbon can exist in one of several possible mixing states; distinct from other aerosol particles (externally mixed) or incorporated within them (internally mixed), or a black-carbon core could be surrounded by a well mixed shell. But so far it has been assumed that aerosols exist predominantly as an external mixture. Here I simulate the evolution of the chemical composition of aerosols, finding that the mixing state and direct forcing of the black-carbon component approach those of an internal mixture, largely due to coagulation and growth of aerosol particles. This finding implies a higher positive forcing from black carbon than previously thought, suggesting that the warming effect from black carbon may nearly balance the net cooling effect of other anthropogenic aerosol constituents. The magnitude of the direct radiative forcing from black carbon itself exceeds that due to CH4, suggesting that black carbon may be the second most important component of global warming after CO2 in terms of direct forcing.

Journal ArticleDOI
TL;DR: It is proved that if S is representable as a highly sparse superposition of atoms from this time-frequency dictionary, then there is only one such highly sparse representation of S, and it can be obtained by solving the convex optimization problem of minimizing the l/sup 1/ norm of the coefficients among all decompositions.
Abstract: Suppose a discrete-time signal S(t), 0/spl les/t

Journal ArticleDOI
TL;DR: The level set method is couple to a wide variety of problems involving external physics, such as compressible and incompressible flow, Stefan problems, kinetic crystal growth, epitaxial growth of thin films, vortex-dominated flows, and extensions to multiphase motion.

Journal ArticleDOI
TL;DR: SIMPLIcity (semantics-sensitive integrated matching for picture libraries), an image retrieval system, which uses semantics classification methods, a wavelet-based approach for feature extraction, and integrated region matching based upon image segmentation to improve retrieval.
Abstract: We present here SIMPLIcity (semantics-sensitive integrated matching for picture libraries), an image retrieval system, which uses semantics classification methods, a wavelet-based approach for feature extraction, and integrated region matching based upon image segmentation. An image is represented by a set of regions, roughly corresponding to objects, which are characterized by color, texture, shape, and location. The system classifies images into semantic categories. Potentially, the categorization enhances retrieval by permitting semantically-adaptive searching methods and narrowing down the searching range in a database. A measure for the overall similarity between images is developed using a region-matching scheme that integrates properties of all the regions in the images. The application of SIMPLIcity to several databases has demonstrated that our system performs significantly better and faster than existing ones. The system is fairly robust to image alterations.

Journal ArticleDOI
TL;DR: The authors propose the markup of Web services in the DAML family of Semantic Web markup languages, which enables a wide variety of agent technologies for automated Web service discovery, execution, composition and interoperation.
Abstract: The authors propose the markup of Web services in the DAML family of Semantic Web markup languages. This markup enables a wide variety of agent technologies for automated Web service discovery, execution, composition and interoperation. The authors present one such technology for automated Web service composition.

Journal ArticleDOI
TL;DR: A structural model of the network of sociocognitive influences that shape children's career aspirations and trajectories is tested andalyses of gender differences reveal that perceived occupational self-efficacy predicts traditionality of career choice.
Abstract: This prospective study tested with 272 children a structural model of the network of sociocognitive influences that shape children's career aspirations and trajectories. Familial socioeconomic status is linked to children's career trajectories only indirectly through its effects on parents' perceived efficacy and academic aspirations. The impact of parental self-efficacy and aspirations on their children's perceived career efficacy and choice is, in turn, entirely mediated through the children's perceived efficacy and academic aspirations. Children's perceived academic, social, and self-regulatory efficacy influence the types of occupational activities for which they judge themselves to be efficacious both directly and through their impact on academic aspirations. Perceived occupational self-efficacy gives direction to the kinds of career pursuits children seriously consider for their life's work and those they disfavor. Children's perceived efficacy rather than their actual academic achievement is the key determinant of their perceived occupational self-efficacy and preferred choice of worklife. Analyses of gender differences reveal that perceived occupational self-efficacy predicts traditionality of career choice.

Journal ArticleDOI
TL;DR: A simple nonparametric empirical Bayes model is introduced, which is used to guide the efficient reduction of the data to a single summary statistic per gene, and also to make simultaneous inferences concerning which genes were affected by the radiation.
Abstract: Microarrays are a novel technology that facilitates the simultaneous measurement of thousands of gene expression levels. A typical microarray experiment can produce millions of data points, raising serious problems of data reduction, and simultaneous inference. We consider one such experiment in which oligonucleotide arrays were employed to assess the genetic effects of ionizing radiation on seven thousand human genes. A simple nonparametric empirical Bayes model is introduced, which is used to guide the efficient reduction of the data to a single summary statistic per gene, and also to make simultaneous inferences concerning which genes were affected by the radiation. Although our focus is on one specific experiment, the proposed methods can be applied quite generally. The empirical Bayes inferences are closely related to the frequentist false discovery rate (FDR) criterion.