scispace - formally typeset
Search or ask a question

Showing papers by "Carnegie Mellon University published in 1998"


Journal ArticleDOI
TL;DR: In this article, the authors adopt a multidisciplinary view of trust within and between firms, in an effort to synthesize and give insight into a fundamental construct of organizational science, while recognizing that the differing meanings scholars bring to the study of trust also can add value.
Abstract: Our task is to adopt a multidisciplinary view of trust within and between firms, in an effort to synthesize and give insight into a fundamental construct of organizational science. We seek to identify the shared understandings of trust across disciplines, while recognizing that the divergent meanings scholars bring to the study of trust also can add value.

8,886 citations


Proceedings ArticleDOI
24 Jul 1998
TL;DR: A PAC-style analysis is provided for a problem setting motivated by the task of learning to classify web pages, in which the description of each example can be partitioned into two distinct views, to allow inexpensive unlabeled data to augment, a much smaller set of labeled examples.
Abstract: We consider the problem of using a large unlabeled sample to boost performance of a learning algorit,hrn when only a small set of labeled examples is available. In particular, we consider a problem setting motivated by the task of learning to classify web pages, in which the description of each example can be partitioned into two distinct views. For example, the description of a web page can be partitioned into the words occurring on that page, and the words occurring in hyperlinks t,hat point to that page. We assume that either view of the example would be sufficient for learning if we had enough labeled data, but our goal is to use both views together to allow inexpensive unlabeled data to augment, a much smaller set of labeled examples. Specifically, the presence of two distinct views of each example suggests strategies in which two learning algorithms are trained separately on each view, and then each algorithm’s predictions on new unlabeled examples are used to enlarge the training set of the other. Our goal in this paper is to provide a PAC-style analysis for this setting, and, more broadly, a PAC-style framework for the general problem of learning from both labeled and unlabeled data. We also provide empirical results on real web-page data indicating that this use of unlabeled examples can lead to significant improvement of hypotheses in practice. *This research was supported in part by the DARPA HPKB program under contract F30602-97-1-0215 and by NSF National Young investigator grant CCR-9357793. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. TO copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. COLT 98 Madison WI USA Copyright ACM 1998 l-58113-057--0/98/ 7...%5.00 92 Tom Mitchell School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213-3891 mitchell+@cs.cmu.edu

5,840 citations


Proceedings ArticleDOI
25 Oct 1998
TL;DR: The results of a derailed packet-levelsimulationcomparing fourmulti-hopwirelessad hoc networkroutingprotocols, which cover a range of designchoices: DSDV,TORA, DSR and AODV are presented.
Abstract: An ad hoc networkis a collwtion of wirelessmobilenodes dynamically forminga temporarynetworkwithouttheuseof anyexistingnetworkirrfrastructureor centralizedadministration.Dueto the limitedtransmissionrange of ~vlrelessnenvorkinterfaces,multiplenetwork“hops”maybe neededfor onenodeto exchangedata ivithanotheracrox thenetwork.Inrecentyears, a ttiery of nelvroutingprotocols~geted specificallyat this environment havebeen developed.but little pcrfomrartwinformationon mch protocol and no ralistic performancecomparisonbehvwrrthem ISavailable. ~Is paper presentsthe results of a derailedpacket-levelsimulationcomparing fourmulti-hopwirelessad hoc networkroutingprotocolsthatcovera range of designchoices: DSDV,TORA, DSR and AODV. \Vehave extended the /~r-2networksimulatorto accuratelymodelthe MACandphysical-layer behaviorof the IEEE 802.1I wirelessLANstandard,includinga realistic wtrelesstransmissionchannelmodel, and present the resultsof simulations of net(vorksof 50 mobilenodes.

5,147 citations


Journal ArticleDOI
TL;DR: A neural network-based upright frontal face detection system that arbitrates between multiple networks to improve performance over a single network, and a straightforward procedure for aligning positive face examples for training.
Abstract: We present a neural network-based upright frontal face detection system. A retinally connected neural network examines small windows of an image and decides whether each window contains a face. The system arbitrates between multiple networks to improve performance over a single network. We present a straightforward procedure for aligning positive face examples for training. To collect negative examples, we use a bootstrap algorithm, which adds false detections into the training set as training progresses. This eliminates the difficult task of manually selecting nonface training examples, which must be chosen to span the entire space of nonface images. Simple heuristics, such as using the fact that faces rarely overlap in images, can further improve the accuracy. Comparisons with several other state-of-the-art face detection systems are presented, showing that our system has comparable performance in terms of detection and false-positive rates.

4,105 citations


Journal ArticleDOI
TL;DR: Greater use of the Internet was associated with declines in participants' communication with family members in the household, declines in the size of their social circle, and increases in their depression and loneliness.
Abstract: The Internet could change the lives of average citizens as much as did the telephone in the early part of the 20th century and television in the 1950s and 1960s. Researchers and social critics are debating whether the Internet is improving or harming participation in community life and social relationships. This research examined the social and psychological impact of the Internet on 169 people in 73 households during their first 1 to 2 years on-line. We used longitudinal data to examine the effects of the Internet on social involvement and psychological well-being. In this sample, the Internet was used extensively for communication. Nonetheless, greater use of the Internet was associated with declines in participants' communication with family members in the household, declines in the size of their social circle, and increases in their depression and loneliness. These findings have implications for research, for public policy and for the design of technology.

4,091 citations


Book
28 Oct 1998
TL;DR: In this paper, the authors present a plan for feedback control in the context of human behavior and its application to problems in living, including the following: 1. Introduction and plan 2. Principles of feedback control 3. Discrepancy reducing feedback processes in behavior and four further issues 4. Disrepancy enlarging loops, and three further issues 5. Goals and behavior 6.
Abstract: 1. Introduction and plan 2. Principles of feedback control 3. Discrepancy reducing feedback processes in behavior 4. Discrepancy enlarging loops, and three further issues 5. Goals and behavior 6. Goals, hierarchicality, and behavior: further issues 7. Public and private aspects of the self 8. Control processes and affect 9. Affect: issues and comparisons 10. Expectancies and disengagement 11. Disengagement: issues and comparisons 12. Applications to problems in living 13. Hierarchicality and problems in living 14. Chaos and dynamic systems 15. Catastrophe theory 16. Further applications to problems in living 17. Is behavior controlled or does it emerge? 18. Goal engagement, life and death.

3,943 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigate the role of trust in inter-organizational exchange at two levels of analysis and assess its effects on negotiation costs, conflict, and ultimately performance.
Abstract: A conceptual challenge in exploring the role of trust in interorganizational exchange is translating an inherently individual-level concept-trust-to the organizational-level outcome of performance. We define interpersonal and interorganizational trust as distinct constructs and draw on theories of interorganizational relations to derive a model of exchange performance. Specifically, we investigate the role of trust in interfirm exchange at two levels of analysis and assess its effects on negotiation costs, conflict, and ultimately performance. Propositions were tested with data from a sample of 107 buyer-supplier interfirm relationships in the electrical equipment manufacturing industry using a structural equation model. The results indicate that interpersonal and interorganizational trust are related but distinct constructs, and play different roles in affecting negotiation processes and exchange performance. Further, the hypotheses linking trust to performance receive some support, although the precise nature of the link is somewhat different than initially proposed. Overall, the results show that trust in interorganizational exchange relations clearly matters.

3,927 citations


Proceedings Article
01 Jan 1998
TL;DR: It is found that the multi-variate Bernoulli performs well with small vocabulary sizes, but that the multinomial performs usually performs even better at larger vocabulary sizes--providing on average a 27% reduction in error over the multi -variateBernoulli model at any vocabulary size.
Abstract: Recent work in text classification has used two different first-order probabilistic models for classification, both of which make the naive Bayes assumption. Some use a multi-variate Bernoulli model, that is, a Bayesian Network with no dependencies between words and binary word features (e.g. Larkey and Croft 1996; Koller and Sahami 1997). Others use a multinomial model, that is, a uni-gram language model with integer word counts (e.g. Lewis and Gale 1994; Mitchell 1997). This paper aims to clarify the confusion by describing the differences and details of these two models, and by empirically comparing their classification performance on five text corpora. We find that the multi-variate Bernoulli performs well with small vocabulary sizes, but that the multinomial performs usually performs even better at larger vocabulary sizes--providing on average a 27% reduction in error over the multi-variate Bernoulli model at any vocabulary size.

3,601 citations


Journal ArticleDOI
19 Feb 1998-Nature
TL;DR: An illusion in which tactile sensations are referred to an alien limb is reported, which reveals a three-way interaction between vision, touch and proprioception, and may supply evidence concerning the basis of bodily self-identification.
Abstract: Illusions have historically been of great use to psychology for what they can reveal about perceptual processes. We report here an illusion in which tactile sensations are referred to an alien limb. The effect reveals a three-way interaction between vision, touch and proprioception, and may supply evidence concerning the basis of bodily self-identification.

3,422 citations


Journal ArticleDOI
01 May 1998-Science
TL;DR: Results confirm that this region shows activity during erroneous responses, but activity was also observed in the same region during correct responses under conditions of increased response competition, which suggests that the ACC detects conditions under which errors are likely to occur rather than errors themselves.
Abstract: An unresolved question in neuroscience and psychology is how the brain monitors performance to regulate behavior. It has been proposed that the anterior cingulate cortex (ACC), on the medial surface of the frontal lobe, contributes to performance monitoring by detecting errors. In this study, event-related functional magnetic resonance imaging was used to examine ACC function. Results confirm that this region shows activity during erroneous responses. However, activity was also observed in the same region during correct responses under conditions of increased response competition. This suggests that the ACC detects conditions under which errors are likely to occur rather than errors themselves.

3,236 citations


Journal ArticleDOI
01 Dec 1998
TL;DR: New reactive behaviors that implement formations in multirobot teams are presented and evaluated and demonstrate the value of various types of formations in autonomous, human-led and communications-restricted applications, and their appropriateness in different types of task environments.
Abstract: New reactive behaviors that implement formations in multirobot teams are presented and evaluated. The formation behaviors are integrated with other navigational behaviors to enable a robotic team to reach navigational goals, avoid hazards and simultaneously remain in formation. The behaviors are implemented in simulation, on robots in the laboratory and aboard DARPA's HMMWV-based unmanned ground vehicles. The technique has been integrated with the autonomous robot architecture (AuRA) and the UGV Demo II architecture. The results demonstrate the value of various types of formations in autonomous, human-led and communications-restricted applications, and their appropriateness in different types of task environments.

Book
01 Jun 1998
TL;DR: The preface to the book describes how C.R. Anderson's Cognitive Arithmetic transformed into Knowledge Representation and how M. Lovett's choice changed the way that people viewed the world around them.
Abstract: Contents: Preface. J.R. Anderson, C. Lebiere, Introduction. J.R. Anderson, C. Lebiere, Knowledge Representation. J.R. Anderson, C. Lebiere, M. Lovett, Performance. J.R. Anderson, C. Lebiere, Learning. J.R. Anderson, M. Matessa, C. Lebiere, Visual Interface. M.D. Byrne, J.R. Anderson, Perception and Action. J.R. Anderson, D. Bothell, C. Lebiere, M. Matessa, List Memory. M. Lovett, Choice. C. Lebiere, J.R. Anderson, Cognitive Arithmetic. D.D. Salvucci, J.R. Anderson, Analogy. C.D. Schunn, J.R. Anderson, Scientific Discovery. J.R. Anderson, C. Lebiere, Reflections.

Book ChapterDOI
01 Jan 1998
TL;DR: In this paper, the authors explore questions of existence and uniqueness for solutions to stochastic differential equations and offer a study of their properties, using diffusion processes as a model of a Markov process with continuous sample paths.
Abstract: We explore in this chapter questions of existence and uniqueness for solutions to stochastic differential equations and offer a study of their properties. This endeavor is really a study of diffusion processes. Loosely speaking, the term diffusion is attributed to a Markov process which has continuous sample paths and can be characterized in terms of its infinitesimal generator.

Journal ArticleDOI
01 Aug 1998
TL;DR: A method for combining query-relevance with information-novelty in the context of text retrieval and summarization and preliminary results indicate some benefits for MMR diversity ranking in document retrieval and in single document summarization.
Abstract: This paper presents a method for combining query-relevance with information-novelty in the context of text retrieval and summarization. The Maximal Marginal Relevance (MMR) criterion strives to reduce redundancy while maintaining query relevance in re-ranking retrieved documents and in selecting apprw priate passages for text summarization. Preliminary results indicate some benefits for MMR diversity ranking in document retrieval and in single document summarization. The latter are borne out by the recent results of the SUMMAC conference in the evaluation of summarization systems. However, the clearest advantage is demonstrated in constructing non-redundant multi-document summaries, where MMR results are clearly superior to non-MMR passage selection.

Proceedings ArticleDOI
24 Jul 1998
TL;DR: A cloth simulation system that can stably take large time steps is described, which is significantly faster than previous accounts of cloth simulation systems in the literature.
Abstract: The bottle-neck in most cloth simulation systems is that time steps must be small to avoid numerical instability. This paper describes a cloth simulation system that can stably take large time steps. The simulation system couples a new technique for enforcing constraints on individual cloth particles with an implicit integration method. The simulator models cloth as a triangular mesh, with internal cloth forces derived using a simple continuum formulation that supports modeling operations such as local anisotropic stretch or compression; a unified treatment of damping forces is included as well. The implicit integration method generates a large, unbanded sparse linear system at each time step which is solved using a modified conjugate gradient method that simultaneously enforces particles’ constraints. The constraints are always maintained exactly, independent of the number of conjugate gradient iterations, which is typically small. The resulting simulation system is significantly faster than previous accounts of cloth simulation systems in the literature.

Proceedings ArticleDOI
01 Jan 1998
TL;DR: The MaximalMarginal Relevance (MMR) criterion as mentioned in this paper aims to reduce redundancy while maintaining query relevance in retrieving retrieved documents and selecting appropriate passages for text summarization.
Abstract: This paper presents a method for combining query-relevance with information-novelty in the context of text retrieval and summarization. The Maximal Marginal Relevance (MMR) criterion strives to reduce redundancy while maintaining query relevance in re-ranking retrieved documents and in selecting appropriate passages for text summarization. Preliminary results indicate some benefits for MMR diversity ranking in document retrieval and in single document summarization. The latter are borne out by the recent results of the SUMMAC conference in the evaluation of summarization systems. However, the clearest advantage is demonstrated in constructing non-redundant multi-document summaries, where MMR results are clearly superior to non-MMR passage selection.

Journal ArticleDOI
TL;DR: In this article, an algorithm for generating provably passive reduced-order N-port models for linear RLC interconnect circuits is described, in which, in addition to macromodel stability, passivity is needed to guarantee the overall circuit stability.
Abstract: This paper describes an algorithm for generating provably passive reduced-order N-port models for RLC interconnect circuits. It is demonstrated that, in addition to macromodel stability, macromodel passivity is needed to guarantee the overall circuit stability once the active and passive driver/load models are connected. The approach proposed here, PRIMA, is a general method for obtaining passive reduced-order macromodels for linear RLC systems. In this paper, PRIMA is demonstrated in terms of a simple implementation which extends the block Arnoldi technique to include guaranteed passivity while providing superior accuracy. While the same passivity extension is not possible for MPVL, comparable accuracy in the frequency domain for all examples is observed.

BookDOI
01 May 1998
TL;DR: This chapter discusses Reinforcement Learning with Self-Modifying Policies J. Schmidhuber, et al., and theoretical Models of Learning to Learn J. Baxter, a first step towards Continual Learning.
Abstract: Preface. Part I: Overview Articles. 1. Learning to Learn: Introduction and Overview S. Thrun, L. Pratt. 2. A Survey of Connectionist Network Reuse Through Transfer L. Pratt, B. Jennings. 3. Transfer in Cognition A. Robins. Part II: Prediction. 4. Theoretical Models of Learning to Learn J. Baxter. 5. Multitask Learning R. Caruana. 6. Making a Low-Dimensional Representation Suitable for Diverse Tasks N. Intrator, S. Edelman. 7. The Canonical Distortion Measure for Vector Quantization and Function Approximation J. Baxter. 8. Lifelong Learning Algorithms S. Thrun. Part III: Relatedness. 9. The Parallel Transfer of Task Knowledge Using Dynamic Learning Rates Based on a Measure of Relatedness D.L. Silver, R.E. Mercer. 10. Clustering Learning Tasks and the Selective Cross-Task Transfer of Knowledge S. Thrun, J. O'Sullivan. Part IV: Control. 11. CHILD: A First Step Towards Continual Learning M.B. Ring. 12. Reinforcement Learning with Self-Modifying Policies J. Schmidhuber, et al. 13. Creating Advice-Taking Reinforcement Learners R. Maclin, J.W. Shavlik. Contributing Authors. Index.

Posted Content
TL;DR: In this article, the authors use a dynamic model to predict changes in a firm's systematic risk, and its expected return, and show that the model simultaneously reproduces the time series relation between the book-to-market ratio and asset returns, the cross-sectional relation between book to market, market value and return, contrarian effects at short horizons, momentum effects at longer horizons and the inverse relation between interest rates and the market risk premium.
Abstract: As a consequence of optimal investment choices, firms' assets and growth options change in predictable ways. Using a dynamic model, we show that this imparts predictability to changes in a firm's systematic risk, and its expected return. Simulations show that the model simultaneously reproduces: (i) the time series relation between the book-to-market ratio and asset returns, (ii) the cross-sectional relation between book to market, market value and return, (iii) contrarian effects at short horizons, (iv) momentum effects at longer horizons, and (v) the inverse relation between interest rates and the market risk premium.

Proceedings ArticleDOI
19 Oct 1998
TL;DR: An end-to-end method for extracting moving targets from a real-time video stream, classifying them into predefined categories according to image-based properties, and then robustly tracking them is described.
Abstract: This paper describes an end-to-end method for extracting moving targets from a real-time video stream, classifying them into predefined categories according to image-based properties, and then robustly tracking them. Moving targets are detected using the pixel wise difference between consecutive image frames. A classification metric is applied these targets with a temporal consistency constraint to classify them into three categories: human, vehicle or background clutter. Once classified targets are tracked by a combination of temporal differencing and template matching. The resulting system robustly identifies targets of interest, rejects background clutter and continually tracks over large distances and periods of time despite occlusions, appearance changes and cessation of target motion.

Journal ArticleDOI
TL;DR: In this article, a computationally efficient and rigorous thermodynamic model that predicts the physical state and composition of inorganic atmospheric aerosol is presented, where the main features of the model is the implementation of mutual deliquescence of multicomponent salt particles.
Abstract: A computationally efficient and rigorous thermodynamic model that predicts the physical state and composition of inorganic atmospheric aerosol is presented. One of the main features of the model is the implementation of mutual deliquescence of multicomponent salt particles, which lowers the deliquescence point of the aerosol phase. The model is used to examine the behavior of four types of tropospheric aerosol (marine, urban, remote continental and non-urban continental), and the results are compared with the predictions of two other models currently in use. The results of all three models were generally in good agreement. Differences were found primarily in the mutual deliquescence humidity regions, where the new model predicted the existence of water, and the other two did not. Differences in the behavior (speciation and water absorbing properties) between the aerosol types are pointed out. The new model also needed considerably less CPU time, and always shows stability and robust convergence.

Journal ArticleDOI
TL;DR: This paper presents several solutions to the problem of task allocation among autonomous agents, and suggests that the agents form coalitions in order to perform tasks or improve the efficiency of their performance.

Journal ArticleDOI
TL;DR: This paper describes an approach that integrates both paradigms: grid-based and topological, which gains advantages from both worlds: accuracy/consistency and efficiency.

Journal ArticleDOI
TL;DR: In this paper, the authors examined innovative activity using a database of innovations in the UK and found that firms located in strong industrial clusters or regions are more likely to innovate than firms outside these regions.

Journal ArticleDOI
TL;DR: The authors proposed a double-entry mental accounting model to model the relationship between the pleasure of consumption and the pain of paying and draw out their implications for consumer behavior and hedonics, showing that consumers will find it less painful to pay for, and hence will prefer flat-rate pricing schemes such as unlimited Internet access at a fixed monthly price, even if it involves paying more for the same usage.
Abstract: In the standard economic account of consumer behavior the cost of a purchase takes the form of a reduction in future utility when expenditures that otherwise could have been made are forgone. The reality of consumer hedonics is different. When people make purchases, they often experience an immediate pain of paying, which can undermine the pleasure derived from consumption. The ticking of the taxi meter, for example, reduces one's pleasure from the ride. We propose a "double-entry" mental accounting theory that describes the nature of these reciprocal interactions between the pleasure of consumption and the pain of paying and draws out their implications for consumer behavior and hedonics. A central assumption of the model, which we call prospective accounting, is that consumption that has already been paid for can be enjoyed as if it were free and that the pain associated with payments made prior to consumption but not after is buffered by thoughts of the benefits that the payments will finance. Another important concept is coupling, which refers to the degree to which consumption calls to mind thoughts of payment, and vice versa. Some financing methods, such as credit cards, tend to weaken coupling, whereas others, such as cash payment, produce tight coupling. Our model makes a variety of predictions that are at variance with economic formulations. Contrary to the standard prediction that people will finance purchases to minimize the present value of payments, our model predicts strong debt aversion-that they should prefer to prepay for consumption or to get paid for work after it is performed. Such pay-before sequences confer hedonic benefits because consumption can be enjoyed without thinking about the need to pay for it in the future. Likewise, when paying beforehand, the pain of paying is mitigated by thoughts of future consumption benefits. Contrary to the economic prediction that consumers should prefer to pay, at the margin, for what they consume, our model predicts that consumers will find it less painful to pay for, and hence will prefer, flat-rate pricing schemes such as unlimited Internet access at a fixed monthly price, even if it involves paying more for the same usage. Other predictions concern spending patterns with cash, charge, or credit cards, and preferences for the earmarking of purchases. We test these predictions in a series of surveys and in a conjoint-like analysis that pitted our double-entry mental accounting model against a standard discounting formulation and another benchmark that did not incorporate hedonic interactions between consumption and payments. Our model provides a better fit of the data for 60% of the subjects; the discounting formulation provides a better fit for only 29% of the subjects even when allowing for positive and negative discount rates. The pain of paying, we argue, plays an important role in consumer self-regulation, but is hedonically costly. From a hedonic perspective the ideal situation is one in which payments are tightly coupled to consumption so that paying evokes thoughts about the benefits being financed but consumption is decoupled from payments so that consumption does not evoke thoughts about payment. From an efficiency perspective, however, it is important for consumers to be aware of what they are paying for consumption. This creates a tension between hedonic efficiency and what we call decision efficiency. Various institutional arrangements, such as financing of public parks through taxes or usage fees, play into this tradeoff. A producer developing a pricing structure for their product or service should be aware of these two conflicting objectives, and should try to devise a structure that reconciles them.

DOI
01 Jan 1998
TL;DR: Topic Detection and Tracking (TDT) is a DARPA-sponsored initiative to investigate the state of the art in finding and following new events in a stream of broadcast news stories.
Abstract: Topic Detection and Tracking (TDT) is a DARPA-sponsored initiative to investigate the state of the art in finding and following new events in a stream of broadcast news stories The TDT problem consists of three major tasks: (1) segmenting a stream of data, especially recognized speech, into distinct stories; (2) identifying those news stories that are the first to discuss a new event occurring in the news; and (3) given a small number of sample news stories about an event, finding all following stories in the stream

Journal ArticleDOI
TL;DR: In this paper, the authors describe a supply chain modeling framework designed to overcome the time and effort required to develop models with sufficient fidelity to the actual supply chain of interest, which is essential to perform risk-benefit analysis of reengineering alternatives before making a final decision.
Abstract: A global economy and increase in customer expectations in terms of cost and services have put a premium on effective supply chain reengineering. It is essential to perform risk-benefit analysis of reengineering alternatives before making a final decision. Simulation provides an effective pragmatic approach to detailed analysis and evaluation of supply chain design and management alternatives. However, the utility of this methodology is hampered by the time and effort required to develop models with sufficient fidelity to the actual supply chain of interest. In this paper, we describe a supply chain modeling framework designed to overcome this difficulty. Using our approach, supply chain models are composed from software components that represent types of supply chain agents (e.g., retailers, manufacturers, transporters), their constituent control elements (e.g., inventory policy), and their interaction protocols (e.g., message types). The underlying library of supply chain modeling components has been derived from analysis of several different supply chains. It provides a reusable base of domain-specific primitives that enables rapid development of customized decision support tools.

Journal ArticleDOI
TL;DR: A review of the development of controlled/living radical polymerization methods can be found in this article, where the authors give a brief overview of recent developments in controlled radical polymerizations and describe in more depth the progress that has been made in the development.
Abstract: The development of new polymeric materials is based on the availability of methods, principally living polymerizations, that allow well-defined polymers to be prepared. Living polymerizations are chain-growth polymerizations that proceed in the absence of irreversible chain transfer and chain termination. Provided that initiation is complete and exchange between species of various reactivities is fast, one can adjust the final average molecular weight of the polymer by varying the initial monomer-to-initiator ratio (DPn = D[M]/[I]0) while maintaining a narrow molecular weight distribution (1.0 < Mw/Mn < 1.5). [8,9] Also, one has control over the chemistry and structure of the initiator and active end group, so polymers can be end-functionalized and block copolymerized with other monomers. Thus, using only a few monomers and a living polymerization, one can create many new materials with vastly differing properties simply by varying the topology of the polymer (i.e., comb, star, dendritic, etc.), the composition of the polymer (i.e., random, periodic, graft, etc.), or the functional groups at various sites on the polymer (i.e., end, center, side, etc.) (Fig. 1). Examples of such materials prepared by atom transfer radical polymerization (ATRP) are shown later in this review. Much of the academic and industrial research on materials development has focused on coordination, cationic, anionic, and ring-opening polymerizations due to the availability of controlled/living polymerizations of these types. Free-radical polymerizations accounted for approximately half of the total production of polymers in the United States in 1995. Despite its tremendous utility, a significant drawback to free-radical polymerization is the lack of macromolecular structure control due to near diffusion-controlled radical coupling and disproportionation. Therefore, the development of controlled/living radical polymerization methods has been a long-standing goal in polymer chemistry. The last five years have seen the realization of this goal and the rapid growth in the development and understanding of new controlled radical polymerizations. In this discussion, we give a brief overview of recent developments in controlled radical polymerizations and describe in more depth the progress that has been made in the development of ATRP.

Journal ArticleDOI
TL;DR: This paper addresses the problem of building large-scale geometric maps of indoor environments with mobile robots as a constrained, probabilistic maximum-likelihood estimation problem, and devises a practical algorithm for generating the most likely map from data, along with the best path taken by the robot.
Abstract: This paper addresses the problem of building large-scale geometric maps of indoor environments with mobile robots It poses the map building problem as a constrained, probabilistic maximum-likelihood estimation problem It then devises a practical algorithm for generating the most likely map from data, along with the most likely path taken by the robot Experimental results in cyclic environments of size up to 80 by 25 meter illustrate the appropriateness of the approach

Journal ArticleDOI
TL;DR: In this paper, the deformation of a curved interface between solid phases was studied under the assumption of small strains in the bulk phases and neglecting accretion at the interfaces, and the authors showed that the free energy of the interface can depend on the normal and tangential components of the jump in displacement at the interface (stretch and slip), and the average of the projected strain in the tangent plane (average tangential strain).
Abstract: We discuss the deformation of a curved interface between solid phases, assuming small strains in the bulk phases and neglecting accretion at the interfaces. Such assumptions are relevant to the deformation of solid microstructures when atomic diffusion and the formation of defects such as dislocations are negligible. We base our theory on a constitutive equation giving the (excess) free energy ψ of the interface when the interfacial limits of the displacement fields in the abutting phases as well as the limits of the displacement gradients are known. Using general considerations of frame invariance, we show that ψ can depend on these quantities at most through: firstly the normal and tangential components of the jump in displacement at the interface (stretch and slip), secondly the average of the projected strain in the tangent plane (average tangential strain), thirdly the tangential component of the jump in the projected displacement gradient at the interface (relative tangential strain and rel...