scispace - formally typeset
Search or ask a question

Showing papers by "Carnegie Mellon University published in 1997"


Journal ArticleDOI
01 Jul 1997
TL;DR: Multi-task Learning (MTL) as mentioned in this paper is an approach to inductive transfer that improves generalization by using the domain information contained in the training signals of related tasks as an inductive bias.
Abstract: Multitask Learning is an approach to inductive transfer that improves generalization by using the domain information contained in the training signals of related tasks as an inductive bias. It does this by learning tasks in parallel while using a shared representation; what is learned for each task can help other tasks be learned better. This paper reviews prior work on MTL, presents new evidence that MTL in backprop nets discovers task relatedness without the need of supervisory signals, and presents new results for MTL with k-nearest neighbor and kernel regression. In this paper we demonstrate multitask learning in three domains. We explain how multitask learning works, and show that there are many opportunities for multitask learning in real domains. We present an algorithm and results for multitask learning with case-based methods like k-nearest neighbor and kernel regression, and sketch an algorithm for multitask learning in decision trees. Because multitask learning works, can be applied to many different kinds of domains, and can be used with different learning algorithms, we conjecture there will be many opportunities for its use on real-world problems.

5,181 citations


Book
01 Feb 1997
TL;DR: The human and the design of interactive systems: The myth of the infinitely fast machine, a guide to designing for diversity and the process of design.
Abstract: Contents Foreword Preface to the third edition Preface to the second edition Preface to the first edition Introduction Part 1 Foundations Chapter 1 The human 1.1 Introduction 1.2 Input-output channels Design Focus: Getting noticed Design Focus: Where's the middle? 1.3 Human memory Design Focus: Cashing in Design Focus: 7 +- 2 revisited 1.4 Thinking: reasoning and problem solving Design Focus: Human error and false memories 1.5 Emotion 1.6 Individual differences 1.7 Psychology and the design of interactive systems 1.8 Summary Exercises Recommended reading Chapter 2 The computer 2.1 Introduction Design Focus: Numeric keypads 2.2 Text entry devices 2.3 Positioning, pointing and drawing 2.4 Display devices Design Focus: Hermes: a situated display 2.5 Devices for virtual reality and 3D interaction 2.6 Physical controls, sensors and special devices Design Focus: Feeling the road Design Focus: Smart-Its - making sensors easy 2.7 Paper: printing and scanning Design Focus: Readability of text 2.8 Memory 2.9 Processing and networks Design Focus: The myth of the infinitely fast machine 2.10 Summary Exercises Recommended reading Chapter 3 The interaction 3.1 Introduction 3.2 Models of interaction Design Focus: Video recorder 3.3 Frameworks and HCI 3.4 Ergonomics Design Focus: Industrial interfaces 3.5 Interaction styles Design Focus: Navigation in 3D and 2D 3.6 Elements of the WIMP interface Design Focus: Learning toolbars 3.7 Interactivity 3.8 The context of the interaction Design Focus: Half the picture? 3.9 Experience, engagement and fun 3.10 Summary Exercises Recommended reading Chapter 4 Paradigms 4.1 Introduction 4.2 Paradigms for interaction 4.3 Summary Exercises Recommended reading Part 2 Design process Chapter 5 Interaction design basics 5.1 Introduction 5.2 What is design? 5.3 The process of design 5.4 User focus Design Focus: Cultural probes 5.5 Scenarios 5.6 Navigation design Design Focus: Beware the big button trap Design Focus: Modes 5.7 Screen design and layout Design Focus: Alignment and layout matter Design Focus: Checking screen colors 5.8 Iteration and prototyping 5.9 Summary Exercises Recommended reading Chapter 6 HCI in the software process 6.1 Introduction 6.2 The software life cycle 6.3 Usability engineering 6.4 Iterative design and prototyping Design Focus: Prototyping in practice 6.5 Design rationale 6.6 Summary Exercises Recommended reading Chapter 7 Design rules 7.1 Introduction 7.2 Principles to support usability 7.3 Standards 7.4 Guidelines 7.5 Golden rules and heuristics 7.6 HCI patterns 7.7 Summary Exercises Recommended reading Chapter 8 Implementation support 8.1 Introduction 8.2 Elements of windowing systems 8.3 Programming the application Design Focus: Going with the grain 8.4 Using toolkits Design Focus: Java and AWT 8.5 User interface management systems 8.6 Summary Exercises Recommended reading Chapter 9 Evaluation techniques 9.1 What is evaluation? 9.2 Goals of evaluation 9.3 Evaluation through expert analysis 9.4 Evaluation through user participation 9.5 Choosing an evaluation method 9.6 Summary Exercises Recommended reading Chapter 10 Universal design 10.1 Introduction 10.2 Universal design principles 10.3 Multi-modal interaction Design Focus: Designing websites for screen readers Design Focus: Choosing the right kind of speech Design Focus: Apple Newton 10.4 Designing for diversity Design Focus: Mathematics for the blind 10.5 Summary Exercises Recommended reading Chapter 11 User support 11.1 Introduction 11.2 Requirements of user support 11.3 Approaches to user support 11.4 Adaptive help systems Design Focus: It's good to talk - help from real people 11.5 Designing user support systems 11.6 Summary Exercises Recommended reading Part 3 Models and theories Chapter 12 Cognitive models 12.1 Introduction 12.2 Goal and task hierarchies Design Focus: GOMS saves money 12.3 Linguistic models 12.4 The challenge of display-based systems 12.5 Physical and device models 12.6 Cognitive architectures 12.7 Summary Exercises Recommended reading Chapter 13 Socio-organizational issues and stakeholder requirements 13.1 Introduction 13.2 Organizational issues Design Focus: Implementing workflow in Lotus Notes 13.3 Capturing requirements Design Focus: Tomorrow's hospital - using participatory design 13.4 Summary Exercises Recommended reading Chapter 14 Communication and collaboration models 14.1 Introduction 14.2 Face-to-face communication Design Focus: Looking real - Avatar Conference 14.3 Conversation 14.4 Text-based communication 14.5 Group working 14.6 Summary Exercises Recommended reading Chapter 15 Task analysis 15.1 Introduction 15.2 Differences between task analysis and other techniques 15.3 Task decomposition 15.4 Knowledge-based analysis 15.5 Entity-relationship-based techniques 15.6 Sources of information and data collection 15.7 Uses of task analysis 15.8 Summary Exercises Recommended reading Chapter 16 Dialog notations and design 16.1 What is dialog? 16.2 Dialog design notations 16.3 Diagrammatic notations Design Focus: Using STNs in prototyping Design Focus: Digital watch - documentation and analysis 16.4 Textual dialog notations 16.5 Dialog semantics 16.6 Dialog analysis and design 16.7 Summary Exercises Recommended reading Chapter 17 Models of the system 17.1 Introduction 17.2 Standard formalisms 17.3 Interaction models 17.4 Continuous behavior 17.5 Summary Exercises Recommended reading Chapter 18 Modeling rich interaction 18.1 Introduction 18.2 Status-event analysis 18.3 Rich contexts 18.4 Low intention and sensor-based interaction Design Focus: Designing a car courtesy light 18.5 Summary Exercises Recommended reading Part 4 Outside the box Chapter 19 Groupware 19.1 Introduction 19.2 Groupware systems 19.3 Computer-mediated communication Design Focus: SMS in action 19.4 Meeting and decision support systems 19.5 Shared applications and artifacts 19.6 Frameworks for groupware Design Focus: TOWER - workspace awareness Exercises Recommended reading Chapter 20 Ubiquitous computing and augmented realities 20.1 Introduction 20.2 Ubiquitous computing applications research Design Focus: Ambient Wood - augmenting the physical Design Focus: Classroom 2000/eClass - deploying and evaluating ubicomp 20.3 Virtual and augmented reality Design Focus: Shared experience Design Focus: Applications of augmented reality 20.4 Information and data visualization Design Focus: Getting the size right 20.5 Summary Exercises Recommended reading Chapter 21 Hypertext, multimedia and the world wide web 21.1 Introduction 21.2 Understanding hypertext 21.3 Finding things 21.4 Web technology and issues 21.5 Static web content 21.6 Dynamic web content 21.7 Summary Exercises Recommended reading References Index

5,095 citations


Proceedings ArticleDOI
03 Aug 1997
TL;DR: This work has developed a surface simplification algorithm which can rapidly produce high quality approximations of polygonal models, and which also supports non-manifold surface models.
Abstract: Many applications in computer graphics require complex, highly detailed models. However, the level of detail actually necessary may vary considerably. To control processing time, it is often desirable to use approximations in place of excessively detailed models. We have developed a surface simplification algorithm which can rapidly produce high quality approximations of polygonal models. The algorithm uses iterative contractions of vertex pairs to simplify models and maintains surface error approximations using quadric matrices. By contracting arbitrary vertex pairs (not just edges), our algorithm is able to join unconnected regions of models. This can facilitate much better approximations, both visually and with respect to geometric error. In order to allow topological joining, our system also supports non-manifold surface models. CR Categories: I.3.5 [Computer Graphics]: Computational Geometry and Object Modeling—surface and object representations

3,564 citations


Journal ArticleDOI
TL;DR: This approach, designed for mobile robots equipped with synchro-drives, is derived directly from the motion dynamics of the robot and safely controlled the mobile robot RHINO in populated and dynamic environments.
Abstract: This approach, designed for mobile robots equipped with synchro-drives, is derived directly from the motion dynamics of the robot. In experiments, the dynamic window approach safely controlled the mobile robot RHINO at speeds of up to 95 cm/sec, in populated and dynamic environments.

2,886 citations


Journal ArticleDOI
TL;DR: This survey reviews work in machine learning on methods for handling data sets containing large amounts of irrelevant information and describes the advances that have been made in both empirical and theoretical work in this area.

2,869 citations


Journal ArticleDOI
TL;DR: The combination of high volume and personal taste made Usenet news a promising candidate for collaborative filtering and the potential predictive utility for Usenets news was very high.
Abstract: newsgroups carry a wide enough spread of messages to make most individuals consider Usenet news to be a high noise information resource. Furthermore, each user values a different set of messages. Both taste and prior knowledge are major factors in evaluating news articles. For example, readers of the rec.humor newsgroup, a group designed for jokes and other humorous postings, value articles based on whether they perceive them to be funny. Readers of technical groups, such as comp.lang.c11 value articles based on interest and usefulness to them—introductory questions and answers may be uninteresting to an expert C11 programmer just as debates over subtle and advanced language features may be useless to the novice. The combination of high volume and personal taste made Usenet news a promising candidate for collaborative filtering. More formally, we determined the potential predictive utility for Usenet news was very high. The GroupLens project started in 1992 and completed a pilot study at two sites to establish the feasibility of using collaborative filtering for Usenet news [8]. Several critical design decisions were made as part of that pilot study, including:

2,657 citations


Journal ArticleDOI
TL;DR: DIGE is reproducible, sensitive, and can detect an exogenous difference between two Drosophila embryo extracts at nanogram levels and was detected after 15 min of induction and identified using DIGE preparatively.
Abstract: We describe a modification of two-dimensional (2-D) polyacrylamide gel electrophoresis that requires only a single gel to reproducibly detect differences between two protein samples. This was accomplished by fluorescently tagging the two samples with two different dyes, running them on the same 2-D gel, post-run fluorescence imaging of the gel into two images, and superimposing the images. The amine reactive dyes were designed to insure that proteins common to both samples have the same relative mobility regardless of the dye used to tag them. Thus, this technique, called difference gel electrophoresis (DIGE), circumvents the need to compare several 2-D gels. DIGE is reproducible, sensitive, and can detect an exogenous difference between two Drosophila embryo extracts at nanogram levels. Moreover, an inducible protein from E. coli was detected after 15 min of induction and identified using DIGE preparatively.

2,220 citations


Journal ArticleDOI
TL;DR: In this article, the authors compare several methods of estimating Bayes factors when it is possible to simulate observations from the posterior distributions, via Markov chain Monte Carlo or other techniques, provided that each posterior distribution is well behaved in the sense of having a single dominant mode.
Abstract: The Bayes factor is a ratio of two posterior normalizing constants, which may be difficult to compute. We compare several methods of estimating Bayes factors when it is possible to simulate observations from the posterior distributions, via Markov chain Monte Carlo or other techniques. The methods that we study are all easily applied without consideration of special features of the problem, provided that each posterior distribution is well behaved in the sense of having a single dominant mode. We consider a simulated version of Laplace's method, a simulated version of Bartlett correction, importance sampling, and a reciprocal importance sampling technique. We also introduce local volume corrections for each of these. In addition, we apply the bridge sampling method of Meng and Wong. We find that a simulated version of Laplace's method, with local volume correction, furnishes an accurate approximation that is especially useful when likelihood function evaluations are costly. A simple bridge sampli...

2,191 citations


Journal ArticleDOI
TL;DR: An HIV vector system in which the virulence genes env, vif, vpr, vpu, and nef have been deleted is described, and this multiply attenuated vector conserved the ability to transduce growth-arrested cells and monocyte-derived macrophages in culture, and could efficiently deliver genes in vivo into adult neurons.
Abstract: Retroviral vectors derived from lentiviruses such as HIV-1 are promising tools for human gene therapy because they mediate the in vivo delivery and long-term expression of transgenes in nondividing tissues. We describe an HIV vector system in which the virulence genes env, vif, vpr, vpu, and nef have been deleted. This multiply attenuated vector conserved the ability to transduce growth-arrested cells and monocyte-derived macrophages in culture, and could efficiently deliver genes in vivo into adult neurons. These data demonstrate the potential of lentiviral vectors in human gene therapy.

2,110 citations


Journal ArticleDOI
TL;DR: The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, assessing predictions, handling noisy data and outliers, improving the quality of predictions by tuning fit parameters, and applications of locally weighted learning.
Abstract: This paper surveys locally weighted learning, a form of lazy learning and memory-based learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, assessing predictions, handling noisy data and outliers, improving the quality of predictions by tuning fit parameters, interference between old and new data, implementing locally weighted learning efficiently, and applications of locally weighted learning. A companion paper surveys how locally weighted learning can be used in robot learning and control.

1,863 citations


Journal ArticleDOI
10 Apr 1997-Nature
TL;DR: Functional magnetic resonance imaging is used to examine brain activation in human subjects during performance of a working memory task and to show that prefrontal cortex along with parietal cortex appears to play a role in active maintenance.
Abstract: Working memory is responsible for the short-term storage and online manipulation of information necessary for higher cognitive functions, such as language, planning and problem-solving. Traditionally, working memory has been divided into two types of processes: executive control (governing the encoding manipulation and retrieval of information in working memory) and active maintenance (keeping information available 'online'). It has also been proposed that these two types of processes may be subserved by distinct cortical structures, with the prefrontal cortex housing the executive control processes, and more posterior regions housing the content-specific buffers (for example verbal versus visuospatial) responsible for active maintenance. However, studies in non-human primates suggest that dorsolateral regions of the prefrontal cortex may also be involved in active maintenance. We have used functional magnetic resonance imaging to examine brain activation in human subjects during performance of a working memory task. We used the temporal resolution of this technique to examine the dynamics of regional activation, and to show that prefrontal cortex along with parietal cortex appears to play a role in active maintenance.

Proceedings ArticleDOI
01 Jan 1997
TL;DR: It is shown in this paper how proof-carrying code might be used to develop safe assembly-language extensions of ML programs and the adequacy of concrete representations for the safety policy, the safety proofs, and the proof validation is proved.
Abstract: This paper describes proof-carrying code (PCC), a mechanism by which a host system can determine with certainty that it is safe to execute a program supplied (possibly in binary form) by an untrusted source. For this to be possible, the untrusted code producer must supply with the code a safety proof that attests to the code's adherence to a previously defined safety policy. The host can then easily and quickly validate the proof without using cryptography and without consulting any external agents.In order to gain preliminary experience with PCC, we have performed several case studies. We show in this paper how proof-carrying code might be used to develop safe assembly-language extensions of ML programs. In the context of this case study, we present and prove the adequacy of concrete representations for the safety policy, the safety proofs, and the proof validation. Finally, we briefly discuss how we use proof-carrying code to develop network packet filters that are faster than similar filters developed using other techniques and are formally guaranteed to be safe with respect to a given operating system safety policy.

Journal ArticleDOI
TL;DR: The current understanding of the fundamentals of recrystallization is summarized in this paper, which includes understanding the as-deformed state, nucleation and growth, the development of misorientation during deformation, continuous, dynamic, and geometric dynamic recystallization, particle effects, and texture.
Abstract: The current understanding of the fundamentals of recrystallization is summarized. This includes understanding the as-deformed state. Several aspects of recrystallization are described: nucleation and growth, the development of misorientation during deformation, continuous, dynamic, and geometric dynamic recrystallization, particle effects, and texture. This article is authored by the leading experts in these areas. The subjects are discussed individually and recommendations for further study are listed in the final section.

Journal ArticleDOI
TL;DR: Graphplan as mentioned in this paper is a partial-order planner based on constructing and analyzing a compact structure called a planning graph, which can be used to find the shortest possible partial order plan or state that no valid plan exists.

Journal ArticleDOI
TL;DR: The key idea is to define architectural connectors as explicit semantic entities as a collection of protocols that characterize each of the participant roles in an interaction and how these roles interact.
Abstract: As software systems become more complex, the overall system structure—or software architecture—becomes a central design problem. An important step toward an engineering discipline of software is a formal basis for describing and analyzing these designs. In the article we present a formal approach to one aspect of architectural design: the interactions among components. The key idea is to define architectural connectors as explicit semantic entities. These are specified as a collection of protocols that characterize each of the participant roles in an interaction and how these roles interact. We illustrate how this scheme can be used to define a variety of common architectural connectors. We further provide a formal semantics and show how this leads to a system in which architectural compatibility can be checked in a way analogous to type-checking in programming languages.

Journal ArticleDOI
TL;DR: In this paper, the authors describe the key features of the new synthesis and its implications for the role of monetary policy and find that the New Neoclassical Synthesis rationalizes an activist monetary policy which is a simply system of inflation targets.
Abstract: Macroeconomics is moving toward a New Neoclassical Synthesis, which like the synthesis of the 1960s melds Classical with Keynesian ideas. This paper describes the key features of the new synthesis and its implications for the role of monetary policy. We find that the New Neoclassical Synthesis rationalizes an activist monetary policy which is a simply system of inflation targets. Under this "neutral" monetary policy, real quantities evolve as suggested in the literature on real business cycles. Going beyond broad principles, we use the new synthesis to address several operational aspects of inflation targeting. These include its practicality, the response to oil shocks, the choice of price index, the design of a mandate, and the tactics of interest rate policy.

Journal ArticleDOI
TL;DR: In this paper, the authors use data on daily observations of wages and hours for New York City cab drivers to estimate the supply response to transitory fluctuations in wages and find that wage elasticities are persistently negative.
Abstract: Life-cycle models of labor supply predict a positive relationship between hours supplied and transitory changes in wages because such changes have virtually no effect on life-cycle wealth. Previous attempts to test this hypothesis empirically with time-series data have not been supportive; estimated elasticities are typically negative or nonsignificant. Such analyses, however, are vulnerable to measurement error and other estimation problems. We use data on daily observations of wages and hours for New York City cab drivers to estimate the supply response to transitory fluctuations in wages. Cab drivers decide daily how many hours to supply, and face wages that are positively correlated within days, but largely uncorrelated between days. Using these data, our central finding is that wage elasticities are persistently negative–from -.5 to -1 in three different samples–even after correcting for measurement error using instrumental variables. These negative wage elasticities challenge the notion that cab drivers trade off labor and leisure at different points in time and question the empirical adequacy of life-cycle formulations of labor supply.

Journal ArticleDOI
15 Oct 1997-JAMA
TL;DR: Although smoking, poor sleep quality, alcohol abstinence, low dietary intake of vitamin C, elevated catecholamine levels, and being introverted were all associated with greater susceptibility to colds, they could only partially account for the relation between social network diversity and incidence of colds.
Abstract: Objective. —To examine the hypothesis that diverse ties to friends, family, work, and community are associated with increased host resistance to infection. Design. —After reporting the extent of participation in 12 types of social ties (eg, spouse, parent, friend, workmate, member of social group), subjects were given nasal drops containing 1 of 2 rhinoviruses and monitored for the development of a common cold. Setting. —Quarantine. Participants. —A total of 276 healthy volunteers, aged 18 to 55 years, neither seropositive for human immunodeficiency virus nor pregnant. Outcome Measures. —Colds (illness in the presence of a verified infection), mucus production, mucociliary clearance function, and amount of viral replication. Results. —In response to both viruses, those with more types of social ties were less susceptible to common colds, produced less mucus, were more effective in ciliary clearance of their nasal passages, and shed less virus. These relationships were unaltered by statistical controls for prechallenge virus-specific antibody, virus type, age, sex, season, body mass index, education, and race. Susceptibility to colds decreased in a dose-response manner with increased diversity of the social network. There was an adjusted relative risk of 4.2 comparing persons with fewest (1 to 3) to those with most (6 or more) types of social ties. Although smoking, poor sleep quality, alcohol abstinence, low dietary intake of vitamin C, elevated catecholamine levels, and being introverted were all associated with greater susceptibility to colds, they could only partially account for the relation between social network diversity and incidence of colds. Conclusions. —More diverse social networks were associated with greater resistance to upper respiratory illness.

Journal ArticleDOI
TL;DR: In this article, a leading depiction of the evolution of new industries, the product life cycle, is used to organize the evidence it is shown that many industries evolve through their formative eras, but regular patterns occur when industries are mature that are not predicted by the product lifecycle.
Abstract: Evidence on entry, exit, firm survival, innovation and firm structure in new industries is reviewed to assess whether industries proceed through regular cycles as they age. A leading depiction of the evolution of new industries, the product life cycle, is used to organize the evidence it is shown that the product life cycle captures the way many industries evolve through their formative eras, but regular patterns occur when industries are mature that are not predicted by the product life cycle. Regularities in entry, exit, firm survival and firm structure are also developed for industries whose evolution departs significantly from the product life cycle. Opportunities for further research on the nature of industry life cycles and the factors that condition which life cycle pattern an industry follows are discussed. Copyright 1997 by Oxford University Press.

Journal ArticleDOI
TL;DR: In this article, the authors review studies conducted by themselves and coauthors that document a "self-serving" bias in judgments of fairness and demonstrate that the bias is an important cause of impasse in negotiations.
Abstract: The authors review studies conducted by themselves and coauthors that document a 'self-serving' bias in judgments of fairness and demonstrate that the bias is an important cause of impasse in negotiations. They discuss experimental evidence showing that (1) the bias causes impasse; (2) it is possible to reduce impasses by debiasing bargainers; and (3) the bias results from selective evaluation of information. The authors also review results from a field study of negotiations between teachers' unions and school boards in Pennsylvania that both document the fairness bias in a naturalistic setting and demonstrates its impact on strikes.

Journal ArticleDOI
TL;DR: The chemical interactions of hydrophobic organic contaminants (HOCs) with soils and sediments (geosorbents) may result in strong binding and slow subsequent release rates that significantly affect remediation rates and endpoints.
Abstract: The chemical interactions of hydrophobic organic contaminants (HOCs) with soils and sediments (geosorbents) may result in strong binding and slow subsequent release rates that significantly affect remediation rates and endpoints The underlying physical and chemical phenomena potentially responsible for this apparent sequestration of HOCs by geosorbents are not well understood This challenges our concepts for assessing exposure and toxicity and for setting environmental quality criteria Currently there are no direct observational data revealing the molecular-scale locations in which nonpolar organic compounds accumulate when associated with natural soils or sediments Hence macroscopic observations are used to make inferences about sorption mechanisms and the chemical factors affecting the sequestration of HOCs by geosorbents Recent observations suggest that HOC interactions with geosorbents comprise different inorganic and organic surfaces and matrices, and distinctions may be drawn along these lines,

Journal ArticleDOI
TL;DR: In this article, the authors present a technique for constructing random fields from a set of training samples, where each feature has a weight that is trained by minimizing the Kullback-Leibler divergence between the model and the empirical distribution of the training data.
Abstract: We present a technique for constructing random fields from a set of training samples. The learning paradigm builds increasingly complex fields by allowing potential functions, or features, that are supported by increasingly large subgraphs. Each feature has a weight that is trained by minimizing the Kullback-Leibler divergence between the model and the empirical distribution of the training data. A greedy algorithm determines how features are incrementally added to the field and an iterative scaling algorithm is used to estimate the optimal values of the weights. The random field models and techniques introduced in this paper differ from those common to much of the computer vision literature in that the underlying random fields are non-Markovian and have a large number of parameters that must be estimated. Relations to other learning approaches, including decision trees, are given. As a demonstration of the method, we describe its application to the problem of automatic word classification in natural language processing.

Journal ArticleDOI
TL;DR: The homogeneous atom transfer radical polymerization (ATRP) of styrene using solubilizing 4,4'dialkyl substituted 2,2'bipyridines yielded well-defined polymers with Mw/Mn ≤ 1.10 as mentioned in this paper.
Abstract: The homogeneous atom transfer radical polymerization (ATRP) of styrene using solubilizing 4,4‘-dialkyl substituted 2,2‘-bipyridines yielded well-defined polymers with Mw/Mn ≤ 1.10. The polymerizations exhibited an increase in molecular weight in direct proportion to the ratio of the monomer consumed to the initial initiator concentration and also exhibited internal first-order kinetics with respect to monomer concentration. The optimum ratio of ligand-to-copper(I) halide for these polymerizations was found to be 2:1, which tentatively indicates that the coordination sphere of the active copper(I) center contains two bipyridine ligands. The exclusive role for this copper(I) complex in ATRP is atom transfer, since at typical concentrations that occur for these polymerizations (≈10-7−10-8 M), polymeric radicals were found not to react with the copper(I) center in any manner that enhanced or detracted from the observed control. ATRP also exhibited first-order kinetics with respect to both initiator and copper...

Proceedings ArticleDOI
01 Oct 1997
TL;DR: The design of Odyssey is described, a prototype implementing application-aware adaptation, and how it supports concurrent execution of diverse mobile applications, and agility is identified as a key attribute of adaptive systems.
Abstract: In this paper we show that application-aware adaptation, a collaborative partnership between the operating system and applications, offers the most general and effective approach to mobile information access. We describe the design of Odyssey, a prototype implementing this approach, and show how it supports concurrent execution of diverse mobile applications. We identify agility as a key attribute of adaptive systems, and describe how to quantify and measure it. We present the results of our evaluation of Odyssey, indicating performance improvements up to a factor of 5 on a benchmark of three applications concurrently using remote services over a network with highly variable bandwidth.

Proceedings ArticleDOI
01 Jun 1997
TL;DR: A simple randomized algorithm for accessing shared objects that tends to satisfy each access request with a nearby copy is designed, based on a novel mechanism to maintain and distribute information about object locations, and requires only a small amount of additional memory at each node.
Abstract: Consider a set of shared objects in a distributed network, where several copies of each object may exist at any given time. To ensure both fast access to the objects as well as efficient utilization of network resources, it is desirable that each access request be satisfied by a copy "close" to the requesting node. Unfortunately, it is not clear how to efficiently achieve this goal in a dynamic, distributed environment in which large numbers of objects are continuously being created, replicated, and destroyed. In this paper, we design a simple randomized algorithm for accessing shared objects that tends to satisfy each access request with a nearby copy. The algorithm is based on a novel mechanism to maintain and distribute information about object locations, and requires only a small amount of additional memory at each node. We analyze our access scheme for a class of cost functions that captures the hierarchical nature of wide-area networks. We show that under the particular cost model considered: (i) the expected cost of an individual access is asymptotically optimal, and (ii) if objects are sufficiently large, the memory used for objects dominates the additional memory used by our algorithm with high probability. We also address dynamic changes in both the network as well as the set of object copies.

Journal ArticleDOI
TL;DR: In this paper, a new visual medium, Virtualized Reality, immerses viewers in a virtual reconstruction of real-world events, which consists of real images and depth information computed from these images.
Abstract: A new visual medium, Virtualized Reality, immerses viewers in a virtual reconstruction of real-world events. The Virtualized Reality world model consists of real images and depth information computed from these images. Stereoscopic reconstructions provide a sense of complete immersion, and users can select their own viewpoints at view time, independent of the actual camera positions used to capture the event.

Journal ArticleDOI
TL;DR: In this paper, the authors examined the globalization of innovation and the phenomenon of foreign direct investment (FDI) in research and development to do so, it draws from a survey of foreign-affiliated R&D laboratories in the United States.

Proceedings ArticleDOI
03 Aug 1997
TL;DR: This paper describes the use of DG interfaces for several parameter-setting problems: light selection and placement for image rendering, both standard and image-based; opacity and color transfer-function specification for volume rendering; and motion control for particle-system and articulated-figure animation.
Abstract: Image rendering maps scene parameters to output pixel values; animation maps motion-control parameters to trajectory values. Because these mapping functions are usually multidimensional, nonlinear, and discontinuous, finding input parameters that yield desirable output values is often a painful process of manual tweaking. Interactive evolution and inverse design are two general methodologies for computer-assisted parameter setting in which the computer plays a prominent role. In this paper we present another such methodology. Design GalleryTM (DG) interfaces present the user with the broadest selection, automatically generated and organized, of perceptually different graphics or animations that can be produced by varying a given input-parameter vector. The principal technical challenges posed by the DG approach are dispersion, finding a set of input-parameter vectors that optimally disperses the resulting output-value vectors, and arrangement, organizing the resulting graphics for easy and intuitive browsing by the user. We describe the use of DG interfaces for several parameter-setting problems: light selection and placement for image rendering, both standard and image-based; opacity and color transfer-function specification for volume rendering; and motion control for particle-system and articulated-figure animation. CR Categories: I.2.6 [Artificial Intelligence]: Problem Solving, Control Methods and Search—heuristic methods; I.3.6 [Computer Graphics]: Methodology and Techniques—interaction techniques; I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism.

Journal ArticleDOI
TL;DR: It is concluded that face recognition normally depends on two systems: a holistic, face-specific system that is dependent on orientationspecific coding of second-order relational features (internal) and a part-based object-recognition system, which is damaged in CK and which contributes to face recognition when the face stimulus does not satisfy the domain-specific conditions needed to activate the face system.
Abstract: In order to study face recognition in relative isolation from visual processes that may also contribute to object recognition and reading, we investigated CK, a man with normal face recognition but with object agnosia and dyslexia caused by a closed-head injury. We administered recognition tests of up right faces, of family resemblance, of age-transformed faces, of caricatures, of cartoons, of inverted faces, and of face features, of disguised faces, of perceptually degraded faces, of fractured faces, of faces parts, and of faces whose parts were made of objects. We compared CK's performance with that of at least 12 control participants. We found that CK performed as well as controls as long as the face was upright and retained the configurational integrity among the internal facial features, the eyes, nose, and mouth. This held regardless of whether the face was disguised or degraded and whether the face was represented as a photo, a caricature, a cartoon, or a face composed of objects. In the last case, CK perceived the face but, unlike controls, was rarely aware that it was composed of objects. When the face, or just the internal features, were inverted or when the configurational gestalt was broken by fracturing the face or misaligning the top and bottom halves, CK's performance suffered far more than that of controls. We conclude that face recognition normally depends on two systems: (1) a holistic, face-specific system that is dependent on orientationspecific coding of second-order relational features (internal), which is intact in CK and (2) a part-based object-recognition system, which is damaged in CK and which contributes to face recognition when the face stimulus does not satisfy the domain-specific conditions needed to activate the face system.

Journal ArticleDOI
TL;DR: There are ways in which locally weighted learning, a type of lazy learning, has been applied by us to control tasks, and various forms that control tasks can take, are explained.
Abstract: Lazy learning methods provide useful representations and training algorithms for learning about complex phenomena during autonomous adaptive control of complex systems. This paper surveys ways in which locally weighted learning, a type of lazy learning, has been applied by us to control tasks. We explain various forms that control tasks can take, and how this affects the choice of learning paradigm. The discussion section explores the interesting impact that explicitly remembering all previous experiences has on the problem of learning to control.