scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Proactive computing

David Tennenhouse1
01 May 2000-Communications of The ACM-Vol. 43, Iss: 5, pp 43-50
TL;DR: The computer science research community now enjoys a rare and exciting opportunity to redefine its agenda and establish the new goals that will propel society beyond interactive computing and the human/machine breakpoint.
Abstract: F or the past 40 years, most of the IT research community has focused on interactive computing, J.C.R. Licklider’s powerful and human-centered vision of human-computer symbiosis [3]. In tandem with this research has come the creation of an IT industry that is hurtling toward the human/machine/network breakpoint—the point at which the number of networked interactive computers will surpass the number of people on the planet. We still have a long way to go before Licklider’s vision is attained—and are many years from extending per-capita penetration to most parts of the world. However, “missing science” may no longer be the factor limiting progress toward these long-cherished goals. It is reasonable, though perhaps heretical, to suggest that refinements of the existing science base will be sufficient to drive these efforts forward. It is time for a change. The computer science research community now enjoys a rare and exciting opportunity to redefine its agenda and establish the new goals that will propel society beyond interactive computing and the human/machine breakpoint. In lifting our sights toward a world in which networked computers outnumber human beings by a hundred or thousand to one, we should consider what these “excess” computers will be doing and craft a research agenda that can lead to increased human productivity and quality of life.
Citations
More filters
Proceedings ArticleDOI
16 Jul 2001
TL;DR: A suite of security building blocks optimized for resource-constrained environments and wireless communication, and shows that they are practical even on minimal hardware: the performance of the protocol suite easily matches the data rate of the network.
Abstract: As sensor networks edge closer towards wide-spread deployment, security issues become a central concern. So far, much research has focused on making sensor networks feasible and useful, and has not concentrated on security.We present a suite of security building blocks optimized for resource-constrained environments and wireless communication. SPINS has two secure building blocks: SNEP and mTESLA SNEP provides the following important baseline security primitives: Data confidentiality, two-party data authentication, and data freshness. A particularly hard problem is to provide efficient broadcast authentication, which is an important mechanism for sensor networks. mTESLA is a new protocol which provides authenticated broadcast for severely resource-constrained environments. We implemented the above protocols, and show that they are practical even on minimal hardware: the performance of the protocol suite easily matches the data rate of our network. Additionally, we demonstrate that the suite can be used for building higher level protocols.

2,703 citations

Journal ArticleDOI
TL;DR: A suite of security protocols optimized for sensor networks: SPINS, which includes SNEP and μTESLA and shows that they are practical even on minimal hardware: the performance of the protocol suite easily matches the data rate of the network.
Abstract: Wireless sensor networks will be widely deployed in the near future. While much research has focused on making these networks feasible and useful, security has received little attention. We present a suite of security protocols optimized for sensor networks: SPINS. SPINS has two secure building blocks: SNEP and μTESLA. SNEP includes: data confidentiality, two-party data authentication, and evidence of data freshness. μTESLA provides authenticated broadcast for severely resource-constrained environments. We implemented the above protocols, and show that they are practical even on minimal hardware: the performance of the protocol suite easily matches the data rate of our network. Additionally, we demonstrate that the suite can be used for building higher level protocols.

2,298 citations

Proceedings ArticleDOI
22 Apr 2001
TL;DR: This work establishes the main highlight of the paper-optimal polynomial time worst and average case algorithm for coverage calculation, which answers the questions about quality of service (surveillance) that can be provided by a particular sensor network.
Abstract: Wireless ad-hoc sensor networks have recently emerged as a premier research topic. They have great long-term economic potential, ability to transform our lives, and pose many new system-building challenges. Sensor networks also pose a number of new conceptual and optimization problems. Some, such as location, deployment, and tracking, are fundamental issues, in that many applications rely on them for needed information. We address one of the fundamental problems, namely coverage. Coverage in general, answers the questions about quality of service (surveillance) that can be provided by a particular sensor network. We first define the coverage problem from several points of view including deterministic, statistical, worst and best case, and present examples in each domain. By combining the computational geometry and graph theoretic techniques, specifically the Voronoi diagram and graph search algorithms, we establish the main highlight of the paper-optimal polynomial time worst and average case algorithm for coverage calculation. We also present comprehensive experimental results and discuss future research directions related to coverage in sensor networks.

1,837 citations


Cites background from "Proactive computing"

  • ...Moreover, embedded web servers [1,3] can be used to connect the physical world of sensors and actuators to the virtual world of information utilities and services....

    [...]

Journal ArticleDOI
TL;DR: It is argued that a multiagent system can naturally be viewed and architected as a computational organization, and the appropriate organizational abstractions that are central to the analysis and design of such systems are identified.
Abstract: Systems composed of interacting autonomous agents offer a promising software engineering approach for developing applications in complex domains. However, this multiagent system paradigm introduces a number of new abstractions and design/development issues when compared with more traditional approaches to software development. Accordingly, new analysis and design methodologies, as well as new tools, are needed to effectively engineer such systems. Against this background, the contribution of this article is twofold. First, we synthesize and clarify the key abstractions of agent-based computing as they pertain to agent-oriented software engineering. In particular, we argue that a multiagent system can naturally be viewed and architected as a computational organization, and we identify the appropriate organizational abstractions that are central to the analysis and design of such systems. Second, we detail and extend the Gaia methodology for the analysis and design of multiagent systems. Gaia exploits the aforementioned organizational abstractions to provide clear guidelines for the analysis and design of complex and open software systems. Two representative case studies are introduced to exemplify Gaia's concepts and to show its use and effectiveness in different types of multiagent system.

1,432 citations


Cites background from "Proactive computing"

  • ...These characteristics apply, for exam­ple, to the semantic web [Berners-Lee et al. 2001], to grid computing [Foster and Kesselman 1999], and to pervasive environments [Abelson et al. 2000; Tennenhouse 2000]....

    [...]

  • ...INTRODUCTION The characteristics and expectations of software systems have changed dra­matically in the past few years, with the result that a range of new soft­ware engineering challenges have arisen [Tennenhouse 2000; Zambonelli and Parunak 2003]....

    [...]

  • ...…applications scenarios (e.g., virtual enterprises [Ricci et al. 2002] global [Babaoglu et al. 2002] and pervasive computing [Estrin et al. 2002; Tennenhouse 2000]) where a system may need to frequently adapt its orga­nizational structure to the prevailing situation, possibly at run-time…...

    [...]

  • ...…to decide what actions it should take at what time [Wooldridge and Jennings 1995]) re.ects the decentralized nature of modern distributed systems [Tennenhouse 2000] and can be considered as the natu­ral extension to the notions of modularity and encapsulation for systems that are owned by…...

    [...]

Proceedings ArticleDOI
11 Jun 2001
TL;DR: The experimental results demonstrate that by using only a subset of sensor nodes at each moment, the system achieves a significant energy savings while fully preserving coverage.
Abstract: Wireless sensor networks have emerged recently as an effective way of monitoring remote or inhospitable physical environments. One of the major challenges in devising such networks lies in the constrained energy and computational resources available to sensor nodes. These constraints must be taken into account at all levels of the system hierarchy. The deployment of sensor nodes is the first step in establishing a sensor network. Since sensor networks contain a large number of sensor nodes, the nodes must be deployed in clusters, where the location of each particular node cannot be fully guaranteed a priori. Therefore, the number of nodes that must be deployed in order to completely cover the whole monitored area is often higher than if a deterministic procedure were used. In networks with stochastically placed nodes, activating only the necessary number of sensor nodes at any particular moment can save energy. We introduce a heuristic that selects mutually exclusive sets of sensor nodes, where the members of each of those sets together completely cover the monitored area. The intervals of activity are the same for all sets, and only one of the sets is active at any time. The experimental results demonstrate that by using only a subset of sensor nodes at each moment, we achieve a significant energy savings while fully preserving coverage.

1,074 citations

References
More filters
Book
01 Jan 1969
TL;DR: A new edition of Simon's classic work on artificial intelligence as mentioned in this paper adds a chapter that sorts out the current themes and tools for analyzing complexity and complex systems, taking into account important advances in cognitive psychology and the science of design while confirming and extending Simon's basic thesis that a physical symbol system has the necessary and sufficient means for intelligent action.
Abstract: Continuing his exploration of the organization of complexity and the science of design, this new edition of Herbert Simon's classic work on artificial intelligence adds a chapter that sorts out the current themes and tools -- chaos, adaptive systems, genetic algorithms -- for analyzing complexity and complex systems. There are updates throughout the book as well. These take into account important advances in cognitive psychology and the science of design while confirming and extending the book's basic thesis: that a physical symbol system has the necessary and sufficient means for intelligent action. The chapter "Economic Reality" has also been revised to reflect a change in emphasis in Simon's thinking about the respective roles of organizations and markets in economic systems.

11,845 citations

Journal ArticleDOI
TL;DR: Consider writing, perhaps the first information technology: The ability to capture a symbolic representation of spoken language for long-term storage freed information from the limits of individual memory.
Abstract: Specialized elements of hardware and software, connected by wires, radio waves and infrared, will soon be so ubiquitous that no-one will notice their presence.

9,073 citations

Journal ArticleDOI

6,484 citations


"Proactive computing" refers background in this paper

  • ...Let’s Get Physical Herbert Simon, Nobel laureate and a professor at Carnegie Mellon University, identified the importance of bridging the physical and virtual domains quite some time ago [6]....

    [...]

Book ChapterDOI
14 Jun 1999
TL;DR: An analysis of why certain design decisions have been so difficult to clearly capture in actual code is presented, and the basis for a new programming technique, called aspect-oriented programming, that makes it possible to clearly express programs involving such aspects.
Abstract: We have found many programming problems for which neither procedural nor object-oriented programming techniques are sufficient to clearly capture some of the important design decisions the program must implement. This forces the implementation of those design decisions to be scattered throughout the code, resulting in “tangled” code that is excessively difficult to develop and maintain. We present an analysis of why certain design decisions have been so difficult to clearly capture in actual code. We call the properties these decisions address aspects, and show that the reason they have been hard to capture is that they cross-cut the system's basic functionality. We present the basis for a new programming technique, called aspect-oriented programming, that makes it possible to clearly express programs involving such aspects, including appropriate isolation, composition and reuse of the aspect code. The discussion is rooted in systems we have built using aspect-oriented programming.

3,355 citations

Proceedings ArticleDOI
01 Sep 2001
TL;DR: This tutorial shows how to use AOP to implement crosscutting conerns in a concise modular way and includes a description of their underlying model, in terms of which a wide range of AOP languages can be understood.
Abstract: Aspect-oriented programming (AOP) is a technique for improving separation of concerns in software design and implementation. AOP works by providing explicit mechanisms for capturing the structure of crosscutting concerns. This tutorial shows how to use AOP to implement crosscutting conerns in a concise modular way. It works with AspectJ, a seamless aspect-oriented extension to the Java(tm) programming language, and with AspectC, an aspect-oriented extension to C in the style of AspectJ. It also includes a description of their underlying model, in terms of which a wide range of AOP languages can be understood.

3,187 citations