scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Computer in 2003"


Journal ArticleDOI
Jeffrey O. Kephart1, David M. Chess1
TL;DR: A 2001 IBM manifesto noted the almost impossible difficulty of managing current and planned computing systems, which require integrating several heterogeneous environments into corporate-wide computing systems that extend into the Internet.
Abstract: A 2001 IBM manifesto observed that a looming software complexity crisis -caused by applications and environments that number into the tens of millions of lines of code - threatened to halt progress in computing. The manifesto noted the almost impossible difficulty of managing current and planned computing systems, which require integrating several heterogeneous environments into corporate-wide computing systems that extend into the Internet. Autonomic computing, perhaps the most attractive approach to solving this problem, creates systems that can manage themselves when given high-level objectives from administrators. Systems manage themselves according to an administrator's goals. New components integrate as effortlessly as a new cell establishes itself in the human body. These ideas are not science fiction, but elements of the grand challenge to create self-managing computing systems.

6,527 citations


Journal ArticleDOI
TL;DR: Combining Web services to create higher level, cross-organizational business processes requires standards to model the interactions.
Abstract: Combining Web services to create higher level, cross-organizational business processes requires standards to model the interactions. Several standards are working their way through industry channels and into vendor products.

1,291 citations


Journal ArticleDOI
TL;DR: Although many view iterative and incremental development as a modern practice, its application dates as far back as the mid-1950s, with prominent software-engineering thought leaders from each succeeding decade supporting IID practices.
Abstract: Although many view iterative and incremental development as a modern practice, its application dates as far back as the mid-1950s. Prominent software-engineering thought leaders from each succeeding decade supported IID practices, and many large projects used them successfully. These practices may have differed in their details, but all had a common theme-to avoid a single-pass sequential, document-driven, gated-step approach.

1,289 citations


Journal ArticleDOI
TL;DR: The other source of power dissipation in microprocessors, dynamic power, arises from the repeated capacitance charge and discharge on the output of the hundreds of millions of gates in today's chips.
Abstract: Off-state leakage is static power, current that leaks through transistors even when they are turned off. The other source of power dissipation in today's microprocessors, dynamic power, arises from the repeated capacitance charge and discharge on the output of the hundreds of millions of gates in today's chips. Until recently, only dynamic power has been a significant source of power consumption, and Moore's law helped control it. However, power consumption has now become a primary microprocessor design constraint; one that researchers in both industry and academia will struggle to overcome in the next few years. Microprocessor design has traditionally focused on dynamic power consumption as a limiting factor in system integration. As feature sizes shrink below 0.1 micron, static power is posing new low-power design challenges.

1,233 citations


Journal ArticleDOI
Debashis Saha, Amitava Mukherjee1
TL;DR: Pervasive computing is close to technical and economic viability, and a computing environment is an information-enhanced physical space, not a virtual environment that exists to store and run software.
Abstract: Pervasive computing promises to make life simpler via digital environments that sense, adapt, and respond to human needs. Yet we still view computers as machines that run programs in a virtual environment. Pervasive computing presumes a different vision. A device can be a portal into an application-data space, not just a repository of custom software a user must manage. An application is a means by which a user performs a task, not software written to exploit a device's capabilities. And a computing environment is an information-enhanced physical space, not a virtual environment that exists to store and run software. Pervasive computing is close to technical and economic viability.

722 citations


Book ChapterDOI
TL;DR: There is a growing interest in Networks on Chips (NoC) that is related to the evolution of integrated circuit technology and to the growing requirements in performance and portability of electronic systems.
Abstract: We are witnessing a growing interest in Networks on Chips (NoC) that is related to the evolution of integrated circuit technology and to the growing requirements in performance and portability of electronic systems. Current integrated circuits contain several processing cores, and even relatively simple systems, such as cellular telephones, behave as multiprocessors. Moreover, many electronic systems consist of heterogeneous components and they require efficient on-chip communication. In the last few years, multiprocessing platforms have been developed to address high performance computation, such as image rendering. Examples are Sony’s emotion engine [OKA] and IBM’s cell chip [PHAM] where on-chip communication efficiency is key to the overall system performance.

641 citations


Journal ArticleDOI
TL;DR: The software as a service model composes services dynamically, as needed, by binding several lower-level services-thus overcoming many limitations that constrain traditional software use, deployment, and evolution.
Abstract: The software as a service model composes services dynamically, as needed, by binding several lower-level services-thus overcoming many limitations that constrain traditional software use, deployment, and evolution.

576 citations


Journal ArticleDOI
TL;DR: The miniature wireless sensor nodes developed from low-cost off-the-shelf components at the University of California, Berkeley, as part of its smart dust projects, establish a self-organizing sensor network when dispersed into an environment.
Abstract: Sensor networks offer economically viable solutions for a variety of applications. For example, current implementations monitor factory instrumentation, pollution levels, freeway traffic, and the structural integrity of buildings. Other applications include climate sensing and control in office buildings and home environmental sensing systems for temperature, light, moisture, and motion. Sensor networks are key to the creation of smart spaces, which embed information technology in everyday home and work environments. The miniature wireless sensor nodes, or motes, developed from low-cost off-the-shelf components at the University of California, Berkeley, as part of its smart dust projects, establish a self-organizing sensor network when dispersed into an environment. The privacy and security issues posed by sensor networks represent a rich field of research problems. Improving network hardware and software may address many of the issues, but others will require new supporting technologies.

555 citations


Journal ArticleDOI
TL;DR: Based on a metamodel with formal semantics that developers can use to capture designs, Metropolis provides an environment for complex electronic-system design that supports simulation, formal analysis, and synthesis.
Abstract: Today, the design chain lacks adequate support, with most system-level designers using a collection of unlinked tools. The implementation then proceeds with informal techniques involving numerous human-language interactions that create unnecessary and unwanted iterations among groups of designers in different companies or different divisions. The move toward programmable platforms shifts the design implementation task toward embedded software design. When embedded software reaches the complexity typical of today's designs, the risk that the software will not function correctly increases exponentially. The Metropolis project seeks to develop a unified framework that can cope with this challenge. Based on a metamodel with formal semantics that developers can use to capture designs, Metropolis provides an environment for complex electronic-system design that supports simulation, formal analysis, and synthesis.

549 citations


Journal ArticleDOI
TL;DR: Commercial-server energy management now focuses on conserving power in the memory and microprocessor subsystems, which is more applicable to multiprocessor environments in commercial servers than techniques that primarily apply to single-application environments, such as those based on compiler optimizations.
Abstract: Servers: high-end, multiprocessor systems running commercial workloads, have typically included extensive cooling systems and resided in custom-built rooms for high-power delivery. Recently, as transistor density and demand for computing resources have rapidly increased, even these high-end systems face energy-use constraints. Commercial-server energy management now focuses on conserving power in the memory and microprocessor subsystems. Because their workloads are typically structured as multiple application programs, system-wide approaches are more applicable to multiprocessor environments in commercial servers than techniques that primarily apply to single-application environments, such as those based on compiler optimizations.

482 citations


Journal ArticleDOI
TL;DR: Research in battery-aware optimization is now moving from stand-alone devices to networks of wireless devices, specifically, ad hoc and distributed sensor networks.
Abstract: Advances in battery technology have not kept pace with rapidly growing energy demands. Most laptops, handheld PCs, and cell phones use batteries that take anywhere from 1.5 to 4 hours to fully charge but can run on this charge for only a few hours. The battery has thus become a key control parameter in the energy management of portables. To meet the stringent power budget of these devices, researchers have explored various architectural, hardware, software, and system-level optimizations to minimize the energy consumed per useful computation. Research in battery-aware optimization is now moving from stand-alone devices to networks of wireless devices, specifically, ad hoc and distributed sensor networks. Computationally feasible mathematical models are now available that capture battery discharge characteristics in sufficient detail to let designers develop an optimization strategy that extracts maximum charge.

Journal ArticleDOI
TL;DR: The authors present a risk-based approach for structuring projects to incorporate both agile and plan-driven approaches in proportion to a project's needs.
Abstract: Both agile and plan-driven approaches have situation-dependent shortcomings that, if not addressed, can lead to project failure. The challenge is to balance the two approaches to take advantage of their strengths in a given situation while compensating for their weaknesses. The authors present a risk-based approach for structuring projects to incorporate both agile and plan-driven approaches in proportion to a project's needs.

Journal ArticleDOI
TL;DR: Currently, the focus is on determining how to blend agile methodologies with plan-driven approaches to software development.
Abstract: Currently, the focus is on determining how to blend agile methodologies with plan-driven approaches to software development.

Journal ArticleDOI
TL;DR: The central idea behind stream processing is to organize an application into streams and kernels to expose the inherent locality and concurrency in media-processing applications.
Abstract: The demand for flexibility in media processing motivates the use of programmable processors. Stream processing bridges the gap between inflexible special-purpose solutions and current programmable architectures that cannot meet the computational demands of media-processing applications. The central idea behind stream processing is to organize an application into streams and kernels to expose the inherent locality and concurrency in media-processing applications. The performance of the Imagine stream processor on these media application is given.

Journal ArticleDOI
TL;DR: A, collaborative handheld system extends the instant messaging paradigm by adding context-awareness to support the intensive and distributed nature of information management within a hospital setting.
Abstract: A, collaborative handheld system extends the instant messaging paradigm by adding context-awareness to support the intensive and distributed nature of information management within a hospital setting.

Journal ArticleDOI
Wayne Wolf1
TL;DR: The term hardware/software codesign, coined about 10 years ago, describes a confluence of problems in integrated circuit design that tells us about the performance and energy consumption of single CPUs and multiprocessors.
Abstract: The term hardware/software codesign, coined about 10 years ago, describes a confluence of problems in integrated circuit design. By the 1990s, it became clear that microprocessor-based systems would be an important design discipline for IC designers as well. Large 16- and 32-bit microprocessors had already been used in board-level designs, and Moore's law ensured that chips would soon be large enough to include both a CPU and other subsystems. Multiple disciplines inform hardware/software codesign. Computer architecture tells us about the performance and energy consumption of single CPUs and multiprocessors. Real-time system theory helps analyze the deadline-driven performance of embedded systems. Computer-aided design assists hardware cost evaluation and design space exploration.

Journal ArticleDOI
TL;DR: In this article, the authors propose a divisible load theory (DLT) model for modeling data-intensive computational problems, which can be used to create intelligent sensor networks, but most recent applications involve parallel and distributed computing.
Abstract: During the past decade, divisible load theory has become a powerful tool for modeling data-intensive computational problems. DLT emerged from a desire to create intelligent sensor networks, but most recent applications involve parallel and distributed computing. Like other linear mathematical models such as Markovian queuing theory and electric resistive circuit theory, DLT offers easy computation, a schematic language, and equivalent network element modeling. While it can incorporate stochastic features, the basic model does not make statistical assumptions, which can be the Achilles' heel of a performance evaluation model.

Journal ArticleDOI
TL;DR: The transition from a plan-driven to an agile software development process affects not only the development team members, but also other teams, departments, and management.
Abstract: The transition from a plan-driven to an agile software development process affects not only the development team members, but also other teams, departments, and management. Any new process will likely attract developers excited to try it while repelling those opposed to change. Thus, how an agile process is introduced into an organization significantly affects its ultimate success.

Journal ArticleDOI
TL;DR: The widescale deployment of wireless networks will improve communication among patients, physicians, and other healthcare worker, as well as enable the delivery of accurate medical information anytime anywhere, thereby reducing errors and improving access.
Abstract: The US healthcare industry is confronting a number of challenges, including skyrocketing costs, a growing incidence of medical errors, inadequate staffing, and lack of coverage in rural and underserved urban areas. Healthcare workers are under increasing pressure to provide better services to more people using limited financial and human resources. One proposed solution to the current crisis is pervasive healthcare. The widescale deployment of wireless networks will improve communication among patients, physicians, and other healthcare worker, as well as enable the delivery of accurate medical information anytime anywhere, thereby reducing errors and improving access. At the same time, advances in wireless technologies - such as intelligent mobile devices and wearable networks - have made possible a wide range of efficient and powerful medical applications. Pervasive healthcare has the potential to reduce long-term costs and improve quality of service, but it also faces many technical and administrative obstacles.

Journal ArticleDOI
TL;DR: Working in conjunction with teachers, researchers have developed a series of projects exploring the potential for using wireless handheld devices to enhance K-12 classroom instruction.
Abstract: Working in conjunction with teachers, researchers have developed a series of projects exploring the potential for using wireless handheld devices to enhance K-12 classroom instruction

Journal ArticleDOI
TL;DR: Microsoft's next-generation secure computing base extends personal computers to offer mechanisms that let high-assurance software protect itself from the operating systems, device drivers, BIOS, and other software running on the same machine.
Abstract: Microsoft's next-generation secure computing base extends personal computers to offer mechanisms that let high-assurance software protect itself from the operating systems, device drivers, BIOS, and other software running on the same machine.

Journal ArticleDOI
TL;DR: The value-based approach to software development integrates value considerations into current and emerging software engineering principles and practices, while developing an overall framework in which these techniques compatibly reinforce each other.
Abstract: The information technology field's accelerating rate of change makes feedback control essential for organizations to sense, evaluate, and adapt to changing value propositions in their competitive marketplace. Although traditional project feedback control mechanisms can manage the development efficiency of stable projects in well-established value situations, they do little to address the project's actual value, and can lead to wasteful misuse of an organization's scarce resources. The value-based approach to software development integrates value considerations into current and emerging software engineering principles and practices, while developing an overall framework in which these techniques compatibly reinforce each other.

Journal ArticleDOI
TL;DR: Triage is the process of determining which requirements a product should satisfy given the time and resources available, and three product development case studies and 14 recommendations for practicing this neglected art are presented.
Abstract: Driven by an increasingly competitive market, companies add features and compress schedules for the delivery of every product, often creating a complete mismatch of requirements and resources that results in products failing to satisfy customer needs. Triage is the process of determining which requirements a product should satisfy given the time and resources available. The author presents three product development case studies and 14 recommendations for practicing this neglected art.

Journal ArticleDOI
TL;DR: The article presents a technology that uses event model interfaces and a novel event flow mechanism that extends formal analysis approaches from real-time system design into the multiprocessor system on chip domain.
Abstract: Multiprocessor system on chip designs use complex on-chip networks to integrate different programmable processor cores, specialized memories, and other components on a single chip. MpSoC have been become the architecture of choice in many industries. Their heterogeneity inevitably increases with intellectual-property integration and component specialization. System integration is becoming a major challenge in their design. Simulation is state of the art in MpSoC performance verification, but it has conceptual disadvantages that become disabling as complexity increases. Formal approaches offer a systematic alternative. The article presents a technology that uses event model interfaces and a novel event flow mechanism that extends formal analysis approaches from real-time system design into the multiprocessor system on chip domain.

Journal ArticleDOI
TL;DR: By using adaptive processing to dynamically tune major microprocessor resources, developers can achieve greater energy efficiency with reasonable hardware and software overhead while avoiding undue performance loss.
Abstract: By using adaptive processing to dynamically tune major microprocessor resources, developers can achieve greater energy efficiency with reasonable hardware and software overhead while avoiding undue performance loss. Adaptive processors require few additional transistors. Further, because adaptation occurs only in response to infrequent trigger events, the decision logic can be placed into a low-leakage state until such events occur.

Journal ArticleDOI
TL;DR: This research seeks to capture some of the regularity apparent in the composition process by using statistical and information theoretic tools to analyze musical pieces and generate new works that imitate the style of the great masters.
Abstract: The ability to construct a musical theory from examples presents a great intellectual challenge that, if successfully met, could foster a range of new creative applications. Inspired by this challenge, we sought to apply machine-learning methods to the problem of musical style modeling. Our work so far has produced examples of musical generation and applications to a computer-aided composition system. Machine learning consists of deriving a mathematical model, such as a set of stochastic rules, from a set of musical examples. The act of musical composition involves a highly structured mental process. Although it is complex and difficult to formalize, it is clearly far from being a random activity. Our research seeks to capture some of the regularity apparent in the composition process by using statistical and information theoretic tools to analyze musical pieces. The resulting models can be used for inference and prediction and, to a certain extent, to generate new works that imitate the style of the great masters.

Journal ArticleDOI
TL;DR: The authors describe a simulation environment that targets heterogeneous multiprocessor systems and is currently working to extend their methodology to more complex on-chip architectures.
Abstract: SystemC is an open source C/C++ simulation environment that provides several class packages for specifying hardware blocks and communication channels. The design environment specifies software algorithmically as a set of functions embedded in abstract modules that communicate with one another and with hardware components via abstract communication channels. It enables transparent integration of instruction-set simulators and prototyping boards. The authors describe a simulation environment that targets heterogeneous multiprocessor systems. They are currently working to extend their methodology to more complex on-chip architectures.

Journal ArticleDOI
TL;DR: Single-assignment C is a C language variant designed to create an automated compilation path from an algorithmic programming language to an FPGA-based reconfigurable computing system.
Abstract: RC systems typically consist of an array of configurable computing elements. The computational granularity of these elements ranges from simple gates - as abstracted by FPGA lookup tables - to complete arithmetic-logic units with or without registers. A rich programmable interconnect completes the array. RC system developer manually partitions an application into two segments: a hardware component in a hardware description language such as VHDL or Verilog that will execute as a circuit on the FPGA and a software component that will execute as a program on the host. Single-assignment C is a C language variant designed to create an automated compilation path from an algorithmic programming language to an FPGA-based reconfigurable computing system.

Journal ArticleDOI
TL;DR: The authors have developed a simulation methodology that uses multiple simulations, pays careful attention to the effects of scaling on workload behavior, and extends Virtutech AB's Simics full system functional simulator with detailed timing models.
Abstract: As dependence on database management systems and Web servers increases, so does the need for them to run reliably and efficiently-goals that rigorous simulations can help achieve. Execution-driven simulation models system hardware. These simulations capture actual program behavior and detailed system interactions. The authors have developed a simulation methodology that uses multiple simulations, pays careful attention to the effects of scaling on workload behavior, and extends Virtutech AB's Simics full system functional simulator with detailed timing models. The Wisconsin Commercial Workload Suite contains scaled and tuned benchmarks for multiprocessor servers, enabling full-system simulations to run on the PCs that are routinely available to researchers.

Journal ArticleDOI
TL;DR: The article looks at some of the areas this capability affects, including: privacy risks; economic damages; location-based spam; intermittent connectivity; user interfaces; network privacy; and privacy protection.
Abstract: After more than two decades of hype, computing and communication technologies are finally converging. Java-enabled cell phones run a host of powerful applications including mobile Internet access, while many notebook computers offer high-speed wireless connectivity as a standard feature. The big decision when purchasing a PDA is whether to get integrated cellular service or Wi-Fi capability. Location-based services are emerging as the next killer app in personal wireless devices, but there are few safeguards on location privacy. In fact, the demand for improved public safety is pushing regulation in the opposite direction. Today, when a person reports an emergency from a landline phone by dialing 911 in the United States or 112 in Europe, the system displays the caller's phone number and address to the dispatcher. The US Federal Communications Commission has mandated that, by December 2005, all cellular carriers be able to identify the location of emergency callers using mobile phones to within 50 to 100 meters. However, how cellular carriers and other businesses will use this capability remains open to question. The article looks at some of the areas this capability affects, including: privacy risks; economic damages; location-based spam; intermittent connectivity; user interfaces; network privacy; and privacy protection.