scispace - formally typeset
Search or ask a question

Showing papers in "Advances in Computers in 1993"


Book ChapterDOI
TL;DR: The chapter discusses two important directions of research to improve learning algorithms: the dynamic node generation, which is used by the cascade correlation algorithm; and designing learning algorithms where the choice of parameters is not an issue.
Abstract: Publisher Summary This chapter provides an account of different neural network architectures for pattern recognition. A neural network consists of several simple processing elements called neurons. Each neuron is connected to some other neurons and possibly to the input nodes. Neural networks provide a simple computing paradigm to perform complex recognition tasks in real time. The chapter categorizes neural networks into three types: single-layer networks, multilayer feedforward networks, and feedback networks. It discusses the gradient descent and the relaxation method as the two underlying mathematical themes for deriving learning algorithms. A lot of research activity is centered on learning algorithms because of their fundamental importance in neural networks. The chapter discusses two important directions of research to improve learning algorithms: the dynamic node generation, which is used by the cascade correlation algorithm; and designing learning algorithms where the choice of parameters is not an issue. It closes with the discussion of performance and implementation issues.

13,033 citations


Book ChapterDOI
TL;DR: This chapter introduces key developments and results in the area of generalized information theory, a theory that deals with uncertainty-based information within mathematical frameworks that are broader than classical set theory and probability theory.
Abstract: Publisher Summary Three types of uncertainty are recognized in the five theories that are currently the only theories in which measurement of uncertainty is well established: classical set theory, fuzzy set theory, probability theory, possibility theory, and evidence theory The three uncertainty types are: fuzziness (or vagueness), that result from imprecise boundaries of fuzzy sets; nonspecificity (or imprecision), that is connected with sizes (cardinalities) of relevant sets of alternatives; and strife (or discord), that expresses conflicts among the various sets of alternatives This chapter introduces key developments and results in the area of generalized information theory, a theory that deals with uncertainty-based information within mathematical frameworks that are broader than classical set theory and probability theory The chapter presents an overview of these measures and explains how they can be utilized for measuring information Basic concepts and properties of relevant theories of uncertainty are introduced This background knowledge is employed to give a comprehensive overview of conceptual and mathematical issues regarding measures of the various types of uncertainty in the introduced theories The utility of these uncertainty measures is exemplified by three broad principles of uncertainty that are described in the chapter—namely, a principle of minimum uncertainty, a principle of maximum uncertainty, and a principle of uncertainty invariance Current results regarding generalized information theory are critically assessed, some important open problems are overviewed, and prospective future directions on research in this area are discussed

104 citations


Book ChapterDOI
TL;DR: The basic principles of the dataflow model, along with its major attributes and the language support for dataflow are discussed, and several proposed methodologies are presented and the various issues involved in program allocation are discussed.
Abstract: Publisher Summary This chapter discusses the basic principles of the dataflow model, along with its major attributes and the language support for dataflow. It provides a general description of the dataflow model of execution based on the pioneering works of research groups at Massachusetts Institute of Technology (MIT) and the University of Manchester namely, the static dataflow machine, tagged-token dataflow architecture, and Manchester machine. Major problems encountered in the design of these machines are outlined. The chapter surveys current dataflow projects and provides a comparison of the architectural characteristics and evolutionary improvements in dataflow computing. It discusses the difficulties in handling data structures in a dataflow environment. An overview of the methods for representing data structures is provided. This is supplemented by various issues required to handle data structures in a dataflow environment. The chapter addresses the issue of program allocation. Several proposed methodologies are presented and the various issues involved in program allocation are discussed. The resource requirements for dataflow computations are examined.

38 citations


Book ChapterDOI
TL;DR: The relevance of ANN technology to practical control problems is discussed, with a slight emphasis toward problems arising in robotics, and a thumbnail sketch of the field of artificial neural networks is presented.
Abstract: Publisher Summary In this chapter, a thumbnail sketch of the field of artificial neural networks is presented. This chapter discusses the relevance of ANN technology to practical control problems, with a slight emphasis toward problems arising in robotics. Models of ANNs are specified by three basic entities: models of the neurons themselves—that is, the node characteristics; models of synaptic interconnections and structures—that is, net topology and weights; and training or learning rules—that is, the method of adjusting the weights or the way the network interprets the information it is receiving. The nodes themselves can be characterized by analog (continuous) or digital (discrete) summing elements exhibiting either linear or nonlinear behavior. The back propagation method appears to have a dominant role in the modeling and identification areas. Control problems fall into two broad categories. In regulation and tracking problems, the objective is to follow a reference trajectory. Self-tuning regulators and model reference controllers belong to this class. In optimum control problems, the objective is to optimize some aspect of the system's behavior without recourse to any reference trajectory. When a detailed model of the plant being controlled is not available, adaptive control techniques can be used to solve either the tracking problem or the optimum control problem. The examples of Neural Controllers selected in the chapter are biased toward robotics. There are a number of other application areas, such as process control and optimization, where the potential of ANNs are being explored. With the advent of VLSI technology on the one hand and supercomputers on the other, it is now possible to routinely simulate large neural networks with relative ease. The time is now ripe to exploit the parallel, distributed nature of neural architectures.

25 citations


Book ChapterDOI
Mary Carol Day1, Susan Boyce1
TL;DR: The chapter cover the reasons it is beneficial to have a human factors specialist on the team and the benefits—relative to the costs—of investing in human factors activities to ensure excellent user interface design.
Abstract: Publisher Summary This chapter focuses on the importance and role of human factors specialists in the design of human-computer systems. This chapter has several purposes: (1) to introduce the field of human factors, also known as “ergonomics,” (2) to describe the role and importance of human factors specialists in the design and development of human-computer systems, and (3) to describe several of the key methodologies used by human factors specialists. A brief overview of the discipline of human factors is presented as a background for those unfamiliar with the field. The relation between user interface design and usability is discussed and an overview of general principles for designing usable systems and the type of information that is relevant for user interface design is presented. Models of the software development process are described. They present a framework for a discussion of the roles of the human factors specialist on design and development teams and the specific human factors activities that should be integrated into the development process. Task analysis, user interface prototyping, and usability testing-key methodologies used by human factors specialists are described and; the importance and the complexities of designing for consistency from the user's perspective are discussed. Because the value of having a human factors specialist on the design and development team is often not understood and the associated costs are often regarded as too high, the chapter cover the reasons it is beneficial to have a human factors specialist on the team and the benefits—relative to the costs—of investing in human factors activities to ensure excellent user interface design.

22 citations


Book ChapterDOI
TL;DR: This chapter examines various approaches to automatic programming and presents several myths related to it, including end-user-oriented automatic programming systems do not need domain knowledge and end- user-oriented, general-purpose, fully automatic programming is possible.
Abstract: Publisher Summary This chapter examines various approaches to automatic programming and presents several myths related to it, including end-user-oriented automatic programming systems do not need domain knowledge; end-user-oriented, general-purpose, fully automatic programming is possible; requirements can be complete; programming is a serial process; there will be no more programming; and there will be no more programming in the large. It discusses the three fundamental technical issues in automatic programming that must be addressed in the design of any automatic programming system: what does the user see, how does the system work, and what does the system know. From the user's perspective, the most prominent aspect of an automatic programming system is the language used to communicate with it. The chapter discusses the range of possibilities using as an example the simple problem of determining the value of an octal number represented as a string. Academic research on automatic programming is focused on developing techniques that can support broad coverage, fully automatic programming. Work in the commercial arena has focused on goals that are more modest and has been able to make significant steps toward automatic programming based on procedural methods.

21 citations


Book ChapterDOI
TL;DR: The chapter discusses modeling issues with particular reference to abstract circuit models, based on graphs that can serve as a common basis for synthesis and that decouple synthesis and optimization from the particular features of any given language.
Abstract: Publisher Summary This chapter discusses state-of-the-art in the high-level synthesis of digital circuits, the success achieved by high-level synthesis, and the present difficulties It mentions that high-level modeling is done by means of hardware description languages (HDLs) The lack of standardization of HDLs suitable for synthesis has been a major impediment in the diffusion of high-level synthesis The chapter discusses modeling issues with particular reference to abstract circuit models, based on graphs that can serve as a common basis for synthesis and that decouple synthesis and optimization from the particular features of any given language Sequel circuit models are discussed at both the functional and logic abstraction levels with behavioral and structural flavors The chapter describes scheduling, resource binding, and control synthesis Structural synthesis and the related tasks are described as applied to non-pipelined circuits, and extensions to pipelined models are reported The chapter concludes by giving a short history of high-level synthesis and by describing and comparing high-level synthesis

20 citations


Book ChapterDOI
TL;DR: Cleanroom Engineering introduces new levels of practical precision for achieving correct software, using three engineering teams—namely, specification engineers, development engineers, and certification engineers.
Abstract: Publisher Summary Software is either correct or incorrect in design to a specification, in contrast with hardware that is reliable to a certain level in performing to a correct design. Certifying the correctness of such software requires two conditions—namely, statistical testing with inputs characteristic of actual usage, and no failures in the testing. Cleanroom Engineering introduces new levels of practical precision for achieving correct software, using three engineering teams—namely, specification engineers, development engineers, and certification engineers. Software can be developed and certified as correct under statistical quality control (SQC) to well-formed specifications of user requirements. The chapter discusses history and application of SQC to software development. Two major properties of Cleanroom Engineering are: no debugging by the developers before the software goes to independent testers and statistical testing taking into account both the usage and the criticalness of software parts. Cleanroom software engineering achieves statistical quality control over software development by strictly separating the design process from the testing process in a pipeline of incremental software development. There are three major engineering activities in this process are software specification, software development, and software certification. The chapter elaborates the statistical quality control in software engineering. Markov chain techniques for software certification are discussed. The cleanroom engineering methods are outlined. Box Structured Software System Design is discussed.

17 citations


Book ChapterDOI
TL;DR: This chapter looks at the controversy that surrounded Frank Rosenblatt's perceptron machine in the late 1950s and early 1960s, and the influence that factors like the emergence of symbolic artificial intelligence and computer technology had on the closure of the neural network controversy.
Abstract: Publisher Summary This chapter discusses the scientific controversies that have shaped neural network research from a sociological point of view. It looks at the controversy that surrounded Frank Rosenblatt's perceptron machine in the late 1950s and early 1960s. Rosenblatt was well aware of the main problems of his machine, and that he even insisted on them in his books and papers. Emphasis is given on one of the main problems of early neural network research, namely the issue of training multilayer systems. In the middle of the perceptron controversy, Minsky and Papert embarked on a project aimed at showing the limitations of Rosenblatt's perceptron beyond doubt. The chapter analyzes the main results of that project, and shows that Minsky and Papert, and neural network researchers interpreted those results rather differently. It discusses the processes through which this interpretative flexibility was closed and the effects that the crisis of early neural network research had upon the three most important neural network groups of the time, namely Widrow's group, Rosenblatt's group, and the group at SRI. The chapter also looks at the influence that factors like the emergence of symbolic artificial intelligence (AI) and computer technology had on the closure of the neural network controversy. After the closure of the perceptron controversy, symbol-processing remained the dominant approach to AI over the years, until the early 1980s. Some of the most important aspects of that changing context are reviewed and the history of back-propagation is discussed.

9 citations


Book ChapterDOI
TL;DR: There are many issues involved in specifying, building, and verifying complex software systems, and there are many techniques that can be applied to help address these important development issues.
Abstract: Publisher Summary This chapter presents a brief survey of various verification methods necessary to show correct functionality of a software product. Verification is only one aspect of the problem. The chapter discusses the role of verification from a historical prospective, and describes the techniques. Several models for program verification are developed. These can be broken down into three general categories: (1) an axiomatic model as an extension to predicate logic and mathematical theorem proving; (2) a functional model that views programs as functions from some input space to some output space; and (3) an algebraic model that models data interactions. These three general categories can be divided into five verification techniques, which will be briefly described. The three of these, the axiomatic, functional, and denotational semantics models, are described in detail. The use of domains and fixed points allows us to describe programming language semantics. The chapter discusses the Vienna Development Method to solve real-world software development problems. A good specification consists of multiple attributes, with functionality being only one of them. In that light, we briefly surveyed some of the other models. As given in this chapter, there are many issues involved in specifying, building, and verifying complex software systems. The technology has been developing over these last 25 years. While the chapter does not present fully automated process, there are many techniques that can be applied to help address these important development issues.

8 citations


Book ChapterDOI
Jürg Nievergelt1
TL;DR: This chapter presents the case of the smart game board and describes the main software tool for rapid prototyping of game implementations, needed to conduct experiments, and presents paradigms of software development favored by system designers and implementers of exploratory development projects, and explains why these are often diametrically opposed to the conventional software engineering lore.
Abstract: Publisher Summary This chapter presents examples that illustrate a type of programming project increasingly prominent in the field of knowledge engineering. The examples are chosen on grounds of familiarity, without any claim to represent the field of heuristic programming at large. The chapter begins by describing some software projects in computational heuristics. These projects are presented as case studies of the interaction between software engineering and knowledge engineering that illustrate the decisive importance of a powerful software environment. The chapter presents the case of the smart game board and describes the main software tool for rapid prototyping of game implementations, needed to conduct experiments. It also describes a project involving heuristics and knowledge engineering that has been evolving without interruption for the past five years. It presents paradigms of software development favored by system designers and implementers of exploratory development projects, and explains why these are often diametrically opposed to the conventional software engineering lore.

Book ChapterDOI
TL;DR: This chapter describes efforts arising from Algorithmic composition (Al) research whose goal is the composition of music in stylistic imitation of earlier composers.
Abstract: Publisher Summary The chapter describes activities in three areas of computing applications in music: music score input and output, sound generation and manipulation, and research on musical structure and performance The chapter focuses on ways of representing music scores for computer storage, retrieval, and manipulation, and includes a discussion of two representative music codes, encoding devices and software, and music printing The chapter discusses computer applications in music composition and includes information about the Musical Instrument Digital Interface(MIDI) standard, digital sound synthesis, and computer-aided composition The chapter presents a survey of computer applications in music research and includes discussions of database developments, data structures for music analysis, adaptations of artificial intelligence techniques and concepts from cognitive science, and research into expressive musical performance This chapter describes efforts arising from Algorithmic composition (Al) research whose goal is the composition of music in stylistic imitation of earlier composers The most promising approaches are those using grammatical approaches and connectionist networks In conclusion, the chapter presents comments on the current situation in each of these areas and then address briefly three philosophical issues attending the use of computers by musicians