scispace - formally typeset
Search or ask a question

Showing papers on "Application software published in 1987"


Patent
20 Apr 1987
TL;DR: In this paper, a file system for a portable data carrier provides improved security for files which support multiple applications, from custom repertory dialing to storage of individual medical and/or banking records.
Abstract: A file system for a portable data carrier (10) provides improved security for files which support multiple applications, from custom repertory dialing to storage of individual medical and/or banking records. Although the portable data carrier looks and feels much like an ordinary credit card, it includes a computer (110) and an electrically erasable programmable read-only memory (115). Power for operation of the portable data carrier is provided from an associated station (18) via a reader/writer (15). The reader/writer also couples data between the data carrier and the associated station. The applications reside in multiple files (42-48) in memory on the portable data carrier. Appropriate application software residing in the station, when accompanied by an appropriate password, enables the retrieval and modification of these files. A separate password is required for gaining access to each of designated levels of interaction between the portable data carrier and the associated station. Additional restrictions such as requiring an additional password for writting to a file and also allowing a user logged in at a particular security level to only append information to a file may be imposed in accordance with file security on the portable data carrier. And since each of the files may have its own security requirements, multiple applications may exist on the portable data carrier without conflict or confusion.

150 citations


Proceedings ArticleDOI
27 Apr 1987
TL;DR: ABYSS is shown to be a general security base, in which many security applications may execute, and a novel use-once authorization mechanism, called a token, is introduced as a solution to the problem of providing authorizations without direct communication.
Abstract: ABYSS (A Basic Yorktown Security System) is an architecture for the trusted execution of application software. It supports a uniform security service across the. range of computing systems. The use of ABYSS discussed in this paper is oriented towards solving the software protection problem, especially in the lower end of the market. Both current and planned software distribution channels are supportable by the architecture, and the system is nearly transparent to legitimate users. A novel use-once authorization mechanism, called a token, is introduced as a solution to the problem of providing authorizations without direct communication. Software vendors may use the system to obtain technical enforcement of virtually any terms and conditions of the sale of their software, including such things as rental software. Software may be transferred between systems, and backed up to guard against loss in case of failure. We discuss the problem of protecting software on these systems, and offer guidelines to its solution. ABYSS is shown to be a general security base, in which many security applications may execute.

145 citations


Journal ArticleDOI
TL;DR: GEM is being used on a multiprocessor with robotics application software of substantial size and complexity and is closely coupled to prototype real-time programming environments that provide programming support for the models of computation offered by the operating system.
Abstract: To increase speed and reliability of operation, multiple computers are replacing uniprocessors and wired-logic controllers in modern robots and industrial control systems. However, performance increases are not attained by such hardware alone. The operating software controlling the robots or control systems must exploit the possible parallelism of various control tasks in order to perform the necessary computations within given real-time and reliability constraints. Such software consists of both control programs written by application programmers and operating system software offering means of task scheduling, intertask communication, and device control.The Generalized Executive for real-time Multiprocessor applications (GEM) is an operating system that addresses several requirements of operating software. First, when using GEM, programmers can select one of two different types of tasks differing in size, called processes and microprocesses. Second, the scheduling calls offered by GEM permit the implementation of several models of task interaction. Third, GEM supports multiple models of communication with a parameterized communication mechanism. Fourth, GEM is closely coupled to prototype real-time programming environments that provide programming support for the models of computation offered by the operating system. GEM is being used on a multiprocessor with robotics application software of substantial size and complexity.

73 citations


Journal ArticleDOI
Peter R. Wilson1
TL;DR: Information modeling in more detail is discussed, the central premise of the PDES/STEP exchange standardization effort is that the "stuff" to be exchanged is information, but during the actual exchange the information is represented by data.
Abstract: Continuing the theme from the last column,1 this time I want to discuss information modeling in more detail. The central premise of the PDES/STEP exchange standardization effort is that the \"stuff\" to be exchanged is information, but during the actual exchange the information is represented by data. The data forms used can differ according to the target computer exchange technology. The objectives of PDES can be summarized as

67 citations


Journal ArticleDOI
TL;DR: This article addresses several of new and challenging problems for software developers with a focus on distributed-software engineering, using the software life cycle as a guide.
Abstract: Distributed-software is made up of a fixed set of software processes that are allocated to various interconnected computers but that closely cooperate to achieve a common goal Interconnected computers are loosely coupled in the sense that interprocessor communication is done by message passing, not by sharing memory Distributed-software engineering poses a host of new and challenging problems for software developers This article addresses several of these problems, using the software life cycle as a guide

55 citations


Journal ArticleDOI
TL;DR: An overview of the analytical and computational problems encountered in the large scale application of the TEF method is presented, and approaches used to overcome the problems are provided.
Abstract: This paper deals with the demonstration of the Transient Energy Function (TEF) method in large, realistic power networks. Documented examples of application in power system operation and associated software development for systems up to 228-generators and 1644-buses are given. An overview of the analytical and computational problems encountered in the large scale application of the TEF method is presented. Approaches used to overcome the problems are provided, and the relevant improvements and modifications to the TEF software are discussed.

52 citations


Journal ArticleDOI
TL;DR: Several misunderstood performance characteristic, characteristics that can have a major effect on the user's ability to use a tablet digitizer in many interactive applications are found.
Abstract: Tablet digitizers are common devices for getting graphical data into a computer. For many applications, they are the only practical device. Because tablet digitizers have been in existence for many years, they are often regarded as "old hat" technology?nothing new can be said about them and no significant technical developments other than price decreases can be expected. This is a misconception. We have found several misunderstood performance characteristic, characteristics that can have a major effect on the user's ability to use a tablet digitizer in many interactive applications. The risk to an application designer is that the or she may base the interface design on incorrect assumptions. As a consequence, the interface may be awkward to use. The article shows what some of these characteristics are how to evaluate their effect in different classes of applications. Specific examples are taken from a hand-printing application for character recognition.

43 citations


Patent
14 Jul 1987
TL;DR: In this article, an interface unit is coupled with and responds to the physical manipulation of the control elements of the cockpit control panel mockup, so that control panel manipulative action of the computer user "pilot" causes the generation and delivery of keystroke representative signals to the keyboard port of a computer.
Abstract: A mockup device interfaces directly with the keyboard port of a desktop (personal) computer and requires no additional or specialized adapter. A hardware mockup of an aircraft cockpit control panel, including instruments, switches and control yoke, includes an interface unit through which digital signals that replicate keystroke sequences employed by flight simulation application software are generated. The interface unit is coupled with and responds to the physical manipulation of the control elements of the cockpit control panel mockup, so that control panel manipulative action of the computer user `pilot` causes the generation and delivery of keystroke representative signals to the keyboard port of the computer. In response to these keystroke representative signals the flight simulation software that has been loaded into the computer controls the flight simulation display just as though the user were operating the system directly from the keyboard. However, because the interface unit produces the command sequences in response to the manipulation of dedicated control/switch elements corresponding to those on an actual cockpit control panel and at a signalling rate that far exceeds the manipulative keystroke action of the computer user, the flight simulation display on the computer monitor is presented effectively in near real time, thereby creating a more realistic simulation of flight conditions to the `pilot` user.

35 citations


Journal Article

35 citations


Journal ArticleDOI
William Harrison1
TL;DR: This model seeks to change the way tools are developed by enabling them to be extended to handle new kinds of input, not just new function.
Abstract: Monolithic tools that can't be extended to handle new kinds of input, not just new function, are hampering development. This model seeks to change that.

35 citations


Journal ArticleDOI
01 May 1987
TL;DR: This paper examines four such distributed systems with contrasting degrees of decentralized hardware, control, and redundancy, one of which is a one-site system, the second is a node-replicated system at a remote site for disaster backup, the third is a multi- site system with central control and the fourth is amulti-site systems with node autonomy.
Abstract: Distributed computer applications built from off-the-shelf hard-ware and software are increasingly common. This paper examines four such distributed systems with contrasting degrees of decentralized hardware, control, and redundancy. The first is a one-site system, the second is a node-replicated system at a remote site for disaster backup, the third is a multi-site system with central control, and the fourth is a multi-site system with node autonomy. The application, design rationale, and experience of each of these systems are briefly sketched.

Journal ArticleDOI
01 Jun 1987
TL;DR: The basic relationships among microprocessor architecture, GaAs technology, and real-time applications are underline, and an analytical execution-time model of the reduced vertical-migration architecture is developed.
Abstract: This paper analyzes the potential performance of a high-level language (HLL) microprocessor architecture for special- purpose real-time applications. Our approach is based on mapping of HLL constructs into microcode, a concept called vertical migration. An analytical execution-time model of the reduced vertical-migration architecture is developed. It is applied to two different workload models: one corresponding to statement mixes, and the other showing some HLL kernel routines. Performance evaluation results are compared to different forms of the reduced vertical-migration architecture, and in various application domains. We underline in this paper the, basic relationships among microprocessor architecture, GaAs technology, and real-time applications.

Book ChapterDOI
Stuart K. Card1, Austin Henderson1
01 Jan 1987
TL;DR: Catalogue, an adjunct to the Rooms multiple virtual workspace environment, is described, which employs the mail-order catalogue as a metaphor for the delivery of application software in an integrated work environment.
Abstract: This paper presents the mail-order catalogue as a metaphor for the delivery of application software in an integrated work environment. It also describes, Catalogue, an adjunct to the Rooms multiple virtual workspace environment, which employs this metaphor. This mechanism can be used (1) to give users “instant starts” by letting the user's select a standard setup, (2) to allow users to assemble their own environment from standard components, (3) to parameterize a standard component, and (4) to load applications ready to run.

Journal ArticleDOI
TL;DR: This correspondence describes the organization and operation of a semantic network array processor (SNAP) as applicable to high level computer vision problems and the two general techniques, discrete relaxation and dynamic programming.
Abstract: The problems in computer vision range from edge detection and segmentation at the lowest level to the problem of cognition at the highest level. This correspondence describes the organization and operation of a semantic network array processor (SNAP) as applicable to high level computer vision problems. The architecture consists of an array of identical cells each containing a content addressable memory, microprogram control, and a communication unit. The applications discussed in this correspondence are the two general techniques, discrete relaxation and dynamic programming. While the discrete relaxation is discussed with reference to scene labeling and edge interpretation, the dynamic programming is tuned for stereo.

Journal ArticleDOI
K. Matsumura1, H. Mizutani1, M. Arai1
TL;DR: This paper proposes a computer-aided software design system (CASDS), which supports software engineers with a series of structural modeling, which helps to extract concepts from many fuzzy requirements.
Abstract: In software development, it has been pointed out that software engineers must pay attention to software requirements definition. One of the important problems in software engineering is to rationalize the processes from requirements definition to design. Computer tools are most useful and efficient for this purpose. This paper proposes a computer-aided software design system (CASDS), which supports software engineers with a series of structural modeling. As is well-known in systems planning, structural modeling helps to extract concepts from many fuzzy requirements. This system contains three structural modeling methods. They are used 1) to determine functional terms from fuzzy software requirements, 2) to obtain modules by structuring the functions with respect to the data flows, and 3) to make a program skeleton by imposing control flows on the functional elements obtained by breaking down the modules.

Journal ArticleDOI
TL;DR: The synergetic combination of interactive 3D computer modeling, graphics, and database management provides an efficient and flexible tool for plant design, verification, and operation.
Abstract: This paper discusses the development philosophy and specific capabilities of an ``intelligent'' 3D modeling software system for power plants. The synergetic combination of interactive 3D computer modeling, graphics, and database management provides an efficient and flexible tool for plant design, verification, and operation. The advantages of an ``intelligent'' physical model and system drawings are outlined in detail, and examples of a wide range of applications are given.

Book ChapterDOI
01 Jan 1987
TL;DR: This chapter provides an overview of cognitive complexity theory, which treats elementary units of cognitive skill that are acquired in an all-or-none and once-and-for-all fashion.
Abstract: Publisher Summary This chapter provides an overview of cognitive complexity theory. The large-scale introduction of office systems with multifunctional application software has led to several attempts to make the different functions of such a system usable. The different aspects of usability have to be related to psychological models of performance, learning, transfer, and development of cognitive skills and competence. One approach is based on the assumption that both ease of learning and efficiency in use are mainly determined by the degree of consistency between the human–computer interfaces of the respective functional domains. For the construction of indicators of consistency among computerized tasks, one needs a formal language to describe tasks in relation to operations to be performed by a user on a given target system. Cognitive complexity modeling is done using production systems for the representation of procedural knowledge. Production rules as elements of a production system form a complete and consistent description of the task to be fulfilled by the user. They are treated as elementary units of cognitive skill that are acquired in an all-or-none and once-and-for-all fashion

Journal ArticleDOI
01 May 1987
TL;DR: The design of DDM, a general-purpose distributed database management system implemented in Ada that supports the use of Adaplex as interface language is described, which is the first full-scale distributed database system to support a semantically rich, functional data model.
Abstract: Adaplex is an integrated language for programming database applications. It results from the embedding of the database sublanguage Daplex in the general-purpose programming language Ada [1]. This paper describes the design of DDM, a general-purpose distributed database management system implemented in Ada that supports the use of Adaplex as interface language. There are two novel aspects in the design of this system. First, this is the first full-scale distributed database system to support a semantically rich, functional data model. DDM goes beyond systems like Distributed INGRES and R*(which are based on the relational technology) in providing advanced data modeling capabilities and ease of use. Second, this is the first full-function distributed DBMS designed to be compatible with the Ada programming environment. The coupling between Ada and Daplex has been achieved at the expression level which is much tighter than the statement level integration attained in previous systems. This tight coupling poses new implementation problems but also creates new opportunities for optimization. The current paper highlights the Adaplex language and discusses innovative aspects in DDM's design that are intended to meet the dual objectives of good performance and high data availability.

Journal ArticleDOI
TL;DR: In this article, the authors present an application of the two-step compensation method to the most commonly performed line-out and line-end fault calculations, demonstrating the use, generality and flexibility of the algorithm.
Abstract: This paper presents an application of the two-step compensation method to the most commonly performed line-out and line-end fault calculations, demonstrating the use, generality and flexibility of the algorithm. The paper is aimed at both computer applications engineers who might implement the algorithm described herein in software packages, and protection engineers who might use the paper's heuristics to efficiently guide their analyses. The compensation-based scheme is extremely powerful for solving problems involving large-scale power systems. Detailed, step-by-step numerical computations are carried out for five examples in this application paper. The examples include calculations for a line-end fault with ensuing line outages. Mutual couplings are included in the network.

Proceedings ArticleDOI
N. J. Elias1
01 Oct 1987
TL;DR: A comparative analysis to conventional layout design accounting for software development and maintenance is formulates, and critical factors in planning silicon compilation software development are identified.
Abstract: Philips Laboratories has developed HVDEV, a procedural language layout generator for compiling high voltage MOS device layouts from behavioral specifications. HVDEV is analyzed as a case study in silicon compilation software engineering. The paper formulates a comparative analysis to conventional layout design accounting for software development and maintenance. Critical factors in planning silicon compilation software development are identified.

Proceedings ArticleDOI
01 Jun 1987
TL;DR: Findings are presented of a study to determine the feasibility of developing and demonstrating a long range autonomous underwater vehicle and the evolution of the AUV system from simulation through component testing to the at sea demonstration is discussed.
Abstract: Findings are presented of a study to determine the feasibility of developing and demonstrating a long range autonomous underwater vehicle. Based on a real world scale program need, a technology development and capability demonstration program is described. The program objectives necessary to provide a proof of principle including expected system performance capabilities are described together with an activity program for the demonstration system. Sensor systems for navigation, obstacle avoidance, passive detection, vehicle motion and vehicle health are described. Particular attention is paid to the discussion of the hardware and software architecture for the system with an emphasis on providing as much top-down guidance as possible and to exploit sensor modality differences to produce complementary perceptual processes in the system. The discussion of the software includes the application of a system capable of supporting parallelism in its knowledge source modules and a organized collection of perceptual and navigation modules tied together through a blackboard. The paper describes the database/communication system, the AUV and system block diagram together with the issues which are inherent in the integration of the multiple sensors of the system. Path planning abilities are described against a background of actual sonar-depth data obtained during the study. Simulations of a proposed vehicle, including six degrees of freedom, in a marine environment are described. The evolution of the AUV system from simulation through component testing to the at sea demonstration is discussed.

Journal Article
TL;DR: The development and validation of a psychological measure designed to investigate the meaning of documentation for its readers is a first step in a program of research involving the design and implementation of documentation as a factor in MIS utilization.
Abstract: Documentation constitutes a significant, primary interface between the individual whose job tasks are impacted by computer-based applications, and the application system or software package. The MIS literature has identified documentation as a factor influencing project success, software maintenance, and application package usage. However, the professional literature has repeatedly noted the lack of adequate, meaningful documentation in practice. The development and validation of a psychological measure designed to investigate the meaning of documentation for its readers is a first step in a program of research involving the design and implementation of documentation as a factor in MIS utilization. The primary research question addressed by this study is: \"What is the nature of dimensions which readers use when evaluating application software documentation?\

Journal ArticleDOI
TL;DR: A methodology to develop fault tolerant application software in two different programming layers with advantages of higher protection between application and recovery programs, application-independence, application transparency and maximum use of the underlying hardware architectural features is illustrated.

Journal ArticleDOI
TL;DR: In this paper, the authors reviewed recent progress in network security analysis in power system control centers and addressed issues concerning modeling, assumptions and limitations, solution techniques, computational efficiency and solution accuracy.

01 Jan 1987
TL;DR: This work addresses two problems in development of application software: the transfer of software from environments optimized for development, to target environments oriented towards development, and the creation of new systems for this purpose.
Abstract: This work addresses two problems in development of application software. One problem concerns the transfer of software from environments optimized for development, to target environments oriented t ...


Proceedings ArticleDOI
01 Apr 1987
TL;DR: SPEED is a software environment that provides facilities for creating networks of programs on a single or multiple processors and a mouse-driven user interface for creating and controlling these networks.
Abstract: SPEED is a software environment that provides facilities for creating networks of programs on a single or multiple processors. A mouse-driven user interface is provided for creating and controlling these networks. Routines and utilities are included for the programmer to develop high-throughput communications among the processes. Program and data control parameters can be set by the user or initialized automatically. An example of speech processing using models of the human auditory system is given.

Book ChapterDOI
01 Jan 1987
TL;DR: The design and implementation of an application model for constructing an exemplar adaptive user interface are presented and conclusions are drawn regarding the potential benefits of application modelling.
Abstract: In developing interactive computer systems, the logical separation of the user interface from the application software is a well recognised design principle. The building of adaptive user interfaces requires this separation to be maintained during the specification and implementation stages. Maintaining this separation places special requirements on the communication between the user interface and the application software. This paper discusses the role of application modelling and knowledge based system approach for supporting these requirements. The design and implementation of an application model for constructing an exemplar adaptive user interface are presented and conclusions are drawn regarding the potential benefits of application modelling.

Journal ArticleDOI
TL;DR: The "Virtual Fastbus Master" is a behavioral model of a general purpose Fastbus master module and the functions of the model, the behavioral modeling technique and the application are described.
Abstract: The "Virtual Fastbus Master" is a behavioral model of a general purpose Fastbus master module. In this paper, we describe the functions of the model, the behavioral modeling technique and the application of the model. Finally, we discuss some conclusions from this application of behavioral modeling.