scispace - formally typeset
Search or ask a question

Showing papers on "Software portability published in 1989"


Journal ArticleDOI
01 Sep 1989
TL;DR: A methodology for measuring the performance of supercomputers, including 13 Fortran programs that total over 50,000 lines of source code, and a set of guidelines that allow portability to several types of machines are presented.
Abstract: This report presents a methodology for measuring the performance of supercomputers. It includes 13 Fortran programs that total over 50,000 lines of source code. They represent applications in several areas of engi neering and scientific computing, and in many cases the codes are currently being used by computational re search and development groups. We also present the PERFECT Fortran standard, a set of guidelines that allow portability to several types of machines. Furthermore, we present some performance measures and a method ology for recording and sharing results among diverse users on different machines. The results presented in this paper should not be used to compare machines, except in a preliminary sense. Rather, they are presented to show how the methodology has been applied, and to encourage others to join us in this effort. The results should be regarded as the first step toward our objec tive, which is to develop a publicly accessible data base of performance information of this type.

489 citations


Journal ArticleDOI
01 Nov 1989
TL;DR: To facilitate the delivery of expert systems, the Artificial Intelligence Section of the Mission Planning and Analysis Division at NASA/Johnson Space Center has developed an expert system tool called CLIPS which provides high portability and ease of integration with most external computer languages.
Abstract: Much of the attention in large expert systems has been on LISP based systems. Several problems that are inherent in LISP based systems include: limited availability of LISP machines and LISP languages on conventional computers; low portability; and restricted capability for integration with other languages, particularly data base systems. To facilitate the delivery of expert systems, the Artificial Intelligence Section of the Mission Planning and Analysis Division at NASA/Johnson Space Center has developed an expert system tool called CLIPS which provides high portability and ease of integration with most external computer languages.

57 citations


Patent
10 Aug 1989
TL;DR: In this paper, the authors present a system and method for providing application program portability and consistency across a number of different hardware, database, transaction processing and operating system environments, which includes a plurality of processes for performing one or more tasks required by the application software.
Abstract: A system and method for providing application program portability and consistency across a number of different hardware, database, transaction processing and operating system environments. In the preferred embodiment, the system includes a plurality of processes for performing one or more tasks required by the application software in one or more distributed processors of a heterogenous or "target" computer. In a run-time mode, program code of the application software is pre-processed, compiled and linked with system interface modules to create code executable by a operating system of the target computer. The executable code, which includes a number of functional calls to the processes, is run by the operating system to enable the processes to perform the tasks required by the application software. Communications to and from the processes are routed by a blackboard switch logic through a partitioned storage area or "blackboard".

50 citations


Journal ArticleDOI
TL;DR: A specialized programming language, called Apply, is developed, which reduces the problem of writing the algorithm for this class of programs to the task ofWriting the function to be applied to a window around a single pixel.
Abstract: Low-level vision is particularly amenable to implementation on parallel architectures, which offer an enormous speedup at this level. To take advantage of this, the algorithm must be adapted to the particular parallel architecture. Having to adapt programs in this way poses a significant barrier to the vision programmer, who must learn and practice a different method of parallelization for each different parallel machine. There is also no possibility of portability for programs written for a particular parallel architecture. We have developed a specialized programming language, called Apply, which reduces the problem of writing the algorithm for this class of programs to the task of writing the function to be applied to a window around a single pixel. Apply provides a method for programming these applications which is easy, consistent, and efficient. Apply is programming model specific—it implements the input partitioning model—but is architecture independent. It is possible to implement versions of Apply which run efficiently on a wide variety of computers. We describe implementations of Apply on Warp, various Warp-like architectures, unix , and the Hughes HBA and sketch implementations on bit-serial processor arrays and distributed memory machines.

45 citations


Book ChapterDOI
12 Jun 1989
TL;DR: The paper reviews the problems inhibiting the widespread use of parallel processing by both industry and by software houses and some remarks on language and software tool developments for parallel programming form the conclusion.
Abstract: The paper reviews the problems inhibiting the widespread use of parallel processing by both industry and by software houses. The two key issues of portability of code and of generality of parallel architectures are discussed. An overview of useful computational models and programming paradigms for parallel machines is presented along with some detailed case studies implemented on transputer arrays. Valiant's results on optimally universal parallel machines are reviewed along with the prospects of building truly general-purpose parallel computers. Some remarks on language and software tool developments for parallel programming form the conclusion to the paper.

31 citations


Proceedings ArticleDOI
R. Atkinson1, Alan J. Demers1, Carl Hauser1, C. Jacobi1, P. Kessler1, Mark D. Weiser1 
21 Jun 1989
TL;DR: A brief description of the Cedar language, the portability strategy for the compiler and runtime, the manner of making connections to other languages and the Unix* operating system, and some measures of the performance of the “Portable Cedar” are presented.
Abstract: Cedar is the name for both a language and an environment in use in the Computer Science Laboratory at Xerox PARC since 1980. The Cedar language is a superset of Mesa, the major additions being garbage collection and runtime types. Neither the language nor the environment was originally intended to be portable, and for many years ran only on D-machines at PARC and a few other locations in Xerox. We recently re-implemented the language to make it portable across many different architectures. Our strategy was, first, to use machine-dependent C code as an intermediate language, second, to create a language-independent layer known as the Portable Common Runtime, and third, to write a relatively large amount of Cedar-specific runtime code in a subset of Cedar itself. By treating C as an intermediate code we are able to achieve reasonably fast compilation, very good eventual machine code, and all with relatively small programmer effort. Because Cedar is a much richer language than C, there were numerous issues to resolve in performing an efficient translation and in providing reasonable debugging. These strategies will be of use to many other porters of high-level languages who may wish to use C as an assembler language without giving up either ease of debugging or high performance. We present a brief description of the Cedar language, our portability strategy for the compiler and runtime, our manner of making connections to other languages and the Unix* operating system, and some measures of the performance of our “Portable Cedar”.

31 citations


Journal ArticleDOI
TL;DR: This paper describes a C compiler that uses abstract machine modelling to achieve portability and shows that a small number of very general handwritten patterns yields code that is comparable to the code from compilers that use more sophisticated code generators.
Abstract: machine modelling is a popular technique for developing portable compilers. A compiler can be quickly realized by translating the abstract machine operations to target machine operations. The problem with these compilers is that they trade execution efficiency for portability. Typically, the code emitted by these compilers runs two to three times slower than the code generated by compilers that employ sophisticated code generators. This paper describes a C compiler that uses abstract machine modelling to achieve portability. The emitted target machine code is improved by a simple, classical rule-directed peephole optimizer. Our experiments with this compiler on four machines show that a small number of very general handwritten patterns (under 40) yields code that is comparable to the code from compilers that use more sophisticated code generators. As an added bonus, compilation time on some machines is reduced by 10 to 20 per cent.

30 citations


Journal ArticleDOI
TL;DR: SDEF is intended to provide systolic algorithm researchers/ developers with an executable notation, and the software systems community with a target notation for the development of higher-level syStolic software tools.

29 citations


Proceedings ArticleDOI
27 Mar 1989
TL;DR: The authors describe the PUNDIT (Prolog Understanding of Integrated Text) text-understanding system, which is designed to analyze and construct representations of paragraph-length text that is flexible, extensible and portable.
Abstract: The authors describe the PUNDIT (Prolog Understanding of Integrated Text) text-understanding system, which is designed to analyze and construct representations of paragraph-length text. PUNDIT is implemented in Quintus Prolog, and consists of distinct lexical, syntactic, semantic, and pragmatic components. Each component draws on one or more sets of data, including a lexicon, a broad-coverage grammar of English, semantic verb decompositions, rules mapping between syntactic and semantic constituents, and a domain model. Modularity, careful separation of declarative and procedural information, and separation of domain-specific and domain-independent information all contribute to a system which is flexible, extensible and portable. Versions of PUNDIT are now running in five domains, including four military and one medical. >

28 citations


Journal ArticleDOI
M. A. Jenkins1
TL;DR: The paper describes the design of the Q'Nial interpreter, discusses constraints imposed by the desire to achieve portability and comments on what has been learned in building an interpreter in this style.
Abstract: The Q'Nial interpreter combines ideas from APL and Lisp implementations to provide a rich programming environment that supports several paradigms of programming. The interpreter is structured to reflect the division of Nial semantics into levels corresponding to zero-, first-and second-order objects. The paper describes the design of the interpreter, discusses constraints imposed by the desire to achieve portability and comments on what has been learned in building an interpreter in this style.

26 citations


Journal ArticleDOI
TL;DR: A graphic query language (GQL) which enables users to retrieve information selectively from land data banks, both in report or graphical forms, is extremely useful.
Abstract: In the computing industry, many regional and international standards organizations such as the American National Standards Institute (ANSI), the Institute for Electrical and Electronics Engineering (I.E.E.E.) and the International Standards Organization (ISO) are actively engaged in the formulation of standards in an attempt to facilitate the development of computer software. It is therefore appropriate to take advantage of standards whenever possible to allow portability of the software. For users of land and geographical information, it is envisaged that there will be more land data banks being made available for public enquiries. There will certainly be a need to have portable software that can be easily used by users with low-end computing resources to retrieve data selectively, both in report or graphical forms, from land data banks. A high-level query language to facilitate such selection is extremely useful. A graphic query language (GQL) which enables users to retrieve information selecti...

Proceedings ArticleDOI
13 Nov 1989
TL;DR: A formalism for specifying the information about applications needed by the user interface building blocks (i.e. the UIMS/Application interface) so that all building blocks share a common set of assumptions is described.
Abstract: The user interface building blocks of any User Interface Management System (UIMS) have built-in assumptions about what information about application programs they need, and assumptions about how to get that information. The lack of a standard to represent this information leads to a proliferation of different assumptions by different building blocks, hampering changeability of the user interface and portability of applications to different sets of building blocks. This paper describes a formalism for specifying the information about applications needed by the user interface building blocks (i.e. the UIMS/Application interface) so that all building blocks share a common set of assumptions. The paper also describes a set of user interface building blocks specifically designed for these standard UIMS/Application interfaces. These building blocks can be used to produce a wide variety of user interfaces, and the interfaces can be changed without having to change the application program.

Journal ArticleDOI
TL;DR: Real/IX operating system is discussed, real-time performance of REAL/IX is discussed and it is compared to MASSCOMP RTU operating system.
Abstract: The UNIX® operating system, developed by AT&T Bell Laboratories, has become a standard operating system gaining rapid acceptance because of its superior flexibility, portability, and a number of support tools to increase programmer productivity. However, UNIX was originally designed for multitasking and time-sharing, and therefore conventional UNIX does not have an adequate response time and data throughput needed to support real-time applications.Many attempts have been made to adapt the UNIX kernel to provide a real-time environment. MODCOMP has developed REAL/IX operating system, which is a fully preemptive, low latency UNIX kernel. This paper discusses real-time performance of REAL/IX and compares it to MASSCOMP RTU operating system.

Book
01 Jan 1989
TL;DR: Information is provided on how to set up and manage program subroutines in a robust way and for how to manage these systems from the command line.
Abstract: Specifications for a Robust Code.- Program Subroutines.- Portability Issues.- User's Guide.

Journal ArticleDOI
TL;DR: In this paper, a computational model of the human process of generating texts is described, and a text generator called GENNY is presented, with particular attention to domain independence and cross-language portability.
Abstract: This article describes a computational model of the human process of generating texts: GENNY. After an introduction to generation, the article illustrates the overall system structure with some typical output and then presents the theoretical and practical motivation for the text generator. Next, the system design and generation process are discussed with particular attention to domain independence and cross-language portability. Then system tests are discussed and the generator is evaluated with respect to current generators. Finally, future directions are suggested.

Book ChapterDOI
Jeff Rothenberg1
01 Jan 1989
TL;DR: A framework of evaluation criteria and a methodology for selecting an expert system tool are presented, not as a final answer to a fixed problem, but as a strategy for dealing with a dynamic problem whose complexity reflects the health of a research area whose impact on software engineering is only beginning to be felt.
Abstract: Publisher Summary This chapter presents a framework of evaluation criteria and a methodology for selecting an expert system tool. Evaluating and choosing a tool requires matching a tool to its intended use including all aspects of the problem domain, the problem itself, and the anticipated project. Because of the evolving and inconsistent terminology in this new field, comparing features of different tools is of limited utility and limited longevity. Instead, the capabilities provided by these features must be analyzed, evaluated, and compared. The framework shows how to use specific assessment techniques to apply specific metrics to specific capabilities of a tool for a specific application in a specific context. The development of expert system is reflected in the importance of issues, such as integration, database access, portability, fielding, maintainability, robustness, reliability, concurrent access, performance, user interface, debugging support, and documentation. Though the difficulty of comparing and selecting tools may be daunting to a developer faced with a decision, this difficulty is largely a result of the richness of the field and the bewildering pace at which new ideas are being incorporated into tools. The evaluation approach is offered, not as a final answer to a fixed problem, but as a strategy for dealing with a dynamic problem whose complexity reflects the health of a research area whose impact on software engineering is only beginning to be felt.

Proceedings ArticleDOI
21 Feb 1989
TL;DR: A knowledge acquisition tool (KNACQ) is described that has sharply decreased the effort in building knowledge bases and is used by both the understanding components and the generation components of Janus.
Abstract: Although natural language technology has achieved a high degree of domain independence through separating domain-independent modules from domain-dependent knowledge bases, portability, as measured by effort to move from one application to another, is still a problem. Here we describe a knowledge acquisition tool (KNACQ) that has sharply decreased our effort in building knowledge bases. The knowledge bases acquired with KNACQ are used by both the understanding components and the generation components of Janus.

Journal ArticleDOI
TL;DR: The paper presents a set of techniques for building robust software using a case study of the development for a microprocessor of the SUPPORT programming environment as an example to discuss some of the attributes built into the system to best handle the concerns of portability, functionality, and effectiveness.
Abstract: The paper presents a set of techniques for building robust software using a case study of the development for a microprocessor of the SUPPORT programming environment as an example. It discusses some of the attributes built into the system to best handle the concerns of portability, functionality, and effectiveness that we had to deal with in order for the system to be reliable on a small machine, and it describes some of the choices made to obtain the best use of available hardware.

Journal ArticleDOI
TL;DR: The critical design decisions affecting CLEER's development are summarized along with lessons learned to date, all in terms of the "how, "why," and "when" of specific features that are being developed.
Abstract: This paper describes a modular AI (artificial intelligence) system being built to help improve electromagnetic compatibility among shipboard topside equipment and its associated systems. CLEER, as it is called, is intended to serve as an easy-to-use integrator of existing expert knowledge and pre-existing databases, and large scale analytical models. These interfaces, the need for software portability and certain AI-related design requirements made traditional expert system shells inappropriate, although relatively off-the-shelf AI technology could be incorporated. The critical design decisions affecting CLEER's development are summarized along with lessons learned to date, all in terms of the "how," "why," and "when" of specific features that are being developed.

Proceedings ArticleDOI
14 May 1989
TL;DR: The authors outline the steps that a researcher can follow in order to integrate a control, sensing hardware interface or data analysis algorithm into an RCTS-based robot system, designed for flexibility, portability, ease of modification and ease of use.
Abstract: A description is given of the Robot Controller Test Station (RCTS), a software environment for implementing, testing, and evaluating robot control and sensor algorithms in both a simulated and a real robot system. RCTS is designed for flexibility, portability, ease of modification and ease of use. The authors discuss many of the design and implementation issues for RCTS. They outline the steps that a researcher can follow in order to integrate a control, sensing hardware interface or data analysis algorithm into an RCTS-based robot system. These steps do not require the researcher to be an expert programmer or even to comprehend the remainder of the robot software, so he or she can concentrate on algorithm design and optimization. >

Journal ArticleDOI
TL;DR: In this article, the authors use the computer as an on-line interactive device in which users do all modelling without resorting to pre-written programs, such as SAS, SPSS, TSP, or Microsoft to databases.
Abstract: ECONOMICS: ON-LINE AND INTERACTIVE I. INTRODUCTION Computers, though radically altering economic research over the last 30 years, have only recently begun to change teaching and have had limited impact on economic reasoning and methodology beyond econometrics and simulation modelling. Computer software now supplements texts with test banks and programmed exercises. Computer text editors permit faster and easier word processing. In addition, students and others undertake very sophisticated statistical and econometric research projects that involve prewritten, or "canned", programs, such as SAS, SPSS, TSP, or Microsoft to databases. Users follow rote predetermined sets of instructions, and the result is printed output containing statistics, regression coefficients, and the like. The user need not know or understand how or why these particular results are produced by the "black box." This is essentially a non-intellectual activity. Here I propose use of the computer as an on-line interactive device in which users do all modelling without resorting to prewritten programs. No existing model has been devised by an expert, instructor, publisher, or other modeller. There is, therefore, no black box. This on-line interactive approach has pedagogical, scientific, and intellectual virtues. First, one must know the economic statistics or econometrics needed to develop the tests or results desired. This avoids bad intellectual habits--reliance on partially understood methods in one's research. Second, one learns precisely how equations, models, and statistics are generated. Students in a classroom or individual researchers, who only partially understand material, cannot escape notice. Prospects of detecting confusion, and thus providing remedial action, increase. This raises the likelihood of accurate results. Third, through repeated application, tools of analysis become part of one's lexicon. In other words, ideas become the authors rather than those of an unknown expert. Fourth, and perhaps most important, each user has the maximum degree of research flexibility and portability. This advantage is especially important in light of the dynamic nature of economic analysis. How many economists have learned a program only to discover new tests or techniques are required to satisfy new research demands and that these techniques are not available on the package? Thus, users become capable of undertaking intellectually meaningful research--running tests that they choose for their own reasons, rather than following rote procedures of a canned program. The interactive aspect enables users to obtain immediate feedback on original ideas and explore alternatives as they occur. The computer becomes an extension of one's intellect rather than a fast machine. On-line interactive computing and instruction is made possible by several languages. True Basic, developed at Dartmouth College, is such a language, as I understand are newer versions of Fortran and Gauss. I use APL, "A Programming Language," to teach students to work without written programs. APL is compact, flexible, and well-suited to economic and statistical analysis. Its strength derives from the fact that the unit of operation is the array which may be a matrix, vector, or scalar rather than simply the scalar, as is the case with other languages such as Basic, Fortran IV, Cobol, and Pascal. Benefits do not come without costs. First, users must learn a language. In the case of APL, portions of the character set are unique so that one must learn new symbols. Second, writing each formula anew while doing research can be tedious and distracting. The virtue of programs is that one can obtain numerous results without much intellectual effort. In my judgment, programming is worth the extra effort. One can minimize distractions and time by effective use of well-designed languages. Once conversant in programming, one can develop one's own models and programs. …

Book ChapterDOI
01 Mar 1989
TL;DR: The present paper discusses the configuration of this CTRON Reference Model and the concepts that went into its design, focusing on the following points.
Abstract: CTRON is an operating system interface that provides functions which can be applied commonly to a variety of nodes in an information communication network, and is designed to contribute to software portability improvement. As a way of ensuring a common conceptualization of CTRON functions and configuration among both users and system designers, a CTRON Reference Model has been devised, giving an overview of the interface configuration. The present paper discusses the configuration of this Reference Model and the concepts that went into its design, focusing on the following points. 1) Requirements for creation of the CTRON Reference Model. 2) Configuration of the CTRON Reference Model, and detailed structure within interface classes.

Book ChapterDOI
01 Jan 1989
TL;DR: In this article, the Veterans Administration installed new hospital information systems in 169 of its facilities during 1984 and 1985, which are based on the ANS MUMPS language, is public domain, and is designed to be operating system and hardware independent.
Abstract: As part of its Decentralized Hospital Computer Program (DHCP) the Veterans Administration installed new hospital information systems in 169 of its facilities during 1984 and 1985. The application software for these systems is based on the ANS MUMPS language, is public domain, and is designed to be operating system and hardware independent. The software, developed by VA employees, is built upon a layered approach, where application packages layer on a common data dictionary which is supported by a Kernel of software. Communications between facilities are based on public domain Department of Defense ARPA net standards for domain naming, mail transfer protocols, and message formats, layered on a variety of communications technologies.

Proceedings ArticleDOI
20 Sep 1989
TL;DR: A programming methodology that implements many object-oriented features within a conventional programming environment is described, which provides all the benefits of a C/Unix environment, including portability, a rich variety of development tools, and efficiency.
Abstract: A programming methodology that implements many object-oriented features within a conventional programming environment is described. The methodology was created during the development of a computer animation system, The Clockworks. The methodology supports such object-oriented features as objects with variables and methods, class hierarchies, variable and method inheritance, object instantiation, and message passing. The methodology does not employ any special keywords or language extensions, thus removing the need for a preprocessor or compiler. The methodology has been implemented in a C/Unix environment. This allows the environment and any system developed within it to be ported to a wide variety of computers which support Unix. The methodology provides many object-oriented features and associated benefits. It also provides all the benefits of a C/Unix environment, including portability, a rich variety of development tools, and efficiency. >

31 Oct 1989
TL;DR: The MNEME Store is reported on, discussing the ORIGINAL DESIGN, the construction of the initial PROTOTYPE, and the approaches being considered for the next PROTOTyPE.
Abstract: THE MNEME PROJECT IS AN INVESTIGATION OF TECHNIQUES FOR INTEGRATING PROGRAMMING LANGUAGE AND DATABASE FEATURES TO PROVIDE BETTER SUPPORT FOR COOPERATIVE, INFORMATION-INTENSIVE TASKS SUCH AS COMPUTER AIDED SOFTWARE ENGINEERING. WE REPORT HERE ON THE MNEME PERSISTENT OBJECT STORE, DISCUSSING THE ORIGINAL DESIGN, THE CONSTRUCTION OF THE INITIAL PROTOTYPE, AND APPROACHES BEING CONSIDERED FOR THE NEXT PROTOTYPE. MNEME STORES `OBJECTS'', WITH A SIMPLE AND GENERAL FORMAT, AND PRESERVES THE IDENTITY OF OF THE OJBECTS AND THEIR STRUCTURAL RELATIONSHIPS. MNEME''S GOALS INCLUDE PORTABILITY, EXTENSIBILITY (ESPECIALLY WITH RESPECT TO OBJECT MANAGEMENT POLICIES), AND LOW OVERHEAD. THE MODEL OF MEMORY THAT MNEME AIMS TO PRESENT IS A SINGLE, COOPERATIVELY SHARED HEAP, DISTRIBUTED ACROSS A COLLECTION OF NETWORKED COMPUTERS.

Journal ArticleDOI
TL;DR: This paper details the design concepts and functional and configurational features, and reports on implementations to date of the CTRON interfaces.

Proceedings ArticleDOI
27 Mar 1989
TL;DR: The implementation of the design of RCTS is discussed and its advantages for integrating sensors into the distributed robot control environment are discussed.
Abstract: In present robot systems, much time and effort is expended in implementing, testing and evaluating sensor processing algorithms and control schemes. This tends to reduce creativity in prototype development and may discourage modernization as technology improves. Therefore, we have devised a "Robot Controller Test Station" (RCTS). RCTS is an environment for implementing, testing and comparing novel adaptive control and sensor processing algorithms. This environment integrates a dynamics simulator and a real robot into a distributed processing environment. Unlike other test stations, the RCTS design emphasizes flexibility, portability, ease of modification and ease of use. The software design of RCTS makes extensive use of standardized signal names, state tables and clearly bounded control and communication blocks. Therefore, new interfaces to hardware and processing routines can be easily integrated into RCTS. The robot control problem is modularized into three levels with control processing separated from sensor processing. This six module structure and the prioritized communication scheme was chosen to reduce the response time of the robot to sensor data. This paper discusses the implementation of the design of RCTS and discuss its advantages for integrating sensors into the distributed robot control environment.

Proceedings ArticleDOI
27 Nov 1989
TL;DR: A novel configuration which achieves TDMA (time division multiple access) functions on a connected bus topology for wide area applications, resulting in an integrated voice-data (IVD) transmission capability, a high throughput, connection portability for stations, and high reliability, is proposed.
Abstract: A novel configuration which achieves TDMA (time division multiple access) functions on a connected bus topology for wide area applications, resulting in an integrated voice-data (IVD) transmission capability, a high throughput, connection portability for stations, and high reliability, is proposed. Loop LANs that are composed for multiple bus lines and operate synchronously with a demand-assignment function are presented. The delay time and throughput are derived, revealing that a wide area system with a loop length of less than 25 km can be achieved. It is verified that all the loop systems considered-bidirectional, partial bidirectional, and one-way, in that order-are much more attractive than any existing LAN system with regard to IVD capability, throughput performance, portability, and reliability. >

Proceedings ArticleDOI
25 Sep 1989
TL;DR: The capabilities of the Universal Ada Test Language (UATL) have been expanded after it was rehosted to an HP Model 9000/Series 300 processor, and a set of test program generation and soft panel instrument control functions were added.
Abstract: The capabilities of the Universal Ada Test Language (UATL) have been expanded. The UATL was rehosted to an HP Model 9000/Series 300 processor, and a set of test program generation and soft panel instrument control functions were added. HP's interactive test generator (ITG) includes ASCII instrument driver files that characterize the instrument functions and define interactive soft-panel control screens for developing test programs in BASIC. UATL/Ada-based instrument and operator screen control procedures have been written that provide similar functions for generating test programs in Ada. UATL provides a complete set of test support functions in Ada for performing closed-loop testing of a unit-under-test through its external interfaces. In addition to the instrument control functions, the UATL provides real-time digital stimulus/response controls over software or hardware interfaces, online test control and performance monitoring, data recording, and both ASCII and graphical data reduction. Ada introduces the advantages of portability and reusability, extensive data consistency checking, and built-in multitasking support. >