scispace - formally typeset
Search or ask a question

Showing papers on "Software as a service published in 1994"


Patent
28 Dec 1994
TL;DR: In this article, the customer interface provides a seamless front end across the different software platforms of the network, allowing the customer to specify a range of optional operations to be performed at the service site, including service research, requesting service, applying service and installing fixes from the service sites to the remote location.
Abstract: A computer network system includes a central software service site that operates with a customer interface through which a customer at a remote location can request service and receive updated executable code back from the service site. The customer interface provides a seamless front end across the different software platforms of the network. A customer initiates servicing of a program product by composing a service request through the front end, which provides a mechanism for the collection of information regarding the nature of the customer request. The front end permits the customer to specify a range of optional operations to be performed at the service site, including service research, requesting service, applying service, and installing fixes from the service site to the remote location. A service facility at the service site performs the requested service and provides the results back to the customer, or collects the service and returns the product and service to the remote location for application of service. The source code for the program product being updated resides only at the service site. All code is returned to the remote location over the network in a form that is ready to be executed. In a distributed implementation, the service site is provided as a central node and one or more slave nodes that also perform service.

71 citations


Patent
08 Feb 1994
TL;DR: In this article, the authors propose a system to speed up and suitably distribute and present software, which is prepared and updated by software presenter, to a lot of software users.
Abstract: PURPOSE: To speedily and suitably distribute and present software, which is prepared and updated by software presenter, to a lot of software users. CONSTITUTION: When object softwares 1a of user computers 1-1 to 1-n are activated, it is detected by a client program 1b, and the information of the present edition is referred to a server program 3a of a presenter's computer 3 through a network 2. When the server program 3a receives the information, it is compared with the configuration of a software library 3b, and the updating instruction information the object software 1a and an updated edition software are returned. While using the information, the client program 1b automatically updates the object software 1a into the latest edition. On the other hand, this system can be also provided with a function for activating the client program 1b at a set time or automatically transmitting the fault/bugging information of the object software to the client program 1b. COPYRIGHT: (C)1995,JPO

33 citations



Proceedings ArticleDOI
14 Feb 1994
TL;DR: In order to coordinate a "late binding" of processes a special ODP Trader has to manage the integration of new and existing services.
Abstract: Open Systems Interconnection (OSI) supports the static connection between two application processes. Extending this, Open Distributed Processing (ODP) realizes the dynamic connection between a various number of application processes. A lot of work was done in order to standardize the concept of ODP. In order to coordinate a "late binding" of processes a special ODP Trader has to manage the integration of new and existing services. The support of this task is done by service management which is introduced in the following.

11 citations


Book ChapterDOI
01 Jan 1994
TL;DR: The objective of this paper is to study the software development process in a high performance computing environment and to outline the stages typically encountered in this process.
Abstract: Software development in a High Performance Computing (HPC) environment is non-trivial and requires a thorough understanding of the application and the architecture. The objective of this paper is to study the software development process in a high performance computing environment and to outline the stages typically encountered in this process. Support required at each stage is also highlighted. The modeling of stock option pricing is used as a running example in the study.

7 citations


Proceedings ArticleDOI
06 Nov 1994
TL;DR: A software maintenance methodology, The Software Service Bay, is introduced which is analogous to the automotive service bay which employs a number of experts for particular maintenance problems.
Abstract: A software maintenance methodology, The Software Service Bay, is introduced. This methodology is analogous to the automotive service bay which employs a number of experts for particular maintenance problems. Problems in maintenance are reformulated so they may be solved with current AI tools and technologies. >

6 citations


Journal ArticleDOI
TL;DR: During the mainframe era, software grew to become a major factor of business, government and military operations and many countries are now entering the commercial software domain, and this trend should accelerate in the 21st century.
Abstract: During the mainframe era, software grew to become a major factor of business, government and military operations. Mainframe software development favoured the industrialised nations such as the United States because of the high levels of investment needed for the mainframe computers themselves, and also because of the specialised data centres, cooling systems and power supplies that mainframes required. Personal computers and industrial microcomputers are changing the situation dramatically. The capital investment required to supply a programming staff with personal computers is almost trivial. Software development in the future can be carried out almost anywhere in the world. Software is a comparatively ‘green’ industry that is not harmful to the environment and uses little in the way of natural resources. Many countries are now entering the commercial software domain, and this trend should accelerate in the 21st century. Software usage and consumption are also expanding rapidly on a global basis.

4 citations


Journal ArticleDOI
TL;DR: Three basic capabilities required of every software CM environment are identified, the desirability of four additional capabilities are discussed, a generalized note of caution on CM tools in a client-server environment is provided, and lessons learned throughout are integrated.
Abstract: Software configuration management (CM) environments are an active field of industrial endeavor that has not been well discussed in the software engineering literature. The capabilities required by those environments are in addition to the usual capabilities specified for software development environments and have not yet found their way into the integrated project support environment discussions. This article identifies three basic capabilities required of every software CM environment, discusses the desirability of four additional capabilities, provides a generalized note of caution on CM tools in a client-server environment, and integrates lessons learned throughout. >

4 citations


Book
01 Nov 1994
TL;DR: This publication describes a set of techniques for creating digital spatial databases, atlases, and wall maps that have proved to be relatively inexpensive, quick, and easy to learn, and to have delivered usable products.
Abstract: This publication describes a set of techniques for creating digital spatial databases, atlases, and wall maps When applied to mapping these techniques form a small subset of geographic information systems (GIS) technology, which deals with using computing equipment to acquire and use spatial data The book describes a specific body of practical methods that have proved to be relatively inexpensive, quick, and easy to learn, and to have delivered usable products The techniques make use of general-purpose, low-cost desktop computing hardware and software Appendixes include: a glossary, recommendations on how to maintain a digital database, some very specific information on digital satellite data, background information on the hardware and software used in developing the techniques, and selected readings

3 citations


Book ChapterDOI
18 Apr 1994
TL;DR: InfoMall is a programme lead by the Northeast Parallel Architectures Center featuring a partnership of over twenty-five organisations and a plan for accelerating development of the High Performance Computing and Communications (HPCC) software and systems industry.
Abstract: InfoMall is a programme lead by the Northeast Parallel Architectures Center (NPAC) featuring a partnership of over twenty-five organisations and a plan for accelerating development of the High Performance Computing and Communications (HPCC) software and systems industry. HPCC (or HPCN as it is known in Europe) is a critical technology which will have unprecedented impact on industry, education, society, and defense. Acceptance of HPCC by these real world sectors is held up by the extremely hard problem of HPCC software development. InfoMall employs a novel technology development strategy involving closely linked programmes in technology execution and certification, software development, marketing, education and training, economic development and small business support. InfoMall has excellent HPCC and other facility infrastructure. InfoMall partners have unrivaled expertise in all the areas critical to rapid development of the HPCC software industry. The process is constructed and explained by analogy to the full-service set of stores found in a shopping mall. InfoMall is a concept which can create 15,000 jobs in New York State alone, and be scaled in future years to create an order of magnitude more jobs nationally.

3 citations


Journal ArticleDOI
TL;DR: What every software supplier and user should know about these latest developments in software protection is outlined.
Abstract: 1993 brought a series of changes in the protection and licensing of software, which mark a shift in the legal and commercial balance between suppliers and users. In particular, they introduced a number of additional rights for software users, some of which will apply irrespective of anything in the relevant contract. Everyone involved in the development, licensing or use of computer programs should therefore be aware of the changes. Here, the author outlines what every software supplier and user should know about these latest developments in software protection. >

20 Jun 1994
TL;DR: The Connection Machine Scientific Software Library, CMSSL, contains routines for managing the data distribution and provides data distribution independent functionality and evidence is provided that CMSSL has reached the goals of performance and scalability for an important set of applications.
Abstract: Massively parallel processors introduce new demands on software systems with respect to performance, scalability, robustness and portability. The increased complexity of the memory systems and the increased range of problem sizes for which a given piece of software is used, poses serious challenges to software developers. The Connection Machine Scientific Software Library, CMSSL, uses several novel techniques to meet these challenges. The CMSSL contains routines for managing the data distribution and provides data distribution independent functionality. High performance is achieved through careful scheduling of arithmetic operations and data motion, and through the automatic selection of algorithms at run-time. We discuss some of the techniques used, and provide evidence that CMSSL has reached the goals of performance and scalability for an important set of applications.

01 Jan 1994
TL;DR: This work has announced that COMPsych has now been designated as the official archive and distribution site for PC-based software and data files described in the journal Behavior Research Methods, Instruments, & Computers.
Abstract: COMPsych is an electronic clearinghouse for software information related to psychology instruction, research, and practice. Since 1987, COMPsych has provided five major services: (1) a catalog of descriptive information about available software, (2) a directory of software users, (3) a message system, (4) an announcement service, and (5) an electronic Newsletter. In addition to these services, COMPsych has now been designated as the official archive and distribution site for PC-based software and data files described in the journalBehavior Research Methods, Instruments, & Computers. A general description of COMPsych and specific instructions for using the new archive service are presented. Those interested in joining COMPsych should send electronic mail to compsych@snyplava.bitnet or compsych@splava.cc.plattsburgh.edu. The system can be accessed via anonymous ftp at gluon.hawk.plattsburgh.edu in the directory pub/compsych. COMPsych files are also available through the gopher server at SUNY Plattsburgh.

Proceedings ArticleDOI
28 Nov 1994
TL;DR: The paper presents the failure detection approach of real- time supervision based on the belief method, which allows failures in a software system to be detected in real-time based only on boundary signals and a specification of its external behaviour.
Abstract: With the critical role played by software systems in the operation of telecommunications networks, the ability to detect and report software failures has become of great importance. The paper presents the failure detection approach of real-time supervision based on the belief method. Real-time supervision allows failures in a software system to be detected in real-time based only on boundary signals and a specification of its external behaviour. This feature is of particular benefit when software is purchased from other vendors and there is no direct access to source code. An implementation of real-time supervision has been shown to be capable of failure detection in a small telephone exchange.

Proceedings ArticleDOI
15 May 1994
TL;DR: Total quality management techniques were applied in rewriting major sections of key software modules in the UCLA PACS, resulting in code development within predictable periods of time at a predictable cost, thus enhancing the software development cycle.
Abstract: A significant problem in building large-scale picture archiving and communications systems (PACS) is the production of reliable and accurate software, within a specified amount of time and cost, without impacting existing operations. PACS software management is particularly difficult because most PACS involve highly distributed processing over very heterogeneous components. We applied total quality management techniques to the problem of PACS software management. All potential users of PACS software were identified as `customers,' and we optimized the quality of service provided to them. Our methodology involves each of these customers at each stage of the software development cycle to help ensure that PACS functions meet the requirements and priorities of the majority of PACS users. We used this approach in rewriting major sections of key software modules in the UCLA PACS, resulting in code development within predictable periods of time at a predictable cost, thus enhancing our software development cycle.

Proceedings ArticleDOI
07 Dec 1994
TL;DR: The operational model for SIMS is the client-server model and the internal structure of software information is based on the entity-relationship model, and the current status and future enhancements of SIMS are described.
Abstract: Provision of software information, which includes all kinds of documents produced during software development and maintenance, is helpful to developers or maintainers. In this paper, we describe a system, called the Software Information Management System (SIMS), for effective management of software information to support software development and maintenance. SIMS also provides a facility for a group of developers to collaborate via a project-wide and network-based bulletin board. The operational model for SIMS is the client-server model and the internal structure of software information is based on the entity-relationship model. We explain the architecture of SIMS and tools constituting SIMS. Finally, we describe the current status and future enhancements of SIMS. >