scispace - formally typeset
Search or ask a question

Showing papers on "Software published in 1996"


Journal ArticleDOI
TL;DR: A package of computer programs for analysis and visualization of three-dimensional human brain functional magnetic resonance imaging (FMRI) results is described and techniques for automatically generating transformed functional data sets from manually labeled anatomical data sets are described.

10,002 citations


Journal ArticleDOI
TL;DR: IMOD is useful for studying and modeling data from tomographic, serial section, and optical section reconstructions and allows image data to be visualized by several different methods.

4,830 citations


Journal ArticleDOI
TL;DR: Calibration methods and software have been developed for single crystal diffraction experiments, using both approaches for calibrate, and apply corrections, to obtain accurate angle and intensity information.
Abstract: Detector systems introduce distortions into acquired data. To obtain accurate angle and intensity information, it is necessary to calibrate, and apply corrections. Intensity non-linearity, spatial distortion, and non-uniformity of intensity response, are the primary considerations. It is better to account for the distortions within scientific analysis software, but often it is more practical to correct the distortions to produce ‘idealised’ data. Calibration methods and software have been developed for single crystal diffraction experiments, using both approaches. For powder diffraction experiments the additional task of converting a two-dimensional image to a one-dimensional spectrum is used to allow Rietveld analysis. This task may be combined with distortion correction to produce intensity information and error estimates. High-pressure experiments can introduce additional complications and place new demands on software. Flexibility is needed to be able to integrate different angular regions se...

4,426 citations


Journal ArticleDOI
TL;DR: This paper outlines a methodology for developing and evaluating ontologies, first discussing informal techniques, concerning such issues as scoping, handling ambiguity, reaching agreement and producing definitions, and considers, a more formal approach.
Abstract: This paper is intended to serve as a comprehensive introduction to the emerging field concerned with the design and use of ontologies. We observe that disparate backgrounds, languages, tools and techniques are a major barrier to effective communication among people, organisations and/or software understanding (i.e. an “ontology”) in a given subject area, can improve such communication, which in turn, can give rise to greater reuse and sharing, inter-operability, and more reliable software. After motivating their need, we clarify just what ontologies are and what purpose they serve. We outline a methodology for developing and evaluating ontologies, first discussing informal techniques, concerning such issues as scoping, handling ambiguity, reaching agreement and producing definitions. We then consider the benefits and describe, a more formal approach. We re-visit the scoping phase, and discuss the role of formal languages and techniques in the specification, implementation and evalution of ontologies. Finally, we review the state of the art and practice in this emerging field, considering various case studies, software tools for ontology development, key research issues and future prospects.

3,568 citations


Book
25 Nov 1996
TL;DR: Algorithms for Image Processing and Computer Vision, 2nd Edition provides the tools to speed development of image processing applications.
Abstract: A cookbook of algorithms for common image processing applicationsThanks to advances in computer hardware and software, algorithms have been developed that support sophisticated image processing without requiring an extensive background in mathematics This bestselling book has been fully updated with the newest of these, including 2D vision methods in content-based searches and the use of graphics cards as image processing computational aids Its an ideal reference for software engineers and developers, advanced programmers, graphics programmers, scientists, and other specialists who require highly specialized image processingAlgorithms now exist for a wide variety of sophisticated image processing applications required by software engineers and developers, advanced programmers, graphics programmers, scientists, and related specialistsThis bestselling book has been completely updated to include the latest algorithms, including 2D vision methods in content-based searches, details on modern classifier methods, and graphics cards used as image processing computational aidsSaves hours of mathematical calculating by using distributed processing and GPU programming, and gives non-mathematicians the shortcuts needed to program relatively sophisticated applicationsAlgorithms for Image Processing and Computer Vision, 2nd Edition provides the tools to speed development of image processing applications

1,517 citations


Book
01 Dec 1996
TL;DR: This book presents a broad overview of numerical methods for solving all the major problems in scientific computing, including linear and nonlinear equations, least squares, eigenvalues, optimization, interpolation, integration, ordinary and partial differential equations, fast Fourier transforms, and random number generators.
Abstract: From the Publisher: "This book presents a broad overview of numerical methods for solving all the major problems in scientific computing, including linear and nonlinear equations, least squares, eigenvalues, optimization, interpolation, integration, ordinary and partial differential equations, fast Fourier transforms, and random number generators The treatment is comprehensive yet concise, software-oriented yet compatible with a variety of software packages and programming languages The book features more than 160 examples, 500 review questions, 240 exercises, and 200 computer problems" "Changes for the second edition include: expanded motivational discussions and examples; formal statements of all major algorithms; expanded discussions of existence, uniqueness, and conditioning for each type of problem so that students can recognize "good" and "bad" problem formulations and understand the corresponding quality of results producted; and expanded coverage of several topics, particularly eigenvalues and constrained optimization" The book contains a wealth of material and can be used in a variety of one- or two-term courses in computer science, mathematics, or engineering Its comprehensiveness and modern perspective, as well as the software pointers provided, also make it a highly useful reference for practicing professionals who need to solve computational problems

1,065 citations


Book
01 Jan 1996
TL;DR: The Computer Music Tutorial is a comprehensive text and reference that covers all aspects of computer music, including digital audio, synthesis techniques, signal processing, musical input devices, performance software, editing systems, algorithmic composition, MIDI, synthesizer architecture, system interconnection, and psychoacoustics.
Abstract: From the Publisher: The Computer Music Tutorial is a comprehensive text and reference that covers all aspects of computer music, including digital audio, synthesis techniques, signal processing, musical input devices, performance software, editing systems, algorithmic composition, MIDI, synthesizer architecture, system interconnection, and psychoacoustics A special effort has been made to impart an appreciation for the rich history behind current activities in the field Profusely illustrated and exhaustively referenced and cross-referenced, The Computer Music Tutorial provides a step-by-step introduction to the entire field of computer music techniques Written for nontechnical as well as technical readers, it uses hundreds of charts, diagrams, screen images, and photographs as well as clear explanations to present basic concepts and terms Mathematical notation and program code examples are used only when absolutely necessary Explanations are not tied to any specific software or hardware Curtis Roads has served as editor-in-chief of Computer Music Journal for more than a decade and is a recognized authority in the field The material in this book was compiled and refined over a period of several years of teaching in classes at Harvard University, Oberlin Conservatory, the University of Naples, IRCAM, Les Ateliers UPIC, and in seminars and workshops in North America, Europe, and Asia

986 citations


Journal ArticleDOI
TL;DR: The SPICE system is described, current and future SPICE applications are identified, and customer support offered by NAIF is summarized.

917 citations


Journal ArticleDOI
TL;DR: A shell of software tools has been developed within a WINDOWS environment which includes user-installed editor, linker, compiler, testbed generator, graphics, database and version control software and the resulting product is one in which the functions are coded in the language most familiar to the developers of scientific modules but provides many of the features of object oriented programming.

881 citations


Journal ArticleDOI
TL;DR: This work uses a mathematical framework to propose definitions of several important measurement concepts (size, length, complexity, cohesion, coupling) and believes that the formalisms and properties it introduces are convenient and intuitive and contributes constructively to a firmer theoretical ground of software measurement.
Abstract: Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts, regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysts, and better evaluation methods for commercial static analyzers for practitioners. We propose a mathematical framework which is generic, because it is not specific to any particular software artifact, and rigorous, because it is based on precise mathematical concepts. We use this framework to propose definitions of several important measurement concepts (size, length, complexity, cohesion, coupling). It does not intend to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalisms and properties we introduce are convenient and intuitive. This framework contributes constructively to a firmer theoretical ground of software measurement.

771 citations


Journal ArticleDOI
TL;DR: This paper describes the current state of a large set of programs written by various members of the Laboratory of Molecular Biology for processing images of two-dimensional crystals and of particles with helical or icosahedral symmetry for determination of macromolecular structures by electron microscopy.

Journal ArticleDOI
TL;DR: Software quality on trial is put on trial, examining both the definition and evaluation of the authors' software products and processes.
Abstract: If you are a software developer, manager, or maintainer, quality is often on your mind. But what do you really mean by software quality? Is your definition adequate? Is the software you produce better or worse than you would like it to be? We put software quality on trial, examining both the definition and evaluation of our software products and processes.


Journal ArticleDOI
TL;DR: Recent and emerging trends in the facility layout problem, covering the last 10 years of research, including new methodologies, objectives, algorithms, and extensions to this well-studied combinatorial optimization problem, are presented in this article.

Patent
24 Jul 1996
TL;DR: In this article, a system and method for distributing software applications and data to many thousands of clients over a network is described, where applications are called channels, the server is called the transmitter, and the client is referred to as the tuner.
Abstract: A system and method for distributing software applications and data to many thousands of clients over a network. The applications are called "channels", the server is called the "transmitter", and the client is called the "tuner". The use of channels is based on subscription. The end-user needs to subscribe to channel before it can be executed. When the end-user subscribes to a channel the associated code and data is downloaded to the local hard-disk, and once downloaded the channel can be executed many times without requiring further network access. Channels can be updated automatically at regular intervals by the tuner, and as a result the end-user is no longer required to manually install software updates, instead these software and data updates are automatically downloaded and installed in the background. This method of automatic downloading of updates achieves for the client the same result as the broadcast distribution of software over a connection based network, but wherein the client initiates each update request without requiring any special broadcast networking infra structure.

Book
30 Dec 1996
TL;DR: The new edition now prepares readers for success with the latest developments in Microsoft Office Excel 2010, including data sets, applications and screen visuals throughout that reflect Excel 2010.
Abstract: Gain a sound conceptual understanding of the role that management science plays in the decision-making process with the latest edition of the book that has defined today's management science course: Anderson/Sweeney/Williams/Camm/Martin's AN INTRODUCTION TO MANAGEMENT SCIENCE: QUANTITATIVE APPROACHES TO DECISION MAKING, REVISED 13th Edition. The trusted market leader for more than two decades, the new edition now prepares readers for success with the latest developments in Microsoft Office Excel 2010, including data sets, applications and screen visuals throughout that reflect Excel 2010. Readers learn from the book's proven applications-oriented approach, powerful examples, and problem-scenario approach that introduces each quantitative technique within an applications setting. Readers can get a copy of LINGO software and Excel add-ins with the book's online content. A copy of the popular Microsoft Project Professional 2010 accompanies the book on CD.

Patent
12 Jul 1996
TL;DR: A software-based computer security enhancing process and graphical software-authenticity method, and a method to apply aspects of the two are disclosed as discussed by the authors, which provides protection against certain attacks on executable software by persons or other software used on the computer.
Abstract: A software-based computer security enhancing process and graphical software-authenticity method, and a method to apply aspects of the two are disclosed. The process provides protection against certain attacks on executable software by persons or other software used on the computer. Software using this process is protected against eavesdropping (the monitoring of software, applications, the operating system, disks, keyboard, or other devices to record (steal) identification, authentication or sensitive data such as passwords, User-ID's, credit-card number and expiry dates, bank account and PIN numbers, smart-card data, biometric information (for example: the data comprising a retina or fingerprint scan), or encryption keys), local and remote tampering (altering software to remove, disable, or compromise security features of the altered software) examination (viewing the executable program, usually with the intent of devising security attacks upon it), tracing (observing the operating of an executable program step-by-step), and spoofing (substituting counterfeit software to emulate the interface of authentic software in order to subvert security) by rogues (e.g.: Trojan Horses, Hackers, Viruses, Terminate-and-stay-resident programs, co-resident software, multi-threaded operating system processes, Worms, Spoof programs, key-press password captures, macro recorders, sniffers, and other software or subversions). Aspects include executable encryption, obfuscation, anti-tracing, anti-tamper and self-verification, runtime self-monitoring, and audiovisual authentication (math, encryption, and graphics based method permitting users to immediately recognise the authenticity and integrity of software). The figure in the specification depicts the many components and their interaction.

Journal ArticleDOI
01 Dec 1996
TL;DR: This paper describes an alternative, measurement based instruction level power analysis approach that provides an accurate and practical way of quantifying the power cost of soft-ware and guides the development of general tools and techniques for low power software.
Abstract: The increasing popularity of power constrained mobile computers and embedded computing applications drives the need for analyzing and optimizing power in all the components of a system. Software constitutes a major component of today's systems, and its role is projected to grow even further. Thus, an ever increasing portion of the functionality of today's systems is in the form of instructions, as opposed to gates. This motivates the need for analyzing power consumption from the point of view of instructions--something that traditional circuit and gate level power analysis tools are inadequate for. This paper describes an alternative, measurement based instruction level power analysis approach that provides an accurate and practical way of quantifying the power cost of soft-ware. This technique has been applied to three commercial, architecturally different processors. The salient results of these analyses are summarized. Instruction level analysis of a processor helps in the development of models for power consumption of software executing on that processor. The power models for the subject processors are described and interesting observations resulting from the comparison of these models are highlighted. The ability to evaluate software in terms of power consumption makes it feasible to seach fow low power implementations of given programs. In addition, it can guide the development of general tools and techniques for low power software. Several ideas in this regard as motivated by the power analysis of the subject processors are also described.

Journal ArticleDOI
Thomas Ball1, Stephen G. Eick1
TL;DR: The invisible nature of software hides system complexity, particularly for large team-oriented projects, and four innovative visual representations of code have evolved to help solve this problem: line representation; pixel representation; file summary representation; and hierarchical representation.
Abstract: The invisible nature of software hides system complexity, particularly for large team-oriented projects. The authors have evolved four innovative visual representations of code to help solve this problem: line representation; pixel representation; file summary representation; and hierarchical representation. We first describe these four visual code representations and then discuss the interaction techniques for manipulating them. We illustrate our software visualization techniques through five case studies. The first three focus on software history and static software characteristics; the last two discuss execution behavior. The software library and its implementation are then described. Finally, we briefly review some related work and compare and contrast our different techniques for visualizing software.

Patent
30 Sep 1996
TL;DR: In this article, the authors describe a process of distributing software and data in a digital computer network by combining the software, together with programs and data known as methods, into single entities referred to as Packages, and then by using specific techniques to transmit Packages from one computer to another.
Abstract: A process of distributing software and data in a digital computer network by combining the software and data, together with programs and data known as methods, into single entities referred to as Packages, and then by using specific techniques to transmit Packages from one computer to another. The methods are operable on a Target digital computer to unpack and perform useful functions such as installing and backing out software on the Target digital computer, collecting data from the Target digital computer and forwarding it to another digital computer, or completing a system administration function on the Target digital computer. The techniques used in transmitting Packages between digital computers includes use of Agent software to transfer and activate Packages at appropriate times. The techniques also include forwarding multiple copies of Packages received on a slow network connection to one or more digital computers connected on faster network connections. The techniques also include temporarily storing one or more Packages for later transmission to computers which connect occasionally to the network. Further, the techniques include limiting simultaneous transfers to Target digital computers based on network capacity and topology. The techniques also including limiting the type of software and data which each Package may affect on a Target digital computer, to more easily manage which Packages must be backed out in what order from the Target digital computers. Finally, the techniques also include constructing Packages which contain some software and data which depends on the configuration of the Target digital computer, and transferring only that part of the Package which is appropriate for each Target digital computer's configuration.

Patent
24 May 1996
TL;DR: In this article, the authors propose a Client-Server-Service (CSS) model for service functionalities in a local host, a remote host, and a user's local host.
Abstract: A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user's local host, thereby providing ease of use and minimal software maintenance for users of that remote service.

Patent
02 Aug 1996
TL;DR: In this paper, a distributed computing system comprising a plurality of computer hosts, a communication network for exchanging information and data between the computer hosts and software-based services, distributed throughout the computing system is disclosed.
Abstract: A distributed computing system comprising a plurality of computer hosts, a communication network for exchanging information and data between the computer hosts, and a plurality of services, including software-based services, distributed throughout the computing system is disclosed. Each of the services within the present distributed computing system, including some basic system-services are adapted to perform prescribed functions in response to the receipt of an electronic message. The disclosed distributed computing system also includes a plurality of intelligent agents executing on the computer hosts and associated with one or more of the services, wherein an agent exercises control of an associated service by manipulating the electronic messages directed to and originating from the associated service. In addition, the services are adapted to cooperatively perform various tasks by exchanging electronic messages across the communication network via their associated agents.

Journal ArticleDOI
Alan Wood1
TL;DR: This work applied software reliability modeling to a subset of products for four major releases of major software releases at Tandem, and learned what was learned.
Abstract: Critical business applications require reliable software, but developing reliable software is one of the most difficult problems facing the software industry After the software is shipped, software vendors receive customer feedback about software reliability However, by then it is too late; software vendors need to know whether their products are reliable before they are delivered to customers Software reliability growth models help provide that information Unfortunately, very little real data and models from commercial applications have been published, possibly because of the proprietary nature of the data Over the past few years, the author and his colleagues at Tandem have experimented with software reliability growth models At Tandem, a major software release consists of substantial modifications to many products and may contain several million lines of code Major software releases follow a well defined development process and involve a coordinated quality assurance effort We applied software reliability modeling to a subset of products for four major releases The article reports on what was learned

Patent
03 May 1996
TL;DR: The Real-Time Embedded Software Repository Apparatus (RTES) as discussed by the authors is a repository system for real-time embedded software that allows the user to associate fixed and user defined Attributes to the RTES.
Abstract: The Real-Time Embedded Software Repository Apparatus fully characterizes, evaluates, and reuses real-time embedded software that is placed or stored in a repository database. The Repository System comprises at least one Repository Client and at least one Repository Server and utilizes simulation and translational techniques to allow Real Time Embedded Software (RTES) to be re-used, played, and evaluated on various desktop development environments or target operating environments. The Repository System organizes and processes Repository files as Repository Units which may comprise Software Source Files and Test Software. Repository Units also contain Attachments that provide current and historic information to static files that are stored in the Repository. The Repository Units are further characterized using analysis tools (software analysis) which allow the user to associate fixed and user defined Attributes to the RTES. A real-time embedded component (Component) provides a clear and well defined software interface to function at a high level of interaction with the RTES. Templates for both searching and displaying information in a multimedia format are also provided.

Proceedings ArticleDOI
01 Oct 1996
TL;DR: The results and experiences of using symbolic model checking to study the specification of an aircraft collision avoidance system and the approach to translating the specification to the SMV language and methods for achieving acceptable performance are reported.
Abstract: In this paper we present our results and experiences of using symbolic model checking to study the specification of an aircraft collision avoidance system. Symbolic model checking has been highly successful when applied to hardware systems. We are interested in the question of whether or not model checking techniques can be applied to large software specifications.To investigate this, we translated a portion of the finite-state requirements specification of TCAS II (Traffic Alert and Collision Avoidance System) into a form accepted by a model checker (SMV). We successfully used the model checker to investigate a number of dynamic properties of the system.We report on our experiences, describing our approach to translating the specification to the SMV language and our methods for achieving acceptable performance in model checking, and giving a summary of the properties that we were able to check. We consider the paper as a data point that provides reason for optimism about the potential for successful application of model checking to software systems. In addition, our experiences provide a basis for characterizing features that would be especially suitable for model checkers built specifically for analyzing software systems.The intent of this paper is to evaluate symbolic model checking of state-machine based specifications, not to evaluate the TCAS II specification. We used a preliminary version of the specification, the version 6.00, dated March, 1993, in our study. We did not have access to later versions, so we do not know if the properties identified here are present in later versions.

Journal ArticleDOI
TL;DR: The question of what levels of contamination can be detected by this algorithm as a function of dimension, computation time, sample size, contamination fraction, and distance of the contamination from the main body of data is investigated.
Abstract: New insights are given into why the problem of detecting multivariate outliers can be difficult and why the difficulty increases with the dimension of the data. Significant improvements in methods for detecting outliers are described, and extensive simulation experiments demonstrate that a hybrid method extends the practical boundaries of outlier detection capabilities. Based on simulation results and examples from the literature, the question of what levels of contamination can be detected by this algorithm as a function of dimension, computation time, sample size, contamination fraction, and distance of the contamination from the main body of data is investigated. Software to implement the methods is available from the authors and STATLIB.

Book
01 Jan 1996
TL;DR: This programming manual accompanies the PcGive Professional 9.0 software, an interactive modelling package for modelling statistical and economic data for forecasting or observing trends.
Abstract: This programming manual accompanies the PcGive Professional 9.0 software, an interactive modelling package for modelling statistical and economic data for forecasting or observing trends.

Proceedings ArticleDOI
01 May 1996
TL;DR: An automated environment known as ANGEL is described that supports the collection, storage and identification of the most analogous projects in order to estimate the effort for a new project and is shown to out perform traditional algorithmic methods for six different datasets.
Abstract: The staff resources or effort required for a software project are notoriously difficult to estimate in advance. To date most work has focused upon algorithmic cost models such as COCOMO and Function Points. These can suffer from the disadvantage of the need to calibrate the model to each individual measurement environment coupled with very variable accuracy levels even after calibration. An alternative approach is to use analogy for estimation. We demonstrate that this method has considerable promise in that we show it to out perform traditional algorithmic methods for six different datasets. A disadvantage of estimation by analogy is that it requires a considerable amount of computation. The paper describes an automated environment known as ANGEL that supports the collection, storage and identification of the most analogous projects in order to estimate the effort for a new project. ANGEL is based upon the minimisation of Euclidean distance in n-dimensional space. The software is flexible and can deal with differing datasets both in terms of the number of observations (projects) and in the variables collected. Our analogy approach is evaluated with six distinct datasets drawn from a range of different environments and is found to outperform other methods. It is widely accepted that effective software effort estimation demands more than one technique. We have shown that estimating by analogy is a candidate technique and that with the aid of an automated environment is an eminently practical technique.

Patent
23 Dec 1996
TL;DR: In this paper, the authors present a common user interface for accessing many different application programs over the web via a common interface. But, the user interface is not designed for the general public.
Abstract: The present invention provides the capability to easily access many different application programs over the WWW via a common user interface. By providing standard procedures, routines, tools, and software "hooks" for accessing software applications over the WWW, software developers can concentrate on the functionality of the application program and easily use HTML to provide a GUI interface for the application program. HTML is a well-known language which can be used by almost any computer system on the market today. In addition, since HTML is a fairly well controlled and standardized language, new software application features can be added as they are developed and supported by HTML. In addition, since HTML is a widely adopted, non-proprietary technology, the present invention can provide open access to a large market for even very small software developers. Further, the present invention also allows software developers to adopt a standard access protocol, which allows them to provide support for any computer system which is capable of utilizing a HTML cognizant browser. Finally, by providing easy-to-implement, standardized solutions to the issues of user interface, authentication/security, and web transaction support, the common user interface of the present invention overcomes the limitations existing in previous solutions.

Journal ArticleDOI
TL;DR: The architecture and implementation of a tool that automates the implementation of design patterns, where the user of the tool supplies application-specific information for a given pattern, from which the tool generates all the pattern-prescribed code automatically.
Abstract: Design patterns raise the abstraction level at which people design and communicate design of object-oriented software. However, the mechanics of implementing design patterns is left to the programmer. This paper describes the architecture and implementation of a tool that automates the implementation of design patterns. The user of the tool supplies application-specific information for a given pattern, from which the tool generates all the pattern-prescribed code automatically. The tool has a distributed architecture that lends itself to implementation with off-the-shelf components.