scispace - formally typeset
Search or ask a question

Showing papers on "Software published in 1999"


Journal Article
TL;DR: In the laboratory, the laboratory investigates several areas, including protein-ligand docking, protein-protein docking, and complex molecular assemblies, as well as developing a number of computational tools such as molecular surfaces, phenomenological potentials, various docking and visualization programs which are used in conjunction with programs developed by others.
Abstract: One of the challenges in bio-computing is to enable the efficient use and inter-operation of a wide variety of rapidly-evolving computational methods to simulate, analyze, and understand the complex properties and interactions of molecular systems. In our laboratory we investigates several areas, including protein-ligand docking, protein-protein docking, and complex molecular assemblies. Over the years we have developed a number of computational tools such as molecular surfaces, phenomenological potentials, various docking and visualization programs which we use in conjunction with programs developed by others. The number of programs available to compute molecular properties and/or simulate molecular interactions (e.g., molecular dynamics, conformational analysis, quantum mechanics, distance geometry, docking methods, ab-initio methods) is large and growing rapidly. Moreover, these programs come in many flavors and variations, using different force fields, search techniques, algorithmic details (e.g., continuous space vs. discrete, Cartesian vs. torsional). Each variation presents its own characteristic set of advantages and limitations. These programs also tend to evolve rapidly and are usually not written as components, making it hard to get them to work together.

2,665 citations


Journal ArticleDOI
TL;DR: EMAN (Electron Micrograph ANalysis), a software package for performing semiautomated single-particle reconstructions from transmission electron micrographs, was written from scratch in C++ and is provided free of charge on the Web site.

2,551 citations


Journal ArticleDOI
TL;DR: Three of the most popular tasks used to study discriminability are discussed, together with the measures that SDT prescribes for quantifying performance in these tasks.
Abstract: Signal detection theory (SDT) may be applied to any area of psychology in which two different types of stimuli must be discriminated. We describe several of these areas and the advantages that can be realized through the application of SDT. Three of the most popular tasks used to study discriminability are then discussed, together with the measures that SDT prescribes for quantifying performance in these tasks. Mathematical formulae for the measures are presented, as are methods for calculating the measures with lookup tables, computer software specifically developed for SDT applications, and general purpose computer software (including spreadsheets and statistical analysis software).

2,438 citations


Patent
03 Mar 1999
TL;DR: In this paper, the authors present a method for instantiation of tiered software applications running on an Internet or Intranet computer system, including a method of instantiation and a program product.
Abstract: Instantiation of tiered software applications running on an Internet or Intranet computer system, including a method of instantiation and a program product for instantiation. The method, and program product are particularly useful in instantiation of multi-tiered applications having a user interface tier on the client, browser, or remote computer, from a meta data repository containing attributes and values of the attributes.

1,295 citations


Journal ArticleDOI
Mark D. Weiser1
TL;DR: Specialized elements of hardware and software, connected by wires, radio waves and infrared, will soon be so ubiquitous that no-one will notice their presence.
Abstract: Specialized elements of hardware and software, connected by wires, radio waves and infrared, will be so ubiquitous that no one will notice their presence.

1,101 citations


Journal ArticleDOI
TL;DR: Rather than planning, analyzing, and designing for the far-flung future, XP exploits the reduction in the cost of changing software to do all of these activities a little at a time, throughout software development.
Abstract: Traditional software engineering means have been characterized by a rather predictable process in the past. Users tell once and for all exactly what they want. Programmers design the system that will deliver those features. They code it; test it, and all is well. But all was not always well. The users did not tell once and for all exactly what they wanted. They changed their minds, and the users were not the only problem. Programmers could misjudge their progress. The academic software engineering community took the high cost of changing software as a challenge, creating technologies like relational databases, modular programming, and information hiding. This is where extreme programming comes in. Rather than planning, analyzing, and designing for the far-flung future, XP exploits the reduction in the cost of changing software to do all of these activities a little at a time, throughout software development. The paper discusses the major practices of XP.

1,011 citations


Book
01 Jan 1999
TL;DR: Open source software brings to the computer software industry even greater freedom than the hardware manufacturers and consumers have enjoyed as mentioned in this paper, instead of enabling the vendors to control their customers through restricting access to the code behind the technologies.
Abstract: From the Book: Freedom is not an abstract concept in business. The success of any industry is almost directly related to the degree of freedom the suppliers and the customers of that industry enjoy. just compare the innovation in the US telephone business since AT&T lost its monopoly control over American consumers, with the previously slow pace of innovation when those customers had no freedom to chose. The world's best example of the benefits of freedom in business is a comparison of the computer hardware business and the computer software business. In computer hardware, where freedom reigns for both the suppliers and consumers alike on a global scale, the industry generates the fastest innovation in product and customer value the world has ever seen. In the computer software industry, on the other hand, change is measured in decades. The office suite, the 1980s killer application, wasn't challenged until the 1990s with the introduction of the Web browser and server. Open source software brings to the computer software industry even greater freedom than the hardware manufacturers and consumers have enjoyed. Computer languages are called languages because they are just that. They enable the educated members of our society (in this case programmers) to build ideas that benefit the other members of our society, including other programmers. To legally restrict access (via the proprietary binary-only software licenses our industry historically has used) to the knowledge of the infrastructure that our society increasingly relies on, has resulted in less freedom and slower innovation. Open source represents some pretty revolutionary concepts being thrown at an industry thatthought it had all of its fundamental structures worked out. It gives customers control over the technologies they use, instead of enabling the vendors to control their customers through restricting access to the code behind the technologies. Supplying open source tools to the market will require new business models. But by delivering unique benefits to the market those companies who develop the business models will be very successful competing with companies who attempt to retain control over their customers. There have always been two things that would be required if open source software was to materially change the world, one was for open source software to become widely used. The other; that the benefits this software development model supplied to its users had to be communicated and understood. This is Eric's great contribution to the success of the open source software revolution, to the adoption of Linux-based operating systems, and to the success of open source users and the companies who supply them. His ability to explain clearly, effectively, and accurately the benefits of this revolutionary software development model has been central to the success of this revolution. Robert Young, Chairman and CEO, Red Hat, Inc. .

873 citations


Proceedings Article
01 Jan 1999
TL;DR: In this paper, a review of the existing frameworks and measures for coupling measurement in object-oriented systems is presented, and a unified framework based on the issues discovered in the review is provided and all existing measures are classified according to this framework.
Abstract: The increasing importance being placed on software measurement has led to an increased amount of research developing new software measures. Given the importance of object-oriented development techniques, one specific area where this has occurred is coupling measurement in object-oriented systems. However, despite a very interesting and rich body of work, there is little understanding of the motivation and empirical hypotheses behind many of these new measures. It is often difficult to determine how such measures relate to one another and for which application they can be used. As a consequence, it is very difficult for practitioners and researchers to obtain a clear picture of the state-of-the-art in order to select or define measures for object-oriented systems. This situation is addressed and clarified through several different activities. First, a standardized terminology and formalism for expressing measures is provided which ensures that all measures using it are expressed in a fully consistent and operational manner. Second, to provide a structured synthesis, a review of the existing frameworks and measures for coupling measurement in object-oriented systems takes place. Third, a unified framework, based on the issues discovered in the review, is provided and all existing measures are then classified according to this framework. This paper contributes to an increased understanding of the state-of-the-art: A mechanism is provided for comparing measures and their potential use, integrating existing measures which examine the same concepts in different ways, and facilitating more rigorous decision making regarding the definition of new measures and the selection of existing measures for a specific goal of measurement. In addition, our review of the state-of-the-art highlights that many measures are not defined in a fully operational form, and relatively few of them are based on explicit empirical models, as recommended by measurement theory.

815 citations


Journal ArticleDOI
TL;DR: The structure and design of the software package PHC is described, which features great variety of root-counting methods among its tools and is ensured by the gnu-ada compiler.
Abstract: Polynomial systems occur in a wide variety of application domains. Homotopy continuation methods are reliable and powerful methods to compute numerically approximations to all isolated complex solutions. During the last decade considerable progress has been accomplished on exploiting structure in a polynomial system, in particular its sparsity. In this article the structure and design of the software package PHC is described. The main program operates in several modes, is menu driven, and is file oriented. This package features great variety of root-counting methods among its tools. The outline of one black-box solver is sketched, and a report is given on its performance on a large database of test problems. The software has been developed on four different machine architectures. Its portability is ensured by the gnu-ada compiler.

708 citations


Proceedings Article
Galen C. Hunt1, Doug Brubacher1
12 Jul 1999
TL;DR: The Detours library is presented, a library for instrumenting arbitrary Win32 functions on x86 machines and is the first package on any platform to logically preserve the un-instrumented target function as a subroutine for use by the instrumentation.
Abstract: Innovative systems research hinges on the ability to easily instrument and extend existing operating system and application functionality. With access to appropriate source code, it is often trivial to insert new instrumentation or extensions by rebuilding the OS or application. However, in today's world of commercial software, researchers seldom have access to all relevant source code. We present Detours, a library for instrumenting arbitrary Win32 functions on x86 machines. Detours intercepts Win32 functions by re-writing target function images. The Detours package also contains utilities to attach arbitrary DLLs and data segments (called payloads) to any Win32 binary. While prior researchers have used binary rewriting to insert debugging and profiling instrumentation, to our knowledge, Detours is the first package on any platform to logically preserve the un-instrumented target function (callable through a trampoline) as a subroutine for use by the instrumentation. Our unique trampoline design is crucial for extending existing binary software. We describe our experiences using Detours to create an automatic distributed partitioning system, to instrument and analyze the DCOM protocol stack, and to create a thunking layer for a COM-based OS API. Micro-benchmarks demonstrate the efficiency of the Detours library.

587 citations


Book
01 Jan 1999
TL;DR: This book focuses on the principles of numerical analysis and is intended to equip those readers who use statistics to craft their own software and to understand the advantages and disadvantages of different numerical methods.
Abstract: Numerical analysis is the study of computation and its accuracy, stability and often its implementation on a computer. This book focuses on the principles of numerical analysis and is intended to equip those readers who use statistics to craft their own software and to understand the advantages and disadvantages of different numerical methods.

Proceedings ArticleDOI
25 Feb 1999
TL;DR: Using PowerScope, a tool for profiling energy usage by applications, the approach combines hardware instrumentation to measure current level with kernel software support to perform statistical sampling of system activity.
Abstract: We describe the design and implementation of PowerScope, a tool for profiling energy usage by applications. PowerScope maps energy consumption to program structure, in much the same way that CPU profilers map processor cycles to specific processes and procedures. Our approach combines hardware instrumentation to measure current level with kernel software support to perform statistical sampling of system activity. Postprocessing software maps the sample data to program structure and produces a profile of energy usage by process and procedure. Using PowerScope, we have been able to reduce the energy consumption of an adaptive video playing application by 46%.

Journal ArticleDOI
TL;DR: Three-dimensional volume rendering is useful in a wide variety of applications but is just now being incorporated into commercially available software packages for medical imaging, with wider availability and improved cost-to-performance ratios in computing.
Abstract: Three-dimensional (3D) medical images of computed tomographic (CT) data sets can be generated with a variety of computer algorithms. The three most commonly used techniques are shaded surface display, maximum intensity projection, and, more recently, 3D volume rendering. Implementation of 3D volume rendering involves volume data management, which relates to operations including acquisition, resampling, and editing of the data set; rendering parameters including window width and level, opacity, brightness, and percentage classification; and image display, which comprises techniques such as "fly-through" and "fly-around," multiple-view display, obscured structure and shading depth cues, and kinetic and stereo depth cues. An understanding of both the theory and method of 3D volume rendering is essential for accurate evaluation of the resulting images. Three-dimensional volume rendering is useful in a wide variety of applications but is just now being incorporated into commercially available software packages for medical imaging. Although further research is needed to determine the efficacy of 3D volume rendering in clinical applications, with wider availability and improved cost-to-performance ratios in computing, 3D volume rendering is likely to enjoy widespread acceptance in the medical community.

Journal ArticleDOI
TL;DR: MCLUST is a software package for cluster analysis written in Fortran and interfaced to the S PLUS commercial software package and includes functions that combine hierarchical clustering EM and the Bayesian Information Criterion BIC in a comprehensive clustering strategy.
Abstract: MCLUST is a software package for cluster analysis written in Fortran and interfaced to the S PLUS commercial software package It implements parameterized Gaussian hierarchical clustering algorithms and the EM algorithm for parameterized Gaussian mixture models with the possible addition of a Poisson noise term MCLUST also includes functions that combine hierarchical clustering EM and the Bayesian Information Criterion BIC in a comprehensive clustering strategy Methods of this type have shown promise in a number of practical applications including character recognition tissue segmenta tion mine eld and seismic fault detection identi cation of textile aws from images and classi cation of astronomical data A web page with related links can be found at

Patent
16 Jul 1999
TL;DR: In this article, the authors proposed a method and apparatus for providing an automatically upgradeable software application that includes targeted advertising based upon demographics and user interaction with the computer, including a display region used for banner advertising.
Abstract: A method and apparatus for providing an automatically upgradeable software application includes targeted advertising based upon demographics and user interaction with the computer. The software application includes a display region used for banner advertising that is downloaded over a network such as the Internet. The software application is accessible from a server via the network and demographic information on the user is acquired by the server and used for determining what advertising will be sent to the user. The software application further targets the advertisements in response to normal user interaction with the computer. Data associated with each advertisement is used by the software application in determining when a particular advertisement is to be displayed. This includes the specification of certain programs that the user may have so that, when the user runs the program (e.g., a spreadsheet program), a relevant advertisement will be displayed (e.g., an advertisement for a stock brokerage). This provides two-tiered, real-time targeting of advertising—both demographically and reactively. The software application includes programming that accesses the server to determine if one or more components of the application need upgrading. If so, the components can be downloaded and installed without further action by the user. A distribution tool is provided for software distribution and upgrading over the network. Also provided is a user profile that is accessible to any computer on the network. Furthermore, multiple users of the same computer can possess Internet web resources and files that are personalized, maintained and organized.

Patent
11 Jun 1999
TL;DR: In this article, a storage device configuration manager is implemented in software for a computer system including a processor, a memory coupled to the processor, and at least one storage device coupled to a processor.
Abstract: A storage device configuration manager implemented in software for a computer system including a processor, a memory coupled to the processor, and at least one storage device coupled to the processor, can advantageously allow a user having relatively limited knowledge to configure storage devices for use with specific applications. The storage device configuration manager includes a user interface allowing for selecting, editing, deleting, and/or activating storage polices. The storage policies include information useful for configuring the storage device to operate efficiently with a particular application, or within a particular user environment. The information is used by a policy engine to configure the storage device, thereby reducing the knowledge and effort required by a user.

Journal ArticleDOI
TL;DR: An overview of work being conducted in software process simulation modeling is provided, offering some guidance in selecting a simulation modeling approach for practical application, and recommending some issues warranting additional research.

Journal ArticleDOI
TL;DR: An overview of common knowledge discovery tasks and approaches to solve these tasks is provided, and a feature classification scheme that can be used to study knowledge and data mining software is proposed.
Abstract: Knowledge discovery in databases is a rapidly growing field, whose development is driven by strong research interests as well as urgent practical, social, and economical needs. While the last few years knowledge discovery tools have been used mainly in research environments, sophisticated software products are now rapidly emerging. In this paper, we provide an overview of common knowledge discovery tasks and approaches to solve these tasks. We propose a feature classification scheme that can be used to study knowledge and data mining software. This scheme is based on the software's general characteristics, database connectivity, and data mining characteristics. We then apply our feature classification scheme to investigate 43 software products, which are either research prototypes or commercially available. Finally, we specify features that we consider important for knowledge discovery software to possess in order to accommodate its users effectively, as well as issues that are either not addressed or insufficiently solved yet.

Patent
16 Sep 1999
TL;DR: In this article, a method and system for downloading software update data for installing a revised software product on a client computer minimizes the amount of update data to be transmitted over the network by downloading only those files needed to put the client computer in the state for installing the product.
Abstract: A method and system for downloading software update data for installing a revised software product on a client computer minimizes the amount of update data to be transmitted over the network by downloading only those files needed to put the client computer in the state for installing the product. In the beginning of the downloading process, the client computer obtains from a setup server an initial setup package that includes a setup program and a list of files required for installing the software product. The setup program running on the client computer then determines whether some current or earlier versions of those files required for installation already exist on the client computer, and compiles a request list of files needed for updating the client computer. The client computer sends the request list to a download server, which maintains a collection of update files and patches. In response to the request list, the download server downloads updating files to the client. Depending of the availability of the requested files or other factors, the downloaded files may or may not be exactly those requested. Using the downloaded files, the setup program updates the existing files to provide the set of installation files on the client computer. The desired revised software product is then installed on the client computer.

Patent
21 Dec 1999
TL;DR: In this paper, an approach for dragging or manipulating an object across a non-touch sensitive discontinuity between touch-sensitive screens of a computer is described. But the user activates means to trigger manipulation of the object from the source screen to the target screen.
Abstract: Apparatus and process are provided for dragging or manipulating an object across a non-touch sensitive discontinuity between touch-sensitive screens of a computer. The object is selected and its parameters are stored in a buffer. The user activates means to trigger manipulation of the object from the source screen to the target screen. In one embodiment, a pointer is manipulated continuously on the source screen to effect the transfer. The object can be latched in a buffer for release on when the pointer contacts the target screen, preferably before a timer expires. Alternatively, the object is dragged in a gesture or to impinge a hot switch which, directs the computer to release the object on the target screen. In a hardware embodiment, buttons on a wireless pointer can be invoked to specify cut, copy or menu options and hold the object in the buffer despite a pointer lift. In another software/hardware embodiment, the steps of source screen and object selection can be aided with eye-tracking and voice recognition hardware and software.

Journal ArticleDOI
Joseph Mitola1
TL;DR: Analysis of the topological properties of the software radio architecture yields a layered distributed virtual machine reference model and a set of architecture design principles that may be useful in defining interfaces among hardware, middleware, and higher level software components that are needed for cost-effective software reuse.
Abstract: As the software radio makes its transition from research to practice, it becomes increasingly important to establish provable properties of the software radio architecture on which product developers and service providers can base technology insertion decisions. Establishing provable properties requires a mathematical perspective on the software radio architecture. This paper contributes to that perspective by critically reviewing the fundamental concept of the software radio, using mathematical models to characterize this rapidly emerging technology in the context of similar technologies like programmable digital radios. The software radio delivers dynamically defined services through programmable processing capacity that has the mathematical structure of the Turing machine. The bounded recursive functions, a subset of the total recursive functions, are shown to be the largest class of Turing-computable functions for which software radios exhibit provable stability in plug-and-play scenarios. Understanding the topological properties of the software radio architecture promotes plug-and-play applications and cost-effective reuse. Analysis of these topological properties yields a layered distributed virtual machine reference model and a set of architecture design principles for the software radio. These criteria may be useful in defining interfaces among hardware, middleware, and higher level software components that are needed for cost-effective software reuse.

Patent
Ray Waldin1, Carey Nachenberg1
25 Mar 1999
TL;DR: In this article, a software application (110) is updated to a newer version by means of incremental update patches (122), each containing that information necessary to transform one version of an application to another version.
Abstract: A software application (110) is updated to a newer version by means of incremental update patches (122). The incremental update patches (122) each contain that information necessary to transform one version of an application to another version. Any version of an application (110) may be upgraded to any other version of the application, through the use of a series of incremental update patches (122). The appropriate incremental update patches (122) are distributed in a multi-tiered manner, such that some update patches (122) update the application (110) by only one version, and others update the application (110) by several versions.

Journal ArticleDOI
TL;DR: This work argues that checklists are seriously flawed in principle because they do not encompass a consideration of learning issues, and proposes an approach that adapts the idea of usability heuristics by taking account of a socio-constructivist learning perspective.

Patent
12 Oct 1999
TL;DR: In this paper, the authors present a system for tracking and graphically displaying the positions of vehicles in a fleet, and interacting with the vehicles from a base station, which includes update software which updates text data in a database, updates the graphical representation of the vehicle, and bidirectionally and dynamically links and integrates the text data with the graphical representations of a vehicle.
Abstract: The present invention is for a system for tracking and graphically displaying the positions of vehicles in a fleet, and interacting with the vehicles from a base station. The vehicles in the fleet are equipped with a G.P.S. receiver and communicate the G.P.S. information to a base station. A receiver at the base station receives the information. A computer system connected to the receiver then uses this information to display the position of the vehicle using mapping and tracking software. The system also includes update software which updates text data in a database, updates the graphical representation of the vehicle, and bidirectionally and dynamically links and integrates the text data with the graphical representation of a vehicle. The text data in the database includes information relating to the vehicle, the driver, the schedule of the fleet as well as information relating to the fleet. A user is able to select a vehicle using a selector, the update software can provide information relating to text data. If the user selects information relating to a vehicle or driver using the selector, the update software provides the graphical representation of the selected vehicle or driver. The system also has several features allowing a dispatcher to cooperate with the driver in delivery and vehicle operation.

Book
11 Jul 1999
TL;DR: This supplement to any standard DSP text is one of the first books to successfully integrate the use of MATLAB® in the study of DSP concepts and greatly expands the range and complexity of problems that students can effectively study in the course.
Abstract: From the Publisher: This supplement to any standard DSP text is one of the first books to successfully integrate the use of MATLAB® in the study of DSP concepts. In this book, MATLAB® is used as a computing tool to explore traditional DSP topics, and solve problems to gain insight. This greatly expands the range and complexity of problems that students can effectively study in the course. Since DSP applications are primarily algorithms implemented on a DSP processor or software, a fair amount of programming is required. Using interactive software such as MATLAB® makes it possible to place more emphasis on learning new and difficult concepts than on programming algorithms. Interesting practical examples are discussed and useful problems are explored. This updated printing revises the scripts in the book, available functions, and m-files (available for downloading from the Brooks/Cole Bookware Companion Resource Series™ Center Web site) to MATLAB® V5 (created with 5.3).

Journal ArticleDOI
TL;DR: This work demonstrates the advantages of applying methods and techniques from other domains to software engineering and illustrates how, despite difficulties, software evolution can be empirically studied.
Abstract: With the approach of the new millennium, a primary focus in software engineering involves issues relating to upgrading, migrating, and evolving existing software systems. In this environment, the role of careful empirical studies as the basis for improving software maintenance processes, methods, and tools is highlighted. One of the most important processes that merits empirical evaluation is software evolution. Software evolution refers to the dynamic behaviour of software systems as they are maintained and enhanced over their lifetimes. Software evolution is particularly important as systems in organizations become longer-lived. However, evolution is challenging to study due to the longitudinal nature of the phenomenon in addition to the usual difficulties in collecting empirical data. We describe a set of methods and techniques that we have developed and adapted to empirically study software evolution. Our longitudinal empirical study involves collecting, coding, and analyzing more than 25000 change events to 23 commercial software systems over a 20-year period. Using data from two of the systems, we illustrate the efficacy of flexible phase mapping and gamma sequence analytic methods, originally developed in social psychology to examine group problem solving processes. We have adapted these techniques in the context of our study to identify and understand the phases through which a software system travels as it evolves over time. We contrast this approach with time series analysis. Our work demonstrates the advantages of applying methods and techniques from other domains to software engineering and illustrates how, despite difficulties, software evolution can be empirically studied.

Proceedings ArticleDOI
18 Oct 1999
TL;DR: The system is equipped with a unique combination of sensors and software that supports natural language processing, speech recognition, machine translation, handwriting recognition and multimodal fusion.
Abstract: In this paper, we present our efforts towards developing an intelligent tourist system The system is equipped with a unique combination of sensors and software The hardware includes two computers, a GPS receiver, a lapel microphone plus an earphone, a video camera and a head-mounted display This combination includes a multimodal interface to take advantage of speech and gesture input to provide assistance for a tourist The software supports natural language processing, speech recognition, machine translation, handwriting recognition and multimodal fusion A vision module is trained to locate and read written language, is able to adapt to to new environments, and is able to interpret intentions offered by the user such as a spoken clarification or pointing gesture We illustrate the applications of the system using two examples

Patent
20 Jul 1999
TL;DR: In this paper, the authors present a collection, configuration and integration of software programs that reside on multiple interconnected computer platforms, which share the software programs, data files, and visualization programs via a Local Area Network (LAN).
Abstract: The analysis system is a collection, configuration and integration of software programs that reside on multiple interconnected computer platforms. The software, less computer operating systems, is a combination of sensor, analysis, data conversion, and visualization programs. The hardware platforms consist of several different types of interconnected computers, which share the software programs, data files, and visualization programs via a Local Area Network (LAN). This collection and integration of software and the migration to a single computer platform results in an approach to LAN/WAN monitoring in either a passive and/or active mode. The architecture permits digital data input from external sensors for analysis, display and correlation with data and displays derived from four major software concept groups. These are: Virus Computer Code Detection; Analysis of Computer Source and Executable Code; Dynamic Monitoring of Data Communication Networks; 3-D Visualization and Animation of Data.

Proceedings ArticleDOI
16 May 1999
TL;DR: This paper assess and compare a selection of common cost modeling techniques fulfilling a number of important criteria using a large multi-organizational database in the business application domain and shows that the performances of the modeling techniques considered were not significantly different, with the exception of the analogy-based models which appear to be less accurate.
Abstract: This paper investigates two essential questions related to data-driven, software cost modeling: (1) What modeling techniques are likely to yield more accurate results when using typical software development cost data? and (2) What are the benefits and drawbacks of using organization-specific data as compared to multi-organization databases? The former question is important in guiding software cost analysts in their choice of the right type of modeling technique, if at all possible. In order to address this issue, we assess and compare a selection of common cost modeling techniques fulfilling a number of important criteria using a large multi-organizational database in the business application domain. Namely, these are: ordinary least squares regression, stepwise ANOVA, CART, and analogy. The latter question is important in order to assess the feasibility of using multi-organization cost databases to build cost models and the benefits gained from local, company-specific data collection and modeling. As a large subset of the data in the multi-company database came from one organization, we were able to investigate this issue by comparing organization-specific models with models based on multi-organization data. Results show that the performances of the modeling techniques considered were not significantly different, with the exception of the analogy-based models which appear to be less accurate. Surprisingly, when using standard cost factors (e.g., COCOMO-like factors, Function Points), organization specific models did not yield better results than generic, multi-organization models.

Journal ArticleDOI
TL;DR: The fast Fourier transform autoindexing routines written by the Rossmann group at Purdue University have been incorporated in MOSFLM, providing a rapid and reliable method of indexing oscillation images.
Abstract: The fast Fourier transform (FFT) autoindexing routines written by the Rossmann group at Purdue University have been incorporated in MOSFLM, providing a rapid and reliable method of indexing oscillation images. This is a procedure which extracts direct-space information about the unit cell from the FFT. The method and its implementation in MOSFLM are discussed.