scispace - formally typeset
Search or ask a question

Showing papers on "Software published in 2007"


Journal ArticleDOI
TL;DR: Z-Tree as mentioned in this paper is a toolbox for ready-made economic experiments, which allows programming almost any kind of experiments in a short time and is stable and easy to use.
Abstract: z-Tree (Zurich Toolbox for Ready-made Economic Experiments) is a software for developing and conducting economic experiments. The software is stable and allows programming almost any kind of experiments in a short time. In this article, I present the guiding principles behind the software design, its features, and its limitations.

9,760 citations


Journal ArticleDOI
TL;DR: The most relevant features of WSXM, a freeware scanning probe microscopy software based on MS-Windows, are described and some relevant procedures of the software are carried out.
Abstract: In this work we briefly describe the most relevant features of WSXM, a freeware scanning probe microscopy software based on MS-Windows. The article is structured in three different sections: The introduction is a perspective on the importance of software on scanning probe microscopy. The second section is devoted to describe the general structure of the application; in this section the capabilities of WSXM to read third party files are stressed. Finally, a detailed discussion of some relevant procedures of the software is carried out.

6,996 citations


Journal ArticleDOI
TL;DR: A new free suite of software tools designed to make this task easier, using the latest advances in hardware and software, written in the Python interpreted language using entirely free libraries are described.

3,575 citations


Journal ArticleDOI
TL;DR: The constraint-based reconstruction and analysis toolbox as discussed by the authors is a software package running in the Matlab environment, which allows for quantitative prediction of cellular behavior using a constraintbased approach and allows predictive computations of both steady-state and dynamic optimal growth behavior, the effects of gene deletions, comprehensive robustness analyses, sampling the range of possible cellular metabolic states and the determination of network modules.
Abstract: The manner in which microorganisms utilize their metabolic processes can be predicted using constraint-based analysis of genome-scale metabolic networks. Herein, we present the constraint-based reconstruction and analysis toolbox, a software package running in the Matlab environment, which allows for quantitative prediction of cellular behavior using a constraint-based approach. Specifically, this software allows predictive computations of both steady-state and dynamic optimal growth behavior, the effects of gene deletions, comprehensive robustness analyses, sampling the range of possible cellular metabolic states and the determination of network modules. Functions enabling these calculations are included in the toolbox, allowing a user to input a genome-scale metabolic model distributed in Systems Biology Markup Language format and perform these calculations with just a few lines of code. The results are predictions of cellular behavior that have been verified as accurate in a growing body of research. After software installation, calculation time is minimal, allowing the user to focus on the interpretation of the computational results.

1,827 citations


Journal ArticleDOI
TL;DR: The paper presents a detailed description of the abstractions chosen for defining geometric information of meshes and the handling of degrees of freedom associated with finite element spaces, as well as of linear algebra, input/output capabilities and of interfaces to other software, such as visualization tools.
Abstract: An overview of the software design and data abstraction decisions chosen for deal.II, a general purpose finite element library written in Cpp, is given. The library uses advanced object-oriented and data encapsulation techniques to break finite element implementations into smaller blocks that can be arranged to fit users requirements. Through this approach, deal.II supports a large number of different applications covering a wide range of scientific areas, programming methodologies, and application-specific algorithms, without imposing a rigid framework into which they have to fit. A judicious use of programming techniques allows us to avoid the computational costs frequently associated with abstract object-oriented class libraries.The paper presents a detailed description of the abstractions chosen for defining geometric information of meshes and the handling of degrees of freedom associated with finite element spaces, as well as of linear algebra, input/output capabilities and of interfaces to other software, such as visualization tools. Finally, some results obtained with applications built atop deal.II are shown to demonstrate the powerful capabilities of this toolbox.

1,306 citations


Proceedings ArticleDOI
23 May 2007
TL;DR: Model-Driven Engineering (MDE) is typically used to describe software development approaches in which abstract models of software systems are created and systematically transformed to concrete implementations as discussed by the authors, but full realizations of the MDE vision may not be possible in the near to medium-term primarily because of the wicked problems involved.
Abstract: The term model-driven engineering (MDE) is typically used to describe software development approaches in which abstract models of software systems are created and systematically transformed to concrete implementations. In this paper we give an overview of current research in MDE and discuss some of the major challenges that must be tackled in order to realize the MDE vision of software development. We argue that full realizations of the MDE vision may not be possible in the near to medium-term primarily because of the wicked problems involved. On the other hand, attempting to realize the vision will provide insights that can be used to significantly reduce the gap between evolving software complexity and the technologies used to manage complexity.

1,155 citations


Posted Content
01 Jan 2007
TL;DR: In this article, the authors define Web 2.0 and understand its implications for the next generation of software, looking at both design patterns and business modes, and define Web2.0 applications are those that make the most of the intrinsic advantages of that platform: delivering software as a continually updated service that gets better the more people use it.
Abstract: This paper was the first initiative to try to define Web2.0 and understand its implications for the next generation of software, looking at both design patterns and business modes. Web 2.0 is the network as platform, spanning all connected devices; Web 2.0 applications are those that make the most of the intrinsic advantages of that platform: delivering software as a continually-updated service that gets better the more people use it, consuming and remixing data from multiple sources, including individual users, while providing their own data and services in a form that allows remixing by others, creating network effects through an "architecture of participation," and going beyond the page metaphor of Web 1.0 to deliver rich user experiences.

1,149 citations


Journal ArticleDOI
TL;DR: The use of the open-source software, CellProfiler, to automatically identify and measure a variety of biological objects in images is described, enabling biologists to comprehensively and quantitatively address many questions that previously would have required custom programming.
Abstract: Careful visual examination of biological samples is quite powerful, but many visual analysis tasks done in the laboratory are repetitive, tedious, and subjective. Here we describe the use of the open-source software, CellProfiler, to automatically identify and measure a variety of biological objects in images. The applications demonstrated here include yeast colony counting and classifying, cell microarray annotation, yeast patch assays, mouse tumor quantification, wound healing assays, and tissue topology measurement. The software automatically identifies objects in digital images, counts them, and records a full spectrum of measurements for each object, including location within the image, size, shape, color intensity, degree of correlation between colors, texture (smoothness), and number of neighbors. Small numbers of images can be processed automatically on a personal computer and hundreds of thousands can be analyzed using a computing cluster. This free, easy-to-use software enables biologists to comprehensively and quantitatively address many questions that previously would have required custom programming, thereby facilitating discovery in a variety of biological fields of study.

849 citations


Book
01 May 2007
TL;DR: This book considers a wide range of tasks and processes in the data management and analysis process, and shows how software can help you at each stage.
Abstract: Using Software in Qualitative Research is an essential introduction to the practice and principles of Computer Assisted Qualitative Data AnalysiS (CAQDAS). The book will help you to choose the most appropriate package for your needs and get the most out of the software once you are using it. This book considers a wide range of tasks and processes in the data management and analysis process, and shows how software can help you at each stage. In the new edition, the authors present three case studies with different forms of data (text, video and mixed data) and show how each step in the analysis process for each project could be supported by software. The new editionis accompanied by an extensive companion website with step-by-step instructions produced by the software developers themselves. Software programmes covered in second edition include the latest versions of: ATLAS.ti DEDOOSE HyperRESEARCH MAXQDA NVivo QDA Miner TRANSANA Ann Lewins and Christina Silver are leading experts in the field of CAQDAS and have trained thousands of students and researchers in using software. Reading this book is like having Ann and Christina at your shoulder as you analyse your data!

811 citations


Book
01 Jan 2007
TL;DR: The second edition of the widely acclaimed "Geospatial analysis" guide has been updated and extended to include a major new chapter on Geocomputational Methods as discussed by the authors, addressing the full spectrum of analytical techniques that are provided within modern Geographic Information Systems (GIS) and related geospatial software products.
Abstract: This second edition of the widely acclaimed "Geospatial Analysis" guide has been updated and extended to include a major new chapter on Geocomputational Methods. It addresses the full spectrum of analytical techniques that are provided within modern Geographic Information Systems (GIS) and related geospatial software products. It is broad in its treatment of concepts and methods and representative in terms of the software that people actually use.Topics covered include: the principal concepts of geospatial analysis, their origins and methodological context; core components of geospatial analysis, including distance and directional analysis, geometrical processing, map algebra, and grid models; basic methods of exploratory spatial data analysis (ESDA) and spatial statistics, including spatial autocorrelation and spatial regression; surface analysis, including surface form analysis, gridding and interpolation methods; network and locational analysis, including shortest path calculation, traveling salesman problems; facility location and arc routing; Geocomputational methods, including Cellular automata, Agent Based Modelling, Neural Networks and Genetic Algorithms.The Guide has been designed for everyone involved in geospatial analysis, from undergraduate and postgraduate to professional analyst, software engineer and GIS practitioner. It builds upon the spatial analysis topics included in the US National Academies 'Beyond Mapping' and 'Learning to think spatially' agendas, the UK 'Spatial Literacy in Teaching' program, the NCGIA Core Curriculum and the AAAG/UCGIS Body of Knowledge. As such it provides a valuable reference guide and accompaniment to courses built around these programs.

620 citations


28 Sep 2007
TL;DR: In this article, an open-source, MATLAB software Global Positioning System (GPS) receiver is described, all functions from signal acquisition to position solution are described, and the results from Wide Area Augmentation System (WAAS) ranging, WAAS correction decoding, and navigation solution filtering are described.
Abstract: This paper presents a brief overview of an open-source, MATLAB software Global Positioning System (GPS) receiver. All functions from signal acquisition to position solution are described. The focus is then turned to developments at the University of Colorado, Boulder related to the software receiver. A Universal Serial Bus (USB) intermediate frequency (IF) data sampler module is described that researchers can use to collect data to support their research. Student projects that were the outcome of a graduate level global navigation satellite system (GNSS) receiver architecture course are also described. These projects included signal acquisition and tracking in a field-programmable gate array (FPGA), multi-correlator signal tracking, L2C acquisition, and Galileo signal tracking. The validity of the existing software receiver as a base for future research was confirmed by the success of these projects. The results from Wide Area Augmentation System (WAAS) ranging, WAAS correction decoding, and navigation solution filtering are described all with the intent of improving positioning accuracy.

Book
02 Nov 2007
TL;DR: This book is intended as an introduction to the entire range of issues important to reconfigurable computing, using FPGAs as the context, or "computing vehicles" to implement this powerful technology.
Abstract: The main characteristic of Reconfigurable Computing is the presence of hardware that can be reconfigured to implement specific functionality more suitable for specially tailored hardware than on a simple uniprocessor. Reconfigurable computing systems join microprocessors and programmable hardware in order to take advantage of the combined strengths of hardware and software and have been used in applications ranging from embedded systems to high performance computing. Many of the fundamental theories have been identified and used by the Hardware/Software Co-Design research field. Although the same background ideas are shared in both areas, they have different goals and use different approaches.This book is intended as an introduction to the entire range of issues important to reconfigurable computing, using FPGAs as the context, or "computing vehicles" to implement this powerful technology. It will take a reader with a background in the basics of digital design and software programming and provide them with the knowledge needed to be an effective designer or researcher in this rapidly evolving field. · Treatment of FPGAs as computing vehicles rather than glue-logic or ASIC substitutes · Views of FPGA programming beyond Verilog/VHDL · Broad set of case studies demonstrating how to use FPGAs in novel and efficient ways

Journal ArticleDOI
TL;DR: Bsoft is a software package written for image processing of electron micrographs, interpretation of reconstructions, molecular modeling, and general image processing that allows shell scripting of processes and allows subtasks to be distributed across multiple computers for concurrent processing.

Patent
Emilie Phillips1, Pavlo E. Rudakevych1, Orjeta Taka1, James Gordon Wolfe, Tom Frost1 
14 May 2007
TL;DR: In this article, the authors propose a method for enhancing operational efficiency of a remote vehicle using a diagnostic behavior, which comprises inputting and analyzing data received from a plurality of sensors to determine the existence of deviations from normal operation of the remote vehicle, updating parameters in a reference mobility model based on deviations from the normal operation, and revising strategies to achieve an operational goal of the vehicle to accommodate deviations from normality.
Abstract: A method for enhancing operational efficiency of a remote vehicle using a diagnostic behavior. The method comprises inputting and analyzing data received from a plurality of sensors to determine the existence of deviations from normal operation of the remote vehicle, updating parameters in a reference mobility model based on deviations from normal operation, and revising strategies to achieve an operational goal of the remote vehicle to accommodate deviations from normal operation. An embedded simulation and training system for a remote vehicle. The system comprises a software architecture installed on the operator control unit and including software routines and drivers capable of carrying out mission simulations and training.


Journal ArticleDOI
TL;DR: A taxonomy is derived from the analysis of this literature and presents the work via four dimensions: the type of software repositories mined, the purpose, the adopted/invented methodology used, and the evaluation method.
Abstract: A comprehensive literature survey on approaches for mining software repositories (MSR) in the context of software evolution is presented. In particular, this survey deals with those investigations that examine multiple versions of software artifacts or other temporal information. A taxonomy is derived from the analysis of this literature and presents the work via four dimensions: the type of software repositories mined (what), the purpose (why), the adopted/invented methodology used (how), and the evaluation method (quality). The taxonomy is demonstrated to be expressive (i.e., capable of representing a wide spectrum of MSR investigations) and effective (i.e., facilitates similarities and comparisons of MSR investigations). Lastly, a number of open research issues in MSR that require further investigation are identified.

Journal ArticleDOI
TL;DR: Based on suitable video recordings of interactive pedestrian motion and improved tracking software, the authors apply an evolutionary optimization algorithm to determine optimal parameter specifications for the social force model, which is then used for large-scale pedestrian simulations of evacuation scenarios, pilgrimage, and urban environments.
Abstract: Based on suitable video recordings of interactive pedestrian motion and improved tracking software, we apply an evolutionary optimization algorithm to determine optimal parameter specifications for the social force model. The calibrated model is then used for large-scale pedestrian simulations of evacuation scenarios, pilgrimage, and urban environments.

Journal ArticleDOI
TL;DR: Why is it that some software engineers and computer scientists are able to produce clear, elegant designs and programs, while others cannot?
Abstract: Why is it that some software engineers and computer scientists are able to produce clear, elegant designs and programs, while others cannot? Is it possible to improve these skills through education and training? Critical to these questions is the notion of abstraction.

Journal ArticleDOI
TL;DR: An artifact management system with a traceability recovery tool based on Latent Semantic Indexing (LSI), an information retrieval technique, is improved and it is shown that such tools can help to identify quality problems in the textual description of traced artifacts.
Abstract: The main drawback of existing software artifact management systems is the lack of automatic or semi-automatic traceability link generation and maintenance. We have improved an artifact management system with a traceability recovery tool based on Latent Semantic Indexing (LSI), an information retrieval technique. We have assessed LSI to identify strengths and limitations of using information retrieval techniques for traceability recovery and devised the need for an incremental approach. The method and the tool have been evaluated during the development of seventeen software projects involving about 150 students. We observed that although tools based on information retrieval provide a useful support for the identification of traceability links during software development, they are still far to support a complete semi-automatic recovery of all links. The results of our experience have also shown that such tools can help to identify quality problems in the textual description of traced artifacts.

Journal Article
TL;DR: In this article, an evolutionary optimization algorithm was applied to determine optimal parameter specifications for the social force model for large-scale pedestrian simulations of evacuation scenarios, pilgrimage, and urban environments.
Abstract: Based on suitable video recordings of interactive pedestrian motion and improved tracking software, we apply an evolutionary optimization algorithm to determine optimal parameter specifications for the social force model. The calibrated model is then used for large-scale pedestrian simulations of evacuation scenarios, pilgrimage, and urban environments.

Proceedings ArticleDOI
23 May 2007
TL;DR: Software Performance Engineering encompasses efforts to describe and improve performance, with two distinct approaches: an early-cycle predictive model- based approach, and a late-cycle measurement-based approach.
Abstract: Performance is a pervasive quality of software systems; everything affects it, from the software itself to all underlying layers, such as operating system, middleware, hardware, communication networks, etc. Software Performance Engineering encompasses efforts to describe and improve performance, with two distinct approaches: an early-cycle predictive model-based approach, and a late-cycle measurement-based approach. Current progress and future trends within these two approaches are described, with a tendency (and a need) for them to converge, in order to cover the entire development cycle.

Journal ArticleDOI
TL;DR: In this paper, the authors discuss the software tools for reconstruction and analysis of tomographic data that are being developed at the UGCT, and the analysis of the 3D data focuses primarily on the characterization of pore structures, but will be extended to other applications.
Abstract: The technique of X-ray microtomography using X-ray tube radiation offers an interesting tool for the non-destructive investigation of a wide range of materials. A major challenge lies in the analysis and quantification of the resulting data, allowing for a full characterization of the sample under investigation. In this paper, we discuss the software tools for reconstruction and analysis of tomographic data that are being developed at the UGCT. The tomographic reconstruction is performed using Octopus, a high-performance and user-friendly software package. The reconstruction process transforms the raw acquisition data into a stack of 2D cross-sections through the sample, resulting in a 3D data set. A number of artifact and noise reduction algorithms are integrated to reduce ring artifacts, beam hardening artifacts, COR misalignment, detector or stage tilt, pixel non-linearities, etc. These corrections are very important to facilitate the analysis of the 3D data. The analysis of the 3D data focuses primarily on the characterization of pore structures, but will be extended to other applications. A first package for the analysis of pore structures in three dimensions was developed under Matlab®. A new package, called Morpho+, is being developed in a C++ environment, with optimizations and extensions of the previously used algorithms. The current status of this project will be discussed. Examples of pore analysis can be found in pharmaceuticals, material science, geology and numerous other fields.

Book
26 Dec 2007
TL;DR: This first up-to-date textbook for machine vision software provides all the details on the theory and practical use of the relevant algorithms, and features real-world examples, example code with HALCON, and further exercises.
Abstract: This first up-to-date textbook for machine vision software provides all the details on the theory and practical use of the relevant algorithms. The first part covers image acquisition, including illumination, lenses, cameras, frame grabbers, and bus systems, while the second deals with the algorithms themselves. This includes data structures, image enhancement and transformations, segmentation, feature extraction, morphology, template matching, stereo reconstruction, and camera calibration. The final part concentrates on applications, and features real-world examples, example code with HALCON, and further exercises. Uniting the latest research results with an industrial approach, this textbook is ideal for students of electrical engineering, physics and informatics, electrical and mechanical engineers, as well as those working in the sensor, automation and optical industries. Free software available with registration code

Journal ArticleDOI
TL;DR: This paper provides a detailed analysis of the behavior of various similarity and distance measures that may be employed for software clustering, and analyzes the clustering process of various well-known clustering algorithms by using multiple criteria.
Abstract: Gaining an architectural level understanding of a software system is important for many reasons. When the description of a system's architecture does not exist, attempts must be made to recover it. In recent years, researchers have explored the use of clustering for recovering a software system's architecture, given only its source code. The main contributions of this paper are given as follows. First, we review hierarchical clustering research in the context of software architecture recovery and modularization. Second, to employ clustering meaningfully, it is necessary to understand the peculiarities of the software domain, as well as the behavior of clustering measures and algorithms in this domain. To this end, we provide a detailed analysis of the behavior of various similarity and distance measures that may be employed for software clustering. Third, we analyze the clustering process of various well-known clustering algorithms by using multiple criteria, and we show how arbitrary decisions taken by these algorithms during clustering affect the quality of their results. Finally, we present an analysis of two recently proposed clustering algorithms, revealing close similarities in their apparently different clustering approaches. Experiments on four legacy software systems provide insight into the behavior of well-known clustering algorithms and their characteristics in the software domain.

Proceedings ArticleDOI
09 Jun 2007
TL;DR: For certain workloads, SigTM can match the performance of a full-featured hardware TM system, while for workloads with large read-sets it can be up to two times slower.
Abstract: We propose signature-accelerated transactional memory (SigTM), ahybrid TM system that reduces the overhead of software transactions. SigTM uses hardware signatures to track the read-set and write-set forpending transactions and perform conflict detection between concurrent threads. All other transactional functionality, including dataversioning, is implemented in software. Unlike previously proposed hybrid TM systems, SigTM requires no modifications to the hardware caches, which reduces hardware cost and simplifies support for nested transactions and multithreaded processor cores. SigTM is also the first hybrid TM system to provide strong isolation guarantees between transactional blocks and non-transactional accesses without additional read and write barriers in non-transactional code.Using a set of parallel programs that make frequent use of coarse-grain transactions, we show that SigTM accelerates software transactions by 30% to 280%. For certain workloads, SigTM can match the performance of a full-featured hardware TM system, while for workloads with large read-sets it can be up to two times slower. Overall, we show that SigTM combines the performance characteristics and strong isolation guarantees of hardware TM implementations with the low cost and flexibility of software TM systems.

Journal ArticleDOI
TL;DR: A flexible and fast computer program, called fast-dm, for diffusion model data analysis is introduced, which allows estimating all parameters of Ratcliff’s (1978) diffusion model from the empirical response time distributions of any binary classification task.
Abstract: In the present article, a flexible and fast computer program, calledfast-dm, for diffusion model data analysis is introduced. Fast-dm is free software that can be downloaded from the authors’ websites. The program allows estimating all parameters of Ratcliff ’s (1978) diffusion model from the empirical response time distributions of any binary classification task. Fast-dm is easy to use: it reads input data from simple text files, while program settings are specified by command0s in a control file. With fast-dm, complex models can be fitted, where some parameters may vary between experimental conditions, while other parameters are constrained to be equal across conditions. Detailed directions for use of fast-dm are presented, as well as results from three short simulation studies exemplifying the utility of fast-dm.

Journal ArticleDOI
TL;DR: This paper surveys and analyzes current component models and classify them into a taxonomy based on commonly accepted desiderata for CBD, and describes its key characteristics and evaluates them with respect to these Desiderata.
Abstract: Component-based development (CBD) is an important emerging topic in software engineering, promising long-sought-after benefits like increased reuse, reduced time to market, and, hence, reduced software production cost. The cornerstone of a CBD technology is its underlying software component model, which defines components and their composition mechanisms. Current models use objects or architectural units as components. These are not ideal for component reuse or systematic composition. In this paper, we survey and analyze current component models and classify them into a taxonomy based on commonly accepted desiderata for CBD. For each category in the taxonomy, we describe its key characteristics and evaluate them with respect to these desiderata.

Proceedings ArticleDOI
09 Jun 2007
TL;DR: Raksha is proposed, an architecture for software security based on dynamic information flow tracking (DIFT) that supports flexible and programmable security policies that enable software to direct hardware analysis towards a wide range of high-level and low-level attacks.
Abstract: High-level semantic vulnerabilities such as SQL injection and crosssite scripting have surpassed buffer overflows as the most prevalent security exploits. The breadth and diversity of software vulnerabilities demand new security solutions that combine the speed and practicality of hardware approaches with the flexibility and robustness of software systems.This paper proposes Raksha, an architecture for software security based on dynamic information flow tracking (DIFT). Raksha provides three novel features that allow for a flexible hardware/software approach to security. First, it supports flexible and programmable security policies that enable software to direct hardware analysis towards a wide range of high-level and low-level attacks. Second, it supports multiple active security policies that can protect the system against concurrent attacks. Third, it supports low-overhead security handlers that allow software to correct, complement, or extend the hardware-based analysis without the overhead associated with operating system traps.We present an FPGA prototype for Raksha that provides a full featured Linux workstation for security analysis. Using unmodified binaries for real-world applications, we demonstrate that Raksha can detect high-level attacks such as directory traversal, command injection, SQL injection, and cross-site scripting as well as low-level attacks such as buffer overflows. We also show that low overhead exception handling is critical for analyses such as memory corruption protection in order to address false positives that occur due to the diverse code patterns in frequently used software.

Journal ArticleDOI
Galen C. Hunt1, James R. Larus1
TL;DR: This paper describes the efforts of the Singularity project to re-examine these design choices in light of advances in programming languages and verification tools, and sketches the ongoing research in experimental systems that build upon it.
Abstract: Every operating system embodies a collection of design decisions. Many of the decisions behind today's most popular operating systems have remained unchanged, even as hardware and software have evolved. Operating systems form the foundation of almost every software stack, so inadequacies in present systems have a pervasive impact. This paper describes the efforts of the Singularity project to re-examine these design choices in light of advances in programming languages and verification tools. Singularity systems incorporate three key architectural features: software-isolated processes for protection of programs and system services, contract-based channels for communication, and manifest-based programs for verification of system properties. We describe this foundation in detail and sketch the ongoing research in experimental systems that build upon it.

Book
01 Jan 2007
Abstract: This book proposes a set of models to describe fuzzy multi-objective decision making (MODM), fuzzy multi-criteria decision making (MCDM), fuzzy group decision making (GDM) and fuzzy multi-objective group decision-making problems, respectively. It also gives a set of related methods (including algorithms) to solve these problems. One distinguishing feature of this book is that it provides two decision support systems software for readers to apply these proposed methods. A set of real-world applications and some new directions in this area are then described to further instruct readers how to use these methods and software in their practice.