scispace - formally typeset
Search or ask a question

Showing papers on "System integration published in 2006"


Journal ArticleDOI
TL;DR: In this paper, the design challenges that current and future processors must face, with stringent power limits, high-frequency targets, and the continuing system integration trends, are reviewed, and a first-generation Cell processor is described.
Abstract: This paper reviews the design challenges that current and future processors must face, with stringent power limits, high-frequency targets, and the continuing system integration trends. This paper then describes the architecture, circuit design, and physical implementation of a first-generation Cell processor and the design techniques used to overcome the above challenges. A Cell processor consists of a 64-bit Power Architecture processor coupled with multiple synergistic processors, a flexible IO interface, and a memory interface controller that supports multiple operating systems including Linux. This multi-core SoC, implemented in 90-nm SOI technology, achieved a high clock rate by maximizing custom circuit design while maintaining reasonable complexity through design modularity and reuse.

258 citations


Patent
06 Dec 2006
TL;DR: In this article, a portable device that a computer can boot from, containing a prefabricated independent operating system environment which is engineered from the ground up to prioritize security while maximizing usability, in order to provide a safe, reliable and easy to use practical platform for high risk applications.
Abstract: The present invention is a portable device that a computer can boot from, containing a prefabricated independent operating system environment which is engineered from the ground up to prioritize security while maximizing usability, in order to provide a safe, reliable and easy to use practical platform for high risk applications. An embodiment of the present invention may temporarily transform an ordinary computer into a naturally inexpensive logical appliance which encapsulates a turn-key functional solution within the digital equivalent of a military grade security fortress. This allows existing hardware to be conveniently leveraged to provide a self contained system which does not depend on the on-site labor of rare and expensive system integration and security experts.

247 citations


Journal ArticleDOI
TL;DR: In this paper, a review of ERP projects, especially in services, completed by six case studies has been undertaken, identifying and discussing some characteristics of services, which are discriminatory regarding manufacturing.

242 citations


Journal ArticleDOI
TL;DR: Model-driven development is an emerging paradigm that solves numerous problems associated with the composition and integration of large-scale systems while leveraging advances in software development technologies such as component-based middleware.
Abstract: Historically, software development methodologies have focused more on improving tools for system development than on developing tools that assist with system composition and integration. Component-based middleware like Enterprise Java-Beans (EJB), Microsoft .NET, and the CORBA Component Model (CCM) have helped improve software reusability through component abstraction. However, as developers have adopted these commercial off-the-shelf technologies, a wide gap has emerged between the availability and sophistication of standard software development tools like compilers and debuggers, and the tools that developers use to compose, analyze, and test a complete system or system of systems. As a result, developers continue to accomplish system integration using ad hoc methods without the support of automated tools. Model-driven development is an emerging paradigm that solves numerous problems associated with the composition and integration of large-scale systems while leveraging advances in software development technologies such as component-based middleware. MDD elevates software development to a higher level of abstraction than is possible with third-generation programming languages.

213 citations


Journal ArticleDOI
TL;DR: In this article, the authors discuss the technology behind RFID systems, identify the applications of RFID in various industries, and discuss the technical challenges of the RFID implementation and the corresponding strategies to overcome those challenges.
Abstract: Purpose – The purpose of this paper is to discuss the technology behind RFID systems, identify the applications of RFID in various industries, and discuss the technical challenges of RFID implementation and the corresponding strategies to overcome those challenges.Design/methodology/approach – Comprehensive literature review and integration of the findings from literature.Findings – Technical challenges of RFID implementation include tag cost, standards, tag and reader selection, data management, systems integration and security. The corresponding solution is suggested for each challenge.Research limitations/implications – A survey type research is needed to validate the results.Practical implications – This research offers useful technical guidance for companies which plan to implement RFID and we expect it to provide the motivation for much future research in this area.Originality/value – As the infancy of RFID applications, few researches have existed to address the technical issues of RFID implementat...

188 citations


Journal ArticleDOI
TL;DR: F fuzzy cognitive mapping is used as a technique to identify causal interrelationships among the EAI adoption factors and will enhance the quality of the evaluation process and emphasizes the importance of each factor and its interrelationship with other factors.
Abstract: The integration of heterogeneous information systems has always been problematic in health-care organizations, as it is associated with the delivery of key services and has high operational costs. Therefore, health-care organizations are looking for new means to increase their functional capabilities and reduce integration cost. In addressing this need, enterprise application integration (EAI) technology has emerged to facilitate systems integration, enhance the quality of services, and reduce integration costs. Despite the application of EAI in other sectors, its adoption in health care is slow. In seeking to build on the limited normative research surrounding EAI, the authors of this paper focus on the evaluation of factors that influence EAI adoption in the health-care sector. In doing so, using fuzzy cognitive mapping as a technique to identify causal interrelationships among the EAI adoption factors. This approach will enhance the quality of the evaluation process and emphasizes the importance of each factor and its interrelationship with other factors. The outcomes shown in this paper will support health-care organizations' decision makers in exploring the implications surrounding EAI adoption.

183 citations


Proceedings ArticleDOI
01 Oct 2006
TL;DR: The Society of Automotive Engineers (SAE) Architecture Analysis & Design Language, AS5506, provides a means for the formal specification of the hardware and software architecture of embedded computer systems and system of systems.
Abstract: The Society of Automotive Engineers (SAE) Architecture Analysis & Design Language, AS5506, provides a means for the formal specification of the hardware and software architecture of embedded computer systems and system of systems. It was designed to support a full Model Based Development lifecycle including system specification, analysis, system tuning, integration, and upgrade over the lifecycle. It was designed to support the integration of multiple forms of analyses and to be extensible in a standard way for additional analysis approaches. A system can be automatically integrated from AADL models when fully specified and when source code is provided for the software components. Analysis of large complex systems has been demonstrated in the avionics domain.

173 citations


Journal ArticleDOI
TL;DR: The Gaggle is described -a simple, open-source Java software environment that helps to solve the problem of software and database integration and identifies a putative ricin-like protein, made possible by simultaneous data exploration using a wide range of publicly available data and a variety of popular bioinformatics software tools.
Abstract: Systems biologists work with many kinds of data, from many different sources, using a variety of software tools. Each of these tools typically excels at one type of analysis, such as of microarrays, of metabolic networks and of predicted protein structure. A crucial challenge is to combine the capabilities of these (and other forthcoming) data resources and tools to create a data exploration and analysis environment that does justice to the variety and complexity of systems biology data sets. A solution to this problem should recognize that data types, formats and software in this high throughput age of biology are constantly changing. In this paper we describe the Gaggle -a simple, open-source Java software environment that helps to solve the problem of software and database integration. Guided by the classic software engineering strategy of separation of concerns and a policy of semantic flexibility, it integrates existing popular programs and web resources into a user-friendly, easily-extended environment. We demonstrate that four simple data types (names, matrices, networks, and associative arrays) are sufficient to bring together diverse databases and software. We highlight some capabilities of the Gaggle with an exploration of Helicobacter pylori pathogenesis genes, in which we identify a putative ricin-like protein -a discovery made possible by simultaneous data exploration using a wide range of publicly available data and a variety of popular bioinformatics software tools. We have integrated diverse databases (for example, KEGG, BioCyc, String) and software (Cytoscape, DataMatrixViewer, R statistical environment, and TIGR Microarray Expression Viewer). Through this loose coupling of diverse software and databases the Gaggle enables simultaneous exploration of experimental data (mRNA and protein abundance, protein-protein and protein-DNA interactions), functional associations (operon, chromosomal proximity, phylogenetic pattern), metabolic pathways (KEGG) and Pubmed abstracts (STRING web resource), creating an exploratory environment useful to 'web browser and spreadsheet biologists', to statistically savvy computational biologists, and those in between. The Gaggle uses Java RMI and Java Web Start technologies and can be found at http://gaggle.systemsbiology.net .

165 citations


Journal ArticleDOI
TL;DR: An INS/GPS integration method based on artificial neural networks (ANN) to fuse uncompensated INS measurements and differential GPS (DGPS) measurements is suggested and two different architectures: the position update architecture (PUA) and the position and velocity PUA (PVUA).
Abstract: Inertial-navigation system (INS) and global position system (GPS) technologies have been widely applied in many positioning and navigation applications. INS determines the position and the attitude of a moving vehicle in real time by processing the measurements of three-axis gyroscopes and three-axis accelerometers mounted along three mutually orthogonal directions. GPS, on the other hand, provides the position and the velocity through the processing of the code and the carrier signals of at least four satellites. Each system has its own unique characteristics and limitations. Therefore, the integration of the two systems offers several advantages and overcomes each of their drawbacks. The integration of INS and GPS is usually implemented utilizing the Kalman filter, which represents one of the best solutions for INS/GPS integration. However, the Kalman filter performs adequately only under certain predefined dynamic models. Alternatively, this paper suggests an INS/GPS integration method based on artificial neural networks (ANN) to fuse uncompensated INS measurements and differential GPS (DGPS) measurements. The proposed method suggests two different architectures: the position update architecture (PUA) and the position and velocity PUA (PVUA). Both architectures were developed utilizing multilayer feed-forward neural networks with a conjugate gradient training algorithm

133 citations


Journal ArticleDOI
TL;DR: By using a generic communication framework, MARIE aims to create a flexible distributed component system that allows robotics developers to share software programs and algorithms, and design prototypes rapidly based on their own integration needs.
Abstract: This paper presents MARIE, a middleware framework oriented towards developing and integrating new and existing software for robotic systems. By using a generic communication framework, MARIE aims to create a flexible distributed component system that allows robotics developers to share software programs and algorithms, and design prototypes rapidly based on their own integration needs. The use of MARIE is illustrated with the design of a socially interactive autonomous mobile robot platform capable of map building, localization, navigation, tasks scheduling, sound source localization, tracking and separation, speech recognition and generation, visual tracking, message reading and graphical interaction using a touch screen interface.

133 citations


Journal ArticleDOI
TL;DR: A distributed system architecture that utilizes dominant state-of-the-art standard technologies, such as workflows, ontologies, and web services, in order to address the need for interoperability in the industrial enterprise environment in an efficient way is presented.
Abstract: The need for interoperability is prominent in the industrial enterprise environment. Different applications and systems that cover the overall range of the industrial infrastructure from the field to the enterprise level need to interoperate. This quest is driven by the enterprise need for greater flexibility and for the wider possible integration of the enterprise systems. This paper presents a distributed system architecture that utilizes dominant state-of-the-art standard technologies, such as workflows, ontologies, and web services, in order to address the above quest in an efficient way.

Journal ArticleDOI
TL;DR: Methods to link virtual environments (VE) and quantitative ergonomic analysis tools in real time for occupational ergonomic studies are presented and a new trend in the integration of different technology fields for synergistic use in industry is contributed.

Journal ArticleDOI
TL;DR: The Biozon system is an extensive knowledge resource of heterogeneous biological data that unifies multiple biological databases consisting of a variety of data types and allows propagation of knowledge through inference and fuzzy searches.
Abstract: Integration of heterogeneous data types is a challenging problem, especially in biology, where the number of databases and data types increase rapidly. Amongst the problems that one has to face are integrity, consistency, redundancy, connectivity, expressiveness and updatability. Here we present a system (Biozon) that addresses these problems, and offers biologists a new knowledge resource to navigate through and explore. Biozon unifies multiple biological databases consisting of a variety of data types (such as DNA sequences, proteins, interactions and cellular pathways). It is fundamentally different from previous efforts as it uses a single extensive and tightly connected graph schema wrapped with hierarchical ontology of documents and relations. Beyond warehousing existing data, Biozon computes and stores novel derived data, such as similarity relationships and functional predictions. The integration of similarity data allows propagation of knowledge through inference and fuzzy searches. Sophisticated methods of query that span multiple data types were implemented and first-of-a-kind biological ranking systems were explored and integrated. The Biozon system is an extensive knowledge resource of heterogeneous biological data. Currently, it holds more than 100 million biological documents and 6.5 billion relations between them. The database is accessible through an advanced web interface that supports complex queries, "fuzzy" searches, data materialization and more, online at http://biozon.org .

Patent
29 Sep 2006
TL;DR: A software management database contains data structures supporting computer software provisioning for a range of CTO/BTO variations, language variations, region variations, and operating system variations as mentioned in this paper.
Abstract: A software management database contains data structures supporting computer software provisioning for a range of CTO/BTO variations, language variations, region variations, and operating system variations.

Journal ArticleDOI
TL;DR: A server, BiologicalNetworks, is described, which provides visualization, analysis services and an information management framework over PathSys, which allows easy retrieval, construction and visualization of complex biological networks, including genome-scale integrated networks of protein–protein, protein–DNA and genetic interactions.
Abstract: Systems level investigation of genomic scale information requires the development of truly integrated databases dealing with heterogeneous data, which can be queried for simple properties of genes or other database objects as well as for complex network level properties, for the analysis and modelling of complex biological processes. Towards that goal, we recently constructed PathSys, a data integration platform for systems biology, which provides dynamic integration over a diverse set of databases [Baitaluk et al. (2006) BMC Bioinformatics 7, 55]. Here we describe a server, BiologicalNetworks, which provides visualization, analysis services and an information management framework over PathSys. The server allows easy retrieval, construction and visualization of complex biological networks, including genome-scale integrated networks of protein-protein, protein-DNA and genetic interactions. Most importantly, BiologicalNetworks addresses the need for systematic presentation and analysis of high-throughput expression data by mapping and analysis of expression profiles of genes or proteins simultaneously on to regulatory, metabolic and cellular networks. BiologicalNetworks Server is available at http://brak.sdsc.edu/pub/BiologicalNetworks.

Proceedings ArticleDOI
Jim Woodcock1
24 Apr 2006
TL;DR: The computer science research community is collaborating to develop verification technology that will demonstrably enhance the productivity and reliability with which software is designed, developed, integrated, and maintained.
Abstract: Bugs have become an unpleasant fact for software producers. Awareness is growing in industry that something must be done about software reliability. A growing number of academic and industrial researchers believe that the way to revolutionize the production of software is by using formal methods, and they also believe that doing so is now feasible. Given the right computer-based tools, the use of formal methods will become widespread, transforming the practice of software engineering. The computer science research community is collaborating to develop verification technology that will demonstrably enhance the productivity and reliability with which software is designed, developed, integrated, and maintained

Journal ArticleDOI
TL;DR: The application of an informatics evaluation framework that provides a heuristic for matching the stage of system design and the level of evaluation is illustrated to illustrate the ability of sound evaluation methodologies throughout the stages of system development.
Abstract: Background Dramatic advances in health information technologies necessitate a comprehensive evaluation framework covering all phases of development, from conception to routine operational use. Objectives The purpose of this article was to illustrate the application of an informatics evaluation framework that provides a heuristic for matching the stage of system design and the level of evaluation. Methods An evaluation framework is illustrated in the context of five studies in different stages of system design. Results The studies discussed in this article represent the various stages of the framework. In addition, they are examples of how evaluation research studies not only contribute to the assessment of system design but also constitute distinct contributions to knowledge. Discussion The ability to engineer advanced information systems has exceeded the understanding of how to deploy them effectively in complex settings and to adapt them to a range of user populations. A systematic, continuous evaluation increases the likelihood that the conditions for success will be understood, including how to tailor a system, how to maintain the target population's use of the system, and how to innovate continually to enhance the functionality of the system and the users' experience. Without sound evaluation methodologies throughout the stages of system development, information systems have limited potential to influence healthcare processes positively.

Journal ArticleDOI
TL;DR: In this article, a field failure rate prediction methodology that starts with analyzing system test data and field data (of previous releases or products) using software reliability growth models (SRGMs) is presented.

Journal ArticleDOI
TL;DR: A framework of critical success factors (CSFs) that can be used to manage IS integration projects, according to a firm's current stage of IT integration maturity and other IS infrastructure characteristics is proposed.
Abstract: System integration is a complex technological task, and an infrastructure decision that seems right today might well be obsolete tomorrow. This article proposes a framework of critical success factors (CSFs) that can be used to manage IS integration projects, according to a firm's current stage of IT integration maturity and other IS infrastructure characteristics. To demonstrate the potential utility of this CSF framework, the authors analyze case studies at two firms using 86 metrics for 20 CSFs developed by the authors.

Journal ArticleDOI
TL;DR: Based on practical experience, a process of DSA system integration is presented that can assist utilities and grid operators in addressing key issues during the specification, development, and installation of such tools.
Abstract: The implementation of online dynamic security assessment (DSA) systems is growing worldwide, and the deployment of this advanced technology is expected to improve the real-time security and, hence, the reliability of power systems. While not insignificant, the cost and efforts required to install online DSA tools are minor compared to the benefits of reducing the volume of offline studies required and, more importantly, the benefits of identifying and avoiding potential security problems in the systems to reduce the risk of blackouts. Based on practical experience, a process of DSA system integration is presented that can assist utilities and grid operators in addressing key issues during the specification, development, and installation of such tools. A number of successful online DSA projects are discussed to illustrate the viability and practicality of such applications, even for large, complex power systems. The penetration of online DSA tools is expected to continue to grow as operators seek timely and cost-effective approaches to enhance system performance. In the meantime, work is continuing on new methods of online analysis, advanced preventive and corrective control tools, and improved hardware architectures

Journal ArticleDOI
TL;DR: Results from several test studies demonstrate the effectiveness of the approach in retrieving biologically interesting relations between genes and proteins, the networks connecting them, and of the utility of PathSys as a scalable graph-based warehouse for interaction-network integration and a hypothesis generator system.
Abstract: Background The goal of information integration in systems biology is to combine information from a number of databases and data sets, which are obtained from both high and low throughput experiments, under one data management scheme such that the cumulative information provides greater biological insight than is possible with individual information sources considered separately.

Proceedings ArticleDOI
16 Oct 2006
TL;DR: In this paper, the authors examine the system integration and optimization issues associated with distributed energy systems and show the benefits of using power electronic (PE) interfaces for such applications, which will allow DE systems to provide increased functionality through improved power quality and voltage/VAR support, increase electrical system compatibility by reducing the fault contributions, and flexibility in operations with various other DE sources, while reducing overall interconnection costs.
Abstract: Optimization of overall electrical system performance is important for the long-term economic viability of distributed energy (DE) systems. With the increasing use of DE systems in industry and its technological advancement, it is becoming more important to understand the integration of these systems with the electric power systems. New markets and benefits for distributed energy applications include the ability to provide ancillary services, improve energy efficiency, enhance power system reliability, and allow customer choice. Advanced power electronic (PE) interfaces will allow DE systems to provide increased functionality through improved power quality and voltage/VAR support, increase electrical system compatibility by reducing the fault contributions, and flexibility in operations with various other DE sources, while reducing overall interconnection costs. This paper examines the system integration and optimization issues associated with DE systems and show the benefits of using PE interfaces for such applications.

Journal ArticleDOI
TL;DR: The proposed integration solutions which use off-the-self OPC products available will enable integrators to create robust solutions and provide true interoperability while at the same time reducing implementation time and costs.

Proceedings ArticleDOI
16 Aug 2006
TL;DR: An industrial case study from the domain of shipboard computing is used to show how system execution modeling tools can provide software and system engineers with quantitative estimates of system bottlenecks and performance characteristics to help evaluate the performance of component-based enterprise DRE systems and reduce time/effort in the integration phase.
Abstract: Component middleware is popular for enterprise distributed systems because it provides effective reuse of the core intellectual property (i.e., the "business logic"). Component-based enterprise distributed real-time and embedded (DRE) systems, however, incur new integration problems associated with component configuration and deployment. New research is therefore needed to minimize the gap between the development and deployment/ configuration of components, so that deployment and configuration strategies can be evaluated well before system integration. This paper uses an industrial case study from the domain of shipboard computing to show how system execution modeling tools can provide software and system engineers with quantitative estimates of system bottlenecks and performance characteristics to help evaluate the performance of componentbased enterprise DRE systems and reduce time/effort in the integration phase. The results from our case study show the benefits of system execution modeling tools and pinpoint where more work is needed.

Journal ArticleDOI
TL;DR: A product model based on ISO 10303 is developed to overcome problems such as data loss during transferring and sharing project data and a framework for practical management of steel bridge information generated from existing tools by using open standards and web technology is presented.

Journal ArticleDOI
TL;DR: A positive association between systems integration and client continuity of care was consistently demonstrated, and better results were obtained in systems characterized by stronger management arrangements, fewer service sectors, and system wide implementation of intensive case management and centralized access to services.
Abstract: Continuity of care is a concern for mental health clients in the post deinstitutionalization era of community care. A proposed solution is systems integration. This paper reviewed research on systems integration, focusing on continuity of care outcomes. A positive association between systems integration and client continuity of care was consistently demonstrated. Better results were obtained in systems characterized by stronger management arrangements, fewer service sectors, and system wide implementation of intensive case management and centralized access to services. Future research should evaluate a wider range of systems integrating mechanisms, using client-based measures that more directly represent continuity of care experiences.

Journal ArticleDOI
TL;DR: The need to take an integrated hardware/software approach to developing SHM solutions is addressed with particular emphasis on the coupling of a general purpose data interrogation software package for structural health monitoring with a modular wireless sensing and processing platform.
Abstract: The process of implementing a damage detection strategy for aerospace, civil and mechanical engineering infrastructure is referred to as structural health monitoring (SHM). The authors' approach is to address the SHM problem in the context of a statistical pattern recognition paradigm. In this paradigm, the process can be broken down into four parts: (1) Operational Evaluation, (2) Data Acquisition and Cleansing, (3) Feature Extraction and Data Compression, and (4) Statistical Model Development for Feature Discrimination. These processes must be implemented through hardware or software and, in general, some combination of these two approaches will be used. This paper will discuss each portion of the SHM process with particular emphasis on the coupling of a general purpose data interrogation software package for structural health monitoring with a modular wireless sensing and processing platform. More specifically, this paper will address the need to take an integrated hardware/software approach to developing SHM solutions.

Proceedings ArticleDOI
01 Jan 2006
TL;DR: An overview of current approaches to enterprise application integration together with an assessment of their strengths and weaknesses is provided and a new enterprise application architecture is proposed, based on the idea of OMG’s Model Driven Architecture, to structure the EAI problem into five general types of model.
Abstract: To achieve greater automation of their business processes, organizations face the challenge of integrating disparate systems. In attempting to overcome this problem, organizations are turning to new integration software called Enterprise Application Integration (EAI). Implementing EAI is a complex task involving both technological and business challenges and requires appropriate EAI architecture. This paper first provides an overview of current approaches to enterprise application integration together with an assessment of their strengths and weaknesses. It then proposes a new enterprise application architecture, based on the idea of OMG’s Model Driven Architecture, to structure the EAI problem into five general types of model. This architecture is developed in response to the need to separate the technological aspect from the business aspect so that the both can evolve independently. The success of EAI lies in its resilience to the both technological and business changes.

Journal ArticleDOI
TL;DR: Key issues of CRM in financial services networks are redundant competencies of partnering companies, privacy constraints, CRM process integration, customer information exchange, and CRM systems integration.
Abstract: Purpose – The aim of this paper is to identify key issues and successful patterns of collaborative customer relationship management (CRM) in financial services networks.Design/methodology/approach – The study takes the form of a multi‐case analysis.Findings – The paper finds that key issues of CRM in financial services networks are redundant competencies of partnering companies, privacy constraints, CRM process integration, customer information exchange, and CRM systems integration. To address these issues, partnering companies have to agree on clear responsibilities in collaborative processes. Data privacy protection laws require that customer data transfer between partnering companies has the explicit approval of customers. For process integration, companies have to agree on process standards and a joint integration architecture. Web services and internet‐based standards can be used for inter‐organizational systems integration. Data integration requires the development of a joint data model. Either a un...

Proceedings ArticleDOI
30 Mar 2006
TL;DR: A case study conducted in a systems integration company investigating the impact of MDD infusion is presented, which takes a practical approach focused on better and more productively meeting customers' requirements.
Abstract: Integration projects are typically faced with a proliferation of standards, technologies, platforms and tools Bespoke solutions are frequently used for what are often generic problems generating work with no discernable business value Business requirements naturally evolve during the development process Because of the complexity of code centric be-spoke solutions, the reactivity to these changes is costly in terms of effort and time Though model driven development (MDD) promises to have a positive response to these problems, there is little practical evidence of the impact of its infusion This paper presents a case study conducted in a systems integration company investigating the impact of MDD infusion We take a practical approach focused on better and more productively meeting customers' requirements Besides this commercial perspective, our approach takes into account the practical aspects of project activities One of these aspects is the influence of the motivation and beliefs of actors in the success of a technological change, which we analyse using ActorO Network technological change, which we analyse using actor network theory (ANT) For systems integration companies, the ability to learn is a critical asset and differentiation factor Knowledge management (KM), as a process through which organizations generate value from their intellectual assets, is another practical aspect we look at We present preliminary findings of the work completed so far We look for industrial representativeness of the results, defined here as being "close to real life" experience that industrial actors (project managers, developers, etc) can relate to, draw conclusions from and translate into action