scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Intelligent Information Technologies in 2005"


Journal ArticleDOI
TL;DR: This framework provides novel methods to control coordinated systems using higher-order mobile agents, which are hierarchically structured agents that can contain other mobile agents.
Abstract: This paper presents a framework for controlling intelligent robots connected by the Internet. This framework provides novel methods to control coordinated systems using higher-order mobile agents. Higher-order mobile agents are hierarchically structured agents that can contain other mobile agents. By using higher-order mobile agents, intelligent robots in action can acquire new functionalities dynamically as well as exchange their roles with other colleague robots. The higher-order property of the mobile agents enables them to be organized hierarchically and dynamically. In addition to these advantages, higher-order mobile agents require minimum communication. They only need connection to be established when they perform migration.

50 citations


Journal ArticleDOI
TL;DR: This paper presents one possible implementation framework for such Web services that should be Semantic Web enabled and form a Service Network based on internal and external agents, which can host heterogeneous mobile agents and coordinate them to perform needed tasks.
Abstract: Among traditional users of Web resources, industry has a growing set of smart industrial devices with embedded intelligence. Just like humans, they need online services (i.e., for condition monitoring, remote diagnostics, maintenance, etc.). In this paper, we present one possible implementation framework for such Web services. Such services should be Semantic Web enabled and form a Service Network based on internal and external agents’ platforms, which can host heterogeneous mobile agents and coordinate them to perform needed tasks. The concept of a “mobile service component†assumes not only exchanging queries and service responses, but also delivering and composition of a service provider. Mobile service component carrier (agent) can move to a field device’s local environment (embedded agent platform) and perform its activities locally. Service components improve their performance through online learning and communication with other components. Heterogeneous service components’ discovery is based on semantic P2P search.

24 citations


Journal ArticleDOI
TL;DR: The results of running 10 simulations for each experiment show that there is a reduction in malaria deaths not only when including schools, but in combination with increasing the number of hospitals.
Abstract: Malaria is a vector-borne disease that greatly affects social and economic development. We adopt the complex system paradigm in our analysis of the problem. Our aim is to assess the impact of education on malaria healthcare. Multi-agent systems are employed to model the spread of malaria in Haiti, where we introduce malaria education as a possible way of regulating deaths due to the parasite. We launch three experiments, each with environment modifications: three hospitals; three hospitals and 20 schools; and five hospitals and 20 schools. The results of running 10 simulations for each experiment show that there is a reduction in malaria deaths not only when including schools, but in combination with increasing the number of hospitals.

23 citations


Journal ArticleDOI
TL;DR: This work presents a methodology that generates association rules without revealing confidential inputs such as statistical properties of individual sites, and yet retains a high level of accuracy in the resultant rules.
Abstract: Data mining is a process that analyzes voluminous digital data in order to discover hidden but useful patterns from digital data. However, the discovering of such hidden patterns has statistical meaning and may often disclose some sensitive information. As a result, privacy becomes one of the prime concerns in the data-mining research community. Since distributed association mining discovers association rules by combining local models from various distributed sites, breaching data privacy happens more often than it does in centralized environments. In this work, we present a methodology that generates association rules without revealing confidential inputs such as statistical properties of individual sites, and yet retains a high level of accuracy in the resultant rules. One of the important outcomes of the proposed technique is that it reduces the overall communication costs. Performance evaluation of our proposed method shows that it reduces the communication cost significantly when we compare it with other well-known, distributed association-rule-mining algorithms. Nevertheless, the global rule model generated by the proposed method is based on the exact global support of each item set and hence diminishes inconsistency, which indeed occurs when global models are generated from partial support count of an item set. © 2005, IGI Global. All rights reserved.

23 citations


Journal ArticleDOI
TL;DR: An experimental multi-agent system developed for and aimed at a computer-supported fault diagnosis in electricity distribution networks is described, based on a hierarchy of five agents that cooperate with each other to diagnose a fault.
Abstract: When a fault occurs in a power system, the protective relays detect the fault and trip appropriate circuit breakers, which isolate the affected equipment from the rest of the power system. Fault diagnosis of power systems is the process of identifying faulty components and/or sections by analysing observable symptoms (telemetry messages). As the domain is characterised by dynamic situations, extensive telemetering, complex operations, and distribution of lines and substations over a large geographical area, it is difficult to tackle fault diagnosis problems through the strength and capability of a single intelligent system. This paper describes an experimental multi-agent system developed for and aimed at a computer-supported fault diagnosis in electricity distribution networks. The system is based on a hierarchy of five agents that cooperate with each other to diagnose a fault. A set of detailed case studies is presented, and the results obtained suggest that an agent-based approach is very efficient and has a good potential for real-time application. © 2005, IGI Global. All rights reserved.

20 citations


Journal ArticleDOI
TL;DR: This article aims to provide a synopsis of agent-based modeling and how to adapt an agent- based research strategy for the scientific study of complex business systems.
Abstract: This article aims to provide a synopsis of agent-based modeling and how to adapt an agent-based research strategy for the scientific study of complex business systems. Agent-based systems have been a popular field of study in computer science for some time. While computer science-related research has been focused on the artifact itself, such as computational languages and algorithms, research in the management sciences is explicitly focused on business problems. Research in Information Systems (IS) has begun to advance knowledge in the use of agent-based systems as a means to seek different, computational explanations for business phenomena that have eluded scientific inquiry reliant on traditional—specifically, law and axiomatic—explanation (Kimbrough, 2003). The focus on business problems requires a different research approach than what is successful in computer science. Key modifications include first, the explicit articulation of benefits specific to the management sciences, and second, instrument validation.

18 citations


Journal ArticleDOI
TL;DR: The research described in this paper aims at specifying and developing an open model-based infrastructure and a set of tools that promote consistent knowledge management within collaborative environments in the construction sector.
Abstract: The research described in this paper aims at specifying and developing an open model-based infrastructure and a set of tools that promote consistent knowledge management within collaborative environments in the construction sector. The specified solution emerged from a comprehensive analysis of the business and information/knowledge management practices of four construction organizations and makes use of a construction-specific ontology for specifying adaptive mechanisms that can organize documents according to their content and interdependencies while maintaining their overall consistency. The proposed Web-based infrastructure includes services allowing the creation, capture, indexing, retrieval, and dissemination of knowledge. It also promotes the integration of third-party services, including proprietary and legacy tools. The Web-services model is used as the underlying middleware technology that supports the solution. The research is sponsored by the European Commission under the Framework V Programme (Information Society and Technology). Keywords information engineering; Inernet-based services; knowledge-based systems; knowledge management

14 citations


Journal ArticleDOI
TL;DR: A kind of anthropomorphic representation of the networked person with whom the user can identify and feel comfortable is introduced, first in an intuitive and informal way with an emphasis on its social aspect, then in a more detailed way with the analysis of its main components.
Abstract: As peer-to-peer computing finally reaches a critical mass, it triggers changes in the IT landscape that traditional network infrastructures, based on centralized, client/server topologies, cannot manage. Consequently, the ad hoc, self-organized, and loosely controlled nature of peer-to-peer networks needs to be supported by a new coordination layer representing the interests of the user. In order to define this new abstraction layer, this paper introduces the concept of the virtual twin — a kind of anthropomorphic representation of the networked person with whom the user can identify and feel comfortable. We discuss the inner structure of the virtual twin, first in an intuitive and informal way with an emphasis on its social aspect, then in a more detailed way with the analysis of its main components.

12 citations


Journal ArticleDOI
TL;DR: A generic electronic market platform that is designed to run different kinds of auctions and exchanges, and a generic OR/XOR bidding language that can express different OR/ XOR combinations is implemented for Web interfaces.
Abstract: In this paper, we present a generic electronic market platform that is designed to run different kinds of auctions and exchanges. Researchers can use the platform to implement different electronic market mechanisms, simulate the market behavior of their interests, and experiment with it. A generic OR/XOR bidding language that can express different OR/XOR combinations is implemented for Web interfaces. Different auctions, including combinatorial auctions, multiple-round reverse auctions, and multiple homogeneous good auctions, have been built and run successfully on the platform.

12 citations


Journal ArticleDOI
TL;DR: This paper proposes a framework for building an automated intelligent agent for memory management under the client-server architecture with the emphasis on collecting the needs of the organization and acquiring the application usage patterns for each client involved in real time.
Abstract: Amidst the era of e-economy, one of the difficulties from the standpoint of the information systems manager is, among others, the forecast of memory needs for the organization. In particular, the manager is often confronted with maintaining a certain threshold amount of memory for a prolonged period of time. However, this constraint requires more than technical and managerial resolutions, encompassing knowledge management for the group, eliciting tacit knowledge from the end users, and pattern and time series analyses of utilization for various applications. This paper proposes a framework for building an automated intelligent agent for memory management under the client-server architecture. The emphasis is on collecting the needs of the organization and acquiring the application usage patterns for each client involved in real time. Due to the dynamic nature of the tasks, incorporation of a neural network architecture with tacit knowledge base is suggested. Considerations for future work associated with technical matters comprising platform independence, portability, and modularity are discussed.

11 citations


Journal ArticleDOI
TL;DR: The interactions among humans, evolutionary agents, and memes in order to reflect upon the future are explored and the social implications of meritorious and malevolent memes exchanged by evolutionary agents are developed.
Abstract: Push technologies are rapidly moving toward autonomous and evolutionary intelligent agents for seeking, organizing, and creating information via the Web and other pervasive and innovative information technologies. We describe and define autonomous and evolutionary agents designed for push technologies. Memes, which are messages one agent broadcasts to another, causing the agent to evolve are introduced and we explore how memes will influence evolutionary agents. We develop the social implications of meritorious and malevolent memes exchanged by evolutionary agents. In the conclusion we explore the interactions among humans, evolutionary agents, and memes in order to reflect upon the future. Finally, we raise a series of future research questions regarding genetic determination of evolutionary agents, whether it is possible to predict if a meme will be meritorious or malevolent, and ask whether it is desirable to legislate the evolution of agents that are evolved from malevolent memes. Our contribution is to raise awareness of the movement toward push technologies deploying evolutionary agents and its promise and caveats, as well as to provide future research directions.

Journal ArticleDOI
TL;DR: The newly developed method utilizes the knowledge of the GOR-V information theory and the power of the neural networks to classify a novel protein sequence in one of its three secondary structure classes (helices, strands, and coils).
Abstract: Protein secondary structure prediction is a fundamental step in determining the 3D structure of a protein. In this paper, a new method for predicting protein secondary structure from amino acid sequences has been proposed and implemented. Cuff and Barton 513 protein data set is used in training and testing the prediction methods under the same hardware, platforms, and environments. The newly developed method utilizes the knowledge of the GOR-V information theory and the power of the neural networks to classify a novel protein sequence in one of its three secondary structure classes (helices, strands, and coils). The newly developed method (NN-GORV-I) is improved further by applying a filtering mechanism to the searched database and, hence, named NN-GORV-II. The developed prediction methods are rigorously analyzed and tested, together with other five well-known prediction methods in this domain in order to allow easy comparison and clear conclusions.

Journal ArticleDOI
TL;DR: This work proposes a semantics-based approach to implement the data-sourcing service for business rules that captures semantics of business rules and provides an agent-enabled mechanism that dynamically maps business rules to the enterprise data model.
Abstract: In recent years, business rule management has become an important component of enterprise information systems. Business rules represent guidelines about how an enterprise should conduct its business and provide better service for customers. Business rules are being widely deployed in supply chains to support real-time decision making. The research reported in this paper is aimed at designing a dynamically adaptable data-sourcing service for deploying business rules effectively in supply chain management. Such a data-sourcing service is important since execution of business rules requires data to be retrieved from various data sources spread across the enterprise, including the enterprise data warehouse. We propose a semantics-based approach to implement the data-sourcing service for business rules. Our approach captures semantics of business rules and provides an agent-enabled mechanism that dynamically maps business rules to the enterprise data model. A prototype system is implemented to illustrate our sourcing service and demonstrate the feasibility of our approach.

Journal ArticleDOI
TL;DR: This research presents a system for agent-based simulation and support for clinical processes and describes the architecture of the system and its functionalities and the integration of existing Foundation for Intelligent Physical Agents, DICOM, and HL7 standards.
Abstract: There are several continuing challenges within the health care domain. On the one hand, there is a greater need for individualized, patient-oriented processes in diagnostics, therapy, nursing, and administration, and on the other hand, hospitals have extremely distributed decision processes and strong local (individual) autonomy with a high degree of situational dynamics. This research focuses on developing an information system that can substantially increase the efficiency of hospital process management. We present a system for agent-based simulation and support for clinical processes. In particular, we describe the architecture of the system and its functionalities and the integration of existing Foundation for Intelligent Physical Agents (FIPA), DICOM, and HL7 standards. We discuss an example scenario for clinical trials to illustrate how the system supports distributed clinical process management. This system interacts with other multiagent systems within the Agent.Hospital framework and hospital information systems (HISs) in the eHealth Lab. This research is part of the German Priority Research Program (SPP) 1083, Intelligent Agents and Their Application in Business Scenarios.

Journal ArticleDOI
TL;DR: In this article, a model for an information system that will support and enhance ethical conduct in an online auction environment is proposed, based on this set of ethics, a code of ethics that could be applied to online auctions is proposed.
Abstract: The online auction has become an important form of e-commerce. Although using a different mode for conducting auction activities, online auctions should abide by the same code of ethics outlined in the face-to-face auction environment. Yet, ethics-related issues for online auctions have not been fully discussed in the current literature. The unique features of online auctions present an opportunity to address how ethical conduct could be supported, monitored, and enforced in an online auction environment. With technology being the backbone of the online auction, information systems appear to be a useful tool in facilitating ethics enforcement. This article summarizes ethics-related issues that are particularly relevant in online auctions, and recommends a code of ethics that could be applied to online auctions. Based on this set of ethics, this article proposes a model for an information system that will support and enhance ethical conduct in an online auction environment.

Journal ArticleDOI
TL;DR: The proposed multi-agent system tool, ADAM, is in the form of a self-administering wrapper around database systems, and it addresses and offers a solution to the problem of overburdened and expensive DBAs with the objective of making databases a cost-effective option for small/medium-sized organizations.
Abstract: In today’s world, databases and database systems have become an essential component of everyday life, so much so that a life without DBMSs has become inconceivable. This article focuses on relational database management systems in particular, and proposes a novel and innovative multi-agent system that would autonomously and rationally administer and maintain databases. The proposed multi-agent system tool, ADAM, is in the form of a self-administering wrapper around database systems, and it addresses and offers a solution to the problem of overburdened and expensive DBAs with the objective of making databases a cost-effective option for small/medium-sized organizations. An implementation of the agent-based system to proactively or reactively identify and resolve a small subset of DBA tasks is discussed, and the GAIA methodology is used to outline the detailed analysis and design of the same. Role models describing the responsibilities, permissions, activities, and protocols of the candidate agents, and interaction models representing the links between the roles, are explained. The Coordinated Intelligent Rational agent model is used to describe the agent architecture, and a brief description of the functionalities, responsibilities, and components of each agent in the ADAM multi-agent system is presented. Finally, a prototype system implementation using JADE 2.5 and Oracle 8.1.7 is presented as evidence of the feasibility of the proposed agent-based solution for the autonomous administration and maintenance of relational databases.

Journal ArticleDOI
TL;DR: This work proposes a framework to allow researchers and developers to choose the level of detail, the type of technologies, and the extent of computing power they want to utilize for their proposed solutions to build on the successful foundations of the Web: ease of use, flexibility, and almost unlimited expression power.
Abstract: Currently, we face a major gap between the reality of the Web — a disjoined and tangled mass of loosely coupled information resources — and the vision for the Web — a tightly integrated and openly structured information network with machine-readable data that allows autonomous agencies to create new applications empowered by this wealth of information. Current research shows that we can hope to achieve this goal, but there are many obstacles left to be mastered. We propose a framework to allow researchers and developers to choose the level of detail, the type of technologies, and the extent of computing power they want to utilize for their proposed solutions. We focus on a flexible abstraction layer, pattern-oriented architecture, and open interfaces to build on the successful foundations of the Web: ease of use, flexibility, and almost unlimited expression power. Agents are the central paradigm for software development using this architecture.