scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Software Engineering and Applications in 2010"


Journal ArticleDOI
TL;DR: A synergistic integration of the constituent parts of mechatronic systems, i.e. mechanical, electronic and software is proposed though the 3+1 SysML view-model through the Model Integrated Mechatronics (MIM) paradigm.
Abstract: Software is becoming the driving force in today’s mechatronic systems. It does not only realize a significant part of their functionality but it is also used to realize their most competitive advantages. However, the traditional development process is wholly inappropriate for the development of these systems that impose a tighter coupling of software with electronics and mechanics. In this paper, a synergistic integration of the constituent parts of mechatronic systems, i.e. mechanical, electronic and software is proposed though the 3+1 SysML view-model. SysML is used to specify the cen-tral view-model of the mechatronic system while the other three views are for the different disciplines involved. The widely used in software engineering V-model is extended to address the requirements set by the 3+1 SysML view-model and the Model Integrated Mechatronics (MIM) paradigm. A SysML profile is described to facilitate the application of the proposed view-model in the development of mechatronic systems.

131 citations


Journal ArticleDOI
TL;DR: The paper proposes ten principles of knowledge creation in open source software community: Self-organizing, Code sharing, Adaptation, Usability, Sustention, Talent, Interaction, Collaboration, Happiness, and Democracy.
Abstract: In this paper, we discuss agile software process improvement in P company with their description of process management in current level and analysis of problems, design the P Company success factors model in organizational culture, systems, products, customers, markets, leadership, technology and other key dimensions, which is verified through questionnaire in P company. In the end, we apply knowledge creation theory to analyze the open source software community with successful application of the typical agile software method, propose ten principles of knowledge creation in open source software community: Self-organizing, Code sharing, Adaptation, Usability, Sustention, Talent, Interaction, Collaboration, Happiness, and Democracy.

54 citations


Journal ArticleDOI
TL;DR: This paper proposes neural network architecture that consists of 6 sub-neural networks to solve the inverse kinematics problem for robotics manipulators with 2 or higher degrees of freedom and reduces the complexity of the algorithm and calculation faced when using the Inverse Geometric Models implementation (IGM) in robotics.
Abstract: One of the most important problems in robot kinematics and control is, finding the solution of Inverse Kinematics. Inverse kinematics computation has been one of the main problems in robotics research. As the Complexity of robot increases, obtaining the inverse kinematics is difficult and computationally expensive. Traditional methods such as geometric, iterative and algebraic are inadequate if the joint structure of the manipulator is more complex. As alternative approaches, neural networks and optimal search methods have been widely used for inverse kinematics modeling and control in robotics This paper proposes neural network architecture that consists of 6 sub-neural networks to solve the inverse kinematics problem for robotics manipulators with 2 or higher degrees of freedom. The neural networks utilized are multi-layered perceptron (MLP) with a back-propagation training algorithm. This approach will reduce the complexity of the algorithm and calculation (matrix inversion) faced when using the Inverse Geometric Models implementation (IGM) in robotics. The obtained results are presented and analyzed in order to prove the efficiency of the proposed approach.

52 citations


Journal ArticleDOI
TL;DR: The method described in this paper ex-tracts the feature based assembly information from CAD models of products and build up liaisons to facilitate assembly planning applications.
Abstract: A mechanical assembly is a composition of interrelated parts. Assembly data base stores the geometric models of indi-vidual parts, the spatial positions and orientations of the parts in the assembly, and the relationships between parts. An assembly of parts can be represented by its liaison which has a description of its relationships between the various parts in the assembly. The problem is to not only make the information available but also use the relevant information for making decisions, especially determination of the assembly sequence plan. The method described in this paper ex-tracts the feature based assembly information from CAD models of products and build up liaisons to facilitate assembly planning applications. The system works on the assumption that the designer explicitly defines joints and mating condi-tions. Further, a computer representation of mechanical assemblies in the form of liaisons is necessary in order to automate the generation of assembly plans. A novel method of extracting the assembly information and representing them in the form of liaisons is presented in this paper.

39 citations


Journal ArticleDOI
TL;DR: This article describes the development of an application for generating tonal melodies based upon a set of construction principles including the notion of a hierarchical organization, and the idea that melodies consist of a skeleton that may be elaborated in various ways.
Abstract: This article describes the development of an application for generating tonal melodies. The goal of the project is to ascertain our current understanding of tonal music by means of algorithmic music generation. The method followed consists of four stages: 1) selection of music-theoretical insights, 2) translation of these insights into a set of principles, 3) conversion of the principles into a computational model having the form of an algorithm for music generation, 4) testing the “music” generated by the algorithm to evaluate the adequacy of the model. As an example, the method is implemented in Melody Generator, an algorithm for generating tonal melodies. The program has a structure suited for generating, displaying, playing and storing melodies, functions which are all accessible via a dedicated interface. The actual generation of melodies, is based in part on constraints imposed by the tonal context, i.e. by meter and key, the settings of which are controlled by means of parameters on the interface. For another part, it is based upon a set of construction principles including the notion of a hierarchical organization, and the idea that melodies consist of a skeleton that may be elaborated in various ways. After these aspects were implemented as specific sub-algorithms, the device produces simple but well-structured tonal melodies.

38 citations


Journal ArticleDOI
TL;DR: This paper presents a new formalism for Modeling Multi Agent Systems that is able to describe not only not the internal state of each agent modeled but also its behavior, and proposes mathematical definitions attached to firing transitions.
Abstract: In this paper, we present a new formalism for Modeling Multi Agent Systems (MAS). Our model based a PN is able to describe not only not the internal state of each agent modeled but also its behavior. Owing to these features, one can model naturally the dynamic behavior of complex systems and the communication between these entities. For this, we propose mathematical definitions attached to firing transitions. To validate our contribution, we will deal with real examples.

32 citations


Journal ArticleDOI
TL;DR: It is found that software requirement development (SRD) is a knowledge creation process, and knowledge creation theory of Nonaka is appropriate for analyzing knowledge creating of SRD.
Abstract: After field survey and literature review, we found that software requirement development (SRD) is a knowledge creation process, and knowledge creation theory of Nonaka is appropriate for analyzing knowledge creating of SRD. The characteristics of knowledge in requirement elicitation process are analyzed, and dissymmetric knowledge of SRD is discussed. Experts on requirement are introduced into SRD process as a third knowledge entity. In addition, a knowledge creation model of SRD is put forward and the knowledge flow and the relationship of entities of this model are illustrated. Case study findings are illustrated in the following: 1) The necessary diversity of the project team can facilitate the implementation of the SRD. 2) The introduction of experts on requirement can achieve the transformation of knowledge effectively, thus helping to carry out the SRD. 3) Methodology and related technologies are important for carrying out the SRD.

31 citations


Journal ArticleDOI
TL;DR: A robust spatial clustering algorithm named NSCABDT (Novel Spatial Clustering Algorithm Based on Delaunay Triangulation) is proposed, used for determining neighborhoods based on the neighborhood notion, spatial association rules and colloca-tions being defined.
Abstract: Exploratory data analysis is increasingly more necessary as larger spatial data is managed in electro-magnetic media. Spatial clustering is one of the very important spatial data mining techniques which is the discovery of interesting rela-tionships and characteristics that may exist implicitly in spatial databases. So far, a lot of spatial clustering algorithms have been proposed in many applications such as pattern recognition, data analysis, and image processing and so forth. However most of the well-known clustering algorithms have some drawbacks which will be presented later when ap-plied in large spatial databases. To overcome these limitations, in this paper we propose a robust spatial clustering algorithm named NSCABDT (Novel Spatial Clustering Algorithm Based on Delaunay Triangulation). Delaunay dia-gram is used for determining neighborhoods based on the neighborhood notion, spatial association rules and colloca-tions being defined. NSCABDT demonstrates several important advantages over the previous works. Firstly, it even discovers arbitrary shape of cluster distribution. Secondly, in order to execute NSCABDT, we do not need to know any priori nature of distribution. Third, like DBSCAN, Experiments show that NSCABDT does not require so much CPU processing time. Finally it handles efficiently outliers.

29 citations


Journal ArticleDOI
TL;DR: F fuzzy analytic hierarchy process (FAHP) and some other extensions of AHP have been configured to solve problems of alternatives evaluation in Analytic Hierarchy Process.
Abstract: This paper concerns with proposing a fuzzy logic based expert system to breakthrough the problem of alternatives evaluation in Analytic Hierarchy Process (AHP) AHP as a multi criteria decision aid helped decision makers for ana-lyzing and prioritizing the alternatives in a hierarchical structure During times AHP encountered some problems Hence, fuzzy analytic hierarchy process (FAHP) and some other extensions of AHP have been configured to solve those problems

27 citations


Journal ArticleDOI
TL;DR: An extension of the classical Resource Constrained Project Scheduling Problem (RCPSP) is presented where staff members can have several skills with different proficiency, i.e., a staff member is able to perform more than one kind of activity as the time need is complete the task assign depends on the staff individual skill.
Abstract: In this paper, we present an extension of the classical Resource Constrained Project Scheduling Problem (RCPSP). We present a new type of resource constraints in which staff members are involved. We present a new model where staff members can have several skills with different proficiency, i.e., a staff member is able to perform more than one kind of activity as well as the time need is complete the task assign depends on the staff individual skill. We call this model the Weighted-Multi-Skill Project Scheduling Problem (WMSPSP). In our model, an activity has specific skill requirements that must be satisfied. To solve this problem, we propose a lower bound that uses a linear programming scheme for the RCPSP.

26 citations


Journal ArticleDOI
TL;DR: A set of features are proposed which will influence the duration patterns of the sequence of the sound units, derived from the results of the duration analysis, which can be further used to predict the durations of the syllables more accurately by exploring various nonlinear models.
Abstract: Acoustic analysis and synthesis experiments have shown that duration and intonation patterns are the two most important prosodic features responsible for the quality of synthesized speech. In this paper a set of features are proposed which will influence the duration patterns of the sequence of the sound units. These features are derived from the results of the duration analysis. Duration analysis provides a rough estimate of features, which affect the duration patterns of the sequence of the sound units. But, the prediction of durations from these features using either linear models or with a fixed rulebase is not accurate. From the analysis it is observed that there exists a gross trend in durations of syllables with respect to syllable position in the phrase, syllable position in the word, word position in the phrase, syllable identity and the context of the syllable (preceding and the following syllables). These features can be further used to predict the durations of the syllables more accurately by exploring various nonlinear models. For analying the durations of sound units, broadcast news data in Telugu is used as the speech corpus. The prediction accuracy of the duration models developed using rulebases and neural networks is evaluated using the objective measures such as percentage of syllables predicted within the specified deviation, average prediction error (µ), standard deviation (σ) and correlation coefficient (γ).

Journal ArticleDOI
TL;DR: A new evolutionary method called combinatorial optimisation with coincidence algorithm (COIN) being applied to Type I problems of MMUALBP in a just-in-time production system and results showed that COIN outperformed NSGA II.
Abstract: Mixed-model U-shaped assembly line balancing problems (MMUALBP) is known to be NP-hard resulting in it being nearly impossible to obtain an optimal solution for practical problems with deterministic algorithms. This paper pre-sents a new evolutionary method called combinatorial optimisation with coincidence algorithm (COIN) being applied to Type I problems of MMUALBP in a just-in-time production system. Three objectives are simultaneously considered; minimum number workstations, minimum work relatedness, and minimum workload smoothness. The variances of COIN are also proposed, i.e. CNSGA II, and COIN-MA. COIN and its variances are tested against a well-known algo-rithm namely non-dominated sorting genetic algorithm II (NSGA II) and MNSGA II (a memetic version of NSGA II). Experimental results showed that COIN outperformed NSGA II. In addition, although COIN-MA uses a marginal CPU time than CNSGA II, its other performances are dominated.

Journal ArticleDOI
TL;DR: A mathematical model is proposed for CFP and is solved using the Ant Colony Optimization, Genetic Algorithm and Simulated Annealing meta-heuristic methods and the results are compared and show that the GA method is more effective in solving the model.
Abstract: Cellular Manufacturing System (CMS) is an application of Group Technology (GT) that allows decomposing a manu-facturing system into subsystems. Grouping the machines and parts in a cellular manufacturing system, based on simi-larities is known as cell formation problem (CFP) which is an NP-hard problem. In this paper, a mathematical model is proposed for CFP and is solved using the Ant Colony Optimization (ACO), Genetic Algorithm (GA) and Simulated Annealing (SA) meta-heuristic methods and the results are compared. The computational results show that the GA method is more effective in solving the model.

Journal ArticleDOI
TL;DR: A novel modified particle swarm optimization algorithm (MPSO) for both offline and online parametric identification of dynamic models that has advantage of convergence property over BPSO and GA, but also can avoid the premature convergence problem effectively.
Abstract: This paper presents a novel modified particle swarm optimization algorithm (MPSO) for both offline and online parametric identification of dynamic models. The MPSO is applied for identifying a suspension system introduced by a quarter-car model. A novel mutation mechanism is employed in MPSO to enhance global search ability and increase convergence speed of basic PSO (BPSO) algorithm. MPSO optimization is used to find the optimum values of parameters by minimizing the sum of squares error. The performance of the MPSO is compared with other optimization methods including BPSO and Genetic Algorithm (GA) in offline parameter identification. The simulating results show that this algorithm not only has advantage of convergence property over BPSO and GA, but also can avoid the premature convergence problem effectively. The MPSO algorithm is also improved to detect and determine the variation of parameters. This novel algorithm is successfully applied for online parameter identification of suspension system.

Journal ArticleDOI
TL;DR: This paper provides effort estimates during pre- coding and post-coding phases using neural network to predict more accurately and gives a measure of the effort to be spent on the testing phase.
Abstract: In software industry the major problem encountered during project scheduling is in deciding what proportion of the resources has allocated to the testing phase. In general it has been observed that about 40%-50% of the resources need to be allocated to the testing phase. However it is very difficult to predict the exact amount of effort required to be allocated to testing phase. As a result the project planning goes haywire. The project which has not been tested sufficiently can cause huge losses to the organization. This research paper focuses on finding a method which gives a measure of the effort to be spent on the testing phase. This paper provides effort estimates during pre-coding and post-coding phases using neural network to predict more accurately.

Journal ArticleDOI
TL;DR: The application of the multi-layer perceptron artificial neural network, ordinary kriging (OK), and inverse distance weighting (IDW) models in the estimation of local scour depth around bridge piers was outlined.
Abstract: This paper outlines the application of the multi-layer perceptron artificial neural network (ANN), ordinary kriging (OK), and inverse distance weighting (IDW) models in the estimation of local scour depth around bridge piers. As part of this study, bridge piers were installed with bed sills at the bed of an experimental flume. Experimental tests were conducted under different flow conditions and varying distances between bridge pier and bed sill. The ANN, OK and IDW models were applied to the experimental data and it was shown that the artificial neural network model predicts local scour depth more accurately than the kriging and inverse distance weighting models. It was found that the ANN with two hidden layers was the optimum model to predict local scour depth. The results from the sixth test case showed that the ANN with one hidden layer and 17 hidden nodes was the best model to predict local scour depth. Whereas the results from the fifth test case found that the ANN with three hidden layers was the best model to predict local scour depth.

Journal ArticleDOI
TL;DR: A computational study to compare the performance of the GA under six different representations of the genetic algorithm for the job shop scheduling problem.
Abstract: Due to the NP-hardness of the job shop scheduling problem (JSP), many heuristic approaches have been proposed; among them is the genetic algorithm (GA). In the literature, there are eight different GA representations for the JSP; each one aims to provide subtle environment through which the GA’s reproduction and mutation operators would succeed in finding near optimal solutions in small computational time. This paper provides a computational study to compare the performance of the GA under six different representations.

Journal ArticleDOI
TL;DR: Under weak conditions, this method possesses global convergence and R-linear convergence for nonconvex function and convex function, respectively and has sufficiently descent property and belongs to a trust region without carrying out any line search rule.
Abstract: It is well known that the line search methods play a very important role for optimization problems. In this paper a new line search method is proposed for solving unconstrained optimization. Under weak conditions, this method possesses global convergence and R-linear convergence for nonconvex function and convex function, respectively. Moreover, the given search direction has sufficiently descent property and belongs to a trust region without carrying out any line search rule. Numerical results show that the new method is effective.

Journal ArticleDOI
TL;DR: This paper proposes a method for evaluating KAOS models through the extension of Wigmore’s model with features of Bayesian networks.
Abstract: Wigmore’s charts and Bayesian networks are used to represent graphically the construction of arguments and to evaluate them. KAOS is a goal oriented requirements analysis method that enables the analysts to capture requirements through the realization of the business goals. However, KAOS does not have inbuilt mechanism for evaluating these goals and the inferring process. This paper proposes a method for evaluating KAOS models through the extension of Wigmore’s model with features of Bayesian networks.

Journal ArticleDOI
TL;DR: An elaborated emulator of a 2-D massively parallel re-configurable mesh computer of size n x n processing elements (PE) is presented and a hard kernel of a parallel virtual machine in which all the physical properties of its different components are translated.
Abstract: Emulating massively parallel computer architectures represents a very important tool for the parallel programmers. It allows them to implement and validate their algorithms. Due to the high cost of the massively parallel real machines, they remain unavailable and not popular in the parallel computing community. The goal of this paper is to present an elaborated emulator of a 2-D massively parallel re-configurable mesh computer of size n x n processing elements (PE). Basing on the object modeling method, we develop a hard kernel of a parallel virtual machine in which we translate all the physical properties of its different components. A parallel programming language and its compiler are also devel-oped to edit, compile and run programs. The developed emulator is a multi platform system. It can be installed in any sequential computer whatever may be its operating system and its processing unit technology (CPU). The size n x n of this virtual re-configurable mesh is not limited; it depends just on the performance of the sequential machine supporting the emulator.

Journal ArticleDOI
TL;DR: This paper proposes an artificial neural network algorithm to identify the three basic control chart patterns; natural, shift, and trend, and showed that the proposed algorithm realized better identification than others.
Abstract: The identification of control chart patterns is very important in statistical process control. Control chart patterns are categorized as natural and unnatural. The presence of unnatural patterns means that a process is out of statistical control and there are assignable causes for process variation that should be investigated. This paper proposes an artificial neural network algorithm to identify the three basic control chart patterns; natural, shift, and trend. This identification is in addition to the traditional statistical detection of runs in data, since runs are one of the out of control situations. It is assumed that a process starts as a natural pattern and then may undergo only one out of control pattern at a time. The performance of the proposed algorithm was evaluated by measuring the probability of success in identifying the three basic patterns accurately, and comparing these results with previous research work. The comparison showed that the proposed algorithm realized better identification than others.

Journal ArticleDOI
TL;DR: This paper proposes an efficient cost effective approach for optimizing the cost of testing using Tabu Search (TS), which will provide maximum code coverage along with the concepts of Dijkstra’s Algorithm which will be implemented in Aspiration criteria of Tabu search in order to optimize the cost and generate a minimum cost path with maximum coverage.
Abstract: In order to deliver a complete reliable software product, testing is performed. As testing phase carries on, cost of testing process increases and it directly affects the overall project cost. Many a times it happens that the actual cost becomes more than the estimated cost. Cost is considered as the most important parameter with respect to software testing, in software industry. In recent year’s researchers have done a variety of work in the area of Cost optimization by using various concepts like Genetic Algorithm, simulated annealing and Automation in generation of test data etc. This paper proposes an efficient cost effective approach for optimizing the cost of testing using Tabu Search (TS), which will provide maximum code coverage along with the concepts of Dijkstra’s Algorithm which will be implemented in Aspiration criteria of Tabu Search in order to optimize the cost and generate a minimum cost path with maximum coverage.

Journal ArticleDOI
TL;DR: A different approach for the organization is presented, which focuses on positive outcomes named as benefits, and a comparison between Benefits Management and PRINCE2 methodologies is illustrated.
Abstract: The benefits management approach complements most of the common project management methodologies such as critical chain project management (CCPM), and PRINCE2. The majority of these methodologies focus on how to com-ply with three parameters: time, cost and quality instead of identifying the positive outcomes and benefits for an organization. In this paper, a different approach for the organization is presented, which focuses on positive outcomes named as benefits. Moreover, a comparison between Benefits Management and PRINCE2 methodologies is illustrated.

Journal ArticleDOI
TL;DR: The developed approach to localized linear models for the prediction of hourly PM10 concentration values returned a significant reduction of the prediction error under all examined metrics against conventional forecasting schemes such as the linear regression and the neural networks.
Abstract: The present paper discusses the application of localized linear models for the prediction of hourly PM10 concentration values. The advantages of the proposed approach lies in the clustering of the data based on a common property and the utilization of the target variable during this process, which enables the development of more coherent models. Two alternative localized linear modelling approaches are developed and compared against benchmark models, one in which data are clustered based on their spatial proximity on the embedding space and one novel approach in which grouped data are described by the same linear model. Since the target variable is unknown during the prediction stage, a complimentary pattern recognition approach is developed to account for this lack of information. The application of the developed approach on several PM10 data sets from the Greater Athens Area, Helsinki and London monitoring networks returned a significant reduction of the prediction error under all examined metrics against conventional forecasting schemes such as the linear regression and the neural networks.

Journal ArticleDOI
TL;DR: Experimental results show that the proposed algorithm has good performance in both invisibility and security and also has good robustness against the noise, cropping, filtering, JPEG compression and other attacks.
Abstract: A digital image watermarking algorithm based on fast curvelet transform is proposed. Firstly, the carrier image is decomposed by fast curvelet transform, and, the watermarking image is scrambled by Arnold transform. Secondly, the binary watermarking image is embedded into the medium frequency coefficients according to the human visual characteristics and curvelet coefficients. Experiment results show that the proposed algorithm has good performance in both invisibility and security and also has good robustness against the noise, cropping, filtering, JPEG compression and other attacks.

Journal ArticleDOI
TL;DR: The some properties of the fuzzy R-solution of the control linear fuzzy differential differential inclu-sions are shown and the optimal time problems for it are researched.
Abstract: In the present paper, we show the some properties of the fuzzy R-solution of the control linear fuzzy differential inclu-sions and research the optimal time problems for it.

Journal ArticleDOI
TL;DR: In this paper, ten practical key principles are proposed, which aim to improve the quality of requirements specification, and what I believe is the fundamental cause: the authors think like programmers, not engineers and managers.
Abstract: We know many of our IT projects fail and disappoint. The poor state of requirements methods and practice is frequently stated as a factor for IT project failure. In this paper, I discuss what I believe is the fundamental cause: we think like programmers, not engineers and managers. We do not concentrate on value delivery, but instead on functions, on use-cases and on code delivery. Further, management is not taking its responsibility to make things better. In this paper, ten practical key principles are proposed, which aim to improve the quality of requirements specification.

Journal ArticleDOI
TL;DR: A reference model for requirements analysis and documentation is proposed and what kind of requirements management tools are needed to support an agile software process is suggested.
Abstract: Most requirements management processes and associated tools are designed for document-driven software development and are unlikely to be adopted for the needs of an agile software development team. We discuss how and what can make the traditional requirements documentation a lightweight process, and suitable for user requirements elicitation and analysis. We propose a reference model for requirements analysis and documentation and suggest what kind of requirements management tools are needed to support an agile software process. The approach and the reference model are demonstrated in Vixtory, a tool for requirements lightweight documentation in agile web application development.

Journal ArticleDOI
TL;DR: In this article, a method for reducing sudden noise using noise detection and classification methods, and noise power estimation, was described, which achieved good performance for recognition of utterances overlapped by sudden noises.
Abstract: This paper describes a method for reducing sudden noise using noise detection and classification methods, and noise power estimation. Sudden noise detection and classification have been dealt with in our previous study. In this paper, GMM-based noise reduction is performed using the detection and classification results. As a result of classification, we can determine the kind of noise we are dealing with, but the power is unknown. In this paper, this problem is solved by combining an estimation of noise power with the noise reduction method. In our experiments, the proposed method achieved good performance for recognition of utterances overlapped by sudden noises.

Journal ArticleDOI
TL;DR: An intelligent multi-agent system to simulate supply chain management has been developed and it focuses on the way in which the agent purchases components using a mixed procurement strategy and how it sets its prices according to the prevailing market conditions and its own inventory level.
Abstract: Fuzzy Logic is used to derive the optimal inventory policies in the Supply Chain (SC) numbers. We examine the performance of the optimal inventory policies by cutting the costs and increasing the supply chain management efficiency. The proposed inventory policy uses multi-agent and Fuzzy logic, and provides managerial insights on the impact of the decision making in all the SC numbers. In particular, we focus on the way in which our agent purchases components using a mixed procurement strategy (combining long and short term planning) and how it sets its prices according to the prevailing market conditions and its own inventory level (because this adaptivity and flexibility are key to its success). In modern global market, one of the most important issues of the supply chain (SC) management is to satisfy changing customer demands and enterprises should enhance the long-term advantage through the optimal inventory control. In this paper an intelligent multi-agent system to simulate supply chain management has been developed.