scispace - formally typeset
Search or ask a question

Showing papers in "Information Technology and Control in 2015"


Journal ArticleDOI
TL;DR: This paper maps LO to KM, and presents, in a holistic manner, a conceptual model of LO and KM, which aims to be a basis for developing guidelines for how to introduceLO and KM.
Abstract: . The need for organizations to become learning organizations grows. To be a Learning Organization (LO) requires Knowledge Management (KM), which in turn is dependent of a LO. It is like the chicken and the egg. It is impossible to answer the question which came first, and they are both dependent on one another for success. Literature emphasizes LO or KM, despite the fact that they are dependent. An organization that wants to become a Learning Organization must pay attention to both, and therefore there has to be a shift in emphasis to LO and KM. This paper addresses this problem. It maps LO to KM, and presents, in a holistic manner, a conceptual model of LO and KM. In future work this model aims to be a basis for developing guidelines for how to introduce LO and KM.

64 citations


Journal ArticleDOI
TL;DR: The main objective of this study is to compare the originally proposed methods with the tangent intersection and the second derivate maximum methods, with respect to error dispersion under different sound-to-noise ratios (SNR) and difference between the foot point of the APW without noise (APW reference value) and the foot Point of theAPW with additive noise.
Abstract: The evaluation of the arterial wall condition is most frequently based on such markers as arterial pulse wave velocity (PWV) and pulse transit time (PTT). To calculate these markers, it is necessary to determine the location of the foot of the arterial pulse wave (APW). This foot point is usually determined with the help of the second derivate maximum or tangent intersection foot-to-foot methods. This paper proposes two original methods for locating the APW foot point, namely: the bottom straight-line and forefront tangent intersection method and the APW foot polynomial approximation method. The main objective of this study is to compare the originally proposed methods with the tangent intersection and the second derivate maximum methods, with respect to error dispersion under different sound-to-noise ratios (SNR) and difference between the foot point of the APW without noise (APW reference value) and the foot point of the APW with a certain SNR. The analysis of the APW signal with additive noise reveals that the second derivate maximum method results in the widest error dispersion, whereas the tangent intersection method results in the greatest difference between the APW reference value and the foot point of the APW with additive noise. The least difference between the APW reference value and the foot point of the APW with additive noise, as well as the least error dispersion is achieved in the APW foot polynomial approximation method.

49 citations


Journal ArticleDOI
TL;DR: This paper presents a framework for modeling software requirements consistently using multiple UML diagrams and discusses how such a framework could be implemented in one of the most popular UML tools, MagicDraw UML, by using its powerful features for customizing the modeling environment.
Abstract: UML is considered to be de facto standard for software modeling. However, in software requirements analysis it is quite common to apply only use case and activity diagrams and focus on the textual requirements specification with some non-standard graphical illustrations. In this paper we present a framework for modeling software requirements consistently using multiple UML diagrams. We illustrate the application of this framework with the examples of different requirements artifacts based on a case study system MagicTest. We discuss how such a framework could be implemented in one of the most popular UML tools, MagicDraw UML, by using its powerful features for customizing the modeling environment, defining methodology wizards, specifying validation rules, analyzing model element relationships, and generating documentation based on user-defined templates. We recognize that our approach provides the foundation, which could and should be refined and extended for special cases of requirements analysis. Our work should be considered as a starting point for practitioners trying to adopt UML for requirements analysis and for scientists working on creating more detailed requirements analysis methods based on UML.

30 citations


Journal ArticleDOI
TL;DR: The proposed method solves locally applied language incompact usage in the process of document clus-tering by document repre-sentation based on tagging, and to improve clustering results by using knowledge technology – ontology.
Abstract: Text documents are very significant in the contemporary organizations, moreover their constant accumulation enlarges the scope of document storage. Standard text mining and information retrieval techniques of text document usually rely on word matching. An alternative way of information retrieval is clustering. In this paper we suggest to complement the traditional clustering method by document repre-sentation based on tagging, and to improve clustering results by using knowledge technology – ontology. The proposed method solves locally applied language incompact usage in the process of document clus-tering.

29 citations


Journal ArticleDOI
TL;DR: The results show that the proposed variant of ITS outperforms both the straightforward TS algorithm and the other heuristic algorithms tested.
Abstract: In this paper, we propose an iterated tabu search (ITS) algorithm for the well-known combinatorial optimization problem, the traveling salesman problem (TSP). ITS is based on so-called intensification (improvement) and diversification (reconstruction) (I&D) paradigm. The goal of the intensification is the search for a locally optimal solution in the neighbourhood of the current solution. The diversification is responsible for escaping from the current local optimum and moving towards new regions in the solution space. Using the limited standard tabu search (TS) in the role of an effective intensification (local improvement) procedure resulted in promising solutions obtained during the experimentation with a number of the test data from the library of TSP instances TSPLIB. The results show that the proposed variant of ITS outperforms both the straightforward TS algorithm and the other heuristic algorithms tested.

25 citations


Journal ArticleDOI
TL;DR: This paper analyzes the properties of feature models and proposes three FD complexity measures, which describe structural complexity and cognitive complexity of the model expressed through the number of adequate sub-trees of the given model.
Abstract: Feature models represented by Feature Diagrams (FDs) prevail in the software product line approach. The product line approach and FDs are used to manage variability and complexity of software families and to ensure higher quality and productivity of product development through higher-level feature modeling and reuse. In this paper we, first, analyze the properties of feature models. Then, combining some properties of FDs with ideas of Miller’s, Metcalfe’s and Keating’s works, we propose three FD complexity measures. The first measure gives boundaries to estimate cognitive complexity of a generic component to be derived from the feature model. The second measure describes structural complexity of the model expressed through the number of adequate sub-trees of the given model. The third measure estimates total cognitive and structural complexity of the model. To validate the introduced measures, we present a case study with three feature models of a varying complexity.

25 citations


Journal ArticleDOI
TL;DR: This paper presents an approach for knowledge represented by ontology automatic transformation into conceptual data model and the graph transformation language is presented and adapted for formal transformation of ontology into conceptual model.
Abstract: The paper analyses graph oriented method for ontology transformation into conceptual data model. A number of methods were proposed to develop conceptual data models, but only few deals with knowledge reuse. In this paper we present an approach for knowledge represented by ontology automatic transformation into conceptual data model. The graph transformation language is presented and adapted for formal transformation of ontology into conceptual model. Details and examples of proposed ontology transformation into conceptual data model are presented also.

24 citations


Journal ArticleDOI
TL;DR: A new knowledge-based model for representing LO instances based on factoring and aggre-gating knowledge units within a LO and presented as a structure of interface and functionality contributes to better compositionality, reusability and can be further generalized easily to support the personalized content delivery and automatic generation.
Abstract: Today there are many efforts to shift the reuse dimension from component-based to generative reuse in the learning object (LO) domain. This requires more precise LO models and commonality-variability analysis. We propose a new knowledge-based model for representing LO instances. The model is based on factoring and aggre-gating knowledge units within a LO and is presented as a structure of interface and functionality. Interface serves for explicit describing knowledge communication to and from the LO. Functionality describes knowledge representation and managing. The model contributes to better compositionality, reusability and can be further generalized easily to support the personalized content delivery and automatic generation. Using the introduced model as a basis for generalization, we extended the known concept of generative LOs by linking domain commonality-variability analysis with meta-programming techniques for generating LO instances on demand from the generic LO specification.

24 citations


Journal ArticleDOI
TL;DR: The goal of this paper is to extend capabilities of UML for required types of integrity constraints introducing stereotypes or reusing them from other methods to make them reusable in various activities.
Abstract: Integrity constraints are incident part of conceptual models, including part of semantics of problem do-main. Analysis of the most important methods of conceptual modelling has revealed that none of them analyze the complete set of integrity constraints needed for making semantically meaningful model. In our previous work the taxo-nomy of integrity constraints relevant for design of well-formed conceptual models was established. The goal of this paper is to extend capabilities of UML for required types of integrity constraints introducing stereotypes or reusing them from other methods. In contrast with current practice of deferring description of constraints to detailed design, modelling of constraints in the phase of conceptual analysis makes them reusable in various activities: not only in generating DB schema, but also in early verification, validation, transformation to other types of schemas and program code.

23 citations


Journal ArticleDOI
TL;DR: Three versions of software codes for the simulation of granular material dynamics based on the discrete element method for object-oriented programming approach based on applying C++ are presented.
Abstract: The paper presents three versions of software codes for the simulation of granular material dynamics based on the discrete element method. The codes DEMMAT_F90 and DEMMAT_PAS are used for the imple-mentation of a purely procedural approach by the programming languages FORTRAN 90 and OBJECT PASCAL, while the code DEMMAT_CPP represents a purely object-oriented programming approach based on applying C++.

21 citations


Journal ArticleDOI
TL;DR: This paper suggests a new procedure for large itemsets generation which is more efficient than the appropriate procedure of the original Apriori algorithm, and suggests a modified sort-merge-join algorithm, which isMore efficient than nested-loop-join algorithms, which are suggested in the originals.
Abstract: One of the most important data mining problems is mining association rules. In this paper we consider discovering association rules from large transaction databases. The problem of discovering association rules can be decomposed into two sub-problems: find large itemsets and generate association rules from large itemsets. The second sub-problem is easier one and the complexity of discovering association rules is determined by complexity of discovering large itemsets. In this paper, we suggest Apriori-based algorithm for discovering large itemsets. Actually, we suggest a new procedure for large itemsets generation which is more efficient than the appropriate procedure of the original Apriori algorithm. For its implementation, we suggest a modified sort-merge-join algorithm, which is more efficient than nested-loop-join algorithm, which is suggested in the original Apriori algorithm. Besides, we propose a way in which Apriori Multiple finishes in just two iterations.

Journal ArticleDOI
TL;DR: The decision making model which is based on the application of Artificial Neural Networks (ANNs) and Particel Swarm Optimization (PSO) algorithm is introduced, which is to select the "global best" ANN for decision making and to adapt the weights of other ANNs towards the weight of the best network.
Abstract: Recently, swarm intelligence is becoming a powerful tool for optimizing operations of various businesses. Swarm intelligence is an artificial intelligence technique which study behavior of decentralized, self-organized systems. The goal of the authors of this paper is to elaborate swarm intelligence for business rules management improvement. The paper introduces the decision making model which is based on the application of Artificial Neural Networks (ANNs) and Particel Swarm Optimization (PSO) algorithm. In the proposed decision making model ANNs are applied in order to make the analysis of data and to calculate the decision. The training of ANNs is based on the application of PSO algorithm. The core idea of this algorithm application is to select the "global best" ANN for decision making and to adapt the weights of other ANNs towards the weights of the best network. The potentiality of PSO algorithm application for improving business rules management is shown in the case study.

Journal ArticleDOI
TL;DR: Instead of sequences, by taking real valued functions x,measurable (in the Lebesque sense) in the interval (1;1), definitions of summability, strong summable, lacunary convergence, lac unary strong convergence, strongly almost convergence, statistical convergence and Lacunary statistical convergence of these functions are given.
Abstract: Strongly summable sequences and lacunary strongly summable sequences were studied by several authors including [5]. Also statistically convergent sequences and lacunary statistical convergent sequences were studied several authors, including [2],[3],[6],[7].In this paper instead of sequences, by taking real valued functions x,measurable (in the Lebesque sense) in the interval (1;1), we have given definitions of summability, strong summability, lacunary convergence, lacunary strong convergence, strongly almost convergence, statistical convergence and lacunary statistical convergence of these functions. Also we have given some inclusion relations.

Journal ArticleDOI
TL;DR: The methodology for checking of conceptual models is proposed as the step-wise pro-cess during which model elements including integrity constraints are progressively checked for their adequacy to values of objects, their relationships and constraints of the corresponding problem domain.
Abstract: Due to the raising level of abstraction in information systems development many activities of this process are migrating to its early phases. The same is true for testing – modern CASE tools are undertaking validation of software models. In this paper the methodology for checking of conceptual models is proposed as the step-wise pro-cess during which model elements including integrity constraints are progressively checked for their adequacy to values of objects, their relationships and constraints of the corresponding problem domain. The checking process is associated with the particular methodology for development of ordered and precise conceptual models (OPCM), which brings improvements to their quality: conformity to normal forms and ontological foundations, and to the observed reality. The rules for checking of integrity constraints are proposed on the base of taxonomy created in the result of analysis of the most promising methods for conceptual modelling.

Journal ArticleDOI
TL;DR: The optimal priority list of jobs is found by applying the algorithms of local and global search, namely, Genetic Algorithm with constructed crossover, mutation, and selection operators, based on the job priority list.
Abstract: Applications of information technologies are often related to making some schedules, timetables of tasks or jobs with constrained resources. In this paper, we consider algorithms of job scheduling related to resources, time, and other constraints. Schedule optimization procedures, based on schedule coding by the priority list of jobs, are created and investigated. The optimal priority list of jobs is found by applying the algorithms of local and global search, namely, Genetic Algorithm with constructed crossover, mutation, and selection operators, based on the job priority list. Computational results with testing data from the project scheduling problem library are given.

Journal ArticleDOI
TL;DR: A path planning algorithm for snake-like robots, modeled with discreet serial links employing many degrees of freedom, that is able to follow smoothly curved paths consisting of many points by determining their configurations to reach the goal while avoiding obstacles in the workspace is presented.
Abstract: Snake-like robots have the ability of automatically performing various tasks that require man-equivalent capabilities by reaching areas difficult or impossible to reach for human beings. However, problems in path planning and design of such robots prevent them from being fully functional. In this paper, a path planning algorithm for snake-like robots is presented. Snake-like robots are modeled with discreet serial links employing many degrees of freedom. They are able to follow smoothly curved paths consisting of many points by determining their configurations to reach the goal while avoiding obstacles in the workspace. Simulations have been accomplished to show the effectiveness of the algorithm.

Journal ArticleDOI
TL;DR: A hybrid genetic algorithm that uses a new kind of solution recombination operators − a so-called multiple parent crossover on the grey pattern problem, which is as special case of the well-known problem, the quadratic assignment problem.
Abstract: Recently, genetic algorithms (GAs) are quite popular by solving combinatorial optimization problems. In this paper, we discuss a hybrid genetic algorithm that uses a new kind of solution recombination operators − a so-called multiple parent crossover. We examined this innovative crossover operator on the grey pattern problem, which is as special case of the well-known problem, the quadratic assignment problem. The results obtained during the experimentation with the set of 62 instances of the grey pattern problem demonstrate promising efficiency of the multiple parent crossover. All the instances tested were solved to pseudo-optimality within surprisingly small computation times.

Journal ArticleDOI
TL;DR: This paper proposes a method which advances currently used methods for generating relational database schemas by generating full-fledged relationaldatabase schemas from a conceptual model by consisting of metamodel-based ant pattern-based transformations.
Abstract: In this paper, we briefly describe currently used methods for generating relational database schemas, their limitations and drawbacks, and propose a method which advances them by generating full-fledged relational database schemas from a conceptual model. The proposed method consists of metamodel-based ant pattern-based transformations. Principles of creating pattern-based transformations are defined for transformation of OCL expressions to corresponding SQL code.

Journal ArticleDOI
TL;DR: It is shown that C-3PEKE is not secure and suffers from off-line password guessing attacks.
Abstract: Recently, Chang proposed a practical three-party key exchange (C-3PEKE) protocol with round efficiency. Unfortunately, this paper shall show that C-3PEKE is not secure and suffers from off-line password guessing attacks.

Journal ArticleDOI
TL;DR: The purpose of the work is to review outlier detection methods and to test the possibility of using them to solve the task and best results were achieved using the multilayer perceptron and the principal component analysis based technique.
Abstract: Doubtful real estate transactions, with the prices far away from the market prices, appear because of non commercial transactions or efforts in order to hide the taxes. To estimate the right values of parameters, such data must be removed from a data set or robust methods of parameters estimation are to be used, while developing a mass appraisal model. Such transactions are outlying observations, which can be detected and removed by outlier detection methods. The purpose of the work is to review outlier detection methods and to test the possibility of using them to solve the task. An overview of real estate market value, valuation methods and process of mass appraisal is made to introduce to real estate mass valuation. Overview of outlier detection method contains scaling and such methods: resampling by half means, the smallest half volume, the closest distance to the center, ellipsoidal multivariate trim- ming, minimum volume ellipsoid, minimum scatter determinant, analysis of projection matrix, principal components and residuals, also influence measures, robust regression, and classification methods. The reviewed methods were categorized; commonly used methods were selected and tested experimentally aiming to compare the effectiveness. Best results were achieved using the multilayer perceptron and the principal component analysis based technique.

Journal ArticleDOI
TL;DR: The paper presents the experimental results for the benchmark suite ISCAS'85 and the value of this approach is high- lighted by the fact that the selected input stimuli detect the same stuck-at faults as the initially generated test set.
Abstract: The software prototype model can be used for the generation of the verification test. The input stimuli, which form essential activity vectors, are selected from randomly generated ones on the base of software prototype. The essential activity vectors correspond to the terms of logical functions of output the existence of which is tested during the verification. The verification test is formed on the base of the essential activity vectors according to the defined rules. The quality of the verification test is measured by the following parameters: the length of test, the fault coverage of the stuck-at faults, the fault coverage of the pin pair faults, and the number of the essential activity vectors. The paper presents the experimental results for the benchmark suite ISCAS'85. The value of this approach is high- lighted by the fact that the selected input stimuli detect the same stuck-at faults as the initially generated test set.

Journal ArticleDOI
TL;DR: The principles and major steps of Enterprise Meta-Model (EMM) based development of Use Case model (UCM) in CASE system environment are presented and illustrated.
Abstract: The principles and major steps of Enterprise Meta-Model (EMM) based development of Use Case model (UCM) in CASE system environment are presented in this paper. The Enterprise Meta-Model represents the key concepts of domain knowledge. The enterprise processes, management functions, and their interactions are considered as the critical components of the domain knowledge accumulated as Enterprise model in the knowledge base of CASE system. The formal background for generation of UCM is mapping rules of EMM constructs to constructs of UCM meta-model. The key rules and steps of Meta-Model based development of UCM for user specified business function are presented and illustrated.

Journal ArticleDOI
TL;DR: It was concluded, that day-of-the-week effect had influence on stocks with medium turnover, including RSU, LDJ, KJK, PZV shares had significant difference for some days of the week.
Abstract: In this article we examine the impact of daily trade turnover on the day-of-the-week effect in emerging stock markets. The empirical analysis was made by using Vilnius Stock OMX equities return data. The main method suggested for analysis was based on formation of three portfolios with equities having low, medium and high daily turnover. By applying traditional statistical research methods, such as t-test, one-way ANOVA, Levene and Brown-Forsythe test of homogeneity of variances the statistically significant difference among Monday and the other week days was observed only for some equities with medium trading volume. Analyzing influence of higher moments to mean return distribution (Kolmogorov-Smirnov test) we concluded, that day-of-the-week effect had influence on stocks with medium turnover. We have also applied Kolmogorov-Smirnov test for different days of the week to inves-tigate the effect to daily turnover of shares. By applying test to all possible pairs of the days, it was defined that volu-mes of RSU, LDJ, KJK, PZV shares had significant difference for some days of the week.

Journal ArticleDOI
TL;DR: Hsu and Chuang demonstrated that Mangipudi-Katti’s scheme is vulnerable to an identity disclosure attack, and further proposed a novel user identification scheme with key distribution preserving user anonymity for distributed computer networks.
Abstract: In 2004, Yang et al. proposed an efficient user identification scheme with key distribution. The scheme provides user anonymity, so it is possible for the user to anonymously login the remote server. Unfortunately, Mangipudi and Katti found that Yang et al.’s scheme suffers from a Denial-of-Service (DoS) attack and then proposed an improvement of Yang et al.’s scheme. However, Hsu and Chuang demonstrated that Mangipudi-Katti’s scheme is vulnerable to an identity disclosure attack, and further proposed a novel user identification scheme with key distribution preserving user anonymity for distributed computer networks. They claimed that their scheme can achieve the following advantages: (1) user anonymity, (2) key distribution, (3) mutual authentication, and (4) key confirm. In this study, the author shows that Hsu-Chuang’s scheme is vulnerable to three impersonation attacks. Then, the improvement of Hsu-Chuang’s scheme is proposed.

Journal ArticleDOI
TL;DR: This paper describes and statistically motivates features and rules for the detection of phoneme groups using phonetically labeled data and proposes algorithms for recognition of stop and fricative consonants.
Abstract: Better discrimination of phonemic units still remains one of the most important problems in automatic speech recognition. Direct phoneme recognition in speaker independent automatic speech recognition systems is unable to provide good enough recognition results. There is made an assumption that better results could be achieved through the recognition of phoneme groups using group characteristic features: voiced/unvoiced, vowel/consonant, etc. This paper describes and statistically motivates features and rules for the detection of phoneme groups using phonetically labeled data. Algorithms for recognition of stop and fricative consonants are presented. Experimental research confirmed the advantages of the hierarchical classification of phonemes. Combination of knowledge and rules for detection of acoustic events with the classical statistical classification methods produced an overall 3% improvement of phoneme recognition accuracy and a 52-55% reduction of time taken by classification.

Journal ArticleDOI
TL;DR: A novel dual–PTZ-camera system, called Eagle-Eye, that can keep monitoring the whole area while tracking and focusing on the details of an object and achieves good tracking performance under various conditions is proposed.
Abstract: An active camera, i.e., pan-tilt-zoom (PTZ) camera, can be used either to monitor a wide area or capture a high resolution image of a specific object by adjusting the zoom value. In order to achieve the above two goals simultaneously just like an eagle’s eye, a novel dual–PTZ-camera system, called Eagle-Eye, is proposed in this paper. The system can keep monitoring the whole area while tracking and focusing on the details of an object. Two techniques, moving object detection and fuzzy matching, are used alternatively for target tracking. According to the experimental results obtained with the implemented prototype, the success rates of tracking tasks for various moving speeds during daytime and nighttime are about 90 percent. The success rate with occlusion condition is also more than 80 percent. Furthermore, the average success rates with four special moving paths are 83.8 percent. These results show that Eagle-Eye system is feasible and achieves good tracking performance under various conditions.

Journal ArticleDOI
TL;DR: This paper considers the quality of the tests generated for two types of delay faults, namely, functional delay and transition faults, and compares the test quality of functional delay tests in regard to transi-tion faults and vice versa.
Abstract: Rapid advances of semiconductor technology lead to higher circuit integration as well as higher ope-rating frequencies. The statistical variations of the parameters during the manufacturing process as well as physical de-fects in integrated circuits can sometimes degrade circuit performance without altering its logic functionality. These faults are called delay faults. In this paper we consider the quality of the tests generated for two types of delay faults, namely, functional delay and transition faults. We compared the test quality of functional delay tests in regard to transi-tion faults and vice versa. We have performed various comprehensive experiments with combinational benchmark circuits. The experiments exhibit that the test sets, which are generated according to the functional delay fault model, obtain high fault coverages of transition faults. However, the functional delay fault coverages of the test sets targeted for the transition faults are low. It is very likely that the test vectors based on the functional delay fault model can cover other kinds of the faults. Another advantage of test set generated at the functional level is that it is independent of and effective for any implementation and, therefore, can be generated at early stages of the design process.

Journal ArticleDOI
TL;DR: Five variants (modifications) of TS are implemented for the random QAP instances from the library of the QAP instance QAPLIB, showing the outstanding efficiency of the modifications proposed and the new best known solution has been achieved for the instance tai100a.
Abstract: Tabu search (TS) is a modern highly effective meta-heuristic for solving various optimization problems. In this paper, we discuss some enhancements of TS for one of the difficult combinatorial optimization problems − the quadratic assignment problem (QAP). We implemented five variants (modifications) of TS for the random QAP instances from the library of the QAP instances QAPLIB. These random QAPs pose a real challenge for the researchers. A number of the experiments were carried out on these instances. The results obtained from the experiments demonstrate the outstanding efficiency of the modifications proposed. These modifications seem to be superior to the earlier TS algorithms for the QAP. In addition, the new best known solution has been achieved for the instance tai100a.

Journal ArticleDOI
TL;DR: In this paper, a system for discrimination of fricative consonants and sonants was developed using phonetic acoustic features, in which a group of phoneme first (voiced/unvoiced, vowel/ con-sonant, etc.) then try to recognize phoneme itself.
Abstract: Direct recognition of phonemes in speaker independent recognition systems still cannot guarantee good enough recognition results. Here we want to investigate assumption that recognition could be improved via group features of phonemic system. We propose to try to recognize group of phoneme first (voiced/unvoiced, vowel/ con-sonant, etc.) then try to recognize phoneme itself. In this experiment a system for discrimination of fricative consonants and sonants was developed using phonetic – acoustic features.

Journal ArticleDOI
TL;DR: This paper discusses an extension of a hybrid genetic algorithm for the well-known combinatorial optimization problem, the quadratic assignment problem, based on a promising genetic-tabu search policy.
Abstract: Genetic algorithms (GAs) are modern population based heuristic approaches. Recently, GAs have be-come very popular by solving various optimization problems. In this paper, we discuss an extension of a hybrid genetic algorithm for the well-known combinatorial optimization problem, the quadratic assignment problem. This extension is based on a promising genetic-tabu search policy. An enhanced tabu search is used in the role of the local improvement of solutions, whereas a robust mutation (reconstruction) strategy is "responsible" for maintaining a high degree of the diversity within the population and for avoiding a premature convergence of GA. We tested our algorithm on a set of the QAP instances. The results obtained show the outstanding performance of the proposed algorithm.