scispace - formally typeset
Search or ask a question

Showing papers by "Mojtaba Ahmadieh Khanesar published in 2009"


Book ChapterDOI
01 Jan 2009
TL;DR: A modified discrete particle swarm optimization (PSO) is successfully used based technique for generating optimal preventive maintenance schedule of generating units for economical and reliable operation of a power system while satisfying system load demand and crew constraints.
Abstract: Particle swarm optimization (PSO) was originally designed and introduced by Eberhart and Kennedy (Ebarhart, Kennedy, 1995; Kennedy, Eberhart, 1995; Ebarhart, Kennedy, 2001). The PSO is a population based search algorithm based on the simulation of the social behavior of birds, bees or a school of fishes. This algorithm originally intends to graphically simulate the graceful and unpredictable choreography of a bird folk. Each individual within the swarm is represented by a vector in multidimensional search space. This vector has also one assigned vector which determines the next movement of the particle and is called the velocity vector. The PSO algorithm also determines how to update the velocity of a particle. Each particle updates its velocity based on current velocity and the best position it has explored so far; and also based on the global best position explored by swarm (Engelbrecht, 2005; Sadri, Ching, 2006; Engelbrecht, 2002). The PSO process then is iterated a fixed number of times or until a minimum error based on desired performance index is achieved. It has been shown that this simple model can deal with difficult optimization problems efficiently. The PSO was originally developed for realvalued spaces but many problems are, however, defined for discrete valued spaces where the domain of the variables is finite. Classical examples of such problems are: integer programming, scheduling and routing (Engelbrecht, 2005). In 1997, Kennedy and Eberhart introduced a discrete binary version of PSO for discrete optimization problems (Kennedy, Eberhart, 1997). In binary PSO, each particle represents its position in binary values which are 0 or 1. Each particle's value can then be changed (or better say mutate) from one to zero or vice versa. In binary PSO the velocity of a particle defined as the probability that a particle might change its state to one. This algorithm will be discussed in more detail in next sections. Upon introduction of this new algorithm, it was used in number of engineering applications. Using binary PSO, Wang and Xiang (Wang & Xiang, 2007) proposed a high quality splitting criterion for codebooks of tree-structured vector quantizers (TSVQ). Using binary PSO, they reduced the computation time too. Binary PSO is used to train the structure of a Bayesian network (Chen et al., 2007). A modified discrete particle swarm optimization (PSO) is successfully used based technique for generating optimal preventive maintenance schedule of generating units for economical and reliable operation of a power system while satisfying system load demand and crew constraints (Yare & Venayagamoorthy, 2007). Choosing optimum input subset for SVM (Zhang & Huo, 2005),

27 citations


Book ChapterDOI
01 Jan 2009
TL;DR: A novel incremental training algorithm for the class of neuro-fuzzy systems that are structured based on local linear classifiers that is utilized to transform the data into a space in which linear discriminancy of training samples is maximized.
Abstract: Optimizing the antecedent part of neuro-fuzzy system is investigated in a number of documents. Current approaches typically suffer from high computational complexity or lack of ability to extract knowledge from a given set of training data. In this paper, we introduce a novel incremental training algorithm for the class of neuro-fuzzy systems that are structured based on local linear classifiers. Linear discriminant analysis is utilized to transform the data into a space in which linear discriminancy of training samples is maximized. The neuro-fuzzy classifier is built in the transformed space, starting from the simplest form. In addition, rule consequent parameters are optimized using a local least square approach.