scispace - formally typeset
Search or ask a question
Author

Karim Faez

Bio: Karim Faez is an academic researcher from Amirkabir University of Technology. The author has contributed to research in topics: Feature extraction & Facial recognition system. The author has an hindex of 36, co-authored 387 publications receiving 5135 citations. Previous affiliations of Karim Faez include University of Tehran & Uttar Pradesh Technical University.


Papers
More filters
Journal ArticleDOI
TL;DR: A holistic system for the recognition of handwritten Farsi/Arabic words using right–left discrete hidden Markov models (HMM) and Kohonen self-organizing vector quantization is presented.
Abstract: A holistic system for the recognition of handwritten Farsi/Arabic words using right–left discrete hidden Markov models (HMM) and Kohonen self-organizing vector quantization is presented. The histogram of chain-code directions of the image strips, scanned from right to left by a sliding window, is used as feature vectors. The neighborhood information preserved in the self-organizing feature map (SOFM), is used for smoothing the observation probability distributions of trained HMMs. Experiments carried out on test samples show promising performance results.

153 citations

Journal ArticleDOI
TL;DR: A new adaptive data-hiding method based on least-significant-bit (LSB) substitution and pixel-value differencing (PVD) for grey-scale images can embed a large amount of secret data while maintaining a high visual quality of the stego-images.
Abstract: This study presents a new adaptive data-hiding method based on least-significant-bit (LSB) substitution and pixel-value differencing (PVD) for grey-scale images. The proposed method partition the cover image into some non-overlapping blocks having three consecutive pixels and select the second pixel of each block as the central pixel (called base-pixel). Then, k -bits of secret data are embedded in the base pixel by using LSB substitution and optimal pixel adjustment process (OPAP). The difference between the base-pixel value and other pixel values in the block are utilised to determine how many secret bits can be embedded in the two pixels. Also, the method divides all possible differences into lower level and higher level with a number of ranges. Then, it obtains the number of the secret bits that will be embedded into each block depending on the range which the difference values belong to. The experimental results show that the proposed method can embed a large amount of secret data while maintaining a high visual quality of the stego-images. The peak signal-to-noise ratio (PSNR) values and the embedding capacity of our method are higher than those of three other data-hiding methods which are investigated in this study.

133 citations

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a feature selection method based on ant colony optimization (ACO), which is inspired of ant's social behavior in their search for the shortest paths to food sources.
Abstract: Feature selection (FS) is a most important step which can affect the performance of a pattern recognition system. This paper proposes a novel feature selection method based on ant colony optimization (ACO). ACO algorithm is inspired of ant’s social behavior in their search for the shortest paths to food sources. Most common techniques for ACO-based feature selection use the priori information of features. However, in the proposed algorithm classifier performance and the length of the selected feature vector are adopted as heuristic information for ACO. So, we can select the optimal feature subset in terms of shortest feature length and the best performance of classifier. The experimental results on face recognition system using ORL database show that the proposed approach is easily implemented and without any priori information of features, its total performance is better than that of GA-based and other ACO-based feature selection methods.

127 citations

Proceedings Article
01 Jan 2003
TL;DR: Though the Fast Circle Detection (FCD) method loses the generality of the CHT, it is shown that there are many applications that can use this method after a simple preprocessing and gain a considerable improvement in performance against theCHT or its modified versions.
Abstract: The Circle Hough Transform (CHT) has become a common method for circle detection in numerous image processing applications. Because of its drawbacks, various modifications to the basic CHT method have been suggested. This paper presents an algorithm to find circles which are totally brighter or darker than their backgrounds. The method is size-invariant, and such circular shapes can be found very fast and accurately. Though Fast Circle Detection (FCD) method loses the generality of the CHT, we show that there are many applications that can use this method after a simple preprocessing and gain a considerable improvement in performance against the CHT or its modified versions. This method has been evaluated in some famous industrial and medical fields, and the results show a significant improvement of finding circular shape objects.

110 citations

Journal ArticleDOI
TL;DR: Experimental results demonstrate that the proposed algorithm accuracy is superior to other previous methods in the gait recognition application and gives a better performance in both recognition rate and computational speed.
Abstract: Analysis of person’s behavioral and psychological features like Gait or the manner of walking, allows human identification process to be able to extract the gait information from a video sequence of different persons at a distance and recognize the specific person. This paper proposes a fast gait recognition algorithm based on the averaged silhouette of person. Three important novelties in our proposed algorithm cause that it gives a better performance than previous methods in both recognition rate and computational speed.1- frame sampling method, which causes the algorithm computational time, without any change in recognition rate, decrease to about half. 2- New method for background estimation, which gives an acceptable estimate of the background. 3- Local thresholding method, which gives more complete and more accurate binary silhouette pictures. Experimental results demonstrate that our proposed algorithm accuracy is superior to other previous methods in the gait recognition application.

109 citations


Cited by
More filters
Christopher M. Bishop1
01 Jan 2006
TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Abstract: Probability Distributions.- Linear Models for Regression.- Linear Models for Classification.- Neural Networks.- Kernel Methods.- Sparse Kernel Machines.- Graphical Models.- Mixture Models and EM.- Approximate Inference.- Sampling Methods.- Continuous Latent Variables.- Sequential Data.- Combining Models.

10,141 citations

Journal ArticleDOI
06 Jun 1986-JAMA
TL;DR: The editors have done a masterful job of weaving together the biologic, the behavioral, and the clinical sciences into a single tapestry in which everyone from the molecular biologist to the practicing psychiatrist can find and appreciate his or her own research.
Abstract: I have developed "tennis elbow" from lugging this book around the past four weeks, but it is worth the pain, the effort, and the aspirin. It is also worth the (relatively speaking) bargain price. Including appendixes, this book contains 894 pages of text. The entire panorama of the neural sciences is surveyed and examined, and it is comprehensive in its scope, from genomes to social behaviors. The editors explicitly state that the book is designed as "an introductory text for students of biology, behavior, and medicine," but it is hard to imagine any audience, interested in any fragment of neuroscience at any level of sophistication, that would not enjoy this book. The editors have done a masterful job of weaving together the biologic, the behavioral, and the clinical sciences into a single tapestry in which everyone from the molecular biologist to the practicing psychiatrist can find and appreciate his or

7,563 citations

Journal ArticleDOI
TL;DR: A critical review of the nature of the problem, the state-of-the-art technologies, and the current assessment metrics used to evaluate learning performance under the imbalanced learning scenario is provided.
Abstract: With the continuous expansion of data availability in many large-scale, complex, and networked systems, such as surveillance, security, Internet, and finance, it becomes critical to advance the fundamental understanding of knowledge discovery and analysis from raw data to support decision-making processes. Although existing knowledge discovery and data engineering techniques have shown great success in many real-world applications, the problem of learning from imbalanced data (the imbalanced learning problem) is a relatively new challenge that has attracted growing attention from both academia and industry. The imbalanced learning problem is concerned with the performance of learning algorithms in the presence of underrepresented data and severe class distribution skews. Due to the inherent complex characteristics of imbalanced data sets, learning from such data requires new understandings, principles, algorithms, and tools to transform vast amounts of raw data efficiently into information and knowledge representation. In this paper, we provide a comprehensive review of the development of research in learning from imbalanced data. Our focus is to provide a critical review of the nature of the problem, the state-of-the-art technologies, and the current assessment metrics used to evaluate learning performance under the imbalanced learning scenario. Furthermore, in order to stimulate future research in this field, we also highlight the major opportunities and challenges, as well as potential important research directions for learning from imbalanced data.

6,320 citations

01 Jan 2004
TL;DR: Comprehensive and up-to-date, this book includes essential topics that either reflect practical significance or are of theoretical importance and describes numerous important application areas such as image based rendering and digital libraries.
Abstract: From the Publisher: The accessible presentation of this book gives both a general view of the entire computer vision enterprise and also offers sufficient detail to be able to build useful applications. Users learn techniques that have proven to be useful by first-hand experience and a wide range of mathematical methods. A CD-ROM with every copy of the text contains source code for programming practice, color images, and illustrative movies. Comprehensive and up-to-date, this book includes essential topics that either reflect practical significance or are of theoretical importance. Topics are discussed in substantial and increasing depth. Application surveys describe numerous important application areas such as image based rendering and digital libraries. Many important algorithms broken down and illustrated in pseudo code. Appropriate for use by engineers as a comprehensive reference to the computer vision enterprise.

3,627 citations

01 Apr 2003
TL;DR: The EnKF has a large user group, and numerous publications have discussed applications and theoretical aspects of it as mentioned in this paper, and also presents new ideas and alternative interpretations which further explain the success of the EnkF.
Abstract: The purpose of this paper is to provide a comprehensive presentation and interpretation of the Ensemble Kalman Filter (EnKF) and its numerical implementation. The EnKF has a large user group, and numerous publications have discussed applications and theoretical aspects of it. This paper reviews the important results from these studies and also presents new ideas and alternative interpretations which further explain the success of the EnKF. In addition to providing the theoretical framework needed for using the EnKF, there is also a focus on the algorithmic formulation and optimal numerical implementation. A program listing is given for some of the key subroutines. The paper also touches upon specific issues such as the use of nonlinear measurements, in situ profiles of temperature and salinity, and data which are available with high frequency in time. An ensemble based optimal interpolation (EnOI) scheme is presented as a cost-effective approach which may serve as an alternative to the EnKF in some applications. A fairly extensive discussion is devoted to the use of time correlated model errors and the estimation of model bias.

2,975 citations