scispace - formally typeset
Search or ask a question
Author

Mike Nachtegael

Bio: Mike Nachtegael is an academic researcher from Ghent University. The author has contributed to research in topics: Fuzzy logic & Fuzzy set. The author has an hindex of 25, co-authored 107 publications receiving 2669 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: A new fuzzy filter is presented for the noise reduction of images corrupted with additive noise, based on fuzzy rules which make use of membership functions.
Abstract: A new fuzzy filter is presented for the noise reduction of images corrupted with additive noise. The filter consists of two stages. The first stage computes a fuzzy derivative for eight different directions. The second stage uses these fuzzy derivatives to perform fuzzy smoothing by weighting the contributions of neighboring pixel values. Both stages are based on fuzzy rules which make use of membership functions. The filter can be applied iteratively to effectively reduce heavy noise. In particular, the shape of the membership functions is adapted according to the remaining noise level after each iteration, making use of the distribution of the homogeneity in the image. A statistical model for the noise distribution can be incorporated to relate the homogeneity to the adaptation scheme of the membership functions. Experimental results are obtained to show the feasibility of the proposed approach. These results are also compared to other filters by numerical measures and visual inspection.

314 citations

BookDOI
01 Jan 2000
TL;DR: The use of fuzzy techniques in image processing is one of the main topics of the Fuzziness and Uncertainty Modelling Research Group of Prof. Kerre.
Abstract: Vision in general and images in particular have always played an important and essential role in human life. Today, image processing is a very active research area with many applications. In order to cope with the wide variety of image processing problems, several techniques have been introduced and developed, quite often with great success. Among the different techniques that are currently in use, we also encounter fuzzy techniques. The use of fuzzy techniques in image processing is one of the main topics of the Fuzziness and Uncertainty Modelling Research Group of Prof. Kerre. In this paper, we briefly summarize some achievements of the past years.

288 citations

Journal ArticleDOI
TL;DR: A new algorithm that is especially developed for reducing all kinds of impulse noise: fuzzy impulse noise detection and reduction method (FIDRM), which can also be applied to images having a mixture of impulse Noise and other types of noise.
Abstract: Removing or reducing impulse noise is a very active research area in image processing. In this paper we describe a new algorithm that is especially developed for reducing all kinds of impulse noise: fuzzy impulse noise detection and reduction method (FIDRM). It can also be applied to images having a mixture of impulse noise and other types of noise. The result is an image quasi without (or with very little) impulse noise so that other filters can be used afterwards. This nonlinear filtering technique contains two separated steps: an impulse noise detection step and a reduction step that preserves edge sharpness. Based on the concept of fuzzy gradient values, our detection method constructs a fuzzy set impulse noise. This fuzzy set is represented by a membership function that will be used by the filtering method, which is a fuzzy averaging of neighboring pixels. Experimental results show that FIDRM provides a significant improvement on other existing filters. FIDRM is not only very fast, but also very effective for reducing little as well as very high impulse noise.

265 citations

Journal ArticleDOI
TL;DR: A new two-step fuzzy filter that adopts a fuzzy logic approach for the enhancement of images corrupted with impulse noise is presented and it is found experimentally that the proposed method provides a significant improvement on other state-of-the-art methods.

129 citations

Journal ArticleDOI
TL;DR: This paper proposes similarity measures based on neighbourhoods, so that the relevant structures of the images are observed better and 13 new similarity measures were found to be appropriate for the comparison of images.

127 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: This paper presents results of an extensive subjective quality assessment study in which a total of 779 distorted images were evaluated by about two dozen human subjects and is the largest subjective image quality study in the literature in terms of number of images, distortion types, and number of human judgments per image.
Abstract: Measurement of visual quality is of fundamental importance for numerous image and video processing applications, where the goal of quality assessment (QA) algorithms is to automatically assess the quality of images or videos in agreement with human quality judgments. Over the years, many researchers have taken different approaches to the problem and have contributed significant research in this area and claim to have made progress in their respective domains. It is important to evaluate the performance of these algorithms in a comparative setting and analyze the strengths and weaknesses of these methods. In this paper, we present results of an extensive subjective quality assessment study in which a total of 779 distorted images were evaluated by about two dozen human subjects. The "ground truth" image quality data obtained from about 25 000 individual human quality judgments is used to evaluate the performance of several prominent full-reference image quality assessment algorithms. To the best of our knowledge, apart from video quality studies conducted by the Video Quality Experts Group, the study presented in this paper is the largest subjective image quality study in the literature in terms of number of images, distortion types, and number of human judgments per image. Moreover, we have made the data from the study freely available to the research community . This would allow other researchers to easily report comparative results in the future

2,598 citations

Proceedings ArticleDOI
23 Aug 2010
TL;DR: A simple mathematical relationship is derived between the peak-signal-to-noise ratio and the structural similarity index measure which works for various kinds of image degradations such as Gaussian blur, additive Gaussian white noise, jpeg and jpeg2000 compression.
Abstract: In this paper, we analyse two well-known objective image quality metrics, the peak-signal-to-noise ratio (PSNR) as well as the structural similarity index measure (SSIM), and we derive a simple mathematical relationship between them which works for various kinds of image degradations such as Gaussian blur, additive Gaussian white noise, jpeg and jpeg2000 compression. A series of tests realized on images extracted from the Kodak database gives a better understanding of the similarity and difference between the SSIM and the PSNR.

2,540 citations

Posted Content
TL;DR: F fuzzy sets allow a far richer dialogue between ideas and evidence in social research than previously possible, and can be carefully tailored to fit evolving theoretical concepts, sharpening quantitative tools with in-depth knowledge gained through qualitative, case-oriented inquiry.
Abstract: In this innovative approach to the practice of social science, Charles Ragin explores the use of fuzzy sets to bridge the divide between quantitative and qualitative methods. Paradoxically, the fuzzy set is a powerful tool because it replaces an unwieldy, "fuzzy" instrument—the variable, which establishes only the positions of cases relative to each other, with a precise one—degree of membership in a well-defined set. Ragin argues that fuzzy sets allow a far richer dialogue between ideas and evidence in social research than previously possible. They let quantitative researchers abandon "homogenizing assumptions" about cases and causes, they extend diversity-oriented research strategies, and they provide a powerful connection between theory and data analysis. Most important, fuzzy sets can be carefully tailored to fit evolving theoretical concepts, sharpening quantitative tools with in-depth knowledge gained through qualitative, case-oriented inquiry. This book will revolutionize research methods not only in sociology, political science, and anthropology but in any field of inquiry dealing with complex patterns of causation.

1,828 citations

Journal ArticleDOI
TL;DR: This paper provides a state-of-the-art review and analysis of the different existing methods of steganography along with some common standards and guidelines drawn from the literature and some recommendations and advocates for the object-oriented embedding mechanism.

1,572 citations

Journal ArticleDOI
TL;DR: Computer and Robot Vision Vol.
Abstract: Computer and Robot Vision Vol. 1, by R.M. Haralick and Linda G. Shapiro, Addison-Wesley, 1992, ISBN 0-201-10887-1.

1,426 citations