scispace - formally typeset
Search or ask a question
Author

Raman Maini

Other affiliations: Government College
Bio: Raman Maini is an academic researcher from Punjabi University. The author has contributed to research in topics: Vehicle routing problem & Ant colony optimization algorithms. The author has an hindex of 8, co-authored 40 publications receiving 691 citations. Previous affiliations of Raman Maini include Government College.

Papers
More filters
Posted Content
TL;DR: Underlying concepts of underlying concepts, along with algorithms commonly used for image enhancement, are provided, with particular reference to point processing methods and histogram processing.
Abstract: Principle objective of Image enhancement is to process an image so that result is more suitable than original image for specific application. Digital image enhancement techniques provide a multitude of choices for improving the visual quality of images. Appropriate choice of such techniques is greatly influenced by the imaging modality, task at hand and viewing conditions. This paper will provide an overview of underlying concepts, along with algorithms commonly used for image enhancement. The paper focuses on spatial domain techniques for image enhancement, with particular reference to point processing methods and histogram processing.

363 citations

01 Jan 2011
TL;DR: A performance comparison between four of the most common encryption algorithms: DES, 3DES, Blowfish and AES is provided.
Abstract: This paper tries to present a fair comparison between the most common and used algorithms in the data encryption field. The two main characteristics that identify and differentiate one encryption algorithm from another are its ability to secure the protected data against attacks and its speed and efficiency in doing so. This paper provides a performance comparison between four of the most common encryption algorithms: DES, 3DES, Blowfish and AES. The comparison has been conducted by running several encryption settings to process different sizes of data blocks

117 citations

Journal ArticleDOI
TL;DR: A hybrid algorithm namely HAFA, which incorporates certain aspects of firefly optimization and ant colony system algorithms for solving a class of vehicle routing problems, demonstrates the superiority of proposed approach over other existing FA based approaches for solving such type of discrete optimization problems.

98 citations

01 Jan 2006
TL;DR: It has been observed that the Prewitt Edge Detector works effectively for the digital images corrupted with Poisson Noise where as its performances reduces sharply for other kinds of noise in digital images.
Abstract: Since edge detection is in the forefront of image processing for object detection, it is crucial to have a good understanding of edge detection algorithms. This paper evaluates the performance of Prewitt Edge Detector for detection of edges in digital images corrupted with different kinds of noise. Different kinds of noise are studied in order to evaluate the performance of the Prewitt Edge Detector. Further, the various standard test Images are examined to validate our results. The software is developed using MATLAB 7.0. 1 It has been observed that the Prewitt Edge Detector works effectively for the digital images corrupted with Poisson Noise where as its performances reduces sharply for other kinds of noise in digital images. The results of this study are quite promising.

80 citations

01 Jan 2009
TL;DR: This paper compares five different speckle reduction filters quantitatively using simulated imageries and recommends the best filter based on the statistical and experimental results.
Abstract: 1Reader, Punjabi University,Patiala-147002(Punjab), India, E-Mailresearch_raman@yahoo.com 2Reader Punjabi University, Patiala-147002(Punjab), India E-mail:himagrawal@rediffmail.com Abstract-Today ultrasound and magnetic resonance imaging are essential tools for noninvasive medical diagnosis. One of the fundamental problems in this field is speckle noise, which is a major limitation on image quality especially in ultrasound imaging. The presence of the speckle noise affects image interpretation by human and the accuracy of computer-assisted diagnostic techniques. Low image quality is an obstacle for effective feature extraction, analysis, recognition and quantitative measurements. Image variances or speckle is a granular noise that inherently exists in and degrades the quality of the medical images. Before using ultrasound and magnetic resonance imaging, the very first step is to reduce the effect of Speckle noise. Most of speckle reduction techniques have been studied by researchers; however, there is no comprehensive method that takes all the constraints into consideration. Filtering is one of the common methods which is used to reduce the speckle noises. This paper compares five different speckle reduction filters quantitatively using simulated imageries. The results have been presented by filtered images, statistical tables and diagrams. Finally, the best filter has been recommended based on the statistical and experimental results.

24 citations


Cited by
More filters
01 Apr 1997
TL;DR: The objective of this paper is to give a comprehensive introduction to applied cryptography with an engineer or computer scientist in mind on the knowledge needed to create practical systems which supports integrity, confidentiality, or authenticity.
Abstract: The objective of this paper is to give a comprehensive introduction to applied cryptography with an engineer or computer scientist in mind. The emphasis is on the knowledge needed to create practical systems which supports integrity, confidentiality, or authenticity. Topics covered includes an introduction to the concepts in cryptography, attacks against cryptographic systems, key use and handling, random bit generation, encryption modes, and message authentication codes. Recommendations on algorithms and further reading is given in the end of the paper. This paper should make the reader able to build, understand and evaluate system descriptions and designs based on the cryptographic components described in the paper.

2,188 citations

Journal ArticleDOI
TL;DR: Computer and Robot Vision Vol.
Abstract: Computer and Robot Vision Vol. 1, by R.M. Haralick and Linda G. Shapiro, Addison-Wesley, 1992, ISBN 0-201-10887-1.

1,426 citations

Journal ArticleDOI
TL;DR: Experimental results demonstrate that the proposed enhancement algorithm can not only enhance the details but also preserve the naturalness for non-uniform illumination images.
Abstract: Image enhancement plays an important role in image processing and analysis. Among various enhancement algorithms, Retinex-based algorithms can efficiently enhance details and have been widely adopted. Since Retinex-based algorithms regard illumination removal as a default preference and fail to limit the range of reflectance, the naturalness of non-uniform illumination images cannot be effectively preserved. However, naturalness is essential for image enhancement to achieve pleasing perceptual quality. In order to preserve naturalness while enhancing details, we propose an enhancement algorithm for non-uniform illumination images. In general, this paper makes the following three major contributions. First, a lightness-order-error measure is proposed to access naturalness preservation objectively. Second, a bright-pass filter is proposed to decompose an image into reflectance and illumination, which, respectively, determine the details and the naturalness of the image. Third, we propose a bi-log transformation, which is utilized to map the illumination to make a balance between details and naturalness. Experimental results demonstrate that the proposed algorithm can not only enhance the details but also preserve the naturalness for non-uniform illumination images.

918 citations

Journal ArticleDOI
TL;DR: Validations based on four publicly available databases show that the proposed patch-based contrast quality index (PCQI) method provides accurate predictions on the human perception of contrast variations.
Abstract: Contrast is a fundamental attribute of images that plays an important role in human visual perception of image quality With numerous approaches proposed to enhance image contrast, much less work has been dedicated to automatic quality assessment of contrast changed images Existing approaches rely on global statistics to estimate contrast quality Here we propose a novel local patch-based objective quality assessment method using an adaptive representation of local patch structure, which allows us to decompose any image patch into its mean intensity, signal strength and signal structure components and then evaluate their perceptual distortions in different ways A unique feature that differentiates the proposed method from previous contrast quality models is the capability to produce a local contrast quality map, which predicts local quality variations over space and may be employed to guide contrast enhancement algorithms Validations based on four publicly available databases show that the proposed patch-based contrast quality index (PCQI) method provides accurate predictions on the human perception of contrast variations

270 citations

01 Jan 2011
TL;DR: This paper provides a fair comparison between three most common symmetric key cryptography algorithms: DES, AES, and Blowfish on the basis of speed, block size, and key size.
Abstract: Security is the most challenging aspects in the internet and network applications. Internet and networks applications are growing very fast, so the importance and the value of the exchanged data over the internet or other media types are increasing. Hence the search for the best solution to offer the necessary protection against the data intruders' attacks along with providing these services in time is one of the most interesting subjects in the security related communities. Cryptography is the one of the main categories of computer security that converts information from its normal form into an unreadable form. The two main characteristics that identify and differentiate one encryption algorithm from another are its ability to secure the protected data against attacks and its speed and efficiency in doing so. This paper provides a fair comparison between three most common symmetric key cryptography algorithms: DES, AES, and Blowfish. Since main concern here is the performance of algorithms under different settings, the presented comparison takes into consideration the behavior and the performance of the algorithm when different data loads are used. The comparison is made on the basis of these parameters: speed, block size, and key size. Simulation program is implemented using Java programming.

220 citations