scispace - formally typeset
Search or ask a question
Topic

Run-length encoding

About: Run-length encoding is a research topic. Over the lifetime, 504 publications have been published within this topic receiving 4441 citations. The topic is also known as: RLE.


Papers
More filters
GO Guo-qin1
01 Jan 2004
TL;DR: An effective approach of vectorization by using the run-length encoding system of compressed raster is developed in this paper, and the implementation procedures of the algorithm are given in detail.
Abstract: In order to overcome the restriction of the traditional methods of converting raster to vector, an effective approach of vectorization by using the run-length encoding system of compressed raster is developed in this paper, and the implementation procedures of the algorithm are given in detail. Because the run-length encoding system has the characteristics of accessing easy, proper compressed ratio, and inter-converting with normal raster rapidly, by using this approach, the computer's ability and speed of handling large scale and complex raster with high precision will be promoted. It can be used to extracting the attribute's borderline from the results of raster-type geographical spatial analysis and remote sensing images, this approach has been proved to be high efficiency and implementation easy in the test.

10 citations

Journal ArticleDOI
TL;DR: In this article, a 3-level Haar wavelet transform is used as a common building block to save the resources for color image blur detection and compression in parallel with compression and encryption.
Abstract: This paper presents a 3 in 1 standalone FPGA system which can perform color image blur detection in parallel with compression and encryption. Both blur detection and compression are based on the 3-level Haar wavelet transform, which is used as a common building block to save the resources. The compression is based on performing the hard thresholding scheme followed by the Run Length Encoding (RLE) technique. The encryption is based on the 128-bit Advanced Encryption Standard (AES), which is considered one of the most secure algorithms. Moreover, the modified Lorenz chaotic system is combined with the AES to perform the Cipher Block Chaining (CBC) mode. The proposed system is realized using HDL and implemented using Xilinx on XC5VLX50T FPGA. The system has utilized only 25% of the available slices. Furthermore, the system can achieve a throughput of 3.458 Gbps, which is suitable for real-time applications. To validate the compression performance, the system has been tested with all the standard $256\times 256$ images. It is shown that depending on the amount of details in the image, the system can achieve 30dB PSNR at compression ratios in the range of (0.08-0.38). The proposed system can be integrated with digital cameras to process the captured images on-the-fly prior to transmission or storage. Based on the application, the blurred images can be either marked for future enhancement or simply filtered out.

10 citations

Proceedings ArticleDOI
27 Mar 2014
TL;DR: A lossless two phase compression algorithm is presented for DNA sequences in the first phase a modified version of Run Length Encoding (RLE) is applied and in the second phase the resultant genetic sequences is compressed using ASCII values.
Abstract: The properties of DNA sequences offer an opportunity to develop DNA specific compression algorithm. A lossless two phase compression algorithm is presented for DNA sequences. In the first phase a modified version of Run Length Encoding (RLE) is applied and in the second phase the resultant genetic sequences is compressed using ASCII values. Using ASCII codes for eight bits ensures one-fourth compression irrespective of repeated or non-repeated behavior of the sequence and modified RLE technique enhances the compression further more. Not only the compression ratio of the algorithm is quite encouraging but the simple technique of compression makes it more interesting.

9 citations

Patent
11 Dec 1990
TL;DR: In this article, a run length encoder for video image data made up of white and black representative pixel values to a run-length code to be used by a vector processor including a transition detector is presented.
Abstract: A run length encoder for converting bi-level video image data made up of white and black representative pixel values to a run length code to be used by a vector processor including a transition detector for detecting the transition of inputted image data values from either white to black or black to white. The inputted image data are addressed along each scanned line. At every white to black transition, the pixel address is stored in memory. The number of continuous black pixels following this first transition is counted until a black to white transition is detected. The resulting count value corresponding to each stored address value for each black "run" is also stored in memory, wherein these stored values represent the run length code of the video image data for use with the vector processor.

9 citations

Proceedings ArticleDOI
26 Aug 2008
TL;DR: Experiments show that the proposed technique of fingerprint identification using gray Hopfield neural network improved by run-length encoding is useful in a number of different samples of fingerprint images in terms of converged images in quality, encoding and decoding performance.
Abstract: This paper presents a new technique of fingerprint identification using gray Hopfield neural network (GHNN) improved by run-length encoding (RLE). Gabor filter has been used for image enhancement at the stage of enrollment and vector field algorithm for core detection as a reference point. Finding this point will enable to cover most of information around the core. GHNN deals with gray level images by learning on bitplanes that represent the fingerprint image layers. For large number of images GHNN's memory needs very large storage space to cover all learned fingerprint images. RLE is a very simple and useful solution for saving the capacity of the net memory by encoding the stored weights, in which the weights data will reduce according to the repeated one. Experiments carried out on fingerprint images show that the proposed technique is useful in a number of different samples of fingerprint images in terms of converged images in quality, encoding and decoding performance.

9 citations

Network Information
Related Topics (5)
Network packet
159.7K papers, 2.2M citations
76% related
Feature extraction
111.8K papers, 2.1M citations
75% related
Convolutional neural network
74.7K papers, 2M citations
74% related
Image processing
229.9K papers, 3.5M citations
74% related
Cluster analysis
146.5K papers, 2.9M citations
74% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202123
202020
201920
201828
201727
201624