scispace - formally typeset
Search or ask a question

Showing papers on "Data compression published in 1969"


Journal ArticleDOI
01 Jan 1969
TL;DR: A method is developed of optimizing a code for data compression by run length encoding and the performance of this code (13.20 compression factor) is compared with the theoretically attainable compression based on the information content of the probability distribution of run lengths.
Abstract: A method is developed of optimizing a code for data compression by run length encoding. The performance of this code (13.20 compression factor) is then compared with the theoretically attainable compression based on the information content of the probability distribution of run lengths.

36 citations


Patent
15 Sep 1969
TL;DR: In this article, a high speed, multistage, compressor-decompressor system for processing arbitrary bit strings by reversibly removing redundant information is presented, where the information which is to be compressed is arranged in strings of bytes and any information defining removal of redundant information from a string is kept together with the string.
Abstract: A high speed, multistage, compressor-decompressor system for processing arbitrary bit strings by reversibly removing redundant information. Alphanumeric information is processed by Type 1 compression which involves removing patterns of contiguous bytes and replacing each removed pattern by decompression information which takes considerably less storage space, and Type 2 compression which involves removing individual redundant bytes and constructing a bit map identifying the location of the removed bytes. Numerical information is processed by a compression technique involving truncation, recursive differencing, sequence removal, packing, and then utilizing the Type 1 and Type 2 compression which are used in conjunction with alphanumeric information. The information which is to be compressed is arranged in strings of bytes and any information defining removal of redundant information from a string is kept together with the string. As a result, each string is selfdefined in the sense that it contains all information needed to decompress that string.

27 citations


01 Dec 1969
TL;DR: The BOND ENERGY ALGORITHM was found to be the most GENERALLY USEFUL and VERSATile of the three algorithms for TREATING CERTAIN PROBLEMS of MULTI-VARIATE ANALYSIS, while the moment ordering alGorithM is an EFFICIENT TECHNIQUE for UNCOVERing and DISPLAYing a UNIVARIATE RELATIONSHIP inherent in the data
Abstract: THIS RESEARCH PAPER PRESENTS THE RESULTS OF A STUDY CONDUCTED TO DEVELOP ALOGIRTHMS FOR ORDERING AND ORGANIZING DATA THAT CAN BE PRESENTED IN A TWO-DIMENSIONAL MATRIX FORM. THE ONLY RESTRICTION IMPOSED ON THE ANALYSIS WAS THAT THE ROWS AND COLUMNS OF THE RAW INPUT DATA MATRICES COULD ONLY BE REORDERED, THUS PREVENTING THE CREATION OF ARTIFICIAL COEFFICIENTS OR LOSS OF ESSENTIAL INPUT INFORMATION. THIS STUDY DEVELOPS METHODS TO EXTRACT LATENT DATA PATTERNS, GROUPINGS, AND STRUCTURAL RELATIONSHIP WHICH ARE NOT APPARENT FROM THE RAW MATRIX DATA. THREE DISTINCT ALGORITHMS WERE DEVELOPED AND PRESENTED IN DETAIL. THE FIRST, THE BOND ENERGY ALGORITHM, IS CAPABLE OF IDENTIFYING AND DISPLAYING NATURAL GROUPS AND CLUSTERS THAT OCCUR IN COMPLEX DATA MATRICES. THE SECOND METHOD IS THE MOMENT ORDERING ALGORITHM, WHOSE PURPOSE IS TO IDENTIFY THE SINGLE DOMINANT RELATIONSHIP IN AN ARRAY OF DATA, AND TO REORDER THE ROWS AND COLUMNS OF THE ARRAY TO PRODUCE A RANKING UNDER THIS DOMINANT RELATIONSHIP. THE MOMENT COMPRESSION ALGORITHM IS DESIGNED TO IDENTIFY NATURAL GROUPS AND CLUSTERS OF ENTITIES BY FACTORING THE DATA RELATIONSHIP MATRIX INTO A NUMBER OF PIECES. THE AUTHORS CONCLUDED THAT THE BOND ENERGY ALGORITHM PROVED TO BE THE MOST GENERALLY USEFUL AND VERSATILE OF THE THREE ALGORITHMS FOR TREATING CERTAIN PROBLEMS OF MULTI-VARIATE ANALYSIS, WHILE THE MOMENT ORDERING ALGORITHM IS AN EFFICIENT TECHNIQUE FOR UNCOVERING AND DISPLAYING A UNIVARIATE RELATIONSHIP INHERENT IN THE DATA. IT IS A FAST AND DIRECT METHOD FOR UNCOVERING THE PRINCIPAL AXIS OF A DATA STRUCTURE. THE EFFICIENCY OF THE ALGORITHM WAS FOUND TO BE IN DIRECT PROPORTION TO ITS ULTIMATE SUCCESS IN IDENTIFYING A PRINCIPAL AXIS. /UMTA/

19 citations


Patent
14 Jan 1969

17 citations


Journal ArticleDOI
TL;DR: A simple manual method of plotting a flat contour map is described with a few examples included to help researchers generate new insights in data compression and also in generating new insights.

8 citations


15 Jun 1969
TL;DR: Information preserving data compression systems with coding algorithm developed for noiseless channel conditions were presented in this article, where the coding algorithm was developed for coding algorithm for noisy channel conditions and the coding scheme was used.
Abstract: Information preserving data compression systems with coding algorithm developed for noiseless channel conditions

4 citations


Journal ArticleDOI
Dieter Seitzer1, F. Closs, P. Stucki
TL;DR: A system is described, which employs the idle periods of one picture to introduce information from other pictures and integrates over several TV frames and is capable of multiplexing 15 high-resolution black and white TV channels.
Abstract: A system is described, which is based on structural and perceptual redundancy. It employs the idle periods of one picture to introduce information from other pictures and integrates over several TV frames. The system is capable of multiplexing 15 high-resolution black and white TV channels.

4 citations


01 Mar 1969
TL;DR: The results of a survey of pictorial data-compression techniques are summarized in this report, motivated by a study of half-time graphics communication over voice grade lines.
Abstract: : The results of a survey of pictorial data-compression techniques are summarized in this report. The survey was motivated by a study of half-time graphics communication over voice grade lines. The principal compression techniques surveyed include the following: the optimization of the parameters of ordinary pulse coded modulation, pulse coded modulation using added pseudo- random noise, differential pulse coded modulation, predictive differential pulse coded modulation, run-length encoding, brightness edge detection and the use of 'synthetic highs,' brightness contour detection and encoding, area encoding, picture compression in the Fourier domain.

3 citations


Journal ArticleDOI
TL;DR: Data compression and transmission technique for real time system to monitor manned space missions, noting Apollo telemetry data.
Abstract: Data compression and transmission technique for real time system to monitor manned space missions, noting Apollo telemetry data

1 citations