scispace - formally typeset
Search or ask a question
Topic

Data Corruption

About: Data Corruption is a research topic. Over the lifetime, 435 publications have been published within this topic receiving 6784 citations.


Papers
More filters
Patent
04 Feb 2013
TL;DR: In this paper, the authors propose to prevent the occurrence of data corruption when the forwarding-source region and the forwardingdestination region of the data overlap, and even when forwarding using a burst-forwarding function.
Abstract: The purpose of the present invention is to prevent the occurrence of data corruption when the forwarding-source region and the forwarding-destination region of the data overlap, and even when forwarding using a burst-forwarding function. Data read from the forwarding-source region is first written in a ring buffer, and then the data written in the ring buffer is written in the forwarding-destination region. When doing so, the reading of data from the ring buffer is controlled on the basis of the magnitude correlation between the number of wraparounds caused by the writing of data in the ring buffer, and the number of wraparounds caused by reading the data.

3 citations

Patent
Eric S. Goldsmith1
29 Sep 2000
TL;DR: In this article, the authors proposed a method for transmitting facsimile data over a wireless interface in a manner that minimizes the occurrence of time outs caused by wireless interface data corruption.
Abstract: The present invention is for transmitting facsimile data over a wireless interface in a manner that minimizes the occurrence of time outs caused by wireless interface data corruption. Facsimile data to be transmitted over a wireless interface is collected and then transmitted using a wireless error detecting and correcting protocol ( 76 ). The facsimile data is re-transmitted if the data transmission is unsuccessful until the transmitting of the facsimile data is successful. Highly correlated lines in the collected facsimile data are identified if delays caused by the re-transmitting of the collected facsimile data reach a predetermined limit, and certain ones of the highly correlated lines that have a minimal impact on received facsimile data are then deleted to prevent a facsimile protocol time out ( 84 ).

3 citations

Posted Content
TL;DR: This paper proposes Fast Integrity VERification (FIVER) algorithm which overlaps checksum computation and data transfer operations of files to minimize the cost of integrity verification and implemented FIVER-Hybrid to mimic disk access patterns of sequential integrity verification approach.
Abstract: The amount of data generated by scientific and commercial applications is growing at an ever-increasing pace. This data is often moved between geographically distributed sites for various purposes such as collaboration and backup which has led to significant increase in data transfer rates. Surge in data transfer rates when combined with proliferation of scientific applications that cannot tolerate data corruption triggered enhanced integrity verification techniques to be developed. End-to-end integrity verification minimizes the likelihood of silent data corruption by comparing checksum of files at source and destination servers using secure hash algorithms such as MD5 and SHA1. However, it imposes significant performance penalty due to overhead of checksum computation. In this paper, we propose Fast Integrity VERification (FIVER) algorithm which overlaps checksum computation and data transfer operations of files to minimize the cost of integrity verification. Extensive experiments show that FIVER is able to bring down the cost from 60% by the state-of-the-art solutions to below 10% by concurrently executing transfer and checksum operations and enabling file I/O share between them. We also implemented FIVER-Hybrid to mimic disk access patterns of sequential integrity verification approach to capture possible data corruption that may occur during file write operations which FIVER may miss. Results show that FIVER-Hybrid is able to reduce execution time by 20% compared to sequential approach without compromising the reliability of integrity verification.

3 citations

Proceedings ArticleDOI
01 Oct 2018
TL;DR: The paper discusses a data integrity methodology, which is alternative to MACs or ICVs, and is based on a novel concept of ‘implicit integrity’, and proposals for constructions that are practical and can be used in communication systems, supporting implicit integrity at low cost.
Abstract: We address the problem of detecting data corruption in computer and device communications without generating, transmitting or verifying integrity metadata. Such metadata typically hold mathematical summaries of the content which is being transmitted, such as checksums, Integrity Check Values (ICVs) or Message Authentication Codes (MACs), and are costly to generate and transmit. In the paper we discuss a data integrity methodology, which is alternative to MACs or ICVs, and is based on a novel concept of ‘implicit integrity’. Implicit integrity supports the detection of corruption based on the observation that regular unencrypted user data typically exhibit patterns, such as repeated bytes words etc. When some encrypted content becomes corrupted and is decrypted, it no longer exhibits patterns. It is the absence or presence of patterns in decrypted content which denotes whether some content is modified or not. In the paper we summarize some of our findings including discovered entropy properties of server and client data, security bounds associated with implicit integrity and proposals for constructions that are practical and can be used in communication systems, supporting implicit integrity at low cost.

3 citations

Journal ArticleDOI
TL;DR: One major function of computer security is to ensure the availability of an IT system, and preventing major failures can prevent the unavailability of critical resources and any eventual information leakage and data corruption.

3 citations


Network Information
Related Topics (5)
Network packet
159.7K papers, 2.2M citations
82% related
Software
130.5K papers, 2M citations
81% related
Wireless sensor network
142K papers, 2.4M citations
78% related
Wireless network
122.5K papers, 2.1M citations
77% related
Cluster analysis
146.5K papers, 2.9M citations
76% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20221
202121
202025
201927
201827
201727