scispace - formally typeset
Search or ask a question
Topic

Data Corruption

About: Data Corruption is a research topic. Over the lifetime, 435 publications have been published within this topic receiving 6784 citations.


Papers
More filters
03 Aug 2000
TL;DR: This paper outlines the steps taken to reduce the processing time of a Reed-Solomon encoding and the insight into modern optimization techniques gained from the experience.
Abstract: Computers transfer data in a number of different ways. Whether through a serial port, a parallel port, over a modem, over an ethernet cable, or internally from a hard disk to memory, some data will be lost. To compensate for that loss, numerous error detection and correction algorithms have been developed. One of the most common error correction codes is the Reed-Solomon code, which is a special subset of BCH (Bose-Chaudhuri-Hocquenghem) linear cyclic block codes. In the AURA project, an unmanned aircraft sends the data it collects back to earth so it can be analyzed during flight and possible flight modifications made. To counter possible data corruption during transmission, the data is encoded using a multi-block Reed-Solomon implementation with a possibly shortened final block. In order to maximize the amount of data transmitted, it was necessary to reduce the computation time of a Reed-Solomon encoding to three percent of the processor's time. To achieve such a reduction, many code optimization techniques were employed. This paper outlines the steps taken to reduce the processing time of a Reed-Solomon encoding and the insight into modern optimization techniques gained from the experience.

6 citations

Proceedings ArticleDOI
11 Apr 2011
TL;DR: The Proactive Checking Framework is described, a new framework that enables a database system to deal with data corruption automatically and proactively and to outline a challenging research agenda to address it.
Abstract: The danger of production or backup data becoming corrupted is a problem that database administrators dread. This position paper aims to bring this problem to the attention of the database research community, which, surprisingly, has by and large overlooked this problem. We begin by pointing out the causes and consequences of data corruption. We then describe the Proactive Checking Framework (PCF), a new framework that enables a database system to deal with data corruption automatically and proactively. We use a prototype implementation of PCF to give deeper insights into the overall problem and to outline a challenging research agenda to address it.

6 citations

01 Jan 2013
TL;DR: Whenever data corruption has been detected during the storage correctness verification, the scheme can almost guarantee the simultaneous localization of data errors, i.e., the identification of the misbehaving server(s).
Abstract: Cloud computing is the newest term for the ongoing-dreamed vision of computing as a utility. The cloud provides convenient, on-demand network access to a centralized pool of configurable computing resources that can be rapidly deployed with great efficiency and minimal management overhead. The industry leaders and customers have wide-ranging expectations for cloud computing in which security concerns remain a major aspect Dealing with “single cloud” providers is becoming less popular with customers due to potential problems such as service availability failure and the possibility that there are malicious insiders in the single cloud. In recent years, there has been a move towards “multiclouds”, “intercloud” or “cloud-of-clouds”. The proposed design allows users to audit the cloud storage with very light weight communication and computation cost. Our scheme achieves the storage correctness insurance as well as data error localization: whenever data corruption has been detected during the storage correctness verification, our scheme can almost guarantee the simultaneous localization of data errors, i.e., the identification of the misbehaving server(s).

6 citations

Book ChapterDOI
25 Mar 2017
TL;DR: Security level has been increased on cloud platform though data stored on server is more secure and integrity is maintained throughout the cloud platform.
Abstract: Many studies have derived multiple ways to achieve security in the server and integrating the data in multiple servers by detecting the misbehavior in the server. The data is secured on server using encryption techniques before dividing into fragments before storing on virtual cloud. This study focuses different perspective of storing data on virtual cloud to maintain integrity by storing the fragments of address of data. Hence, the data remains secure and only the address of the data is transmitted when divided in fragments and data is secured with encryption, so it would be difficult for third party to decrypt and access on server. Thus, security level has been increased on cloud platform though data stored on server is more secure and integrity is maintained throughout the cloud platform.

6 citations

Journal ArticleDOI
01 Apr 2016
TL;DR: A probabilistic verification algorithm to execute the routine verification and apply the homomorphism-based method to perform the challenge verification to decrease the cost of verification and avoid the limitation of the traditional hash-based verification only executing for finite rounds.
Abstract: Remote data storage is a promising service and deployed in large-scale. The verification of data integrity is an important measure to protect the data from data corruption or loss in remote data storage environments. Current verification algorithms are mainly oriented to the service providers who may provide the false verification metadata or force the user to take fewer verification requests owing to the verification expense. To solve these issues, we propose a probabilistic verification algorithm to execute the routine verification and apply the homomorphism-based method to perform the challenge verification. The algorithm improves the metadata generation and innovates the metadata replacement to decrease the cost of verification and avoid the limitation of the traditional hash-based verification only executing for finite rounds. Theoretic analysis and simulation results demonstrate that our algorithm resists the spoofing attack as well as the excessive computation overhead at the expense of less cost in terms of verification data storage, transmission, and computation.

6 citations


Network Information
Related Topics (5)
Network packet
159.7K papers, 2.2M citations
82% related
Software
130.5K papers, 2M citations
81% related
Wireless sensor network
142K papers, 2.4M citations
78% related
Wireless network
122.5K papers, 2.1M citations
77% related
Cluster analysis
146.5K papers, 2.9M citations
76% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20221
202121
202025
201927
201827
201727