scispace - formally typeset
Search or ask a question

Showing papers on "Data Corruption published in 1992"


Patent
10 Jul 1992
TL;DR: In this article, a system and method which provides a complete software implementation of a device driver that is capable of detecting an undetectable data corruption problem without hardware redesign and/or internal modification to an existing FDC.
Abstract: A system and method which provides a complete software implementation of a device driver that is capable of detecting an undetectable data corruption problem without hardware redesign and/or internal modification to an existing FDC. The approach taken consists of software DMA shadowing and use of a software decoding network which allows the implementation of the invention to require a small amount of memory and only degrade the performance of the computer system a minimal amount when floppy diskette write operations occur.

24 citations


Journal ArticleDOI
TL;DR: One used to think of a business disaster as the inferno, tempest or flood that destroyed buildings, equipment and records vital to the running of a company, but an increasing number of disasters are less physical and more subtle.
Abstract: One used to think of a business disaster as the inferno, tempest or flood that destroyed buildings, equipment and records vital to the running of a company. Of course, such disasters still happen, but an increasing number-indeed the majority-of disasters are less physical and more subtle. They include: system failure and data corruption caused by electromagnetic interference; loss of processing capability through police denying access to computer installations as a result of nearby terrorist activity; chemical contamination of hardware and magnetic media; illegal access (hacking) into computer systems resulting in corruption or loss of data; theft of personal computers; rodents; loss of supplied services; and loss of power. >

7 citations


Journal ArticleDOI
TL;DR: The authors showed that statistical tests of the effectiveness of alternative monetary fiscal policies may be inconclusive because the policies themselves may influence the observed time-series in such a way as to cause regression coefficients to differ from their true values.
Abstract: It is shown that statistical tests of the effectiveness of alternative monetary fiscal policies may be inconclusive because the policies themselves may influence the observed time-series in such a way as to cause regression coefficients to differ from their true values. It is established that simple rules such as a constant rate of monetary growth have no “corrupting” effect on the data, but this is not true for sophisticated policies designed to go beyond naive model relationships.

2 citations


Patent
09 Dec 1992
TL;DR: In this paper, a sufficient test on function security in a storage device including data preservation is performed to detect data corruption in each storage element by detecting the parity error, which is defined as the data corruption due to data corruption.
Abstract: PURPOSE:To perform the sufficient test on function security in a storage device including trouble during data preservation CONSTITUTION:When each storage element of a storage device (test object device) 1A will be subjected to the test of information storing operation, test data 4 is first written in the storage area of the storage device 1A by a test part 2 Thereafter, written test data is read out from the storage device 1A after the lapse of the delay time sufficient for the occurrence of data corruption in each storage element By this read, trouble due to data corruption is detected as the parity error

1 citations


Proceedings ArticleDOI
16 Dec 1992
TL;DR: An outstanding problem in the study of adaptive learning is overspecialization of the learning system, and its consequent inability to handle new data correctly, and a means of addressing this difficulty is described here.
Abstract: An outstanding problem in the study of adaptive learning is overspecialization of the learning system, and its consequent inability to handle new data correctly. A means of addressing this difficulty is described here. When used in conjunction with standard processes such as backpropagation, it identifies the level of corruption of the training sample, and thus provides a `best fit' to the entire domain of interest, rather than to the training sample alone. This is accomplished by a combination of simulated annealing, bootstrap estimation, and analysis methods derived from statistical mechanics. Its advantage is that data need not be reserved for an independent test set, and thus all available samples are used. A modified generalization error, defined through a thermalization parameter on the training set, provides a measure of the sample space consistent with the network function. A criterion for optimal match between network and sample set is obtained from the requirement that generalization error and training error be consistent. Numerical results are presented for examples which illustrate several distinct forms of data corruption. A quantity analogous to the specific heat in thermodynamic systems is found to exhibit anomalies at values of training error near the onset of overtraining.© (1992) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.