scispace - formally typeset
Search or ask a question

Showing papers on "Weak consistency published in 2002"


Journal ArticleDOI
Haifeng Yu1, Amin Vahdat1
TL;DR: It is argued that the TACT consistency model can simultaneously achieve the often conflicting goals of generality and practicality by describing how a broad range of applications can express their consistency semantics using TACT and by demonstrating that application-independent algorithms can efficiently enforce target consistency levels.
Abstract: The tradeoffs between consistency, performance, and availability are well understood. Traditionally, however, designers of replicated systems have been forced to choose from either strong consistency guarantees or none at all. This paper explores the semantic space between traditional strong and optimistic consistency models for replicated services. We argue that an important class of applications can tolerate relaxed consistency, but benefit from bounding the maximum rate of inconsistent access in an application-specific manner. Thus, we develop a conit-based continuous consistency model to capture the consistency spectrum using three application-independent metrics, numerical error, order error, and staleness. We then present the design and implementation of TACT, a middleware layer that enforces arbitrary consistency bounds among replicas using these metrics. We argue that the TACT consistency model can simultaneously achieve the often conflicting goals of generality and practicality by describing how a broad range of applications can express their consistency semantics using TACT and by demonstrating that application-independent algorithms can efficiently enforce target consistency levels. Finally, we show that three replicated applications running across the Internet demonstrate significant semantic and performance benefits from using our framework.

321 citations


Journal ArticleDOI
TL;DR: In this paper, a weighted Nadaraya-Watson (WNW) estimator of conditional distribution function was used for regression quantiles for α-mixing time series at both boundary and interior points, and the WNW conditional distribution estimator not only preserves the bias, variance and, more important, automatic good boundary behavior properties of local linear estimators introduced by Yu and Jones (1998, Journal of the American Statistical Association 93, 228-237), but also has the additional advantage of always being a distribution itself.
Abstract: In this paper we study nonparametric estimation of regression quantiles for time series data by inverting a weighted Nadaraya–Watson (WNW) estimator of conditional distribution function, which was first used by Hall, Wolff, and Yao (1999, Journal of the American Statistical Association 94, 154–163). First, under some regularity conditions, we establish the asymptotic normality and weak consistency of the WNW conditional distribution estimator for α-mixing time series at both boundary and interior points, and we show that the WNW conditional distribution estimator not only preserves the bias, variance, and, more important, automatic good boundary behavior properties of local linear “double-kernel” estimators introduced by Yu and Jones (1998, Journal of the American Statistical Association 93, 228–237), but also has the additional advantage of always being a distribution itself. Second, it is shown that under some regularity conditions, the WNW conditional quantile estimator is weakly consistent and normally distributed and that it inherits all good properties from the WNW conditional distribution estimator. A small simulation study is carried out to illustrate the performance of the estimates, and a real example is also used to demonstrate the methodology.

192 citations


Proceedings Article
07 Aug 2002
TL;DR: In this article, the authors present two natural generalizations of the faithfulness assumption in the context of structural equation models, under which the typical algorithms in the literature (in some cases with modifications) are uniformly consistent even when the time order is unknown.
Abstract: A fundamental question in causal inference is whether it is possible to reliably infer manipulation effects from observational data. There are a variety of senses of asymptotic reliability in the statistical literature, among which the most commonly discussed frequentist notions are pointwise consistency and uniform consistency (see, e.g. Bickel, Doksum [2001]). Uniform consistency is in general preferred to pointwise consistency because the former allows us to control the worst case error bounds with a finite sample size. In the sense of pointwise consistency, several reliable causal inference algorithms have been constructed under the Markov and Faithfulness assumptions [Pearl 2000, Spirtes et al. 2001]. In the sense of uniform consistency, however, reliable causal inference is impossible under the two assumptions when time order is unknown and/or latent confounders are present [Robins et al. 20001. In this paper we present two natural generalizations of the Faithfulness assumption in the context of structural equation models, under which we show that the typical algorithms in the literature (in some cases with modifications) are uniformly consistent even when the time order is unknown. We also discuss the situation where latent confounders may be present and the sense in which the Faithfulness assumption is a limiting case of the stronger assumptions.

76 citations


Posted Content
TL;DR: In this paper, the authors study the computation of transient solutions of a class of piecewise-linear (PL) circuits, which can be seen as dynamical extensions of the PL modeling structure.
Abstract: In this brief, we will study the computation of transient solutions of a class of piecewise-linear (PL) circuits. The network models will be so-called linear complementarity systems, which can be seen as dynamical extensions of the PL modeling structure. In particular, the numerical simulation will be based on a time-stepping method using the well-known backward Euler scheme. It will be demonstrated, by means of an example, that this widely applied time-stepping method does not necessarily produce useful output for arbitrary linear dynamical systems with ideal diode characteristics. Next the consistency of the method will be proven for PL networks that can be realized by linear passive circuit elements and ideal diodes by showing that the approximations generated by the method converge to the true solution of the system in a suitable sense. To give such a consistency proof, a fundamental framework developed previously is indispensable as it proposes a precise definition of a "solution" of a linear complementarity system and provides conditions under which solutions exist and are unique.

68 citations


Patent
17 Sep 2002
TL;DR: In this paper, the authors present a method for emulation in a multiprocessor system, which includes performing an emulation in which the host system of the system supports weak consistency and the target system supports strong consistency.
Abstract: A method (and system) of emulation in a multiprocessor system, includes performing an emulation in which a host multiprocessing system of the multiprocessor system supports a weak consistency model, and the target multiprocessing system of the multiprocessor system supports a strong consistency model.

43 citations


Journal ArticleDOI
TL;DR: This work deals with the problem of the reliability of quantitative rankings and uses quasi-linear means for providing a more general approach to get priority and antipriority vectors.
Abstract: It is known that in the Analytic Hierarchy Process (A.H.P.) a scale of relative importance for alternatives is derived from a pairwise comparisons matrix A = (aij). Priority vectors are basically provided by the following methods: the right eigenvector method, the geometric mean method and the arithmetic mean method. Antipriority vectors can also be considered; they are built by both the left eigenvector method and mean procedures applied to the columns of A. When the matrix A is inconsistent, priority and antipriority vectors do not indicate necessarily the same ranking. We deal with the problem of the reliability of quantitative rankings and we use quasi-linear means for providing a more general approach to get priority and antipriority vectors.

38 citations


Journal ArticleDOI
TL;DR: In this article, a non-isotropic mixing spatial data process is introduced, and under such a spatial structure a nonparametric kernel method is suggested to estimate a spatial conditional regression.
Abstract: Data collected on the surface of the earth often has spatial interaction. In this paper, a non-isotropic mixing spatial data process is introduced, and under such a spatial structure a nonparametric kernel method is suggested to estimate a spatial conditional regression. Under mild regularities, sufficient conditions are derived to ensure the weak consistency as well as the convergence rates for the kernel estimator. Of interest are the following: (1) All the conditions imposed on the mixing coefficient and the bandwidth are simple; (2) Differently from the time series setting, the bandwidth is found to be dependent on the dimension of the site in space as well; (3) For weak consistency, the mixing coefficient is allowed to be unsummable and the tendency of sample size to infinity may be in different manners along different direction in space; (4) However, to have an optimal convergence rate, faster decreasing rates of mixing coefficient and the tendency of sample size to infinity along each direction are required.

38 citations


Book ChapterDOI
18 Aug 2002
TL;DR: Three levels of consistency are investigated for the evaluation of action descriptions: uniform consistency, consistency of formulas and regional consistency, which provide an intuitive resolution of problems of explanation conflicts and fluent dependency.
Abstract: As a contribution to the metatheory of reasoning about actions, we present some characteristics of the consistency of action theories. Three levels of consistency are investigated for the evaluation of action descriptions: uniform consistency, consistency of formulas and regional consistency. The first two provide an intuitive resolution of problems of explanation conflicts and fluent dependency. The concept of regional consistency provides for a measure of ramification. A highly expressive form of action descriptions, the normal form, is introduced to facilitate this analysis. The relative satisfiability of the situation calculus is generalized to accommodate non-deterministic effects and ramifications.

26 citations


Proceedings ArticleDOI
10 Aug 2002
TL;DR: This revue shows that, from an implementation point of view, sequential consistency can be considered as a form of lazy linearizability, supported by a versatile protocol that can be tailored to implement any of them.
Abstract: This revue shows that, from an implementation point of view, sequential consistency can be considered as a form of lazy linearizability. This claim is supported by a versatile protocol that can be tailored to implement any of them.

24 citations


Proceedings ArticleDOI
17 Nov 2002
TL;DR: This work focuses on a hierarchical caching system based on the time-to-live (TTL) expiration mechanism and presents a basic model for such a system, extracting several important performance metrics from the perspective of the caching system and end users.
Abstract: Caching is an important means to scale up the growth of the Internet. Weak consistency is a major approach used in Web caching and has been deployed in various forms. The paper investigates some properties and performance issues of an expiration-based caching system. We focus on a hierarchical caching system based on the time-to-live (TTL) expiration mechanism and present a basic model for such a system. By analyzing the intrinsic TTL timing behavior in the basic model, we derive several important performance metrics from the perspective of the caching system and end users, respectively. Our results offer some basic understanding of a hierarchical caching system based on the weak consistency paradigm.

16 citations


Proceedings ArticleDOI
17 Nov 2002
TL;DR: This work distinguishes definitions of consistency and coherence for Web-like caching environments, and presents a novel Web protocol called "basis token consistency" (BTC), which allows compliant caches to guarantee strong consistency of content retrieved from supporting servers.
Abstract: With Web caching and cache-related services like CDNs and edge services playing an increasingly significant role in the modern Internet, the problem of the weak consistency and coherence provisions in current Web protocols is drawing increasing attention. Toward this end, we differentiate definitions of consistency and coherence for Web-like caching environments, and then present a novel Web protocol we call "basis token consistency" (BTC). This protocol allows compliant caches to guarantee strong consistency of content retrieved from supporting servers. We then compare the performance of BTC with the traditional TTL (time to live) algorithm under a range of synthetic workloads in order to illustrate its qualitative performance properties.


Journal ArticleDOI
TL;DR: Two alternative characterizations of the egalitarian solution of the Nash solution based on converse consistency as well as either weak consistency or population monotonicity are presented, in addition to other standard axioms of weak Pareto optimality, symmetry, and continuity.

01 Jan 2002
TL;DR: This work proposes definitions of consistency and coherence for web-like caching environments, and presents a novel web protocol called “Basis Token Consistency” (BTC), which allows compliant caches to guarantee strongly consistent views of content retrieved from supporting servers.
Abstract: With web caching and cache-related services like CDNs and edge services playing an increasingly significant role in the modern internet, the problem of the weak consistency and coherence provisions in currently standardized web protocols is drawing greater attention. Toward this end, we propose definitions of consistency and coherence for web-like caching environments, and then present a novel web protocol we call “Basis Token Consistency” (BTC). This protocol allows compliant caches to guarantee strongly consistent views of content retrieved from supporting servers. We discuss this protocol and its extensions, and compare the performance of BTC with the traditional TTL (Time To Live) algorithm under a range of synthetic workloads.

Proceedings ArticleDOI
02 Jul 2002
TL;DR: In zones of higher demand, the consistent state is reached up to six times quicker than with a normal weak consistency algorithm, without incurring the additional costs of the strong consistency.
Abstract: In many Internet scale replicated system, not all replicas can be dealt with in the same way, since some will be in greater demand than others. In the case of weak consistency algorithms, we have observed that updating first replicas having most demand, a greater number of clients would gain access to updated content in a shorter period of time. In this work we have investigated the benefits that can be obtained by prioritizing replicas with greater demand, and considerable improvements have been achieved. In zones of higher demand, the consistent state is reached up to six times quicker than with a normal weak consistency algorithm, without incurring the additional costs of the strong consistency.

01 Jan 2002
TL;DR: This lift generalizes Fitting’s multiple-valued semantics of modal logic in that the treatment of negation generalizes Heyting negation beyond fully specified and consistent models.
Abstract: Using a priority preorder on requirements or specifications, we lift established property-verification techniques of threevalued model checking from single to multiple viewpoints. This lift guarantees a maximal degree of autonomy and accountability to single views, automatically synthesizes single-analysis results for multiple-view consistency and assertion checking, allows the re-use of single-view technology (e.g. standard model checkers), and transforms many meta-results (e.g. soundness of abstraction) from the singleview to the multiple-view setting. We formulate assertionconsistency lattices as a proper denotational universe for this lift, show that their symmetric versions are DeMorgan lattices, and classify both structures through (idempotent) order-isomorphisms on (self-dual) priority preorders in the finite case. In particular, this lift generalizes Fitting’s multiple-valued semantics of modal logic in that our treatment of negation generalizes Heyting negation beyond fully specified and consistent models. We compare our approach to existing work on multiple-valued model checking.

Proceedings ArticleDOI
17 Dec 2002
TL;DR: A protocol that distinguishes between two classes of consistency (i.e. weak and strong) and treats them differently is proposed and shows that switching from strong to weak consistency reduces the number of aborts due to conflicting operations by almost half even with high read/write sharing.
Abstract: Object caching is often used to improve the performance of mobile applications, but the gain is often lessened by the additional load of maintaining consistency between an original object and its cached copy. This paper aims at reducing the consistency maintenance work and proposes a protocol that distinguishes between two classes of consistency (i.e. weak and strong) and treats them differently. Strong consistency is used for data that needs to be consistent all the time, whereas weak consistency is for cases when stale data can be tolerated or only specific updates are relevant to the application. Consistency is maintained by using strict and permissive read/write time locks that enable data sharing for a fixed time period and support concurrency control. A notification protocol for propagating updates to clients is also proposed. Performance tests have shown that switching from strong to weak consistency reduces the number of aborts due to conflicting operations by almost half even with high read/write sharing.

Katsutoshi Wakai1
01 Jan 2002
TL;DR: In this article, the authors derive the conditions under which the no-trade theorem of Milgrom-Stokey (1982) holds for the economy of agents with multiple priors.
Abstract: We derive the conditions under which the no trade theorem of Milgrom-Stokey (1982) holds for the economy of agents with multiple priors. We first investigate individual behavior, and derive the conditions under which agent’s preference relations confirm dynamic consistency. The main result is the converse of the proposition in Sarin-Wakker (1989); dynamic consistency and sequential consistency (or consequentialism) implies the recursive structure of multiple priors. Next we examine the connection between ex-ante knowledge and ex-post knowledge, and study the conditions required for the no trade theorem to hold. With perfect anticipation of ex-post knowledge, the no-trade theorem holds under the multiple priors model.

Proceedings ArticleDOI
02 Jul 2002
TL;DR: This method dynamically distributes the variation of numerical data to replicas according to their demands while achieving fairness among them and can achieve extremely high fairness while processing update transactions at the maximum rate.
Abstract: This paper proposes a replica control method based on a fairly assigned variation of numerical data that has weak consistency for loosely coupled systems managed or used by different organizations. This method dynamically distributes the variation of numerical data to replicas according to their demands while achieving fairness among them. By assigning the variation, a replica can determine the possibility that processed update transactions will be aborted and can notify a client of the possibility even when network partitioning happens. In addition, fairly, assigning the variation of data to replicas enables the disadvantage among replicas caused by asynchronous update to be balanced among replicas. Fairness control for assigning the variation of data is performed by averaging the demands in the variation that are requested by the replicas. Simulation showed that our system can achieve extremely high fairness while processing update transactions at the maximum rate.

Journal ArticleDOI
TL;DR: It is found that most data conflicts between an ROT and update transactions are independent data conflicts, which indicates that using a separate algorithm to process ROTs from update transactions is more effective than relaxing consistency requirement.

Book ChapterDOI
08 Sep 2002
TL;DR: In this paper, the authors proposed optimistic concurrency control based on timestamp interval for broadcast environment, which adapt weak consistency that is the most appropriate correctness criterion of read-only transactions.
Abstract: The broadcast environment has asymmetric communication aspect that is typically much greater communication bandwidth available from server to clients than that of the opposite direction. In addition, mobile computing systems generate mostly read-only transactions from mobile clients for retrieving different types of information such as stock data, traffic information and news updates. Since previous concurrency control protocols, however, do not consider such a particular characteristics, the performance is degraded when previous schemes are applied to the broadcast environment. In this paper, we propose optimistic concurrency control based on timestamp interval for broadcast environment. The following requirements are satisfied by adapting weak consistency that is the most appropriate correctness criterion of read-only transactions: (1) the mutual consistency of data maintained by the server and read by clients (2) the currency of data read by clients. We also adopt the Timestamp Interval protocol to check the weak consistency efficiently. As a result, we improved a performance by reducing unnecessary aborts and restarts of read-only transactions caused when global serializability was adopted.

Journal Article
TL;DR: This paper proposes optimistic concurrency control based on timestamp interval for broadcast environment by adapting weak consistency that is the most appropriate correctness criterion of read-only transactions and adopting the Timestamp Interval protocol to check the weak consistency efficiently.
Abstract: The broadcast environment has asymmetric communication aspect that is typically much greater communication bandwidth available from server to clients than that of the opposite direction. In addition, mobile computing systems generate mostly read-only transactions from mobile clients for retrieving different types of information such as stock dada, traffic information and news updates. Since previous concurrency control protocols, however, do not consider such a particular characteristics, the performance is degraded when previous schemes are applied to the broadcast environment. In this paper, we propose optimistic concurrency control based on timestamp interval for broadcast environment The following requirements are satisfied by adapting weak consistency that is the most appropriate correctness criterion of read-only transactions: (1) the mutual consistency of data maintained by the server and read by clients (2) the currency of data read by clients. We also adopt the Timestamp Interval protocol to check the weak consistency efficiently. As a result, we improved a performance by reducing unnecessary aborts and restarts of read-only transactions caused when global serializability was adopted.

Dissertation
01 Jan 2002
TL;DR: This thesis presents extensions to the standard dual encoding that can compactly represent the given CSP using an equivalent dual encode that contains all the original solutions to the CSP, using constraint coverings and shows how enforcing arc consistency in these constraint covering based encodings, strictly dominates enforcement of generalised arc consistency (GAC) on the primal non-binary encoding.
Abstract: Recently, a lot of research has gone into the development of techniques that can directly handle non-binary constraints. On one hand, many extensions to existing binary constraint satisfaction algorithms have been proposed that directly deal with the non-binary constraints. The other choice is to perform a structural transformation of the representation of the problem, so that the resulting problem is a binary CSP except that now the original constraints which were non-binary are replaced by binary compatibility constraints between relations. A lot of recent work has been concerned with comparing different levels of local consistency enforceable in the non-binary representation with the dual representation. The dual encoding can often enforce high levels of consistency when compared to the primal representations. In some cases the space complexity of the dual encodings is prohibitive and this is sometimes a drawback when trying to use these encodings. In this thesis we present extensions to the standard dual encoding that can compactly represent the given CSP using an equivalent dual encoding that contains all the original solutions to the CSP, using constraint coverings. We show how enforcing arc consistency in these constraint covering based encodings, strictly dominates enforcement of generalised arc consistency (GAC) on the primal non-binary encoding. The main theme of this thesis concerns the study of the dual encodings of non-binary CSPs and extensions to the standard forms of local consistency defined in the dual, that can enforce arbitrarily high levels of consistency. Our extensions to standard dual arc consistency allow us to construct compact encodings of the original problem. This directly addresses the space complexity issues related to the standard dual encoding, while still being able to enforce provably high levels of consistency. We present a complete theoretical evaluation of these different consistency techniques while also providing experimental results in support of these theoretical results.

Journal ArticleDOI
TL;DR: In this article, the asymptotic properties of the least squares estimators of the coefficients in a polynomial regression when the error is heteroskedastic and its variance increases with time varying are investigated.
Abstract: We investigate asymptotic properties of the least squares estimators of the coefficients in a polynomial regression when the error is heteroskedastic and its variance increases with time varying. First, we show the weak consistency of the estimators under some assumptions on the covariance structure of the error. Second, the consistency of an estimator of the variance of the error is studied. Finally, we examine the asymptotic efficiency of the estimators of the coefficients when the errors are uncorrelated.

Proceedings Article
24 Jun 2002
TL;DR: The View-based Consistency model is recognized as the model that can offer the maximum performance advantage among the Weak Sequential Consistsency models.
Abstract: Based on time, processor, and data selection techniques, a group of Weak Sequential Consistency models have been proposed to improve the performance of Sequential Consistency for Distributed Shared Memory. These models can guarantee Sequential Consistency for data-race-free programs that are properly labelled. This paper reviews and discusses these models in terms of their use of the selection techniques. Their programmer interfaces are also discussed and compared. Among them the View-based Consistency model is recognized as the model that can offer the maximum performance advantage among the Weak Sequential Consistency models. An implementation of the View-based Consistency model has been given. Finally this paper suggests future directions of implementation effort for Distributed Shared Memory.

Posted Content
TL;DR: In this paper, the structural distribution function of the cell probabilities of a multinomial sample in situations where the number of cells is large is considered and an estimator based on grouping the cells and a kernel type estimator is presented.
Abstract: We consider estimation of the structural distribution function of the cell probabilities of a multinomial sample in situations where the number of cells is large. We review the performance of the natural estimator, an estimator based on grouping the cells and a kernel type estimator. Inconsistency of the natural estimator and weak consistency of the other two estimators is derived by Poissonization and other, new, technical devices.

Journal Article
TL;DR: Timed consistency as discussed by the authors addresses how quickly the effects of an operation are perceived by the rest of the system, by requiring that if a write operation is executed at time t, it must be visible to the entire distributed system by time t+/spl Delta.
Abstract: A distributed computation satisfies sequential consistency if it is possible to establish a legal ordering of all the operations such that the program order of each site in the distributed system is respected. However, sequential consistency does not necessarily consider the particular real-time instant at which each operation is executed. A timed consistency model addresses how quickly the effects of an operation are perceived by the rest of the system, by requiring that if a write operation is executed at time t, it must be visible to the entire distributed system by time t+/spl Delta/.Timed consistency generalizes several existing consistency criteria and it is well suited for applications where the action of one user must be seen by others in a timely fashion.