scispace - formally typeset
Search or ask a question

Showing papers on "Weak consistency published in 2008"


Proceedings ArticleDOI
10 May 2008
TL;DR: TReMer+ works by first constructing a merged model before checking consistency, which enables a flexible way of verifying global consistency properties that is not possible with other existing tools.
Abstract: We present TReMer+, a tool for consistency checking of distributed models (i.e., models developed by distributed teams). TReMer+ works by first constructing a merged model before checking consistency. This enables a flexible way of verifying global consistency properties that is not possible with other existing tools.

40 citations


Journal ArticleDOI
TL;DR: In this article, a new concept of convergence concerning random probability measure is introduced in order to discuss the weak consistency of the bootstrap distribution estimator and some results on random probabilities and conditional distributions corresponding to the classical theorems are proved.

34 citations


Posted Content
Teemu Pennanen1
TL;DR: In this article, the authors derive dual characterizations of superhedging conditions for contingent claim processes in a market without a cash account in terms of stochastic discount factors that correspond to martingale densities.
Abstract: We study contingent claims in a discrete-time market model where trading costs are given by convex functions and portfolios are constrained by convex sets. In addition to classical frictionless markets and markets with transaction costs or bid-ask spreads, our framework covers markets with nonlinear illiquidity effects for large instantaneous trades. We derive dual characterizations of superhedging conditions for contingent claim processes in a market without a cash account. The characterizations are given in terms of stochastic discount factors that correspond to martingale densities in a market with a cash account. The dual representations are valid under a topological condition and a weak consistency condition reminiscent of the ``law of one price'', both of which are implied by the no arbitrage condition in the case of classical perfectly liquid market models. We give alternative sufficient conditions that apply to market models with nonlinear cost functions and portfolio constraints.

32 citations


Proceedings Article
27 Jun 2008
TL;DR: This paper demonstrates that when an adaptive heuristic is used, value deletions and domain wipeouts caused by individual constraints largely occur in clusters of consecutive or nearby constraint revisions, and develops a number of simple heuristics that allow to dynamically switch between enforcing a weak, and cheap local consistency, and a strong but more expensive one, depending on the activity of individual constraints.
Abstract: Building adaptive constraint solvers is a major challenge in constraint programming. An important line of research towards this goal is concerned with ways to dynamically adapt the level of local consistency applied during search. A related problem that is receiving a lot of attention is the design of adaptive branching heuristics. The recently proposed adaptive variable ordering heuristics of Boussemart et al. use information derived from domain wipeouts to identify highly active constraints and focus search on hard parts of the problem resulting in important saves in search effort. In this paper we show how information about domain wipeouts and value deletions gathered during search can be exploited, not only to perform variable selection, but also to dynamically adapt the level of constraint propagation achieved on the constraints of the problem. First we demonstrate that when an adaptive heuristic is used, value deletions and domain wipeouts caused by individual constraints largely occur in clusters of consecutive or nearby constraint revisions. Based on this observation, we develop a number of simple heuristics that allow us to dynamically switch between enforcing a weak, and cheap local consistency, and a strong but more expensive one, depending on the activity of individual constraints. As a case study we experiment with binary problems using AC as the weak consistency and maxRPC as the strong one. Results from various domains demonstrate the usefulness of the proposed heuristics.

31 citations


Journal ArticleDOI
TL;DR: In this paper, an extended formulation of the averaged periodogram method proposed in Robinson [1994] is derived, considering a certain homogeneous and isotropic behaviour of the spectral distribution in the low frequencies.

19 citations


Journal ArticleDOI
TL;DR: In this paper, the wavelet estimator of a nonparametric fixed design regression function when errors are strictly stationary and associated random variables is considered and the rates of uniformly asymptotic normality for associated samples are given.

19 citations


Proceedings ArticleDOI
12 Jun 2008
TL;DR: Results show that IDEA is able to provide consistency guarantees adaptive to userpsilas changing needs, and it achieves low delay for inconsistency resolution and incurs small communication overhead.
Abstract: To maintain consistency, designers of replicated services have traditionally been forced to choose from either strong consistency guarantees or none at all. Realizing that a continuum between strong and optimistic consistencies is semantically meaningful for a broad range of network services, previous research has proposed a continuous consistency model for replicated services to support the tradeoff between the guaranteed consistency level, performance and availability. However, to meet changing application needs and to make the model useful for interactive users of large-scale replicated services, the adaptability and the swiftness of inconsistency resolution are important and challenging. This paper presents IDEA (an infrastructure for detection-based adaptive consistency guarantees) for adaptive consistency guarantees of large-scale, Internet-based replicated services. The main functions enabled by IDEA include quick inconsistency detection and resolution, consistency adaptation and quantified consistency level guarantees. Through experimentation on the Planet-Lab, IDEA is evaluated from two aspects: its adaptive consistency guarantees and its performance for inconsistency resolution. Results show that IDEA is able to provide consistency guarantees adaptive to userpsilas changing needs, and it achieves low delay for inconsistency resolution and incurs small communication overhead.

15 citations


Journal ArticleDOI
TL;DR: A performance model for the two-level architecture is developed and analytic results on the workload experienced by each server are obtained and a novel technique to achieve weak consistency among copies of the virtual environment at the various servers is developed.
Abstract: A distributed virtual environment (DVE) is a shared virtual environment where multiple users at their workstations interact with each other over a network. Some of these systems may support a large number of users, for example, multiplayer online games. An important issue is how well the system scales as the number of users increases. In terms of scalability, a promising system architecture is a two-level hierarchical architecture. At the lower level, multiple servers are deployed; each server interacts with its assigned users. At the higher level, the servers ensure that their copies of the virtual environment are as consistent as possible. Although the two-level architecture is believed to have good properties with respect to scalability, not much is known about its performance characteristics. In this paper, we develop a performance model for the two-level architecture and obtain analytic results on the workload experienced by each server. Our results provide valuable insights into the scalability of the architecture. We also investigate the issue of consistency and develop a novel technique to achieve weak consistency among copies of the virtual environment at the various servers. Simulation results on the consistency/scalability trade-off are presented.

14 citations


01 Jan 2008
TL;DR: An intermediate model is constructed that captures the physical transfers in a value model, thereby reducing the conceptual gap between value and process models and can be checked for consistency with a process model via the existing “reduced model” approach.
Abstract: Business value models and process models describe the same subject from a different perspective. Therefore, it is important that both models are consistent with each other. To do consistency checking, we construct an intermediate model that captures the physical transfers in a value model, thereby reducing the conceptual gap between value and process models. This physical transfer model can then be checked for consistency with a process model via the already existing “reduced model” approach. A reduced model is a simplified representation of a value model or process model, where common concepts represent aspects from both the value and process model. We illustrate our approach using a small case study in the electricity sector.

12 citations


Proceedings Article
Amitanand S. Aiyer1, Eric Anderson1, Xiaozhou Li1, Mehul A. Shah1, Jay J. Wylie1 
07 Dec 2008
TL;DR: This position paper motivates the need for and introduces the concept of consistability--a unified metric of consistency and availability and describes the initial results of applying consistability reasoning to a keyvalue store the authors are developing and to other recent distributed systems.
Abstract: Current weak consistency semantics provide worst-case guarantees to clients. These guarantees fail to adequately describe systems that provide varying levels of consistency in the face of distinct failure modes, or that achieve better than worst-case guarantees during normal execution. The inability to make precise statements about consistency throughout a system's execution represents a lost opportunity to clearly understand client application requirements and to optimize systems and services appropriately. In this position paper, we motivate the need for and introduce the concept of consistability--a unified metric of consistency and availability. Consistability offers a means of describing, specifying, and discussing how much consistency a usually consistent system provides, and how often it does so. We describe our initial results of applying consistability reasoning to a keyvalue store we are developing and to other recent distributed systems. We also discuss the limitations of our consistability definition.

12 citations


Proceedings ArticleDOI
01 Jun 2008
TL;DR: During the process of decision making, due to time limitation, knowledge structure and environmental factors, decision makers can not provide accurate numbers to express their preferences and have some hesitation to make decisions, so it will be better to use interval-valued intuitionistic fuzzy numbers toexpress decision makerspsila preferences.
Abstract: During the process of decision making, due to time limitation, knowledge structure and environmental factors, decision makers can not provide accurate numbers to express their preferences and have some hesitation to make decisions. In this condition, it will be better to use interval-valued intuitionistic fuzzy numbers to express decision makerspsila preferences. Therefore, the weak consistency of an interval-valued intuitionistic fuzzy matrix was defined, and some properties were proposed to judge the weak consistency of an interval-valued intuitionistic fuzzy matrix. Then the acceptability of a given interval-valued intuitionisitic fuzzy matrix can be measured and advice can be given to a decision maker to adapt his/her preference if the given matrix is not weakly consistent. Meanwhile, illustrated examples were given.

Journal Article
TL;DR: Some definitions concerning the consistency of interval judgement matrix including perfect consistency, strong consistency, consistency and satisfactory consistency are proposed, and the relations between these definitions and the existing ones in some papers are discussed.
Abstract: The consistency problem of the interval judgment matrix is studied.Existing definitions on the consistency in some papers are firstly summarized and analyzed,and the invalidity in these definitions are pointed out.Furthermove,some definitions concerning the consistency of interval judgement matrix including perfect consistency,strong consistency,consistency and satisfactory consistency are proposed,and the relations between these definitions and the existing ones in some papers are discussed.The validity of these definitions is confirmed.Moreover,a method for testing consistency is given,and two examples show the effectiveness and potential practicality of this method.

Proceedings ArticleDOI
01 Dec 2008
TL;DR: It is proved that LC is strictly weaker than PRAM consistency, and for data race free programs, the semantics of LC is equivalent to memory coherence, and by introducing memory ordering semantics into LC judiciously, it is proven that the enhanced model is equivalents to SC for dataRace free programs.
Abstract: Location consistency (LC) is a weak memory consistency model which is defined entirely on partial order execution semantics of parallel programs. Compared with sequential consistency (SC), LC is scalable and provides ample theoretical parallelism. This makes LC an interesting memory model in the upcoming many-core parallel processing era. Previous work has pointed out that LC does not guarantee SC execution behavior for all data race free programs. In this paper, we compare the semantics of LC with PRAM consistency and memory coherence, and prove that LC is strictly weaker than PRAM consistency. For data race free programs, we prove that the semantics of LC is equivalent to memory coherence. In addition, by introducing memory ordering semantics into LC judiciously, we prove that the enhanced model is equivalent to SC for data race free programs. Finally, we discuss possible solutions for adding reasoning rules for LC-like weak memory models.

Proceedings ArticleDOI
15 Sep 2008
TL;DR: A specification for weak consistency in the context of a replicated service that tolerates Byzantine faults is proposed, using a real world application that can currently only tolerate crash faults to exemplify the need for such consistency guarantees.
Abstract: We propose a specification for weak consistency in the context of a replicated service that tolerates Byzantine faults. We define different levels of consistency for the replies that can be obtained from such a service---we use a real world application that can currently only tolerate crash faults to exemplify the need for such consistency guarantees.

Journal ArticleDOI
TL;DR: This paper generalizes CTPs to CTPPs by adding fuzzy preferences to the temporal constraints and by allowing fuzzy thresholds for the occurrence of some events.

Journal Article
TL;DR: A domain based adaptive replica selecting model named DARSM (domainbased adaptive replica selection model), in which component replicas are organized into strong consistency domain and weak consistency domain, and a schema based on consistency window is utilized to synchronize states between theses domains is proposed.
Abstract: This paper proposes a domain based adaptive replica selecting model named DARSM (domain based adaptive replica selection model),in which component replicas are organized into strong consistency domain and weak consistency domain,and a schema based on consistency window is utilized to synchronize states between theses domains.Accordingly,a partition-balanced based adaptive replica selection algorithm PWARS (partition- weighted based adaptive replica selection) could be built on dynamic performance metric information to select the appropriate replica set that satisfies specific QoS constraints.A consistency window adaptive reconfiguration algorithm CWAR (consistency window adaptive reconfiguration) is presented to adapt to the dynamic change of consistency constraints.In this algorithm,a probability model built on the base of current consistency constraints distribution is used to dynamically adjust the configuration of consistency window.As a result,the inconsistency of each replica is controlled adaptively.This approach has been implemented in OnceAS,and experimental results demonstrate that it can effectively enhance the performance of replica selection.

Posted Content
Teemu Pennanen1
TL;DR: In this article, the authors derive dual characterizations of superhedging conditions for contingent claim processes in a market without a cash account in terms of stochastic discount factors that correspond to martingale densities.
Abstract: We study contingent claims in a discrete-time market model where trading costs are given by convex functions and portfolios are constrained by convex sets. In addition to classical frictionless markets and markets with transaction costs or bid-ask spreads, our framework covers markets with nonlinear illiquidity effects for large instantaneous trades. We derive dual characterizations of superhedging conditions for contingent claim processes in a market without a cash account. The characterizations are given in terms of stochastic discount factors that correspond to martingale densities in a market with a cash account. The dual representations are valid under a topological condition and a weak consistency condition reminiscent of the ``law of one price'', both of which are implied by the no arbitrage condition in the case of classical perfectly liquid market models. We give alternative sufficient conditions that apply to market models with nonlinear cost functions and portfolio constraints.

01 Jan 2008
TL;DR: The inconsistent data that exist under the attribute sets in the relations having possible functional dependencies can be found effectively by applying the suggested rough set based consistency checking method.
Abstract: Summary In order to deal with data inconsistency problems in relational databases, a new method based on rough set theory which checks data consistency solely based on data is presented. The inconsistent data that exist under the attribute sets in the relations having possible functional dependencies can be found effectively by applying the suggested rough set based consistency checking method. The method is illustrated by examples.

Journal Article
TL;DR: These properties and theory for deleting redundant information of judgement matrix and giving ranking method provide necessary theory foundation and the theory and algorithm to find equivalent consistency matrix of weak consistency judgement matrix is designed.

Proceedings ArticleDOI
25 May 2008
TL;DR: The experiments show that the states from extensions of original information systems, having greater values of consistency factors, appear significantly more often in the future.
Abstract: The paper is devoted to the application of the extensions of information systems to state prediction problems. An information system (in the Pawlakpsilas sense) can describe the states of processes observed in a given concurrent system. If we extend a given information system by adding some new states which have not been observed yet, then we are interested in the degrees of consistency (called also consistency factors) of added states with the knowledge of state coexistence included in the original information system. Such information can be helpful in predicting the possibility of appearing given states in the future in the examined system. The consistency factor computed can be between 0 and 1, 0 for the full inconsistency and 1 for the full consistency. The experiments show that the states from extensions of original information systems, having greater values of consistency factors, appear significantly more often in the future.

Journal ArticleDOI
TL;DR: It is shown that sequential consistency and linearizability cannot be distinguished by the timing conditions previously considered in the context of counting networks; thus, in contexts where these constraints apply, it is possible to rely on the stronger semantics oflinearizability, which simplifies proofs and enhances compositionality.
Abstract: We compare the impact of timing conditions on implementing sequentially consistent and linearizable counters using (uniform) counting networks in distributed systems. For counting problems in application domains which do not require linearizability but will run correctly if only sequential consistency is provided, the results of our investigation, and their potential payoffs, are threefold: First, we show that sequential consistency and linearizability cannot be distinguished by the timing conditions previously considered in the context of counting networks; thus, in contexts where these constraints apply, it is possible to rely on the stronger semantics of linearizability, which simplifies proofs and enhances compositionality. Second, we identify local timing conditions that support sequential consistency but not linearizability; thus, we suggest weaker, easily implementable timing conditions that are likely to be sufficient in many applications. Third, we show that any kind of synchronization that is too weak to support even sequential consistency may violate it significantly for some counting networks; hence, we identify timing conditions that are to be totally ruled out for specific applications that rely critically on either sequential consistency or linearizability.

Posted Content
TL;DR: In this paper, a non-parametric kernel regression for strongly mixing processes is proposed, where the regressor is nonnegative and the nonparametric regression is implemented using asymmetric kernels [Gamma (Chen, 2000b), Inverse Gaussian and Reciprocal InverseGaussian (Scaillet, 2004) kernels].
Abstract: This paper considers a nonstandard kernel regression for strongly mixing processes when the regressor is nonnegative. The nonparametric regression is implemented using asymmetric kernels [Gamma (Chen, 2000b), Inverse Gaussian and Reciprocal Inverse Gaussian (Scaillet, 2004) kernels] that possess some appealing properties such as lack of boundary bias and adaptability in the amount of smoothing. The paper investigates the asymptotic and finite-sample properties of the asymmetric kernel Nadaraya-Watson, local linear, and re-weighted Nadaraya-Watson estimators. Pointwise weak consistency, rates of convergence and asymptotic normality are established for each of these estimators. As an important economic application of asymmetric kernel regression estimators, we reexamine the problem of estimating scalar diffusion processes.

Journal Article
TL;DR: By constructing and analyzing the mediation matrix judgment matrix, an approach for regulating consistency is proposed and the regulated judgment matrix has more satisfying consistency and reflects the decision maker's willing in the greatest degree.
Abstract: Based on the definition of the complete consistency of fuzzy judgment matrixes,the criterion of identifying the degree of consistency of a fuzzy judgment matrix is presented,and a judgment method of the satisfying consistency is given.Then,by constructing and analyzing the mediation matrix judgment matrix,an approach for regulating consistency is proposed.The regulated judgment matrix has more satisfying consistency,and it reflects the decision maker's willing in the greatest degree.Finally,a numerical example is given to illustrate the use of the proposed approach.

Journal Article
TL;DR: In this paper, it was shown that the complete consistency of judgment matrix based on some common reciprocal scales is incompatible, and the consistency proportions of strictly consistent identiy normal form between these scales are studied.
Abstract: This paper justifies the complete consistency of judgment matrix based on some common reciprocal scales is incompatible,and the consistency proportions of strictly consistent identiy normal form between these scales are studied.The percent of satisfied consistency matrix is given then,it presents that an(a8=9) exponential scale is more reasonable.


Proceedings ArticleDOI
10 Dec 2008
TL;DR: In this article, a system is built for violation detection under topo-semantic consistency with specific checking and correcting processes using a SSRO-Tree structure for spatial data access.
Abstract: In any Active Geographical Information System the reliability of any results of queries, analysis or reasoning depends on data quality (up-to-date data with properties of positional accuracy, consistency and so on). A system is built for violation detection under Topo-Semantic consistency with specific checking and correcting processes using a SSRO-Tree structure for spatial data access. Results indicate that the developed Constraint Violation Detection (CVD) system is powerful when compared with popular conventional systems. Three kinds of errors are identified which lead to three kinds of consistency, namely structural consistency, geometric consistency and Topo-Semantic consistency.

Journal ArticleDOI
TL;DR: In this article, the conditional distribution of the next outcome given the infinite past of a stationary process can be inferred from finite but growing segments of the past, and the question whether they are pointwise consistent is still open.
Abstract: The conditional distribution of the next outcome given the infinite past of a stationary process can be inferred from finite but growing segments of the past. Several schemes are known for constructing pointwise consistent estimates, but they all demand prohibitive amounts of input data. In this paper we consider real-valued time series and construct conditional distribution estimates that make much more efficient use of the input data. The estimates are consistent in a weak sense, and the question whether they are pointwise consistent is still open. For finite-alphabet processes one may rely on a universal data compression scheme like the Lempel-Ziv algorithm to construct conditional probability mass function estimates that are consistent in expected information divergence. Consistency in this strong sense cannot be attained in a universal sense for all stationary processes with values in an infinite alphabet, but weak consistency can. Some applications of the estimates to on-line forecasting, regression and classification are discussed.

Journal ArticleDOI
TL;DR: In this article, it was shown that weak consistency is not equivalent to negation-consistency or absolute consistency (i.e., non-triviality) in any logic included in positive contractionless intermediate logic LC plus the constructive negation of BKc1 and the (constructive) contraposition axioms.
Abstract: The logic BKc1 is the basic constructive logic for weak consistency (i.e., absence of the negation of a theorem) in the ternary relational semantics without a set of designated points. In this paper, a number of extensions of B Kc1 defined with a propositional falsity constant are defined. It is also proved that weak consistency is not equivalent to negation-consistency or absolute consistency (i.e., non-triviality) in any logic included in positive contractionless intermediate logic LC plus the constructive negation of BKc1 and the (constructive) contraposition axioms.