scispace - formally typeset
Search or ask a question

Showing papers on "Weak consistency published in 1994"


Proceedings ArticleDOI
27 Jun 1994
TL;DR: This work examines the consistency problem for descriptions of trees based on remote dominance, and presents a consistency-checking algorithm which is polynomial in the number of nodes in the description, despite disjunctions inherent in the theory of trees.
Abstract: We examine the consistency problem for descriptions of trees based on remote dominance, and present a consistency-checking algorithm which is polynomial in the number of nodes in the description, despite disjunctions inherent in the theory of trees. The resulting algorithm allows for descriptions which go beyond sets of atomic formulas to allow certain types of disjunction and negation.

35 citations


Book ChapterDOI
01 Jan 1994
TL;DR: This paper presents a new property called constraint tightness that can be used for characterizing the difficulty of problems formulated as constraint networks, and shows that when the constraints are tight they may require less preprocessing in order to guarantee a backtrack-free solution.
Abstract: Constraint networks are a simple representation and reasoning framework with diverse applications. In this paper, we present a new property called constraint tightness that can be used for characterizing the difficulty of problems formulated as constraint networks. Specifically, we show that when the constraints are tight they may require less preprocessing in order to guarantee a backtrack-free solution. This suggests, for example, that many instances of crossword puzzles are relatively easy while scheduling problems involving resource constraints are quite hard. Formally, we present a relationship between the tightness or restrictiveness of the constraints, and the level of local consistency sufficient to ensure global consistency, thus ensuring backtrack-freeness. Two definitions of local consistency are employed. The traditional variable-based notion leads to a condition involving the tightness of the constraints, the level of local consistency, and the arity of the constraints, while a new definition of relational consistency leads to a condition expressed in terms of tightness and local-consistency level, alone. New algorithms for enforcing relational consistency are introduced and analyzed.

21 citations


Proceedings Article
05 Oct 1994
TL;DR: A new property called constraint looseness is presented and it is shown how it can be used to estimate the level of local consistency of a binary constraint network and an algorithm is developed that can sometimes find an ordering of the variables such that a network is backtrack-free.
Abstract: We present a new property called constraint looseness and show how it can be used to estimate the level of local consistency of a binary constraint network. Specifically, we present a relationship between the looseness of the constraints, the size of the domains, and the inherent level of local consistency of a constraint network. The results we present are useful in two ways. First, a common method for finding solutions to a constraint network is to first preprocess the network by enforcing local consistency conditions, and then perform a backtracking search. Here, our results can be used in deciding which low-order local consistency techniques will not change a given constraint network and thus are not useful for preprocessing the network. Second, much previous work has identified conditions for when a certain level of local consistency is sufficient to guarantee a network is backtrack-free. Here, our results can be used in deciding which local consistency conditions, if any, still need to be enforced to achieve the specified level of local consistency. As well, we use the looseness property to develop an algorithm that can sometimes find an ordering of the variables such that a network is backtrack-free.

16 citations


Journal ArticleDOI
TL;DR: Society and Space as discussed by the authors is a forum for social, economic, political, and cultural analyses, based in the understanding that economic theory and practice are also metaphorical and cultural performances, just as cultural practices are inseparable from economic ones.
Abstract: Change and consistency Society and Space has a new look; the cover has been redesigned and the page layout has been altered. As a gesture towards embodying the intellect, we now publish both the first and last names of authors. As readers of Society and Space are no doubt alive to the politics of representation, I would like to underline the consistency of editorial ideals and thematic focus that persist despite changes in the look of the journal First, we continue to invite a wide range of discussions around the theme of society and space, discussions that move across disciplines, perspectives, times, and places, Society and Space is a forum for social, economic, political, and cultural analyses, based in the understanding that economic theory and practice, for example, are also metaphorical and cultural performances, just as cultural practices are inseparable from economic ones. Second, we retain the ideal of opening the journal to argument and debate from diverse intellectual communities. Third, we work from an understanding that social theory has practical relevance. In Derek Gregory's terms (1993), we invite 'bloody theory*, social analysis, and commentary that engage both empirical research and the immediacy of embodied everyday life. Fourth, our new look enacts a continuing willingness to experiment, within reasonable limits, with representational strategies and ways of writing. Noting these consistencies in thematic focus and editorial policy, let us take pleasure in change.

7 citations


11 Apr 1994
TL;DR: The Unify system is exploring scalable approaches for designing distributed multicomputers that support a shared memory paradigm that employs highly efficient communication protocols to support new weak consistency sharing models and introduces the notion of spatial consistency and a non-standard memory type called sequential segments.
Abstract: The Unify system is exploring scalable approaches for designing distributed multicomputers that support a shared memory paradigm. To achieve massive scalability, unify employs highly efficient communication protocols to support new weak consistency sharing models. In particular, Unify introduces the notion of spatial consistency and a non-standard memory type called sequential segments. The combination of out-of-order spatial consistency and sequential segments increases concurrency, reduces the need for synchronization, and allows the use of highly efficient non-atomic multicast protocols. Our experience shows that there is a logical and intuitive connection mapping between sequential segments and a wide variety of parallel and distributed applications that require support for shared information. Moreover, the use of sequential segments results in simplified code and efficiency comparable to that of optimized message passing systems.

5 citations


Journal ArticleDOI
TL;DR: In this paper, a generalized cross validation (GCV) estimate for simultaneously estimating the weighting parameters and the smoothing parameters is developed, where r is the weight parameter and r represents the relative weight.
Abstract: The problem of using the “direct” variational methods in a statistical model that merges data from different sources with unknown relative weightys is condiered. to carry our this merging optimally, it is necessary to provide an estimate of the relative weights to be given to data from different sources. A new form of generalized cross validation(GCV) estimate for simultaneously estimating the weighting parameters and the smoothing parameters is developed here. we name this estimate GCV-r where r represents the weighting parameter. we study the properties of the GCV-r estimators as well as the properties of the generalized maximum likelihood (GML-r) estimators proposed in wahba, Johnson and Reames (1990). We prove the weak consistency and the asymptotic normality of all these estimators under a stochastic model. The convergence rates for these estimators are obtained under some conditions. some simulation studies are caried out both to confirm the theoretical results and to compare different methods under...

5 citations


Journal ArticleDOI
TL;DR: There was evidence that properties of a consistent underlying rule structure were made more salient when the mappings were consistent with users' expectations, and only under these circumstances were performance benefits observed.
Abstract: Consistency can be expressed in terms of minimal components of an interaction language. However, what is taken as a unit in describing stimulus events is crucial. A particular command set may be generated by very few rules (internal consistency) but should also map on to the users' expectations (higher level consistency, or compatibility). Sixty subjects took part in a simple computer game in order to explore the relationship between internal consistency, compatibility, and mode of learning. Internal consistency was found to be related to the subjects' ability to create an explicit model of the task, and compatibility was related to enhanced performance on the task. There was evidence that properties of a consistent underlying rule structure were made more salient when the mappings were consistent with users' expectations?and only under these circumstances were performance benefits observed.

4 citations


Journal ArticleDOI
TL;DR: A set of APL2 functions is presented for a new definition of consistency of pairwise comparisons that locate the source of inconsistency and can thus be used to improve relative judgements.
Abstract: A set of APL2 functions is presented for a new definition of consistency of pairwise comparisons. By calculating an inconsistency for each comparison, these functions locate the source of inconsistency and can thus be used to improve relative judgements.

2 citations



Proceedings ArticleDOI
19 Dec 1994
TL;DR: Stochastic timed Petri nets are developed to evaluate the relative performance of distributed shared memory models for scalable multithreaded multiprocessors and it is found that multithreading contributes more than half of the performance improvement, while the improvement from memory consistency models varies between 20% to 40% of the total performance gain.
Abstract: Stochastic timed Petri nets are developed to evaluate the relative performance of distributed shared memory models for scalable multithreaded multiprocessors. The shared memory models evaluated include the Sequential Consistency (SC), the Weak Consistency (WC), the Processor Consistency (PC) and the Release Consistency (RC) models. Under saturated conditions, we found that multithreading contributes more than 50% of the performance improvement, while the improvement from memory consistency models varies between 20% to 40% of the total performance gain. Our analytical results reveal the lowest performance of the SC model. The PC model requires to use larger write buffers and may perform even lower than the SC model if a small buffer was used. The performance of the WC model depends heavily on the synchronization rate in user code. For a low synchronization rate, the WC model performs as well as the RC model. With sufficient multithreading and network bandwidth, the RC model shows the best performance among the four models.

1 citations


01 Jan 1994
TL;DR: In this article, a new property called constraint looseneas is proposed to estimate the level of local consistency of a binary constraint network, which can be used to decide which low-order local consistency techniques will not change a given constraint network and thus are not useful for preprocessing the network.
Abstract: We present a new property called constraint looseneas and show how it can be used to estimate the level of local consistency of a binary constraint network. Specifically, we present a relationship between the looseness of the constraints, the size of the domains, and the inherent level of local consistency of a constraint network. The results we present are useful in two ways. First, a common method for finding solutions to a constraint network is to first preprocess the network by enforcing local consistency conditions, and then perform a backtracking search. Here, our results can be used in deciding which low-order local consistency techniques will not change a given constraint network and thus are not useful for preprocessing the network. Second, much previous work has identified conditions for when a certain level of local consistency is sufficient to guarantee a network is backtrack-free. Here, our results can be used in deciding which local consistency conditions, if any, still need to be enforced to achieve the specified level of local consistency. As well, we use the looseness property to develop an algorithm that can sometimes find an ordering of the variables such that a network is backtrack-free.

Proceedings ArticleDOI
S.F. Hummel1
01 Jan 1994
TL;DR: A family of fault-tolerant algorithms for maintaining the consistency of cacheable data is described, which ensures the reliable delivery of operations on strongly consistent data.
Abstract: As the number of processors increases, so does communication latency and the probability of component failure. A technique that addresses these problems is data replication, which provides faster access and greater availability. Its drawback is that the replicas must be kept consistent. The author describes a family of fault-tolerant algorithms for maintaining the consistency of cacheable data. Processors specify the consistency requirements, strong or weak, of their shared data. Data are kept consistent by a fault-tolerant multicast protocol. The protocol ensures the reliable delivery of operations on strongly consistent data. Weakly consistent data updates are inserted as operation prologues. Depending on the consistency policy, the execution of an operation causes cached data to be either invalidated or updated. >