scispace - formally typeset
Search or ask a question

Showing papers on "Formal language published in 2005"


Journal ArticleDOI
TL;DR: This paper shows that every efficient algorithm for the smallest grammar problem has approximation ratio at least 8569/8568 unless P=NP, and bound approximation ratios for several of the best known grammar-based compression algorithms, including LZ78, B ISECTION, SEQUENTIAL, LONGEST MATCH, GREEDY, and RE-PAIR.
Abstract: This paper addresses the smallest grammar problem: What is the smallest context-free grammar that generates exactly one given string /spl sigma/? This is a natural question about a fundamental object connected to many fields such as data compression, Kolmogorov complexity, pattern identification, and addition chains. Due to the problem's inherent complexity, our objective is to find an approximation algorithm which finds a small grammar for the input string. We focus attention on the approximation ratio of the algorithm (and implicitly, the worst case behavior) to establish provable performance guarantees and to address shortcomings in the classical measure of redundancy in the literature. Our first results are concern the hardness of approximating the smallest grammar problem. Most notably, we show that every efficient algorithm for the smallest grammar problem has approximation ratio at least 8569/8568 unless P=NP. We then bound approximation ratios for several of the best known grammar-based compression algorithms, including LZ78, B ISECTION, SEQUENTIAL, LONGEST MATCH, GREEDY, and RE-PAIR. Among these, the best upper bound we show is O(n/sup 1/2/). We finish by presenting two novel algorithms with exponentially better ratios of O(log/sup 3/n) and O(log(n/m/sup */)), where m/sup */ is the size of the smallest grammar for that input. The latter algorithm highlights a connection between grammar-based compression and LZ77.

457 citations


Journal ArticleDOI
TL;DR: An approach for the efficient solution of motion-planning problems for time-invariant dynamical control systems with symmetries, such as mobile robots and autonomous vehicles, under a variety of differential and algebraic constraints on the state and on the control input.
Abstract: In this paper, we introduce an approach for the efficient solution of motion-planning problems for time-invariant dynamical control systems with symmetries, such as mobile robots and autonomous vehicles, under a variety of differential and algebraic constraints on the state and on the control inputs. Motion plans are described as the concatenation of a number of well-defined motion primitives, selected from a finite library. Rules for the concatenation of primitives are given in the form of a regular language, defined through a finite-state machine called a Maneuver Automaton. We analyze the reachability properties of the language, and present algorithms for the solution of a class of motion-planning problems. In particular, it is shown that the solution of steering problems for nonlinear dynamical systems with symmetries and invariant constraints can be reduced to the solution of a sequence of kinematic inversion problems. A detailed example of the application of the proposed approach to motion planning for a small aerobatic helicopter is presented.

450 citations


Proceedings Article
30 Jul 2005
TL;DR: This paper introduces and illustrates BLOG, a formal language for defining probability models over worlds with unknown objects and identity uncertainty, and introduces a probabilistic form of Skolemization for handling evidence.
Abstract: This paper introduces and illustrates BLOG, a formal language for defining probability models over worlds with unknown objects and identity uncertainty. BLOG unifies and extends several existing approaches. Subject to certain acyclicity constraints, every BLOG model specifies a unique probability distribution over first-order model structures that can contain varying and unbounded numbers of objects. Furthermore, complete inference algorithms exist for a large fragment of the language. We also introduce a probabilistic form of Skolemization for handling evidence.

398 citations


Journal ArticleDOI
TL;DR: The goal of this paper is to introduce a certain number of papers related with grammatical inference, some of which are essential and should constitute a common background to research in the area, whereas others are specialized on particular problems or techniques, but can be of great help on specific tasks.

275 citations


Proceedings Article
09 Jul 2005
TL;DR: This paper presents a method for inducing transformation rules that map natural-language sentences into a formal query or command language and shows that this method performs overall better and faster than previous approaches in both domains.
Abstract: This paper presents a method for inducing transformation rules that map natural-language sentences into a formal query or command language. The approach assumes a formal grammar for the target representation language and learns transformation rules that exploit the non-terminal symbols in this grammar. The learned transformation rules incrementally map a natural-language sentence or its syntactic parse tree into a parse-tree for the target formal language. Experimental results are presented for two corpora. one which maps English instructions into an existing formal coaching language for simulated RoboCup soccer agents, and another which maps English U.S.-geography questions into a database query language. We show that our method performs overall better and faster than previous approaches in both domains.

251 citations


Book
01 Jan 2005
TL;DR: In this article, the authors propose a method to solve the problem of "uniformity" in the following: 1.207.5.0.1.0, 1.
Abstract: 207

157 citations


DOI
01 Jan 2005
TL;DR: This thesis enhances the state of the art in this field by offering the following contributions: developing a syntactic format guaranteeing a language construct to be commutative and proposing syntactic rule formats for guaranteeing congruence of strong bisimilarity and higher-order bisimilarities in the setting of higher order processes.
Abstract: Defining a formal (i.e., mathematical) semantics for computer languages is the first step towards developing rigorous techniques for reasoning about computerprograms and specifications in such a language. Structural Operational Semantics (SOS), introduced by Plotkin in 1981, has become a popular technique for defining formal semantics. In this thesis, we first review the basic concepts of SOS and the existing meta-results. Subsequently, we enhance the state of the art in this field by offering the following contributions:• developing a syntactic format guaranteeing a language construct to be commutative;• extending the existing congruence and well-definedness meta-results to the setting with equational specifications;• defining a more liberal notion of operational conservativity, called orthogonality,and formulating meta-theorems for it;• prototyping a framework for checking the premises of congruence and conservativity meta-theorems and animating programs according to their SOS specification;• defining notions of bisimulation with data and formulating syntactic rule formats guaranteeing congruence for these notions;• proposing syntactic rule formats for guaranteeing congruence of strong bisimilarity and higher-order bisimilarity in the setting of higher order processes.

85 citations


Journal ArticleDOI
08 Apr 2005
TL;DR: In this article, the LISA compiler is used to generate editors, inspectors, debuggers, and visualisers/animators from formal language specifications, which can be described by a generic fixed part that traverses the appropriate data structures generated by a specific variable part.
Abstract: Many tools have been constructed using different formal methods to process various parts of a language specification (e.g. scanner generators, parser generators and compiler generators). The automatic generation of a complete compiler was the primary goal of such systems, but researchers recognised the possibility that many other language-based tools could be generated from formal language specifications. Such tools can be generated automatically whenever they can be described by a generic fixed part that traverses the appropriate data structures generated by a specific variable part, which can be systematically derivable from the language specifications. The paper identifies generic and specific parts for various language-based tools. Several language-based tools are presented in the paper, which are automatically generated using an attribute grammar-based compiler generator called LISA. The generated tools that are described in the paper include editors, inspectors, debuggers and visualisers/animators. Because of their complexity of construction, special emphasis is given to visualisers/animators, and the unique contribution of our approach toward generating such tools.

79 citations


Journal ArticleDOI
TL;DR: This article gives an overview of a series of grammatical approaches to biological sequence analyses and related researches and focuses on learning stochastic grammars from biological sequences and predicting their functions based on learned Stochastic Grammars.
Abstract: Bioinformatics is an active research area aimed at developing intelligent systems for analyses of molecular biology. Many methods based on formal language theory, statistical theory, and learning theory have been developed for modeling and analyzing biological sequences such as DNA, RNA, and proteins. Especially, grammatical inference methods are expected to find some grammatical structures hidden in biological sequences. In this article, we give an overview of a series of our grammatical approaches to biological sequence analyses and related researches and focus on learning stochastic grammars from biological sequences and predicting their functions based on learned stochastic grammars.

75 citations


Book
01 Jan 2005
TL;DR: In this paper, a Rewriting Logic Sampler is used to rewrite the logic sampler, and an SLD-Resolution Calculus for Basic Serial Multimodal Logics is presented.
Abstract: Invited Speakers.- A Rewriting Logic Sampler.- Codes and Length-Increasing Transitive Binary Relations.- Languages and Process Calculi for Network Aware Programming - Short Summary -.- Stochastic Analysis of Graph Transformation Systems: A Case Study in P2P Networks.- Component-Based Software Engineering.- Formal Languages.- Outfix-Free Regular Languages and Prime Outfix-Free Decomposition.- Solving First Order Formulae of Pseudo-Regular Theory.- Splicing Array Grammar Systems.- Computer Science Logics.- Compositionality of Fixpoint Logic with Chop.- An SLD-Resolution Calculus for Basic Serial Multimodal Logics.- Upside-Down Transformation in SOL/Connection Tableaux and Its Application.- Program Construction.- On the Stability Semantics of Combinational Programs.- Generating C Code from LOGS Specifications.- Formalizing the Debugging Process in Haskell.- Finding Resource Bounds in the Presence of Explicit Deallocation.- Real-Time Systems.- The Timer Cascade: Functional Modelling and Real Time Calculi.- A Robust Interpretation of Duration Calculus.- Symbolic Model Checking of Finite Precision Timed Automata.- Concurrency and Refinement.- Covarieties of Coalgebras: Comonads and Coequations.- Linking Theories of Concurrency.- On Cool Congruence Formats for Weak Bisimulations.- Externalized and Internalized Notions of Behavioral Refinement.- Software Security.- Information Flow Is Linear Refinement of Constancy.- On Typing Information Flow.- Representation and Reasoning on RBAC: A Description Logic Approach.- Revisiting Failure Detection and Consensus in Omission Failure Environments.- Quantitative Logics.- Congruences and Bisimulations for Continuous-Time Stochastic Logic.- A Logic for Quantum Circuits and Protocols.- Quantitative Temporal Logic Mechanized in HOL.- Weak Stochastic Bisimulation for Non-markovian Processes.- Object-Orientation and Component Systems.- On Refinement of Software Architectures.- POST: A Case Study for an Incremental Development in rCOS.- Implementing Application-Specific Object-Oriented Theories in HOL.- Constructing Open Systems via Consistent Components.- Model-Checking and Algorithms.- A Sub-quadratic Algorithm for Conjunctive and Disjunctive Boolean Equation Systems.- Using Fairness Constraints in Process-Algebraic Verification.- Maximum Marking Problems with Accumulative Weight Functions.- Applied Logics and Computing Theory.- Toward an Abstract Computer Virology.- On Superposition-Based Satisfiability Procedures and Their Combination.- Tutorials at ICTAC 2005.- A Summary of the Tutorials at ICTAC 2005.

75 citations


Dissertation
01 Jan 2005
TL;DR: The aim of this research is to design an executable meta-language that supports system architects' modeling process by automating certain model construction, manipulation and simulation tasks, specified as an Object-Process Network (OPN).
Abstract: The aim of this research is to design an executable meta-language that supports system architects' modeling process by automating certain model construction, manipulation and simulation tasks. This language specifically addresses the needs in systematically communicating architects' intent with a wide range of stakeholders and to organize knowledge from various domains. Our investigation into existing architecting approaches and technologies has pointed out the need to develop a simple and intuitive, yet formal language that expresses multiple layers of abstractions, provides reflexive knowledge about the models, mechanizes data exchange and manipulation, while allow integration with legacy infrastructures. A small set of linguistic primitives, stateful objects and processes that transform them, were identified as both required and sufficient building blocks of the meta-language, specified as an Object-Process Network (OPN). To demonstrate the applicability of OPN, a software environment has been developed and applied to define meta-models of large-scale complex system architectures such as space transportation systems. OPN provides three supporting aspects of architectural modeling. As a declarative language, OPN provides a diagrammatic formal language to help architects specify the space of architectural options. As an imperative language, OPN automates the process of creating architectural option instances and computes associated performance metrics for those instances. As a simulation language, OPN uses a function-algebraic model to subsume and compose discrete, continuous, and probabilistic events within one unified execution engine. To demonstrate its practical value in large-scale engineering systems, the research applied OPN to two space exploration programs and one aircraft design problem. In our experiments, OPN was able to significantly change the modeling and architectural reasoning process by automating a number of manual model construction, manipulation, and simulation tasks. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.)

Proceedings ArticleDOI
23 Jun 2005
TL;DR: It is shown that Constraint LTL over the simple domain augmented with the freeze operator is undecidable which is a surprising result regarding the poor language for constraints (only equality tests).
Abstract: Constraint LTL, a generalization of LTL over Presburger constraints, is often used as a formal language to specify the behavior of operational models with constraints. The freeze quantifier can be part of the language, as in some real-time logics, but this variable-binding mechanism is quite general and ubiquitous in many logical languages (first-order temporal logics, hybrid logics, logics for sequence diagrams, navigation logics, etc.). We show that Constraint LTL over the simple domain augmented with the freeze operator is undecidable which is a surprising result regarding the poor language for constraints (only equality tests). Many versions of freeze-free constraint LTL are decidable over domains with qualitative predicates and our undecidability result actually establishes /spl Sigma//sub 1//sup 1/ -completeness. On the positive side, we provide complexity results when the domain is finite (EXPSPACE-completeness) or when the formulae are flat in a sense introduced in the paper.

Journal ArticleDOI
TL;DR: It is seen that prefix-closed languages are relatively hard to learn compared to arbitrary regular languages and an optimized version of this algorithm is implemented and analyzed, showing positive results.

Proceedings ArticleDOI
26 Jun 2005
TL;DR: The main results are that the variant of probabilistic Buchi automata (PBA) is more expressive than non-deterministic /spl omega/-automata, but a certain subclass of PBA has exactly the power of /spl Omega/-regular languages.
Abstract: Probabilistic finite automata as acceptors for languages over finite words have been studied by many researchers. In this paper, we show how probabilistic automata can serve as acceptors for /spl omega/-regular languages. Our main results are that our variant of probabilistic Buchi automata (PBA) is more expressive than non-deterministic /spl omega/-automata, but a certain subclass of PBA, called uniform PBA, has exactly the power of /spl omega/-regular languages. This also holds for probabilistic /spl omega/-automata with Streett or Rabin acceptance. We show that certain /spl omega/-regular languages have uniform PBA of linear size, while any nondeterministic Streett automaton is of exponential size, and vice versa. Finally, we discuss the emptiness problem for uniform PBA and the use of PBA for the verification of Markov chains against qualitative linear-time properties.

Journal ArticleDOI
21 Nov 2005
TL;DR: This work introduces a new abstraction, called catchup equivalence, which is defined on event zones and which can be seen as an implementation of one of the (more abstract) previous congruences, and yields an algorithm to check emptiness which has the same complexity bound in the worst case as the algorithm to test emptiness in the classical semantics of timed automata.
Abstract: We present a new approach to the symbolic model checking of timed automata based on a partial order semantics. It relies on event zones that use vectors of event occurrences instead of clock zones that use vectors of clock values grouped in polyhedral clock constraints. We provide a description of the different congruences that arise when we consider an independence relation in a timed framework. We introduce a new abstraction, called catchup equivalence which is defined on event zones and which can be seen as an implementation of one of the (more abstract) previous congruences. This formal language approach helps clarifying what the issues are and which properties abstractions should have. The catchup equivalence yields an algorithm to check emptiness which has the same complexity bound in the worst case as the algorithm to test emptiness in the classical semantics of timed automata. Our approach works for the class of timed automata proposed by Alur-Dill, except for state invariants (an extension including state invariants is discussed informally). First experiments show that the approach is promising and may yield very significant improvements.

Journal ArticleDOI
01 Oct 2005
TL;DR: This paper develops a direct method for checking iP-observability, based on an insightful observation that the iP function is a left congruence in terms of relations on formal languages, to detect denial of service vulnerabilities in security protocols based on INI.
Abstract: We propose an algorithmic approach to the problem of verification of the property of intransitive noninterference (INI), using tools and concepts of discrete event systems (DES). INI can be used to characterize and solve several important security problems in multilevel security systems. In a previous work, we have established the notion of iP-observability, which precisely captures the property of INI. We have also developed an algorithm for checking iP-observability by indirectly checking P-observability for systems with at most three security levels. In this paper, we generalize the results for systems with any finite number of security levels by developing a direct method for checking iP-observability, based on an insightful observation that the iP function is a left congruence in terms of relations on formal languages. To demonstrate the applicability of our approach, we propose a formal method to detect denial of service vulnerabilities in security protocols based on INI. This method is illustrated using the TCP/IP protocol. The work extends the theory of supervisory control of DES to a new application domain.

Book Chapter
01 Jan 2005
TL;DR: The authors argue that the highly ambiguous character of natural languages is surprising, and that the very existence of ambiguity calls for an explanation, and they argue that ambiguity is not theoretically important, whereas formal languages are unambiguous by design.
Abstract: Montague’s celebrated claim that no “important theoretical difference exists between formal and natural languages” (Montague 1974; 188) implies that ambiguity is not theoretically important, for ambiguity abounds in natural languages, whereas formal languages are unambiguous by design. More generally, the pervasiveness of ambiguity in natural languages seems to be widely regarded as unremarkable. Our objective in this paper is to argue, to the contrary, that the highly ambiguous character of natural languages is surprising, and that the very existence of ambiguity calls for an explanation.

Book ChapterDOI
07 Nov 2005
TL;DR: This paper proposes to define the concrete syntax of a language by an extension of the already existing metamodel of the abstract syntax, which describes the concepts of the language, with a second layer describing the graphical representation of concepts by visual elements.
Abstract: Language-centric methodologies, triggered by the success of Domain Specific Languages, rely on precise specifications of modeling languages. While the definition of the abstract syntax is standardized by the 4-layer metamodel architecture of the OMG, most language specifications are held informally for the description of the semantics and the (graphical) concrete syntax. This paper is tackling the problem of specifying the concrete syntax of a language in a formal and non-ambiguous way. We propose to define the concrete syntax by an extension of the already existing metamodel of the abstract syntax, which describes the concepts of the language, with a second layer describing the graphical representation of concepts by visual elements. In addition, an intermediate layer defines how elements of both layers are related to each other. Unlike similar approaches that became the basis of some CASE tools, the intermediate layer is not a pure mapping from abstract to concrete syntax but connects both layers in a flexible, declarative way. We illustrate our approach with a simplified form of statecharts.

Journal ArticleDOI
TL;DR: It is shown that shrinking two-pushdown automata and length-reducing two- pushing down automata are equivalent, both in the non-deterministic and the deterministic case, thus obtaining still another characterization of the growing context-sensitive languages and the Church-Rosser languages, respectively.
Abstract: The growing context-sensitive languages have been classified through the shrinking two-pushdown automaton, the deterministic version of which characterizes the class of generalized Church-Rosser languages [Inform. Comput. 141 (1998) 1]. Exploiting this characterization we prove that the latter class coincides with the class of Church-Rosser languages that was introduced by McNaughton et al. [J. ACM 35 (1988) 324]. Based on this result several open problems of McNaughton et al. are solved. In addition, we show that shrinking two-pushdown automata and length-reducing two-pushdown automata are equivalent, both in the non-deterministic and the deterministic case, thus obtaining still another characterization of the growing context-sensitive languages and the Church-Rosser languages, respectively.

Book ChapterDOI
24 Feb 2005
TL;DR: An algebraic characterization of the regular languages of ranked labeled trees is given that yields a decision procedure for determining whether a regular collection of trees is first-order definable: the procedure is polynomial time in the minimal automaton presenting the regular language.
Abstract: We consider regular languages of ranked labeled trees. We give an algebraic characterization of the regular languages over such trees that are definable in first-order logic in the language of labeled graphs. These languages are the analog on ranked trees of the “locally threshold testable” languages on strings. We show that this characterization yields a decision procedure for determining whether a regular collection of trees is first-order definable: the procedure is polynomial time in the minimal automaton presenting the regular language.

Journal ArticleDOI
TL;DR: A list of known properties of DNA languages which are free of certain types of undesirable bonds is recalled and a general framework in which each of these properties is characterized by a solution of a uniform formal language inequation is introduced.

Journal ArticleDOI
TL;DR: Some fundamental results about primitive words are extended to primitive partial words, which are strings that may have a number of ''do not know'' symbols.

Proceedings ArticleDOI
27 Dec 2005
TL;DR: The need for a formal language for the common description of aircraft intent in the context of trajectory computation, denoted as aircraft intent description language (AIDL), is identified and considered as a possible solution to ensure interoperability among future ATM automation components.
Abstract: This paper summarizes the key characteristics of several different methods applied to encode aircraft intent in TP implementations. The differences appear to be significant. We conclude that a standardized formal method to express aircraft intent information (understood as input to the trajectory computation process) is a key enabler to achieve the required consistency. This paper identifies the need for a formal language for the common description of aircraft intent in the context of trajectory computation. Such a language, denoted as aircraft intent description language (AIDL), would allow expressing the aircraft intent with different levels of detail within a single standard unifying framework. By using the AIDL as a common standard for intent sharing, different TP clients could interoperate on the basis of a consistent input to the TP, regardless of their individual requirements. The proposed approach is considered as a possible solution to ensure interoperability among future ATM automation components.

Journal ArticleDOI
TL;DR: The proposed approach uses statistical alignment methods to produce a set of conventional strings from which a stochastic finite-state grammar is inferred, which is finally transformed into a resulting finite- state transducer.

Book ChapterDOI
TL;DR: This work proposes an approach to the verification of a priori conformance, of an agent's conversation policy to a protocol, which is based on the theory of formal languages and can be applied to a wide variety of cases with the proviso that the protocol specification and the protocol implementation can be translated into finite state automata.
Abstract: In open multi-agent systems agent interaction is usually ruled by public protocols defining the rules the agents should respect in message exchanging. The respect of such rules guarantees interoperability. Given two agents that agree on using a certain protocol for their interaction, a crucial issue (known as “a priori conformance test”) is verifying if their interaction policies, i.e. the programs that encode their communicative behavior, will actually produce interactions which are conformant to the agreed protocol. An issue that is not always made clear in the existing proposals for conformance tests is whether the test preserves agents' capability of interacting, besides certifying the legality of their possible conversations. This work proposes an approach to the verification of a priori conformance, of an agent's conversation policy to a protocol, which is based on the theory of formal languages. The conformance test is based on the acceptance of both the policy and the protocol by a special finite state automaton and it guarantees the interoperability of agents that are individually proved conformant. Many protocols used in multi-agent systems can be expressed as finite state automata, so this approach can be applied to a wide variety of cases with the proviso that both the protocol specification and the protocol implementation can be translated into finite state automata. In this sense the approach is general. Easy applicability to the case when a logic-based language is used to implement the policies is shown by means of a concrete example, in which the language DyLOG, based on computational logic, is used.

Journal ArticleDOI
TL;DR: The authors examines the idea that in mathematics education it is important to wean pupils off the use of informal everyday language and to privilege the use formal technical vocabulary, and they make some observations on use of formal and informal language in the Dimensions transcript.
Abstract: This paper examines the idea that in mathematics education it is important to wean pupils off the use of informal everyday language and to privilege the use of formal technical vocabulary. I will first make some observations on the use of formal and informal language in the Dimensions transcript. The main focus of the next part of the discussion is on the complexities in establishing core and non-core vocabulary meaning and the need to use words to represent established meaning/s as well as to create new ones. After that I will draw on research in mathematics education to show that informal and formal language (including technical vocabulary)is used in various combinations and that pupils can, indeed need to, use informal language productively to explore concepts represented by technical vocabulary.

Journal ArticleDOI
TL;DR: It is verified that PFAs computing strings of words can be implemented by means of calculating strings of symbols, and it is proved that Theorem 1 does hold for PCFRs and PRGs.

Book ChapterDOI
18 Jul 2005
TL;DR: How formal and informal modeling language can be cooperatively used in the MDA framework and how the trans-formations between models in these languages can be achieved using an MDA development environment are shown.
Abstract: The Model Driven Architecture (MDA) involves automated trans-formations between software models defined in different languages at different abstraction levels. This paper takes an MDA approach to integrate a formal modeling language (Object-Z) with an informal modeling language (UML) via model transformation. This paper shows how formal and informal modeling languages can be cooperatively used in the MDA framework and how the trans-formations between models in these languages can be achieved using an MDA development environment. The MDA model transformation techniques allow us to have a reusable transformation between formal and informal modeling languages. The integrated approach provides an effective V&V technique for the MDA.

01 Jan 2005
TL;DR: A reinforced filament tape for securing and releasing meeting edges of a container or articles within a container, comprises a flexible backing ribbon with a pressure sensitive adhesive on one side. On the other side of the backing ribbon is a binding layer overlayed by a plurality of reinforcing filament and a releasing layer as mentioned in this paper.
Abstract: A reinforced filament tape for securing and releasing meeting edges of a container or articles within a container, comprises a flexible backing ribbon with a pressure sensitive adhesive on one side. On the other side of the backing ribbon is a binding layer overlayed by a plurality of reinforcing filaments and a releasing layer. The filaments are contacted with a filament coating and remain securely positioned in the binding layer.

Journal ArticleDOI
TL;DR: It is demonstrated that several types of unresolved equations cannot effectively simulate each other in spite of the equality of the language families they define, and it is proved that systems with linear concatenation and union only are as expressive as more general unresolved inequalities with all Boolean operations and unrestricted Concatenation.