scispace - formally typeset
Search or ask a question

Showing papers on "Formal grammar published in 2013"


Proceedings ArticleDOI
18 Mar 2013
TL;DR: A process calculus specifically designed to model systems based on the Internet of Things paradigm is presented, and a formal syntax and semantics for the calculus are defined and it is shown how it can be used to reason about relevant examples.
Abstract: This paper presents a process calculus specifically designed to model systems based on the Internet of Things paradigm. We define a formal syntax and semantics for the calculus, and show how it can be used to reason about relevant examples. We also define two notions of bisimilarity, one capturing the behavior seen by the end user of the system, and one allowing compositional reasoning.

48 citations


BookDOI
01 Jan 2013
TL;DR: Some of the most prominent figures in linguistics, including Noam Chomsky and Barbara H. Partee, offer new insights into the nature of linguistic meaning and pave the way for the further development of formal semantics and formal pragmatics.
Abstract: In recent years, the study of formal semantics and formal pragmatics has grown tremendously showing that core aspects of language meaning can be explained by a few principles. These principles are grounded in the logic that is behind - and tightly intertwined with - the grammar of human language. In this book, some of the most prominent figures in linguistics, including Noam Chomsky and Barbara H. Partee, offer new insights into the nature of linguistic meaning and pave the way for the further development of formal semantics and formal pragmatics. Each chapter investigates various dimensions in which the logical nature of human language manifests itself within a language and/or across languages. Phenomena like bare plurals, free choice items, scalar implicatures, intervention effects, and logical operators are investigated in depth and at times cross-linguistically and/or experimentally. This volume will be of interest to scholars working within the fields of semantics, pragmatics, language acquisition and psycholinguistics.

42 citations


Journal ArticleDOI
01 Dec 2013
TL;DR: The formalization of natural language syntaxes and grammars enables a wide range of applications not only in cognitive informatics, cognitive linguistics, natural language processing, cognitive computing, semantic computing, cognitive robotics, and computational linguistics in general, but also in word processing, web search engines, online text processing, machine-enabled language comprehension, autonomous machine learning, cognitive translators, computing with words, and cognitive systems in particular.
Abstract: It is recognized that formal grammars and rigorous syntactic rules are indispensable in formal, cognitive, and computational linguistics. The formal syntax of linguistics can be classified into two categories known as the analytic and synthetic syntax. The former are a set of lexical rules of individual words, phrases, and parts of speech; while the latter are a set of relational rules of formal syntaxes within and beyond sentences. The synthetic syntaxes can be formally embodied by a set of relational syntactic rules for lexis, phrases, and parts of speech within and beyond sentences. A theoretical framework of formal linguistics is coherently presented by analytic and synthetic syntaxes based on contemporary denotational mathematics. The formalization of natural language syntaxes and grammars enables a wide range of applications not only in cognitive informatics, cognitive linguistics, natural language processing, cognitive computing, semantic computing, cognitive robotics, and computational linguistics in general, but also in word processing, web search engines, online text processing, machine-enabled language comprehension, autonomous machine learning, cognitive translators, computing with words, and cognitive systems in particular.

27 citations


Dissertation
28 Nov 2013
TL;DR: This thesis demonstrates that statistical parsing techniques, adapted from NLP with little modification, can be successfully applied to recovering the harmonic structure underlying music.
Abstract: Various patterns of the organization of Western tonal music exhibit hierarchical structure, among them the harmonic progressions underlying melodies and the metre underlying rhythmic patterns. Recognizing these structures is an important part of unconscious human cognitive processing of music. Since the prosody and syntax of natural languages are commonly analysed with similar hierarchical structures, it is reasonable to expect that the techniques used to identify these structures automatically in natural language might also be applied to the automatic interpretation of music. In natural language processing (NLP), analysing the syntactic structure of a sentence is prerequisite to semantic interpretation. The analysis is made difficult by the high degree of ambiguity in even moderately long sentences. In music, a similar sort of structural analysis, with a similar degree of ambiguity, is fundamental to tasks such as key identification and score transcription. These and other tasks depend on harmonic and rhythmic analyses. There is a long history of applying linguistic analysis techniques to musical analysis. In recent years, statistical modelling, in particular in the form of probabilistic models, has become ubiquitous in NLP for large-scale practical analysis of language. The focus of the present work is the application of statistical parsing to automatic harmonic analysis of music. This thesis demonstrates that statistical parsing techniques, adapted from NLP with little modification, can be successfully applied to recovering the harmonic structure underlying music. It shows first how a type of formal grammar based on one used for linguistic syntactic processing, Combinatory Categorial Grammar (CCG), can be used to analyse the hierarchical structure of chord sequences. I introduce a formal language similar to first-order predicate logical to express the hierarchical tonal harmonic relationships between chords. The syntactic grammar formalism then serves as a mechanism to map an unstructured chord sequence onto its structured analysis. In NLP, the high degree of ambiguity of the analysis means that a parser must consider a huge number of possible structures. Chart parsing provides an efficient mechanism to explore them. Statistical models allow the parser to use information about structures seen before in a training corpus to eliminate improbable interpretations early on in the process and to rank the final analyses by plausibility. To apply the same techniques to harmonic analysis of chord sequences, a corpus of tonal jazz chord sequences annotated by hand with harmonic analyses is constructed. Two statistical parsing techniques are adapted to the present task and evaluated on their success at

25 citations


Proceedings ArticleDOI
27 Oct 2013
TL;DR: This paper presents GuLP a graph query language that enables to declaratively express preferences and presents the formal syntax and semantics of GuLP and a polynomial time algorithm for evaluating GuLP expressions.
Abstract: This paper presents GuLP a graph query language that enables to declaratively express preferences. Preferences enable to order the answers to a query and can be stated in terms of nodes/edge attributes and complex paths. We present the formal syntax and semantics of GuLP and a polynomial time algorithm for evaluating GuLP expressions. We describe an implementation of GuLP in the GuLP-it system, which is available for download. We evaluate the GuLP-it system on real-world and synthetic data.

20 citations


Journal Article
TL;DR: This work takes as its starting point a simple learning algorithm for substitutable context-free languages, based on principles of distributional learning, and modify it so that it will converge to a canonical grammar for each language.
Abstract: Standard models of language learning are concerned with weak learning: the learner, receiving as input only information about the strings in the language, must learn to generalise and to generate the correct, potentially infinite, set of strings generated by some target grammar. Here we define the corresponding notion of strong learning: the learner, again only receiving strings as input, must learn a grammar that generates the correct set of structures or parse trees. We formalise this using a modification of Gold's identification in the limit model, requiring convergence to a grammar that is isomorphic to the target grammar. We take as our starting point a simple learning algorithm for substitutable context-free languages, based on principles of distributional learning, and modify it so that it will converge to a canonical grammar for each language. We prove a corresponding strong learning result for a subclass of context-free grammars.

19 citations


Journal ArticleDOI
TL;DR: The ways that linguistics can be and has been applied to molecular biology are introduced, covering the relevant formal language theory at a relatively nontechnical level.
Abstract: Polymeric macromolecules, when viewed abstractly as strings of symbols, can be treated in terms of formal language theory, providing a mathematical foundation for characterizing such strings both as collections and in terms of their individual structures. In addition this approach offers a framework for analysis of macromolecules by tools and conventions widely used in computational linguistics. This article introduces the ways that linguistics can be and has been applied to molecular biology, covering the relevant formal language theory at a relatively nontechnical level. Analogies between macromolecules and human natural language are used to provide intuitive insights into the relevance of grammars, parsing, and analysis of language complexity to biology. © 2012 Wiley Periodicals, Inc. Biopolymers 99: 203–217, 2013.

16 citations


Journal ArticleDOI
TL;DR: This article found evidence that as early as 19 months French-speaking infants have an abstract representation of the word order of their language using pseudoverbs and the weird word order paradigm adapted to comprehension.
Abstract: Word order is one of the earliest aspects of grammar that the child acquires, because her early utterances already respect the basic word order of the target language. However, the question of the nature of early syntactic representations is subject to debate. Approaches inspired by formal syntax assume that the head–complement order, differentiating verb–object and object–verb languages, is represented very early on in an abstract, rulelike format. In contrast, constructivist theories assume that it is initially encoded as lexicalized, verb-specific knowledge. In order to address this issue experimentally, we combined the preferential looking paradigm using pseudoverbs with the weird word order paradigm adapted to comprehension. The results, based on highly reliable, coder-independent eye-tracking measures, provide the first direct evidence that as early as 19 months French-speaking infants have an abstract representation of the word order of their language.

14 citations


01 Jan 2013
TL;DR: An analysis of Bangladeshi university EFL teachers’ attitudes towards grammar indicates that these teachers view grammar as an inseparable part of language teaching and learning and think that formal grammar instruction has a facilitative role in language learning.
Abstract: Grammar is now rehabilitated in language teaching and learning after years of debate and research on how best to teach grammar has produced a variety of options for the teachers to follow in their classrooms. The present study reports 30 Bangladeshi university EFL teachers’ attitudes towards grammar and its teaching and learning relating to those options. An analysis of their responses indicates that these teachers view grammar as an inseparable part of language teaching and learning and think that formal grammar instruction has a facilitative role in language learning. In teaching grammar, explicit grammar instruction and contextualized use of grammar within communicative activities are preferred and inductive approaches and correction of errors are considered effective and helpful by these teachers. Small class size, use of audio-visual materials and flexibility in teaching grammar are suggested by them for better results.

14 citations


Proceedings ArticleDOI
25 May 2013
TL;DR: An attribute grammar is presented which captures the semantics of a subset of English language assertion descriptions that are commonly used for result checking in the hardware verification process and is evaluated using a large set of industrial assertion descriptions.
Abstract: We present a technique to automatically generate formal, executable assertions from natural language assertion descriptions written in English. Assertions are program invariants which are commonly used for result checking in the hardware verification process. We present an attribute grammar which captures the semantics of a subset of English language assertion descriptions. Using the attribute grammar, we parse assertion descriptions and generate semantically equivalent formal models. We have evaluated our technique using a large set of industrial assertion descriptions. We present the successful assertion generation results, as well as the limitations of our approach and methods to address those limitations in the future.

13 citations


Proceedings ArticleDOI
19 Dec 2013
TL;DR: This study experimentally investigates the use of natural language in the learning of programming fundamentals by two groups of undergraduate students without prior knowledge of programming and compares its use with that of a traditional grammar language.
Abstract: The complexity and importance of learning programming fundamentals (i.e., sequences of sentences that express actions, conditions, and repetitions in computing) for undergraduate students has motivated the development of an intense educational research area. One frequently studied problem is the difficulty in the learning of traditional context-free grammars which are present, for example, in programming languages such as Pascal and C. This study experimentally investigates the use of natural language in the learning of programming fundamentals by two groups of undergraduate students without prior knowledge of programming and compares its use with that of a traditional grammar language. Results suggest that the use of natural language is a good alternative, despite the small differences, to the use of traditional programming languages defined by context-free grammars. This alternative is attractive and promising because the student does not need to learn a formal grammar to learn the fundamentals of programming.

Journal ArticleDOI
15 Oct 2013
TL;DR: It is shown how a lexical licensing mechanism, which is formulated within a formal grammar framework, can deal with the data and this proposal is extended to the phenomenon of negative polarity.
Abstract: This article examines idiomatic expressions as sources of both regularity and irregularity in language. Some morphological, lexical, syntactical, and semantical characteristics of idioms are discussed. It is shown how a lexical licensing mechanism, which is formulated within a formal grammar framework, can deal with the data. After that, this proposal is extended to the phenomenon of negative polarity.

Journal Article
TL;DR: This paper presented some insights for educators that may help them to consider the possibility of teaching formal grammar as part of the curriculum, based on personal teaching practices and found that some learners are producing the foreign language in a fluid, but sometimes inaccurate form.
Abstract: With the rise of new tendencies and methodologies in the English as a foreign language field, formal grammar instruction has become unnecessary during the last few years. Institutions and educators have made serious decisions in order to promote a language production which is fluent and coherent. Thus, grammar instruction has been partially relegated and new trends have occupied its place. However, based on personal teaching practices, I have realized that some learners are producing the foreign language in a fluid, but sometimes inaccurate form. The present reflection is aimed at presenting some insights for educators that may help them consider the possibility of teaching formal grammar as part of the curriculum.

Proceedings ArticleDOI
01 Dec 2013
TL;DR: Based on a mathematical model for Skinner's functional analysis of Verbal Behavior, formal definitions for the concepts semantic anchor, utterance-meaning pair, and micro-local grammar are derived and used for designing a bidirectional interface connecting automatic speech recognition to meaning oriented language processing.
Abstract: Based on a mathematical model for Skinner's functional analysis of Verbal Behavior, we derive formal definitions for the concepts semantic anchor, utterance-meaning pair, and micro-local grammar. We show how these concepts can be used for designing a bidirectional interface connecting automatic speech recognition to meaning oriented language processing. A semi-automatic process for constructing components for such an interface from sparse data, e. g. collected from Wizard of Oz experiments, is described. Finally, we use our formal approach to investigate some questions concerning the formal complexity of natural language.

Journal ArticleDOI
TL;DR: The authors discusses a case of possible non-acquisition by L2 children who had had considerable exposure to the L2 speaker's underlying mental representation, and this allows us to consider the value of lxSLA methodology on the one hand and and raises issues about what might be lacking in the current socio-SLA paradigm, on the other hand.
Abstract: Generative linguistics has long been concerned with the linguistic competence of the “ideal speaker-listener, in a completely homogeneous speech-community, who knows its language perfectly” (Chomsky 1965: 3). Research in formal-linguistics-based second language acquisition takes as its starting point the second language (L2) speaker's underlying mental representation. Here the factors of interest are influence of the learner's native language and, in generative SLA, the operation of innate linguistic mechanisms (Universal Grammar). Similar to methodology in formal syntax, lxSLA adopts techniques such as grammaticality judgment, comprehension and perception tasks supplementing spontaneously produced oral data. While there may be individual differences in oral production, tasks that tap learners' mental representations reveal commonalities across learners from a given native language background with the same amount/ type of exposure and age of initial L2 exposure. When it comes to phonology, age has long been a central factor with numerous comparative studies showing younger learners far outperforming older learners (see Piske et al. 2001). This paper discusses a case of possible non-acquisition by L2 children who had had considerable exposure to the L2. Children's non-acquisition is only apparent, and this allows us to consider the value of lxSLA methodology on the one hand, and and raises issues about what might be lacking in the current socio-SLA paradigm, on the other. We argue that only when we return to the cooperation that marked its birth in the 1960s will we have a comprehensive picture of SLA.

Journal ArticleDOI
TL;DR: This paper is an attempt to step in this direction by providing a formal syntax together with a compositional semantics for GCL, and Zadeh's deduction rules are proved to be valid in the defined semantics.
Abstract: The generalized constraint language (GCL), introduced by Zadeh, serves as a basis for computing with words (CW). It provides an agenda to express the imprecise and fuzzy information embedded in natural language and allows reasoning with perceptions. Despite its fundamental role, the definition of GCL has remained informal since its introduction by Zadeh, and to our knowledge, no attempt has been made to formulate a rigorous theoretical framework for GCL. Such formalization is necessary for further theoretical and practical advancement of CW for two important reasons. First, it provides the underlying infrastructure for the development of useful inference patterns based on sound theories. Second, it determines the scope of GCL and hence facilitates the translation of natural language expressions into GCL. This paper is an attempt to step in this direction by providing a formal syntax together with a compositional semantics for GCL. A soundness theorem is defined, and Zadeh's deduction rules are proved to be valid in the defined semantics. Furthermore, a discussion is provided on how the proposed language may be used in practice.

Journal ArticleDOI
31 Dec 2013
TL;DR: The author shows a sketch of a formal grammar generating all, and only the narrative structures represented through Greimas’s metalanguage, and expresses his sympathy for a morphodynamic framework.
Abstract: According to the author, Hjelmslev’s Glossematics is not adequate to base Greimas’s narrative structures for several reasons. First, Glossematics could be effectively encoded as a Turing Machine. Unfotunately, it would be generative only in a weak sense: it could generate every sort of narrative structure, not their structural description. A comparison with Propp’s model underlines another lack of Glossematics: it is not recursive. For these reasons, it cannot generate the infinite hierarchical self-embedded structures which represent the relation between base-narrative programs and use-narrative programs. The author shows a sketch of a formal grammar generating all, and only the narrative structures represented through Greimas’s metalanguage. Finally, after a critical balance on this generative model, its goals and its lacks, the author expresses his sympathy for a morphodynamic framework.

01 Jan 2013
TL;DR: The results indicate that the metalanguage included in three popular EFL textbooks compatibly reflects the trend of the related research during these years, i.e. a period of favoring grammar, followed by a phase of deemphasizing it, and finally a revival of grammar instruction.
Abstract: Formal grammar instruction in general – as well as teaching metalanguage in particular – has always generated a great deal of heated debate among researchers, teachers, and also materials developers. Metalanguage can be considered as a good touchstone of the emphasis that different textbooks put on formal grammar instruction. The present study, therefore, investigates the quality and quantity of metalanguage embodied in three popular English as a Foreign Language (EFL) textbooks, taught successively in one of the largest language institutes in Iran from 1996 to the present day. The results indicate that the metalanguage included in these textbooks compatibly reflects the trend of the related research during these years, i.e. a period of favoring grammar, followed by a phase of deemphasizing it, and finally a revival of grammar instruction. The results of the study have also implications for materials writers and teachers, which are discussed at the end.

Journal ArticleDOI
TL;DR: A pumping lemma for random permitting context languages and a shrinking lemmafor random forbidding context languages are proven and a new necessary condition for context-free languages is presented.

Patent
Ian Niles1
12 Mar 2013
TL;DR: In this article, a taxonomy slicer is applied to the subset of manually-tagged categories to determine parent/child relationships for each category in the subset, and to assemble the categories into the taxonomic view.
Abstract: Systems, methods, and computer-readable storage media are provided for generating a taxonomic view from a standard taxonomy and generating audience segments for targeting using a formal grammar. A manually-tagged subset of categories of the standard taxonomy is provided to a server. Each category in the subset of categories is tagged based on an entity's own legacy taxonomy. A taxonomy slicer is applied to the subset of manually-tagged categories to determine parent/child relationships for each category in the subset, and to assemble the categories into the taxonomic view. The taxonomic view maintains interoperability with the standard taxonomy, and with other entities using views of the standard taxonomy. Further, a formal grammar is defined for specifying online behaviors for targeting, and is applied to extracted categories of the standard taxonomy and/or taxonomic view. The formal grammar may be programmatically applied during behavioral targeting.

Proceedings ArticleDOI
01 Jan 2013
TL;DR: This Universal Scheme for modeling Energy Systems (USES) is the preferred language for the power system simulation presented here as being part of the Scalable Electro-Mobility Simulation (SEMSim) platform.
Abstract: With the increasing complexity of real-world energy systems its modeling process becomes even more crucial when large-scale simulations are conducted. Being computational intensive and therefore requiring efficient simulation models a modeling scheme with a well-defined formal syntax definition is developed and together with its meta model proposed in this paper. This Universal Scheme for modeling Energy Systems (USES) is the preferred language for the power system simulation presented here as being part of our Scalable Electro-Mobility Simulation (SEMSim) platform. For investigating the impact of electro-mobility on the city infrastructure the transmission system of Singapore is described as real, data, and formal model, the first two based on USES.

01 Jan 2013
TL;DR: A formal grammar as a computational approach to the generation of design that allows the creation of new ornamental structures and can lead to a new language of architectural forms.
Abstract: We introduce a formal grammar as a computational approach to the generation of design. While existing shape-grammars transform primitive shapes as lines or rectangles, the presented production system specifically addresses polyhedral objects described by three-dimensional meshes composed of vertices, edges and faces. The parameters of the transformation rules are sensitive to topological and topographical properties of the selected input mesh. We demonstrate that this approach allows the creation of new ornamental structures and can lead to a new language of architectural forms.

01 Jun 2013
TL;DR: This paper proposes using a controlled natural language, namely International Technology Alliance Controlled English (CE), and CE-based tools to improve cross-linguistic/cross-cultural communication and enabling multi-nation teams to work together effectively and efficiently.
Abstract: : Coalition operations involve multi-team and/or multi-nation collaborations. Linguistic variations and cultural differences often create unexpected challenges for effective communication, and thus for Command and Control (C2) during military operations. In this paper, we propose using a controlled natural language, namely International Technology Alliance Controlled English (CE), and CE-based tools to improve cross-linguistic/cross-cultural communication. We will discuss various types of linguistic variations and cultural differences manifested by U.S. and British groups during coalition operations. The differences include lexical differences, and more importantly, differences in language use. These differences often result in miscommunication that can impede effective operations. CE (Mott 2010) is a subset of English with a restricted grammar that is based on a formal syntax and semantics. CE is human friendly but it allows machine processing. The current version of CE provides a common form of expression that promotes standard terminology and usage to reduce ambiguity in person-to-person communication; allows end-users to create new concepts with associated syntax and semantics; and provides a basis for automated and assistive applications and tools that support natural human-computer interaction, reasoning, and explanation. CE and CE-based tools can play an important role in facilitating cross-linguistic and cross-culture communication and enabling multi-nation teams to work together effectively and efficiently.

Proceedings ArticleDOI
01 Jan 2013
TL;DR: Experimental results show high accuracy of the approach in translating a natural language problem into a formal description, and this is done as an essential step involved in text to diagram conversion.
Abstract: Natural language geometry problems are translated into formal representation. This is done as an essential step involved in text to diagram conversion. A parser is designed that analyzes a problem statement in order to describe it as a language independent, unambiguous formal representation. Natural language processing tools and a lexical knowledge base are used to assist the parser that finally generates a graph as the parsing output. The parse graph is the formal representation of the input natural language problem. This graph is later translated into another intermediate representation consisting of a set of graphics-friendly statements. High school level geometry problems are used to develop and test the proposed methods. Experimental results show high accuracy of the approach in translating a natural language problem into a formal description.

Book ChapterDOI
01 Jan 2013
TL;DR: This paper presents a model that could allow to solve the problem of distorted patterns by involving an uncertainty factor (fuzziness/distortion) into the whole process of syntactic pattern recognition.
Abstract: The process of syntactic pattern recognition consists of two main phases. In the first one the symbolic representation of a pattern is created (so called primitives are identified). In the second phase the representation is analyzed by a formal automaton on the base of a previously defined formal grammar (i.e. syntax analysis / parsing is performed). One of the main problems of syntactic pattern recognition is the analysis of distorted (fuzzy) patterns. If a pattern is distorted and the results of the first phase are wrong, then the second phase usually will not bring satisfactory results either. In this paper we present a model that could allow to solve the problem by involving an uncertainty factor (fuzziness/distortion) into the whole process of syntactic pattern recognition. The model is a hybrid one (based on artificial neural networks and GDPLL(k)-based automata) and it covers both phases of the recognition process (primitives’ identification and syntax analysis). We discuss the application area of this model, as well as the goals of further research.

Proceedings Article
13 Jul 2013
TL;DR: The principal features of the RECon language are described, with examples, and the corresponding formal logic constructs that are produced by the RECON tool are shown.
Abstract: Capturing business rules in a formal logic representation supports the enterprise in two important ways: it enables the evaluation of logs and audit records for conformance to, or violation of, the rules; and it enables the conforming automation of some enterprise activities. The problem is that formal logic representations of the rules are very difficult for an industry expert to read and even more difficult to write, and translating the natural language of the enterprise to formal logic is an unsolved problem. RECON – Restricted English for Constructing Ontologies – is a subset of English that can be easily read by an industry expert, while having a formal grammar and an unambiguous translation to formal logic. This paper describes the principal features of the RECON language, with examples, and shows the corresponding formal logic constructs that are produced by the RECON tool.

Journal Article
TL;DR: For instance, it has been said that the Chinese did not develop abstract sciences or formal logic, because their language was not suitable for this, because it lacked a formal grammar with inflection, and because the script was pictographic or ideographic and not phonetic.
Abstract: Shall East and West never meet? Of course they do. But do they “really” understand each other? And what role does language play in this? Chinese is different from English, so is Classical Chinese from Classical Greek or Latin. But does this account for cultural differences? Do Chinese think differently from Westerners, because they speak a different language? It has often been said that the Chinese did not develop abstract sciences or formal logic, because their language was not suitable for this, because it lacked a formal grammar with inflection, and because the script was pictographic or ideographic and not phonetic. There are at least two questions in all of this: First, does language determine thought? Second, is Chinese different from Western languages in ways that play a role regarding the first question? It is on these two questions that we will focus.

Journal Article
TL;DR: The authors investigated the effect of teaching phrasal verbs as a classroom activity on Iranian EFL learners' knowledge of grammatical patterns, and found that the knowledge of phrasals verbs might enhance higher knowledge in Iranian learners of English.
Abstract: The present study aimed to investigate the effect of teaching phrasal verbs as a classroom activity on Iranian EFL learners' knowledge of grammatical patterns. The main question this study tried to answer was whether the knowledge of phrasal verbs might enhance higher knowledge of grammatical patterns in Iranian learners of English. To answer the question, 40 English intermediate trainees participated in the experiment of the study. They were randomly selected from among a population of trainees via an OPT test score of at least one standard deviation below the mean score. They were then divided into two groups of 20 and were randomly assigned to an experimental and a control group. A pretest of English grammatical patterns (sentence word order) was administered to both groups, then, they were taught grammatical patterns for 8 sessions but with different methodologies: the experimental group received a treatment of phrasal verbs while the control group received a placebo. A posttest of English grammatical patterns (sentence word order) was then administered to both groups. The data of the study were analyzed using an independent sample ttest to indicate the groups' posttest mean difference. The results indicated that the Iranian EFL learners in the experimental group received higher scores, though not significantly, in grammatical patterns after being treated with 8 sessions of phrasal verbs.Key Words: Phrasal Verbs, Grammatical Patterns, Sentence Word Order, Iranian EFL Learners, OPT.1. IntroductionOver the last decade the number of studies concerned with the effects of different ways such as making the learners familiar with phrasal verbs on learning a second or foreign language has increased considerably. This is due to the fact that many linguistic or non-linguistic factors can extremely influence language learning in individuals. In this study, grammar is magnified because it is considered as one of the most basic components in learning English as a foreign language. The purpose of this study was to explore the probable effect of making the learners familiar with the phrasal verbs as an independent variable on the learners' knowledge of grammatical patterns (word order) as a dependent variable. It is worth mentioning that by making the subjects familiar with phrasal verbs, the researcher can at least help the subjects to be aware of the appropriate sequence and position of verbs, their particles, and their related objects in sentences.Syntax along with the concepts such as grammar, grammatical patterns of sentences and word order, which are deemed to be the subcategories of the broad notion of syntax, is one of the first notions highlighted in this paper. The syntax of English language has several interesting properties which have often been discussed in many works of research. According to a net source, syntactic theories are commonly divided into two broad types, formal and functional. Linguistic form is what the formal theories of syntax focus on, relegating meaning to a peripheral position; by contrast, functional theories tend to focus on the function that language serves, and the ways that syntax is organized to serve these functions; in other words, meaning plays a central role.An enormous range of variation can be found in the extent to which theories are formal and functional within these two camps. Extreme functional syntaxes recognize only meaning or functions, and deny the existence of structure in syntax. In extreme version of formal syntax, by contrast, grammar tends to be conceptualized as an abstract algebraic system specifying the acceptable strings of symbols making up a language. Meaning is considered irrelevant, and syntax (in whole or part) is seen as constituting an autonomous system. The majority of theories fall in somewhere between the two poles.These days there are different views regarding syntactic theories. Most syntacticians agree that there are limits on the range of syntactic variation possible among languages. …

Book ChapterDOI
01 Jan 2013
TL;DR: This work gives a formal syntax and semantics for the extended diagram language before introducing a collection of reasoning rules encapsulating logical equivalence and logical consequence and proves that the resulting logic is sound, complete and decidable.
Abstract: Diagrammatic reasoning can be described formally by a number of diagrammatic logics; spider diagrams are one of these, and are used for expressing logical statements about set membership and containment. Here, existing work on spider diagrams is extended to include constant spiders that represent specific individuals. We give a formal syntax and semantics for the extended diagram language before introducing a collection of reasoning rules encapsulating logical equivalence and logical consequence. We prove that the resulting logic is sound, complete and decidable.

Journal ArticleDOI
TL;DR: It turns out that the system of rules, axioms and constraints of grammar cannot be explicitly represented in a general architecture of the language faculty — which circumvents the ontological mismatch of mental representations and formal/axiomatic properties of language.
Abstract: This paper explores the link between rules of grammar, grammar formalisms and the architecture of the language faculty. In doing so, it provides a flexible meta-level theory of the language faculty through the postulation of general axioms that govern the interaction of different components of grammar. The idea is simply that such an abstract formulation allows us to view the structure of the language faculty independently of specific theoretical frameworks/formalisms. It turns out that the system of rules, axioms and constraints of grammar cannot be explicitly represented in a general architecture of the language faculty — which circumvents the ontological mismatch of mental representations and formal/axiomatic properties of language. Rather, the system of rules, axioms, constraints of grammar is intentionally projected by humans, and this projection realizes/instantiates what Dascal (1992) calls ‘psychopragmatics’. Relevant implications for linguistic theory, learnability and (computational) models of language processing are also explored.