scispace - formally typeset
Search or ask a question
Topic

Formal language

About: Formal language is a research topic. Over the lifetime, 5763 publications have been published within this topic receiving 154114 citations.


Papers
More filters
Book
01 Jan 1996
TL;DR: In this article, an introductory course for computer science and computer engineering majors who have knowledge of some higher-level programming language, the fundamentals of formal languages, automata, computability, and related matters form the major part of the theory of computation.
Abstract: Formal languages, automata, computability, and related matters form the major part of the theory of computation. This textbook is designed for an introductory course for computer science and computer engineering majors who have knowledge of some higher-level programming language, the fundamentals of

322 citations

Journal ArticleDOI
TL;DR: An explicit expression for the membership function of the language L(G) generated by a fuzzy grammar G is given, and it is shown that any context-sensitive fuzzy grammar is recursive.

322 citations

Proceedings Article
15 Feb 2018
TL;DR: This work proposes to use graphs to represent both the syntactic and semantic structure of code and use graph-based deep learning methods to learn to reason over program structures, and suggests that these models learn to infer meaningful names and to solve the VarMisuse task in many cases.
Abstract: Learning tasks on source code (i.e., formal languages) have been considered recently, but most work has tried to transfer natural language methods and does not capitalize on the unique opportunities offered by code’s known syntax. For example, long-range dependencies induced by using the same variable or function in distant locations are often not considered. We propose to use graphs to represent both the syntactic and semantic structure of code and use graph-based deep learning methods to learn to reason over program structures. In this work, we present how to construct graphs from source code and how to scale Gated Graph Neural Networks training to such large graphs. We evaluate our method on two tasks: VarNaming, in which a network attempts to predict the name of a variable given its usage, and VarMisuse, in which the network learns to reason about selecting the correct variable that should be used at a given program location. Our comparison to methods that use less structured program representations shows the advantages of modeling known structure, and suggests that our models learn to infer meaningful names and to solve the VarMisuse task in many cases. Additionally, our testing showed that VarMisuse identifies a number of bugs in mature open-source projects.

321 citations

Book
01 Jan 1998
TL;DR: In Models of Computation, John Savage re-examines theoretical computer science, offering a fresh approach that gives priority to resource tradeoffs and complexity classifications over the structure of machines and their relationships to languages.
Abstract: From the Publisher: Your book fills the gap which all of us felt existed too long. Congratulations on this excellent contribution to our field." --Jan van Leeuwen, Utrecht University "This is an impressive book. The subject has been thoroughly researched and carefully presented. All the machine models central to the modern theory of computation are covered in depth; many for the first time in textbook form. Readers will learn a great deal from the wealth of interesting material presented." --Andrew C. Yao, Professor of Computer Science, Princeton University "Models of Computation" is an excellent new book that thoroughly covers the theory of computation including significant recent material and presents it all with insightful new approaches. This long-awaited book will serve as a milestone for the theory community." --Akira Maruoka, Professor of Information Sciences, Tohoku University "This is computer science." --Elliot Winard, Student, Brown University In Models of Computation: Exploring the Power of Computing, John Savage re-examines theoretical computer science, offering a fresh approach that gives priority to resource tradeoffs and complexity classifications over the structure of machines and their relationships to languages. This viewpoint reflects a pedagogy motivated by the growing importance of computational models that are more realistic than the abstract ones studied in the 1950s, '60s and early '70s. Assuming onlysome background in computer organization, Models of Computation uses circuits to simulate machines with memory, thereby making possible an early discussion of P-complete and NP-complete problems. Circuits are also used to demonstrate that tradeoffs between parameters of computation, such as space and time, regulate all computations by machines with memory. Full coverage of formal languages and automata is included along with a substantive treatment of computability. Topics such as space-time tradeoffs, memory hierarchies, parallel computation, and circuit complexity, are integrated throughout the text with an emphasis on finite problems and concrete computational models FEATURES: Includes introductory material for a first course on theoretical computer science. Builds on computer organization to provide an early introduction to P-complete and NP-complete problems. Includes a concise, modern presentation of regular, context-free and phrase-structure grammars, parsing, finite automata, pushdown automata, and computability. Includes an extensive, modern coverage of complexity classes. Provides an introduction to the advanced topics of space-time tradeoffs, memory hierarchies, parallel computation, the VLSI model, and circuit complexity, with parallelism integrated throughout. Contains over 200 figures and over 400 exercises along with an extensive bibliography. ** Instructor's materials are available from your sales rep. If you do not know your local sales representative, please call 1-800-552-2499 for assistance, or use the Addison Wesley Longman rep-locator at ...

311 citations

Journal ArticleDOI
Tobias Kuhn1
TL;DR: A comprehensive survey of existing English-based Natural Language Controlled Natural Language (CNL) can be found in this article, where the authors provide a common terminology and a common model for CNL, to contribute to the understanding of their general nature, to provide a starting point for researchers interested in the area and to help developers to make design decisions.
Abstract: What is here called controlled natural language CNL has traditionally been given many different names. Especially during the last four decades, a wide variety of such languages have been designed. They are applied to improve communication among humans, to improve translation, or to provide natural and intuitive representations for formal notations. Despite the apparent differences, it seems sensible to put all these languages under the same umbrella. To bring order to the variety of languages, a general classification scheme is presented here. A comprehensive survey of existing English-based CNLs is given, listing and describing 100 languages from 1930 until today. Classification of these languages reveals that they form a single scattered cloud filling the conceptual space between natural languages such as English on the one end and formal languages such as propositional logic on the other. The goal of this article is to provide a common terminology and a common model for CNL, to contribute to the understanding of their general nature, to provide a starting point for researchers interested in the area, and to help developers to make design decisions.

308 citations


Network Information
Related Topics (5)
Data structure
28.1K papers, 608.6K citations
87% related
Time complexity
36K papers, 879.5K citations
86% related
Graph (abstract data type)
69.9K papers, 1.2M citations
85% related
Semantics
24.9K papers, 653K citations
85% related
Component-based software engineering
24.2K papers, 461.9K citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20237
202237
2021113
2020175
2019173
2018142