scispace - formally typeset
Search or ask a question
Topic

Indexed language

About: Indexed language is a research topic. Over the lifetime, 334 publications have been published within this topic receiving 11000 citations.


Papers
More filters
Book ChapterDOI
01 Jan 1968
TL;DR: The authors constructed grammars for the number names of the four major Dravidian languages: Tamil, Malayalam, Kannada, and Telugu, for numbers having up to ten digits.
Abstract: Generative grammars for number names have been written by Van Katwijk1 and Brainerd2, and in this paper we have constructed grammars for the number names of the four major Dravidian languages: Tamil, Malayalam, Kannada, and Telugu. In the Indian system of numbers, hundred thousand is called a lakh, a million ten lakhs and ten million a crore (written as kōdi in Tamil). Even though Tamil names for numbers much larger than the crore are found in ancient treatises3 on mathematics, these names are not commonly known and not used in practice, and hence we have restricted ourselves to numbers having up to ten digits.
Journal Article
TL;DR: In this paper, the authors introduce two new classes of internal contextual grammars, called end-marked maximal depth-first and inner end-marked maximal depthfirst contextual Grammars.
Abstract: In this paper, we present a few results which are of interest for the potential application of contextual grammars to natural languages. We introduce two new classes of internal contextual grammars, called end-marked mammal depth-first and inner end-marked maximal depth-first contextual grammars. We analyze the new variants with respect to the basic properties of the mildly context sensitive languages. With this aim, we show that (i) the three basic non-context-free constructions in natural languages can be realized upon using these variants, (ii) the membership problem for these family of languages is decidable in polynomial time algorithm, (iii) the family of languages generated by end-marked maximal depth-first grammars contains non-semilinear languages. We also solve the following open problem addressed in [3] and [1]: whether the families of languages generated by maximal depth-first and maximal local contextual grammars are semilinear or not?
Proceedings ArticleDOI
19 May 1965
TL;DR: These sets are developed by subjecting generators of context-sensitive grammars to abstract versions of a "hardware" restriction to which the users of natural languages, unlike the describers ofnatural languages, might be subject.
Abstract: We discuss some sets of grammars whose generative power lies between that of the set of context-free grammars and that of the set of context-sensitive grammars. These sets are developed by subjecting generators of context-sensitive grammars to abstract versions of a "hardware" restriction to which the users of natural languages, unlike the describers of natural languages, might be subject.
Book ChapterDOI
TL;DR: This chapter introduces stochastic string languages and describes three applications: communication and coding; syntactic pattern recognition; and error-correcting parsing.
Abstract: Publisher Summary Formal languages and their corresponding automata and parsing procedures have been used in the modeling and analysis of natural and computer languages and the description and recognition of patterns. One natural extension of one-dimensional string languages to high dimensional is tree languages. Interesting applications of tree languages to picture recognition include the classification of bubble chamber events, the recognition of fingerprint patterns and the interpretation of the LANDSAT data. In some applications, a certain amount of uncertainty exists in the process under study. This chapter introduces stochastic string languages. It describes three applications: communication and coding; syntactic pattern recognition; and error-correcting parsing. The stochastic tree languages are also introduced and their application to texture modeling is described. Two natural ways of extending the concept of formal languages to stochastic languages are to randomize the productions of grammars and the state transitions of recognition devices respectively. Some major results in stochastic grammars and stochastic syntax analysis are reviewed in the chapter.

Network Information
Related Topics (5)
Time complexity
36K papers, 879.5K citations
76% related
Finite-state machine
15.1K papers, 292.9K citations
76% related
Logic programming
11.1K papers, 274.2K citations
75% related
Type (model theory)
38.9K papers, 670.5K citations
74% related
Concurrency
13K papers, 347.1K citations
74% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20211
20195
20182
20177
201615
20157