scispace - formally typeset
Search or ask a question
Topic

Phrase

About: Phrase is a research topic. Over the lifetime, 12580 publications have been published within this topic receiving 317823 citations. The topic is also known as: syntagma & phrases.


Papers
More filters
Journal ArticleDOI
TL;DR: In this experiment, pianists memorized and performed polyphonic music in which the serial distance and phrase structure between the entrances of 2 musical voices were varied, suggesting structural as well as linear constraints on the planning of complex sequences.
Abstract: Two factors influence the range of planning in music performance: the structural content of musical events and the serial distances between them. In this experiment, pianists memorized and performed polyphonic music (which contained multiple simultaneous voices) in which the serial distance and phrase structure between the entrances of 2 musical voices were varied. The distance between each musical element and its influence on other elements was assessed in production errors and interonset timing measures. Errors and timing measures offered converging evidence for interactive effects of serial distance and phrase structure; intervening phrase boundaries reduced the serial distances over which musical elements influenced one another. These findings suggest structural as well as linear constraints on the planning of complex sequences.

84 citations

Book
01 Jan 1934
TL;DR: The account of the basic experiments in parapsychology, out of which came the phrase, "extra-sensory perception", was given in this paper, and the phrase "extra sensory perception" was coined.
Abstract: The account of the basic experiments in parapsychology, out of which came the phrase, 'extra-sensory perception'.

83 citations

Proceedings ArticleDOI
14 Jun 2020
TL;DR: GSMN as mentioned in this paper explicitly models object, relation and attribute as a structured phrase, which not only allows to learn correspondence of object, relations and attribute separately, but also benefits to learn fine-grained correspondence of structured phrase.
Abstract: Image-text matching has received growing interest since it bridges vision and language. The key challenge lies in how to learn correspondence between image and text. Existing works learn coarse correspondence based on object co-occurrence statistics, while failing to learn fine-grained phrase correspondence. In this paper, we present a novel Graph Structured Matching Network (GSMN) to learn fine-grained correspondence. The GSMN explicitly models object, relation and attribute as a structured phrase, which not only allows to learn correspondence of object, relation and attribute separately, but also benefits to learn fine-grained correspondence of structured phrase. This is achieved by node-level matching and structure-level matching. The node-level matching associates each node with its relevant nodes from another modality, where the node can be object, relation or attribute. The associated nodes then jointly infer fine-grained correspondence by fusing neighborhood associations at structure-level matching. Comprehensive experiments show that GSMN outperforms state-of-the-art methods on benchmarks, with relative Recall@1 improvements of nearly 7% and 2% on Flickr30K and MSCOCO, respectively. Code will be released at: https://github.com/CrossmodalGroup/GSMN.

83 citations

Proceedings Article
10 Jul 2012
TL;DR: The key innovation provided by the toolkit is that the decoder can work with various grammars and offers different choices of decoding algrithms, such as phrase-based decoding, decoding as parsing/tree-parsing and forest-based decode.
Abstract: We present a new open source toolkit for phrase-based and syntax-based machine translation. The toolkit supports several state-of-the-art models developed in statistical machine translation, including the phrase-based model, the hierachical phrase-based model, and various syntax-based models. The key innovation provided by the toolkit is that the decoder can work with various grammars and offers different choices of decoding algrithms, such as phrase-based decoding, decoding as parsing/tree-parsing and forest-based decoding. Moreover, several useful utilities were distributed with the toolkit, including a discriminative reordering model, a simple and fast language model, and an implementation of minimum error rate training for weight tuning.

83 citations

Proceedings Article
04 May 2007
TL;DR: This work shows that incorporating lexical syntactic descriptions in the form of supertags can yield significantly better PBSMT systems, and describes a novel PBSMT model that integrates supertags into the target language model and the target side of the translation model.
Abstract: Until quite recently, extending Phrase-based Statistical Machine Translation (PBSMT) with syntactic structure caused system performance to deteriorate. In this work we show that incorporating lexical syntactic descriptions in the form of supertags can yield significantly better PBSMT systems. We describe a novel PBSMT model that integrates supertags into the target language model and the target side of the translation model. Two kinds of supertags are employed: those from Lexicalized Tree-Adjoining Grammar and Combinatory Categorial Grammar. Despite the differences between these two approaches, the supertaggers give similar improvements. In addition to supertagging, we also explore the utility of a surface global grammaticality measure based on combinatory operators. We perform various experiments on the Arabic to English NIST 2005 test set addressing issues such as sparseness, scalability and the utility of system subcomponents. Our best result (0.4688 BLEU) improves by 6.1% relative to a state-of-theart PBSMT model, which compares very favourably with the leading systems on the NIST 2005 task.

83 citations


Network Information
Related Topics (5)
Sentence
41.2K papers, 929.6K citations
92% related
Vocabulary
44.6K papers, 941.5K citations
88% related
Natural language
31.1K papers, 806.8K citations
84% related
Grammar
33.8K papers, 767.6K citations
83% related
Perception
27.6K papers, 937.2K citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023467
20221,079
2021360
2020470
2019525
2018535