scispace - formally typeset
Search or ask a question

Showing papers by "Jacob Eisenstein published in 2005"



19 Apr 2005
TL;DR: It is found that gesture features correlate well with sentence boundaries, but that these features improve the overall performance of a language-only system only marginally, which suggests that gestural features can still be useful when speech recognition is inaccurate.
Abstract: In human-human dialogues, face-to-face meetings are often preferred over phone conversations. One explanation is that non-verbal modalities such as gesture provide additional information, making communication more efficient and accurate. If so, computer processing of natural language could improve by attending to non-verbal modalities as well. We consider the problem of sentence segmentation, using hand-annotated gesture features to improve recognition. We find that gesture features correlate well with sentence boundaries, but that these features improve the overall performance of a language-only system only marginally. This finding is in line with previous research on this topic. We provide a regression analysis, revealing that for sentence boundary detection, the gestural features are largely redundant with the language model and pause features. This suggests that gestural features can still be useful when speech recognition is inaccurate.

9 citations


01 Jan 2005
TL;DR: Whether gestural cues can improve sentence boundary detection in informal, spontaneous speech is explored, and the probabilities of various gesture features are shown, conditioned on the presence of sentence boundary events.
Abstract: One of the ways in which gesture supplements communication is by helping to identify the “meta-data” that comprises the organizational structure of the discourse. One such type of meta-data is sentence unit boundaries;the detection of sentence boundaries in informal, spontaneous speech is a difficult problem. In this abstract, we explore whether gestural cues can improve sentence boundary detection. We have hand-annotated a corpus of 26 short videos of spontaneous speech and gesture (see [1] for a more complete account of this research). We employ the movement phase and gesture phrase taxonomies as summarized by McNeill [2]. These gesture features correlate well with sentence boundaries in this corpus. Table 1 shows the probabilities of various gesture features, conditioned on the presence of sentence boundary events.

2 citations