scispace - formally typeset
Open AccessJournal ArticleDOI

Effects of Iconicity and Semantic Relatedness on Lexical Access in American Sign Language

TLDR
The existence of semantic priming for sign language and iconicity are confirmed and it is suggested that iconicity does not play a robust role in online lexical processing.
Abstract
Iconicity is a property that pervades the lexicon of many sign languages, including American Sign Language (ASL). Iconic signs exhibit a motivated, nonarbitrary mapping between the form of the sign and its meaning. We investigated whether iconicity enhances semantic priming effects for ASL and whether iconic signs are recognized more quickly than noniconic signs are (controlling for strength of iconicity, semantic relatedness, familiarity, and imageability). Twenty deaf signers made lexical decisions to the 2nd item of a prime–target pair. Iconic target signs were preceded by prime signs that were (a) iconic and semantically related, (b) noniconic and semantically related, or (c) semantically unrelated. In addition, a set of noniconic target signs was preceded by semantically unrelated primes. Significant facilitation was observed for target signs when they were preceded by semantically related primes. However, iconicity did not increase the priming effect (e.g., the target sign PIANO was primed equally by the iconic sign GUITAR and the noniconic sign MUSIC). In addition, iconic signs were not recognized faster or more accurately than were noniconic signs. These results confirm the existence of semantic priming for sign language and suggest that iconicity does not play a robust role in online lexical processing.

read more

Content maybe subject to copyright    Report

Journal of Experimental Psychology:
Learning, Memory, and Cognition
Effects of Iconicity and Semantic Relatedness on Lexical Access in
American Sign Language
Rain G. Bosworth, and Karen Emmorey
Online First Publication, October 4, 2010. doi: 10.1037/a0020934
CITATION
Bosworth, R. G., & Emmorey, K. (2010, October 4). Effects of Iconicity and Semantic Relatedness
on Lexical Access in American Sign Language. Journal of Experimental Psychology: Learning,
Memory, and Cognition. Advance online publication. doi: 10.1037/a0020934

Effects of Iconicity and Semantic Relatedness on Lexical Access in
American Sign Language
Rain G. Bosworth
University of California, San Diego
Karen Emmorey
San Diego State University
Iconicity is a property that pervades the lexicon of many sign languages, including American Sign
Language (ASL). Iconic signs exhibit a motivated, nonarbitrary mapping between the form of the sign
and its meaning. We investigated whether iconicity enhances semantic priming effects for ASL and
whether iconic signs are recognized more quickly than noniconic signs are (controlling for strength of
iconicity, semantic relatedness, familiarity, and imageability). Twenty deaf signers made lexical deci-
sions to the 2nd item of a prime–target pair. Iconic target signs were preceded by prime signs that were
(a) iconic and semantically related, (b) noniconic and semantically related, or (c) semantically unrelated.
In addition, a set of noniconic target signs was preceded by semantically unrelated primes. Significant
facilitation was observed for target signs when they were preceded by semantically related primes.
However, iconicity did not increase the priming effect (e.g., the target sign PIANO was primed equally
by the iconic sign GUITAR and the noniconic sign MUSIC). In addition, iconic signs were not
recognized faster or more accurately than were noniconic signs. These results confirm the existence of
semantic priming for sign language and suggest that iconicity does not play a robust role in online lexical
processing.
Keywords: semantic priming, iconicity, American Sign Language, lexical recognition
For spoken languages, lexical priming effects have been found
for words that are phonologically, morphologically, or semanti-
cally related (e.g., Hamburger & Slowiaczek, 1996; Marslen-
Wilson, Tyler, Waksler, & Older, 1994; Meyer & Schvaneveldt,
1971). Facilitatory and inhibitory priming effects (i.e., error rate
and reaction time for a target word are reduced or increased by a
preceding prime word) provide evidence for how linguistic infor-
mation is structured and accessed in the mental lexicon. A growing
body of research is beginning to establish the nature of lexical
priming for sign languages, uncovering both parallel and unique
aspects of lexical processing in visual–manual compared with
aural– oral languages. Identifying modality-independent and
modality-specific effects is imperative for determining what as-
pects of lexical processing are universal to all human languages
and for documenting how the characteristics of sign versus speech
shape the nature of lexical access and word recognition.
Phonological priming effects have been found for signed lan-
guages, despite the fact that sign phonology is not based on sound
and does not involve oral articulation. For spoken languages,
consonants and vowels constitute the basic units of phonological
structure, whereas for signed languages, handshape, location
(place of articulation), movement, and orientation constitute basic
phonological elements (for reviews, see Brentari, 1998; Sandler &
Lillo-Martin, 2006). Using a lexical decision task with sign pairs,
several studies report inhibitory (negative) priming effects when
prime and target signs share the same location (Carreiras,
Gutie´rrez-Sigut, Baquero, & Corina, 2008; Corina & Emmorey,
1993; Corina & Hildebrandt, 2002). Carreiras et al. (2008) pro-
posed that this inhibitory effect is due to activation of lexical
competitors by the prime sign, which slows recognition of the
target sign, and they suggested that this effect parallels the inhi-
bition observed when spoken prime–target word pairs share initial
phonemes (e.g., Hamburger & Slowiaczek, 1996). No significant
priming effects have been observed for prime–target signs that
have the same handshape (Carreiras et al., 2008; Corina & Em-
morey, 1993), and mixed results are reported for phonological
priming with movement (Corina & Emmorey, 1993; Dye & Shih,
2006). It is currently unclear why different priming patterns are
observed for different phonological units in sign language, but the
answer likely lies in the nature of sign-specific phonological
representations—for example, handshape may be best treated as a
complex autosegment that is not easily primed (Sandler, 1986)—
and/or in the nature of visual processing—for example, location
information is perceived prior to movement (Emmorey & Corina,
1990), which could lead to early lexical competition (Carreiras et
al., 2008).
Morphological priming has also been observed for American
Sign Language (ASL). Emmorey (1991) used repetition priming to
investigate the organization of morphologically complex signs in
the ASL lexicon (see also Hanson & Feldman, 1989). Two sepa-
Rain G. Bosworth, Department of Psychology, University of California,
San Diego; Karen Emmorey, School of Speech, Language, and Hearing
Sciences, San Diego State University.
This research was supported in part by Grant R01 DC010997 to Karen
Emmorey from the National Institute on Deafness and Other Communica-
tion Disorders. Thanks to Kevin Clark and Lucinda Batch for assistance in
creating video images for this study. Special thanks to Dave Swinney for
guidance and to the Deaf people who made this work possible.
Correspondence concerning this article should be addressed to Karen
Emmorey, Laboratory for Language and Cognitive Neuroscience, 6495
Alvarado Road, Suite 200, San Diego, CA 92120. E-mail: kemmorey@
mail.sdsu.edu
Journal of Experimental Psychology: © 2010 American Psychological Association
Learning, Memory, and Cognition
2010, Vol. , No. , 000 000
0278-7393/10/$12.00 DOI: 10.1037/a0020934
1

rate experiments showed that verbs inflected with aspect morphol-
ogy (but not with agreement morphology) produced strong facil-
itation for later recognition of the base verb (i.e., the citation form
of the same verb). The task was continuous lexical decision and
approximately 1 min (30 items) intervened between the prime and
target signs. Repetition priming was not observed for nonsigns
produced with ASL aspectual inflections, which indicates that the
facilitation effect was a true lexical effect and not due to episodic
memory or to priming at the phonological level. Repetition prim-
ing with morphologically related forms is generally interpreted as
an index of the interrelation among morphologically related forms
in the lexicon (see Marslen-Wilson et al., 1994, for a review).
Surprisingly, there has been only one (unpublished) study in-
vestigating semantic priming in a signed language, although two
recent studies have documented semantic interference effects in
sign production (Baus, Gutie´rrez-Sigut, Quer, & Carreiras, 2008;
Corina & Knapp, 2006). Corina and Emmorey (1993) asked deaf
ASL signers to make a lexical decision to the second sign in a
prime–target pair and found significantly faster response times
when targets were preceded by semantically related primes (e.g.,
PAPER–PENCIL; HOT–COLD; CAR–TRAFFIC).
1
This finding
suggests that semantic similarity effects are universal and may
reflect modality-independent principles of semantic organization
and representation. Semantic priming effects can be accounted for
by spreading activation across nodes within a lexical network (e.g.,
Collins & Loftus, 1975) or by activation of overlapping semantic
features (e.g., Cree, McRae, & McNorgan, 1999).
Our goals in the current study are to replicate the semantic
priming effects initially observed by Corina and Emmorey (1993)
and to examine a modality-specific semantic property of sign
language: the iconicity of linguistic forms. Sign languages exhibit
a greater capacity for iconic representation than do spoken lan-
guages because the visual–manual modality provides richer re-
sources for creating structural similarities between phonological
form and meaning. Spoken languages have iconic words that
sound like their referents, for example, onomatopoetic words that
denote animal sounds (in English, meow or moo) or reflect the
sounds of actions ( pop or crash). However, such sound-based
iconicity is relatively rare, perhaps because most phenomena are
not easily depicted with sound, which is a one-dimensional se-
quential medium. In contrast, the visual three-dimensional modal-
ity of sign languages allows for iconic expression of a wide range
of basic conceptual structures, such as object and human actions,
movements, locations, and shapes (see Taub, 2001, for extensive
discussion). For example, the signs illustrated in Figures 1A and
1B all bear a resemblance to the concepts that they denote. The
ASL signs PIANO and GUITAR depict how these instruments are
played, whereas the signs BOOK and WRITE depict properties of
the object and the action that they denote.
There is currently an active debate about whether iconicity plays
a significant role in the representation and/or processing of sign
languages. Some have argued for a very strong link between form
and meaning such that there is no level of strictly meaningless
units in sign language (e.g., Armstrong, Stokoe, & Wilcox, 1995;
Wilcox, 2004), whereas others consider iconicity an attribute of
signs that is not linguistically relevant for sign language processing
(e.g., Emmorey, 2002; Klima & Bellugi, 1979; Newport & Meier,
1985). The evidence for these views is mixed. For example,
Poizner, Bellugi, and Tweney (1981) showed that iconicity does
not bestow a processing or memory advantage for short-term
recall. Iconic signs were remembered as accurately as noniconic
signs by deaf ASL signers in an immediate serial recall task.
Further, iconic and noniconic signs are equally impaired with sign
language aphasia (Marshall, Atkinson, Smulovitch, Thacker, &
Woll, 2004), and production of iconic and noniconic signs engages
the same language-related neural regions (Emmorey et al., 2004).
Similarly, iconicity does not appear to guide early sign language
acquisition in children. For example, iconic signs are not learned
first and are not overrepresented in the early vocabularies of
ASL-learning children (Anderson & Reilly, 2002; Orlansky &
Bonvillian, 1984). Iconicity appears to be ignored in the acquisi-
tion of pronouns, negation, and the directional aspect of verb
agreement (Anderson & Reilly, 1997; Meier, 1987; Petitto, 1987).
For very young hearing children, iconic and arbitrary referential
gestures are learned equally well (Namy, Campbell, & Tomasello,
2004). Together, these studies suggest that iconicity does not play
a significant role in language processing.
However, there is growing evidence that semantic processing
for sign language can be facilitated by iconicity. For example,
Thompson, Vinson, and Vigliocco (2009) recently reported that
iconicity aids lexical retrieval in a sign–picture verification task.
Signers had to decide whether a sign and a picture referred to the
same object, and the iconic relationship between the sign and the
picture was manipulated. For example, the beak of a bird is
depicted in the ASL sign BIRD, and this property is salient in a
picture of a bird’s head in profile but not in a picture of a bird in
flight. Response times were faster when the property that was
iconically depicted in the sign (e.g., the beak of the bird) was
highlighted in the corresponding picture. Similar results were
reported by Grote and Linz (2003) for German Sign Language and
by Ormel (2008) for Sign Language of the Netherlands. In addi-
tion, Vigliocco, Vinson, Woolfe, Dye, and Woll (2005) found that
iconicity influenced semantic similarity judgments for British Sign
Language. These studies suggest that iconicity may confer a se-
mantic processing advantage when there is a close mapping be-
tween meaning and phonological form.
In this study, we investigated whether iconicity enhances semantic
priming effects for ASL and whether iconic signs are recognized more
quickly than noniconic signs. Iconic signs may be more effective
semantic primes than noniconic signs because the phonological
representations of iconic signs have features that are grounded in
perception and action (at least historically). Vigliocco et al. (2005)
suggested that iconic signs stimulate mental imagery within a
semantic field, and it is possible that shared imagery will enhance
semantic priming effects as well as lexical recognition. We se-
lected the lexical decision paradigm to test this hypothesis, rather
than the sign–picture verification task, in order to (a) measure
lexical semantic priming effects and (b) determine whether ico-
nicity plays a role in sign recognition itself. In sign–picture veri-
fication tasks, decision times are recorded to the pictures, not to the
signs. We hypothesized that if iconic signs have stronger connec-
tions between phonological form and semantic features, then these
signs may be recognized more quickly and more accurately than
noniconic signs.
1
Words in capital letters represent English glosses (the nearest equiva
-
lent translation) for ASL signs.
2
RESEARCH REPORTS

Method
Participants
Twenty prelingually and profoundly deaf participants (6 men,
14 women; M
age
27 years, SD 6 years) were tested at
Gallaudet University in Washington, DC; at the Salk Institute for
Biological Studies in San Diego, California; or at Deaf Community
Services in San Diego. All participants were exposed to ASL by
the age of 5 years, and all but three participants had at least one
deaf signing parent or older sibling. All participants used ASL on
a daily basis as their primary and preferred means of communica-
tion.
In addition, 68 hearing participants from the University of
California, San Diego, with no knowledge of ASL rated a large
corpus of ASL signs for iconicity and semantic relatedness (on the
basis of their English translations). An additional five deaf signers
who did not participate in the experiment provided iconicity,
semantic relatedness, and familiarity ratings for the final subset of
stimuli.
Materials
Semantically related and unrelated prime–target sign pairs were
created in which the prime was either iconic or noniconic (see
Figure 1 and Appendix). For these sign pairs, the target sign was
always iconic. A second set of prime–target pairs was created in
which the target sign was noniconic and the prime sign was
semantically unrelated to the target. The first set of prime–target
pairs was designed to allow us to investigate whether iconicity
enhances semantic priming, and the second set was designed to
help us determine whether iconic signs are recognized faster than
noniconic signs in a lexical decision task when both are preceded
by primes that are semantically unrelated.
Hearing nonsigners (n 68) rated 234 ASL signs for degree of
iconicity, using a scale of 1 (noniconic)to5(very iconic). The
stimuli were chosen with the expectation that about half would be
rated as iconic and half as noniconic, with some signs also falling
in the middle of the iconicity range. Prior to the rating task,
participants were given examples of clearly iconic signs that have
very transparent meanings (e.g., EAT is made by bringing the hand
to the mouth as if holding food) and examples of signs that are
arbitrary and clearly noniconic (e.g., NAME is made with two U
handshapes, index and middle fingers extended, contacting each
other). After watching and hearing a fluent ASL signer sign and
verbalize each item individually, the participants silently marked
their ratings on answer sheets. There were five practice items.
Participants were tested in groups of five.
After the iconicity ratings, the same 68 participants were asked
to rate 195 pairs of written words for semantic relatedness on a
scale of 1 (no semantic relationship)to7(strongly related in
meaning). The written word pairs represented English translations
of ASL sign pairs that could potentially be used in the experiment.
Both semantically related and associated pairs were included in the
stimuli set. Participants were instructed to assign high numbers to
pairs that go together, like king– queen, and to pairs that have
similar meanings, like bird– duck. There were 10 practice pairs.
Items were counterbalanced across participants such that no par-
ticipant saw the same item in the iconicity and semantic related-
ness rating tasks. For both rating tasks, participants were encour-
aged to use the entire scale in making judgments.
Items with mean iconicity ratings at or above 4 were selected as
possible iconic signs and items with mean iconicity ratings at or
below 1.5 were selected as potential noniconic signs to be used in
the experiment. Word pairs with a mean semantic relatedness
rating at or above 5.5 were considered semantically related, and
pairs with a mean semantic relatedness rating at or below 2.5 were
considered unrelated pairs. Signs (n 131) and sign pairs (n
105) meeting these criteria were presented to five deaf participants
for further iconicity and semantic relatedness ratings. Signs and
word pairs that fell in the middle of the rating scales were also
included. The procedure for collecting these ratings was the same
as the procedure for the hearing participants. In addition, the deaf
Figure 1. Illustration of two iconic target signs (A) and the prime signs (B–D) that preceded each.
3
RESEARCH REPORTS

participants were asked to provide familiarity ratings for the 131
individual signs using a scale of 1 (rarely signed by deaf people)
to5(seen every day). These ratings were used to make sure that the
conditions were balanced for sign familiarity. Iconicity, semantic
relatedness, and familiarity ratings were not obtained on the same
day; rather, each rating task was separated by at least a week.
The final experimental stimuli consisted of 32 noniconic target
signs preceded by semantically unrelated prime signs and 32
iconic target signs preceded by primes that were (a) iconic and
semantically related, (b) noniconic and semantically related, or (c)
semantically unrelated (see Appendix).
2
Tables 1
and 2 provide the
mean semantic relatedness ratings, iconicity ratings, imageability
ratings (from the MRC Psycholinguistic Database; Coltheart,
1981), familiarity ratings, and mean durations for the final prime
signs and for the iconic and noniconic target signs, respectively.
The prime signs did not differ significantly in imageability, famil-
iarity, or duration (all Fs 1). The iconicity ratings for the
semantically unrelated primes were near the middle of the 5-point
scale (M 3.17) because half of these primes were iconic and half
were noniconic. The iconic target signs and the noniconic target
signs did not differ significantly in imageability, familiarity, or
duration (all ts 1). Finally, the noniconic target signs were
preceded by unrelated primes with a mean semantic relatedness
rating of 1.44 (SD 0.3), which was not significantly different
from the relatedness rating for the unrelated primes preceding the
iconic targets (M 1.41; see Table 1), t 1.
In addition, 64 sign–nonsign pairs were created. The nonsigns
were created by varying one or two phonological parameters of a
real sign (e.g., BUG produced on the chest instead of the nose or
MOON produced with a 3 handshape instead of a hooked L
handshape). Nonsigns were all permissible but nonoccurring signs
in ASL. Nonsign targets were preceded by both iconic and noni-
conic prime signs. Thus, participants viewed a total of 128 stim-
ulus pairs (64 sign–sign pairs and 64 sign–nonsign pairs). The
iconic target pairs were counterbalanced across participants such
that no one saw the same target sign twice. In addition, no iconic
sign appeared as a prime and as a target for a given participant. All
participants saw the same noniconic target signs (all preceded by
unrelated primes).
The sign and nonsign stimuli were produced by a deaf native
ASL signer and recorded using a Panasonic video camera. When
editing the stimuli, we defined the beginning of a sign or nonsign
as the moment the hand(s) entered the frame and the end as the
moment the hand(s) began to move out of the sign configuration
and back down to resting position on the lap. A tone was aligned
with the first frame of each target item, and this audio signal was
fed into a Carnegie Mellon Button Box response timer. The primes
and targets were separated by 333 ms (10 video frames), and 3.5 s
of black videotape separated each trial. The videotapes were edited
using a Panasonic AG-650 editor controller and Panasonic AG
6500 and 6300 videocassette recorders.
Procedure
The participants were tested individually using a Sony PVM
1380 Trinitron color video monitor. Response latencies were re-
corded by a Power Macintosh G-3 computer using PsyScope
software. Participants were instructed in ASL to decide whether
the second sign of each pair (the target) was a true ASL sign or a
nonsense sign as quickly as they could without making errors.
They responded by pressing the appropriate green button marked
yes or red button marked no on the button box. A practice session
of 12 trials was given to each participant.
Results
Mean response latencies and error rates are given in Table 3.
Incorrect responses were excluded from the response latency anal-
yses (3.1% of the data). The mean response time for the nonsigns
was 1,283 ms (SD 60 ms), with a mean error rate of 9.9% (SD
1.8%). We conducted separate one-way analyses of variance for
latencies and error rates for the iconic target signs, with prime type
as the independent measure and participants (F
1
) and items (F
2
)as
random factors. For response latency, the effect of prime type was
significant, F
1
(2, 38) 5.93, p .006; F
2
(2, 62) 4.19, p .02.
As predicted, targets preceded by semantically related primes were
responded to significantly faster than were targets preceded by
unrelated primes, and this was true for both iconic primes, F
1
(1,
19) 9.94, p .005; F
2
(1, 31) 4.71, p .038, and noniconic
primes, F
1
(1, 19) 5.76, p .027; F
2
(1, 31) 5.94, p .02.
However, the iconicity of the prime did not increase the priming
effect: There was no difference between response latency for targets
preceded by iconic versus noniconic primes, F
1
(1, 19) 0.43, p
.52; F
2
(1, 31) .05, p .94. As can be seen in Figure 2
, the amount
of priming created by iconic and noniconic semantically related
primes was nearly identical.
For error rates, the main effect of prime type was not significant
by participants and approached significance by items, F
1
(2, 38)
1.84, p .17; F
2
(2, 62) 3.20, p .05. No comparison between
the three prime types was significant.
Next, we examined whether iconicity speeded lexical recogni-
tion time or reduced error rate by comparing responses to the
iconic and noniconic targets when both were preceded by unre-
lated primes (see Table 3). The latency difference between iconic
versus noniconic signs was not significant by participants, F
1
(1,
19) 0.23, p .63, or by items, F
2
(1, 54) 2.04, p .16. There
was also no significant difference in error rate between the two
sign types, F
1
(1, 19) 1.14, p .30; F
2
(1, 54) 2.02, p .09.
Discussion
As expected, signers were faster when making lexical decisions
to signs that were preceded by a semantically related prime than by
an unrelated prime (see Table 3). This result confirms that seman-
tic priming is a universal linguistic process that is unaffected by
language modality. Over the past several years, psycholinguistic
research has revealed both similarities and differences in lexical
access and representation for signed and spoken languages. For
example, lexical access for both languages involves a sequential
mapping process between an incoming linguistic signal and stored
lexical representations, both words and signs must be phonologi-
2
After item selection and stimulus design, it turned out that the exper
-
imental groups were not balanced for mean imageability, so eight items
were removed from the noniconic target set to equate imageability (while
maintaining balanced iconicity, familiarity, and semantic relatedness rat-
ings). The summary statistics reported in Tables 1 and 2 and in the text
reflect the properties of the final 24 noniconic target signs.
4
RESEARCH REPORTS

Citations
More filters
Journal ArticleDOI

Language as a multimodal phenomenon: implications for language learning, processing and evolution

TL;DR: The motivation for taking a multi-modal approach to the study of language learning, processing and evolution is provided, and the broad implications of shifting current dominant approaches and assumptions to encompass multimodal expression in both signed and spoken languages are discussed.
Journal ArticleDOI

Long-Term Effects of Gestures on Memory for Foreign Language Words Trained in the Classroom

TL;DR: The authors showed that gestures significantly enhance vocabulary learning in quantity and over time, in terms of Klimesch's connectivity model (CM) of information processing, and that a code, a word, is better integrated into long-term memory if it is composed of many interconnected components.
Book

Sign Language Phonology

TL;DR: Sign language phonology is the abstract grammatical component where primitive structural units are combined to create an infinite number of meaningful utterances, and this comparison allows us to better understand how the modality of a language influences its phonological system.
Journal ArticleDOI

Iconicity as structure mapping

TL;DR: It is suggested that iconicity is better understood as a structured mapping between two mental representations than as a link between linguistic form and human experience.
Journal ArticleDOI

Teaching American Sign Language to Hearing Adult Learners

TL;DR: In this paper, three main areas of research are proposed: the possible role of the socio-political history of the Deaf community in which ASL teaching is situated, linguistic differences between signed and spoken languages, and the use of video and computer-based technologies.
References
More filters
Journal ArticleDOI

A spreading-activation theory of semantic processing

TL;DR: The present paper shows how the extended theory can account for results of several production experiments by Loftus, Juola and Atkinson's multiple-category experiment, Conrad's sentence-verification experiments, and several categorization experiments on the effect of semantic relatedness and typicality by Holyoak and Glass, Rips, Shoben, and Smith, and Rosch.
Journal ArticleDOI

Facilitation in recognizing pairs of words: Evidence of a dependence between retrieval operations.

TL;DR: The results of both experiments support a retrieval model involving a dependence between separate successive decisions about whether each of the two strings is a word.
Journal ArticleDOI

The MRC Psycholinguistic Database

TL;DR: A computerised database of psycholinguistic information is described, where semantic, syntactic, phonological and orthographic information about some or all of the 98,538 words in the database is accessible, by using a specially-written and very simple programming language.
Book

The signs of language

TL;DR: The two faces of sign and sign language have been studied in this paper, where the authors compare Chinese and American signs and feature analysis of handshapes and the rate of speaking and signing.
Related Papers (5)
Frequently Asked Questions (13)
Q1. What was the main effect of prime type on the response time?

For error rates, the main effect of prime type was not significant by participants and approached significance by items, F1(2, 38) 1.84, p .17; F2(2, 62) 3.20, p .05. 

In addition, because of the high degree of simultaneous phonological structure and the varied phonotactic structure of sign onsets, lexical recognition occurs proportionally earlier for signs than for words. 

Spoken language users also prefer motivated forms in word creation (e.g., Hinton, Nichols, & Ohala, 1994), but the oral–auditory modality affords fewer possibilities for iconically mapping form to meaning. 

An additional five deaf signers who did not participate in the experiment provided iconicity, semantic relatedness, and familiarity ratings for the final subset of stimuli. 

Thompson et al. (2010) suggested that handshape decisions were slower for iconic signs because the automatic activation of meaning by iconic signs provided irrelevant information that interfered with the phonological decision. 

however, Thompson, Vinson, and Vigliocco (2010) found that iconicity slowed phonological decisions for British Sign Language signs and suggested that iconicity effects arise automatically, even when access to meaning is not relevant to the task. 

In addition, the deafparticipants were asked to provide familiarity ratings for the 131 individual signs using a scale of 1 (rarely signed by deaf people) to 5 (seen every day). 

it is possible that iconicity only boosts semantic priming when signs share few features or are not strong semantic associates. 

In addition, 68 hearing participants from the University of California, San Diego, with no knowledge of ASL rated a large corpus of ASL signs for iconicity and semantic relatedness (on the basis of their English translations). 

another possibility is that the handshape decision was slowed because handshapes in many of the iconic signs were historically derived from classifier constructions in which the handshape was morphemic. 

Although the results provide evidence against a robust and general effect of iconicity on semantic priming, it is nonetheless possible that sign iconicity could play a role when prime–target pairs are only weakly related semantically. 

These languages are relatively young (cf. Aronoff, Meir, & Sandler, 2005), and first-generation users are likely to create motivated gestures that can be quickly and easily understood within this early signing community. 

The experiments that have thus far shown an effect of iconicity have not directly tapped lexical access processes and may instead reflect postlexical, metalinguistic processes.