scispace - formally typeset
Search or ask a question

Showing papers on "Counterintuitive published in 2010"


01 Jan 2010
TL;DR: This paper found that disfluency can function as a desirable difficulty in education, which can be viewed as a way of reducing extraneous cognitive load in order to improve student learning, which may yield positive educational outcomes.
Abstract: Fortune Favors the Bold (and the Italicized): Effects of Disfluency on Educational Outcomes Daniel M. Oppenheimer (doppenhe@princeton.edu) Princeton University, Department of Psychology Green Hall, Princeton, NJ 08540 USA Connor Diemand Yauman (cdiemand@princeton.edu) Princeton University, Department of Psychology Green Hall, Princeton, NJ 08540 USA Erikka B. Vaughan (ebvaugha@umail.iu.edu) Indiana University, Department of Psychology 1101 E. 10th Street, Bloomington, IN 47405 Abstract Research has shown that disfluency – the metacognitive experience of difficulty associated with a cognitive task – engenders deeper processing. Since deeper processing typically leads to better retention, this paper examined whether decreasing perceptual fluency of educational materials would improve retention. Study 1 found that harder to read fonts led to increased retention in a controlled laboratory setting. Study 2 extended this finding to real- world classroom environments. It appears as though perceptual disfluency can function as a desirable difficulty in education. Implications and caveats are discussed. Introduction It seems logical that to effectively communicate an idea, one should present it in a manner which is clear and easy to follow. Educators follow this principle when designing textbooks—the order, wording, and formatting is designed to help students read the information with minimal effort. Indeed, there is evidence to support the notion that students benefit from decreased cognitive demands when learning new concepts (Sweller and Chandler, 1994). While it is commonly accepted that reducing extraneous cognitive load is beneficial to student learning, there is some research that seems to suggest there are exceptions to this rule. In fact, research shows that in certain instances, it may be beneficial to increase extraneous cognitive load (e.g. Bjork 1994). These aptly named ―desirable difficulties‖ create additional cognitive burdens but nonetheless improve learning. For example, in one experimental paradigm (Hirshman & Bjork, 1988), participants are asked to remember pairs of words, such as ―bread : butter.‖ Hirshman and Bjork found that requiring subjects to mentally generate missing letters in a word pair, such as ―bread : b_tt_r,‖ leads to improved recall performance over participants who read the word pair without any missing letters. Bjork extended this strategy to realistic educational settings, finding that students who complete simple fill-in-the-blank sentences are better able to retain information than students who read the same sentences with the key words filled in and underlined for them (Richland, Bjork, Finley, & Linn, 2005). It seems counterintuitive that imposing unnecessary strain on students’ limited cognitive capacity would actually improve performance, yet desirable difficulties seem to exploit nuances in our cognitive systems. Importantly, these instructional techniques appear sub-optimal. Without conscious recognition and implementation on behalf of cognitive psychologists and educators, it is likely that these techniques would not even be considered for use. It is important to explore such techniques and seek out new methods of presentation that better reflect or utilize the way we process information. One such technique may come from explorations on the metacognitive experience of fluency—the subjective feeling of ease or difficulty which is associated with almost any mental task (Alter & Oppenheimer, 2009). For instance, a blurry photograph is disfluent because it is difficult to discern, a whisper is disfluent because it is difficult to hear, and a foreign word may be disfluent because it is difficult to pronounce. Fluency has been shown to influence our judgments in a variety of ways, including our judgments of truth, confidence, intelligence, or familiarity (for a review, see Alter & Oppenheimer, 2009). Importantly, recent studies have begun to explore how fluency influences cognitive processing in ways that might yield positive educational outcomes. Recent work in fluency has demonstrated that when a problem is disfluent, people adopt a more deliberate processing strategy (Alter, Oppenheimer, Epley, & Eyre, 2007). In one experiment, participants were asked to read logical syllogisms and indicate whether they were true or false. Participants who read the syllogisms in a difficult to read (i.e. disfluent) font performed significantly better on the task than those who read the syllogisms in a clear, easy to read font. The authors replicated this result in three distinct cognitive domains. In this way, disfluency may be categorized as a desirable difficulty and can be used to improve student learning by encouraging them to select more accurate problem solving strategies.

150 citations


Book
14 Jan 2010
TL;DR: A review of these discredited but still used methods are reviewed in this chapter as discussed by the authors, with a focus on pre-scientific, counterintuitive approaches to assess and reveal the true nature of individuals, specifically their qualities, abilities, traits and motives.
Abstract: Since the beginning of time, individuals have had to make ‘people decisions’: who to marry, to employ, to fight. In recent decades, sociobiology and evolutionary psychology have suggested that many of these apparently (quasilogical) decisions are based on powerful people markers that we respond to, but are unaware that we are doing so. We assess people on a daily basis. There is, however, in every culture, a rich and interesting history of the techniques groups have favoured in making people decisions. Many of these techniques have quietly passed into history but others remain in use despite being rigorously tested and found wanting. It appears that there have always been schools of thought with their ingenious methods that assess and reveal the ‘true nature’ of individuals, specifically their qualities, abilities, traits and motives. It is patently obvious that people are complex, capricious and quixotic. They are difficult to actually read, to understand and therefore to predict. Neither their virtues or values nor their potential for disaster are easily apparent. People are deceptive, both in the impression management and self-delusional sense. Some are self-aware: they know their strengths, limitations, even what really motivates them; they may even be able to report their condition. Many others are not. Charlatans, snake-oil salesmen and their ilk find easy pickings among those who feel they need to evaluate or assess others for work purposes. The odd thing is that many of these disproved, pre-scientific, worthless and misleading systems still exist. They have advocates who still ply their trade despite lack of evidence that their systems actually work in the sense of providing reliable and valid assessments (see Section 2.6 and Figure 2.7 for an explanation of the technical meaning of ‘reliability’ and ‘validity’, which are the two main psychometric requirements that accurate instruments ought to fulfil). We shall consider some of these. These are essentially pre-scientific methods that pre-date the beginning of the twentieth century. Most have been thoroughly investigated and shown to be both unreliable and invalid. That is, there is ample evidence to suggest it is very unwise to use these methods in selection. However, they continue to be used. One reason for this is that scientific methods are often based on more common sense than these pre-scientific, counterintuitive approaches are. Ironically, counterintuitive methods and approaches have wider appeal than simple, logical methods. In that sense employers and companies are fooled by non-qualified consultants because, like Oscar Wilde, they ‘believe anything as long as it is incredible’. Some of these discredited but still used methods are reviewed in this chapter.

104 citations


Book
04 Jan 2010
TL;DR: Levitt and Levitt as mentioned in this paper show how nearly every business and personal interaction has a game-theory component to it, and how to master game theory will make you more successful in business and life.
Abstract: I am hard pressed to think of another book that can match the combination of practical insights and reading enjoymentSteven Levitt Game theory means rigorous strategic thinking Its the art of anticipating your opponents next moves, knowing full well that your rival is trying to do the same thing to you Though parts of game theory involve simple common sense, much is counterintuitive, and it can only be mastered by developing a new way of seeing the world Using a diverse array of rich case studiesfrom pop culture, TV, movies, sports, politics, and historythe authors show how nearly every business and personal interaction has a game-theory component to it Mastering game theory will make you more successful in business and life, and this lively book is the key to that mastery

97 citations


Journal ArticleDOI
TL;DR: In this paper, the authors consider how the nature and effects of neoliberal policy in education are illuminated by the outcomes of a study of white middle-class families choosing ordinary state secondary schools in England, and conclude that the conditions so generated not only provide advantages to those making conventional choices in keeping with a marketized service, but that they may also bring advantages to middle class families making "counterintuitive" choices as well.
Abstract: This article considers how the nature and effects of neoliberal policy in education are illuminated by the outcomes of a study of white middle-class families choosing ordinary state secondary schools in England. Having described the main features of the study and some of its findings, consideration is given to specific ‘global’ dimensions — one in terms of parental perceptions and the other drawing upon analysis of the global effects of neoliberalism, an example of which is illustrated with reference to an influential UK policy. The article concludes that the conditions so generated not only provide advantages to those making conventional choices in keeping with a marketized service, but that they may also bring advantages to middle-class families making ‘counterintuitive’ choices as well.

57 citations


Journal ArticleDOI
TL;DR: The normative significance of the distinction between therapy and enhancement has come under sustained philosophical attack in recent discussions of the ethics of shaping future persons by means of advanced genetic technologies.
Abstract: The normative significance of the distinction between therapy and enhancement has come under sustained philosophical attack in recent discussions of the ethics of shaping future persons by means of advanced genetic technologies. Giving up the idea that whether a condition is normal or not should play a crucial role in assessing the ethics of genetic interventions has unrecognized and strongly counterintuitive implications when it comes to selecting what sort of children should be brought into the world. According to standard philosophical accounts of the factors one should take into account when making such decisions, women are "better than men." Given the biological differences between the sexes, then, if the only concern is the capacities of an embryo rather than its capacities relative to some normatively significant baseline, there is compelling reason to choose only female embryos. In order to avoid this radical and counterintuitive conclusion, one must embrace the idea that both sexes are normal. The strength of the prima facie reasons to select or reject embryos depends on their sex, which is to say that it depends on the normal capacities of their sex. The therapy/enhancement distinction therefore plays a crucial role in determining the ethics of interventions into the genetics of future generations.

42 citations


Journal ArticleDOI
TL;DR: This paper outlines two approaches to account for the finding that concepts that are minimally counterintuitive are better remembered than intuitive or maximally counter intuitive concepts.

34 citations


Journal Article
TL;DR: The entrepreneurs and innovators who make the most of a tough business climate exploit four counterintuitive domains of opportunity as discussed by the authors, i.e., innovation, entrepreneurship, diversity, and sustainability.
Abstract: The entrepreneurs and innovators who make the most of a tough business climate exploit four counterintuitive domains of opportunity. Bhaskar Chakravorti.

19 citations


Journal ArticleDOI
TL;DR: This article found that humorous statements with parallel violations are recalled significantly better than statements which have only template-level violations, affective statements with only schema level violations, as well as intuitive statements in both immediate and 1-week follow-up sessions.
Abstract: The recent surge of interest in the cognitive science of religion has resulted in a number of studies regarding the memorability of minimally counterintuitive ideas (MCIs). The present model incorporates ontological templates and their respective inferences, as well as delineates between two major types of violations: schema- and template-level violations. As humor is also defined by its counter-intutiveness at the schema level, this study was designed to find effects this emotion has on retention. Results suggest that humorous statements with parallel violations are recalled significantly better than statements which have only template-level violations, affective statements with only schema-level violations, as well as intuitive statements in both immediate and 1-week follow-up sessions.

18 citations


Journal ArticleDOI
TL;DR: A field experiment conducted in a church community tested the counterintuitive notion that individuals were more likely to respond affirmatively to donation requests made through email than face to face, and demonstrated that the predicted positive interaction was moderated by the degree of in-group identification.
Abstract: A field experiment conducted in a church community tested the counterintuitive notion that individuals were more likely to respond affirmatively to donation requests made through email than face to face. According to social identity perspectives of computer-mediated communication, email increases the salience of group attributes and reduces cognitive perceptions of interpersonal differences. These processes depersonalize individuals who then become more sensitive to group norms and expectations. Analyses demonstrated that the predicted positive interaction was moderated by the degree of in-group identification, such that low identifiers were more likely to respond to email calls when the salience of social identity was heightened.

17 citations


Proceedings ArticleDOI
18 Jul 2010
TL;DR: The ‘creativity effect’ is introduced, in which the known opposition of attention and creativity are applied to paradigms used in recent tests for confidence of certain conscious experiences.
Abstract: We introduce the ‘creativity effect’, in which the known opposition of attention and creativity are applied to paradigms used in recent tests for confidence of certain conscious experiences. In particular the manner subjects could use the creativity effect to have greater subjective experience during the paradigms with low attention is used to give an explanation of several counterintuitive phenomena.

8 citations


Journal ArticleDOI
TL;DR: This article argued that Kripke's arguments appear probative only so long as one fails to distinguish between semantics and presemantics, between the literal meanings of sentences and the information on the basis of which one identifies those literal meanings, on the other.

01 Jan 2010
TL;DR: Upal et al. as mentioned in this paper argued that ideas that violate cultural schemas are counterintuitive and that only the intuitive conceptualizations enjoy the transmission advantages because they are the only ones that are minimally counterintuitive.
Abstract: On Attractiveness of Surprising Ideas: How Memory for Counterintuitive Ideas Drives Cultural Dynamics M. Afzal Upal (Afzal.Upal@drdc-rddc.gc.ca) Adversarial Intent Section Defence Research & Development Canada (DRDC) Toronto 1133 Sheppard Ave W, Toronto, M3M 3B9 While a number of subsequent empirical studies (Atran, 2004; J. Barrett & Nyhof, 2001; Boyer & Ramble, 2001; Gonce, Upal, Slone, & Tweney, 2006; Upal, 2005a; Upal, Gonce, Tweney, & Slone, 2007) have found some support for better memory for the MCI concepts, some cultural scientists (Bloch, 2005; Harris & Koenig, 2002; Keller, 2004) have argued that a number of widespread religious concepts such as Gods and ghosts are maximally counterintuitive and not minimally counterintuitive as implied by the minimal counterintuitiveness hypothesis. Some cognitive scientists of religion (J. L. Barrett, 1997, 1999; J. L. Barrett & Keil, 1996; Slone, 2004) have responded by suggesting that this is because believers hold two different (“theologically correct” and “intuitive”) conceptualizations of God and that only the intuitive conceptualizations enjoy the transmission advantages because they are the only ones that are minimally counterintuitive. Barrett (1997, Page 124) says: God, and perhaps other religious objects and entities, are conceptualized on at least two different levels: the basic, everyday concept used in real-time processing of information, and the “T.C.” or theologically correct level used in theological discussion of God’s properties or activities outside of a real-time context. As was shown in above, these two levels of conceptualization may represent God in substantially different ways. Thus, argue these cognitive scientists of religion, that the MC hypothesis “does not apply” to the theological conceptualizations of God or to any other cultural concepts that do not involve violating expectations of intuitive reflective thinking (J. L. Barrett, 1997) (Page 127). This includes ideas that have been learned through explicit training such as the socio-cultural and religious schemas, scripts, and scientific concepts (J. L. Barrett, 2008). Another hurdle in the applicability of the MC-hypothesis to the spread of the cultural beliefs in contemporary social groups is the often implicit assumption that the MC- hypothesis is only applicable to societies where oral transmission is the primary source of the transmission of cultural information. Since most of the modern cultural ideas are spread through pen, paper, and the internet the MC-hypothesis may not apply to them. Previously (Upal, 2009a), I have argued against this narrow interpretation of the MC-hypothesis and suggested that memory advantages obtained by violating conceptual expectations should not be limited to “intuitive concepts”. Instead, I argued that ideas that violate cultural schemas, Abstract The emerging field of cognition and culture has had some success in explaining the spread of counterintuitive religious concepts around the world. However, researchers have been reluctant to extend its findings to explain the widespread occurrence of counterintuitive ideas in general. This article suggests a way to generalize the minimal counterintuitive hypothesis, which argues that such ideas spread because they are more memorable, to form the outline of a model of cultural dynamism which can help explain why strange and novel ideas spread more quickly than ordinary seeming traditional ideas. Keywords: ideology, shared beliefs, counterintuitiveness. Introduction Why do some aspects of group ideologies and cultural worldviews change over time while others stay unchanged for long periods of time? What explains the patterns of persistence and change in shared beliefs of social groups such as new religious movements and political parties? The cognition and culture researchers argue that any attempt to satisfactorily answer such questions must take the individual cognitive tendencies for communication, comprehension, and belief revision into account (Sperber, 1996). A key finding of this research has been the minimal counterintuitiveness hypothesis (Boyer, 1994, 2001) which suggests that the reason why minimally counterintuitive concepts, such as God and ghosts, dominate religious concepts is that people remember them better than intuitive and maximally counterintuitive ideas. This article first reviews the minimal counterintuitiveness hypothesis and then argues that it can be used to explain the spread of novel ideas in general and not just in the context of religious ideas. The Minimal Counterintuitiveness (MC) Hypothesis The minimal counterintuitive (MC) hypothesis posits that: 1. Most of the widespread religious concepts around the globe are minimally counterintuitive. 2. The minimally counterintuitive (MCI) concepts that violate a small number of intuitive expectations (such as, a talking tree, a rock that eats, and an invisible cow) are more memorable than either intuitive concepts (such as, a green tree, a brown rock, and a good person) or maximally counterintuitive concepts that violate a larger number of intuitive expectations (such as, an invisible talking tree that does not occupy any space and a sad illuminant travelling rock).

Journal Article
TL;DR: The crossroads between past beliefs that are still accepted by contemporary common sense and new, emergent findings, which are often counterintuitive for non-specialists, can be found in neuroscience as discussed by the authors.
Abstract: Upshot: Neuroscience is at the crossroads between past beliefs that are still accepted by contemporary common sense and new, emergent findings, which are often counterintuitive for non-specialists…

Book Chapter
01 Jan 2010
TL;DR: The authors employ a cognitive narratological approach to elements of the oral tradition from the Norwegian province Telemark, attempting to utilize tools developed in the cognitive science of religion for historiographical case studies.
Abstract: The article employs a cognitive narratological approach to elements of the oral tradition from the Norwegian province Telemark, attempting to utilize tools developed in the cognitive science of religion for historiographical case studies. The aim is to identify and account for recurring patterns both in the style and content of folk religious tales, giving way to an improved analysis of fragmented and - due to the process of textualization - ambiguous sources. Instead of asking for the "believed" substrate of the folk religious oral tradition, the analysis focuses on the stylistic function of counterintuitive concepts in the narrative context of legends about the Hidden People. Against the backdrop of cognitive story processing theory (Zwaan and Radvansky), it is shown how the narrators applied a set of narrative techniques suitable for utilizing basic cognitive mechanisms of sensory and narrative processing to create credibility. The basic traits of the legends are shown to maximize cognitive effort in story processing, forcing the recipient to recognize the coherence between highly heterogeneous information and to position him- or herself with respect to the ambivalence of the stories. This cognitive effort is often resolved by reference to superempirical elements, which are not provided in the story, but concluded by the recipient. The narrative culture not only kept their belief in the Hidden People alive, but provided an experiential reality where official Christian doctrine, skeptical thought, and traditional belief elements were synthesized, forming a dynamic field of discourse. Although mostly static in content and form, the legends were used as reactions to objections from the church and framed an arena in which norms, values, religious beliefs, and individual lifeplans could be negotiated.

Posted Content
TL;DR: For instance, Gallagher as mentioned in this paper provides a brilliant overview of emerging knowledge that is redrawing the map of the body-mind relationship, which is often counterintuitive for non-specialists.
Abstract: Neuroscience is at the crossroads between past beliefs that are still accepted by contemporary common sense and new, emergent findings, which are often counterintuitive for non-specialists. Gallagher’s work provides a brilliant overview of this emerging knowledge that is redrawing the map of the body-mind relationship.

Proceedings ArticleDOI
09 Nov 2010
TL;DR: This paper introduces the concept of using a reasoning engine to draw causal inferences about the data and then expressing them in an explanatory narrative.
Abstract: Data-to-text natural language generation techniques do not currently impart deep meaning in their output and leave it to an expert user to draw causal inferences. Frequently, the expert is adding meaning that would be present in data sources that could be made available to the NLG system. As the system is intended to convey as much information as possible, it seems counterintuitive to require the user to add meaning that could already have been included in the systems output. In this paper, we introduce our concept of using a reasoning engine to draw causal inferences about the data and then expressing them in an explanatory narrative.

01 Jan 2010
TL;DR: For example, Upal et al. as discussed by the authors found that stories containing two or three counterintuitive concepts are more memorable than stories containing fewer or more counterintuitive ideas. But they did not consider the impact of these counterintuitive notions on the overall recall of a story.
Abstract: Finding the Sweet Spot: Is There a Fixed Template for Culturally Successful Counterintuitive Narratives? M. Afzal Upal (Afzal.Upal@drdc-rddc.gc.ca) Adversarial Intent Section Defence Research & Development Canada (DRDC) Toronto 1133 Sheppard Ave W, Toronto, M3M 3B9 Abstract This article reports an investigation involving a series of studies carried out to critically examine the hypothesis that presence of 2 or 3 counterintuitive concepts in a story makes it more memorable than stories containing fewer or more such concepts. Our results paint a more complicated picture involving a number of interacting factors with contribution of the counterintuitive concepts to the global story cohesion emerging as a key factor. Keywords: Memory, culture, folktales, concept learning. Introduction A number of recent studies have found that minimally counterintuitive concepts are recalled better than intuitive and maximally counterintuitive ideas (Barrett & Nyhof, 2001; Boyer, 1994, 2001; Boyer & Ramble, 2001). Better memorability for minimally counterintuitive concepts, these researchers argue, explains why such concepts form part of widespread religious beliefs and other widely shared cultural beliefs. However, as Atran (2003) has argued, these findings on their own are not sufficient to explain why most of the widespread cultural folktales contain only a small number of counterintuitive concepts 1 and are mostly composed of intuitive concepts. How and why do the apparently less memorable intuitive concepts continue to be successfully transmitted along with a small number of counterintuitive concepts? Does the presence of counterintuitive concepts improve overall recall for a story? If so, would an even larger number of counterintuitive concepts make the story even more memorable or would memorability drop of if counterintuitive concepts are added beyond a certain number? Norenzayan, Atran, Faulkner, and Schaller (2006) report on an investigation carried out to study these questions. They selected 42 Grimm Brothers folktales such that half of the stories were judged to be “culturally successful” (they attracted more Google hits) and the other half were considered to be “culturally unsuccessful” (because they received fewer Google hits). Counterintuitive concepts present in each story were then counted. They found that a vast majority of the culturally successful folk tales had two or three counterintuitive ideas whereas counterintuitive ideas were more evenly distributed among the unsuccessful The rest of the article uses the terms MCI concepts or simply counterintuitive concepts when referring to minimally counterintuitive concepts. folktales. Subjects were then asked to read the stories and answer a number of questions to determine if the subjects thought that the stories were familiar, memorable, easy to understand, easy to transmit, and interesting enough to tell others. Their results show that stories with more Google hits were judged by the subjects to be more memorable and worth telling their friends. On the basis of this evidence, Norenzayan et al. argued that stories that contain two or three counterintuitive ideas enjoy memorability advantages over stories that have fewer (0 or 1) or more (4, 5, 6, or larger) counterintuitive ideas. They further argue that this should be true for all stories and not just Grimm Brother’s tales or just Northern European folktales from the 19 th century, or just for narratives of a certain length. They call stories containing 2-3 counterintuitive concepts as MCI narratives and state, “we propose that MCI narratives are culturally successful partly because they enjoy a stronger cognitive advantage in recall than other narrative templates” (Page 549)(Norenzayan, Atran, Faulkner, & Scaller, 2006). Let us call the hypothesis that stories containing 2 or 3 counterintuitive ideas are more memorable than stories containing fewer or more concepts as the MCI-hypothesis. The objective of this paper is to carefully examine the MCI-hypothesis and its implications. This is accomplished through a series of studies. Initially, we replicate Norenzayan et al.’s methodology but then complement it with other techniques. Study I This study replicates Norenzayan et al.’s methodology for a different set of folktales. Aesop’s fables are folktales credited to a Greek slave named Aesop who is thought to have lived from 620 to 560 BC. Most of the short stories contain between 50 and 500 words and are organized around moral themes. A number of stories contain counterintuitive concepts such as anthropomorphic animals. While Aesop’s fables have survived for hundreds (if not thousands) of years and are widely known around the world, not all tales are equally well known. This study used George Fyler Townsend’s collection (1867) containing 350 fables. Using Norenzayan et al.’s methodology, Google hits were computed for all 350 fables by querying for “Aesop” and the title of a story (e.g., “The Hare and the Tortoise”). Besides Google’s initial estimate of the number of matching documents (which was the only measure used by Norenzayan et al.), this study also computed the actual

01 Jan 2010
TL;DR: The authors showed that the less-is-more effect is not unique to recognition-based inference but can also be observed with a knowledge-based strategy provided two assumptions, limited information and differential access, are met.
Abstract: Inference on the basis of recognition alone is assumed to occur prior to accessing further information (Pachur & Hertwig, 2006). A counterintuitive result of this is the “less-is-more” effect: a drop in the accuracy with which choices are made as to which of two or more items scores highest on a given criterion as more items are learned (Frosch, Beaman & McCloy, 2007; Goldstein & Gigerenzer, 2002). In this paper, we show that less-is-more effects are not unique to recognition-based inference but can also be observed with a knowledge-based strategy provided two assumptions, limited information and differential access, are met. The LINDA model which embodies these assumptions is presented. Analysis of the less-is-more effects predicted by LINDA and by recognition-driven inference shows that these occur for similar reasons and casts doubt upon the “special” nature of recognition-based inference. Suggestions are made for empirical tests to compare knowledge-based and recognition-based less-is-more effects

01 Jan 2010
TL;DR: In this paper, the authors examined whether or not collaborative inhibition would disappear if the total possible number of unique items were equal in groups and individuals randomly put into pairs and triads, but the effects of collaborative inhibition disappeared when the collaborative subjects were given an equal number of chances to remember as the isolated subjects.
Abstract: Recollection is frequently social; people tend to remember with others and when they do, their joint recollection is enhanced (Meudell, Hitch & Kirby, 1992). While one intuitively thinks that collaboration would enhance memory, Weldon, et al. (1997) argued that recalling with others impairs retrieval of "unique items." This collaborative inhibition (CI), occurs when pairs of subjects recall fewer correct "unique" items than others recall in isolation. This is a common result in many studies and has been attributed to both social and cognitive causes. This study examined whether or not collaborative inhibition would disappear if the total possible number of unique items were equal in groups and individuals randomly put into pairs and triads. In a series of experiments, we showed that the nominal grouping condition remembered more unique correct items than collaborative groups, but the effects of collaborative inhibition disappeared when the collaborative subjects were given an equal number of chances to remember as the isolated subjects. This provides evidence that the effects of collaborative inhibition are caused by an artifact in the scoring procedure and not a memory failure. This finding is vital in memory research because it alleviates the doubt on group recall caused by collaborative inhibition. Collaborative Inhibition 3 Collaborative inhibition: A counterintuitive phenomenon Most memory research conducted analyzes individual participants' data. The few past studies that investigated the relationship between individual recall and recall in a group all concluded that groups remember more items accurately than an individual (Meudell et aI., 1992). Thus the expression "Two heads are better than one," became a common belief. With memory, particularly episodic memory, relying heavily on interpersonal interactions through the reminiscing of shared experiences, that conclusion seemed both logical and precise. Early Collaborative Research Meudell, Hitch, and Kirby (1992) took the analysis of collaborative recall a step further by asking whether the trend of groups recalling more items was due to collaborative groups facilitating memory or simply due to the a statistical pooling of responses. The authors tested their theory by having participants recall in two separate sessions immediately after each other. In the fIrst session all participants individually completed a free-recall of words learned three months before. Immediately after the fIrst recall participants performed another recall with some put into pairs (dyads) and some left as individuals to act as a control group. Scoring the data played an important part in how the results were analyzed. Meudell et aI. (1992) theorized that if group recall facilitated individual recall then group recall would not only contain more items than individual recall, but it would also contain items that had not appeared on either of the participants' individual recall. That is, collaborative group recall would elicit new information that would not have been recalled if the individuals stayed separate. To Collaborative Inhibition 4 examine that hypothesis, participants' recall was compared as individual recall to group recall to see if new words appeared. Surprisingly, the results showed no significant difference in new items recalled in collaborative groups than in individuals in the second recall session. This led Meudell et al. to conclude two major theories on collaborative group recall; collaborative groups do not facilitate individual recall and the trend of groups having a higher mean of items recalled is most likely due to simple pooling of answers. Not only did they find that group recall showed no facilitation of new items, but they also hinted at a possible inhibitory mechanism associated with collaborative group recall. Other research comparing individual recall with collaborative recall found results that suggested an inhibitory mechanism involved in collaborative recall, although none thoroughly investigated those counterintuitive fmdings until Weldon arid Bellinger (1997). Collaborative Inhibition occurs when interacting collaborative groups fail to out perform the combined performance of individuals (Weldon & Bellinger, 1997). Nominal Groups: Optimizing Group Recall Before continuing the examination of collaborative inhibition, some explanations in terminology and methodology need to be made. First of all, the terms collaborative group and group are used interchangeable. Secondly, the method used in collaborative inhibition studies to determine the predicted performance value of a group requires some clarification. In order to compare the output of a collaborating group to that of an individual, it is necessary to compare the performance of a collaborating or interacting group to the performance of a nominal or non-interacting group (Thompson, 2008). Nominal groups Collaborative Inhibition 5 consist of at least two individuals who work alone during the entire experiment. The nominal groups are then formed during the analysis of the data by randomly placing individual participants' data together as if they had worked together collaboratively. The data are scored by pooling together unique, non-redundant, answers from each individual set of data. For example, if the nominal group consists of two participants with participant 1 recalling items a,b,c,d,e while participant 2 recalling items d,e,f,g,h,j then the combined performance would be a,b,c,d,e,f,g,hj and would be the nominal group's performance (Thompson, 2008). The scores from the nominal group act as a predictor for the performance of a collaborating group due to the findings in collaborative memory research that colhtborative group performance is just a pooling of the individuals (Meudell et al.,1992). Comparing the scores of a nominal group to a collaborating group produces three potential predictions. (1) If the collaborating group performs greater than the nominal group, then collaboration facilitates, or enhances performance. (2) If the performance ofboth the collaborating and the nominal groups are equal, then there is no effect of collaboration on performance. (3) If the collaborating groups perform less than the nominal groups, then collaboration must inhibit performance (Weldon & Bellinger, 1997). Collaborative inhibition is found when the third prediction is correct. Early Collaborative Inhibition Research Using the concept of nominal groups, Weldon and Bellinger (1997) were the first to explore collaboration in memory specifically set on determining whether groups or individuals are more productive in recall. The goal of their study was to determine whether collaboration facilitates, inhibits, or pools individual's knowledge. They set up their experiments in a similar fashion to Meudell et al. (1992), having participants recall Collaborative Inhibition 6 the same items in two testing sessions, one after the other, either individually-individually (II), collaboratively-collaboratively (CC), individually-collaboratively (lC), or collaboratively-individually (CI). For example, when working in isolation, one person might remember the items coffee, eggs, toast, jam and bacon and the other remembers coffee, eggs, jelly, juice and cereal (See Table 1). When these two people remember together after a one-week delay, they remember coffee, eggs, jelly, jam, juice and cereal. While they remember more at the one-week delay (six items instead of the five each remembered individually), the number of unique correct items recalled between the two of them is less than when they worked alone (six unique items as a pair instead of eight when working alone). The results showed that despite the output of a collaborating group being larger than an individual's output, collaborating group's pefformance did not exceed the predicted performance of the nominal group. They called this curious lack of optimization collaborative inhibition. Social Explanations ofCollaborative Inhibition With memory being a social activity, a logical reason for CI can be attributed to previously established theories of social inhibition dealing with other cognitive functions, such as brainstorming and group productivity. One such theory of is socialloafmg. As described by Latane, Williams and Harkins (1979), social loafing refers to "a decrease in individual effort due to the social presence of other persons." Latane et al. (1979) demonstrated social loafing though two physical tasks, cheering and clapping, that are usually done in social settings. They compared the measured intensity of an individual clapping/cheering by themselves to the intensities of clapping/cheering in groups of two, Collaborative Inhibition 7 four, and six. The results showed that, although the overall intensity of the group was higher than the individuals', the group's intensity level was not just a summation ofthe individual's intensity. Instead, group intensity level never represented the predicted value of multiplying the individual intensity level by number ofpeople in the group. In fact, as the group size increased, ratio of actual intensity level to predicted intensity level decreased. That is, two-person groups only performed to 71% of their predicted performance, while six-person groups performed at 40%. These results support the social loafing because in a group, one feels as if they do not need to put as much effort into the activity because there are other people to contribute. Social loafing fits logically as the cause of collaborative inhibition because the effect of collaborative inhibition matches the effect seen in social loafing, that as a group gets larger they stray even farther away from the predicted value ofperformance because there are others to assist in the contributions to the group (Latane et. al, 1979). Weldon, Blair, and Huebsch (2000) investigated whether motivational factors, such as social loafing, contributed to CI during recall.