scispace - formally typeset
Search or ask a question
Author

Emily A. Thorson

Bio: Emily A. Thorson is an academic researcher from Syracuse University. The author has contributed to research in topics: Misinformation & Fandom. The author has an hindex of 11, co-authored 17 publications receiving 1850 citations. Previous affiliations of Emily A. Thorson include Boston College & George Washington University.

Papers
More filters
Journal ArticleDOI
09 Mar 2018-Science
TL;DR: The rise of fake news highlights the erosion of long-standing institutional bulwarks against misinformation in the internet age as discussed by the authors. But much remains unknown regarding the vulnerabilities of individuals, institutions, and society to manipulations by malicious actors.
Abstract: The rise of fake news highlights the erosion of long-standing institutional bulwarks against misinformation in the internet age. Concern over the problem is global. However, much remains unknown regarding the vulnerabilities of individuals, institutions, and society to manipulations by malicious actors. A new system of safeguards is needed. Below, we discuss extant social and computer science research regarding belief in fake news and the mechanisms by which it spreads. Fake news has a long history, but we focus on unanswered scientific questions raised by the proliferation of its most recent, politically oriented incarnation. Beyond selected references in the text, suggested further reading can be found in the supplementary materials.

2,106 citations

Journal ArticleDOI
Emily A. Thorson1
TL;DR: This article found that exposure to negative political information continues to shape attitudes even after the information has been effectively discredited, and that belief echoes can be created through an automatic or deliberative process.
Abstract: Across three separate experiments, I find that exposure to negative political information continues to shape attitudes even after the information has been effectively discredited. I call these effects “belief echoes.” Results suggest that belief echoes can be created through an automatic or deliberative process. Belief echoes occur even when the misinformation is corrected immediately, the “gold standard” of journalistic fact-checking. The existence of belief echoes raises ethical concerns about journalists’ and fact-checking organizations’ efforts to publicly correct false claims.

355 citations

Journal ArticleDOI
TL;DR: The authors examined how the prevalence of news recommendation engines, such as the most-emailed stories list on the front page of the New York Times website, could change patterns of news consumption.
Abstract: This study examines how the prevalence of news recommendation engines, such as the most-emailed stories list on the front page of the New York Times website, could change patterns of news consumption The top five most-emailed articles from the New York Times website were collected for two 23-day periods The content of the most-emailed list was found to differ both from the articles cued by editors in a traditional newspaper format and from patterns of individual online news browsing Opinion, business and national news articles appear most frequently on the most-emailed list, and more than half of the total articles appeared on the list for multiple days Counter-intuitive articles and articles that offered advice about life issues were significantly more likely to remain on the list for multiple days The data suggest that the most-emailed list, part of a larger family of news recommendation engines (NREs), acts both as an aggregator of individual actions and as a new way for online users to navigate o

110 citations

Journal ArticleDOI
TL;DR: This paper used theories from communication and psychology to compare two prevailing approaches: an online experiment examined how the use of visual "truth scales" interacts with partisanship to shape the effectiveness of corrections and found that truth scales make fact-checks more effective in some conditions.
Abstract: While fact-checking has grown dramatically in the last decade, little is known about the relative effectiveness of different formats in correcting false beliefs or overcoming partisan resistance to new information. This article addresses that gap by using theories from communication and psychology to compare two prevailing approaches: An online experiment examined how the use of visual “truth scales” interacts with partisanship to shape the effectiveness of corrections. We find that truth scales make fact-checks more effective in some conditions. Contrary to theoretical predictions and the fears of some journalists, their use does not increase partisan backlash against the correction or the organization that produced it.

84 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: A review of how previous studies have defined and operationalized the term "fake news" can be found in this article, based on a review of 34 academic articles that used the term 'fake news' between 2003 and 2013.
Abstract: This paper is based on a review of how previous studies have defined and operationalized the term “fake news.” An examination of 34 academic articles that used the term “fake news” between 2003 and...

1,065 citations

Journal ArticleDOI
TL;DR: The results, which mirror those found previously for political fake news, suggest that nudging people to think about accuracy is a simple way to improve choices about what to share on social media.
Abstract: Across two studies with more than 1,700 U.S. adults recruited online, we present evidence that people share false claims about COVID-19 partly because they simply fail to think sufficiently about whether or not the content is accurate when deciding what to share. In Study 1, participants were far worse at discerning between true and false content when deciding what they would share on social media relative to when they were asked directly about accuracy. Furthermore, greater cognitive reflection and science knowledge were associated with stronger discernment. In Study 2, we found that a simple accuracy reminder at the beginning of the study (i.e., judging the accuracy of a non-COVID-19-related headline) nearly tripled the level of truth discernment in participants' subsequent sharing intentions. Our results, which mirror those found previously for political fake news, suggest that nudging people to think about accuracy is a simple way to improve choices about what to share on social media.

914 citations

Journal ArticleDOI
25 Jan 2019-Science
TL;DR: Exposure to and sharing of fake news by registered voters on Twitter was examined and it was found that engagement with fake news sources was extremely concentrated and individuals most likely to engage withfake news sources were conservative leaning, older, and highly engaged with political news.
Abstract: The spread of fake news on social media became a public concern in the United States after the 2016 presidential election. We examined exposure to and sharing of fake news by registered voters on Twitter and found that engagement with fake news sources was extremely concentrated. Only 1% of individuals accounted for 80% of fake news source exposures, and 0.1% accounted for nearly 80% of fake news sources shared. Individuals most likely to engage with fake news sources were conservative leaning, older, and highly engaged with political news. A cluster of fake news sources shared overlapping audiences on the extreme right, but for people across the political spectrum, most political news exposure still came from mainstream media outlets.

872 citations

Journal ArticleDOI
TL;DR: A clear link between susceptibility to misinformation and both vaccine hesitancy and a reduced likelihood to comply with health guidance measures is demonstrated, and interventions which aim to improve critical thinking and trust in science may be a promising avenue for future research.
Abstract: Misinformation about COVID-19 is a major threat to public health. Using five national samples from the UK (n = 1050 and n = 1150), Ireland (n = 700), the USA (n = 700), Spain (n = 700) and Mexico (n = 700), we examine predictors of belief in the most common statements about the virus that contain misinformation. We also investigate the prevalence of belief in COVID-19 misinformation across different countries and the role of belief in such misinformation in predicting relevant health behaviours. We find that while public belief in misinformation about COVID-19 is not particularly common, a substantial proportion views this type of misinformation as highly reliable in each country surveyed. In addition, a small group of participants find common factual information about the virus highly unreliable. We also find that increased susceptibility to misinformation negatively affects people's self-reported compliance with public health guidance about COVID-19, as well as people's willingness to get vaccinated against the virus and to recommend the vaccine to vulnerable friends and family. Across all countries surveyed, we find that higher trust in scientists and having higher numeracy skills were associated with lower susceptibility to coronavirus-related misinformation. Taken together, these results demonstrate a clear link between susceptibility to misinformation and both vaccine hesitancy and a reduced likelihood to comply with health guidance measures, and suggest that interventions which aim to improve critical thinking and trust in science may be a promising avenue for future research.

797 citations