scispace - formally typeset
Search or ask a question
Author

Winnie Goh Wen Pin

Bio: Winnie Goh Wen Pin is an academic researcher from Nanyang Technological University. The author has contributed to research in topics: News media & Journalism. The author has an hindex of 1, co-authored 2 publications receiving 4 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: The authors examined the impact of fake news discourse on perceptions of news media credibility and found that those who saw fake news and were not debriefed did not change their perceptions of the news media.

10 citations

Journal Article
TL;DR: The authors examined the impact of fake news discourse on perceptions of news media credibility and found that those who saw fake news and were not debriefed did not change their perceptions of the news media.
Abstract: This study examines the impact of fake news discourse on perceptions of news media credibility. If participants are told they have been exposed to fake news, does this lead them to trust information institutions less, including the news media? Study 1 (n = 188) found that news media credibility decreased when participants were told they saw fake news, while news credibility did not change when participants were told they saw real news. Study 2 (n = 400) found that those who saw fake news – and were told they saw a fake news post – decreased their trust in the news media while those who saw fake news and were not debriefed did not change their perceptions of the news media. This shows that the social impact of fake news is not limited to its direct consequences of misinforming individuals, but also includes the potentially adverse effects of discussing fake news.

1 citations


Cited by
More filters
Journal ArticleDOI
12 Jan 2022
TL;DR: In this article , the authors show that, given the very limited prevalence of misinformation (including fake news), interventions aimed at reducing acceptance or spread of such news are bound to have very small effects on the overall quality of the information environment.
Abstract: A wealth of interventions have been devised to reduce belief in fake news or the tendency to share such news. By contrast, interventions aimed at increasing trust in reliable news sources have received less attention. In this article we show that, given the very limited prevalence of misinformation (including fake news), interventions aimed at reducing acceptance or spread of such news are bound to have very small effects on the overall quality of the information environment, especially compared to interventions aimed at increasing trust in reliable news sources. To make this argument, we simulate the effect that such interventions have on a global information score, which increases when people accept reliable information and decreases when people accept misinformation.

15 citations

Journal ArticleDOI
Scott Wright1

9 citations

Posted Content
TL;DR: In this article, the authors argue that cultural factors are neglected causes of the spread of fake news and the distrust of established media more generally, and argue that the new architecture of networked information has a structurally corrosive effect.
Abstract: The Internet has famously democratized the information ecosystem. Online, everyone is a pundit: each participant can share news, analyze events, and opine. The analog system, by contrast, was one where incumbent intermediaries (frequently licensed by governments) performed a powerful, centralized gatekeeping function that largely regulated the creation and dissemination of news. Scholars have mostly welcomed the rise of the democratized, networked Fourth Estate. We argue that this transformation is not at all an unalloyed good. Moreover, in celebrating this technological revolution, commentators have neglected the role of cultural factors that tend to magnify the pernicious effects of a flattened information hierarchy. Distrust in social institutions has been on the rise since the Watergate crisis in the 1970s. While government has been the most obvious target of falling confidence, media entities and subject matter experts have also been increasingly the focus of skepticism. The advent of the Internet has magnified this effect: gatekeepers such as CBS and the New York Times are vilified when wrong and invisible when correct. Many eyes make media errors shallow. Moreover, traditional journalistic norms that require forthright admission of mistakes help reinforce narratives that portray the “mainstream media” as biased, incompetent, and out of touch. The current phenomenon labeled as “fake news,” and the older trend of conspiracy theories, are outgrowths of both the technological amplification of skeptical or nihilistic voices and the postmodern assault on information shibboleths. It is critical to realize that the Internet’s initial promise of disintermediation was illusory: gatekeepers have not been eliminated, but merely replaced. The new breed of intermediaries operates with radically different financial incentives and professional norms than their predecessors did. While Facebook moderates and removes information on its ubiquitous platform for violations of amorphous community standards, the company’s goal is not the production of truth, but rather the generation of increased traffic and interaction by users. Falsity can be profitable if it’s popular. Both the old and new bosses curated content, but to vastly different ends. We argue that the new architecture of networked information has a structurally corrosive effect. It is easier to generate doubt about narratives — even those produced by previously trusted sources — than it is to create trusted content. Previously, intermediaries served as choke points: they reacted to incentives that led them to filter unreliable material, in order to preserve their status as creators of the historical record. Now, authors and distributors attract attention (which they monetize) by casting doubt. The most pernicious feature of the Internet news ecosystem is that it leads to a cascade of cynicism: it reinforces not just skepticism about a particular course, but distrust for all media production. Importantly, current scholarly accounts of fake news and conspiracy theories are technologically overdetermined. The democratization of information flows by networked computing cannot fully account for the spread of fake news and the distrust of established media more generally. We argue that cultural factors are neglected causes of these phenomena. First, the technological transformation of the public sphere is accompanied by a social shift toward pervasive distrust of experts. This anti-intellectual turn both constitutes and is constituted by the spread of fake news. Second, while fake news has taken a stronger hold in America than in Europe, the technical systems that undergird the information economy are nearly identical on both sides of the Atlantic. Thus, we explore the non-technical factors that make the United States particularly amenable to the spread of fake news and a culture of media distrust.

4 citations

Journal ArticleDOI
TL;DR: While technology companies have been blamed for playing a key role in the rise of online falsehoods, it has not always been clear how these companies understand the nature of the problem as mentioned in this paper.
Abstract: While technology companies have been blamed for playing a key role in the rise of online falsehoods, it has not always been clear how these companies understand the nature of the problem, which can...

3 citations

Journal ArticleDOI
TL;DR: In this article , the authors examined different disciplines (computer science, economics, history, information science, journalism, law, media, politics, philosophy, psychology, sociology) that investigate misinformation.
Abstract: In the last decade there has been a proliferation of research on misinformation. One important aspect of this work that receives less attention than it should is exactly why misinformation is a problem. To adequately address this question, we must first look to its speculated causes and effects. We examined different disciplines (computer science, economics, history, information science, journalism, law, media, politics, philosophy, psychology, sociology) that investigate misinformation. The consensus view points to advancements in information technology (e.g., the Internet, social media) as a main cause of the proliferation and increasing impact of misinformation, with a variety of illustrations of the effects. We critically analyzed both issues. As to the effects, misbehaviors are not yet reliably demonstrated empirically to be the outcome of misinformation; correlation as causation may have a hand in that perception. As to the cause, advancements in information technologies enable, as well as reveal, multitudes of interactions that represent significant deviations from ground truths through people's new way of knowing (intersubjectivity). This, we argue, is illusionary when understood in light of historical epistemology. Both doubts we raise are used to consider the cost to established norms of liberal democracy that come from efforts to target the problem of misinformation.

3 citations