scispace - formally typeset
Search or ask a question
Author

Eric Gilbert

Bio: Eric Gilbert is an academic researcher from University of Michigan. The author has contributed to research in topics: Social media & Social network. The author has an hindex of 37, co-authored 82 publications receiving 7631 citations. Previous affiliations of Eric Gilbert include Georgia Institute of Technology & University of Valladolid.


Papers
More filters
Proceedings Article
16 May 2014
TL;DR: Interestingly, using the authors' parsimonious rule-based model to assess the sentiment of tweets, it is found that VADER outperforms individual human raters, and generalizes more favorably across contexts than any of their benchmarks.
Abstract: The inherent nature of social media content poses serious challenges to practical applications of sentiment analysis. We present VADER, a simple rule-based model for general sentiment analysis, and compare its effectiveness to eleven typical state-of-practice benchmarks including LIWC, ANEW, the General Inquirer, SentiWordNet, and machine learning oriented techniques relying on Naive Bayes, Maximum Entropy, and Support Vector Machine (SVM) algorithms. Using a combination of qualitative and quantitative methods, we first construct and empirically validate a gold-standard list of lexical features (along with their associated sentiment intensity measures) which are specifically attuned to sentiment in microblog-like contexts. We then combine these lexical features with consideration for five general rules that embody grammatical and syntactical conventions for expressing and emphasizing sentiment intensity. Interestingly, using our parsimonious rule-based model to assess the sentiment of tweets, we find that VADER outperforms individual human raters (F1 Classification Accuracy = 0.96 and 0.84, respectively), and generalizes more favorably across contexts than any of our benchmarks.

3,299 citations

Proceedings ArticleDOI
04 Apr 2009
TL;DR: A predictive model that maps social media data to tie strength is presented, which performs quite well and is illustrated by illustrating how modeling tie strength can improve social media design elements, including privacy controls, message routing, friend introductions and information prioritization.
Abstract: Social media treats all users the same: trusted friend or total stranger, with little or nothing in between. In reality, relationships fall everywhere along this spectrum, a topic social science has investigated for decades under the theme of tie strength. Our work bridges this gap between theory and practice. In this paper, we present a predictive model that maps social media data to tie strength. The model builds on a dataset of over 2,000 social media ties and performs quite well, distinguishing between strong and weak ties with over 85% accuracy. We complement these quantitative findings with interviews that unpack the relationships we could not predict. The paper concludes by illustrating how modeling tie strength can improve social media design elements, including privacy controls, message routing, friend introductions and information prioritization.

1,416 citations

Proceedings ArticleDOI
26 Apr 2014
TL;DR: The first results on how photos with human faces relate to engagement on large scale image sharing communities are presented, finding that the number of faces, their age and gender do not have an effect.
Abstract: Photos are becoming prominent means of communication online. Despite photos' pervasive presence in social media and online world, we know little about how people interact and engage with their content. Understanding how photo content might signify engagement, can impact both science and design, influencing production and distribution. One common type of photo content that is shared on social media, is the photos of people. From studies of offline behavior, we know that human faces are powerful channels of non-verbal communication. In this paper, we study this behavioral phenomena online. We ask how presence of a face, it's age and gender might impact social engagement on the photo. We use a corpus of 1 million Instagram images and organize our study around two social engagement feedback factors, likes and comments. Our results show that photos with faces are 38% more likely to receive likes and 32% more likely to receive comments, even after controlling for social network reach and activity. We find, however, that the number of faces, their age and gender do not have an effect. This work presents the first results on how photos with human faces relate to engagement on large scale image sharing communities. In addition to contributing to the research around online user behavior, our findings offer a new line of future work using visual analysis.

350 citations

Proceedings Article
16 May 2010
TL;DR: It is demonstrated that estimating emotions from weblogs provides novel information about future stock market prices, and how the mood of millions in a large online community, even one that primarily discusses daily life, can anticipate changes in a seemingly unrelated system.
Abstract: Our emotional state influences our choices. Research on how it happens usually comes from the lab. We know relatively little about how real world emotions affect real world settings, like financial markets. Here, we demonstrate that estimating emotions from weblogs provides novel information about future stock market prices. That is, it provides information not already apparent from market data. Specifically, we estimate anxiety, worry and fear from a dataset of over 20 million posts made on the site LiveJournal. Using a Granger-causal framework, we find that increases in expressions of anxiety, evidenced by computationally-identified linguistic features, predict downward pressure on the S&P 500 index. We also present a confirmation of this result via Monte Carlo simulation. The findings show how the mood of millions in a large online community, even one that primarily discusses daily life, can anticipate changes in a seemingly unrelated system. Beyond this, the results suggest new ways to gauge public opinion and predict its impact.

306 citations

Proceedings ArticleDOI
15 Feb 2014
TL;DR: The factors which lead to successfully funding a crowdfunding project are explored, and the language used in the project has surprising predictive power accounting for 58.56% of the variance around successful funding.
Abstract: Crowdfunding sites like Kickstarter--where entrepreneurs and artists look to the internet for funding--have quickly risen to prominence. However, we know very little about the factors driving the 'crowd' to take projects to their funding goal. In this paper we explore the factors which lead to successfully funding a crowdfunding project. We study a corpus of 45K crowdfunded projects, analyzing 9M phrases and 59 other variables commonly present on crowdfunding sites. The language used in the project has surprising predictive power accounting for 58.56% of the variance around successful funding. A closer look at the phrases shows they exhibit general persuasion principles. For example, also receive two reflects the principle of Reciprocity and is one of the top predictors of successful funding. We conclude this paper by announcing the release of the predictive phrases along with the control variables as a public dataset, hoping that our work can enable new features on crowdfunding sites--tools to help both backers and project creators make the best use of their time and money.

288 citations


Cited by
More filters
01 Jan 2002

9,314 citations

Journal ArticleDOI
TL;DR: As an example of how the current "war on terrorism" could generate a durable civic renewal, Putnam points to the burst in civic practices that occurred during and after World War II, which he says "permanently marked" the generation that lived through it and had a "terrific effect on American public life over the last half-century."
Abstract: The present historical moment may seem a particularly inopportune time to review Bowling Alone, Robert Putnam's latest exploration of civic decline in America. After all, the outpouring of volunteerism, solidarity, patriotism, and self-sacrifice displayed by Americans in the wake of the September 11 terrorist attacks appears to fly in the face of Putnam's central argument: that \"social capital\" -defined as \"social networks and the norms of reciprocity and trustworthiness that arise from them\" (p. 19)'has declined to dangerously low levels in America over the last three decades. However, Putnam is not fazed in the least by the recent effusion of solidarity. Quite the contrary, he sees in it the potential to \"reverse what has been a 30to 40-year steady decline in most measures of connectedness or community.\"' As an example of how the current \"war on terrorism\" could generate a durable civic renewal, Putnam points to the burst in civic practices that occurred during and after World War II, which he says \"permanently marked\" the generation that lived through it and had a \"terrific effect on American public life over the last half-century.\" 3 If Americans can follow this example and channel their current civic

5,309 citations

Book
01 Jan 2003
TL;DR: In this paper, Sherry Turkle uses Internet MUDs (multi-user domains, or in older gaming parlance multi-user dungeons) as a launching pad for explorations of software design, user interfaces, simulation, artificial intelligence, artificial life, agents, virtual reality, and the on-line way of life.
Abstract: From the Publisher: A Question of Identity Life on the Screen is a fascinating and wide-ranging investigation of the impact of computers and networking on society, peoples' perceptions of themselves, and the individual's relationship to machines. Sherry Turkle, a Professor of the Sociology of Science at MIT and a licensed psychologist, uses Internet MUDs (multi-user domains, or in older gaming parlance multi-user dungeons) as a launching pad for explorations of software design, user interfaces, simulation, artificial intelligence, artificial life, agents, "bots," virtual reality, and "the on-line way of life." Turkle's discussion of postmodernism is particularly enlightening. She shows how postmodern concepts in art, architecture, and ethics are related to concrete topics much closer to home, for example AI research (Minsky's "Society of Mind") and even MUDs (exemplified by students with X-window terminals who are doing homework in one window and simultaneously playing out several different roles in the same MUD in other windows). Those of you who have (like me) been turned off by the shallow, pretentious, meaningless paintings and sculptures that litter our museums of modern art may have a different perspective after hearing what Turkle has to say. This is a psychoanalytical book, not a technical one. However, software developers and engineers will find it highly accessible because of the depth of the author's technical understanding and credibility. Unlike most other authors in this genre, Turkle does not constantly jar the technically-literate reader with blatant errors or bogus assertions about how things work. Although I personally don't have time or patience for MUDs,view most of AI as snake-oil, and abhor postmodern architecture, I thought the time spent reading this book was an extremely good investment.

4,965 citations

Journal ArticleDOI
TL;DR: This work investigates whether measurements of collective mood states derived from large-scale Twitter feeds are correlated to the value of the Dow Jones Industrial Average (DJIA) over time and indicates that the accuracy of DJIA predictions can be significantly improved by the inclusion of specific public mood dimensions but not others.

4,453 citations

Journal ArticleDOI
TL;DR: The authors found that people are much more likely to believe stories that favor their preferred candidate, especially if they have ideologically segregated social media networks, and that the average American adult saw on the order of one or perhaps several fake news stories in the months around the 2016 U.S. presidential election, with just over half of those who recalled seeing them believing them.
Abstract: Following the 2016 U.S. presidential election, many have expressed concern about the effects of false stories (“fake news”), circulated largely through social media. We discuss the economics of fake news and present new data on its consumption prior to the election. Drawing on web browsing data, archives of fact-checking websites, and results from a new online survey, we find: (i) social media was an important but not dominant source of election news, with 14 percent of Americans calling social media their “most important” source; (ii) of the known false news stories that appeared in the three months before the election, those favoring Trump were shared a total of 30 million times on Facebook, while those favoring Clinton were shared 8 million times; (iii) the average American adult saw on the order of one or perhaps several fake news stories in the months around the election, with just over half of those who recalled seeing them believing them; and (iv) people are much more likely to believe stories that favor their preferred candidate, especially if they have ideologically segregated social media networks.

3,959 citations