scispace - formally typeset
Search or ask a question

Showing papers on "Emoticon published in 2012"


Journal ArticleDOI
TL;DR: To understand how emoticons are used in text messaging and, in particular, how genders differed in the frequency and variety of emoticons used via this medium, data is collected from individuals' smartphones over a 6-month period.

187 citations


Journal ArticleDOI
TL;DR: Results generally support earlier findings, indicating that the valence of the cue (smiley or emoticon) affects the corresponding impression formation, whereas smiling smilies have a stronger impact on personal mood than smiling emoticons.
Abstract: Emoticons (ASCII-based character strings) and smilies (pictograms) are widely used in computer-mediated communication as substitutes to compensate for the absence of nonverbal cues. Although their usage has been investigated in numerous studies, it remains open whether they provoke differential effects and whether they lead to person perception patterns similar to what is known from face-to-face interactions. Based on findings from research about person perception and nonverbal communication, we investigated the differential effects of smilies and emoticons with regard to recipients' mood, message evaluation, and person perception in an experimental online study (n=127) with a 2(smiley/emoticon) by 2(positive/negative) between-subjects design (with an additional control condition). Results generally support earlier findings, indicating that the valence of the cue (smiley or emoticon) affects the corresponding impression formation. Further, findings concerning the differential influence of both forms of cues show that there are no differences with regard to message interpretation, whereas smiling smilies have a stronger impact on personal mood than smiling emoticons. The perception of a writer's commitment was only altered by smilies, suggesting that they elicit a stronger impact than emoticons.

95 citations


Patent
Aaron Druck1
19 Nov 2012
TL;DR: In this article, the authors discuss dynamically manipulating or modifying graphic user representations during an electronic communication, including text, images, video, or selecting an appropriate emoticon or avatar from a palette of predetermined emoticons or avatars.
Abstract: Disclosed is a system and method for an interactive communication experience on mobile devices. In general, the present disclosure discusses dynamically manipulating or modifying graphic user representations during an electronic communication. The modification or manipulation of these graphic user representations enables users to convey nuances of mood and feelings rather than being confined to conveying them through conventional communications, including text, images, video, or selecting an appropriate emoticon or avatar from a palette of predetermined emoticons or avatars.

52 citations


Patent
27 Sep 2012
TL;DR: In this article, a text-based response to a received text message is automatically identified as at least one context-relevant emoticon and then displayed such that a user can select the context relevant emoticon to include in the response.
Abstract: These teachings provide for automatically using content from a received text-based message to identify at least one context-relevant emoticon and then automatically displaying that context-relevant emoticon such that a user can select the context-relevant emoticon to include in a text-based response to that received message.

47 citations


Journal ArticleDOI
TL;DR: This paper used typographic emoticons as linguistic units and observed their structures and uses in sentences, and found that emoticons are not only paraverbal devices, but also structural markers, and play a significant role in the formation of the sentence.
Abstract: With the flourishing of information technology in the last 50 years, electronic communication has become a significant part of our daily lives. As electronic language is written text, it is divorced from gestures, facial expressions, and prosodic features such as intonation, rhythm, and volume. That is why emoticons have entered cyberspace; they infuse electronic communication with an emotional, human touch. This paper deals with typographic emoticons as linguistic units, and observes their structures and uses in sentences. The research corpus covers 258 French text messages collected with anonym questionnaire around the years 2008 - 2009. After a graphic analysis of typographic emoticons, we define ―emoticon structure‖ as ―a pictogram-like unit formed with alphagrams and topograms of distinctive significative function, and visually conditioned to the referent‖. Morphological analysis has shown that, in emoticon structure, graphemes of entirely different significances and functions become morpheme-like units, which, like word morphemes, can be derivational, inflectional, or abbreviated, but never unbound. Relying on a corpus, we isolated the two main uses of the emoticon: non-verbal and verbal. The former is the more frequent use, so it is considered in more detail in this paper. Analysis has shown that emoticons are not only paraverbal devices, but also structural markers, and they play a significant role in the formation of the sentence.

47 citations


Patent
Jenny Yuen1, Luke St. Clair1
05 Dec 2012
TL;DR: In this article, a computing device receives input from a user participating in a message session and detects an emoticon in the received input and identifies an image corresponding to the emoticon.
Abstract: In one embodiment, a computing device receives input from a user participating in a message session. The computing device detects an emoticon in the received input and identifies an image corresponding to the emoticon. The computing device accesses the image corresponding to the emoticon and replaces the emoticon with the image in the message session.

36 citations


Patent
29 Aug 2012
TL;DR: This paper used emoticons identified from a source text to provide contextual text-to-speech expressivity, such as intonation, prosody, speed, pauses, and other expressivity characteristics.
Abstract: Techniques disclosed herein include systems and methods that improve audible emotional characteristics used when synthesizing speech from a text source. Systems and methods herein use emoticons identified from a source text to provide contextual text-to-speech expressivity. In general, techniques herein analyze text and identify emoticons included within the text. The source text is then tagged with corresponding mood indicators. For example, if the system identifies an emoticon at the end of a sentence, then the system can infer that this sentence has a specific tone or mood associated with it. Depending on whether the emoticon is a smiley face, angry face, sad face, laughing face, etc., the system can infer use or mood from the various emoticons and then change or modify the expressivity of the TTS output such as by changing intonation, prosody, speed, pauses, and other expressivity characteristics.

29 citations


Patent
10 Dec 2012
TL;DR: In this article, the authors present systems, devices and techniques that generate a set of media portions associated with message inputs for a multimedia message based on an emoticon or an acronym.
Abstract: Disclosed are systems, devices and techniques that generate a set of media portions associated with a set of message inputs for a multimedia message based on an emoticon or an acronym. A text based message can be received having an emoticon or acronym. The emoticon or acronym is identified in the text based message. A splicing component extracts a set of media content portions from media content, in which the media content portions correspond to the emoticon or acronym received. A multimedia message is generated with the media content portions to convey the text based message as a multimedia message.

29 citations


Patent
27 Nov 2012
TL;DR: In this paper, a method and apparatus for linking sounds and emoticons to allow a recipient of a message containing an emoticon to hear audio associated with the emoticon was presented.
Abstract: A method and apparatus for linking sounds and emoticons to allow a recipient of a message containing an emoticon to hear audio associated with the emoticon. In one aspect of the invention, a first user or sender establishes a link or association between the emoticon being sent and a sound file to be associated with that particular emoticon. The emoticon is then transmitted from the sender to the recipient along with the link or association such that the recipient can hear the audio when the emoticon is displayed.

21 citations


Journal ArticleDOI
TL;DR: It is suggested that emoticon usage is prevalent in the writing of some non-native speakers of English but that usage patterns vary significantly across individuals, and concerns about the multiple and diverse interpretations of emoticons are raised.
Abstract: This mixed-methods study looks at patterns of emoticon usage in adult, ESL student writing. Data are drawn from 13 students and their participation in online discussion forums designed to supplement a traditional ESL writing course. The study conceptualizes computer mediated communication as a hybridized and emergent form which utilizes features of both oral and written discourse. Emoticons are seen as central to this hybridity in terms of their ability to serve as textual representations of oral discourse features. Findings from this study suggest that emoticon usage is prevalent in the writing of some non-native speakers of English but that usage patterns vary significantly across individuals. Previous experience with discussion forums in the first language as well as emoticon familiarity are identified as mediating factors in emoticon usage in English. The study also raises concerns about the multiple and diverse interpretations of emoticons and the possibilities for miscommunication and misunderstanding.

19 citations


Proceedings ArticleDOI
16 Nov 2012
TL;DR: The goal of this research is to solve the problem of inconsistency between the sender's intended tone and how the recipient perceives it by developing a system that supports fluent communication.
Abstract: Asynchronous communication using text messaging is a major mode of o n l i n e communication It is simple and easy to use, however, there is often an inconsistency between the sender's intended tone and how the recipient perceives it Emoticons, additional textual expression using icons for facial expressions, are often used to supplement or adjust the verbal part of t h e text, though the problem persists The goal of our research is to solve the problem by developing a system that supports fluent communication The system would estimate t h e emotions in the s e n d e r ' s text and note whether different intentions may be conveyed For that purpose, we analyzed concurrences between emotional words and emoticons in a text We observed the following: (1) There are cases when the emotion represented in words and the emotion represented by the emoticon are inconsistent, (2) The expressed emotion c a n change between positive and negative emotion according to the co-occurring words

Proceedings ArticleDOI
17 Sep 2012
TL;DR: This paper utilized sentences containing emoticon from the articles in Yahoo! blogs to automatically detect user's emotions from messenger logs and found the best performance of the proposed approaches for user emotion detection was achieved by the topical approach.
Abstract: This paper utilized sentences containing emoticon from the articles in Yahoo! blogs to automatically detect user's emotions from messenger logs. Four approaches, topical approach, emotional approach, retrieval approach, and lexicon approach, were proposed. Forty emoticon classes found in Yahoo! blog articles were used for experiments. Two experiments were performed. The first experiment classified sentences into 40 emoticon classes by calculating emotional scores of words. The second experiment took the Yahoo! and MSN messenger logs collected from users as the experimental materials, classified them into 40 emoticon classes by proposed approaches, and mapped 40 emoticon classes to 6 emotion classes to tell the user's emotion. The best performance of the proposed approaches for user emotion detection was achieved by the topical approach and its micro-average precision 0.48 was satisfactory.

Patent
04 Jul 2012
TL;DR: In this article, an advertisement service providing method using emoticons is provided to offer advertisement through an existed chatting program by providing advertisement information through an emoticon linked with advertisement information and by transmitting the emoticon.
Abstract: PURPOSE: An advertisement service providing method using emoticons is provided to offer advertisement through an existed chatting program by providing advertisement information through an emoticon linked with advertisement information and by transmitting the emoticon CONSTITUTION: An emoticon linked with advertisement information to a first user terminal(S100) When the selection of the emoticon is inputted from the first user terminal, the abstracted advertisement information linked with the emoticon is transmitted to the first user terminal(S200) The emoticon is transmitted to a second user terminal according to the request of the first user terminal(S300) The advertisement information linked with the emoticon is transmitted to the second user terminal(S400)


Patent
08 Mar 2012
TL;DR: In this paper, an advertisement service using an emoticon is proposed, where an advertisement can be provided by using an existing chatting program or a messaging system by transmitting the emoticon linked to advertisement information and providing the advertisement information using emoticon, and negative reaction by a user can be minimized due to the advertisement being included in the content of a conversation.
Abstract: According to a method for providing an advertisement service using an emoticon, suggested in the present invention, an advertisement can be provided by using an existing chatting program or a messaging system by transmitting the emoticon linked to advertisement information and providing the advertisement information using the emoticon, and negative reaction by a user can be minimized due to the advertisement being included in the content of a conversation instead of taking up a portion of the screen, thereby not impeding the conversation. Also, according to the present invention, the user can be more inclined to actively participate in spreading the advertisement thereby increasing focus on the advertisement and advertising effect by giving benefits according to the use of the emoticon linked to advertisement, and targeted advertisement can be enabled and advertisement effect can be maximized by receiving information on a user location or interests and providing the emoticon which is linked to advertisement information related to the information on the user location or the interests.



01 Jan 2012
TL;DR: Empirically based gesture animation generation system that uses character's facial expression and gesture to delivery emotion excitably and clearly is suggested.
Abstract: Recently people do not only use mobile phone to call, but also use it to send a SMS. However, it is difficult to express own complicated emotion with text and emoticon of exited SMS service. We pay attention to express user’s emotion interesting and correct, we use character animation. This paper suggests emotion based gesture animation generation system that uses character's facial expression and gesture to delivery emotion excitably and clearly. Michael[1] investigated interview of two people who has stylized gesture. They suggested gesture generation graph for stylized gesture animation. In this paper, we focus to analyze and extract emotional gestures of Disney animation characters and create 3D models of extracted emotional gestures. To express emotion of person, we use emotion gesture generation graph that import the emotion flow graph that expresses emotion flow for probability. We investigated user reaction for research proprieties of suggested system and the alternation propriety.

Patent
30 Jun 2012
TL;DR: In this paper, a method and a mobile terminal for dynamic display of an emotion is presented. But, the method is restricted to the case of emoticons, and the animation instruction of the emoticon is not considered.
Abstract: The present invention relates to the field of communications technologies, and in particular, to a method and a mobile terminal for dynamic display of an emotion. The method for dynamic display of an emoticon includes: obtaining, by a mobile terminal, a trigger event, where the trigger event is used to trigger dynamic display of an emoticon; and searching, by the mobile terminal and according to the trigger event, an emoticon package of the emoticon, and obtaining an animation instruction of the emoticon, where the emoticon package of the emoticon includes a media material file of the emoticon; and utilizing, by the mobile terminal, an animation capacity of the mobile terminal, and operating, according to the animation instruction of the emoticon, the media material file of the emoticon, so as to implement the dynamic display of the emoticon. By applying the present invention, because the mobile terminal implements, according to the animation instruction, an animation effect by invoking the animation capacity of the mobile terminal itself, the mobile terminal may utilize the animation capacity of the mobile terminal itself repeatedly to implement reuse of the animation effect of the emotion.