scispace - formally typeset
Search or ask a question

Showing papers on "Emoticon published in 2010"


Journal ArticleDOI
TL;DR: An exploratory experiment to categorize workplace IM messages into coherent groups, identify the most commonly used emoticons (emblems) for expressing positive, negative, and neutral emotions in the case company, and examine the intention to use emoticons in IM in the workplace showed that negative emoticons could cause a negative effect in both simplex and complex task-oriented communication.

107 citations


Patent
07 Sep 2010
TL;DR: Graphical user representations such as emoticons or avatars, used to convey mood and emotion, can be dynamically modified and manipulated, eg by squeezing, rotating, distorting, colouring, etc as mentioned in this paper.
Abstract: Graphical user representations, such as emoticons or avatars, used to convey mood and emotion, can be dynamically modified and manipulated, eg by squeezing, rotating, distorting, colouring, etc This enables a user to customize or tailor an existing emoticon to better reflect the user's current mood or emotion For example, a user may insert a smiley face emoticon ~ into a text message and then manipulate or distort the face or a component of the face (eg the smile) to broaden the smile into a grin, or twist the smile into an ironic smile, etc This enables the user to personalize the emoticon rather than having to select the most appropriate emoticon from a palette of predefined emoticons Another aspect is device hardware (eg dedicated or shared user interface elements or specific touchscreen gestures) for recognizing the squeezes or other gestures that are meant to modify or manipulate the emoticon

92 citations


Journal ArticleDOI
TL;DR: The evaluation of CAO confirmed the system's capability to sufficiently detect and extract any emoticon, analyze its semantic structure, and estimate the potential emotion types expressed, outperforming existing emoticon analysis systems.
Abstract: This paper presents CAO, a system for affect analysis of emoticons in Japanese online communication. Emoticons are strings of symbols widely used in text-based online communication to convey user emotions. The presented system extracts emoticons from input and determines the specific emotion types they express with a three-step procedure. First, it matches the extracted emoticons to a predetermined raw emoticon database. The database contains over 10,000 emoticon samples extracted from the Web and annotated automatically. The emoticons for which emotion types could not be determined using only this database, are automatically divided into semantic areas representing “mouths” or “eyes,” based on the idea of kinemes from the theory of kinesics. The areas are automatically annotated according to their co-occurrence in the database. The annotation is first based on the eye-mouth-eye triplet, and if no such triplet is found, all semantic areas are estimated separately. This provides hints about potential groups of expressed emotions, giving the system coverage exceeding 3 million possibilities. The evaluation, performed on both training and test sets, confirmed the system's capability to sufficiently detect and extract any emoticon, analyze its semantic structure, and estimate the potential emotion types expressed. The system achieved nearly ideal scores, outperforming existing emoticon analysis systems.

55 citations


01 Jan 2010
TL;DR: The authors examined nonverbal representations of communication across high and low context cultures within the low context medium of computer-mediated communication and found that the use of these non-verbal contextual cues is culturally grounded with high context cultures such as Japanese relying heavily on these graphical accents in their blog entries regardless of sex.
Abstract: Using Hall’s (1976) high/low context distinction and Hofstede’s (1980) individualist verses collectivist cultural dimension, this paper examined non-verbal representations of communication across high and low context cultures within the low context medium of computer-mediated communication. A sample of 80 Japanese and English personal diary weblogs that were divided into blog topic content, entries, and comments from these individual blogs were examined for emoticon use. Findings showed that the gender of the blog author as well as the topic of the personal blog may play a role in influencing emoticon use and that the majority of emoticon usage could be seen in the blog comments rather than the blog articles themselves, where interaction is reduced to the blog writer and blog commentator. The data also showed that the use of these non-verbal contextual cues is culturally grounded with high context cultures such as Japanese relying heavily on these graphical accents in their blog entries regardless of sex. In contrast, low context cultures were found to use these emoticons sparingly in comparison.

28 citations


Proceedings Article
11 Jul 2010
TL;DR: CAO, a system for affect analysis of emoticons, confirmed the system's capability to sufficiently detect and extract any emoticon, analyze its semantic structure and estimate the potential emotion types expressed.
Abstract: This paper presents CAO, a system for affect analysis of emoticons. Emoticons are strings of symbols widely used in text-based online communication to convey emotions. It extracts emoticons from input and determines specific emotions they express. Firstly, by matching the extracted emoticons to a raw emoticon database, containing over ten thousand emoticon samples extracted from the Web and annotated automatically. The emoticons for which emotion types could not be determined using only this database, are automatically divided into semantic areas representing "mouths" or "eyes", based on the theory of kinesics. The areas are automatically annotated according to their co-occurrence in the database. The annotation is firstly based on the eye-mouth-eye triplet, and if no such triplet is found, all semantic areas are estimated separately. This provides the system coverage exceeding 3 million possibilities. The evaluation, performed on both training and test sets, confirmed the system's capability to sufficiently detect and extract any emoticon, analyze its semantic structure and estimate the potential emotion types expressed. The system achieved nearly ideal scores, outperforming existing emoticon analysis systems.

21 citations


Patent
06 May 2010
TL;DR: In this paper, an emotion specification device includes an emoticon text storage part 40 for storing a plurality of pieces of text information, each of which includes emoticons; a text acquisition part 52 for, when the emoticons are input, acquiring the text information including the input emoticons from the emoticon texts part 40, an emotion word extraction part 53 for extracting emotion words and the category of an emotion, corresponding to the emotion words from text information acquired by the text acquisition parts 52; and an emotion specifying part 55 for specifying the category with the input emojis, on
Abstract: PROBLEM TO BE SOLVED: To provide an emotion specifying device, an emotion specification method, a program and a recording medium that attains higher efficiency for the specification processing of an emotion expressed by an emoticon, and for improving the specification accuracy. SOLUTION: This emotion specification device includes: an emoticon text storage part 40 for storing a plurality of pieces of text information, each of which includes emoticons; a text acquisition part 52 for, when the emoticons are input, acquiring the text information including the input emoticons from the emoticon text storage part 40, an emotion word extraction part 53 for extracting emotion words and the category of an emotion, corresponding to the emotion words from the text information acquired by the text acquisition part 52; an emotion word measuring part 54 for counting the number of the emotion words present, at positions within 10 characters from the emoticon in the text information acquired by the text acquisition part 52 for each emotion word extracted by the emotion word extracting part 53; and an emotion specifying part 55 for specifying the category of an emotion expressed with the input emoticons, on the basis of the number of the emotion words counted by the emotion word measuring part 54. COPYRIGHT: (C)2010,JPO&INPIT

4 citations


Patent
26 Aug 2010
TL;DR: In this article, a mobile terminal and a method for using an effect emoticon are provided to use an emoticon including an expression method and a vibration pattern about an item, and a user input unit receives a user command for generating or using the emoticon.
Abstract: PURPOSE: A mobile terminal and a method for using an effect emoticon are provided to use an effect emoticon including an expression method and a vibration pattern about an item. CONSTITUTION: A memory(160) stores an item and an effect emoticon. The effect emoticon comprises an expression method of an item or a vibration pattern. A control unit(180) generates or uses the effect emoticon. An output unit(150) expresses the effect emoticon. A user input unit(130) receives a user command for generating or using the effect emoticon.

4 citations


Patent
02 Jul 2010
TL;DR: In this paper, a system and a method for transmitting an emoticon are provided to change a emoticon into a recognizable emoticon in a communication terminal, thereby reducing emoticon transmission error.
Abstract: PURPOSE: A system and a method for transmitting an emoticon are provided to change an emoticon into recognizable emoticon in a communication terminal, thereby reducing emoticon transmission error in case an emoticon includes in a text message in communication terminals of manufacturers. CONSTITUTION: A terminal information server(120) stores terminal information corresponding to terminal identification information. An emoticon information server(130) stores emoticon information arranged in the terminal information. A message server(110) determines presence of emoticon in an SMS(Short Message Service) message. The message server transmits received SMS message to a receiving terminal(200). The message server changes the emoticon into recognizable emoticon on the receiving terminal.

3 citations


01 Jan 2010
TL;DR: The database is created by gathering emoticons from numerous dictionaries of face marks and online jargon, and the inconsistencies in emotion classification provided by various dictionaries are solved by processing them with an affect analysis system developed previously.
Abstract: In this paper we present our work on creating a database of emoticons – face marks widely used to convey emotions in text-based online communication. The database is created by gathering emoticons from numerous dictionaries of face marks and online jargon. The inconsistencies in emotion classification provided by various dictionaries are solved by processing them with an affect analysis system developed previously. Having the emoticon database annotated automatically this way, we extract patterns from it patterns of semantic areas of emoticons, such as ―eyes‖ and ―mouths‖. Finally, we perform annotation of the semantic areas based on co-occurrence statistics and the theory of kinesics.

2 citations


Patent
06 May 2010
TL;DR: In this article, the authors proposed a method to improve the detection accuracy by recognizing the appearance tendency of bilateral symmetry in an emoticon, when detecting the emoticon in a text, using machine learning.
Abstract: PROBLEM TO BE SOLVED: To improve the detection accuracy by recognizing the appearance tendency of bilateral symmetry in an emoticon, when detecting the emoticon in a text. SOLUTION: A range that includes at least either two identical characters or a pair of symmetric characters, and corresponding to a bilateral symmetry character string having bilateral symmetry as a whole is extracted by a bilateral symmetric character string extracting part 11, and bilateral symmetry information, showing whether each character in the text divided into character units by a character division part 3 is contained in the bilateral symmetric character string is generated by a bilateral symmetry generating part 12 on the basis of the range; and this is used as the identity, together with the notation of each character in the text; and a BIO determining part 14 determines, to which of the first character "B" configuring the emoticon, the second and following characters "I" configuring the emoticon and the character "O" other than the emoticon, the character in the text corresponds by machine learning by using a model stored in a model storage part 13. COPYRIGHT: (C)2010,JPO&INPIT

1 citations