Graphically Speaking

Editor: Miguel Encarnação

iFeel_IM!: Augmenting Emotions during Online Communication Dzmitry Tsetserukou Toyohashi University of Technology Alena Neviarouskaya University of Tokyo

“A

ll emotions use the body as their theater.” — Antonio Damasio Nowadays, companies providing media for remote online communications place great importance on live communication and immersive technologies. Along with widely used instant messaging (IM) applications (such as Yahoo Messenger, Microsoft Windows Live Messenger, Google Talk, and Skype), Web services such as Twitter, Google Wave, and Google Buzz are gaining notability and popularity worldwide. Such applications let you keep in touch with friends in real time over

multiple networks and devices. Recently, mobilecommunications companies have launched IM services on cellular phones (for example, AOL Instant Messenger on the iPhone). 3D virtual worlds (for example, Second Life and OpenSim) also have embedded chat and IM. Such systems encourage people to establish or strengthen interpersonal relationships, share ideas, gain new experiences, and feel genuine emotions during their VR adventures. However, conventional mediated systems usually ■







support only simple textual cues such as emoticons, lack visual emotional signals such as facial expressions and gestures, support only manual control of the expressiveness of graphical representations of users (avatars), and ignore such important social-communication channels as the sense of touch.

Tactile interfaces could let users enhance their emotional-communication abilities by adding a new dimension to mobile communications. Here, we introduce iFeel_IM! (see Figure 1), a system that employs haptic devices and visual stimulation to convey and augment the emotions experienced during online conversations. iFeel_IM! stands for Intelligent System for Feeling Enhancement Powered by Affect-Sensitive Instant Messenger.

Affective Haptics—an Emerging Frontier Figure 1. A user communicating through iFeel_IM! (Intelligent System for Feeling Enhancement Powered by Affect-Sensitive Instant Messenger). The devices worn on the body reinforce communicators’ own feelings or simulate the partner’s emotions. 72

September/October 2010

Human emotions can be easily evoked by different cues, and the sense of touch is one of the most emotionally charged channels. Affective haptics is an emerging area of research

Published by the IEEE Computer Society

0272-1716/10/$26.00 © 2010 IEEE

Chat text Emotion: intensity

Affect Analysis Model

Second Life PC Hapticdevice controller

Chat log file

Digital/analog converter

Driver box

HaptiHeart

HaptiTemper and HaptiShiver

HaptiHug

HaptiTickler

HaptiButterfly

Figure 2. The iFeel_IM! architecture. To communicate through iFeel_IM!, users wear six affective haptic devices: HaptiHeart, HaptiHug, HaptiButterfly, HaptiTickler, HaptiTemper, and HaptiShiver.

focusing on designing the devices and systems that elicit, enhance, or influence a human’s emotional state by means of touch. We distinguish four basic haptic (tactile) channels governing our emotions: ■■

■■ ■■ ■■

physiological changes (for example, heart rate and body temperature), physical stimulation (for example, tickling), social touch (for example, hugging), and emotional haptic design (for example, a device’s shape, material, or texture; we discuss this concept in more detail later).

Wanting to enhance social interactivity and provide an emotionally immersive experience for real-time messaging, we designed iFeel_IM! to reinforce (intensify) communicators’ own feelings or reproduce (simulate) the partner’s emotions. The philosophy behind iFeel_IM! is “I feel [therefore] I am!” The emotion elicited by physical stimula

tion might imbue our communication with passion and increase emotional intimacy—the ability to be close, loving, and vulnerable. Interpersonal relationships and the ability to express empathy grow strongly when people become emotionally closer through disclosing thoughts, feelings, and emotions for the sake of understanding. iFeel_IM! tries to influence human emotions not only by the four haptic channels but also by visual feedback, as we’ll show.

The iFeel_IM! Architecture Figure 2 shows the structure of iFeel_IM! As you can see, the wearable part of the system is based on the human body and includes such parts as the heart, hands, abdomen, and sides. iFeel_IM! stresses ■■

automatic sensing of emotions conveyed through text messages (artificial intelligence), IEEE Computer Graphics and Applications

73

Graphically Speaking

■■

■■ ■■

visualization of the detected emotions through avatars in a virtual world, enhancement of the user’s affective state, and reproduction of social touch through haptic stimulation in the real world.

We use Second Life as the communication platform. With Second Life, users can flexibly create their online identities (avatars) and play various animations (for example, facial expressions and gestures) of avatars by typing special abbreviations in a chat window.

iFeel_IM! stresses automatic sensing of emotions, visualization of those emotions, enhancement of the user’s affective state, and reproduction of social touch. We implement control of the conversation through EmoHeart, a Second Life object attached to an avatar’s chest. EmoHeart communicates with the Affect Analysis Model (AAM), a system for textual affect sensing. It also senses symbolic cues or keywords in the text that indicate a hug and generates a hugging visualization (that is, it triggers the related animation). iFeel_IM! stores the results from the AAM (the dominant emotion and intensity) and EmoHeart (the hug communicative function) along with the chat messages in a file on each user’s local computer. The haptic-device controller analyzes the data in real time and generates control signals for the digital/analog converter, which then feeds control cues for the haptic devices to the driver box. On the basis of the transmitted signal, iFeel_IM! activates the user’s corresponding haptic device. iFeel_IM! employs six haptic devices: HaptiHeart, HaptiHug, HaptiButterfly, HaptiTickler, HaptiTemper, and HaptiShiver. We describe them in more detail later.

3. word-level analysis, 4. phrase-level analysis, and 5. sentence-level analysis. The AAM is based on the compositionality principle. According to this principle, we determine the emotional meaning of a sentence by composing the pieces that correspond to lexical units or other linguistic constituent types, governed by the rules of aggregation, propagation, domination, neutralization, and intensification, at various grammatical levels. Analyzing each sentence in sequential stages, this method can process sentences of various complexities, including simple, compound, complex (with a complement and relative clauses), and complex-compound sentences. To measure the accuracy of our affect recognition algorithm, we extracted 700 sentences from a collection of diary-like blog posts provided by BuzzMetrics (http://en-us.nielsen.com/tab/product_families/ nielsen_buzzmetrics). We focused on online-diary or personal-blog entries, which are typically written in a free style and are rich in emotional colorations. Three independent annotators labeled each sentence with one of the nine emotions (or labeled it neutral) and a corresponding intensity value. Empirical testing showed promising results regarding the AAM’s ability to accurately classify affective information. Employing the Connexor Machinese Syntax parser (www.connexor.eu/ technology/machinese/machinesesyntax), the AAM achieves 81.5 percent accuracy.

EmoHeart The EmoHeart object listens to its owner’s messages and sends each message to the AAM’s Webbased interface. After receiving the results (the dominant emotion and intensity), it visually reflects the sensed affective state through ■■ ■■

■■

animation of the avatar’s facial expression, EmoHeart’s texture (its expression, shape, and color, which indicate the type of emotion), and EmoHeart’s size (indicating the emotion’s strength—low, medium, or high).

The Affect Analysis Model The AAM senses nine emotions conveyed through text: anger, disgust, fear, guilt, interest, joy, sadness, shame, and surprise.1 The affect recognition algorithm, which takes into account the specific style and evolving language of online conversation, consists of five main stages: 1. symbolic-cue analysis, 2. syntactical-structure analysis, 74

September/October 2010

If the AAM detects no emotion, EmoHeart remains invisible, and the avatar’s facial expression remains neutral. Figures 3 and 4 show avatar facial expressions and EmoHeart textures. From December 2008 to January 2009, 89 Second Life users became owners of EmoHeart; 74 actually communicated using it. We stored the text messages along with the AAM results in an EmoHeart log database. The AAM categorized 20 percent of

the sentences as emotional and 80 percent as neutral. The most frequent emotion conveyed was joy (68.8 percent of all emotional sentences), followed by surprise (9.0 percent), sadness (8.8 percent), and interest (6.9 percent). We believe that this dominance of positivity is due to the nature and purpose of online communication media (to establish or strengthen interpersonal relationships).

Affective Haptic Devices According to the James-Lange theory, the conscious experience of emotion occurs after the cortex receives signals about physiological-state changes. That is, certain physiological changes precede feelings. Also, something as simple as changing a facial expression can easily evoke emotions—for example, a smile can cause happiness. This fact supports noncognitive theories of the nature of emotions. To support affective communication, iFeel_IM! incorporates three types of haptic devices: ■■

■■ ■■

HaptiHeart, HaptiButterfly, HaptiTemper, and HaptiShiver implicitly elicit emotion. HaptiTickler directly evokes emotion. HaptiHug uses social touch to influence mood and provide a sense of physical copresence.

Figure 5 shows these devices worn on a human body and their 3D models. Each emotion is characterized by a specific pattern of physiological changes. We selected four distinct emotions having strong physical features: anger, fear, sadness, and joy. The AAM recognizes these emotions with considerably higher precision (anger, 92 percent; fear, 91 percent; joy, 95 percent; and sadness, 88 percent) than it does for other emotions. Table 1 lists the emotions that each haptic device induces. HaptiHug. Online interactions rely heavily on vision and hearing, so a substantial need exists for mediated social touch.2 Of the forms of physical contact, hugging is particularly emotionally charged; it conveys warmth, love, and affiliation. Recently, researchers have made several attempts to create a hugging device.3–6 However, the proposed interfaces don’t produce a natural hugging sensation, so they can’t elicit a strong affective experience. They also lack a visual representation of the partner, which adds ambiguity (real-life hugging involves both a visual and physical experience). In addition, they don’t consider the power of social pseudohaptic illusion (that is, they don’t incorporate hugging animation).

(a)

(b)

(c)

(d)

Figure 3. Avatar facial expressions with the corresponding EmoHeart: (a) joy, (b) sadness, (c) anger, and (d) fear. EmoHeart’s texture (its expression, shape, and color) indicates the type of emotion; its size indicates the emotion’s strength.

Figure 4. Users chatting in Second Life through their avatars. The EmoHeart vividly and expressively represents the communicated emotions. IEEE Computer Graphics and Applications

75

Graphically Speaking

(a) (b) Figure 5. Our affective haptic devices worn on a human body, along with their 3D models. These devices implicitly elicit emotion, directly evoke emotion, or use social touch to influence mood and provide a sense of physical copresence.

Table 1. The iFeel_IM! affective haptic devices with the emotions they stimulate. Device

Joy

Sadness

Anger

Fear

HaptiHeart



Yes

Yes

Yes

HaptiButterfly

Yes







HaptiShiver







Yes

HaptiTemper

Yes



Yes

Yes

HaptiTickler

Yes







HaptiHug*

Yes







*HaptiHug also simulates social touch.

Soft Hand Pressure

Direction of belt tension

Soft Hand

Human body Motor holder Pressure Two motors Pressure

Direction of belt tension

Figure 6. HaptiHug physically reproduces a human hug by generating pressure on the user’s chest and back. 76

September/October 2010

We developed HaptiHug to create a wearable haptic display generating forces similar to those of a human hugging another human. Such a device should be lightweight, compact, comfortable, and aesthetically pleasing and should have low power consumption. HaptiHug’s key feature is that it physically reproduces the human-hug pattern, generating pressure simultaneously on each user’s chest and back. In HaptiHug, a holder on the user’s chest contains two oppositely rotating motors. The device’s hands (called Soft Hands), which are aligned horizontally, contact the user’s back. Once HaptiHug receives the hug command, the motors tense the belt, thus pressing HaptiHug’s hands and chest part toward the human body (see Figure 6). Soft Hands are based on real human hands and are made from 5-mm-thick sponge rubber (see Figure 7). Two pieces of the hand-shaped material sandwich narrow belt slots, with plastic screws connecting them. This structure provides enough flexibility to snugly fit the human back when the belt is pressing it. Moreover, the belt can loosely move inside the hands during tension. Experiments with actual human hugging showed that during plain hugging, the average pressure on the male back, female back, and chest area was 1.4 kN/m2, 1.7 kN/m2, and 2.3 kN/m2, respectively. HaptiHug can achieve the force level of a plain hug. Producing stronger forces would require more powerful motors and might cause unpleasant sensations. So, for online communication, we

138 mm

5 mm

Cover fabric

188 mm

Belt slots

Soft Hand

Soft Hand (a)

Belt

(b)

Figure 7. Soft Hands (a) dimensions and (b) structure. Soft Hands are based on those of a real human and made from soft material so that hugging partners can realistically feel the social presence of each other.

assigned the actual pressure of a plain hug to a “great big hug.” That is, we scaled the force level so that the hug sensation for a great big hug would be more comfortable for users. On the basis of our experimental results, we designed the control signals such that the pressure intensity, pattern, and duration are like those of human hugs. iFeel_IM! controls the hug’s duration and intensity in accordance with the emoticon or keyword detected from the text. To present a regular hug (for example, “(>^_^)>,” “{},” or “”), a big hug (for example, “>:D

iFeel_IM!: augmenting emotions during online communication.

iFeel_IM!: augmenting emotions during online communication. - PDF Download Free
6MB Sizes 0 Downloads 3 Views