Download A Light-weight Emotion Model for Non-Player Characters in a Video Game Yathirajan ... PDF

TitleA Light-weight Emotion Model for Non-Player Characters in a Video Game Yathirajan ...
LanguageEnglish
File Size18.1 MB
Total Pages81
Document Text Contents
Page 1

A Light-weight Emotion Model for Non-Player Characters in a
Video Game

by

Yathirajan Brammadesam Manavalan

A thesis submitted in partial fulfillment of the requirements for the degree of

Master of Science

Department of Computing Science

University of Alberta

c
Yathirajan Brammadesam Manavalan, 2015

Page 2

Abstract

Displaying believable emotional reactions in virtual characters is required in applica-

tions ranging from virtual-reality trainers to video games. Manual scripting is the

most frequently used method and enables an arbitrarily high �delity of the emotions

displayed. However, scripting is labor intense and thus greatly reduces the scope

of emotions displayed and emotionally a�ected behavior in virtual characters. As

a result, only a few virtual characters can display believable emotions and only in

pre-scripted encounters. In this thesis we implement and evaluate a light-weight

algorithm for procedurally controlling both emotionally a�ected behavior and emo-

tional appearance of a virtual character. The algorithm is based on two psychological

models of emotions: conservation of resources and appraisal. The former component

controls emotionally a�ected behavior of a virtual character whereas the latter gen-

erates explicit numeric descriptors of the character’s emotions which can be used to

drive the character’s appearance. We implement the algorithm in a simple testbed

and compare it to two baseline approaches via a user study. Human participants

judged the emotions displayed by the algorithm to be more believable than those of

the baselines.

ii

Page 40

Figure 5.2: An individual showing hope (top left), joy (top right), fear (bottom left)
and distress (bottom right).

At every time step, the person at the head of the line purchases the video game

and leaves the line. The simulation was stopped when the line became empty.

Visually, each person in line was represented with a photograph showing their facial

expression (only the highest intensity emotion was shown; Figure 5.2), their name

and the three resources. Health was visualized with a bar underneath the image. The

reputation was shown by the color of frame around their portrait (Figure 5.3). The

rank was shown by the position of the person in line (Figure 5.1). Additionally, people

in line uttered one-line remarks shown as text above their heads (Table 5.1). There

were 18 individuals (Figure 5.4) from which a line of six was randomly populated

(without repetition).

5.1.1 Implementation Details

ACORE was originally implemented as a text-based demonstration and the algorithm

was written in Python. We then decided to show the emotions as facial expressions

on a webpage. The webpage was created using Django which could directly run

29

Page 41

Figure 5.3: The visual representation of resources.

Table 5.1: One-line utterances.

Condition Utterance

At the head of the line Can I get a copy of Destiny?

Stop!
Being passed Where are you going?

You shall not pass!

Having been just passed Not fair!
I will get back at you!

the Python code in the backend. For the facial expressions we initially used Face

Plus (Mixamo, 2013) (Figure 5.5). The number of characters available in Face

Plus however was very limited and to incorporate more variety we decided to

use human faces (Figure 5.4). To run the user studies we obtained the domain

http://acore.cs.ualberta.ca/ but we ran into issue when trying to implement

the Django server on the website. As a result, we rewrote the ACORE algorithm

and the user interface in client side Javascript and ran the algorithm from a webpage

in our user study.

5.1.2 Participants

For the user study we recruited 94 participants (30 males, 64 females; mean age

20). The participants came from the research pool at the Department of Psychology

at the University of Alberta. The students participated for a partial credit and

were also given the option of taking an alternate assignment (Appendix A.5) if they

choose not to participate in the experiment. None of the participants opted for the

alternate assignment.

30

Page 80

Figure B.9: Kory.

Figure B.10: Leona.

Figure B.11: Matt.

Figure B.12: Nicole.

Figure B.13: Nitya.

69

Page 81

Figure B.14: Pankaj.

Figure B.15: Rohit.

Figure B.16: Stephanie.

Figure B.17: Tom.

Figure B.18: Vince.

70

Similer Documents