The Artificial Emotion Engine ™,

download The Artificial Emotion Engine ™,

of 5

Transcript of The Artificial Emotion Engine ™,

  • 8/10/2019 The Artificial Emotion Engine ,

    1/5

    TheArtificial Emotion Engine ,Driving Emotional Behavior.

    Ian Wilson

    Artificial-Emotion.Com

    http://artificial-emotion.com

    [email protected] 1999.

    Abstract.The Artificial Emotion Engine is designed to generatea variety of rich emotional behaviors for autonomous,

    believable, interactive characters. These behaviors arein the form of facial and body gestures, motivationalactions and emotional internal states. It will help

    developers populate simulations with emotionalcharacters while freeing them of the need for an indepth knowledge of neurophysiological systems.Technologies, like the Artificial Emotion Engine, areneeded to help fuel the growth of the interactiveentertainment industry from a niche medium to themass-market medium it is becoming. Mass-market

    products will require the use of social interaction andengaging emotional characters to facilitate thecreation of some of their most powerful experiences.Many of the technologies currently employed to

    produce character behavior, finite state machines(FSMs), scripted conditional logic etc are beingstretched beyond their limits. Robotic, one-dimensional characters with limited sets of behaviors

    populate many interactive experiences, this problemneeds to be addressed. In this paper we present anoverview of the engine and a review of the neuro-

    physiological model employed by the engine.

    Introduction.

    Recent advances in neuroscience have begun to pavethe way for the simulation of emotion. We defineemotion as being comprised of two conceptual, andin some senses physical, properties. These arecognitive emotions and non-cognitive or innateemotions. We deal, primarily, with non-cognitiveemotions in the first version of the engine. Dealing

    with cognition is the Holy Grail of A.I but as yet itis some way off and it is fraught with problems andcomplications. As yet, neuroscience has no clearunderstanding of our cognitive systems, symbolicrepresentations, memory and how these work. Thismakes any attempt at simulation speculation at best.Simulation of non-cognitive emotions, while lessexpressive than cognitive emotions, is an attainablegoal in the near term.

    Recent work is showing us that emotion is anintegral part of our brains overall functioning. Thereis no clear delineation between thinking and feelingthey are both neural functions that interact andfeedback as a unified system, the brain. So what are

    we looking to achieve with this engine? Because this system is being developed as anengine it needs to work in a variety of applications,many of which will not previously have beenattempted, especially socially based applications.This constrains the design of the engine but also takessome of the burden away from it. The engineprovides, from an action perspective, the characterscurrent motivations. The designer or developer canthen use this information to determine what thecharacter will do next. We conceptually divide emotions into three layersof behavior. At the top level are what we termreactions or momentary emotions; these are the

    behaviors that we display briefly in reaction toevents. For example when someone strikes us. Thenext level down are moods, these are prolongedemotional states caused by the cumulative effect ofmomentary emotions, that is signals of punishmentand reward. Underlying both of these layers andalways present is our personality; this is the behaviorthat we generally display when no momentaryemotion or mood overrides (Fig 1).

    Fig 1.

  • 8/10/2019 The Artificial Emotion Engine ,

    2/5

    These behavior levels have different levels ofprominence. Reactions or momentary emotions havethe highest priority. If no reaction is generated wegenerate behavior based on the current mood. If ourmood level is below a certain threshold then wegenerate behavior based on our underlyingpersonality (Fig 2).

    Fig 2.

    The characters personality modulates all of thesystems in the engine and produces unique behaviorsfor each character. This allows the engine to be verysmall, elegant and robust and still provide an almostinfinite array of behaviors. The most powerful behaviors that the engineproduces are the character's body and facial gestures.These are very important attributes in the creation ofbelievable interactive characters. These gestures addextra channels of communication to a character andhelp not only in expressing emotion but also help topush the viewers brain over a "believability

    acceptance threshold". These behavioral gestures canbe broken down further and described in terms ofspecific body parts. Facial gestures are comprised ofeyebrow movements, eye movements and mouthmovements. Body gestures are comprised of spinalmovements, shoulder movements, head and neckmovements and hip movements. In the following sections we will describe themodel that the engine is based upon, the architectureof the engine, the engines interface (API), thedifferent modules of the engine and we detail anexample scenario of the engine at work and how itfits into an application.

    A Model of Human Personality.

    The conceptual model we use here is based on theextensive work of Professor Hans Eysenck (1985)and Professor Jeffery Gray (1970,1971,1975) whoare two of the most prominent researchers in the fieldof personality psychology and neurology. Their ownmodels represent some of the most robust and solidlygrounded work in the field. The model takesfundamental elements from Eysencks behavior and

    psychobiology based top down model and Graysneurological and psycho-biologically based bottomup model and combines the two to form a newsynthesis representing the emotional brain. Althoughthis model was developed with the intention of beingused to simulate rather than authentically reproducehuman personality it has, none the less, been

    described by Professor Gray as a valid and soundmodel. What is personality? We define personality as thegenetic and environmental differences in the brainstructures of individuals. As a species our brains arealmost identical which gives rise to our common setsof behaviors. We are, however, all genetically andenvironmentally unique and as such our brains,although similar, are all different to varying degrees.It is these differences that give us our uniquebehavioral variations to common behavior patterns,our personality. The model defines a characters personality as anarea in 3 dimensional Cartesian space. The axes of

    this space are Extroversion, Fear and Aggression(EFA space). Personality traits are represented bypoints within this space and are positioned accordingto the amount with which they are correlated witheach axis. For example the personality trait of anxietyis positioned at (E -30, F +70, A -10). This tells usthat the trait of anxiety is associated -30% withExtroversion, 70% with Fear and -10% withAggression (Fig 3).

    Fig 3.

    The position of the center of the personality area in

    this EFA space (point P) represents by how mucheach of those axes (central traits) define the overallpersonality. From this personality point we produce asphere which will contain the set of personality traitsthat make up the aggregate personality and areavailable to the character. Representing the manysubtle facets of human personality as beingconstructed from only three constituent parts seemson the surface like a gross over simplification.However if we only partition our EFA axes into

  • 8/10/2019 The Artificial Emotion Engine ,

    3/5

    intervals of 10 units we have a potential 20 x 20 x 20,8000, different personality traits. In general theaverage person would find it difficult to define morethan 50 individual traits. Think of the way in whichthe simple colors of red green and blue can becombined to produce millions of colors and we havea good analogy of how the three dimensions of

    personality combine to produce many of the facets ofhuman behavior. The reason for having three central dimensions ofpersonality becomes clear when we consider that thebrain may have three central systems that mediatebehavior. We believe these are the Approach Systemwhich is associated with Extroversion, the BehavioralInhibition System which is associated with Fear andthe Fight / Flight system which is believed to be thebiological basis of Aggression (Gray 1975). Thebehavior we display at any time is thought to becontrolled by these three systems and the way inwhich we behave is determined by the geneticprominence of each system. So if our personality

    were that of an anxious person then our BehavioralInhibition System would be very prominent indetermining our behavior, our personality would havea high Fear component and we would be generallyfearful or cautious.

    A Model of Human Moods

    As with most elements of this system mood isdetermined by perceived signals of punishment andreward but modulated by the characters personality.This means that the characters personality position inEFA space affects the range of positive and negativemoods and also the moods rate of change. Themaximum level of positive moods increases withincreased E , the maximum level of negative moodsincreases with increased F and the speed at whichmoods build up and decay increases with increased Avalues. If we take a character with a personality thatwas in the high E, high F, high A, area of our 3dimensional space their moods would display largepositive and large negative values and would build upand decay (change) quickly. So our character wouldbe very "moody" and have large mood swings, as isthe case for people with personalities in this area.

    Artificial Emotion Engine Architecture.

    As we can see from figure 4 the engine is builtaround nine modules. Six of these modules representconceptual neural systems, Emotional Reactions,Personality, Punishment and Reward, Mood,Memory and Motivations. Input and Output representthe engines interface, the API. The Self State is thecentral data repository for the engine and representsthe characters general emotional state at any time.The arrows in the diagram represent the feedback and

    interactions within the system, making it highlydynamic and complex in nature and allowing forsome degree of self-organization.

    Fig 4.

    Punishment and Reward. We, like most animals,seem to function with just two types of input signals,punishment and reward. To this end the majority ofthis system is fueled by these two inputs whereverpossible. When it is not possible (i.e. it is not yetunderstood) we use higher level conceptualizations.The function of the Punishment and Reward moduleis to take the incoming signals of raw, sensedpunishment and reward (p/r) and translate them intoperceived signals of p/r. The difference betweenthese two varieties of p/r is that the perceived p/r isdependant on the characters previous history ofreceived p/r and its personality. Two concepts areused here, habituation and novelty. The more thecharacter receives, in succession, a signal of one type,the lower the effect that signal has. This ishabituation. Conversely, the longer the character goeswithout receiving one type of signal the greater theeffect of that signal when it is received. This isnovelty. The characters personality determines how

    susceptible they are to signals of punishment orreward. For example the psychopath is highlysusceptible to reward but highly unsusceptible topunishment. This makes them go after thrillswithout regard for the consequences. Motivations. Our motivations are arranged into afour layer hierarchy of needs (Maslow 1970). Thephysiological needs are at the bottom followed bysafety needs, affiliation needs and finally esteemneeds. Each of these layers is made up of specific sub

  • 8/10/2019 The Artificial Emotion Engine ,

    4/5

    needs. The physiological need layer is comprised ofthe hunger, thirst, warmth and energy sub needs.Punishment and reward determines whether each ofthese needs has been satisfied or frustrated (is closerto or further away from an equilibrium point). Weprioritize these needs depending on how far they arefrom their equilibrium point and this determines what

    action the character needs to take to rectify thesituation. You may have realized that although thisengine does not have any cognitive components thesemotivations supply a very wide range of low levelactions for the character to take. These actions are atthe level of lower order primate behavior and maysuffice for many applications. Personality determineswhere each layer sits in the hierarchy. Physiologicalneeds are always the top priority but the other layersare interchangeable. For example our psychopathmay prioritize esteem above affiliation (friendshipand kinship) depending on how much of apsychopath they are. Personality. The description of the model of

    personality gives a good conceptual view of thismodule. Basically it represents a database ofpersonality traits and their position in EFA spacealong with a rating of the traits social desirability.This social desirability scale value determineswhether the trait should be used when the character isin a good (+sdsv) mood or a bad (-sdsv) mood. Mood. The mood module takes the incomingperceived signal of punishment or reward and uses itto alter the characters current mood. Obviouslypunishment pushes the characters mood in a negativedirection while reward does the opposite. How farand to what level the mood alters in any one time stepis a function of the personality (as mentionedpreviously). The psychopath, with a large Acomponent to his personality has moods that changequickly and so is easily pushed into a negative moodby a few signals of punishment. Memory. In this version of the engine we areimplementing a very constrained form of memory tofacilitate some non-cognitive social processing. It istherefore a social memory. It is used in conjunctionwith the affiliation and esteem needs to provideenhanced social behavior, which is the primaryapplication focus of this engine. It will keep track ofhow much a particular character is liked based on avariety of factors and the social preferences of ourfamily and friends amongst other things. It will alsoallow us to deepen the abilities of the reactiveemotion module using this enhanced data. Reactive Emotions. These are, generally, innateemotional reactions that have not involved any deepcognitive processing. As such they represent stimulusresponse reactions of the kind hard wired into ourneural circuits. We use the six types of emotionalreaction described by Ekman as Joy, Sadness, Fear,Anger, Surprise and Disgust. Again, personality plays

    a part in modulating these reactions. Our psychopathwould show very little fear in his reactions but mayshow a great deal of anger. The signals ofpunishment and reward and the levels of themotivational needs trigger the reactions if thesesignals cross a reaction threshold. The Self State. This is where we centrally store all

    data about the emotional state of the agent.

    API Input.

    The interface to the engine is designed to be assimple and elegant as possible leaving the developerwith very little work. The engine requires onlysignals of punishment or reward for each of thephysiological and safety needs (eight in total) in itsinput. It also takes input representing signals receivedby the five senses for emotional reactions. Some ofthese will remain constant if not changed, like theshelter need (we assume we are sheltered unless weare informed otherwise). Some will increase/decrease

    unless information is received to the contrary (i.e.hunger will increase unless we receive a rewardsignal for the hunger need). Obviously the idealsituation would be for the character to determine foritself when it was, for example, sheltered or feeding.But right now, in a general-purpose engine, this is notpossible and the (small) cognitive burden has to beplaced with the developer/designer. A data structureis provided to hold these values and each (developer-determined) time step a call is made to update theengine. This, apart from the characters engineinitialization, is all that is required as input to theengine.

    API Output.

    Output is provided in three formats to give thedeveloper their choice of power or flexibility. At itsmost expressively powerful the engine generates thepositional information required to produce emotionalbody and facial gestures. For example, if we had acharacter whose body joints were being driven by aninverse kinematics system the emotion engine wouldprovide the joint deviations from a plain vanillamovement to make that movement emotional. So ifwe had a timid character the spine would be curvedbackward, the shoulders hunched forwards and thehead down. Further to this the engine determines thecharacters movement speed, style and smoothness. Aneurotic character would move quickly in short burstsand in a very staccato, jerky fashion. At the second level the engine generates asemantic action plan which is determined by thecharacters motivational needs. This plan not onlydetermines what to do but how. The how is takenfrom the current mood and the personality traitselected. As an example if the character was out in

  • 8/10/2019 The Artificial Emotion Engine ,

    5/5

    the cold rain and had been for a while theirtemperature (warmth need) would be receiving aconstant stream of punishment. This would have theeffect of making the mood level highly negative andthe warmth need a high priority. If the character had atimid personality then the resulting semantic outputmay be: increase warmth very anxiously. Here

    increase warmth comes from the motivational needfor warmth satisfaction, very comes from theimportance of the need and the mood level andanxiously is a trait that this particular characterpossess and is appropriate for a negative mood (-sdsv). The howof this action plan adds a great deal ofdepth to the action. The third level of output is simply the charactersraw emotional state data. This gives the developer theflexibility to use the emotional state in ways nothandled by the other two modes.

    Example Scenario.

    Jane is a 38-year-old mother of two playing an officesimulation. She is the boss and she has to manage thewell being of her office co-workers. Unlike mostgames the tasks that the team have to perform aresecondary to their social interactions. Different teammembers respond differently to the same situationsthat the team encounters. She has to make them allwork well together and keep them happy (the goal).Jane starts out by taking a personality test todetermine her own personality. The result is passed tothe Artificial Emotion Engine of her character whichwill now display behaviors somewhat similar to herown. She then decides on the personalities of her fiveco-workers. She chooses an extrovert, a neurotic, apsychopath, an introvert and a character with apersonality similar to her own. Sounds like a sitcomin the making! Day one, Janes (alter egos) first day at work. Hercharacter is greeted warmly by the extrovert whosmiles a wide smile, the introvert sits at his desk andcarries on typing, looking glum and hardlyacknowledging her presence. The neurotic looksmildly panicked, she does not like change, hershoulders slump forward and she curls up lookingsubmissive. The psychopath sticks out his chest,shoulders back and fixes Janes character with asteely gaze as he marches, quickly and firmly, over toher to make his presence known. The character withthe personality similar to Janes, John, looks unsure.He is unable to decide if his new boss is a good thingor a bad one for him. Jane, knowing his personality,can empathize with him so she makes the first moveto smile and greet John What we are trying to show here is that, in additionto improving current genres, a new generation ofinteractive entertainment experience based on

    emotion and social interaction is possible using thiskind of technology.

    Conclusion.

    The use of a solidly based, thoroughly researched andcredible model of personality and emotion has reaped

    many benefits with the work we have done. All of thesystems that have built on this core model, moods,gestures, behaviors etc. have fitted neatly into theoverall scheme and convince us further of the validityof the model. It has given us a solid foundation fromwhich to build more complexity and depth for ourcharacters. Recent research and development hascentered on the neurological basis of the three axes ofthe personality model and their function in a systemof emotion. Using the work of Joseph LeDoux wehave been able to add non-cognitive emotions on toour model of personality. Autonomous and semiautonomous characters can be given more depth andreact emotionally to world and designer scripted

    events. These characters will help populate the newfrontiers of the Internet and virtual environments thatwe are building.

    Future Work.

    Simulating human emotion is an endeavor that isvery much open ended, at least for the next twentyyears or so. We plan to add cognitive processing ofsome sort to the system and also migrate the systemsmodules to a neural network based representation.Further to this we anxiously await the discoveriesbeing made by colleagues in the fields ofneuroscience.

    References.

    Ekman, P. & Rosenberg, E. Eds. 1997. What the facereveals. Oxford University Press.Eysenck, H.J. & Eysenck, M.W. 1985. Personalityand individual differences. Plenum.Gray, J.A. 1970. The psycho-physiological basis ofintroversion - extraversion. Behavior research andtherapy, 8.Gray, J.A. 1971. The psychology of fear and stress.Cambridge University Press.Gray, J.A. 1975.Elements of a two process theory oflearning. Academic Press.LeDoux, J. 1998. The Emotional Brain, Touchstone.Maslow, A.H. 1970. Personality and motivation.Harper and Collins.Wilson, I. 1996. Believable Autonomous Characters.Thesis. Dept. of Artificial Intelligence, Univ. ofWestminster.

    Copyright 2000, American Association for Artificial

    Intelligence. All rights reserved. http://www.aaai.org