WO2004079530A2 - Using avatars to communicate - Google Patents

Using avatars to communicate Download PDF

Info

Publication number
WO2004079530A2
WO2004079530A2 PCT/US2004/006284 US2004006284W WO2004079530A2 WO 2004079530 A2 WO2004079530 A2 WO 2004079530A2 US 2004006284 W US2004006284 W US 2004006284W WO 2004079530 A2 WO2004079530 A2 WO 2004079530A2
Authority
WO
WIPO (PCT)
Prior art keywords
avatar
ofthe
user
sender
animation
Prior art date
Application number
PCT/US2004/006284
Other languages
French (fr)
Other versions
WO2004079530A3 (en
Inventor
Patrick Blattner
John Robinson
Jamie Odell
Brian Heikes
Tom Love
Mike Blackwell
David S. Levinson
Andrew Weaver
Original Assignee
America Online, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/747,701 external-priority patent/US7484176B2/en
Priority claimed from US10/747,696 external-priority patent/US7636755B2/en
Application filed by America Online, Inc. filed Critical America Online, Inc.
Priority to EP04716149A priority Critical patent/EP1599862A2/en
Priority to JP2006508976A priority patent/JP2006520053A/en
Priority to CA002517909A priority patent/CA2517909A1/en
Priority to AU2004216758A priority patent/AU2004216758A1/en
Publication of WO2004079530A2 publication Critical patent/WO2004079530A2/en
Publication of WO2004079530A3 publication Critical patent/WO2004079530A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/18Commands or executable codes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]

Definitions

  • This description relates to projecting a graphical representation of a communications application operator (hereinafter "sender") in communications sent in a network of computers.
  • send a communications application operator
  • Online services may provide users with the ability to send and receive instant messages.
  • Instant messages are private online conversations between two or more people who have access to an instant messaging service, who have installed communications software necessary to access and use the instant messaging service, and who each generally have access to information reflecting the online status of other users.
  • An instant message sender may send self-expression items to an instant message recipient.
  • Current implementations of instant messaging self-expression enable a user to individually select self-expression settings, such as a Buddy Icon and a Buddy Wallpaper, which settings thereafter project to other users who see or interact with that person online.
  • a graphical user interface on a display device of a computer enables communications using an avatar.
  • the graphical user interface includes an instant message sender display.
  • the instant message sender display has a sender portion that displays a sender avatar capable of displaying multiple animations.
  • the instant message sender display also has a message compose area capable of displaying text included in the message sent from the sender to the recipient and communication controls.
  • At least one communication control is operable to receive an indication that the message displayed in the message compose area is to be sent from the sender to the recipient.
  • the sender avatar is animated in response to a trigger related to content of a message sent from a sender to a recipient.
  • the instant message sender display may include a recipient portion that displays a recipient avatar and a message history area.
  • the recipient avatar may be capable of displaying multiple animations in response to a trigger related to content of a message sent from a sender to a recipient.
  • the message history area may be capable of displaying the content of multiple messages sent between the sender and the recipient and identifying an identity associated with the recipient.
  • the recipient avatar is animated in response to an animation ofthe sender avatar.
  • the graphical user interface may include a contact list display for displaying potential recipients.
  • the contact list display may indicate whether each potential recipient is available to receive a message.
  • the potential recipients may be grouped and associated with an indication of a group identity.
  • a potential recipient displayed on the contact list may be associated with a potential recipient avatar
  • the potential recipient avatar may be displayed on the contact list in association with an identity ofthe potential recipient.
  • the potential recipient avatar may be animated on the contact list in response to animation ofthe potential recipient avatar displayed elsewhere.
  • the animation ofthe potential recipient avatar on the contact list may include an animation that is substantially similar to, or different than, the animation ofthe potential recipient avatar displayed elsewhere.
  • the animation ofthe potential recipient avatar on the contact list may include an animation that is representative ofthe animation ofthe potential recipient avatar displayed elsewhere.
  • the graphical user interface may be a graphical user interface that is used for an instant messaging communication session.
  • the trigger comprises a portion or all ofthe text ofthe message.
  • the appearance or animation ofthe sender avatar may indicate an environmental condition, a personality characteristic associated with the sender, an emotional state associated with the sender, a setting characteristic or an activity associated with the sender.
  • the sender avatar may be animated in response to the passing of a predetermined amount of time during which the sender does not communicate a message to the recipient or during which the sender does not use a computing device that is used by the sender to communicate with the recipient in the communications session.
  • the avatar animation used as the communication conduit may include a breakout animation that involves displaying avatar outside of normal display space occupied by the avatar.
  • communicating includes graphically representing, with an avatar capable of being animated, a first user in a communication session involving the first user and a second user.
  • a message is communicated between the first user and the second user.
  • the message conveys explicit information from the first user to the second user.
  • Out-of-band information is communicated to the second user using a change in the avatar appearance or avatar animation as a communication conduit.
  • the out-of-band communication includes a communication that is related to a context ofthe first user and that differs from the information conveyed in the message sent between the first user and the second user.
  • the communication session may be an instant messaging communication session.
  • the avatar may be a facial animation that does not include a body having an ear or a leg or may be a facial animation, including a neck, that does not include a body having an ear or a leg.
  • the out-of-band information may include information indicating an environmental condition associated with the first user.
  • the environmental condition may include an environmental condition related to weather occurring in a geographic location near the first user.
  • the out-of-band information may indicate a personality characteristic associated with the first user or an emotional state associated with the first user.
  • the out-of-band information may include information indicating a setting characteristic associated with the first user.
  • the setting characteristic may include a characteristic related to time of day ofthe first user or a characteristic related to time of year.
  • the time of year may include a holiday or a season that is one of spring, summer, fall or winter.
  • the setting characteristic may include a characteristic associated with a work setting or a recreation setting.
  • the recreation setting may include a beach setting, a tropical setting or a winter sport setting.
  • the out-of-band information may include information related to a mood of the first user.
  • the mood ofthe first user may be one of happy, sad or angry.
  • the out-of-band information may include information associated with an activity ofthe first user.
  • the activity may be performed by the first user at substantially the same time that the out-of-band message is communicated from the first user to the second user.
  • the activity may be working or listening to music.
  • the out-of-band information may include information conveying that the first user has muted sounds associated with the avatar.
  • An animation ofthe avatar to convey the out-of-band information from the first user to the second user may be triggered based on the information conveyed in the message from the first user to the second user.
  • the trigger may include a portion or all ofthe text ofthe message.
  • the trigger may include an audio portion ofthe message.
  • the trigger may include the passing of a predetermined amount of time during which the first user does not communicate a message to the second user or does not use a computing device that is used by the first user to communicate with the second user in the communication session.
  • the avatar animation used as the communication conduit may include a facial expression ofthe avatar, a gesture made by a hand or arm ofthe avatar, movement of a body ofthe avatar or sounds made by the avatar. At least some ofthe sounds may include a voice based on a voice ofthe first user.
  • the avatar animation used as the communication conduit may include a breakout animation that involves displaying avatar outside of normal display space occupied by the avatar.
  • a breakout animation may include telescoping, resizing, or repositioning the avatar.
  • the first user may be provided with multiple preconfigured avatars having associated preselected animations.
  • the first user may be enabled to select a particular avatar to represent the user in the communications session.
  • the first user may be persistently associated with the selected avatar to represent the first user in subsequent communication sessions.
  • the characteristic ofthe first user may be one of age, gender, hair color, eye color, or a facial feature.
  • Enabling the first user to modify the appearance ofthe avatar may include enabling the first user to modify the appearance ofthe avatar by adding, changing or deleting a prop displayed with the avatar.
  • a prop may be one of eyeglasses, sunglasses, a hat, or earrings.
  • the first user may be enabled to modify a trigger used to cause an animation ofthe avatar.
  • the trigger may include text included in the message sent from the first user to the second user.
  • the avatar may be animated for use as an information assistant to convey information to the first user.
  • Use ofthe avatar by an application other than a communications application, including an online journal, may be enabled.
  • a depiction ofthe avatar may be displayed in the form that is substantially similar to a trading card.
  • the trading card depiction ofthe avatar may include characteristics associated with the first user.
  • perception of multiple online personas is enabled in an instant messaging communications session. At least two identities within a communications environment to whom messages may be directed are identified. A first persona of a user is enabled to be projected to a first ofthe identities while a second persona ofthe same user is enabled to be concurrently projected to a second ofthe identities. The first and second personas each include an avatar capable of being animated, and the first persona and the second persona differ.
  • Implementations may include one or more ofthe following features.
  • the first persona may differ from the second persona such that first persona invokes a different avatar than an avatar invoked by the second persona.
  • the first persona may invoke a first avatar
  • the second persona may invoke a second avatar.
  • the first avatar and the second avatar may be the same avatar.
  • An animation associated with the first avatar may be different from animations associated with the second avatar.
  • An appearance associated with the first avatar may be different from appearances associated with the second avatar.
  • An avatar may be associated with multiple sounds.
  • An avatar may be capable of being animated based on text of a message sent in the instant message communications session.
  • An avatar also maybe capable of being animated to send an out-of-band communication.
  • the first persona may be associated with a first group of identities so that the first persona is projected in communications sessions with members ofthe first group of identities.
  • the second persona may be associated with a second group of identities so that the second persona is projected in communications sessions with members of the second group of identities.
  • a persona may be associated with the first ofthe identities, and a different persona may be associated with a group ofthe identities with which the first ofthe identities is associated.
  • the first persona projected to the first ofthe identities may be an amalgamation ofthe persona associated with the first ofthe identities and the different persona associated with the group ofthe identities.
  • the persona associated with the first ofthe identities may override the different persona associated with the group ofthe identities to the extent a conflict exists.
  • perception of multiple online personas is enabled in an instant messaging communications session.
  • An instant messaging application user interface for an instant messaging communications session is rendered on an instant messaging recipient system.
  • the communications session involves at least one potential instant messaging recipient and a single potential instant messaging sender.
  • a message is sent that includes a text message and a persona.
  • the persona is selected among multiple possible personas associated with the instant messaging sender to be displayed by the potential instant messaging recipient when displaying the text message.
  • the selected persona includes a collection of one or more self-expression items and a sender avatar capable of being animated.
  • the selected persona is rendered at the potential instant messaging recipient system when rendering another portion ofthe message.
  • Implementations may include one or more ofthe following features.
  • the sender persona may be selected by the instant messaging sender from the multiple possible personas associated with the instant messaging sender.
  • the persona may be rendered before or after communications are initiated by the potential instant messaging sender.
  • the self-expression items may include one or more of a wallpaper, an emoticon, and a sound.
  • One or more personas may be defined.
  • a first persona may be assigned to a first potential instant messaging recipient so that the first persona is thereafter automatically invoked and projected in an instant messaging communications session involving the first potential instant messaging recipient.
  • a second persona may be assigned to a second potential instant messaging recipient so that the second persona is thereafter automatically invoked and projected in an instant messaging communications session involving the second potential instant messaging recipient.
  • the second persona may be at least partially distinguishable fi.Om the first persona.
  • a first persona may be assigned to a first group of potential instant messaging recipients so that the first persona is thereafter automatically invoked and projected in an instant messaging communications session involving a member ofthe first group of potential instant messaging recipients.
  • a second persona may be assigned to a second potential instant messaging recipient so that the second persona is thereafter automatically invoked and projected in an instant messaging communications session involving the second potential instant messaging recipient.
  • the second persona may be at least partially distinguishable from the first persona.
  • the use of one ofthe multiple personas may be disabled. Disabling the use of one ofthe multiple personas may be based on the instant messaging recipient.
  • One ofthe multiple personas may be a work persona associated with presence ofthe instant messaging sender at a work location associated with the instant messaging sender.
  • One ofthe multiple personas may be a home persona associated with presence ofthe instant messaging sender at home.
  • a determination may be made as to whether the instant messaging sender is at home or at the work location, h response to a determination that the instant messaging sender is at home, the home persona may be selected for use in the instant messaging communications session, hi response to a determination that the instant messaging sender is at the work location, the work persona may be selected for use in the instant messaging communications session.
  • a persona to be displayed may be selected by the potential instant messaging recipient based on time of day, day of week, or a group of potential instant messaging recipients that are associated with the potential instant messaging recipient.
  • At least some of characteristics of a persona may be transparent to the instant messaging sender.
  • the sender avatar may be animated to send an out-of-band communication from the instant messaging sender to the potential instant messaging recipient.
  • an avatar is used to coirimunicate.
  • a user is represented graphically using an avatar capable of being animated.
  • the avatar is associated with multiple animations and multiple features of appearance that represent a pattern of characteristics representing a personality ofthe avatar.
  • Implementations may include one or more ofthe following features.
  • the avatar may be associated with a description that identifies the personality ofthe avatar.
  • the personality ofthe avatar may include at least some characteristics that are distinct of at least some characteristics of a personality of the user.
  • a second user may be graphically represented with a second avatar capable of being animated.
  • the second avatar may be associated with multiple animations and multiple features of appearance that represent a pattern of characteristics representing a personality ofthe second avatar.
  • the personality ofthe second avatar may include at least some characteristics that are distinct of at least some characteristics ofthe personality ofthe first avatar. Communication messages may be sent between the first user and the second user.
  • a first avatar is animated based on perceived animation of a second avatar.
  • a first user is graphically represented with a first avatar capable of being animated
  • a second user is graphically represented with a second avatar capable of being animated.
  • Communication messages are being sent between the first user and the second user.
  • An indication of an animation ofthe first avatar is received, and, the second avatar is animated in response to, and based on, the received indication ofthe animation.
  • Implementations may include one or more ofthe following features.
  • the indication of an animation received may be any type of animation ofthe first avatar or may be an indication of a particular animation of multiple possible animations ofthe first avatar.
  • the first avatar may be subsequently animated in response to and based on the animation ofthe second avatar.
  • the first avatar may be ammated in response to a particular portion of a message sent between the first user and the second user.
  • the message may be sent from the first user to the second user or may be sent to the first user from the second user.
  • FIGS. 1, 2 and 5 are diagrams of user interfaces for an instant messaging service capable of enabling a user to project an avatar for self-expression.
  • FIG. 3 is a flow chart of a process for animating an avatar based on the content of an instant message.
  • FIG. 4 is a block diagram illustrating exemplary animations of an avatar and textual triggers for each animation.
  • FIG. 6 is a diagram illustrating an exemplary process involving communications between two instant messaging client systems and an instant message host system, whereby an avatar of a user of one of the instant message client systems is animated based on the animation of an avatar of a user ofthe other ofthe instant message client systems.
  • FIG. 7 is a flow chart of a process for selecting and optionally customizing an avatar.
  • FIG. 8 is a block diagram depicting examples of avatars capable of being projected by a user for self-expression.
  • FIG. 9 is a diagram of a user interface for customizing the appearance of an avatar.
  • FIG. 10 is a diagram of a user interface used to present a snapshot description of an avatar.
  • FIG. 11 A is a block diagram illustrating relationships between online personas, avatars, avatar behaviors and avatar appearances.
  • FIG. 1 IB is a flow chart of a process for using a different online personality to communicate with each of two instant message recipients.
  • FIG. 12 is a diagram of a user interface that enables an instant message sender to select among available online personas.
  • FIG. 13 is a diagram of exemplary user interfaces for enabling an instant message sender to create and store an online persona that includes an avatar for self- expression.
  • FIG. 14 is a flow chart of a process for enabling a user to change an online persona that includes an avatar for self-expression.
  • FIG. 15 is a flow chart of a process for using an avatar to communicate an out- of-band message to an instant message recipient.
  • FIGS. 16, 17 and 18 are diagrams of exemplary communications systems capable of enabling an instant message user to project an avatar for self-expression.
  • An avatar representing an instant messaging user may be animated based on the message sent between a sender and recipient.
  • An instant messaging application interface is configured to detect entry of predetermined or user-defined character strings, and to relate those character strings to predefined animations of an avatar.
  • the avatar representing or selected by the sender is animated in the recipient's instant messaging application interface and, optionally, in the sender's instant messaging application interface.
  • the avatar is rendered based on an animation model including a mesh that defines, using polygons, the form ofthe avatar, a texture defines an image to covers the mesh ofthe avatar, and a light map that defines the effect of a light source on the avatar.
  • the animation model for the avatar includes particular geometry, including at least one thousand polygons in the underlying wire model that makes up the avatar's mesh, and at least twenty blend shapes, each of which defines a different facial expression or shape.
  • the animation model includes multiple animations capable of being rendered for the avatar defined by the animation model and the animations being capable of association with one or more sound effects.
  • the animation model for the avatar may include only a face and/or a face and neck ofthe avatar.
  • An avatar representing a user in a communications session also may be used to send to another user an out-of-band communication that conveys information independent of information conveyed directly in the text message sent.
  • the out-of- band information may be communicated using a change in the avatar appearance or avatar animation as a communication conduit.
  • an out-of-band communication may include information about the sender's setting, environment, activity or mood, which is not explicitly communicated and part of a text message exchanged by a sender and a recipient.
  • a user may name and save multiple different “online personas” or “online personalities,” which are groups of instant messaging self-expression settings such as, for example, avatars, Buddy Sounds, Buddy Wallpaper and Emoticons (e.g.,
  • Smileys depending on the identity with whom the user communicates, they may access and project a preselected one of their online personas in an instant messaging environment, and or they may manually invoke and manage the online persona they project to others. Functionality and features ofthe instant messaging interface may differ based upon the online personas being used in the instant message conversation.
  • An avatar that represents a user in a communications session may be ammated, without user manipulation, based on the animation of another avatar that represents another user in the same communications session. This may be referred to as an automatic response of an avatar to the behavior of another avatar.
  • FIG. 1 illustrates an exemplary graphical user interface 100 for an instant messaging service capable of enabling a user to project an avatar for self-expression.
  • the user interface 100 may be viewed by a user who is an instant message sender and whose instant messaging communications program is configured to project an avatar associated with and used as an identifier for the user to one or more other users or user groups (collectively, instant message recipients).
  • the user IMSender is an instant message sender using the user interface 100.
  • the instant message sender projects a sender avatar 135 in an instant messaging communications session with an instant message recipient SuperBuddyFanl, who projects a recipient avatar 115.
  • a corresponding graphical user interface (not shown) is used by the instant message recipient SuperBuddyFanl . h this manner, the sender avatar 135 is visible in each ofthe sender's user interface and the recipient's user interface, as is the recipient avatar 115.
  • the instant messaging communications session may be conducted simultaneously, near-simultaneously, or serially.
  • the user interface (UI) 100 includes an instant message user interface 105 and an instant messaging buddy list window 170.
  • the instant message user interface 105 has an instant message recipient portion 110 and an instant message sender portion 130.
  • the instant message recipient portion 110 displays the recipient avatar 115 chosen by the instant message recipient with whom the instant message sender is having an instant message conversation.
  • the instant message sender portion 130 displays the sender avatar 135 chosen by the instant message sender.
  • the display ofthe sender avatar 135 in the instant message user interface 105 enables the instant message sender to perceive the avatar being projected to the particular instant message recipient with whom the instant message sender is communicating.
  • the avatars 135 and 115 are personalization items selectable by an instant message user for self-expression.
  • the instant message user interface 105 includes an instant message composition area 145 for composing instant message messages to be sent to the instant message recipient and for message history text box 125 for displaying a transcript ofthe instant message communications session with the instant message recipient.
  • Each ofthe messages sent to, or received from, the instant message recipient are listed in chronological order in the message history text box 125, each with an indication ofthe user that sent the message as shown at 126.
  • the message history text box 125 optionally may include a time stamp 127 for each ofthe messages sent.
  • Wallpaper may be applied to portions ofthe graphical user interface 100.
  • wallpaper may be applied to window portion 120 that is outside ofthe message history box 125 or window portion 140 that is outside ofthe message composition area 145.
  • the recipient avatar 115 is displayed over, or in place of, the wallpaper applied to the window portion 120, and the wallpaper applied to the window portion 120 corresponds to the recipient avatar 115.
  • the sender avatar 135 is displayed over, or in place of, the wallpaper applied to the window portion 140 and the wallpaper applied to the window portion 120 corresponds to the sender avatar 135.
  • a box or other type of boundary may be displayed around the avatar, as shown by boundary 157 displayed around the sender avatar 135.
  • a different wallpaper may be applied to window portion 158 inside the boundary 157 than the wallpaper applied to the window portion 140 outside ofthe message composition area 145 but not within the boundary 157.
  • the wallpaper may appear to be non-uniform and may include objects that are animated.
  • the wallpapers applied to the window portions 120 and 140 may be personalization items selectable by an instant message user for self-expression.
  • the instant message user interface 105 also includes a set of feature controls 165 and a set of transmission controls 150.
  • the feature controls 165 may control features such as encryption, conversation logging, conversation forwarding to a different commumcations mode, font size and color control, and spell checking, among others.
  • the set of transmission controls 150 includes a control 160 to trigger sending ofthe message that was typed into the instant message composition area 145, and a control 155 for modifying the appearance or behavior ofthe sender avatar 135.
  • the instant message buddy list window 170 includes an instant message sender-selected list 175 of potential instant messaging recipients ("buddies") 180a- 180g. Buddies typically are contacts who are known to the potential instant message sender (here, EVISender).
  • the representations 180a-180g include text identifying the screen names ofthe buddies included in list 175; however, additional or alternative information may be used to represent one or more ofthe buddies, such as an avatar associated with the buddy, that is reduced in size and either still or animated.
  • the representation 180a includes the screen name and avatar ofthe instant message recipient named SuperBuddyFanl.
  • the representations 180a- 180g may provide connectivity information to the instant message sender about the buddy, such as whether the buddy is online, how long the buddy has been online, whether the buddy is away, or whether the buddy is using a mobile device.
  • Buddies may be grouped by an instant message sender into one or more user- defined or pre-selected groupings ("groups"). As shown, the instant message buddy list window 170 has three groups, Buddies 182, Co-Workers 184, and Family 186. SuperBuddyFanl 185a belongs to the Buddies group 182, and ChattingChuck 185c belongs to the Co-Workers group 184. When a buddy's instant message client program is able to receive communications, the representation ofthe buddy in the buddy list is displayed under the name or representation ofthe buddy group to which the buddy belongs.
  • each ofthe sender avatar 135 and the recipient avatar 115 is a graphical image that represents a user in an instant message communications session. The sender projects the sender avatar 135 for self-expression, whereas the recipient projects the recipient avatar 115 also for self-expression.
  • each ofthe animation avatars 135 or 115 is an avatar that only includes a grapliical image of a face, which may be referred to as a facial avatar or a head avatar.
  • an avatar may include additional body components.
  • a Thanksgiving turkey avatar may include an image of a whole turkey, including a head, a neck, a body and feathers.
  • the sender avatar 135 may be animated in response to an instant message sent to the instant message recipient, and the recipient avatar 115 may be animated in response to an instant message sent by the instant message recipient.
  • the text of an instant message sent by the sender may trigger an animation ofthe sender avatar 135, and the text of an instant messages sent by the instant message recipient to the sender may trigger an animation of the recipient avatar 115.
  • the text of a message to be sent is specified by the sender in the message specification text box 145.
  • the text entered in the message specification text box 145 is sent to the recipient when the sender activates the send button 160.
  • the instant message application searches the text ofthe message for animation triggers.
  • an animation trigger is identified, the sender avatar 135 is animated with an animation that is associated with the identified trigger. This process is described more fully later.
  • the text of a message sent by the instant message recipient and received by the sender is searched for animation triggers and, when found, the recipient avatar 115 is animated with an animation associated with the identified trigger.
  • the text of a message may include a character string "LOL,” which is an acronym that stands for "laughing out loud.”
  • the character string "LOL” may trigger an animation in the sender avatar 135 or the recipient avatar 115 such that the sender avatar 135 or the recipient avatar 115 appears to be laughing.
  • the sender avatar 135 may be animated in response to an instant message sent from the instant message recipient, and the recipient avatar 115 may be animated in response to a message sent from the instant message sender.
  • the text of an instant message sent by the sender may trigger an animation ofthe recipient avatar 115
  • the text of an instant messages sent by the instant message recipient to the sender may trigger an animation ofthe sender avatar 135.
  • the text of a message to be sent is specified by the sender in the message specification text box 145.
  • the text entered in the message specification text box 145 is sent to the recipient when the sender activates the send button 160.
  • the instant message application searches the text ofthe message for animation triggers.
  • an animation trigger is identified, the recipient avatar 115 is animated with an animation that is associated with the identified trigger.
  • the text of a message sent by the instant message recipient and received by the sender is searched for animation triggers and, when found, the sender avatar 135 is animated with an animation associated with the identified trigger.
  • the sender avatar 135 or the recipient avatar 115 maybe animated in direct response to a request from the sender or the recipient.
  • Direct animation of the sender avatar 135 or the recipient avatar 115 enables use ofthe avatars as a means for communicating information between the sender and the recipient without an accompanying instant message.
  • the sender may perform an action that directly causes the sender avatar 135 to be animated, or the recipient may perform an action that directly causes the recipient avatar 115 to be animated.
  • the action may include pressing a button corresponding to the animation to be played or selecting the animation to be played from a list of animations.
  • the sender may be presented with a button that inspires an animation in the sender avatar 135 and that is distinct from the send button 160. Selecting the button may cause an animation ofthe sender avatar 135 to be played without performing any other actions, such as sending an instant message specified in the message composition area 145.
  • the played animation may be chosen at random from the possible animations ofthe sender avatar 135, or the played animation may be chosen before the button is selected.
  • An animation in one ofthe avatars 135 or 115 displayed on the instant messaging user interface 105 may cause an animation in the other avatar.
  • an animation ofthe recipient avatar 115 may trigger an animation in the sender avatar 135, and vice versa.
  • the sender avatar 135 maybe animated to appear to be crying, hi response to the animation ofthe sender avatar 135, the recipient avatar 115 also may be animated to appear to be crying.
  • the recipient avatar 115 may be animated to appear comforting or sympathetic in response to the crying animation ofthe sender avatar 135.
  • a sender avatar 135 may be ammated to show a kiss and, in response, a recipient avatar 115 may be animated to blush.
  • the recipient avatar 115 may appear to respond to a mood ofthe sender communicated by the sender avatar 135.
  • the recipient avatar 115 in response to a frowning or teary animation ofthe sender avatar 135, the recipient avatar 115 also may appear sad.
  • the recipient avatar 115 may be animated to try to cheer up the sender avatar 135, such as by smiling, exhibiting a comical expression, such as sticking its tongue out, or exhibiting a sympathetic expression.
  • An avatar 135 or 115 may be animated in response to a detected idle period of a predetermined duration. For example, after a period of sender inactivity, the sender avatar 135 may be ammated to give the appearance that the avatar is sleeping, falling off ofthe instant messaging interface 105, or some other activity indicative of inactivity. An avatar 135 or 115 also may progress through a series of animations during a period of sender inactivity. The series of animations may repeat continuously or play only once in response to the detection of an idle period. In one example, the sender avatar 135 may be animated to give the appearance that the avatar is sleeping and then having the avatar appear to fall off the instant messaging user interface 105 after a period of sleeping.
  • Animating an avatar 135 or 115 through a progression of multiple animations representative of a period of sender inactivity may provide entertainment to the sender. This may lead to increased usage ofthe instant messaging user interface 105 by the sender, which in turn, may lead to an increased market share for the instant message service provider.
  • the sender avatar 135 or the recipient avatar 115 may be animated to reflect the weather at the geographic locations ofthe sender and the recipient, respectively. For example, if rain is falling at the geographic location ofthe sender, then the sender avatar 135 may be animated to put on a rain coat or open an umbrella.
  • the wallpaper corresponding to the sender avatar 135 also may include rain drops animated to appear to be failing on the sender avatar 135.
  • the animation ofthe sender avatar 135 or the recipient avatar 115 played in response to the weather may be triggered by weather information received on the sender's computer or the recipient's computer, respectively.
  • the weather information may be pushed to the sender's computer by a host system of an instant messaging system being used. If the pushed weather information indicates that it is raining, then an animation ofthe sender avatar 135 corresponding to rainy weather is played.
  • the avatar may be used to audibly verbalize content other than the text communicated between parties during a communications session. For example, if the text "Hi” appears within a message sent by the sender, the sender avatar 135 may be animated to verbally say “Hello” in response. As another example, when the text "otp” or the text "on the phone” appears within a message sent by the recipient, the recipient avatar 115 may be animated to verbally say “be with you in just a minute” in response. As another example, in response to an idle state, an avatar may audibly try to get the attention ofthe sender or the recipient.
  • the recipient avatar 115 may audibly say "Hello? You there?" to try to elicit a response from the sender regarding the recipient's question.
  • the sender may mute the recipient avatar 115 or the sender avatar 135 to prevent the recipient avatar 115 or the sender avatar 135 from speaking further.
  • the sender may prefer to mute the recipient avatar 115 to prevent the recipient avatar 115 from speaking.
  • the avatar may appear to be wearing a gag.
  • the voice of an avatar may correspond to the voice of a user associated with the avatar.
  • the characteristics ofthe user's voice may be extracted from audio samples ofthe user's voice.
  • the extracted characteristics and the audio samples may be used to create the voice ofthe avatar.
  • the voice ofthe avatar need not correspond to the voice ofthe user and may be any generated or recorded voice.
  • the sender avatar 135 may be used to communicate an aspect ofthe setting or the environment ofthe sender.
  • the animation and appearance of the sender avatar 135 may reflect aspects ofthe time, date or place ofthe sender or aspects ofthe circumstances, objects or conditions ofthe sender.
  • the sender avatar 135 may appear to be dressed in pajamas and have a light turned on to illuminate an otherwise dark portion ofthe screen on which the avatar is displayed and/or the sender avatar 135 may periodically appear to yawn.
  • the sender avatar 135 When the sender uses the instant messaging user interface 105 during a holiday period, the sender avatar 135 maybe dressed in a manner illustrative ofthe holiday, such as appearing, as Santa Claus during December, a pumpkin near Halloween, or Uncle Sam during early July.
  • the appearance ofthe sender avatar 135 also may reflect the climate or geographic location ofthe sender. For example, when rain is falling in the location ofthe sender, wallpaper corresponding the sender avatar 135 may include falling raindrops and/or the sender avatar 135 may wear a rain hat or appear under an open umbrella. In another example, when the sender is sending instant message from a tropical location, the sender avatar 135 may appear in beach attire.
  • the sender avatar 135 also may communicate an activity being performed by the sender while the sender is using the instant messaging user interface 105. For example, when the sender is listening to music, the avatar 135 may appear to be wearing headphones. When the sender is working, the sender avatar 135 may be dressed in business attire, such as appearing in a suit and a tie.
  • the appearance ofthe sender avatar 135 also may communicate the mood or an emotional state ofthe sender.
  • the sender avatar 135 may communicate a sad state ofthe sender by frowning or shedding a tear.
  • the appearance ofthe sender avatar 135 or the recipient avatar 115 may resemble the sender or the recipient, respectively.
  • the appearance ofthe sender avatar 135 may be such that the sender avatar 135 appears to be of a similar age as the sender.
  • the sender avatar 135 also may appear to age.
  • the appearance ofthe recipient avatar 115 may be such that the recipient avatar 115 has an appearance similar to that ofthe recipient.
  • the wallpaper applied to the window portion 120 and/or the wallpaper applied to the window portion 140 may include one or more animated objects.
  • the animated objects may repeat continuously or periodically on a predetermined or random basis a series of animations.
  • the wallpapers applied to the window portions 120 and 140 may be animated to in response to the text of messages sent between the sender and the recipient.
  • the text of an instant message sent by the sender may trigger an animation of the animated objects included in the wallpaper corresponding to the sender avatar 135, and the text of an instant messages sent by the instant message recipient to the sender may trigger an animation ofthe animated objects included in the wallpaper corresponding to the recipient avatar 115.
  • the animated objects included in the wallpapers may be animated to reflect the setting or environment, activity and mood ofthe recipient and the sender, respectively.
  • An avatar may be used as a mechanism to enable self-expression or additional non-text communication by a user associated with the avatar.
  • the sender avatar 135 is a projection ofthe sender
  • the recipient avatar 115 is a projection of the recipient.
  • the avatar represents the user in instant messaging communications sessions that involve the user.
  • the personality or emotional state of a sender may be projected or otherwise communicated tlirough the personality ofthe avatar.
  • Some users may prefer to use an avatar that more accurately represents the user. As such, a user may change the appearance and behavior of an avatar to more accurately reflect the personality ofthe user, hi some cases, a sender may prefer to use an avatar for self-expression rather than projecting an actual image ofthe sender. For example, some people may prefer using an avatar to sending a video or photograph ofthe sender.
  • the animation of an avatar may involve resizing or repositioning the avatar such that the avatar occupies more or different space on the instant message user interface 105 than the original boundary ofthe avatar.
  • the size of sender avatar 205 has been increased such that the avatar 205 covers a portion ofthe message instant message composition area 145 and the control 155.
  • elements ofthe user interface 100 other than an avatar also may be displayed using additional space or using different space on the user interface 100.
  • a sender avatar may depict a starfish with an expressive face and may be displayed on wallpaper that includes animated fish. The animated fish included in the wallpaper may be drawn outside the original boundary around the sender avatar 135 and appear to swim outside the original boundary area.
  • a process 300 is illustrated for animating an avatar for self-expression based on the content of an instant message, h particular, an avatar representing an instant message sender is animated in response to text sent by the sender.
  • the wallpaper ofthe avatar also is animated.
  • the process 300 is performed by a processor executing an instant messaging communications program.
  • the text of a message sent to an instant message recipient is searched for an animation trigger and, when a trigger is found, the avatar that represents the instant message sender is animated in a particular manner based on the particular trigger that is found.
  • the wallpaper displayed for the avatar includes an animated object or animated objects.
  • the object or objects may be animated based on the content ofthe instant message sent or may be animated based on other triggers, including (but not limited to) the passing of a predetermined amount of time, the occurrence of a particular day or time of day, any type of animation ofthe sender avatar, a particular type of animation ofthe sender avatar, any type of animation ofthe recipient avatar, or a particular type ofthe animation ofthe recipient avatar. Also, when the sender is inactive for a predetermined duration, the avatar sequentially displays each of multiple animations associated with an idle state.
  • the process 300 begins when an instant message sender who is associated with an avatar starts an instant messaging communications session with an instant message recipient (step 305).
  • the sender may select the name ofthe recipient from a buddy list, such as the buddy list 170 from FIG. 1.
  • the name ofthe recipient may be entered into a form that enables instant messages to be specified and sent.
  • the sender may start an instant messaging application that may be used to sign on for access to the instant messaging system and specify the recipient as a user ofthe instant messaging system with which a communications session is to be started. Once the recipient has been specified in this manner, a determination is made as to whether a copy of avatars associated with the sender and the recipient exist on the instant message client system being used by the sender.
  • copies ofthe avatars are retrieved for use during the instant message communications session.
  • information to render an avatar ofthe recipient may be retrieved from an instant message host system or the instant message recipient client, some cases, a particular avatar may be selected by the sender for use during the instant messaging communications session. Alternatively or additionally, the avatar may have been previously identified and associated with the sender.
  • the processor displays a user interface for the instant messaging session including the avatar associated with the sender and wallpaper applied to the user interface over which the avatar is displayed (step 307).
  • the avatar may be displayed over, for example, wallpaper applied to a portion of a window in which an instant message interface is displayed.
  • the avatar is displayed over a portion or portions of an instant message interface, such as window portions 120 or 140 and FIG. 1.
  • the wallpaper corresponding to avatar may include an object or objects that are animated during the instant message communications session.
  • the processor receives text of a message entered by the sender to be sent to the instant message recipient (step 310) and sends a message corresponding to the entered text to the recipient (step 315).
  • the processor compares the text ofthe message to multiple animation triggers that are associated with the avatar projected by the sender (step 320).
  • a trigger may include any letter, number, or symbol that may be typed or otherwise entered using a keyboard or keypad. Multiple triggers may be associated with an animation.
  • each ofthe animations 405a-405q has multiple associated triggers 410a-410q. More particularly, by way of example, the animation 405a, in which the avatar is made to smile, has associated triggers 410a.
  • Each ofthe triggers 410a includes multiple character strings, hi particular, triggers 410a include a ":)" trigger 411a, a ":-)" trigger 412a, a "0:-)" trigger 413a, a "0:)” trigger 414a, and a "Nice” trigger 415a.
  • a trigger may be an English word, such as 415a, or an emoticon, such as 41 la-414a.
  • Other examples of a trigger include a particular abbreviation, such as “lol” 41 In, and an English phrase, such as "Oh no” 415e.
  • the avatar is animated with an animation that is associated with the trigger, h one example, when "Nice” is included in an instant message, the avatar is made to smile.
  • one or more ofthe triggers associated with an animation is modifiable by a user.
  • a user may associate a new trigger with an animation, such as by adding "Happy" to triggers 410a to make the avatar smile, hi another example, a user may delete a trigger associated with an animation (that is, disassociate a trigger from an animation), such as by deleting "Nice” 415a.
  • a user may change a trigger that is associated with an animation, such as by changing the "wink” trigger 413b to "winks.”
  • a particular trigger may be associated with only one ammation. hi other implementations, a particular trigger may be permitted to be associated with multiple animations. In some implementations, only one ofthe multiple animations may be played in response to a particular trigger. The single animation to be played may be chosen randomly or in a pre-determined manner from the multiple animations. In other implementations, all ofthe multiple animations may be played serially based on a single trigger. In some implementations, a user may be permitted to delete a particular animation. For example, the user may delete the yell animation 405g. hi such a case, the user may delete some or all ofthe triggers associated with the yell ammation 405g or may chose to associate some or all ofthe triggers 410g with a different animation, such as a smile animation 405 a.
  • the processor determines whether a trigger is included within the message (step 325).
  • the processor identifies a type of animation that is associated with the identified trigger (step 330). This may be accomplished by using a database table, a list, or a file that associates one or more triggers with a type of animation for the avatar to identify a particular type of animation.
  • Types of animation include, by way of example, a smile 405a, a wink 405b, a frown 405c, an expression with a tongue out 405d, a shocked expression 410d, a kiss 405f, a yell 405g, a big smile 405h, a sleeping expression 405i, a nodding expression 405j, a sigh 405k, a sad expression 4051, a cool expression 405m, a laugh 405n, a disappearance 405o, a smell 405p, or a negative expression 405q, all of FIG. 4.
  • the identified type of animation for the avatar is played (step 335).
  • the processor may identify and play an animation of at least one wallpaper object based on the match of a trigger with the text ofthe message sent (step 337).
  • the processor monitors the communications activity ofthe sender for periods of inactivity (step 340) to detect when the sender is in an idle state or an idle period of communications activity (step 345).
  • the sender may be in an idle state after a period during which no messages were sent.
  • the processor may determine whether the sender has not typed or sent an instant message or otherwise interacted with the instant message communications application for a predetermined amount of time.
  • an idle state may be detected by the processor when the sender has not used the computer system in which the processor operates for a predetermined amount of time.
  • the processor detects inactivity (which may be referred to an idle state)
  • a type of animation associated with the idle state is identified (step 350).
  • This may be accomplished by using a database table, list or file that identifies one or more types of animations to play during a detected idle period.
  • the type of animations played during a detected idle state may be the same as or different from the types of animations played based on a trigger in an instant message.
  • the identified type of animation is played (step 355).
  • multiple types of animation associated with the idle state may be identified and played.
  • the processor detects that the sender is no longer idle, such as by receiving an input from the sender, the processor may immediately stop playing the animation event (not shown).
  • a user may select types of animations to be played during an idle period and/or select the order in which the animation are played when multiple animations are played during an idle period.
  • a user may configure or otherwise determine the duration of time during which no messages are sent that constitutes an idle period for the user.
  • the processor may detect a wallpaper object trigger that is different than the trigger used to animate the sender avatar (step 360). For example, the processor may detect the passage of a predetermined amount of time. In another example, the processor may detect that the content ofthe instant message includes a trigger for a wallpaper object animation that is different from the trigger used to animate the sender avatar.
  • Other wallpaper object triggers may include (but are not limited to) the occurrence of a particular day or a particular time of day, the existence of any animations by the sender avatar, the existence of a particular type of animation by the sender avatar, the existence of animations by the recipient avatar, and/or the existence of a particular type ofthe animation ofthe recipient avatar.
  • the triggers for the animation of wallpaper objects also maybe user-configurable such that a user selects whether a particular type of animation is to be included, any animations are to be played, and triggers for one or more ofthe wallpaper objects.
  • a trigger for a type of animation of a wallpaper object or objects may be the same as, or different from, one ofthe triggers associated with animating the avatar.
  • the processor detects a wallpaper object trigger (step 360)
  • the processor identifies and plays an animation of at least one wallpaper object (step 337).
  • step 310-335) The process of identifying and playing types of animations during a sent instant message (steps 310-335) is performed for every instant message that is sent and for every instant message that is received by the processor.
  • the process of identifying and playing types of animation events during periods of inactivity (steps 340-355) may occur multiple times during the instant messaging communications session. Steps 310-355 may be repeated indefinitely until the end of the instant messaging communications session.
  • step 320-355 The process of identifying and playing the types of animations that correspond to a sent instant message or that are played during a period of sender inactivity (steps 320-355) also are performed by the processor ofthe instant message communications application that received the message.
  • the animation ofthe sender avatar may be viewed by the sender and the recipient ofthe instant message.
  • the animation ofthe avatar conveys information from the sender to the recipient that is not directly included in the instant message.
  • an instant messaging interface 500 may be used by a sender of a speech-based instant messaging system to send and receive instant messages.
  • instant messages are heard rather than read by users.
  • the instant messages may be audio recordings ofthe users ofthe speech-based instant messaging system, or the instant messages may include text that is converted into audible speech with a text-to-speech engine. The audio recordings or the audible speech are played by the users.
  • the speech-based instant messaging interface 500 may display an avatar 505 corresponding to a user ofthe instant messaging system from which speech-based instant messages are received.
  • the avatar 505 may be animated automatically in response to the received instant messages such that the avatar 505 appears to be speaking the contents ofthe instant message.
  • the recipient may view the animation ofthe avatar 505 and gather information not directly or explicitly conveyed in the instant message. Depending on the animation played, the recipient may be able to determine, for example, the mood ofthe sender or whether the sender is being serious or joking.
  • the audio message may be processed in the same or similar manner as a textual instant message is processed with respect to the animation process 300 of FIG. 3.
  • types of animations are triggered by audio triggers included in an instant message.
  • the avatar 505 may appear to be speaking the instant message.
  • the avatar 505 may include animations of mouth movements corresponding to phonemes in human speech to increase the accuracy ofthe speaking animations.
  • a text-to-speech process may be generate sounds spoken by the avatar 505
  • animations corresponding to phonemes in the text maybe generated, and a lip synchronization process may be used to synchronize the playing ofthe audio with the lip animation such that the phonemes are heard at the same time that the corresponding animation ofthe mouth ofthe avatar 505 is seen.
  • the instant message includes an audio recording
  • animations corresponding to phonemes in the audio recording may be generated, and a lip synchronization used to synchronize the playing ofthe audio recording with the lip animation.
  • a sender may record an audio portion to be associated with one or more animations ofthe avatar 505. The recording then may be played when the corresponding animation ofthe avatar 505 is played.
  • FIG. 6 illustrates an example process 600 for communicating between instant message clients 602a and 602b, through an instant message host system 604, to animate one avatar in response to an animation played in a different avatar.
  • Each of the users using client 602a or client 602b is associated with an avatar that represents and projects the user during the instant message session.
  • the communications between the clients 602a and 602b are facilitated by an instant messaging host system 604.
  • the communications process 600 enables a first client 602a and a second client 602b to send and receive communications from each other.
  • the communications are sent through the instant messaging host system 604.
  • Some or all ofthe communications may trigger an animation or animations in an avatar associated with the user ofthe first client 602a and an animation or animations in an avatar associated with the user ofthe second client 602b.
  • An instant messaging communications session is established between the first client 602a and the second client 602b in which communications are sent through the instant messaging server host system 604 (step 606).
  • the communications session involves a first avatar that represents the user ofthe first client 602a and a second avatar that represents the user ofthe second client 602b. This may be accomplished, for example, as described previously with respect to step 305 of FIG. 3. h general, both the user ofthe first client 602a and the user ofthe second client 602b may use a user interface similar to the user interface 100 of FIG. 1 in which the sender avatar and the recipient avatar are displayed on the first client 602a and on the second client 602b.
  • a user associated with the first client 602a enters text of an instant message to be sent to a user ofthe second client 602b, which is received by the processor on the client 602aexecuting the instant messaging communications application (step 608).
  • the entered text may include a trigger for one ofthe animations from the first avatar model.
  • the processor executing the instant messaging communications application sends the entered text to the second client 602b in the instant message by way ofthe host system 604 (step 610).
  • the host system 604 receives the message and forwards the message from the first client 602a to the second client 602b (step 612).
  • the message then is received by the second client 602b (step 614).
  • the second client 602b Upon receipt ofthe message, the second client 602b displays the message in a user interface in which messages from the user ofthe first client 602a are displayed.
  • the user interface may be similar to the instant messaging user interface 105 from FIG. 1, in which avatars corresponding to the sender and the recipient are displayed.
  • Both the first client 602a and the second client 602b have a copy ofthe message, and both the first client 602a and the second client 602b begin processing the text ofthe message to determine if the text ofthe message triggers any animations in the respective copies ofthe first and second avatar models.
  • the first client 602a and the second client 602b may actually process the message substantially concurrently or serially, but both the first client 602a and the second client 602b process the message in the same way.
  • the first client 602a searches the text ofthe message for animation triggers to identify a type of animation to play (step 616a).
  • the first client 602a identifies an animation having the identified type of animation for a first avatar associated with the user ofthe first client 602a (step 618a).
  • the first client 602a plays the identified animation for the first avatar that is associated with the user ofthe first client 602a (step 620a).
  • the first avatar model is used to identify the animation to be played because the first avatar model is associated with the first client 602a, which sent the message.
  • the first client 602a and the second client 602b use identical copies ofthe first avatar model to process the message, so the same animation event is seen on the first client 602a and the second client 602b.
  • the animation from the first avatar model triggers an animation from the second avatar model.
  • the first client 602a identifies, based on the identified type of animation played for the first avatar in response to the text trigger, a type of animation to be played for a second avatar that is associated with the user ofthe second client 602b (step 622a).
  • the first client 602b plays the identified type of ammation for the second avatar (step 624a).
  • the first client also may identify a type of animation to be played for wallpaper corresponding to the first avatar and plays the identified wallpaper animation ofthe first avatar (step 626a).
  • the wallpaper ofthe avatar may include an object or objects that are animated during the instant message communications session.
  • the animation ofthe object or objects may occur based on, for example, a trigger in an instant message or the passage of a predetermined amount of time.
  • the animation of wallpaper objects also may be user-configurable such that a user selects whether a particular type animation, or any animations, are played, and the triggers for one or more of the wallpaper objects.
  • a trigger for a type of animation of a wallpaper object or objects maybe the same as, or different from, one ofthe triggers associated with animating the avatar.
  • the user ofthe first client 602a may not send any additional messages for a period of time.
  • the first client 602a detects such a period of inactivity (step 628a).
  • the first client 602a identifies and plays an animation of a type associated with a period of inactivity of detected by the first client 602a (step 630a). This may be accomplished by using a database table, list or file that identifies one or more types of animations to play during a detected idle period.
  • the second client 602b processes the instant message in the same was as the first client 602a. Specifically, the second client 602b processes the message with steps 616b through 630b, each of which are substantially the same as parallel the message processing steps 616a through 630a performed by the first client 602a. Because each ofthe first client 602a and the second client 602b have copies ofthe avatars corresponding to the users ofthe first client 602a and the second client 602b, the same animations that were played on the first client 602a as a result of executing steps 616a tlirough 630a are played on the second client 602b as a result of executing the similar steps 616b through 630b.
  • a text-based message indicates the types of animations that occur.
  • messages with different types of content also may trigger animations ofthe avatars.
  • characteristics of an audio signal included in an audio-based message may trigger animations from the avatars.
  • a process 700 is used to select and optionally customize an avatar for use with an instant messaging system.
  • An avatar may be customized to reflect a personality to be expressed or another aspect of self-expression ofthe user associated with the avatar.
  • the process 700 begins when a user selects an avatar from multiple avatars and the selection is received by the processor executing the process 700 (step 705). For example, a user may select a particular avatar from multiple avatars such as the avatars illustrated in FIG. 8.
  • Each ofthe avatars 805a-805r is associated with an avatar model that specifies the appearance ofthe avatar.
  • Each of the avatars 805a-805r also includes multiple associated animations, each animation identified as being of a particular animation type. The selection may be accomplished, for example, when a user selects one avatar from a group of displayed avatars.
  • the display ofthe avatars may show multiple avatars in a window, such as by showing a small representation (which in some implementations may be referred to as a "thumbnail") of each avatar. Additionally or alternatively, the display may be a list of avatar names from which the user selects.
  • FIG. 8 illustrates multiple avatars 805a-805r.
  • Each avatar 805a-805r includes an appearance, name, and personality description.
  • avatar 805a has an appearance 810a, a name 810b and a personality description 810c.
  • the appearance of an avatar may represent, by way of example, living, fictional or historical people, sea creatures, amphibians, reptiles, mammals, birds, or animated objects.
  • Some avatars may be represented only with a head, such as avatars 805a-805r.
  • the appearance ofthe avatar 805b includes a head of a sheep.
  • the appearance of other avatars may include only a portion or a specific part of a head.
  • the appearance ofthe avatar 8051 resembles a set of lips.
  • avatars may be represented by a body in addition to a head.
  • the appearance ofthe avatar 805n includes a full crab body in addition to a head.
  • An avatar may be displayed over wallpaper that is related in subject matter to the avatar.
  • the avatar 805i is displayed over wallpaper that is indicative of a swamp in which the avatar 805j lives.
  • Each ofthe avatars 805a-805r has a base state expression.
  • the avatar 805f appears to be happy
  • the avatar 805j appears to be sad
  • the avatar 805m appears to be angry.
  • Avatars may have other base state expressions, such as scared or bored.
  • the base state expression of an avatar may influence the behavior of the avatar, including the animations and the sounds ofthe avatar.
  • the avatar 805f has a happy base state expression and consequently has a generally happy behavior
  • the avatar 805m has a creepy base state expression and consequently has a generally scary, creepy and spooky demeanor.
  • a happy avatar may have upbeat sounds while an angry avatar may appear to be shouting when a sound is produced.
  • the base state expression of an avatar may be changed as a result ofthe activities of a user associated with the avatar.
  • the degree of happiness expressed by the avatar may be related to the number of messages sent or received by the user. When the user sends or receives many messages in a predetermined period of time, the avatar may appear happier than when the user sends or receives fewer messages in the predetermined period of time.
  • One of multiple avatars 805a-805r may be chosen by a user ofthe instant messaging system.
  • Each ofthe avatars 805a-805r is associated with an appearance, characteristics and behaviors that express a particular type of personality. For example, an avatar 805f, which has appearance characteristics of a dolphin, may be chosen.
  • Each ofthe avatars 805a-805r is a multi-dimensional character with depth of personality, voice, and visual attributes.
  • an avatar of the avatars 805a-805r is capable of indicating a rich variety of information about the user projecting the avatar.
  • Properties ofthe avatar enable the communication of physical attributes, emotional attributes, and other types of context information about the user that are not well-suited (or even available) for presentation through the use of two-dimensional icons that are not animated.
  • the avatar may reflect the user's mood, emotions, and personality.
  • the avatar may reflect the location, activities and other context ofthe user.
  • an avatar named SoccerBuddy (not shown) is associated with an energetic personality.
  • the personality ofthe SoccerBuddy avatar may be described as energetic, bouncy, confidently enthusiastic, and youthful.
  • the SoccerBuddy avatar's behaviors reflect events in soccer matches.
  • the avatar's yell animation is an "ole, ole, ole" chant
  • his big-smile animation is and, during a frown animation or a tongue-out animation, the avatar shows a yellow card.
  • the SoccerBuddy is customizable to represent a specific team.
  • Special features ofthe SoccerBuddy avatar include cleated feet to represent the avatar's base. In general, the feet act as the base for the avatar.
  • the SoccerBuddy avatar is capable of appearing to move about by pogo-sticking on his feet. In a few animations, such as when the avatar goes away, the avatar's feet may become large and detach from the SoccerBuddy. The feet are able to be animated to kick a soccer ball around the display.
  • a silent movie avatar is reminiscent of silent film actor in the 1920's and 1930's.
  • a silent movie avatar is depicted using a stove-pipe hat and a handle-bar moustache.
  • the silent movie avatar is not associated with audio, instead of speaking, the silent movie avatar is replaced by, or displays, placards having text in a mamier similar to how speech was conveyed in a silent movie.
  • an avatar may be appropriate to current events or a season.
  • an avatar may represent a team or a player on a team involved in professional or amateur sport.
  • An avatar may represent a football team, a baseball team, or a basketball team, or a particular player of a team.
  • teams engaged in a particular playoff series may be represented.
  • seasonal avatars include a Santa Claus avatar, an Uncle Sam avatar, a Thanksgiving turkey avatar, a Jack-o-Lantern avatar, a Valentine's Day heart avatar, an Easter egg avatar, and an Easter bunny avatar.
  • Animation triggers ofthe avatar may be modified to customize when various types of animations associated with the avatar are to occur (step 710).
  • a user may modify the triggers shown in FIG. 4 to indicate when an avatar is to be animated, as described previously with respect to FIG. 3.
  • the triggers may be augmented to include frequently used words, phrases, or character strings.
  • the triggers also may be modified such that the animations that are played as a result of the triggers are indicative ofthe personality ofthe avatar. Modifying the triggers may help to define the personality expressed by the avatar and used for user self- expression.
  • a user also may configure the appearance of an avatar (step 715). This also may help define the personality ofthe avatar, and communicate a self-expressive aspect ofthe sender.
  • an appearance modification user interface 900 may be used to configure the appearance of an avatar.
  • the appearance modification user interface 900 enables the user to modify multiple characteristics of a head of an avatar. For example, hair, eyes, nose, lips and skin tone ofthe avatar may be configured with the appearance modification user interface 900.
  • a hair slider 905 may be used to modify the length ofthe avatar's hair.
  • the various positions ofthe hair slider 905 represent different possible lengths of hair for the avatar that correspond to different representations ofthe hair ofthe avatar included in the avatar model file associated with the avatar being configured.
  • An eyes slider 910 may be used to modify the color ofthe avatar's eyes, with each position ofthe eyes slider 910 representing a different possible color ofthe avatar's eyes and each color being represented in the avatar model file.
  • a nose slider 915 may be used to modify the appearance ofthe avatar's nose, with each position ofthe nose slider 915 representing a different possible appearance ofthe avatar's nose and each possible appearance being represented in the avatar model file.
  • a lips slider 920 may be used to modify the appearance ofthe avatar's lips, with each position ofthe lips slider 920 representing a different possible appearance ofthe avatar's lips and associated with a different lip representation in the avatar model file.
  • the avatar's skin tone also may be modified with a skin tone slider 925.
  • Each ofthe possible positions ofthe skin tone slider 925 represents a possible skin tone for the avatar with each being represented in the avatar model file.
  • the appearance ofthe avatar that is created as a result of using the sliders 905- 925 may be previewed in an avatar viewer 930.
  • the values chosen with the sliders 905-925 are reflected in the avatar illustrated in the avatar viewer 930.
  • the avatar viewer 930 may be updated as each ofthe sliders 905-925 is moved such that the changes made to the avatar's appearance are immediately visible. In another implementation, the avatar viewer 930 may be updated once after all ofthe sliders 905-925 have been used.
  • a rotation slider 935 enables the rotation ofthe avatar illustrated in the avatar viewer 930.
  • the avatar may be rotated about an axis by a number of degrees chosen on the rotation slider 935 relative to an unrotated orientation ofthe avatar.
  • the axis extends vertically through the center ofthe avatar's head and the unrotated orientation ofthe avatar is when the avatar is facing directly forward.
  • Rotating the avatar's head with the rotation slider 930 enables viewing of all sides ofthe avatar to illustrate the changes to the avatar's appearance made with the sliders 905-925.
  • the avatar viewer 930 may be updated as the rotation slider 930 is moved such that changes in the orientation ofthe avatar may be immediately visible.
  • the appearance modification user interface 900 also includes a hair tool button 940, a skin tool button 945, and a props tool button 950.
  • Selecting the hair tool button 940 displays a tool for modifying various characteristics ofthe avatar's hair.
  • the tool displayed as a result of selecting the hair tool button 940 may enable changes to, for example, the length, color, cut, and comb ofthe avatar's hair.
  • the changes made to the avatar's hair with the tool displayed as a result of selecting the hair tool button 940 are reflected in the illustration ofthe avatar in the avatar viewer 930.
  • selecting a skin tool button 945 displays a tool for modifying various aspects ofthe avatar's skin.
  • the tool displayed as a result of selecting the skin tool button 945 may enable, for example, changing the color ofthe avatar's skin, giving the avatar a tan, giving the avatar tattoos, or changing the weathering ofthe avatar's skin to give appearances ofthe age represented by the avatar.
  • the changes made to the avatar's skin with the tool displayed as a result of selecting the skin tool button 945 are reflected in the illustration ofthe avatar in the avatar viewer 930.
  • selecting the props tool button 950 displays a tool for associating one or more props with the avatar.
  • the avatar may be given eyeglasses, earrings, hats, or other objects that may be worn by, or displayed on or near, the avatar through use ofthe props tool.
  • the props given to the avatar with the tool displayed as a result of selecting the props tool button 950 are shown in the illustration ofthe avatar in the avatar viewer 930.
  • all ofthe props that may be associated with the avatar are included in the avatar model file. The props controls whether each ofthe props is made visible when the avatar is displayed, h some implementations, a prop may be created using and rendered by two-dimensional animation techniques.
  • the rendering ofthe prop is synchronized with animations for the three-dimensional avatar. Props may be generated and associated with an avatar after the avatar is initially created. Once all desired changes have been made to the avatar's appearance, the user may accept the changes by selecting a publish button 955. Selecting the publish button 955 saves the changes made to the avatar's appearance, hi addition, when copies ofthe avatar are held by other users ofthe instant messaging system to reflect the change made, the other users are sent updated copies ofthe avatar that reflect the changes made by the user to the avatar.
  • the copies ofthe avatar may be updated so that all copies ofthe avatar have the same appearance such that there is consistency among the avatars used to send and receive out-of-band communications.
  • the appearance modification user interface 900 may be used by the user to change only copies ofthe avatar corresponding to the user. Therefore, the user is prevented from making changes to other avatars corresponding to other users that may be overwritten he user is sent updated copies ofthe other avatars because the other users made changes to the other avatars. Preventing the user from modifying the other avatars ensures that all copies ofthe avatars are identical.
  • the avatar illustrated in the avatar viewer 930 may have an appearance that does not include one of hair, eyes, a nose, lips, or skin tone that are modified with the sliders 905-925.
  • the appearance ofthe avatar 8051 from FIG. 8 does not include hair, eyes, a nose, or skin tone.
  • the appearance modification user interface 900 may omit the sliders 905-925 and instead include sliders to control other aspects ofthe appearance ofthe avatar.
  • the appearance modification user interface 900 may include a teeth slider when the appearance ofthe avatar 8051 is being modified.
  • the interface 900 may be customized based on the avatar selected, to enable appropriate and relevant visual enhancements thereto.
  • a configurable facial feature of an avatar may be created using blend shapes ofthe animation model corresponding to the avatar.
  • a blend shape defines a portion ofthe avatar that may be animated.
  • a blend shape may include a mesh percentage that may be modified to cause a corresponding modification in the facial feature.
  • a user may be able to configure a facial feature of an avatar by using a slider or other type of control to modify the mesh percentage ofthe blend shapes associated with the facial feature being configured.
  • the color, texture, and particles ofthe avatar maybe modified. More particularly, the color or shading ofthe avatar may be changed.
  • the texture applied to avatar may be changed to age or weather the skin ofthe avatar.
  • the width, length, texture, and color of particles ofthe avatar maybe customized.
  • particles ofthe avatar used to portray hair or facial hair, such as a beard may be modified to show hair or beard growth in the avatar.
  • wallpaper over which the avatar is illustrated and an animation for objects in the wallpaper may be chosen (step 720). This may be accomplished by, for example, choosing wallpaper from a set of possible wallpapers.
  • the wallpapers may include animated objects, or the user may choose objects and animations for the chosen objects to be added to the chosen wallpaper.
  • a trading card that includes an image ofthe avatar, a description ofthe avatar may be created (step 725).
  • the trading card also may include a description ofthe user associated with the avatar.
  • the trading card may be shared with other users ofthe instant messaging system to inform the other users of the avatar associated with the user.
  • FIG. 10 one example of a trading card is depicted.
  • the front side 1045 ofthe trading card shows the avatar 1046.
  • the animations ofthe avatar may be played by selecting the animations control 1047.
  • the back side 1050 ofthe trading card includes descriptive information 1051 about the avatar, including the avatar's name, date of birth, city, species, likes, dislikes, hobbies, and aspirations. As illustrated in FIG.
  • both the front side 1045 and the back side 1050 ofthe trading card is shown, hi some implementations, only one side 1045 or 1050 ofthe trading card is able to be displayed at one time. In such a case, a user may be able to control the side ofthe trading card that is displayed by using one ofthe flip controls 1048 or 1052. A store from which accessories for the avatar 1046 illustrated in the trading card may be accessed by selecting a shopping control 1049.
  • an avatar also may be exported for use in another application (step 730).
  • an avatar may be used by an application other than a messaging application.
  • an avatar may be displayed as part of a user's customized home page ofthe user's access provider, such as an internet service provider.
  • An instant message sender may drag-and-drop an avatar to the user's customized home page such that the avatar is viewable by the user corresponding to the avatar.
  • the avatar may be used in an application in which the avatar is viewable by anyone.
  • An instant message sender may drag-and-drop the sender's avatar to the sender's blog or another type of publicly-accessible online journal.
  • the avatar settings user interface 1000 includes a personality section 1002. Selecting a personality tab 1010 displays a personality section ofthe avatar settings interface 1000 for modifying the behavior ofthe one or more avatars.
  • the avatar settings user interface 1000 may be used with the process 700 of FIG. 7 to choose the wallpaper of an avatar and/or to create a trading card for an avatar.
  • the personality section 1002 ofthe avatar settings interface 1000 includes an avatar list 1015 including the one or more various avatars corresponding to the user of the instant messaging system.
  • Each ofthe one or more avatars may be specified to have a distinct personality for use while communicating with a specific person or in a specific situation.
  • an avatar may change appearance or behavior depending on the person with which the user interacts.
  • an avatar may be created with a personality that is appropriate for business communications, and another avatar may be created with a personality that is appropriate for communications with family members.
  • Each ofthe avatars may be presented in the list with a name as well as a small illustration of each avatar's appearance. Selection of an avatar from the avatar list 1015 enables the specification ofthe behavior ofthe selected avatar.
  • the avatar 1020 which is chosen to be the user's default avatar, has been selected from the avatar list 1015, so the behavior ofthe avatar 1020 maybe specified.
  • Names ofthe avatars included in the avatar list may be changed through selection of a rename button 1025. Selecting the rename button displays a tool for changing the name of an avatar selected from the avatar list 1015. Similarly, an avatar may be designated as a default avatar by selecting a default button 1030 after selecting the avatar from the avatar list 1015. Avatars may be deleted by selecting a delete button 1035 after selecting the avatar from the avatar list 1015. hi one implementation, a notification is displayed before the avatar is deleted from the avatar list 1015. Avatars also may be created by selecting a create button 1040. When the create button 1040 is pressed, a new entry is added to the avatar list 1015. The entry may be selected and modified in the same way as other avatars in the avatar list 1015. The behavior ofthe avatar is summarized in a card front 1045 and a card back
  • the card front 1045 includes an illustration ofthe avatar and wallpaper over which the avatar 1020 is illustrated.
  • the card front 1045 also includes a shopping control 1049 to a means for purchasing props for the selected avatar 1020.
  • the card back 1050 includes information describing the selected avatar 1020 and a user ofthe selected avatar. The description may include a name, a birth date, a location, as well as other identifying and descriptive information for the avatar and the user ofthe avatar.
  • the card back 1050 also may include an illustration ofthe selected avatar 1020 as well as the wallpaper over which the avatar 1020 is illustrated.
  • the trading card created as part ofthe avatar customization process 700 includes the card front 1045 and the card back 1050 automatically generated by the avatar settings interface 1000.
  • the personality section 1002 ofthe avatar settings interface 1000 may include multiple links 1055-1070 to tools for modifying other aspects ofthe selected avatar's 1020 behavior.
  • an avatar link 1055 may lead to a tool for modifying the appearance ofthe selected avatar 1020.
  • selecting the avatar link 1055 may display the appearance modification user interface 900 from FIG. 9.
  • the avatar link 1055 may display a tool for substituting or otherwise selecting the selected avatar 1020.
  • the avatar link 1055 may allow the appearance ofthe avatar to be changed to a different species.
  • the tool may allow the appearance ofthe avatar 1020 to be changed from that of a dog to that of a cat.
  • a wallpaper link 1060 may be selected to display a tool for choosing the wallpaper over which the selected avatar 1020 is drawn, hi one implementation, the wallpaper may be animated.
  • a sound link 1065 may be selected to display a tool with which the sounds made by the avatar 1020 may be modified.
  • the sounds may be played when the avatar is animated, or at other times, to get the attention ofthe user.
  • An emoticon link 1070 may be selected to display a tool for specifying emoticons that are available when communicating with the selected avatar 1020.
  • Emoticons are two-dimensional non-animated images that are sent when certain triggers are included in the text of an instant message. Changes made using the tools that are accessible through the links 1055-1070 may be reflected in the card front 1045 and the card back 1050. After all desired changes have been made to the avatars included in the avatar list 1015, the avatar settings interface 1000 may be dismissed by selecting a close button 1075. It is possible, through the systems and techniques described herein, particularly with respect to FIGS.
  • Each self-expression item is used to represent the instant message sender or a characteristic or preference ofthe instant message sender, and may include user- selectable binary objects.
  • the self-expression items may be made perceivable by a potential instant message recipient ("instant message recipient") before, during, or after the initiation of communications by a potential instant message sender ("instant message sender").
  • self-expression items may include an avatar, images, such as wallpaper, that are applied in a location having a contextual placement on a user interface. The contextual placement typically indicates an association with the user represented by the self-expression item.
  • the wallpaper may be applied in an area where messages from the instant message sender are displayed, or in an area around a dialog area on a user interface.
  • Self-expression items also include sounds, animation, video clips, and emoticons (e.g., smileys).
  • the personality may also include a set of features or functionality associated with the personality. For example, features such as encrypted transmission, instant message conversation logging, and forwarding of instant messages to an alternative communication system may be enabled for a given personality.
  • Users may assign personalities to be projected when conversing with other users, either in advance of or "on-the-fly" during a communication session. This allows the user to project different personalities to different people on-line.
  • users may save one or more personalities (e.g., where each personality typically includes groups of instant messaging self-expression items such as, for example avatars, Buddy Sounds, Buddy Wallpaper, and Smileys, and/or a set of features and functionalities) and they may name those personalities to enable their invocation, they may associate each of different personalities with different users with whom they communicate or groups of such users so as to automatically display an appropriate/selected personality during communications with such other users or groups, or they may establish each of different personalities during this process of creating, adding or customizing lists or groups of users or the individual users themselves.
  • personalities e.g., where each personality typically includes groups of instant messaging self-expression items such as, for example avatars, Buddy Sounds, Buddy Wallpaper, and Smileys, and/or a set of features and functionalities
  • the personalities may be projected to others in interactive online environments (e.g., Instant Messaging and Chat) according the assignments made by the user.
  • personalities may be assigned, established and/or associated with other settings, such that a particular personality may be projected based on time-of- day, geographic or virtual location, or even characteristics or attributes of each (e.g., cold personality for winter in Colorado or chatting personality while participating in a chat room).
  • an instant message sender may have multiple online personas for use in an instant message communications session. Each online persona is associated with an avatar representing the particular online persona ofthe instant message sender. In many cases, each online persona of a particular instant message sender is associated with a different avatar. This need not be necessarily so. Moreover, even when two or more online personas of a particular instant message sender include the same avatar, the appearance or behavior ofthe avatar may be different for each ofthe online personas. hi one example, a starfish avatar may be associated with two online personas of a particular instant message sender. The starfish avatar that is associated with one online persona may have different animations than the other starfish avatar that is associated with the other online persona. Even when both ofthe starfish avatars include the same animations, one of the starfish avatars may be animated to display an ammation of a particular type based on different triggers than the same animation that is displayed for the other ofthe starfish avatars.
  • FIG. 11 A shows relationships between online personas, avatars, avatar behaviors and avatar appearances, hi particular, FIG. 11 A shows online personas 1102a-l 102e and avatars 1104a-l 104d that are associated with the online personas 1102a- 1102e.
  • Each of the avatars 1104a- 1104d includes an appearance 1106a- 1106c and a behavior 1108a-l 108d. More particularly, the avatar 1104a includes an appearance 1106a and a behavior 1108a; the avatar 1104b includes an appearance 1106b and a behavior 1108b; the avatar 1104c includes the appearance 1106c and a behavior 1108c; and the avatar 1104d includes an appearance 1106c and a behavior 1108d.
  • the avatars 1104c and 1104d are similar in that both include the appearance 1106c. However, the avatars 1104c and 1104d differ in that the avatar 1104c includes the behavior 1108c while the avatar 1104d includes the behavior 1108d.
  • Each ofthe online personas 1102a-l 102e is associated with one ofthe avatars 1104a-l 104d. More particularly, the online persona 1102a is associated with the avatar 1104a; the online persona 1102b is associated with the avatar 1104b; the online persona 1102c also is associated with the avatar 1104b the online persona 1102d is associated with the avatar 1104c; and the online persona 1102e is associated with the avatar 1104d. As illustrated by the online persona 1102a that is associated with the avatar 1104a, an online persona may be associated with an avatar that is not also associated with a different online persona.
  • Multiple online personas may use the same avatar. This is illustrated by the online personas 1102b and 1102c that are both associated with the avatar 1104b. In this case, the appearance and behavior exhibited by avatar 1104b is the same for both ofthe online personas 1102b and 1102c. hi some cases, multiple online personas may use similar avatars that have the same appearance by which exhibit different behavior, as illustrated by online personas 1102d and 1102e. The online personas 1102d and 1102e are associated with similar avatars 1104c and 1104d that have the same appearance 1106c. The avatars 1102d and 1102e, however, exhibit different behavior 1108c and 1108d, respectively.
  • the instant message sender may forbid a certain personality to be shown to designate instant message recipients and/or groups. For example, if the instant message sender wants to ensure that the "Casual" personality is not accidentally displayed to the boss or to co-workers, the instant message sender may prohibit the display ofthe "Casual" personality to the boss on an individual basis, and may prohibit the display ofthe "Casual" personality to the "Co-workers" group on a group basis. An appropriate user interface may be provided to assist the instant message sender in making such a selection. Similarly, the instant message sender may be provided an option to "lock" a personality to an instant message recipient or a group of instant message recipients to guard against accidental or unintended personality switching and/or augmenting.
  • the instant message sender may choose to lock the "Work" personality to the boss on an individual basis, or to lock the "Work" personality to the "Co-workers” group on a group basis, hi one example, the Casual personality will not be applied to a locked personality.
  • FIG. 1 IB shows an exemplary process 1100 to enable an instant message sender to select an online persona to be made perceivable to an instant message recipient.
  • the selected online persona includes an avatar representing the online persona ofthe instant message sender.
  • the process 1100 generally involves selecting and projecting an online persona that includes an avatar representing the sender.
  • the instant message sender creates or modifies one or more online personalities, including an avatar representing the sender (step 1105).
  • the online personalities may be created or modified with, for example, the avatar settings user interface 1000 of FIG. 10.
  • Creating an online persona generally involves the instant message sender selecting one or more self-expression items and/or features and functionalities to be displayed to a certain instant message recipient or group of instant message recipients.
  • a user interface may be provided to assist the instant message sender in making such a selection, as illustrated in FIG. 12.
  • FIG. 12 shows a chooser user interface 1200 that enables the instant message sender to select among available personalities 1205, 1210, 1215, 1220, 1225, 1230, 1235, 1240, 1245, 1250, and 1255.
  • the user interface 1200 also has a control 1260 to enable the instant message sender to "snag" the personality of another user, and a control 1265 to review the personality settings currently selected by the instant message sender.
  • the user may change the personality, including the avatar, being projected to the instant message recipient before, during, or after the instant message conversation with the recipient.
  • the selection of a personality also may occur automatically without sender intervention. For example, an automatic determination may be made that the sender is sending instant messages from work.
  • a personality to be used at work may be selected automatically and used for all communications.
  • an automatic determination may be made that the sender is sending instant messages from home, and a personality to be used at home may be selected automatically and used for all communications.
  • the sender is not able to control which personality is selected for use.
  • automatic selection of a personality may be used in conjunction with sender selection of a personality, in which case the personality automatically selected may act as a default that may be changed by the sender.
  • FIG. 13 shows a series 1300 of exemplary user interfaces for enabling an instant message sender to create and store a personality, and/or select various aspects ofthe personality such as avatars, buddy wallpaper, buddy sounds, and smileys.
  • user interface 1305 enables an instant message sender to select a set of one or more self-expression items and save the set of self-expression items as a personality.
  • the user interface 1305 also enables an instant message sender to review and make changes to an instant message personality.
  • the user interface 1305 enables an instant message sender to choose an avatar 1310 (here, referred to as a SuperBuddy), buddy wallpaper 1315, emoticons 1320 (here, referred to as Smileys), and buddy sounds 1325.
  • a set of controls 1340 is provided to enable the instant message sender to preview 1340a the profile and to save 1340b these selected self- expression items as a personality.
  • the instant message sender is able to name and save the personality 1345 and then is able to apply the personality 1350 to one or more individual instant message recipients or one or more groups of instant message recipients.
  • a management area 1350a is provided to enable the instant message sender to delete, save, or rename various instant message personalities, h choosing the self-expression items, other interfaces such as user interface 1355 may be displayed to enable the mstant message sender to select the particular self-expression items.
  • the user interface 1355 includes a set of themes 1360 for avatars which enables an instant message sender to select a particular theme 1365 and choose a particular avatar 1370 in the selected theme.
  • a set of controls 1375 is provided to assist the instant message sender in making the selection of self-expression items.
  • an instant message sender may be enabled to choose a pre-determined theme, for example, by using a user interface 1380.
  • the instant message sender may select various categories 1385 of pre-selected themes and upon selecting a particular category 1390, a set of default pre-selected, self-expression items is displayed, 1390a, 1390b, 1390c, 1390d, 1390e, and 1390f.
  • the set may be unchangeable or the instant message sender may be able to individually change any of the pre-selected self-expression items in the set.
  • a control section 1395 is also provided to enable the instant message sender to select the themes.
  • the features or functionality ofthe instant message interface may vary based upon user-selected or pre-selected options for the personality selected or currently in use.
  • the features or functionality may be transparent to the instant message sender.
  • the outgoing instant messages may be encrypted, and a copy may be recorded in a log, or a copy may be forwarded to a designated contact such as an administrative assistant.
  • a warning may be provided to an instant message recipient that the instant message conversation is being recorded or viewed by others, as appropriate to the situation.
  • the non-professional "Casual" personality is selected, the outgoing instant messages may not be encrypted and no copy is recorded or forwarded.
  • the instant message sender indicates an unavailability to receive instant messages (e.g., through selection of an "away” message or by going offline)
  • messages received from others during periods of unavailability may be forwarded to another instant message recipient such as an administrative assistant, or may be forwarded to an e-mail address for the instant message sender.
  • the non-professional "Casual" personality is selected, no extra measures are taken to ensure delivery ofthe message.
  • the features and functionality associated with the personality would be transparent to the instant message sender, and may be based upon one or more pre-selected profiles types when setting up the personality.
  • the instant message sender may be asked to choose from a group of personality types such as professional, management, informal, vacation, offbeat, etc.
  • the "Work" personality may have been be set up as a "professional” personality type and the "Casual” personality may have been set up as an "informal" personality type.
  • the instant message sender may individually select the features and functionalities associated with the personality.
  • the personality is then stored (step 1110).
  • the personality may be stored on the instant message sender system, on the instant message host system, or on a different host system such as a host system of an authorized partner or access provider.
  • the instant message sender assigns a personality to be projected during future instant message sessions or when engaged in future instant message conversations with an instant message recipient (step 1115).
  • the instant message sender may wish to display different personalities to different instant message recipients and/or groups in the buddy list.
  • the instant message sender may use a user interface to assign personalization items to personalities on at least a per-buddy group basis. For example, an instant message sender may assign a global avatar to all personalities, but assign different buddy sounds on a per-group basis to other personalities (e.g. work, family, friends), and assign buddy wallpaper and smileys on an individual basis to individual personalities corresponding to particular instant message recipients within a group.
  • the instant message sender may assign other personality attributes based upon the occurrence of certain predetermined events or triggers.
  • certain potential instant message recipients may be designated to see certain aspects ofthe Rainy Day personality if the weather indicates rain at the geographic location ofthe instant message sender.
  • Default priority rules may be implemented to resolve conflicts, or the user may select priority rules to resolve conflicts among personalities being projected or among self-expression items being projected for an amalgamated personality.
  • a set of default priority rules may resolve conflicts among assigned personalities by assigning the highest priority to personalities and self- expression items of personalities assigned on an individual basis, assigning the next highest priority to assignments of personalities and personalization items made on a group basis, and assigning the lowest priority to assignments of personalities and personalization items made on a global basis.
  • the user may be given the option to override these default priority rules and assign different priority rules for resolving conflicts.
  • an instant message session between the instant message sender and the mstant message recipient is initiated (step 1120).
  • the mstant message session may be initiated by either the instant message sender or the instant message recipient.
  • An instant message user interface is rendered to the instant message recipient, configured to project the personality, including the avatar, assigned to the instant message recipient by the mstant message sender (step 1125), as illustrated, for example, in the user interface 100 in FIG. 1.
  • the personality, including an avatar associated with the personality, chosen by an instant messaging recipient may be made perceivable upon opening of a communication window by the instant message sender for a particular instant message recipient but prior to initiation of communications. This may allow a user to determine whether to initiate communications with instant message recipient. For example, an instant message sender may notice that the instant message recipient is projecting an at- work personality, and the instant message sender may decide to refrain from sending an instant message. This may be particularly true when the avatar of the instant message recipient is displayed on a contact list. On the other hand, rendering the instant message recipient avatar after sending an instant message may result in more efficient communications.
  • the appropriate personality/personalization item set for a buddy is sent to the buddy when the buddy communicates with the instant message sender through the instant messaging client program. For example, in an implementation which supports global personalization items, group personalization items, and personal personalization items, a personal personalization item is sent to the buddy if set, otherwise a group personalization item is sent, if set. If neither a personal nor a group personalization item is set, then the global personalization item is sent. As another example, in an implementation that supports global personalization items and group personalization items, the group personalization item for the group to which the buddy belongs is sent, if set, otherwise the global personalization item is sent. In an implementation that only supports group personalization items, the group personalization item for the group to which the buddy belongs is sent to the buddy. An instant message session between the instant message sender and another instant message recipient also may be initiated (step 1130) by either the instant message sender or the second instant message recipient.
  • a second instant message user interface is rendered to the second instant message recipient, configured to project the personality, including the avatar, assigned to the second instant message recipient by the instant message sender (step 1135), similar to the user interface illustrated by FIG. 1.
  • the personality may be projected in a similar manner to that described above with respect to step 1125.
  • the personality and avatar projected to the second instant message recipient may differ from the personality and avatar projected to the first instant message recipient described above in step 1125.
  • an exemplary process 1400 enables an instant message sender to change a personality assigned to an instant message recipient, h process 1400, a user selection of a new online persona, including an avatar, to be assigned to the instant message recipient is received (step 1405).
  • the change may be received through an instant message chooser 1200, such as that discussed above with respect to FIG. 12, and may include choosing self-expression items and/or features and functionality using such as interface or may include "snagging" an online persona or an avatar ofthe buddy using such an interface.
  • Snagging an avatar refers to the appropriation by the mstant message sender of one or more personalization items, such as the avatar, used by the instant message recipient.
  • all personalization items in the online persona ofthe instant message recipient are appropriated by the instant message sender when "snagging" an online persona.
  • the updated user interface for that instant message recipient is rendered based on the newly selected personality (step 1410).
  • FIG. 15 illustrates an example process 1500 for modifying the appearance, or the behavior, of an avatar associated with an instant message sender to communicate an out-of-band message to an instant message recipient.
  • the process may be performed by an instant messaging system, such as communications systems 1600, 1700, and 1800 described with respect to FIGS. 16, 17, and 18, respectively.
  • An out- of-band message refers to sending a message that communicates context out-of-band - that is, conveying information independent of information conveyed directly through the text ofthe instant message itself sent to the recipient.
  • the recipient views the appearance and behavior ofthe avatar to receive information that is not directly or explicitly conveyed in the instant message itself.
  • an out-of-band communication may include information about the sender's setting, environment, activity or mood, which is not communicated and part of a text message exchanged by a sender and a recipient.
  • the process 1500 begins with the instant messaging system monitoring the communications environment and sender's environment for an out-of-band communications indicator (step 1510).
  • the indicator may be an indicator of the sender's setting, environment, activity, or mood that is not expressly conveyed in instant messages sent by the sender.
  • the out-of-band indicator may be an indication of time and date ofthe sender's location, which may be obtained from a clock application associated with the instant messaging system or with the sender's computer.
  • the indicator may be an indication ofthe sender's physical location.
  • the indicator may be an indication of an indication of weather conditions ofthe sender's location, which may be obtained from a weather reporting service, such as a web site that provides weather information for geographic locations.
  • the indicator may indicate the activities ofthe sender that take place at, or near, the time when an instant message is sent.
  • the indicator may determine from the sender's computer other applications that are active at, or near, the time that an instant message is sent.
  • the indicator may detect that the sender is using a media-playing application to play music, so the avatar associated with the sender may appear to be wearing headphones to reflect that the sender is listening to music.
  • the indicator may detect that the sender is working with a calculator application, so the avatar may appear to be wearing glasses to reflect that sender is working.
  • the activities ofthe sender also may be monitored through use of a camera focused on the sender.
  • Visual information taken from the camera may be used to determine the activities and mood ofthe sender.
  • the location of points on the face ofthe sender may be determined from the visual information taken from the camera.
  • the position and motion ofthe facial points may be reflected in the avatar associated with the sender. Therefore, if the sender were to, for example, smile, then the avatar also smiles.
  • the indicator ofthe sender's mood also may come from another device that is operable to determine the sender's mood and send an indication of mood to the sender's computer.
  • the sender may be wearing a device that monitors heart rate, and determines the sender's mood from the heart rate.
  • the device may conclude that the sender is agitated or excited when an elevated heart rate is detected.
  • the device may send the indication ofthe sender's mood to the sender's computer for use with the sender's avatar.
  • the instant messaging system makes a determination as to whether an out-of- band communications indicator has been detected (step 1520).
  • the instant messaging system determines whether the avatar must be modified, customized, or animated to reflect the detected out-of-band communications indicator (step 1530); meanwhile or otherwise, the instant messaging system continues to monitor for out-of-band communications indicators (step 1510).
  • the instant messaging system may use a data table, list or file that includes out-of-band communications indicators and an associated action to be taken for each out-of-band communications indicator. Action may not be required for each out-of-band communications indicator detected. For example, action may only be required for some out-of-band communications indicators when an indicator has changed from a previous indicator setting.
  • the instant messaging system may periodically monitor the clock application to determine whether the setting associated with the sender is daytime or nighttime. Once the instant messaging system has taken action based on detecting an out-of-band communications indicator having a nighttime setting, the instant messaging system need not take action based on the detection of a subsequent nighttime setting indicator. The instant messaging system only takes action based on the nighttime setting after receiving an intervening out-of-band communications indicator for a daytime setting. When action is required (step 1540), the appearance and/or behavior ofthe avatar is modified in response to the out-of-band communications indicator (step 1550).
  • an out-of-band communications indicator shows that the sender is sending instant messages at night
  • the appearance ofthe avatar is modified to be dressed in pajamas.
  • the indicator shows that the sender is sending instant messages during a holiday period
  • the avatar may be dressed in a mamier illustrative ofthe holiday.
  • the avatar may be dressed as Santa Claus during December, a pumpkin near Halloween, or Uncle Sam during early July.
  • the avatar when the out-of-band indicator shows that the sender is at the office, the avatar may be dressed in business attire, such as a suit and a tie.
  • the appearance ofthe avatar also may reflect the weather or general climate ofthe geographic location ofthe sender.
  • the wallpaper ofthe avatar when the out-of-band communications indicator shows that it is raining at the location ofthe sender, the wallpaper ofthe avatar may be modified to include falling raindrops or display an open umbrella and/or the avatar may appear to wear a rain hat.
  • the appearance ofthe avatar may be changed to show the avatar wearing headphones. Additionally or alternatively, the appearance of the avatar may be changed based on the type of music to which the sender is listening.
  • the indicator indicates that the sender is working (at the sender's work location or at another location)
  • the avatar may appear in business attire, such as wearing a suit and a tie.
  • different out-of-band communications indicators may trigger the same appearance ofthe avatar.
  • both the out- of-band communications indicator ofthe sender being located at work and the out-of- band communications indicator ofthe sender performing a work activity causes the avatar to appear to be wearing a suit and tie.
  • the mood ofthe sender may be so indicated.
  • the appearance ofthe avatar may be changed to reflect the indicated mood.
  • the avatar may be modified to reflect the sad state ofthe sender, such as by animating the avatar to frown or cry.
  • a frazzled, busy or pressed mood may be detected and the avatar animated to communicate such an emotional state.
  • the updated avatar or indication that the avatar has been changed, is provided in association with the next instant message sent by the sender; however, this is not necessarily so in every implementation, hi some implementations, a change in the avatar may be communicated to the recipient independently ofthe sending of a communication. Additionally or alternatively, when a buddy list ofthe instant message user interface includes a display of a sender's avatar, the change ofthe avatar appearance may be communicated to each buddy list that includes the sender. Thus, the recipient is made able to perceive the updated avatar, the behavior and/or appearance providing an out-of-band communication to the sender.
  • FIG. 16 illustrates a communications system 1600 that includes an instant message sender system 1605 capable of communicating with an instant message host system 1610 through a communication link 1615.
  • the communications system 1600 also includes an instant message recipient system 1620 capable of communicating with the instant message host system 1610 through the communication link 1615.
  • a user ofthe instant message sender system 1605 is capable of exchanging communications with a user ofthe instant message recipient system 1620.
  • the communications system 1600 is capable of animating avatars for use in self-expression by an instant message sender.
  • any ofthe instant message sender system 1605, the instant message recipient system 1620, or the instant message host system 1610 may include one or more general-purpose computers, one or more special-purpose computers (e.g., devices specifically programmed to communicate with each other), or a combination of one or more general-purpose computers and one or more special- purpose computers.
  • the instant message sender system 1605 or the instant message recipient system 1620 may be a personal computer or other type of personal computing device, such as a personal digital assistant or a mobile communications device.
  • the instant message sender system 1605 and/or the instant message recipient 1620 may be a mobile telephone that is capable of receiving instant messages.
  • the instant message sender system 1605, the instant message recipient system 1620 and the instant message host system 1610 may be arranged to operate within or in concert with one or more other systems, such as, for example, one or more LANs ("Local Area Networks") and/or one or more WANs ("Wide Area Networks").
  • the communications link 1615 typically includes a delivery network (not shown) that provides direct or indirect communication between the instant message sender system 1605 and the instant message host system 1610, irrespective of physical separation.
  • Examples of a delivery network include the Internet, the World Wide Web, WANs, LANs, analog or digital wired and wireless telephone networks (e.g., Public Switched Telephone Network (PSTN), Integrated Services Digital Network (ISDN), and various implementations of a Digital Subscriber Line (DSL)), radio, television, cable, or satellite systems, and other delivery mechanisms for carrying data.
  • the communications link 1615 may include communication pathways (not shown) that enable communications through the one or more delivery networks described above. Each ofthe communication pathways may include, for example, a wired, wireless, cable or satellite communication pathway.
  • the instant message host system 1610 may support instant message services irrespective of an instant message sender's network or Internet access. Thus, the instant message host system 1610 may allow users to send and receive instant messages, regardless of whether they have access to any particular Internet service provider (ISP).
  • the instant message host system 1610 also may support other services, including, for example, an account management service, a directory service, and a chat service.
  • the instant message host system 1610 has an architecture that enables the devices (e.g., servers) within the instant message host system 1610 to communicate with each other. To transfer data, the instant message host system 1610 employs one or more standard or proprietary instant message protocols. To access the instant message host system 1610 to begin an instant message session in the implementation of FIG.
  • the instant message sender system 1605 establishes a connection to the instant message host system 1610 over the communication link 1615. Once a connection to the instant message host system 1610 has been established, the instant message sender system 1605 may directly or indirectly transmit data to and access content from the instant message host system 1610. By accessing the instant message host system 1610, an instant message sender can use an instant message client application located on the instant message sender system 1605 to view whether particular users are online, view whether users may receive mstant messages, exchange instant messages with particular instant message recipients, participate in group chat rooms, trade files such as pictures, invitations or documents, find other instant message recipients with similar interests, get customized information such as news and stock quotes, and search the Web.
  • the instant message recipient system 1620 may be similarly manipulated to establish contemporaneous connection with instant message host system 1610.
  • the instant message sender may view or perceive an avatar and/or other aspects of an online persona associated with the instant message sender prior to engaging in communications with an instant message recipient.
  • an instant message recipient selected personality such as an avatar chosen by the instant message recipient, may be perceivable through the buddy list itself prior to engaging in communications.
  • Other aspects of a selected personality chosen by an instant message recipient may be made perceivable upon opening of a communication window by the instant message sender for a particular instant message recipient but prior to initiation of communications.
  • animations of an avatar associated with the instant message sender only may be viewable in a communication window, such as the user interface 100 of FIG. 1.
  • the instant messages sent between instant message sender system 1605 and instant message recipient system 1620 are routed through the instant message host system 1610.
  • the instant messages sent between instant message sender system 1605 and instant message recipient system 1620 are routed through a third party server (not shown), and, in some cases, are also routed through the instant message host system 1610.
  • the instant messages are sent directly between instant message sender system 1605 and instant message recipient system 1620.
  • communications system 1600 may be implemented using communications system 1600.
  • One or more ofthe processes may be implemented in a client/host context, a standalone or offline client context, or a combination thereof.
  • some functions of one or more ofthe processes may be performed entirely by the instant message sender system 1605, other functions may be performed by host system 1610, or the collective operation of the instant message sender system 1605 and the host system 1610.
  • the avatar of an instant message sender may be respectively selected and rendered by the standalone/offline device, and other aspects ofthe online persona ofthe instant message sender may be accessed or updated tlirough a remote device in a non-client/host environment such as, for example, a LAN server serving an end user or a mainframe serving a terminal device.
  • a non-client/host environment such as, for example, a LAN server serving an end user or a mainframe serving a terminal device.
  • FIG. 17 illustrates a communications system 1700 that includes an instant message sender system 1605, an instant message host system 1610, a communication link 1615, and an instant message recipient 1620.
  • System 1700 illustrates another possible implementation ofthe communications system 1600 of FIG. 16 that is used for animating avatars used for self-expression by an instant message sender.
  • the instant message host system 1610 includes a login server 1770 for enabling access by instant message senders and routing communications between the instant message sender system 1605 and other elements ofthe instant message host system 1610.
  • the instant message host system 1610 also includes an instant message server 1790.
  • the instant message sender system 1605 and the instant message recipient system 1620 may include communication software, such as for example, an online service provider client application and/or an instant message client application.
  • the instant message sender system 1605 establishes a connection to the login server 1770 in order to access the instant message host system 1610 and begin an instant message session.
  • the login server 1770 typically determines whether the particular instant message sender is authorized to access the instant message host system 1610 by verifying the instant message sender's identification and password. If the instant message sender is authorized to access the instant message host system 1610, the login server 1770 usually employs a hashing technique on the instant message sender's screen name to identify a particular instant message server 1790 within the instant message host system 1610 for use during the instant message sender's session.
  • the login server 1770 provides the instant message sender (e.g., instant message sender system 1605) with the Internet protocol ("IP") address ofthe instant message server 1790, gives the instant message sender system 1605 an encrypted key, and breaks the connection.
  • IP Internet protocol
  • the instant message sender system 1605 then uses the IP address to establish a connection to the particular instant message server 1790 through the communications link 1615, and obtains access to the instant message server 1790 using the encrypted key. Typically, the instant message sender system 1605 will be able to establish an open TCP connection to the instant message server 1790.
  • the instant message recipient system 1620 establishes a comiection to the instant message host system 1610 in a similar manner.
  • the instant message host system 1610 also includes a user profile server (not shown) connected to a database (not shown) for storing large amounts of user profile data.
  • the user profile server may be used to enter, retrieve, edit, manipulate, or otherwise process user profile data.
  • an instant message sender's profile data includes, for example, the instant message sender's screen name, buddy list, identified interests, and geographic location.
  • the instant message sender's profile data may also include self-expression items selected by the instant message sender.
  • the instant message sender may enter, edit and/or delete profile data using an installed instant message client application on the instant message sender system 1705 to interact with the user profile server.
  • the instant message sender does not have to reenter or update such information in the event that the instant message sender accesses the instant message host system 1610 using a new or different instant message sender system 1605. Accordingly, when an instant message sender accesses the instant message host system 1610, the instant message server can instruct the user profile server to retrieve the instant message sender's profile data from the database and to provide, for example, the instant message sender's self-expression items and buddy list to the instant message server. Alternatively, user profile data may be saved locally on the instant message sender system 1605.
  • FIG. 18 illustrates another example communications system 1800 capable of exchanging communications between users that project avatars for self-expression.
  • the commumcations system 1800 includes an instant message sender system 1605, an instant message host system 1610, a communications link 1615 and an instant message recipient system 1620.
  • the host system 1610 includes instant messaging server software 1832 routing communications between the instant message sender system 1605 and the instant message recipient system 1620.
  • the instant messaging server software 1832 may make use of user profile data 1834.
  • the user profile data 1834 includes indications of self-expression items selected by an instant message sender.
  • the user profile data 1834 also includes associations 1834a of avatar models with users (e.g., instant message senders).
  • the user profile data 1834 may be stored, for example, in a database or another type of data collection, such as a series of extensible mark-up language ⁇ XML) files.
  • the some portions ofthe user profile data 1834 may be stored in a database while other portions, such as associations 1834a of avatar models with users, may be stored in an XML file.
  • user profile data 1834 appears in the table below.
  • the user profile data includes a screen name to uniquely identify the user for whom the user profile data applies, a password for signing-on to the instant message service, an avatar associated with the user, and an optional online persona.
  • a user may have multiple online personas, each associated with the same or a different avatar.
  • the host system 1610 also includes an avatar model repository 1835 in which definitions of avatars that may be used in the instant message service are stored, hi this implementation, an avatar definition includes an avatar model file, an avatar expression file for storing instructions to control the animation ofthe avatar, and wallpaper file.
  • the avatar model repository 1835 includes avatar model files 1836, avatar expression files 1837 and avatar wallpaper files 1838.
  • the avatar model files 1836 define the appearance and animations of each of the avatars included in the avatar model repository 1835.
  • Each ofthe avatar model files 1836 defines the mesh, texture, lighting, sounds, and animations used to render an avatar.
  • the mesh of a model file defines the form ofthe avatar, and the texture defines the image that covers the mesh.
  • the mesh may be represented as a wire structure composed of a multitude of polygons that may be geometrically transformed to enable the display of an avatar to give the illusion of motion.
  • lighting information of an avatar model file is in the form of a light map that portrays the effect of a light source on the avatar.
  • the avatar model file also includes multiple animation identifiers. Each animation identifier identifies a particular animation that may be played for the avatar. For example, each animation identifier may identify one or more morph targets to describe display changes to transform the mesh of an avatar and display changes in the camera perspective used to display the avatar.
  • an instant message user projects an avatar self-expression
  • facial animations may be desirable for facial animations to use a larger number of blend shapes, which may result in an avatar that, when rendered, may appears more expressive.
  • a blend shape defines a portion ofthe avatar that may be animated and, in general, the more blend shapes that are defined for an animation model, the more expressive the image rendered from the animation model may appear.
  • information to define an avatar may be stored in multiple avatar files that may be arranged in a hierarchical structure, such as a directory structure.
  • a hierarchical structure such as a directory structure.
  • the association between a user and an avatar may be made through an association ofthe user with the root file in a directory of model files for the avatar.
  • an avatar model file may include all possible appearances of an avatar, including different features and props that are available for user-customization.
  • user preferences for the appearance ofthe user's avatar include indications of which portions ofthe avatar model are to be displayed, and flags or other indications for each optional appearance feature or prop may be set to indicate whether the feature or prop is to be displayed.
  • an avatar model may be configured to display sunglasses, reading glasses, short hair and long hair. When a user configures the avatar to wear sunglasses and have long hair, the sunglasses feature and long hair features are turned on, the reading glasses and short hair features are turned off, and subsequent renderings ofthe avatar display the avatar having long hair and sunglasses.
  • the avatar model repository 1835 also includes avatar expression files 1837.
  • Each ofthe avatar expression files 1837 defines triggers that cause animations in the avatars.
  • each ofthe avatar expression files 1837 may define the text triggers that cause an of animation when the text trigger is identified in an instant message, as previously described with respect to FIGS. 3 and 4.
  • An avatar expression file also may store associations between out-of-band communication indicators and animations that are played when a particular out-of-band communication indicator is detected.
  • Table 2 One example of a portion of an avatar expression file is depicted in Table 2 below.
  • the association between a particular animation for a particular animation identifier is indirectly determined for a particular trigger or out- of-band communication indicator.
  • a particular trigger or out-of-band communication indicator may be associated with a type of animation (such as a smile, gone away, or sleep), as illustrated in Table 2.
  • a type of animation also may be associated with a particular animation identifier included in a particular avatar model file, as illustrated in Table 3 below.
  • the type of animation is identified, the animation identifier associated with the identified type of animation is determined, and the animation identified by the animation identifier is played.
  • Other computer animation and programming techniques also may be used.
  • each avatar may use the same animation identifier for a particular animation type rather than including the avatar name shown in the table.
  • the association of animation types and animation identifiers maybe stored separately for each avatar.
  • the avatar expression files 1837 also include information to define the way that an avatar responds to an animation of another avatar, hi one implementation, an avatar expression file includes pairs of animation identifiers. One ofthe animation identifiers in each pair identifies a type of animation that, when the type of animation is played for one avatar, triggers an animation that is identified by the other animation identifier in the pair in another avatar. In this manner, the avatar expression file may define an animation played for an instant message recipient's avatar in response to an animation played by an instant message sender's avatar.
  • the avatar expression files 1837 may include XML files having elements for defining the text triggers for each ofthe animations ofthe corresponding avatar and elements for defining the animations that are played in response to animations seen from other avatars.
  • the avatar model repository 1835 also includes avatar wallpaper files 1838 that define the wallpaper over which an avatar is drawn.
  • the wallpaper may be defined using the same or different type of file structure as the avatar model files.
  • an avatar model file may be defined as an animation model file that is generated and playable using animation software from Viewpoint Corporation of New York, New York, whereas the wallpaper files may be in the form of a Macromedia Flash file that is generated and playable using animation software available from Macromedia, Inc. of San Francisco, California.
  • the avatar wallpaper files 1838 also may include one or more triggers that are associated with the wallpaper animation.
  • Each ofthe instant message sender system 1605 and the instant message recipient system 1620 includes an instant messaging communication application 1807 or 1827 that capable of exchanging instant messages over the communications link 1615 with the instant message host system 1610.
  • the instant messaging communication application 1807 or 1827 also maybe referred to as an instant messaging client.
  • Each ofthe instant message sender system 1605 and the instant message recipient system 1620 also includes avatar data 1808 or 1828.
  • the avatar data 1808 or 1828 include avatar model files 1808a or 1828a, avatar expression files 1808b or 1828b, and avatar wallpaper files 1808c or 1828c for the avatars that are capable of being rendered by the instant message sender system 1605 or the instant message recipient system 1620, respectively.
  • the avatar data 1808 or 1828 maybe stored in persistent storage, transient storage, or stored using a combination of persistent and transient storage. When all or some ofthe avatar data 1808 or 1828 is stored in persistent storage, it may be useful to associate a predetermined date on which some or all ofthe avatar data 1808 or 1828 is to be deleted from the instant message sender system 1605 or the instant message recipient system 1620, respectively. In this manner, avatar data may be removed from the instant message sender system 1605 or the instant message recipient system 1620 after the data has resided on the instant message sender system 1605 or 1620 for a predetermined period of time and presumably is no longer needed. This may help reduce the amount of storage space used for instant messaging on the instant message sender system 1605 or the instant message recipient system 1620.
  • the avatar data 1808 or 1828 is installed on the instant message sender system 1605 or the instant message recipient system 1620, respectively, with the instant messaging client software installed on the mstant message sender system 1605 or the instant message recipient system 1620.
  • the avatar data 1808 or 1828 is transmitted to the instant message sender system 1605 or the instant message recipient system 1620, respectively, from the avatar model repository 1835 ofthe instant messaging host system 1610.
  • the avatar data 1808 or 1828 is copied from a source unrelated to instant messaging and stored for use as instant messaging avatars on the instant message sender system 1605 or the instant message recipient system 1620, respectively.
  • the avatar data 1808 or 1828 is sent to the instant message sender system 1605 or the instant message recipient system 1620, respectively, with or incident to instant messages sent to the instant message sender system 1605 or the instant message recipient system 1620.
  • the avatar data sent with an instant message corresponds to the instant message sender that sent the message.
  • the avatar expression files 1808b or 1828b are used to determine when an avatar is to be rendered on the instant message sender system 1605 or the instant message recipient 1620, respectively.
  • one ofthe avatar model files 1808a is displayed on the two-dimensional display ofthe instant messaging system 1605 or 1620 by an avatar model player 1809 or 1829, respectively.
  • the avatar model player 1808 or 1829 is an animation player by Viewpoint Corporation.
  • the processor ofthe instant messaging system 1605 or 1620 calls the avatar model player 1809 or 1829 and identifies an animation included in one ofthe avatar model files 1808a or 1828a.
  • the animation is identified by an animation identifier in the avatar model file.
  • the avatar model player 1809 or 1829 then accesses the avatar model file and plays the identified animation.
  • multiple animations may be played based on a single trigger or out-of-band communications indicator. This may occur, for example, when one avatar reacts to an animation of another avatar that is animated based on a text trigger, as described previously with respect to FIG. 6.
  • four animations may be separately initiated based on a text trigger in one instant message.
  • An instant message sender projecting a self-expressive avatar uses instant message sender system 1605 to sends a text message to an instant message recipient using instant message recipient system 1620.
  • the instant message recipient also is projecting a self-expressive avatar.
  • the display ofthe instant message sender system 1605 shows an instant message user interface, such as user interface 100 of FIG. 1, as does the display of instant message recipient system 1620.
  • the sender avatar is shown on both the instant message sender system 1605 and the instant message recipient system 1620, as is the recipient avatar.
  • the instant message sent from instant message sender system includes a text trigger that causes the animation ofthe sender avatar on the instant message sender system 1605 and the sender avatar on the instant message recipient system 1620.
  • an instant messaging user is permitted to customize one or more ofthe animation triggers or out-of-band communications indicators for avatar animations, wallpaper displayed for an avatar, triggers or out-of-band communications indicators for animating objects ofthe wallpaper, and the appearance ofthe avatar, h one implementation, a copy of an avatar model file, an expression file or a wallpaper file is made and the modifications ofthe user are stored in the copy ofthe avatar model file, an expression file or a wallpaper file. The copy that includes the modification is then associated with the user.
  • the changes - that is, the differences between the avatar before the modifications and the avatar after the modifications are made - are stored.
  • different versions ofthe same avatar may be stored and associated with a user. This may enable a user to modify an avatar, use the modified avatar for a period of time, and then return to using a previous version ofthe avatar that does not include the modification.
  • the avatars from which a user may choose may be limited by the instant message service provider. This may be referred to as a closed implementation or a locked-down implementation, hi such an implementation, the animations and triggers associated with each avatar within the closed set of avatars may be preconfigured.
  • the user may customize the animations and/or triggers of a chosen avatar. For example, a user may include a favorite video clip as an animation of an avatar, and the avatar may be configured to play the video clip after certain text triggers appear in the messages sent by the user. h other closed implementations, the user is also prevented from adding animations to an avatar.
  • the set of avatars from which a user may choose is not limited by the instant message service provider, and the user may use an avatar other than an avatar provided by the instant message service provider.
  • This may be referred to as an open implementation or an unlocked implementation.
  • an avatar usable in an instant message service may be created by a user using animation software provided by the instant message service provider, off-the-shelf computer animation software, or software tools provided by a third-party that are specialized for the creating avatars compatible with one or more instant message services.
  • an instant message service provider may limit the selection by users who are minors to a set of predetermined avatars provided by the instant message service provider while permitting users who are adults to use an avatar other than an avatar available from the instant message service provider.
  • the avatars from which a user may select may be limited based on a user characteristic, such as age. As illustrated in Table 4 below and using the avatars shown in FIG. 8 only as an example, a user who is under the age of 10 may be limited to one group of avatars. A user who is between 10 and 18 may be limited to a different group of avatars, some of which are the same as the avatars selectable by users under the age of 10. A user who is 18 or older may select from any avatar available from the instant message provider service.
  • a user characteristic such as age.
  • a user who is under the age of 10 may be limited to one group of avatars.
  • a user who is between 10 and 18 may be limited to a different group of avatars, some of which are the same as the avatars selectable by users under the age of 10.
  • a user who is 18 or older may select from any avatar available from the instant message provider service.
  • Instant messaging programs typically allow instant message senders to communicate in real-time with each other in a variety of ways. For example, many instant messaging programs allow mstant message senders to send text as an instant message, to transfer files, and to communicate by voice. Examples of instant messaging communication applications include AIM (America Online Instant Messenger), AOL (America Online) Buddy List and Instant Messages which is an aspect of many client communication applications provided by AOL, Yahoo Messenger, MSN Messenger, and ICQ, among others. Although discussed above primarily with respect to instant message applications, other implementations are contemplated for providing similar functionality in platforms and online applications. For example, the techniques and concepts may be applied to an animated avatar that acts as an information assistant to convey news, weather, and other information to a user of a computer system or a computing device.
  • AIM America Online Instant Messenger
  • AOL America Online Buddy List
  • Instant Messages which is an aspect of many client communication applications provided by AOL, Yahoo Messenger, MSN Messenger, and ICQ, among others.
  • the techniques and concepts may be applied to an animated avatar that acts
  • an instant messaging system that uses an instant messaging host system to facilitate the instant messaging communication between instant message senders and instant message recipients.
  • Other instant message implementations are contemplated, such as an instant message service in which instant messages are exchanged directly between an instant message sender system and an instant message recipient system.
  • an instant message service in which instant messages are exchanged directly between an instant message sender system and an instant message recipient system.
  • the examples above are given in an instant message context, other communications systems with similar attributes may be used.
  • multiple personalities may be used in a chat room or in e-mail communications.
  • the user interface may be a viewable interface, an audible interface, a tactile interface, or a combination of these.
  • Other implementations are within the scope ofthe following claims.

Abstract

A graphical user interface on a display device for using a computer to communicate using an avatar includes an instant message sender display. The instant message sender display includes a sender portion that displays a sender avatar capable of displaying multiple animations. The sender avatar is animated in response to a trigger related to content of a message sent from a sender to a recipient and may be animated to send to another user an out-of-band communication that conveys information independent of information conveyed directly in the text message sent. An avatar also may be included in one or more of multiple online personas enabled for a user an instant messaging communications session. An avatar may be animated in response to the animation of another avatar in the same communications session.

Description

USING AVATARS TO COMMUNICATE
TECHNICAL FIELD
This description relates to projecting a graphical representation of a communications application operator (hereinafter "sender") in communications sent in a network of computers.
BACKGROUND
Online services may provide users with the ability to send and receive instant messages. Instant messages are private online conversations between two or more people who have access to an instant messaging service, who have installed communications software necessary to access and use the instant messaging service, and who each generally have access to information reflecting the online status of other users.
An instant message sender may send self-expression items to an instant message recipient. Current implementations of instant messaging self-expression enable a user to individually select self-expression settings, such as a Buddy Icon and a Buddy Wallpaper, which settings thereafter project to other users who see or interact with that person online.
SUMMARY
A graphical user interface on a display device of a computer enables communications using an avatar. The graphical user interface includes an instant message sender display. The instant message sender display has a sender portion that displays a sender avatar capable of displaying multiple animations. The instant message sender display also has a message compose area capable of displaying text included in the message sent from the sender to the recipient and communication controls. At least one communication control is operable to receive an indication that the message displayed in the message compose area is to be sent from the sender to the recipient. The sender avatar is animated in response to a trigger related to content of a message sent from a sender to a recipient.
Implementations may include one or more ofthe following features. For example, the instant message sender display may include a recipient portion that displays a recipient avatar and a message history area. The recipient avatar may be capable of displaying multiple animations in response to a trigger related to content of a message sent from a sender to a recipient. The message history area may be capable of displaying the content of multiple messages sent between the sender and the recipient and identifying an identity associated with the recipient. The recipient avatar is animated in response to an animation ofthe sender avatar.
The graphical user interface may include a contact list display for displaying potential recipients. The contact list display may indicate whether each potential recipient is available to receive a message. The potential recipients may be grouped and associated with an indication of a group identity. A potential recipient displayed on the contact list may be associated with a potential recipient avatar The potential recipient avatar may be displayed on the contact list in association with an identity ofthe potential recipient. The potential recipient avatar may be animated on the contact list in response to animation ofthe potential recipient avatar displayed elsewhere. The animation ofthe potential recipient avatar on the contact list may include an animation that is substantially similar to, or different than, the animation ofthe potential recipient avatar displayed elsewhere. The animation ofthe potential recipient avatar on the contact list may include an animation that is representative ofthe animation ofthe potential recipient avatar displayed elsewhere. The graphical user interface may be a graphical user interface that is used for an instant messaging communication session. The trigger comprises a portion or all ofthe text ofthe message.
The appearance or animation ofthe sender avatar may indicate an environmental condition, a personality characteristic associated with the sender, an emotional state associated with the sender, a setting characteristic or an activity associated with the sender. The sender avatar may be animated in response to the passing of a predetermined amount of time during which the sender does not communicate a message to the recipient or during which the sender does not use a computing device that is used by the sender to communicate with the recipient in the communications session.
The avatar animation used as the communication conduit may include a breakout animation that involves displaying avatar outside of normal display space occupied by the avatar. The sender avatar may be animated to produce sounds used for verbal communication. Implementations ofthe techniques discussed above may include a computer program product for generating a graphical user interface, a graphical user interface configured for presentation on a display device, or a system or apparatus.
In another general aspect, communicating includes graphically representing, with an avatar capable of being animated, a first user in a communication session involving the first user and a second user. A message is communicated between the first user and the second user. The message conveys explicit information from the first user to the second user. Out-of-band information is communicated to the second user using a change in the avatar appearance or avatar animation as a communication conduit. The out-of-band communication includes a communication that is related to a context ofthe first user and that differs from the information conveyed in the message sent between the first user and the second user.
Implementations may include one or more ofthe following features. For example, the communication session may be an instant messaging communication session. The avatar may be a facial animation that does not include a body having an ear or a leg or may be a facial animation, including a neck, that does not include a body having an ear or a leg.
The out-of-band information may include information indicating an environmental condition associated with the first user. The environmental condition may include an environmental condition related to weather occurring in a geographic location near the first user. The out-of-band information may indicate a personality characteristic associated with the first user or an emotional state associated with the first user.
The out-of-band information may include information indicating a setting characteristic associated with the first user. The setting characteristic may include a characteristic related to time of day ofthe first user or a characteristic related to time of year. The time of year may include a holiday or a season that is one of spring, summer, fall or winter. The setting characteristic may include a characteristic associated with a work setting or a recreation setting. The recreation setting may include a beach setting, a tropical setting or a winter sport setting. The out-of-band information may include information related to a mood of the first user. The mood ofthe first user may be one of happy, sad or angry.
The out-of-band information may include information associated with an activity ofthe first user. The activity may be performed by the first user at substantially the same time that the out-of-band message is communicated from the first user to the second user. The activity may be working or listening to music. The out-of-band information may include information conveying that the first user has muted sounds associated with the avatar.
An animation ofthe avatar to convey the out-of-band information from the first user to the second user may be triggered based on the information conveyed in the message from the first user to the second user. The trigger may include a portion or all ofthe text ofthe message. The trigger may include an audio portion ofthe message. The trigger may include the passing of a predetermined amount of time during which the first user does not communicate a message to the second user or does not use a computing device that is used by the first user to communicate with the second user in the communication session.
The avatar animation used as the communication conduit may include a facial expression ofthe avatar, a gesture made by a hand or arm ofthe avatar, movement of a body ofthe avatar or sounds made by the avatar. At least some ofthe sounds may include a voice based on a voice ofthe first user. The avatar animation used as the communication conduit may include a breakout animation that involves displaying avatar outside of normal display space occupied by the avatar. A breakout animation may include telescoping, resizing, or repositioning the avatar.
The first user may be provided with multiple preconfigured avatars having associated preselected animations. The first user may be enabled to select a particular avatar to represent the user in the communications session. The first user may be persistently associated with the selected avatar to represent the first user in subsequent communication sessions.
The first user may be enabled to modify the appearance ofthe avatar. Enabling the first user to modify the appearance ofthe avatar may include enabling the first user to use a slide bar to indicate a particular modification of a particular feature ofthe avatar or enabling the first user to modify appearance ofthe avatar to reflect a characteristic ofthe first user. The characteristic ofthe first user may be one of age, gender, hair color, eye color, or a facial feature.
Enabling the first user to modify the appearance ofthe avatar may include enabling the first user to modify the appearance ofthe avatar by adding, changing or deleting a prop displayed with the avatar. A prop may be one of eyeglasses, sunglasses, a hat, or earrings.
The first user may be enabled to modify a trigger used to cause an animation ofthe avatar. The trigger may include text included in the message sent from the first user to the second user.
The avatar may be animated for use as an information assistant to convey information to the first user. Use ofthe avatar by an application other than a communications application, including an online journal, may be enabled.
A depiction ofthe avatar may be displayed in the form that is substantially similar to a trading card. The trading card depiction ofthe avatar may include characteristics associated with the first user.
In yet another general aspect, perception of multiple online personas is enabled in an instant messaging communications session. At least two identities within a communications environment to whom messages may be directed are identified. A first persona of a user is enabled to be projected to a first ofthe identities while a second persona ofthe same user is enabled to be concurrently projected to a second ofthe identities. The first and second personas each include an avatar capable of being animated, and the first persona and the second persona differ.
Implementations may include one or more ofthe following features. For example, the first persona may differ from the second persona such that first persona invokes a different avatar than an avatar invoked by the second persona.
The first persona may invoke a first avatar, and the second persona may invoke a second avatar. The first avatar and the second avatar may be the same avatar. An animation associated with the first avatar may be different from animations associated with the second avatar. An appearance associated with the first avatar may be different from appearances associated with the second avatar.
An avatar may be associated with multiple sounds. An avatar may be capable of being animated based on text of a message sent in the instant message communications session. An avatar also maybe capable of being animated to send an out-of-band communication. The first persona may be associated with a first group of identities so that the first persona is projected in communications sessions with members ofthe first group of identities. The second persona may be associated with a second group of identities so that the second persona is projected in communications sessions with members of the second group of identities. A persona may be associated with the first ofthe identities, and a different persona may be associated with a group ofthe identities with which the first ofthe identities is associated. The first persona projected to the first ofthe identities may be an amalgamation ofthe persona associated with the first ofthe identities and the different persona associated with the group ofthe identities. The persona associated with the first ofthe identities may override the different persona associated with the group ofthe identities to the extent a conflict exists.
In still another general aspect, perception of multiple online personas is enabled in an instant messaging communications session. An instant messaging application user interface for an instant messaging communications session is rendered on an instant messaging recipient system. The communications session involves at least one potential instant messaging recipient and a single potential instant messaging sender. A message is sent that includes a text message and a persona. The persona is selected among multiple possible personas associated with the instant messaging sender to be displayed by the potential instant messaging recipient when displaying the text message. The selected persona includes a collection of one or more self-expression items and a sender avatar capable of being animated. The selected persona is rendered at the potential instant messaging recipient system when rendering another portion ofthe message.
Implementations may include one or more ofthe following features. For example, the sender persona may be selected by the instant messaging sender from the multiple possible personas associated with the instant messaging sender. The persona may be rendered before or after communications are initiated by the potential instant messaging sender. The self-expression items may include one or more of a wallpaper, an emoticon, and a sound. One or more personas may be defined.
A first persona may be assigned to a first potential instant messaging recipient so that the first persona is thereafter automatically invoked and projected in an instant messaging communications session involving the first potential instant messaging recipient. A second persona may be assigned to a second potential instant messaging recipient so that the second persona is thereafter automatically invoked and projected in an instant messaging communications session involving the second potential instant messaging recipient. The second persona may be at least partially distinguishable fi.Om the first persona.
A first persona may be assigned to a first group of potential instant messaging recipients so that the first persona is thereafter automatically invoked and projected in an instant messaging communications session involving a member ofthe first group of potential instant messaging recipients. A second persona may be assigned to a second potential instant messaging recipient so that the second persona is thereafter automatically invoked and projected in an instant messaging communications session involving the second potential instant messaging recipient. The second persona may be at least partially distinguishable from the first persona. The use of one ofthe multiple personas may be disabled. Disabling the use of one ofthe multiple personas may be based on the instant messaging recipient. One ofthe multiple personas may be a work persona associated with presence ofthe instant messaging sender at a work location associated with the instant messaging sender. One ofthe multiple personas may be a home persona associated with presence ofthe instant messaging sender at home. A determination may be made as to whether the instant messaging sender is at home or at the work location, h response to a determination that the instant messaging sender is at home, the home persona may be selected for use in the instant messaging communications session, hi response to a determination that the instant messaging sender is at the work location, the work persona may be selected for use in the instant messaging communications session.
A persona to be displayed may be selected by the potential instant messaging recipient based on time of day, day of week, or a group of potential instant messaging recipients that are associated with the potential instant messaging recipient.
At least some of characteristics of a persona may be transparent to the instant messaging sender. The sender avatar may be animated to send an out-of-band communication from the instant messaging sender to the potential instant messaging recipient.
In yet another general aspect, an avatar is used to coirimunicate. A user is represented graphically using an avatar capable of being animated. The avatar is associated with multiple animations and multiple features of appearance that represent a pattern of characteristics representing a personality ofthe avatar.
Implementations may include one or more ofthe following features. For example, the avatar may be associated with a description that identifies the personality ofthe avatar. The personality ofthe avatar may include at least some characteristics that are distinct of at least some characteristics of a personality of the user. A second user may be graphically represented with a second avatar capable of being animated. The second avatar may be associated with multiple animations and multiple features of appearance that represent a pattern of characteristics representing a personality ofthe second avatar. The personality ofthe second avatar may include at least some characteristics that are distinct of at least some characteristics ofthe personality ofthe first avatar. Communication messages may be sent between the first user and the second user. hi yet another general aspect, a first avatar is animated based on perceived animation of a second avatar. A first user is graphically represented with a first avatar capable of being animated, and a second user is graphically represented with a second avatar capable of being animated. Communication messages are being sent between the first user and the second user. An indication of an animation ofthe first avatar is received, and, the second avatar is animated in response to, and based on, the received indication ofthe animation. Implementations may include one or more ofthe following features. For example, the indication of an animation received may be any type of animation ofthe first avatar or may be an indication of a particular animation of multiple possible animations ofthe first avatar. The first avatar may be subsequently animated in response to and based on the animation ofthe second avatar. The first avatar may be ammated in response to a particular portion of a message sent between the first user and the second user. The message may be sent from the first user to the second user or may be sent to the first user from the second user. The first avatar may be animated to send an out-of-band communication from the first user to the second user. Implementations ofthe techniques discussed above may include a method or process, a system or apparatus, or computer software on a computer-accessible medium.
The details of one or more ofthe implementations are set forth in the accompanying drawings and description below. Other features will be apparent from the description and drawings, and from the claims.
DESCRIPTION OF DRAWINGS
FIGS. 1, 2 and 5 are diagrams of user interfaces for an instant messaging service capable of enabling a user to project an avatar for self-expression. FIG. 3 is a flow chart of a process for animating an avatar based on the content of an instant message. FIG. 4 is a block diagram illustrating exemplary animations of an avatar and textual triggers for each animation.
FIG. 6 is a diagram illustrating an exemplary process involving communications between two instant messaging client systems and an instant message host system, whereby an avatar of a user of one of the instant message client systems is animated based on the animation of an avatar of a user ofthe other ofthe instant message client systems.
FIG. 7 is a flow chart of a process for selecting and optionally customizing an avatar. FIG. 8 is a block diagram depicting examples of avatars capable of being projected by a user for self-expression.
FIG. 9 is a diagram of a user interface for customizing the appearance of an avatar.
FIG. 10 is a diagram of a user interface used to present a snapshot description of an avatar.
FIG. 11 A is a block diagram illustrating relationships between online personas, avatars, avatar behaviors and avatar appearances.
FIG. 1 IB is a flow chart of a process for using a different online personality to communicate with each of two instant message recipients. FIG. 12 is a diagram of a user interface that enables an instant message sender to select among available online personas.
FIG. 13 is a diagram of exemplary user interfaces for enabling an instant message sender to create and store an online persona that includes an avatar for self- expression. FIG. 14 is a flow chart of a process for enabling a user to change an online persona that includes an avatar for self-expression.
FIG. 15 is a flow chart of a process for using an avatar to communicate an out- of-band message to an instant message recipient.
FIGS. 16, 17 and 18 are diagrams of exemplary communications systems capable of enabling an instant message user to project an avatar for self-expression. DETAILED DESCRIPTION
An avatar representing an instant messaging user may be animated based on the message sent between a sender and recipient. An instant messaging application interface is configured to detect entry of predetermined or user-defined character strings, and to relate those character strings to predefined animations of an avatar. The avatar representing or selected by the sender is animated in the recipient's instant messaging application interface and, optionally, in the sender's instant messaging application interface. The avatar is rendered based on an animation model including a mesh that defines, using polygons, the form ofthe avatar, a texture defines an image to covers the mesh ofthe avatar, and a light map that defines the effect of a light source on the avatar. The animation model for the avatar includes particular geometry, including at least one thousand polygons in the underlying wire model that makes up the avatar's mesh, and at least twenty blend shapes, each of which defines a different facial expression or shape. The animation model includes multiple animations capable of being rendered for the avatar defined by the animation model and the animations being capable of association with one or more sound effects. The animation model for the avatar may include only a face and/or a face and neck ofthe avatar. An avatar representing a user in a communications session also may be used to send to another user an out-of-band communication that conveys information independent of information conveyed directly in the text message sent. The out-of- band information may be communicated using a change in the avatar appearance or avatar animation as a communication conduit. By way of example, an out-of-band communication may include information about the sender's setting, environment, activity or mood, which is not explicitly communicated and part of a text message exchanged by a sender and a recipient.
A user may name and save multiple different "online personas" or "online personalities," which are groups of instant messaging self-expression settings such as, for example, avatars, Buddy Sounds, Buddy Wallpaper and Emoticons (e.g.,
Smileys). Then, depending on the identity with whom the user communicates, they may access and project a preselected one of their online personas in an instant messaging environment, and or they may manually invoke and manage the online persona they project to others. Functionality and features ofthe instant messaging interface may differ based upon the online personas being used in the instant message conversation. An avatar that represents a user in a communications session may be ammated, without user manipulation, based on the animation of another avatar that represents another user in the same communications session. This may be referred to as an automatic response of an avatar to the behavior of another avatar.
FIG. 1 illustrates an exemplary graphical user interface 100 for an instant messaging service capable of enabling a user to project an avatar for self-expression. The user interface 100 may be viewed by a user who is an instant message sender and whose instant messaging communications program is configured to project an avatar associated with and used as an identifier for the user to one or more other users or user groups (collectively, instant message recipients). In particular, the user IMSender is an instant message sender using the user interface 100. The instant message sender projects a sender avatar 135 in an instant messaging communications session with an instant message recipient SuperBuddyFanl, who projects a recipient avatar 115. A corresponding graphical user interface (not shown) is used by the instant message recipient SuperBuddyFanl . h this manner, the sender avatar 135 is visible in each ofthe sender's user interface and the recipient's user interface, as is the recipient avatar 115. The instant messaging communications session may be conducted simultaneously, near-simultaneously, or serially.
The user interface (UI) 100 includes an instant message user interface 105 and an instant messaging buddy list window 170. The instant message user interface 105 has an instant message recipient portion 110 and an instant message sender portion 130. The instant message recipient portion 110 displays the recipient avatar 115 chosen by the instant message recipient with whom the instant message sender is having an instant message conversation. Similarly, the instant message sender portion 130 displays the sender avatar 135 chosen by the instant message sender. The display ofthe sender avatar 135 in the instant message user interface 105 enables the instant message sender to perceive the avatar being projected to the particular instant message recipient with whom the instant message sender is communicating. The avatars 135 and 115 are personalization items selectable by an instant message user for self-expression. The instant message user interface 105 includes an instant message composition area 145 for composing instant message messages to be sent to the instant message recipient and for message history text box 125 for displaying a transcript ofthe instant message communications session with the instant message recipient. Each ofthe messages sent to, or received from, the instant message recipient are listed in chronological order in the message history text box 125, each with an indication ofthe user that sent the message as shown at 126. The message history text box 125 optionally may include a time stamp 127 for each ofthe messages sent.
Wallpaper may be applied to portions ofthe graphical user interface 100. For example, wallpaper may be applied to window portion 120 that is outside ofthe message history box 125 or window portion 140 that is outside ofthe message composition area 145. The recipient avatar 115 is displayed over, or in place of, the wallpaper applied to the window portion 120, and the wallpaper applied to the window portion 120 corresponds to the recipient avatar 115. Likewise, the sender avatar 135 is displayed over, or in place of, the wallpaper applied to the window portion 140 and the wallpaper applied to the window portion 120 corresponds to the sender avatar 135. In some implementations, a box or other type of boundary may be displayed around the avatar, as shown by boundary 157 displayed around the sender avatar 135. A different wallpaper may be applied to window portion 158 inside the boundary 157 than the wallpaper applied to the window portion 140 outside ofthe message composition area 145 but not within the boundary 157. The wallpaper may appear to be non-uniform and may include objects that are animated. The wallpapers applied to the window portions 120 and 140 may be personalization items selectable by an instant message user for self-expression.
The instant message user interface 105 also includes a set of feature controls 165 and a set of transmission controls 150. The feature controls 165 may control features such as encryption, conversation logging, conversation forwarding to a different commumcations mode, font size and color control, and spell checking, among others. The set of transmission controls 150 includes a control 160 to trigger sending ofthe message that was typed into the instant message composition area 145, and a control 155 for modifying the appearance or behavior ofthe sender avatar 135. The instant message buddy list window 170 includes an instant message sender-selected list 175 of potential instant messaging recipients ("buddies") 180a- 180g. Buddies typically are contacts who are known to the potential instant message sender (here, EVISender). hi the list 175, the representations 180a-180g include text identifying the screen names ofthe buddies included in list 175; however, additional or alternative information may be used to represent one or more ofthe buddies, such as an avatar associated with the buddy, that is reduced in size and either still or animated. For example, the representation 180a includes the screen name and avatar ofthe instant message recipient named SuperBuddyFanl. The representations 180a- 180g may provide connectivity information to the instant message sender about the buddy, such as whether the buddy is online, how long the buddy has been online, whether the buddy is away, or whether the buddy is using a mobile device.
Buddies may be grouped by an instant message sender into one or more user- defined or pre-selected groupings ("groups"). As shown, the instant message buddy list window 170 has three groups, Buddies 182, Co-Workers 184, and Family 186. SuperBuddyFanl 185a belongs to the Buddies group 182, and ChattingChuck 185c belongs to the Co-Workers group 184. When a buddy's instant message client program is able to receive communications, the representation ofthe buddy in the buddy list is displayed under the name or representation ofthe buddy group to which the buddy belongs. As shown, at least potential instant messaging recipients 180a- 180g are online, h contrast, when a buddy's instant message client program is not able to receive communications, the representation ofthe buddy in the buddy list may not be displayed under the group with which it is associated, but it may instead be displayed with representations of buddies from other groups under the heading Offline 188. All buddies included in the list 175 are displayed either under one ofthe groups 182, 184, or 186, or under the heading Offline 188. As illustrated in FIG. 1, each ofthe sender avatar 135 and the recipient avatar 115 is a graphical image that represents a user in an instant message communications session. The sender projects the sender avatar 135 for self-expression, whereas the recipient projects the recipient avatar 115 also for self-expression. Here, each ofthe animation avatars 135 or 115 is an avatar that only includes a grapliical image of a face, which may be referred to as a facial avatar or a head avatar. In other implementations, an avatar may include additional body components. By way of example, a Thanksgiving turkey avatar may include an image of a whole turkey, including a head, a neck, a body and feathers. The sender avatar 135 may be animated in response to an instant message sent to the instant message recipient, and the recipient avatar 115 may be animated in response to an instant message sent by the instant message recipient. For example, the text of an instant message sent by the sender may trigger an animation ofthe sender avatar 135, and the text of an instant messages sent by the instant message recipient to the sender may trigger an animation of the recipient avatar 115.
More particularly, the text of a message to be sent is specified by the sender in the message specification text box 145. The text entered in the message specification text box 145 is sent to the recipient when the sender activates the send button 160. When the send button 160 is activated, the instant message application searches the text ofthe message for animation triggers. When an animation trigger is identified, the sender avatar 135 is animated with an animation that is associated with the identified trigger. This process is described more fully later. In a similar manner, the text of a message sent by the instant message recipient and received by the sender is searched for animation triggers and, when found, the recipient avatar 115 is animated with an animation associated with the identified trigger. By way of example, the text of a message may include a character string "LOL," which is an acronym that stands for "laughing out loud." The character string "LOL" may trigger an animation in the sender avatar 135 or the recipient avatar 115 such that the sender avatar 135 or the recipient avatar 115 appears to be laughing. Alternatively or additionally, the sender avatar 135 may be animated in response to an instant message sent from the instant message recipient, and the recipient avatar 115 may be animated in response to a message sent from the instant message sender. For example, the text of an instant message sent by the sender may trigger an animation ofthe recipient avatar 115, and the text of an instant messages sent by the instant message recipient to the sender may trigger an animation ofthe sender avatar 135.
More particularly, the text of a message to be sent is specified by the sender in the message specification text box 145. The text entered in the message specification text box 145 is sent to the recipient when the sender activates the send button 160. When the send button 160 is activated, the instant message application searches the text ofthe message for animation triggers. When an animation trigger is identified, the recipient avatar 115 is animated with an animation that is associated with the identified trigger. In a similar manner, the text of a message sent by the instant message recipient and received by the sender is searched for animation triggers and, when found, the sender avatar 135 is animated with an animation associated with the identified trigger. h addition, the sender avatar 135 or the recipient avatar 115 maybe animated in direct response to a request from the sender or the recipient. Direct animation of the sender avatar 135 or the recipient avatar 115 enables use ofthe avatars as a means for communicating information between the sender and the recipient without an accompanying instant message. For example, the sender may perform an action that directly causes the sender avatar 135 to be animated, or the recipient may perform an action that directly causes the recipient avatar 115 to be animated. The action may include pressing a button corresponding to the animation to be played or selecting the animation to be played from a list of animations. For example, the sender may be presented with a button that inspires an animation in the sender avatar 135 and that is distinct from the send button 160. Selecting the button may cause an animation ofthe sender avatar 135 to be played without performing any other actions, such as sending an instant message specified in the message composition area 145. The played animation may be chosen at random from the possible animations ofthe sender avatar 135, or the played animation may be chosen before the button is selected. An animation in one ofthe avatars 135 or 115 displayed on the instant messaging user interface 105 may cause an animation in the other avatar. For example, an animation ofthe recipient avatar 115 may trigger an animation in the sender avatar 135, and vice versa. By way of example, the sender avatar 135 maybe animated to appear to be crying, hi response to the animation ofthe sender avatar 135, the recipient avatar 115 also may be animated to appear to be crying. Alternatively, the recipient avatar 115 may be animated to appear comforting or sympathetic in response to the crying animation ofthe sender avatar 135. In another example, a sender avatar 135 may be ammated to show a kiss and, in response, a recipient avatar 115 may be animated to blush.
The recipient avatar 115 may appear to respond to a mood ofthe sender communicated by the sender avatar 135. By way of example, in response to a frowning or teary animation ofthe sender avatar 135, the recipient avatar 115 also may appear sad. Alternatively, the recipient avatar 115 may be animated to try to cheer up the sender avatar 135, such as by smiling, exhibiting a comical expression, such as sticking its tongue out, or exhibiting a sympathetic expression.
An avatar 135 or 115 may be animated in response to a detected idle period of a predetermined duration. For example, after a period of sender inactivity, the sender avatar 135 may be ammated to give the appearance that the avatar is sleeping, falling off ofthe instant messaging interface 105, or some other activity indicative of inactivity. An avatar 135 or 115 also may progress through a series of animations during a period of sender inactivity. The series of animations may repeat continuously or play only once in response to the detection of an idle period. In one example, the sender avatar 135 may be animated to give the appearance that the avatar is sleeping and then having the avatar appear to fall off the instant messaging user interface 105 after a period of sleeping. Animating an avatar 135 or 115 through a progression of multiple animations representative of a period of sender inactivity may provide entertainment to the sender. This may lead to increased usage ofthe instant messaging user interface 105 by the sender, which in turn, may lead to an increased market share for the instant message service provider. The sender avatar 135 or the recipient avatar 115 may be animated to reflect the weather at the geographic locations ofthe sender and the recipient, respectively. For example, if rain is falling at the geographic location ofthe sender, then the sender avatar 135 may be animated to put on a rain coat or open an umbrella. The wallpaper corresponding to the sender avatar 135 also may include rain drops animated to appear to be failing on the sender avatar 135. The animation ofthe sender avatar 135 or the recipient avatar 115 played in response to the weather may be triggered by weather information received on the sender's computer or the recipient's computer, respectively. For example, the weather information may be pushed to the sender's computer by a host system of an instant messaging system being used. If the pushed weather information indicates that it is raining, then an animation ofthe sender avatar 135 corresponding to rainy weather is played.
Furthermore, the avatar may be used to audibly verbalize content other than the text communicated between parties during a communications session. For example, if the text "Hi" appears within a message sent by the sender, the sender avatar 135 may be animated to verbally say "Hello" in response. As another example, when the text "otp" or the text "on the phone" appears within a message sent by the recipient, the recipient avatar 115 may be animated to verbally say "be with you in just a minute" in response. As another example, in response to an idle state, an avatar may audibly try to get the attention ofthe sender or the recipient. For example, when the recipient sends a message to the sender that includes a question mark and the sender is determined to be idle, the recipient avatar 115 may audibly say "Hello? You there?" to try to elicit a response from the sender regarding the recipient's question. The sender may mute the recipient avatar 115 or the sender avatar 135 to prevent the recipient avatar 115 or the sender avatar 135 from speaking further. By way of example, the sender may prefer to mute the recipient avatar 115 to prevent the recipient avatar 115 from speaking. In one implementation, to show that an avatar is muted, the avatar may appear to be wearing a gag.
The voice of an avatar may correspond to the voice of a user associated with the avatar. To do so, the characteristics ofthe user's voice may be extracted from audio samples ofthe user's voice. The extracted characteristics and the audio samples may be used to create the voice ofthe avatar. Additionally or alternatively, the voice ofthe avatar need not correspond to the voice ofthe user and may be any generated or recorded voice.
The sender avatar 135 may be used to communicate an aspect ofthe setting or the environment ofthe sender. By way of example, the animation and appearance of the sender avatar 135 may reflect aspects ofthe time, date or place ofthe sender or aspects ofthe circumstances, objects or conditions ofthe sender. For example, when the sender uses the instant messaging user interface 105 at night, the sender avatar 135 may appear to be dressed in pajamas and have a light turned on to illuminate an otherwise dark portion ofthe screen on which the avatar is displayed and/or the sender avatar 135 may periodically appear to yawn. When the sender uses the instant messaging user interface 105 during a holiday period, the sender avatar 135 maybe dressed in a manner illustrative ofthe holiday, such as appearing, as Santa Claus during December, a pumpkin near Halloween, or Uncle Sam during early July. The appearance ofthe sender avatar 135 also may reflect the climate or geographic location ofthe sender. For example, when rain is falling in the location ofthe sender, wallpaper corresponding the sender avatar 135 may include falling raindrops and/or the sender avatar 135 may wear a rain hat or appear under an open umbrella. In another example, when the sender is sending instant message from a tropical location, the sender avatar 135 may appear in beach attire.
The sender avatar 135 also may communicate an activity being performed by the sender while the sender is using the instant messaging user interface 105. For example, when the sender is listening to music, the avatar 135 may appear to be wearing headphones. When the sender is working, the sender avatar 135 may be dressed in business attire, such as appearing in a suit and a tie.
The appearance ofthe sender avatar 135 also may communicate the mood or an emotional state ofthe sender. For example, the sender avatar 135 may communicate a sad state ofthe sender by frowning or shedding a tear. The appearance ofthe sender avatar 135 or the recipient avatar 115 may resemble the sender or the recipient, respectively. For example, the appearance ofthe sender avatar 135 may be such that the sender avatar 135 appears to be of a similar age as the sender. In one implementation, as the sender ages, the sender avatar 135 also may appear to age. As another example, the appearance ofthe recipient avatar 115 may be such that the recipient avatar 115 has an appearance similar to that ofthe recipient. In some implementations, the wallpaper applied to the window portion 120 and/or the wallpaper applied to the window portion 140 may include one or more animated objects. The animated objects may repeat continuously or periodically on a predetermined or random basis a series of animations. Additionally or alternatively, the wallpapers applied to the window portions 120 and 140 may be animated to in response to the text of messages sent between the sender and the recipient. For example, the text of an instant message sent by the sender may trigger an animation of the animated objects included in the wallpaper corresponding to the sender avatar 135, and the text of an instant messages sent by the instant message recipient to the sender may trigger an animation ofthe animated objects included in the wallpaper corresponding to the recipient avatar 115. The animated objects included in the wallpapers may be animated to reflect the setting or environment, activity and mood ofthe recipient and the sender, respectively.
An avatar may be used as a mechanism to enable self-expression or additional non-text communication by a user associated with the avatar. For example, the sender avatar 135 is a projection ofthe sender, and the recipient avatar 115 is a projection of the recipient. The avatar represents the user in instant messaging communications sessions that involve the user. The personality or emotional state of a sender may be projected or otherwise communicated tlirough the personality ofthe avatar. Some users may prefer to use an avatar that more accurately represents the user. As such, a user may change the appearance and behavior of an avatar to more accurately reflect the personality ofthe user, hi some cases, a sender may prefer to use an avatar for self-expression rather than projecting an actual image ofthe sender. For example, some people may prefer using an avatar to sending a video or photograph ofthe sender.
Referring to FIG. 2, the animation of an avatar may involve resizing or repositioning the avatar such that the avatar occupies more or different space on the instant message user interface 105 than the original boundary ofthe avatar. In the illustration of FIG. 2, the size of sender avatar 205 has been increased such that the avatar 205 covers a portion ofthe message instant message composition area 145 and the control 155. In addition, elements ofthe user interface 100 other than an avatar also may be displayed using additional space or using different space on the user interface 100. For example, a sender avatar may depict a starfish with an expressive face and may be displayed on wallpaper that includes animated fish. The animated fish included in the wallpaper may be drawn outside the original boundary around the sender avatar 135 and appear to swim outside the original boundary area.
Referring to FIG. 3, a process 300 is illustrated for animating an avatar for self-expression based on the content of an instant message, h particular, an avatar representing an instant message sender is animated in response to text sent by the sender. The wallpaper ofthe avatar also is animated. The process 300 is performed by a processor executing an instant messaging communications program. In general, the text of a message sent to an instant message recipient is searched for an animation trigger and, when a trigger is found, the avatar that represents the instant message sender is animated in a particular manner based on the particular trigger that is found. The wallpaper displayed for the avatar includes an animated object or animated objects. The object or objects may be animated based on the content ofthe instant message sent or may be animated based on other triggers, including (but not limited to) the passing of a predetermined amount of time, the occurrence of a particular day or time of day, any type of animation ofthe sender avatar, a particular type of animation ofthe sender avatar, any type of animation ofthe recipient avatar, or a particular type ofthe animation ofthe recipient avatar. Also, when the sender is inactive for a predetermined duration, the avatar sequentially displays each of multiple animations associated with an idle state.
The process 300 begins when an instant message sender who is associated with an avatar starts an instant messaging communications session with an instant message recipient (step 305). To do so, the sender may select the name ofthe recipient from a buddy list, such as the buddy list 170 from FIG. 1. Alternatively, the name ofthe recipient may be entered into a form that enables instant messages to be specified and sent. As another alternative, the sender may start an instant messaging application that may be used to sign on for access to the instant messaging system and specify the recipient as a user ofthe instant messaging system with which a communications session is to be started. Once the recipient has been specified in this manner, a determination is made as to whether a copy of avatars associated with the sender and the recipient exist on the instant message client system being used by the sender. If not, copies ofthe avatars are retrieved for use during the instant message communications session. For example, information to render an avatar ofthe recipient may be retrieved from an instant message host system or the instant message recipient client, some cases, a particular avatar may be selected by the sender for use during the instant messaging communications session. Alternatively or additionally, the avatar may have been previously identified and associated with the sender.
The processor displays a user interface for the instant messaging session including the avatar associated with the sender and wallpaper applied to the user interface over which the avatar is displayed (step 307). The avatar may be displayed over, for example, wallpaper applied to a portion of a window in which an instant message interface is displayed. In another example, the avatar is displayed over a portion or portions of an instant message interface, such as window portions 120 or 140 and FIG. 1. h the example of FIG. 3, the wallpaper corresponding to avatar may include an object or objects that are animated during the instant message communications session.
The processor receives text of a message entered by the sender to be sent to the instant message recipient (step 310) and sends a message corresponding to the entered text to the recipient (step 315). The processor compares the text ofthe message to multiple animation triggers that are associated with the avatar projected by the sender (step 320). A trigger may include any letter, number, or symbol that may be typed or otherwise entered using a keyboard or keypad. Multiple triggers may be associated with an animation.
Referring also to FIG. 4, examples 400 of triggers associated with animations 405a-405q of a particular avatar model are shown. Each ofthe animations 405a-405q has multiple associated triggers 410a-410q. More particularly, by way of example, the animation 405a, in which the avatar is made to smile, has associated triggers 410a. Each ofthe triggers 410a includes multiple character strings, hi particular, triggers 410a include a ":)" trigger 411a, a ":-)" trigger 412a, a "0:-)" trigger 413a, a "0:)" trigger 414a, and a "Nice" trigger 415a. As illustrated, a trigger may be an English word, such as 415a, or an emoticon, such as 41 la-414a. Other examples of a trigger include a particular abbreviation, such as "lol" 41 In, and an English phrase, such as "Oh no" 415e. As discussed previously, when one ofthe triggers is included in an instant message, the avatar is animated with an animation that is associated with the trigger, h one example, when "Nice" is included in an instant message, the avatar is made to smile. In one implementation, one or more ofthe triggers associated with an animation is modifiable by a user. For example, a user may associate a new trigger with an animation, such as by adding "Happy" to triggers 410a to make the avatar smile, hi another example, a user may delete a trigger associated with an animation (that is, disassociate a trigger from an animation), such as by deleting "Nice" 415a. In yet another example, a user may change a trigger that is associated with an animation, such as by changing the "wink" trigger 413b to "winks."
In some implementations, a particular trigger may be associated with only one ammation. hi other implementations, a particular trigger may be permitted to be associated with multiple animations. In some implementations, only one ofthe multiple animations may be played in response to a particular trigger. The single animation to be played may be chosen randomly or in a pre-determined manner from the multiple animations. In other implementations, all ofthe multiple animations may be played serially based on a single trigger. In some implementations, a user may be permitted to delete a particular animation. For example, the user may delete the yell animation 405g. hi such a case, the user may delete some or all ofthe triggers associated with the yell ammation 405g or may chose to associate some or all ofthe triggers 410g with a different animation, such as a smile animation 405 a.
Referring again to FIG. 3, the processor determines whether a trigger is included within the message (step 325). When the message includes a trigger (step 325), the processor identifies a type of animation that is associated with the identified trigger (step 330). This may be accomplished by using a database table, a list, or a file that associates one or more triggers with a type of animation for the avatar to identify a particular type of animation. Types of animation include, by way of example, a smile 405a, a wink 405b, a frown 405c, an expression with a tongue out 405d, a shocked expression 410d, a kiss 405f, a yell 405g, a big smile 405h, a sleeping expression 405i, a nodding expression 405j, a sigh 405k, a sad expression 4051, a cool expression 405m, a laugh 405n, a disappearance 405o, a smell 405p, or a negative expression 405q, all of FIG. 4. The identified type of animation for the avatar is played (step 335).
Optionally, the processor may identify and play an animation of at least one wallpaper object based on the match of a trigger with the text ofthe message sent (step 337).
The processor monitors the communications activity ofthe sender for periods of inactivity (step 340) to detect when the sender is in an idle state or an idle period of communications activity (step 345). The sender may be in an idle state after a period during which no messages were sent. To detect an idle state, the processor may determine whether the sender has not typed or sent an instant message or otherwise interacted with the instant message communications application for a predetermined amount of time. Alternatively, an idle state may be detected by the processor when the sender has not used the computer system in which the processor operates for a predetermined amount of time. When the processor detects inactivity (which may be referred to an idle state), a type of animation associated with the idle state is identified (step 350). This may be accomplished by using a database table, list or file that identifies one or more types of animations to play during a detected idle period. The type of animations played during a detected idle state may be the same as or different from the types of animations played based on a trigger in an instant message. The identified type of animation is played (step 355). In one implementation, multiple types of animation associated with the idle state may be identified and played. When the processor detects that the sender is no longer idle, such as by receiving an input from the sender, the processor may immediately stop playing the animation event (not shown). In some implementations, a user may select types of animations to be played during an idle period and/or select the order in which the animation are played when multiple animations are played during an idle period. A user may configure or otherwise determine the duration of time during which no messages are sent that constitutes an idle period for the user.
In some implementations, the processor may detect a wallpaper object trigger that is different than the trigger used to animate the sender avatar (step 360). For example, the processor may detect the passage of a predetermined amount of time. In another example, the processor may detect that the content ofthe instant message includes a trigger for a wallpaper object animation that is different from the trigger used to animate the sender avatar. Other wallpaper object triggers may include (but are not limited to) the occurrence of a particular day or a particular time of day, the existence of any animations by the sender avatar, the existence of a particular type of animation by the sender avatar, the existence of animations by the recipient avatar, and/or the existence of a particular type ofthe animation ofthe recipient avatar. The triggers for the animation of wallpaper objects also maybe user-configurable such that a user selects whether a particular type of animation is to be included, any animations are to be played, and triggers for one or more ofthe wallpaper objects. A trigger for a type of animation of a wallpaper object or objects may be the same as, or different from, one ofthe triggers associated with animating the avatar.
When the processor detects a wallpaper object trigger (step 360), the processor identifies and plays an animation of at least one wallpaper object (step 337).
The process of identifying and playing types of animations during a sent instant message (steps 310-335) is performed for every instant message that is sent and for every instant message that is received by the processor. The process of identifying and playing types of animation events during periods of inactivity (steps 340-355) may occur multiple times during the instant messaging communications session. Steps 310-355 may be repeated indefinitely until the end of the instant messaging communications session.
The process of identifying and playing the types of animations that correspond to a sent instant message or that are played during a period of sender inactivity (steps 320-355) also are performed by the processor ofthe instant message communications application that received the message. In this mam er, the animation ofthe sender avatar may be viewed by the sender and the recipient ofthe instant message. Thus, the animation ofthe avatar conveys information from the sender to the recipient that is not directly included in the instant message.
Referring to FIG. 5, an instant messaging interface 500 may be used by a sender of a speech-based instant messaging system to send and receive instant messages. In the speech-based instant messaging system, instant messages are heard rather than read by users. The instant messages may be audio recordings ofthe users ofthe speech-based instant messaging system, or the instant messages may include text that is converted into audible speech with a text-to-speech engine. The audio recordings or the audible speech are played by the users. The speech-based instant messaging interface 500 may display an avatar 505 corresponding to a user ofthe instant messaging system from which speech-based instant messages are received. The avatar 505 may be animated automatically in response to the received instant messages such that the avatar 505 appears to be speaking the contents ofthe instant message. The recipient may view the animation ofthe avatar 505 and gather information not directly or explicitly conveyed in the instant message. Depending on the animation played, the recipient may be able to determine, for example, the mood ofthe sender or whether the sender is being serious or joking.
More particularly, the audio message may be processed in the same or similar manner as a textual instant message is processed with respect to the animation process 300 of FIG. 3. In such a case, types of animations are triggered by audio triggers included in an instant message.
In some implementations, the avatar 505 may appear to be speaking the instant message. For example, the avatar 505 may include animations of mouth movements corresponding to phonemes in human speech to increase the accuracy ofthe speaking animations. When the instant message includes text, a text-to-speech process may be generate sounds spoken by the avatar 505, animations corresponding to phonemes in the text maybe generated, and a lip synchronization process may be used to synchronize the playing ofthe audio with the lip animation such that the phonemes are heard at the same time that the corresponding animation ofthe mouth ofthe avatar 505 is seen. When the instant message includes an audio recording, animations corresponding to phonemes in the audio recording may be generated, and a lip synchronization used to synchronize the playing ofthe audio recording with the lip animation.
In another example, a sender may record an audio portion to be associated with one or more animations ofthe avatar 505. The recording then may be played when the corresponding animation ofthe avatar 505 is played.
FIG. 6 illustrates an example process 600 for communicating between instant message clients 602a and 602b, through an instant message host system 604, to animate one avatar in response to an animation played in a different avatar. Each of the users using client 602a or client 602b is associated with an avatar that represents and projects the user during the instant message session. The communications between the clients 602a and 602b are facilitated by an instant messaging host system 604. In general, the communications process 600 enables a first client 602a and a second client 602b to send and receive communications from each other. The communications are sent through the instant messaging host system 604. Some or all ofthe communications may trigger an animation or animations in an avatar associated with the user ofthe first client 602a and an animation or animations in an avatar associated with the user ofthe second client 602b.
An instant messaging communications session is established between the first client 602a and the second client 602b in which communications are sent through the instant messaging server host system 604 (step 606). The communications session involves a first avatar that represents the user ofthe first client 602a and a second avatar that represents the user ofthe second client 602b. This may be accomplished, for example, as described previously with respect to step 305 of FIG. 3. h general, both the user ofthe first client 602a and the user ofthe second client 602b may use a user interface similar to the user interface 100 of FIG. 1 in which the sender avatar and the recipient avatar are displayed on the first client 602a and on the second client 602b.
During the instant messaging communications session, a user associated with the first client 602a enters text of an instant message to be sent to a user ofthe second client 602b, which is received by the processor on the client 602aexecuting the instant messaging communications application (step 608). The entered text may include a trigger for one ofthe animations from the first avatar model. The processor executing the instant messaging communications application sends the entered text to the second client 602b in the instant message by way ofthe host system 604 (step 610). Specifically, the host system 604 receives the message and forwards the message from the first client 602a to the second client 602b (step 612). The message then is received by the second client 602b (step 614). Upon receipt ofthe message, the second client 602b displays the message in a user interface in which messages from the user ofthe first client 602a are displayed. The user interface may be similar to the instant messaging user interface 105 from FIG. 1, in which avatars corresponding to the sender and the recipient are displayed.
Both the first client 602a and the second client 602b have a copy ofthe message, and both the first client 602a and the second client 602b begin processing the text ofthe message to determine if the text ofthe message triggers any animations in the respective copies ofthe first and second avatar models. When processing the message, the first client 602a and the second client 602b may actually process the message substantially concurrently or serially, but both the first client 602a and the second client 602b process the message in the same way.
Specifically, the first client 602a searches the text ofthe message for animation triggers to identify a type of animation to play (step 616a). The first client 602a then identifies an animation having the identified type of animation for a first avatar associated with the user ofthe first client 602a (step 618a). The first client 602a plays the identified animation for the first avatar that is associated with the user ofthe first client 602a (step 620a). The first avatar model is used to identify the animation to be played because the first avatar model is associated with the first client 602a, which sent the message. The first client 602a and the second client 602b use identical copies ofthe first avatar model to process the message, so the same animation event is seen on the first client 602a and the second client 602b.
The animation from the first avatar model triggers an animation from the second avatar model. To do so, the first client 602a identifies, based on the identified type of animation played for the first avatar in response to the text trigger, a type of animation to be played for a second avatar that is associated with the user ofthe second client 602b (step 622a). The first client 602b plays the identified type of ammation for the second avatar (step 624a).
The first client also may identify a type of animation to be played for wallpaper corresponding to the first avatar and plays the identified wallpaper animation ofthe first avatar (step 626a). The wallpaper ofthe avatar may include an object or objects that are animated during the instant message communications session. The animation ofthe object or objects may occur based on, for example, a trigger in an instant message or the passage of a predetermined amount of time. The animation of wallpaper objects also may be user-configurable such that a user selects whether a particular type animation, or any animations, are played, and the triggers for one or more of the wallpaper objects. A trigger for a type of animation of a wallpaper object or objects maybe the same as, or different from, one ofthe triggers associated with animating the avatar. After the message has been sent and processed, the user ofthe first client 602a may not send any additional messages for a period of time. The first client 602a detects such a period of inactivity (step 628a). The first client 602a identifies and plays an animation of a type associated with a period of inactivity of detected by the first client 602a (step 630a). This may be accomplished by using a database table, list or file that identifies one or more types of animations to play during a detected idle period.
The second client 602b processes the instant message in the same was as the first client 602a. Specifically, the second client 602b processes the message with steps 616b through 630b, each of which are substantially the same as parallel the message processing steps 616a through 630a performed by the first client 602a. Because each ofthe first client 602a and the second client 602b have copies ofthe avatars corresponding to the users ofthe first client 602a and the second client 602b, the same animations that were played on the first client 602a as a result of executing steps 616a tlirough 630a are played on the second client 602b as a result of executing the similar steps 616b through 630b. During the communications process 600, a text-based message indicates the types of animations that occur. However, messages with different types of content also may trigger animations ofthe avatars. For example, characteristics of an audio signal included in an audio-based message may trigger animations from the avatars. Referring to FIG. 7, a process 700 is used to select and optionally customize an avatar for use with an instant messaging system. An avatar may be customized to reflect a personality to be expressed or another aspect of self-expression ofthe user associated with the avatar. The process 700 begins when a user selects an avatar from multiple avatars and the selection is received by the processor executing the process 700 (step 705). For example, a user may select a particular avatar from multiple avatars such as the avatars illustrated in FIG. 8. Each ofthe avatars 805a-805r is associated with an avatar model that specifies the appearance ofthe avatar. Each of the avatars 805a-805r also includes multiple associated animations, each animation identified as being of a particular animation type. The selection may be accomplished, for example, when a user selects one avatar from a group of displayed avatars. The display ofthe avatars may show multiple avatars in a window, such as by showing a small representation (which in some implementations may be referred to as a "thumbnail") of each avatar. Additionally or alternatively, the display may be a list of avatar names from which the user selects.
FIG. 8 illustrates multiple avatars 805a-805r. Each avatar 805a-805r includes an appearance, name, and personality description. In one example, avatar 805a has an appearance 810a, a name 810b and a personality description 810c. The appearance of an avatar may represent, by way of example, living, fictional or historical people, sea creatures, amphibians, reptiles, mammals, birds, or animated objects. Some avatars may be represented only with a head, such as avatars 805a-805r. In one example, the appearance ofthe avatar 805b includes a head of a sheep. The appearance of other avatars may include only a portion or a specific part of a head. For example, the appearance ofthe avatar 8051 resembles a set of lips. Other avatars may be represented by a body in addition to a head. For example, the appearance ofthe avatar 805n includes a full crab body in addition to a head. An avatar may be displayed over wallpaper that is related in subject matter to the avatar. In one example, the avatar 805i is displayed over wallpaper that is indicative of a swamp in which the avatar 805j lives. Each ofthe avatars 805a-805r has a base state expression. For example, the avatar 805f appears to be happy, the avatar 805j appears to be sad, and the avatar 805m appears to be angry. Avatars may have other base state expressions, such as scared or bored. The base state expression of an avatar may influence the behavior of the avatar, including the animations and the sounds ofthe avatar. In one example, the avatar 805f has a happy base state expression and consequently has a generally happy behavior, whereas the avatar 805m has a creepy base state expression and consequently has a generally scary, creepy and spooky demeanor. In another example, a happy avatar may have upbeat sounds while an angry avatar may appear to be shouting when a sound is produced. The base state expression of an avatar may be changed as a result ofthe activities of a user associated with the avatar. By way of example, the degree of happiness expressed by the avatar may be related to the number of messages sent or received by the user. When the user sends or receives many messages in a predetermined period of time, the avatar may appear happier than when the user sends or receives fewer messages in the predetermined period of time. One of multiple avatars 805a-805r may be chosen by a user ofthe instant messaging system. Each ofthe avatars 805a-805r is associated with an appearance, characteristics and behaviors that express a particular type of personality. For example, an avatar 805f, which has appearance characteristics of a dolphin, may be chosen.
Each ofthe avatars 805a-805r is a multi-dimensional character with depth of personality, voice, and visual attributes. In contrast to representing a single aspect of a user through the use of an unanimated, two-dimensional graphical icon, an avatar of the avatars 805a-805r is capable of indicating a rich variety of information about the user projecting the avatar. Properties ofthe avatar enable the communication of physical attributes, emotional attributes, and other types of context information about the user that are not well-suited (or even available) for presentation through the use of two-dimensional icons that are not animated. In one example, the avatar may reflect the user's mood, emotions, and personality. In another example, the avatar may reflect the location, activities and other context ofthe user. These characteristics of the user may be communicated through the appearance, the visual animations, and the audible sounds ofthe avatar.
In one example of an avatar personality, an avatar named SoccerBuddy (not shown) is associated with an energetic personality. In fact, the personality ofthe SoccerBuddy avatar may be described as energetic, bouncy, confidently enthusiastic, and youthful. The SoccerBuddy avatar's behaviors reflect events in soccer matches. For example, the avatar's yell animation is an "ole, ole, ole" chant, his big-smile animation is and, during a frown animation or a tongue-out animation, the avatar shows a yellow card. Using wallpaper, the SoccerBuddy is customizable to represent a specific team. Special features ofthe SoccerBuddy avatar include cleated feet to represent the avatar's base. In general, the feet act as the base for the avatar. The SoccerBuddy avatar is capable of appearing to move about by pogo-sticking on his feet. In a few animations, such as when the avatar goes away, the avatar's feet may become large and detach from the SoccerBuddy. The feet are able to be animated to kick a soccer ball around the display.
In another example, a silent movie avatar is reminiscent of silent film actor in the 1920's and 1930's. A silent movie avatar is depicted using a stove-pipe hat and a handle-bar moustache. The silent movie avatar is not associated with audio, instead of speaking, the silent movie avatar is replaced by, or displays, placards having text in a mamier similar to how speech was conveyed in a silent movie.
In other examples, an avatar may be appropriate to current events or a season. In one example, an avatar may represent a team or a player on a team involved in professional or amateur sport. An avatar may represent a football team, a baseball team, or a basketball team, or a particular player of a team. In one example, teams engaged in a particular playoff series may be represented. Examples of seasonal avatars include a Santa Claus avatar, an Uncle Sam avatar, a Thanksgiving turkey avatar, a Jack-o-Lantern avatar, a Valentine's Day heart avatar, an Easter egg avatar, and an Easter bunny avatar.
Animation triggers ofthe avatar may be modified to customize when various types of animations associated with the avatar are to occur (step 710). For example, a user may modify the triggers shown in FIG. 4 to indicate when an avatar is to be animated, as described previously with respect to FIG. 3. The triggers may be augmented to include frequently used words, phrases, or character strings. The triggers also may be modified such that the animations that are played as a result of the triggers are indicative ofthe personality ofthe avatar. Modifying the triggers may help to define the personality expressed by the avatar and used for user self- expression.
A user also may configure the appearance of an avatar (step 715). This also may help define the personality ofthe avatar, and communicate a self-expressive aspect ofthe sender. For example, referring also to FIG. 9, an appearance modification user interface 900 may be used to configure the appearance of an avatar. In the example of FIG. 9, the appearance modification user interface 900 enables the user to modify multiple characteristics of a head of an avatar. For example, hair, eyes, nose, lips and skin tone ofthe avatar may be configured with the appearance modification user interface 900. For example, a hair slider 905 may be used to modify the length ofthe avatar's hair. The various positions ofthe hair slider 905 represent different possible lengths of hair for the avatar that correspond to different representations ofthe hair ofthe avatar included in the avatar model file associated with the avatar being configured. An eyes slider 910 may be used to modify the color ofthe avatar's eyes, with each position ofthe eyes slider 910 representing a different possible color ofthe avatar's eyes and each color being represented in the avatar model file. A nose slider 915 may be used to modify the appearance ofthe avatar's nose, with each position ofthe nose slider 915 representing a different possible appearance ofthe avatar's nose and each possible appearance being represented in the avatar model file. In a similar manner, a lips slider 920 may be used to modify the appearance ofthe avatar's lips, with each position ofthe lips slider 920 representing a different possible appearance ofthe avatar's lips and associated with a different lip representation in the avatar model file. The avatar's skin tone also may be modified with a skin tone slider 925. Each ofthe possible positions ofthe skin tone slider 925 represents a possible skin tone for the avatar with each being represented in the avatar model file. The appearance ofthe avatar that is created as a result of using the sliders 905- 925 may be previewed in an avatar viewer 930. The values chosen with the sliders 905-925 are reflected in the avatar illustrated in the avatar viewer 930. In one implementation, the avatar viewer 930 may be updated as each ofthe sliders 905-925 is moved such that the changes made to the avatar's appearance are immediately visible. In another implementation, the avatar viewer 930 may be updated once after all ofthe sliders 905-925 have been used.
A rotation slider 935 enables the rotation ofthe avatar illustrated in the avatar viewer 930. For example, the avatar may be rotated about an axis by a number of degrees chosen on the rotation slider 935 relative to an unrotated orientation ofthe avatar. In one implementation, the axis extends vertically through the center ofthe avatar's head and the unrotated orientation ofthe avatar is when the avatar is facing directly forward. Rotating the avatar's head with the rotation slider 930 enables viewing of all sides ofthe avatar to illustrate the changes to the avatar's appearance made with the sliders 905-925. The avatar viewer 930 may be updated as the rotation slider 930 is moved such that changes in the orientation ofthe avatar may be immediately visible.
The appearance modification user interface 900 also includes a hair tool button 940, a skin tool button 945, and a props tool button 950. Selecting the hair tool button 940 displays a tool for modifying various characteristics ofthe avatar's hair. For example, the tool displayed as a result of selecting the hair tool button 940 may enable changes to, for example, the length, color, cut, and comb ofthe avatar's hair. In one implementation, the changes made to the avatar's hair with the tool displayed as a result of selecting the hair tool button 940 are reflected in the illustration ofthe avatar in the avatar viewer 930.
Similarly, selecting a skin tool button 945 displays a tool for modifying various aspects ofthe avatar's skin. For example, the tool displayed as a result of selecting the skin tool button 945 may enable, for example, changing the color ofthe avatar's skin, giving the avatar a tan, giving the avatar tattoos, or changing the weathering ofthe avatar's skin to give appearances ofthe age represented by the avatar. In one implementation, the changes made to the avatar's skin with the tool displayed as a result of selecting the skin tool button 945 are reflected in the illustration ofthe avatar in the avatar viewer 930.
In a similar manner, selecting the props tool button 950 displays a tool for associating one or more props with the avatar. For example, the avatar may be given eyeglasses, earrings, hats, or other objects that may be worn by, or displayed on or near, the avatar through use ofthe props tool. In one implementation, the props given to the avatar with the tool displayed as a result of selecting the props tool button 950 are shown in the illustration ofthe avatar in the avatar viewer 930. In some implementations, all ofthe props that may be associated with the avatar are included in the avatar model file. The props controls whether each ofthe props is made visible when the avatar is displayed, h some implementations, a prop may be created using and rendered by two-dimensional animation techniques. The rendering ofthe prop is synchronized with animations for the three-dimensional avatar. Props may be generated and associated with an avatar after the avatar is initially created. Once all desired changes have been made to the avatar's appearance, the user may accept the changes by selecting a publish button 955. Selecting the publish button 955 saves the changes made to the avatar's appearance, hi addition, when copies ofthe avatar are held by other users ofthe instant messaging system to reflect the change made, the other users are sent updated copies ofthe avatar that reflect the changes made by the user to the avatar. The copies ofthe avatar may be updated so that all copies ofthe avatar have the same appearance such that there is consistency among the avatars used to send and receive out-of-band communications. The appearance modification user interface 900 may be used by the user to change only copies ofthe avatar corresponding to the user. Therefore, the user is prevented from making changes to other avatars corresponding to other users that may be overwritten he user is sent updated copies ofthe other avatars because the other users made changes to the other avatars. Preventing the user from modifying the other avatars ensures that all copies ofthe avatars are identical.
The avatar illustrated in the avatar viewer 930 may have an appearance that does not include one of hair, eyes, a nose, lips, or skin tone that are modified with the sliders 905-925. For example, the appearance ofthe avatar 8051 from FIG. 8 does not include hair, eyes, a nose, or skin tone. In such a case, the appearance modification user interface 900 may omit the sliders 905-925 and instead include sliders to control other aspects ofthe appearance ofthe avatar. For example, the appearance modification user interface 900 may include a teeth slider when the appearance ofthe avatar 8051 is being modified. Moreover, the interface 900 may be customized based on the avatar selected, to enable appropriate and relevant visual enhancements thereto. In another example of configuring the appearance of an avatar, a configurable facial feature of an avatar may be created using blend shapes ofthe animation model corresponding to the avatar. A blend shape defines a portion ofthe avatar that may be animated. In some implementations, a blend shape may include a mesh percentage that may be modified to cause a corresponding modification in the facial feature. In such a case, a user may be able to configure a facial feature of an avatar by using a slider or other type of control to modify the mesh percentage ofthe blend shapes associated with the facial feature being configured. hi addition to modifying the appearance ofthe avatar with the appearance modification user interface 900, the color, texture, and particles ofthe avatar maybe modified. More particularly, the color or shading ofthe avatar may be changed. The texture applied to avatar may be changed to age or weather the skin ofthe avatar. Furthermore, the width, length, texture, and color of particles ofthe avatar maybe customized. In one example, particles ofthe avatar used to portray hair or facial hair, such as a beard, may be modified to show hair or beard growth in the avatar.
Referring again to FIG. 7, wallpaper over which the avatar is illustrated and an animation for objects in the wallpaper may be chosen (step 720). This may be accomplished by, for example, choosing wallpaper from a set of possible wallpapers. The wallpapers may include animated objects, or the user may choose objects and animations for the chosen objects to be added to the chosen wallpaper.
A trading card that includes an image ofthe avatar, a description ofthe avatar may be created (step 725). In some implementations, the trading card also may include a description ofthe user associated with the avatar. The trading card may be shared with other users ofthe instant messaging system to inform the other users of the avatar associated with the user. Referring also to FIG. 10, one example of a trading card is depicted. The front side 1045 ofthe trading card shows the avatar 1046. The animations ofthe avatar may be played by selecting the animations control 1047. The back side 1050 ofthe trading card includes descriptive information 1051 about the avatar, including the avatar's name, date of birth, city, species, likes, dislikes, hobbies, and aspirations. As illustrated in FIG. 10, both the front side 1045 and the back side 1050 ofthe trading card is shown, hi some implementations, only one side 1045 or 1050 ofthe trading card is able to be displayed at one time. In such a case, a user may be able to control the side ofthe trading card that is displayed by using one ofthe flip controls 1048 or 1052. A store from which accessories for the avatar 1046 illustrated in the trading card may be accessed by selecting a shopping control 1049.
Referring again to FIG. 7, the avatar also may be exported for use in another application (step 730). In some implementations, an avatar may be used by an application other than a messaging application. In one example, an avatar may be displayed as part of a user's customized home page ofthe user's access provider, such as an internet service provider. An instant message sender may drag-and-drop an avatar to the user's customized home page such that the avatar is viewable by the user corresponding to the avatar. In another example, the avatar may be used in an application in which the avatar is viewable by anyone. An instant message sender may drag-and-drop the sender's avatar to the sender's blog or another type of publicly-accessible online journal. The user may repeat one or more ofthe steps in process 700 until the user is satisfied with the appearance and behavior ofthe avatar. The avatar is saved and made available for use in an instant messaging communications session. Referring again to FIG. 10, the avatar settings user interface 1000 includes a personality section 1002. Selecting a personality tab 1010 displays a personality section ofthe avatar settings interface 1000 for modifying the behavior ofthe one or more avatars. In one implementation, the avatar settings user interface 1000 may be used with the process 700 of FIG. 7 to choose the wallpaper of an avatar and/or to create a trading card for an avatar. The personality section 1002 ofthe avatar settings interface 1000 includes an avatar list 1015 including the one or more various avatars corresponding to the user of the instant messaging system. Each ofthe one or more avatars may be specified to have a distinct personality for use while communicating with a specific person or in a specific situation. In one implementation, an avatar may change appearance or behavior depending on the person with which the user interacts. For example, an avatar may be created with a personality that is appropriate for business communications, and another avatar may be created with a personality that is appropriate for communications with family members. Each ofthe avatars may be presented in the list with a name as well as a small illustration of each avatar's appearance. Selection of an avatar from the avatar list 1015 enables the specification ofthe behavior ofthe selected avatar. For example, the avatar 1020, which is chosen to be the user's default avatar, has been selected from the avatar list 1015, so the behavior ofthe avatar 1020 maybe specified. Names ofthe avatars included in the avatar list may be changed through selection of a rename button 1025. Selecting the rename button displays a tool for changing the name of an avatar selected from the avatar list 1015. Similarly, an avatar may be designated as a default avatar by selecting a default button 1030 after selecting the avatar from the avatar list 1015. Avatars may be deleted by selecting a delete button 1035 after selecting the avatar from the avatar list 1015. hi one implementation, a notification is displayed before the avatar is deleted from the avatar list 1015. Avatars also may be created by selecting a create button 1040. When the create button 1040 is pressed, a new entry is added to the avatar list 1015. The entry may be selected and modified in the same way as other avatars in the avatar list 1015. The behavior ofthe avatar is summarized in a card front 1045 and a card back
1050 displayed on the personality section. The card front 1045 includes an illustration ofthe avatar and wallpaper over which the avatar 1020 is illustrated. The card front 1045 also includes a shopping control 1049 to a means for purchasing props for the selected avatar 1020. The card back 1050 includes information describing the selected avatar 1020 and a user ofthe selected avatar. The description may include a name, a birth date, a location, as well as other identifying and descriptive information for the avatar and the user ofthe avatar. The card back 1050 also may include an illustration ofthe selected avatar 1020 as well as the wallpaper over which the avatar 1020 is illustrated. The trading card created as part ofthe avatar customization process 700 includes the card front 1045 and the card back 1050 automatically generated by the avatar settings interface 1000.
The personality section 1002 ofthe avatar settings interface 1000 may include multiple links 1055-1070 to tools for modifying other aspects ofthe selected avatar's 1020 behavior. For example, an avatar link 1055 may lead to a tool for modifying the appearance ofthe selected avatar 1020. hi one implementation, selecting the avatar link 1055 may display the appearance modification user interface 900 from FIG. 9. In another implementation, the avatar link 1055 may display a tool for substituting or otherwise selecting the selected avatar 1020. In yet another example, the avatar link 1055 may allow the appearance ofthe avatar to be changed to a different species. For example, the tool may allow the appearance ofthe avatar 1020 to be changed from that of a dog to that of a cat.
A wallpaper link 1060 may be selected to display a tool for choosing the wallpaper over which the selected avatar 1020 is drawn, hi one implementation, the wallpaper may be animated.
A sound link 1065 may be selected to display a tool with which the sounds made by the avatar 1020 may be modified. The sounds may be played when the avatar is animated, or at other times, to get the attention ofthe user.
An emoticon link 1070 may be selected to display a tool for specifying emoticons that are available when communicating with the selected avatar 1020. Emoticons are two-dimensional non-animated images that are sent when certain triggers are included in the text of an instant message. Changes made using the tools that are accessible through the links 1055-1070 may be reflected in the card front 1045 and the card back 1050. After all desired changes have been made to the avatars included in the avatar list 1015, the avatar settings interface 1000 may be dismissed by selecting a close button 1075. It is possible, through the systems and techniques described herein, particularly with respect to FIGS. 11 A-14, to enable users to assemble multiple self- expression items into a collective "online persona" or "online personality," which may then be saved and optionally associated with one or more customized names. Each self-expression item is used to represent the instant message sender or a characteristic or preference ofthe instant message sender, and may include user- selectable binary objects. The self-expression items may be made perceivable by a potential instant message recipient ("instant message recipient") before, during, or after the initiation of communications by a potential instant message sender ("instant message sender"). For example, self-expression items may include an avatar, images, such as wallpaper, that are applied in a location having a contextual placement on a user interface. The contextual placement typically indicates an association with the user represented by the self-expression item. For instance, the wallpaper may be applied in an area where messages from the instant message sender are displayed, or in an area around a dialog area on a user interface. Self-expression items also include sounds, animation, video clips, and emoticons (e.g., smileys). The personality may also include a set of features or functionality associated with the personality. For example, features such as encrypted transmission, instant message conversation logging, and forwarding of instant messages to an alternative communication system may be enabled for a given personality.
Users may assign personalities to be projected when conversing with other users, either in advance of or "on-the-fly" during a communication session. This allows the user to project different personalities to different people on-line. In particular, users may save one or more personalities (e.g., where each personality typically includes groups of instant messaging self-expression items such as, for example avatars, Buddy Sounds, Buddy Wallpaper, and Smileys, and/or a set of features and functionalities) and they may name those personalities to enable their invocation, they may associate each of different personalities with different users with whom they communicate or groups of such users so as to automatically display an appropriate/selected personality during communications with such other users or groups, or they may establish each of different personalities during this process of creating, adding or customizing lists or groups of users or the individual users themselves. Thus, the personalities may be projected to others in interactive online environments (e.g., Instant Messaging and Chat) according the assignments made by the user. Moreover, personalities may be assigned, established and/or associated with other settings, such that a particular personality may be projected based on time-of- day, geographic or virtual location, or even characteristics or attributes of each (e.g., cold personality for winter in Colorado or chatting personality while participating in a chat room).
In many instances, an instant message sender may have multiple online personas for use in an instant message communications session. Each online persona is associated with an avatar representing the particular online persona ofthe instant message sender. In many cases, each online persona of a particular instant message sender is associated with a different avatar. This need not be necessarily so. Moreover, even when two or more online personas of a particular instant message sender include the same avatar, the appearance or behavior ofthe avatar may be different for each ofthe online personas. hi one example, a starfish avatar may be associated with two online personas of a particular instant message sender. The starfish avatar that is associated with one online persona may have different animations than the other starfish avatar that is associated with the other online persona. Even when both ofthe starfish avatars include the same animations, one of the starfish avatars may be animated to display an ammation of a particular type based on different triggers than the same animation that is displayed for the other ofthe starfish avatars.
FIG. 11 A shows relationships between online personas, avatars, avatar behaviors and avatar appearances, hi particular, FIG. 11 A shows online personas 1102a-l 102e and avatars 1104a-l 104d that are associated with the online personas 1102a- 1102e. Each of the avatars 1104a- 1104d includes an appearance 1106a- 1106c and a behavior 1108a-l 108d. More particularly, the avatar 1104a includes an appearance 1106a and a behavior 1108a; the avatar 1104b includes an appearance 1106b and a behavior 1108b; the avatar 1104c includes the appearance 1106c and a behavior 1108c; and the avatar 1104d includes an appearance 1106c and a behavior 1108d. The avatars 1104c and 1104d are similar in that both include the appearance 1106c. However, the avatars 1104c and 1104d differ in that the avatar 1104c includes the behavior 1108c while the avatar 1104d includes the behavior 1108d.
Each ofthe online personas 1102a-l 102e is associated with one ofthe avatars 1104a-l 104d. More particularly, the online persona 1102a is associated with the avatar 1104a; the online persona 1102b is associated with the avatar 1104b; the online persona 1102c also is associated with the avatar 1104b the online persona 1102d is associated with the avatar 1104c; and the online persona 1102e is associated with the avatar 1104d. As illustrated by the online persona 1102a that is associated with the avatar 1104a, an online persona may be associated with an avatar that is not also associated with a different online persona.
Multiple online personas may use the same avatar. This is illustrated by the online personas 1102b and 1102c that are both associated with the avatar 1104b. In this case, the appearance and behavior exhibited by avatar 1104b is the same for both ofthe online personas 1102b and 1102c. hi some cases, multiple online personas may use similar avatars that have the same appearance by which exhibit different behavior, as illustrated by online personas 1102d and 1102e. The online personas 1102d and 1102e are associated with similar avatars 1104c and 1104d that have the same appearance 1106c. The avatars 1102d and 1102e, however, exhibit different behavior 1108c and 1108d, respectively. In creating personalities, the instant message sender may forbid a certain personality to be shown to designate instant message recipients and/or groups. For example, if the instant message sender wants to ensure that the "Casual" personality is not accidentally displayed to the boss or to co-workers, the instant message sender may prohibit the display ofthe "Casual" personality to the boss on an individual basis, and may prohibit the display ofthe "Casual" personality to the "Co-workers" group on a group basis. An appropriate user interface may be provided to assist the instant message sender in making such a selection. Similarly, the instant message sender may be provided an option to "lock" a personality to an instant message recipient or a group of instant message recipients to guard against accidental or unintended personality switching and/or augmenting. Thus, for example, the instant message sender may choose to lock the "Work" personality to the boss on an individual basis, or to lock the "Work" personality to the "Co-workers" group on a group basis, hi one example, the Casual personality will not be applied to a locked personality.
FIG. 1 IB shows an exemplary process 1100 to enable an instant message sender to select an online persona to be made perceivable to an instant message recipient. The selected online persona includes an avatar representing the online persona ofthe instant message sender. The process 1100 generally involves selecting and projecting an online persona that includes an avatar representing the sender. The instant message sender creates or modifies one or more online personalities, including an avatar representing the sender (step 1105). The online personalities may be created or modified with, for example, the avatar settings user interface 1000 of FIG. 10. Creating an online persona generally involves the instant message sender selecting one or more self-expression items and/or features and functionalities to be displayed to a certain instant message recipient or group of instant message recipients. A user interface may be provided to assist the instant message sender in making such a selection, as illustrated in FIG. 12.
FIG. 12 shows a chooser user interface 1200 that enables the instant message sender to select among available personalities 1205, 1210, 1215, 1220, 1225, 1230, 1235, 1240, 1245, 1250, and 1255. The user interface 1200 also has a control 1260 to enable the instant message sender to "snag" the personality of another user, and a control 1265 to review the personality settings currently selected by the instant message sender. Through the use ofthe avatar settings interface 1000, the user may change the personality, including the avatar, being projected to the instant message recipient before, during, or after the instant message conversation with the recipient. Alternatively, the selection of a personality also may occur automatically without sender intervention. For example, an automatic determination may be made that the sender is sending instant messages from work. In such a case, a personality to be used at work may be selected automatically and used for all communications. As another example, an automatic determination may be made that the sender is sending instant messages from home, and a personality to be used at home may be selected automatically and used for all communications. In such an implementation, the sender is not able to control which personality is selected for use. In other implementations, automatic selection of a personality may be used in conjunction with sender selection of a personality, in which case the personality automatically selected may act as a default that may be changed by the sender. FIG. 13 shows a series 1300 of exemplary user interfaces for enabling an instant message sender to create and store a personality, and/or select various aspects ofthe personality such as avatars, buddy wallpaper, buddy sounds, and smileys. As shown, user interface 1305 enables an instant message sender to select a set of one or more self-expression items and save the set of self-expression items as a personality. The user interface 1305 also enables an instant message sender to review and make changes to an instant message personality. For example, the user interface 1305 enables an instant message sender to choose an avatar 1310 (here, referred to as a SuperBuddy), buddy wallpaper 1315, emoticons 1320 (here, referred to as Smileys), and buddy sounds 1325. A set of controls 1340 is provided to enable the instant message sender to preview 1340a the profile and to save 1340b these selected self- expression items as a personality. The instant message sender is able to name and save the personality 1345 and then is able to apply the personality 1350 to one or more individual instant message recipients or one or more groups of instant message recipients. A management area 1350a is provided to enable the instant message sender to delete, save, or rename various instant message personalities, h choosing the self-expression items, other interfaces such as user interface 1355 may be displayed to enable the mstant message sender to select the particular self-expression items. The user interface 1355 includes a set of themes 1360 for avatars which enables an instant message sender to select a particular theme 1365 and choose a particular avatar 1370 in the selected theme. A set of controls 1375 is provided to assist the instant message sender in making the selection of self-expression items. Also, an instant message sender may be enabled to choose a pre-determined theme, for example, by using a user interface 1380. In user interface 1380, the instant message sender may select various categories 1385 of pre-selected themes and upon selecting a particular category 1390, a set of default pre-selected, self-expression items is displayed, 1390a, 1390b, 1390c, 1390d, 1390e, and 1390f. The set may be unchangeable or the instant message sender may be able to individually change any of the pre-selected self-expression items in the set. A control section 1395 is also provided to enable the instant message sender to select the themes. hi another implementation, the features or functionality ofthe instant message interface may vary based upon user-selected or pre-selected options for the personality selected or currently in use. The features or functionality may be transparent to the instant message sender. For example, when using the "Work" personality, the outgoing instant messages may be encrypted, and a copy may be recorded in a log, or a copy may be forwarded to a designated contact such as an administrative assistant. A warning may be provided to an instant message recipient that the instant message conversation is being recorded or viewed by others, as appropriate to the situation. By comparison, if the non-professional "Casual" personality is selected, the outgoing instant messages may not be encrypted and no copy is recorded or forwarded. As a further example, if the "Work" personality is selected and the instant message sender indicates an unavailability to receive instant messages (e.g., through selection of an "away" message or by going offline), then messages received from others during periods of unavailability may be forwarded to another instant message recipient such as an administrative assistant, or may be forwarded to an e-mail address for the instant message sender. By comparison, if the non-professional "Casual" personality is selected, no extra measures are taken to ensure delivery ofthe message. hi one implementation, the features and functionality associated with the personality would be transparent to the instant message sender, and may be based upon one or more pre-selected profiles types when setting up the personality. For example, the instant message sender may be asked to choose from a group of personality types such as professional, management, informal, vacation, offbeat, etc. In the example above, the "Work" personality may have been be set up as a "professional" personality type and the "Casual" personality may have been set up as an "informal" personality type. In another implementation, the instant message sender may individually select the features and functionalities associated with the personality. Referring again to FIG. 1 IB, the personality is then stored (step 1110). The personality may be stored on the instant message sender system, on the instant message host system, or on a different host system such as a host system of an authorized partner or access provider. Next, the instant message sender assigns a personality to be projected during future instant message sessions or when engaged in future instant message conversations with an instant message recipient (step 1115). The instant message sender may wish to display different personalities to different instant message recipients and/or groups in the buddy list. The instant message sender may use a user interface to assign personalization items to personalities on at least a per-buddy group basis. For example, an instant message sender may assign a global avatar to all personalities, but assign different buddy sounds on a per-group basis to other personalities (e.g. work, family, friends), and assign buddy wallpaper and smileys on an individual basis to individual personalities corresponding to particular instant message recipients within a group. The instant message sender may assign other personality attributes based upon the occurrence of certain predetermined events or triggers. For example, certain potential instant message recipients may be designated to see certain aspects ofthe Rainy Day personality if the weather indicates rain at the geographic location ofthe instant message sender. Default priority rules may be implemented to resolve conflicts, or the user may select priority rules to resolve conflicts among personalities being projected or among self-expression items being projected for an amalgamated personality.
For example, a set of default priority rules may resolve conflicts among assigned personalities by assigning the highest priority to personalities and self- expression items of personalities assigned on an individual basis, assigning the next highest priority to assignments of personalities and personalization items made on a group basis, and assigning the lowest priority to assignments of personalities and personalization items made on a global basis. However, the user may be given the option to override these default priority rules and assign different priority rules for resolving conflicts. Next, an instant message session between the instant message sender and the mstant message recipient is initiated (step 1120). The mstant message session may be initiated by either the instant message sender or the instant message recipient.
An instant message user interface is rendered to the instant message recipient, configured to project the personality, including the avatar, assigned to the instant message recipient by the mstant message sender (step 1125), as illustrated, for example, in the user interface 100 in FIG. 1. The personality, including an avatar associated with the personality, chosen by an instant messaging recipient may be made perceivable upon opening of a communication window by the instant message sender for a particular instant message recipient but prior to initiation of communications. This may allow a user to determine whether to initiate communications with instant message recipient. For example, an instant message sender may notice that the instant message recipient is projecting an at- work personality, and the instant message sender may decide to refrain from sending an instant message. This may be particularly true when the avatar of the instant message recipient is displayed on a contact list. On the other hand, rendering the instant message recipient avatar after sending an instant message may result in more efficient communications.
The appropriate personality/personalization item set for a buddy is sent to the buddy when the buddy communicates with the instant message sender through the instant messaging client program. For example, in an implementation which supports global personalization items, group personalization items, and personal personalization items, a personal personalization item is sent to the buddy if set, otherwise a group personalization item is sent, if set. If neither a personal nor a group personalization item is set, then the global personalization item is sent. As another example, in an implementation that supports global personalization items and group personalization items, the group personalization item for the group to which the buddy belongs is sent, if set, otherwise the global personalization item is sent. In an implementation that only supports group personalization items, the group personalization item for the group to which the buddy belongs is sent to the buddy. An instant message session between the instant message sender and another instant message recipient also may be initiated (step 1130) by either the instant message sender or the second instant message recipient.
Relative to the second instant message session, a second instant message user interface is rendered to the second instant message recipient, configured to project the personality, including the avatar, assigned to the second instant message recipient by the instant message sender (step 1135), similar to the user interface illustrated by FIG. 1. The personality may be projected in a similar manner to that described above with respect to step 1125. However, the personality and avatar projected to the second instant message recipient may differ from the personality and avatar projected to the first instant message recipient described above in step 1125.
Referring to FIG. 14, an exemplary process 1400 enables an instant message sender to change a personality assigned to an instant message recipient, h process 1400, a user selection of a new online persona, including an avatar, to be assigned to the instant message recipient is received (step 1405). The change may be received through an instant message chooser 1200, such as that discussed above with respect to FIG. 12, and may include choosing self-expression items and/or features and functionality using such as interface or may include "snagging" an online persona or an avatar ofthe buddy using such an interface. Snagging an avatar refers to the appropriation by the mstant message sender of one or more personalization items, such as the avatar, used by the instant message recipient. Typically, all personalization items in the online persona ofthe instant message recipient are appropriated by the instant message sender when "snagging" an online persona.
Next, the updated user interface for that instant message recipient is rendered based on the newly selected personality (step 1410).
FIG. 15 illustrates an example process 1500 for modifying the appearance, or the behavior, of an avatar associated with an instant message sender to communicate an out-of-band message to an instant message recipient. The process may be performed by an instant messaging system, such as communications systems 1600, 1700, and 1800 described with respect to FIGS. 16, 17, and 18, respectively. An out- of-band message refers to sending a message that communicates context out-of-band - that is, conveying information independent of information conveyed directly through the text ofthe instant message itself sent to the recipient. Thus, the recipient views the appearance and behavior ofthe avatar to receive information that is not directly or explicitly conveyed in the instant message itself. By way of example, an out-of-band communication may include information about the sender's setting, environment, activity or mood, which is not communicated and part of a text message exchanged by a sender and a recipient.
The process 1500 begins with the instant messaging system monitoring the communications environment and sender's environment for an out-of-band communications indicator (step 1510). The indicator may be an indicator of the sender's setting, environment, activity, or mood that is not expressly conveyed in instant messages sent by the sender. For example, the out-of-band indicator may be an indication of time and date ofthe sender's location, which may be obtained from a clock application associated with the instant messaging system or with the sender's computer. The indicator may be an indication ofthe sender's physical location. The indicator may be an indication of an indication of weather conditions ofthe sender's location, which may be obtained from a weather reporting service, such as a web site that provides weather information for geographic locations.
In addition, the indicator may indicate the activities ofthe sender that take place at, or near, the time when an instant message is sent. For example, the indicator may determine from the sender's computer other applications that are active at, or near, the time that an instant message is sent. For example, the indicator may detect that the sender is using a media-playing application to play music, so the avatar associated with the sender may appear to be wearing headphones to reflect that the sender is listening to music. As another example, the indicator may detect that the sender is working with a calculator application, so the avatar may appear to be wearing glasses to reflect that sender is working.
The activities ofthe sender also may be monitored through use of a camera focused on the sender. Visual information taken from the camera may be used to determine the activities and mood ofthe sender. For example, the location of points on the face ofthe sender may be determined from the visual information taken from the camera. The position and motion ofthe facial points may be reflected in the avatar associated with the sender. Therefore, if the sender were to, for example, smile, then the avatar also smiles.
The indicator ofthe sender's mood also may come from another device that is operable to determine the sender's mood and send an indication of mood to the sender's computer. For example, the sender may be wearing a device that monitors heart rate, and determines the sender's mood from the heart rate. For example, the device may conclude that the sender is agitated or excited when an elevated heart rate is detected. The device may send the indication ofthe sender's mood to the sender's computer for use with the sender's avatar.
The instant messaging system makes a determination as to whether an out-of- band communications indicator has been detected (step 1520). When an out-of-band communications indicator is detected, the instant messaging system determines whether the avatar must be modified, customized, or animated to reflect the detected out-of-band communications indicator (step 1530); meanwhile or otherwise, the instant messaging system continues to monitor for out-of-band communications indicators (step 1510). To determine whether action is required, the instant messaging system may use a data table, list or file that includes out-of-band communications indicators and an associated action to be taken for each out-of-band communications indicator. Action may not be required for each out-of-band communications indicator detected. For example, action may only be required for some out-of-band communications indicators when an indicator has changed from a previous indicator setting. By way of example, the instant messaging system may periodically monitor the clock application to determine whether the setting associated with the sender is daytime or nighttime. Once the instant messaging system has taken action based on detecting an out-of-band communications indicator having a nighttime setting, the instant messaging system need not take action based on the detection of a subsequent nighttime setting indicator. The instant messaging system only takes action based on the nighttime setting after receiving an intervening out-of-band communications indicator for a daytime setting. When action is required (step 1540), the appearance and/or behavior ofthe avatar is modified in response to the out-of-band communications indicator (step 1550).
In one example, when an out-of-band communications indicator shows that the sender is sending instant messages at night, the appearance ofthe avatar is modified to be dressed in pajamas. When the indicator shows that the sender is sending instant messages during a holiday period, the avatar may be dressed in a mamier illustrative ofthe holiday. By way of example, the avatar may be dressed as Santa Claus during December, a pumpkin near Halloween, or Uncle Sam during early July.
In another example, when the out-of-band indicator shows that the sender is at the office, the avatar may be dressed in business attire, such as a suit and a tie. The appearance ofthe avatar also may reflect the weather or general climate ofthe geographic location ofthe sender. For example, when the out-of-band communications indicator shows that it is raining at the location ofthe sender, the wallpaper ofthe avatar may be modified to include falling raindrops or display an open umbrella and/or the avatar may appear to wear a rain hat.
As another example, when the out-of-band communications indicator shows that the sender is listening to music, the appearance ofthe avatar may be changed to show the avatar wearing headphones. Additionally or alternatively, the appearance of the avatar may be changed based on the type of music to which the sender is listening. When the indicator indicates that the sender is working (at the sender's work location or at another location), the avatar may appear in business attire, such as wearing a suit and a tie. As indicated by this example, different out-of-band communications indicators may trigger the same appearance ofthe avatar. In particular, both the out- of-band communications indicator ofthe sender being located at work and the out-of- band communications indicator ofthe sender performing a work activity causes the avatar to appear to be wearing a suit and tie.
In yet another example of an out-of-band communications indicator, the mood ofthe sender may be so indicated. In such a case, the appearance ofthe avatar may be changed to reflect the indicated mood. For example, when the sender is sad, the avatar may be modified to reflect the sad state ofthe sender, such as by animating the avatar to frown or cry. In another example, based on the detected activity ofthe sender, a frazzled, busy or pressed mood may be detected and the avatar animated to communicate such an emotional state. After the avatar appearance and/or behavior has been modified to reflect the out-of-band indicator (step 1550), the updated avatar, or an indication that the avatar has been updated, is communicated to the recipient (step 1560). Generally, the updated avatar, or indication that the avatar has been changed, is provided in association with the next instant message sent by the sender; however, this is not necessarily so in every implementation, hi some implementations, a change in the avatar may be communicated to the recipient independently ofthe sending of a communication. Additionally or alternatively, when a buddy list ofthe instant message user interface includes a display of a sender's avatar, the change ofthe avatar appearance may be communicated to each buddy list that includes the sender. Thus, the recipient is made able to perceive the updated avatar, the behavior and/or appearance providing an out-of-band communication to the sender.
FIG. 16 illustrates a communications system 1600 that includes an instant message sender system 1605 capable of communicating with an instant message host system 1610 through a communication link 1615. The communications system 1600 also includes an instant message recipient system 1620 capable of communicating with the instant message host system 1610 through the communication link 1615. Using the communications system 1600, a user ofthe instant message sender system 1605 is capable of exchanging communications with a user ofthe instant message recipient system 1620. The communications system 1600 is capable of animating avatars for use in self-expression by an instant message sender.
In one implementation, any ofthe instant message sender system 1605, the instant message recipient system 1620, or the instant message host system 1610 may include one or more general-purpose computers, one or more special-purpose computers (e.g., devices specifically programmed to communicate with each other), or a combination of one or more general-purpose computers and one or more special- purpose computers. By way of example, the instant message sender system 1605 or the instant message recipient system 1620 may be a personal computer or other type of personal computing device, such as a personal digital assistant or a mobile communications device. In some implementations, the instant message sender system 1605 and/or the instant message recipient 1620 may be a mobile telephone that is capable of receiving instant messages.
The instant message sender system 1605, the instant message recipient system 1620 and the instant message host system 1610 may be arranged to operate within or in concert with one or more other systems, such as, for example, one or more LANs ("Local Area Networks") and/or one or more WANs ("Wide Area Networks"). The communications link 1615 typically includes a delivery network (not shown) that provides direct or indirect communication between the instant message sender system 1605 and the instant message host system 1610, irrespective of physical separation. Examples of a delivery network include the Internet, the World Wide Web, WANs, LANs, analog or digital wired and wireless telephone networks (e.g., Public Switched Telephone Network (PSTN), Integrated Services Digital Network (ISDN), and various implementations of a Digital Subscriber Line (DSL)), radio, television, cable, or satellite systems, and other delivery mechanisms for carrying data. The communications link 1615 may include communication pathways (not shown) that enable communications through the one or more delivery networks described above. Each ofthe communication pathways may include, for example, a wired, wireless, cable or satellite communication pathway.
The instant message host system 1610 may support instant message services irrespective of an instant message sender's network or Internet access. Thus, the instant message host system 1610 may allow users to send and receive instant messages, regardless of whether they have access to any particular Internet service provider (ISP). The instant message host system 1610 also may support other services, including, for example, an account management service, a directory service, and a chat service. The instant message host system 1610 has an architecture that enables the devices (e.g., servers) within the instant message host system 1610 to communicate with each other. To transfer data, the instant message host system 1610 employs one or more standard or proprietary instant message protocols. To access the instant message host system 1610 to begin an instant message session in the implementation of FIG. 16, the instant message sender system 1605 establishes a connection to the instant message host system 1610 over the communication link 1615. Once a connection to the instant message host system 1610 has been established, the instant message sender system 1605 may directly or indirectly transmit data to and access content from the instant message host system 1610. By accessing the instant message host system 1610, an instant message sender can use an instant message client application located on the instant message sender system 1605 to view whether particular users are online, view whether users may receive mstant messages, exchange instant messages with particular instant message recipients, participate in group chat rooms, trade files such as pictures, invitations or documents, find other instant message recipients with similar interests, get customized information such as news and stock quotes, and search the Web. The instant message recipient system 1620 may be similarly manipulated to establish contemporaneous connection with instant message host system 1610.
Furthermore, the instant message sender may view or perceive an avatar and/or other aspects of an online persona associated with the instant message sender prior to engaging in communications with an instant message recipient. For example, certain aspects of an instant message recipient selected personality, such as an avatar chosen by the instant message recipient, may be perceivable through the buddy list itself prior to engaging in communications. Other aspects of a selected personality chosen by an instant message recipient may be made perceivable upon opening of a communication window by the instant message sender for a particular instant message recipient but prior to initiation of communications. For example, animations of an avatar associated with the instant message sender only may be viewable in a communication window, such as the user interface 100 of FIG. 1.
In one implementation, the instant messages sent between instant message sender system 1605 and instant message recipient system 1620 are routed through the instant message host system 1610. In another implementation, the instant messages sent between instant message sender system 1605 and instant message recipient system 1620 are routed through a third party server (not shown), and, in some cases, are also routed through the instant message host system 1610. In yet another implementation, the instant messages are sent directly between instant message sender system 1605 and instant message recipient system 1620.
The techniques, processes and concepts in this description may be implemented using communications system 1600. One or more ofthe processes may be implemented in a client/host context, a standalone or offline client context, or a combination thereof. For example, while some functions of one or more ofthe processes may be performed entirely by the instant message sender system 1605, other functions may be performed by host system 1610, or the collective operation of the instant message sender system 1605 and the host system 1610. By way of example, in process 300, the avatar of an instant message sender may be respectively selected and rendered by the standalone/offline device, and other aspects ofthe online persona ofthe instant message sender may be accessed or updated tlirough a remote device in a non-client/host environment such as, for example, a LAN server serving an end user or a mainframe serving a terminal device.
FIG. 17 illustrates a communications system 1700 that includes an instant message sender system 1605, an instant message host system 1610, a communication link 1615, and an instant message recipient 1620. System 1700 illustrates another possible implementation ofthe communications system 1600 of FIG. 16 that is used for animating avatars used for self-expression by an instant message sender.
In contrast to the depiction ofthe instant message host system 1610 in FIG. 16, the instant message host system 1610 includes a login server 1770 for enabling access by instant message senders and routing communications between the instant message sender system 1605 and other elements ofthe instant message host system 1610. The instant message host system 1610 also includes an instant message server 1790. To enable access to and facilitate interactions with the instant message host system 1610, the instant message sender system 1605 and the instant message recipient system 1620 may include communication software, such as for example, an online service provider client application and/or an instant message client application. hi one implementation, the instant message sender system 1605 establishes a connection to the login server 1770 in order to access the instant message host system 1610 and begin an instant message session. The login server 1770 typically determines whether the particular instant message sender is authorized to access the instant message host system 1610 by verifying the instant message sender's identification and password. If the instant message sender is authorized to access the instant message host system 1610, the login server 1770 usually employs a hashing technique on the instant message sender's screen name to identify a particular instant message server 1790 within the instant message host system 1610 for use during the instant message sender's session. The login server 1770 provides the instant message sender (e.g., instant message sender system 1605) with the Internet protocol ("IP") address ofthe instant message server 1790, gives the instant message sender system 1605 an encrypted key, and breaks the connection. The instant message sender system 1605 then uses the IP address to establish a connection to the particular instant message server 1790 through the communications link 1615, and obtains access to the instant message server 1790 using the encrypted key. Typically, the instant message sender system 1605 will be able to establish an open TCP connection to the instant message server 1790. The instant message recipient system 1620 establishes a comiection to the instant message host system 1610 in a similar manner.
In one implementation, the instant message host system 1610 also includes a user profile server (not shown) connected to a database (not shown) for storing large amounts of user profile data. The user profile server may be used to enter, retrieve, edit, manipulate, or otherwise process user profile data. In one implementation, an instant message sender's profile data includes, for example, the instant message sender's screen name, buddy list, identified interests, and geographic location. The instant message sender's profile data may also include self-expression items selected by the instant message sender. The instant message sender may enter, edit and/or delete profile data using an installed instant message client application on the instant message sender system 1705 to interact with the user profile server.
Because the instant message sender's data are stored in the instant message host system 1610, the instant message sender does not have to reenter or update such information in the event that the instant message sender accesses the instant message host system 1610 using a new or different instant message sender system 1605. Accordingly, when an instant message sender accesses the instant message host system 1610, the instant message server can instruct the user profile server to retrieve the instant message sender's profile data from the database and to provide, for example, the instant message sender's self-expression items and buddy list to the instant message server. Alternatively, user profile data may be saved locally on the instant message sender system 1605.
FIG. 18 illustrates another example communications system 1800 capable of exchanging communications between users that project avatars for self-expression. The commumcations system 1800 includes an instant message sender system 1605, an instant message host system 1610, a communications link 1615 and an instant message recipient system 1620.
The host system 1610 includes instant messaging server software 1832 routing communications between the instant message sender system 1605 and the instant message recipient system 1620. The instant messaging server software 1832 may make use of user profile data 1834. The user profile data 1834 includes indications of self-expression items selected by an instant message sender. The user profile data 1834 also includes associations 1834a of avatar models with users (e.g., instant message senders). The user profile data 1834 may be stored, for example, in a database or another type of data collection, such as a series of extensible mark-up language {XML) files. In some implementations, the some portions ofthe user profile data 1834 may be stored in a database while other portions, such as associations 1834a of avatar models with users, may be stored in an XML file.
One implementation of user profile data 1834 appears in the table below. In this example, the user profile data includes a screen name to uniquely identify the user for whom the user profile data applies, a password for signing-on to the instant message service, an avatar associated with the user, and an optional online persona. As shown in Table 1, a user may have multiple online personas, each associated with the same or a different avatar.
Figure imgf000060_0001
Table 1. The host system 1610 also includes an avatar model repository 1835 in which definitions of avatars that may be used in the instant message service are stored, hi this implementation, an avatar definition includes an avatar model file, an avatar expression file for storing instructions to control the animation ofthe avatar, and wallpaper file. Thus, the avatar model repository 1835 includes avatar model files 1836, avatar expression files 1837 and avatar wallpaper files 1838.
The avatar model files 1836 define the appearance and animations of each of the avatars included in the avatar model repository 1835. Each ofthe avatar model files 1836 defines the mesh, texture, lighting, sounds, and animations used to render an avatar. The mesh of a model file defines the form ofthe avatar, and the texture defines the image that covers the mesh. The mesh may be represented as a wire structure composed of a multitude of polygons that may be geometrically transformed to enable the display of an avatar to give the illusion of motion. In one implementation, lighting information of an avatar model file is in the form of a light map that portrays the effect of a light source on the avatar. The avatar model file also includes multiple animation identifiers. Each animation identifier identifies a particular animation that may be played for the avatar. For example, each animation identifier may identify one or more morph targets to describe display changes to transform the mesh of an avatar and display changes in the camera perspective used to display the avatar.
When an instant message user projects an avatar self-expression, it may be desirable to define an avatar with multiple animations, including facial animations, to provide more types of animations usable by the user for self-expression. Additionally, it may be desirable for facial animations to use a larger number of blend shapes, which may result in an avatar that, when rendered, may appears more expressive. A blend shape defines a portion ofthe avatar that may be animated and, in general, the more blend shapes that are defined for an animation model, the more expressive the image rendered from the animation model may appear.
Various data management techniques may be used to implement the avatar model files, hi some implementations, information to define an avatar may be stored in multiple avatar files that may be arranged in a hierarchical structure, such as a directory structure. In such a case, the association between a user and an avatar may be made through an association ofthe user with the root file in a directory of model files for the avatar.
In one implementation, an avatar model file may include all possible appearances of an avatar, including different features and props that are available for user-customization. In such a case, user preferences for the appearance ofthe user's avatar include indications of which portions ofthe avatar model are to be displayed, and flags or other indications for each optional appearance feature or prop may be set to indicate whether the feature or prop is to be displayed. By way of example, an avatar model may be configured to display sunglasses, reading glasses, short hair and long hair. When a user configures the avatar to wear sunglasses and have long hair, the sunglasses feature and long hair features are turned on, the reading glasses and short hair features are turned off, and subsequent renderings ofthe avatar display the avatar having long hair and sunglasses.
The avatar model repository 1835 also includes avatar expression files 1837. Each ofthe avatar expression files 1837 defines triggers that cause animations in the avatars. For example, each ofthe avatar expression files 1837 may define the text triggers that cause an of animation when the text trigger is identified in an instant message, as previously described with respect to FIGS. 3 and 4. An avatar expression file also may store associations between out-of-band communication indicators and animations that are played when a particular out-of-band communication indicator is detected. One example of a portion of an avatar expression file is depicted in Table 2 below.
Figure imgf000062_0001
Table 2. In some implementations, the association between a particular animation for a particular animation identifier is indirectly determined for a particular trigger or out- of-band communication indicator. For example, a particular trigger or out-of-band communication indicator may be associated with a type of animation (such as a smile, gone away, or sleep), as illustrated in Table 2. A type of animation also may be associated with a particular animation identifier included in a particular avatar model file, as illustrated in Table 3 below. In such a case, to play an animation based on a particular trigger or out-of-band communication indicator, the type of animation is identified, the animation identifier associated with the identified type of animation is determined, and the animation identified by the animation identifier is played. Other computer animation and programming techniques also may be used. For example, each avatar may use the same animation identifier for a particular animation type rather than including the avatar name shown in the table. Alternatively or additionally, the association of animation types and animation identifiers maybe stored separately for each avatar.
Figure imgf000062_0002
Table 3. The avatar expression files 1837 also include information to define the way that an avatar responds to an animation of another avatar, hi one implementation, an avatar expression file includes pairs of animation identifiers. One ofthe animation identifiers in each pair identifies a type of animation that, when the type of animation is played for one avatar, triggers an animation that is identified by the other animation identifier in the pair in another avatar. In this manner, the avatar expression file may define an animation played for an instant message recipient's avatar in response to an animation played by an instant message sender's avatar. In some implementations, the avatar expression files 1837 may include XML files having elements for defining the text triggers for each ofthe animations ofthe corresponding avatar and elements for defining the animations that are played in response to animations seen from other avatars.
The avatar model repository 1835 also includes avatar wallpaper files 1838 that define the wallpaper over which an avatar is drawn. The wallpaper may be defined using the same or different type of file structure as the avatar model files. For example, an avatar model file may be defined as an animation model file that is generated and playable using animation software from Viewpoint Corporation of New York, New York, whereas the wallpaper files may be in the form of a Macromedia Flash file that is generated and playable using animation software available from Macromedia, Inc. of San Francisco, California. When wallpaper includes animated objects that are triggered by an instant message, an out-of-band communication indicator or an animation of an avatar, the avatar wallpaper files 1838 also may include one or more triggers that are associated with the wallpaper animation. Each ofthe instant message sender system 1605 and the instant message recipient system 1620 includes an instant messaging communication application 1807 or 1827 that capable of exchanging instant messages over the communications link 1615 with the instant message host system 1610. The instant messaging communication application 1807 or 1827 also maybe referred to as an instant messaging client. Each ofthe instant message sender system 1605 and the instant message recipient system 1620 also includes avatar data 1808 or 1828. The avatar data 1808 or 1828 include avatar model files 1808a or 1828a, avatar expression files 1808b or 1828b, and avatar wallpaper files 1808c or 1828c for the avatars that are capable of being rendered by the instant message sender system 1605 or the instant message recipient system 1620, respectively. The avatar data 1808 or 1828 maybe stored in persistent storage, transient storage, or stored using a combination of persistent and transient storage. When all or some ofthe avatar data 1808 or 1828 is stored in persistent storage, it may be useful to associate a predetermined date on which some or all ofthe avatar data 1808 or 1828 is to be deleted from the instant message sender system 1605 or the instant message recipient system 1620, respectively. In this manner, avatar data may be removed from the instant message sender system 1605 or the instant message recipient system 1620 after the data has resided on the instant message sender system 1605 or 1620 for a predetermined period of time and presumably is no longer needed. This may help reduce the amount of storage space used for instant messaging on the instant message sender system 1605 or the instant message recipient system 1620. hi one implementation, the avatar data 1808 or 1828 is installed on the instant message sender system 1605 or the instant message recipient system 1620, respectively, with the instant messaging client software installed on the mstant message sender system 1605 or the instant message recipient system 1620. In another implementation, the avatar data 1808 or 1828 is transmitted to the instant message sender system 1605 or the instant message recipient system 1620, respectively, from the avatar model repository 1835 ofthe instant messaging host system 1610. In yet another implementation, the avatar data 1808 or 1828 is copied from a source unrelated to instant messaging and stored for use as instant messaging avatars on the instant message sender system 1605 or the instant message recipient system 1620, respectively. In yet another implementation, the avatar data 1808 or 1828 is sent to the instant message sender system 1605 or the instant message recipient system 1620, respectively, with or incident to instant messages sent to the instant message sender system 1605 or the instant message recipient system 1620. The avatar data sent with an instant message corresponds to the instant message sender that sent the message. The avatar expression files 1808b or 1828b are used to determine when an avatar is to be rendered on the instant message sender system 1605 or the instant message recipient 1620, respectively. To render an avatar, one ofthe avatar model files 1808a is displayed on the two-dimensional display ofthe instant messaging system 1605 or 1620 by an avatar model player 1809 or 1829, respectively. In one implementation, the avatar model player 1808 or 1829 is an animation player by Viewpoint Corporation. More particularly, the processor ofthe instant messaging system 1605 or 1620 calls the avatar model player 1809 or 1829 and identifies an animation included in one ofthe avatar model files 1808a or 1828a. In general, the animation is identified by an animation identifier in the avatar model file. The avatar model player 1809 or 1829 then accesses the avatar model file and plays the identified animation. hi many cases multiple animations may be played based on a single trigger or out-of-band communications indicator. This may occur, for example, when one avatar reacts to an animation of another avatar that is animated based on a text trigger, as described previously with respect to FIG. 6. hi the system 1800, four animations may be separately initiated based on a text trigger in one instant message. An instant message sender projecting a self-expressive avatar uses instant message sender system 1605 to sends a text message to an instant message recipient using instant message recipient system 1620. The instant message recipient also is projecting a self-expressive avatar. The display ofthe instant message sender system 1605 shows an instant message user interface, such as user interface 100 of FIG. 1, as does the display of instant message recipient system 1620. Thus, the sender avatar is shown on both the instant message sender system 1605 and the instant message recipient system 1620, as is the recipient avatar. The instant message sent from instant message sender system includes a text trigger that causes the animation ofthe sender avatar on the instant message sender system 1605 and the sender avatar on the instant message recipient system 1620. In response to the animation ofthe sender avatar, the recipient avatar is animated, as described previously with respect to FIG. 6. The reactive animation ofthe recipient avatar occurs in both the recipient avatar displayed on the instant message sender system 1605 and the recipient avatar displayed on the instant message recipient system 1620. h some implementations, an instant messaging user is permitted to customize one or more ofthe animation triggers or out-of-band communications indicators for avatar animations, wallpaper displayed for an avatar, triggers or out-of-band communications indicators for animating objects ofthe wallpaper, and the appearance ofthe avatar, h one implementation, a copy of an avatar model file, an expression file or a wallpaper file is made and the modifications ofthe user are stored in the copy ofthe avatar model file, an expression file or a wallpaper file. The copy that includes the modification is then associated with the user. Alternatively or additionally, only the changes - that is, the differences between the avatar before the modifications and the avatar after the modifications are made - are stored. In some implementations, different versions ofthe same avatar may be stored and associated with a user. This may enable a user to modify an avatar, use the modified avatar for a period of time, and then return to using a previous version ofthe avatar that does not include the modification.
In some implementations, the avatars from which a user may choose may be limited by the instant message service provider. This may be referred to as a closed implementation or a locked-down implementation, hi such an implementation, the animations and triggers associated with each avatar within the closed set of avatars may be preconfigured. In some closed implementations, the user may customize the animations and/or triggers of a chosen avatar. For example, a user may include a favorite video clip as an animation of an avatar, and the avatar may be configured to play the video clip after certain text triggers appear in the messages sent by the user. h other closed implementations, the user is also prevented from adding animations to an avatar.
In some implementations, the set of avatars from which a user may choose is not limited by the instant message service provider, and the user may use an avatar other than an avatar provided by the instant message service provider. This may be referred to as an open implementation or an unlocked implementation. For example, an avatar usable in an instant message service may be created by a user using animation software provided by the instant message service provider, off-the-shelf computer animation software, or software tools provided by a third-party that are specialized for the creating avatars compatible with one or more instant message services.
In some implementations, a combination of a closed-implementation and an open-implementation may be used. For example, an instant message service provider may limit the selection by users who are minors to a set of predetermined avatars provided by the instant message service provider while permitting users who are adults to use an avatar other than an avatar available from the instant message service provider.
In some implementations, the avatars from which a user may select may be limited based on a user characteristic, such as age. As illustrated in Table 4 below and using the avatars shown in FIG. 8 only as an example, a user who is under the age of 10 may be limited to one group of avatars. A user who is between 10 and 18 may be limited to a different group of avatars, some of which are the same as the avatars selectable by users under the age of 10. A user who is 18 or older may select from any avatar available from the instant message provider service.
Figure imgf000067_0001
Table 4. Instant messaging programs typically allow instant message senders to communicate in real-time with each other in a variety of ways. For example, many instant messaging programs allow mstant message senders to send text as an instant message, to transfer files, and to communicate by voice. Examples of instant messaging communication applications include AIM (America Online Instant Messenger), AOL (America Online) Buddy List and Instant Messages which is an aspect of many client communication applications provided by AOL, Yahoo Messenger, MSN Messenger, and ICQ, among others. Although discussed above primarily with respect to instant message applications, other implementations are contemplated for providing similar functionality in platforms and online applications. For example, the techniques and concepts may be applied to an animated avatar that acts as an information assistant to convey news, weather, and other information to a user of a computer system or a computing device.
The techniques and concepts generally have been described in the context of an instant messaging system that uses an instant messaging host system to facilitate the instant messaging communication between instant message senders and instant message recipients. Other instant message implementations are contemplated, such as an instant message service in which instant messages are exchanged directly between an instant message sender system and an instant message recipient system. For example, although the examples above are given in an instant message context, other communications systems with similar attributes may be used. For example, multiple personalities may be used in a chat room or in e-mail communications. Also, the user interface may be a viewable interface, an audible interface, a tactile interface, or a combination of these. Other implementations are within the scope ofthe following claims.

Claims

WHAT IS CLAIMED IS:
1. A graphical user interface configured for presentation on a display device and comprising: a sender portion that displays a sender avatar capable of displaying multiple animations; a message compose area capable of displaying text included in the message sent from the sender to the recipient; and communication controls, at least one communication control being operable to receive an indication that the message displayed in the message compose area is to be sent from the sender to the recipient, wherein the sender avatar is animated in response to a trigger related to content of a message sent from a sender to a recipient.
2. The graphical user interface of claim 1 wherein the instant message sender display comprises a recipient portion that displays a recipient avatar capable of displaying multiple animations in response to a trigger related to content of a message sent from a sender to a recipient, a message history area capable of displaying the content of multiple messages sent between the sender and the recipient and identifying an identity associated with the recipient.
3. The graphical user interface of claim 2 wherein the recipient avatar is animated in response to an animation ofthe sender avatar.
4. The graphical user interface of claim 1 wherein the graphical user interface comprises a contact list display for displaying potential recipients.
5. The graphical user interface of claim 4 wherein the contact list display indicates whether each potential recipient is available to receive a message.
6. The graphical user interface of claim 4 wherein the potential recipients are grouped and associated with an indication of a group identity.
7. The graphical user interface of claim 4, wherein a potential recipient displayed on the contact list is associated with a potential recipient avatar, further comprising: displaying the potential recipient avatar on the contact list in association with an identity ofthe potential recipient; and animating the potential recipient avatar on the contact list in response to animation ofthe potential recipient avatar displayed elsewhere.
8. The graphical user interface of claim 7 wherein the animation ofthe potential recipient avatar on the contact list comprises an animation that is substantially similar to the animation ofthe potential recipient avatar displayed elsewhere.
9. The graphical user interface of claim 7 wherein the animation ofthe potential recipient avatar on the contact list comprises an animation that is different than the animation ofthe potential recipient avatar displayed elsewhere.
10. The graphical user interface of claim 7 wherein the animation ofthe potential recipient avatar on the contact list comprises an animation that is representative ofthe animation ofthe potential recipient avatar displayed elsewhere.
11. The graphical user interface of claim 1 wherein graphical user interface is used for an instant messaging communication session.
12. The graphical user interface of claim 1 wherein the trigger comprises a portion ofthe text ofthe message.
13. The graphical user interface of claim 1 wherein the trigger comprises all ofthe text ofthe message.
14. The graphical user interface of claim 1 wherein appearance or animation ofthe sender avatar indicates an environmental condition associated with the sender.
15. The graphical user interface of claim 1 wherein appearance or animation ofthe sender avatar indicates a personality characteristic associated with the sender.
16. The graphical user interface of claim 1 wherein appearance or animation ofthe sender avatar indicates an emotional state associated with the sender.
17. The graphical user interface of claim 1 wherein appearance or animation ofthe sender avatar indicates a setting characteristic associated with the sender.
18. The graphical user interface of claim 1 wherein appearance or animation ofthe sender avatar indicates an activity associated with the sender.
19. The graphical user interface of claim 1 wherein the sender avatar is animated in response to the passing of a predetermined amount of time during which the sender does not communicate a message to the recipient.
20. The graphical user interface of claim 1 wherein the sender avatar is animated in response to the passing of a predetermined amount of time during which the sender does not use a computing device that is used by the sender to communicate with the recipient in the communications session.
21. The graphical user interface of claim 1 wherein the avatar animation used as the communication conduit comprises a breakout animation that involves displaying avatar outside of normal display space occupied by the avatar.
22. The graphical user interface of claim 1 wherein the sender avatar is animated to produce sounds used for verbal communication.
23. The graphical user interface of claims 1-22 wherein the graphical user interface is generated by an executing computer program product.
24. An apparatus for generating a graphical user interface configured for presentation on a display device, the apparatus comprising a processor connected to one or more input components and one or more output components, wherein the processor is configured to : generate a sender portion that displays a sender avatar capable of displaying multiple animations; generate a message compose area capable of displaying text included in the message sent from the sender to the recipient; and generate communication controls, at least one communication control being operable to receive an indication that the message displayed in the message compose area is to be sent from the sender to the recipient, wherein the sender avatar is animated in response to a trigger related to content of a message sent from a sender to a recipient.
25. A method for conununicating, the method comprising: graphically representing, with an avatar capable of being animated, a first user in a communication session involving the first user and a second user; communicating a message between the first user and the second user, the message conveying explicit information from the first user to the second user; and communicating out-of-band information to the second user using a change in the avatar appearance or avatar animation as a communication conduit, wherein the out-of-band communication comprises a communication that is related to a context ofthe first user and that differs from the information conveyed in the message sent between the first user and the second user.
26. The method of claim 25 wherein the communication session is an instant messaging communication session.
27. The method of claim 25 wherein the avatar comprises a facial animation that does not include a body having an ear or a leg.
28. The method of claim 25 wherein the avatar comprises a facial animation, including a neck, that does not include a body having an ear or a leg.
29. The method of claim 25 wherein the out-of-band information comprises information indicating an environmental condition associated with the first user.
30. The method of claim 29 wherein the environmental condition comprises an enviromnental condition related to weather occurring in a geographic location near the first user.
31. The method of claim 25 wherein the out-of-band information comprises information indicating a personality characteristic associated with the first user.
32. The method of claim 25 wherein the out-of-band information comprises information indicating an emotional state associated with the first user.
33. The method of claim 25 wherein the out-of-band infoπnation comprises information indicating a setting characteristic associated with the first user.
34. The method of claim 33 wherein the setting characteristic comprises a characteristic related to time of day ofthe first user.
35. The method of claim 33 wherein the setting characteristic comprises a characteristic related to time of year.
36. The method of claim 35 wherein the time of year comprises a holiday.
37. The method of claim 35 wherein the time of year comprises a season wherein the season is one of spring, summer, fall or winter.
38. The method of claim 33 wherein the setting characteristic comprises a characteristic associated with a work setting.
39. The method of claim 33 wherein the setting characteristic comprises a characteristic associated with a recreation setting.
40. The method of claim 39 wherein the recreation setting comprises a beach setting or a tropical setting.
41. The method of claim 39 wherein the recreation setting comprises a winter sport setting.
42. The method of claim 25 wherein out-of-band information comprises information related to a mood ofthe first user.
43. The method of claim 42 wherein the mood ofthe first user comprises one of happy, sad or angry.
44. The method of claim 25 wherein out-of-band information comprises information associated with an activity ofthe first user.
45. The method of claim 44 wherein the activity is being performed by the first user at substantially the same time that the out-of-band message is communicated from the first user to the second user.
46. The method of claim 45 wherein the activity comprises one of working or listening to music.
47. The method of claim 29 wherein out-of-band information comprises information conveying that the first user has muted sounds associated with the avatar.
48. The method of claim 25 further comprising triggering, based on the information conveyed in the message from the first user to the second user, an animation ofthe avatar to convey the out-of-band information from the first user to the second user.
49. The method of claim 48 wherein the trigger comprises a portion of text.
50. The method of claim 48 wherein the trigger comprises all ofthe text of the message.
51. The method of claim 48 wherein the trigger comprises an audio portion ofthe message.
52. The method of claim 48 wherein the trigger comprises passing a predetermined amount of time during which the first user does not communicate a message to the second user.
53. The method of claim 48 wherein the trigger comprises passing a predetermined amount of time during which the first user does not use a computing device that is used by the first user to communicate with the second user in the communication session.
54. The method of claim 25 wherein the avatar animation used as the communication conduit comprises a facial expression ofthe avatar.
55. The method of claim 25 wherein the avatar animation used as the communication conduit comprises a gesture made by a hand ofthe avatar or a gesture made by an arm ofthe avatar.
56. The method of claim 25 wherein the avatar animation used as the communication conduit comprises movement of a body ofthe avatar.
57. The method of claim 25 wherein the avatar animation used as the communication conduit comprises sounds made by the avatar.
58. The method of claim 57 wherein at least some ofthe sounds comprise a voice based on a voice ofthe first user.
59. The method of claim 25 wherein the avatar animation used as the communication conduit comprises a breakout animation that involves displaying avatar outside of normal display space occupied by the avatar.
60. The method of claim 59 wherein the breakout animation comprises telescoping the avatar.
61. The method of claim 59 wherein the breakout animation comprises resizing the avatar.
62. The method of claim 59 wherein the breakout animation comprises repositioning the avatar.
63. The method of claim 25 further comprising providing the first user with multiple preconfigured avatars having associated preselected animations; and enabling the first user to select a particular avatar to represent the user in the communications session.
64. The method of claim 63 further comprising persistently associating the first user with the selected avatar to represent the first user in subsequent communication sessions.
65. The method of claim 63 further comprising enabling the first user to modify the appearance ofthe avatar.
66. The method of claim 65 wherein enabling the first user to modify the appearance ofthe avatar comprises enabling the first user to use a slide bar to indicate a particular modification of a particular feature ofthe avatar.
67. The method of claim 65 wherein enabling the first user to modify the appearance ofthe avatar comprises enabling the first user to modify appearance ofthe avatar to reflect a characteristic ofthe first user.
68. The method of claim 67 wherein the characteristic ofthe first user comprises one of age, gender, hair color, eye color, or a facial feature.
69. The method of claim 65 wherein enabling the first user to modify the appearance ofthe avatar comprises enabling the first user to modify appearance ofthe avatar by adding, changing or deleting a prop displayed with the avatar.
70. The method of claim 69 wherein the prop comprises one of eyeglasses, sunglasses, a hat, or earrings.
71. The method of claim 25 further comprising enabling the first user to modify a trigger used to cause an animation ofthe avatar.
72. The method of claim 71 wherein the trigger comprises text included in the message sent from the first user to the second user.
73. The method of claim 25 further comprising animating the avatar for use as an infonnation assistant to convey information to the first user.
74. The method of claim 25 further comprising enabling use ofthe avatar by an application other than a communications application.
75. The method of claim 74 wherein enabling use ofthe avatar by an application other than a communications application comprises enabling use ofthe avatar in an online journal.
76. The method of claim 25 further comprising displaying a depiction of the avatar in the form that is substantially similar to a trading card.
77. The method of claim 76 wherein the trading card depiction ofthe avatar comprises a trading card depiction ofthe avatar that includes chai-acteristics associated with the first user.
78. The method of claims 25-77 wherein the processes are performed by a computer program that is configured to communicate and that is embodied on a computer-readable medium or propagated signal.
79. An apparatus for communicating, the apparatus comprising a processor connected to a storage device and one or more input/output devices, wherein the processor is configured to: graphically represent, with an avatar capable of being animated, a first user in a communication session involving the first user and a second user; communicate a message between the first user and the second user, the message conveying explicit information from the first user to the second user; and communicate out-of-band information to the second user using a change in the avatar appearance or avatar animation as a communication conduit, wherein the out-of-band communication comprises a communication that is related to a context ofthe first user and that differs from the information conveyed in the message sent between the first user and the second user.
80. A computer-implemented method for enabling perception of multiple online personas in an instant messaging commumcations session, the method comprising: identifying at least two identities within a communications environment to whom messages may be directed; and enabling a first persona of a user to be projected to a first ofthe identities while enabling a second persona ofthe same user to be concurrently projected to a second ofthe identities.! wherein: the first and second personas each comprise an avatar capable of being animated, and the first persona and the second persona differ.
81. The method of claim 80 wherein the first persona differs from the second persona such that first persona invokes a different avatar than an avatar invoked by the second persona.
82. The method of claim 80 wherein: the first persona invokes a first avatar, the second persona invokes a second avatar, the first avatar and the second avatar are the same avatar, and an animation associated with the first avatar is different from animations associated with the second avatar.
83. The method of claim 80 wherein: the first persona invokes a first avatar, the second persona invokes a second avatar, the first avatar and the second avatar are the same avatar, and an appearance associated with the first avatar is different from appearances associated with the second avatar.
84. The method of claim 80 wherein at least one ofthe avatars comprises an avatar that is associated with multiple sounds.
85. The method of claim 80 wherein at least one ofthe avatars comprises an avatar capable of being animated persona based on text of a message sent in the instant message communications session.
86. The method of claim 80 wherein at least one ofthe avatars comprises an avatar capable of being animated to send an out-of-band communication.
87. The method of claim 80 further comprising associating the first persona with a first group of identities so that the first persona is projected in communications sessions with members ofthe first group of identities.
88. The method of claim 87 further comprising associating the second persona with a second group of identities so that the second persona is projected in communications sessions with members ofthe second group of identities.
89. The method of claim 80 further comprising associating a persona with the first ofthe identities and associating a different persona with a group ofthe identities with which the first ofthe identities is associated, wherein the first persona projected to the first ofthe identities comprises an amalgamation ofthe persona associated with the first ofthe identities and the different persona associated with the group ofthe identities.
90. The method of claim 89 wherein the persona associated with the first ofthe identities overrides the different persona associated with the group ofthe identities to the extent a conflict exists.
91. The method of claims 80-90 where in the processes are performed by a computer program configured to enable perception of multiple online personas in an instant messaging communications session and that is embodied on a computer- readable medium or propagated signal.
92. An apparatus for enabling perception of multiple online personas in an instant messaging communications session, the apparatus comprising a processor connected to a storage device and one or more input/output devices, wherein the processor is configured to: identify at least two identities within a communications environment to whom messages may be directed; and enable a first persona of a user to be projected to a first ofthe identities while enabling a second persona ofthe same user to be concurrently projected to a second ofthe identities, wherein: the first and second personas each comprise an avatar capable of being animated, and the first persona and the second persona differ.
93. The apparatus of claim 92 wherein the first persona differs from the second persona such that first persona invokes a different avatar than an avatar invoked by the second persona.
94. The apparatus of claim 92 wherein: the first persona invokes a first avatar, the second persona invokes a second avatar, the first avatar and the second avatar are the same avatar, and an animation associated with the first avatar is different from animations associated with the second avatar.
95. The apparatus of claim 92 wherein: the first persona invokes a first avatar, the second persona invokes a second avatar, the first avatar and the second avatar are the same avatar, and an appearance associated with the first avatar is different from appearances associated with the second avatar.
96. The apparatus of claim 92 wherein at least one ofthe avatars comprises an avatar capable of being animated persona based on text of a message sent in the instant message communications session.
97. The apparatus of claim 92 wherein at least one ofthe avatars comprises an avatar capable of being animated to send an out-of-band communication.
98. A computer-implemented method for enabling perception of multiple online personas in an instant messaging communications session, the method comprising: rendering, on an instant messaging recipient system, an instant messaging application user interface for an instant messaging communications session involving at least one potential instant messaging recipient and a single potential instant messaging sender; sending a message that includes a text message and a persona selected among multiple possible personas associated with the instant messaging sender to be displayed by the potential instant messaging recipient when displaying the text message, the selected persona comprising a collection of one or more self-expression items and a sender avatar capable of being animated; and rendering the selected persona at the potential instant messaging recipient system when rendering another portion ofthe message.
99. The method of claim 98 wherein the sender persona is selected by the instant messaging sender from the multiple possible personas associated with the instant messaging sender.
100. The method of claim 98 wherein the persona is rendered before communications are initiated by the potential instant messaging sender.
101. The method of claim 98 wherein the persona is rendered after communications are initiated by the potential instant messaging sender.
102. The method of claim 98 in which self-expression items comprise one or more of a wallpaper, an emoticon, and a sound.
103. The method of claim 98 further comprising defining one or more personas.
104. The method of claim 103 further comprising: assigning a first persona to a first potential instant messaging recipient so that the first persona is thereafter automatically invoked and projected, in an instant messaging communications session involving the first potential instant messaging recipient; and assigning a second persona to a second potential instant messaging recipient so that the second persona is thereafter automatically invoked and projected, in an instant messaging communications session involving the second potential instant messaging recipient, wherein the second persona is at least partially distinguishable from the first persona.
105. The method of claim 104 further comprising: assigning a first persona to a first group of potential instant messaging recipients so that the first persona is thereafter automatically invoked and projected in an instant messaging communications session involving a member ofthe first group of potential instant messaging recipients; and assigning a second persona to a second potential instant messaging recipient so that the second persona is thereafter automatically invoked and projected, in an instant messaging communications session involving the second potential instant messaging recipient, wherein the second persona is at least partially distinguishable from the first persona.
106. The method of claim 98 further comprising disabling use of one ofthe multiple personas.
107. The method of claim 98 wherein disabling use of one of the multiple personas comprises disabling use of one ofthe multiple personas based on the instant messaging recipient.
108. The method of claim 98 wherein: one ofthe multiple personas comprise a work persona associated with presence ofthe instant messaging sender at a work location associated with the instant messaging sender, and one ofthe multiple personas comprise a home persona associated with presence ofthe instant messaging sender at home, the method further comprising: determining whether the instant messaging sender is at home or at the work location; in response to a determination that the instant messaging sender is at home, selecting the home persona for use in the instant messaging communications session; and in response to a determination that the instant messaging sender is at the work location, selecting the work persona for use in the instant messaging communications session.
109. The method of claim 98 further comprising selecting a persona to be displayed by the potential instant messaging recipient based on time of day.
110. The method of claim 98 further comprising selecting a persona to be displayed by the potential instant messaging recipient based on day of week.
111. The method of claim 98 further comprising selecting a persona to be displayed by the potential instant messaging recipient based on a group of potential instant messaging recipients that are associated with the potential instant messaging recipient.
112. The method of claim 98 wherein at least some of characteristics of a persona may be transparent to the instant messaging sender.
113. The method of claim 98 wherein the sender avatar is ammated to send an out-of-band communication from the instant messaging sender to the potential instant messaging recipient.
114. The method of claim 113 wherein the out-of-band communication comprises a communication indicating an environmental condition associated with the instant messaging sender.
115. The method of claim 114 wherein the environmental condition comprises an enviromnental condition related to weather occurring in a geographic location near the instant messaging sender.
116. The method of claim 113 wherein the out-of-band communication comprises a communication indicating a personality characteristic associated with the instant messaging sender.
117. The method of claim 113 wherein the out-of-band communication comprises a communication indicating an emotional state associated with the instant messaging sender.
118. The method of claim 113 wherein the out-of-band communication comprises a communication indicating a setting characteristic associated with the instant messaging sender.
119. The method of claim 118 wherein the setting characteristic comprises a characteristic related to time of day ofthe instant messaging sender.
120. The method of claim 113 wherein the setting characteristic comprises a characteristic related to time of year.
121. The method of claim 120 wherein the time of year comprises a holiday.
122. The method of claim 120 wherein the time of year comprises a season wherein the season is one of spring, summer, fall or winter.
123. The method of claim 113 wherein the setting characteristic comprises a characteristic associated with a work setting.
124. The method of claim 113 wherein the setting characteristic comprises a characteristic associated with a recreation setting.
125. The method of claim 124 wherein the recreation setting comprises a beach setting or a tropical setting.
126. The method of claim 125 wherein the recreation setting comprises a winter sport setting.
127. The method of claims 98-126 wherein the processes are performed by a computer program that is configured to enable perception of multiple online personas in an instant messaging communications session and that is embodied on a computer-readable medium or propagated signal.
128. An apparatus for enabling perception of multiple online personas in an instant messaging communications session, the apparatus comprising a processor connected to a storage device and one or more input/output devices, wherein the processor is configured to : render, on an instant messaging recipient system, an instant messaging application user interface for an instant messaging communications session involving at least one potential instant messaging recipient and a single potential instant messaging sender; send a message that includes a text message and a persona selected among multiple possible personas associated with the instant messaging sender to be displayed by the potential instant messaging recipient when displaying the text message, the selected persona comprising a collection of one or more self-expression items and a sender avatar capable of being animated; and render the selected persona at the potential instant messaging recipient system when rendering another portion ofthe message.
129. A computer-implemented method for using an avatar to communicate, the method comprising: representing a user graphically using an avatar capable of being animated, wherein the avatar is associated with multiple animations and multiple features of appearance that represent a pattern of characteristics representing a personality ofthe avatar.
130. The method of claim 129 wherein the avatar is associated with a description that identifies the personality ofthe avatar.
131. The method of claim 129 wherein the personality of the avatar includes at least some characteristics that are distinct of at least some characteristics of a personality ofthe user.
132. The method of claim 129 further comprising: graphically representing a second user with a second avatar capable of being animated wherein the second avatar is associated with multiple animations and multiple features of appearance that represent a pattern of characteristics representing a personality of the second avatar, wherein: the personality ofthe second avatar includes at least some characteristics that are distinct of at least some characteristics ofthe personality ofthe first avatar, and communication messages are being sent between the first user and the second user.
133. The method of claims 73-76 wherein the processes are performed by a computer that is configured to use an avatar to communicate and that is embodied on a computer-readable medium or propagated signal.
134. An apparatus for using an avatar to communicate, the apparatus comprising a processor connected to a storage device and one or more input/output devices, wherein the processor is configured to: represent a user graphically using an avatar capable of being animated, wherein the avatar is associated with multiple animations and multiple features of appearance that represent a pattern of characteristics representing a personality ofthe avatar.
135. A apparatus for using an avatar to communicate, the apparatus comprising: means for representing a user graphically using an avatar capable of being animated, wherein the avatar is associated with multiple animations and multiple features of appearance that represent a pattern of characteristics representing a personality ofthe avatar.
136. A computer-implemented method for animating a first avatar based on perceived animation of a second avatar, the method comprising: graphically representing a first user with a first avatar capable of being animated; graphically representing a second user with a second avatar capable of being animated wherein communication messages are being sent between the first user and the second user; receiving an indication of an animation ofthe first avatar; and in response to and based on the received indication ofthe animation, animating the second avatar.
137. The method of claim 136 wherein receiving the indication of an animation comprises receiving an indication of any type of animation ofthe first avatar.
138. The method of claim 136 wherein receiving the indication of an animation comprises receiving an indication of a particular animation of multiple possible animations ofthe first avatar.
139. The method of claim 136 further comprising animating the first avatar in response to and based on the animation ofthe second avatar.
140. The method of claim 136 wherein the first avatar is animated in response to a particular portion of a message sent between the first user and the second user.
141. The method of claim 140 wherein the first avatar is animated in response to a particular portion of a message sent from the first user to the second user.
142. The method of claim 140 wherein the first avatar is animated in response to a particular portion of a message sent to the first user from the second user.
143. The method of claim 136 wherein the first avatar is animated to send an out-of-band communication from the first user to the second user.
144. The method of claim 143 wherein the out-of-band communication comprises a communication indicating an environmental condition associated with the first user.
145. The method of claim 144 wherein the environmental condition comprises an environmental condition related to weather occurring in a geographic location near the first user.
146. The method of claim 143 wherein the out-of-band communication comprises a communication indicating a personality characteristic associated with the first user.
147. The method of claim 143 wherein the out-of-band communication comprises a communication indicating an emotional state associated with the first user.
148. The method of claim 143 wherein the out-of-band communication comprises a communication indicating a setting characteristic associated with the first user.
149. The method of claim 148 wherein the setting characteristic comprises a characteristic related to time of day ofthe first user.
150. The method of claim 147 wherein the setting characteristic comprises a characteristic related to time of year.
151. The method of claim 150 wherein the time of year comprises a holiday.
152. The method of claim 150 wherein the time of year comprises a season wherein the season is one of spring, summer, fall or winter.
153. The method of claim 147 wherein the setting characteristic comprises a characteristic associated with a work setting.
154. The method of claim 147 wherein the setting characteristic comprises a characteristic associated with a recreation setting.
155. The method of claim 154 wherein the recreation setting comprises a beach setting or a tropical setting.
156. The method of claim 154 wherein the recreation setting comprises a winter sport setting.
157. The method of claims 136-156 wherein the processes are performed by a computer program that is configured to animate a first avatar based on perceived animation of a second avatar and that is embodied on a computer-readable medium or propagated signal.
158. An apparatus for animating a first avatar based on perceived animation of a second avatar, the apparatus comprising a processor connected to a storage device and one or more input/output devices, wherein the processor is configured to: graphically represent a first user with a first avatar capable of being animated; graphically represent a second user with a second avatar capable of being animated wherein communication messages are being sent between the first user and the second user; receive an indication of an animation ofthe first avatar; and animate the second avatar in response to and based on the received indication ofthe animation.
159. The apparatus of claim 158 wherein the processor is configured to receive an indication of any type of animation ofthe first avatar.
160. The apparatus of claim 158 wherein the processor is configured to receive an indication of a particular animation of multiple possible animations ofthe first avatar.
161. The apparatus of claim 158 wherein the processor is further configured to animate the first avatar in response to and based on the animation ofthe second avatar.
162. The apparatus of claim 158 wherein the processor is further configured to animate the first avatar in response to a particular portion of a message sent between the first user and the second user.
163. The apparatus of claim 158 wherein the processor is further configured animate the first avatar to send an out-of-band communication from the first user to the second user.
PCT/US2004/006284 2003-03-03 2004-03-01 Using avatars to communicate WO2004079530A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP04716149A EP1599862A2 (en) 2003-03-03 2004-03-01 Using avatars to communicate
JP2006508976A JP2006520053A (en) 2003-03-03 2004-03-01 How to use an avatar to communicate
CA002517909A CA2517909A1 (en) 2003-03-03 2004-03-01 Using avatars to communicate
AU2004216758A AU2004216758A1 (en) 2003-03-03 2004-03-01 Using avatars to communicate

Applications Claiming Priority (12)

Application Number Priority Date Filing Date Title
US45066303P 2003-03-03 2003-03-03
US60/450,663 2003-03-03
US51285203P 2003-10-22 2003-10-22
US60/512,852 2003-10-22
US10/747,701 US7484176B2 (en) 2003-03-03 2003-12-30 Reactive avatars
US10/747,255 US20040179039A1 (en) 2003-03-03 2003-12-30 Using avatars to communicate
US10/747,652 US20040179037A1 (en) 2003-03-03 2003-12-30 Using avatars to communicate context out-of-band
US10/747,701 2003-12-30
US10/747,255 2003-12-30
US10/747,652 2003-12-30
US10/747,696 US7636755B2 (en) 2002-11-21 2003-12-30 Multiple avatar personalities
US10/747,696 2003-12-30

Publications (2)

Publication Number Publication Date
WO2004079530A2 true WO2004079530A2 (en) 2004-09-16
WO2004079530A3 WO2004079530A3 (en) 2004-10-28

Family

ID=32966868

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2004/006284 WO2004079530A2 (en) 2003-03-03 2004-03-01 Using avatars to communicate

Country Status (5)

Country Link
EP (1) EP1599862A2 (en)
JP (1) JP2006520053A (en)
AU (1) AU2004216758A1 (en)
CA (1) CA2517909A1 (en)
WO (1) WO2004079530A2 (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004061884A1 (en) * 2004-12-22 2006-07-13 Combots Product Gmbh & Co. Kg Messaging type telecommunication between users who are each provided with a virtual representative whose animated interactions are used to transfer chat-type text between the communicating users
WO2007072328A2 (en) 2005-12-20 2007-06-28 Philips Intellectual Property & Standards Gmbh Method of sending motion control content in a message, message transmitting device and message rendering device
WO2007129144A2 (en) 2005-12-09 2007-11-15 Ebuddy Holding B.V. High level network layer system and method
DE102006025687A1 (en) * 2006-06-01 2007-12-06 Combots Product Gmbh Communication device for animated communication, has communication terminal with display device, on which window-based graphic user interface is represented, and area is defined as window, which is represented transparently
DE102006025685A1 (en) * 2006-06-01 2007-12-06 Combots Product Gmbh & Co. Kg Communication device for animated communication, has display device, on which window-based graphic user interface is represented, and area is defined as window, which is represented transparently
DE102006025686A1 (en) * 2006-06-01 2008-02-07 Combots Product Gmbh Communication device, has display unit, and area defined as window, where combot is composed of non-transparent pixels and represented in window, and window-based graphical user interface can be represented on unit
EP1887524A1 (en) * 2005-06-02 2008-02-13 Tencent Technology (Shenzhen) Company Limited Animation displaying method and system thereof
JP2008523477A (en) * 2004-12-09 2008-07-03 ▲騰▼▲訊▼科技(深▲セン▼)有限公司 Data transmission management method, system and server
US7711778B2 (en) 2006-09-05 2010-05-04 Samsung Electronics Co., Ltd Method for transmitting software robot message
CN103797761A (en) * 2013-08-22 2014-05-14 华为技术有限公司 Communication method, client, and terminal
WO2015148585A1 (en) * 2014-03-28 2015-10-01 Microsoft Technology Licensing, Llc Delivering an action
WO2016045005A1 (en) 2014-09-24 2016-03-31 Intel Corporation User gesture driven avatar apparatus and method
EP3101845A1 (en) * 2015-06-01 2016-12-07 Facebook, Inc. Providing augmented message elements in electronic communication threads
US9686219B2 (en) 2010-04-14 2017-06-20 Nokia Technologies Oy Systems, methods, and apparatuses for facilitating determination of a message recipient
WO2017137952A1 (en) * 2016-02-10 2017-08-17 Vats Nitin Intelligent chatting on digital communication network
DK201670642A1 (en) * 2016-05-18 2017-12-04 Apple Inc Devices, Methods, and Graphical User Interfaces for Messaging
US9959037B2 (en) 2016-05-18 2018-05-01 Apple Inc. Devices, methods, and graphical user interfaces for messaging
US9973458B2 (en) 2012-04-06 2018-05-15 I-On Communications Co., Ltd. Mobile chat system for supporting cartoon story-style communication on webpage
CN110023985A (en) * 2016-10-24 2019-07-16 斯纳普公司 Simultaneously displaying format customization head portrait is generated in electronic information
CN110799937A (en) * 2017-04-27 2020-02-14 斯纳普公司 Location-based virtual avatar
US10706606B2 (en) 2017-08-24 2020-07-07 Fuji Xerox Co., Ltd. Information processing apparatus for modifying a graphical object based on sensor input
US10791081B2 (en) 2015-06-01 2020-09-29 Facebook, Inc. Providing augmented message elements in electronic communication threads
US11159922B2 (en) 2016-06-12 2021-10-26 Apple Inc. Layers in messaging applications
US11221751B2 (en) 2016-05-18 2022-01-11 Apple Inc. Devices, methods, and graphical user interfaces for messaging
US11607616B2 (en) 2012-05-08 2023-03-21 Snap Inc. System and method for generating and displaying avatars
US11966579B2 (en) 2016-08-24 2024-04-23 Apple Inc. Devices, methods, and graphical user interfaces for messaging

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101542776B1 (en) 2007-02-15 2015-08-07 엘지전자 주식회사 Controlling Method of Instant Messenger Service for Mobile Communication Terminal
KR20100037119A (en) * 2007-06-27 2010-04-08 카렌 날리스 엔터프라이지즈 피티와이 엘티디 Communication method, system and products
JP4931245B2 (en) * 2007-11-30 2012-05-16 インターナショナル・ビジネス・マシーンズ・コーポレーション Access control method, server device and system
TWI439960B (en) 2010-04-07 2014-06-01 Apple Inc Avatar editing environment
WO2013080636A1 (en) * 2011-12-02 2013-06-06 株式会社コナミデジタルエンタテインメント Server device, recording medium, and avatar management method
JP5048877B1 (en) 2012-02-14 2012-10-17 株式会社 ディー・エヌ・エー Social game computing
JP5427925B2 (en) * 2012-06-28 2014-02-26 株式会社 ディー・エヌ・エー Social game computing
JP5460918B1 (en) * 2013-11-20 2014-04-02 株式会社 ディー・エヌ・エー GAME PROGRAM AND GAME SYSTEM
JP2016071571A (en) * 2014-09-30 2016-05-09 大日本印刷株式会社 Message transmission device and computer program
EP3238176B1 (en) * 2014-12-11 2023-11-01 Intel Corporation Avatar selection mechanism
AU2017100670C4 (en) 2016-06-12 2019-11-21 Apple Inc. User interfaces for retrieving contextually relevant media content
KR20230144661A (en) 2017-05-16 2023-10-16 애플 인크. Emoji recording and sending
US10210648B2 (en) * 2017-05-16 2019-02-19 Apple Inc. Emojicon puppeting
US10375313B1 (en) 2018-05-07 2019-08-06 Apple Inc. Creative camera
DK180078B1 (en) 2018-05-07 2020-03-31 Apple Inc. USER INTERFACE FOR AVATAR CREATION
US11722764B2 (en) 2018-05-07 2023-08-08 Apple Inc. Creative camera
CN113535306B (en) * 2018-05-07 2023-04-07 苹果公司 Avatar creation user interface
JP2019050018A (en) * 2018-11-09 2019-03-28 富士ゼロックス株式会社 Method for controlling display, information processor, and program
US11107261B2 (en) 2019-01-18 2021-08-31 Apple Inc. Virtual avatar animation based on facial feature movement
JP6921925B2 (en) * 2019-04-08 2021-08-18 バイドゥ ドットコム タイムス テクノロジー (ベイジン) カンパニー リミテッド Parameter adjustment methods, devices, servers, computer-readable storage media and computer programs
DK181103B1 (en) 2020-05-11 2022-12-15 Apple Inc User interfaces related to time
US11921998B2 (en) 2020-05-11 2024-03-05 Apple Inc. Editing features of an avatar
US11714536B2 (en) 2021-05-21 2023-08-01 Apple Inc. Avatar sticker editor user interfaces
US11776190B2 (en) 2021-06-04 2023-10-03 Apple Inc. Techniques for managing an avatar on a lock screen

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5793365A (en) * 1996-01-02 1998-08-11 Sun Microsystems, Inc. System and method providing a computer user interface enabling access to distributed workgroup members
US5880731A (en) * 1995-12-14 1999-03-09 Microsoft Corporation Use of avatars with automatic gesturing and bounded interaction in on-line chat session

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11184790A (en) * 1997-12-25 1999-07-09 Casio Comput Co Ltd Cyberspace system and recording medium for storing program for providing cyberspace to user terminal
JP3450760B2 (en) * 1999-10-14 2003-09-29 富士通株式会社 Communication promotion method and system
JP3720230B2 (en) * 2000-02-18 2005-11-24 シャープ株式会社 Expression data control system, expression data control apparatus constituting the same, and recording medium on which the program is recorded
JP2001338077A (en) * 2000-05-24 2001-12-07 Digital Passage:Kk Language lesson method through internet, system for the same and recording medium
JP2003058484A (en) * 2001-08-21 2003-02-28 Sony Corp Method and device for providing community service, program storage medium and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5880731A (en) * 1995-12-14 1999-03-09 Microsoft Corporation Use of avatars with automatic gesturing and bounded interaction in on-line chat session
US5793365A (en) * 1996-01-02 1998-08-11 Sun Microsystems, Inc. System and method providing a computer user interface enabling access to distributed workgroup members

Cited By (70)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9002743B2 (en) 2004-12-09 2015-04-07 Tencent Technology (Shenzhen) Company Limited Method, system and server for managing data transmission
JP2008523477A (en) * 2004-12-09 2008-07-03 ▲騰▼▲訊▼科技(深▲セン▼)有限公司 Data transmission management method, system and server
DE102004061884B4 (en) * 2004-12-22 2007-01-18 Combots Product Gmbh & Co. Kg Method and system for telecommunications with virtual substitutes
DE102004061884A1 (en) * 2004-12-22 2006-07-13 Combots Product Gmbh & Co. Kg Messaging type telecommunication between users who are each provided with a virtual representative whose animated interactions are used to transfer chat-type text between the communicating users
EP1887524A1 (en) * 2005-06-02 2008-02-13 Tencent Technology (Shenzhen) Company Limited Animation displaying method and system thereof
EP1887524A4 (en) * 2005-06-02 2012-02-15 Tencent Tech Shenzhen Co Ltd Animation displaying method and system thereof
US10536412B2 (en) 2005-12-09 2020-01-14 Ebuddy Technologies B.V. Contact list aggregation and display
US10523612B2 (en) 2005-12-09 2019-12-31 Ebuddy Technologies B.V. Message history display system and method
US11438291B2 (en) 2005-12-09 2022-09-06 Ebuddy Holding B.V. Message history display system and method
EP1969786A2 (en) * 2005-12-09 2008-09-17 Ebuddy Holding B.V. High level network layer system and method
US10735364B2 (en) 2005-12-09 2020-08-04 Ebuddy Technologies B.V. Title provisioning for event notification on a mobile device
US9584453B2 (en) 2005-12-09 2017-02-28 Ebuddy Holding B.V. Contact list aggregation and display
US8356070B2 (en) 2005-12-09 2013-01-15 Ebuddy Holding B.V. High level network layer system and method
EP1969786B1 (en) * 2005-12-09 2013-06-26 Ebuddy Holding B.V. High level network layer system and method
US8510395B2 (en) 2005-12-09 2013-08-13 Ebuddy Holding B.V. Contact list display system and method
US8700713B2 (en) 2005-12-09 2014-04-15 Ebuddy Holding B.V. Picture provisioning system and method
US11438293B2 (en) 2005-12-09 2022-09-06 Ebuddy Holding B.V. Title provisioning for event notification on a mobile device
US8806084B2 (en) 2005-12-09 2014-08-12 Ebuddy Holding B.V. Event notification system and method
WO2007129144A2 (en) 2005-12-09 2007-11-15 Ebuddy Holding B.V. High level network layer system and method
USRE46328E1 (en) 2005-12-09 2017-02-28 Ebuddy Holding B.V. Event notification system and method
US10389666B2 (en) 2005-12-09 2019-08-20 Ebuddy Technologies B.V. Event notification
US9250984B2 (en) 2005-12-09 2016-02-02 Ebuddy Holding B.V. Message history display system and method
US10986057B2 (en) 2005-12-09 2021-04-20 Ebuddy Technologies B.V. Message history display system and method
US11012393B2 (en) 2005-12-09 2021-05-18 Ebuddy Technologies B.V. Contact list aggregation and display
US11689489B2 (en) 2005-12-09 2023-06-27 Ebuddy Technologies B.V. Message history display system and method
WO2007072328A2 (en) 2005-12-20 2007-06-28 Philips Intellectual Property & Standards Gmbh Method of sending motion control content in a message, message transmitting device and message rendering device
DE102006025686A1 (en) * 2006-06-01 2008-02-07 Combots Product Gmbh Communication device, has display unit, and area defined as window, where combot is composed of non-transparent pixels and represented in window, and window-based graphical user interface can be represented on unit
DE102006025687A1 (en) * 2006-06-01 2007-12-06 Combots Product Gmbh Communication device for animated communication, has communication terminal with display device, on which window-based graphic user interface is represented, and area is defined as window, which is represented transparently
DE102006025685A1 (en) * 2006-06-01 2007-12-06 Combots Product Gmbh & Co. Kg Communication device for animated communication, has display device, on which window-based graphic user interface is represented, and area is defined as window, which is represented transparently
US7711778B2 (en) 2006-09-05 2010-05-04 Samsung Electronics Co., Ltd Method for transmitting software robot message
US9686219B2 (en) 2010-04-14 2017-06-20 Nokia Technologies Oy Systems, methods, and apparatuses for facilitating determination of a message recipient
US9973458B2 (en) 2012-04-06 2018-05-15 I-On Communications Co., Ltd. Mobile chat system for supporting cartoon story-style communication on webpage
US11607616B2 (en) 2012-05-08 2023-03-21 Snap Inc. System and method for generating and displaying avatars
US11925869B2 (en) 2012-05-08 2024-03-12 Snap Inc. System and method for generating and displaying avatars
CN103797761B (en) * 2013-08-22 2017-02-22 华为技术有限公司 Communication method, client, and terminal
US9531841B2 (en) 2013-08-22 2016-12-27 Huawei Technologies Co., Ltd. Communications method, client, and terminal
JP2016536695A (en) * 2013-08-22 2016-11-24 華為技術有限公司Huawei Technologies Co.,Ltd. Communication method, client, and terminal
EP2866391A4 (en) * 2013-08-22 2015-06-24 Huawei Tech Co Ltd Communication method, client, and terminal
CN103797761A (en) * 2013-08-22 2014-05-14 华为技术有限公司 Communication method, client, and terminal
WO2015148585A1 (en) * 2014-03-28 2015-10-01 Microsoft Technology Licensing, Llc Delivering an action
EP3198560A4 (en) * 2014-09-24 2018-05-09 Intel Corporation User gesture driven avatar apparatus and method
WO2016045005A1 (en) 2014-09-24 2016-03-31 Intel Corporation User gesture driven avatar apparatus and method
US11233762B2 (en) 2015-06-01 2022-01-25 Facebook, Inc. Providing augmented message elements in electronic communication threads
EP3101845A1 (en) * 2015-06-01 2016-12-07 Facebook, Inc. Providing augmented message elements in electronic communication threads
US10791081B2 (en) 2015-06-01 2020-09-29 Facebook, Inc. Providing augmented message elements in electronic communication threads
WO2017137952A1 (en) * 2016-02-10 2017-08-17 Vats Nitin Intelligent chatting on digital communication network
US11320982B2 (en) 2016-05-18 2022-05-03 Apple Inc. Devices, methods, and graphical user interfaces for messaging
US11625165B2 (en) 2016-05-18 2023-04-11 Apple Inc. Devices, methods, and graphical user interfaces for messaging
US10983689B2 (en) 2016-05-18 2021-04-20 Apple Inc. Devices, methods, and graphical user interfaces for messaging
US11954323B2 (en) 2016-05-18 2024-04-09 Apple Inc. Devices, methods, and graphical user interfaces for initiating a payment action in a messaging session
US11112963B2 (en) 2016-05-18 2021-09-07 Apple Inc. Devices, methods, and graphical user interfaces for messaging
US11126348B2 (en) 2016-05-18 2021-09-21 Apple Inc. Devices, methods, and graphical user interfaces for messaging
AU2019204403B2 (en) * 2016-05-18 2019-08-01 Apple Inc. Devices, Methods, and Graphical User Interfaces for Messaging
US11221751B2 (en) 2016-05-18 2022-01-11 Apple Inc. Devices, methods, and graphical user interfaces for messaging
US10949081B2 (en) 2016-05-18 2021-03-16 Apple Inc. Devices, methods, and graphical user interfaces for messaging
EP3594795A1 (en) * 2016-05-18 2020-01-15 Apple Inc. Devices, methods, and graphical user interfaces for messaging
US9959037B2 (en) 2016-05-18 2018-05-01 Apple Inc. Devices, methods, and graphical user interfaces for messaging
DK201670642A1 (en) * 2016-05-18 2017-12-04 Apple Inc Devices, Methods, and Graphical User Interfaces for Messaging
US10852935B2 (en) 2016-05-18 2020-12-01 Apple Inc. Devices, methods, and graphical user interfaces for messaging
US11159922B2 (en) 2016-06-12 2021-10-26 Apple Inc. Layers in messaging applications
US11778430B2 (en) 2016-06-12 2023-10-03 Apple Inc. Layers in messaging applications
US11966579B2 (en) 2016-08-24 2024-04-23 Apple Inc. Devices, methods, and graphical user interfaces for messaging
CN110023985B (en) * 2016-10-24 2023-07-21 斯纳普公司 Generating and displaying custom avatars in electronic messages
US11843456B2 (en) 2016-10-24 2023-12-12 Snap Inc. Generating and displaying customized avatars in media overlays
US11876762B1 (en) 2016-10-24 2024-01-16 Snap Inc. Generating and displaying customized avatars in media overlays
CN110023985A (en) * 2016-10-24 2019-07-16 斯纳普公司 Simultaneously displaying format customization head portrait is generated in electronic information
CN110799937A (en) * 2017-04-27 2020-02-14 斯纳普公司 Location-based virtual avatar
US11842411B2 (en) 2017-04-27 2023-12-12 Snap Inc. Location-based virtual avatars
US11893647B2 (en) 2017-04-27 2024-02-06 Snap Inc. Location-based virtual avatars
US10706606B2 (en) 2017-08-24 2020-07-07 Fuji Xerox Co., Ltd. Information processing apparatus for modifying a graphical object based on sensor input

Also Published As

Publication number Publication date
WO2004079530A3 (en) 2004-10-28
AU2004216758A1 (en) 2004-09-16
CA2517909A1 (en) 2004-09-16
JP2006520053A (en) 2006-08-31
EP1599862A2 (en) 2005-11-30

Similar Documents

Publication Publication Date Title
US10504266B2 (en) Reactive avatars
US20180054466A1 (en) Multiple avatar personalities
US10616367B2 (en) Modifying avatar behavior based on user action or mood
US7913176B1 (en) Applying access controls to communications with avatars
EP1599862A2 (en) Using avatars to communicate
US20070168863A1 (en) Interacting avatars in an instant messaging communication session
US20070113181A1 (en) Using avatars to communicate real-time information
US10042536B2 (en) Avatars reflecting user states
US7468729B1 (en) Using an avatar to generate user profile information
US6948131B1 (en) Communication system and method including rich media tools
AU2001241645A1 (en) Communication system and method including rich media tools
WO2007134402A1 (en) Instant messaging system
US9652809B1 (en) Using user profile information to determine an avatar and/or avatar characteristics
NGUYEN et al. TECHNICAL FIELD

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2004216758

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: 3516/DELNP/2005

Country of ref document: IN

WWE Wipo information: entry into national phase

Ref document number: 2004716149

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2006508976

Country of ref document: JP

ENP Entry into the national phase

Ref document number: 2004216758

Country of ref document: AU

Date of ref document: 20040301

Kind code of ref document: A

WWP Wipo information: published in national office

Ref document number: 2004216758

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: 2517909

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 20048057907

Country of ref document: CN

WWP Wipo information: published in national office

Ref document number: 2004716149

Country of ref document: EP