JP2006520053A - How to use an avatar to communicate - Google Patents

How to use an avatar to communicate Download PDF

Info

Publication number
JP2006520053A
JP2006520053A JP2006508976A JP2006508976A JP2006520053A JP 2006520053 A JP2006520053 A JP 2006520053A JP 2006508976 A JP2006508976 A JP 2006508976A JP 2006508976 A JP2006508976 A JP 2006508976A JP 2006520053 A JP2006520053 A JP 2006520053A
Authority
JP
Japan
Prior art keywords
avatar
user
method
sender
personality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2006508976A
Other languages
Japanese (ja)
Inventor
ウエアベル アンドレウ
オデル ジャミエ
ディー. ロビンソン ジョフン
エス. レビンソン ダビド
ジー. ロベ トム
ディー. ブラットネル パトリック
ディー. ヘイケス ブリアン
ジェイ. ブラックウエル ミカエル
Original Assignee
アメリカ オンライン インコーポレイテッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US45066303P priority Critical
Priority to US51285203P priority
Priority to US10/747,701 priority patent/US7484176B2/en
Priority to US10/747,255 priority patent/US20040179039A1/en
Priority to US10/747,652 priority patent/US20040179037A1/en
Priority to US10/747,696 priority patent/US7636755B2/en
Application filed by アメリカ オンライン インコーポレイテッド filed Critical アメリカ オンライン インコーポレイテッド
Priority to PCT/US2004/006284 priority patent/WO2004079530A2/en
Publication of JP2006520053A publication Critical patent/JP2006520053A/en
Application status is Pending legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00Arrangements for user-to-user messaging in packet-switching networks, e.g. e-mail or instant messages
    • H04L51/18Messages including commands or codes to be executed either at an intermediate node or at the recipient to perform message-related actions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00Arrangements for user-to-user messaging in packet-switching networks, e.g. e-mail or instant messages
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]

Abstract

A method and apparatus for enabling communication using an avatar through a graphical user interface of a display device of a computer.
A graphical user interface on a display device for using a computer to communicate using an avatar includes an instant message sender display. The display of the instant message sender includes a sender portion that displays a sender avatar that can display a number of animations. The sender avatar is animated in response to an activation trigger related to the content of the message sent from the sender to the receiver, and carries information that is independent of the information carried directly in the sent text message An out-of-band communication can be animated to send to another user. The avatar can also be included in one or more of a number of online personalities enabled for users of instant messaging activities. Avatars can be animated in response to other avatar animations in the same communication activity.

Description

  This description relates to the projection of a graphical display of an operator (hereinafter “sender”) of a communication application in a communication transmitted within a network of computers.

  Online services can provide users with the ability to send and receive instant messages. Instant messaging has access to the instant messaging service, has installed the communication software necessary to access and use the instant messaging service, and reflects the online status of other users A personal online conversation between two or more people, each with general access to information.

  The sender of the instant message can send a self-expression item to the recipient of the instant message. Current implementations of instant messaging self-expressions include self-expression settings such as Buddy Icon and Buddy Wallpaper that the user sees who is online or later projects to other users who interact with this person Can be selected individually.

(wrap up)
The graphical user interface of the computer display allows communication using an avatar. The graphical user interface includes an instant message sender display. The instant message sender display has a sender portion that displays a sender avatar that can display a number of animations. The instant message sender display also has a message composition area in which a text included in a message transmitted from the sender to the receiver and the communication control device can be displayed. The at least one communication control device is operable to receive an indication that the message displayed in the message composition area is to be transmitted from the sender to the receiver. The sender avatar is animated in response to an activation trigger related to the content of the message sent from the sender to the recipient.

  Implementations can include one or more of the following features. For example, an instant message sender display can include a recipient portion that displays a recipient avatar and a message history area. The recipient avatar may be able to display a number of animations in response to activation triggers associated with the content of messages sent from the sender to the recipient. The message history area can display the contents of a large number of messages transmitted between the sender and the receiver, and can identify the identity of the receiver. The receiver avatar is animated according to the animation of the sender avatar.

  The graphical user interface can include a contact list display for displaying potential recipients. The contact list display can indicate whether each potential recipient is available to receive the message. Potential recipients can be grouped and associated with an indication of the group's identity.

  Potential recipients displayed on the contact list can be associated with potential recipient avatars. A potential recipient avatar can be displayed on the contact list along with the identity of the potential recipient. The potential recipient avatar can be animated on the contact list in response to an animation of the potential recipient avatar displayed elsewhere. The potential recipient avatar animation on the contact list may include substantially the same or different animation as the potential recipient avatar animation displayed elsewhere. The potential recipient avatar animation on the contact list may include an animation representing the potential recipient avatar animation displayed elsewhere.

  The graphical user interface may be a graphical user interface used for instant messaging communication activities. The activation trigger includes part or all of the message text.

  The appearance or animation of the sender avatar can indicate environmental conditions, personality characteristics for the sender, emotional conditions for the sender, settings characteristics, or activity for the sender.

  A sender avatar is a predetermined length of time that the sender does not communicate a message to the recipient or that the sender does not use the computing device used by the sender to communicate with the recipient in a communication activity. It can be animated as time passes.

  An avatar animation used as a communication path may include a breakout animation that is involved in displaying an avatar outside the normal display space occupied by the avatar. The sender avatar can be animated to generate sound that is used for verbal communication.

  Embodiments of the techniques discussed above may include a computer program product for creating a graphical user interface, a graphical user interface configured for display on a display device, or a system or device. it can.

  In another general aspect, communicating comprises graphically displaying a first user in a communication activity involving a first user and a second user using an avatar that can be animated. Including. Messages are communicated between the first user and the second user. The message carries explicit information from the first user to the second user. Out-of-band information is communicated to the second user using a change in the appearance of the avatar or the animation of the avatar as a communication path. Out-of-band communication involves communication that is related to the situation of the first user and that is different from the information carried in the message transmitted between the first user and the second user.

  Implementations can include one or more of the following features. For example, the communication activity can be an instant message communication activity. The avatar can be an animation of a face that does not include a torso with ears or legs, or it can be an animation of a face that includes a neck without including a torso with ears or legs.

  The out-of-band information can include information indicating an environmental state related to the first user. The environmental conditions can include environmental conditions related to weather occurring at a geographical location near the first user. The out-of-band information can indicate a personality characteristic for the first user or an emotional state for the first user.

  The out-of-band information can include information indicating a setting feature related to the first user. The setting features may include features related to the time of the first user or features related to time of year. The time of year may include a holiday or a season that is one of spring, summer, autumn, or winter. Setting features may include features relating to work settings or rest settings. The rest setting can include a sandy setting, a tropical setting, or a winter sports setting.

  The out-of-band information may include information related to the first user's mood. The first user's mood can be one of happy, sad or angry.

  The out-of-band information can be information regarding the activity of the first user. The activity can be performed by the first user at a time that is practically the same as the time when the out-of-band message is communicated from the first user to the second user. An activity can be working or listening to music. The out-of-band information can include information that conveys that the first user has muted the sound associated with the avatar.

  An animation of the avatar for carrying out-of-band information from the first user to the second user can be activated based on the information carried in the message from the first user to the second user. The activation trigger can include some or all of the message text. The activation trigger can include the audio portion of the message. The activation trigger is a predetermined length that the first user does not communicate a message to the second user or does not use the computing device used by the first user to communicate with the second user in a communication activity. Over time.

  An avatar animation used as a communication path can include facial expressions of the avatar, gestures created by the avatar's hands or arms, movements of the avatar's trunk, or sounds created by the avatar. At least some of the acoustics can include audio based on the first user's audio. Avatar animations used as communication paths can include breakout animations that involve displaying an avatar outside the normal display space occupied by the avatar. Breakout animation can include nesting, resizing, or repositioning the avatar.

  The first user can be provided with a number of predetermined avatars with associated pre-selected animations. The first user may be able to select a specific avatar to represent the user in communication activity. The first user can be permanently associated with the selected avatar to represent the first user in subsequent communication activities.

  The first user can modify the appearance of the avatar. The ability for the first user to be able to modify the appearance of the avatar is that the first user can be able to enable a slide bar to indicate a particular modification of a particular feature of the avatar, or It can include allowing the first user to modify the appearance of the avatar to reflect the characteristics of the first user. The first user characteristic may be one of age, gender, hair color, eye color, or facial features.

  The ability for the first user to modify the appearance of the avatar is that the first user can modify the appearance of the avatar by adding, changing, or deleting belongings displayed with the avatar. Can be included. The belonging can be one of glasses, sunglasses, a hat or earrings.

  The first user may be able to modify the activation trigger used to activate the avatar animation. The activation trigger can include a sentence included in a message transmitted from the first user to the second user.

  The avatar can be animated for use as an information assistant to convey information to the first user. The use of avatars by applications other than communication applications including online journals can be enabled.

  The representation of the avatar can be displayed in a form that is substantially the same as a trading card. The depiction of the avatar's trading card may include features relating to the first user.

  In yet another general aspect, recognition of multiple online personalities is enabled in instant messaging communication activities. At least two identities in the communication environment to which the message can be directed are identified. The first personality of the user can be projected onto the first one of the identities, while the second personality of the same user can be projected onto the second one of the identities at the same time. The first and second personalities each include an avatar that can be animated, and the first personality and the second personality are different.

  Implementations can include one or more of the following features. For example, the first personality may be different from the second personality such that the first personality calls a different avatar than the avatar called by the second personality.

  The first personality can call the first avatar, and the second personality can call the second avatar. The first avatar and the second avatar can be the same avatar. The animation relating to the first avatar may be different from the animation relating to the second avatar. The appearance associated with the first avatar can be different from the appearance associated with the second avatar.

  An avatar can be accompanied by multiple sounds. The avatar can be animated based on the text of the message sent in the instant messaging activity. The avatar can be animated to send out-of-band communications.

  The first personality may be related to a first group of identities, so that the first personality is projected in communication activities with members of the first group of identities. The second personality may be related to a second group of identities, so that the second personality is projected in communication activities with members of the second group of identities.

  The personality can be related to the first one of the identities, and the different personality can be related to the group of identities that the first one of the identities relates to. The first personality projected on the first one of identity may be a fusion of personalities relating to the first one of identity and a different personality relating to the group of identity, and the personality relating to the first one of identity Can invalidate different personalities with respect to groups of identities to the extent that rumination exists.

  In yet another general aspect, recognition of multiple online personalities is enabled in instant messaging communication activities. The instant messaging application user interface for instant messaging communication activities is provided to the instant messaging recipient system. The communication activity includes at least one potential instant message recipient and a single instant message potential sender. A message containing a text message and personality is sent. The personality is selected from a number of possible personalities for the instant message sender displayed by the potential instant message recipient when displaying the text message. The selected personality includes a collection of one or more self-expression items and animatable sender avatars. The selected personality is given in the potential instant message recipient system when providing another part of the message.

  Implementations can include one or more of the following features. For example, the sender's personality can be selected by the instant message sender from a number of possible personalities related to the instant message sender. The personality can be given before or after the communication is initiated by a potential instant message sender. Self-expression items can include one or more of wallpaper, emoticons, and sounds. One or more personalities can be defined.

  The first personality can be assigned to a first instant message potential recipient, so that the first personality is then automatically called and the instant involved with the first instant message potential recipient・ Projected in message communication activities. The second personality can be assigned to a second instant message potential recipient, so that the second personality is then automatically called and the instant involved with the second instant message potential recipient.・ Projected in message communication activities. The second personality can be at least partially distinguished from the first personality.

  The first personality can be assigned to a first group of potential instant message recipients, so that the first personality is then automatically invoked and the first group of potential instant message recipients. Projected into instant messaging activities involving members of The second personality can be assigned to a second instant message potential recipient, so that the second personality is then automatically called and the instant involved with the second instant message potential recipient.・ Projected in message communication activities. The second personality can be at least partially distinguished from the first personality.

  One use of multiple personalities can be made impossible. Disabling the use of one of multiple personalities can be based on the instant message recipient.

  One of the many personalities may be a work personality for the instant message sender at the workplace for the instant message sender. One of the many personalities can be a home personality related to the presence of an instant message sender at home. A determination can be made as to whether the instant message sender is at home or at work. In response to the determination that the instant message sender is at home, a home personality can be selected for use in the instant messaging activity. In response to a determination that the instant message sender is at work, a work personality can be selected for use in the instant messaging activity.

  The personality to be displayed can be selected by the instant message potential recipient based on the time of day, the day of the week, or a group of potential instant message recipients with respect to the instant message potential recipient.

  At least some of the personality characteristics may be transparent to the instant message sender. The sender avatar can be animated to send an out-of-band communication from the instant message sender to the potential instant message recipient.

  In yet another general aspect, avatars are used to communicate. The user is represented graphically using an animated avatar. An avatar relates to multiple animations and multiple features of appearance that represent patterns of features that represent the personality of the avatar.

  Implementations can include one or more of the following features. For example, an avatar may relate to a description that identifies the personality of the avatar. The personality of the avatar can include at least some features that are different from at least some features of the user's personality. The second user can be graphically represented using a second avatar that can be animated. The second avatar may relate to multiple animations and multiple features of appearance that represent a pattern of features that represent the personality of the second avatar. The personality of the second avatar can include at least some features that are different from at least some features of the personality of the first avatar. The communication message can be transmitted between the first user and the second user.

  In yet another general aspect, the first avatar is animated based on the recognized animation of the second avatar. The first user is graphically represented using a first avatar that can be animated, and the second user is graphically represented using a second avatar that can be animated. The communication message is transmitted between the first user and the second user. An instruction for the animation of the first avatar is received and the second avatar is animated in response to and based on the received instruction for the animation.

  Implementations can include one or more of the following features. For example, the received animation indication can be any type of animation of the first avatar, or can be a specific animation indication of a number of possible animations of the first avatar. Thereafter, the first avatar can be animated according to and based on the animation of the second avatar.

  The first avatar can be animated in response to a particular portion of the message sent between the first user and the second user. The message can be sent to the first user shell second user, or can be sent from the second user to the first user. The first avatar can be animated to send out-of-band communication from the first user to the second user.

  Embodiments of the technology discussed above can include computer software on a method or process, system or apparatus, or computer-accessible medium.

  The details of one or more embodiments are set forth in the accompanying drawings and the description below. Other features will be apparent from the description and drawings, and from the claims.

(Detailed explanation)
The avatar representing the user of the instant message can be animated based on the message sent between the sender and the recipient. The instant messaging application interface is configured to detect the input of predetermined or user-defined character strings and to relate the character strings to a predetermined animation of the avatar. An avatar representing or selected by the sender is animated in the recipient's instant messaging application interface and optionally in the sender's instant messaging application interface. The avatar is based on an animation model that includes a mesh that uses a polygon to define the shape of the avatar, a texture that defines the image to cover the avatar's mesh, and a light map that defines the effect of the light source on the avatar. It is done. An animation model for an avatar contains at least a thousand polygons in the basic wire model that makes up the avatar mesh, and at least 20 blending shapes, each defining a different facial expression or shape Including geometric shapes. The animation model includes a number of animations that can be applied to an avatar defined by the animation model and animations that can relate to one or more sound effects. An animation model for an avatar can include only the avatar's face and / or the face and neck.

  An avatar representing a user in communication activity can also be used to send out-of-band communications to other users that carry information that is independent of the information carried directly in the sent text message. Out-of-band information can be communicated using a change in avatar appearance or avatar animation as a communication path. According to the example method, out-of-band communication is not explicitly communicated and is not part of the text message exchanged by the sender and receiver, with respect to the sender's settings, environment, activity, or mood Information can be included.

  Users name a number of different "online personalities" or "online personalities" that are groups of self-expression settings for instant messaging such as Avatar, Buddy Sounds, Buddy Wallpaper, and Emoticons (eg Smileys), And it can be saved. Thus, depending on the identity with which the user is communicating, their personality or personality can access and project a pre-selected one of their online personalities in an instant messaging environment. Or they can manually call and manage the online personality they project to others. The functions and features of the instant messaging interface can vary based on the online personality used in the instant messaging conversation.

  An avatar representing a user in communication activity can be animated based on an animation of another avatar representing another user in the same communication activity without being operated by the user. This can be referred to as an avatar's automatic response to the behavior of other avatars.

  FIG. 1 illustrates an exemplary graphical user interface 100 for an instant messaging service that allows a user to project an avatar for self-expression. The user interface 100 can be viewed by a user, who is an instant message sender, and the user's instant messaging communication program is one or more other users or user groups (collectively). , Configured to project an avatar relating to this user and to be used as an identifier for this user. In particular, the user's IMSender is an instant message sender that uses the user interface 100. The instant message sender projects the sender avatar 135 in an instant message communication activity with the instant message recipient SuperBuddyFan1, which projects the recipient avatar 115. A corresponding graphical user interface (not shown) is used by the instant message recipient SuperBuddyFan1. In this manner, the sender avatar 135 can be viewed in each of the sender's user interface and the recipient's user interface, like the receiver avatar 115. Instant messaging communication activities can occur simultaneously, nearly simultaneously, or sequentially.

  The user interface (UI) 100 includes an instant message user interface 105 and an instant message buddy list window 170.

  The instant message user interface 105 has an instant message recipient portion 110 and an instant message sender portion 130. The instant message recipient portion 110 displays the recipient avatar 115 selected by the instant message recipient whose instant message sender has an instant message conversation. Similarly, the instant message sender portion 130 displays the sender avatar 135 selected by the instant message sender. The display of the sender avatar 135 in the instant message user interface 105 allows the instant message sender to recognize the avatar projected to the specific instant message recipient with which the instant message sender is communicating. Make it possible. Avatars 135 and 115 are personalized items that can be selected by the user of the instant message for self-expression.

  Instant message user interface 105 composes an instant message message to be sent to the instant message recipient and displays a transcript of instant message communication activities with the instant message recipient An instant message composition area 145 for a message history sentence box 125 for Each message sent to or received from an instant message recipient is listed in chronological order in the message history text box 125, along with instructions from the user who sent the message, as shown at 126, respectively. Yes. The message history sentence box 125 can optionally include a time stamp 127 for each transmitted message.

  The wallpaper can be attached to each part of the graphical user interface 100. For example, the wallpaper can be applied to the window portion 120 outside the message history box 125 or the window portion 140 outside the message composition area 145. The recipient avatar 115 is displayed over or in place of the wallpaper pasted on the window portion 120 and the wallpaper pasted on the window portion 120 corresponding to the recipient avatar 115. Similarly, the sender avatar 135 is displayed over or in place of the wallpaper pasted on the window portion 140 and the wallpaper pasted on the window portion 120 corresponding to the sender avatar 135. . In some implementations, a box or other type of boundary can be displayed around the avatar, as indicated by the boundary 157 displayed around the sender avatar 135. A wallpaper different from the wallpaper pasted on the window portion 140 outside the message composition area 145 but not within the border 157 can be pasted on the window portion 158 within the border 157. Wallpapers can appear uneven and can contain animated objects. The wallpaper applied to the window portions 120 and 140 can be personalized items that can be selected by the user of the instant message for self-expression.

  The instant messaging user interface 105 also includes a feature controller 165 and a transmission controller 150. The feature controller 165 can control features such as encryption scheme, conversation recording, conversation for different communication modes, font size and color control, and spell check, among others. The set of transmission controllers 150 includes a controller 160 for initiating the transmission of messages typed in the instant message composition area 145 and a controller 155 for modifying the appearance or behavior of the sender avatar 135.

  Instant message buddy list window 170 includes instant message sender selection list 175 of potential instant message recipients (hereinafter “buddy”) 180a-180g. A buddy is typically a contact known to a potential instant message sender (here IMSender). In Listing 175, displays 180a through 180g include text identifying the on-screen names of buddies included in Listing 175, but the additional or alternative information is reduced in size and static or animated. Can be used to represent one or more of the buddies, such as avatars, related to buddies. For example, display 180a includes the name and avatar on the screen of the instant message recipient named SuperBuddyFan1. Indications 180a-180g tell the instant message sender about the buddy, such as whether the buddy is online, how long the buddy has been online, whether the buddy is away, or whether the buddy is using a mobile device Connectivity information can be provided.

  The buddies can be grouped into one or more user-defined or pre-selected group configurations (“groups”) by the instant message sender. As shown, the instant message buddy list window 170 has these groups: Buddies 182, Co-Workers 184, and Family 186. SuperBuddyFan1 185a belongs to the Buddies group 182 and Chatting Chuck 185c belongs to the Co-Workers group 184. When a buddy's instant messaging client program is ready to receive communications, the buddy's display in the buddy list is displayed below the name or display of the buddy group to which this buddy belongs. As shown, at least potential instant message recipients 180a-180g are online. In contrast, when a buddy's instant messaging client program is not able to receive communications, the buddy's display in the buddy list will not appear below the group the buddy is associated with, but instead, the headline Offline 188 Can be displayed below with buddies from other groups. All buddies included in the list 175 are displayed either below one of the groups 182, 184, or 186 or below the heading Offline 188.

  As shown in FIG. 1, each of the sender avatar 135 and the recipient avatar 115 is a graphical image representing a user in an instant messaging activity. While the sender projects a sender avatar 135 for self-expression, the receiver also projects a receiver avatar 115 for self-expression. Here, each of the animation avatars 135 or 115 is an avatar including only a graphical image of a face, which may be called a face avatar or a head avatar. In other embodiments, the avatar can include additional fuselage components. According to an example method, a Thanksgiving turkey avatar can include an image of the entire turkey including head, neck, torso, and features.

  Sender avatar 135 can be animated in response to an instant message sent to an instant message recipient, and recipient avatar 115 can be animated in response to an instant message sent by the instant message recipient be able to. For example, an instant message text sent by the sender can trigger an animation of the sender avatar 135, and an instant message text sent by the instant message receiver to the sender can be the receiver avatar 115. The animation can be activated.

  More specifically, the text of the message to be transmitted is designated by the sender in the message designation text box 145. The text entered in the message specification text box 145 is transmitted to the receiver when the sender activates the send button 160. When the send button 160 is activated, the instant message application retrieves the message text for the animation activation trigger. Once the animation activation trigger is identified, the sender avatar 135 is animated using the animation associated with the identified activation trigger. This process will be explained more fully later. In the same manner, the text of the message sent by and received by the instant message recipient is retrieved and found against the animation activation trigger, and when found, the recipient avatar 115 is associated with the identified activation trigger. Animated using animation. According to the method of the embodiment, the text of the message can include the character string “LOL” which is an acronym representing “laughing loudly”. The string “LOL” can activate an animation in the sender avatar 135 or the receiver avatar 115 so that the sender avatar 135 or the receiver avatar 115 appears to be laughing.

  Alternatively or additionally, the sender avatar 135 can be animated in response to an instant message sent from an instant message recipient, and the recipient avatar 115 can be animated in response to a message sent from the instant message sender. Can be animated. For example, an instant message text sent by the sender can trigger an animation of the recipient avatar 115, and an instant message text sent by the instant message recipient to the sender can be sent to the sender avatar 135. An animation can be activated.

  More specifically, the text of the message to be transmitted is designated by the sender in the message designation text box 145. The text entered in the message specification text box 145 is transmitted to the receiver when the sender activates the send button 160. When the send button 160 is activated, the instant message application retrieves the message text for the animation activation trigger. Once the animation activation trigger is identified, the recipient avatar 115 is animated using the animation for the identified activation trigger. In the same way, the text of the message sent by and received by the instant message recipient is retrieved and found against the animation activation trigger, and when found, the sender avatar 135 relates to the identified activation trigger. Animated using animation.

  In addition, the sender avatar 135 or the receiver avatar 115 can be animated directly in response to a request from the sender or receiver. Direct animation of the sender avatar 135 or the receiver avatar 115 allows the use of the avatar as a means for communicating information between the sender and the recipient without an instant message. For example, the sender can take action to animate the sender avatar 135 directly, or the receiver can take action to animate the receiver avatar 115 directly. The action includes pressing a button corresponding to the animation to be played, or selecting an animation to be played from a list of animations. For example, the sender can suggest an animation in the sender avatar 135 and can present a button different from the send button 160. The step of selecting the button can play the animation of the sender avatar 135 without taking any other action, such as the step of sending the instant message specified in the message composition area 145. The played animation can be randomly selected from the possible animations of the sender avatar 135, or the played animation can be selected before the button is selected.

  An animation in one of the avatars 135 or 115 displayed on the instant message user interface 105 can cause an animation in the other avatar. For example, the animation of the receiver avatar 115 can activate the animation in the sender avatar 135 and vice versa. According to the example method, the sender avatar 135 can be animated to appear to cry. Depending on the animation of the sender avatar 135, the receiver avatar 115 can also be animated to appear to cry. Alternatively, the recipient avatar 115 can be animated to appear to comfort or sympathize with the crying animation of the sender avatar 135. In another example, the sender avatar 135 can be animated to show a kiss and, in response, the recipient avatar 115 can be animated to blush.

  The recipient avatar 115 can appear to respond to the mood of the sender communicated by the sender avatar 135. According to the example method, the recipient avatar 115 may appear sad in response to an animation of frowning or tearing the sender avatar 135. Alternatively, the recipient avatar 115 can be animated to try to encourage the sender avatar 135, such as by showing a funny expression such as smiling, sticking out the tongue, or showing a sympathetic expression. .

  The avatar 135 or 115 can be animated according to a detected waiting period of a predetermined duration. For example, after the sender's pause period, the sender avatar 135 may be animated to give the appearance of this avatar sleeping, disconnecting the instant messaging interface 105, or some other activity that indicates pause. Can be The avatar 135 or 115 can also progress through a series of animations during the sender's pause. This series of animations can be repeated continuously or played only once, depending on the detection of the waiting period. In one embodiment, the appearance of the sender avatar 135 is as if the avatar is sleeping and subsequently appears to have disconnected the instant messaging user interface 105 after the sleeping period. Can be animated to give Animating the avatar 135 or 115 through a number of animation progressions that represent the duration of the sender's pause can provide entertainment to the sender. This can lead to increased use of instant messaging user interface 105 by senders, which in turn can lead to increased market share for instant messaging service providers. .

  The sender avatar 135 or the receiver avatar 115 can be animated to reflect the weather at the sender and recipient's geographic location, respectively. For example, if it is raining at the sender's geographic location, the sender avatar 135 can be animated to wear a raincoat or open an umbrella. Wallpaper corresponding to the sender avatar 135 may also include raindrops that are animated to appear to fall on the sender avatar 135. The animation of the sender avatar 135 or the receiver avatar 115 played according to the weather can be activated by the weather information received by the sender computer or the recipient computer, respectively. For example, weather information can be forcibly sent to the sender's computer by the host system of the instant messaging system being used. If the forcedly transmitted weather information indicates rain, the animation of the sender avatar 135 corresponding to the rainy weather is played.

  In addition, avatars can be used to audibly read content other than text communicated between parties in communication activity. For example, if the appearance sentence "Hey" is on if the message that was sent by the sender, the sender avatar 135, in response to this, can be animated to speech read aloud as "Hello". As another example, when the sentence “otp” or the sentence “on the phone” appears in a message sent by the recipient, the recipient avatar 115 reads “soon to come out” and speaks accordingly. Can be animated to do. As another example, depending on the standby state, the avatar may attempt to audibly attract the sender or receiver. For example, if the recipient sends a message containing a question mark to the sender and the sender decides to wait, the recipient avatar 115 will try to ask for a response from the sender regarding the recipient's question. "Is it there?"

  The sender can mute the receiver avatar 115 or the sender avatar 135 to prevent the receiver avatar 115 or the sender avatar 135 from further speaking. According to the example method, the sender may prefer to mute the recipient avatar 115 to prevent the recipient avatar 115 from speaking. In one embodiment, the avatar can appear to be bitten by a gagged to indicate that the avatar is muted.

  The avatar's voice can correspond to the user's voice regarding the avatar. To do this, the characteristics of the user's voice can be extracted from an acoustic sample of the user's voice. The extracted features and sound samples can be used to create an avatar's voice. Additionally or alternatively, the avatar's voice need not correspond to the user's voice and can be any generated or recorded voice.

  The sender avatar 135 can be used to communicate settings aspects or the sender's environment. According to the example method, the animation and appearance of the sender avatar 135 may reflect aspects of the sender's time, date, or location, or aspects of the sender's environment, purpose, or status. it can. For example, if the sender uses the instant messaging user interface 105 at night, the sender avatar 135 wears pajamas and the avatar is displayed and / or the sender avatar 135 is missing Otherwise it can appear to appear to be lit to illuminate the darker areas of the screen that can appear periodically. If the sender uses the instant messaging user interface 105 during the holiday period, the sender avatar 135 will be Santa Claus in December, pumpkin near the All Saints, or Uncle Sam in early July. It can be disguised as a form that shows holidays such as those that look like. The appearance of the sender avatar 135 may reflect the sender's climate or geographic location. For example, if it is raining at the sender's location, the wallpaper corresponding to the sender avatar 135 can include raindrops that fall and / or the sender avatar 135 wears a rain hat or opens. Can appear under the umbrella. In another example, the sender avatar 135 can appear in a beach garment when the sender is sending an instant message from a tropical location.

  While the sender uses the instant message user interface 105, the sender avatar 135 can also communicate activities being performed by the sender. For example, when the sender is listening to music, the avatar 135 can appear to be wearing headphones. When the sender is working, the sender avatar 135 can be dressed for work, such as appearing in a suit and tie.

  The appearance of the sender avatar 135 can also communicate the mood or emotional state of the sender. For example, the sender avatar 135 can communicate the sad state of the sender by frowning or tearing. The appearance of sender avatar 135 or receiver avatar 115 can resemble the sender or receiver, respectively. For example, the appearance of the sender avatar 135 can make the sender avatar 135 appear to be the same age as the sender. In one embodiment, as the sender ages, the sender avatar 135 can also appear to age. As another example, the appearance of the recipient avatar 115 can be such that the recipient avatar 115 has the same appearance as the recipient.

  In some implementations, the wallpaper applied to the window portion 120 and / or the wallpaper applied to the window portion 140 can include one or more animated objects. An animated object can repeat a series of animations continuously or periodically on a predetermined or random basis. Additionally or alternatively, the wallpaper applied to the window portions 120 and 140 can be animated depending on the text of the message sent between the sender and the receiver. For example, an instant message sentence sent by the sender can trigger an animated object animation included in the wallpaper corresponding to the sender avatar 135 and sent to the sender by the instant message recipient. The sent instant message text can activate animation of an animated object included in the wallpaper corresponding to the recipient avatar 115. The animated objects included in the wallpaper can be animated to reflect the settings or environment, activity, and mood of the receiver and sender, respectively.

  The avatar can be used as a mechanism to allow self-expression or additional non-text communication by the user regarding the avatar. For example, sender avatar 135 is the sender's projection and receiver avatar 115 is the receiver's projection. An avatar represents a user in an instant messaging activity involving the user. The sender's personality or emotional state can be projected or otherwise communicated via the avatar's personality. Some users may prefer to use an avatar that more accurately represents the user. For this reason, the user can change the appearance and behavior of the avatar to more accurately reflect the user's personality. In some cases, the sender may prefer to use an avatar for self-expression rather than projecting the actual person image of the sender. For example, some people may prefer to use an avatar over sending a sender's video or photo.

  Referring to FIG. 2, the avatar animation shows the size or position of the avatar so that the avatar occupies a space on the instant messaging user interface 105 that is larger or different from the avatar's original boundary. A step of adjusting the amount of In the illustration of FIG. 2, the size of the sender avatar 205 is enlarged so that the avatar 205 covers the message in the instant message writing area 145 and a part of the control device 155. In addition, elements of the user interface 100 other than avatars can be displayed using additional space or using different spaces on the user interface 100. For example, the sender avatar can draw a starfish with an expressive face and can display it on a wallpaper containing animated fish. The animated fish contained in the wallpaper can be pulled out of the original boundary around the sender avatar 135 and can appear to swim outside the original boundary region.

  Referring to FIG. 3, a process 300 is shown for animating an avatar for self-expression based on the content of an instant message. In particular, an avatar representing an instant message sender is animated according to text sent by the sender. The avatar wallpaper is also animated. Step 300 is performed by a processor executing an instant message communication program. In general, the text of a message sent to an instant message recipient is searched for an animation activation trigger, and when an activation trigger is found, the avatar representing the instant message sender is identified based on the specific trigger found It is animated by the method of. The wallpaper displayed for the avatar contains animated objects. The object can be animated based on the content of the sent instant message, or a predetermined length of time, a specific date or time occurrence, any type of animation of the sender avatar, Animate based on a specific type of animation of the sender avatar, any type of animation of the recipient avatar, or other triggers including (but not limited to) a specific type of animation of the recipient avatar be able to. Similarly, if the sender has been idle for a predetermined duration, the avatar continuously displays each of a number of animations related to the standby state.

  The process 300 begins when the instant message sender for the avatar initiates an instant message communication activity with the instant message recipient (step 305). To do this, the sender can select the recipient's name from a buddy list, such as the buddy list 170 from FIG. Alternatively, the name of the recipient can be entered in a format that allows instant messages to be specified and sent. As another alternative, the sender signs on for access to the instant messaging system and designates the recipient as the user of the instant messaging system where the communication activity should begin An instant messaging application can be started that can be used. Once the recipient is specified in this manner, a determination is made as to whether there is a copy of the sender and the avatar associated with the recipient on the instant messaging client system being used by the sender. If not present, a copy of the avatar is retrieved for use during the instant message communication activity. For example, information for providing a recipient's avatar can be retrieved from an instant message host system or an instant message recipient client. In some cases, a specific avatar can be selected by the sender for use during instant messaging communication activities. As an alternative or in addition, the avatar has been previously identified and can be related to the sender.

  The processor displays a user interface for instant messaging activity including the avatar for the sender and the wallpaper affixed to the user interface displayed over the avatar (step 307). The avatar can display, for example, a wallpaper pasted on a portion of the window in which the instant message interface is displayed. In another embodiment, the avatar is displayed over the window portion 120 or 140 and part or portions of the instant message interface such as FIG. In the example of FIG. 3, the wallpaper corresponding to the avatar may include objects that are animated during the instant message communication activity.

  The processor receives the text of the message entered by the sender that should be sent to the instant message recipient (step 310) and sends a message corresponding to the entered text to the recipient (step 315). ). The processor compares the message text to a number of animation activation triggers for the avatar projected by the sender (step 320). The activation trigger can include any letter, number, or symbol that has been typed using a keyboard or keypad or can be entered elsewhere. Multiple activation triggers can relate to one animation.

  Referring to FIG. 4, an activation trigger embodiment 400 for a particular avatar model animation 405a-405q is shown. Each of animations 405a through 405q has a number of associated activation triggers 410a through 410q. More specifically, according to the example method, animation 405a with avatars smiling is associated with activation trigger 410a. Each of the activation triggers 410a includes a number of character strings. In particular, the activation trigger 410a is ":)", the activation trigger 411a is ":-)", the activation trigger 412a is "0 :-)", the activation trigger 413a is "0 :)", the activation trigger 414a, And a “Nice” activation trigger 415a. As shown, the activation trigger can be an English word such as 415a or an emotion such as 411a to 414a. Another example of an activation trigger includes an abbreviation such as “lol” 411n and an English phrase such as “Oh, no” 415e. As already discussed, if one of the activation triggers is included in the instant message, the avatar is animated using the animation associated with that activation trigger. In one embodiment, the avatar is smiled if “Nice” is included in the instant message. In one embodiment, one or more of the activation triggers associated with an animation can be modified by the user. For example, the user can associate a new activation trigger with the animation, such as by adding “happy” to activation trigger 410a to smile the avatar. In another example, the user can delete the activation trigger associated with the animation, such as by deleting “Nice” 415a (ie, disassociate the activation trigger from the animation). In yet another embodiment, the user can change the activation trigger associated with the animation, such as by changing the “wink” activation trigger 413b to “winks”.

  In some implementations, a particular activation trigger can be for only one animation. In another embodiment, a particular activation trigger can be allowed to involve multiple animations. In some implementations, only one of a number of animations can be played in response to a particular activation trigger. A single animation to be played can be selected from a number of animations randomly or in a predetermined manner. In another embodiment, all of the multiple animations can be played continuously based on a single activation trigger. In some implementations, the user can be allowed to delete certain animations. For example, the user can delete the animation 405g that shouts. In such a case, the user can delete some or all of the trigger triggers related to the screaming animation 405g, or some or all of the trigger triggers 410g can be different animations such as a smile animation 405a. You can choose to associate with.

  Referring again to FIG. 3, the processor determines whether an activation trigger is included in the message (step 325). If the message includes an activation trigger (step 325), the processor identifies a type of animation associated with the identified activation trigger (step 330). This can be accomplished by using a database table, list, or file that associates one or more activation triggers with a certain type of animation in order for the avatar to identify the specific type of animation. According to the method of the embodiment, the types of animation are all smiles 405a, wink 405b, frown 405c, tongue-exposed expression 405d, shocked expression 410d, kiss 405f, screaming 405g, which are all of FIG. Including big smile 405h, sleeping expression 405i, nodding expression 405j, sigh 405k, sad expression 405l, calm expression 405m, laughing 405n, discouraged 405o, odor 405p, or negative expression 405q. The identified type of animation for the avatar is played (step 335).

  Optionally, the processor can identify and play an animation of the at least one wallpaper object based on the match of the activation trigger with the text of the transmitted message (step 337).

  The processor monitors the sender's communication activity over a period of inactivity (step 340) and detects when the sender is on standby or waiting for communication activity (step 345). The sender can enter a waiting state after a period when the message has not been sent. To detect the wait state, the processor determines whether the sender has typed, sent an instant message, or otherwise interacted with the instant messaging application for a predetermined length of time. Can be determined. Alternatively, the wait state can be detected by the processor when the sender is not using a computer system that operates for a predetermined amount of time.

  When the processor detects a pause (which can be referred to as a wait state), a type of animation related to the wait state is identified (step 350). This can be accomplished by using a database table, list, or file that identifies one or more types of animation to play during the detected waiting period. The type of animation played during the detected standby state can be the same as or different from the type of animation played based on the activation trigger in the instant message. The identified type of animation is played (step 355). In one embodiment, many types of animations regarding the waiting period can be identified and played. If the processor detects that the sender is no longer waiting, such as by receiving input from the sender, the processor can immediately stop playing an animation event (not shown). In some implementations, the user can select the type of animation to be played during the waiting period and / or the order in which the animation is played when multiple animations are played during the waiting state Can be selected. The user can configure a duration during which messages that constitute a wait state for the user are not sent, or otherwise determined.

  In some implementations, the processor may detect a wallpaper object activation trigger that is different from the activation trigger used to animate the sender avatar (step 360). For example, the processor can detect the passage of a predetermined length of time. In another example, the processor may detect that the content of the instant message includes an activation trigger for a wallpaper object animation that is different from the activation trigger used to animate the sender avatar. Another wallpaper object activation trigger is the occurrence of a specific day or time, the presence of any animation by the sender avatar, the presence of a specific type of animation by the sender avatar, the presence of animation by the receiver avatar, and / Or may include (but is not limited to) the presence of a particular type of animation of the recipient avatar. The activation trigger for the wallpaper object animation also includes whether a particular type of animation is included, whether any animation is played, and the user selects the activation trigger for one or more of the wallpaper objects, It can be configurable by the user. The activation trigger for a wallpaper object or some type of animation of the object may be the same as or different from one of the activation triggers for avatar animation.

  When the processor detects a wallpaper object activation trigger (step 360), the processor identifies and plays an animation of at least one wallpaper object (step 337).

  The steps of identifying and playing several types of animation in the sent instant message (steps 310 to 335) are performed for each instant message sent and received by the processor. . The process of identifying and playing back several types of animation events during the pause (steps 340-355) can occur many times during an instant messaging activity. Steps 310 through 355 can be repeated indefinitely until the end of the instant messaging activity.

  The processor of the instant messaging application that also received the message corresponding to the sent instant message or identifying and playing the type of animation that is played during the sender's pause (steps 320-355) Is done. Thus, the animation of the sender avatar can be viewed by the sender and receiver of the instant message. Thus, the animation of the avatar carries information from the sender to the recipient that is not directly contained in the instant message.

  Referring to FIG. 5, the instant message interface 500 can be used by the sender of a voice-based instant messaging system for sending and receiving instant messages. In voice-based instant messaging systems, instant messages are listened to rather than read by the user. The instant message can be a recording of a voice-based instant messaging system user, or the instant message contains text that is changed to audible speech using a text-to-speech engine Can do. The recorded or audible sound is played by the user. The voice-based instant message interface 500 can display an avatar 505 corresponding to a user of the instant message system from which the voice-based instant message is received. The avatar 505 can be automatically animated in response to the received instant message so that the avatar 505 appears to speak the content of the instant message. The recipient can view the animation of the avatar 505 and collect information that is not conveyed directly in the instant message or explicitly conveyed. Depending on the animation played, the receiver may be able to determine, for example, the mood of the sender or whether the sender is serious or joking.

  More specifically, the voice message can be processed in the same or similar manner as the sentence instant message is processed with respect to the animation process 300 of FIG. In such cases, some types of animation are triggered by an acoustic activation trigger included in the instant message.

  In some embodiments, the avatar 505 appears to speak an instant message. For example, the avatar 505 can include an animation of mouth movements corresponding to phonemes in human speech to increase the accuracy of the speech animation. When the instant message includes text, the text utterance process generates a sound uttered by the avatar 505, an animation corresponding to the phoneme in the text is generated, and a corresponding animation of the mouth of the avatar 505 is seen A lip synchronization process can be used to synchronize the sound reproduction with the lip animation so that the phonemes can be heard at the same time as the time. When the instant message includes a recording, an animation corresponding to the phonemes in the recording can be generated, and lip synchronization can be used to synchronize the playback of the recording to the lip animation.

  In another embodiment, the sender can record the audio portion to be associated with one or more animations of the avatar 505. Subsequently, the recording can be replayed when the corresponding animation of avatar 505 is replayed.

  FIG. 6 illustrates communicating between instant message clients 602a and 602b via an instant message host system 604 to animate one avatar in response to animation played on different avatars. An example process 600 is shown. Each user using client 602a or client 602b represents a user in an instant messaging activity and relates to a projecting avatar. Communication between clients 602a and 602b is facilitated by instant message host system 604. In general, the communication process 600 allows the first client 602a and the second client 602b to send and receive communications with each other. Communication is sent via the instant message host system 604. Some or all of the communications can trigger an animation in the avatar for the user of the first client 602a and an animation in the avatar for the user of the second client 602b.

  Instant message communication activity is established between the first client 602a and the second client 602b to which communication is transmitted via the instant message server host system 604 (step 606). The communication activity includes a first avatar representing the user of the first client 602a and a second avatar representing the user of the second client 602b. This can be accomplished, for example, as already described with respect to step 305 of FIG. In general, the user of the first client 602a and the user of the second client 602b are responsible for the user interface of FIG. 1 where the sender and recipient avatars are displayed on the first client 602a and the second client 602b. A user interface 100 similar to 100 can be used.

  During the instant message communication activity, the user associated with the first client 602a enters an instant message sentence to be sent to the user of the second client 602b, which executes the instant message communication application. Received by the processor on client 602a (step 608). The entered text can include an activation trigger for one of the animations from the first avatar model. The processor executing the instant messaging application transmits the text entered into the second client 602b in the instant message by the method of the host system 604 (step 610). Specifically, the host system 604 receives a message from the first client 602a and forwards the message to the second client 602b (step 612). Subsequently, the message is received by the second client 602b (step 614). Upon receiving the message, the second client 602b displays the message in a user interface where a message from the user of the first client 602a is displayed. The user interface may be similar to the instant messaging user interface 105 from FIG. 1 where avatars corresponding to senders and recipients are displayed.

  Both the first client 602a and the second client 602b have a copy of the message, and both the first client 602a and the second client 602b have message text in the first and second avatar models. Begin processing message text to determine whether to activate any animation in each copy. When processing a message, the first client 602a and the second client 602b can actually process the message substantially simultaneously and sequentially, but the first client 602a and the second client 602b. Both process messages in the same way.

  Specifically, the first client 602a searches the message text for an animation activation trigger to identify the type of animation to play (step 616a). Subsequently, the first client 602a identifies an animation having an identified type of animation for the first avatar associated with the user of the first client 602a (step 618a). The first client 602a plays the identified animation for the first avatar associated with the user of the first client 602a (step 620a). The first avatar model is used to identify the animation to be played. This is because the first avatar model relates to the first client 602a sending the message. The first client 602a and the second client 602b use the same copy of the first avatar model to process the message, so that the same animation event is on the first client 602a and the second client 602b Seen in

  The animation from the first avatar model activates the animation from the second avatar model. To do this, the first client 602a uses a second type of animation related to the user of the second client 602b based on the identified type of animation played against the first avatar in response to the sentence activation trigger. The type of animation to be played for the avatar is identified (step 622a). The first client 602b plays the identified type of animation for the second avatar (step 624a).

  The first client can also identify the type of animation to be played for the wallpaper corresponding to the first avatar, and can also play the identified wallpaper animation for the first avatar ( Step 626a). The avatar wallpaper can include objects that are animated during instant messaging activities. The animation of the object can be generated based on, for example, an activation trigger in an instant message or the passage of a predetermined length of time. Wallpaper object animations allow the user to configure whether a particular type of animation or any animation playback is played and the user selects a trigger for one or more of the wallpaper objects be able to. The trigger for a certain type of animation of the wallpaper object may be the same as or different from one of the triggers related to the animation of the avatar. After the message has been sent and processed, the user of the first client 602a may have not sent any additional messages for some time. The first client 602a detects such a pause period (step 628a). The first client 602a identifies and plays the type of animation related to the pause period detected by the first client 602a (step 630a). This can be accomplished by using a database table, list, or file that identifies one or more types of animation to play during the detected waiting period.

  The second client 602b processes instant messages in the same manner as the first client 602a. Specifically, the second client 602b processes the message using steps 616b to 630b, each of which is substantially the same in parallel to the message processing steps 616a to 630a performed by the first client 602a. Since each of the first client 602a and the second client 602b has a copy of the avatar corresponding to the user of the first client 602a and the second client 602b, the first client as a result of the execution of steps 616a to 630a The same animation that was playing on 602a is played on the second client 602b as a result of the execution of similar steps 616b to 630b.

  During the communication process 600, text-based messages indicate the type of animation that occurs. However, messages with different types of content can also trigger an avatar animation. For example, the characteristics of an audio signal included in a message based on sound can trigger an animation from an avatar.

  Referring to FIG. 7, process 700 is used to select and optionally customize an avatar for use with an instant messaging system. The avatar can be customized to reflect the user's personality or other aspects of self-expression related to the avatar. Process 700 begins when a user selects an avatar from a number of avatars and the selection is received by a processor that performs process 700 (step 705). For example, the user can select a particular avatar from a number of avatars such as the avatar shown in FIG. Each of the avatars 805a to 805r relates to an avatar model that specifies the appearance of the avatar. Each of the avatars 805a through 805r also includes a number of related animations, each animation being identified as being of a particular animation type. The selection can be achieved, for example, when the user selects one avatar from a group of displayed avatars. The display of the avatar can indicate multiple avatars in the window, such as by showing a small display of each avatar (sometimes referred to as a “thumbnail” in some embodiments). Additionally or alternatively, the display may be a list of avatar names that the user selects.

  FIG. 8 shows a number of avatars 805a to 805r. Each avatar 805a-805r includes a description of appearance, name, and personality. In one embodiment, the avatar 805a has an appearance 810a, a name 810b, and a personality description 810c. The appearance of the avatar can represent an existing person, a fictional or historical person, a sea creature, an amphibian, a reptile, a mammal, a bird, or an animated object, according to an example method. Some avatars can be represented using only the head, such as avatars 805a-805r. In one embodiment, the appearance of avatar 805b includes a sheep's head. The appearance of other avatars can include only a portion of the head or a designated portion. For example, the appearance of avatar 805l resembles a set of lips. Other avatars can be represented by the torso in addition to the head. For example, the appearance of the avatar 805n includes a full heel torso in addition to the head. The avatar can be displayed over the wallpaper associated with the avatar in the subject. In one embodiment, avatar 805i is displayed over a wallpaper showing the swamp where avatar 805j lives.

  Each of the avatars 805a to 805r has a basic state expression. For example, avatar 805f appears happy, avatar 805j appears sad and avatar 805m appears angry. Avatars have other basic state expressions such as scared or bored. The basic state representation of the avatar can affect the behavior of the avatar, including the animation and sound of the avatar. In one embodiment, avatar 805f has a happy basic state representation and, as a result, has a generally happy behavior, whereas avatar 805m has a stunning basic state representation, and As a result, it has a generally horrible, horrible and creepy attitude. In another example, a happy avatar can have a cheerful sound, while an angry avatar can appear to scream when the sound is created. The basic state representation of the avatar can be changed as a result of user activity related to the avatar. According to example methods, the degree of happiness represented by an avatar can be related to the number of messages sent or received by the user. If the user sends or receives more messages within a given time, the avatar may appear to be happier than when it sends or receives fewer messages within a given time.

  One of a number of avatars 805a through 805r can be selected by the user of the instant messaging system. Avatars 805a through 805r relate to appearances, features, and behaviors that express particular types of personality. For example, the avatar 805f having the appearance characteristics of a dolphin can be selected.

  Each of the avatars 805a to 805r is a multi-dimensional character having a depth of individuality, voice, and visual attributes. In contrast to representing a single aspect of the user through the use of non-animated two-dimensional graphical icons, the avatars of avatars 805a to 805r show a great variety of information about the user projecting the avatar be able to. The characteristics of avatars are physical, emotional, and other attributes about users who are not well suited (or even available) for representation through the use of non-animated two-dimensional icons Allows communication of type context information. In one example, an avatar can reflect a user's mood, emotions, and personality. In another example, the avatar can reflect the user's location, activity, and other situations. User characteristics can be communicated via avatar appearance, visual animation, and audible sound.

  In one example of avatar personality, an avatar named SoccerBuddy (not shown) relates to an active personality. In fact, the personality of the SoccerBuddy avatar can be described as active, dashing, full of self, passionate and youthful. The behavior of SoccerBuddy avatar reflects events in a soccer game. For example, the avatar ’s yelling animation is “Ore, Ore, Ore” cheering, his big smiling animation is “Goole”, and during the animation of frowning or sticking out the tongue, the avatar shows a yellow card . Using wallpaper, SoccerBuddy is customized to represent a particular team. Special features of SoccerBuddy avatar include spiked legs to represent the avatar's basis. In general, the foot serves as the basis for the avatar. A SoccerBuddy avatar can appear to move around by jumping on his feet. In a few animations, such as when the avatar leaves, the avatar's legs can grow and come off the SoccerBuddy. The foot can be animated to kick a soccer ball while displayed.

  In another embodiment, a silent movie avatar reminds a 1920s and 1930s silent movie actor. Silent movie avatars are drawn using a bowler hat and a Tenshin beard. Silent movie avatars are not related to sound. Instead of speaking, the silent movie avatar is replaced by or displays a placard with text in the same way that a conversation is exchanged in a silent movie.

  In another example, the avatar may be suitable for the current event or season. In one example, an avatar may represent a team or team player involved in professional or amateur sports. The avatar can represent a football team, a baseball team, a basketball team, or a specific player on the team. In one example, teams participating in a particular winning decision series can be represented. Examples of seasonal avatars include Santa Claus Avatar, Uncle Sam Avatar, Thanksgiving Turkey Avatar, Pumpkin Lantern Avatar, Valentine's Day Heart Avatar, Easter Egg Avatar, and Easter Rabbit Avatar.

  If various types of animation for the avatar are to occur, the avatar's animation activation trigger can be modified for customization (step 710). For example, the user can modify the activation trigger shown in FIG. 4 to indicate when the avatar is animated, as already described with respect to FIG. Activation triggers can be increased to include frequently used words, phrases, or strings. The activation trigger can also be modified so that the animation played as a result of the activation trigger shows the personality of the avatar. Modifying the trigger can help define the personality expressed by the avatar and use it for the user's self-expression.

  The user can also configure the appearance of the avatar (step 715). This defines the personality of the avatar and can also help communicate the self-expression aspects of the sender. For example, referring also to FIG. 9, the appearance modification user interface 900 can be used to configure the appearance of an avatar. In the example of FIG. 9, the appearance modification user interface 900 allows the user to modify a number of features of the avatar's head. For example, the avatar's hair, eyes, nose, lips, and skin tone can be configured using the appearance modification user interface 900. For example, the hair slider 905 can be used to modify the length of the avatar's hair. The various positions of the hair slider 905 represent different possible lengths of hair relative to the avatar corresponding to various representations of the avatar's hair contained in the avatar model file for the avatar being constructed. The eye slider 910 can be used to modify the avatar's eye color using each position of the eye slider 910 representing the various possible colors of the avatar's eyes, each color being an avatar model file. It is expressed in The nose slider 915 can be used to modify the avatar's nose appearance using the positions of the nose slider 915 representing the various possible appearances of the avatar's nose, each possible appearance being an avatar Represented in the model file. In a similar manner, the lip slider 920 uses the positions of the lip slider 920 to represent the various possible appearances of the avatar's lips, and for the various lips represented in the avatar model file, the avatar. Can be used to correct the appearance of the lips. The skin tone of the avatar can also be corrected using the skin tone slider 925. Each possible position of the skin shade slider 925 represents a possible skin shade for the avatar, and each of the shades is represented in the avatar model file.

  The appearance of the avatar created as a result of the use of sliders 905-925 can be pre-viewed in an avatar viewer 930. The value selected using sliders 905-925 is reflected in the avatar shown in avatar viewer 930. In one embodiment, the avatar viewer 930 can be updated as each of the sliders 905-925 is moved so that changes made to the avatar's appearance are immediately visible. In another embodiment, the avatar viewer 930 can be updated once after all of the sliders 905-925 have been used.

  A rotation slider 935 allows for the rotation of the avatar shown in the avatar viewer 930. For example, the avatar can rotate about the axis by a selected number of angles on the rotation slider 935 relative to the unrotated orientation of the avatar. In one embodiment, the axis extends vertically through the center of the avatar's head and the avatar's unrotated orientation is that when the avatar is directly facing forward. Rotating the avatar's head with rotation slider 930 allows viewing all sides of the avatar to show changes to the avatar's appearance made using sliders 905-925. The avatar viewer 930 can be updated as the rotation slider 930 is moved so that changes in the orientation of the avatar can be seen immediately.

  The appearance modification user interface 900 also includes a hair tool button 940, a skin tool button 945, and an inventory tool button 950. Selecting the hair tool button 940 displays a tool for modifying various characteristics of the avatar's hair. For example, a tool displayed as a result of selecting the hair tool button 940 may allow changes to the avatar's hair length, color, trim, and comb, for example. In one embodiment, changes made to the avatar's hair using the tool displayed as a result of selecting the hair tool button 940 are reflected in the display of the avatar in the avatar viewer 930.

  Similarly, selecting skin tool button 945 displays a tool for modifying various aspects of the avatar's skin. For example, a tool displayed as a result of selecting the skin tool button 945, for example, changes the skin color of the avatar, tans the avatar, engraves the avatar, or changes the appearance of the age represented by the avatar. It may be possible to change the weathering of the avatar's skin to give. In one embodiment, changes made to the avatar's skin using the tool displayed as a result of selecting the skin tool button 945 are reflected in the display of the avatar in the avatar viewer 930.

  In a similar manner, selecting the inventory tool button 950 displays a tool for associating one or more inventory with an avatar. For example, the avatar can be provided with glasses, earrings, a hat, or other object that can be worn by the avatar through the use of personal belongings tools or displayed on or near the avatar. . In one embodiment, the belongings given to the avatar using the tool displayed as a result of selecting the belongings tool button 950 are shown in the avatar display of the avatar viewer 930. In some embodiments, all of the belongings that can be accompanied by an avatar are included in the avatar model file. The belongings control whether each of the belongings is made visible when the avatar is displayed. In some implementations, the belongings can be created and applied by two-dimensional animation techniques. The giving of belongings is synchronized with the animation for the 3D avatar. The personal belongings are created after the avatar is first created and can be accompanied by the avatar.

  Once all desired changes have been made to the appearance of the avatar, the user can accept the changes by selecting the publish button 955. Selecting the Publish button 955 saves changes made to the appearance of the avatar. In addition, when a copy of the avatar is held by another user of the instant messaging system to reflect the changes made, the other user can be made aware of the changes made by the user to the avatar. An updated copy of the reflecting avatar is sent. The avatar copies can be updated so that all copies of the avatar have the same appearance so that there is consistency between the avatars used to send and receive out-of-band communications. The appearance modification user interface 900 can be used by the user to change only the copy of the avatar corresponding to the user. Thus, the user is prevented from making changes to other avatars corresponding to other users that can be overwritten, and the user is sent an updated copy of the other avatar. This is because other users have changed to other avatars. Preventing the user from modifying other avatars ensures that all copies of the avatar are identical.

  The avatar shown in the avatar viewer 930 may have an appearance that does not include one of hair, eyes, nose, lips, or skin shades that are modified using sliders 905-925. For example, the appearance of avatar 805l from FIG. 8 does not include hair, eyes, nose, or skin tone. In such a case, the appearance modification user interface 900 can omit the sliders 905-925, and can instead include a slider for controlling other aspects of the avatar's appearance. For example, the appearance modification user interface 900 can include a tooth slider when the appearance of the avatar 805l is being modified. Further, the interface 900 can be customized based on the selected avatar to allow appropriate and relevant visual enhancement for the selected avatar.

  In another embodiment of configuring the appearance of the avatar, the configurable facial features of the avatar can be created using an animation model blend shape corresponding to the avatar. The blending shape defines the part of the avatar that can be animated. In some embodiments, the blending shape can include a mesh percentage that can be modified to cause a corresponding modification in facial features. In such a case, the user can configure the avatar facial feature configuration by using a slider or other type of control device to modify the mesh ratio of the blending shape with respect to the facial feature being constructed. Can be possible.

  In addition to modifying the appearance of the avatar using the appearance modification user interface 900, the color, texture, and particles of the avatar can be modified. More specifically, the avatar's color or shadow can be changed. The texture applied to the avatar can be changed to make the avatar's skin age or weather. In addition, the width, length and particle color of the avatar can be customized. In one example, avatar particles used to depict hair or facial hair such as beard can be modified to show hair or beard growth in the avatar.

  Referring to FIG. 7, the wallpaper for which the avatar is shown and the animation for the objects in the wallpaper can be selected (step 720). This can be accomplished, for example, by selecting a wallpaper from a set of possible wallpapers. The wallpaper can include animated objects, or the user can select an object and an animation for the selected object to add to the selected wallpaper.

  A trading card containing an avatar image and a description of the avatar can be created (step 725). In some implementations, the trading card may also include a user description for the avatar. Trading cards can be shared with other users of the instant messaging system to inform other users about avatars related to the user.

  Referring also to FIG. 10, one embodiment of a trading card is depicted. The front 1045 of the trading card shows an avatar 1046. The animation of the avatar can be reproduced by selecting the animation control device 1047. The back side 1050 of the trading card contains descriptive information 1051 about the avatar, including the name of the avatar, date of birth, city, species, likes, dislikes, hobbies, and future dreams. As shown in FIG. 10, both the front 1045 and back 1050 of the trading card are shown. In some implementations, only one side 1045 or 1050 of the trading card can be displayed at a time. In such a case, the user can use one of the flip control devices 1048 or 1052 to control the side of the trading card being displayed. A store from which accessories for the avatar 1046 shown on the trading card can be obtained can be accessed by selecting the shopping control device 1049.

  Referring back to FIG. 7, the avatar can also be exported for use in another application (step 730). In some implementations, the avatar can be used by applications other than message transmission applications. In one example, the avatar can be displayed as part of a user's customized home page of a user's access provider, such as an Internet service provider. The sender of the instant message can drag and drop the avatar onto the user's customized home page so that the avatar can be viewed by the user corresponding to the avatar. In another example, the avatar can be used in applications where the avatar can be viewed by anyone. The instant message sender can drag and drop the sender's avatar to the sender's blog or other type of publicly accessible online journal. The user can repeat one or more of the steps of process 700 until satisfied with the appearance and behavior of the avatar. The avatar is stored and made available for use in instant messaging activities.

  Referring back to FIG. 10, the avatar setting user interface 1000 includes an individuality section 1002. Selecting the personality tab 1010 displays the personality section of the avatar setting interface 1000 for modifying the behavior of one or more avatars. In one embodiment, the avatar setting user interface 1000 can be used in conjunction with step 700 of FIG. 7 to select an avatar wallpaper and / or to create a trading card for the avatar.

  The personality section 1002 of the avatar setting interface 1000 includes an avatar list 1015 that includes one or more various avatars corresponding to users of the instant messaging system. Each of the one or more avatars can be designated to have a distinct personality for use, while communicating with a particular person or in a particular situation. In one embodiment, the avatar can change its appearance or behavior depending on the person with whom the user interacts. For example, an avatar can be created with a personality that is appropriate for business communications, and other avatars can be created with a personality that is appropriate for communication with a family. Each avatar can be presented in a list with a small display of the name as well as the appearance of each avatar. Selection of the avatar from the avatar list 1015 allows specification of the behavior of the selected avatar. For example, the avatar 1020 selected to be the user's default avatar has been selected from the avatar list 1015 so that the behavior of the avatar 1020 can be specified.

  The names of the avatars included in the avatar list can be changed through selection of the rename button 1025. Selecting the rename button displays a tool for changing the name of the avatar selected from the avatar list 1015. Similarly, an avatar can be designated as a default avatar by selecting a default button 1030 after selecting an avatar from the avatar list 1015. The avatar can be deleted by selecting the avatar from the avatar list 1015 and then selecting the delete button 1035. In one embodiment, a notification is displayed before the avatar is deleted from the avatar list 1015. An avatar can be created by selecting the create button 1040. When the create button 1040 is pressed, a new input is added to the avatar list 1015. Input can be selected and modified in the same way as other avatars in the avatar list 1015.

  The behavior of the avatar is summarized in the card front 1045 and the card back 1050 displayed in the individuality section. Card front 1045 includes an avatar illustration and a wallpaper on which avatar 1020 is shown. The card front surface 1045 also includes a shopping control device 1049 to the means for purchasing belongings for the selected avatar 1020. The card back 1050 includes information describing the selected avatar 1020 and the user of the selected avatar. The description can include name, date of birth, location, and other information for identification and description to the avatar and the user of the avatar. The card back 1050 can also include an avatar illustration as well as a wallpaper on which the avatar 1020 is shown. A trading card created as part of the avatar customization process 700 includes a card front 1045 and a card back 1050 that are automatically generated by the avatar setting interface 1000.

  The personality section 1002 of the avatar setting interface 1000 can include a number of links 1055 to 1070 to tools for modifying other aspects of the behavior of the selected avatar 1020. For example, the avatar link 1055 can be linked to a tool for modifying the appearance of the selected avatar 1020. In one embodiment, selecting avatar link 1055 may display appearance modification user interface 900 from FIG. In another embodiment, the avatar link 1055 may replace the selected avatar 1020 or otherwise display a tool for selecting it. In yet another embodiment, the avatar link 1055 can allow the avatar's appearance to be changed to a different species. For example, the tool may allow the appearance of the avatar 1020 to change from a dog appearance to a cat appearance.

  Wallpaper link 1060 can be selected to display a tool for selecting the wallpaper from which the selected avatar 1020 is drawn. In one embodiment, the wallpaper can be animated.

  The acoustic link 1065 can be selected to display a tool that can modify the sound created by the avatar 1020. The sound can be played when the avatar is animated or at other times to attract the user's attention.

  Emotion link 1070 can be selected to display a tool for specifying emotions that are available when communicating with the selected avatar 1020. Emotion is a two-dimensional non-animated image that is sent when the instant message text includes a specific activation trigger. Changes made using tools that are accessible via links 1055 to 1070 can be reflected in the card front 1045 and the card back 1050. After all desired changes have been made to the avatars included in the avatar list 1015, the avatar setting interface 1000 can be exited by selecting the exit button 1075.

  In particular, with reference to FIGS. 11A-14, through the systems and techniques described herein, a comprehensive “online personality” that a user can later save and optionally associate with one or more customized names. "Or" online personality "can allow to assemble a large number of self-expression items. Each self-representation item is used to represent an instant message sender or the characteristics or preferences of the instant message sender and can include a binary object that can be selected by the user. Self-expressing items may be sent by potential instant message recipients (“instant message recipients”) before, during, or after initiation of communications by potential instant message senders (“instant message senders”) Can be perceivable. For example, the self-expression item may include an image such as an avatar or wallpaper applied to a location having a contextual arrangement on the user interface. Situational placement typically indicates an association with a user represented by a self-expression item. For example, the wallpaper can be placed in an area where a message from an instant message sender is displayed or in the area around the interaction area on the user interface. Self-expression items also include sound, animation, video clips, and emotions (eg, smileys). Individuality can also include a set of features or functions related to individuality. For example, features such as encrypted transmission, instant message conversation recording, and transfer of instant messages to alternative communication systems can be enabled for a given personality.

  A user can assign a personality that is projected when talking to other users either prior to the communication activity or “run time” during the communication activity. This makes it possible to project different personalities to different people who are online. In particular, the user can store one or more personalities (eg, each personality is a group of instants, such as an avatar, BuddySounds, Buddy Wallpaper, and Smileys, and / or a set of features and functions, for example. • If the message self-representation item is typically included), and the user can name these personalities to enable invocation of these personalities and is in communication with the user or group with whom the user communicates To automatically display the appropriate / selected personality, the user can associate each of the various personalities with the user or group of such users with whom the user communicates, or the user can Or each of the various personalities during this process of creating, adding, or customizing individual users' own lists It can be established. Thus, personality can be projected to others in an online environment (eg, Instant Messaging and Chat) that can interact according to assignments made by the user. In addition, personality can be chatter when a particular personality is in time, geographic or virtual location, or each characteristic or attribute (eg, a cold personality for winter in Colorado, or when participating in a chat room) Can be assigned, established, and / or associated with other settings.

  In many embodiments, instant message senders can have multiple online personalities for use in instant messaging activities. Each online personality relates to an avatar that represents the specific online personality of the instant message sender. In many cases, each online personality of a particular instant message sender is associated with a different avatar. This need not be the case. Further, even when two or more online personalities of a particular instant message sender include the same avatar, the appearance or behavior of this avatar can be different for each of the online personalities. In one embodiment, a starfish avatar may relate to the two online personalities of a particular instant message sender. A starfish avatar for one online personality may have a different animation than other starfish avatars for other online personalities. Even when both starfish avatars contain the same animation, one of the starfish avatars will generate a particular type of animation based on a different activation trigger than the same animation displayed for the other of the starfish avatars. Can be animated for display.

  FIG. 11A shows the relationship between online personality, avatar, avatar behavior, and avatar appearance. In particular, FIG. 11A shows avatars 1104a through 1104d associated with online personalities 1102a through 1102e and online personalities 1102a through 1102e. Each of avatars 1104a through 1104d includes appearances 1106a through 1106c and behaviors 1108a through 1108d. More specifically, avatar 1104a includes appearance 1106a and behavior 1108a, avatar 1104b includes appearance 1106b and behavior 1108b, avatar 1104c includes appearance 1106c and behavior 1108c, and avatar 1104d includes appearance 1106c and behavior 1108d. Including. Avatars 1104c and 1104d are similar in that both include an appearance 1106c. However, avatars 1104c and 1104d differ in that avatar 1104c includes behavior 1108c, while avatar 1104d includes behavior 1108d.

  Each of the online personalities 1102a through 1102e is associated with one of the avatars 1104a through 1104d. More specifically, online personality 1102a relates to avatar 1104a, online personality 1102b relates to avatar 1104b, online personality 1102c also relates to avatar 1104b, online personality 1102d relates to avatar 1104c, and online personality 1102e relates to avatar 1104d. As indicated by the online personality 1102a for avatar 1104a, the online personality can relate to avatars that are also not associated with different online personalities.

  Multiple online personalities can use the same avatar. This is illustrated by online personality 1102b and 1102c, both of which are related to avatar 1104b. In this case, the appearance and behavior shown by avatar 1104b is the same for both online personalities 1102b and 1102c. In some cases, multiple online personalities can use the same avatar with the same appearance that exhibits different behavior, as shown by online personality 1102d and 1102e. Online personalities 1102d and 1102e relate to the same avatar 1104c and 1104d having the same appearance 1106c. However, avatars 1102d and 1102e exhibit different behaviors 1108c and 1108d, respectively.

  In creating a personality, an instant message sender can prohibit a particular personality from being shown in order to specify instant message recipients and / or groups. For example, if an instant message sender wants to ensure that the “usual” personality is not accidentally displayed to the boss or colleague, the instant message sender can use an “individual” "Individual" display of personality can be prohibited, and "normal" personality display based on the group for the "colleague" group can be prohibited. An appropriate user interface can be provided to assist the instant message sender in making such a selection. Similarly, an instant message sender has a personality to an instant message recipient or group of instant message recipients to defend against accidental or unintentional personality switching and / or increase. Can provide an option to “lock”. So, for example, the instant message sender chooses to lock the "work" personality to the boss on an individual basis, or to lock the "work" personality to the "colleague" group on a group basis. can do. In one embodiment, normal personality does not apply to locked personality.

  FIG. 11B shows an exemplary process 1100 for allowing an instant message sender to select an online personality to make it recognizable to the instant message recipient. The selected online personality includes an avatar representing the online personality of the instant message sender. Step 1100 generally involves selecting and projecting an online personality that includes an avatar representing the sender. The instant message sender creates or modifies one or more online personalities that include the avatar representing the sender (step 1105). Online personality can be created or modified using, for example, the avatar setting user interface 1000 or FIG. Creating an online personality includes one or more self-expressing items and / or features that an instant message sender should display for a particular instant message recipient or group of instant message recipients. It is generally involved in selecting functions. As shown in FIG. 12, a user interface is provided to assist the instant message sender in making such a selection.

  FIG. 12 is a selector user that allows instant message senders to select from the available personalities 1205, 1210, 1215, 1220, 1225, 1230, 1235, 1240, 1245, 1250, and 1255 • Interface 1200 is shown. The user interface 1200 includes a controller 1260 to allow the instant message sender to “pretend” another user's personality, and the personality settings currently selected by the instant message sender. It also has a controller 1265 for consideration. Through the use of the avatar settings interface 1000, the user changes personality, including the avatar projected to the instant message recipient before, during, or after the instant message conversation with the recipient. be able to.

  As an alternative, personality selection can also occur automatically without the intervention of the sender. For example, an automatic determination can be made that the sender is sending an instant message from work. In such a case, the personality to be used at work is automatically selected and can be used for all communications. As another example, an automatic determination that the sender is sending an instant message from the home can be made, and the personality to be used at home is automatically selected and used for all communications be able to. In such an implementation, the sender has no control over which personality is selected for use. In another embodiment, the automatic selection of personality can be used simultaneously with the selection by the sender of the personality, in which case the automatically selected personality serves as a default that can be changed by the sender be able to.

  FIG. 13 is for enabling an instant message sender to create and save personality and / or select various aspects of personality such as avatar, buddy wallpaper, buddy sound, and smileys. An exemplary series of user interfaces 1300 are shown. As shown, the user interface 1305 allows the sender to select one or more of a set of self-expression items and save the set of self-expression items as individuality. User interface 1305 also allows the instant message sender to review and make changes to the instant message personality. For example, the user interface 1305 allows an instant message sender to select an avatar 1310 (referred to here as SuperBuddy), a buddy wallpaper 1315, an emotion 1320 (referred to herein as Smileys), and a buddy sound 1325. To. A set of controllers 1340 are provided to allow the instant message sender to review the profile 1340a and save these selected self-representation items as individuality 1340b. The instant message sender can name and store the personality 1345 and then store the personality 1350 with one or more individual instant message recipients or one or more groups of instant message recipients. Can be applied to. Management area 1350a is provided to allow instant message senders to delete, save, or rename various instant message personalities. In selecting a self-expressing item, other interfaces, such as user interface 1355, may be displayed to allow the instant message sender to select a particular self-expressing item. The user interface 1355 includes a set of themes 1360 for an avatar that allows an instant message sender to select a particular theme 1365 and select a particular avatar 1370 in the selected theme. A set of controllers 1375 are provided to assist the instant message sender in selecting self-expression items. Similarly, the instant message sender can select a predetermined theme, such as by using the user interface 1380. In the user interface 1380, the instant message sender can select various categories 1385 of a given theme, and selecting a specific category 1390 will result in a set of default pre-selected self-representation items 1390a. , 1390b, 1390c, 1390d, 1390e, and 1390f are displayed. This set can be immutable or the instant message sender can individually change any of the preselected self-expression items in the set. A control section 1395 is also provided to allow the instant message sender to select a theme.

  In another embodiment, the features or functions of the instant messaging interface can vary based on the choice or pre-selected options selected by the user for the personality currently in use. These features or functions can be transparent to the instant message sender. For example, when using a “work” personality, an instant message that goes out can be encrypted, a copy can be recorded in the logbook, or a copy can be designated contacts such as an administrative assistant Can be transferred to. An instant message recipient can be alerted that an instant message conversation has been recorded and viewed by others appropriate to the situation. By comparison, if a non-professional “normal” personality is selected, the outgoing instant message cannot be encrypted and no copy is recorded or forwarded.

  As a further example, if the “work” personality is selected and the availability of the instant message sender to receive the instant message (eg, via the “away” message selection, Or (if by disconnecting) messages received from others during unavailable periods can be forwarded to other instant message recipients, such as administrative assistants, or instant Can be forwarded to an email address for the message sender. By comparison, if a non-professional “normal” personality is selected, no action is taken to ensure delivery of the message.

  In one embodiment, personality features and functions are transparent to the instant message sender and can be based on one or more preselected profile types when setting personality. For example, an instant message sender can be asked to select from a group of individuality types such as professional, administrative, informal, occupational, and out-of-order, and in the above example, the “work” personality is “professional” A “normal” individuality type can be constructed, and a “normal” individuality can be constructed as an “informal” individuality type. In another embodiment, instant message senders can individually select personality features and functions.

  Referring back to FIG. 11B, the personality is subsequently preserved (step 1110). The personality can be stored on the instant message sender system, on the instant message host system, or on a different host system, such as a legitimate partner or access provider host system.

  Next, the instant message sender assigns a personality that is projected during a future instant message activity or when having a future instant message conversation with the instant message recipient (step 1115). The instant message sender may wish to display different personalities to different instant message recipients and / or groups in the buddy list. The instant message sender can use the user interface to assign personalization items for personality at least on a per-buddy group basis. For example, an instant message sender can assign a global avatar to all personalities, but other personalities (eg work, family, friends) can be assigned various buddy sounds on a group basis and within the group Individual personalities corresponding to certain instant message recipients can be assigned buddy wallpaper and smileys on an individual basis. The instant message sender can assign other personality attributes based on the occurrence of certain predetermined events or activation triggers. For example, if the weather indicates rain at the instant message sender's geographic location, a particular instant message potential recipient can be directed to view certain aspects of the “rainy day” personality. Default priority rules can be implemented to resolve the rebuttal, or the user resolves the rebuttal between projected personalities or projected self-expression items for the fused personality In order to do so, a priority rule can be selected.

  For example, a set of default-priority rules follows the assignment of the highest priority to personality and personality self-expression items assigned on an individual basis, and the assignment of individuality and personalization items on a group basis. It is possible to resolve the conflict between the assigned personalities by assigning the highest priority to the personality and assigning the lowest priority to the personality and personalization item assignment performed on the basis of the entire area. However, the user can be given an option to override these default priority rules and can be assigned a different priority rule to resolve the rumination.

  Next, instant message activity between the instant message sender and the instant message recipient is initiated (step 1120). Instant message activity can be initiated by either an instant message sender or an instant message recipient.

  As shown, for example, in the user interface 100 of FIG. 1, an instant message user interface is provided to an instant message recipient and is configured to project a personality including an avatar to provide an instant message sender. Is assigned to the instant message recipient (step 1125). The personality selected by the instant message recipient, including the personality avatar, will be recognized at the beginning of the communication window by the instant message sender for the specific instant message recipient but prior to the start of the communication be able to. This can allow the user to determine whether to initiate communication with the instant message recipient. For example, an instant message sender can notify that the instant message recipient is projecting personality at work, and the instant message sender can decide to refrain from sending instant messages . This may be particularly true when the instant message recipient's avatar is displayed on the contact list. On the other hand, providing an instant message recipient avatar after sending an instant message can result in more efficient communication.

  When a buddy communicates with an instant message sender via an instant message client program, the appropriate personality / individualization set for the buddy is sent to the buddy. For example, in embodiments corresponding to global personalization items, group personalization items, and personal personalization items, if set, personal personalization items are sent to the buddy, otherwise set If so, a group personalization item is transmitted. If no personal or group personalization item is set, a global personalization item is transmitted. As another example, in an embodiment corresponding to a global personalization item and a group personalization item, if set, a group personalization item for the group to which the buddy belongs is transmitted, otherwise global personalization. The item is sent. In embodiments corresponding only to group personalization items, group personalization items for the group to which the buddy belongs are sent to the buddy.

  Instant message activity between an instant message sender and another instant message recipient can be initiated by either the instant message sender or the instant message recipient (step 1130).

  Similar to the user interface shown by FIG. 1, on the basis of the second instant message activity, the second instant message user interface is provided to the second instant message recipient and includes an avatar. It is configured to project personality and is assigned by the instant message sender to a second instant message recipient (step 1135). The personality can be projected in the same manner as described above with respect to step 1125. However, the personality and avatar projected to the second instant message recipient may be different from the personality and avatar projected to the first instant message recipient described above in step 1125. .

  Referring to FIG. 14, an example process 1400 allows an instant message sender to change the personality assigned to the instant message recipient. In step 1400, a selection by a user of a new online personality to be assigned to an instant message recipient that includes an avatar is received (step 1405). Changes can be received via an instant message selector 1200, such as those discussed above with respect to FIG. 12, and include selecting self-expressing items and / or features and functions using an interface or the like Or may include “stolen” the buddy's online personality or avatar using such an interface. Intercepting an avatar refers to the allocation by the instant message sender of one or more personalized items such as an avatar used by the instant message recipient. Typically, all personalization items in the online personality of the instant message recipient are applied by the instant message sender when “pretending” the online personality.

  Next, an updated user interface for the instant message recipient is provided based on the newly selected personality (step 1410).

  FIG. 15 illustrates an example process 1500 for modifying the appearance or behavior of an avatar for an instant message sender to communicate an out-of-band message to an instant message recipient. The process may be performed by an instant messaging system such as communication systems 1600, 1700, and 1800 described with respect to FIGS. 16, 17, and 18, respectively. Out-of-band messages send messages that communicate status out-of-band, i.e. carry information independent of the information carried directly through the text of the instant message itself sent to the recipient. Point to. Thus, the recipient views the avatar's appearance and behavior, either directly in the instant message itself or to receive information that is not explicitly conveyed. According to the example method, out-of-band communication is not communicated and is not part of the text message exchanged by the sender and the recipient, regarding the sender's settings, environment, activity or mood Information can be included.

  Process 1500 begins using an instant messaging system that monitors the communication environment for the out-of-band communication indicator and the sender's environment (step 1510). The indication may be an indication of the sender's settings, environment, activity, or mood that is not explicitly conveyed in the instant message sent by the sender. For example, the out-of-band indicator may be a time and date indicator of the sender's location, which may be obtained from a clock application associated with the instant messaging system or the sender's computer. The indicator can be an indicator of the sender's physical location. The indicator may be an indicator of the weather condition indicator of the sender's location, which may be obtained from a weather reporting service such as a web site that provides weather information for the geographical location.

  In addition, the indicator can indicate the sender's activity at or near the time the instant message was sent. For example, the indicator may determine from the sender's computer other applications that are active at or near the time the instant message was sent. For example, the indicator can detect that the sender is using a media playback application to play music, so the avatar for the sender reflects that the sender is listening to music. In order to make it appear to be wearing headphones. As another example, the indicator can detect that the sender is working using a calculator application, so the avatar can reflect that the sender is working , You can make it look like wearing glasses.

  Sender activity can be monitored through the use of a camera focused on the sender. Visual information taken from the camera can be used to determine the sender's activity and mood. For example, the location of each point on the sender's face can be determined from visual information taken from the camera. The position and movement of each point on the face can be reflected in the avatar associated with the sender. Thus, if the sender is to smile, for example, the avatar also smiles.

  The sender's mood indicator (display) can also come from other devices that are operable to determine the sender's mood and to send the mood indicator to the sender's computer. For example, the sender can wear a device that monitors the heart rate and can determine the sender's mood from the heart rate. For example, the device can conclude that the sender is upset or excited when an elevated heart rate is detected. The device can send an indication of the sender's mood to the sender's computer for use on the sender's avatar.

  The instant messaging system makes a determination as to whether an out-of-band communication indicator has been detected (step 1520). When an out-of-band communication indicator is detected, the instant messaging system determines whether the avatar must be modified, customized, or animated to reflect the detected out-of-band communication indicator ( Step 1530). On the other hand, or otherwise, the instant messaging system continues to monitor for out-of-band communication indicators (step 1510). To determine whether action is required, the instant messaging system uses a data table, list, or file that contains out-of-band communication indicators and the associated actions to be taken for each out-of-global communication indicator be able to. Behavior may not be required for each detected out-of-band communication indicator. For example, if the index changes from previous index settings, actions may only be required for some out-of-band communication indices. According to an example method, the instant messaging system can periodically monitor a clock application to determine if the setting for the sender is daytime or nighttime. Once the instant messaging system has taken action based on detection of an out-of-band communication indicator having a night setting, the instant messaging system must take action based on subsequent detection of the night setting indicator. Absent. The instant messaging system only takes action based on the nighttime setting after receiving an inter-band communication indicator for the daytime setting.

  If action is required (step 1540), the appearance and / or behavior of the avatar is modified according to the out-of-band communication indicator (step 1550).

  In one embodiment, when the out-of-band communication indicator indicates that the sender is sending an instant message at night, the appearance of the avatar is modified to wear pajamas. If the indicator indicates that the sender is sending an instant message during the vacation period, the avatar can pretend to be a holiday. According to the method of the embodiment, the avatar can be dressed as Santa Claus in December, pumpkin near the All-Saint Bible, or Uncle Sam in early July.

  In another example, the avatar can be dressed for business, such as a suit and tie, when the out-of-band indicator indicates that the sender is at the office. The appearance of the avatar can also reflect the weather or general climate of the sender's geographic location. For example, if the out-of-band indicator indicates that it is raining at the sender's location, the avatar's wallpaper can be modified to include raining raindrops, or display an open umbrella And / or make it appear that the avatar is wearing a rain cap.

  As another example, when the out-of-band communication indicator indicates that the sender is listening to music, the appearance of the avatar can be changed to indicate the avatar wearing headphones. Additionally or alternatively, the appearance of the avatar can be changed based on the type of music that the sender is listening to. If the indicator indicates that the sender is working (at the sender's workplace or elsewhere), the avatar can appear in business attire, such as wearing a suit and tie. As shown by this example, different out-of-band communication indicators can trigger the same appearance of the avatar. In particular, make both the out-of-band communication indicator of the sender located at the workplace and the out-of-band communication indicator of the sender performing work activities appear to wear a suit and tie on the avatar. .

  In yet another embodiment of the out-of-band communication indicator, the sender's mood can also be indicated in this way. In this case, the appearance of the avatar can be changed to reflect the mood shown. For example, if the sender is sad, the avatar can be modified to reflect the sender's sad state by frowning or animating the avatar to cry. In another embodiment, based on the detected activity of the sender, a tired, busy, or chased mood can be detected, and the avatar animates to communicate such an emotional state be able to.

  After the avatar appearance and / or behavior has been modified to reflect the out-of-band indicator (step 1550), the updated avatar or updated indicator is communicated to the recipient (step 1560). In general, an updated avatar or an indication that an avatar has been changed is provided for the next instant message sent by the sender, but this is not necessarily the case in each implementation. In some implementations, the avatar change can be communicated to the recipient independent of the transmission of the communication. Additionally or alternatively, when the instant message user interface buddy list includes a display of the sender's avatar, a change in the appearance of the avatar can be communicated to each buddy list that includes the sender. Thus, the receiver can be made aware of the updated avatar, behavior, and / or appearance that provides out-of-band communication to the sender.

  FIG. 16 shows a communication system 1600 that includes an instant message sender system 1605 that can communicate with an instant message host system 1610 via a communication link 1615. The communication system 1600 also includes an instant message recipient system 1620 that can communicate with the instant message host system 1610 via the communication link 1615. Using communication system 1600, a user of instant message sender system 1605 can exchange communications with a user of instant message recipient system 1620. The communication system 1600 can animate the avatar for use in self-expression by the instant message sender.

  In one embodiment, either the instant message sender system 1605, the instant message recipient system 1620, or the instant message host system 1610 is one or more general purpose computers, one or more dedicated computers (eg, , Devices uniquely programmed to communicate with each other), or a combination of one or more general purpose computers and one or more dedicated computers. According to example methods, instant message sender system 1605 or instant message recipient system 1620 may be a personal computer or other type of personal information device such as a personal digital assistant or mobile communication device. Can do. In some implementations, the instant message sender system 1605 and / or the instant message recipient 1620 can be a mobile phone capable of receiving instant messages.

  The instant message sender system 1605, the instant message recipient system 1620, and the instant message host system 1610 may include, for example, one or more LANs (“Local Area Networks”) and / or one or more WANs (“ It can be configured to operate in or in cooperation with one or more other systems such as “Wide Area Networks”). Communication link 1615 typically includes a delivery network (not shown) that provides direct or indirect communication between instant message sender system 1605 and instant message host system 1610 regardless of physical separation. . Examples of delivery networks are the Internet, World Wide Web, WAN, LAN, analog or digital wired and wireless telephone networks (eg, public switched telephone network (PSTN), digital integrated services network (ISDN), and digital subscriber lines) (DSL) various embodiments, radio, television, cable or satellite systems, as well as other delivery mechanisms for carrying data, communication link 1615 via one or more delivery networks as described above. Communication paths (not shown) that enable such communication, each of which may include, for example, a wired, wireless, cable, or satellite communication path.

  The instant messaging host system 1610 can support instant messaging services regardless of the instant messaging sender's access to the network or the Internet. Thus, the instant message host system 1610 allows them to send and receive instant messages regardless of whether the user has access to any particular Internet service provider (ISP) Can be. The instant message host system 1610 may also support other services including, for example, accounting management services, roster services, and chat services. Instant message host system 1610 has an architecture that allows devices (eg, servers) in instant message host system 1610 to communicate with each other. In order to carry data, the instant messaging host system 1610 employs one or more standard or proprietary instant messaging protocols.

  In order to access the instant message host system 1610 to initiate instant message activity in the embodiment of FIG. 16, the instant message sender system 1605 is connected via the communication link 1615 to the instant message host system. Establish a connection to 1610. Once a connection to the instant message host system 1610 is established, the instant message sender system 1605 transmits data directly or indirectly to the instant message host system 1610 and the system 1610 Can access content from. By accessing the instant message host system 1610, the instant message sender can see if a particular user is online so that the user can receive an instant message and the specific instant message recipient Exchange instant messages with them, join group chat rooms, trade files such as images, invitations or documents, find other instant message recipients with the same interests, Instant message client located on instant message sender system 1605 to see if you can get customized information such as news and stock quotes and search the web Application It is possible to use. The instant message recipient system 1620 can be similarly operated to establish a simultaneous connection with the instant message host system 1610.

  Furthermore, the instant message sender can view or recognize the avatar and / or other aspects of the online personality associated with the instant message sender prior to engaging in communication with the instant message recipient. For example, certain aspects of the personality selected by the instant message recipient, such as an avatar selected by the instant message recipient, can be recognized via the buddy list itself prior to engaging in communication. . Another aspect of the selected personality selected by the instant message recipient is when the communication window is opened by the instant message sender for a particular instant message recipient, but prior to the start of communication, Can be recognized. For example, in a communication window such as the user interface 100 of FIG. 1, only avatar animations related to the instant message sender can be viewed.

  In one embodiment, instant messages sent between the instant message sender system 1605 and the instant message recipient system 1620 are directed through the instant message host system 1610. In another embodiment, instant messages sent between instant message sender system 1605 and instant message recipient system 1620 are directed through a third party server (not shown) and some In this case, it is also sent through the instant message host system 1610. In yet another embodiment, the instant message is sent directly between the instant message sender system 1605 and the instant message recipient system 1620.

  The techniques, processes, and concepts described herein may be implemented using communication system 1600. One or more of the steps can be performed in a client / host context, a stand-alone or offline client context, or a combination thereof. For example, some functions of one or more of the processes can be fully performed by the instant message sender system 1605, while other functions can be performed by the host system 1610 or the instant message sender system 1605 and the host system. This can be done with 1610 integrated actions. According to an example method, in step 300, the instant message sender's avatar can be individually selected and provided by a self-supporting / offline device, and other than the instant message sender's online personality. This aspect can be accessed or updated via a remote device in a non-client / host environment, such as, for example, a LAN server serving end users or a mainframe serving terminal devices.

  FIG. 17 shows a communication system 1700 that includes an instant message sender system 1605, an instant message host system 1610, a communication link 1615, and an instant message recipient 1620. System 1700 illustrates another possible implementation of communication system 1600 of FIG. 16 used to animate an avatar used for self-expression by an instant message sender.

  In contrast to the depiction of the instant message host system 1610 of FIG. 16, the instant message host system 1610 allows access by an instant message sender, and the instant message sender system 1605 and A login server 1770 is included for directing communication between other elements of the instant messaging host system 1610. The instant message host system 1610 also includes an instant message server 1790. In order to allow access to and facilitate interaction with the instant message host system 1610, the instant message sender system 1605 and the instant message recipient system 1620, for example, Communication software such as a service provider client application and / or an instant messaging client application may be included.

  In one embodiment, instant message sender system 1605 establishes a connection to login server 1770 to access instant message host system 1610 and initiate instant messaging activity. The login server 1770 typically determines whether a particular instant message sender is authorized to access the instant message host system 1610 by verifying the identity and password of the instant message sender. To decide. If the instant message sender is authorized to access the instant message host system 1610, the login server 1770 may provide an instant message host for the active use of the instant message sender. • A hashing technique is typically employed for the on-screen name of the instant message sender to identify a particular instant message server 1790 within the system 1610. The login server 1770 provides the instant message sender (eg, instant message sender system 1605) with the instant message sender's Internet Protocol (“IP”) address 1790 to the instant message sender system 1605. Give the encrypted key and disconnect. Subsequently, the instant message sender system 1605 uses an IP address and an encrypted key to establish a connection to a particular instant message server 1790 via a communication link 1615. To gain access to the instant message server 1790. Typically, the instant message sender system 1605 can establish an open TCP connection to the instant message server 1790. Instant message recipient system 1620 establishes a connection to instant message host system 1610 in a similar manner.

  In one embodiment, the instant messaging host system 1610 also includes a user profile server (not shown) connected to a database (not shown) for storing large amounts of user profile data. The user profile server can be used to enter, search, edit, manipulate, or otherwise manipulate user profile data. In one embodiment, the instant message sender's profile data includes, for example, the instant message sender's on-screen name, buddy list, identified interests, and geographic location. The instant message sender's profile data may also include self-expressing items selected by the instant message sender. The instant message sender enters and edits profile data using the installed instant messaging client application on the instant message sender system 1705 to interact with the user profile server, And / or can be deleted.

  Because the instant message sender's data is stored in the instant message host system 1610, the instant message sender may have a new instant message sender or a different instant message sender system 1605. Even if the instant messaging host system 1610 is accessed using, such information need not be re-entered or updated. Thus, when an instant message sender accesses the instant message host system 1610, the instant message server causes the user profile server to retrieve the instant message sender's profile data from the database, And the instant message server can be instructed to provide, for example, self-expression items and buddy lists of instant message senders. Alternatively, the user profile data can be stored locally on the instant message sender system 1605.

  FIG. 18 shows another example communication system 1800 that can exchange communications between users that project avatars for self-expression. The communication system 1800 includes an instant message sender system 1605, an instant message host system 1610, a communication link 1615, and an instant message recipient system 1620.

  Host system 1610 includes instant message server software 1832 that directs communication between instant message sender system 1605 and instant message recipient system 1620. The instant message server software 1832 can utilize user profile data 1834. User profile data 1834 includes a display of self-expressing items selected by the instant message sender. User profile data 1834 also includes an avatar model association 1834a with the user (eg, instant message sender). User profile data 1834 can be stored, for example, in a database or other type of data collection such as a set of extensible markup language (XML) files. In some embodiments, some portions of user profile data 1834 can be stored in a database, while other portions such as an avatar model association 1834a with a user can be stored in an XML file. it can.

  One embodiment of user profile data 1834 is found in the table below. In this embodiment, the user profile data includes the on-screen name for uniquely identifying the user to which the user profile data applies, the password for signing on to the instant messaging service, the user Related avatars and any online personality. As shown in Table 1, a user can have multiple online personalities, each with the same or different avatars.

  The host system 1610 also includes an avatar model storage location 1835 where avatar definitions that can be used in the instant messaging service are stored. In this embodiment, the avatar definition includes an avatar model file, an avatar expression file for storing instructions for controlling the animation of the avatar, and a wallpaper file. Accordingly, the avatar model storage location 1835 includes an avatar model file 1836, an avatar expression file 1837, and an avatar wallpaper file 1838.

  Avatar model file 1836 defines the appearance and animation of each avatar contained in avatar model storage location 1835. Each of the avatar model files 1836 defines the mesh, texture, lighting, acoustics, and animation used to give the avatar. The mesh in the model file defines the form of the avatar, and the texture defines the image that covers the mesh. A mesh can be represented as a wire structure consisting of a large number of polygons that can be geometrically transformed to allow the display of an avatar to give motion illusions. In one embodiment, the lighting information in the avatar model file is in the form of a light map that depicts the effect of the light source on the avatar. The avatar model file also contains a number of animation identifiers. Each animation identifier identifies a specific animation that can be played for the avatar. For example, each animation identifier transforms the avatar mesh and displays one or more displacement image targets to describe the change in display to display changes in the camera view used to display the avatar. Can be identified.

  When an instant messaging user projects an avatar self-expression, the avatar is defined using a number of animations, including face animations, to provide more types of animation that can be used by the user for self-expression It may be desirable to do so. In addition, it may be desirable for facial animation to use a very large number of formulation shapes. This can result in an avatar that, when given, may appear more expressive. Blending shapes define some of the avatars that can be animated, and in general, the more blending shapes defined for an animation model, the more the images given from the animation model will look more expressive. it can.

  Various data management techniques can be used to implement the avatar model file. In some implementations, information for defining an avatar can be stored in multiple avatar files that can be arranged in a hierarchical structure, such as a roster structure. In such a case, the association between the user and the avatar can be created via the user's association with the root file in the model file directory for the avatar.

  In one embodiment, the avatar model file may include all possible appearances of the avatar, including various features and belongings that are available for user customization. In such cases, the user's preferences for the appearance of the user's avatar include an indication that a portion of the avatar model should be displayed, and other indicators for the flag or any of the appearance features, Can be set to indicate whether a feature or belonging should be displayed. According to an example method, the avatar model can be configured to display sunglasses, reading glasses, short hair, and long hair. When the user configures the avatar to wear sunglasses and long hair, the sunglasses and long hair features are turned on, the reading glasses and short hair features are turned off, and the subsequent provision of the avatar Displays an avatar with long hair and sunglasses.

  The avatar model storage location 1835 also includes an avatar expression file 1837. Each of the avatar expression files 1837 defines an activation trigger that activates an animation in the avatar. For example, as already described with respect to FIGS. 3 and 4, each of the avatar expression files 1837 may define a sentence activation trigger that activates an animation when a sentence activation trigger is identified in the instant message. The avatar expression file can also store associations between out-of-band communication indicators and animations that are played when a specific out-of-band communication indicator is detected. One example of a portion of an avatar expression file is shown in Table 2 below.

  In some embodiments, the association between a particular animation for a particular animation identifier is determined indirectly for a particular activation trigger or out-of-band communication indicator. For example, as shown in Table 2, a particular activation trigger or out-of-band communication indicator can be associated with a certain type of animation (such as smiling, leaving or sleeping). As shown in Table 3 below, certain types of animations can also be associated with specific animation identifiers contained in specific avatar model files. In such cases, to play the animation based on a specific activation trigger or out-of-band communication indicator, the type of animation is identified, an animation identifier for the identified type of animation is determined, and identified by the animation identifier The played animation is played. Other computer animation and programming techniques can also be used. For example, each avatar can use the same animation identifier for a particular animation type rather than including the avatar name shown in the table. Alternatively or additionally, the association between animation type and animation identifier can be stored individually for each avatar.

  The avatar representation file 1837 also includes information for defining how avatars respond to other avatar animations. In one embodiment, the avatar expression file includes several pairs of animation identifiers. One of each pair of animation identifiers identifies the type of animation that, when that type of animation is played for one avatar, triggers the animation identified by the other animation identifier in the pair in another avatar. . In this manner, the avatar expression file can define an animation to be played for the instant message recipient's avatar according to the animation played by the instant message sender's avatar. In some embodiments, the avatar representation file 1837 includes an element for defining a sentence activation trigger for each corresponding avatar animation, and an animation played in response to an animation seen by another avatar. An XML file with elements to define can be included.

  The avatar model storage location 1835 also includes an avatar wallpaper file 1838 that defines the wallpaper from which the avatar is drawn. The wallpaper can be defined using the same or different type of file structure as the avatar model file. For example, an avatar model file can be defined as an animation model file that is generated using animation software from Viewpoint Corporation of New York City, NY and can be played back, while wallpaper The file can be in the form of a Macromedia Flash file that can be generated and played using animation software commercially available from Macromedia, Inc. of San Francisco, California. The avatar wallpaper file 1838 can also include one or more activation triggers for wallpaper animation when the wallpaper includes animated objects activated by instant messages, out-of-band communication indicators, or avatar animations.

  Each of instant message sender system 1605 and instant message recipient system 1620 includes an instant message communication application 1807 or 1827 that can exchange instant messages with instant message host system 1610 via communication link 1615. . The instant messaging application 1807 or 1827 can also be referred to as an instant messaging client.

  Each of instant message sender system 1605 and instant message recipient system 1620 also includes avatar data 1808 or 1828. Avatar data 1808 or 1828 is an avatar model file 1808a or 1828a, an avatar representation file 1808b or 1828b for an avatar that can be provided by an instant message sender system 1605 and an instant message recipient system 1620, respectively, and Includes avatar wallpaper file 1808c or 1828c. Avatar data 1808 or 1828 can be stored using persistent storage, transient storage, or a combination of persistent and transient storage. Once all or some of the avatar data 1808 or 1828 is stored with persistent storage, all or some of the avatar data 1808 or 1828 is instant message sender system 1605 or instant message receiver system 1620 It may be useful to associate it with predetermined data that each should be deleted from. As such, the avatar data is present on the instant message sender system 1605 or 1620 for a predetermined length of time and is probably no longer needed after the instant message sender system 1605 or It can be removed from the instant message recipient system 1620. This can help reduce the amount of storage space used for instant messaging on the instant message sender system 1605 or instant message recipient system 1620.

  In one embodiment, the avatar data 1808 or 1828 is sent to the instant message sender using instant message client software installed on the instant message sender system 1605 or instant message recipient system 1620. Installed on system 1605 or instant message recipient system 1620, respectively. In another embodiment, the avatar data 1808 or 1828 is transmitted from the avatar model storage location 1835 of the instant message host system 1610 to the instant message sender system 1605 or the instant message recipient system 1620, respectively. In yet another embodiment, the avatar data 1808 or 1828 is copied from a data source unrelated to the instant message and instant on the instant message sender system 1605 or instant message recipient system 1620, respectively. • Stored for use as a message avatar. In yet another embodiment, the avatar data 1808 or 1828 is used or associated with an instant message sent to the instant message sender system 1605 or the instant message recipient system 1620, Sent to the instant message sender system 1605 or the instant message recipient system 1620, respectively. The avatar data sent using the instant message corresponds to the instant message sender who sent the message.

  The avatar representation file 1808b or 1828b is used to determine when an avatar is provided on the instant message sender system 1605 or the instant message recipient system 1620, respectively. To provide the avatar, one of the avatar model files 1808a is displayed on the two-dimensional display of the instant message system 1605 or 1620 by the avatar model playback device 1809 or 1829, respectively. In one embodiment, the avatar model playback device 1808 or 1829 is an animation playback device from Viewpoint Corporation. More specifically, the instant messaging system 1605 or 1620 processor calls the avatar model playback device 1809 or 1829 and identifies the animation contained in one of the avatar model files 1808a or 1828a. In general, an animation is identified by an animation identifier in an avatar model file. Subsequently, the avatar model playback device 1809 or 1829 accesses the avatar model file and plays back the identified animation.

  In many cases, multiple animations can be played based on a single activation trigger or out-of-band communication indicator. This occurs, for example, when one avatar reacts to an animation of another avatar that is animated based on a sentence activation trigger, as already described with respect to FIG.

  In system 1800, four animations can be started separately based on the sentence activation trigger in one instant message. An instant message sender projecting a self-expressing avatar uses instant message sender system 1605 to send a text message to an instant message recipient using instant message recipient system 1620 . Instant message recipients also project self-expression avatars. As the instant message recipient system 1620 displays, the display of the instant message sender system 1605 shows an instant message user interface, such as the user interface 100 of FIG. Thus, the sender avatar is shown on both the instant message sender system 1605 and the instant message recipient system 1620, as is the recipient avatar. The instant message sent from the instant message sender system includes a sentence trigger that activates the animation of the sender avatar on the instant message sender system 1605 and the sender avatar on the instant message recipient system 1620. As already described with respect to FIG. 6, the recipient avatar is animated in response to the animation of the sender avatar. The responsive animation of the recipient avatar occurs in both the recipient avatar displayed on the instant message sender system 1605 and the recipient avatar displayed on the instant message recipient system 1620.

  In some implementations, the instant messaging user can use the avatar animation, the wallpaper displayed for the avatar, the activation trigger or out-of-band communication indicator to animate the wallpaper object, and the appearance of the avatar. It is allowed to customize one or more of the animation activation triggers or out-of-band communication indicators. In one embodiment, a copy of the avatar model file, expression file, or wallpaper file is created, and the user modifications are saved in a copy of the avatar model file, expression file, or wallpaper file. Subsequently, the copy containing the modification is associated with the user. As an alternative or addition, only the changes, i.e., the differences between the avatar before modification and the avatar after modification is saved. In some embodiments, different versions of the same avatar are saved and associated with the user. This may allow the user to modify the avatar, use the modified avatar for a period of time, and then revert to using the previous version of the avatar that does not include the modification.

  In some implementations, some avatars that the user can select may be restricted by the instant messaging service provider. This can be referred to as a closed or locked implementation. In such an embodiment, the animation and activation trigger for each avatar in the closed set of avatars can be reconfigured. In some closure implementations, the user can customize the animation and / or trigger of the selected avatar. For example, a user can include a favorite video clip as an animation of an avatar, and the avatar can be configured to play this video clip after a specific sentence activation trigger appears in a message sent by the user Can do. In other closure implementations, the user is also prevented from adding animation to the avatar.

  In some embodiments, the set of avatars that the user can select is not limited by the instant message service provider, and the user uses an avatar other than the avatar provided by the instant message service provider. Can do. This can be referred to as an open implementation or an unlocked implementation. For example, an avatar that can be used in an instant messaging service can be shared with animation software provided by an instant messaging service provider, commercially available computer animation software, or one or more instant messaging services It can be created by the user using software tools provided by a third party specializing in the creation of simple avatars.

  In some embodiments, a combination of closed and open implementations can be used. For example, an instant messaging service provider can restrict selection by a minor user to a set of predetermined avatars provided by the instant messaging service provider, while an adult user can It is also permitted to use avatars other than those available from this instant messaging service provider.

  In some implementations, the avatars that the user can select can be limited based on user characteristics such as age. As shown in Table 4 below, and using the avatar shown in FIG. 8 by way of example only, users under the age of 10 can be restricted to a single group of avatars. Users between the ages of 10 and 18 can be restricted to different groups of avatars, some of which are the same as the avatars that can be selected by users under the age of 10. Users over the age of 18 can choose from any avatar available from this instant messaging provider's service.

  Instant message programs typically allow instant message senders to communicate with each other in real time in a variety of ways. For example, many instant messaging programs allow instant message senders to send text as instant messages for transferring files and communicating by voice. Examples of instant messaging applications include AIM (American Online Instant Messenger), AOL (America Online) Buddy List and Instant Messages, Yahoo Messenger, MSN. Includes Messenger and ICQ. Although discussed above primarily with respect to instant messaging applications, other implementations are possible to provide similar functionality in platform and online applications. For example, the techniques and concepts can be applied to animated avatars that serve as informational aids for conveying news, weather, or other information to users of computer systems or information devices.

  The techniques and concepts are generally described in the context of an instant messaging system that uses an instant messaging host system to facilitate instant messaging communications between instant messaging senders and instant messaging recipients. It was. Other instant message implementations are also contemplated, such as an instant message service where instant messages are exchanged directly between the instant message sender system and the instant message recipient system.

  For example, although the above example is given in the context of instant messaging, other communication systems with similar attributes can be used. For example, multiple personalities can be used in chat rooms or email communications. Similarly, the user interface can be a viewable interface, an audible interface, a tactile interface, or a combination thereof.

  Other embodiments are within the scope of the following claims.

FIG. 4 is a diagram of a user interface for an instant messaging service that allows a user to project an avatar for self-expression. FIG. 4 is a diagram of a user interface for an instant messaging service that allows a user to project an avatar for self-expression. 6 is a flowchart of a process for animating an avatar based on the content of an instant message. FIG. 6 is a block diagram illustrating an exemplary animation of an avatar and sentence triggers for each animation. FIG. 4 is a diagram of a user interface for an instant messaging service that allows a user to project an avatar for self-expression. FIG. 2 illustrates an example process involved in communication between two instant messaging client systems and one instant messaging host system, whereby one user of the instant messaging client system The avatar is animated based on the animation of another user's avatar in the instant messaging client system. FIG. 6 is a flowchart of a process for selecting an avatar and optionally customizing it. FIG. 3 is a block diagram depicting an example of an avatar that can be projected by a user for self-expression. FIG. 4 is a user interface for customizing the appearance of an avatar. FIG. 3 is a diagram of a user interface used to present a quick description of an avatar. FIG. 4 is a block diagram illustrating the relationship between online personality, avatar, avatar behavior, and avatar appearance. FIG. 5 is a flow chart of a process for using different online personalities to communicate with each of two instant message recipients. FIG. 4 is a user interface that allows an instant message sender to select from available online personalities. FIG. 6 is an exemplary user interface for enabling an instant message sender to create and save an online personality that includes an avatar for self-expression. 6 is a flowchart of a process for allowing a user to change an online personality including an avatar for self-expression. 4 is a flowchart of a process for using an avatar to communicate an out-of-band message to an instant message recipient. FIG. 4 is an illustration of an example communication system that can allow an instant messaging user to project an avatar for self-expression. FIG. 4 is an illustration of an example communication system that can allow an instant messaging user to project an avatar for self-expression. FIG. 4 is an illustration of an example communication system that can allow an instant messaging user to project an avatar for self-expression.

Claims (163)

  1. A sender part that displays a sender avatar that can display multiple animations,
    A message composition area capable of displaying a sentence included in the message transmitted from the sender to the receiver;
    A plurality of communication controls operable to receive an indication that the message displayed in the message composition area is to be sent from the sender to the recipient; Including a device,
    A graphical user interface configured for display on a display device, wherein the sender avatar is animated in response to an activation trigger associated with the content of a message sent from the sender to the sender.
  2.   The instant message sender display includes a receiver portion that displays a recipient avatar that can display multiple animations in response to an activation trigger associated with the content of a message sent from the sender to the receiver; The graphical history of claim 1 including a message history area that can display the contents of a number of messages transmitted between the sender and the recipient and identify the identity of the recipient. User interface.
  3.   The graphical user interface of claim 2, wherein the recipient avatar is animated in response to the sender avatar.
  4.   The graphical user interface of claim 1 including a contact list display for displaying potential recipients.
  5.   The graphical user interface of claim 4, wherein the contact list display indicates whether each potential recipient is available to receive a message.
  6.   The graphical user interface of claim 4, wherein the potential recipients are grouped and associated with an indicator of group identity.
  7. A potential recipient displayed in the contact list is associated with a potential recipient avatar,
    Displaying the potential recipient avatar on a contact list with respect to the identity of the potential recipient;
    5. The graphical user of claim 4, further comprising animating the potential recipient avatar on the contact list in response to an animation of the potential recipient avatar displayed elsewhere. ·interface.
  8.   8. The graphical user user of claim 7, wherein the potential recipient avatar on the contact list includes an animation that is substantially the same as the animation of the potential recipient avatar displayed elsewhere. interface.
  9.   8. The graphical representation of claim 7, wherein the animation of the potential recipient avatar on the contact list includes an animation that is different from the animation of the potential recipient avatar displayed elsewhere. User interface.
  10.   8. The graphical user interface of claim 7, wherein the animation of the potential recipient avatar on the contact list includes an animation representing the animation of the potential recipient avatar displayed elsewhere. .
  11.   The graphical user interface of claim 1 used for instant messaging activities.
  12.   The graphical user interface of claim 1, wherein the activation trigger includes a portion of the text of the message.
  13.   The graphical user interface of claim 1, wherein the activation trigger includes all of the text of the message.
  14.   The graphical user interface of claim 1, wherein an appearance or animation of the sender avatar indicates an environmental condition associated with the sender.
  15.   The graphical user interface of claim 1, wherein the appearance or animation of the sender avatar exhibits personality characteristics associated with the sender.
  16.   The graphical user interface of claim 1, wherein an appearance or animation of the sender avatar indicates an emotional situation associated with the sender.
  17.   The graphical user interface of claim 1, wherein the appearance or animation of the sender avatar indicates a setting characteristic associated with the sender.
  18.   The graphical user interface of claim 1, wherein an appearance or animation of the sender avatar indicates activity related to the sender.
  19.   The graphical user interface of claim 1, wherein the sender avatar is animated over time of a predetermined length of time that the sender does not communicate a message to the recipient.
  20.   The sender avatar is animated over a predetermined length of time that the sender does not use an information device used by the sender to communicate with the recipient in the communication activity. A graphical user interface according to 1.
  21.   The graphical user interface of claim 1, wherein the avatar animation used as a communication path includes a breakout animation involving the display of an avatar outside the normal display space occupied by the avatar.
  22.   The graphical user interface of claim 1, wherein the sender avatar is animated to generate sound used for verbal communication.
  23.   23. A graphical user interface as claimed in claim 1 to 22 generated by executing a computer program product.
  24. An apparatus for generating a graphical user interface configured for display on a display device, comprising: a processor coupled to one or more input components and one or more output components; Is
    Generate a sender part that displays a sender avatar that can display multiple animations,
    Generating a message composition area capable of displaying a sentence included in the message transmitted from the sender to the recipient; and
    A communication controller, wherein at least one communication controller is operable to receive an indication that the message displayed in the message composition area is to be transmitted from the sender to the recipient. Configured to generate a plurality of communication control devices,
    The apparatus wherein the sender avatar is animated in response to an activation trigger associated with the content of a message sent from the sender to the receiver.
  25. Graphically representing the first user in a communication activity involving the first user and the second user using an avatar that can be animated;
    Communicating a message between the first user and the second user, the message carrying explicit information from the first user to the second user;
    Communicating out-of-band information to the second user using a change in avatar appearance or avatar animation as a communication path;
    The out-of-band communication includes communication related to the situation of the first user and different from the information carried in the message transmitted between the first user and the second user. ,Communication method.
  26.   26. The method of claim 25, wherein the communication activity is an instant message communication activity.
  27.   26. The method of claim 25, wherein the avatar includes an animation of a face that does not include a torso with ears or legs.
  28.   26. The method of claim 25, wherein the avatar includes a facial animation including a neck without a torso with ears or legs.
  29.   26. The method of claim 25, wherein the out-of-band information includes information indicating an environmental condition related to the first user.
  30.   30. The method of claim 29, wherein the environmental condition comprises an environmental condition associated with weather occurring at a geographical location near the first user.
  31.   26. The method of claim 25, wherein the out-of-band information includes information indicative of personality characteristics associated with the first user.
  32.   26. The method of claim 25, wherein the out-of-band information includes information indicating an emotional state related to the first user.
  33.   26. The method of claim 25, wherein the out-of-band information includes information indicative of configuration characteristics related to the first user.
  34.   34. The method of claim 33, wherein the setting features include features related to the time of the first user.
  35.   34. The method of claim 33, wherein the setting features include features related to time of year.
  36.   36. The method of claim 35, wherein the time of the year includes holidays.
  37.   36. The method of claim 35, wherein the time of the year includes a season, the season being one of spring, summer, autumn, or winter.
  38.   34. The method of claim 33, wherein the setting features include features related to work settings.
  39.   34. The method of claim 33, wherein the setting features include features related to entertainment settings.
  40.   40. The method of claim 39, wherein the entertainment setting comprises a sandy beach setting or a tropical setting.
  41.   40. The method of claim 39, wherein the entertainment setting comprises a winter sports setting.
  42.   26. The method of claim 25, wherein out-of-band information includes information related to the first user's mood.
  43.   43. The method of claim 42, wherein the mood of the first user includes one of happy, sad, or angry.
  44.   26. The method of claim 25, wherein out-of-band information includes information related to the first user activity.
  45.   45. The method of claim 44, wherein the activity is being performed by the first user at substantially the same time that the out-of-band message is communicated from the first user to the second user. .
  46.   46. The method of claim 45, wherein the activity includes one of working or listening to music.
  47.   30. The method of claim 29, wherein out-of-band information includes information conveying that the first user has muted sound associated with the avatar.
  48.   An animation of the avatar based on the information carried in the message from the first user to the second user to carry the out-of-band information from the first user to the second user 26. The method of claim 25, further comprising activating.
  49.   49. The method of claim 48, wherein the activation trigger includes a portion of text.
  50.   49. The method of claim 48, wherein the activation trigger includes all of the text of the message.
  51.   49. The method of claim 48, wherein the activation trigger includes an acoustic portion of the message.
  52.   49. The method of claim 48, wherein the activation trigger includes a predetermined length of time that the first user has not communicated a message to the second user.
  53.   The activation trigger includes an elapse of a predetermined length of time that the first user does not use an information device used by the first user to communicate with the second user in communication activity. 48. The method according to 48.
  54.   26. The method of claim 25, wherein the avatar animation used as the communication path includes facial expressions of the avatar.
  55.   26. The method of claim 25, wherein the avatar animation used as the communication path includes a gesture made by a hand of the avatar or a gesture made by the arm of the avatar.
  56.   26. The method of claim 25, wherein the avatar animation used as the communication path includes movement of the avatar's torso.
  57.   26. The method of claim 25, wherein the avatar animation used as the communication path includes sound created by the avatar.
  58.   The method of claim 57, wherein at least some of the sounds comprise voices based on the voice of the first user.
  59.   26. The method of claim 25, wherein the avatar animation used as the communication path comprises a breakout animation involving the display of an avatar outside the normal display space occupied by the avatar.
  60.   60. The method of claim 59, wherein the breakout animation includes nesting the avatar.
  61.   60. The method of claim 59, wherein the breakout animation includes changing the size of the avatar.
  62.   60. The method of claim 59, wherein the breakout animation includes changing the position of the avatar.
  63. Providing a number of pre-configured avatars with pre-selected animations associated with the first user;
    26. The method of claim 25, further comprising: allowing the first user to select a specific avatar to represent the user in the communication activity.
  64.   64. The method of claim 63, further comprising permanently associating the first user with the selected avatar to represent the first user in subsequent communication activities.
  65.   64. The method of claim 63, further comprising allowing the first user to modify the appearance of the avatar.
  66.   Allowing the first user to modify the appearance of the avatar comprises using the slide bar to indicate a particular modification of a particular feature of the avatar; 66. The method of claim 65, comprising the step of enabling:
  67.   The step of allowing the first user to modify the appearance of the avatar comprises the first user modifying the appearance of the avatar to reflect the characteristics of the first user. 66. The method of claim 65, comprising the step of enabling.
  68.   68. The method of claim 67, wherein the first user characteristic includes one of age, hair color, eye color, or facial features.
  69.   The step of allowing the first user to modify the appearance of the avatar includes the first user adding, changing, or deleting belongings displayed with the avatar. 66. The method of claim 65, comprising the step of allowing the appearance of the image to be modified.
  70.   70. The method of claim 69, wherein the personal belongings include one of glasses, sunglasses, a hat, or earrings.
  71.   26. The method of claim 25, further comprising allowing the first user to modify an activation trigger used to activate the avatar animation.
  72.   72. The method of claim 71, wherein the activation trigger includes text included in the message sent from the first user to the second user.
  73.   26. The method of claim 25, further comprising animating the avatar for use as an informational supplement for conveying information to the first user.
  74.   26. The method of claim 25, further comprising allowing the avatar to be used by an application other than a communication application.
  75.   The method of claim 74, wherein enabling the use of the avatar by an application other than a communication application comprises enabling the use of the avatar in an online journal.
  76.   26. The method of claim 25, further comprising displaying a representation of the avatar in a manner that is substantially the same as a trading card.
  77.   77. The method of claim 76, wherein the trading card depiction of the avatar includes a rendering of the avatar trading card that includes features related to the first user.
  78.   78. The process of any one of claims 25 to 77, wherein the process is performed by a computer program configured to communicate and embodied on a computer readable medium or propagated signal. Method.
  79. An apparatus for communication comprising a processor connected to a storage device, and one or more input / output devices, the processor comprising:
    Graphically representing the first user in communication activities involving the first user and the second user using an avatar that can be animated;
    Communicating a message carrying explicit information from the first user to the second user between the first user and the second user; and
    Configured to communicate out-of-band information to the second user using a change in the appearance or animation of the avatar as a communication path;
    The out-of-band communication is related to the situation of the first user and differs from the information carried in the message transmitted between the first user and the second user. Including communication devices.
  80. A computer-implemented method for enabling recognition of multiple online personalities in instant messaging activities, comprising:
    Identifying at least two identities in the communication environment to which the message can be directed;
    Allowing the user's first personality to be projected onto the first identity of the identity,
    Allowing a second personality of the same user to be projected simultaneously to a second identity of the identity,
    The first and second personalities each include an animated avatar; and
    The method, wherein the first personality and the second personality are different.
  81.   81. The method of claim 80, wherein the first personality is different from the second personality such that the first personality calls a different avatar than the avatar called by the second personality.
  82. The first personality calls the first avatar;
    The second personality calls a second avatar,
    The first avatar and the second avatar are the same avatar; and
    81. The method of claim 80, wherein the animation associated with the first avatar is different from the animation associated with the second avatar.
  83. The first personality calls the first avatar;
    The second personality calls a second avatar,
    The first avatar and the second avatar are the same avatar; and
    81. The method of claim 80, wherein the appearance associated with the first avatar is different from the appearance associated with the second avatar.
  84.   81. The method of claim 80, wherein at least one of the avatars comprises a number of acoustic avatars.
  85.   81. The method of claim 80, wherein at least one of the avatars includes an avatar that can be an animated personality based on message text transmitted in the instant messaging activity.
  86.   81. The method of claim 80, wherein at least one of the avatars includes an avatar that can be animated to transmit an out-of-band communication.
  87.   81. The method of claim 80, further comprising associating the first personality with a first group of identities, so that the first personality is projected during communication activities with members of the first group of identities. the method of.
  88.   88. The method of claim 87, further comprising associating the second personality with a second group of identities, so that the second personality is projected during communication activities with members of the second group of identities. the method of.
  89.   Associating a personality with the first identity of the identity and associating a different personality with the group of identities to which the first identity of the identity relates, projecting to the first identity of the identity 81. The method of claim 80, wherein the first personality made further comprises the integration of the personality of the identity with respect to the first identity and the different personalities with respect to the group of identity.
  90.   90. The method of claim 89, wherein a personality related to the first identity of the identity invalidates the different personality related to the group of identity to the extent that rumination exists.
  91.   The process is performed by a computer program configured to allow recognition of multiple online personalities in instant messaging activities and embodied on a computer readable medium or propagated signal. The method according to any one of 80 to 90.
  92. An apparatus for enabling recognition of multiple online personalities in an instant messaging activity, comprising a processor connected to a storage device and one or more input / output devices, the processor comprising:
    Identify at least two identities in the communication environment to which the message can be directed, and
    Allows a user's first personality to be projected onto the first identity of the identity while allowing a second personality of the same user to be projected onto the second identity of the identity at the same time Configured as
    The first and second personalities each include an avatar that can be animated; and
    The apparatus wherein the first personality and the second personality are different.
  93.   94. The apparatus of claim 92, wherein the first personality is different from the second personality such that the first personality calls a different avatar than the avatar called by the second personality.
  94. The first personality calls the first avatar;
    The second personality calls a second avatar,
    The first avatar and the second avatar are the same avatar; and
    94. The apparatus of claim 92, wherein the animation related to the first avatar is different from the animation related to the second avatar.
  95. The first personality calls the first avatar;
    The second personality calls a second avatar,
    The first avatar and the second avatar are the same avatar; and
    94. The apparatus of claim 92, wherein the appearance associated with the first avatar is different from the appearance associated with the second avatar.
  96.   94. The method of claim 92, wherein at least one of the avatars includes an avatar that can be an animated personality based on message text transmitted in the instant messaging activity.
  97.   94. The apparatus of claim 92, wherein at least one of the avatars includes an avatar that can be animated to transmit out-of-band communications.
  98. A computer-implemented method for enabling recognition of multiple online personalities in instant messaging activities, comprising:
    Providing an instant message application user interface on an instant message recipient system for instant message communication activities involving at least one instant message potential recipient and a single instant message potential sender When,
    Sending a message comprising a text message and a personality selected from a number of possible personalities for the instant message sender to be displayed by the potential instant message recipient when displaying the text message. The selected personality includes a collection of one or more self-expressing items and an animated sender avatar;
    Providing the selected personality at the instant message potential recipient system when providing other parts of the message.
  99.   99. The method of claim 98, wherein the sender personality is selected by the instant message sender from the multiple possible personalities for the instant message sender.
  100.   99. The method of claim 98, wherein the personality is given before a communication is initiated by the instant message potential sender.
  101.   99. The method of claim 98, wherein the personality is provided after a communication is initiated by the instant message potential sender.
  102.   99. The method of claim 98, wherein the self-expression item includes one or more of wallpaper, emotion, and sound.
  103.   99. The method of claim 98, further comprising defining one or more personalities.
  104. Assigning a first personality to a first instant message potential recipient, so that the first personality is then automatically used during an instant messaging activity involving the first instant message potential recipient Called and projected by:
    Assigning a second personality to a second instant message potential recipient, so that the second personality is then automatically transmitted during an instant messaging activity involving the second instant message potential recipient 104. The method of claim 103, further comprising: calling and projecting.
  105. Assigning a first personality to a first group of potential instant message recipients, so that during the instant messaging activity involving members of the first group of potential instant message recipients A process in which a personality of 1 is automatically called and projected;
    Assigning a second personality to a second instant message potential recipient, so that the second personality is then automatically transmitted during an instant messaging activity involving the second instant message potential recipient 105. The method of claim 104, further comprising the steps of: being called and projected, wherein the second personality is at least partially distinguishable from the first personality.
  106.   99. The method of claim 98, further comprising disabling use of one of the multiple personalities.
  107.   99. The method of claim 98, wherein disabling the use of one of the multiple personalities comprises disabling the use of one of the multiple personalities based on the instant message recipient.
  108. One of the multiple personalities includes a work personality relating to the presence of the instant message sender in the workplace relating to the instant message sender; and
    One of the multiple personalities includes a personality at home relating to the presence of the instant message sender at home;
    Determining whether the instant message sender is at home or at work;
    Selecting a personality at home for use in the instant messaging activity in response to a determination that the instant message sender is at home;
    99. The method of claim 98, further comprising: selecting a personality at the workplace for use in the instant messaging activity in response to a determination that the instant message sender is at the workplace.
  109.   99. The method of claim 98, further comprising selecting a personality to be displayed by the potential instant message recipient based on time of day.
  110.   99. The method of claim 98, further comprising selecting a personality to be displayed by the potential instant message recipient based on a day of the week.
  111.   99. The method of claim 98, further comprising selecting a personality to be displayed by the instant message potential recipient based on a group of instant message potential recipients for the instant message potential recipient.
  112.   99. The method of claim 98, wherein at least some of the personality characteristics can be made transparent to the instant message sender.
  113.   99. The method of claim 98, wherein the sender avatar is animated to send an out-of-band communication from the instant message sender to the potential instant message recipient.
  114.   114. The method of claim 113, wherein the out-of-band communication includes a communication indicating an environmental condition related to the instant message sender.
  115.   115. The method of claim 114, wherein the environmental condition comprises an environmental condition associated with weather occurring at a geographic location near the instant message sender.
  116.   114. The method of claim 113, wherein the out-of-band communication includes communication exhibiting personality characteristics associated with the instant message sender.
  117.   114. The method of claim 113, wherein the out-of-band communication includes a communication that indicates an emotional state related to the instant message sender.
  118.   114. The method of claim 113, wherein the out-of-band communication includes a communication that indicates configuration characteristics related to the instant message sender.
  119.   119. The method of claim 118, wherein the setting features include features related to the instant message sender's time.
  120.   114. The method of claim 113, wherein the setting features include features associated with a time of year.
  121.   121. The method of claim 120, wherein the period of the year includes a holiday.
  122.   121. The method of claim 120, wherein the period of the year includes a season, the season being one of spring, summer, autumn, or winter.
  123.   114. The method of claim 113, wherein the setting features include features relating to work settings.
  124.   114. The method of claim 113, wherein the setting features include features related to entertainment settings.
  125.   129. The method of claim 124, wherein the entertainment setting comprises a sandy beach setting or a tropical setting.
  126.   126. The method of claim 125, wherein the entertainment setting comprises a winter sports setting.
  127.   The process is performed by a computer program configured to allow recognition of multiple online personalities in instant messaging activities and embodied on a computer readable medium or propagated signal. 127. The method according to any one of items 98 to 126.
  128. An apparatus for enabling recognition of multiple online personalities in an instant messaging activity, comprising a processor connected to a storage device and one or more input / output devices, the processor comprising:
    Providing an instant message application user interface on the instant message recipient system for instant message communication activities involving at least one instant message potential recipient and a single instant message potential sender;
    Sending a message including a text message and a personality selected from a number of possible personalities for the instant message sender to be displayed by the potential instant message recipient when displaying the text message; The selected personality includes a collection of one or more self-expression items and an animated sender avatar, and
    The apparatus configured to provide the selected personality in the instant message potential recipient system when providing other parts of the message.
  129.   Using an animated avatar to graphically represent a user, the avatar using an avatar to communicate on multiple animations and multiple features of appearance representing a pattern of features representing the personality of the avatar A computer-implemented method for doing.
  130.   131. The method of claim 129, wherein the avatar relates to a description that identifies the personality of the avatar.
  131.   131. The method of claim 129, wherein the personality of the avatar includes at least some features that are different from at least some features of the personality of the avatar.
  132. Graphically representing a second user using a second avatar that can be animated, wherein the second avatar is a number of animations of appearance representing a pattern of features representing the personality of the second avatar And further comprising a number of features steps,
    The personality of the second avatar includes at least some features that are distinct from at least some features of the personality of the first avatar; and
    130. The method of claim 129, wherein a communication message is transmitted between the first user and the second user.
  133.     77. The method of any one of claims 73 to 76, wherein the process is performed by a computer configured to use an avatar to communicate and embodied on a computer readable medium or propagated signal. The method described.
  134. An apparatus for using an avatar to communicate, comprising a processor connected to a storage device and one or more input / output devices, the processor comprising:
    Configured to graphically represent the user using an animated avatar,
    The apparatus, wherein the avatar relates to multiple animations and multiple features of appearance that represent a pattern of features representing the personality of the avatar.
  135.   Means for graphically representing a user using an animated avatar, the avatar for communicating a number of animations and a number of features representing a pattern of features representing the personality of the avatar Device for using avatars.
  136. A computer-implemented method for animating a first avatar based on a recognized animation of a second avatar, comprising:
    Graphically representing the first user using a first avatar that can be animated;
    Graphically representing a second user using a second avatar that can be animated, wherein a communication message is transmitted between the first user and the second user;
    Receiving an indication of the animation of the first avatar;
    Animating the second avatar in response to and based on the received indication of the animation.
  137.   137. The method of claim 136, wherein the step of receiving the indication of animation comprises receiving an indication of any type of animation of the first avatar.
  138.   137. The method of claim 136, wherein the step of receiving the indication of animation comprises receiving a specific animation indication of a number of possible animations of the first avatar.
  139.   138. The method of claim 136, further comprising animating the first avatar in response to and based on the animation of the second avatar.
  140.   137. The method of claim 136, wherein the first avatar is animated in response to a particular portion of a message transmitted between the first user and the second user.
  141.   141. The method of claim 140, wherein the first avatar is animated in response to a particular portion of a message sent from the first user to the second user.
  142.   141. The method of claim 140, wherein the first avatar is animated in response to a particular portion of a message sent from the second user to the first user.
  143.   137. The method of claim 136, wherein the first avatar is animated to send out-of-band communications from the first user to the second user.
  144.   145. The method of claim 143, wherein the out-of-band communication includes a communication that indicates an environmental condition for the first user.
  145.   145. The method of claim 144, wherein the environmental condition comprises an environmental condition associated with weather occurring at a geographical location near the first user.
  146.   145. The method of claim 143, wherein the out-of-band communication includes communication that exhibits personality characteristics associated with the first user.
  147.   145. The method of claim 143, wherein the out-of-band communication includes a communication indicating an emotional state related to the first user.
  148.   145. The method of claim 143, wherein the out-of-band communication includes communication indicative of configuration characteristics related to the first user.
  149.   149. The method of claim 148, wherein the setting features include features related to the time of the first user.
  150.   148. The method of claim 147, wherein the setting features include features related to time of year.
  151.   161. The method of claim 150, wherein the period of the year includes a holiday.
  152.   156. The method of claim 150, wherein the time period of the year includes a season, the season being one of spring, summer, autumn, or winter.
  153.   148. The method of claim 147, wherein the setting features include features relating to work settings.
  154.   148. The method of claim 147, wherein the setting features include features related to entertainment settings.
  155.   157. The method of claim 154, wherein the entertainment setting comprises a sandy beach setting or a tropical setting.
  156.   157. The method of claim 154, wherein the entertainment setting comprises a winter sports setting.
  157.   The step is performed by a computer program configured to animate the first avatar based on the recognized animation of the second avatar and embodied on a computer readable medium or propagated signal. 157. A method according to any one of claims 136 to 156.
  158. An apparatus for animating a first avatar based on a recognized animation of a second avatar, comprising a processor connected to a storage device and one or more input / output devices, the processor comprising:
    Graphically representing the first user using a first avatar that can be animated,
    A second user is graphically represented using a second avatar that can be animated, and a communication message is transmitted between the first user and the second user;
    Receiving an indication of the animation of the first avatar; and
    An apparatus configured to animate the second avatar in response to and based on the received indication of the animation.
  159.   159. The apparatus of claim 158, wherein the processor is configured to receive an indication of any type of animation of the first avatar.
  160.   159. The apparatus of claim 158, wherein the processor is configured to receive a specific animation indication of a number of possible animations of the first avatar.
  161.   159. The apparatus of claim 158, wherein the processor is further configured to animate the first avatar in response to and based on the animation of the second avatar.
  162.   159. The processor of claim 158, wherein the processor is further configured to animate the first avatar in response to a particular portion of a message transmitted between the first user and the second user. apparatus.
  163.   159. The apparatus of claim 158, wherein the processor is further configured to animate the first avatar to transmit out-of-band communications from the first user to the second user.
JP2006508976A 2002-11-21 2004-03-01 How to use an avatar to communicate Pending JP2006520053A (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
US45066303P true 2003-03-03 2003-03-03
US51285203P true 2003-10-22 2003-10-22
US10/747,255 US20040179039A1 (en) 2003-03-03 2003-12-30 Using avatars to communicate
US10/747,652 US20040179037A1 (en) 2003-03-03 2003-12-30 Using avatars to communicate context out-of-band
US10/747,696 US7636755B2 (en) 2002-11-21 2003-12-30 Multiple avatar personalities
US10/747,701 US7484176B2 (en) 2003-03-03 2003-12-30 Reactive avatars
PCT/US2004/006284 WO2004079530A2 (en) 2003-03-03 2004-03-01 Using avatars to communicate

Publications (1)

Publication Number Publication Date
JP2006520053A true JP2006520053A (en) 2006-08-31

Family

ID=32966868

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2006508976A Pending JP2006520053A (en) 2002-11-21 2004-03-01 How to use an avatar to communicate

Country Status (5)

Country Link
EP (1) EP1599862A2 (en)
JP (1) JP2006520053A (en)
AU (1) AU2004216758A1 (en)
CA (1) CA2517909A1 (en)
WO (1) WO2004079530A2 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009134653A (en) * 2007-11-30 2009-06-18 Internatl Business Mach Corp <Ibm> Access control method, server device and system
JP2010533902A (en) * 2007-06-27 2010-10-28 カレン ノウルズ エンタープライゼズ ピーティーワイ リミテッド Communication method, system and product
JP5048877B1 (en) * 2012-02-14 2012-10-17 株式会社 ディー・エヌ・エー Social game computing
WO2013080636A1 (en) * 2011-12-02 2013-06-06 株式会社コナミデジタルエンタテインメント Server device, recording medium, and avatar management method
JP2013165951A (en) * 2012-06-28 2013-08-29 Dna:Kk Computing for social game
JP2014087657A (en) * 2013-11-20 2014-05-15 Dna:Kk Game program and game system
JP2015514273A (en) * 2012-04-06 2015-05-18 アイ−オン コミュニケーションズ カンパニー リミテッド Mobile chat system to support comic story style dialogue on web pages
KR101542776B1 (en) 2007-02-15 2015-08-07 엘지전자 주식회사 Controlling Method of Instant Messenger Service for Mobile Communication Terminal
JP2016071571A (en) * 2014-09-30 2016-05-09 大日本印刷株式会社 Message transmission device and computer program

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100388666C (en) 2004-12-09 2008-05-14 腾讯科技(深圳)有限公司 Method and system for controlling data transmission procedure
DE102004061884B4 (en) * 2004-12-22 2007-01-18 Combots Product Gmbh & Co. Kg Method and system for telecommunications with virtual substitutes
CA2610054C (en) * 2005-06-02 2012-09-11 Tencent Technology (Shenzhen) Company Limited Method for displaying animation and system thereof
WO2007129143A2 (en) 2005-12-09 2007-11-15 Ebuddy Holding B.V. Message history display system and method
KR20080085049A (en) 2005-12-20 2008-09-22 코닌클리케 필립스 일렉트로닉스 엔.브이. Method of sending a message, message transmitting device and message rendering device
DE102006025685A1 (en) * 2006-06-01 2007-12-06 Combots Product Gmbh & Co. Kg Communication device for animated communication, has display device, on which window-based graphic user interface is represented, and area is defined as window, which is represented transparently
DE102006025687A1 (en) * 2006-06-01 2007-12-06 Combots Product Gmbh Communication device for animated communication, has communication terminal with display device, on which window-based graphic user interface is represented, and area is defined as window, which is represented transparently
DE102006025686A1 (en) * 2006-06-01 2008-02-07 Combots Product Gmbh Communication device, has display unit, and area defined as window, where combot is composed of non-transparent pixels and represented in window, and window-based graphical user interface can be represented on unit
KR100834646B1 (en) 2006-09-05 2008-06-02 삼성전자주식회사 Method for transmitting software robot message
US9686219B2 (en) 2010-04-14 2017-06-20 Nokia Technologies Oy Systems, methods, and apparatuses for facilitating determination of a message recipient
CN103797761B (en) * 2013-08-22 2017-02-22 华为技术有限公司 Communication method, client, and terminal
GB201405651D0 (en) * 2014-03-28 2014-05-14 Microsoft Corp Delivering an action
EP3198560A4 (en) * 2014-09-24 2018-05-09 Intel Corporation User gesture driven avatar apparatus and method
EP3101845A1 (en) * 2015-06-01 2016-12-07 Facebook, Inc. Providing augmented message elements in electronic communication threads
WO2017137952A1 (en) * 2016-02-10 2017-08-17 Vats Nitin Intelligent chatting on digital communication network
US9959037B2 (en) * 2016-05-18 2018-05-01 Apple Inc. Devices, methods, and graphical user interfaces for messaging
DK201670641A1 (en) * 2016-05-18 2017-12-04 Apple Inc Devices, Methods, and Graphical User Interfaces for Messaging

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11184790A (en) * 1997-12-25 1999-07-09 Casio Comput Co Ltd Cyberspace system and recording medium for storing program for providing cyberspace to user terminal
JP2001117894A (en) * 1999-10-14 2001-04-27 Fujitsu Ltd Method and system for promoting communication
JP2001228947A (en) * 2000-02-18 2001-08-24 Sharp Corp Expression data control system, expression data controller to constitute the same and recording medium in which its program is recorded
JP2001338077A (en) * 2000-05-24 2001-12-07 Digital Passage:Kk Language lesson method through internet, system for the same and recording medium
JP2003058484A (en) * 2001-08-21 2003-02-28 Sony Corp Method and device for providing community service, program storage medium and program

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5880731A (en) * 1995-12-14 1999-03-09 Microsoft Corporation Use of avatars with automatic gesturing and bounded interaction in on-line chat session
US5793365A (en) * 1996-01-02 1998-08-11 Sun Microsystems, Inc. System and method providing a computer user interface enabling access to distributed workgroup members

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11184790A (en) * 1997-12-25 1999-07-09 Casio Comput Co Ltd Cyberspace system and recording medium for storing program for providing cyberspace to user terminal
JP2001117894A (en) * 1999-10-14 2001-04-27 Fujitsu Ltd Method and system for promoting communication
JP2001228947A (en) * 2000-02-18 2001-08-24 Sharp Corp Expression data control system, expression data controller to constitute the same and recording medium in which its program is recorded
JP2001338077A (en) * 2000-05-24 2001-12-07 Digital Passage:Kk Language lesson method through internet, system for the same and recording medium
JP2003058484A (en) * 2001-08-21 2003-02-28 Sony Corp Method and device for providing community service, program storage medium and program

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101542776B1 (en) 2007-02-15 2015-08-07 엘지전자 주식회사 Controlling Method of Instant Messenger Service for Mobile Communication Terminal
JP2015018563A (en) * 2007-06-27 2015-01-29 カレン ノウルズ エンタープライゼズ ピーティーワイ リミテッド Communication method, system, and products
JP2010533902A (en) * 2007-06-27 2010-10-28 カレン ノウルズ エンタープライゼズ ピーティーワイ リミテッド Communication method, system and product
JP2009134653A (en) * 2007-11-30 2009-06-18 Internatl Business Mach Corp <Ibm> Access control method, server device and system
WO2013080636A1 (en) * 2011-12-02 2013-06-06 株式会社コナミデジタルエンタテインメント Server device, recording medium, and avatar management method
JP5048877B1 (en) * 2012-02-14 2012-10-17 株式会社 ディー・エヌ・エー Social game computing
US9289690B2 (en) 2012-02-14 2016-03-22 DeNA Co., Ltd. Computing of social game
US8834275B2 (en) 2012-02-14 2014-09-16 DeNA Co., Ltd. Computing of social game
JP2015514273A (en) * 2012-04-06 2015-05-18 アイ−オン コミュニケーションズ カンパニー リミテッド Mobile chat system to support comic story style dialogue on web pages
US9973458B2 (en) 2012-04-06 2018-05-15 I-On Communications Co., Ltd. Mobile chat system for supporting cartoon story-style communication on webpage
JP2013165951A (en) * 2012-06-28 2013-08-29 Dna:Kk Computing for social game
JP2014087657A (en) * 2013-11-20 2014-05-15 Dna:Kk Game program and game system
JP2016071571A (en) * 2014-09-30 2016-05-09 大日本印刷株式会社 Message transmission device and computer program

Also Published As

Publication number Publication date
EP1599862A2 (en) 2005-11-30
CA2517909A1 (en) 2004-09-16
WO2004079530A2 (en) 2004-09-16
WO2004079530A3 (en) 2004-10-28
AU2004216758A1 (en) 2004-09-16

Similar Documents

Publication Publication Date Title
Valentine Creating transgressive space: the music of kd lang
RU2527199C2 (en) Avatar integrated shared media selection
US7908556B2 (en) Method and system for media landmark identification
US9215095B2 (en) Multiple personalities
US9576400B2 (en) Avatar editing environment
Jacobson Impression formation in cyberspace: Online expectations and offline experiences in text-based virtual communities
US10357881B2 (en) Multi-segment social robot
US9568993B2 (en) Automated avatar mood effects in a virtual world
US9542038B2 (en) Personalizing colors of user interfaces
US9521364B2 (en) Ambulatory presence features
EP1420366A2 (en) System and method for modifying a portrait image in response to a stimulus
CN102132244B (en) Image tagging user interface
US7091976B1 (en) System and method of customizing animated entities for use in a multi-media communication application
US20130080467A1 (en) Social networking system and method
US20040148346A1 (en) Multiple personalities
US20100153453A1 (en) Communication method, system and products
US9160773B2 (en) Mood-based organization and display of co-user lists
US8264505B2 (en) Augmented reality and filtering
US20040001090A1 (en) Indicating the context of a communication
US7065711B2 (en) Information processing device and method, and recording medium
US9280545B2 (en) Generating and updating event-based playback experiences
US9223469B2 (en) Configuring a virtual world user-interface
US20120284623A1 (en) Online search, storage, manipulation, and delivery of video content
KR20190086056A (en) Automatic suggestions and other content for messaging applications
US20130117365A1 (en) Event-based media grouping, playback, and sharing

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20061106

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20091117

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20100511