EP1714465A1 - Procede et systeme de telecommunication a l'aide de representants virtuels - Google Patents

Procede et systeme de telecommunication a l'aide de representants virtuels

Info

Publication number
EP1714465A1
EP1714465A1 EP05701278A EP05701278A EP1714465A1 EP 1714465 A1 EP1714465 A1 EP 1714465A1 EP 05701278 A EP05701278 A EP 05701278A EP 05701278 A EP05701278 A EP 05701278A EP 1714465 A1 EP1714465 A1 EP 1714465A1
Authority
EP
European Patent Office
Prior art keywords
user
animation
interaction
users
representative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP05701278A
Other languages
German (de)
English (en)
Inventor
Christian REISSMÜLLER
Frank SCHÜLER
Markus Knaup
Pierre-Alain c/o WEB.DE AG COTTE
Michael c/o WEB.DE AG GREVE
Matthias c/o WEB.DE AG GREVE
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Combots Product GmbH and Co KG
Original Assignee
Combots Product GmbH and Co KG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from EP04002154A external-priority patent/EP1560402A1/fr
Priority claimed from DE102004033164A external-priority patent/DE102004033164A1/de
Priority claimed from DE102004061884A external-priority patent/DE102004061884B4/de
Application filed by Combots Product GmbH and Co KG filed Critical Combots Product GmbH and Co KG
Priority to EP05701278A priority Critical patent/EP1714465A1/fr
Publication of EP1714465A1 publication Critical patent/EP1714465A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/75Indicating network or usage conditions on the user display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1822Conducting the conference, e.g. admission, detection, selection or grouping of participants, correlating users to one or more conference sessions, prioritising transmission
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations

Definitions

  • the invention relates to a method and a system by means of which at least two users can communicate with one another via corresponding terminals. Communication is enhanced and supported through the use of virtual proxies.
  • IM instant messaging
  • multiple users can use a client program to write written messages in real time exchange or "chat”.
  • chat the text entered by a user is transmitted in real time to the other participating users.
  • These can in turn respond by text input to transmitted text messages.
  • Emoticons are a face or “smiley” emulating strings that are used in written electronic communication to mood and emotional states express.
  • the means of expression in the context of instant messaging can be slightly expanded, it still lacks the opportunity to a chat partner in particular emotions and moods in many layers, clear, responsive and multimedia-wise way.
  • the method and the system according to the invention should enable a particularly direct, versatile and varied communication of moods, emotions and emotional states.
  • the present invention proposes a method for telecommunications between at least two users via a telecommunications network, wherein the first user via a first terminal and the second user is connected via a second terminal to the telecommunications network, and wherein each user associated virtual deputy, with the following steps: - representation of the two representatives on the first terminal and the second terminal;
  • the communication between the two users is substantially expanded and improved through the use of virtual proxies.
  • the users are now no longer bound exclusively to the exchange of information to the written form, but can by an animation of their respective deputy the communication partners directly visually and acoustically play information.
  • the virtual proxy not only represents the respective user, but also includes communication functions, in particular the functions described below for a non-verbal communication.
  • each representative is to be understood not only as a graphic element, but also as a program element or object for a user program that runs on the terminal of each user for the purpose of communication with the other user.
  • the substitutes are therefore also small communication control programs.
  • the deputies are therefore also referred to below as "Communications Robots", in short "ComBots”.
  • Telecommunication in the context of the invention means the communication between the two users over a certain distance, in a very comprehensive understanding. This means that all types of communication are covered by any communication network. For example, this is done via a telecommunications network, which may be exemplified by a telephone network, a wireless network, a computer network, a satellite network or a combination of these networks. Preferably comprises the network according to the invention the Internet or the World Wide Web.
  • Both the first user and the second user communicate with each other via so-called terminals.
  • These terminals are used for telecommunications and allow the visual, audible and / or written exchange of information between users.
  • These terminals may be phones, cell phones, PDAs, calculators or the like. Users can also communicate with each other via different devices.
  • the terminals are Internet-capable computers or even computers.
  • a "user” is seen as a natural person or as a human individual.
  • a virtual deputy (ComBot) is assigned to each user in the telecommunication taking place.
  • This virtual proxy can also be referred to as a doppelganger or avatar.
  • It is a graphic placeholder representing the respective user.
  • a user can have a familiar comic character as a virtual deputy, such as Donald Duck or Bart Simpson. The user is shown his graphic figure representing him during communication on his terminal. At the same time, the user also sees another graphic object, which stands for his communication partner.
  • a communication partner can thus be informed in a novel manner of information, such as, for example, an utterance of feeling, by animating the virtual representative of the communicating party accordingly. Additionally or alternatively, an interaction between the two representatives can be represented.
  • the fact that a substitute is animated means that his graphical representation changes visually and / or acoustically over time.
  • a substitute is thus not just a static image or symbol, but dynamic and can perform a variety of actions.
  • a deputy for example, to wave the sign of greeting.
  • the animation and / or interaction of the proxies takes place in response to a user command, in particular in response to a drag-and-drop command of the user
  • the user can control his deputy individually to illustrate by the control of the deputy, for example, his current mood his communication partner.
  • the control is carried out by a corresponding operation of the respective terminal, preferably a
  • the terminal has a graphical user interface (desktop) with a mouse-like control
  • the user can trigger in a particularly simple manner by dragging and dropping an animation or interaction of his deputy.
  • the user moves his mouse pointer to a graphic representation of the animation which his deputy is to perform and "draws" this image to the graphical representation of his deputy.
  • This can also serve a predefined area of the desktop or window or window area generated by the user program.
  • an animation of the second user's representative can take place and vice versa. With this function, the described interaction between the substitutes becomes possible in a simple manner.
  • This function is particularly useful when the one user wants his deputy to perform an action that is to have an effect on the deputy of the other user.
  • the first user for example, instruct his deputy to throw an object after the deputy of the other user.
  • a corresponding "reaction" of the cast representative takes place in the form of a corresponding animation.
  • a control command of the first user is an animation of the deputy of
  • the first user may achieve such animation of the second user's representative by dragging and dropping onto the second user's representative.
  • the animation and / or interaction taking place in response to a user command is displayed simultaneously, in parallel and in real time on both terminals of the two users. This means that both users can, so to speak live, follow the behavior of the substitutes in response to the commands entered on their respective terminal.
  • the control commands entered by the users for the animation of the deputies can be processed differently.
  • a newly entered user command can lead to a direct termination of a running animation or interaction; in connection to The demolition is then immediately the desired by the user new animation.
  • the ongoing animation or interaction may first be completed so that the desired animation occurs immediately after the completed animation.
  • the desired animations or interactions can be enqueued into animations or interactions to be carried out in a waiting list.
  • the animations specified by the users are then processed according to the waiting list one after the other.
  • the termination of a first animation or interaction triggered by the first user and a replacement of the first animation or interaction by a second animation or interaction triggered by the second user and vice versa can also take place.
  • the first user triggers an interaction with which his deputy shoots an arrow on the second user's proxy
  • the second user could interrupt that interaction by instructing his deputy to fend off the arrow with a shield.
  • This second interaction could interrupt the first user by triggering another interaction and so on.
  • a veritable interactive action and reaction game between the two users can develop using the substitutes.
  • the course of the interaction can be dependent on predefinable parameters which the users specify and / or which are stored in the system in user profiles assigned to the users.
  • the parameters may e.g. Information about the person of the respective user, such as his nationality, his place of residence or current place of residence, his
  • the respective user profile can be managed by the system and brought up to date, so that the suitable interactions are automatically used for the representative (ComBot) of the respective user or at least the user is offered a suitable selection, eg one Number of favorite interactions (favorites). So the system has a feature that automatically changes and adjusts the interactions based on the parameters. This auto function can be switched on and off at any time by the user.
  • a further recognition of a voice or text input made by a user into his terminal can be carried out to further expand the communication depth between the users.
  • the recognized voice or text input is then analyzed so that its meaning is captured.
  • a video recognition for example by means of a video camera
  • a video recognition of the facial expression of a user and their analysis and interpretation can take place. So can preferably the
  • Facial expressions of a user are recorded and evaluated on the basis of certain emotional expressions. Subsequent to the analysis and interpretation, the user may be provided with several suitable animation or interaction possibilities in accordance with the meaning of his speech or text input or his facial expression. So if the user is e.g. written, verbal or by his facial expression indicates that he is pleased, the user can to the appropriate animation of his
  • Representative suitable joy expressing animations (the animation “smile” or “laugh” or “hopping” etc ..)
  • a suggestion function can also directly and automatically an animation of a substitute and / or an interaction between the representatives in accordance with the meaning of the voice or text input or facial expressions done.
  • the meaning of a voice or text message or facial expressions can be automatically determined and as a result automatically the behavior of the corresponding representative to the meaning of the voice or text message or facial expressions are adjusted.
  • a user's voice or text message or mimic is "I'm feeling sad”
  • the user's proxy can automatically take on a sad expression, or the user can first verify it before the proxy mimics the recognized meaning Automatic recognition of the meaning of a text message can also be called "parsing".
  • the search is for keywords and terms in the text, to which then appropriate
  • Animations and if necessary Interactions are offered to the user and / or automatically used by the system in the communication.
  • Such a "parsing" feature may also be applied to non-textual messages, particularly to voice messages, and the analysis of message content may also use information about the user queried from the user profile stored in the system be stored on the writing and speaking habits of each user, which are then taken into account in the implementation in animations and interactions.
  • the tabular overview applies to terminals that provide the user with a graphical user interface.
  • the user can select an action to be performed by a deputy based on the graphically displayed table, which contains the available control commands in the form of small graphical symbols ("icons" or "thumbnails").
  • the overview table can also be called a grid, matrix or grid.
  • the tabular overview has a fixed number of classes, under which the animation and interaction possibilities are summarized and retrievable.
  • the tabular overview can consist of a two-by-three matrix, each of the six fields of the matrix standing for one of the six classes finally determined.
  • the animations for the areas "Mood”, “Love”, “Topics”, “Commentary”, “Aggression” and “Events” are particularly preferably grouped together under the six classes.
  • the users are provided with a drawing function for enabling a real-time transmission of a drawing of a user on his terminal to the other user on his terminal, there is still another type of communication for the users.
  • a user can create a drawing on his graphic user interface with a drawing tool. The creation of the drawing can then follow the other users and communication partners in real time on his device. In this way, information can also be transmitted that is difficult to describe in writing or via the deputy.
  • This mood display can be realized in the form of a mood bar and / or in the form of a more or less smiling face ("smiley"), so that each user can see directly what the mood of the own and the foreign representative looks like right now Deputies and as a result of their exchange, the respective mood display can vary.
  • the behavior of the deputy is particularly varied and lifelike.
  • the deputy of the first user automatically start to jump for joy, if the mood display has exceeded a certain limit.
  • the mood display can be presented in a variety of forms, e.g. also in the form of a "fever thermometer” or a "fever curve”. Also by a color or other transformation of the representative (ComBot) the current mood or mood of the user can be displayed. It is particularly advantageous if the communication depth is also made dependent on the current mood. For this purpose, the system evaluates the current mood display and accordingly alters the animations and / or interactions relating to the representative (ComBot) of this user.
  • the presentation of the two representatives on the first terminal mirror image or mirrored to the representation of the two representatives on the second terminal.
  • the animation of the at least one representative and / or the interaction between the representatives takes place as a function of predefinable criteria, in particular of criteria that is stored in a user profile assigned to at least one of the two users.
  • At least one of the two users may be provided with a selection of animations and / or interactions to be transmitted. This can also be done in accordance with predefinable criteria, in particular of criteria that are stored in a user profile that is assigned to at least one of the two users. At least this user is proposed a selection of animations and / or interactions to be transmitted.
  • information can be provided on at least one of the two users, in particular information on gender, age, nationality, mother tongue, speech habits or patterns, residence, interests and / or hobbies.
  • the animation of the at least one representative and / or the interaction between the representatives takes place in response to a user's drag-and-drop command, with the drag-and-drop command referring to the user's own representative or to the other's representative User, and where the animation or interaction occurs depending on which of the two proxies the drag-and-drop command refers to.
  • this can be done in dependence on predefinable criteria, in particular of criteria that are stored in a user profile that is assigned to at least one of the two users.
  • the specifiable criteria include information about at least one of the two users, in particular information on gender, age, nationality, mother tongue, speech habits or patterns, residence, interests and / or hobbies.
  • the animation of the at least one deputy and / or the interaction between the deputies may be dependent on the mood display, which for at least one of the two users is his current emotional one
  • the automatic response of a representative in response to a received emotion depends on the current basic mood of the receiving representative. So, for example, if the mood of a proxy is a good-humored and that deputy receives an aggressive emotion, the proxy's automatic reaction could be a simple shake of the head. But if the mood of the progenitor receiving the aggression is negative, he might automatically bale and s worn his fist instead of shaking his head.
  • the mood display which displays for at least one of the two users its current emotional mood, depending on the transmitted emotion and / or interaction can be changed.
  • the selection of the animations and / or interactions to be transmitted as a function of the mood display provided to at least one of the two users can be provided which indicates for at least one of the two users its current emotional mood. It is advantageous if the selection of animations and / or interactions to be transmitted is provided in the form of groupings and / or classes. In this context, the composition of the classes and / or the selection of animations and / or interactions can be done automatically and pseudo-randomly controlled.
  • the present invention also includes a system for carrying out the method according to the invention just described.
  • FIG. 1 shows an overall view of a user interface of a user for carrying out the method according to the invention
  • Figures 2a to 2c show alternative embodiments of user interfaces according to the invention
  • FIGS. 3a to 3f show a further alternative embodiment of a user interface according to the invention
  • FIGS. 4a and 4b show, by way of example, an interaction between two virtual proxies
  • Figs. 5a and 5b show two embodiments of the tables according to the invention for selecting a control command
  • FIG. 6 shows different control options that are available to the user on the basis of the tables according to FIGS. 4a and 4b;
  • FIG. 7 shows the text recognition and interpretation (parsing) according to the invention
  • Figures 8a to 8d show various processing possibilities of the control commands issued by the users
  • FIG. 9 shows the mirror-inverted view of both users
  • FIG. Figs. 10 to 28 exemplify the flow of communication between two users using the method and system of the present invention
  • FIG. 29 shows an example of a complex communication with text-based elements and with elements configured according to the invention.
  • Fig. 1 is the interface that Franz uses to communicate with Manuela.
  • Fig. 1 again shows what Franz sees in his communication with Manuela on his screen.
  • Manuela has on the screen of her computer via a similar surface to the user interface 1.
  • the user interface 1 is realized as a separate window with three main sections 2, 3 and 4.
  • Section 2 can be called an animation section.
  • the virtual proxies are 5 and 6 from Franz and Manuela presented.
  • Section 3 is the text and control section. Here are the text messages exchanged between Franz and Manuela.
  • Section 3 also houses the control panels for communication control.
  • section 4 is the menu bar.
  • the two virtual proxies 5 and 6 (ComBots) of both users can be recognized.
  • the virtual deputy 5 of Franz is a car while the virtual deputy 6 of Manuela is a doll.
  • the deputy 5 of Franz is currently undergoing an animation phase and sends to the
  • small windows 10a and 10b are arranged. These windows indicate which actions are currently being performed by the respective user. If e.g. In the window 10b a pen appears, then Franz knows that Manuela just uses the character function described in detail later.
  • the text and control section 3 is divided into a message area 11, a control bar 12 and a design area 13.
  • a message area 11 In the news area 11 are already exchanged between Franz and Manuela
  • the control bar 12 has a plurality of buttons 15. With these buttons 15 different animations of the representatives 5 and 6 can be triggered. Thus, by pulling the heart symbol on Franz Auto and dropping, the "heart animation" indicated in Fig. 1 can be triggered By pulling the boxing glove on Manuela's doll, Deputy 5 can be induced by Franz to punch the doll with a punch.
  • the button 17 allows the opening and closing (show and hide) of the animation section. 2
  • Reference 19 is highlighted.
  • the already exchanged emotions are displayed symbolically in the context of section 11, ie the emotions belonging to the history of this still ongoing communication are displayed.
  • an emotion provided with the reference numeral 19H and displayed as a "boxing glove” is displayed, namely a rather aggressive emotion that "Franz” had previously sent to "Manuela.”
  • By clicking on the symbol 19H of this historical emotion it can immediately be repeated spontaneously become.
  • FIG. 2a shows an alternative embodiment of the invention
  • Animationsabiteses 20.1 In contrast to Fig. 1 here section 2 has in addition moods 20.1 and 20.2.
  • the mood display 20.1 is a stylized face (a "smiley face"), which expresses by his facial expression the current mood of the associated deputy and thus the corresponding user.As you can see, the mood of "Franz” is better than that of "Vroni "because the laughter of the mood display 20.1a (smiley) in” Franz “is wider than the smiley face 20.1b in” Vroni. "The mood display may alternatively or additionally be implemented in the form of mood bars 20.2a and 20.2b the length of the beam indicates the goodness of the mood.
  • FIGS. 2b and 2c show, in comparison to FIG. 1, two further variants of the representation of the interaction between two virtual substitutes. Shown is the desktop of the user "Franz".
  • Substitute 21 and 22 then takes place in the opened thought bubble 24.
  • a user can also address a communication to several communication partners at the same time. If, for example, the user wants to simultaneously transmit the same animation to two users, he can combine the two corresponding representatives into one group. The user can then send the desired animation to the group of two communication partners in a single process by means of a single "drag and drop.” With this "intelligent" formation of representative groups, a wide variety of groups can be generated, such as temporary or permanent groups. Several individual substitutes can also be combined into a single group representative.
  • the group proxy (“GroupComBot”) is a single proxy, with the help of which the user can immediately connect with a whole group of communication partners.
  • the system provides the following note or note function: If the recipient "Franz” does not respond to this emotion by reacting to it manually or if the system does not initiate an automatic response, a note 21Z to the ComBot 21 from Vroni is still displayed For example, this hint 21Z is the current number of unanswered emotions, so if the potential recipient "Franz” is not present for incoming emotions, he can immediately see if and how many emotions he has during his or her life
  • the determination of whether the potential recipient has perceived or missed the emotion is done by the system based on monitoring the activities of the receiver. For example, if there are no mouse or keystrokes by the recipient on the receiver side during the presentation of the emotion and at least five seconds thereafter, the system assumes that the receiver missed the animation.
  • activity detection may be via a video camera connected to the recipient's computer. The system uses the camera to check if the receiver is present. The video camera can also be used by the receiver as input means. The receiver may then, for example, by hand gestures, which are detected by the camera, commands to his computer, for. to control his ComBot or to respond to a received emotion.
  • the camera can even perform a body language recognition of the user, preferably in the form of real-time monitoring.
  • the camera continuously records the behavior of the user.
  • recognition or interpretation software the system can interpret the user's behavior and animate the virtual proxy in real time in accordance with user behavior. It can be thanks to the camera recording For example, it can be stated that the user has just taken a stance that suggests that he is sad.
  • the ComBot of the user will then automatically express simultaneously the sadness of the user. With the camera recognition the ComBot imitates, so to speak, any behavior of the user.
  • the video camera captures the mood or the attitude or the gesture of the user. The detected mood is then automatically transferred by the system to the ComBot.
  • the movement is recorded by the camera, then interpreted by the system, and finally the user's ComBot is forced to replay the user's movement: the ComBot, like the user, clenches his fist.
  • an indication 21Z is displayed at the corresponding ComBot.
  • an entry about the missed emotion is made in a dedicated logbook (so-called "history") .
  • the receiver can trigger or play back the missed animation via the logbook and / or the hint 21Z, so the system becomes a kind of recorder - or memory function for missed animations provided.
  • a speech bubble is displayed on a received emotion, but a thought bubble on an outgoing emotion.
  • the different presentation But it can also refer to whether an emotion is already transmitted or not. If a user merely wants to prepare the transmission of an emotion (edit mode and / or preview mode), a thought bubble appears on his desktop. At first, nothing appears yet on the desktop of the communication partner. However, as soon as the emotion is transmitted (interaction mode) a speech bubble appears on both desktops.
  • FIG. 2c Another variant is shown in FIG. 2c:
  • the two virtual proxies 21 and 22 are stored on the respective desktop 23.
  • the animation is carried out by combining the two representatives in an overall representation, a so-called "arena", which preferably has the shape of a tube or cylinder 25.
  • FIGS. 3a to 3f illustrate a further variant of the operation and interaction of the virtual substitutes.
  • Fig. 3a shows the desktop, i. the screen surface of a user named Bob.
  • Bob has placed a proxy 59 in snowman form on his desktop.
  • the deputy 59 is assigned to Bob's acquaintance Alice. Over the representative 59 Bob can communicate with Alice.
  • the animation chosen by Bob manifests itself on Alice's desktop in such a way that the one deposited there
  • FIGS. 4a and 4b show a typical example
  • Figures 5a and 5b illustrate two embodiments 28a and 28b of the command table, which may be invoked by pressing the button 16 (see Figure 1).
  • substantially all actions that can be performed by means of a virtual proxy are represented in an overall grid 29.
  • Each available animation is represented by a corresponding square icon in the table 28a.
  • the icons may each be arranged according to common groupings (for example, for "love", "friendship”, etc.).
  • the overall grid 29 is divided into two sections 30a and 30b.
  • In the first section 30a are the basic animations ("Basic Emotions"), which are freely available to each user, such as laughter, crying, etc.
  • Basic Emotions Basic Emotions
  • Gold Emotions special animations
  • These special abilities of the deputies can be acquired by a user, for example by buying, bartering or trading with other users.
  • an icon is not only directly for an emotion or animation, but representative of a whole group of animations.
  • the heart icon 32 stands for the group of "love animations.”
  • another sub-table 31 is opened, from which the user can then select the desired love animation for his deputy that is, several variants of a basic representation of an emotion, such as the heart representation described here.
  • Those animations which can not be assigned to a grouping are shown in a separate column 33.
  • Fig. 5b another type of distribution of emotions is shown, with summary table 28b limited to six fields. Each of these fields represents a whole class of animations.
  • the respective class eg class 34 "Mood”
  • Mood is faded in by pressing the corresponding field in the table 28b, within which the desired animation can be selected, for example, another class includes all kinds of aggressive emotions and is displayed in the output table 28b symbolized by a bomb
  • a mouse click on this symbol opens the subtable, in which different emotions are available.
  • the emotions summarized within a class differ not only in terms of their presentation, but in principle. This means that different emotions can be assigned to a class. Common to them is the same meaning, content statement or mood.
  • the class of aggressive emotions described herein includes e.g. a bombing animation, a flash animation or a shooting animation.
  • FIG. 6 illustrates how the desired animation is selected and triggered by a user based on a table 28.
  • the three variants are indicated by corresponding arrows.
  • Variant A the user moves the selected icon to the appropriate proxy and drops it there.
  • the so-called representative then immediately performs the desired animation.
  • FIG. 6 is a storm cloud selected and dragged on the foreign representative 6.
  • a storm cloud is sent by the user's own deputy 5 to the foreign deputy 6 and drench him.
  • the icon is dragged into the message area 11 and dropped there. As a result, the selected icon appears in the message area of the respective communication partner.
  • Communication partner can then trigger the animation transmitted by the counterpart by clicking on the icon.
  • the icon is just clicked by the user.
  • the icon is integrated into the design area 13 at the current cursor position.
  • a suitable text can be automatically offered to the user. So if the user is e.g. If you click on the icon "birthday cake”, you can see the text "happy birthday! in the design area 13 above the "birthday cake”.
  • the send button 14 By pressing the send button 14, the written text message is sent with the integrated icon to the communication partner.
  • a small face 41 which is also referred to as an "emoticon" can be recognized in the message area 11.
  • Such faces which express a certain mood, can be entered in the message text as shown Just enter the string of the emoticons you want in a text message in the design area 13, for example :-) This string is then automatically converted to the corresponding face, here ⁇ If the send button 14 is pressed, the text and the emoticon become the communication partner transmitted. From the selection of emotions displayed in Table 28, each individual emotion can also be activated immediately by double-clicking.
  • Fig. 7 The automatic text recognition and interpretation ("parsing") is illustrated in Fig. 7. If a user enters a text 35 into his design area 13, the meaning of the text is automatically determined and the user is then presented with the determined terms, here in the form of a speech bubble 36. At the same time, the user will receive two in the
  • FIGS. 8a to 8d illustrate various alternatives for processing the animation commands issued by the users.
  • an animation 38a of the proxy is immediately interrupted and replaced by a new animation 38b when the user directs a command to execute the new animation 38b to his deputy.
  • this processing of the control commands is a direct and delay-free implementation, so that the behavior of the deputy acts quickly and dynamically.
  • the animation 38a is first completed before the new animation 38b takes place.
  • the originally scheduled subsequent animation 38c is suppressed.
  • all animations triggered by the users are processed linearly one after the other. There is no suppression of animations.
  • the requested animations are thus arranged according to their chronological order in a so-called "playlist" and performed one after the other.
  • FIG. 8d illustrates how a multiple interaction between two proxies can be processed and played back.
  • a first user lets his deputy perform an action 38a. This is then interrupted by a replica of the representative of the second user 39a, which in turn is performed instead until the representative of the first user again responds by the action 38b.
  • FIG. 9 shows the mirror-inverted animation sections 2a and 2b of a first and a second user in relation to each other.
  • the first user and the second user use in the described manner their respective animation section 2a and 2b, respectively, to exchange emotions with each other via their deputies.
  • the exchange is via the network 40 (e.g., the Internet).
  • Both the first user (“my PC") and the second user (“your PC”) have their own deputy on the left side and the foreign deputy on the right, so that the result is a mirror image.
  • both users see the same process in their respective animation sections at the same time. So, one could say that both users see "everything,” that is, the totality of all the processes that take place.
  • Figures 10 to 28 show an example of a communication that could take place between Franz and Manuela.
  • Figures 10 to 28 respectively depict snapshots ("screenshots") of the user interface 1 of Franz, whereby figure 10 shows the beginning of the communication and figure 28 the end.
  • FIG. 13 Franz has addressed a first text message to Manuela, to which Manuela also immediately answers (see Fig. 14). While Manuela enters her text, Franz appears in the window 10b a hand indicating that Manuela is currently performing an action. Following her text input, Manuela draws a "sad face" with the already described drawing function. 19. On the basis of the pencil shown in the window 10b Franz can see that Manuela is drawing (see Fig. 15).
  • a communication area 51 (Communications Area), comprising a news section, where actual communication takes place in real time or near real time, and there is a preparatory Section 53 with a design section in which the user can prepare his intended contributions (text, graphics, emotions, etc.) before sending them to the other user by pressing the Send button.
  • a slider 52 is also provided with control bars, which separates the areas 51 and 53 from each other and the control elements for text input, for drawing, etc. provides. So far corresponds to the structure of this surface
  • FIG. 29 an overview area 55 with history is now also provided, in which all previous communications are listed.
  • the listing can be chronological, thematic or user-related.
  • the layout shown in FIG. 29 also has a navigation area 56, which serves to navigate within a single (still ongoing or already completed) communication.
  • This is u.a. a movable window with a cutout 51, which detects a partial area in the navigation area, this partial area then being enlarged in the area 51. So this is an enlargement or magnifying function.
  • the cutout 57 In an ongoing communication, the cutout 57 always runs in real time with the area 51. Through this "running along" the user always keeps the
  • the user interface 50 also has a special area in which the representatives 5 and 6 (ComBots) of the two users (here the car of "Franz” or the eye of "Vroni") are displayed in interaction. In this case, however, not only non-verbal communication is represented by emotions, as previously described (Figs. 1-29).
  • two or more users can communicate with each other in a particularly appealing, versatile and varied manner.
  • virtual substitutes and their animation or interaction moods feelings and emotions between the users can be exchanged in a particularly effective and vivid way.

Abstract

Procédé et système de télécommunication entre au moins deux abonnés via un réseau de télécommunication (40). Le premier abonné est connecté via un premier terminal et le deuxième abonné est connecté via un second terminal au réseau de télécommunication (40) et un représentant virtuel (5, 6) est attribué à chaque abonné. Ledit procédé comporte la représentation des deux représentants (5, 6) sur le premier terminal et sur le deuxième terminal et la transmission d'une information du premier abonné au deuxième abonné et inversement par l'animation d'au moins un représentant (5, 6) et par une interaction entre les représentants (5, 6).
EP05701278A 2004-01-30 2005-01-31 Procede et systeme de telecommunication a l'aide de representants virtuels Withdrawn EP1714465A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP05701278A EP1714465A1 (fr) 2004-01-30 2005-01-31 Procede et systeme de telecommunication a l'aide de representants virtuels

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
EP04002154A EP1560402A1 (fr) 2004-01-30 2004-01-30 Robot de communication
EP04012120 2004-05-21
US57448904P 2004-05-26 2004-05-26
US58469804P 2004-07-01 2004-07-01
US58646904P 2004-07-08 2004-07-08
DE102004033164A DE102004033164A1 (de) 2004-07-08 2004-07-08 Verfahren und System zum Verwalten und Vorschlagen von Kontaktgruppen
DE102004061884A DE102004061884B4 (de) 2004-12-22 2004-12-22 Verfahren und System zur Telekommunikation mit virtuellen Stellvertretern
US63855604P 2004-12-23 2004-12-23
EP05701278A EP1714465A1 (fr) 2004-01-30 2005-01-31 Procede et systeme de telecommunication a l'aide de representants virtuels
PCT/EP2005/000939 WO2005074235A1 (fr) 2004-01-30 2005-01-31 Procede et systeme de telecommunication a l'aide de representants virtuels

Publications (1)

Publication Number Publication Date
EP1714465A1 true EP1714465A1 (fr) 2006-10-25

Family

ID=56290655

Family Applications (1)

Application Number Title Priority Date Filing Date
EP05701278A Withdrawn EP1714465A1 (fr) 2004-01-30 2005-01-31 Procede et systeme de telecommunication a l'aide de representants virtuels

Country Status (5)

Country Link
US (1) US20080214214A1 (fr)
EP (1) EP1714465A1 (fr)
JP (1) JP2007520005A (fr)
CA (1) CA2551782A1 (fr)
WO (1) WO2005074235A1 (fr)

Families Citing this family (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8365084B1 (en) * 2005-05-31 2013-01-29 Adobe Systems Incorporated Method and apparatus for arranging the display of sets of information while preserving context
USRE49187E1 (en) 2005-09-06 2022-08-23 Samsung Electronics Co., Ltd. Mobile communication terminal and method of the same for outputting short message
EP1771002B1 (fr) * 2005-09-30 2017-12-27 LG Electronics Inc. Terminal de communication vidéo mobile
DE102006021399B4 (de) * 2006-05-08 2008-08-28 Combots Product Gmbh & Co. Kg Verfahren und Vorrichtung zum Bereitstellen eines einem dargestellten Symbol zugeordneten Auswahlmenüs
DE102006024449A1 (de) * 2006-05-24 2007-11-29 Combots Product Gmbh & Co. Kg Übermittlung von Nachrichten mittels animierter Kommunikationselemente
US8726195B2 (en) * 2006-09-05 2014-05-13 Aol Inc. Enabling an IM user to navigate a virtual world
US20080098315A1 (en) * 2006-10-18 2008-04-24 Dao-Liang Chou Executing an operation associated with a region proximate a graphic element on a surface
KR101079592B1 (ko) * 2006-11-03 2011-11-04 삼성전자주식회사 디스플레이장치 및 그 정보갱신방법
DE102006059174A1 (de) * 2006-12-14 2008-06-19 Combots Product Gmbh & Co. Kg Verfahren, Vorrichtung und System zum Bereitstellen eines einer Kommunikation zugeordneten Auswahlmenüs für non-verbale Botschaften
US8504926B2 (en) * 2007-01-17 2013-08-06 Lupus Labs Ug Model based avatars for virtual presence
EP1995909A1 (fr) * 2007-05-25 2008-11-26 France Telecom Procédé d'évaluation dynamique de l'humeur d'un utilisateur de messagerie instantanée
US7890876B1 (en) * 2007-08-09 2011-02-15 American Greetings Corporation Electronic messaging contextual storefront system and method
CN100514290C (zh) * 2007-11-08 2009-07-15 腾讯科技(深圳)有限公司 显示面板的管理系统和管理方法
US8584025B2 (en) 2008-05-02 2013-11-12 International Business Machines Corporation Virtual world teleportation
US8464167B2 (en) * 2008-12-01 2013-06-11 Palo Alto Research Center Incorporated System and method for synchronized authoring and access of chat and graphics
CN102566863B (zh) * 2010-12-25 2016-07-27 上海量明科技发展有限公司 在即时通信工具中设置辅助区的方法及系统
JP5742378B2 (ja) * 2011-03-30 2015-07-01 ソニー株式会社 情報処理装置、プレイリスト生成方法及びプレイリスト生成プログラム
US20130268119A1 (en) * 2011-10-28 2013-10-10 Tovbot Smartphone and internet service enabled robot systems and methods
WO2014061715A1 (fr) * 2012-10-19 2014-04-24 グリー株式会社 Procédé de distribution d'images, dispositif de serveur de distribution d'images et système de dialogue en ligne
CN104184760B (zh) * 2013-05-22 2018-08-07 阿里巴巴集团控股有限公司 通讯过程中的信息交互方法、客户端及服务器
CN103369477B (zh) * 2013-07-02 2016-12-07 华为技术有限公司 显示媒体信息方法、装置、客户端,图形控件显示方法和装置
US20160154959A1 (en) * 2013-07-23 2016-06-02 Banff Cyber Technologies Pte. Ltd. A method and system for monitoring website defacements
US10516629B2 (en) * 2014-05-15 2019-12-24 Narvii Inc. Systems and methods implementing user interface objects
US9857939B2 (en) 2015-02-27 2018-01-02 Accenture Global Services Limited Three-dimensional virtualization
US10225220B2 (en) 2015-06-01 2019-03-05 Facebook, Inc. Providing augmented message elements in electronic communication threads
US9658704B2 (en) * 2015-06-10 2017-05-23 Apple Inc. Devices and methods for manipulating user interfaces with a stylus
US20170054662A1 (en) * 2015-08-21 2017-02-23 Disney Enterprises, Inc. Systems and methods for facilitating gameplay within messaging feeds
CN105468244A (zh) * 2015-12-11 2016-04-06 俺朋堂(北京)网络科技有限公司 一种多人聊天页面实现方法
US20190114037A1 (en) 2017-10-17 2019-04-18 Blend Systems, Inc. Systems and methods for distributing customized avatars responsive to events
US10698583B2 (en) * 2018-09-28 2020-06-30 Snap Inc. Collaborative achievement interface
US11209964B1 (en) * 2020-06-05 2021-12-28 SlackTechnologies, LLC System and method for reacting to messages

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5347306A (en) * 1993-12-17 1994-09-13 Mitsubishi Electric Research Laboratories, Inc. Animated electronic meeting place
JP2727982B2 (ja) * 1994-10-28 1998-03-18 日本電気株式会社 インクジェット式プリントヘッド
US5880731A (en) * 1995-12-14 1999-03-09 Microsoft Corporation Use of avatars with automatic gesturing and bounded interaction in on-line chat session
US6476830B1 (en) * 1996-08-02 2002-11-05 Fujitsu Software Corporation Virtual objects for building a community in a virtual world
US5732232A (en) * 1996-09-17 1998-03-24 International Business Machines Corp. Method and apparatus for directing the expression of emotion for a graphical user interface
US6230111B1 (en) * 1998-08-06 2001-05-08 Yamaha Hatsudoki Kabushiki Kaisha Control system for controlling object using pseudo-emotions and pseudo-personality generated in the object
JP2000076487A (ja) * 1998-09-03 2000-03-14 Sony Corp 情報処理装置および方法、並びに提供媒体
WO2001047210A2 (fr) * 1999-12-20 2001-06-28 Nokia Corporation Ameliorations apportees a ou relatives a des dispositifs de communication
US6404438B1 (en) * 1999-12-21 2002-06-11 Electronic Arts, Inc. Behavioral learning for a visual representation in a communication environment
NZ519616A (en) * 1999-12-24 2002-10-25 Siemens Ltd A portable symbol for establishing a telephone call where data and software is copied with the symbol
US6784901B1 (en) * 2000-05-09 2004-08-31 There Method, system and computer program product for the delivery of a chat message in a 3D multi-user environment
JP2002055935A (ja) * 2000-08-07 2002-02-20 Sony Corp 情報処理装置および情報処理方法、サービス提供システム、並びに記録媒体
US6910186B2 (en) * 2000-12-08 2005-06-21 Kyunam Kim Graphic chatting with organizational avatars
US7203356B2 (en) * 2002-04-11 2007-04-10 Canesta, Inc. Subject segmentation and tracking using 3D sensing technology for video compression in multimedia applications
US20040220850A1 (en) * 2002-08-23 2004-11-04 Miguel Ferrer Method of viral marketing using the internet
US7853652B2 (en) * 2003-01-18 2010-12-14 International Business Machines Corporation Instant messaging system with privacy codes
US7484176B2 (en) * 2003-03-03 2009-01-27 Aol Llc, A Delaware Limited Liability Company Reactive avatars
US20050054381A1 (en) * 2003-09-05 2005-03-10 Samsung Electronics Co., Ltd. Proactive user interface
US10225373B2 (en) * 2003-11-21 2019-03-05 Thomson Reuters (Grc) Llc Financial-information systems, methods, interfaces, and software
US8171084B2 (en) * 2004-01-20 2012-05-01 Microsoft Corporation Custom emoticons

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2005074235A1 *

Also Published As

Publication number Publication date
US20080214214A1 (en) 2008-09-04
JP2007520005A (ja) 2007-07-19
WO2005074235A1 (fr) 2005-08-11
CA2551782A1 (fr) 2005-08-11

Similar Documents

Publication Publication Date Title
EP1714465A1 (fr) Procede et systeme de telecommunication a l'aide de representants virtuels
DE60209261T2 (de) Rich-kommunikation über das internet
DE102006021399B4 (de) Verfahren und Vorrichtung zum Bereitstellen eines einem dargestellten Symbol zugeordneten Auswahlmenüs
Cassell et al. Fully embodied conversational avatars: Making communicative behaviors autonomous
DE69629983T2 (de) Verfahren zum anteiligen Nutzen eines dreidimensionalen Raumes in virtueller Realität und System dafür
EP2198589B1 (fr) Procédé pour réaliser une communication multimédia reposant sur un protocole réseau, en particulier sur tcp/ip et/ou udp
DE202016003234U1 (de) Vorrichtung zum Einfangen von und Interagieren mit erweiterten digitalen Bildern
DE202015006142U1 (de) Elektronische Touch-Kommunikation
DE202016002529U1 (de) Vorrichtungen zum Einfangen von und Interagieren mit erweiterten digitalen Bildern
DE112015004021T5 (de) Elektronische touch-kommunikation
DE102019217730A1 (de) Verfahren zum Betreiben eines Bediensystems in einem Fahrzeug und Bediensystem für ein Fahrzeug
CN107480766B (zh) 多模态虚拟机器人的内容生成的方法和系统
CN112000781A (zh) 用户对话中的信息处理方法、装置、电子设备及存储介质
DE102004061884B4 (de) Verfahren und System zur Telekommunikation mit virtuellen Stellvertretern
DE102006021376A1 (de) Verfahren und Vorrichtung zum Bereitstellen von angepassten Kommunikationsfenstern zur Kommunikation mit einem Kommunikationspartner in einer Anwendung auf einem Endgerät eines Benutzers
WO2007014698A1 (fr) Systeme de communication permettant de securiser la communication entre des terminaux d'interlocuteurs, et appareils peripheriques correspondants
KR20070018843A (ko) 가상 상징물들을 이용한 전기통신 방법 및 시스템
KR100372929B1 (ko) 채팅 이미지 작성 방법
KR102244280B1 (ko) 이모티콘 입력에 동기화된 광고컨텐츠 순간 노출 광고 방법
WO2005074237A1 (fr) Robot de communication servant a ameliorer les contacts et la communication
EP0921482A2 (fr) Dispositif et méthode pour la détermination de modèles de comportement de consommateurs et de chercheurs d'informations dans une plateforme communautaire ouverte
DE102006059174A1 (de) Verfahren, Vorrichtung und System zum Bereitstellen eines einer Kommunikation zugeordneten Auswahlmenüs für non-verbale Botschaften
DE102006010770B4 (de) Verfahren zur Einladung, ein Kommunikationssystem zu nutzen, und zur Installation eines Kommunikationselementes, sowie eine das Verfahren ausführende Vorrichtung
DE102006025687A1 (de) Kommunikationseinrichtung zur animierten Kommunikation über einen ComBOT
Hsu et al. Application of Robots for Enhancing Social Presence in Remote Communication Scenarios

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20060830

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR

17Q First examination report despatched

Effective date: 20061122

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20080819