GB2422454A - A system for communicating user emotion - Google Patents

A system for communicating user emotion Download PDF

Info

Publication number
GB2422454A
GB2422454A GB0501393A GB0501393A GB2422454A GB 2422454 A GB2422454 A GB 2422454A GB 0501393 A GB0501393 A GB 0501393A GB 0501393 A GB0501393 A GB 0501393A GB 2422454 A GB2422454 A GB 2422454A
Authority
GB
United Kingdom
Prior art keywords
emotion
nodes
input device
user
line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0501393A
Other versions
GB0501393D0 (en
Inventor
James David Smith
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens PLC
Original Assignee
Siemens PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens PLC filed Critical Siemens PLC
Priority to GB0501393A priority Critical patent/GB2422454A/en
Publication of GB0501393D0 publication Critical patent/GB0501393D0/en
Publication of GB2422454A publication Critical patent/GB2422454A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Abstract

A communications system has two or more nodes, at least one of the nodes comprising an emotion input device for user selection of a stored emotion type and a transmitter for transmitting the selected emotion type to output devices at other nodes. An output device may be a display for displaying an image representing the selected emotion type, e.g. an icon depicting a smile or a grimace. In this way, a participant in the communication may indicate a currently felt emotion to the other participants. This will be particularly useful in a video conference. The emotion input device may detect user emotions by movement of the input device alone, e.g. a mouse, virtual-reality glove, or computer-controlled camera. The output device may, alternatively, provide a sensory output such as a sound, music, a smell or a physical force such as vibration.

Description

A COMMUNICATIONS SYSTEM
This invention relates to a communications system.
It is well known that in face to face communication it is not merely the speech that is important but the emotions displayed on the faces of those taking part and body language. When direct communication is not possible, and communication takes place via a telephone, the facial expressions are not seen and the communication is not as effective.
Video conferencing seeks to overcome the limitations by enabling the faces of the participants to be seen. However, because of bandwidth restrictions it is not always possible for the facial expressions to be clearly seen so in effect video conferencing does not always satisfactorily address the problem it seeks to overcome.
Furthermore, not all parties may have the required video conferencing equipment or access to a communication line having the required characteristics.
According to the invention there is provided a communications system comprising two or more nodes at least one of the nodes comprising an emotion input device for user input of an emotion type; a transmitter for transmitting the input emotion type to the other of the nodes and an output device at the other node for outputting a representation of the input emotion type to the user.
The emotion input device may detect user emotions, via the movement of the device alone. Thus the use of the emotion input device is more akin to natural body movements which occur when humans communicate. The emotion input device may be any device that can detect and quantify user motion. As such a range of devices may be employed as an emotion input device; from the PC mouse, through Virtual Reality gloves, to automated visualisation via computer controlled cameras.
Alternatively, the input device may enable user selection of an emotion type from a number of stored emotion types.
The output device may be a means to provide a sensory output such as a sound, music, smell, a physical force such as a vibration or some other means that provides sensory stimulation. in the described io embodiment the device is a display to display an image conveying the selected type of emotion.
The image may be an icon depicting say a smile or a grimace or some other image indicative of a particular emotion.
A specific embodiment of the invention will now be described, by way of example only, with reference to the accompanying drawing in which: Figure 1 shows a communications system in accordance with the invention; Figure 2 shows a communication node of the system shown in figure 1; Figure 3 shows in schematic block diagram form the architecture of the communication node shown in figure 2; Figure 4 shows blocks of functionality furnished by the architecture shown in figure 3; and Figure 5 shows a display of a node with a video-conference taking place.
As is shown in figure 1, a communications system 1, in accordance with the invention, comprises a number of network nodes 2 to 5 interconnected via a telecommunications network 6.
Each network node 2 to 5 comprises a computer terminal 7 having a display 8, a keyboard 9, a mouse 10, a connection 11 via a modem to the network 6 and a camera 12 as is shown in figure 2. In addition to the usual computer operations the mouse 10 also provides an emotion input device in a manner to be described later. (Other input devices may be used such as keypads or pressure sensitive devices.) As will be understood by a person skilled in the art, the computer terminal 7 is formed by a computer architecture such as is depicted in figure 3. A processor 13 is connected to input devices 14 such as the mouse and keyboard to memory 15 and output devices 16 such as the display. The memory 15 provides storage space for the program controlling the operation of the processor 13. In this particular case an application is held which enables video- conferencing of the terminals.
The application provides, in conjunction with the processor 13, the blocks of functionality shown in figure 4. The core function is a video conference processor 17 which enables the connections to the participants and monitors for inputs from the keyboard 9 and mouse 10. The camera 12 provides its video output to the video conference processor 17. If the image is to be transmitted to the other participants it is encoded and packetised before being passed to a transmitter 18 and a modem 19 and thus via the telecommunications network 6 to the participants.
The video conference processor 17 receives information from the other nodes involved in the conference call via a receiver section which is also connected to the modem 19. It strips the information from the received packets and passes part of the received information concerning participants' input emotions to an expression processor 21.
The local user may also input emotion types via the mouse which is also connected to the expression processor 21. The expression processor 21 selects from expression memory 22 an expression graphic which is passed to the display driver 23. The display driver is connected to the display 8. It is also connected to the video conference processor 17 to receive decoded video information received from the camera 12 and from the other nodes involved in the conference.
A typical conference display is shown in figure 5 and it comprises a first region 24 and a second region 25. The first region 24 displays the video information received from a camera at one of the nodes. The region 25 displays images representing emotions input by the other participants. For participant A, the icon displayed indicates an input emotion of agreement. For participant B, the icon displayed is one indicating a confused state. For participant C, we have an icon representing disagreement and participant D, the icon represents boredom.
The images in region 25 are not fixed but are updated as the input made by the participants changes.
In the described embodiment, a number of actions may be input by the movement of a cursor on the display by the mouse and activation of the right or left button or a middle scrolling wheel. The actions appear in the tables below.
In the first table I the action is a reciprocating line in which the cursor is moved from left to right and then back right to left. That basic motion can be varied in its characteristics as shown under the appropriate descriptions. Table 2 shows other actions. Each action may be attributed to a particular emotion type.
Table 1 ____________________ Action Example Characteristic ___________ Description I Lateral line, started Left __________ from right side Lateral line, started Right from left side Direction Transverse line, Up __________ starting from bottom Transverse line, Down _____________ __________ starting from top Short Mean length of line Long over last cycle Mean length Length Shortening decreasing over ___________ cycles Mean length Growing ___________ increasing over cycles Very fast Time to return to start Cycle duration fast another cycle Reciprocating ______________ slow ____________________ line Mean line middle Up moves up over cycles Displacement Mean line middle Down moves down over _______________ ___________ cycles Moves slower at start Start end ___________ end of the line Moves faster at start Turn end ___________ end of the line Moves slower out End emphasis Out leg from start end of the ___________ line Moves faster out In leg from start end of the ____________ line Repetition 2-4 Number of cycles _____________ >4 __________________
Table 2
Action Example Characteristic I Description
Lateral line, started Left _______ from right side Lateral line, started Right from left side Direction Transverse line, starting Up from bottom Transverse line, starting Down _______ from top Short Length Length of line ______________ Long ______________________ Unidirectional line Very fast Duration Time to complete line fast slow _____________________ Start Moves slower at start end end of the line End emphasis Turn Moves faster at start end _______________ end of the line
I
Number of times Repetition 2-4 repeated _____________ >4 ____________________ Action Example Characteristic __________ Description Left Anticlockwise motion Direction _____________ Right Clockwise motion Small Mean diameter over last Large cycle Mean diameter (:1) Diameter Shrinking decreasing over cycles Mean diameter increasing Growing over cycles Very fast Time to return to start Single Cycle duration fast enclosed another cycle slow _______________________ space Mean centre moves up Up over cycles Displacement Mean centre moves down Down over cycles Start end Moves slower at the top End emphasis Turn end Moves faster at the top Repetition 2-4 Number of cycles ___________ _____________ >4 ____________________ Action Example Characteristic ___________ Description Double Lateral layout, started Left enclosed ___________ from right side space. Lateral layout, started Right from left side Direction Transverse layout, Up __________ starting from bottom Transverse layout, Down _____________ __________ starting from top Short Mean length of shape ciKIII Long over last cycle Length Shortening Mean length decreasing over cycles Mean length increasing Growing over cycles Very fast Time to return to start Cycle duration fast slow another cycle Mean middle moves up Up over cycles Displacement Mean middle moves Down down over cycles Moves slower over the Start end __________ start part of the shape Moves faster at start part Turn end ___________ of the shape End emphasis Moves slower out from Out leg ___________ start end of the shape Moves faster out from In leg __________ start end of the shape Start Loop at start end is ___________ larger Loop bias End Loop at start end is smaller Repetition 2-4 Number of cycles ___________ ____________ >4 ____________________ It may be that the device provides additional means by which the user may express their emotions. Typically these might include; left button, right button, scroll wheel.
These can be used in isolation, in combination, or together with a motion action, to express a specific emotion. Some examples of device actions templates that could be used are given in table 3.
Table 3 ___________________________
on Characteristic Description
Wheel Slow The scroll wheel is moved backwards and forwards Fast about a quarter turn Wheel DoWn The scroll wheel is moved about half a turn in one I flick. . direction Repetition Button Left A button is pressed and released in less than half a Right click. . second Repetition Left Button Right A button is pressed and held for longer than half a hold Duration second ____ Repetition By way of illustration table 4 indicates some possible emotion types and how they could be associated with the use of the emotion input device.
Table 4
Emotion Screen pointer device action Normal Pointer stationary (or not recognised as a defined ____________________ motion) Agree Slow transverse reciprocating line YES! Fast transverse reciprocating line Disagree Slow lateral reciprocating line NO! Fast lateral reciprocating line Uncertain Small slow lateral double enclosed space Confused Lateral double enclosed space with start loop bias Questioning Large fast lateral double enclosed space Need more information Transverse double enclosed space Distracted Pointer focus moves to another application Impatient Slow wheel toggle Frustrated Fast wheel toggle Taking note Keyboard entry with focus on the application Attentive Transverse unidirectional line Interruption Fast repeated transverse unidirectional line Attention request Right button click Attention urgent Repeated right button click Amazed Right button hold Discount Wheel tuck Happy Large slow single bounded space Sad Slow lateral unidirectional line Angry Left and right button click Aggressive Repeated left and right button click Each terminal may be trained by a particular user to attribute particular actions to emotions. in the training, an emotion type will be presented to the user and the user will manipulate the mouse in a way that the user wishes. Default options are provided in the event that the user does not wish to undergo the training operation.
The images to be displayed to depict a particular emotion may be tailored by the user by selection from an image set or in other ways.
Different image sets may be provided for different cultures. For example, a western set, an oriental set or a middle eastern set may be provided to cater for the cultural heritage of the participants and to avoid offence being caused.
Once the image is selected the expression processor 21 informs the video conference processor 17 which transmits the emotion image identification to the other participants in the conference in an information packet. The packet includes the identity of the user expressing the emotion, a timestamp with the local time and the time since the application was first linked to the conversation, the emotion image identifier, suggested image set, the duration of the emotion. In some applications a URL of a custom image may be transmitted to select a previously used image.
Whilst in the described embodiment, a video conference is established, the invention is applicable to other situations. For example, a conference call may be established and only the region 25 of the display is used to show the images representing the input emotion types. The first region will then be blank. This then caters for the situation where video cameras are not available.
In a similar manner, a simpler system is envisaged in which a telephone is associated with a PC which provides a mouse that is used to express a particular emotion type. The emotion type may then be displayed to similarly equipped participants via the PC display.
Another example is where the emotion type expressed is associated with the user's "presence" data that may be accessed by other users both prior to and within a communication.
In the described embodiment the emotions are displayed as icons, it may be possible to use digital images of the participants in alternative embodiments. The images will comprise a set of photographs of the participant in different applicable emotional states. These images may be transferred when communication is established. t

Claims (8)

1. A communications system comprising two or more nodes where users at the nodes may communicate with users at the other nodes, at least one of the nodes comprising an emotion input device for user input of an emotion type; a transmitter for transmitting the input emotion type to the other of the nodes and an output device at the other nodes for outputting a representation of the emotion type to the user.
2. A system as claimed in claim 1 wherein the output device outputs a representation in the form of at least one of: an image, a smell, a force or other user perceivable output.
3. A system as claimed in claim 2 wherein the output device is a display which displays an image as a representation of the selected emotion type.
4. A system as claimed in claim 1 wherein the emotion input device enables a user to make a selection of emotion based on their body movements.
5. A system as claimed in claim 4 wherein the emotion input device comprises a computer mouse.
6. A system as claimed in any preceding claim wherein the transmitter transmits an identifier of the selected emotion.
7. A system as claimed in claim 6 wherein the transmitter transmits an identifier of an image set from which emotions are to be selected.
8. A communications system as hereinbefore described with reference to, and as illustrated by, the accompanying drawing.
GB0501393A 2005-01-22 2005-01-22 A system for communicating user emotion Withdrawn GB2422454A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0501393A GB2422454A (en) 2005-01-22 2005-01-22 A system for communicating user emotion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0501393A GB2422454A (en) 2005-01-22 2005-01-22 A system for communicating user emotion

Publications (2)

Publication Number Publication Date
GB0501393D0 GB0501393D0 (en) 2005-03-02
GB2422454A true GB2422454A (en) 2006-07-26

Family

ID=34259536

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0501393A Withdrawn GB2422454A (en) 2005-01-22 2005-01-22 A system for communicating user emotion

Country Status (1)

Country Link
GB (1) GB2422454A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070139516A1 (en) * 2005-09-30 2007-06-21 Lg Electronics Inc. Mobile communication terminal and method of processing image in video communications using the same
EP3229477A4 (en) * 2014-12-03 2018-05-23 Sony Corporation Information processing apparatus, information processing method, and program

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5974262A (en) * 1997-08-15 1999-10-26 Fuller Research Corporation System for generating output based on involuntary and voluntary user input without providing output information to induce user to alter involuntary input
US6190314B1 (en) * 1998-07-15 2001-02-20 International Business Machines Corporation Computer input device with biosensors for sensing user emotions
WO2002003172A2 (en) * 2000-06-30 2002-01-10 Immersion Corporation Chat interface with haptic feedback functionality
US20020194006A1 (en) * 2001-03-29 2002-12-19 Koninklijke Philips Electronics N.V. Text to visual speech system and method incorporating facial emotions
US20020197967A1 (en) * 2001-06-20 2002-12-26 Holger Scholl Communication system with system components for ascertaining the authorship of a communication contribution
US20030159567A1 (en) * 2002-10-18 2003-08-28 Morton Subotnick Interactive music playback system utilizing gestures
US20040013252A1 (en) * 2002-07-18 2004-01-22 General Instrument Corporation Method and apparatus for improving listener differentiation of talkers during a conference call
EP1460588A2 (en) * 2003-03-19 2004-09-22 Matsushita Electric Industrial Co., Ltd. Videophone Terminal
WO2004088960A1 (en) * 2003-03-31 2004-10-14 British Telecommunications Public Limited Company Sensory output devices

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5974262A (en) * 1997-08-15 1999-10-26 Fuller Research Corporation System for generating output based on involuntary and voluntary user input without providing output information to induce user to alter involuntary input
US6190314B1 (en) * 1998-07-15 2001-02-20 International Business Machines Corporation Computer input device with biosensors for sensing user emotions
WO2002003172A2 (en) * 2000-06-30 2002-01-10 Immersion Corporation Chat interface with haptic feedback functionality
US20020194006A1 (en) * 2001-03-29 2002-12-19 Koninklijke Philips Electronics N.V. Text to visual speech system and method incorporating facial emotions
US20020197967A1 (en) * 2001-06-20 2002-12-26 Holger Scholl Communication system with system components for ascertaining the authorship of a communication contribution
US20040013252A1 (en) * 2002-07-18 2004-01-22 General Instrument Corporation Method and apparatus for improving listener differentiation of talkers during a conference call
US20030159567A1 (en) * 2002-10-18 2003-08-28 Morton Subotnick Interactive music playback system utilizing gestures
EP1460588A2 (en) * 2003-03-19 2004-09-22 Matsushita Electric Industrial Co., Ltd. Videophone Terminal
WO2004088960A1 (en) * 2003-03-31 2004-10-14 British Telecommunications Public Limited Company Sensory output devices

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070139516A1 (en) * 2005-09-30 2007-06-21 Lg Electronics Inc. Mobile communication terminal and method of processing image in video communications using the same
EP3229477A4 (en) * 2014-12-03 2018-05-23 Sony Corporation Information processing apparatus, information processing method, and program
US10721525B2 (en) 2014-12-03 2020-07-21 Sony Corporation Information processing device, information processing method, and program
US11218768B2 (en) 2014-12-03 2022-01-04 Sony Corporation Information processing device, information processing method, and program

Also Published As

Publication number Publication date
GB0501393D0 (en) 2005-03-02

Similar Documents

Publication Publication Date Title
JP4199665B2 (en) Rich communication via the Internet
US20180160075A1 (en) Automatic Camera Selection
US20170302709A1 (en) Virtual meeting participant response indication method and system
AU2004248274B2 (en) Intelligent collaborative media
EP3649588A1 (en) Virtual meeting participant response indication method and system
CN113741765B (en) Page jump method, device, equipment, storage medium and program product
US20030177444A1 (en) System for describing markup language for mobile use, and information processing apparatus and program for generating display content
WO2015142600A1 (en) Stop recording and send using a single action
CN107085495A (en) A kind of information displaying method, electronic equipment and storage medium
CN103747308A (en) Method and system for controlling smart television with analog keys, and mobile terminal
US20150264312A1 (en) Highlighting Unread Messages
WO2021218555A1 (en) Information display method and apparatus, and electronic device
TW201145152A (en) Flash content navigation method, mobile electronic device, and computer-readable medium
KR102078209B1 (en) Avatar visual conversion apparatus expressing text message into V-moji and message conversion method
US20150264305A1 (en) Playback of Interconnected Videos
KR101967998B1 (en) Method for creating moving image based key input, and user device for performing the method
GB2422454A (en) A system for communicating user emotion
KR100946914B1 (en) Video call device supporting whiteboard function and operating method of the same
KR20140089069A (en) user terminal device for generating playable object and method thereof
CN114095611A (en) Incoming call display interface processing method and device and electronic equipment
JP2005348144A (en) Information terminal device, method and program for shared media data exhibition
KR102607377B1 (en) Content recommendation method and user terminal
KR102619340B1 (en) Method and user terminal of providing contents to user
JP2024050636A (en) Content recommendation method and user device
KR20230024174A (en) Method and user terminal of providing emoticon to user

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)