WO2024080970A1 - Emotion state monitoring - Google Patents

Emotion state monitoring Download PDF

Info

Publication number
WO2024080970A1
WO2024080970A1 PCT/US2022/046231 US2022046231W WO2024080970A1 WO 2024080970 A1 WO2024080970 A1 WO 2024080970A1 US 2022046231 W US2022046231 W US 2022046231W WO 2024080970 A1 WO2024080970 A1 WO 2024080970A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
input
emotional state
peripheral devices
peripheral
Prior art date
Application number
PCT/US2022/046231
Other languages
French (fr)
Inventor
I-Chen Lin
Isaac Lagnado
Chung-Chun Chen
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2022/046231 priority Critical patent/WO2024080970A1/en
Publication of WO2024080970A1 publication Critical patent/WO2024080970A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1822Conducting the conference, e.g. admission, detection, selection or grouping of participants, correlating users to one or more conference sessions, prioritising transmission
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1827Network arrangements for conference optimisation or adaptation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/2866Architectures; Arrangements
    • H04L67/30Profiles
    • H04L67/306User profiles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/535Tracking the activity of the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Definitions

  • Computing systems such as notebooks, desktops etc.
  • the computing system may activate different applications to access a telecommunications session.
  • Components, such as microphones, of the computing systems can be used during the telecommunication session.
  • the microphone may be used to communicate with other electronic devices.
  • Electronic devices coupled to the computing system may transmit information to and from the computing system at various time.
  • Figure 1 illustrates an example of a system including a plurality of peripheral devices for emotion state monitoring.
  • Figure 2 illustrates an example of an apparatus that includes a processing resource and memory resource for emotion state monitoring.
  • Figure 3 illustrates an example of a system including a plurality of systems each including a computing device and a plurality of peripheral devices for emotion state monitoring.
  • Figure 4 illustrates an example of a system that includes a non-transitory machine-readable medium including instructions for emotion state monitoring.
  • Systems described herein can be used to detect the emotional state of a user using a computing device during a virtual meeting.
  • the system can include peripheral devices that connect to the computing device.
  • the peripheral devices can be used during the virtual meeting and when the computing device is activated. That is, the peripheral device may assist the user in communicating with the computing device and others, while using the computer device.
  • the computing device would rely on a camera peripheral device to determine the emotional state of the user.
  • the camera would use facial detection to determine the mood of the person using the computing device.
  • relying only on the camera peripheral device can provide limited understanding of the emotional state of the user.
  • the system may not be able to adequately determine the emotional state of the user when the user is using the computing device.
  • systems include computing devices that are able to determine the emotional state of a person using a plurality of peripheral devices and stored behavior of the user.
  • the computing device can include a plurality of peripheral devices and a processing resource to receive input from the plurality of peripheral devices.
  • the processing resource can analyze the received input from the peripheral devices and determine the emotional state of the user based on the received input.
  • the received input may show behavior that can lead to a determination of the emotional state of the user.
  • FIG. 1 illustrates an example of a system 100 including a plurality of peripheral devices 108 for emotion state monitoring.
  • System 100 can include a variety of devices, such as, a computing device 102, a display 104, and peripheral devices 108 (peripheral device 108-1 , 108-2, 108-3 can be referred to as peripheral devices 108).
  • the computing device 102 can be a desktop computer, a portable computer, an all-in-one (AIO) computer, a tablet, a mobile phone, internet of things (loT) device, or a phablet, etc.
  • the display 104 can be combined with the computing device 102 (e.g., portable computer, AIO, tablet, mobile phone, phablet, etc.).
  • the computing device 102 can initiate a variety of applications that utilizes different components of the computing device 102. For example, the computing device 102 can initiate an audio/video telecommunication application that uses peripheral devices 108 to communicate with another computing device.
  • the peripheral devices 108 can aide a user in communicating with the computing device 102 and other computing devices.
  • an “application” refers to a collection of instructions and data that instruct a computing related device how to execute specific tasks.
  • peripheral devices refers to a device used to input information and output information for a computer device.
  • a peripheral device can be a camera, microphone, keyboard, computer mouse, joystick, etc.
  • the application can cause a virtual meeting to initiate. The virtual meeting allows the computing device 102 to communicate with other computing devices.
  • the peripheral devices 108 allow a user of the computing device 102 to communicate with the users of other computing devices during a virtual meeting.
  • the computing device can determine the emotional state of the user.
  • a processing resource of the computing device 102 can cause behavioral information related to the user, in reference to the use of the peripheral devices 108 by the user, to be stored on a memory resource.
  • the information stored on the memory resource can provide insight to the emotional state of the user based on the behavior of the user.
  • the processing resource can cause emotional cues from the previous interactions of the user with the peripheral devices 108 to be stored on a memory resource for later access. That is, the processing resource will cause behavioral inputs from peripheral devices 108 to be stored and evaluated for emotional cues.
  • the processing resource can cause the volume, speed, tone, intonation, context, and/or content of a talking user to be evaluated for emotional cues and then store the evaluation on the memory resource.
  • the processing resource can evaluate the way a user types, when using a keyboard peripheral device 108-3, for emotional cues. That is, the processing resource can cause the typing pressure, typing speed, and/or typing accuracy to be evaluated and store the evaluation on the memory resource.
  • the processing resource will match current behaviors of the user when using the peripheral device 108 to past behaviors of the user using the peripheral devices 108 and the emotion associated to the behavior.
  • emotion cue refers to facial expressions, body movement, tone of voice, and/or the action of a person that indicates the emotion of the person.
  • the processing resource will store the standard behavior of the user when using the peripheral device. The stored standard behavior (e.g., behavior when the user is in a neutral mood) may be used to determine when the user is excited or sad.
  • the processing resource can determine that the user is not in a neutral state and can use other behavioral cues to determine which emotional state the user is in. For example, the processing resource may compare the stored standard behavior of the user to received input from the peripheral device to determine the emotional state of the user.
  • the computing device 102 uses a plurality of peripheral devices 108 to determine the emotion of the user.
  • the computer device 102 can analyze the interaction (e.g., behavior) of the user when using a camera peripheral device 108-1 , a microphone peripheral device 108-2, a keyboard peripheral device 108-3, a computer mouse peripheral device 108-4, or a combination thereof.
  • the analyzed interactions e.g., learned behavior
  • the computing device 102 via the processing resource can continue to store interactions as the user continues to use the peripheral devices 108.
  • the newly learned interactions behavior can be stored on the memory device.
  • the processing resource can access the stored interactions to determine how the user is feeling at the particular time when using the peripheral devices 108.
  • the computing device 102 when the user is on a virtual meeting the computing device 102, via the processing resource, will analyze the current behavior of the user with the peripheral devices 108 and compare the current behavior with past behavior to determine the emotional state of the user. For example, information can be stored in the memory resource stating that if the camera peripheral device 108-1 detects a user not looking at the display device during a virtual meeting, the user is uninterested and/or distracted. Further, information could be stored in the memory resource stating that if the user has a specific facial expression (e.g., smile, frown, etc.) the user is happy, angry, or sad, depending on the facial expression.
  • a specific facial expression e.g., smile, frown, etc.
  • the processing resource may notify other participants of the virtual meeting of the emotional state of the user. For example, once the emotional state of the user is determined, the processing resource can cause the emotional state of the user to appear on the display 104. Specifically, an indication of the emotional state of the user will appear around the profile 105 of the user in a conference screen 106 and other participants will be able to view the emotional state of the user. That is, the emotional state of the user may be displayed on each display connected to the virtual meeting via the profile 105 of the user.
  • “profile” refers to the graphical representation of a person in the form of a still image, characters (e.g., letters, numbers, etc.), and/or a video image.
  • the processing resource may analyze input from two or more peripheral devices to determine the current emotional state of the user. That is, the processing resource may collect, individually, from the peripheral devices 108 and combine that data to determine the emotional state of the user. For example, data from the keyboard 108-3 peripheral device, data from the computer mouse 108-4 peripheral device, and data from the microphone 108-2 peripheral device can be used to determine the emotional state of the user. Using a plurality of devices to gauge the emotional state of the user can ensure that an accurate emotional state of the user is determined.
  • Figure 2 illustrates an example of an apparatus 220 that includes a processing resource 221 and memory resource 222 for emotion state monitoring.
  • the processing resource 221 may be a hardware processing unit such as a microprocessor, application specific instruction set processor, coprocessor, network processor, application specific integrated circuit (ASIC), general purpose input output (GPIO), a central processing unit (CPU), or similar hardware circuitry that may cause machine- readable instructions to be executed.
  • the processing resource 221 may be a plurality of hardware processing units that may cause machine-readable instructions to be executed.
  • the memory resource 222 may be any type of volatile or non-volatile memory or storage, such as random-access memory (RAM), flash memory, read-only memory (ROM), storage volumes, a hard disk, or a combination thereof.
  • the memory resource 222 may be a non-transitory machine-readable medium (e.g., the non-transitory machine-readable medium 440 of Figure 4).
  • the memory resource 222 may store instructions thereon, such as instructions 223, 224, 225, 226, and 227. When executed by the processing resource 221 , the instructions 223, 224, 225, 226, and 227 may cause the apparatus 220 to perform specific tasks and/or functions.
  • the memory resource 222 may store instructions 223 which may be executed by the processing resource 221 to store behavioral input related to emotional cues. Behavioral input stored over time can lead to an understanding of which actions performed by the user are indicative of a specific emotional state.
  • the processing resource 221 can store interactions and use the stored interactions to analyze current and/or future behavior to determine the emotional state of the user. That is, the processing resource 221 can receive behavioral input related to the user from peripheral devices and store the behavioral input. The behavioral input can be matched to an emotional state for the user and the emotional states matched to a particular behavior can be matched to future behavior by the user when it relates to peripheral devices.
  • the processing resource 221 can cause information to be stored on the memory resource 222 stating that when the user types fast or typing inaccurately (e.g., misspelling words) on the keyboard peripheral device that the user is angry.
  • the processing resource 221 can cause this information to be stored on the memory resource 222 and use the stored information to analyze later behavior of the user.
  • the process resource 221 can cause information to be stored on the memory resource 222 stating that when the user yells into the microphone peripheral device or has a high-pitched voice that the user is angry or excited.
  • the processing resource 221 can cause information to be stored on the memory resource 222 stating that when the user speaks calmly into the microphone that the user is in a neutral emotional state. That is, a particular tone and the emotion associated with the tone can be stored on the memory resource 222 and used to determine the emotional state of the user.
  • the emotional state of the user can be determined based on the interaction with the smartphone, tablet, and/or phablet.
  • information can be stored in the memory resource 222, by the processing resource 221 , stating that if the user swipes or touches the touch screen slowly, erratically, or in a smooth manner, the user could be upset, angry, or calm, respectively. That is, the interactions with the smartphone, tablet, and/or phablet and the associated emotion can be stored on the memory device 222.
  • the memory resource 222 may store instructions 224 which may be executed by the processing resource 221 to cause the apparatus 220 to receive input from a plurality of peripheral devices during a virtual meeting.
  • the processing resource 221 can receive current input from peripheral devices.
  • the current input from the peripheral devices can assist the computing device in determining the emotional state of the user.
  • the user will use peripheral devices to communicate with the other participants of the virtual meeting.
  • the user may type on the keyboard peripheral device, speak into the microphone peripheral device, click on items with the computer mouse peripheral device, and/or show video with the camera peripheral device during the virtual meeting. How the user uses these peripheral devices can be dictated and sent to the processing resource 221 for analysis. That is, as the user uses the peripheral devices, the processing resource 221 will received input from the peripheral devices and the current input can be used to determine the emotional state of the user.
  • the memory resource 222 may store instructions 225 which may be executed by the processing resource 221 to cause the apparatus 220 to compare the stored behavioral input to the received input from the plurality of peripheral devices.
  • stored behavioral input refers to behavior from the use of a peripheral device analyzed for a specific emotional cue and stored in the memory resources and used to determine the emotion behind future and/or current behavior related to use of peripheral devices.
  • the stored behavioral input may be received during a virtual meeting.
  • this disclosure is not so limited. For example, stored behavioral input may be received when a virtual meeting is not in session and the user is using the computing device.
  • the processing resource will compare the current input from the peripheral devices to the stored behavior from the peripheral device to determine the emotion of the user.
  • Comparing the current input from the peripheral devices will allow the computing device to determine the emotion of the user at the time the input was created. That is, the processing resource 221 will analyze the received input from the peripheral devices by comparing them to stored behavior and the corresponding emotion matched to the stored behavior.
  • the stored behavior can be linked to specific emotions which can inform the processing resource of the emotional state of the user.
  • the processing resource 221 may have to analyze each input from the peripheral devices separately and then combine each analysis to determine the emotional state of the user. That is, the processing resource 221 can analyze input from the microphone peripheral device and the keyboard peripheral device to determine the emotional state of the user. For example, the processing resource 221 can compare and analyze stored input from the microphone peripheral device to the current input received from the microphone peripheral device and compare and analyze stored input from the keyboard peripheral device to the current input received from the keyboard peripheral device to determine the emotional state of the user. Hence, the processing resource 221 will combined the compared/analyzed input to determine the emotional state of the user.
  • the memory resource 222 may store instructions 226 which may be executed by the processing resource 221 to cause the apparatus 220 to adjust an emotional state indication of a user based on the compared stored behavioral input and the input from the plurality of peripheral devices.
  • the processing resource 221 will determine the emotional state of the user. That is, the emotional state of the user can change from time to time. For instance, the manner in which the user uses the peripheral devices can be an indication of the emotional state the user is in. When the emotional state of the user changes based on the use of the peripheral devices by the user, the processing resource 221 will adjust the known emotional state of the user.
  • the processing resource 221 can set a threshold range for each emotional state determined by the behavioral inputs.
  • a threshold can be set for the speed at which a user types on a keyboard peripheral device, the volume at which the user speaks on the microphone peripheral device, the force in which the user touches the touchpad of a smart device, or the amount of force used on a smart pen when using a smart device, etc. For instance, if the typing speed, volume, touch/smart pen force provided by a user exceeds a certain amount the processing resource 221 can determine that a threshold range has been met and adjust the emotional state. That is, the processing resource can update the emotional status indicator responsive to the threshold range for the emotional state being reached and/or exceeded.
  • the memory resource 222 may store instructions 227 which may be executed by the processing resource 221 to cause the apparatus 220 to transmit the emotional state to a second device.
  • the processing resource 221 will determine the emotional state of the user when the user is in a virtual meeting. Once the processing resource 221 determines the emotional state of the user, a notification is sent to the conference screen of the virtual meeting. This informs other participants of the virtual meeting to know the emotional state of the user.
  • the indication can be a border of a particular color around the profile of the user or words in a comer of the profile of the user.
  • the indication of the emotional state of the user can appear when a participant hovers a computer mouse over the profile of the user.
  • Figure 3 illustrates an example of a system 301 including a plurality of systems 300 each including a computing device 302 and a plurality of peripheral devices 308 for emotion state monitoring.
  • Figure 3 can include analogous or similar elements as Figure 1.
  • Figure 3 can include system 300, computing device 302, display 304, conference screen 306, camera peripheral device 308-1 , microphone peripheral device 308-2, keyboard peripheral device 308-3, computer mouse peripheral device 308-4.
  • connection 303 connection 303-1 , 303-2, 303-3 can be referred to as connection 303. That is, each system 300 (system 300-1 , 300-2, 300-3 can be referred to as system 300) can connect via connection 303 during a virtual meeting.
  • the connection 303 between the plurality of systems 300-1 , 300-2, and 300-3 can be a direct or indirect connection. That is, the connection can be via the internet (e.g., indirect) using various network interfaces, a local area network (LAN) (e.g., direct connection), a video conferencing platform (e.g., indirect) connected to the internet, etc.
  • LAN local area network
  • video conferencing platform e.g., indirect
  • each computing device 302-1 , 302-2, 302-3 can store input based on the emotion of the user from the peripheral devices 308 (peripheral device 308-1 A, 308-2A, 308-3A, 308-4A, 308-1 B, 308-2B, 308-3B, 308-4B, 308-1 C, 308-2C, 308-3C, 308-4C can be referred to as peripheral devices 308) connected to the computing device 302-1 , 302-2, 302-3.
  • the computing device 302 can use the stored input to predict the emotional state of the user.
  • the processing resource will analyze the behavior and compare the current input with the stored input to determine the emotional state of the user of the particular computing device 302.
  • the processing resource of a computing device will send the emotional state of the user to the other computing devices (e.g., 302-1 , 302-3) via connection (e.g., 303-1 , 303-2).
  • the emotional state of the user will appear on the conference screen for all of the participants of the virtual meeting to see.
  • the processing resource of the first computing device 302-1 can receive input from the peripheral devices 308-1 A, 308-2A, 308-3A, 308-4A and compare the received input to stored input to determine the emotional state of the user.
  • the processing resource will then send, via connection 303-1 and 303-3, the determined emotional state to the other computing devices 302-2 and 302-3 participating in the virtual meeting.
  • conference screen 306 conference screen 306
  • conference screens 306 display 304-1 , 304-2, 304-3, on all other computing devices 302 participating in the virtual meeting.
  • the processing resource of each computing device 302 can receive input from each peripheral device 308 attached to the particular computing device 302.
  • the input can be used to gauge the participation of the user in a virtual meeting.
  • the amount of participation can be sent, via connection 303, to each computing device 302, and the combined analysis of each user participation can be used to determine the quality of the meeting.
  • the quality of the virtual meeting can be indicated on the conference screen 306 of each computing device 302 in the virtual meeting.
  • the virtual meeting may be on a separate device (e.g., a server), such that the plurality of systems 300 each send the data to the separate device (e.g., server) and the separate device (e.g., server) can send the data to each system 300.
  • a separate device e.g., a server
  • Figure 4 illustrates an example of a system that includes a non-transitory machine-readable medium 440 including instructions for emotion state monitoring.
  • a processing resource may execute instructions stored on the non-transitory machine- readable medium 440.
  • the non-transitory machine-readable medium 440 may be any type of volatile or non-volatile memory or storage, such as random-access memory (RAM), flash memory, read-only memory (ROM), storage volumes, a hard disk, or a combination thereof.
  • the non-transitory machine-readable medium 440 stores instructions 443 executable by a processing resource to receive input from a plurality of peripheral devices connected to a computing device.
  • the user may use peripheral devices to communicate with the other participants of the virtual meeting.
  • the user may use the keyboard to chat during the virtual meeting, a microphone to speak to other participants in the virtual meeting, a camera to show the video stream of the user, and/or a computer mouse to click on items relevant to the virtual meeting.
  • information related to how each device is being used may be sent to the processing resource.
  • the processing resource will receive information such as how fast the user is typing on the keyboard peripheral device, or how many spelling errors the user is making when typing, the amount of pressure being exerted on the keys of the keyboard or the computer mouse buttons when being depressed, the volume the user is speaking into the microphone peripheral device, an/or the facial expression the user is making when the camera peripheral device is being used, etc.
  • the information related to how each device is being used may be received during a virtual meeting.
  • this disclosure is not so limited.
  • the information related to how each device is being used may be received when a virtual meeting is not in session and the user is using the computing device.
  • the non-transitory machine-readable medium 440 stores instructions 444 executable by a processing resource to match the stored behavioral input to the emotional state of the user.
  • the processing resource will use the received input from the peripheral devices to determine the emotional state of the user.
  • the computing device can learn the emotional state tied to specific behaviors, as it relates to using peripheral devices, by monitoring the user use of the peripheral devices and prompting questions for the use based on that use. When the similar behavior with the peripheral devices is repeated the processing resource can match the current behavior using the peripheral devices with the learned behavior to determine the emotional state of the user.
  • the non-transitory machine-readable medium 440 stores instructions 445 executable by a processing resource to determine the emotional state of the user based on stored behavioral inputs and the input from the plurality of peripheral devices.
  • the processing resource will match current input from peripheral devices to stored input from peripheral devices to determine the emotional state of the user. Once determined the emotional state will be sent to other participants of the virtual meeting. That is, the other participants of the virtual meeting will get an indication of the emotional state of the user in the conference screen.
  • the non-transitory machine-readable medium 440 stores instructions 446 executable by a processing resource to update the emotional state matched to the behavioral input responsive to receiving additional behavioral input.
  • a processing resource executable by a processing resource to update the emotional state matched to the behavioral input responsive to receiving additional behavioral input.
  • the indication of the emotional state of the user may be updated as the emotional state of the user changes. For example, if the input from the peripheral devices show that the user is happy and later in the virtual meeting the input determines that the user is angry or sad, the indication of the emotional state of the user will be adjusted on the conference screen.
  • the non-transitory machine-readable medium 440 stores instructions 447 executable by a processing resource to analyze input from the plurality of peripheral devices from a first user and input from a plurality of peripheral devices from a second user to determine the quality of the virtual meeting.
  • the input from each peripheral device connected to each computing device participating in the virtual meeting maybe used to determine the quality of the meeting.
  • the processing resource can record the number of participants in the virtual meeting and determine how many participants are active in the virtual meeting via the peripheral devices connected to each computing device.
  • the processing resource for each computing device participating in the virtual meeting will receive input related to the use of the peripheral devices connected to the respective computing device.
  • the processing resource of each computer can use the input to determine If the owner of the particular computing device actively participated in the virtual meeting. For example, each processing resource can determine how many times the owner of the particular computing device talked into the microphone peripheral device, how often the owner typed on the keyboard peripheral device, if the camera peripheral device was turned on, and/or if the mouse peripheral device was used for the virtual meeting.
  • the non-transitory machine-readable medium 440 stores instructions 448 executable by a processing resource to indicate the quality of the virtual meeting on a display.
  • the information determined by each processing resource will be shared with the processing resources of the other computing devices that participated in the virtual meeting. The shared information can then be used to determine the quality of the meeting.
  • the meeting would be determined to have a high quality. However, if only a few participants of the virtual meeting actively participated then the virtual meeting would be determined to have a low quality. That is, the number of attendees (e.g., the number of people participating in the virtual meeting), the number of inputs from a peripheral device from each attendee (e.g., how many time each participant used a peripheral device during the virtual meeting for meeting purposes), and the length of the input (e.g., how long the participant used the peripheral device), and the type of input (e.g., camera peripheral device, microphone peripheral device, keyboard peripheral device, mouse peripheral device, etc.) can be used to determine the quality of the meeting.
  • the number of attendees e.g., the number of people participating in the virtual meeting
  • the number of inputs from a peripheral device from each attendee e.g., how many time each participant used a peripheral device during the virtual meeting for meeting purposes
  • the length of the input e.g., how long the participant used the peripheral device
  • the type of input
  • the quality of the virtual meeting can be displayed on the conference screen via the display to inform the participants of the meeting quality.
  • the quality of the virtual meeting can actively change during the meeting. That is, if participation lessens during the meeting the quality will start to fall. Conversely, if participation in the virtual meeting begins to increase the quality of the meeting will increase.
  • the indication of the quality of the meeting will be updated as the quality changes.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Examples described herein relate to systems and devices consistent with the disclosure. For instance, the computing device comprises a plurality of peripheral devices, and a non-transitory machine-readable medium storing instructions executable by a processing resource to receive input from the plurality of peripheral devices, analyze the received input from the plurality of peripheral devices for emotional cues, determine an emotional state of a user based on the analyzed input from the plurality of peripheral devices, and display the determined emotional state on a display device.

Description

EMOTION STATE MONITORING
BACKGROUND
[0001] Computing systems, such as notebooks, desktops etc., can communicate with other electronic devices during telecommunication session. The computing system may activate different applications to access a telecommunications session.
Components, such as microphones, of the computing systems can be used during the telecommunication session. The microphone may be used to communicate with other electronic devices. Electronic devices coupled to the computing system may transmit information to and from the computing system at various time.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] Figure 1 illustrates an example of a system including a plurality of peripheral devices for emotion state monitoring.
[0003] Figure 2 illustrates an example of an apparatus that includes a processing resource and memory resource for emotion state monitoring.
[0004] Figure 3 illustrates an example of a system including a plurality of systems each including a computing device and a plurality of peripheral devices for emotion state monitoring.
[0005] Figure 4 illustrates an example of a system that includes a non-transitory machine-readable medium including instructions for emotion state monitoring. Detailed Description
[0006] Systems described herein can be used to detect the emotional state of a user using a computing device during a virtual meeting. The system can include peripheral devices that connect to the computing device. The peripheral devices can be used during the virtual meeting and when the computing device is activated. That is, the peripheral device may assist the user in communicating with the computing device and others, while using the computer device.
[0007] in some systems, the computing device would rely on a camera peripheral device to determine the emotional state of the user. For example, the camera would use facial detection to determine the mood of the person using the computing device. However, relying only on the camera peripheral device can provide limited understanding of the emotional state of the user. Hence, the system may not be able to adequately determine the emotional state of the user when the user is using the computing device.
[0008] As such, systems, as described herein, include computing devices that are able to determine the emotional state of a person using a plurality of peripheral devices and stored behavior of the user. For example, the computing device can include a plurality of peripheral devices and a processing resource to receive input from the plurality of peripheral devices. The processing resource can analyze the received input from the peripheral devices and determine the emotional state of the user based on the received input. In some examples, the received input may show behavior that can lead to a determination of the emotional state of the user.
[0009] Notably, such systems Improve the quality of detecting the emotional state of the user, as compared to systems that rely solely on one peripheral device (e.g., a camera peripheral device) to determine the emotional state of the user. That is, using a plurality of peripheral devices to detect the emotional state of the user would allow for a more accurate determination of the emotional state of the user. [0010] Figure 1 illustrates an example of a system 100 including a plurality of peripheral devices 108 for emotion state monitoring. System 100 can include a variety of devices, such as, a computing device 102, a display 104, and peripheral devices 108 (peripheral device 108-1 , 108-2, 108-3 can be referred to as peripheral devices 108). In some examples, the computing device 102 can be a desktop computer, a portable computer, an all-in-one (AIO) computer, a tablet, a mobile phone, internet of things (loT) device, or a phablet, etc. In some cases, the display 104 can be combined with the computing device 102 (e.g., portable computer, AIO, tablet, mobile phone, phablet, etc.). The computing device 102 can initiate a variety of applications that utilizes different components of the computing device 102. For example, the computing device 102 can initiate an audio/video telecommunication application that uses peripheral devices 108 to communicate with another computing device.
[0011] In some examples, when the computing device 102 initiates an application, the peripheral devices 108 can aide a user in communicating with the computing device 102 and other computing devices. As used herein, an “application” refers to a collection of instructions and data that instruct a computing related device how to execute specific tasks. As used herein, “peripheral devices” refers to a device used to input information and output information for a computer device. For example, a peripheral device can be a camera, microphone, keyboard, computer mouse, joystick, etc. In some examples, the application can cause a virtual meeting to initiate. The virtual meeting allows the computing device 102 to communicate with other computing devices. The peripheral devices 108 allow a user of the computing device 102 to communicate with the users of other computing devices during a virtual meeting.
[0012] In some examples, the computing device can determine the emotional state of the user. For example, a processing resource of the computing device 102 can cause behavioral information related to the user, in reference to the use of the peripheral devices 108 by the user, to be stored on a memory resource. The information stored on the memory resource can provide insight to the emotional state of the user based on the behavior of the user. Said differently, the processing resource can cause emotional cues from the previous interactions of the user with the peripheral devices 108 to be stored on a memory resource for later access. That is, the processing resource will cause behavioral inputs from peripheral devices 108 to be stored and evaluated for emotional cues. For example, the processing resource can cause the volume, speed, tone, intonation, context, and/or content of a talking user to be evaluated for emotional cues and then store the evaluation on the memory resource. Similarly, the processing resource can evaluate the way a user types, when using a keyboard peripheral device 108-3, for emotional cues. That is, the processing resource can cause the typing pressure, typing speed, and/or typing accuracy to be evaluated and store the evaluation on the memory resource.
[0013] This allows the processing resource to match current behaviors of the user when using the peripheral device 108 to past behaviors of the user using the peripheral devices 108 and the emotion associated to the behavior. As used herein, “emotional cue” refers to facial expressions, body movement, tone of voice, and/or the action of a person that indicates the emotion of the person. In some examples, the processing resource will store the standard behavior of the user when using the peripheral device. The stored standard behavior (e.g., behavior when the user is in a neutral mood) may be used to determine when the user is excited or sad. For instance, if the use of the peripheral devices 108 by the user is not standard the processing resource can determine that the user is not in a neutral state and can use other behavioral cues to determine which emotional state the user is in. For example, the processing resource may compare the stored standard behavior of the user to received input from the peripheral device to determine the emotional state of the user.
[0014] In some examples, the computing device 102 uses a plurality of peripheral devices 108 to determine the emotion of the user. For example, the computer device 102 can analyze the interaction (e.g., behavior) of the user when using a camera peripheral device 108-1 , a microphone peripheral device 108-2, a keyboard peripheral device 108-3, a computer mouse peripheral device 108-4, or a combination thereof. The analyzed interactions (e.g., learned behavior) will then be stored in the memory resource and the stored interactions will be applied to future actions, by the user, to determine the emotional state of the user. The computing device 102 via the processing resource can continue to store interactions as the user continues to use the peripheral devices 108. The newly learned interactions behavior can be stored on the memory device. The processing resource can access the stored interactions to determine how the user is feeling at the particular time when using the peripheral devices 108.
[0015] In some examples, when the user is on a virtual meeting the computing device 102, via the processing resource, will analyze the current behavior of the user with the peripheral devices 108 and compare the current behavior with past behavior to determine the emotional state of the user. For example, information can be stored in the memory resource stating that if the camera peripheral device 108-1 detects a user not looking at the display device during a virtual meeting, the user is uninterested and/or distracted. Further, information could be stored in the memory resource stating that if the user has a specific facial expression (e.g., smile, frown, etc.) the user is happy, angry, or sad, depending on the facial expression.
[0016] The processing resource may notify other participants of the virtual meeting of the emotional state of the user. For example, once the emotional state of the user is determined, the processing resource can cause the emotional state of the user to appear on the display 104. Specifically, an indication of the emotional state of the user will appear around the profile 105 of the user in a conference screen 106 and other participants will be able to view the emotional state of the user. That is, the emotional state of the user may be displayed on each display connected to the virtual meeting via the profile 105 of the user. As used herein, “profile” refers to the graphical representation of a person in the form of a still image, characters (e.g., letters, numbers, etc.), and/or a video image.
[0017] The processing resource may analyze input from two or more peripheral devices to determine the current emotional state of the user. That is, the processing resource may collect, individually, from the peripheral devices 108 and combine that data to determine the emotional state of the user. For example, data from the keyboard 108-3 peripheral device, data from the computer mouse 108-4 peripheral device, and data from the microphone 108-2 peripheral device can be used to determine the emotional state of the user. Using a plurality of devices to gauge the emotional state of the user can ensure that an accurate emotional state of the user is determined. [0018] Figure 2 illustrates an example of an apparatus 220 that includes a processing resource 221 and memory resource 222 for emotion state monitoring. The processing resource 221 may be a hardware processing unit such as a microprocessor, application specific instruction set processor, coprocessor, network processor, application specific integrated circuit (ASIC), general purpose input output (GPIO), a central processing unit (CPU), or similar hardware circuitry that may cause machine- readable instructions to be executed. In some examples, the processing resource 221 may be a plurality of hardware processing units that may cause machine-readable instructions to be executed. The memory resource 222 may be any type of volatile or non-volatile memory or storage, such as random-access memory (RAM), flash memory, read-only memory (ROM), storage volumes, a hard disk, or a combination thereof. In some examples, the memory resource 222 may be a non-transitory machine-readable medium (e.g., the non-transitory machine-readable medium 440 of Figure 4).
[0019] The memory resource 222 may store instructions thereon, such as instructions 223, 224, 225, 226, and 227. When executed by the processing resource 221 , the instructions 223, 224, 225, 226, and 227 may cause the apparatus 220 to perform specific tasks and/or functions. For example, the memory resource 222 may store instructions 223 which may be executed by the processing resource 221 to store behavioral input related to emotional cues. Behavioral input stored over time can lead to an understanding of which actions performed by the user are indicative of a specific emotional state. The processing resource 221 can store interactions and use the stored interactions to analyze current and/or future behavior to determine the emotional state of the user. That is, the processing resource 221 can receive behavioral input related to the user from peripheral devices and store the behavioral input. The behavioral input can be matched to an emotional state for the user and the emotional states matched to a particular behavior can be matched to future behavior by the user when it relates to peripheral devices.
[0020] For example, the processing resource 221 can cause information to be stored on the memory resource 222 stating that when the user types fast or typing inaccurately (e.g., misspelling words) on the keyboard peripheral device that the user is angry. The processing resource 221 can cause this information to be stored on the memory resource 222 and use the stored information to analyze later behavior of the user. In addition, the process resource 221 can cause information to be stored on the memory resource 222 stating that when the user yells into the microphone peripheral device or has a high-pitched voice that the user is angry or excited. Moreover, the processing resource 221 can cause information to be stored on the memory resource 222 stating that when the user speaks calmly into the microphone that the user is in a neutral emotional state. That is, a particular tone and the emotion associated with the tone can be stored on the memory resource 222 and used to determine the emotional state of the user.
[0021] In some examples, if the user is conducting the virtual meeting on a smartphone, tablet, and/or phablet, the emotional state of the user can be determined based on the interaction with the smartphone, tablet, and/or phablet. For example, information can be stored in the memory resource 222, by the processing resource 221 , stating that if the user swipes or touches the touch screen slowly, erratically, or in a smooth manner, the user could be upset, angry, or calm, respectively. That is, the interactions with the smartphone, tablet, and/or phablet and the associated emotion can be stored on the memory device 222.
[0022] The memory resource 222 may store instructions 224 which may be executed by the processing resource 221 to cause the apparatus 220 to receive input from a plurality of peripheral devices during a virtual meeting. In some examples, when a user uses the peripheral devices, the input dictating the behavior behind the use is stored on the memory resource 222. As such, the processing resource 221 can receive current input from peripheral devices. The current input from the peripheral devices can assist the computing device in determining the emotional state of the user. In some examples, when the user enters a virtual meeting, the user will use peripheral devices to communicate with the other participants of the virtual meeting. For example, the user may type on the keyboard peripheral device, speak into the microphone peripheral device, click on items with the computer mouse peripheral device, and/or show video with the camera peripheral device during the virtual meeting. How the user uses these peripheral devices can be dictated and sent to the processing resource 221 for analysis. That is, as the user uses the peripheral devices, the processing resource 221 will received input from the peripheral devices and the current input can be used to determine the emotional state of the user.
[0023] The memory resource 222 may store instructions 225 which may be executed by the processing resource 221 to cause the apparatus 220 to compare the stored behavioral input to the received input from the plurality of peripheral devices. As used herein, “stored behavioral input” refers to behavior from the use of a peripheral device analyzed for a specific emotional cue and stored in the memory resources and used to determine the emotion behind future and/or current behavior related to use of peripheral devices. In some examples, the stored behavioral input may be received during a virtual meeting. However, this disclosure is not so limited. For example, stored behavioral input may be received when a virtual meeting is not in session and the user is using the computing device. In some examples, the processing resource will compare the current input from the peripheral devices to the stored behavior from the peripheral device to determine the emotion of the user. Comparing the current input from the peripheral devices will allow the computing device to determine the emotion of the user at the time the input was created. That is, the processing resource 221 will analyze the received input from the peripheral devices by comparing them to stored behavior and the corresponding emotion matched to the stored behavior.
[0024] The stored behavior can be linked to specific emotions which can inform the processing resource of the emotional state of the user. In some examples, the processing resource 221 may have to analyze each input from the peripheral devices separately and then combine each analysis to determine the emotional state of the user. That is, the processing resource 221 can analyze input from the microphone peripheral device and the keyboard peripheral device to determine the emotional state of the user. For example, the processing resource 221 can compare and analyze stored input from the microphone peripheral device to the current input received from the microphone peripheral device and compare and analyze stored input from the keyboard peripheral device to the current input received from the keyboard peripheral device to determine the emotional state of the user. Hence, the processing resource 221 will combined the compared/analyzed input to determine the emotional state of the user. [0025] The memory resource 222 may store instructions 226 which may be executed by the processing resource 221 to cause the apparatus 220 to adjust an emotional state indication of a user based on the compared stored behavioral input and the input from the plurality of peripheral devices. In some examples, the processing resource 221 will determine the emotional state of the user. That is, the emotional state of the user can change from time to time. For instance, the manner in which the user uses the peripheral devices can be an indication of the emotional state the user is in. When the emotional state of the user changes based on the use of the peripheral devices by the user, the processing resource 221 will adjust the known emotional state of the user.
[0026] In some examples, the processing resource 221 can set a threshold range for each emotional state determined by the behavioral inputs. For example, a threshold can be set for the speed at which a user types on a keyboard peripheral device, the volume at which the user speaks on the microphone peripheral device, the force in which the user touches the touchpad of a smart device, or the amount of force used on a smart pen when using a smart device, etc. For instance, if the typing speed, volume, touch/smart pen force provided by a user exceeds a certain amount the processing resource 221 can determine that a threshold range has been met and adjust the emotional state. That is, the processing resource can update the emotional status indicator responsive to the threshold range for the emotional state being reached and/or exceeded.
[0027] The memory resource 222 may store instructions 227 which may be executed by the processing resource 221 to cause the apparatus 220 to transmit the emotional state to a second device. In some examples, the processing resource 221 will determine the emotional state of the user when the user is in a virtual meeting. Once the processing resource 221 determines the emotional state of the user, a notification is sent to the conference screen of the virtual meeting. This informs other participants of the virtual meeting to know the emotional state of the user. For example, the indication can be a border of a particular color around the profile of the user or words in a comer of the profile of the user. In some examples, the indication of the emotional state of the user can appear when a participant hovers a computer mouse over the profile of the user.
[0028] Figure 3 illustrates an example of a system 301 including a plurality of systems 300 each including a computing device 302 and a plurality of peripheral devices 308 for emotion state monitoring. Figure 3 can include analogous or similar elements as Figure 1. For example, Figure 3 can include system 300, computing device 302, display 304, conference screen 306, camera peripheral device 308-1 , microphone peripheral device 308-2, keyboard peripheral device 308-3, computer mouse peripheral device 308-4.
[0029] In some examples, a plurality of systems 300-1 , 300-2, and 300-3 can be communicatively connected via connection 303 (connection 303-1 , 303-2, 303-3 can be referred to as connection 303). That is, each system 300 (system 300-1 , 300-2, 300-3 can be referred to as system 300) can connect via connection 303 during a virtual meeting. The connection 303 between the plurality of systems 300-1 , 300-2, and 300-3 can be a direct or indirect connection. That is, the connection can be via the internet (e.g., indirect) using various network interfaces, a local area network (LAN) (e.g., direct connection), a video conferencing platform (e.g., indirect) connected to the internet, etc. in some examples, each computing device 302-1 , 302-2, 302-3 can store input based on the emotion of the user from the peripheral devices 308 (peripheral device 308-1 A, 308-2A, 308-3A, 308-4A, 308-1 B, 308-2B, 308-3B, 308-4B, 308-1 C, 308-2C, 308-3C, 308-4C can be referred to as peripheral devices 308) connected to the computing device 302-1 , 302-2, 302-3. The computing device 302 can use the stored input to predict the emotional state of the user. For example, during a virtual meeting when a user uses peripheral devices 308 connected to the computing device 302 (computing device 302-1 , 302-2, 302-3 can be referred to as computing devices 302), the processing resource will analyze the behavior and compare the current input with the stored input to determine the emotional state of the user of the particular computing device 302.
[0030] The processing resource of a computing device (e.g., 302-2) will send the emotional state of the user to the other computing devices (e.g., 302-1 , 302-3) via connection (e.g., 303-1 , 303-2). The emotional state of the user will appear on the conference screen for all of the participants of the virtual meeting to see. For example, the processing resource of the first computing device 302-1 can receive input from the peripheral devices 308-1 A, 308-2A, 308-3A, 308-4A and compare the received input to stored input to determine the emotional state of the user. The processing resource will then send, via connection 303-1 and 303-3, the determined emotional state to the other computing devices 302-2 and 302-3 participating in the virtual meeting. The emotional state of the user of the first computing device 302-1 will appear on the conference screen 306 (conference screen 306-1 , 306-2, 306-3 can be referred to as conference screens 306), via display 304-1 , 304-2, 304-3, on all other computing devices 302 participating in the virtual meeting.
[0031] In some examples, the processing resource of each computing device 302 can receive input from each peripheral device 308 attached to the particular computing device 302. The input can be used to gauge the participation of the user in a virtual meeting. As described herein, the amount of participation can be sent, via connection 303, to each computing device 302, and the combined analysis of each user participation can be used to determine the quality of the meeting. In some examples, the quality of the virtual meeting can be indicated on the conference screen 306 of each computing device 302 in the virtual meeting. In some examples, the virtual meeting may be on a separate device (e.g., a server), such that the plurality of systems 300 each send the data to the separate device (e.g., server) and the separate device (e.g., server) can send the data to each system 300.
[0032] Figure 4 illustrates an example of a system that includes a non-transitory machine-readable medium 440 including instructions for emotion state monitoring. A processing resource may execute instructions stored on the non-transitory machine- readable medium 440. The non-transitory machine-readable medium 440 may be any type of volatile or non-volatile memory or storage, such as random-access memory (RAM), flash memory, read-only memory (ROM), storage volumes, a hard disk, or a combination thereof.
[0033] The non-transitory machine-readable medium 440 stores instructions 443 executable by a processing resource to receive input from a plurality of peripheral devices connected to a computing device. In some examples, when a user is in a virtual meeting, the user may use peripheral devices to communicate with the other participants of the virtual meeting. For example, the user may use the keyboard to chat during the virtual meeting, a microphone to speak to other participants in the virtual meeting, a camera to show the video stream of the user, and/or a computer mouse to click on items relevant to the virtual meeting. As the user uses the peripheral devices, information related to how each device is being used may be sent to the processing resource. For example, the processing resource will receive information such as how fast the user is typing on the keyboard peripheral device, or how many spelling errors the user is making when typing, the amount of pressure being exerted on the keys of the keyboard or the computer mouse buttons when being depressed, the volume the user is speaking into the microphone peripheral device, an/or the facial expression the user is making when the camera peripheral device is being used, etc. In some examples, the information related to how each device is being used may be received during a virtual meeting. However, this disclosure is not so limited. For example, the information related to how each device is being used may be received when a virtual meeting is not in session and the user is using the computing device.
[0034] The non-transitory machine-readable medium 440 stores instructions 444 executable by a processing resource to match the stored behavioral input to the emotional state of the user. In some examples, the processing resource will use the received input from the peripheral devices to determine the emotional state of the user. For example, the computing device can learn the emotional state tied to specific behaviors, as it relates to using peripheral devices, by monitoring the user use of the peripheral devices and prompting questions for the use based on that use. When the similar behavior with the peripheral devices is repeated the processing resource can match the current behavior using the peripheral devices with the learned behavior to determine the emotional state of the user.
[0035] The non-transitory machine-readable medium 440 stores instructions 445 executable by a processing resource to determine the emotional state of the user based on stored behavioral inputs and the input from the plurality of peripheral devices.
During a virtual meeting, the processing resource will match current input from peripheral devices to stored input from peripheral devices to determine the emotional state of the user. Once determined the emotional state will be sent to other participants of the virtual meeting. That is, the other participants of the virtual meeting will get an indication of the emotional state of the user in the conference screen.
[0036] The non-transitory machine-readable medium 440 stores instructions 446 executable by a processing resource to update the emotional state matched to the behavioral input responsive to receiving additional behavioral input. As described herein, once the emotional state of the user is determined an indication of the emotional state of the user is shown on the conference screen. In some examples, the indication of the emotional state of the user may be updated as the emotional state of the user changes. For example, if the input from the peripheral devices show that the user is happy and later in the virtual meeting the input determines that the user is angry or sad, the indication of the emotional state of the user will be adjusted on the conference screen.
[0037] The non-transitory machine-readable medium 440 stores instructions 447 executable by a processing resource to analyze input from the plurality of peripheral devices from a first user and input from a plurality of peripheral devices from a second user to determine the quality of the virtual meeting. In some examples, when the user is in a virtual meeting, the input from each peripheral device connected to each computing device participating in the virtual meeting maybe used to determine the quality of the meeting. For example, the processing resource can record the number of participants in the virtual meeting and determine how many participants are active in the virtual meeting via the peripheral devices connected to each computing device. In some examples, the processing resource for each computing device participating in the virtual meeting will receive input related to the use of the peripheral devices connected to the respective computing device. The processing resource of each computer can use the input to determine If the owner of the particular computing device actively participated in the virtual meeting. For example, each processing resource can determine how many times the owner of the particular computing device talked into the microphone peripheral device, how often the owner typed on the keyboard peripheral device, if the camera peripheral device was turned on, and/or if the mouse peripheral device was used for the virtual meeting. [0038] The non-transitory machine-readable medium 440 stores instructions 448 executable by a processing resource to indicate the quality of the virtual meeting on a display. In some examples, the information determined by each processing resource will be shared with the processing resources of the other computing devices that participated in the virtual meeting. The shared information can then be used to determine the quality of the meeting. For instance, if majority of the participants actively participated in the meeting, the meeting would be determined to have a high quality. However, if only a few participants of the virtual meeting actively participated then the virtual meeting would be determined to have a low quality. That is, the number of attendees (e.g., the number of people participating in the virtual meeting), the number of inputs from a peripheral device from each attendee (e.g., how many time each participant used a peripheral device during the virtual meeting for meeting purposes), and the length of the input (e.g., how long the participant used the peripheral device), and the type of input (e.g., camera peripheral device, microphone peripheral device, keyboard peripheral device, mouse peripheral device, etc.) can be used to determine the quality of the meeting.
[0039] In some examples, once the quality of the meeting is determined, the quality of the virtual meeting can be displayed on the conference screen via the display to inform the participants of the meeting quality. The quality of the virtual meeting can actively change during the meeting. That is, if participation lessens during the meeting the quality will start to fall. Conversely, if participation in the virtual meeting begins to increase the quality of the meeting will increase. In some examples, the indication of the quality of the meeting will be updated as the quality changes.
[0040] The figures herein follow a numbering convention in which the first digit corresponds to the drawing figure number and the remaining digits identify an element or component in the drawing. Similar elements or components between different figures can be identified by the use of similar digits. For example, 102 can reference element “02” in Figure 1 , and a similar element can be referenced as 302 in Figure 3.
[0041] Elements shown in the various figures herein can be capable of being added, exchanged, and/or eliminated so as to provide a number of additional examples of the disclosure. In addition, the proportion and the relative scale of the elements provided in the figures are intended to illustrate the examples of the disclosure and should net be taken in a limiting sense.
[0042] The above specification and examples provide a description of the method and applications and use of the system and method of the present disclosure. Since many examples can be made without departing from the scope of the system and method, this specification merely sets forth some of the many possible example configurations and implementations.
[0043] It should be understood that the descriptions of various examples may not be drawn to scale and thus, the descriptions can have a different size and/or configuration other than as shown therein.

Claims

What is claimed:
1 . A computing device comprising: a plurality of peripheral devices; and a non-transitory machine-readable medium storing instructions executable by a processing resource to: receive input from the plurality of peripheral devices; analyze the received input from the plurality of peripheral devices for emotional cues; determine an emotional state of a user based on the analyzed input from the plurality of peripheral devices; and display the determined emotional state on a display device.
2. The computing device of claim 1 , further comprising instructions to store a standard behavior of the user while using a peripheral device of the plurality of peripheral device.
3. The computing device of claim 2, further comprising instructions to compare the stored standard behavior of the user to the received input from the peripheral device of the plurality of peripheral device to determine the emotional state of the user.
4. The computing device of claim 1 , wherein the plurality of peripheral devices includes a camera peripheral device, a microphone peripheral device, a keyboard peripheral device, and a computer mouse peripheral device.
5. The computing device of claim 1 , wherein displaying the determined emotional state on the display device comprises displaying the emotional state on each display device connected to a virtual meeting.
6. A non-transitory machine-readable medium storing instructions executable by a processing resource to: store behavioral input related to emotional cues; receive input from a plurality of peripheral devices during a virtual meeting; compare the stored behavioral input to the received input from the plurality of peripheral devices; adjust an emotional state indication of a user based on the compared stored behavioral input and the input from the plurality of peripheral devices; and transmit the emotional state to a second device.
7. The medium of claim 6, further comprising instructions to match the stored behavioral input to the emotional state of the user.
8. The medium of claim 7, further comprising instructions to update the emotional state matched to the behavioral input responsive to receiving additional behavioral input.
9. The medium of claim 6, further comprising instructions to analyze input from the plurality of peripheral devices from a first user and input from a plurality of peripheral devices from a second user to determine a quality of the virtual meeting.
10. The medium of claim 9, further comprising instructions to indicate the quality of the virtual meeting on a display device.
11. A system comprising: a display device; a plurality of peripheral devices; and a non-transitory machine-readable medium storing instructions executable by a processing resource to: receive behavioral input related to a user; match the behavioral input to an emotional state of the user; receive input from the plurality of peripheral devices; determine the emotional state of the user by comparing the input from the plurality of peripheral devices, the behavioral input, and the matched emotional state; and update an emotion status indicator based on the determined emotional state.
12. The system of claim 11 , further comprising instructions to analyze the input from the plurality of peripheral devices to determine a quality of a virtual meeting.
13. The system of claim 11 , further comprising instructions to update the emotion status indicator by displaying a color associated with the emotional state on the display device.
14. The system of claim 11 , further comprising instructions to set a threshold range for each emotional state determined by the behavioral input.
15. The system of claim 14, further comprising instructions to update the emotion status indicator responsive to the threshold range for the emotional state being reached and/or exceeded.
PCT/US2022/046231 2022-10-11 2022-10-11 Emotion state monitoring WO2024080970A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2022/046231 WO2024080970A1 (en) 2022-10-11 2022-10-11 Emotion state monitoring

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2022/046231 WO2024080970A1 (en) 2022-10-11 2022-10-11 Emotion state monitoring

Publications (1)

Publication Number Publication Date
WO2024080970A1 true WO2024080970A1 (en) 2024-04-18

Family

ID=84329605

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/046231 WO2024080970A1 (en) 2022-10-11 2022-10-11 Emotion state monitoring

Country Status (1)

Country Link
WO (1) WO2024080970A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130019187A1 (en) * 2011-07-15 2013-01-17 International Business Machines Corporation Visualizing emotions and mood in a collaborative social networking environment
US20210076002A1 (en) * 2017-09-11 2021-03-11 Michael H Peters Enhanced video conference management

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130019187A1 (en) * 2011-07-15 2013-01-17 International Business Machines Corporation Visualizing emotions and mood in a collaborative social networking environment
US20210076002A1 (en) * 2017-09-11 2021-03-11 Michael H Peters Enhanced video conference management

Similar Documents

Publication Publication Date Title
US9843768B1 (en) Audience engagement feedback systems and techniques
US10712936B2 (en) First electronic device and information processing method applicable to first or second electronic device comprising a first application
WO2016188494A1 (en) Expression curve generation method based on voice input, and apparatus using same
US11556761B2 (en) Method and device for compressing a neural network model for machine translation and storage medium
US9542943B2 (en) Minutes making assistance device, electronic conference device, electronic conference system, minutes making assistance method, and storage medium storing minutes making assistance program
JP7323098B2 (en) Dialogue support device, dialogue support system, and dialogue support program
CN111556279A (en) Monitoring method and communication method of instant session
US11741964B2 (en) Transcription generation technique selection
CN111835617B (en) User head portrait adjusting method and device and electronic equipment
CN110992958B (en) Content recording method, content recording apparatus, electronic device, and storage medium
WO2024080970A1 (en) Emotion state monitoring
US20230138820A1 (en) Real-time name mispronunciation detection
CN110908523A (en) Input method and device
CN115118820A (en) Call processing method and device, computer equipment and storage medium
CN113420553A (en) Text generation method and device, storage medium and electronic equipment
JP2022056108A (en) Information processing device, information processing method, information processing program, and information processing system
CN106453936A (en) Terminal control method and device
CN113946228A (en) Statement recommendation method and device, electronic equipment and readable storage medium
JP2017059079A (en) Information delivery device and information delivery program
US20240146673A1 (en) Method for correcting profile image in online communication service and apparatus therefor
CN116405736B (en) Video recommendation method, device, electronic equipment and storage medium
WO2023033033A1 (en) Communication support system, information processing device, communication support method, and program
US11935076B2 (en) Video sentiment measurement
JP7471683B2 (en) Reaction notification system
US20230289740A1 (en) Management of in room meeting participant

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22800936

Country of ref document: EP

Kind code of ref document: A1