CN112950398A - Online social contact method and device based on emotion recognition and storage medium - Google Patents

Online social contact method and device based on emotion recognition and storage medium Download PDF

Info

Publication number
CN112950398A
CN112950398A CN202110239315.1A CN202110239315A CN112950398A CN 112950398 A CN112950398 A CN 112950398A CN 202110239315 A CN202110239315 A CN 202110239315A CN 112950398 A CN112950398 A CN 112950398A
Authority
CN
China
Prior art keywords
emotion
data
user
chat
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110239315.1A
Other languages
Chinese (zh)
Inventor
王化
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Technology University
Original Assignee
Shenzhen Technology University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Technology University filed Critical Shenzhen Technology University
Priority to CN202110239315.1A priority Critical patent/CN112950398A/en
Publication of CN112950398A publication Critical patent/CN112950398A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection

Abstract

The invention discloses an online social method, a device and a storage medium based on emotion recognition, which are used for recognizing the emotion of a local chat user based on user physiological characteristic data, transmitting emotion expression contents such as emotion expression animations, emotion expression dynamic characters and the like corresponding to the emotion of the user to a chat client used by an opposite chat user in real time for displaying, wherein the emotion expression animations and the emotion expression dynamic characters can display various different emotions and emotions in a corresponding dynamic mode, for example, when the emotion of the chat user is high, the animations and the dynamic characters can have more cheerful actions, higher speed, larger size and brighter colors, and when the emotion of the chat user is low, the animations and the dynamic characters adopt less heavy actions, lower speed, smaller size and darker colors, so that the emotion of the user is expressed visually through the animations and the dynamic characters, the two chat parties can be effectively ensured to know the emotion of each other, and the chat interaction effectiveness of the two chat parties can be obviously improved.

Description

Online social contact method and device based on emotion recognition and storage medium
Technical Field
The invention relates to the technical field of electronics, in particular to an online social contact method and device based on emotion recognition and a storage medium.
Background
With the continuous development of internet technology, online chat applications have become a main tool for users to perform instant messaging. However, online chatting is much more primitive in communicating opposite sides and it is often difficult to convey much basic information, such as user emotion. In practical applications, if two chat parties cannot accurately know the emotion of each other, the chat content cannot be accurately understood, and the effectiveness of chat interaction is poor.
Disclosure of Invention
The embodiments of the present invention mainly aim to provide an online social method, an online social device, and a storage medium based on emotion recognition, which can at least solve the problems in the related art that chat content cannot be accurately understood and chat interaction effectiveness is poor due to the fact that two parties of a chat cannot accurately know the emotion of each other.
In order to achieve the above object, a first aspect of the embodiments of the present invention provides an online social method based on emotion recognition, where the method includes:
acquiring physiological characteristic data of a local chat user through a sensor;
determining a corresponding user emotion recognition result based on the physiological characteristic data;
acquiring corresponding emotion expression content according to the user emotion recognition result; the emotion expression content comprises at least one of emotion expression animation and emotion expression dynamic characters;
and sending the emotion expression content to a chat client used by an opposite-end chat user in real time.
In order to achieve the above object, a second aspect of the embodiments of the present invention provides an online social device based on emotion recognition, including:
the acquisition module is used for acquiring physiological characteristic data of the local chat user through the sensor;
a determining module for determining a corresponding user emotion recognition result based on the physiological characteristic data;
the acquisition module is used for acquiring corresponding emotion expression content according to the user emotion recognition result; the emotion expression content comprises at least one of emotion expression animation and emotion expression dynamic characters;
and the sending module is used for sending the emotion expression content to the chat client used by the opposite-end chat user in real time.
To achieve the above object, a third aspect of embodiments of the present invention provides an electronic apparatus, including: a processor, a memory, and a communication bus;
the communication bus is used for realizing connection communication between the processor and the memory;
the processor is configured to execute one or more programs stored in the memory to implement the steps of any one of the above-mentioned online social methods.
To achieve the above object, a fourth aspect of the embodiments of the present invention provides a computer-readable storage medium storing one or more programs, which are executable by one or more processors to implement the steps of any one of the above-mentioned online social methods.
According to the online social contact method, device and storage medium based on emotion recognition, provided by the embodiment of the invention, physiological characteristic data of a local chat user is acquired through a sensor; determining a corresponding user emotion recognition result based on the physiological characteristic data; acquiring corresponding emotion expression content according to the emotion recognition result of the user; and sending the emotion expression content to the chat client used by the opposite-end chat user in real time for displaying. Through the implementation of the invention, the emotion of the local chat user is identified based on the physiological characteristic data of the user, and emotion expression contents such as emotion expression animation, emotion expression dynamic characters and the like corresponding to the emotion of the user are sent to the chat client used by the opposite chat user in real time for displaying, so that the emotion information of the chat user is transmitted through the animation, the dynamic characters and the like, the two chat parties are effectively ensured to know the emotion of each other, and the chat interaction effectiveness of the two chat parties can be obviously improved.
Other features and corresponding effects of the present invention are set forth in the following portions of the specification, and it should be understood that at least some of the effects are apparent from the description of the present invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a basic flowchart of an online social method according to a first embodiment of the present invention;
fig. 2 is an emotion recognition map provided in a first embodiment of the present invention;
FIG. 3 is a diagram of an emotional expression animation according to a first embodiment of the invention;
fig. 4 is a schematic diagram illustrating a GSR data change according to a first embodiment of the present invention;
FIG. 5 is a schematic diagram illustrating program modules of an online social device according to a second embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to a third embodiment of the invention.
Detailed Description
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The first embodiment:
in order to solve the problems that chat contents cannot be accurately understood and chat interaction effectiveness is poor due to the fact that two chat parties cannot accurately know the emotion of each other in the related art, an online social method based on emotion recognition is provided in the present embodiment, and as shown in fig. 1, a basic flow diagram of the online social method provided in the present embodiment is provided, and the online social method provided in the present embodiment includes the following steps:
step 101, collecting physiological characteristic data of a local chat user through a sensor.
In particular, linguistic studies indicate that the information conveyed by text chatting accounts for only 20% to 30% of the information conveyed in conversations, thus demonstrating the importance of non-verbal information, such as mood, which is essential for human cognition and affects different aspects of human life. For example, when we are excited, our perception tends to select a happy event, while negative emotions are reversed. This indicates that online chatting would benefit from an understanding of the emotional state of others. With the information of the emotional state of the user, the system can interact with the user more efficiently.
In practical applications, a user connects a specific sensor during a chat process, the sensor performs digital sampling, and data capture can be realized by using a module written in Visual C + +.
In an optional implementation manner of this embodiment, the physiological characteristic data may be galvanic skin response data and galvanic muscle response data; the Skin Galvanic reaction data is collected by a Skin Galvanic reaction instrument (GSR), which is an instrument for recording Skin Galvanic reaction changes in a curved ripple mode, and the muscle Galvanic reaction data can be collected by an electromyograph to record muscle bioelectricity graphs.
In another optional implementation manner of this embodiment, the physiological characteristic data may also be gaze tracking data, and the gaze tracking data may include pupil dilation data, eye movement data, and the like, and in practical application, the image sensor may be used to perform eye image acquisition on the user, and then the image recognition algorithm may recognize a plurality of consecutive eye images to acquire the gaze tracking data.
And 102, determining a corresponding emotion recognition result of the user based on the physiological characteristic data.
Specifically, in practical applications, when the emotion of the user changes, the physiological characteristics of the user generally change, and thus the present embodiment can adaptively sense the emotion of the user through the physiological characteristic data.
In the present embodiment, specific implementations of determining the corresponding emotion recognition result of the user based on the physiological characteristic data include, but are not limited to, the following two types:
determining corresponding emotional arousal data based on galvanic skin response data and corresponding emotional valence data based on galvanic muscle response data; and determining a corresponding emotion recognition result of the user by combining the emotional arousal data and the emotional valence data.
Specifically, since the GSR data has less noise signals relative to the BVP data, the present embodiment uses a GSR sensor to detect the emotional arousal data, the GSR sensor that measures Skin Conductance (SC) is connected to the middle and index fingers of the user's non-dominant hand, and the signal can be recorded with a ProComp + device. SC varies linearly with overall level of arousal, increasing with increased anxiety and stress. In an embodiment, emotional arousal data may be detected by analyzing peaks and troughs of the GSR data.
Fig. 2 shows an emotion recognition graph provided in this embodiment, in which a horizontal axis represents emotion valence data, a vertical axis represents emotion awakening data, and elements in each quadrant represent different types of emotion recognition results, including excitement, relaxation, sadness, fear, and the like.
Determining corresponding attention characterization data and interest characterization data based on the gaze tracking data; and determining a corresponding emotion recognition result of the user by combining the attention characterization data and the interest characterization data.
Specifically, the embodiment reacts to the sight line information of the user in real time, and records the eye gazing and pupil expanding data in the online chatting process. Eye movements may reveal the user's interests and attention, enrich non-verbal information by detecting attention, interests, etc. information from the user's real-time eye movement data, and use line of sight and eye movement information to facilitate emotion-related reasoning.
And 103, acquiring corresponding emotion expression content according to the emotion recognition result of the user.
In this embodiment, the emotion expression content includes at least one of an emotion expression animation and an emotion expression dynamic text, and fig. 3 is a schematic diagram of the emotion expression animation provided in this embodiment. The emotion expression content of the embodiment is adapted to the current emotional state of the user, and can be used for expressing more detailed information including the emotion of the user, different from the plain text information.
In an optional implementation manner of this embodiment, the step of obtaining corresponding emotion expression content according to the emotion recognition result of the user specifically includes: determining a corresponding content type and a content display attribute according to a user emotion recognition result; and acquiring corresponding emotion expression content according to the content type and the content display attribute.
Specifically, the content types of the emotion expression content may include: joy, anger, sadness, happiness, etc., the content display attributes may include: speed of change, size, color, etc. The embodiment can determine the display attributes of the emotion expression animation and the emotion expression dynamic characters according to the emotion awakening data, and determine the specific types of the emotion expression animation and the emotion expression dynamic characters according to the emotion valence data. In practical applications, for example, when the GSR data suddenly increases and peaks, the corresponding generated animation or dynamic text may have a higher speed, a larger size, and a brighter color; when the GSR data is reduced, the corresponding generated emotion expression content changes slowly, and the display color becomes dark.
And step 104, sending the emotion expression content to the chat client used by the opposite-end chat user in real time.
Specifically, the embodiment describes and expresses the emotion of the user to the two chat parties by adopting the emotion expression content so as to convey emotion and subtle information during online chat, and the two chat parties can effectively perceive the emotion of each other, so that the participation degree of each other in the chat can be increased, and the chat interaction can be carried out more effectively. In practical applications, the user may create an emotive animation or emotive dynamic text by buttons, shortcut keys, or indicia embedded in the text message, and embed an emotive animation in a specific location (e.g., in front) of the text message to accurately convey the emotions of both parties to the chat, such as "< Happy > i Happy! ", where < Happy >, that is, the emotion shown at the first row and first column position in fig. 3, expresses an animation to sufficiently convey a pleasant emotion of the user through the animation. In practical application, different types of emotion expression animations can be flexibly adopted according to different application scenes, for example, when the emotion of a user is Sad, the < Sad > animation shown in the third column position of the second row in fig. 3 can be adopted.
It should be further noted that the present embodiment also performs usability research on the chat client implemented by using the online social method, and the research target is a user who uses the chat client of the present embodiment and a conventional chat client at the same time. The test subjects were randomly assigned to use two chat systems, which were in the same building, but should not be met. They were divided into three pairs and were separately conversed with partners, each conversation lasting for more than an hour, and they were conversed with school lessons and the like. After the conversation, the subject answered the questionnaire and evaluated the chat system. First, the present embodiment examines the GSR data and the subject's responses, and when they are asked to feel that they like the chats the best, the present embodiment compares their answers with the GSR results. There is a good correlation between the GSR data and the user reported tension. Subjects show that they gradually focus their attention on the conversation and the GSR also shows similar changes, as shown in fig. 4, which is a schematic diagram of the GSR data changes provided in this example, and the GSR increases as the subjects are more focused on the conversation. (the graph records the user's GSR every minute). This indicates that GSR can be used to determine changes in emotion in real time in an online conversation. The results of the study also indicate that affective information may increase the subject's engagement in conversation. The emotional information from the chat partners gives the user the feeling that they are not only exchanging text information, but also exchanging their own feelings with each other, which enables them to participate more in the conversation.
According to the online social contact method based on emotion recognition, provided by the embodiment of the invention, physiological characteristic data of a local chat user is collected through a sensor; determining a corresponding user emotion recognition result based on the physiological characteristic data; acquiring corresponding emotion expression content according to the emotion recognition result of the user; and sending the emotion expression content to the chat client used by the opposite-end chat user for displaying. Through the implementation of the invention, the emotion of the local chat user is identified based on the physiological characteristic data of the user, and emotion expression contents such as emotion expression animation, emotion expression dynamic characters and the like corresponding to the emotion of the user are sent to the chat client used by the opposite chat user in real time for displaying, so that the emotion information of the chat user is transmitted through the animation, the dynamic characters and the like, the two chat parties are effectively ensured to know the emotion of each other, and the chat interaction effectiveness of the two chat parties can be obviously improved.
Second embodiment:
in order to solve the problems in the related art that chat contents cannot be accurately understood and chat interaction effectiveness is poor due to the fact that two chat parties cannot accurately know the emotion of each other, the embodiment shows an online social device based on emotion recognition, and referring to fig. 5 specifically, the online social device of the embodiment includes:
the acquisition module 501 is used for acquiring physiological characteristic data of the local chat user through a sensor;
a determining module 502 for determining a corresponding user emotion recognition result based on the physiological characteristic data;
an obtaining module 503, configured to obtain a corresponding emotion expression animation according to the emotion recognition result of the user; wherein, the emotion expression content comprises at least one of emotion expression animation and emotion expression dynamic characters;
and a sending module 504, configured to send the emotion expression content to a chat client used by an opposite-end chat user in real time.
In some embodiments of this embodiment, the physiological characteristic data is galvanic skin response data and galvanic muscle response data. Correspondingly, the determining module 502 is specifically configured to: determining corresponding emotional arousal data based on the galvanic skin response data and corresponding emotional valence data based on the galvanic muscle response data; and determining a corresponding emotion recognition result of the user by combining the emotional arousal data and the emotional valence data.
In other embodiments of this embodiment, the physiological characteristic data is gaze tracking data. Correspondingly, the determining module 502 is specifically configured to: determining respective attention characterization data and interest characterization data based on the gaze tracking data; and determining a corresponding emotion recognition result of the user by combining the attention characterization data and the interest characterization data.
In some embodiments of this embodiment, the obtaining module 503 is specifically configured to: acquiring a corresponding content type and a content display attribute according to a user emotion recognition result; and acquiring corresponding emotion expression content according to the content type and the content display attribute.
It should be noted that, the online social method based on emotion recognition in the foregoing embodiments may be implemented based on the online social device based on emotion recognition provided in this embodiment, and it may be clearly understood by a person having ordinary skill in the art that, for convenience and brevity of description, the specific working process of the online social device described in this embodiment may refer to the corresponding process in the foregoing method embodiments, and details are not described here.
By adopting the online social contact device based on emotion recognition, physiological characteristic data of the local chat user is acquired through a sensor; determining a corresponding user emotion recognition result based on the physiological characteristic data; acquiring corresponding emotion expression content according to the emotion recognition result of the user; and sending the emotion expression content to the chat client used by the opposite-end chat user for displaying. Through the implementation of the invention, the emotion of the local chat user is identified based on the physiological characteristic data of the user, and emotion expression contents such as emotion expression animation, emotion expression dynamic characters and the like corresponding to the emotion of the user are sent to the chat client used by the opposite chat user in real time for displaying, so that the emotion information of the chat user is transmitted through the animation, the dynamic characters and the like, the two chat parties are effectively ensured to know the emotion of each other, and the chat interaction effectiveness of the two chat parties can be obviously improved.
The third embodiment:
the present embodiment provides an electronic device, as shown in fig. 6, which includes a processor 601, a memory 602, and a communication bus 603, wherein: the communication bus 603 is used for realizing connection communication between the processor 601 and the memory 602; the processor 601 is configured to execute one or more computer programs stored in the memory 602 to implement at least one step of the online social method in the first embodiment.
The present embodiments also provide a computer-readable storage medium including volatile or non-volatile, removable or non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, computer program modules or other data. Computer-readable storage media include, but are not limited to, RAM (Random Access Memory), ROM (Read-Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), flash Memory or other Memory technology, CD-ROM (Compact disk Read-Only Memory), Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer.
The computer-readable storage medium in this embodiment may be used for storing one or more computer programs, and the stored one or more computer programs may be executed by a processor to implement at least one step of the method in the first embodiment.
The present embodiment also provides a computer program, which can be distributed on a computer readable medium and executed by a computing device to implement at least one step of the method in the first embodiment; and in some cases at least one of the steps shown or described may be performed in an order different than that described in the embodiments above.
The present embodiments also provide a computer program product comprising a computer readable means on which a computer program as shown above is stored. The computer readable means in this embodiment may include a computer readable storage medium as shown above.
It will be apparent to those skilled in the art that all or some of the steps of the methods, systems, functional modules/units in the devices disclosed above may be implemented as software (which may be implemented in computer program code executable by a computing device), firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be performed by several physical components in cooperation. Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor, or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit.
In addition, communication media typically embodies computer readable instructions, data structures, computer program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to one of ordinary skill in the art. Thus, the present invention is not limited to any specific combination of hardware and software.
The foregoing is a more detailed description of embodiments of the present invention, and the present invention is not to be considered limited to such descriptions. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.

Claims (10)

1. An online social method based on emotion recognition, comprising:
acquiring physiological characteristic data of a local chat user through a sensor;
determining a corresponding user emotion recognition result based on the physiological characteristic data;
acquiring corresponding emotion expression content according to the user emotion recognition result; the emotion expression content comprises at least one of emotion expression animation and emotion expression dynamic characters;
and sending the emotion expression content to a chat client used by an opposite-end chat user in real time.
2. An online social method according to claim 1, wherein the physiological characteristic data is galvanic skin response data and galvanic muscle response data;
the determining a corresponding user emotion recognition result based on the physiological characteristic data comprises:
determining corresponding emotional arousal data based on the galvanic skin response data and corresponding emotional valence data based on the galvanic muscle response data;
and determining a corresponding emotion recognition result of the user by combining the emotional arousal data and the emotional valence data.
3. An online social method as recited in claim 1 wherein the physiological characteristic data is gaze tracking data;
the determining a corresponding user emotion recognition result based on the physiological characteristic data comprises:
determining respective attention characterization data and interest characterization data based on the gaze tracking data;
and determining a corresponding user emotion recognition result by combining the attention characterization data and the interest characterization data.
4. An online social method according to any one of claims 1 to 3, wherein the obtaining of corresponding emotion expression content according to the user emotion recognition result comprises:
determining a corresponding content type and a content display attribute according to the user emotion recognition result;
and acquiring corresponding emotion expression content according to the content type and the content display attribute.
5. An online social device based on emotion recognition, comprising:
the acquisition module is used for acquiring physiological characteristic data of the local chat user through the sensor;
a determining module for determining a corresponding user emotion recognition result based on the physiological characteristic data;
the acquisition module is used for acquiring corresponding emotion expression content according to the user emotion recognition result; the emotion expression content comprises at least one of emotion expression animation and emotion expression dynamic characters;
and the sending module is used for sending the emotion expression content to the chat client used by the opposite-end chat user in real time.
6. An online social device as set forth in claim 5 wherein the physiological characteristic data is galvanic skin response data and galvanic muscle response data;
the determining module is specifically configured to: determining corresponding emotional arousal data based on the galvanic skin response data and corresponding emotional valence data based on the galvanic muscle response data; and determining a corresponding emotion recognition result of the user by combining the emotional arousal data and the emotional valence data.
7. An online social device as set forth in claim 5 wherein the physiological characteristic data is gaze tracking data;
the determining module is specifically configured to: determining respective attention characterization data and interest characterization data based on the gaze tracking data; and determining a corresponding user emotion recognition result by combining the attention characterization data and the interest characterization data.
8. An online social device as recited in any of claims 5 to 7 wherein the obtaining module is specifically configured to: acquiring a corresponding content type and a content display attribute according to the user emotion recognition result; and acquiring corresponding emotion expression content according to the content type and the content display attribute.
9. An electronic device, comprising: a processor, a memory, and a communication bus;
the communication bus is used for realizing connection communication between the processor and the memory;
the processor is configured to execute one or more programs stored in the memory to implement the steps of the online social method as claimed in any one of claims 1 to 4.
10. A computer readable storage medium, characterized in that the computer readable storage medium stores one or more programs which are executable by one or more processors to implement the steps of the online social method according to any one of claims 1 to 4.
CN202110239315.1A 2021-03-04 2021-03-04 Online social contact method and device based on emotion recognition and storage medium Pending CN112950398A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110239315.1A CN112950398A (en) 2021-03-04 2021-03-04 Online social contact method and device based on emotion recognition and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110239315.1A CN112950398A (en) 2021-03-04 2021-03-04 Online social contact method and device based on emotion recognition and storage medium

Publications (1)

Publication Number Publication Date
CN112950398A true CN112950398A (en) 2021-06-11

Family

ID=76247593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110239315.1A Pending CN112950398A (en) 2021-03-04 2021-03-04 Online social contact method and device based on emotion recognition and storage medium

Country Status (1)

Country Link
CN (1) CN112950398A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103782253A (en) * 2011-09-09 2014-05-07 高通股份有限公司 Transmission of emotions as haptic feedback
CN105345818A (en) * 2015-11-04 2016-02-24 深圳好未来智能科技有限公司 3D video interaction robot with emotion module and expression module
CN107464188A (en) * 2017-06-23 2017-12-12 浙江大学 A kind of internet social networking application system based on Internet of Things mood sensing technology
CN107562186A (en) * 2017-07-14 2018-01-09 华侨大学 The 3D campuses guide method for carrying out emotion computing is recognized based on notice
TW201813584A (en) * 2016-10-11 2018-04-16 國立臺北護理健康大學 Brainwave emotion detecting device
CN110503027A (en) * 2019-08-20 2019-11-26 合肥凌极西雅电子科技有限公司 A kind of interest of children based on data collection and analysis, character analysis system and method
CN110998566A (en) * 2017-06-30 2020-04-10 Pcms控股公司 Method and apparatus for generating and displaying360 degree video based on eye tracking and physiological measurements
CN112908066A (en) * 2021-03-04 2021-06-04 深圳技术大学 Online teaching implementation method and device based on sight tracking and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103782253A (en) * 2011-09-09 2014-05-07 高通股份有限公司 Transmission of emotions as haptic feedback
CN105345818A (en) * 2015-11-04 2016-02-24 深圳好未来智能科技有限公司 3D video interaction robot with emotion module and expression module
TW201813584A (en) * 2016-10-11 2018-04-16 國立臺北護理健康大學 Brainwave emotion detecting device
CN107464188A (en) * 2017-06-23 2017-12-12 浙江大学 A kind of internet social networking application system based on Internet of Things mood sensing technology
CN110998566A (en) * 2017-06-30 2020-04-10 Pcms控股公司 Method and apparatus for generating and displaying360 degree video based on eye tracking and physiological measurements
CN107562186A (en) * 2017-07-14 2018-01-09 华侨大学 The 3D campuses guide method for carrying out emotion computing is recognized based on notice
CN110503027A (en) * 2019-08-20 2019-11-26 合肥凌极西雅电子科技有限公司 A kind of interest of children based on data collection and analysis, character analysis system and method
CN112908066A (en) * 2021-03-04 2021-06-04 深圳技术大学 Online teaching implementation method and device based on sight tracking and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈筝 等: "基于可穿戴传感器的实时环境情绪感受评价", 《中国园林》, vol. 34, no. 03, pages 12 - 17 *

Similar Documents

Publication Publication Date Title
Nakasone et al. Emotion recognition from electromyography and skin conductance
US9646046B2 (en) Mental state data tagging for data collected from multiple sources
US9934425B2 (en) Collection of affect data from multiple mobile devices
Dobs et al. Identity information content depends on the type of facial movement
JP2015505087A (en) Evaluation of advertising effectiveness based on emotion
US11430561B2 (en) Remote computing analysis for cognitive state data metrics
US20130189661A1 (en) Scoring humor reactions to digital media
Perusquía-Hernández et al. Human perception and biosignal-based identification of posed and spontaneous smiles
Lee et al. Using physiological recordings for studying user experience: case of conversational agent-equipped TV
US20130052621A1 (en) Mental state analysis of voters
CN112950398A (en) Online social contact method and device based on emotion recognition and storage medium
CN112613364A (en) Target object determination method, target object determination system, storage medium, and electronic device
US20220036481A1 (en) System and method to integrate emotion data into social network platform and share the emotion data over social network platform
WO2022183424A1 (en) Emotion recognition-based online social method and apparatus, and storage medium
Camerini et al. Exploring the Emotional Experience During Instant Messaging Among Young Adults: An Experimental Study Incorporating Physiological Correlates of Arousal
WO2022183423A1 (en) Online teaching implementation method and apparatus based on gaze tracking, and storage medium
Soyel et al. Towards an affect sensitive interactive companion
Santuber et al. Using body signals and facial expressions to study the norms that drive creative collaboration
Shahab et al. Using electroencephalograms to interpret and monitor the emotions
Carofiglio et al. A Rough BCI-based Assessment of User's Emotions for Interface Adaptation: Application to a 3D-Virtual-Environment Exploration Task.
Cierro et al. Eye-tracking for Sense of Immersion and Linguistic Complexity in the Skyrim Game: Issues and Perspectives
Kim et al. Mediating individual affective experience through the emotional photo frame
WO2014106216A1 (en) Collection of affect data from multiple mobile devices
US11119572B2 (en) Selective display of objects based on eye gaze attributes
US11429188B1 (en) Measuring self awareness utilizing a mobile computing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination