WO2022183424A1 - Emotion recognition-based online social method and apparatus, and storage medium - Google Patents

Emotion recognition-based online social method and apparatus, and storage medium Download PDF

Info

Publication number
WO2022183424A1
WO2022183424A1 PCT/CN2021/079035 CN2021079035W WO2022183424A1 WO 2022183424 A1 WO2022183424 A1 WO 2022183424A1 CN 2021079035 W CN2021079035 W CN 2021079035W WO 2022183424 A1 WO2022183424 A1 WO 2022183424A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
user
emotion recognition
chat
emotional
Prior art date
Application number
PCT/CN2021/079035
Other languages
French (fr)
Chinese (zh)
Inventor
王化
Original Assignee
深圳技术大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳技术大学 filed Critical 深圳技术大学
Priority to PCT/CN2021/079035 priority Critical patent/WO2022183424A1/en
Publication of WO2022183424A1 publication Critical patent/WO2022183424A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators

Definitions

  • the present invention relates to the field of electronic technology, and in particular, to an online social networking method, device and storage medium based on emotion recognition.
  • online chat applications have become the main tool for users to conduct instant communication.
  • online chat is much more primitive than face-to-face communication, and it is often difficult for online chat to convey many basic information, such as user emotions.
  • both parties in the chat cannot accurately know each other's emotions, the content of the chat cannot be accurately understood, and the effectiveness of the chat interaction is poor.
  • the main purpose of the embodiments of the present invention is to provide an online social networking method, device and storage medium based on emotion recognition, which can at least solve the problem that the chatting parties in the related art cannot accurately know each other's emotions, resulting in that the chatting content cannot be accurately understood and chatted. The problem of less effective interaction.
  • a first aspect of the embodiments of the present invention provides an online social networking method based on emotion recognition, the method comprising:
  • the emotional expression content includes at least one of emotional expression animation and emotional expression dynamic text
  • the emotional expression content is sent to the chat client used by the peer chat user in real time.
  • a second aspect of the embodiments of the present invention provides an online social networking device based on emotion recognition, the device comprising:
  • the acquisition module is used to collect the physiological characteristic data of the local chat user through the sensor
  • a determining module for determining a corresponding user emotion recognition result based on the physiological characteristic data
  • an acquisition module configured to acquire corresponding emotional expression content according to the user emotion recognition result; wherein, the emotional expression content includes at least one of emotional expression animation and emotional expression dynamic text;
  • a sending module configured to send the emotional expression content to the chat client used by the opposite chat user in real time.
  • a third aspect of the embodiments of the present invention provides an electronic device, the electronic device includes: a processor, a memory, and a communication bus;
  • the communication bus is used to realize the connection communication between the processor and the memory
  • the processor is configured to execute one or more programs stored in the memory to implement the steps of any one of the above-mentioned online social networking methods.
  • a fourth aspect of the embodiments of the present invention provides a computer-readable storage medium, where the computer-readable storage medium stores one or more programs, and the one or more programs can be processed by one or more
  • the server executes the steps to implement any one of the above-mentioned online social networking methods.
  • the physiological characteristic data of the chatting user at the local end is collected by sensors; the corresponding user emotion recognition result is determined based on the physiological characteristic data; and the user emotion recognition result is obtained according to the user emotion recognition result
  • the corresponding emotional expression content; the emotional expression content is sent to the chat client used by the peer chat user in real time for display.
  • the emotion of the chat user at the local end is identified based on the user's physiological characteristic data, and emotion expression content such as emotion expression animation and emotion expression dynamic text corresponding to the user's emotion are sent to the chat client used by the opposite end chat user in real time.
  • the display realizes that the emotional information of chatting users is conveyed through animation, dynamic text, etc., which effectively ensures that both parties in the chat know each other's emotions, which can significantly improve the effectiveness of the chat interaction between the two parties.
  • FIG. 1 is a schematic diagram of a basic flow of an online social networking method provided by a first embodiment of the present invention
  • Fig. 2 is a kind of emotion recognition map provided by the first embodiment of the present invention.
  • FIG. 3 is a schematic diagram of an emotion expression animation provided by the first embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a GSR data change provided by the first embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a program module of an online social networking device provided by a second embodiment of the present invention.
  • FIG. 6 is a schematic structural diagram of an electronic device according to a third embodiment of the present invention.
  • FIG. 1 is a schematic diagram of the basic flow of the online socializing method provided by this embodiment.
  • the online socializing method provided by this embodiment includes the following steps:
  • Step 101 collecting the physiological characteristic data of the chat user at the local end through the sensor.
  • the physiological characteristic data may be galvanic skin response data and electrical muscle response data; wherein, the galvanic skin response data is analyzed by a galvanic skin response meter (GSR, Galvanic Skin Response apparatus).
  • GSR galvanic Skin Response apparatus
  • the galvanic skin response instrument is an instrument that records the changes of the galvanic skin response in the form of curve ripples, and the electromyography can be used to collect the data of the electrical muscle response and record the muscle bioelectricity graph.
  • the physiological feature data may also be gaze tracking data
  • the gaze tracking data may include pupil dilation data, eye movement data, etc. Eye images are collected, and then multiple eye images are recognized according to the image recognition algorithm to obtain gaze tracking data.
  • Step 102 Determine a corresponding user emotion recognition result based on the physiological characteristic data.
  • the physiological characteristics usually change adaptively. Therefore, in this embodiment, the user's emotions can be adaptively sensed through the physiological characteristics data.
  • specific implementations for determining the corresponding user emotion recognition result based on the physiological feature data include but are not limited to the following two:
  • Mode 1 Determine the corresponding emotional arousal data based on the galvanic skin response data, and determine the corresponding emotional valence data based on the muscle electrical response data; combine the emotional arousal data and the emotional valence data to determine the corresponding user emotion recognition result.
  • this embodiment uses a GSR sensor to detect emotional arousal data, and the GSR sensor for measuring skin conductivity (SC) is connected to the middle finger and index finger of the user's non-dominant hand , and the signal can be recorded with the ProComp+ device. SC varied linearly with overall arousal levels and increased with anxiety and stress.
  • the emotional arousal data can be detected and obtained by analyzing the peaks and troughs of the GSR data.
  • Figure 2 shows an emotion recognition map provided by the present embodiment, wherein the horizontal axis represents emotional valence data, the vertical axis represents emotional arousal data, and the elements in each quadrant are different types of emotion recognition results, including excitement , relaxation, sadness, fear, etc.
  • the emotional valence data represents the user's emotional type
  • the negative valence emotions include fear, sadness, etc.
  • the positive valence emotions include happiness, excitement, etc.
  • the emotional arousal data indicates that the user is in the corresponding The intensity of the emotion under the emotion type.
  • the corresponding attention representation data and interest representation data are determined based on the gaze tracking data; the corresponding user emotion recognition result is determined by combining the attention representation data and the interest representation data.
  • this embodiment makes real-time response to the user's sight line information, and records eye gaze and pupil dilation data during the online chat.
  • Eye movements can reveal the user's interest and attention, enrich non-verbal information by detecting attention, interest, etc. from the user's real-time eye movement data, and use gaze and eye movement information to facilitate emotion-related reasoning.
  • Step 103 Acquire corresponding emotion expression content according to the user emotion recognition result.
  • the emotion expression content includes at least one of emotion expression animation and emotion expression dynamic text.
  • FIG. 3 is a schematic diagram of an emotion expression animation provided in this embodiment.
  • the emotion expression content of this embodiment is adapted to the current emotional state of the user, and is different from the plain text information, and can be used to express more subtle information including the user's emotion.
  • the above-mentioned step of obtaining corresponding emotional expression content according to the user emotion recognition result specifically includes: determining the corresponding content type and content display attribute according to the user emotion recognition result; and content display attributes to obtain the corresponding emotional expression content.
  • the content type of the emotional expression content may include: joy, anger, sadness, joy, etc.
  • the content display attributes may include: change speed, size, color, and the like.
  • the display attributes of the emotional expression animation and the emotional expression dynamic text can be determined according to the emotional awakening data, and the specific types of the emotional expression animation and the emotional expression dynamic text can be determined according to the emotional valence data.
  • the GSR data suddenly increases and peaks, the corresponding generated animation or dynamic text can have higher speed, larger size and brighter color; when the GSR data decreases, all Correspondingly, the change speed of the generated emotional expression content becomes slower, and the display color becomes darker.
  • Step 104 Send the emotional expression content to the chat client used by the opposite chat user in real time.
  • this embodiment uses emotional expression content to describe and express user emotions to both chatting parties, so as to convey emotions and subtle information during online chatting. Since both chatting parties can effectively perceive each other's emotions, it is possible to increase each other's presence in chatting. engagement and more effective chat interaction.
  • users can formulate expression animations or emotional expression dynamic text through buttons, shortcut keys or markers embedded in text messages, and embed emotional expression animations in specific positions (such as the front) of text messages to accurately convey chatting parties. For example, " ⁇ Happy> I'm happy!, where ⁇ Happy> is the emotion expression animation shown in the first row and first column of Figure 3, so as to fully convey the user's joyful emotion through this animation.
  • different types of emotion expression animations can be flexibly used according to different application scenarios. For example, when the user's emotions are sad, the ⁇ Sad> animation shown in the second row and third column in Figure 3 can be used. .
  • this embodiment also conducts usability research on the chat client implemented by the online social networking method, and the research objects are users who use the chat client of this embodiment and the traditional chat client at the same time.
  • Subjects were randomly assigned to use both chat systems, they were in the same building but were not supposed to meet. Divide them into three pairs and talk to their partners individually, each lasting more than an hour, and they talk about school lessons, etc. After the conversation, subjects answered questionnaires and rated the chat system.
  • this example examines GSR data and responses from subjects who were asked when they felt they enjoyed chatting the most, and this example compared their responses with the GSR results. There is a good correlation between GSR data and user-reported tension.
  • FIG. 4 shows the schematic diagram of the changes in the GSR data provided in this example. As the subjects focused more on the dialogue, the GSR increased. . (The graph records the user's GSR every minute). This suggests that GSR can be used to judge changes in emotions in online conversations in real time. The findings also suggest that emotional information may increase subjects' engagement in the conversation. Emotional messages from chat partners give users the feeling that they are not only exchanging text messages but also communicating their feelings with each other, which enables them to be more involved in the conversation.
  • the physiological feature data of the chatting user at the local end is collected by sensors; the corresponding user emotion recognition result is determined based on the physiological feature data; the corresponding emotion expression content is obtained according to the user emotion recognition result ; Send the emotional expression content to the chat client used by the peer chat user for display.
  • the emotion of the chat user at the local end is identified based on the user's physiological characteristic data, and emotion expression content such as emotion expression animation and emotion expression dynamic text corresponding to the user's emotion are sent to the chat client used by the opposite end chat user in real time.
  • the display realizes that the emotional information of chatting users is conveyed through animation, dynamic text, etc., which effectively ensures that both parties in the chat know each other's emotions, which can significantly improve the effectiveness of the chat interaction between the two parties.
  • this embodiment shows an online social device based on emotion recognition, specifically Referring to FIG. 5 , the online social networking device in this embodiment includes:
  • the collection module 501 is used for collecting the physiological characteristic data of the chat user at the local end through the sensor;
  • a determination module 502 configured to determine a corresponding user emotion recognition result based on the physiological characteristic data
  • the obtaining module 503 is configured to obtain the corresponding emotion expression animation according to the user emotion recognition result; wherein, the emotion expression content includes at least one of emotion expression animation and emotion expression dynamic text;
  • the sending module 504 is configured to send the emotional expression content to the chat client used by the opposite chat user in real time.
  • the physiological characteristic data are galvanic skin response data and electrical muscle response data.
  • the determining module 502 is specifically configured to: determine the corresponding emotional arousal data based on the galvanic skin response data, and determine the corresponding emotional valence data based on the electrical muscle response data; and determine the corresponding user based on the emotional arousal data and the emotional valence data. Emotion recognition results.
  • the physiological feature data is gaze tracking data.
  • the determining module 502 is specifically configured to: determine the corresponding attention representation data and interest representation data based on the gaze tracking data; and determine the corresponding user emotion recognition result in combination with the attention representation data and the interest representation data.
  • the obtaining module 503 is specifically configured to: obtain the corresponding content type and content display attribute according to the user emotion recognition result; obtain the corresponding emotion expression content according to the content type and content display attribute.
  • the online socializing methods based on emotion recognition in the foregoing embodiments can all be implemented based on the online socializing device based on emotion recognition provided in this embodiment, and those of ordinary skill in the art can clearly understand that the descriptions For convenience and simplicity, for the specific working process of the online social networking device described in this embodiment, reference may be made to the corresponding process in the foregoing method embodiments, which will not be repeated here.
  • the physiological characteristic data of the chatting user at the local end is collected through the sensor; the corresponding user emotion recognition result is determined based on the physiological characteristic data; the corresponding emotion expression content is obtained according to the user emotion recognition result; Send the emotional expression content to the chat client used by the peer chat user for display.
  • the emotion of the chat user at the local end is identified based on the user's physiological characteristic data, and emotion expression content such as emotion expression animation and emotion expression dynamic text corresponding to the user's emotion are sent to the chat client used by the opposite end chat user in real time.
  • the display realizes that the emotional information of chatting users is conveyed through animation, dynamic text, etc., which effectively ensures that both parties in the chat know each other's emotions, which can significantly improve the effectiveness of the chat interaction between the two parties.
  • This embodiment provides an electronic device, as shown in FIG. 6 , which includes a processor 601, a memory 602, and a communication bus 603, wherein: the communication bus 603 is used to implement connection and communication between the processor 601 and the memory 602; processing The device 601 is configured to execute one or more computer programs stored in the memory 602 to implement at least one step of the online social networking method in the first embodiment.
  • the present embodiments also provide a computer-readable storage medium embodied in any method or technology for storing information, such as computer-readable instructions, data structures, computer program modules, or other data volatile or nonvolatile, removable or non-removable media.
  • Computer-readable storage media include but are not limited to RAM (Random Access Memory, random access memory), ROM (Read-Only Memory, read-only memory), EEPROM (Electrically Erasable Programmable read only memory, electrically erasable programmable read only memory), flash memory or other memory technology, CD-ROM (Compact Disc Read-Only Memory), Digital Versatile Disc (DVD) or other optical disk storage, magnetic cartridge, magnetic tape, magnetic disk storage or other magnetic storage device, or may be used to store desired information and be accessible by a computer any other medium.
  • RAM Random Access Memory
  • ROM Read-Only Memory
  • EEPROM Electrically Erasable Programmable read only memory
  • flash memory or other memory technology
  • CD-ROM Compact Disc Read-Only Memory
  • the computer-readable storage medium in this embodiment may be used to store one or more computer programs, and the stored one or more computer programs may be executed by a processor to implement at least one step of the method in the first embodiment.
  • This embodiment also provides a computer program, which can be distributed on a computer-readable medium and executed by a computer-readable device to implement at least one step of the method in the above-mentioned first embodiment; and in some cases , at least one of the steps shown or described may be performed in an order different from that described in the above embodiments.
  • This embodiment also provides a computer program product, including a computer-readable device, where the computer program as shown above is stored on the computer-readable device.
  • the computer-readable device may include the computer-readable storage medium as described above.
  • the functional modules/units in the system and the device can be implemented as software (which can be implemented by computer program codes executable by a computing device) ), firmware, hardware, and their appropriate combination.
  • the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be composed of several physical components Components execute cooperatively.
  • Some or all physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit .
  • communication media typically embodies computer readable instructions, data structures, computer program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism, and can include any information delivery, as is well known to those of ordinary skill in the art medium. Therefore, the present invention is not limited to any particular combination of hardware and software.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Economics (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Hardware Design (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

Disclosed are an emotion recognition-based online social method and apparatus, and a storage medium. The emotion of a local chat user is recognized on the basis of physiological characteristic data of the user, and emotion expression content such as an emotion expression animation and emotion expression dynamic text corresponding to the emotion of the user is sent in real time to a chat client used by a peer chat user for display; the emotion expression animation and the emotion expression dynamic text can be displayed in corresponding dynamic modes to reflect different emotions and feelings, for example, when the chat user is in high spirits, the animation or the dynamic text can use faster actions, higher speeds, larger sizes, and brighter colors, and when the chat user is in low spirits, the animation or the dynamic text use slower actions, lower speeds, smaller sizes, and darker colors. The emotion of a user is visualized and reflected by means of an animation or dynamic text, thereby effectively ensuring that two chat parties know each other's emotions, and significantly improving effectiveness of chat interaction between the two parties.

Description

基于情绪识别的在线社交方法、装置及存储介质Online social networking method, device and storage medium based on emotion recognition 技术领域technical field
本发明涉及电子技术领域,尤其涉及一种基于情绪识别的在线社交方法、装置及存储介质。The present invention relates to the field of electronic technology, and in particular, to an online social networking method, device and storage medium based on emotion recognition.
背景技术Background technique
随着互联网技术的不断发展,在线聊天应用已经成为用户进行即时通信的主要工具。然而,在线聊天相对面对面交流要原始得多,在线聊天通常很难传达很多基本信息,比如用户情绪。在实际应用中,若聊天双方无法准确获知彼此情绪,则会使得聊天内容无法被准确理解,聊天互动的有效性较差。With the continuous development of Internet technology, online chat applications have become the main tool for users to conduct instant communication. However, online chat is much more primitive than face-to-face communication, and it is often difficult for online chat to convey many basic information, such as user emotions. In practical applications, if both parties in the chat cannot accurately know each other's emotions, the content of the chat cannot be accurately understood, and the effectiveness of the chat interaction is poor.
技术问题technical problem
本发明实施例的主要目的在于提供一种基于情绪识别的在线社交方法、装置及存储介质,至少能够解决相关技术中的聊天双方无法准确获知彼此情绪,所导致的聊天内容无法被准确理解、聊天互动的有效性较差的问题。The main purpose of the embodiments of the present invention is to provide an online social networking method, device and storage medium based on emotion recognition, which can at least solve the problem that the chatting parties in the related art cannot accurately know each other's emotions, resulting in that the chatting content cannot be accurately understood and chatted. The problem of less effective interaction.
技术解决方案technical solutions
为实现上述目的,本发明实施例第一方面提供了一种基于情绪识别的在线社交方法,该方法包括:To achieve the above object, a first aspect of the embodiments of the present invention provides an online social networking method based on emotion recognition, the method comprising:
通过传感器采集本端聊天用户的生理特征数据;Collect the physiological characteristic data of local chat users through sensors;
基于所述生理特征数据确定相应的用户情绪识别结果;Determine a corresponding user emotion recognition result based on the physiological characteristic data;
根据所述用户情绪识别结果获取对应的情绪表达内容;其中,所述情绪表达内容包括情绪表达动画、情绪表达动态文字中至少一种;Obtain corresponding emotional expression content according to the user emotion recognition result; wherein, the emotional expression content includes at least one of emotional expression animation and emotional expression dynamic text;
将所述情绪表达内容实时发送至对端聊天用户使用的聊天客户端。The emotional expression content is sent to the chat client used by the peer chat user in real time.
为实现上述目的,本发明实施例第二方面提供了一种基于情绪识别的在线社交装置,该装置包括:To achieve the above object, a second aspect of the embodiments of the present invention provides an online social networking device based on emotion recognition, the device comprising:
采集模块,用于通过传感器采集本端聊天用户的生理特征数据;The acquisition module is used to collect the physiological characteristic data of the local chat user through the sensor;
确定模块,用于基于所述生理特征数据确定相应的用户情绪识别结果;a determining module, for determining a corresponding user emotion recognition result based on the physiological characteristic data;
获取模块,用于根据所述用户情绪识别结果获取对应的情绪表达内容;其中,所述情绪表达内容包括情绪表达动画、情绪表达动态文字中至少一种;an acquisition module, configured to acquire corresponding emotional expression content according to the user emotion recognition result; wherein, the emotional expression content includes at least one of emotional expression animation and emotional expression dynamic text;
发送模块,用于将所述情绪表达内容实时发送至对端聊天用户使用的聊天客户端。A sending module, configured to send the emotional expression content to the chat client used by the opposite chat user in real time.
为实现上述目的,本发明实施例第三方面提供了一种电子装置,该电子装置包括:处理器、存储器和通信总线;To achieve the above object, a third aspect of the embodiments of the present invention provides an electronic device, the electronic device includes: a processor, a memory, and a communication bus;
所述通信总线用于实现所述处理器和存储器之间的连接通信;The communication bus is used to realize the connection communication between the processor and the memory;
所述处理器用于执行所述存储器中存储的一个或者多个程序,以实现上述任意一种在线社交方法的步骤。The processor is configured to execute one or more programs stored in the memory to implement the steps of any one of the above-mentioned online social networking methods.
为实现上述目的,本发明实施例第四方面提供了一种计算机可读存储介质,该计算机可读存储介质存储有一个或者多个程序,所述一个或者多个程序可被一个或者多个处理器执行,以实现上述任意一种在线社交方法的步骤。To achieve the above object, a fourth aspect of the embodiments of the present invention provides a computer-readable storage medium, where the computer-readable storage medium stores one or more programs, and the one or more programs can be processed by one or more The server executes the steps to implement any one of the above-mentioned online social networking methods.
有益效果beneficial effect
根据本发明实施例提供的基于情绪识别的在线社交方法、装置及存储介质,通过传感器采集本端聊天用户的生理特征数据;基于生理特征数据确定相应的用户情绪识别结果;根据用户情绪识别结果获取对应的情绪表达内容;将情绪表达内容实时发送至对端聊天用户使用的聊天客户端进行展示。通过本发明的实施,基于用户生理特征数据识别本端聊天用户的情绪,并将用户情绪相应的情绪表达动画、情绪表达动态文字等情绪表达内容,实时发送至对端聊天用户使用的聊天客户端进行展示,实现了通过动画、动态文字等来传达聊天用户的情绪信息,有效保证了聊天双方获知彼此情绪,可显著提升双方聊天互动的有效性。According to the online social networking method, device and storage medium based on emotion recognition provided by the embodiment of the present invention, the physiological characteristic data of the chatting user at the local end is collected by sensors; the corresponding user emotion recognition result is determined based on the physiological characteristic data; and the user emotion recognition result is obtained according to the user emotion recognition result The corresponding emotional expression content; the emotional expression content is sent to the chat client used by the peer chat user in real time for display. Through the implementation of the present invention, the emotion of the chat user at the local end is identified based on the user's physiological characteristic data, and emotion expression content such as emotion expression animation and emotion expression dynamic text corresponding to the user's emotion are sent to the chat client used by the opposite end chat user in real time. The display realizes that the emotional information of chatting users is conveyed through animation, dynamic text, etc., which effectively ensures that both parties in the chat know each other's emotions, which can significantly improve the effectiveness of the chat interaction between the two parties.
本发明其他特征和相应的效果在说明书的后面部分进行阐述说明,且应当理解,至少部分效果从本发明说明书中的记载变的显而易见。Other features of the present invention and corresponding effects are set forth in later parts of the specification, and it should be understood that at least some of the effects will become apparent from the description of the present specification.
附图说明Description of drawings
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。In order to explain the embodiments of the present invention or the technical solutions in the prior art more clearly, the following briefly introduces the accompanying drawings that need to be used in the description of the embodiments or the prior art. Obviously, the accompanying drawings in the following description are only These are some embodiments of the present invention. For those skilled in the art, other drawings can also be obtained according to these drawings without creative efforts.
图1为本发明第一实施例提供的在线社交方法的基本流程示意图;FIG. 1 is a schematic diagram of a basic flow of an online social networking method provided by a first embodiment of the present invention;
图2为本发明第一实施例提供的一种情绪识别图谱;Fig. 2 is a kind of emotion recognition map provided by the first embodiment of the present invention;
图3为本发明第一实施例提供的一种情绪表达动画的示意图;3 is a schematic diagram of an emotion expression animation provided by the first embodiment of the present invention;
图4为本发明第一实施例提供的一种GSR数据变化示意图;4 is a schematic diagram of a GSR data change provided by the first embodiment of the present invention;
图5为本发明第二实施例提供的在线社交装置的程序模块示意图;5 is a schematic diagram of a program module of an online social networking device provided by a second embodiment of the present invention;
图6为本发明第三实施例提供的电子装置的结构示意图。FIG. 6 is a schematic structural diagram of an electronic device according to a third embodiment of the present invention.
本发明的实施方式Embodiments of the present invention
为使得本发明的发明目的、特征、优点能够更加的明显和易懂,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而非全部实施例。基于本发明中的实施例,本领域技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。In order to make the purpose, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. The embodiments described above are only a part of the embodiments of the present invention, but not all of the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those skilled in the art without creative efforts shall fall within the protection scope of the present invention.
第一实施例:First embodiment:
为了解决相关技术中的聊天双方无法准确获知彼此情绪,所导致的聊天内容无法被准确理解、聊天互动的有效性较差的问题,本实施例提出了一种基于情绪识别的在线社交方法,如图1所示为本实施例提供的在线社交方法的基本流程示意图,本实施例提出的在线社交方法包括以下的步骤:In order to solve the problem in the related art that the two chatting parties cannot accurately know each other's emotions, the chat content cannot be accurately understood, and the effectiveness of the chat interaction is poor, this embodiment proposes an online social networking method based on emotion recognition, such as FIG. 1 is a schematic diagram of the basic flow of the online socializing method provided by this embodiment. The online socializing method provided by this embodiment includes the following steps:
步骤101、通过传感器采集本端聊天用户的生理特征数据。Step 101 , collecting the physiological characteristic data of the chat user at the local end through the sensor.
具体的,语言学家研究表明,通过文字聊天所能传达的信息仅占谈话中传达信息的20%至30%,从而证明了非语言信息的重要性,如情绪,这是人类认知所必需的,并影响着人们生活的不同方面。例如,当我们兴奋时,我们的感知倾向于选择快乐事件,而消极情绪则相反。这表明在线聊天将受益于对他人情绪状态的了解。有了用户情绪状态的信息,系统就能更有效地与用户交互。Specifically, linguist studies have shown that only 20% to 30% of the information conveyed in a conversation can be conveyed through text chat, thus proving the importance of non-verbal information, such as emotions, which is necessary for human cognition , and affect different aspects of people's lives. For example, when we are excited, our perceptions tend to select happy events, while negative emotions do the opposite. This suggests that online chat would benefit from understanding the emotional state of others. Armed with information about the user's emotional state, the system can interact with the user more effectively.
在实际应用中,用户在聊天过程中连接特定传感器,传感器进行数字采样,而数据捕获可以通过使用Visual C++编写的模块来实现。In practical applications, users connect specific sensors during chat, the sensors are digitally sampled, and data capture can be achieved through modules written in Visual C++.
在本实施例的一种可选的实施方式中,生理特征数据可以为皮肤电反应数据和肌肉电反应数据;其中,通过皮肤电反应仪(GSR,Galvanic Skin Responseapparatus)来进行皮肤电反应数据的采集,皮肤电反应仪是以曲线波纹形式记录皮肤电反应变化的仪器,并可通过肌电仪来进行肌肉电反应数据的采集,对肌肉生物电图形进行记录。In an optional implementation of this embodiment, the physiological characteristic data may be galvanic skin response data and electrical muscle response data; wherein, the galvanic skin response data is analyzed by a galvanic skin response meter (GSR, Galvanic Skin Response apparatus). Acquisition, the galvanic skin response instrument is an instrument that records the changes of the galvanic skin response in the form of curve ripples, and the electromyography can be used to collect the data of the electrical muscle response and record the muscle bioelectricity graph.
在本实施例的另一种可选的实施方式中,生理特征数据还可以为视线跟踪数据,视线跟踪数据可以包括瞳孔扩张数据、眼动数据等,在实际应用中可以通过图像传感器来进行用户眼部图像采集,然后根据图像识别算法识别连续多张眼部图像来获取视线跟踪数据。In another optional implementation of this embodiment, the physiological feature data may also be gaze tracking data, and the gaze tracking data may include pupil dilation data, eye movement data, etc. Eye images are collected, and then multiple eye images are recognized according to the image recognition algorithm to obtain gaze tracking data.
步骤102、基于生理特征数据确定相应的用户情绪识别结果。Step 102: Determine a corresponding user emotion recognition result based on the physiological characteristic data.
具体的,在实际应用中,用户在情绪变化时,通常会在生理特征上适应性有所变化,由此,本实施例可以通过生理特征数据来适应性感知用户情绪。Specifically, in practical applications, when the user's emotions change, the physiological characteristics usually change adaptively. Therefore, in this embodiment, the user's emotions can be adaptively sensed through the physiological characteristics data.
在本实施例中,基于生理特征数据确定相应的用户情绪识别结果的具体实现方式包括但不限于以下两种:In this embodiment, specific implementations for determining the corresponding user emotion recognition result based on the physiological feature data include but are not limited to the following two:
方式一,基于皮肤电反应数据确定相应的情绪唤醒数据,以及基于肌肉电反应数据确定相应的情绪效价数据;结合情绪唤醒数据和情绪效价数据,确定相应的用户情绪识别结果。Mode 1: Determine the corresponding emotional arousal data based on the galvanic skin response data, and determine the corresponding emotional valence data based on the muscle electrical response data; combine the emotional arousal data and the emotional valence data to determine the corresponding user emotion recognition result.
具体的,由于GSR数据相对于BVP数据的噪声信号较少,从而本实施例使用GSR传感器来检测情绪唤醒数据,测量皮肤电导率(SC)的GSR传感器连接在用户非支配手的中指和食指上,并可用ProComp+装置记录信号。SC随整体唤醒水平线性变化,随焦虑和压力增加而增加。在实施例中,通过分析GSR数据的波峰和波谷即可检测得到情绪唤醒数据。Specifically, since GSR data has less noise signal than BVP data, this embodiment uses a GSR sensor to detect emotional arousal data, and the GSR sensor for measuring skin conductivity (SC) is connected to the middle finger and index finger of the user's non-dominant hand , and the signal can be recorded with the ProComp+ device. SC varied linearly with overall arousal levels and increased with anxiety and stress. In an embodiment, the emotional arousal data can be detected and obtained by analyzing the peaks and troughs of the GSR data.
如图2所示为本实施例提供的一种情绪识别图谱,其中,横轴代表情绪效价数据,纵轴代表情绪唤醒数据,各象限中的元素则为不同类型的情绪识别结果,包括兴奋、放松、悲伤、恐惧等,在本实施例中,情绪效价数据表示用户情绪类型,负价情绪包括恐惧、悲伤等,正价情绪包括快乐、兴奋等,而情绪唤醒数据则表示用户在相应情绪类型下的情绪强弱程度。Figure 2 shows an emotion recognition map provided by the present embodiment, wherein the horizontal axis represents emotional valence data, the vertical axis represents emotional arousal data, and the elements in each quadrant are different types of emotion recognition results, including excitement , relaxation, sadness, fear, etc. In this embodiment, the emotional valence data represents the user's emotional type, the negative valence emotions include fear, sadness, etc., the positive valence emotions include happiness, excitement, etc., and the emotional arousal data indicates that the user is in the corresponding The intensity of the emotion under the emotion type.
方式二,基于视线跟踪数据确定相应的注意力表征数据和兴趣表征数据;结合注意力表征数据和兴趣表征数据,确定相应的用户情绪识别结果。In the second method, the corresponding attention representation data and interest representation data are determined based on the gaze tracking data; the corresponding user emotion recognition result is determined by combining the attention representation data and the interest representation data.
具体的,本实施例对用户的视线信息作出实时反应,于在线聊天过程中记录眼睛注视和瞳孔扩张数据。眼动可以显示出用户的兴趣和注意力,通过从用户的实时眼动数据中检测注意力、兴趣等信息来丰富非语言信息,使用视线以及眼动信息来促进情绪相关的推理。Specifically, this embodiment makes real-time response to the user's sight line information, and records eye gaze and pupil dilation data during the online chat. Eye movements can reveal the user's interest and attention, enrich non-verbal information by detecting attention, interest, etc. from the user's real-time eye movement data, and use gaze and eye movement information to facilitate emotion-related reasoning.
步骤103、根据用户情绪识别结果获取对应的情绪表达内容。Step 103: Acquire corresponding emotion expression content according to the user emotion recognition result.
在本实施例中,情绪表达内容包括情绪表达动画、情绪表达动态文字中至少一种,如图3所示为本实施例提供的一种情绪表达动画的示意图。本实施例的情绪表达内容适应于用户当前的情感状态,区别于纯文本信息,可以用于抒发包括用户情绪在内的更为细微的信息。In this embodiment, the emotion expression content includes at least one of emotion expression animation and emotion expression dynamic text. FIG. 3 is a schematic diagram of an emotion expression animation provided in this embodiment. The emotion expression content of this embodiment is adapted to the current emotional state of the user, and is different from the plain text information, and can be used to express more subtle information including the user's emotion.
在本实施例的一种可选的实施方式中,上述根据用户情绪识别结果获取对应的情绪表达内容的步骤,具体包括:根据用户情绪识别结果确定对应的内容类型以及内容显示属性;按照内容类型以及内容显示属性,获取对应的情绪表达内容。In an optional implementation of this embodiment, the above-mentioned step of obtaining corresponding emotional expression content according to the user emotion recognition result specifically includes: determining the corresponding content type and content display attribute according to the user emotion recognition result; and content display attributes to obtain the corresponding emotional expression content.
具体的,情绪表达内容的内容类型可以包括:喜悦、愤怒、悲哀、高兴等,内容显示属性可以包括:变化速度、大小、色彩等。本实施例可以根据情绪唤醒数据确定情绪表达动画、情绪表达动态文字的显示属性,以及根据情绪效价数据确定情绪表达动画、情绪表达动态文字的具体类型。在实际应用中,例如,当GSR数据突然增加并出现峰值时,所对应生成的动画或动态文字可以具有更高的速度、更大的尺寸和更亮的颜色;而当GSR数据减少时,所对应生成的情绪表达内容变化速度变慢、显示颜色变暗。Specifically, the content type of the emotional expression content may include: joy, anger, sadness, joy, etc., and the content display attributes may include: change speed, size, color, and the like. In this embodiment, the display attributes of the emotional expression animation and the emotional expression dynamic text can be determined according to the emotional awakening data, and the specific types of the emotional expression animation and the emotional expression dynamic text can be determined according to the emotional valence data. In practical applications, for example, when the GSR data suddenly increases and peaks, the corresponding generated animation or dynamic text can have higher speed, larger size and brighter color; when the GSR data decreases, all Correspondingly, the change speed of the generated emotional expression content becomes slower, and the display color becomes darker.
步骤104、将情绪表达内容实时发送至对端聊天用户使用的聊天客户端。Step 104: Send the emotional expression content to the chat client used by the opposite chat user in real time.
具体的,本实施例采用情绪表达内容向聊天双方进行用户情绪的描述与表达,以在在线聊天时传达情感和细微的信息,由于聊天双方可以有效感知彼此的情绪,从而可以增加彼此在聊天中的参与度,更有效的进行聊天互动。在实际应用中,用户可以通过按钮、快捷键或嵌入在文本消息中的标记来制定表达动画或情绪表达动态文字,通过在文本消息特定位置(例如前面)嵌入情绪表达动画,来准确传达聊天双方的情绪,例如“<Happy>我很快乐!”,其中<Happy>也即图3中第一行第一列位置所示出的情绪表达动画,以通过该动画来充分传达用户的喜悦情绪。在实际应用中,根据应用场景的不同,还可以灵活采用不同类型的情绪表达动画,例如在用户情绪较为悲伤时,可以采用图3中第二行第三列位置所示出的<Sad>动画。Specifically, this embodiment uses emotional expression content to describe and express user emotions to both chatting parties, so as to convey emotions and subtle information during online chatting. Since both chatting parties can effectively perceive each other's emotions, it is possible to increase each other's presence in chatting. engagement and more effective chat interaction. In practical applications, users can formulate expression animations or emotional expression dynamic text through buttons, shortcut keys or markers embedded in text messages, and embed emotional expression animations in specific positions (such as the front) of text messages to accurately convey chatting parties. For example, "<Happy> I'm happy!", where <Happy> is the emotion expression animation shown in the first row and first column of Figure 3, so as to fully convey the user's joyful emotion through this animation. In practical applications, different types of emotion expression animations can be flexibly used according to different application scenarios. For example, when the user's emotions are sad, the <Sad> animation shown in the second row and third column in Figure 3 can be used. .
还应当说明的是,本实施例还对采用该在线社交方法实现的聊天客户端进行了可用性研究,研究对象是同时使用本实施例的聊天客户端和传统的聊天客户端的用户。受试者被随机分配使用两种聊天系统,他们在同一幢楼里,但不应该见面。将他们分成三对分别和搭档交谈,每次谈话持续了一个多小时,他们就学校课程等进行了交谈。交谈后,受试者回答了问卷,并对聊天系统进行了评价。首先,本实施例检查了GSR数据和受试者的反应,他们被问到什么时候觉得自己最喜欢聊天,本实施例将他们的回答与GSR的结果进行了比较。GSR数据和用户报告的张力之间有很好的相关性。受试者表示他们逐渐将注意力集中在对话上,GSR也显示出类似的变化,如图4为本实施例提供的GSR数据变化示意图,随着被试更多地集中在对话上,GSR增加。(图表每分钟记录用户的GSR)。这表明GSR可以用来实时判断在线会话中情绪的变化。研究结果还表明,情感信息可能会增加受试者在谈话中的参与度。来自聊天伙伴的情感信息给用户的感觉是,他们不仅在交换文本信息,而且在相互交流自己的感受,这使得他们能够更多地参与到对话中。It should also be noted that this embodiment also conducts usability research on the chat client implemented by the online social networking method, and the research objects are users who use the chat client of this embodiment and the traditional chat client at the same time. Subjects were randomly assigned to use both chat systems, they were in the same building but were not supposed to meet. Divide them into three pairs and talk to their partners individually, each lasting more than an hour, and they talk about school lessons, etc. After the conversation, subjects answered questionnaires and rated the chat system. First, this example examines GSR data and responses from subjects who were asked when they felt they enjoyed chatting the most, and this example compared their responses with the GSR results. There is a good correlation between GSR data and user-reported tension. The subjects indicated that they gradually focused their attention on the dialogue, and the GSR also showed similar changes. Figure 4 shows the schematic diagram of the changes in the GSR data provided in this example. As the subjects focused more on the dialogue, the GSR increased. . (The graph records the user's GSR every minute). This suggests that GSR can be used to judge changes in emotions in online conversations in real time. The findings also suggest that emotional information may increase subjects' engagement in the conversation. Emotional messages from chat partners give users the feeling that they are not only exchanging text messages but also communicating their feelings with each other, which enables them to be more involved in the conversation.
根据本发明实施例提供的基于情绪识别的在线社交方法,通过传感器采集本端聊天用户的生理特征数据;基于生理特征数据确定相应的用户情绪识别结果;根据用户情绪识别结果获取对应的情绪表达内容;将情绪表达内容发送至对端聊天用户使用的聊天客户端进行展示。通过本发明的实施,基于用户生理特征数据识别本端聊天用户的情绪,并将用户情绪相应的情绪表达动画、情绪表达动态文字等情绪表达内容,实时发送至对端聊天用户使用的聊天客户端进行展示,实现了通过动画、动态文字等来传达聊天用户的情绪信息,有效保证了聊天双方获知彼此情绪,可显著提升双方聊天互动的有效性。According to the online social networking method based on emotion recognition provided by the embodiment of the present invention, the physiological feature data of the chatting user at the local end is collected by sensors; the corresponding user emotion recognition result is determined based on the physiological feature data; the corresponding emotion expression content is obtained according to the user emotion recognition result ; Send the emotional expression content to the chat client used by the peer chat user for display. Through the implementation of the present invention, the emotion of the chat user at the local end is identified based on the user's physiological characteristic data, and emotion expression content such as emotion expression animation and emotion expression dynamic text corresponding to the user's emotion are sent to the chat client used by the opposite end chat user in real time. The display realizes that the emotional information of chatting users is conveyed through animation, dynamic text, etc., which effectively ensures that both parties in the chat know each other's emotions, which can significantly improve the effectiveness of the chat interaction between the two parties.
第二实施例:Second embodiment:
为了解决相关技术中的聊天双方无法准确获知彼此情绪,所导致的聊天内容无法被准确理解、聊天互动的有效性较差的问题,本实施例示出了一种基于情绪识别的在线社交装置,具体请参见图5,本实施例的在线社交装置包括:In order to solve the problems in the related art that the two chatting parties cannot accurately know each other's emotions, the chat content cannot be accurately understood, and the effectiveness of the chat interaction is poor, this embodiment shows an online social device based on emotion recognition, specifically Referring to FIG. 5 , the online social networking device in this embodiment includes:
采集模块501,用于通过传感器采集本端聊天用户的生理特征数据;The collection module 501 is used for collecting the physiological characteristic data of the chat user at the local end through the sensor;
确定模块502,用于基于生理特征数据确定相应的用户情绪识别结果;A determination module 502, configured to determine a corresponding user emotion recognition result based on the physiological characteristic data;
获取模块503,用于根据用户情绪识别结果获取对应的情绪表达动画;其中,情绪表达内容包括情绪表达动画、情绪表达动态文字中至少一种;The obtaining module 503 is configured to obtain the corresponding emotion expression animation according to the user emotion recognition result; wherein, the emotion expression content includes at least one of emotion expression animation and emotion expression dynamic text;
发送模块504,用于将情绪表达内容实时发送至对端聊天用户使用的聊天客户端。The sending module 504 is configured to send the emotional expression content to the chat client used by the opposite chat user in real time.
在本实施例的一些实施方式中,生理特征数据为皮肤电反应数据和肌肉电反应数据。相应的,确定模块502具体用于:基于皮肤电反应数据确定相应的情绪唤醒数据,以及基于肌肉电反应数据确定相应的情绪效价数据;结合情绪唤醒数据和情绪效价数据,确定相应的用户情绪识别结果。In some implementations of this embodiment, the physiological characteristic data are galvanic skin response data and electrical muscle response data. Correspondingly, the determining module 502 is specifically configured to: determine the corresponding emotional arousal data based on the galvanic skin response data, and determine the corresponding emotional valence data based on the electrical muscle response data; and determine the corresponding user based on the emotional arousal data and the emotional valence data. Emotion recognition results.
在本实施例的另一些实施方式中,生理特征数据为视线跟踪数据。相应的,确定模块502具体用于:基于视线跟踪数据确定相应的注意力表征数据和兴趣表征数据;结合注意力表征数据和兴趣表征数据,确定相应的用户情绪识别结果。In other implementations of this embodiment, the physiological feature data is gaze tracking data. Correspondingly, the determining module 502 is specifically configured to: determine the corresponding attention representation data and interest representation data based on the gaze tracking data; and determine the corresponding user emotion recognition result in combination with the attention representation data and the interest representation data.
在本实施例的一些实施方式中,获取模块503具体用于:根据用户情绪识别结果获取对应的内容类型以及内容显示属性;按照内容类型以及内容显示属性,获取对应的情绪表达内容。In some implementations of this embodiment, the obtaining module 503 is specifically configured to: obtain the corresponding content type and content display attribute according to the user emotion recognition result; obtain the corresponding emotion expression content according to the content type and content display attribute.
应当说明的是,前述实施例中的基于情绪识别的在线社交方法,均可基于本实施例提供的基于情绪识别的在线社交装置实现,所属领域的普通技术人员可以清楚的了解到,为描述的方便和简洁,本实施例中所描述的在线社交装置的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。It should be noted that, the online socializing methods based on emotion recognition in the foregoing embodiments can all be implemented based on the online socializing device based on emotion recognition provided in this embodiment, and those of ordinary skill in the art can clearly understand that the descriptions For convenience and simplicity, for the specific working process of the online social networking device described in this embodiment, reference may be made to the corresponding process in the foregoing method embodiments, which will not be repeated here.
采用本实施例提供的基于情绪识别的在线社交装置,通过传感器采集本端聊天用户的生理特征数据;基于生理特征数据确定相应的用户情绪识别结果;根据用户情绪识别结果获取对应的情绪表达内容;将情绪表达内容发送至对端聊天用户使用的聊天客户端进行展示。通过本发明的实施,基于用户生理特征数据识别本端聊天用户的情绪,并将用户情绪相应的情绪表达动画、情绪表达动态文字等情绪表达内容,实时发送至对端聊天用户使用的聊天客户端进行展示,实现了通过动画、动态文字等来传达聊天用户的情绪信息,有效保证了聊天双方获知彼此情绪,可显著提升双方聊天互动的有效性。Using the online social device based on emotion recognition provided by this embodiment, the physiological characteristic data of the chatting user at the local end is collected through the sensor; the corresponding user emotion recognition result is determined based on the physiological characteristic data; the corresponding emotion expression content is obtained according to the user emotion recognition result; Send the emotional expression content to the chat client used by the peer chat user for display. Through the implementation of the present invention, the emotion of the chat user at the local end is identified based on the user's physiological characteristic data, and emotion expression content such as emotion expression animation and emotion expression dynamic text corresponding to the user's emotion are sent to the chat client used by the opposite end chat user in real time. The display realizes that the emotional information of chatting users is conveyed through animation, dynamic text, etc., which effectively ensures that both parties in the chat know each other's emotions, which can significantly improve the effectiveness of the chat interaction between the two parties.
第三实施例:Third embodiment:
本实施例提供了一种电子装置,参见图6所示,其包括处理器601、存储器602及通信总线603,其中:通信总线603用于实现处理器601和存储器602之间的连接通信;处理器601用于执行存储器602中存储的一个或者多个计算机程序,以实现上述实施例一中的在线社交方法中的至少一个步骤。This embodiment provides an electronic device, as shown in FIG. 6 , which includes a processor 601, a memory 602, and a communication bus 603, wherein: the communication bus 603 is used to implement connection and communication between the processor 601 and the memory 602; processing The device 601 is configured to execute one or more computer programs stored in the memory 602 to implement at least one step of the online social networking method in the first embodiment.
本实施例还提供了一种计算机可读存储介质,该计算机可读存储介质包括在用于存储信息(诸如计算机可读指令、数据结构、计算机程序模块或其他数据)的任何方法或技术中实施的易失性或非易失性、可移除或不可移除的介质。计算机可读存储介质包括但不限于RAM(Random Access Memory,随机存取存储器), ROM(Read-Only Memory,只读存储器), EEPROM(Electrically Erasable Programmable read only memory,带电可擦可编程只读存储器)、闪存或其他存储器技术、CD-ROM(Compact Disc Read-Only Memory,光盘只读存储器),数字多功能盘(DVD)或其他光盘存储、磁盒、磁带、磁盘存储或其他磁存储装置、或者可以用于存储期望的信息并且可以被计算机访问的任何其他的介质。The present embodiments also provide a computer-readable storage medium embodied in any method or technology for storing information, such as computer-readable instructions, data structures, computer program modules, or other data volatile or nonvolatile, removable or non-removable media. Computer-readable storage media include but are not limited to RAM (Random Access Memory, random access memory), ROM (Read-Only Memory, read-only memory), EEPROM (Electrically Erasable Programmable read only memory, electrically erasable programmable read only memory), flash memory or other memory technology, CD-ROM (Compact Disc Read-Only Memory), Digital Versatile Disc (DVD) or other optical disk storage, magnetic cartridge, magnetic tape, magnetic disk storage or other magnetic storage device, or may be used to store desired information and be accessible by a computer any other medium.
本实施例中的计算机可读存储介质可用于存储一个或者多个计算机程序,其存储的一个或者多个计算机程序可被处理器执行,以实现上述实施例一中的方法的至少一个步骤。The computer-readable storage medium in this embodiment may be used to store one or more computer programs, and the stored one or more computer programs may be executed by a processor to implement at least one step of the method in the first embodiment.
本实施例还提供了一种计算机程序,该计算机程序可以分布在计算机可读介质上,由可计算装置来执行,以实现上述实施例一中的方法的至少一个步骤;并且在某些情况下,可以采用不同于上述实施例所描述的顺序执行所示出或描述的至少一个步骤。This embodiment also provides a computer program, which can be distributed on a computer-readable medium and executed by a computer-readable device to implement at least one step of the method in the above-mentioned first embodiment; and in some cases , at least one of the steps shown or described may be performed in an order different from that described in the above embodiments.
本实施例还提供了一种计算机程序产品,包括计算机可读装置,该计算机可读装置上存储有如上所示的计算机程序。本实施例中该计算机可读装置可包括如上所示的计算机可读存储介质。This embodiment also provides a computer program product, including a computer-readable device, where the computer program as shown above is stored on the computer-readable device. In this embodiment, the computer-readable device may include the computer-readable storage medium as described above.
可见,本领域的技术人员应该明白,上文中所公开方法中的全部或某些步骤、系统、装置中的功能模块/单元可以被实施为软件(可以用计算装置可执行的计算机程序代码来实现)、固件、硬件及其适当的组合。在硬件实施方式中,在以上描述中提及的功能模块/单元之间的划分不一定对应于物理组件的划分;例如,一个物理组件可以具有多个功能,或者一个功能或步骤可以由若干物理组件合作执行。某些物理组件或所有物理组件可以被实施为由处理器,如中央处理器、数字信号处理器或微处理器执行的软件,或者被实施为硬件,或者被实施为集成电路,如专用集成电路。It can be seen that those skilled in the art should understand that all or some of the steps in the methods disclosed above, the functional modules/units in the system and the device can be implemented as software (which can be implemented by computer program codes executable by a computing device) ), firmware, hardware, and their appropriate combination. In a hardware implementation, the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be composed of several physical components Components execute cooperatively. Some or all physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit .
此外,本领域普通技术人员公知的是,通信介质通常包含计算机可读指令、数据结构、计算机程序模块或者诸如载波或其他传输机制之类的调制数据信号中的其他数据,并且可包括任何信息递送介质。所以,本发明不限制于任何特定的硬件和软件结合。In addition, communication media typically embodies computer readable instructions, data structures, computer program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism, and can include any information delivery, as is well known to those of ordinary skill in the art medium. Therefore, the present invention is not limited to any particular combination of hardware and software.
以上内容是结合具体的实施方式对本发明实施例所作的进一步详细说明,不能认定本发明的具体实施只局限于这些说明。对于本发明所属技术领域的普通技术人员来说,在不脱离本发明构思的前提下,还可以做出若干简单推演或替换,都应当视为属于本发明的保护范围。The above content is a further detailed description of the embodiments of the present invention in combination with specific embodiments, and it cannot be considered that the specific implementation of the present invention is limited to these descriptions. For those of ordinary skill in the technical field of the present invention, without departing from the concept of the present invention, some simple deductions or substitutions can be made, which should be regarded as belonging to the protection scope of the present invention.

Claims (10)

  1. 一种基于情绪识别的在线社交方法,其特征在于,包括:An online social networking method based on emotion recognition, characterized in that it includes:
    通过传感器采集本端聊天用户的生理特征数据;Collect the physiological characteristic data of local chat users through sensors;
    基于所述生理特征数据确定相应的用户情绪识别结果;Determine a corresponding user emotion recognition result based on the physiological characteristic data;
    根据所述用户情绪识别结果获取对应的情绪表达内容;其中,所述情绪表达内容包括情绪表达动画、情绪表达动态文字中至少一种;Obtain corresponding emotional expression content according to the user emotion recognition result; wherein, the emotional expression content includes at least one of emotional expression animation and emotional expression dynamic text;
    将所述情绪表达内容实时发送至对端聊天用户使用的聊天客户端。The emotional expression content is sent to the chat client used by the peer chat user in real time.
  2. 如权利要求1所述的在线社交方法,其特征在于,所述生理特征数据为皮肤电反应数据和肌肉电反应数据;The online social networking method according to claim 1, wherein the physiological characteristic data is galvanic skin response data and electrical muscle response data;
    所述基于所述生理特征数据确定相应的用户情绪识别结果,包括:The determining a corresponding user emotion recognition result based on the physiological characteristic data includes:
    基于所述皮肤电反应数据确定相应的情绪唤醒数据,以及基于所述肌肉电反应数据确定相应的情绪效价数据;Determine corresponding emotional arousal data based on the galvanic skin response data, and determine corresponding emotional valence data based on the electrical muscle response data;
    结合所述情绪唤醒数据和所述情绪效价数据,确定相应的用户情绪识别结果。Combining the emotional arousal data and the emotional valence data, a corresponding user emotion recognition result is determined.
  3. 如权利要求1所述的在线社交方法,其特征在于,所述生理特征数据为视线跟踪数据;The online social networking method according to claim 1, wherein the physiological characteristic data is gaze tracking data;
    所述基于所述生理特征数据确定相应的用户情绪识别结果,包括:The determining a corresponding user emotion recognition result based on the physiological characteristic data includes:
    基于所述视线跟踪数据确定相应的注意力表征数据和兴趣表征数据;Determine corresponding attention representation data and interest representation data based on the gaze tracking data;
    结合所述注意力表征数据和所述兴趣表征数据,确定相应的用户情绪识别结果。Combining the attention representation data and the interest representation data, a corresponding user emotion recognition result is determined.
  4. 如权利要求1至3中任意一项所述的在线社交方法,其特征在于,所述根据所述用户情绪识别结果获取对应的情绪表达内容,包括:The online social networking method according to any one of claims 1 to 3, wherein the obtaining corresponding emotional expression content according to the user's emotion recognition result comprises:
    根据所述用户情绪识别结果确定对应的内容类型以及内容显示属性;Determine the corresponding content type and content display attribute according to the user emotion recognition result;
    按照所述内容类型以及所述内容显示属性,获取对应的情绪表达内容。According to the content type and the content display attribute, the corresponding emotional expression content is acquired.
  5. 一种基于情绪识别的在线社交装置,其特征在于,包括:An online social device based on emotion recognition, comprising:
    采集模块,用于通过传感器采集本端聊天用户的生理特征数据;The acquisition module is used to collect the physiological characteristic data of the local chat user through the sensor;
    确定模块,用于基于所述生理特征数据确定相应的用户情绪识别结果;a determining module, for determining a corresponding user emotion recognition result based on the physiological characteristic data;
    获取模块,用于根据所述用户情绪识别结果获取对应的情绪表达内容;其中,所述情绪表达内容包括情绪表达动画、情绪表达动态文字中至少一种;an acquisition module, configured to acquire corresponding emotional expression content according to the user emotion recognition result; wherein, the emotional expression content includes at least one of emotional expression animation and emotional expression dynamic text;
    发送模块,用于将所述情绪表达内容实时发送至对端聊天用户使用的聊天客户端。A sending module, configured to send the emotional expression content to the chat client used by the opposite chat user in real time.
  6. 如权利要求5所述的在线社交装置,其特征在于,所述生理特征数据为皮肤电反应数据和肌肉电反应数据;The online social networking device according to claim 5, wherein the physiological characteristic data is galvanic skin response data and electrical muscle response data;
    所述确定模块具体用于:基于所述皮肤电反应数据确定相应的情绪唤醒数据,以及基于所述肌肉电反应数据确定相应的情绪效价数据;结合所述情绪唤醒数据和所述情绪效价数据,确定相应的用户情绪识别结果。The determining module is specifically configured to: determine corresponding emotional arousal data based on the galvanic skin response data, and determine corresponding emotional valence data based on the electrical muscle response data; combine the emotional arousal data and the emotional valence data to determine the corresponding user emotion recognition results.
  7. 如权利要求5所述的在线社交装置,其特征在于,所述生理特征数据为视线跟踪数据;The online social networking device according to claim 5, wherein the physiological characteristic data is gaze tracking data;
    所述确定模块具体用于:基于所述视线跟踪数据确定相应的注意力表征数据和兴趣表征数据;结合所述注意力表征数据和所述兴趣表征数据,确定相应的用户情绪识别结果。The determining module is specifically configured to: determine corresponding attention representation data and interest representation data based on the gaze tracking data; and determine a corresponding user emotion recognition result in combination with the attention representation data and the interest representation data.
  8. 如权利要求5至7中任意一项所述的在线社交装置,其特征在于,所述获取模块具体用于:根据所述用户情绪识别结果获取对应的内容类型以及内容显示属性;按照所述内容类型以及所述内容显示属性,获取对应的情绪表达内容。The online social networking device according to any one of claims 5 to 7, wherein the obtaining module is specifically configured to: obtain the corresponding content type and content display attribute according to the user emotion recognition result; type and the content display attribute to obtain the corresponding emotional expression content.
  9. 一种电子装置,其特征在于,包括:处理器、存储器和通信总线;An electronic device, comprising: a processor, a memory and a communication bus;
    所述通信总线用于实现所述处理器和存储器之间的连接通信;The communication bus is used to realize the connection communication between the processor and the memory;
    所述处理器用于执行所述存储器中存储的一个或者多个程序,以实现如权利要求1至4中任意一项所述的在线社交方法的步骤。The processor is configured to execute one or more programs stored in the memory to implement the steps of the online social networking method according to any one of claims 1 to 4.
  10. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有一个或者多个程序,所述一个或者多个程序可被一个或者多个处理器执行,以实现如权利要求1至4中任意一项所述的在线社交方法的步骤。A computer-readable storage medium, characterized in that the computer-readable storage medium stores one or more programs, and the one or more programs can be executed by one or more processors, so as to realize the invention as claimed in claim 1 Steps of the online social networking method described in any one of to 4.
PCT/CN2021/079035 2021-03-04 2021-03-04 Emotion recognition-based online social method and apparatus, and storage medium WO2022183424A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/079035 WO2022183424A1 (en) 2021-03-04 2021-03-04 Emotion recognition-based online social method and apparatus, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/079035 WO2022183424A1 (en) 2021-03-04 2021-03-04 Emotion recognition-based online social method and apparatus, and storage medium

Publications (1)

Publication Number Publication Date
WO2022183424A1 true WO2022183424A1 (en) 2022-09-09

Family

ID=83153895

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/079035 WO2022183424A1 (en) 2021-03-04 2021-03-04 Emotion recognition-based online social method and apparatus, and storage medium

Country Status (1)

Country Link
WO (1) WO2022183424A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120182211A1 (en) * 2011-01-14 2012-07-19 Research In Motion Limited Device and method of conveying emotion in a messaging application
CN105897551A (en) * 2015-02-13 2016-08-24 国际商业机器公司 Point In Time Expression Of Emotion Data Gathered From A Chat Session
CN110837294A (en) * 2019-10-14 2020-02-25 成都西山居世游科技有限公司 Facial expression control method and system based on eyeball tracking
CN111124150A (en) * 2019-12-25 2020-05-08 王少白 Input method capable of visually reflecting emotion
CN111339442A (en) * 2020-02-25 2020-06-26 北京声智科技有限公司 Online friend interaction method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120182211A1 (en) * 2011-01-14 2012-07-19 Research In Motion Limited Device and method of conveying emotion in a messaging application
CN105897551A (en) * 2015-02-13 2016-08-24 国际商业机器公司 Point In Time Expression Of Emotion Data Gathered From A Chat Session
CN110837294A (en) * 2019-10-14 2020-02-25 成都西山居世游科技有限公司 Facial expression control method and system based on eyeball tracking
CN111124150A (en) * 2019-12-25 2020-05-08 王少白 Input method capable of visually reflecting emotion
CN111339442A (en) * 2020-02-25 2020-06-26 北京声智科技有限公司 Online friend interaction method and device

Similar Documents

Publication Publication Date Title
Oh Kruzic et al. Facial expressions contribute more than body movements to conversational outcomes in avatar-mediated virtual environments
Rafaeli et al. Assessing interactivity in computer-mediated
Farrugia Facebook and relationships: A study of how social media use is affecting long-term relationships
Fujiwara et al. Evaluating interpersonal synchrony: Wavelet transform toward an unstructured conversation
Becker et al. Evaluating affective feedback of the 3D agent max in a competitive cards game
US20130169680A1 (en) Social system and method used for bringing virtual social network into real life
US20130339875A1 (en) Method and apparatus for presenting a participant engagement level in an online interaction
Bernieri The expression of rapport
US20130120114A1 (en) Biofeedback control system and method for human-machine interface
Bente et al. Effects of simulated gaze on social presence, person perception and personality attribution in avatar-mediated communication
CN109670385B (en) Method and device for updating expression in application program
Fujiwara et al. Video-based tracking approach for nonverbal synchrony: a comparison of Motion Energy Analysis and OpenPose
WO2017016123A1 (en) Method and device for testing application to be tested
Nakano et al. Generating robot gaze on the basis of participation roles and dominance estimation in multiparty interaction
Qiu et al. THE PERSUASIVE POWER OF EMOTICONS IN ELECTRONIC WORD-OF-MOUTH COMMUNICATION ON SOCIAL NETWORKING SERVICES.
Celiktutan et al. Continuous prediction of perceived traits and social dimensions in space and time
US11294474B1 (en) Controlling video data content using computer vision
Ertay et al. Challenges of emotion detection using facial expressions and emotion visualisation in remote communication
WO2022183424A1 (en) Emotion recognition-based online social method and apparatus, and storage medium
de la Rosa et al. View dependencies in the visual recognition of social interactions
CN111783587A (en) Interaction method, device and storage medium
CN112950398A (en) Online social contact method and device based on emotion recognition and storage medium
CN112843731B (en) Shooting method, shooting device, shooting equipment and storage medium
Marom et al. Relational dynamics in perception: impacts on trial-to-trial variation
Piton et al. Exploring peak-end effects in player affect through hearthstone

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21928511

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21928511

Country of ref document: EP

Kind code of ref document: A1