WO2021135812A1 - 一种情绪信息的处理方法及装置 - Google Patents

一种情绪信息的处理方法及装置 Download PDF

Info

Publication number
WO2021135812A1
WO2021135812A1 PCT/CN2020/133746 CN2020133746W WO2021135812A1 WO 2021135812 A1 WO2021135812 A1 WO 2021135812A1 CN 2020133746 W CN2020133746 W CN 2020133746W WO 2021135812 A1 WO2021135812 A1 WO 2021135812A1
Authority
WO
WIPO (PCT)
Prior art keywords
emotion
information
robot
type
behavior
Prior art date
Application number
PCT/CN2020/133746
Other languages
English (en)
French (fr)
Inventor
段素霞
赵安莉
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP20908501.8A priority Critical patent/EP3923198A4/en
Publication of WO2021135812A1 publication Critical patent/WO2021135812A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/008Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0005Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
    • B25J11/001Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means with emotions simulating means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks

Definitions

  • This application relates to the field of information processing, and in particular to a method and device for processing emotional information.
  • a robot is a machine device that automatically performs work. It can accept human commands, run pre-arranged programs, or act according to principles and guidelines formulated with artificial intelligence technology. Its task is to assist or replace human work, such as work in the production industry, construction industry and other fields. With the development of science and technology, the field of research on robots has gradually expanded from the industrial field to medical, healthcare, entertainment, and education. Robots not only have the ability to complete designated tasks, but also have the ability to interact with people. In order to obtain a better interactive experience, it is necessary to make the interaction between robots and humans more anthropomorphic.
  • External stimuli or internal stimuli usually cause changes in people's emotions, and people generally express different emotions through different behaviors (for example, different facial expressions, voices, or actions, etc.). Since people’s expressions of emotions are generally common, in the process of communication between people, observing the other’s behavior is helpful to determine the other’s emotions. For example, when you see a person smiling, you can generally consider his current emotions as positive emotions, such as happiness; when you hear a person's crying, you can generally consider his current emotions as negative emotions, such as sadness.
  • the embodiments of the present application provide a method and device for processing emotion information, a computer device, and a readable storage medium, which are beneficial to show the emotion dissipation process of the robot and improve the personification degree of the robot.
  • a first aspect of the embodiments of the present application provides a method for processing emotion information, including: in response to a robot detecting a first emotion trigger event, determining first emotion information according to the first emotion trigger event; and according to the first emotion information Generate first control information, the first control information is used to instruct the robot to perform a first behavior and a second behavior in sequence, the first behavior and the second behavior are both used to express emotions, wherein the first behavior It is used to express the emotion indicated by the first emotion information, and the second behavior is used to express a milder emotion than the emotion indicated by the first emotion information.
  • Emotion trigger events are events that can cause changes in the emotion information of the robot and can be detected by the robot;
  • the execution subject is the robot or the server, or the system of the robot and the server;
  • the emotion information can be quantified as a parameter that expresses the emotion of the robot;
  • the emotion corresponding to the second behavior is milder than the first emotion, in other words, the second behavior is closer to the behavior under calm emotions than the first behavior, or the type of emotion corresponding to the second behavior is the same as the type of the first emotion , But to a low degree; the robot can perform emotion-expressing behaviors.
  • the first and second behaviors are used to express emotions, for example, by making sounds, displaying images or text, driving entities to make actions, etc. to show laugh, cry, frown, etc. Human emotion expression behavior.
  • the robot after determining the first emotion information of the robot, by instructing the robot to perform the first behavior and the second behavior in sequence, it is beneficial for the person interacting with the robot to feel the dissipating process of the robot emotion, which is beneficial to Improve the degree of emotional delicacy of the robot, thereby increasing the degree of personification of the robot, making it easier for people to resonate with the robot, improving the stickiness of the robot, and increasing the value of the robot.
  • the first emotion triggering event includes the robot being moved, the robot falling down, the environmental parameters of the robot being inferior to preset parameters, the task failure of the robot and the At least one of the robot’s mission successes.
  • the environmental parameters may be light intensity, temperature, noise decibels, and so on.
  • the degree of the second behavior is milder than the degree of the first behavior, and the degree of the first behavior and the second behavior is at least one of amplitude, frequency, and volume.
  • the first control information is also used to instruct the robot to perform a third emotion for expressing the initial emotion after performing the second behavior and before detecting the next emotion trigger event.
  • the initial emotion information used to indicate the initial emotion is pre-stored in the storage medium.
  • the third behavior is the first behavior used to express emotions performed after the robot is turned on.
  • the first emotion information is used to indicate that the type of emotion is the first type, and the degree of emotion is the first degree.
  • the type of emotion expressed by the second behavior is the first type, and the degree of emotion expressed by the second behavior is lighter than the first degree.
  • the method further includes: in response to the robot detecting a second emotion trigger event, according to the first emotion trigger event, 2.
  • the emotion trigger event determines historical emotion information, the historical emotion information is used to generate historical control information, the historical control information is used to instruct the robot to perform a historical behavior, and the historical behavior is used to express what the historical emotional information indicates The emotion;
  • the determining first emotion information according to the first emotion trigger event includes: determining first emotion change information according to the first emotion trigger event, the first emotion change information being used to indicate that the first emotion An emotion triggering event causes a change in the type of emotion and the amount of change in the degree; the first emotion information is determined according to the first emotion change information and the historical emotion information.
  • the emotion type indicated by the first emotion change information and the emotion type indicated by the historical emotion information are both the emotion type in the first direction
  • the first degree is positively correlated with the amount of change in the emotion indicated by the first emotion change information
  • the first degree is related to the emotion indicated by the historical emotion information.
  • the degree is positively correlated, and the emotion type in the first direction is a positive emotion type (for example, happy, satisfied, confident, etc.), or a negative emotion type (for example, sadness, disappointment, unconfident, etc.).
  • the first degree is negatively correlated with a first time length
  • the first time length is the second time when the robot detects the second emotion trigger event and the robot detects the second emotion trigger event. The length of the interval between the first moments of the first emotional trigger event.
  • the first time length is not strictly limited to the time interval between the first time and the second time. It can be slightly larger or smaller than the time interval, as long as it can reflect the change of the time interval. For example, when the time interval becomes larger, the first time interval The length of time has also become larger accordingly.
  • the emotion type in the first direction is a positive emotion type
  • the emotion type in the second direction is a negative emotion type
  • the emotion type in the first direction is a negative emotion type
  • the emotion type in the second direction is a positive emotion type
  • the emotion type indicated based on the first emotion change information is the emotion type in the first direction
  • the emotion type indicated by the historical emotion information is the first emotion type.
  • the first type is the type of emotions in the first direction
  • the first degree is positively correlated with the amount of change in the emotion indicated by the first emotion information
  • the first degree It is negatively correlated with the degree of emotion indicated by the historical emotion information.
  • the first degree is positively correlated with a first duration
  • the first degree is negatively correlated with the first duration
  • the first duration is when the robot detects the second emotion trigger.
  • a second aspect of the embodiments of the present application also provides an apparatus for processing emotion information, including: a determining module, configured to determine first emotion information according to the first emotion trigger event in response to the robot detecting the first emotion trigger event; and generate Module, used to generate first control information according to the first emotion information, the first control information is used to instruct the robot to perform the first behavior and the second behavior in sequence, and the first behavior and the second behavior are both used For expressing emotions, wherein the first behavior is used to express the emotions indicated by the first emotion information, and the second behavior is used to express emotions that are milder than the emotions indicated by the first emotion information .
  • the first emotion triggering event includes the robot being moved, the robot falling down, the environmental parameters of the robot being inferior to preset parameters, the task failure of the robot and the At least one of the robot’s mission successes.
  • the degree of the second behavior is milder than the degree of the first behavior, and the degree of the first behavior and the second behavior is at least one of amplitude, frequency, and volume.
  • the degree of the first behavior and the second behavior is at least one of amplitude, frequency, and volume.
  • the first control information is also used to instruct the robot to perform a third emotion for expressing the initial emotion after performing the second behavior and before detecting the next emotion trigger event.
  • the initial emotion information used to indicate the initial emotion is pre-stored in the storage medium.
  • the third behavior is the first behavior used to express emotions performed after the robot is turned on.
  • the first emotion information is used to indicate that the type of emotion is the first type, and the degree of emotion is the first degree.
  • the type of emotion expressed by the second behavior is the first type, and the degree of emotion expressed by the second behavior is lighter than the first degree.
  • the determining module is further configured to: before the first emotion information is determined according to the first emotion trigger event, in response to the robot detecting a second emotion trigger event, according to the The second emotion trigger event determines historical emotion information, the historical emotion information is used to generate historical control information, the historical control information is used to instruct the robot to perform a historical behavior, and the historical behavior is used to express the historical emotional information The indicated emotion; the determining module determines the first emotion information according to the first emotion trigger event, and is specifically configured to: determine the first emotion change information according to the first emotion trigger event, and the first emotion change information is used To indicate the type and amount of change of the emotion whose degree is changed by the first emotion triggering event; the first emotion information is determined according to the first emotion change information and the historical emotion information.
  • the emotion type indicated by the first emotion change information and the emotion type indicated by the historical emotion information are both the emotion type in the first direction, then the first type Is the emotion type in the first direction, and the first degree is positively correlated with the amount of change in the emotion indicated by the first emotion change information, and the first degree is related to the emotion indicated by the historical emotion information.
  • the degree is positively correlated, and the emotion type in the first direction is a positive emotion type or a negative emotion type.
  • the first degree is negatively correlated with a first time length
  • the first time length is the second time when the robot detects the second emotion trigger event and the robot detects the second emotion trigger event. The length of the interval between the first moments of the first emotional trigger event.
  • the emotion type in the first direction is a positive emotion type
  • the emotion type in the second direction is a negative emotion type
  • the emotion type in the first direction is a negative emotion type
  • the emotion type in the second direction is a positive emotion type
  • the emotion type indicated based on the first emotion change information is the emotion type in the first direction
  • the emotion type indicated by the historical emotion information is the first emotion type.
  • the first type is the type of emotions in the first direction
  • the first degree is positively correlated with the amount of change in the emotion indicated by the first emotion information
  • the first degree It is negatively correlated with the degree of emotion indicated by the historical emotion information.
  • the first degree is positively correlated with a first duration
  • the first degree is negatively correlated with the first duration
  • the first duration is when the robot detects the second emotion trigger.
  • the device further includes an execution module configured to: after the generation module generates the first control information according to the first emotion information, according to the first control information The first behavior and the second behavior are executed in sequence.
  • a third aspect of the embodiments of the present application provides a computer device, including a processor and a memory.
  • the processor runs the computer instructions stored in the memory, it executes any one of the first aspect or the first aspect of the embodiments of the present application.
  • One possible way to achieve the described method is to achieve the described method.
  • the fourth aspect of the embodiments of the present application provides a computer-readable storage medium, including instructions, when the instructions are run on a computer, the computer can execute any one of the first aspect or the first aspect of the embodiments of the present application. Implement the method described in the mode.
  • the fifth aspect of the embodiments of the present application provides a computer program product, including instructions, which when the instructions run on a computer, cause the computer to execute the first aspect or any possible implementation manner of the first aspect of the embodiments of the present application The method described.
  • a sixth aspect of the embodiments of the present application provides a robot, including an input module, an output module, a processor, and a memory.
  • the input module is used to detect emotion trigger events
  • the memory is used to store computer instructions
  • the processor is running.
  • the output module is used to execute the control information generated by the processor For example, after receiving the first control information sent by the server, the first behavior and the second behavior are sequentially executed according to the first control information.
  • a seventh aspect of the embodiments of the present application provides a robot system, including a robot and a server; the robot is used to detect an emotion trigger event and send the detected emotion trigger event to the server; the server is used to The emotion-triggered event detected by the robot executes the method described in the first aspect or any one of the possible implementations of the first aspect of the embodiments of the present application, and sends control information to the robot; the robot is also used to execute all The control information sent by the server, for example, after receiving the first control information sent by the server, the first behavior and the second behavior are sequentially executed according to the first control information.
  • Fig. 1A is a schematic diagram of an embodiment of a robot provided by the present application.
  • FIG. 1B is a schematic diagram of an embodiment of the server provided by the present application.
  • Fig. 1C is a schematic diagram of an embodiment of the robot system provided by the present application.
  • Fig. 2 is a schematic diagram of an embodiment of a method for processing emotional information provided by the present application
  • 3A is a schematic diagram of different behaviors corresponding to different degrees of happiness provided in this application.
  • FIG. 3B is a schematic diagram of different behaviors corresponding to different degrees of sadness provided in this application.
  • Fig. 4A is a schematic diagram of the dissipating process of behaviors used by the robot of the present application to express emotions
  • FIG. 4B is another schematic diagram of the behavior dissipation process used by the robot of the present application to express emotions
  • FIG. 4C is another schematic diagram of the behavior dissipation process used by the robot of the present application to express emotions
  • FIG. 5 is a possible schematic diagram of the curves corresponding to the dissipation equations of two emotions provided in this application;
  • 6A to 6E are schematic diagrams of a possible application scenario of the emotional information processing method provided by this application.
  • FIG. 7A is a schematic diagram of an embodiment of an apparatus for processing emotional information provided by the present application.
  • Fig. 7B is a schematic diagram of another embodiment of the apparatus for processing emotional information provided by the present application.
  • the embodiments of the present application provide a method and device for processing emotional information, a robot, a server, a robot system, a computer-readable storage medium, and a computer program product.
  • the embodiments of the present application will be described below with reference to the accompanying drawings.
  • FIG. 1A is a schematic diagram of an embodiment of the robot 110 provided in the present application.
  • the robot 110 may include an input module 111, a processor 112, a memory 113, and an output module 114.
  • the input module 111 may include a sensor and a data processing module.
  • the sensor is used to detect data
  • the data processing module is used to process the data detected by the sensor.
  • a camera is used to detect image data
  • a microphone array is used to detect audio data.
  • the sensitive sensor is used to detect ambient temperature data, the photosensitive sensor is used to detect light intensity data, etc.; the memory 113 is used to store computer programs; the processor 112 is used to execute the computer programs in the memory, perform data processing, and send control information to the output module;
  • the output module 114 is used to interact with users (human or other robots).
  • the output module 114 may be one or more of a display screen (for example, it may be set on the head of the robot in FIG. 1A), a speaker, and a driving mechanism.
  • the driving mechanism can be used to control the robot to change its body posture, for example, by controlling the rotation of the robot's hands, arms, legs, head, etc., and the driving mechanism can also be used to control the movement of the robot.
  • the shape of the robot 110 takes the shape of the robot 110 as a human form as an example. It should be noted that this application does not limit the shape of the robot 110, nor does it limit that the robot 110 can perform all human behaviors, as long as the robot 110 can be formulated according to artificial intelligence technology. Principles and programs of action, and perform one or more types of behaviors used to express emotions.
  • the types of human behaviors used to express emotions can generally include facial expressions, emotional body movements (such as clapping hands, shaking the body, etc.), tone of voice, etc., as long as the robot 110 can show the user (human or other robots) the above at least
  • One type of behavior for expressing emotions is sufficient, for example, the ability to show facial expressions, or make physical movements, or make sounds with different tones.
  • the embodiment of the present application provides that the robot 110 may also be a smart terminal device (for example, a mobile phone).
  • an embodiment of the present application also provides a server 120.
  • the server 120 may include a processor 121 and a memory 122.
  • the processor in FIG. 1A and/or FIG. 1B may be a central processing unit (CPU), a network processor (NP), or a combination of CPU and NP, or a digital signal processor (DSP). ), application specific integrated circuit (ASIC), ready-made programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components.
  • CPU central processing unit
  • NP network processor
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA ready-made programmable gate array
  • FPGA field programmable gate array
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • the steps in the method disclosed in this application can be directly embodied as being executed and completed by a hardware decoding processor, or executed and completed by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in a mature storage medium in the field, such as random access memory, flash memory, read-only memory, programmable read-only memory, or electrically erasable programmable memory, registers.
  • the storage medium is located in the memory, and the processor reads the information in the memory and completes the steps of the above method in combination with its hardware.
  • the apparatus may include multiple processors or the processors may include multiple processing units.
  • the processor may be a single-core processor, or a multi-core or many-core processor.
  • the processor may be an ARM architecture processor.
  • the memory in FIG. 1A and/or FIG. 1B is used to store computer instructions executed by the processor.
  • the memory can be a storage circuit or a memory.
  • the memory may be volatile memory or non-volatile memory, or may include both volatile and non-volatile memory.
  • the non-volatile memory can be read-only memory (ROM), programmable read-only memory (programmable ROM, PROM), erasable programmable read-only memory (erasable PROM, EPROM), and electrically available Erase programmable read-only memory (electrically EPROM, EEPROM) or flash memory.
  • the volatile memory may be random access memory (RAM), which is used as an external cache.
  • the memory may be independent of the processor.
  • the processor and the memory may be connected to each other through a bus.
  • the bus may be a peripheral component interconnect standard (PCI) bus or an extended industry standard architecture (EISA) bus, etc.
  • PCI peripheral component interconnect standard
  • EISA extended industry standard architecture
  • the bus can be divided into an address bus, a data bus, a control bus, and so on.
  • the memory may also be a storage unit in the processor, which is directly attached to the processor, which is not limited here. Although only one memory is shown in the figure, the device may also include multiple memories or the memory may include multiple storage units.
  • FIG. 1C is a schematic diagram of an embodiment of the robot system 100 provided by the present application.
  • the robot system 100 includes a robot 110 and a server 120.
  • the robot 110 and the server 120 may further include a communication module, and the robot 110 and the server 120 may perform wired or wireless communication through their respective communication modules, for example, interact through the Internet.
  • the robot 110, the server 120, and the robot system 100 provided by the embodiments of the present application are described above, and the processing method of the emotional information provided by the present application is described below.
  • This method can be applied to the robot 110 shown in Fig. 1A, can also be applied to the server 120 shown in Fig. 1B, and can also be applied to the robot system 100 shown in Fig. 1C.
  • the method is executed by the robot 110 and the server 120 together.
  • the execution subject of this method is collectively referred to as a computer device below.
  • Human emotions are usually affected by some events, which produce emotional changes. Events that cause changes in emotions are called emotional triggering events. Similarly, events that cause changes in the emotional information of robots can be called emotional triggering events. After people feel emotional triggering events from the outside (such as being praised) or internal emotional triggering events (such as hunger), they will produce corresponding emotions, such as happy or sad, and express the generated emotions through corresponding behaviors, such as through Expressions-laugh to express happiness, and expressions-frowns to express sadness. When the generated emotions return to calm (that is, there is no emotion), people no longer show behaviors for expressing emotions, that is, there is no expression.
  • the robot can generate emotional information in response to the detection of an emotional trigger event, and imitate the way people express emotions, perform behaviors for expressing emotions, and make interactive objects (such as people or other robots) understand the emotions of the robot .
  • the robot when the robot generates emotional information e1 in response to detecting an emotional trigger event i1, the robot executes the behavior a1 corresponding to the emotional information e1 to express the emotion. After that, the robot can maintain the behavior a1 until the behavior a1 is maintained. The preset duration, or the next emotion trigger event i2 is detected.
  • the weakening of emotion can be the degree of weakening of the same type, or it can be the change from one type of emotion to another type of emotion, for example, from joy to satisfaction.
  • the robot After detecting the emotion trigger event i1, the robot generally executes the behavior for expressing emotion in one of the following two ways before detecting the next emotion trigger event i2:
  • FIG. 2 is a schematic diagram of an embodiment of a method for processing emotion information provided by an embodiment of the application.
  • an embodiment of the method for processing emotion information of the present application may include the following steps:
  • the robot can detect emotion trigger events through cameras, thermal sensors, light sensors, sound detection devices and other sensors.
  • the computer device can obtain the emotion trigger event and determine the robot’s status based on the emotion trigger event.
  • Emotional information The emotion trigger event detected in step 201 is referred to as the first emotion trigger event, and the determined emotion information is referred to as first emotion information.
  • Emotional information can be quantified as parameters that express the emotions of the robot, and different values of the parameters can be used to represent different emotional information of the robot.
  • the embodiment of this application does not limit the way that the computer device determines the first emotion information of the robot according to the first emotion trigger event.
  • the computer device can input the detected first emotion trigger event into the training through artificial intelligence.
  • the neural network model of the robot outputs the first emotion information of the robot, or the computer device can determine the first emotion information corresponding to the first emotion trigger event through a preset correspondence relationship.
  • the computer device After the computer device determines the first emotion information, it can generate first control information based on the first emotion information.
  • the first control information is used to instruct the robot to perform the first behavior and the second behavior in sequence, and both the first behavior and the second behavior are used for The act of expressing emotions.
  • the first behavior is used to express emotions indicated by the first emotion information
  • the second behavior is used to express emotions that are milder than the emotions indicated by the first emotion information.
  • the robot After the robot detects the first stimulus data, before detecting the next stimulus data, in addition to performing the first behavior for expressing the emotion corresponding to the first emotion information, it can also perform the second behavior, which is used to express the ratio.
  • the emotions indicated by the first emotion information are milder emotions.
  • the method for processing emotion information provided by the embodiments of the present application is conducive to making the robot show a process in which emotions gradually weaken over time, which is conducive to making the robot Expressing emotions in a more delicate way will help improve the personification of the robot and increase the stickiness of the robot.
  • the robot may include multiple output modules, and each output module is used to output a unit action, for example, a speaker is used to output a sound, a display screen is used to output expressions, and a driving mechanism is used to output an action of shaking the body.
  • the first behavior for expressing emotional information may include multiple unit actions performed by the robot at the same time, for example, while outputting a smiling expression through a display screen, laughter through a speaker, and a driving mechanism Drive the body to shake slightly.
  • the second behavior may include multiple unit actions performed by the robot at the same time.
  • the behavior of robots for expressing emotions can be understood by referring to the behaviors of humans for expressing emotions.
  • the first and second behaviors can be laugh, cry, frown, sigh, clap, sing, and take a pleasant or angry tone.
  • One or more of the behaviors such as speaking, and the emotional information expressed by the behavior can refer to the emotions expressed by the human behavior, for example, laughter is used to express happiness, and crying and frowning are used to express sadness.
  • the first behavior and the second behavior can be the same type of behavior, for example, both are laughing, but the degree of the two is different.
  • the degree of the behavior may refer to the amplitude or frequency or volume of the behavior.
  • the first behavior refers to the robot's laughing expression
  • the amplitude of the first behavior refers to the curvature of one of the mouth, eyes, and eyebrows in the expression
  • the first behavior refers to the robot laughing or clapping
  • the first behavior is Frequency refers to the frequency of the sound or the frequency of clapping hands
  • the first behavior refers to the crying of the robot
  • the volume of the first behavior refers to the volume of the crying.
  • the emotion expressed by the second behavior is milder than the emotion expressed by the first behavior.
  • the degree of the second behavior is milder than the degree of the first behavior, and the degree of the behavior can be understood as the magnitude At least one of frequency and volume, that is, the amplitude of the second act is smaller than the amplitude of the first act, and/or the frequency of the second act is lower than the frequency of the first act, and/or, the second act
  • the sound produced is smaller than the sound produced by the first act.
  • people’s emotions are generally divided into two categories, positive emotions and negative emotions, and the same type of emotions will also be divided into different degrees, for example, Use adverbs that indicate different levels (for example, extreme, very, very, and mild, or severe, moderate, mild, slightly) or numerical values to distinguish different degrees of emotion.
  • positive emotions and negative emotions will be divided into more types, or more finely divided into different degrees of the same type of emotions.
  • emotion information can be used to indicate the type of emotion and the degree of emotion.
  • a positive integer represents the degree of emotion, and the larger the value, the higher the degree of emotion.
  • I1 and I2 are used to represent the degree of happiness and sadness, respectively. degree.
  • degree thresholds can be set for various emotions. For example, in Figure 3A, TH1 (0.1) represents the threshold of happiness, and in Figure 3B, TH2 (0.2) represents the threshold of sadness. When you are happy When the degree of is more than 0.1, the corresponding degree of happy expression is expressed in the face. When the degree of happiness does not exceed 0.1, the act of expressing happiness may not be performed, that is, no expression is expressed in the face.
  • the type of emotion corresponding to the second behavior is the same as the type of emotion corresponding to the first behavior, and the degree of emotion expressed by the second behavior is lighter than the degree of emotion expressed by the first behavior.
  • the robot after the robot performs the second behavior, it can also perform one or more other behaviors for expressing emotions.
  • the process of expressing robot emotions gradually disappearing.
  • the robot may stop performing the behavior for expressing emotional information after it performs the last behavior for expressing emotional information before detecting the next stimulus data.
  • 4A to 4C respectively exemplarily show the process of dissipating emotion-expressing behaviors. Referring to Figure 4A, as time goes by, the faces of the robot sequentially become Face 1, Face 2, and Face 0, where Face 1 and Face 2 represent the first behavior (specifically the first expression) and the second behavior (specifically). Is the second expression), face 0 is used to represent stopping performing the behavior of expressing emotional information (specifically, no expression).
  • initial emotional information can be set for the robot, and after the robot performs the second behavior, it can continue to perform the third behavior for expressing the initial emotional information.
  • the initial emotion information can be stored in a storage medium in advance.
  • the storage medium can store multiple candidate initial emotional information, and the computer device can dynamically select one as the current initial emotional information.
  • the robot can be designed to periodically change the initial emotional information.
  • sexually a cycle of 28 days can be used.
  • the computer device can store 28 candidate initial emotional information. Each candidate initial emotional information corresponds to one day in a cycle. The candidate initial emotional information of two adjacent days is different.
  • the device can determine the current initial emotional information according to the order of the current date in the current cycle (for example, today is the second day of the current Sunday).
  • face 3 and face 4 are used to represent different initial emotional information.
  • Fig. 4B combined with Fig. 3A the initial emotional information corresponding to face 3 is: happy, the degree is between 0.1 and 0.35;
  • Fig. 4C combined with Fig. 3B the initial emotional information corresponding to face 4 is: sad, the degree is between 0.2 and 0.4.
  • the Stimulus data (the currently detected stimulation data is called the first stimulation data).
  • the first stimulation data is not limited to data detected by one type of sensor. It can include data detected by multiple types of sensors, for example, including detection through a camera Image data, audio data detected by a voice detection device, ambient temperature data detected by a thermal sensor, and light intensity data detected by a photosensitive sensor, etc.
  • the computer device can determine whether there is an emotion triggering event, that is, an event that causes the robot's emotional change according to the first stimulus data, and if an emotion triggering event occurs, it is determined that the robot has detected an emotion triggering event (referred to as the first emotion triggering event).
  • an emotion triggering event that is, an event that causes the robot's emotional change according to the first stimulus data
  • the computer device can detect corresponding data through sensors such as a camera, a thermal sensor, a light sensor, a sound detection device, etc., for example, a camera detects image data, and a voice detection device detects audio data.
  • the ambient temperature data is detected by the thermal sensor, and the light intensity data is detected by the photosensitive sensor.
  • the computer device can determine whether an emotional trigger event is detected based on the data detected by one or more sensors (referred to as stimulus data). Specifically, the computer device can identify and process the stimulus data. For example, the computer device can analyze the detected image.
  • the computer device can perform face recognition, recognize the expression of the person in the image, the number of people, etc.; after the face is recognized, the computer device can also recognize the person's body posture, for example, recognize that the person is beckoning; the computer device can recognize the detected
  • the audio data is used for text recognition and semantic analysis of the recognized text is performed to obtain the intention of the person speaking, such as ordering the robot to perform a task or greeting the robot.
  • the computer device can recognize the emotional trigger event indicated by the stimulus data (referred to as the first emotional trigger event).
  • the first emotional trigger event may include the robot being moved, the robot falling, and the robot’s
  • the environmental parameters are inferior to the preset parameters, and at least one of the robot's mission failure and the robot's mission success.
  • Environmental parameters can refer to light intensity, ambient temperature, noise decibels, and so on.
  • the computer device may pre-store the corresponding relationship between the emotion trigger event and the emotion change information, and after identifying the first emotion trigger event indicated by the first stimulus data, the corresponding first emotion change information can be determined ,
  • the first emotion change information is used to indicate the type and degree of change of the emotion whose degree is changed by the first emotion trigger event.
  • the type and degree of emotion indicated by the first emotion change information are the type and degree of emotion indicated by the first emotion information.
  • the corresponding relationship between the emotional trigger event and the emotional change information can be determined according to the demand level of the robot, which is beneficial to design a corresponding relationship that is more in line with human emotional changes, and improves the personification degree of the robot.
  • the first emotion change information may be used to indicate the type of emotion affected by the first emotion triggering event and the degree of change caused by the type of emotion.
  • Table 1 and Table 2 respectively show the types of emotions affected by various emotion triggering events, and the amount of change in the degree of the corresponding type of emotion caused.
  • the degree change is represented by a value greater than 0 and not more than 1. And, the greater the value, the greater the degree of change.
  • a person When feeling an emotional trigger event, a person’s current mood generally changes.
  • the current mood before the change due to the emotional trigger event can be called the historical emotional margin, and the current mood after the change is called the stimulus response.
  • Emotion, stimulus response emotion is not only determined by the emotional changes brought about by the stimulus, but also by the historical emotional margin.
  • the historical emotional margin is determined by the emotional response to the previous stimulus. Therefore, in order to improve the personification of the robot, if the robot detects the second emotion trigger event before detecting the first emotion trigger event, and determines the corresponding emotion information (called historical emotion information) according to the second emotion trigger event, The historical emotion information is used to generate historical control information, the historical control information is used to instruct the robot to perform historical behaviors, and the historical behaviors are used to express the emotions indicated by the historical emotion information. Then, in a possible implementation manner, step 201 may include: The first emotion change information is determined according to the first emotion trigger event, and then the first emotion information is determined according to the first emotion change information and historical emotion information.
  • the type of emotion indicated by the first emotion information is referred to as the first type, and the degree of emotion indicated by the first emotion information is referred to as the first degree.
  • the influence between emotions can be mutually reinforcing or mutually inhibiting.
  • the first type is the emotion in the first direction Type
  • the first degree is positively correlated with the amount of change in the emotion indicated by the first emotion change information
  • the first degree is positively correlated with the degree of emotion indicated by the historical emotion information.
  • the emotion type in the first direction is a positive emotion type, Or a negative emotion type.
  • the emotion type in the first direction is a positive emotion type
  • the emotion type in the second direction is a negative emotion type
  • the emotion type in the first direction is a negative emotion type
  • the emotion type in the second direction is a positive emotion type Emotional type.
  • the type of emotion indicated by the first emotion change information is the emotion type in the first direction
  • the type of emotion indicated by the historical emotion information is the emotion type in the second direction.
  • the first degree is positively correlated with the amount of change in the emotion indicated by the first emotion information, and the first degree is negatively correlated with the degree of emotion indicated by the historical emotion information; if the first type is the second For the emotion type of the direction, the first degree is negatively correlated with the amount of change in the emotion indicated by the first emotion information, and the first degree is positively correlated with the degree of emotion indicated by the historical emotion information.
  • the historical emotion margin is also related to the length of time between the time when the second emotion trigger event is acquired and the time when the first emotion trigger event is acquired.
  • the first duration is the interval duration between the second moment when the second emotion trigger event is acquired and the first moment when the first emotion trigger event is acquired.
  • the first degree is negatively correlated with the first duration.
  • the first time length is not strictly limited to the time interval between the first time and the second time. It can be slightly larger or smaller than the time interval, as long as it can reflect the change of the time interval. For example, when the time interval becomes larger, the first time interval The length of time has also become larger accordingly.
  • the first type is a negative emotion type
  • the type of emotion indicated by the historical emotion information is a positive emotion type
  • the first type is a positive emotion type, which is indicated by the historical emotion information.
  • the type of emotion is negative emotion type, then the first degree is positively correlated with the first duration.
  • the historical emotion margin I_r can be calculated according to the first duration and the degree of emotion indicated by the historical emotion information according to the following formula (referred to as the dissipation equation):
  • one or more parameters in the dissipation equations corresponding to different types of emotions may be different.
  • the solid curve and the dashed curve in Figure 5 respectively represent the dissipation curve corresponding to the happy degree of 1 and the dissipation curve corresponding to the sadness of degree 1. It can be seen that the two are similar in that they are both "inverted S-shaped". "Curve, that is, the degree of change with time first increases and then decreases; the difference is that the length of time experienced when the degree of sadness decreases from 1 to 0 (called the dissipation time) is less than the degree of happiness experienced when the degree of happiness decreases from 1 to 0. The duration (dissipation duration) is shorter.
  • the dissipating time of negative emotions is shorter than the dissipating time of positive emotions, which is beneficial for the robot to express more positive emotions.
  • the above respectively introduces the determination methods of the emotion change information and the historical emotion margin corresponding to the emotion triggering event.
  • the following example introduces the method of determining the first emotion information based on the emotion change information and the historical emotion margin.
  • the first type indicated by the first emotion information of the robot corresponds to a type of emotion, and the first type corresponds to a higher degree of emotion than Threshold emotion type.
  • the degree of each type of emotion can be determined according to the emotion change information, the historical emotion margin, and the amount of influence between different types of emotions. The degree is regarded as the first degree, and the type of emotion corresponding to this degree is regarded as the first type.
  • the following example introduces the method for determining the first emotion information of the robot.
  • the robot includes three types of emotions, denoted as A, B, and C, respectively, and the degrees of the three emotions are denoted as: I(A), I(B), and I(C), A and B promote each other, A and C inhibits each other, and B and C inhibit each other.
  • the amount of change is a.
  • the degree of three emotions can be calculated separately according to the following process:
  • I(A) F(A)+E(A)+In(A_B)-In(A_C);
  • F(A), F(B), and F(C) are the historical emotional margins of A, B, and C, respectively.
  • the first emotional trigger event is a robot
  • E(A), E(B), E(C) are the degree changes of A, B, and C caused by the first emotion triggering event, respectively.
  • In(A_B) is the influence of B on A
  • In(B_A) is the influence of A on B.
  • In(A_B) in(A_B)*I(B)
  • In(B_A) in(B_A)*E(A)
  • in(A_B) is the ratio of B to A Impact factor
  • in(B_A) is the impact factor of A on B, and the two can be different or the same.
  • the meanings of other In() and in() can be understood similarly, and will not be repeated here.
  • the degree exceeding the threshold from I(A), I(B), and I(C) as the degree indicated by the first emotion information, and use the type of emotion corresponding to this degree as the type of emotion indicated by the first emotion information .
  • the maximum level may be used as the level indicated by the first emotion information, and the type of emotion corresponding to this level may be used as the type of emotion indicated by the first emotion information.
  • the method embodiment of the present application may further include: sequentially executing the first behavior and the second behavior according to the first control information.
  • the emotion corresponding to the expression of the robot may be "calm”.
  • the robot determines the initial emotional information according to the current time
  • the robot detects emotional trigger events
  • the sensor detects the stimulus signal through sound, contact, light and other signals, processes the stimulus signal, and judges whether an emotion trigger event is detected.
  • the robot determines the emotion change information v1 according to the emotion trigger event i1;
  • the user ( Figure 6B takes a child as an example) greets the robot enthusiastically and says “Good morning ⁇ tell me a story?”
  • the robot can determine that the emotion trigger event i1 is detected, and the emotion trigger event i1 is "greeted by enthusiasm", you can refer to Table 1 to determine the emotion change information v1 as: happy, 0.5.
  • the robot determines the emotion information e1 according to the initial emotion information and the emotion change information v1;
  • the robot can determine the emotion information e1 according to the initial emotion information and the emotion change information v1.
  • the robot can determine the emotion information e1 according to the initial emotion information and the emotion change information v1.
  • the specific process for determining the emotion information e1 can be as follows:
  • the robot executes the emotion dissipation behavior a1 according to the first emotion information
  • t represents time, t1, t2, and t3 respectively represent different moments, and t1 is earlier than t2, and t2 is earlier than t3.
  • the emotional dissipating behavior a1 includes three emotional behaviors (Behavior 1, Behavior 2, and Behavior 3) As an example, where behavior 1 corresponds to the emotional information e1, please refer to Figure 3A; behavior 2 corresponds to the same emotion type as behavior 1, but the emotional degree corresponding to behavior 2 is lower than that of behavior 1; behavior 3 corresponds to the initial Emotional information.
  • the robot detects emotional trigger events
  • the robot determines the emotion change information v2 according to the emotion trigger event i2;
  • the robot can also obtain the task instruction issued by the user based on the detected audio data and image data, that is, the instruction to read a story.
  • the robot can download the audio of a story through the network or read the audio from the local storage medium. And play the audio data of the story through the loudspeaker.
  • the robot can detect the emotion trigger event i2, specifically the task failure, the robot can Determine the emotion change information v2 according to the emotion trigger event i2. You can refer to Table 2.
  • the historical emotion change information is: sad, 0.8.
  • the robot determines the dissipation margin of the emotion information e1 according to t1 and t4;
  • step 8 and step 9 are not limited.
  • the robot determines the emotion information e2 according to the dissipation margin of the emotion information e1 and the emotion change information v2;
  • the robot can determine the emotion information e2 according to the dissipation margin of the emotion information e1 and the emotion change information v2.
  • the specific process of determining the emotion information e2 may be as follows:
  • the robot executes the emotion dissipation behavior a2 according to the emotion information e2.
  • t represents time, t4, t5, and t6 respectively represent different moments, and t4 is earlier than t5, and t5 is earlier than t6.
  • the emotional dissipating behavior a2 includes three emotional behaviors (behavior 4, behavior 5, and behavior 3) As an example, where behavior 4 corresponds to emotion information e2, refer to FIG. 3B; behavior 5 and behavior 4 correspond to the same emotion type, but the emotion level corresponding to behavior 5 is lower than the emotion level corresponding to behavior 4.
  • the computer device in any of the foregoing method embodiments may be the robot 110.
  • the input module 111 may be used to detect an emotion trigger event (for example, the first emotion trigger event in the foregoing method embodiment), and the memory 113 is used for Stores the computer instructions for executing the solution of the present application.
  • the processor 112 is used to execute the computer instructions in the memory 113, it executes any method embodiment provided in the embodiments of the present application.
  • the output module 114 is used to receive and execute the control information generated by the processor 112. For example, the first behavior and the second behavior are executed in sequence according to the first control information.
  • the computer device in any of the foregoing method embodiments may be the server 120, the memory 122 is used to store computer instructions for executing the solution of the present application, and the processor 121 is used to execute the computer instructions in the memory 122 to execute the embodiment of the present application.
  • first control information is generated.
  • the computer device in any of the foregoing method embodiments may be the computer system 100, and the robot 110 and the server 120 jointly execute any of the foregoing method embodiments.
  • the robot 110 is used to detect emotion triggering events, and the detected The emotion trigger event is sent to the server 120; the server 120 is used to execute any method embodiment of this application according to the emotion trigger event detected by the robot 110, generate control information, and send the control information to the robot 110; the robot 110 is also used to execute the server 120
  • the sent control information for example, after receiving the first control information sent by the server 120, the first behavior and the second behavior are sequentially executed according to the first control information.
  • the present application can divide the device that executes the emotional information processing method according to the above method embodiments into functional modules.
  • each functional module can be divided corresponding to each function, or two or more The functions are integrated in a functional module.
  • the above-mentioned integrated functional modules can be implemented either in the form of hardware or in the form of software functional units.
  • FIG. 7A shows a schematic structural diagram of an emotional information processing device.
  • an embodiment of an apparatus 700 for processing emotion information of the present application may include:
  • the determining module 701 is configured to determine first emotion information according to the first emotion trigger event in response to the robot detecting the first emotion trigger event; the generating module 702 is configured to generate first control information according to the first emotion information, and the first control information Used to instruct the robot to perform the first behavior and the second behavior in sequence. Both the first behavior and the second behavior are used to express emotions, where the first behavior is used to express the emotions indicated by the first emotion information, and the second behavior is used to express An emotion that is milder than the emotion indicated by the first emotion information.
  • the first emotion trigger event includes at least one of the robot being moved, the robot falling down, the environmental parameters of the robot being inferior to the preset parameters, the failure of the robot's task and the success of the robot's task.
  • the degree of the second behavior is milder than the degree of the first behavior, and the degree of the first behavior and the second behavior is at least one of amplitude, frequency, and volume.
  • the first control information is also used to instruct the robot to perform a third behavior for expressing the initial emotion after performing the second behavior and before detecting the next emotion triggering event, which is used to indicate the initial emotion
  • the initial emotion information of the emotion is stored in the storage medium in advance.
  • the third behavior is the first behavior used to express emotions performed after the robot is turned on.
  • the first emotion information is used to indicate that the type of emotion is the first type, and the degree of emotion is the first degree.
  • the type of emotion expressed by the second behavior is the first type, and the degree of emotion expressed by the second behavior is lighter than the first degree.
  • the determining module 701 is further configured to, before determining the first emotion information according to the first emotion trigger event, in response to the robot detecting the second emotion trigger event, determine the historical emotion according to the second emotion trigger event Information, historical emotion information is used to generate historical control information, historical control information is used to instruct the robot to perform historical behaviors, and historical behaviors are used to express the emotions indicated by the historical emotion information.
  • the determining module determines the first emotion information according to the first emotion trigger event, and is specifically used to determine the first emotion change information according to the first emotion trigger event, and the first emotion change information is used to indicate the degree of change caused by the first emotion trigger event The amount of change in the type and degree of emotion; the first emotion information is determined according to the first emotion change information and historical emotion information.
  • the first type is the emotion in the first direction Type
  • the first degree is positively correlated with the amount of change in the emotion indicated by the first emotion change information
  • the first degree is positively correlated with the degree of emotion indicated by the historical emotion information.
  • the emotion type in the first direction is a positive emotion type, Or a negative emotion type.
  • the first degree is negatively related to the first duration
  • the first duration is between the second moment when the robot detects the second emotion trigger event and the first moment when the robot detects the first emotion trigger event The length of the interval.
  • the emotion type in the first direction is a positive emotion type, and the emotion type in the second direction is a negative emotion type; or, the emotion type in the first direction is a negative emotion type, and the emotion type in the first direction is a negative emotion type.
  • the emotion type in the two directions is a positive emotion type; the emotion type indicated based on the first emotion change information is the emotion type in the first direction, the emotion type indicated by the historical emotion information is the emotion type in the second direction, and the emotion type indicated by the historical emotion information is the emotion type in the second direction.
  • One type is the emotion type of the first direction, then the first degree is positively correlated with the amount of change of the emotion indicated by the first emotion information, and the first degree is negatively correlated with the degree of the emotion indicated by the historical emotion information.
  • the first degree is positively correlated with the first duration
  • the first degree is negatively correlated with the first duration
  • the first duration is the second moment when the robot detects the second emotion trigger event and the robot detects the second moment. The length of the interval between the first moments of an emotional trigger event.
  • the apparatus further includes an execution module 703.
  • the execution module 703 is configured to: after the generation module 702 generates the first control information according to the first emotion information, sequentially execute the first control information according to the first control information. One act and second act.
  • the device corresponding to FIG. 7B may be provided in the robot 110 or the robot system 100.
  • the computer execution instructions or computer instructions in the embodiments of the present application may also be referred to as application program codes, which are not specifically limited in the embodiments of the present application.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • the computer instructions may be transmitted from a website, computer, server, or data center. Transmission to another website, computer, server or data center via wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.).
  • wired such as coaxial cable, optical fiber, digital subscriber line (DSL)
  • wireless such as infrared, wireless, microwave, etc.
  • the computer-readable storage medium may be any available medium that can be stored by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).
  • words such as “exemplary” or “for example” are used as examples, illustrations, or illustrations. Any embodiment or design solution described as “exemplary” or “for example” in the embodiments of the present application should not be construed as being more preferable or advantageous than other embodiments or design solutions. To be precise, words such as “exemplary” or “for example” are used to present related concepts in a specific manner.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Robotics (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Manipulator (AREA)

Abstract

一种情绪信息的处理方法及装置(700)、计算机设备及可读存储介质,有利于展示机器人(110)的情绪消散过程,提高机器人(110)的拟人程度。方法包括:响应于机器人(110)检测到第一情绪触发事件,根据第一情绪触发事件确定第一情绪信息(201);根据第一情绪信息生成第一控制信息(202),第一控制信息用于指示机器人(110)依次执行用于表达情绪的第一行为和第二行为,其中,第一行为用于表达第一情绪信息所指示的情绪,第二行为用于表达比所述第一情绪信息所指示的情绪更为轻微的情绪。

Description

一种情绪信息的处理方法及装置
本申请要求于2019年12月31日提交中国专利局、申请号为201911415571.0、申请名称为“一种情绪信息的处理方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及信息处理领域,尤其涉及一种情绪信息的处理方法及装置。
背景技术
机器人(robot)是自动执行工作的机器装置,它既可以接受人类指挥,又可以运行预先编排的程序,也可以根据以人工智能技术制定的原则纲领行动。它的任务是协助或取代人类的工作,例如生产业、建筑业等领域的工作。随着科学技术的发展,对机器人的研究领域已经逐渐从工业领域扩展到医疗、保健、娱乐和教育等领域,机器人不仅具有完成指定任务的能力,还具有与人交互的能力。为了获得更好的交互体验,需要使机器人与人的交互更加拟人化。
外界刺激或内部刺激通常会引起人情绪的变化,人一般会通过不同的行为(例如不同的面部表情、声音或动作等)来表达不同的情绪。由于人们对情绪的表达方式通常具有共通性,在人与人的沟通过程中,通过观察对方的行为有利于确定对方的情绪。例如,当看到一个人微笑时,一般可以认为他当前的情绪是积极的情绪,例如开心;当听到一个人的哭泣声时,一般可以认为他当前的情绪是消极的情绪,例如伤心。
当机器人受到外界刺激或内部刺激时,若机器人能够像人一样表达其情绪的变化,将使机器人更加拟人,使人对机器人更容易产生共鸣,提高人们对机器人的使用粘性。但是,针对机器人受到刺激后如何像人一样表达其情绪,并没有太多解决方案。
发明内容
本申请实施例提供了一种情绪信息的处理方法及装置、计算机设备及可读存储介质,有利于展示机器人的情绪消散过程,提高机器人的拟人程度。
本申请实施例第一方面提供一种情绪信息的处理方法,包括:响应于机器人检测到第一情绪触发事件,根据所述第一情绪触发事件确定第一情绪信息;根据所述第一情绪信息生成第一控制信息,所述第一控制信息用于指示所述机器人依次执行第一行为和第二行为,所述第一行为和第二行为均用于表达情绪,其中,所述第一行为用于表达所述第一情绪信息所指示的情绪,所述第二行为用于表达比所述第一情绪信息所指示的情绪更为轻微的情绪。
情绪触发事件为引起机器人的情绪信息发生变化的、可被机器人检测到的事件;执行主体为机器人或服务器,或机器人与服务器的系统;情绪信息可以被量化为表达机器人情绪的参数;所述第二行为对应的情绪比所述第一情绪更轻微,或者说,第二行为比第一行 为更接近平静情绪下的行为,或者说,第二行为对应的情绪的类型与第一情绪的类型相同,但是程度较低;机器人可以执行表达情绪的行为,第一行为、第二行为用于表达情绪,例如,通过发出声音、显示图像或文字、驱动实体做出动作等展示笑、哭、皱眉等人的情绪表达行为。
在第一方面提供的方法中,在确定机器人的第一情绪信息后,通过指示机器人依次执行第一行为和第二行为,有利于使得与机器人交互的人感受到机器人情绪的消散过程,有利于提高机器人的情绪细腻程度,进而提高机器人的拟人程度,使人对机器人更容易产生共鸣,提高机器人的使用粘性,提高机器人的价值。
在一种可能的实现方式中,所述第一情绪触发事件包括所述机器人被移动,所述机器人摔倒,所述机器人的环境参数劣于预设参数,所述机器人的任务失败和所述机器人的任务成功中的至少一项。示例性的,环境参数可以为光照程度、温度、噪音分贝等。
在一种可能的实现方式中,第二行为的程度比所述第一行为的程度更轻微,所述第一行为和所述第二行为的程度为幅度、频率和音量中的至少一种。
在一种可能的实现方式中,所述第一控制信息还用于指示所述机器人在执行所述第二行为之后,在检测到下一个情绪触发事件之前,执行用于表达初始情绪的第三行为,用于指示所述初始情绪的初始情绪信息预先存储在存储介质中。
在一种可能的实现方式中,所述第三行为是所述机器人开机后执行的第一个用于表达情绪的行为。
在一种可能的实现方式中,所述第一情绪信息用于指示情绪的类型为第一类型,且情绪的程度为第一程度。
在一种可能的实现方式中,所述第二行为所表达的情绪的类型为所述第一类型,所述第二行为所表达的情绪的程度轻于所述第一程度。
在一种可能的实现方式中,在所述根据所述第一情绪触发事件确定第一情绪信息之前,所述方法还包括:响应于所述机器人检测到第二情绪触发事件,根据所述第二情绪触发事件确定历史情绪信息,所述历史情绪信息用于生成历史控制信息,所述历史控制信息用于指示所述机器人执行历史行为,所述历史行为用于表达所述历史情绪信息所指示的情绪;所述根据所述第一情绪触发事件确定第一情绪信息,包括:根据所述第一情绪触发事件确定第一情绪变化信息,所述第一情绪变化信息用于指示由所述第一情绪触发事件引起程度发生变化的情绪的类型和程度的变化量;根据所述第一情绪变化信息和所述历史情绪信息确定所述第一情绪信息。
在一种可能的实现方式中,基于所述第一情绪变化信息所指示的情绪的类型与所述历史情绪信息所指示的情绪的类型均为第一方向的情绪类型,那么所述第一类型为所述第一方向的情绪类型,且所述第一程度与所述第一情绪变化信息所指示的情绪的变化量正相关,所述第一程度与所述历史情绪信息所指示的情绪的程度正相关,所述第一方向的情绪类型为积极的情绪类型(例如开心、满意、自信等),或者为消极的情绪类型(例如伤心、失望、不自信等)。
在一种可能的实现方式中,所述第一程度与第一时长负相关,所述第一时长为所述机 器人检测到所述第二情绪触发事件的第二时刻与所述机器人检测到所述第一情绪触发事件的第一时刻之间的间隔时长。
在实际设计中,不严格限定第一时长等于第一时刻和第二时刻的时间间隔,可以略大于或小于时间间隔,只要能够体现时间间隔的变化即可,例如,时间间隔变大时,第一时长也相应变大。
在一种可能的实现方式中,第一方向的情绪类型为积极的情绪类型,且第二方向的情绪类型为消极的情绪类型;或者,所述第一方向的情绪类型为消极的情绪类型,且所述第二方向的情绪类型为积极的情绪类型;基于所述第一情绪变化信息所指示的情绪的类型为第一方向的情绪类型,所述历史情绪信息所指示的情绪的类型为第二方向的情绪类型,且所述第一类型为所述第一方向的情绪类型,那么所述第一程度与所述第一情绪信息所指示的情绪的变化量正相关,所述第一程度与所述历史情绪信息所指示的情绪的程度负相关。
在一种可能的实现方式中,所述第一程度与第一时长正相关,所述第一程度与第一时长负相关,所述第一时长为所述机器人检测到所述第二情绪触发事件的第二时刻与所述机器人检测到所述第一情绪触发事件的第一时刻之间的间隔时长。
本申请实施例第二方面还提供一种情绪信息的处理装置,包括:确定模块,用于响应于机器人检测到第一情绪触发事件,根据所述第一情绪触发事件确定第一情绪信息;生成模块,用于根据所述第一情绪信息生成第一控制信息,所述第一控制信息用于指示所述机器人依次执行第一行为和第二行为,所述第一行为和第二行为均用于表达情绪,其中,所述第一行为用于表达所述第一情绪信息所指示的情绪,所述第二行为用于表达比所述第一情绪信息所述指示的情绪更为轻微的情绪。
在一种可能的实现方式中,所述第一情绪触发事件包括所述机器人被移动,所述机器人摔倒,所述机器人的环境参数劣于预设参数,所述机器人的任务失败和所述机器人的任务成功中的至少一项。
在一种可能的实现方式中,所述第二行为的程度比所述第一行为的程度更轻微,所述第一行为和所述第二行为的程度为幅度、频率和音量中的至少一种。
在一种可能的实现方式中,所述第一控制信息还用于指示所述机器人在执行所述第二行为之后,在检测到下一个情绪触发事件之前,执行用于表达初始情绪的第三行为,用于指示所述初始情绪的初始情绪信息预先存储在存储介质中。
在一种可能的实现方式中,所述第三行为是所述机器人开机后执行的第一个用于表达情绪的行为。
在一种可能的实现方式中,所述第一情绪信息用于指示情绪的类型为第一类型,且情绪的程度为第一程度。
在一种可能的实现方式中,所述第二行为所表达的情绪的类型为所述第一类型,所述第二行为所表达的情绪的程度轻于所述第一程度。
在一种可能的实现方式中,所述确定模块还用于:在所述根据所述第一情绪触发事件确定第一情绪信息之前,响应于所述机器人检测到第二情绪触发事件,根据所述第二情绪触发事件确定历史情绪信息,所述历史情绪信息用于生成历史控制信息,所述历史控制信 息用于指示所述机器人执行历史行为,所述历史行为用于表达所述历史情绪信息所指示的情绪;所述确定模块根据所述第一情绪触发事件确定第一情绪信息,具体用于:根据所述第一情绪触发事件确定第一情绪变化信息,所述第一情绪变化信息用于指示由所述第一情绪触发事件引起程度发生变化的情绪的类型和程度的变化量;根据所述第一情绪变化信息和所述历史情绪信息确定所述第一情绪信息。
在一种可能的实现方式中,基于所述第一情绪变化信息所指示的情绪的类型与所述历史情绪信息所指示的情绪的类型均为第一方向的情绪类型,那么所述第一类型为所述第一方向的情绪类型,且所述第一程度与所述第一情绪变化信息所指示的情绪的变化量正相关,所述第一程度与所述历史情绪信息所指示的情绪的程度正相关,所述第一方向的情绪类型为积极的情绪类型,或者为消极的情绪类型。
在一种可能的实现方式中,所述第一程度与第一时长负相关,所述第一时长为所述机器人检测到所述第二情绪触发事件的第二时刻与所述机器人检测到所述第一情绪触发事件的第一时刻之间的间隔时长。
在一种可能的实现方式中,第一方向的情绪类型为积极的情绪类型,且第二方向的情绪类型为消极的情绪类型;或者,所述第一方向的情绪类型为消极的情绪类型,且所述第二方向的情绪类型为积极的情绪类型;基于所述第一情绪变化信息所指示的情绪的类型为第一方向的情绪类型,所述历史情绪信息所指示的情绪的类型为第二方向的情绪类型,且所述第一类型为所述第一方向的情绪类型,那么所述第一程度与所述第一情绪信息所指示的情绪的变化量正相关,所述第一程度与所述历史情绪信息所指示的情绪的程度负相关。
在一种可能的实现方式中,所述第一程度与第一时长正相关,所述第一程度与第一时长负相关,所述第一时长为所述机器人检测到所述第二情绪触发事件的第二时刻与所述机器人检测到所述第一情绪触发事件的第一时刻之间的间隔时长。
在一种可能的实现方式中,所述装置还包括执行模块,所述执行模块用于:在所述生成模块根据所述第一情绪信息生成第一控制信息之后,根据所述第一控制信息依次执行所述第一行为和所述第二行为。
本申请实施例第三方面提供一种计算机设备,包括处理器和存储器,所述处理器在运行所述存储器存储的计算机指令时,执行如本申请实施例第一方面或第一方面的任意一种可能的实现方式所述的方法。
本申请实施例第四方面提供一种计算机可读存储介质,包括指令,当所述指令在计算机上运行时,使得计算机执行如本申请实施例第一方面或第一方面的任意一种可能的实现方式所述的方法。
本申请实施例第五方面提供一种计算机程序产品,包括指令,当所述指令在计算机上运行时,使得计算机执行如本申请实施例第一方面或第一方面的任意一种可能的实现方式所述的方法。
本申请实施例第六方面提供一种机器人,包括输入模块、输出模块、处理器和存储器,所述输入模块用于检测情绪触发事件,所述存储器用于存储计算机指令,所述处理器在运行所述存储器存储的计算机指令时,执行如本申请实施例第一方面或第一方面的任意一种 可能的实现方式所述的方法,所述输出模块用于执行所述处理器生成的控制信息,例如,在接收到服务器发送的第一控制信息后,根据所述第一控制信息依次执行所述第一行为和所述第二行为。
本申请实施例第七方面提供一种机器人系统,包括机器人和服务器;所述机器人用于检测情绪触发事件,并将检测到的情绪触发事件发送给所述服务器;所述服务器用于根据所述机器人检测到的情绪触发事件执行如本申请实施例第一方面或第一方面的任意一种可能的实现方式所述的方法,并向所述机器人发送控制信息;所述机器人还用于执行所述服务器发送的控制信息,例如,在接收到服务器发送的第一控制信息后,根据所述第一控制信息依次执行所述第一行为和所述第二行为。
附图说明
图1A是本申请提供的机器人一个实施例示意图;
图1B是本申请提供的服务器一个实施例示意图;
图1C是本申请提供的机器人系统一个实施例示意图;
图2是本申请提供的情绪信息的处理方法一个实施例示意图;
图3A是本申请提供的不同程度的开心对应的不同行为的示意图;
图3B是本申请提供的不同程度的伤心对应的不同行为的示意图;
图4A是本申请机器人用于表达情绪的行为消散过程的一个示意图;
图4B是本申请机器人用于表达情绪的行为消散过程的另一个示意图;
图4C是本申请机器人用于表达情绪的行为消散过程的另一个示意图;
图5是本申请提供两种情绪的消散方程对应的曲线一种可能的示意图;
图6A至图6E是本申请提供的情绪信息的处理方法一种可能的应用场景示意图;
图7A是本申请提供的情绪信息的处理装置一个实施例示意图;
图7B是本申请提供的情绪信息的处理装置另一个实施例示意图。
具体实施方式
本申请实施例提供了一种情绪信息的处理方法及装置、机器人、服务器、机器人系统、计算机可读存储介质和计算机程序产品。下面结合附图,对本申请实施例进行描述。
图1A是本申请提供的机器人110的一个实施例示意图。机器人110可以包括输入模块111、处理器112、存储器113和输出模块114。其中,输入模块111可以包括传感器和数据处理模块,传感器用于检测数据,数据处理模块用于对传感器检测到的数据进行处理,例如摄像头用于检测图像数据,麦克风阵列用于检测音频数据,热敏传感器用于检测环境温度数据,光敏传感器用于检测光强数据等;存储器113用于存储计算机程序;处理器112用于执行存储器中的计算机程序,进行数据处理,向输出模块发送控制信息;输出模块114用于与用户(人或其他机器人)交互,例如,输出模块114可以为显示屏(例如可以设置在图1A中机器人的头部)、扬声器、驱动机构中的一种或多种,若机器人为人形机器人,驱动机构可以用于控制机器人改变身体姿态,例如通过控制机器人的手、胳膊、腿、头部 等转动,驱动机构还可以用于控制机器人移动。
图1A以机器人110的外形为人形为例,需要说明的是,本申请不对机器人110的外形进行限定,也不限定机器人110能够执行人的所有行为,只要机器人110能够根据以人工智能技术制定的原则纲领行动,并且执行用于表达情绪的一类或多类行为即可。例如,人的用于表达情绪的行为类型一般可以包括表情、表达情绪的肢体动作(比如拍手、晃动身体等)、说话的语气等,机器人110只要能够向用户(人或其他机器人)展示上述至少一类用于表达情绪的行为即可,例如能够展示表情,或者能够做出肢体动作,或者能够发出带有不同语气的声音即可。示例性的,本申请实施例提供机器人110还可以为智能终端设备(例如手机)。
参考图1B,本申请实施例还提供一种服务器120。服务器120可以包括处理器121和存储器122。
图1A和/或图1B中的处理器可以是中央处理器(central processing unit,CPU),网络处理器(network processor,NP)或者CPU和NP的组合、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现成可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法的步骤。虽然图中仅仅示出了一个处理器,该装置可以包括多个处理器或者处理器包括多个处理单元。具体的,处理器可以是一个单核处理器,也可以是一个多核或众核处理器。该处理器可以是ARM架构处理器。
图1A和/或图1B中的存储器用于存储处理器执行的计算机指令。存储器可以是存储电路也可以是存储器。存储器可以是易失性存储器或非易失性存储器,或可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(random access memory,RAM),其用作外部高速缓存。存储器可以独立于处理器,一种可能的实现方式中,处理器和存储器可以通过总线相互连接。总线可以是外设部件互连标准(peripheral component interconnect,PCI)总线或扩展工业标准结构(extended industry standard architecture,EISA)总线等。所述总线可以分为地址总线、数据总线、控制总线等。或者,存储器也可以是处理器中的存储单元,与处理器直接相连(attach),在此不做限定。虽然图中仅仅示出了一个存储器,该装置也可以包括多个存储器或者存储器包括多个存储单元。
图1C是本申请提供的机器人系统100的一个实施例示意图。机器人系统100包括机器 人110和服务器120。机器人110的结构可以参考关于图1A的相关描述,服务器120的结构可以参考关于图1B的相关描述,此处不再赘述。机器人110和服务器120还可以包括通信模块,机器人110和服务器120可以通过各自的通信模块进行有线或无线的通信,例如,通过互联网进行交互。
上面对本申请实施例提供的机器人110、服务器120和机器人系统100进行了介绍,下面对本申请提供的情绪信息的处理方法进行介绍。该方法可以应用于图1A所示的机器人110中,也可以应用于图1B所示的服务器120中,也可以应用于图1C所示的机器人系统100中。当该方法应用于机器人系统100时,该方法由机器人110和服务器120共同执行。为了便于描述,下面将本方法的执行主体统称作计算机设备。
人的情绪通常会被一些事件影响,而产生情绪的变化,将引起情绪的变化的事件称作情绪触发事件,类似的,可以将引起机器人的情绪信息变化的事件称作情绪触发事件。人感受到来自外界的情绪触发事件(比如被夸奖)或内部的情绪触发事件(例如饥饿)后,会产生相应的情绪,例如开心或伤心,并通过相应的行为来表达产生的情绪,例如通过表情-笑来表达开心,通过表情-皱眉来表达伤心,当产生的情绪恢复平静(即没有情绪)后,人不再展现用于表达情绪的行为,即没有表情。
由于人们对情绪的表达方式通常具有共通性,在人与人的沟通过程中,通过观察对方的行为有利于确定对方的情绪。例如,当看到一个人微笑时,一般可以认为他当前的情绪是积极的情绪,例如开心;当听到一个人的哭泣声时,一般可以认为他当前的情绪是消极的情绪,例如伤心。
为了实现机器人的拟人化,机器人可以响应于检测到情绪触发事件产生情绪信息,并模仿人表达情绪的方式,执行用于表达情绪的行为,使交互对象(比如人或其他机器人)了解机器人的情绪。
现有技术中,机器人响应于检测到一个情绪触发事件i1产生情绪信息e1时,机器人会执行情绪信息e1对应的行为a1来表达该情绪,之后,机器人可以维持该行为a1,直至该行为a1维持预设时长,或者,检测到下一个情绪触发事件i2。
人的情绪通常比较细腻,这种细腻可以体现在因情绪触发事件产生的情绪会随着时间逐渐减弱,直至恢复到比较平静的情绪,相应的,人所展现的用于表达情绪的行为也随着时间逐渐减弱,直至不再展现用于表达情绪的行为。情绪的减弱可以为同种类型的程度减弱,或者可以为由一种类型的情绪转变为另一种类型的情绪,例如,由喜悦转变为满意。
而现有技术中,机器人在检测到情绪触发事件i1后,在检测到下一个情绪触发事件i2之前,一般按照如下两种方式中的一种方式执行用于表达情绪的行为:
1)由执行行为a1直接切换至不执行任何表达情绪的行为;
2)维持行为a1。
可见,和人的情绪表达相比,现有机器人表达情绪的方式不够细腻,不利于提高机器人的拟人程度。
图2为本申请实施例提供的情绪信息的处理方法一个实施例示意图。参考图2,本申请情绪信息的处理方法一个实施例可以包括如下步骤:
201、响应于机器人检测到第一情绪触发事件,根据第一情绪触发事件确定第一情绪信息;
机器人可以通过摄像头、热敏传感器、光敏传感器、声音检测装置等传感器来检测情绪触发事件,当机器人检测到情绪触发事件时,计算机设备可以获取该情绪触发事件,并根据该情绪触发事件确定机器人的情绪信息。将步骤201中检测到的情绪触发事件称作第一情绪触发事件,确定的情绪信息称作第一情绪信息。情绪信息可以被量化为表达机器人的情绪的参数,参数的不同值可以用来表示机器人的不同情绪信息。
本申请实施例不对计算机设备根据第一情绪触发事件确定机器人的第一情绪信息的方式进行限定,示例性的,计算机设备可以通过人工智能的方式,将检测到的第一情绪触发事件输入训练好的神经网络模型,输出机器人的第一情绪信息,或者,计算机设备可以通过预设的对应关系确定第一情绪触发事件对应的第一情绪信息。
202、根据第一情绪信息生成第一控制信息;
计算机设备确定第一情绪信息后,可以根据第一情绪信息生成第一控制信息,第一控制信息用于指示机器人依次执行第一行为和第二行为,第一行为和第二行为均是用于表达情绪的行为。其中,第一行为用于表达第一情绪信息所指示的情绪,第二行为用于表达比第一情绪信息所指示的情绪更为轻微的情绪。
机器人在检测到第一刺激数据后,在检测到下一次刺激数据之前,除了执行用于表达第一情绪信息对应的情绪的第一行为,还可以执行第二行为,第二行为用于表达比第一情绪信息所指示的情绪更轻微的情绪,和现有技术相比,本申请实施例提供的情绪信息的处理方法有利于使机器人展现出情绪随着时间逐渐减弱的过程,有利于使得机器人以更加细腻的方式表达情绪,从而有利于提高机器人的拟人程度,增加机器人的用户粘性。
机器人可以包括多个输出模块,每个输出模块用于输出一个单位动作,例如扬声器用于输出声音,显示屏用于输出表情,通过驱动机构用于输出晃动身体的动作。在一种可能的实现方式中,用于表达情绪信息的第一行为可以包括机器人同时执行的多个单位动作,例如在通过显示屏输出微笑的表情的同时,通过扬声器输出笑声,通过驱动机构驱动身体微微晃动。类似的,第二行为可以包括机器人同时执行的多个单位动作。
机器人的用于表达情绪的行为可以参考人的用于表达情绪的行为进行理解,例如,第一行为和第二行为可以为笑、哭、皱眉、叹气、拍手、唱歌、以愉悦或生气的口气说话等行为中的一种或多种,并且,行为所表达的情绪信息可以参考人的行为所表达的情绪,例如,笑用于表达开心,哭和皱眉用来表达伤心。需要说明的是,第一行为和第二行为可以为同一类型的行为,例如,都是笑,只是二者的程度不同。
在一种可能的实现方式中,行为的程度可以指该行为的幅度或频率或音量等。示例性的,第一行为指机器人做出笑的表情,第一行为的幅度指表情中嘴巴、眼睛、眉毛中的一个弯曲的幅度;第一行为指机器人发出笑声或拍手,第一行为的频率指声音的频率或拍手的频率;第一行为指机器人发出哭声,第一行为的音量指哭声的音量。
在一种可能的实现方式中,第二行为所表达的情绪比第一行为所表达的情绪更轻微,那么,第二行为的程度比第一行为的程度更轻微,行为的程度可以理解为幅度、频率和音 量中的至少一种,即第二行为的幅度比第一行为的幅度更小,和/或,第二行为的频率比第一行为的频率更低,和/或,第二行为产生的声音比第一行为产生的声音更小。
在对人的情绪以及情绪的表达进行分析研究时,一般会将人的情绪划分为两大类,积极的情绪和消极的情绪,并且,同一类型的情绪还会被划分为不同程度,比如,用表示不同程度的副词(例如极度、非常、很和轻微,或者重度、中度、轻度、略微)或数值来区分情绪的不同程度。在一些更为具体的研究中,还会将积极的情绪和消极的情绪分别划分为更多的类型,或者,更加精细的划分同一类型的情绪的不同程度。
为了提高机器人的拟人程度,在一种可能的实现方式中,情绪信息可以用于指示情绪的类型和情绪的程度。
不同类型的情绪,其对应的行为不同;对于同一类型、不同程度的情绪,其对应的行为也不同。示例性的,以正整数代表情绪的程度,数值越大,情绪的程度越高。参考图3A和图3B,分别展示了开心的不同程度对应的不同行为(图3A和图3B中以面孔中的表情代表表达情绪的行为),I1和I2分别用于代表开心的程度和伤心的程度。在一种可能的实现方式中,可以为各类情绪设置程度阈值,例如,图3A中以TH1(0.1)代表开心的程度阈值,图3B中以TH2(0.2)代表伤心的程度阈值,当开心的程度超过0.1时,才在面孔中表达相应程度的开心的表情,当开心的程度未超过0.1时,可以不执行表达开心的行为,即面孔中不表达任何表情。
在一种可能的实现方式中,可以认为第二行为对应的情绪的类型与第一行为对应的情绪的类型相同,第二行为所表达的情绪的程度轻于第一行为所表达的情绪的程度。
在一种可能的实现方式中,在机器人执行第二行为之后,还可以执行其他的一个或多个用于表达情绪的行为,越轻微的行为,执行的次序越靠后,从而有利于更加细腻的表达机器人情绪逐渐消失的过程。
在一种可能的实现方式中,机器人在未检测到下一次刺激数据之前,其执行最后一个用于表达情绪信息的行为后,可以停止执行用于表达情绪信息的行为。图4A至图4C分别示例性的示出了表达情绪的行为消散的过程。参考图4A,随着时间的流逝,机器人的面孔依次变为面孔1、面孔2和面孔0,其中,面孔1和面孔2分别代表第一行为(具体为第一表情)和第二行为(具体为第二表情),面孔0用于代表停止执行表达情绪信息的行为(具体为无表情)。
或者,在一种可能的实现方式中,可以为机器人设置初始情绪信息,机器人在执行第二行为后,可以持续执行用于表达初始情绪信息的第三行为。该初始情绪信息可以预先存储在存储介质中。在一种可能的实现方式中,存储介质中可以存储多个备选初始情绪信息,计算机设备可以动态选择一个作为当前的初始情绪信息,例如,可以将机器人设计为初始情绪信息周期性变化,示例性的,可以以28天为一个周期,计算机设备可以存储28个备选初始情绪信息,每个备选初始情绪信息对应一个周期中的一天,相邻两天的备选初始情绪信息不同,计算机设备可以根据当前日期在当前周期中的次序(例如,今天是当前周日的第二天),便可以确定当前的初始情绪信息。参考图4B和图4C,面孔3和面孔4用于代表不同的初始情绪信息。图4B结合图3A,面孔3对应的初始情绪信息为:开心,程度在 0.1至0.35之间;图4C结合图3B,面孔4对应的初始情绪信息为:伤心,程度在0.2至0.4之间。
刺激数据(将当前检测到的刺激数据称作第一刺激数据),不限定第一刺激数据为一种传感器检测到的数据,可以包括多类传感器检测到的数据,例如,包括通过摄像头检测到的图像数据、通过语音检测装置检测到的音频数据、通过热敏传感器检测到的环境温度数据以及通过光敏传感器检测到的光强数据等。
计算机设备可以根据第一刺激数据判断当前是否出现情绪触发事件,即引起机器人的情绪变化的事件,若出现情绪触发事件,则确定机器人检测到情绪触发事件(称作第一情绪触发事件)。
关于步骤201,在一种可能的实现方式中,计算机设备可以通过摄像头、热敏传感器、光敏传感器、声音检测装置等传感器检测相应数据,例如通过摄像头检测图像数据,通过语音检测装置检测音频数据,通过热敏传感器检测环境温度数据,通过光敏传感器检测,光强数据。计算机设备可以根据一个或多个传感器检测到的数据(称作刺激数据)判断是否检测到情绪触发事件,具体的,计算机设备可以对刺激数据进行识别处理,例如,计算机设备可以对检测到的图像进行人脸识别,识别图像中人的表情、人的个数等;识别到人脸后,计算机设备还可以对人的身体姿势进行识别,例如识别到人在招手;计算机设备可以对检测到的音频数据进行文字识别,并对识别的文字进行语义分析,得到人说话的意图,例如命令机器人执行任务,或者跟机器人打招呼。对刺激数据识别处理后,计算机设备可以识别刺激数据所指示的情绪触发事件(称作第一情绪触发事件),示例性的,第一情绪触发事件可以包括机器人被移动,机器人摔倒,机器人的环境参数劣于预设参数,机器人的任务失败和机器人的任务成功中的至少一项。环境参数可以指光照强度、环境温度、噪音分贝等。
在一种可能的实现方式中,计算机设备可以预先存储情绪触发事件与情绪变化信息的对应关系,识别出第一刺激数据所指示的第一情绪触发事件后,可以确定相应的第一情绪变化信息,第一情绪变化信息用于指示由第一情绪触发事件引起程度发生变化的情绪的类型和程度的变化量。在一种可能的实现方式中,第一情绪变化信息所指示的情绪的类型和程度即为第一情绪信息所指示的情绪的类型和程度。
在一种可能的实现方式中,可以根据机器人的需求层级来确定情绪触发事件与情绪变化信息的对应关系,有利于设计出更加符合人的情绪变化的对应关系,提高机器人的拟人程度。
在一种可能的实现方式中,第一情绪变化信息可以用于指示第一情绪触发事件影响的情绪的类型和引起的该类型情绪的程度变化量。示例性的,表1和表2分别示出了多种情绪触发事件分别影响的情绪的类型,以及引起的相应类型情绪的程度变化量,以大于0,且不超过1的数值来表示程度变化量,并且,数值越大,程度变化量越大。
表1
Figure PCTCN2020133746-appb-000001
表2
Figure PCTCN2020133746-appb-000002
感受到情绪触发事件时,人的当前情绪一般会发生改变,为了便于描述,可以将因情绪触发事件而发生改变之前的当前情绪称作历史情绪余量,将改变后的当前情绪称作刺激响应情绪,刺激响应情绪不仅由刺激带来的情绪变化决定,还由历史情绪余量决定。
在一种可能的实现方式中,历史情绪余量由前一次刺激响应情绪决定。因此,为了提高机器人的拟人程度,若机器人在检测到第一情绪触发事件之前,检测到第二情绪触发事件,并且根据第二情绪触发事件确定了对应的情绪信息(称作历史情绪信息),其中历史情绪信息用于生成历史控制信息,历史控制信息用于指示机器人执行历史行为,历史行为用于表达历史情绪信息所指示的情绪,那么在一种可能的实现方式中,步骤201可以包括:根据第一情绪触发事件确定第一情绪变化信息,之后,根据第一情绪变化信息和历史情绪信息确定第一情绪信息。
为了便于描述,将第一情绪信息指示的情绪的类型称作第一类型,将第一情绪信息指 示的情绪的程度称作第一程度。
情绪之间的影响可以为相互促进的,或者可以为相互抑制的。
在一种可能的实现方式中,基于第一情绪变化信息所指示的情绪的类型与历史情绪信息所指示的情绪的类型均为第一方向的情绪类型,那么第一类型为第一方向的情绪类型,且第一程度与第一情绪变化信息所指示的情绪的变化量正相关,第一程度与历史情绪信息所指示的情绪的程度正相关,第一方向的情绪类型为积极的情绪类型,或者为消极的情绪类型。
假设第一方向的情绪类型为积极的情绪类型,且第二方向的情绪类型为消极的情绪类型;或者,第一方向的情绪类型为消极的情绪类型,且第二方向的情绪类型为积极的情绪类型。在一种可能的实现方式中,基于第一情绪变化信息所指示的情绪的类型为第一方向的情绪类型,历史情绪信息所指示的情绪的类型为第二方向的情绪类型,若第一类型为第一方向的情绪类型,那么第一程度与第一情绪信息所指示的情绪的变化量正相关,第一程度与历史情绪信息所指示的情绪的程度负相关;若第一类型为第二方向的情绪类型,那么第一程度与第一情绪信息所指示的情绪的变化量负相关,第一程度与历史情绪信息所指示的情绪的程度正相关。
在一种可能的实现方式中,历史情绪余量还与获取第二情绪触发事件的时刻与获取第一情绪触发事件的时刻之间的时长相关。假设,第一时长为获取第二情绪触发事件的第二时刻与获取第一情绪触发事件的第一时刻之间的间隔时长。
在一种可能的实现方式中,若第一类型和历史情绪信息所指示的情绪的类型均为消极的情绪类型或均为积极的情绪类型,那么第一程度与第一时长负相关。
在实际设计中,不严格限定第一时长等于第一时刻和第二时刻的时间间隔,可以略大于或小于时间间隔,只要能够体现时间间隔的变化即可,例如,时间间隔变大时,第一时长也相应变大。
在一种可能的实现方式中,若第一类型为消极的情绪类型,历史情绪信息所指示的情绪的类型为积极的情绪类型;或者,第一类型为积极的情绪类型,历史情绪信息所指示的情绪的类型为消极的情绪类型,那么第一程度与第一时长正相关。
示例性的,可以按照如下公式(称作消散方程),根据第一时长和历史情绪信息所指示的情绪的程度计算历史情绪余量I_r:
I_r=I_2*f=I_2*1/(param1+param2*Math.exp(param3/t)),其中,param1、param2和param3是该方程的3个参数,t代表第一时长,I_2代表历史情绪信息所指示的情绪的程度。通过调整这3个参数,可以拟合出不同的消散曲线。
在一种可能的实现方式中,不同类型的情绪对应的消散方程中的一个或多个参数可以不同。图5中的实线曲线和虚线曲线分别代表程度为1的开心对应的消散曲线和程度为1的伤心对应的消散曲线,可以看出,二者的相同之处在于,均是“倒S形”曲线,即程度随时间的变化率先增加,后减小;不同之处在于,伤心的程度由1减少至0所经历的时长(称作消散时长)比开心的程度由1减少至0所经历的时长(消散时长)更短。在一种可能的实现方式中,消极的情绪的消散时长比积极的情绪的消散时长更短,有利于机器人更 多的表达积极的情绪。
上面分别介绍了情绪触发事件对应的情绪变化信息和历史情绪余量的确定方式,下面举例介绍根据情绪变化信息和历史情绪余量确定第一情绪信息的方法。
若机器人的情绪的类型包括多种,在一种可能的实现方式中,机器人的第一情绪信息所指示的第一类型对应于一种情绪的类型,并且第一类型对应于情绪的程度高于阈值的情绪类型。在一种可能的实现方式中,可以根据情绪变化信息、历史情绪余量和不同类型的情绪之间的影响量分别确定每种类型的情绪的程度,从中选择高于阈值的程度,并以该程度作为第一程度,以该程度对应的情绪的类型作为第一类型。
下面举例介绍机器人的第一情绪信息的确定方法。
假设机器人包括三种类型的情绪,分别记为A、B和C,三种情绪的程度分别记作:I(A)、I(B)和I(C),A和B相互促进,A和C相互抑制,B和C相互抑制。
机器人检测到第一情绪触发事件时,假设第一情绪触发事件引起A的程度变化,变化量为a,可以按照如下过程分别计算三种情绪的程度:
I(B)=F(B)+E(B)+In(B_A)-In(B_C);
I(C)=F(C)+E(C)-In(C_B)-In(C_A);
I(A)=F(A)+E(A)+In(A_B)-In(A_C);
其中,F(A)、F(B)、F(C)分别为A、B、C的历史情绪余量,当三者的历史情绪余量均为0时,例如第一情绪触发事件为机器人开机后检测到的第一个情绪触发事件,F(A)、F(B)、F(C)为根据初始情绪信息确定的A、B、C的程度,例如,若初始情绪信息用于指示情绪B,程度为b,那么F(A)=0,F(B)=b,F(C)=0。
E(A)、E(B)、E(C)分别为第一情绪触发事件导致的A、B、C的程度变化,对于第一情绪触发事件引起A的程度变化,变化量为a,那么,E(A)=a,E(B)=0,E(C)=0。
In(A_B)为B对A的影响量,In(B_A)为A对B的影响量。在一种可能的实现方式中,In(A_B)=in(A_B)*I(B),In(B_A)=in(B_A)*E(A),其中,in(A_B)为B对A的影响因子,in(B_A)为A对B的影响因子,二者可以不同,也可以相同。其他In()和in()的含义可以类似理解,此处不再赘述。
从I(A)、I(B)和I(C)中选择超过阈值的程度作为第一情绪信息所指示的程度,以该程度对应的情绪的类型作为第一情绪信息所指示的情绪的类型。当多个程度均超过阈值时,可以以最大的程度作为第一情绪信息所指示的程度,以该程度对应的情绪的类型作为第一情绪信息所指示的情绪的类型。
若上述计算机设备为机器人,那么本申请方法实施例还可以包括:根据第一控制信息依次执行第一行为和第二行为。
下面介绍本申请提供的情绪信息的处理方法一个应用场景。假设情绪信息与情绪表达的对应关系如图3A和3B所示,情绪触发事件与情绪变化的对应关系如图表1和表2所示,计算机设备为机器人。
1、机器人开机;
参考图6A,机器人的表情对应的情绪可以为“平静”。
2、机器人根据当前时间确定初始情绪信息;
假设初始情绪信息为:开心,程度为f1=0.3。
3、机器人检测情绪触发事件;
通过声音、接触、光等信号的传感器检测刺激信号,对刺激信号进行处理,判断是否检测到情绪触发事件。
4、响应于在t1时刻检测到情绪触发事件i1,机器人根据该情绪触发事件i1确定情绪变化信息v1;
参考图6B,用户(图6B以儿童为例)对机器人热情的打招呼说“早上好~给我讲个故事好吗?”,此时,机器人可以判定检测到情绪触发事件i1,该情绪触发事件i1为“被热情的打招呼”,可以参考表1,确定情绪变化信息v1为:开心,0.5。
5、机器人根据初始情绪信息和情绪变化信息v1确定情绪信息e1;
之后,机器人可以根据初始情绪信息和情绪变化信息v1确定情绪信息e1,示例性的,以机器人包括两种类型的情绪(A和C)为例,其中A为开心,C为伤心,in(A_C)为0.1,in(C_A)为0.2,C的程度的阈值为0.2,A的程度的阈值为0.1确定情绪信息e1的具体过程可以如下:
I(C)=F(C)+E(C)-in(C_A)*E(A)=0+0-0.2*0.5=-0.1,对于伤心的程度小于0,可以认为伤心的程度为0。
I(A)=F(A)+E(A)-in(A_C)*I(C)=0.3+0.6-0.1*0=0.9。
关于上述两个公式的理解可以前考前述对三种情绪的程度的计算过程的介绍,此处不再赘述。
由于I(C)<0.2,I(A)>0.1,因此,情绪信息e1为:开心,程度I(A)为0.9。
6、机器人根据第一情绪信息执行情绪消散行为a1;
参考图6C,t代表时间,t1、t2、t3分别代表不同的时刻,且t1早于t2,t2早于t3,以情绪消散行为a1包括三个情绪行为(行为1、行为2和行为3)为例,其中,行为1对应于情绪信息e1,可以参考图3A;行为2与行为1对应的情绪类型相同,但是行为2对应的情绪程度低于行为1对应的情绪程度;行为3对应于初始情绪信息。
7、机器人检测情绪触发事件;
8、响应于在t4时刻检测到情绪触发事件i2,机器人根据情绪触发事件i2确定情绪变化信息v2;
步骤4之后,机器人还可以根据检测到的音频数据和图像数据,获取到用户下达的任务指令,即读个故事的指令,机器人可以通过网络下载或从本地存储介质中读取一个故事的音频,并通过扩音器播放该故事的音频数据。
参考图6D,假设播放完该故事的音频数据时(假设在t4时刻),用户皱着眉说“这个故事不好听”,此时机器人可以检测到情绪触发事件i2,具体为任务失败,机器人可以根据情绪触发事件i2确定情绪变化信息v2,可以参考表2,历史情绪变化信息为:伤心,0.8。
9、响应于在t4时刻检测到情绪触发事件i2,机器人根据t1和t4确定情绪信息e1 的消散余量;
假设t1为9点15分,t4为9点20分,第一情绪信息的消散余量为:开心,程度为I1*f(t4-t1)=0.3,其中,f(t4-t1)可以参考图5对应的描述进行理解,此处不再赘述。
不限定步骤8和步骤9之间的时序关系。
10、机器人根据情绪信息e1的消散余量和情绪变化信息v2确定情绪信息e2;
之后,机器人可以可以根据情绪信息e1的消散余量和情绪变化信息v2确定情绪信息e2,示例性的,确定情绪信息e2的具体过程可以如下:
I(A)=F(A)+E(A)-in(A_C)*E(C)=0.3+0-0.1*0.8=0.22。
I(C)=F(C)+E(C)-in(C_A)*I(A)=0+0.8-0.2*0.22=0.8-0.044=0.756。
由于I(C)>0.2,I(A)>0.1,I(C)>I(A),因此,情绪信息e2为:伤心,程度I(C)为0.756。
11、机器人根据情绪信息e2执行情绪消散行为a2。
参考图6E,t代表时间,t4、t5、t6分别代表不同的时刻,且t4早于t5,t5早于t6,以情绪消散行为a2包括三个情绪行为(行为4、行为5和行为3)为例,其中,行为4对应于情绪信息e2,参考图3B;行为5与行为4对应的情绪类型相同,但是行为5对应的情绪程度低于行为4对应的情绪程度。
结合图1A,上述任一方法实施例中的计算机设备可以为机器人110,那么,输入模块111可以用于检测情绪触发事件(例如上述方法实施例中的第一情绪触发事件),存储器113用于存储执行本申请方案的计算机指令,处理112器用于执行存储器113中的计算机指令时,执行本申请实施例提供的任意一个方法实施例,输出模块114用于接收并执行处理器112生成的控制信息,例如,根据第一控制信息依次执行第一行为和第二行为。
结合图1B,上述任一方法实施例中的计算机设备可以为服务器120,存储器122用于存储执行本申请方案的计算机指令,处理121器用于执行存储器122中的计算机指令时,执行本申请实施例提供的任意一个方法实施例,生成第一控制信息。
结合图1C,上述任一方法实施例中的计算机设备可以为计算机系统100,机器人110和服务器120共同执行上述任一方法实施例,例如,机器人110用于检测情绪触发事件,并将检测到的情绪触发事件发送给服务器120;服务器120用于根据机器人110检测到的情绪触发事件执行本申请任一方法实施例,生成控制信息,并向机器人110发送控制信息;机器人110还用于执行服务器120发送的控制信息,例如,在接收到服务器120发送的第一控制信息后,根据第一控制信息依次执行第一行为和第二行为。
上面从方法和实体设备的角度对本申请实施例进行了介绍。下面,从功能模块的角度,介绍本申请实施例提供的情绪信息的处理装置。
从功能模块的角度,本申请可以根据上述方法实施例对执行情绪信息的处理方法的装置进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个功能模块中。上述集成的功能模块既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
比如,以采用集成的方式划分各个功能单元的情况下,图7A示出了一种情绪信息的处 理装置的结构示意图。如图7A所示,本申请情绪信息的处理装置700的一个实施例可以包括:
确定模块701,用于响应于机器人检测到第一情绪触发事件,根据第一情绪触发事件确定第一情绪信息;生成模块702,用于根据第一情绪信息生成第一控制信息,第一控制信息用于指示机器人依次执行第一行为和第二行为,第一行为和第二行为均用于表达情绪,其中,第一行为用于表达第一情绪信息所指示的情绪,第二行为用于表达比第一情绪信息指示的情绪更为轻微的情绪。
在一种可能的实现方式中,第一情绪触发事件包括机器人被移动,机器人摔倒,机器人的环境参数劣于预设参数,机器人的任务失败和机器人的任务成功中的至少一项。
在一种可能的实现方式中,第二行为的程度比第一行为的程度更轻微,第一行为和第二行为的程度为幅度、频率和音量中的至少一种。
在一种可能的实现方式中,第一控制信息还用于指示机器人在执行第二行为之后,在检测到下一个情绪触发事件之前,执行用于表达初始情绪的第三行为,用于指示初始情绪的初始情绪信息预先存储在存储介质中。
在一种可能的实现方式中,第三行为是机器人开机后执行的第一个用于表达情绪的行为。
在一种可能的实现方式中,第一情绪信息用于指示情绪的类型为第一类型,且情绪的程度为第一程度。
在一种可能的实现方式中,第二行为所表达的情绪的类型为第一类型,第二行为所表达的情绪的程度轻于第一程度。
在一种可能的实现方式中,确定模块701还用于,在根据第一情绪触发事件确定第一情绪信息之前,响应于机器人检测到第二情绪触发事件,根据第二情绪触发事件确定历史情绪信息,历史情绪信息用于生成历史控制信息,历史控制信息用于指示机器人执行历史行为,历史行为用于表达历史情绪信息所指示的情绪。确定模块根据第一情绪触发事件确定第一情绪信息,具体用于,根据第一情绪触发事件确定第一情绪变化信息,第一情绪变化信息用于指示由第一情绪触发事件引起程度发生变化的情绪的类型和程度的变化量;根据第一情绪变化信息和历史情绪信息确定第一情绪信息。
在一种可能的实现方式中,基于第一情绪变化信息所指示的情绪的类型与历史情绪信息所指示的情绪的类型均为第一方向的情绪类型,那么第一类型为第一方向的情绪类型,且第一程度与第一情绪变化信息所指示的情绪的变化量正相关,第一程度与历史情绪信息所指示的情绪的程度正相关,第一方向的情绪类型为积极的情绪类型,或者为消极的情绪类型。
在一种可能的实现方式中,第一程度与第一时长负相关,第一时长为机器人检测到第二情绪触发事件的第二时刻与机器人检测到第一情绪触发事件的第一时刻之间的间隔时长。
在一种可能的实现方式中,第一方向的情绪类型为积极的情绪类型,且第二方向的情绪类型为消极的情绪类型;或者,第一方向的情绪类型为消极的情绪类型,且第二方向的情绪类型为积极的情绪类型;基于第一情绪变化信息所指示的情绪的类型为第一方向的情 绪类型,历史情绪信息所指示的情绪的类型为第二方向的情绪类型,且第一类型为第一方向的情绪类型,那么第一程度与第一情绪信息所指示的情绪的变化量正相关,第一程度与历史情绪信息所指示的情绪的程度负相关。
在一种可能的实现方式中,第一程度与第一时长正相关,第一程度与第一时长负相关,第一时长为机器人检测到第二情绪触发事件的第二时刻与机器人检测到第一情绪触发事件的第一时刻之间的间隔时长。
参考图7B,在一种可能的实现方式中,装置还包括执行模块703,执行模块703用于:在生成模块702根据第一情绪信息生成第一控制信息之后,根据第一控制信息依次执行第一行为和第二行为。图7B对应的装置可以设置在机器人110或机器人系统100中。
一种可能的实现方式,本申请实施例中的计算机执行指令或计算机指令也可以称之为应用程序代码,本申请实施例对此不作具体限定。
上述实施例,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现,当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。
所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机执行指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存储的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘Solid State Disk(SSD))等。
本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的术语在适当情况下可以互换,这仅仅是描述本申请的实施例中对相同属性的对象在描述时所采用的区分方式。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,以便包含一系列单元的过程、方法、系统、产品或设备不必限于那些单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它单元。在本申请实施例中,“多个”指两个或两个以上。
本申请实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其他实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
在本申请的各实施例中,为了方面理解,进行了多种举例说明。然而,这些例子仅仅是一些举例,并不意味着是实现本申请的最佳实现方式。
以上对本申请所提供的技术方案进行了详细介绍,本申请中应用了具体个例对本申请 的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本申请的方法及其核心思想;同时,对于本领域的一般技术人员,依据本申请的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本申请的限制。

Claims (29)

  1. 一种情绪信息的处理方法,其特征在于,包括:
    响应于机器人检测到第一情绪触发事件,根据所述第一情绪触发事件确定第一情绪信息;
    根据所述第一情绪信息生成第一控制信息,所述第一控制信息用于指示所述机器人依次执行第一行为和第二行为,所述第一行为和第二行为均用于表达情绪,其中,所述第一行为用于表达所述第一情绪信息所指示的情绪,所述第二行为用于表达比所述第一情绪信息所指示的情绪更为轻微的情绪。
  2. 根据权利要求1所述的方法,其特征在于,所述第一情绪触发事件包括所述机器人被移动,所述机器人摔倒,所述机器人的环境参数劣于预设参数,所述机器人的任务失败和所述机器人的任务成功中的至少一项。
  3. 根据权利要求1或2所述的方法,其特征在于,所述第二行为的程度比所述第一行为的程度更轻微,所述第一行为和所述第二行为的程度为幅度、频率和音量中的至少一种。
  4. 根据权利要求1至3中任一项所述的方法,其特征在于,所述第一控制信息还用于指示所述机器人在执行所述第二行为之后,在检测到下一个情绪触发事件之前,执行用于表达初始情绪的第三行为,用于指示所述初始情绪的初始情绪信息预先存储在存储介质中。
  5. 根据权利要求4所述的方法,其特征在于,所述第三行为是所述机器人开机后执行的第一个用于表达情绪的行为。
  6. 根据权利要求1至5中任一项所述的方法,其特征在于,所述第一情绪信息用于指示情绪的类型为第一类型,且情绪的程度为第一程度。
  7. 根据权利要求6所述的方法,其特征在于,所述第二行为所表达的情绪的类型为所述第一类型,所述第二行为所表达的情绪的程度轻于所述第一程度。
  8. 根据权利要求6或7所述的方法,其特征在于,在所述根据所述第一情绪触发事件确定第一情绪信息之前,所述方法还包括:
    响应于所述机器人检测到第二情绪触发事件,根据所述第二情绪触发事件确定历史情绪信息,所述历史情绪信息用于生成历史控制信息,所述历史控制信息用于指示所述机器人执行历史行为,所述历史行为用于表达所述历史情绪信息所指示的情绪;
    所述根据所述第一情绪触发事件确定第一情绪信息,包括:
    根据所述第一情绪触发事件确定第一情绪变化信息,所述第一情绪变化信息用于指示由所述第一情绪触发事件引起程度发生变化的情绪的类型和程度的变化量;
    根据所述第一情绪变化信息和所述历史情绪信息确定所述第一情绪信息。
  9. 根据权利要求8所述的方法,其特征在于,基于所述第一情绪变化信息所指示的情绪的类型与所述历史情绪信息所指示的情绪的类型均为第一方向的情绪类型,那么所述第一类型为所述第一方向的情绪类型,且所述第一程度与所述第一情绪变化信息所指示的情绪的变化量正相关,所述第一程度与所述历史情绪信息所指示的情绪的程度正相关,所述第一方向的情绪类型为积极的情绪类型,或者为消极的情绪类型。
  10. 根据权利要求9所述的方法,其特征在于,所述第一程度与第一时长负相关,所 述第一时长为所述机器人检测到所述第二情绪触发事件的第二时刻与所述机器人检测到所述第一情绪触发事件的第一时刻之间的间隔时长。
  11. 根据权利要求8所述的方法,其特征在于,第一方向的情绪类型为积极的情绪类型,且第二方向的情绪类型为消极的情绪类型;
    或者,所述第一方向的情绪类型为消极的情绪类型,且所述第二方向的情绪类型为积极的情绪类型;
    基于所述第一情绪变化信息所指示的情绪的类型为第一方向的情绪类型,所述历史情绪信息所指示的情绪的类型为第二方向的情绪类型,且所述第一类型为所述第一方向的情绪类型,那么所述第一程度与所述第一情绪信息所指示的情绪的变化量正相关,所述第一程度与所述历史情绪信息所指示的情绪的程度负相关。
  12. 根据权利要求11所述的方法,其特征在于,所述第一程度与第一时长正相关,所述第一时长为所述机器人检测到所述第二情绪触发事件的第二时刻与所述机器人检测到所述第一情绪触发事件的第一时刻之间的间隔时长。
  13. 一种情绪信息的处理装置,其特征在于,包括:
    确定模块,用于响应于机器人检测到第一情绪触发事件,根据所述第一情绪触发事件确定第一情绪信息;
    生成模块,用于根据所述第一情绪信息生成第一控制信息,所述第一控制信息用于指示所述机器人依次执行第一行为和第二行为,所述第一行为和第二行为均用于表达情绪,其中,所述第一行为用于表达所述第一情绪信息所指示的情绪,所述第二行为用于表达比所述第一情绪信息所述指示的情绪更为轻微的情绪。
  14. 根据权利要求13所述的装置,其特征在于,所述第一情绪触发事件包括所述机器人被移动,所述机器人摔倒,所述机器人的环境参数劣于预设参数,所述机器人的任务失败和所述机器人的任务成功中的至少一项。
  15. 根据权利要求13或14所述的装置,其特征在于,所述第二行为的程度比所述第一行为的程度更轻微,所述第一行为和所述第二行为的程度为幅度、频率和音量中的至少一种。
  16. 根据权利要求13至15中任一项所述的装置,其特征在于,所述第一控制信息还用于指示所述机器人在执行所述第二行为之后,在检测到下一个情绪触发事件之前,执行用于表达初始情绪的第三行为,用于指示所述初始情绪的初始情绪信息预先存储在存储介质中。
  17. 根据权利要求16所述的装置,其特征在于,所述第三行为是所述机器人开机后执行的第一个用于表达情绪的行为。
  18. 根据权利要求13至17中任一项所述的装置,其特征在于,所述第一情绪信息用于指示情绪的类型为第一类型,且情绪的程度为第一程度。
  19. 根据权利要求18所述的装置,其特征在于,所述第二行为所表达的情绪的类型为所述第一类型,所述第二行为所表达的情绪的程度轻于所述第一程度。
  20. 根据权利要求18或19所述的装置,其特征在于,所述确定模块还用于:
    在所述根据所述第一情绪触发事件确定第一情绪信息之前,响应于所述机器人检测到第二情绪触发事件,根据所述第二情绪触发事件确定历史情绪信息,所述历史情绪信息用于生成历史控制信息,所述历史控制信息用于指示所述机器人执行历史行为,所述历史行为用于表达所述历史情绪信息所指示的情绪;
    所述确定模块根据所述第一情绪触发事件确定第一情绪信息,具体用于:
    根据所述第一情绪触发事件确定第一情绪变化信息,所述第一情绪变化信息用于指示由所述第一情绪触发事件引起程度发生变化的情绪的类型和程度的变化量;
    根据所述第一情绪变化信息和所述历史情绪信息确定所述第一情绪信息。
  21. 根据权利要求20所述的装置,其特征在于,基于所述第一情绪变化信息所指示的情绪的类型与所述历史情绪信息所指示的情绪的类型均为第一方向的情绪类型,那么所述第一类型为所述第一方向的情绪类型,且所述第一程度与所述第一情绪变化信息所指示的情绪的变化量正相关,所述第一程度与所述历史情绪信息所指示的情绪的程度正相关,所述第一方向的情绪类型为积极的情绪类型,或者为消极的情绪类型。
  22. 根据权利要求21所述的装置,其特征在于,所述第一程度与第一时长负相关,所述第一时长为所述机器人检测到所述第二情绪触发事件的第二时刻与所述机器人检测到所述第一情绪触发事件的第一时刻之间的间隔时长。
  23. 根据权利要求18或19所述的装置,其特征在于,第一方向的情绪类型为积极的情绪类型,且第二方向的情绪类型为消极的情绪类型;
    或者,所述第一方向的情绪类型为消极的情绪类型,且所述第二方向的情绪类型为积极的情绪类型;
    基于所述第一情绪变化信息所指示的情绪的类型为第一方向的情绪类型,所述历史情绪信息所指示的情绪的类型为第二方向的情绪类型,且所述第一类型为所述第一方向的情绪类型,那么所述第一程度与所述第一情绪信息所指示的情绪的变化量正相关,所述第一程度与所述历史情绪信息所指示的情绪的程度负相关。
  24. 根据权利要求23所述的装置,其特征在于,所述第一程度与第一时长正相关,所述第一程度与第一时长负相关,所述第一时长为所述机器人检测到所述第二情绪触发事件的第二时刻与所述机器人检测到所述第一情绪触发事件的第一时刻之间的间隔时长。
  25. 一种计算机设备,其特征在于,包括处理器和存储器,所述处理器在运行所述存储器存储的计算机指令时,执行如权利要求1至12中任一项所述的方法。
  26. 一种计算机可读存储介质,其特征在于,包括指令,当所述指令在计算机上运行时,使得计算机执行如权利要求1至12中任一项所述的方法。
  27. 一种计算机程序产品,其特征在于,包括指令,当所述指令在计算机上运行时,使得计算机执行如权利要求1至12中任一项所述的方法。
  28. 一种机器人,其特征在于,包括输入模块、输出模块、处理器和存储器,所述输入模块用于检测情绪触发事件,所述存储器用于存储计算机指令,所述处理器在运行所述存储器存储的计算机指令时,执行如权利要求1至12中任一项所述的方法,所述输出模块用于执行所述处理器生成的控制信息。
  29. 一种机器人系统,其特征在于,包括机器人和服务器;
    所述机器人用于检测情绪触发事件,并将检测到的情绪触发事件发送给所述服务器;
    所述服务器用于根据所述机器人检测到的情绪触发事件执行如权利要求1至12中任一项所述的方法,并向所述机器人发送控制信息;
    所述机器人还用于执行所述服务器发送的控制信息。
PCT/CN2020/133746 2019-12-31 2020-12-04 一种情绪信息的处理方法及装置 WO2021135812A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP20908501.8A EP3923198A4 (en) 2019-12-31 2020-12-04 METHOD AND DEVICE FOR PROCESSING EMOTIONAL INFORMATION

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911415571.0 2019-12-31
CN201911415571.0A CN111191765A (zh) 2019-12-31 2019-12-31 一种情绪信息的处理方法及装置

Publications (1)

Publication Number Publication Date
WO2021135812A1 true WO2021135812A1 (zh) 2021-07-08

Family

ID=70707948

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/133746 WO2021135812A1 (zh) 2019-12-31 2020-12-04 一种情绪信息的处理方法及装置

Country Status (3)

Country Link
EP (1) EP3923198A4 (zh)
CN (1) CN111191765A (zh)
WO (1) WO2021135812A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117250883A (zh) * 2022-12-06 2023-12-19 北京小米机器人技术有限公司 智能设备控制方法、装置、存储介质与芯片

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111191765A (zh) * 2019-12-31 2020-05-22 华为技术有限公司 一种情绪信息的处理方法及装置
CN112115847B (zh) * 2020-09-16 2024-05-17 深圳印像数据科技有限公司 人脸情绪愉悦度判断方法
CN112379821A (zh) * 2020-11-24 2021-02-19 浙江同善人工智能技术有限公司 一种服务型机器人的交互系统
CN112847369B (zh) * 2021-01-08 2023-04-07 深圳市注能科技有限公司 机器人情绪转变的方法、装置、机器人及存储介质
CN113524179A (zh) * 2021-07-05 2021-10-22 上海仙塔智能科技有限公司 基于情绪累积数值的控制方法、装置、设备以及介质
WO2023000310A1 (en) * 2021-07-23 2023-01-26 Huawei Technologies Co., Ltd. Methods, devices, and media for customizing and expressing personality of robot

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105345818A (zh) * 2015-11-04 2016-02-24 深圳好未来智能科技有限公司 带有情绪及表情模块的3d视频互动机器人
CN106096717A (zh) * 2016-06-03 2016-11-09 北京光年无限科技有限公司 面向智能机器人的信息处理方法及系统
CN106200959A (zh) * 2016-07-08 2016-12-07 北京光年无限科技有限公司 面向智能机器人的信息处理方法及系统
CN106970703A (zh) * 2017-02-10 2017-07-21 南京威卡尔软件有限公司 基于心情指数的多层情感计算方法
US20170365277A1 (en) * 2016-06-16 2017-12-21 The George Washington University Emotional interaction apparatus
CN111191765A (zh) * 2019-12-31 2020-05-22 华为技术有限公司 一种情绪信息的处理方法及装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10289006A (ja) * 1997-04-11 1998-10-27 Yamaha Motor Co Ltd 疑似感情を用いた制御対象の制御方法
KR100825719B1 (ko) * 2005-12-09 2008-04-29 한국전자통신연구원 복수의 감정 생성 로봇 및 로봇에서 복수의 감정 생성 방법
CN107020637A (zh) * 2016-01-29 2017-08-08 深圳光启合众科技有限公司 宠物机器人的情绪表达方法及宠物机器人

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105345818A (zh) * 2015-11-04 2016-02-24 深圳好未来智能科技有限公司 带有情绪及表情模块的3d视频互动机器人
CN106096717A (zh) * 2016-06-03 2016-11-09 北京光年无限科技有限公司 面向智能机器人的信息处理方法及系统
US20170365277A1 (en) * 2016-06-16 2017-12-21 The George Washington University Emotional interaction apparatus
CN106200959A (zh) * 2016-07-08 2016-12-07 北京光年无限科技有限公司 面向智能机器人的信息处理方法及系统
CN106970703A (zh) * 2017-02-10 2017-07-21 南京威卡尔软件有限公司 基于心情指数的多层情感计算方法
CN111191765A (zh) * 2019-12-31 2020-05-22 华为技术有限公司 一种情绪信息的处理方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3923198A4

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117250883A (zh) * 2022-12-06 2023-12-19 北京小米机器人技术有限公司 智能设备控制方法、装置、存储介质与芯片

Also Published As

Publication number Publication date
CN111191765A (zh) 2020-05-22
EP3923198A1 (en) 2021-12-15
EP3923198A4 (en) 2022-06-22

Similar Documents

Publication Publication Date Title
WO2021135812A1 (zh) 一种情绪信息的处理方法及装置
CN109789550B (zh) 基于小说或表演中的先前角色描绘的社交机器人的控制
CN110609620B (zh) 基于虚拟形象的人机交互方法、装置及电子设备
TWI379205B (en) Instant communication interacting system and method thereof
US11037576B2 (en) Distributed machine-learned emphatic communication for machine-to-human and machine-to-machine interactions
CN116524924A (zh) 数字人交互控制方法、装置、电子设备和存储介质
WO2016206645A1 (zh) 为机器装置加载控制数据的方法及装置
WO2022141142A1 (zh) 一种确定目标音视频的方法及系统
US11997445B2 (en) Systems and methods for live conversation using hearing devices
KR102361038B1 (ko) 적응형 다중 생체정보를 이용한 고수준 상황인지 기반의 스마트미러 챗봇 시스템
EP4141867A1 (en) Voice signal processing method and related device therefor
WO2020130734A1 (ko) 사용자 상태에 기초하여 반응을 제공하는 전자 장치 및 그의 동작 방법
KR102078792B1 (ko) 아스퍼거 증후군 치료 장치
CN110718119A (zh) 基于儿童专用穿戴智能设备的教育能力支持方法及系统
JP7123028B2 (ja) 情報処理システム、情報処理方法、及びプログラム
WO2024219506A1 (ja) 電子機器、行動制御システムおよび制御システム
KR102637704B1 (ko) 아동에게 칭찬 메시지를 제공하는 방법 및 그것을 수행하는 서버
KR102128812B1 (ko) 로봇의 사회 지능 평가 방법 및 이를 위한 장치
JP2018051648A (ja) ロボット制御装置、ロボット、ロボット制御方法、及びプログラム
JP4355823B2 (ja) 表情等の情報処理装置
KR102722595B1 (ko) 전자 장치 및 그 제어 방법
Campbell et al. Expressivity in interactive speech synthesis; some paralinguistic and nonlinguistic issues of speech prosody for conversational dialogue systems
JP2024155861A (ja) 電子機器
JP2024155820A (ja) 電子機器
Zomer Shaping proactivity: Designing interactions with proactive ambient artefacts

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20908501

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020908501

Country of ref document: EP

Effective date: 20210908

NENP Non-entry into the national phase

Ref country code: DE