CN107066956B - Multisource emotion recognition robot based on body area network - Google Patents
Multisource emotion recognition robot based on body area network Download PDFInfo
- Publication number
- CN107066956B CN107066956B CN201710181535.7A CN201710181535A CN107066956B CN 107066956 B CN107066956 B CN 107066956B CN 201710181535 A CN201710181535 A CN 201710181535A CN 107066956 B CN107066956 B CN 107066956B
- Authority
- CN
- China
- Prior art keywords
- user
- mobile robot
- intelligent mobile
- heart rate
- emotion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 230000008909 emotion recognition Effects 0.000 title claims abstract description 30
- 230000002996 emotional effect Effects 0.000 claims abstract description 40
- 230000001815 facial effect Effects 0.000 claims abstract description 40
- 230000008451 emotion Effects 0.000 claims abstract description 39
- 238000004458 analytical method Methods 0.000 claims abstract description 14
- 230000004927 fusion Effects 0.000 claims abstract description 14
- 238000000605 extraction Methods 0.000 claims abstract description 10
- 230000003993 interaction Effects 0.000 claims abstract description 7
- 239000013598 vector Substances 0.000 claims description 25
- 238000004891 communication Methods 0.000 claims description 24
- 238000004422 calculation algorithm Methods 0.000 claims description 15
- 230000003238 somatosensory effect Effects 0.000 claims description 15
- 230000014509 gene expression Effects 0.000 claims description 13
- 210000004243 sweat Anatomy 0.000 claims description 13
- 238000013507 mapping Methods 0.000 claims description 10
- 238000007635 classification algorithm Methods 0.000 claims description 9
- 230000004886 head movement Effects 0.000 claims description 8
- 210000000707 wrist Anatomy 0.000 claims description 8
- 238000007781 pre-processing Methods 0.000 claims description 6
- 230000006399 behavior Effects 0.000 claims description 5
- 210000001367 artery Anatomy 0.000 claims description 4
- 238000006073 displacement reaction Methods 0.000 claims description 4
- 238000007493 shaping process Methods 0.000 claims description 4
- 230000003068 static effect Effects 0.000 claims description 4
- 230000003321 amplification Effects 0.000 claims description 3
- 238000001914 filtration Methods 0.000 claims description 3
- 238000003199 nucleic acid amplification method Methods 0.000 claims description 3
- 238000000611 regression analysis Methods 0.000 claims description 3
- 230000008921 facial expression Effects 0.000 description 5
- 230000009471 action Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 238000000034 method Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000002045 lasting effect Effects 0.000 description 1
- 230000004630 mental health Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
- G06V40/176—Dynamic expression
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/0005—Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
- B25J11/001—Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means with emotions simulating means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/04—Programme control other than numerical control, i.e. in sequence controllers or logic controllers
- G05B19/042—Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/28—Databases characterised by their database models, e.g. relational or object models
- G06F16/284—Relational databases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
- G06F18/24155—Bayesian classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Automation & Control Theory (AREA)
- Neurosurgery (AREA)
- Evolutionary Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Biomedical Technology (AREA)
- Dermatology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Neurology (AREA)
- Evolutionary Computation (AREA)
- Probability & Statistics with Applications (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Multimedia (AREA)
- Manipulator (AREA)
- Telephone Function (AREA)
- Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
Abstract
The invention provides a multisource emotion recognition robot based on a body area network, which has higher emotion recognition accuracy. The multi-source emotion recognition robot comprises: wearable equipment of a body area network and an intelligent mobile robot; the body area network wearable device is used for acquiring physiological information of a user and transmitting the acquired physiological information to the intelligent mobile robot; the intelligent mobile robot is used for acquiring the facial information of the user, extracting the emotional characteristics of the acquired facial information and the received physiological information, and performing emotion fusion analysis according to the emotional characteristic extraction result to obtain the user emotional state based on physiology-face. The invention is suitable for the fields of industrial automation information and man-machine interaction.
Description
Technical Field
The invention relates to the field of industrial automation information and human-computer interaction, in particular to a multisource emotion recognition robot based on a body area network.
Background
In recent years, with the change of national conditions of China, a well-jet type old society is coming in the coming years, old people accompany is an increasingly important service field, and in addition to medical accompanying, the mental health of old people is an indispensable part for improving the life quality of old people, so that an intelligent mobile robot serving for emotion monitoring of old people becomes one of important devices in the field of old people accompany. At present, most of the elderly emotion accompanying and attending robots only rely on facial expressions to perform emotion recognition, recognition accuracy is insufficient, physiological information of the elderly cannot be detected simultaneously, and user experience is poor.
Disclosure of Invention
The invention aims to solve the technical problem of providing a multisource emotion recognition robot based on a body area network, so as to solve the problem of low emotion recognition accuracy in the prior art.
In order to solve the above technical problem, an embodiment of the present invention provides a multi-source emotion recognition robot based on a body area network, including: wearable equipment of a body area network and an intelligent mobile robot;
the body area network wearable device is used for acquiring physiological information of a user and transmitting the acquired physiological information to the intelligent mobile robot;
the intelligent mobile robot is used for acquiring the facial information of the user, extracting the emotional characteristics of the acquired facial information and the received physiological information, and performing emotion fusion analysis according to the emotional characteristic extraction result to obtain the user emotional state based on physiology-face.
Further, the emotional states include: surprise, fear, disgust, anger, happiness and sadness.
Further, the body area network wearable device comprises: the system comprises a body sensing temperature sensor sub-node, a conductivity sensor sub-node, a heart rate sensor sub-node and a sink node;
the somatosensory temperature sensor sub-node comprises: the system comprises a body sensing temperature sensor for collecting body sensing temperature of a user in real time and a first Bluetooth communication module for sending the collected body sensing temperature of the user to a sink node;
the conductivity sensor sub-node comprises: the conductivity sensor is used for collecting the conductivity in the skin sweat of the user in real time, and the second Bluetooth communication module is used for sending the collected conductivity in the skin sweat of the user to the sink node;
the heart rate sensor sub-node comprises: the third Bluetooth communication module is used for transmitting the collected heart rate of the user to the sink node;
the sink node includes: and the fourth Bluetooth communication module is used for transmitting the received body sensing temperature of the user, the conductivity in skin sweat of the user and the heart rate of the user to the intelligent mobile robot.
Further, the heart rate sensor sub-node further comprises:
a digital to analog circuit for pre-processing a heart rate signal collected by a heart rate sensor before sending out the heart rate signal collected by the heart rate sensor, the pre-processing comprising: amplification, shaping and filtering operations.
Further, the heart rate sensor is a patch type heart rate sensor, and the patch type heart rate sensor is worn at a position of a wrist artery blood vessel;
the conductivity sensor is a bracelet and is worn on the wrist;
the somatosensory temperature sensor is worn on the inner side of the large arm.
Further, the intelligent mobile robot is used for acquiring user face information by using a face recognition algorithm when a user is in a static state and the face of the user is in a shooting range of the intelligent mobile robot;
the intelligent mobile robot is further used for controlling the intelligent mobile robot to move according to the head movement track of the user by using an average displacement algorithm when the user is in a moving state, and carrying out face tracking according to the head movement track of the user to acquire face information of the user.
Further, the intelligent mobile robot is used for extracting facial region expression features based on a facial expression recognition algorithm according to the acquired user facial information, and obtaining facial emotion probability vectors based on prior knowledge of an expression feature-emotion mapping database stored in a local database in advance and by using a Bayesian classification algorithm, wherein the facial region expression features are marked feature point positions.
Further, the intelligent mobile robot is configured to obtain a corresponding statistical feature value according to the received physiological information, and obtain a physiological feature emotional probability vector by using a bayesian classification algorithm based on prior knowledge of a physiological feature-emotion mapping database pre-stored in a local database, where the physiological feature emotional probability vector is a vector formed by probabilities of 6 emotional states, i.e., surprise, fear, disgust, anger, happiness and sadness, obtained by the physiological features.
Further, the intelligent mobile robot is used for performing regression analysis according to the obtained facial emotion probability vector and physiological feature emotion probability vector to obtain the correlation between the facial features and the physiological features and the probability conditions between the two and the target variable;
the intelligent mobile robot is further used for carrying out fusion analysis of a decision layer by using an evidence reasoning theory based on a Gordon-Shortliff algorithm according to the obtained correlation between the facial features and the physiological features and the probability conditions between the two facial features and the target variable to obtain the emotional state of the user.
Furthermore, the intelligent mobile robot is further configured to generate a corresponding behavior instruction according to the obtained physiological-facial based user emotional state, so as to realize emotional interaction between the intelligent mobile robot and the user.
The technical scheme of the invention has the following beneficial effects:
in the scheme, the physiological information of the user is acquired through the wearable device of the body area network, and the acquired physiological information is transmitted to the intelligent mobile robot; the method comprises the steps of acquiring facial information of a user through an intelligent mobile robot, extracting emotional features of the acquired facial information and received physiological information, carrying out emotion fusion analysis according to an emotional feature extraction result to obtain a user emotional state based on physiology-face, and enabling the obtained user emotional state based on physiology-face to have higher emotion recognition accuracy.
Drawings
Fig. 1 is a schematic structural diagram of a multi-source emotion recognition robot based on a body area network according to an embodiment of the present invention;
FIG. 2 is a detailed structural schematic diagram of a multi-source emotion recognition robot based on a body area network according to an embodiment of the present invention;
fig. 3 is a flowchart of a multi-source emotion recognition robot based on a body area network according to an embodiment of the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantages of the present invention more apparent, the following detailed description is given with reference to the accompanying drawings and specific embodiments.
The invention provides a multisource emotion recognition robot based on a body area network, aiming at the problem of low accuracy of the existing emotion recognition.
As shown in fig. 1, the multi-source emotion recognition robot based on the body area network according to the embodiment of the present invention includes: a body area network wearable device 11 and an intelligent mobile robot 12;
the body area network wearable device 11 is used for acquiring physiological information of a user and transmitting the acquired physiological information to the intelligent mobile robot;
the intelligent mobile robot 12 is configured to acquire facial information of a user, perform emotion feature extraction on the acquired facial information and the received physiological information, and perform emotion fusion analysis according to an emotion feature extraction result to obtain a user emotion state based on physiology-face.
According to the body area network-based multi-source emotion recognition robot, physiological information of a user is acquired through body area network wearable equipment, and the acquired physiological information is transmitted to an intelligent mobile robot; the method comprises the steps of acquiring facial information of a user through an intelligent mobile robot, extracting emotional features of the acquired facial information and received physiological information, carrying out emotion fusion analysis according to an emotional feature extraction result to obtain a user emotional state based on physiology-face, and enabling the obtained user emotional state based on physiology-face to have higher emotion recognition accuracy.
In the foregoing specific implementation of the body area network-based multi-source emotion recognition robot, further, the emotional state includes: surprise, fear, disgust, anger, happiness and sadness.
In this embodiment, the wearable device of the body area network not only includes: the system comprises 3 types of sensor sub-nodes (somatosensory temperature sensor sub-nodes, conductivity sensor sub-nodes and heart rate sensor sub-nodes) and sink nodes, and further comprises the somatosensory temperature sensor sub-nodes, the conductivity sensor sub-nodes and the heart rate sensor sub-nodes.
In this embodiment, as shown in fig. 2, as an optional embodiment, the somatosensory temperature sensor sub-node includes: the system comprises a body sensing temperature sensor for collecting body sensing temperature of a user in real time and a first Bluetooth communication module for sending the collected body sensing temperature of the user to a sink node;
the conductivity sensor sub-node comprises: the conductivity sensor is used for collecting the conductivity in the skin sweat of the user in real time, and the second Bluetooth communication module is used for sending the collected conductivity in the skin sweat of the user to the sink node;
the heart rate sensor sub-node comprises: the third Bluetooth communication module is used for transmitting the collected heart rate of the user to the sink node;
the sink node includes: and the fourth Bluetooth communication module is used for transmitting the received body sensing temperature of the user, the conductivity in skin sweat of the user and the heart rate of the user to the intelligent mobile robot.
In this embodiment, the sink node communicates with the intelligent mobile robot through the fourth bluetooth communication module, that is, the wearable device of the body area network communicates with the intelligent mobile robot through the fourth bluetooth communication module.
In the embodiment of the present invention, the sink node includes: the fourth bluetooth communication module further comprises: microprocessor, be the power module of fourth bluetooth communication module, microprocessor power supply, wherein, the second power can be the button cell that the model is CR2032, button cell has small and puts the lasting advantage of some. The fourth Bluetooth communication module can be Bluetooth low energy 4.0, the microprocessor can select an STC89C51 chip, the microprocessor receives the user body sensing temperature, the conductivity in the skin sweat and the user heart rate which are respectively collected by the sub-nodes of the 3 types of sensors in real time through Bluetooth low energy 4.0, and physiological information such as the user body sensing temperature, the conductivity in the skin sweat and the user heart rate which are sent by the Bluetooth communication modules contained in the respective nodes, and the microprocessor also transmits the received physiological information to the intelligent mobile robot in real time through the Bluetooth low energy 4.0 of the sink node; the sink node has the functions of acquiring physiological data and transmitting the physiological data in real time.
In this embodiment, the sink node is connected with the somatosensory temperature sensor sub-node, the conductivity sensor sub-node and the heart rate sensor sub-node without a solid line, the somatosensory temperature sensor sub-node, the conductivity sensor sub-node and the heart rate sensor sub-node are configured as a bluetooth slave (slave) end, the sink node is configured as a bluetooth host (master) end, the master end is in a monitoring state after being powered on, the slave end is in a broadcasting state, connection is established when the master end receives a slave end broadcast message, the slave end transmits physiological information to the master end, and a transmission data interval is set to be 1 s.
In an embodiment of the foregoing multi-source emotion recognition robot based on body area network, further, the heart rate sensor sub-node further includes:
a digital to analog circuit for pre-processing a heart rate signal collected by a heart rate sensor before sending out the heart rate signal collected by the heart rate sensor, the pre-processing comprising: amplification, shaping and filtering operations.
In this embodiment, the heart rate sensor sub-node includes: the heart rate sensor still includes the digital analog circuit who carries out the preliminary treatment to the heart rate signal of heart rate sensor collection, digital analog circuit includes: the heart rate sensor is connected with the heart rate sensor, the filter is connected with the amplifier, and the comparator is connected with the filter; the amplifier is used for amplifying and shaping weak original heart rate signals, the filter is used for denoising the amplified heart rate signals, the comparator is used for performing analog-to-digital conversion on the denoised heart rate signals, the heart rate signals with stable waveforms are finally output, and the output heart rate signals are pulse signals.
In a specific embodiment of the multi-source emotion recognition robot based on the body area network, further, the heart rate sensor is a patch type heart rate sensor, and the patch type heart rate sensor is worn at a wrist artery blood vessel;
the conductivity sensor is a bracelet and is worn on the wrist;
the somatosensory temperature sensor is worn on the inner side of the large arm.
In this embodiment, for convenience, the heart rate sensor, the conductivity sensor and the somatosensory temperature sensor are mounted on a wearable device, and the heart rate sensor, the conductivity sensor and the somatosensory temperature sensor are worn on different body parts, the heart rate sensor may be a patch type heart rate sensor, and the patch type heart rate sensor may be worn on a wrist artery blood vessel to collect heart rate signals of a user in real time; the conductivity sensor is a bracelet and can be worn on the wrist to collect conductivity in skin sweat of the user in real time; the somatosensory temperature sensor can be worn on the inner side of the large arm and collects the somatosensory temperature of a user in real time; physiological information (heart rate of a user, conductivity in sweat of skin of the user and body sensing temperature of the user) collected by the heart rate sensor, the conductivity sensor and the body sensing temperature sensor transmits data to the sink node through the corresponding Bluetooth communication modules.
In this embodiment, the hardware of the intelligent mobile robot includes: the mobile device comprises a camera for acquiring a facial image of a user, a mobile device for taking charge of the intelligent mobile robot to move, a distributed microprocessor, a touch screen, a fifth Bluetooth communication module, a power supply module for supplying power to the intelligent mobile robot, interactive equipment for interacting with the user and the like, wherein the interactive equipment comprises but is not limited to a display screen, a voice module, a call module and an emergency alarm module; the mobile device includes: two drive wheels, two universal wheels and infrared obstacle avoidance sensor.
In this embodiment, because physiological information and facial data are complicated and computation load is large, the emotion recognition model of the user is built by using a distributed microprocessor, where the distributed microprocessor includes: the distributed microprocessor is built by a Hadoop structure, and optionally, the operating system adopts a red-cap open-source linux.
In this embodiment, the fifth bluetooth communication module mainly communicates with a sink node in the wearable device of the body area network, and since the sink node is a master of the somatosensory temperature sensor sub-node, the conductivity sensor sub-node, and the heart rate sensor sub-node, the sink node and the fifth bluetooth communication module in the intelligent mobile robot may form a self-organizing network, and the sink node is a slave of the fifth bluetooth communication module in the intelligent mobile robot. And a fifth Bluetooth communication module in the intelligent mobile robot communicates with a distributed microprocessor in the intelligent mobile robot in a serial port transmission mode.
In this embodiment, the software composition of the intelligent mobile robot includes: the system comprises a local database, a face recognition module, a face tracking module, a feature extraction module and a multi-source emotion fusion analysis module; wherein the local database comprises: an expression feature-emotion mapping database and a physiological feature-emotion mapping database.
In the foregoing specific embodiment of the multi-source emotion recognition robot based on the body area network, further, the intelligent mobile robot is configured to obtain the face information of the user by using a face recognition algorithm when the user is in a static state and the face of the user is within a camera shooting range of the intelligent mobile robot;
the intelligent mobile robot is further used for controlling the intelligent mobile robot to move according to the head movement track of the user by using an average displacement algorithm when the user is in a moving state, and carrying out face tracking according to the head movement track of the user to acquire face information of the user.
In this embodiment, as shown in fig. 3, when a user is in a static state and the face of the user is within a shooting range of the intelligent mobile robot, a face image of the user is acquired through a camera, and the face image of the user acquired by the camera is preprocessed through a face recognition module in the intelligent mobile robot by using a face recognition algorithm, so as to acquire face information of the user.
In this embodiment, when the user is in a moving state, the face tracking module in the intelligent mobile robot controls the intelligent mobile robot to move according to the head movement track of the user by using an average displacement algorithm, and performs face tracking according to the head movement track of the user to obtain user face information, which is mainly used for obtaining user face information in a dynamic state.
In the specific implementation manner of the multi-source emotion recognition robot based on the body area network, further, the intelligent mobile robot is configured to extract facial region expression features based on an expression recognition algorithm of a human face according to the acquired facial information of the user, and obtain a facial emotion probability vector by using a bayesian classification algorithm based on prior knowledge of an expression feature-emotion mapping database pre-stored in a local database, where the facial region expression features are labeled feature point positions.
In this embodiment, as shown in fig. 3, according to the obtained user facial information, facial expression features are extracted based on a facial expression recognition algorithm by a feature extraction module in the intelligent mobile robot, and a facial emotion probability vector is obtained by using a bayesian classification algorithm based on prior knowledge of an expression feature-emotion mapping database pre-stored in a local database, where the facial expression features may be positions of labeled feature points.
In the foregoing specific embodiment of the multi-source emotion recognition robot based on the body area network, further, the smart mobile robot is configured to obtain a corresponding statistical feature value according to the received physiological information, and obtain a physiological feature emotion probability vector by using a bayesian classification algorithm based on a priori knowledge of a physiological feature-emotion mapping database pre-stored in a local database, where the physiological feature emotion probability vector is a vector formed by probabilities of 6 emotional states, namely surprise, fear, disgust, anger, happiness and sadness, obtained from the physiological features.
In this embodiment, as shown in fig. 3, according to the received physiological information, a feature extraction module in the smart mobile robot obtains a physiological feature emotion probability vector (the physiological feature emotion probability vector is posterior probability) by using a bayesian classification algorithm (for example, BAN bayesian classification algorithm) based on prior knowledge of a physiological feature-emotion mapping database pre-stored in a local database, where the physiological feature emotion probability vector is a vector formed by probabilities of 6 emotion states, namely surprise, fear, disgust, anger, happiness and sadness, obtained from the physiological feature, and since the physiological feature is a continuous variable, it can be assumed that a value of the physiological feature follows gaussian distribution.
In this embodiment, the emotion state of the user may be obtained by performing emotion fusion analysis on the obtained facial emotion probability vector and physiological feature emotion probability vector by using the multi-source emotion fusion analysis module in the intelligent mobile robot, as shown in fig. 3, the specific steps may include:
performing regression analysis according to the obtained facial emotion probability vector and physiological feature emotion probability vector to obtain the correlation between the facial features and the physiological features and the probability condition between the two and the target variable;
the intelligent mobile robot is further used for carrying out fusion analysis of a decision layer by using an evidence reasoning theory based on a Gordon-Schottky algorithm according to the obtained correlation between the facial features and the physiological features and the probability conditions between the two facial features and the target variable to obtain the physiological-facial-based user emotional state, and the obtained physiological-facial-based user emotional state has higher emotion recognition accuracy.
In this embodiment, the target variable refers to an emotional state corresponding to the feature, and discretized values of different emotional states are given, so that the emotional state can be calculated. The probability condition between the two and the target variable is, for example: a is the position of the face characteristic value point, B is the characteristic value (variance and standard deviation) of the heart rate, the probability of a certain emotional state corresponding to the simultaneous existence of (A, B) can be known through the previous calculation of the database, and the conditional probability can be used for the fusion analysis of evidence reasoning. Of course, not limited to the combination with a and B.
In the foregoing specific implementation manner of the multi-source emotion recognition robot based on the body area network, further, the intelligent mobile robot is further configured to generate a corresponding behavior instruction according to the obtained physiological-facial based emotional state of the user, so as to implement emotional interaction between the intelligent mobile robot and the user.
In this embodiment, after obtaining the emotional state of the user, the multi-source emotion fusion analysis module generates a corresponding behavior instruction according to the obtained physiological-facial based emotional state of the user, and controls the interaction device in the smart mobile robot to execute the corresponding behavior instruction to perform emotion interaction with the user.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
While the foregoing is directed to the preferred embodiment of the present invention, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the appended claims.
Claims (3)
1. The utility model provides a multisource feelings recognition robot based on body area network which characterized in that includes: wearable equipment of a body area network and an intelligent mobile robot;
the body area network wearable device is used for acquiring physiological information of a user and transmitting the acquired physiological information to the intelligent mobile robot;
the intelligent mobile robot is used for acquiring the facial information of the user, extracting the emotional characteristics of the acquired facial information and the received physiological information, and performing emotion fusion analysis according to the emotional characteristic extraction result to obtain the physiological-facial-based emotional state of the user;
wherein the emotional state comprises: surprise, fear, disgust, anger, happiness and sadness;
wherein the body area network wearable device comprises: the system comprises a body sensing temperature sensor sub-node, a conductivity sensor sub-node, a heart rate sensor sub-node and a sink node;
the somatosensory temperature sensor sub-node comprises: the system comprises a body sensing temperature sensor for collecting body sensing temperature of a user in real time and a first Bluetooth communication module for sending the collected body sensing temperature of the user to a sink node;
the conductivity sensor sub-node comprises: the conductivity sensor is used for collecting the conductivity in the skin sweat of the user in real time, and the second Bluetooth communication module is used for sending the collected conductivity in the skin sweat of the user to the sink node;
the heart rate sensor sub-node comprises: the third Bluetooth communication module is used for transmitting the collected heart rate of the user to the sink node;
the sink node includes: the fourth Bluetooth communication module is used for transmitting the received body sensing temperature of the user, the conductivity in skin sweat of the user and the heart rate of the user to the intelligent mobile robot;
the heart rate sensor is a patch type heart rate sensor, and the patch type heart rate sensor is worn at a wrist artery blood vessel;
the conductivity sensor is a bracelet and is worn on the wrist;
the somatosensory temperature sensor is worn on the inner side of the large arm;
the intelligent mobile robot is used for extracting facial region expression characteristics based on an expression recognition algorithm of a human face according to the acquired user facial information, and obtaining facial emotion probability vectors based on prior knowledge of an expression characteristic-emotion mapping database stored in a local database in advance by using a Bayesian classification algorithm, wherein the facial region expression characteristics are the positions of marked feature points;
the intelligent mobile robot is used for obtaining corresponding statistical characteristic values according to received physiological information, obtaining physiological characteristic emotion probability vectors by means of a Bayesian classification algorithm based on priori knowledge of a physiological characteristic-emotion mapping database pre-stored in a local database, wherein the physiological characteristic emotion probability vectors are vectors formed by probabilities of 6 emotion states, namely surprise, fear, disgust, anger, happiness and sadness, obtained by physiological characteristics;
the intelligent mobile robot is used for carrying out regression analysis according to the obtained facial emotion probability vector and physiological feature emotion probability vector to obtain the correlation between the facial features and the physiological features and the probability conditions between the two characteristics and target variables, wherein the target variables refer to the emotion states corresponding to the features;
the intelligent mobile robot is also used for carrying out fusion analysis of a decision layer by utilizing an evidence reasoning theory based on a Gordon-Shortliff algorithm according to the obtained correlation between the facial features and the physiological features and the probability conditions between the two features and the target variable to obtain the emotional state of the user;
the intelligent mobile robot is used for acquiring user face information by using a face recognition algorithm when a user is in a static state and the face of the user is within a camera shooting range of the intelligent mobile robot;
the intelligent mobile robot is further used for controlling the intelligent mobile robot to move according to the head movement track of the user by using an average displacement algorithm when the user is in a moving state, and carrying out face tracking according to the head movement track of the user to acquire face information of the user.
2. The body area network-based multi-source emotion recognition robot of claim 1, wherein the heart rate sensor sub-node further comprises:
a digital to analog circuit for pre-processing a heart rate signal collected by a heart rate sensor before sending out the heart rate signal collected by the heart rate sensor, the pre-processing comprising: amplification, shaping and filtering operations.
3. The body area network-based multi-source emotion recognition robot of claim 1, wherein the smart mobile robot is further configured to generate a corresponding behavior instruction according to the obtained physiological-facial-based emotional state of the user, so as to realize emotional interaction between the smart mobile robot and the user.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710181535.7A CN107066956B (en) | 2017-03-24 | 2017-03-24 | Multisource emotion recognition robot based on body area network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710181535.7A CN107066956B (en) | 2017-03-24 | 2017-03-24 | Multisource emotion recognition robot based on body area network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107066956A CN107066956A (en) | 2017-08-18 |
CN107066956B true CN107066956B (en) | 2020-06-19 |
Family
ID=59620430
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710181535.7A Expired - Fee Related CN107066956B (en) | 2017-03-24 | 2017-03-24 | Multisource emotion recognition robot based on body area network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107066956B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109186778A (en) * | 2018-09-30 | 2019-01-11 | 深圳和呈睿国际技术有限公司 | Heat source reminding method, system, readable storage medium storing program for executing and company robot |
CN109670406B (en) * | 2018-11-25 | 2023-06-20 | 华南理工大学 | Non-contact emotion recognition method for game user by combining heart rate and facial expression |
CN110334626B (en) * | 2019-06-26 | 2022-03-04 | 北京科技大学 | Online learning system based on emotional state |
CN112017403B (en) * | 2020-09-15 | 2022-02-01 | 青岛联合创智科技有限公司 | Community-house integrated intelligent service electronic board |
CN112990067A (en) * | 2021-03-31 | 2021-06-18 | 上海理工大学 | Robot intelligent emotion recognition and cure method for solitary people |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101887721A (en) * | 2010-07-19 | 2010-11-17 | 东南大学 | Electrocardiosignal and voice signal-based bimodal emotion recognition method |
CN102968550A (en) * | 2012-10-18 | 2013-03-13 | 上海交通大学无锡研究院 | Human health unified management system for community based on body area network |
CN103413113A (en) * | 2013-01-15 | 2013-11-27 | 上海大学 | Intelligent emotional interaction method for service robot |
CN104305561A (en) * | 2014-09-30 | 2015-01-28 | 肖南 | Emotion wearable system and emotion judging method |
CN105082150A (en) * | 2015-08-25 | 2015-11-25 | 国家康复辅具研究中心 | Robot man-machine interaction method based on user mood and intension recognition |
TW201607511A (en) * | 2014-08-29 | 2016-03-01 | 國立臺中教育大學 | Integration of multi-physiological signals for developing emotion recognition engine system and method |
CN105976809A (en) * | 2016-05-25 | 2016-09-28 | 中国地质大学(武汉) | Voice-and-facial-expression-based identification method and system for dual-modal emotion fusion |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10776710B2 (en) * | 2015-03-24 | 2020-09-15 | International Business Machines Corporation | Multimodal data fusion by hierarchical multi-view dictionary learning |
-
2017
- 2017-03-24 CN CN201710181535.7A patent/CN107066956B/en not_active Expired - Fee Related
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101887721A (en) * | 2010-07-19 | 2010-11-17 | 东南大学 | Electrocardiosignal and voice signal-based bimodal emotion recognition method |
CN102968550A (en) * | 2012-10-18 | 2013-03-13 | 上海交通大学无锡研究院 | Human health unified management system for community based on body area network |
CN103413113A (en) * | 2013-01-15 | 2013-11-27 | 上海大学 | Intelligent emotional interaction method for service robot |
TW201607511A (en) * | 2014-08-29 | 2016-03-01 | 國立臺中教育大學 | Integration of multi-physiological signals for developing emotion recognition engine system and method |
CN104305561A (en) * | 2014-09-30 | 2015-01-28 | 肖南 | Emotion wearable system and emotion judging method |
CN105082150A (en) * | 2015-08-25 | 2015-11-25 | 国家康复辅具研究中心 | Robot man-machine interaction method based on user mood and intension recognition |
CN105976809A (en) * | 2016-05-25 | 2016-09-28 | 中国地质大学(武汉) | Voice-and-facial-expression-based identification method and system for dual-modal emotion fusion |
Also Published As
Publication number | Publication date |
---|---|
CN107066956A (en) | 2017-08-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107066956B (en) | Multisource emotion recognition robot based on body area network | |
Qi et al. | A cybertwin based multimodal network for ecg patterns monitoring using deep learning | |
Bright et al. | EEG-based brain controlled prosthetic arm | |
CN110605724B (en) | Intelligence endowment robot that accompanies | |
US10061389B2 (en) | Gesture recognition system and gesture recognition method | |
RU2013100382A (en) | METHOD AND SYSTEM FOR OBSERVING SLEEP AND OTHER PHYSIOLOGICAL CONDITIONS | |
CN108172303A (en) | A kind of intelligent health service robot | |
CN105945949A (en) | Information processing method and system for intelligent robot | |
Tang et al. | Wearable supernumerary robotic limb system using a hybrid control approach based on motor imagery and object detection | |
CN203552178U (en) | Wrist strip type hand motion identification device | |
CN110673721A (en) | Robot nursing system based on vision and idea signal cooperative control | |
Ananthi et al. | Implementation of IoT and UAV Based WBAN for healthcare applications | |
Turgunov et al. | Comparative analysis of the results of EMG signal classification based on machine learning algorithms | |
Nugent et al. | Managing sensor data in ambient assisted living | |
CN106651283A (en) | Intelligent monitoring and reminding robot for teenagers | |
CN107307851A (en) | Intelligent robot system | |
CN104516485A (en) | Glasses and mobile phone interaction system based on idea control | |
CN110166645A (en) | A kind of call service system based on electro-ocular signal control | |
Almazroa et al. | An Internet of Things (IoT) management system for improving homecare-A case study | |
Shabnam et al. | IoT based health monitoring using smart devices for medical emergency services | |
Xingxiu et al. | Arrhythmia Classification Based on CNN and Bidirectional LSTM. | |
Limchesing et al. | A review on recent applications of EEG-based BCI in wheelchairs and other assistive devices | |
CN208910229U (en) | Medical system based on wearable device | |
CN112704479A (en) | Intelligent health monitoring system and method based on Internet of things | |
CN215189715U (en) | Intelligent monitoring bracelet system for old people |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20200619 |
|
CF01 | Termination of patent right due to non-payment of annual fee |