CN111951930A - Emotion identification system based on big data - Google Patents

Emotion identification system based on big data Download PDF

Info

Publication number
CN111951930A
CN111951930A CN202010837903.0A CN202010837903A CN111951930A CN 111951930 A CN111951930 A CN 111951930A CN 202010837903 A CN202010837903 A CN 202010837903A CN 111951930 A CN111951930 A CN 111951930A
Authority
CN
China
Prior art keywords
unit
emotion
user
face
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010837903.0A
Other languages
Chinese (zh)
Other versions
CN111951930B (en
Inventor
陈霄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CHINESE FOOD ANHONG (GUANGDONG) HEALTH INDUSTRY Co.,Ltd.
Original Assignee
陈霄
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 陈霄 filed Critical 陈霄
Priority to CN202010837903.0A priority Critical patent/CN111951930B/en
Priority to CN202110882961.XA priority patent/CN113724838B/en
Publication of CN111951930A publication Critical patent/CN111951930A/en
Application granted granted Critical
Publication of CN111951930B publication Critical patent/CN111951930B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention discloses an emotion identification system based on big data, which belongs to the technical field of big data and comprises a feature processing module, an emotion identification module and an identification determination module; the characteristic processing module is used for processing the acquired human face information characteristics; the emotion identification module is used for identifying the emotion of the user according to the digitalized human face information processed by the feature processing module; the emotion identification determination module is used for determining an emotion identification result of the emotion identification module; the method is scientific and reasonable, is safe and convenient to use, utilizes the face information modeling unit and the coordinate system establishing unit to establish a two-dimensional model of the face of the user and extracts the positioning information of the characteristic position, so that the face change of the user can be analyzed in a data mode, and the emotion of the user can be identified more accurately by comparing the change of the face with historical data in a database when the emotion is low.

Description

Emotion identification system based on big data
Technical Field
The invention relates to the technical field of big data, in particular to an emotion identification system based on big data.
Background
With the continuous progress of society and the continuous development of science and technology, the working pressure of the current generation is continuously increased, the emotion of the current generation young people is influenced, the low emotion can cause the reduction of the working efficiency, therefore, the working efficiency can be effectively improved by improving the emotion, and the existing emotion identification has the following problems:
1. the emotion identification accuracy is low, and the emotion of the user cannot be identified through the digitalized information;
2. the existing emotion identification cannot verify the emotion after identification, so that the final identification result cannot reach the emotion expectation of a user;
therefore, there is a strong need for a sentiment evaluation system based on big data to solve the above problems.
Disclosure of Invention
The invention aims to provide an emotion identification system based on big data so as to solve the problems in the prior art.
In order to achieve the purpose, the invention provides the following technical scheme: a sentiment identification system based on big data comprises a feature processing module, a sentiment identification module, an identification determination module and a suggestion reminding module;
the characteristic processing module is used for processing the collected human face information characteristics to enable the human face information to be digitalized and facilitate the analysis and identification of the emotion of the user;
the emotion identification module is used for identifying the emotion of the user according to the digitalized human face information processed by the feature processing module;
the identification determining module is used for determining the emotion identification result of the emotion identification module, so that the accuracy of the system for emotion identification is improved;
the suggestion reminding module is used for giving corresponding suggestions when the emotion of the user is identified to be low, helping the user improve the emotion and improving the working and learning efficiency;
the output end of the characteristic processing module is electrically connected with the input end of the emotion identification module, the output end of the emotion identification module is electrically connected with the input end of the identification determination module, and the output end of the identification determination module is electrically connected with the input end of the suggestion reminding module.
According to the technical scheme, the feature processing module comprises a face information acquisition unit, a face information modeling unit, a feature position positioning unit and a coordinate system establishing unit;
the face information acquisition unit is used for acquiring face information of the user so as to identify the emotion of the user according to the current face information and the historical face information of the user; the face information modeling unit is used for establishing a two-dimensional model of the face information so as to perform data representation on the face information, so that the face information of a user can be analyzed more easily; the determined characteristic position positioning unit is used for positioning the facial characteristic position of the user so as to determine the emotion change of the user according to the data change of the facial characteristic position of the user; the coordinate system establishing unit is used for establishing a two-dimensional coordinate system of the two-dimensional model of the face information so as to facilitate the data positioning of the face characteristic position of the user;
the output end of the face information acquisition unit is electrically connected with the input end of the face information modeling unit, the output end of the coordinate system establishing unit is electrically connected with the input end of the face information modeling unit, and the output end of the face information modeling unit is electrically connected with the input end of the characteristic position positioning unit.
According to the technical scheme, the emotion identification module comprises a face data analysis unit, a face information comparison unit and a database;
the face data analysis unit is used for identifying and analyzing the emotion of the user according to the face characteristic information data of the user; the face information comparison unit is used for comparing the analysis data of the face data analysis unit with historical analysis data stored in a database to determine the emotion of the user; the database is used for storing and marking the facial feature information data of the user so as to facilitate comparison of the facial feature information data and identification of emotion in the later period;
the output end of the face data analysis unit is electrically connected with the input end of the face information comparison unit, and the output end of the database is electrically connected with the input end of the face information comparison unit.
According to the technical scheme, the identification determining module comprises a conclusion verifying unit, an intelligent information collecting bracelet, a popup inquiring unit and an emotion identification marking unit;
the conclusion verification unit is used for verifying the emotion identification result according to the comparison result of the face information comparison unit and the information acquired by the intelligent information acquisition bracelet, so that the accuracy of emotion identification is improved; the intelligent information acquisition bracelet is used for acquiring body temperature information, heart rate information and blood pressure information of a user, so that the emotion identification result of the user can be further determined; the popup inquiring unit is used for inquiring whether the emotion identification result of the user is correct or not in a popup mode so as to determine the accuracy of the system for the emotion identification of the user, and therefore the identification condition of the system can be adjusted according to the determination result; the emotion identification marking unit is used for marking the facial feature data after emotion identification so as to record the emotion change frequency of the user and give corresponding suggestions and opinions while continuously expanding the database;
the face information comparison unit comprises an output end, an input end, an output end and an input end, wherein the output end of the face data analysis unit is electrically connected with the input end of the emotion identification marking unit, the output end of the face information comparison unit is electrically connected with the input end of the conclusion verification unit, the output end of the intelligent information collection bracelet is electrically connected with the input end of the conclusion verification unit, the output end of the conclusion verification unit is electrically connected with.
According to the technical scheme, the suggestion reminding module comprises an entertainment reminding unit, a rest reminding unit and a movement reminding unit;
the entertainment reminding unit reminds the user in a mode of pushing entertainment information and entertainment short videos, the emotion of the user is improved, the rest reminding unit is used for reminding the user to pay attention to rest, the emotion falling caused by fatigue is avoided, the exercise reminding unit is used for reminding the user to pay attention to exercise activities of muscles and bones, and the emotion falling caused by physical fatigue is avoided.
According to the technical scheme, the intelligent information acquisition bracelet is inside still to install the detecting element of sitting for a long time, whether the detecting element of sitting for a long time is used for detecting the user sits for a long time, makes it remind the unit inter combination with the motion, and fatigue that can effectually avoid sitting for a long time and bring leads to the mood low-falling.
According to the technical scheme, the face information acquisition unit is a camera at the mobile phone end of the user, when the user uses the mobile phone, the face information of the user is acquired, so that emotion identification cannot be caused to influence normal life of the user, the face information modeling unit establishes a two-dimensional model for the acquired face information, the coordinate system establishing unit endows the established two-dimensional model with a two-dimensional rectangular coordinate system, so that each point in the two-dimensional model can be endowed with a coordinate value, analysis of the face information can be realized in a datamation mode, and emotion identification results are more convincing, and the feature position positioning unit is used for positioning the coordinate values of the feature position of the face information of the user and comprises a coordinate value set O of an eye contour, a coordinate value set P of each point of eyebrow and a coordinate value set Q of a mouth contour.
According to the above solution, the set of coordinate values O { (X) of the eye contour1,Y1),(X2,Y2),(X3,Y3),…,(Xn,Yn) And the coordinate value set P { (M) of each point of the eyebrow1,N1),(M2,N2),(M3,N3),…,(Mm,Nm) -said set of coordinate values of the mouth contour Q { (E)1,F1),(E2,F2),(E3,F3),…,(Ek,Fk)};
The face data analysis unit extracts a coordinate value (X) having a maximum value on the Y axis from the coordinate value set O of the eye contourmin,Ymin) The face data analysis unit extracts a coordinate value (X) having a minimum value on the Y axis from the coordinate value set O of the eye contourmax,Ymax) The difference between the maximum and minimum values on the Y-axis is calculated according to the following formula:
L=Ymax-Ymin
wherein L represents the distance that the eye contour is open;
the face data analysis unit calculates the sum of the difference values of two adjacent coordinate values in the coordinate value set P of each point of the eyebrow on the Y axis according to the following formula:
Figure BDA0002640370070000061
wherein the content of the first and second substances,
Figure BDA0002640370070000062
representing the sum of the difference values of two adjacent coordinate values on the Y axis;
the facial data analysis unit extracts a coordinate value (E) having a minimum value on the E axis from the coordinate value set Q of the mouth contourmin,Fmin) And the coordinate value (E) having the largest value on the E axismax,Fmax) The face data analysis unit extracts a coordinate value (E) having a maximum value on the F-axis from the coordinate value set Q of the mouth contourmid,Fmid) The face data analysis unit calculates the distance of the mouth corner sinking according to the following formula:
Lmin=Fmid-Fmin
Lmax=Fmid-Fmax
wherein L isminAnd LmaxThe difference value of the maximum coordinate values on the mouth angles on the two sides and the F axis is respectively obtained.
According to the above technical solution, the face information comparing unit compares the data L analyzed by the face data analyzing unit,
Figure BDA0002640370070000063
LminAnd LmaxComparing with historical data stored in a database, wherein in the data stored in the database, L is less than or equal to a in the data with low emotion of the user,
Figure BDA0002640370070000064
Lmin≥c,Lmax≧ c, where a, b, and c are respectively analyzed by historical data stored in the database to derive thresholds, thus:
when L is less than or equal to a or
Figure BDA0002640370070000065
Or LminNot less than c andLmaxand when the current emotion of the user is lower than or equal to c, the conclusion verification unit acquires detection data of the bracelet according to the intelligent information, verifies the emotion identification of the user, and reminds the user to take a rest through the suggestion reminding module to improve the emotion.
According to the technical scheme, the popup inquiring unit inquires whether the authentication of the system is correct in a popup mode;
when the user selects correct identification, the emotion identification marking unit marks the currently identified data, marks the facial feature information data as the data when the emotion is low, and stores the data in the database as the data source of the next emotion identification;
when the user chooses not to authenticate correctly, the emotion authentication marking unit does not mark the current data.
Compared with the prior art, the invention has the beneficial effects that:
1. the invention utilizes the face information modeling unit and the coordinate system establishing unit to establish a two-dimensional model of the face of the user and extract the positioning information of the characteristic position, so that the change of the face of the user can be analyzed in a data mode, because the change of the face is obvious when the emotion is low, the emotion identification of the user can be more accurate by comparing the change of the face with historical data in a database, and meanwhile, the success rate of the emotion identification is increased by verifying the emotion identification result by combining the data acquired by the intelligent information acquisition bracelet.
2. The invention inquires whether the emotion identification result of the user is correct or not in a popup inquiry mode, so that the emotion identification marking unit is utilized to mark and store the facial feature data during emotion identification, thereby being beneficial to continuous expansion of a database, increasing the reference comparison data during emotion identification and effectively improving the success rate of later emotion identification.
3. According to the invention, the user is reminded to take a rest by the suggestion reminding module when the emotion identification of the user is low, so that the emotion of the user can be effectively improved, and the work and study efficiency is improved.
Drawings
FIG. 1 is a schematic diagram of a module structure of a big data-based emotion identification system according to the present invention;
fig. 2 is a schematic diagram of a module connection structure of a sentiment identification system based on big data.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1-2, the emotion identification system based on big data comprises a feature processing module, an emotion identification module, an identification determination module and a suggestion reminding module;
the characteristic processing module is used for processing the collected human face information characteristics to enable the human face information to be digitalized and facilitate the analysis and identification of the emotion of the user;
the emotion identification module is used for identifying the emotion of the user according to the digitalized human face information processed by the feature processing module;
the identification determining module is used for determining the emotion identification result of the emotion identification module, so that the accuracy of the system for emotion identification is improved;
the suggestion reminding module is used for giving corresponding suggestions when the emotion of the user is identified to be low, helping the user improve the emotion and improving the working and learning efficiency;
the output end of the characteristic processing module is electrically connected with the input end of the emotion identification module, the output end of the emotion identification module is electrically connected with the input end of the identification determination module, and the output end of the identification determination module is electrically connected with the input end of the suggestion reminding module.
The characteristic processing module comprises a face information acquisition unit, a face information modeling unit, a characteristic position positioning unit and a coordinate system establishing unit;
the face information acquisition unit is used for acquiring face information of the user so as to identify the emotion of the user according to the current face information and the historical face information of the user; the face information modeling unit is used for establishing a two-dimensional model of the face information so as to be convenient for carrying out data representation on the face information and more easily analyze the face information of the user; the determined characteristic position positioning unit is used for positioning the facial characteristic position of the user so as to determine the emotion change of the user according to the data change of the facial characteristic position of the user; the coordinate system establishing unit is used for establishing a two-dimensional coordinate system of the two-dimensional model of the face information so as to facilitate the data positioning of the face characteristic position of the user;
the output end of the face information acquisition unit is electrically connected with the input end of the face information modeling unit, the output end of the coordinate system establishing unit is electrically connected with the input end of the face information modeling unit, and the output end of the face information modeling unit is electrically connected with the input end of the characteristic position positioning unit.
The emotion identification module comprises a face data analysis unit, a face information comparison unit and a database;
the face data analysis unit is used for identifying and analyzing the emotion of the user according to the face characteristic information data of the user; the face information comparison unit is used for comparing the analysis data of the face data analysis unit with historical analysis data stored in a database to determine the emotion of the user; the database is used for storing and marking the facial feature information data of the user so as to facilitate comparison of the facial feature information data and identification of emotion in the later period;
the output end of the face data analysis unit is electrically connected with the input end of the face information comparison unit, and the output end of the database is electrically connected with the input end of the face information comparison unit.
The identification determining module comprises a conclusion verifying unit, an intelligent information acquisition bracelet, a popup inquiring unit and an emotion identification marking unit;
the conclusion verification unit is used for verifying the emotion identification result according to the comparison result of the face information comparison unit and the information acquired by the intelligent information acquisition bracelet, so that the accuracy of emotion identification is improved; the intelligent information acquisition bracelet is used for acquiring body temperature information, heart rate information and blood pressure information of a user, so that the emotion identification result of the user can be further determined; the popup inquiring unit is used for inquiring whether the emotion identification result of the user is correct or not in a popup mode so as to determine the accuracy of the system for the emotion identification of the user, and therefore the identification condition of the system can be adjusted according to the determination result; the emotion identification marking unit is used for marking the facial feature data after emotion identification so as to record the emotion change frequency of the user and give corresponding suggestions and opinions while continuously expanding the database;
the output of face data analysis unit electric connection mood appraisal mark unit's input, the input of mood appraisal mark unit's output electric connection database, the input of the output electric connection conclusion verification unit of face information comparison unit, the input of the output electric connection conclusion verification unit of intelligent information collection bracelet, the output electric connection popup inquiry unit's of conclusion verification unit input, the output electric connection face data analysis unit's of popup inquiry unit input.
The suggestion reminding module comprises an entertainment reminding unit, a rest reminding unit and a movement reminding unit;
the entertainment reminding unit reminds the user through the mode of propelling movement entertainment information and entertainment short video, improves user's mood, and the rest reminding unit is used for reminding the user to pay attention to the rest, avoids tired to lead to the mood to fall, and the motion reminding unit is used for reminding the user to pay attention to the motion activity muscles and bones, avoids the mood that the health fatigue brought to fall.
The inside detecting element that sits for a long time that still installs of intelligent information acquisition bracelet, the detecting element that sits for a long time is used for detecting whether the user sits for a long time, makes it remind the unit inter combination with the motion, and fatigue that can effectually avoid sitting for a long time to bring leads to the mood low.
The face information acquisition unit is a camera at the mobile phone end of a user, the face information of the user is acquired when the user uses the mobile phone, so that emotion identification cannot affect normal life of the user, the face information modeling unit establishes a two-dimensional model for the acquired face information, the coordinate system establishing unit endows the established two-dimensional model with a two-dimensional rectangular coordinate system, so that each point in the two-dimensional model can be endowed with a coordinate value, the analysis of the face information can be realized in a datamation mode, the emotion identification result is more convincing, the characteristic position positioning unit is used for positioning the coordinate value of the characteristic position of the face information of the user, and the coordinate value set comprises a coordinate value set O of an eye contour, a coordinate value set P of each point of eyebrow and a coordinate value set Q of a mouth contour.
Set of coordinate values of eye contour O { (X)1,Y1),(X2,Y2),(X3,Y3),…,(Xn,Yn) The coordinate value set P { (M) of each point of the eyebrow1,N1),(M2,N2),(M3,N3),…,(Mm,Nm) -a set of coordinate values Q { (E) for the mouth contour1,F1),(E2,F2),(E3,F3),…,(Ek,Fk)};
The face data analysis unit extracts a coordinate value (X) having a maximum value on the Y axis from the coordinate value set O of the eye contourmin,Ymin) The face data analysis unit extracts a coordinate value (X) having a minimum value on the Y axis from the coordinate value set O of the eye contourmax,Ymax) The difference between the maximum and minimum values on the Y-axis is calculated according to the following formula:
L=Ymax-Ymin
wherein L represents the distance that the eye contour is open;
the face data analysis unit calculates the sum of the difference values of two adjacent coordinate values on the Y axis in the coordinate value set P of each point of the eyebrow according to the following formula:
Figure BDA0002640370070000121
wherein the content of the first and second substances,
Figure BDA0002640370070000122
representing the sum of the difference values of two adjacent coordinate values on the Y axis;
the face data analysis unit extracts a coordinate value (E) having the smallest value on the E axis from the coordinate value set Q of the mouth contourmin,Fmin) And the coordinate value (E) having the largest value on the E axismax,Fmax) The face data analysis unit extracts a coordinate value (E) having a maximum value on the F-axis from the coordinate value set Q of the mouth contourmid,Fmid) The face data analysis unit calculates the distance of the mouth corner sinking according to the following formula:
Lmin=Fmid-Fmin
Lmax=Fmid-Fmax
wherein L isminAnd LmaxThe difference value of the maximum coordinate values on the mouth angles on the two sides and the F axis is respectively obtained.
The face information comparison unit compares the data L analyzed by the face data analysis unit,
Figure BDA0002640370070000131
LminAnd LmaxComparing with historical data stored in a database, wherein in the data stored in the database, L is less than or equal to a in the data with low emotion of the user,
Figure BDA0002640370070000132
Lmin≥c,Lmax≧ c, where a, b, and c are respectively analyzed by historical data stored in the database to derive thresholds, thus:
when L is less than or equal to a or
Figure BDA0002640370070000133
Or LminNot less than c andLmaxwhen the current value is more than or equal to c, the current value of the user is representedThe emotion is low, the conclusion verification unit collects detection data of the bracelet according to the intelligent information, the emotion identification of the user is verified, and the user is reminded to take a rest through the suggestion reminding module, so that the emotion is improved.
The popup inquiring unit inquires whether the authentication of the system is correct or not in a popup mode;
when the user selects correct identification, the emotion identification marking unit marks the currently identified data, marks the facial feature information data as the data when the emotion is low, and stores the data in the database as the data source of the next emotion identification;
when the user chooses to authenticate incorrectly, the current data is not marked by the emotional-authentication marking unit.
The first embodiment is as follows:
set of coordinate values of eye contour O { (X)1,Y1),(X2,Y2),(X3,Y3),…,(Xn,Yn) The coordinate value set P { (M) of each point of the eyebrow1,N1),(M2,N2),(M3,N3),…,(Mm,Nm) -a set of coordinate values Q { (E) for the mouth contour1,F1),(E2,F2),(E3,F3),…,(Ek,Fk)};
The face data analysis unit extracts a coordinate value (X) having a maximum value on the Y axis from the coordinate value set O of the eye contourmin,Ymin) The face data analysis unit extracts the coordinate value (X) having the smallest value on the Y axis from the coordinate value set O of the eye contour (1.5,1.8)max,Ymax) (1.5, 0.72), the difference between the maximum and minimum values on the Y axis is calculated according to the following formula:
L=Ymax-Ymin=1.8-0.72=1.08;
wherein L ═ 1.08 denotes the distance at which the eye contour is open;
the face data analysis unit calculates the sum of the difference values of two adjacent coordinate values on the Y axis in the coordinate value set P of each point of the eyebrow according to the following formula:
Figure BDA0002640370070000141
wherein the content of the first and second substances,
Figure BDA0002640370070000142
representing the sum of the difference values of two adjacent coordinate values on the Y axis;
the face data analysis unit extracts a coordinate value (E) having the smallest value on the E axis from the coordinate value set Q of the mouth contourmin,Fmin) (2.2, -6.2) and the coordinate value (E) which has the largest value on the E-axismax,Fmax) (5.4, -6.2), the face data analysis unit extracts a coordinate value (E) having the largest value on the F-axis from the coordinate value set Q of the mouth contourmid,Fmid) (3.3, -5.3), the face data analysis unit calculates the distance of the mouth corner depression according to the following formula:
Lmin=Fmid-Fmin=-5.3+6.2=0.9;
Lmax=Fmid-Fmax=-5.3+6.2=0.9;
wherein L isminAnd LmaxThe difference value of the maximum coordinate values on the mouth angles on the two sides and the F axis is respectively obtained.
The face information comparison unit compares the data L analyzed by the face data analysis unit,
Figure BDA0002640370070000151
LminAnd LmaxComparing with historical data stored in a database, wherein in the data stored in the database, L is less than or equal to a in the data with low emotion of the user,
Figure BDA0002640370070000152
Lmin≤c,Lmaxc, wherein a 1.11, b 1.54, and c 0.95 are thresholds, respectively, derived from analysis of historical data stored in the database, such that:
l > a or
Figure BDA0002640370070000153
Or LminC is less than or equal to cLmaxC represents that the current emotion of the user is normal, and the conclusion verification unit verifies the emotion identification of the user according to detection data of the intelligent information acquisition bracelet.
Example two:
set of coordinate values of eye contour O { (X)1,Y1),(X2,Y2),(X3,Y3),…,(Xn,Yn) The coordinate value set P { (M) of each point of the eyebrow1,N1),(M2,N2),(M3,N3),…,(Mm,Nm) -a set of coordinate values Q { (E) for the mouth contour1,F1),(E2,F2),(E3,F3),…,(Ek,Fk)};
The face data analysis unit extracts a coordinate value (X) having a maximum value on the Y axis from the coordinate value set O of the eye contourmin,Ymin) The face data analysis unit extracts the coordinate value (X) having the smallest value on the Y axis from the coordinate value set O of the eye contour (1.5,1.8)max,Ymax) (1.5, 0.8), the difference between the maximum and minimum values on the Y axis is calculated according to the following formula:
L=Ymax-Ymin=1.8-0.8=1;
wherein L ═ 1 denotes the distance at which the eye contour is open;
the face data analysis unit calculates the sum of the difference values of two adjacent coordinate values on the Y axis in the coordinate value set P of each point of the eyebrow according to the following formula:
Figure BDA0002640370070000161
wherein the content of the first and second substances,
Figure BDA0002640370070000162
indicating the difference between two adjacent coordinate values on the Y axisAnd;
the face data analysis unit extracts a coordinate value (E) having the smallest value on the E axis from the coordinate value set Q of the mouth contourmin,Fmin) (2.2, -6.2) and the coordinate value (E) which has the largest value on the E-axismax,Fmax) (5.4, -6.2), the face data analysis unit extracts a coordinate value (E) having the largest value on the F-axis from the coordinate value set Q of the mouth contourmid,Fmid) (3.3, -5.3), the face data analysis unit calculates the distance of the mouth corner depression according to the following formula:
Lmin=Fmid-Fmin=-5.3+6.2=0.9;
Lmax=Fmid-Fmax=-5.3+6.2=0.9;
wherein L isminAnd LmaxThe difference value of the maximum coordinate values on the mouth angles on the two sides and the F axis is respectively obtained.
The face information comparison unit compares the data L analyzed by the face data analysis unit,
Figure BDA0002640370070000163
LminAnd LmaxComparing with historical data stored in a database, wherein in the data stored in the database, L is less than or equal to a in the data with low emotion of the user,
Figure BDA0002640370070000164
Lmin≤c,Lmaxc, wherein a 1.11, b 1.54, and c 0.95 are thresholds, respectively, derived from analysis of historical data stored in the database, such that:
l is less than or equal to a
Figure BDA0002640370070000165
The current emotion of the user is low, the conclusion verification unit collects detection data of the bracelet according to the intelligent information, the emotion identification of the user is verified, and the suggestion module reminds the user to pay attention to rest and improve the emotion.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.

Claims (10)

1. An emotion identification system based on big data, characterized in that: the system comprises a feature processing module, an emotion identification module, an identification determination module and a suggestion reminding module;
the characteristic processing module is used for processing the acquired human face information characteristics;
the emotion identification module is used for identifying the emotion of the user according to the digitalized human face information processed by the feature processing module;
the emotion identification determination module is used for determining an emotion identification result of the emotion identification module;
the suggestion reminding module is used for giving corresponding suggestions when the emotion of the user is identified to be low;
the output end of the characteristic processing module is electrically connected with the input end of the emotion identification module, the output end of the emotion identification module is electrically connected with the input end of the identification determination module, and the output end of the identification determination module is electrically connected with the input end of the suggestion reminding module.
2. A big data based emotion recognition system as claimed in claim 1, wherein: the characteristic processing module comprises a face information acquisition unit, a face information modeling unit, a characteristic position positioning unit and a coordinate system establishing unit;
the face information acquisition unit is used for acquiring face information of a user; the face information modeling unit is used for establishing a two-dimensional model of the face information; the determined feature position positioning unit is used for positioning the face feature position of the user; the coordinate system establishing unit is used for establishing a two-dimensional coordinate system of the face information two-dimensional model;
the output end of the face information acquisition unit is electrically connected with the input end of the face information modeling unit, the output end of the coordinate system establishing unit is electrically connected with the input end of the face information modeling unit, and the output end of the face information modeling unit is electrically connected with the input end of the characteristic position positioning unit.
3. A big-data based emotion recognition system as claimed in claim 2, wherein: the emotion identification module comprises a face data analysis unit, a face information comparison unit and a database;
the face data analysis unit is used for identifying and analyzing the emotion of the user according to the face characteristic information data of the user; the face information comparison unit is used for comparing the analysis data of the face data analysis unit with historical analysis data stored in a database to determine the emotion of the user; the database is used for storing and marking the facial feature information data of the user;
the output end of the face data analysis unit is electrically connected with the input end of the face information comparison unit, and the output end of the database is electrically connected with the input end of the face information comparison unit.
4. A big-data based emotion recognition system as claimed in claim 3, wherein: the identification determining module comprises a conclusion verifying unit, an intelligent information acquisition bracelet, a popup inquiring unit and an emotion identification marking unit;
the conclusion verification unit is used for verifying the emotion identification result according to the comparison result of the face information comparison unit and the information acquired by the intelligent information acquisition bracelet; the intelligent information acquisition bracelet is used for acquiring body temperature information, heart rate information and blood pressure information of a user; the popup inquiring unit is used for inquiring whether the emotion identification result of the user is correct or not in a popup mode; the emotion identification marking unit is used for marking the facial feature data after emotion identification;
the face information comparison unit comprises an output end, an input end, an output end and an input end, wherein the output end of the face data analysis unit is electrically connected with the input end of the emotion identification marking unit, the output end of the face information comparison unit is electrically connected with the input end of the conclusion verification unit, the output end of the intelligent information collection bracelet is electrically connected with the input end of the conclusion verification unit, the output end of the conclusion verification unit is electrically connected with.
5. The big-data based emotion recognition system of claim 4, wherein: the suggestion reminding module comprises an entertainment reminding unit, a rest reminding unit and a movement reminding unit;
the entertainment reminding unit reminds the user in a mode of pushing entertainment information and entertainment short videos to improve the emotion of the user, the rest reminding unit is used for reminding the user to pay attention to rest, and the exercise reminding unit is used for reminding the user to pay attention to the muscles and bones of the exercise activity.
6. The big-data based emotion recognition system of claim 5, wherein: the intelligent information gathers the inside detection element that sits of still installing of bracelet, it is used for detecting whether the user sits for a long time to sit for a long time detection element.
7. A big-data based emotion recognition system as claimed in claim 6, wherein: the facial information acquisition unit is a camera at the mobile phone end of a user, the facial information of the user is acquired when the user uses the mobile phone, the facial information modeling unit establishes a two-dimensional model for the acquired facial information, the coordinate system establishing unit endows the established two-dimensional model with a two-dimensional rectangular coordinate system, and the characteristic position positioning unit is used for positioning coordinate values of characteristic positions of the facial information of the user and comprises a coordinate value set 0 of an eye contour, a coordinate value set P of each eyebrow point and a coordinate value set Q of a mouth contour.
8. A big-data based emotion recognition system as claimed in claim 6, wherein: the set of coordinate values for the eye contour is 0 { (X)1,Y1),(X2,Y2),(X3,Y3),…,(Xn,Yn) And the coordinate value set P { (M) of each point of the eyebrow1,N1),(M2,N2),(M3,N3),…,(Mm,Nm) -said set of coordinate values of the mouth contour Q { (E)1,F1),(E2,F2),(E3,F3),…,(Ek,Fk)};
The face data analysis unit extracts a coordinate value (X) having a maximum value on the Y axis from the coordinate value set 0 of the eye contourmin,Ymin) The face data analysis unit extracts a coordinate value (X) having a minimum value on the Y axis from the coordinate value set 0 of the eye contourmax,Ymax) The difference between the maximum and minimum values on the Y-axis is calculated according to the following formula:
L=Ymax-Ymin
wherein L represents the distance that the eye contour is open;
the face data analysis unit calculates the sum of the difference values of two adjacent coordinate values in the coordinate value set P of each point of the eyebrow on the Y axis according to the following formula:
Figure FDA0002640370060000041
wherein the content of the first and second substances,
Figure FDA0002640370060000042
representing two adjacent coordinate valuesSum of differences on the Y-axis;
the facial data analysis unit extracts a coordinate value (E) having a minimum value on the E axis from the coordinate value set Q of the mouth contourmin,Fmin) And the coordinate value (E) having the largest value on the E axismax,Fmax) The face data analysis unit extracts a coordinate value (E) having a maximum value on the F-axis from the coordinate value set Q of the mouth contourmid,Fmid) The face data analysis unit calculates the distance of the mouth corner sinking according to the following formula:
Lmin=Fmid-Fmin
Lmax=Fmid--Fmax
wherein L isminAnd LmaxThe difference value of the maximum coordinate values on the mouth angles on the two sides and the F axis is respectively obtained.
9. A big-data based emotion recognition system as claimed in claim 8, wherein: the face information comparison unit compares the data L analyzed by the face data analysis unit,
Figure FDA0002640370060000051
LminAnd LmaxComparing with historical data stored in a database, wherein in the data stored in the database, L is less than or equal to a in the data with low emotion of the user,
Figure FDA0002640370060000052
Lmin≥c,Lmax≧ c, where a, b, and c are respectively analyzed by historical data stored in the database to derive thresholds, thus:
when L is less than or equal to a or
Figure FDA0002640370060000053
Or LminNot less than c and LmaxWhen the emotion of the user is lower than or equal to c, the current emotion of the user is shown to be low, the conclusion verification unit verifies the emotion identification of the user according to the detection data of the intelligent information acquisition bracelet, and the intelligent information acquisition bracelet is connected to the conclusion verification unitAnd the suggestion reminding module reminds the user of paying attention to rest and improving emotion.
10. A big-data based emotion recognition system as claimed in claim 9, wherein: the popup inquiring unit inquires whether the authentication of the system is correct or not in a popup mode;
when the user selects correct identification, the emotion identification marking unit marks the currently identified data, marks the facial feature information data as the data when the emotion is low, and stores the data in the database as the data source of the next emotion identification;
when the user chooses not to authenticate correctly, the emotion authentication marking unit does not mark the current data.
CN202010837903.0A 2020-08-19 2020-08-19 Emotion identification system based on big data Active CN111951930B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010837903.0A CN111951930B (en) 2020-08-19 2020-08-19 Emotion identification system based on big data
CN202110882961.XA CN113724838B (en) 2020-08-19 2020-08-19 Emotion identification system based on big data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010837903.0A CN111951930B (en) 2020-08-19 2020-08-19 Emotion identification system based on big data

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202110882961.XA Division CN113724838B (en) 2020-08-19 2020-08-19 Emotion identification system based on big data

Publications (2)

Publication Number Publication Date
CN111951930A true CN111951930A (en) 2020-11-17
CN111951930B CN111951930B (en) 2021-10-15

Family

ID=73358424

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110882961.XA Active CN113724838B (en) 2020-08-19 2020-08-19 Emotion identification system based on big data
CN202010837903.0A Active CN111951930B (en) 2020-08-19 2020-08-19 Emotion identification system based on big data

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202110882961.XA Active CN113724838B (en) 2020-08-19 2020-08-19 Emotion identification system based on big data

Country Status (1)

Country Link
CN (2) CN113724838B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112487474A (en) * 2020-11-26 2021-03-12 冉令峻 Private information protection system based on big data

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115797523B (en) * 2023-01-05 2023-04-18 武汉创研时代科技有限公司 Virtual character processing system and method based on face motion capture technology

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101908149A (en) * 2010-07-06 2010-12-08 北京理工大学 Method for identifying facial expressions from human face image sequence
CN102970438A (en) * 2012-11-29 2013-03-13 广东欧珀移动通信有限公司 Automatic mobile phone alarming method and device
CN106580346A (en) * 2015-10-14 2017-04-26 松下电器(美国)知识产权公司 Emotion estimating method, and emotion estimating apparatus
CN107358218A (en) * 2017-07-24 2017-11-17 英锐科技(深圳)有限公司 Fatigue detection method and the fatigue detecting system using this method
CN109394207A (en) * 2018-08-17 2019-03-01 西安易朴通讯技术有限公司 Emotion identification method and system, electronic equipment
CN110215218A (en) * 2019-06-11 2019-09-10 北京大学深圳医院 A kind of wisdom wearable device and its mood identification method based on big data mood identification model
CN110321477A (en) * 2019-05-24 2019-10-11 平安科技(深圳)有限公司 Information recommendation method, device, terminal and storage medium
CN110765838A (en) * 2019-09-02 2020-02-07 合肥工业大学 Real-time dynamic analysis method for facial feature region for emotional state monitoring
US20200081535A1 (en) * 2018-09-07 2020-03-12 Hyundai Motor Company Emotion recognition apparatus and control method thereof

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10799186B2 (en) * 2016-02-12 2020-10-13 Newton Howard Detection of disease conditions and comorbidities
US11298062B2 (en) * 2017-02-01 2022-04-12 Conflu3Nce Ltd Multi-purpose interactive cognitive platform
CN110390048A (en) * 2019-06-19 2019-10-29 深圳壹账通智能科技有限公司 Information-pushing method, device, equipment and storage medium based on big data analysis
CN111312394B (en) * 2020-01-15 2023-09-29 东北电力大学 Psychological health assessment system based on combined emotion and processing method thereof

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101908149A (en) * 2010-07-06 2010-12-08 北京理工大学 Method for identifying facial expressions from human face image sequence
CN102970438A (en) * 2012-11-29 2013-03-13 广东欧珀移动通信有限公司 Automatic mobile phone alarming method and device
CN106580346A (en) * 2015-10-14 2017-04-26 松下电器(美国)知识产权公司 Emotion estimating method, and emotion estimating apparatus
CN107358218A (en) * 2017-07-24 2017-11-17 英锐科技(深圳)有限公司 Fatigue detection method and the fatigue detecting system using this method
CN109394207A (en) * 2018-08-17 2019-03-01 西安易朴通讯技术有限公司 Emotion identification method and system, electronic equipment
US20200081535A1 (en) * 2018-09-07 2020-03-12 Hyundai Motor Company Emotion recognition apparatus and control method thereof
CN110321477A (en) * 2019-05-24 2019-10-11 平安科技(深圳)有限公司 Information recommendation method, device, terminal and storage medium
CN110215218A (en) * 2019-06-11 2019-09-10 北京大学深圳医院 A kind of wisdom wearable device and its mood identification method based on big data mood identification model
CN110765838A (en) * 2019-09-02 2020-02-07 合肥工业大学 Real-time dynamic analysis method for facial feature region for emotional state monitoring

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112487474A (en) * 2020-11-26 2021-03-12 冉令峻 Private information protection system based on big data
CN112487474B (en) * 2020-11-26 2024-02-02 宁波英派尔科技有限公司 Secret information protection system based on big data

Also Published As

Publication number Publication date
CN113724838A (en) 2021-11-30
CN111951930B (en) 2021-10-15
CN113724838B (en) 2023-06-20

Similar Documents

Publication Publication Date Title
CN103593598B (en) User's on-line authentication method and system based on In vivo detection and recognition of face
Luettin et al. Speechreading using shape and intensity information
CN105740780A (en) Method and device for human face in-vivo detection
CN111951930B (en) Emotion identification system based on big data
CN107103309A (en) A kind of sitting posture of student detection and correcting system based on image recognition
CN103902978A (en) Face detection and identification method
CN106203256A (en) A kind of low resolution face identification method based on sparse holding canonical correlation analysis
US20230237694A1 (en) Method and system for detecting children's sitting posture based on face recognition of children
CN110969073B (en) Facial expression recognition method based on feature fusion and BP neural network
CN106529377A (en) Age estimating method, age estimating device and age estimating system based on image
CN102831408A (en) Human face recognition method
WO2021248815A1 (en) High-precision child sitting posture detection and correction method and device
CN116269355B (en) Safety monitoring system based on figure gesture recognition
CN102799872A (en) Image processing method based on face image characteristics
CN104679967B (en) A kind of method for judging psychological test reliability
CN112801859A (en) Cosmetic mirror system with cosmetic guiding function
CN110148092A (en) The analysis method of teenager's sitting posture based on machine vision and emotional state
CN111062021B (en) Method and device for identity authentication based on wearable equipment
CN113920568A (en) Face and human body posture emotion recognition method based on video image
CN114255508A (en) OpenPose-based student posture detection analysis and efficiency evaluation method
Bi et al. SmartGe: identifying pen-holding gesture with smartwatch
Lanz et al. Automated classification of therapeutic face exercises using the Kinect
CN113034322B (en) Internet-based online education supervision system and method
CN109886091A (en) Three-dimensional face expression recognition methods based on Weight part curl mode
CN104850225A (en) Activity identification method based on multi-level fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210907

Address after: Room 2501, No. 163, Huangpu Avenue West, Tianhe District, Guangzhou City, Guangdong Province

Applicant after: CHINESE FOOD ANHONG (GUANGDONG) HEALTH INDUSTRY Co.,Ltd.

Address before: 215000 No. 511 Yushan Road, high tech Zone, Suzhou, Jiangsu

Applicant before: Chen Xiao

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant