CN107392124A - Emotion identification method, apparatus, terminal and storage medium - Google Patents

Emotion identification method, apparatus, terminal and storage medium Download PDF

Info

Publication number
CN107392124A
CN107392124A CN201710557733.9A CN201710557733A CN107392124A CN 107392124 A CN107392124 A CN 107392124A CN 201710557733 A CN201710557733 A CN 201710557733A CN 107392124 A CN107392124 A CN 107392124A
Authority
CN
China
Prior art keywords
user
physiologic parameter
current
parameter value
similarity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201710557733.9A
Other languages
Chinese (zh)
Inventor
沈婉婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meizu Technology Co Ltd
Original Assignee
Meizu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meizu Technology Co Ltd filed Critical Meizu Technology Co Ltd
Priority to CN201710557733.9A priority Critical patent/CN107392124A/en
Publication of CN107392124A publication Critical patent/CN107392124A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • G06V40/175Static expression
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Physiology (AREA)
  • Cardiology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Pulmonology (AREA)
  • Psychiatry (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The application provides a kind of Emotion identification method, and methods described includes:Current first physiologic parameter value of the user of terminal is obtained, and current first physiologic parameter value is matched with multiple default first physiologic parameter values, determines the first mood model and the first similarity;Current second physiologic parameter value of the user of terminal is obtained, and current second physiologic parameter value is matched with multiple default second physiologic parameter values, determines the second mood model and the second similarity;And determine the current mood model that the first similarity mood model corresponding with similarity the higher person in the second similarity is the user;Wherein, the type of current first physiologic parameter value is different with the type of current second physiologic parameter value.The application also provides a kind of Emotion identification device, terminal and storage medium.Emotion identification method, apparatus, terminal and the storage medium can improve the Emotion identification degree of accuracy.

Description

Emotion identification method, apparatus, terminal and storage medium
Technical field
The present invention relates to a kind of Emotion identification technology, and in particular to a kind of method based on camera identification user emotion, Device, terminal and storage medium.
Background technology
With the development of terminal, particularly mobile terminal technology, mobile terminal (such as:Mobile phone, flat board etc.) have become The necessity that everybody possesses, all the time all without mobile phone is left, mobile phone is like the friend of user to some users within even one day, When sad user can change mobile phone theme, looked for mobile phone on the net some laugh at or happy video to improve its sad Mood, user can use mobile phone to be chatted with friend when happy.But these action be all will be by the manually operated ability of user It can realize, can not actively deacclimatize user emotion.It is therefore desirable to provide a kind of technology of energy automatic identification user emotion, The technology of some automatic identification user emotions is developed at present, such as passes through user's call voice, facial expression, heart rate etc..So And often there is deviation in the technology of these automatic identification user emotions in the degree of accuracy of identification user emotion.
The content of the invention
In consideration of it, being necessary to provide a kind of Emotion identification method, apparatus, terminal and storage medium, Emotion identification can be improved Accuracy.
The application provides a kind of Emotion identification method, and methods described includes:
Obtain terminal user current first physiologic parameter value, and by current first physiologic parameter value with it is multiple pre- If the first physiologic parameter value is matched, determine in multiple default first physiologic parameter values with current first physiologic parameter value Mood model corresponding to default first physiologic parameter value most matched is the first mood model, and current first physiologic parameter value is with being somebody's turn to do The similarity of default first physiologic parameter value most matched is the first similarity;
Obtain terminal user current second physiologic parameter value, and by current second physiologic parameter value with it is multiple pre- If the second physiologic parameter value is matched, determine in multiple default second physiologic parameter values with current second physiologic parameter value Mood model corresponding to default second physiologic parameter value most matched is the second mood model, and current second physiologic parameter value is with being somebody's turn to do The similarity of default second physiologic parameter value most matched is the second similarity;And
Determine the first similarity mood model corresponding with similarity the higher person in the second similarity working as the user Preceding mood model;Wherein, the type of current first physiologic parameter value and the type of current second physiologic parameter value be not Together.
In alternatively possible implementation, first physiologic parameter value be user facial expression feature value, user Facial expression obtained by camera, the terminal includes at least two cameras, current first life for obtaining user Parameter value is managed, including:
The pixel value of at least two cameras is obtained respectively;
Determine the camera that pixel value is minimum at least two camera;
Start the minimum camera of the pixel value to obtain the current face facial expression image of user.
In alternatively possible implementation, second physiologic parameter value be user respiratory rate and/or heart rate, institute Current second physiologic parameter value for obtaining user is stated, including:
The present image of user is acquired by the infrared lamp of the camera, and analyzes the present image to determine The heart rate of user;Or
The present image of user is gathered by the camera, signal is carried out in the pre-set image region of the present image Sampling and conversion process obtain the respiratory rate of user.
In alternatively possible implementation, methods described also includes:
The current topic of the terminal is changed according to current mood model, the current topic includes wallpaper, the tinkle of bells, operation One or more in icon and interface.
In alternatively possible implementation, methods described also includes true according to the first mood model and the second mood model The anxious state of mind degree of the fixed user, and according to corresponding to the selection of the anxious state of mind degree of user theme as the master after replacing Topic;Or
Judged according to the present image of user in the terminal local environment with the presence or absence of the factor for influenceing user emotion, institute Stating the factor of Graphical User mood includes ambient color;If exist influence user emotion factor, from the terminal memory storage or The theme for advantageously reducing environment influence is obtained in the subject data base of far-end server storage as the theme after change.
The application also provides a kind of Emotion identification device, and the Emotion identification device is applied to terminal, including:
First acquisition module, current first physiologic parameter value of the user for obtaining terminal, and by described current first Physiologic parameter value is matched with multiple default first physiologic parameter values, determine in multiple default first physiologic parameter values with it is described Mood model corresponding to default first physiologic parameter value that current first physiologic parameter value most matches is the first mood model, currently First physiologic parameter value is the first similarity with the similarity of default first physiologic parameter value most matched;
Second acquisition module, current second physiologic parameter value of the user for obtaining terminal, and by described current second Physiologic parameter value is matched with multiple default second physiologic parameter values, determine in multiple default second physiologic parameter values with it is described Mood model corresponding to default second physiologic parameter value that current second physiologic parameter value most matches is the second mood model, currently Second physiologic parameter value is the second similarity with the similarity of default second physiologic parameter value most matched;And
Determining module, it is for the first similarity of determination mood model corresponding with similarity the higher person in the second similarity The current mood model of the user;Wherein, the type of current first physiologic parameter value and the current second physiology ginseng The type of numerical value is different.
In alternatively possible implementation, wherein:
First physiologic parameter value is the facial expression feature value of user, and the facial expression of user is obtained by camera, institute Stating terminal includes at least two cameras, current first physiologic parameter value for obtaining user, including:At least two are obtained respectively The pixel value of individual camera;Determine the camera that pixel value is minimum at least two camera;Start the pixel value most Low camera is to obtain the current face facial expression image of user.
In alternatively possible implementation, wherein:
Second physiologic parameter value is the respiratory rate and/or heart rate of user, current second life for obtaining user Parameter value is managed, including:The present image of user is acquired by the infrared lamp of the camera, and analyzes the present image To determine the heart rate of user;Or the present image of user is gathered by the camera, in the default figure of the present image As region progress signal sampling and conversion process obtain the respiratory rate of user.
In alternatively possible implementation, the determining module is additionally operable to according to the determination pair of the current mood model of user The theme answered is as the theme after changing;Or
For according to the present image of user judge in the terminal local environment with the presence or absence of influence user emotion because Element;If existing influences the factor of user emotion, it is advantageously selected for dropping the theme of low environmental impact as the theme after change.
In alternatively possible implementation, the theme include wallpaper, the tinkle of bells, handle icon and interface in one kind or It is several.
In alternatively possible implementation, the photograph that is stored in wallpaper of the wallpaper selected from terminal storage, terminal photograph album Piece and it is stored in the wallpaper of far-end server, picture.
In alternatively possible implementation, the determining module is additionally operable to according to the first mood model and the second mood mould Type determines the anxious state of mind degree of the terminal user, and the user emotion degree of fluctuation is divided into slight anxious state of mind, moderate feelings Three thread fluctuation, severe mood fluctuation grades.
It is currently used in user when user emotion is slight fluctuations in alternatively possible implementation Matching is advantageous to the dynamic effect of entertaining that mood develops toward positive direction in theme;In the case that user emotion fluctuation is moderate, more Change the wallpaper of terminal subject;When user emotion fluctuation is severe, whole terminal subject is changed.
In alternatively possible implementation, the time that the theme after change is persistently presented depends on the terminal user's The duration of anxious state of mind.
The application also provides a kind of terminal, and the terminal includes processor, and the processor is used to perform in memory to deposit The step of upper methods described is realized during the computer program of storage.
The application also provides a kind of computer-readable recording medium, and the application also provides a kind of computer-readable storage medium Matter, it is stored thereon with computer program (instruction), it is characterised in that:It is real when the computer program (instruction) is executed by processor Now the step of upper methods described.
The present invention by the first physiologic parameter value determines the first mood model and the first similarity, according to passing through the second physiology Parameter value determines the second mood model and the second similarity, then further according to the first mood model and the first similarity and the second feelings Thread model and the second similarity determine the current mood model of user, can more accurately identify user emotion.
Brief description of the drawings
Technical scheme in order to illustrate the embodiments of the present invention more clearly, it is required in being described below to embodiment to use Accompanying drawing is briefly described, it should be apparent that, drawings in the following description are some embodiments of the present invention, general for this area For logical technical staff, on the premise of not paying creative work, other accompanying drawings can also be obtained according to these accompanying drawings.
Fig. 1 is a kind of flow chart of Emotion identification method provided in an embodiment of the present invention;
Fig. 2 is the structure chart of Emotion identification device provided in an embodiment of the present invention;
Fig. 3 is the structural representation of the terminal provided in an embodiment of the present invention for realizing Emotion identification method;
Following embodiment will combine above-mentioned accompanying drawing and further illustrate the present invention.
Embodiment
It is below in conjunction with the accompanying drawings and specific real in order to be more clearly understood that the above objects, features and advantages of the present invention Applying example, the present invention will be described in detail.It should be noted that in the case where not conflicting, embodiments herein and embodiment In feature can be mutually combined.
Elaborate many details in the following description to facilitate a thorough understanding of the present invention, described embodiment only Only it is part of the embodiment of the present invention, rather than whole embodiments.Based on the embodiment in the present invention, ordinary skill The every other embodiment that personnel are obtained under the premise of creative work is not made, belongs to the scope of protection of the invention.
Unless otherwise defined, all of technologies and scientific terms used here by the article is with belonging to technical field of the invention The implication that technical staff is generally understood that is identical.Term used in the description of the invention herein is intended merely to description tool The purpose of the embodiment of body, it is not intended that in the limitation present invention.
Embodiment
Fig. 1 is the schematic flow diagram of Emotion identification method provided in an embodiment of the present invention.Emotion identification method as shown in Figure 1 100 may include following steps:
Step 101:Current first physiologic parameter value of terminal user is obtained, and determines that the first mood model is similar to first Degree.
Emotion identification method provided by the invention is applied to terminal, and the terminal can be the electronic equipments such as mobile phone, flat board.
The major way of emotion/emotion expression service has:Facial expression, voice, body posture, physiological signal etc..In some realities To apply in example, first physiologic parameter value is the facial expression feature value of user, and the facial expression of user is obtained by camera, The terminal includes at least two cameras, current first physiologic parameter value for obtaining user, including:
The pixel value of at least two cameras is obtained respectively;
Determine the camera that pixel value is minimum at least two camera;
Start the minimum camera of the pixel value to obtain the present image of user.
The present image of the user, include, but not limited to facial expression (by the facial table such as eye, mouth, eyebrow, muscle Feelings transmit emotion signal), the foundation of the image as determination user emotion such as body posture (breathing, heart rate).
In certain embodiments, the Emotion identification method can also be according to the distance of user and image acquiring device to obtaining The image taken is corrected prediction, to improve the accuracy of Emotion identification.
For example, it is that the positive face-image of acquisition user is used as the base of Emotion identification generally in ideal conditions Plinth, but sometimes user is not just face described image acquisition device, when obtaining user images, by detecting user From with a distance from image acquiring device, azimuth, correction prediction is carried out to the image of acquisition by deep neural network algorithm, judge Corresponding expression situation of the mood expression of user now at front, so as to obtain the positive facial expression image of user.
By the facial expression image of acquisition (can be whole facial expression images or local facial expression image, Such as including one in eyebrow, eye, lip, muscle or several characteristic images) and the progress of default first Emotion identification database With analysis, corresponding user emotion model is found out, by expression being identified research, comparative analysis goes out the expression and first First similarity of data in Emotion identification database, when the first similarity reaches a predetermined value, draw the first mood model. For example, obtaining data in user's face expression and the first Emotion identification database carries out screening matching, if with the first Emotion identification The first similarity of facial expression image corresponding to angry mood model is up to 80% in database, then draws the first current feelings of user Thread model is indignation.
Common mood model includes but is not limited to, glad, indignation, sad, frightened, neutrality.User can be obtained in advance in not With the facial expression image under mood model, it is associated with each mood model and saves as the first Emotion identification database.One In a little embodiments, each mood model can also correspond to multiple facial expression images.In certain embodiments, if the user's expression obtained When image and the facial expression image similarity to prestore all do not reach predetermined value, it is allowed to which user increases facial expression image newly to first mood Identification database, and the newly-increased facial expression image association is preserved to default mood model, or association and preserved to user Xin Ding In the mood model of justice.
Step 102:Current second physiologic parameter value of terminal user is obtained, and determines that the second mood model is similar to second Degree.
In certain embodiments, second physiologic parameter value is the respiratory rate and/or heart rate of user, and described obtain is used Current second physiologic parameter value at family, including:
The present image of user is acquired by the infrared lamp of the camera, and analyzes the present image to determine The heart rate of user;Or
The present image of user is gathered by the camera, in the pre-set image region of the present image (such as chest Mouthful position) carry out signal sampling and conversion process obtains the respiratory rate of user.
In certain embodiments, the determination method of breath signal pickup area is:It is true according to the present image of user first Determine face location, then according to the general law of human body proportion, the chest position of user is extrapolated by Face detection.The party It is owned by France in non-contacting measuring method.It is understood that in some other embodiment, the heart rate or respiratory rate of user The measuring method of contact can be used, such as pass through the heart rate with being worn on/respiratory rate measurement apparatus (for example, bracelet etc.) Obtain heart rate/respiratory rate of user.
After breath signal pickup area determines, you can carry out signal sampling and conversion process in the region, its process is such as Under:Sample of signal->Image gray processing processing->Signal normalization process->Filtering elimination shake->Fourier transformation->Determine certain Respiratory variations curve in one period, it is heart rate/the respiratory rate that can obtain user according to the respiratory variations curve.
The current respiratory rate of user and heart rate are subjected to the matching analysis with default second Emotion identification database, obtained User's current breath frequency and heart rate and default respiratory rate and the second similarity of heart rate in the Emotion identification database, When the second similarity reaches a predetermined value, the second mood model is determined.For example, active user is drawn by date comprision Current respiratory rate and user's heart rate respiratory rate corresponding with the angry model in database and the second phase of user's heart rate Like degree up to 80%, it is determined that the second mood model is angry model.
In certain embodiments, obtained similarity, use can be compared to the respiratory rate of user and default respiratory rate The similarity that the heart rate at family compares to obtain with default heart rate carries out algorithm calculations and obtains the second similarity, the algorithm calculations bag Include, but be not limited to, average, weighted average etc..In certain embodiments, its in user's respiratory rate and heart rate can be also based only upon Middle one is used as foundation to determine the second similarity and the second mood model.
Similar to the first above-mentioned Emotion identification database, breathing of the user under different mood models can be also obtained in advance Frequency and heart rate, it is associated with each mood model and saves as the second Emotion identification database.In certain embodiments, each feelings Thread model can correspond to a respiratory rate value scope and heart rate value scope.In certain embodiments, if the user obtained breathes frequency When rate and heart rate and the respiratory rate to prestore and heart rate similarity all do not reach predetermined value, it is allowed to which user increases respiratory rate and the heart newly Rate associates preservation to default mood mould to the second Emotion identification database, and by the newly-increased respiratory rate with heart rate Type, or association are preserved in the mood model newly defined to user.
Step 103, determine that the first similarity mood model corresponding with similarity the higher person in the second similarity is described The current mood model of user.
If the first mood model is coincide with the second mood model, such as is angry model, it is determined that user's works as cause Thread model is angry model;If the first mood model misfits with the second mood model, the first similarity and the second similarity Mood model corresponding to middle the higher person is the current mood model of user.If for example, the first similarity being calculated be 60%, First mood model is indignation, and the second similarity is 80%, and the second mood model is neutrality, it is determined that the current emotional mould of user Type is neutrality.
In certain embodiments, the Emotion identification method can also further comprise updating step.The renewal step root According to the predetermined state in the current mood model change mobile phone of the user, the predetermined state may include, but be not limited to, terminal Theme, the terminal subject may include one kind or several in desktop wallpaper, screen protection wallpaper, the tinkle of bells, handle icon and interface etc. Kind.The wallpaper may be selected from the wallpaper of mobile phone storage, the photo stored in mobile phone photo album and be stored in far-end server wallpaper, Picture.
In certain embodiments, it is stored with and default is matched with various mood models in the terminal or far-end server Theme.Each mood model can correspond to one or more themes.The terminal is changed according to the current mood model of user Current topic, including judge whether current theme meets the current mood model of user, if not meeting, from the terminal The master to match with the current mood model of user is obtained in the theme storehouse of storage or the theme storehouse being stored in far-end server Topic, and the theme that the current topic of the terminal is changed to match with the current mood model of user.In some embodiments In, the theme that matches with the current mood model of user includes multiple, can randomly select one as current theme, or will It is alternative to be presented to user's selection with the theme that user current mood model matches.In certain embodiments, if working as Preceding theme meets the current mood model of user, the theme storehouse that can also be stored from the terminal or is stored in remote service Obtained in theme storehouse in device after matching with the current mood model of user and being used as change different from the theme of current topic Theme.
In certain embodiments, methods described may also include according to determining the first mood model and the second mood model The anxious state of mind degree of terminal.The user emotion degree of fluctuation can be divided into slight anxious state of mind, moderate anxious state of mind, severe feelings Three fluctuation grades of thread.For example, user is very angry, countenance is very ferocious, and pupil has amplification phenomenon, while breathing also can be corresponding Hurriedly, heart rate is very fast, it may be determined that corresponding user emotion fluctuation is more violent;The face that user's countenance meets anger is special Sign, but corresponding breathing is less rapid, heart rate is in value usually, it may be determined that user emotion degree of fluctuation is relatively low.
An example of the theme of terminal is changed according to user emotion degree of fluctuation:When user emotion is slight fluctuations When, the dynamic effect of entertaining for being advantageous to that mood develops toward positive direction is matched in the currently used theme of user;Work as user emotion In the case of fluctuating as moderate, the wallpaper of mobile phone theme is changed;When user emotion fluctuation is severe, whole mobile phone is changed Theme.The time that theme after change is persistently presented depends on the duration of the anxious state of mind of the terminal user.
In certain embodiments, the Emotion identification method can also by the acquisition of the information to user surrounding environment, Analyzed with reference to big data, judge environment with the presence or absence of influence user emotion factor, if in the presence of, convert theme when Wait and recommend some accordingly to reduce the subject content that environment influences on user emotion.
For example, determining the current mood model of user by Emotion identification as anxiety, and detect that user is in ambient color category In the environment of red colour system (red easily causes nervous), it can at this time be divided by influence of the environment to mood Analysis, some is pushed as green, blueness, themes of yellow, or prompting user first leave current environment, in prompt message appearance form Also it should use and such as be aided in green, blueness, yellow color, can somewhat slow down nervous feelings during user's reading information Thread.
Embodiment
Fig. 2 is the structure chart of Emotion identification device 20 provided in an embodiment of the present invention, as shown in Fig. 2 the Emotion identification Device can include:First acquisition module 202, the second acquisition module 204 and determining module 206.Module alleged by the present invention is Refer to a kind of series of computation machine program segment that can be executed by a computer and fixing function can be completed.
Emotion identification device 20 provided by the invention can be applied to terminal, and the terminal can be the electronics such as mobile phone, flat board Equipment.
First acquisition module 202 is used to obtaining current first physiologic parameter value of terminal user, and according to described the One physiologic parameter value determines the first mood model and the first similarity.
The major way of emotion/emotion expression service has:Facial expression, voice, body posture, physiological signal etc..In some realities To apply in example, first physiologic parameter value is the facial expression feature value of user, and the facial expression of user is obtained by camera, The terminal includes at least two cameras, current first physiologic parameter value for obtaining user, including:
The pixel value of at least two cameras is obtained respectively;
Determine the camera that pixel value is minimum at least two camera;
Start the minimum camera of the pixel value to obtain the present image of user.
The present image of the user, include, but not limited to facial expression (by the facial table such as eye, mouth, eyebrow, muscle Feelings transmit emotion signal), the foundation of the image as determination user emotion such as body posture (breathing, heart rate).
In certain embodiments, first acquisition module 202 can also be according to user and the distance of image acquiring device Prediction is corrected to the image of acquisition, to improve the accuracy of Emotion identification.
For example, it is that the positive face-image of acquisition user is used as the base of Emotion identification generally in ideal conditions Plinth, but sometimes user is not just face described image acquisition device, when obtaining user images, by detecting user From with a distance from image acquiring device, azimuth, correction prediction is carried out to the image of acquisition by deep neural network algorithm, judge Corresponding expression situation of the mood expression of user now at front, so as to obtain the positive facial expression image of user.
First acquisition module 202 by the facial expression image of acquisition (can be whole facial expression images, can also It is local facial expression image, such as including one in eyebrow, eye, lip, muscle or several characteristic images) and default first feelings Thread identification database carries out the matching analysis, finds out corresponding user emotion model, by expression being identified research, contrast First similarity of the expression and data in the first Emotion identification database is analyzed, when the first similarity reaches a predetermined value When, draw the first mood model.For example, obtain user's face expression carries out screening with data in the first Emotion identification database Match somebody with somebody, if the first similarity of facial expression image corresponding with angry mood model in the first Emotion identification database is up to 80%, Draw the first current mood model of user for indignation.
Common mood model includes but is not limited to, glad, indignation, sad, frightened, neutrality.User can be obtained in advance in not With the facial expression image under mood model, it is associated with each mood model and saves as the first Emotion identification database.One In a little embodiments, each mood model can also correspond to multiple facial expression images.In certain embodiments, if the user's expression obtained When image and the facial expression image similarity to prestore all do not reach predetermined value, it is allowed to which user increases facial expression image newly to first mood Identification database, and the newly-increased facial expression image association is preserved to default mood model, or association and preserved to user Xin Ding In the mood model of justice.
Second acquisition module 204 is used for current second physiologic parameter value for obtaining terminal user, and according to the second life Reason parameter value determines the second mood model and the second similarity.
In certain embodiments, second physiologic parameter value is the respiratory rate and/or heart rate of user, and described obtain is used Current second physiologic parameter value at family, including:
The present image of user is acquired by the infrared lamp of the camera, and analyzes the present image to determine The heart rate of user;Or
The present image of user is gathered by the camera, in the pre-set image region of the present image (such as chest Mouthful position) carry out signal sampling and conversion process obtains the respiratory rate of user.
In certain embodiments, the determination method of breath signal pickup area is:It is true according to the present image of user first Determine face location, then according to the general law of human body proportion, the chest position of user is extrapolated by Face detection.The party It is owned by France in non-contacting measuring method.It is understood that in some other embodiment, the heart rate or respiratory rate of user The measuring method of contact can be used, such as pass through the heart rate with being worn on/respiratory rate measurement apparatus (for example, bracelet etc.) Obtain heart rate/respiratory rate of user.
After breath signal pickup area determines, you can carry out signal sampling and conversion process in the region, its process is such as Under:Sample of signal->Image gray processing processing->Signal normalization process->Filtering elimination shake->Fourier transformation->Determine certain Respiratory variations curve in one period, it is heart rate/the respiratory rate that can obtain user according to the respiratory variations curve.
The current respiratory rate of user and heart rate are subjected to the matching analysis with default second Emotion identification database, obtained User's current breath frequency and heart rate and default respiratory rate and the second similarity of heart rate in the Emotion identification database, When the second similarity reaches a predetermined value, the second mood model is determined.For example, active user is drawn by date comprision Current respiratory rate and user's heart rate respiratory rate corresponding with the angry model in database and the second phase of user's heart rate Like degree up to 80%, it is determined that the second mood model is angry model.
In certain embodiments, obtained similarity, use can be compared to the respiratory rate of user and default respiratory rate The similarity that the heart rate at family compares to obtain with default heart rate carries out algorithm calculations and obtains the second similarity, the algorithm calculations bag Include, but be not limited to, average, weighted average etc..In certain embodiments, its in user's respiratory rate and heart rate can be also based only upon Middle one is used as foundation to determine the second similarity and the second mood model.
Similar to the first above-mentioned Emotion identification database, breathing of the user under different mood models can be also obtained in advance Frequency and heart rate, it is associated with each mood model and saves as the second Emotion identification database.In certain embodiments, each feelings Thread model can correspond to a respiratory rate value scope and heart rate value scope.In certain embodiments, if the user obtained breathes frequency When rate and heart rate and the respiratory rate to prestore and heart rate similarity all do not reach predetermined value, it is allowed to which user increases respiratory rate and the heart newly Rate associates preservation to default mood mould to the second Emotion identification database, and by the newly-increased respiratory rate with heart rate Type, or association are preserved in the mood model newly defined to user.
The determining module 206 is used to determine the first similarity mood corresponding with similarity the higher person in the second similarity Model is the current mood model of the user.
If the first mood model is coincide with the second mood model, such as is angry model, it is determined that user's works as cause Thread model is angry model;If the first mood model misfits with the second mood model, the first similarity and the second similarity Mood model corresponding to middle the higher person is the current mood model of user.If for example, the first similarity being calculated be 60%, First mood model is indignation, and the second similarity is 80%, and the second mood model is neutrality, it is determined that the current emotional mould of user Type is neutrality.
In certain embodiments, the Emotion identification device 20 can also further comprise update module 208, the renewal mould Block 208 changes the predetermined state in mobile phone according to the current mood model of the user, and the predetermined state may include, but unlimited In the theme of terminal, the terminal subject may include one in desktop wallpaper, screen protection wallpaper, the tinkle of bells, handle icon and interface etc. Kind is several.The wallpaper may be selected from the photo stored in the wallpaper of mobile phone storage, mobile phone photo album and be stored in far-end server Wallpaper, picture.
In certain embodiments, it is stored with and default is matched with various mood models in the terminal or far-end server Theme.Each mood model can correspond to one or more themes.The terminal is changed according to the current mood model of user Current topic, including judge whether current theme meets the current mood model of user, if not meeting, from the terminal The master to match with the current mood model of user is obtained in the theme storehouse of storage or the theme storehouse being stored in far-end server Topic, and the theme that the current topic of the terminal is changed to match with the current mood model of user.In some embodiments In, the theme that matches with the current mood model of user includes multiple, can randomly select one as current theme, or will It is alternative to be presented to user's selection with the theme that user current mood model matches.In certain embodiments, if working as Preceding main body meets the current mood model of user, the theme storehouse that can also be stored from the terminal or is stored in remote service Obtained in theme storehouse in device after matching with the current mood model of user and being used as change different from the theme of current topic Theme.
In certain embodiments, the update module 208 can also be true according to the first mood model and the second mood model The anxious state of mind degree of the fixed terminal.The user emotion degree of fluctuation can be divided into slight anxious state of mind, moderate anxious state of mind, Three fluctuation grades of severe mood.For example, user is very angry, countenance is very ferocious, and pupil has amplification phenomenon, while breathes Can be accordingly rapid, heart rate is very fast, it may be determined that corresponding user emotion fluctuation is more violent;User's countenance meets anger Face feature, but corresponding breathing is less rapid, heart rate is in value usually, it may be determined that user emotion degree of fluctuation is relatively low.
An example of the theme of terminal is changed according to user emotion degree of fluctuation:When user emotion is slight fluctuations When, the dynamic effect of entertaining for being advantageous to that mood develops toward positive direction is matched in the currently used theme of user;Work as user emotion In the case of fluctuating as moderate, the wallpaper of mobile phone theme is changed;When user emotion fluctuation is severe, whole mobile phone is changed Theme.The time that theme after change is persistently presented depends on the duration of the anxious state of mind of the terminal user.
In certain embodiments, the Emotion identification device 20 can also obtaining by the information to user surrounding environment Take, analyzed with reference to big data, judge environment with the presence or absence of the factor for influenceing user emotion, if in the presence of in conversion theme When recommend some accordingly to reduce the subject content that environment influences on user emotion.
For example, determining the current mood model of user by the determining module 206 as anxiety, and detect ring residing for user The ambient color in border belongs in the environment of red colour system (red easily causes nervous), at this time can be by environment to mood Influence is analyzed, and pushes some as green, blueness, the theme of yellow, or prompting user first leave current environment, prompt message It should also use in appearance form and such as be aided in green, blueness, yellow color, can somewhat slow down user's reading information When intense strain.
Embodiment
Fig. 3 is refer to, Fig. 3 is that the structure for the terminal 30 for realizing Emotion identification method that one embodiment of the invention provides is shown It is intended to.
The terminal 30 of the embodiment is a kind of can to carry out numerical computations automatically according to the instruction for being previously set or storing And/or the equipment of information processing, including:Memory 31, processor 32 and it is stored in the memory 31 and can be described The computer program run on processor 32, such as Emotion identification program.When the processor 32 performs the computer program Realize the step in above-mentioned each Emotion identification embodiment of the method, such as the step shown in Fig. 1.Or the computing device The function of each module in above-mentioned each device embodiment is realized during the computer program.
Exemplary, the computer program can be divided into one or more module/units, one or more Individual module/unit is stored in the memory 31, and is performed by the processor 32, to complete the present invention.It is one Or multiple module/units can be the series of computation machine programmed instruction section that can complete specific function, the instruction segment is used to retouch State implementation procedure of the computer program in the terminal.For example, the computer program can be divided into shown in Fig. 2 Taking module 202, the first acquisition module 204, the second acquisition module 206 and determining module 208, update module 210.
The terminal 30 may be, but not limited to, any electronic product that can obtain user images, for example, flat board is electric Brain, smart mobile phone, personal digital assistant (Personal Digital Assistant, PDA), intellectual Wearable etc. are eventually End.
The memory 31 can be used for storing the computer program and/or module, the processor 32 by operation or The computer program and/or module being stored in the memory 31 are performed, and calls the data being stored in memory 31, Realize the various functions of the terminal 30.The memory 31 can mainly include storing program area and storage data field, wherein, deposit Storing up program area can storage program area, application program (such as sound-playing function, image player work(needed at least one function Can etc.) etc.;Storage data field can store uses created data (such as voice data, phone directory etc.) according to terminal 30 Deng.In addition, memory 31 can include high-speed random access memory, nonvolatile memory can also be included, such as hard disk, Internal memory, plug-in type hard disk, intelligent memory card (Smart Media Card, SMC), secure digital (Secure Digital, SD) Card, flash card (Flash Card), at least one disk memory, flush memory device or other volatile solid-state parts.
The processor 32 can be CPU (Central Processing Unit,
CPU), can also be other general processors, digital signal processor (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated
Circuit, ASIC), ready-made programmable gate array (Field-Programmable Gate Array, FPGA) or Person other PLDs, discrete gate or transistor logic, discrete hardware components etc..General processor can be Microprocessor or the processor can also be any conventional processors etc., and the processor is in the control of the terminal The heart, utilize various interfaces and the various pieces of the whole terminal of connection.
Preferably, the processor 32 can call the program code stored in the memory 31 to perform the work(of correlation Energy.For example, the modules described in Fig. 2 are stored in the program code in the memory 31, and by the processor 32 It is performed, to realize a kind of Emotion identification method (such as Emotion identification method in embodiment shown in Fig. 1).
The terminal 30 also includes at least one communicator 33, at least one image acquiring device 34, at least one aobvious Showing device 35 and at least one communication bus.Wherein, the communication bus is used to realize the connection communication between these components.
Wherein, the communicator 33 can be that wire communication device can also be radio communication device.It is wherein described to have Line communicator includes COM1, such as USB (universal serial bus, USB), controller local (Inter- between net (Controller area network, CAN), serial and/or other standards network connection, integrated circuit Integrated Circuit, I2C) bus etc..The radio communication device can use the wireless communication system of any classification, example Such as, bluetooth, infrared ray, Wireless Fidelity (Wireless Fidelity, WiFi), cellular technology, satellite, and broadcast.It is wherein described Cellular technology may include the mobile communication technologies such as the second generation (2G), the third generation (3G), forth generation (4G) or the 5th generation (5G).
Described image acquisition device 34 can be the low-power consumption camera that is assembled in the terminal 30 or by logical T unit 33 and an independent image acquiring device of the terminal 30 communication connection, such as web camera etc..
The display device 35 can touch LCDs (Liquid Crystal Display, LCD), light-emitting diodes Manage (Light Emitting Diode, LED) display screen, Organic Electricity laser display screen (Organic Light-Emitting Diode, OLED) or other suitable display screens.
It will be understood by those skilled in the art that the schematic diagram is only the example of terminal, the not limit of structure paired terminal It is fixed, it can include than illustrating more or less parts, either combine some parts or different parts, such as the end End can also include input-output equipment, network access equipment etc..The input-output equipment may include the input of any suitable Equipment, include but is not limited to, mouse, keyboard, touch-screen or contactless input, for example, gesture input, sound input etc..
Embodiment
If the integrated module/unit of the terminal is realized in the form of SFU software functional unit and is used as independent product pin Sell or in use, can be stored in a computer read/write memory medium.Based on such understanding, the present invention realizes above-mentioned All or part of flow in Emotion identification method described in embodiment, the hard of correlation can also be instructed by computer program Part is completed, and described computer program can be stored in a computer-readable recording medium, and the computer program is processed Device perform when, can be achieved embodiment of the method above described in Emotion identification method the step of.Wherein, the computer program includes Computer program code, the computer program code can be source code form, object identification code form, executable file or certain A little intermediate forms etc..The computer-readable medium can include:Any entity of the computer program code can be carried Or device, recording medium, USB flash disk, mobile hard disk, magnetic disc, CD, computer storage, read-only storage (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), electric carrier signal, telecommunication signal and software Distribution medium etc..It should be noted that the content that includes of the computer-readable medium can be according to making laws in jurisdiction Appropriate increase and decrease is carried out with the requirement of patent practice, such as in some jurisdictions, according to legislation and patent practice, computer Computer-readable recording medium does not include electric carrier signal and telecommunication signal.
In several embodiments provided by the present invention, it should be understood that disclosed method and apparatus, can also pass through Other modes are realized.For example, device embodiment described above is only schematical, for example, the division of the module, Only a kind of division of logic function, can there is other dividing mode when actually realizing.
It is obvious to a person skilled in the art that the invention is not restricted to the details of above-mentioned one exemplary embodiment, Er Qie In the case of without departing substantially from spirit or essential attributes of the invention, the present invention can be realized in other specific forms.Therefore, no matter From the point of view of which point, embodiment all should be regarded as exemplary, and be nonrestrictive, the scope of the present invention is by appended power Profit requires rather than described above limits, it is intended that all in the implication and scope of the equivalency of claim by falling Change is included in the present invention.Any reference in claim should not be considered as to the involved claim of limitation.This Outside, it is clear that the word of " comprising " one is not excluded for other units or step, and odd number is not excluded for plural number.That is stated in device claim is multiple Device can also be realized by same device or system by software or hardware.The first, the second grade word is used for representing name Claim, and be not offered as any specific order.
Finally it should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention and it is unrestricted, although reference The present invention is described in detail for preferred embodiment, it will be understood by those within the art that, can be to the present invention's Technical scheme is modified or equivalent substitution, without departing from the spirit and scope of technical solution of the present invention.

Claims (10)

  1. A kind of 1. Emotion identification method, it is characterised in that the Emotion identification method includes:
    Current first physiologic parameter value of the user of terminal is obtained, and current first physiologic parameter value is preset the with multiple One physiologic parameter value is matched, and is determined in multiple default first physiologic parameter values with current first physiologic parameter value most Mood model corresponding to default first physiologic parameter value matched somebody with somebody be the first mood model, current first physiologic parameter value and this most The similarity for default first physiologic parameter value matched somebody with somebody is the first similarity;
    Current second physiologic parameter value of the user of terminal is obtained, and current second physiologic parameter value is preset the with multiple Two physiologic parameter values are matched, and are determined in multiple default second physiologic parameter values with current second physiologic parameter value most Mood model corresponding to default second physiologic parameter value matched somebody with somebody be the second mood model, current second physiologic parameter value and this most The similarity for default second physiologic parameter value matched somebody with somebody is the second similarity;And
    Determine that the first similarity mood model corresponding with similarity the higher person in the second similarity works as cause for the user Thread model;Wherein, the type of current first physiologic parameter value is different with the type of current second physiologic parameter value.
  2. 2. Emotion identification method as claimed in claim 1, it is characterised in that first physiologic parameter value is the face of user Expressive features value, the facial expression of user are obtained by camera, and the terminal includes at least two cameras, and described obtain is used Current first physiologic parameter value at family, including:
    The pixel value of at least two cameras is obtained respectively;
    Determine the camera that pixel value is minimum at least two camera;
    Start the minimum camera of the pixel value to obtain the current face facial expression image of user.
  3. 3. Emotion identification method as claimed in claim 2, it is characterised in that second physiologic parameter value is the breathing of user Frequency and/or heart rate, current second physiologic parameter value for obtaining user, including:
    The present image of user is acquired by the infrared lamp of the camera, and analyzes the present image to determine user Heart rate;Or
    The present image of user is gathered by the camera, signal sampling is carried out in the pre-set image region of the present image And conversion process obtains the respiratory rate of user.
  4. 4. Emotion identification method as claimed in claim 1, it is characterised in that methods described also includes:
    The current topic of the terminal is changed according to current mood model, the current topic includes wallpaper, the tinkle of bells, handle icon And the one or more in interface.
  5. 5. Emotion identification method as claimed in claim 4, it is characterised in that methods described also includes according to the first mood model The anxious state of mind degree of the user is determined with the second mood model, and it is main according to corresponding to the selection of the anxious state of mind degree of user Inscribe as the theme after changing;Or
    Judged according to the present image of user in the terminal local environment with the presence or absence of the factor for influenceing user emotion, the figure As the factor of user emotion includes ambient color;If exist influence user emotion factor, from the terminal memory storage or distal end The theme for advantageously reducing environment influence is obtained in the subject data base of server storage as the theme after change.
  6. 6. a kind of Emotion identification device, applied to terminal, it is characterised in that the Emotion identification device includes:
    First acquisition module, current first physiologic parameter value of the user for obtaining terminal, and by current first physiology Parameter value is matched with multiple default first physiologic parameter values, determine in multiple default first physiologic parameter values with it is described current Mood model corresponding to default first physiologic parameter value that first physiologic parameter value most matches is the first mood model, current first Physiologic parameter value is the first similarity with the similarity of default first physiologic parameter value most matched;
    Second acquisition module, current second physiologic parameter value of the user for obtaining terminal, and by current second physiology Parameter value is matched with multiple default second physiologic parameter values, determine in multiple default second physiologic parameter values with it is described current Mood model corresponding to default second physiologic parameter value that second physiologic parameter value most matches is the second mood model, current second Physiologic parameter value is the second similarity with the similarity of default second physiologic parameter value most matched;And
    Determining module, for determining that the first similarity mood model corresponding with similarity the higher person in the second similarity is described The current mood model of user;Wherein, the type of current first physiologic parameter value and current second physiologic parameter value Type it is different.
  7. 7. Emotion identification device as claimed in claim 6, it is characterised in that wherein:
    First physiologic parameter value is the facial expression feature value of user, and the facial expression of user is obtained by camera, the end Holding includes at least two cameras, current first physiologic parameter value for obtaining user, including:At least two are obtained respectively to take the photograph As the pixel value of head;Determine the camera that pixel value is minimum at least two camera;It is minimum to start the pixel value Camera is to obtain the current face facial expression image of user;And/or
    Second physiologic parameter value is the respiratory rate and/or heart rate of user, the current second physiology ginseng for obtaining user Numerical value, including:The present image of user is acquired by the infrared lamp of the camera, and analyzes the present image with true Determine the heart rate of user;Or the present image of user is gathered by the camera, in the pre-set image area of the present image Domain carries out signal sampling and conversion process obtains the respiratory rate of user.
  8. 8. Emotion identification device as claimed in claim 6, it is characterised in that the determining module is additionally operable to working as according to user Theme is as the theme after changing corresponding to preceding mood model determination;Or
    For being judged according to the present image of user in the terminal local environment with the presence or absence of the factor for influenceing user emotion;If In the presence of the factor for influenceing user emotion, it is advantageously selected for dropping the theme of low environmental impact as the theme after change.
  9. 9. a kind of terminal, it is characterised in that the terminal includes processor, and the processor is used to perform what is stored in memory The step of any one methods described in such as claim 1-5 is realized during computer program.
  10. 10. a kind of computer-readable recording medium, it is stored thereon with computer program (instruction), it is characterised in that:The calculating The step of any one methods described in such as claim 1-5 is realized when machine program (instruction) is executed by processor.
CN201710557733.9A 2017-07-10 2017-07-10 Emotion identification method, apparatus, terminal and storage medium Withdrawn CN107392124A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710557733.9A CN107392124A (en) 2017-07-10 2017-07-10 Emotion identification method, apparatus, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710557733.9A CN107392124A (en) 2017-07-10 2017-07-10 Emotion identification method, apparatus, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN107392124A true CN107392124A (en) 2017-11-24

Family

ID=60335577

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710557733.9A Withdrawn CN107392124A (en) 2017-07-10 2017-07-10 Emotion identification method, apparatus, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN107392124A (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108307037A (en) * 2017-12-15 2018-07-20 努比亚技术有限公司 Terminal control method, terminal and computer readable storage medium
CN108742516A (en) * 2018-03-26 2018-11-06 浙江广厦建设职业技术学院 The mood measuring and adjusting system and method for smart home
CN108937973A (en) * 2018-06-15 2018-12-07 四川文理学院 A kind of robotic diagnostic human body indignation mood method and device
CN109240786A (en) * 2018-09-04 2019-01-18 广东小天才科技有限公司 A kind of subject replacement method and electronic equipment
CN109255310A (en) * 2018-08-28 2019-01-22 百度在线网络技术(北京)有限公司 Animal mood recognition methods, device, terminal and readable storage medium storing program for executing
CN109345258A (en) * 2018-09-26 2019-02-15 广东小天才科技有限公司 A kind of safe payment method, device and intelligent wearable device
CN109360130A (en) * 2018-10-29 2019-02-19 四川文轩教育科技有限公司 A kind of student's mood monitoring method based on artificial intelligence
CN109753889A (en) * 2018-12-18 2019-05-14 深圳壹账通智能科技有限公司 Service evaluation method, apparatus, computer equipment and storage medium
CN109800734A (en) * 2019-01-30 2019-05-24 北京津发科技股份有限公司 Human facial expression recognition method and device
CN110013261A (en) * 2019-05-24 2019-07-16 京东方科技集团股份有限公司 Method, apparatus, electronic equipment and the storage medium of mood monitoring
CN110311950A (en) * 2019-05-21 2019-10-08 平安科技(深圳)有限公司 Information-pushing method, device, equipment and computer readable storage medium
CN110399836A (en) * 2019-07-25 2019-11-01 深圳智慧林网络科技有限公司 User emotion recognition methods, device and computer readable storage medium
WO2020015152A1 (en) * 2018-07-18 2020-01-23 平安科技(深圳)有限公司 Method and device for setting system theme, computer apparatus, and storage medium
CN110853605A (en) * 2019-11-15 2020-02-28 中国传媒大学 Music generation method and device and electronic equipment
WO2020098013A1 (en) * 2018-11-14 2020-05-22 深圳创维-Rgb电子有限公司 Television program recommendation method, terminal, system, and storage medium
CN111209445A (en) * 2018-11-21 2020-05-29 中国电信股份有限公司 Method and device for recognizing emotion of terminal user
CN111354053A (en) * 2020-02-27 2020-06-30 北京华峰创业科技有限公司 Method and device for generating cartoon image icon and storage medium
CN111413874A (en) * 2019-01-08 2020-07-14 北京京东尚科信息技术有限公司 Method, device and system for controlling intelligent equipment
CN112200462A (en) * 2020-10-13 2021-01-08 中国银行股份有限公司 Risk assessment method and device
WO2021147901A1 (en) * 2020-01-20 2021-07-29 北京津发科技股份有限公司 Pressure recognition bracelet
CN113793578A (en) * 2021-08-12 2021-12-14 咪咕音乐有限公司 Tune generation method, device, equipment and computer readable storage medium
CN117370768A (en) * 2023-12-08 2024-01-09 北京回龙观医院(北京心理危机研究与干预中心) Mood fluctuation detection method and system for mental patients

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104699972A (en) * 2015-03-18 2015-06-10 小米科技有限责任公司 Emotion recognition reminding method and device
CN105607822A (en) * 2014-11-11 2016-05-25 中兴通讯股份有限公司 Theme switching method and device of user interface, and terminal
CN106126017A (en) * 2016-06-20 2016-11-16 北京小米移动软件有限公司 Intelligent identification Method, device and terminal unit
CN106469297A (en) * 2016-08-31 2017-03-01 北京小米移动软件有限公司 Emotion identification method, device and terminal unit

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105607822A (en) * 2014-11-11 2016-05-25 中兴通讯股份有限公司 Theme switching method and device of user interface, and terminal
CN104699972A (en) * 2015-03-18 2015-06-10 小米科技有限责任公司 Emotion recognition reminding method and device
CN106126017A (en) * 2016-06-20 2016-11-16 北京小米移动软件有限公司 Intelligent identification Method, device and terminal unit
CN106469297A (en) * 2016-08-31 2017-03-01 北京小米移动软件有限公司 Emotion identification method, device and terminal unit

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108307037A (en) * 2017-12-15 2018-07-20 努比亚技术有限公司 Terminal control method, terminal and computer readable storage medium
CN108742516A (en) * 2018-03-26 2018-11-06 浙江广厦建设职业技术学院 The mood measuring and adjusting system and method for smart home
CN108937973A (en) * 2018-06-15 2018-12-07 四川文理学院 A kind of robotic diagnostic human body indignation mood method and device
WO2020015152A1 (en) * 2018-07-18 2020-01-23 平安科技(深圳)有限公司 Method and device for setting system theme, computer apparatus, and storage medium
CN109255310A (en) * 2018-08-28 2019-01-22 百度在线网络技术(北京)有限公司 Animal mood recognition methods, device, terminal and readable storage medium storing program for executing
CN109240786A (en) * 2018-09-04 2019-01-18 广东小天才科技有限公司 A kind of subject replacement method and electronic equipment
CN109240786B (en) * 2018-09-04 2021-11-26 广东小天才科技有限公司 Theme changing method and electronic equipment
CN109345258A (en) * 2018-09-26 2019-02-15 广东小天才科技有限公司 A kind of safe payment method, device and intelligent wearable device
CN109345258B (en) * 2018-09-26 2021-03-23 广东小天才科技有限公司 Safe payment method and device and intelligent wearable device
CN109360130A (en) * 2018-10-29 2019-02-19 四川文轩教育科技有限公司 A kind of student's mood monitoring method based on artificial intelligence
WO2020098013A1 (en) * 2018-11-14 2020-05-22 深圳创维-Rgb电子有限公司 Television program recommendation method, terminal, system, and storage medium
CN111209445A (en) * 2018-11-21 2020-05-29 中国电信股份有限公司 Method and device for recognizing emotion of terminal user
CN111209445B (en) * 2018-11-21 2023-05-02 中国电信股份有限公司 Method and device for identifying emotion of terminal user
CN109753889A (en) * 2018-12-18 2019-05-14 深圳壹账通智能科技有限公司 Service evaluation method, apparatus, computer equipment and storage medium
CN111413874A (en) * 2019-01-08 2020-07-14 北京京东尚科信息技术有限公司 Method, device and system for controlling intelligent equipment
CN109800734A (en) * 2019-01-30 2019-05-24 北京津发科技股份有限公司 Human facial expression recognition method and device
CN110311950A (en) * 2019-05-21 2019-10-08 平安科技(深圳)有限公司 Information-pushing method, device, equipment and computer readable storage medium
CN110013261A (en) * 2019-05-24 2019-07-16 京东方科技集团股份有限公司 Method, apparatus, electronic equipment and the storage medium of mood monitoring
CN110013261B (en) * 2019-05-24 2022-03-08 京东方科技集团股份有限公司 Emotion monitoring method and device, electronic equipment and storage medium
CN110399836A (en) * 2019-07-25 2019-11-01 深圳智慧林网络科技有限公司 User emotion recognition methods, device and computer readable storage medium
CN110853605A (en) * 2019-11-15 2020-02-28 中国传媒大学 Music generation method and device and electronic equipment
WO2021147901A1 (en) * 2020-01-20 2021-07-29 北京津发科技股份有限公司 Pressure recognition bracelet
CN111354053A (en) * 2020-02-27 2020-06-30 北京华峰创业科技有限公司 Method and device for generating cartoon image icon and storage medium
CN112200462B (en) * 2020-10-13 2024-04-26 中国银行股份有限公司 Risk assessment method and risk assessment device
CN112200462A (en) * 2020-10-13 2021-01-08 中国银行股份有限公司 Risk assessment method and device
CN113793578A (en) * 2021-08-12 2021-12-14 咪咕音乐有限公司 Tune generation method, device, equipment and computer readable storage medium
CN113793578B (en) * 2021-08-12 2023-10-20 咪咕音乐有限公司 Method, device and equipment for generating tune and computer readable storage medium
CN117370768A (en) * 2023-12-08 2024-01-09 北京回龙观医院(北京心理危机研究与干预中心) Mood fluctuation detection method and system for mental patients
CN117370768B (en) * 2023-12-08 2024-03-05 北京回龙观医院(北京心理危机研究与干预中心) Mood fluctuation detection method and system for mental patients

Similar Documents

Publication Publication Date Title
CN107392124A (en) Emotion identification method, apparatus, terminal and storage medium
KR102299764B1 (en) Electronic device, server and method for ouptting voice
US20200364457A1 (en) Emotion recognition-based artwork recommendation method and device, medium, and electronic apparatus
CN111161035B (en) Dish recommendation method and device, server, electronic equipment and storage medium
WO2017096979A1 (en) Program playing method and system based on emotion of user
CN110021061A (en) Collocation model building method, dress ornament recommended method, device, medium and terminal
CN108197185A (en) A kind of music recommends method, terminal and computer readable storage medium
CN113520340A (en) Sleep report generation method, device, terminal and storage medium
CN107784114A (en) Recommendation method, apparatus, terminal and the storage medium of facial expression image
US10437332B1 (en) System and method for emotional context communication
US11430561B2 (en) Remote computing analysis for cognitive state data metrics
WO2020088102A1 (en) Emotion intervention method, device and system, computer-readable storage medium, and therapeutic cabin
CN109241336A (en) Music recommended method and device
CN109272994A (en) Speech data processing method and the electronic device for supporting the speech data processing method
CN115699095A (en) Augmented reality content from third-party content
CN110399836A (en) User emotion recognition methods, device and computer readable storage medium
CN108141490A (en) For handling the electronic equipment of image and its control method
CN107705245A (en) Image processing method and device
CN111009031A (en) Face model generation method, model generation method and device
CN108024763A (en) Action message provides method and supports its electronic equipment
CN113576452A (en) Respiration rate detection method and device based on thermal imaging and electronic equipment
CN107665232A (en) Detect the method for similar application and its electronic installation of adaptation
Geller How do you feel? Your computer knows
JP7288064B2 (en) visual virtual agent
CN113556603B (en) Method and device for adjusting video playing effect and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20171124