CN109829082A - Application method, system and the intelligent terminal of Emotion identification - Google Patents

Application method, system and the intelligent terminal of Emotion identification Download PDF

Info

Publication number
CN109829082A
CN109829082A CN201811574475.6A CN201811574475A CN109829082A CN 109829082 A CN109829082 A CN 109829082A CN 201811574475 A CN201811574475 A CN 201811574475A CN 109829082 A CN109829082 A CN 109829082A
Authority
CN
China
Prior art keywords
user
emotion identification
identification result
music
result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811574475.6A
Other languages
Chinese (zh)
Inventor
李绿松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guizhou Fortuneship Technology Co Ltd
Original Assignee
Guizhou Fortuneship Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guizhou Fortuneship Technology Co Ltd filed Critical Guizhou Fortuneship Technology Co Ltd
Priority to CN201811574475.6A priority Critical patent/CN109829082A/en
Publication of CN109829082A publication Critical patent/CN109829082A/en
Pending legal-status Critical Current

Links

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention proposes a kind of application method of Emotion identification and systems, and wherein the application method of Emotion identification includes: the current emotional recognition result for obtaining user;Corresponding operation is executed according to Emotion identification result, operation includes at least social sharing and/or data statistics.By the emotional change for detecting and obtaining user, take different corresponding operatings, wherein, share mood by Emotion identification result, active mood/animation of user can be shared with friend in time, play the guiding role of active life, or comfort user timely and timely sharing, in addition, by the mood result for recording user, family can be used and also become more apparent upon oneself emotional state on ordinary days while sharing life with other people, according to the record result in a period of time, family can be used, management is optimized to the emotional state of oneself, also the memory time of user can be waken up, thus people's lives experience is improved, and it is simple and practical, it is convenient and efficient.

Description

Application method, system and the intelligent terminal of Emotion identification
Technical field
The present invention relates to Emotion identification fields, especially relate to application method, system and the intelligence of a kind of Emotion identification It can terminal.
Background technique
With the development of science and technology, people increasingly pursue the life of high-quality, many electronic products with corresponding function Also come into being, in people's lives, mood be influence people's quality of the life an important factor for one of, in this regard, some moods Identification technology starts to be used in people's lives, it is worth mentioning at this point that, Emotion identification is to apply investigating or examining at first In news, just start to popularize gradually in masses in recent years.But the application surface of present Emotion identification is less, generally all also only locates In adding the more primary degree such as expression, or some relevant music of recommendation to photo by Emotion identification, do not send out It is huge using that can play the role of to shoot Emotion identification.
Summary of the invention
The main object of the present invention is the application method for providing a kind of Emotion identification, system and intelligent terminal, realizes basis The different emotional states of people and the purpose for taking different operations.
The present invention proposes a kind of application method of Emotion identification, is applied in intelligent terminal, comprising:
Obtain the current emotional recognition result of user;
Corresponding operation is executed according to Emotion identification result, operation includes at least social sharing and/or data statistics.
Further, before the step of executing corresponding operation according to Emotion identification result, further includes:
Judge whether operation needs user to confirm;
If desired, then it generates confirmation and reminds and be pushed to user, the prompting for executing operation for confirmation is reminded in confirmation;
After receiving the instruction that user confirms execution operation, into the step of executing operation.
Further, the step of whether operation needs user to confirm judged, comprising:
Judge whether in automatic operation mode;
If being not at automatic operation mode, decision needs user to confirm.
Further, operation further includes playing music;The step of executing corresponding operation according to Emotion identification result, packet It includes:
According to the Emotion identification result determination music categories to be played;
Search the specific music that music categories are corresponded in the music record or favorites list of user;
Play specific music.
Further, the specific music of music categories is corresponded in the music record or favorites list for searching user After step, comprising:
If not finding specific music, carries out intelligent recommendation and play the music of corresponding music categories.
Further, operate for social activity share when, according to Emotion identification result execute corresponding operation the step of, comprising:
Corresponding default sharing template is determined according to Emotion identification result;
Social sharing is carried out to Emotion identification result according to default sharing template.
Further, when operating as data statistics, the step of corresponding operation is executed according to Emotion identification result, comprising:
Scene information when Emotion identification result generates is obtained, scene information includes at least temporal information, geographical location is believed One of breath, photographic intelligence;
Corresponding storage label is distributed to Emotion identification result, temporal information, geographical location information and photographic intelligence;
Emotion identification result, temporal information, geographical location information, photographic intelligence are associated jointly with storage label and are deposited Storage.
Further, operation further includes sending distress signals to medical institutions;It is executed according to Emotion identification result corresponding The step of operation, comprising:
Obtain the current geographical location information of user;
According to current geographic position information, the medical institutions in pre-determined distance are searched;
Distress signals are sent to medical institutions.
The invention also provides a kind of application systems of Emotion identification, comprising:
Module is obtained, for obtaining the current emotional recognition result of user;
Execution module, for executing corresponding operation according to Emotion identification result, operation include at least it is social share and/or Data statistics.
Further, further includes:
Judgment module, for judging whether operation needs user to confirm;
User is reminded and be pushed to reminding module for if desired, then generating confirmation, and confirmation, which is reminded, executes operation for confirmation Prompting;
Receiving module, after the instruction that execution operation is confirmed for receiving user, into the step of executing operation.
Further, judgment module, comprising:
Judging unit, for judging whether in automatic operation mode;
Judging unit, if for being not at automatic operation mode, decision needs user to confirm.
Further, operation further includes playing music;Execution module, comprising:
First determination unit, for according to the Emotion identification result determination music categories to be played;
First searching unit, for search user music record or favorites list in correspond to the specific of music categories Music;
First broadcast unit, for playing specific music.
Further, execution module, further includes:
Second broadcast unit, if carrying out intelligent recommendation for not finding specific music and playing corresponding music kind The music of class.
Further, operate for social activity share when, execution module, comprising:
Second determination unit, for determining corresponding default sharing template according to Emotion identification result;
Sharing unit, for carrying out social sharing according to default sharing template.
Further, when operating as data statistics, execution module, comprising:
First acquisition unit, for obtaining scene information when Emotion identification result generates, when scene information includes at least Between one of information, geographical location information, photographic intelligence;
Allocation unit is corresponding for distributing Emotion identification result, temporal information, geographical location information and photographic intelligence Storage label;
Storage unit, for Emotion identification result, temporal information, geographical location information, photographic intelligence and storage label It is associated storage jointly.
Further, operation further includes sending distress signals to medical institutions;Execution module, comprising:
Second acquisition unit, for obtaining the current geographic position information of user;
Second searching unit, for searching the medical institutions in pre-determined distance according to current geographic position information;
Transmission unit, for sending distress signals to medical institutions.
The invention also provides a kind of intelligent terminal, including memory and processor, computer journey is stored in memory Sequence, which is characterized in that the step of processor realizes any one of above-mentioned method when executing computer program.
Compared with prior art, the present invention beneficial effect is: the invention proposes a kind of application method of Emotion identification and System, wherein the application method of Emotion identification includes: the current emotional recognition result for obtaining user;It is held according to Emotion identification result The corresponding operation of row, operation include social sharing and/or data statistics.By detecting and obtaining the emotional change of user, with certainly It moves, quickly take different corresponding operatings, wherein mood is shared by Emotion identification result, it can be in time by user's Active mood/animation is shared with friend, plays the guiding role of active life, or make and timely sharing User can timely be comforted, in addition, sharing life with other people as a result, family can be used by the mood of record user Simultaneously also become more apparent upon oneself emotional state on ordinary days, according in a period of time record as a result, can be used family to oneself Emotional state carries out specific aim optimum management, can also wake up the memory time of user, thus improves people's lives experience and life Bioplasm amount, and it is simple and practical, it is convenient and efficient.
Detailed description of the invention
Fig. 1 is the step schematic diagram of an embodiment of the application method of Emotion identification of the present invention;
Fig. 2 is the module frame schematic diagram of the first embodiment of the application system of Emotion identification of the present invention;
Fig. 3 is the module frame schematic diagram of the second embodiment of the application system of Emotion identification of the present invention;
Fig. 4 is the module frame schematic diagram of judgment module in Fig. 3 of the application system of Emotion identification of the present invention;
Fig. 5 is the module frame signal of execution module first embodiment in Fig. 2 of the application system of Emotion identification of the present invention Figure;
Fig. 6 is the module frame signal of execution module second embodiment in Fig. 2 of the application system of Emotion identification of the present invention Figure;
Fig. 7 is the module frame signal of execution module 3rd embodiment in Fig. 2 of the application system of Emotion identification of the present invention Figure;
Fig. 8 is the module frame signal of execution module fourth embodiment in Fig. 2 of the application system of Emotion identification of the present invention Figure;
Fig. 9 is the module frame signal of the 5th embodiment of execution module in Fig. 2 of the application system of Emotion identification of the present invention Figure.
The embodiments will be further described with reference to the accompanying drawings for the realization, the function and the advantages of the object of the present invention.
Specific embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that described embodiment is only a part of the embodiments of the present invention, instead of all the embodiments.Base Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts it is all its His embodiment, shall fall within the protection scope of the present invention.
It is to be appreciated that the directional instruction (such as up, down, left, right, before and after ...) of institute is only used in the embodiment of the present invention In explaining in relative positional relationship, the motion conditions etc. under a certain particular pose (as shown in the picture) between each component, if should When particular pose changes, then directionality instruction also correspondingly changes correspondingly, and the connection, which can be, to be directly connected to, It can be and be indirectly connected with.
In addition, the description for being such as related to " first ", " second " in the present invention is used for description purposes only, and should not be understood as Its relative importance of indication or suggestion or the quantity for implicitly indicating indicated technical characteristic.Define as a result, " first ", The feature of " second " can explicitly or implicitly include at least one of the features.In addition, the technical side between each embodiment Case can be combined with each other, but must be based on can be realized by those of ordinary skill in the art, when the combination of technical solution Conflicting or cannot achieve when occur will be understood that the combination of this technical solution is not present, also not the present invention claims guarantor Within the scope of shield.
As shown in Figure 1, a kind of application method of Emotion identification, comprising:
S1: the current emotional recognition result of user is obtained;
S2: corresponding operation is executed according to Emotion identification result, operation includes at least social sharing and/or data statistics.
It, in some embodiments, can be by taking the photograph in the step S1 of the current emotional recognition result of above-mentioned acquisition user It is captured as eyeball information of the head to user carries out scanning in real time, generates scan image.In further embodiments, can pass through Network receives the eyeball information that the scanning of other terminals captures, other terminals can be video camera, Intelligent mirror or infrared sensing Imager, network can be wireless network, bluetooth, NFC (Near Field Communication, close range wireless communication skill Art) etc..Then the scan image of acquisition is analyzed, in conjunction with Expression analysis software, according to the eyeball information in scan image It analyzes and obtains the eyeball information characteristics of user at that time, such as the general relationship between the eyeball information and mood of user is as follows: when When recalling something, eye pupil can move to the left naturally;If it is when thinking deeply something, his pupil is then It can move to the right in spite of oneself.When people encounters disgusting stimulation or makes a person angry, is disagreeable, pupil can can't help It shrinks on ground;On the contrary, when encounter make us buoyant stimulation when, pupil will amplify automatically;When main body feel it is panic, pleasant, When liking, is excited, is exciting, pupil can be amplified to usual 4 times.If pupil does not play any variation, then it represents that he is to being seen Things, hold a kind of unconcerned or boring attitude.Studies have shown that percent 90 mankind are all suitable for this relationship, therefore The accuracy of analysis is very high.
It is needing further exist for judging that user agent is in fear, pleasure, likes, excited, any specific feelings of excitement It, can also be further combined with facial recognition techniques, such as the variation of identification nozzle type and facial muscle variation etc. when thread.In addition, also By the way that the eyeball information and facial information of user are combined identification, to obtain more accurate Emotion identification as a result, so as to Next step is carried out according to the Emotion identification result.
In the above-mentioned step S2 for executing corresponding operation according to Emotion identification result, intelligent terminal is captured according to scanning After user eyeball information and/or facial information obtain Emotion identification result, corresponding behaviour is executed according to the Emotion identification result Make.In some embodiments, intelligent terminal carries out social sharing according to the Emotion identification result identified, that is, passes through social application The mood result of the user recognized is shared away in a particular manner, to allow the relatives and friends of user to learn the feelings of user Thread.Thus the content of social sharing is enriched, and increases more frequent and deep emotional interaction between relatives and friends, often A user can more be concerned about other people and obtain other people care.In further embodiments, intelligent terminal can will identify To Emotion identification result recorded and done data statistics, for example, first obtain Emotion identification result generate when time letter At least one of breath, geographical location information or photographic intelligence, due to Emotion identification the result is that being carried out by the image to user Obtained from scanning, which has recorded the emotional state of user at that time, has to user and saves meaning.It therefore should Scan image is stored as photo, is recorded in the form of photographic intelligence, and wherein the scan image can be the real-time photograph of user Real-time scene state locating for piece or user is also possible to the two combination etc., can also be other existing Photo.Such as: under a storage photo, display: on August 3rd, 2015,25 minutes at 4 points in afternoons, in Guangzhou.Then to mood Recognition result, temporal information, geographical location information, the corresponding storage label of photographic intelligence distribution, the storage label can be the heart Feelings label, such as " sad ", " happy " are also possible to geographical labels, such as " Guangzhou ", " Shenzhen ", can also be other forms Label.It in some embodiments, can be to Emotion identification result, temporal information, geographical location information and corresponding photo Information distributes multiple and different storage labels, is convenient for multiple classification.Finally Emotion identification result, temporal information, first Geographical location information and storage label are associated storage, the personal mood warehouse of a user are established, for recording and conveniently User looks back, such as user can be searched by storing label, can by mood label " sad ", " happy " etc. Oneself mistake can be looked back to look back oneself past specific mood states of a certain kind by geographical labels " Guangzhou ", " Shenzhen " etc. The animation in somewhere is removed, by time tag " 2015 ", " 2016 " etc., oneself past can be looked back in certain a period of time Animation, in conjunction with the photographic intelligence of storage, by the previous animation of record, both pictures and texts are excellent is presented to user, in conjunction with a variety of Label carries out Integrated Selection, then can play the effect for carrying out accurate specific review.
In some embodiments, before the step S2 for executing corresponding operation according to Emotion identification result, further includes:
S01: judge whether operation needs user to confirm;
S02: it if desired, then generates confirmation and reminds and be pushed to user, the prompting for executing operation for confirmation is reminded in confirmation;
S03: after receiving the instruction that user confirms execution operation, into the step of executing operation.
Judge to operate in the step S01 for whether needing user to confirm above-mentioned, after obtaining Emotion identification result, acquisition pair Should Emotion identification result operation, then first have to confirm whether the Emotion identification result needs to use before performing the operation Family, which carries out confirmation, could execute corresponding operation, i.e., whether user is needed to carry out further authorization could execute corresponding behaviour Make, Emotion identification result corresponding operation is classified according to its order of importance and emergency, different operations needs different permissions, both prevents The maloperation of system, has also respected the personal inclination of user, gives the higher operation freedom degree of user.
It if desired then generates confirmation above-mentioned and reminds and be pushed to user, confirm that reminding is that confirmation executes the prompting operated Step S02 in, if it is judged that corresponding to current emotional recognition result operation need the confirmation of user that could execute, such as Corresponding music is played according to Emotion identification result, is shared in another example carrying out state with relatives and friends according to Emotion identification result These are related with user, but less urgent operation, then generate confirmation and remind and push for notifying user, respect The personal inclination of user.In some embodiments, confirmation is reminded can be and is pushed directly on the screen of user watched, User is waited to confirm.In further embodiments, confirmation, which is reminded can be, is pushed to the intelligent terminal that user binds in advance Screen on, wait user confirmation.
After the above-mentioned instruction for receiving user's confirmation execution operation, into the step S03 for executing operation, receiving After confirming the instruction for executing operation to user, system executes corresponding operation, and in some embodiments, confirmation is reminded by user Manual confirmation is carried out, safety and precise respects fully the personal inclination of user.In further embodiments, user can be to some Corresponding operating, such as the automatic music that plays are authorized in advance, are such as added into trust list, make user do not need every time into Row manual confirmation, confirmation are reminded and can be automatically confirmed that, use process is more convenient.In further embodiments, if The confirmation for judging that operation does not need user corresponding to current emotional recognition result can execute, such as according to Emotion identification knot Fruit discovery user is likely to be at morbidity state or other state of emergency, needs automatic dialing ambulance call this kind more urgent Operation, due to user possibly can not in time even can not carry out manual confirmation confirmation remind, such corresponding operation Be included in and do not need the range that the confirmation of user can execute, for example, such corresponding operation setting be automatic operation mode, i.e., It is equivalent to and gives this generic operation and authorize in advance, preferably to safeguard the health and life security of user, preferably play The positive positive effect of Emotion identification.In some embodiments, it can be executed although not needing the confirmation of user, also It is that can generate waiting time, such as 30s in screen, after the waiting time of 30s, if user does not carry out cancellation behaviour still Make, then corresponding operating is executed according to Emotion identification result automatically, the waiting time is set, is the generation to prevent maloperation, in order to avoid Cause the waste of medical resource or police strength resource.
In some embodiments, judge the step S01 whether operation needs user to confirm, comprising:
S011: judge whether in automatic operation mode;
S012: if being not at automatic operation mode, decision needs user to confirm.
In the above-mentioned step S011 for judging whether in automatic operation mode, automatic operation mode refers to giving in advance Some authorization permissions include trust list in automatic operation mode, are added in trust list in some embodiments Operation, not needing user's progress manual confirmation prompting can be automatically brought into operation.Automatic operation mode is set, does not need user often Secondary all progress manual confirmation, it is convenient to use.
If being not at automatic operation mode above-mentioned, decision needs in the step S012 of user's confirmation, if detection It is not in automatic operation mode to current operation, or the respective operations of the progress according to needed for Emotion identification result do not exist In trust list, then decision needs user to confirm, generates confirmation and reminds for notifying user, respects a people's will of user It is willing to.In some embodiments, confirmation is reminded can be and is pushed directly on the screen of user watched, and user is waited to carry out Confirmation.In further embodiments, confirmation is reminded can be and is pushed on the screen for the intelligent terminal that user binds in advance, is waited User's confirmation.
In some embodiments, operation further includes playing music;The step of corresponding operation is executed according to Emotion identification result Rapid S2, comprising:
S21: according to the Emotion identification result determination music categories to be played;
S22: the specific music that music categories are corresponded in the music record or favorites list of user is searched;
S23: specific music is played.
In the above-mentioned step S21 according to the Emotion identification result determination music categories to be played, the mood of user is obtained After recognition result, when the mood of Emotion identification user as the result is shown reaches corresponding threshold value, determine that corresponding music categories are used In subsequent automatic broadcasting, by changing the background environment of user come gentle user emotion.For example, when Emotion identification is used as the result is shown When the current mood in family is in sad sad, it is determined that play cheerful and light-hearted music categories to make user feel fast to a certain extent It is happy;When Emotion identification as the result is shown the current mood of user be in it is dejected when, it is determined that play impassioned music categories come using Feel to be inspired excitation to a certain extent in family;When the current mood of user is in excitement to Emotion identification as the result is shown, then really Surely soft music categories are played to make the gentle mood of user, to prevent extreme joy begets sorrow etc..In further embodiments, in addition to broadcasting It puts except corresponding music categories, corresponding video genre can also be played.
The specific music step S22 of music categories is corresponded in the music record or favorites list of above-mentioned lookup user In, after corresponding music categories have been determined according to Emotion identification result, record in the music of user first or like arranging It carries out searching the corresponding music categories in table, because if the music played is that user does not like, then effect may be big It gives a discount, plays the correspondence music in the music record or favorites list of user, it can be in corresponding music categories more The favorite music of user is accurately played, the effect for preferably calming down user emotion is played.
In some embodiments, user can also be such that intelligent terminal has determined according to Emotion identification result by setting After corresponding music categories, directly according to corresponding music categories carry out intelligent recommendation broadcasting, and be more than be defined in It is searched in the music record or favorites list at family, allows users to itself more possible favorite music of discovery.
In the step S23 of above-mentioned broadcasting specific music, found in the music record or favorites list of user After corresponding music categories, the automatic specific music for playing the corresponding music categories is carried out, in some embodiments, according to user's Broadcasting frequency is put by first broadcast from high to low, and what is played at first is exactly user most familiar with favorite, so that user's connecing at the beginning Can be higher by degree, play the effect for preferably calming down mood.
In some embodiments, the specific of music categories is corresponded in the music record or favorites list for searching user After the step S22 of music, comprising:
S24: it if not finding specific music, carries out intelligent recommendation and plays the music of corresponding music categories.
If not finding specific music above-mentioned, carries out intelligent recommendation and play the step of the music of corresponding music categories In rapid S24, in some embodiments, user may listen to music fewer, or do not have preference for the correspondence music categories, listen Obtain specific sound less, therefore that the corresponding music categories can not be found in the music of user record or favorites list It is happy, intelligent recommendation play mode is then opened at this time, chooses carry out intelligence broadcasting more popular in the correspondence music categories, Play the effect for calming down user emotion.
In some embodiments, operate for social activity share when, according to Emotion identification result execute corresponding operation the step of S2, further includes:
S25: corresponding default sharing template is determined according to Emotion identification result;
S26: social sharing is carried out according to default sharing template.
It is determined in the corresponding default step S25 for sharing template above-mentioned according to Emotion identification result, is corresponded in addition to playing Music categories except, according to Emotion identification as a result, mood sharing can also be carried out, such as pass through wechat, microblogging, facebook, short The approach such as letter are shared.In some embodiments, the default sharing template of user's pre-edit, different Emotion identification result Corresponding different default sharing template seems nature without lofty so that sharing result.
In the above-mentioned step S46 for carrying out social sharing according to default sharing template, it is determined that with Emotion identification result pair After the specific sharing template answered, state sharing is carried out according to the specific sharing template, relatives and friends is enable to obtain at the first time The emotional state for knowing user is shared happy together with user or undertakes sadness, is beneficial to the mental health of user.
In some embodiments, when operation is data statistics, the step of corresponding operation is executed according to Emotion identification result Rapid S2, comprising:
S27: obtaining scene information when Emotion identification result generates, and scene information includes at least temporal information, geographical position One of confidence breath, photographic intelligence;
S28: corresponding storage mark is distributed to Emotion identification result, temporal information, geographical location information and photographic intelligence Label;
S29: Emotion identification result, temporal information, geographical location information, photographic intelligence and storage label are closed jointly Connection storage.
In the step S27 of scene information when above-mentioned acquisition Emotion identification result generates, according to Emotion identification result After performing respective operations, obtain in temporal information, geographical location information or the photographic intelligence when Emotion identification result generates At least one, as Emotion identification the result is that obtained from being scanned as the image to user, the scan image i.e. remember The emotional state of user at that time has been recorded, has had to user and saves meaning.Therefore the scan image is stored as photo, is believed with photo The form of breath is recorded, wherein the scan image can be user live-pictures or user locating for real-time field Scape state is also possible to the two combination etc., can also be other existing photos.Such as: in a storage photo Under, display: on August 3rd, 2015,25 minutes at 4 points in afternoons, in Guangzhou.
Corresponding storage is distributed to Emotion identification result, temporal information, geographical location information and photographic intelligence above-mentioned In the step S28 of label, the temporal information and geographical location information, photo letter of the Emotion identification result respective operations are obtained After breath, corresponding storage label is distributed to Emotion identification result, temporal information, geographical location information and photographic intelligence, it should Storage label can be mood label, such as " sad ", " happy ", be also possible to geographical labels, such as " Guangzhou ", " Shenzhen ", It can also be the label of other forms.It in some embodiments, can be to Emotion identification result, temporal information and geographical location Information distributes multiple and different storage labels, is convenient for multiple classification.
Emotion identification result, temporal information, geographical location information and storage label are associated storage jointly above-mentioned Step S29 in, after distributing Emotion identification result, temporal information and geographical location information corresponding storage label, Emotion identification result, temporal information, geographical location information and storage label are associated storage, establish the individual of a user Mood warehouse, for recording and user is facilitated to look back, such as user can be searched by storing label, pass through mood Label " sad ", " happy " etc. can look back oneself past specific mood states of a certain kind, by geographical labels " Guangzhou ", " Shenzhen " etc., can look back oneself past can be returned in the animation in somewhere by time tag " 2015 ", " 2016 " etc. Oneself past is cared in the animation of certain a period of time, in conjunction with the photographic intelligence of storage, by the previous animation picture and text of record And cyclopentadienyl is presented to user, carries out Integrated Selection in conjunction with a variety of labels, then can play the effect for carrying out accurate specific review, note Employ the life at family.In some embodiments, it every one specific time, such as a season or 1 year etc., or leaves As soon as specific place, the summary of Emotion identification result is carried out according to the storage in mood warehouse, for example, being made into a spy Volume, allow user facilitate review with share, or select season, information in year when best mood is (as most happy in year When time, place, photo), guidance user remember a good time.The review that user can also be done according to data statistics Adjust oneself animation, for example, user found by data statistics be all greatly within past a period of time In positive emotional state, then explanation can continue to keep current animation.If user is had found by data statistics It is greatly within past a period of time in passive emotional state, then illustrates have in current animation The place for needing to be adjusted.
In some embodiments, operation further includes sending distress signals to medical institutions;It is executed according to Emotion identification result The step S2 of corresponding operation, comprising:
S210: the current geographic position information of user is obtained;
S211: according to current geographic position information, the medical institutions in pre-determined distance are searched;
S212: distress signals are sent to medical institutions.
In the step S210 of the current geographic position information of above-mentioned acquisition user, according to Emotion identification as a result, if detection It is in a state of emergency to user, after needing first aid to help, obtains the current geographic position information of user, be used for subsequent transmission In distress signals.Such as the scan image of the acquisition in step S1 is analyzed, in conjunction with Expression analysis software, according to scanning figure Eyeball information and facial information as in analyze the eyeball information characteristics obtained user at that time and facial information feature, in some realities It applies in example, if showing that user is in the mood of the pain of highest level according to eyeball information characteristics and facial information signature analysis When state, such as pupil variation abnormality degree reaches threshold value, and facial muscle degreeof tortuosity reaches threshold value, and corners of the mouth effluent, which has, to be covered Lid, then judge that user is in a state of emergency, it is possible in morbidity state or injured state etc..In some embodiments, should The function of detecting the state of emergency can be selected to close or opened by user, have the user of morbidity medical history to select to open the work played With larger, play and remedy function, the user for medical history of not falling ill then can choose closing, respect fully individual subscriber wish.
Above-mentioned according to current geographic position information, in the step S211 for searching the medical institutions in pre-determined distance, obtain After the current geographic position information of user, search with the medical institutions in pre-determined distance, so that user can as early as possible in time It is given treatment to.
It is above-mentioned to medical institutions send distress signals step S212 in, find the medical institutions near user it Afterwards, ambulance call is dialed to the medical institutions, is helped with allowing user to obtain first aid.In some embodiments, user can preset The medical institutions of oneself preference, when detecting that user is in a state of emergency, after needing first aid to help, just default doctor from trend It treats mechanism and dials ambulance call, succour user timely.In further embodiments, first aid is being dialed to medical institutions While phone, also distress signals can be sent from the preset emergency contact of trend or dial rescuing telephone, to allow relatives and friends Learn the message that the user needs to be succoured.
The invention proposes a kind of application methods of Emotion identification, comprising: S1: obtaining the current emotional identification knot of user Fruit;S2: corresponding operation is executed according to Emotion identification result, operation includes at least social sharing and/or data statistics.Pass through inspection Survey and obtain the emotional change of user, with it is automatic, quickly take different corresponding operatings, wherein pass through Emotion identification result Share mood, active mood/animation of user can be shared in time with friend, play the guiding of active life Effect, or comforts user timely and timely sharing, in addition, by the mood of record user as a result, Family can be used and also become more apparent upon oneself emotional state on ordinary days while sharing life with other people, according in a period of time Record carries out specific aim optimum management to the emotional state of oneself as a result, family can be used, and can also wake up the memory time of user, Thus people's lives experience and quality of life are improved, and simple and practical, it is convenient and efficient.
As shown in Fig. 2, the invention also provides a kind of application systems of Emotion identification, comprising:
Module 10 is obtained, for obtaining the current emotional recognition result of user;
Execution module 20, for executing corresponding operation according to Emotion identification result, operation include at least it is social share and/ Or data statistics.
In above-mentioned acquisition module 10, in some embodiments, intelligent terminal can be by camera to the eyeball of user Information carries out scanning in real time and captures, and generates scan image.In further embodiments, intelligent terminal can be received by network The eyeball information that the scanning of other terminals captures, other terminals can be video camera, Intelligent mirror or infrared sensing imager, net Network can be wireless network, bluetooth, NFC (Near Field Communication, near field communication (NFC)) etc..Then The scan image for obtaining 10 pairs of module acquisitions is analyzed, in conjunction with Expression analysis software, according to the eyeball information in scan image It analyzes and obtains the eyeball information characteristics of user at that time, such as the general relationship between the eyeball information and mood of user is as follows: when When recalling something, eye pupil can move to the left naturally;If it is when thinking deeply something, his pupil is then It can move to the right in spite of oneself.When people encounters disgusting stimulation or makes a person angry, is disagreeable, pupil can can't help It shrinks on ground;On the contrary, when encounter make us buoyant stimulation when, pupil will amplify automatically;When main body feel it is panic, pleasant, When liking, is excited, is exciting, pupil can be amplified to usual 4 times.If pupil does not play any variation, then it represents that he is to being seen Things, hold a kind of unconcerned or boring attitude.Studies have shown that percent 90 mankind are all suitable for this relationship, therefore The accuracy of analysis is very high.It is needing further exist for judging that user agent is in fear, pleasure, likes, excited, exciting any When specific mood, obtaining module 10 can be further combined with facial recognition techniques, such as identification nozzle type variation.And face's flesh Meat variation etc., by the way that the eyeball information and facial information of user are combined identification, to obtain more accurate Emotion identification As a result, to carry out next step according to the Emotion identification result.
Obtain module 10 according to scanning capture user eyeball information and/or facial information obtain Emotion identification result it Afterwards, execution module 20 executes corresponding operation according to the Emotion identification result.In some embodiments, execution module 20 is according to knowledge Not Chu Emotion identification result carry out social sharing, i.e., by social application by the mood result of the user recognized with specific Mode is shared away, to allow the relatives and friends of user to learn the mood of user.Thus the content of social sharing, Yi Ji are enriched Increase between relatives and friends more frequently can more be concerned about other people and obtain him with deep emotional interaction, each user The care of people.In further embodiments, the Emotion identification result recognized can be recorded and do data by execution module 20 Statistics, for example, being obtained in temporal information when Emotion identification result generates, geographical location information or photographic intelligence first at least One, as Emotion identification the result is that obtained from being scanned as the image to user, which has recorded use The emotional state of family at that time has user and saves meaning.Therefore the scan image is stored as photo, with the shape of photographic intelligence Formula is recorded, wherein the scan image can be user live-pictures or user locating for real-time scene state, It is also possible to the two combination etc., can also be other existing photos.Such as: under a storage photo, display: On August 3rd, 2015,25 minutes at 4 points in afternoons, in Guangzhou.Then to Emotion identification result, temporal information and geographical location information point With corresponding storage label, which can be mood label, such as " sad ", " happy ", is also possible to geographical labels, Such as " Guangzhou ", " Shenzhen ", it can also be the label of other forms.In some embodiments, can to Emotion identification result, when Between information, geographical location information and corresponding photographic intelligence distribute multiple and different storage labels, be convenient for multiple classification. Emotion identification result, temporal information, the first geographical location information and storage label are finally associated storage, establish a use The personal mood warehouse at family, for recording and user is facilitated to look back, such as user can be searched by storing label, By mood label " sad ", " happy " etc., oneself past specific mood states of a certain kind can be looked back, geographical labels are passed through " Guangzhou ", " Shenzhen " etc., can look back oneself past in the animation in somewhere, by time tag " 2015 ", " 2016 " etc., Oneself past can be looked back in the animation of certain a period of time, in conjunction with the photographic intelligence of storage, by the previous life shape of record Both pictures and texts are excellent is presented to user for state, carries out Integrated Selection in conjunction with a variety of labels, then can play and carry out accurate specific review Effect.
In some embodiments, further includes:
Judgment module 30, for judging whether operation needs user to confirm;
User is reminded and be pushed to reminding module 40 for if desired, then generating confirmation, and confirmation, which is reminded, executes behaviour for confirmation The prompting of work;
Receiving module 50, after the instruction that execution operation is confirmed for receiving user, into the step of executing operation.
Judgment module 30 first has to whether confirm the Emotion identification result before execution module 20 executes corresponding operation Corresponding operation could be executed by needing user to carry out confirmation, i.e., whether user is needed to carry out further authorization could execute accordingly Operation, Emotion identification result corresponding operation is classified according to its order of importance and emergency, different operations need different permissions, both The maloperation of system is prevented, the personal inclination of user has also been respected, gives the higher operation freedom degree of user.
If it is determined that module 30 judges that operation corresponding to current emotional recognition result needs the confirmation of user that can just hold Row, such as corresponding music is played according to Emotion identification result, in another example carrying out shape according to Emotion identification result and relatives and friends These are related with user for state sharing etc., but less urgent operation, then generate confirmation by reminding module 40 and remind simultaneously Push respects the personal inclination of user for notifying user.In some embodiments, the confirmation prompting that reminding module 40 generates can To be to wait user to confirm on the screen watched for be pushed directly to user.In further embodiments, mould is reminded The confirmation prompting that block 40 generates, which can be, to be pushed on the screen for the intelligent terminal that user binds in advance, and user's confirmation is waited.
After the instruction that receiving module 50 receives that user confirms execution operation, system executes corresponding operation, one In a little embodiments, confirmation, which is reminded, carries out manual confirmation by user, and safety and precise respects fully the personal inclination of user.Another In a little embodiments, user can be to some corresponding operatings, such as the automatic music that plays is authorized in advance, is such as added into trust column In table, user is made not need to carry out manual confirmation every time, confirmation is reminded and can be automatically confirmed that, use process is more convenient fast It is prompt.In further embodiments, if it is determined that module 30 judges that operation corresponding to current emotional recognition result does not need to use The confirmation at family can execute, such as according to Emotion identification as a result, it has been found that user is likely to be at morbidity state or other urgent shapes State, the operation for needing automatic dialing ambulance call this kind more urgent, since user possibly can not can not even carry out in time Manual confirmation confirmation is reminded, therefore such corresponding operation is included in the range that does not need the confirmation of user and can execute, such as It is automatic operation mode such corresponding operation setting, is authorized in advance that is, giving this generic operation, preferably to tie up The health and life security for protecting user, preferably play the positive positive effect of Emotion identification.In some embodiments In, it can be executed although not needing the confirmation of user, still waiting time, such as 30s can be generated in screen, 30s's After the waiting time, if receiving module 50 is still not received by the instruction that user cancels operation, 20 basis of execution module Emotion identification result executes corresponding operating automatically, and the waiting time is arranged, is the generation to prevent maloperation, in order to avoid cause medical money The waste of source or police strength resource.
In some embodiments, judgment module 30, comprising:
Judging unit 301, for judging whether in automatic operation mode;
Judging unit 302, if for being not at automatic operation mode, decision needs user to confirm.
In above-mentioned judging unit 301, automatic operation mode refers to giving some authorization permissions in advance, some It include trust list in automatic operation mode in embodiment, the operation being added in trust list does not need user and carries out manually Confirmation is reminded and can be automatically brought into operation.Automatic operation mode is set, user is made not need to carry out manual confirmation, user every time Just quick.
If judging unit 301 detects that current operation is not in automatic operation mode, or according to Emotion identification As a result the respective operations carried out needed for are not in trust list, then 302 decision of judging unit needs user to confirm, generate true Prompting is recognized for notifying user, respects the personal inclination of user.In some embodiments, confirmation is reminded can be and is pushed directly to On the screen of user watched, user is waited to confirm.In further embodiments, confirmation is reminded can be and is pushed to On the screen for the intelligent terminal that user binds in advance, user's confirmation is waited.
In some embodiments, operation further includes playing music;Execution module 20, comprising:
First determination unit 201, for according to the Emotion identification result determination music categories to be played;
First searching unit 202, for search user music record or favorites list in correspond to music categories Specific music;
First broadcast unit 203, for playing specific music.
It is obtaining after module 10 obtains the Emotion identification result of user, when the mood of Emotion identification user as the result is shown reaches When to corresponding threshold value, the first determination unit 201 determines that corresponding music categories are used for subsequent automatic broadcasting, passes through and changes user's Background environment carrys out gentle user emotion.For example, when the current mood of user is in sad sad to Emotion identification as the result is shown, the One determination unit 201, which then determines, plays cheerful and light-hearted music categories to make user feel happy to a certain extent;Work as Emotion identification When the current mood of user is in dejected as the result is shown, the first determination unit 201, which then determines, plays impassioned music categories to make User feels to be inspired excitation to a certain extent;When the current mood of user is in excitement to Emotion identification as the result is shown, the One determination unit 201, which then determines, plays soft music categories to make the gentle mood of user, to prevent extreme joy begets sorrow etc..
After first determination unit 201 has determined corresponding music categories according to Emotion identification result, the first searching unit 202 carry out searching the corresponding music categories in the music record or favorites list of user first, because if playing Music be that user does not like, then effect may have a greatly reduced quality, play user music record or favorites list in Correspondence music, the favorite music of broadcasting user that can be more accurate in corresponding music categories plays and preferably calms down use The effect of family mood.In some embodiments, user can also make intelligent terminal at the first determination unit 201 by setting After corresponding music categories have been determined according to Emotion identification result, intelligent recommendation directly is carried out according to corresponding music categories and is broadcast Put, and be not limited to search in the music record or favorites list of user, allow users to discovery it is more itself It may favorite music.
After first searching unit 202 finds corresponding music categories in the music record or favorites list of user, First broadcast unit 203 carries out the specific music for playing the corresponding music categories automatically, and in some embodiments, first plays list Member 203 is put by first broadcast from high to low according to the broadcasting frequency of user, and what is played at first is exactly user most familiar with favorite, so that The acceptance of user at the beginning can be higher, plays the effect for preferably calming down mood.In further embodiments, it first broadcasts Unit 203 is put other than playing corresponding music categories, corresponding video genre can also be played.
In some embodiments, execution module 20, further includes:
Second broadcast unit 204, if carrying out intelligent recommendation for not finding specific music and playing corresponding music The music of type.
In some embodiments, user may listen to music fewer, or not have preference for the correspondence music categories, listen Must be less, therefore the first searching unit 202 can not find the corresponding sound in the music of user record or favorites list The specific music of happy type, then opens intelligent recommendation play mode at this time, and the second broadcast unit 204 chooses the correspondence music categories In more popular carry out intelligence broadcasting, play the effect for calming down user emotion.
In some embodiments, operate for social activity share when, execution module 20, further includes:
Second determination unit 205, for determining corresponding default sharing template according to Emotion identification result;
Sharing unit 206, for carrying out social sharing according to default sharing template.
It is corresponding in addition to that can be played by the first broadcast unit 203 and the second broadcast unit 204 in execution module 20 Except music categories, according to Emotion identification as a result, mood sharing can also be carried out, such as pass through wechat, microblogging, facebook, short message Etc. approach shared.In some embodiments, the default sharing template of user's pre-edit, the second determination unit 205 basis Different Emotion identification results determines different default sharing templates, seems nature without lofty so that sharing result.
After second determination unit 205 has determined specific sharing template corresponding with Emotion identification result, sharing unit 206 Carry out state sharing according to the specific sharing template, relatives and friends enable to learn the emotional state of user at the first time, with Family shares happy together or undertakes sadness, is beneficial to the mental health of user.
In some embodiments, when operation is data statistics, execution module 20, comprising:
First acquisition unit 207, for obtaining scene information when Emotion identification result generates, scene information is included at least One of temporal information, geographical location information, photographic intelligence;
Allocation unit 208, is used for: distributing Emotion identification result, temporal information, geographical location information and photographic intelligence Corresponding storage label;
Storage unit 209, for marking Emotion identification result, temporal information, geographical location information, photographic intelligence and storage Label are associated storage jointly.
In execution module 20, temporal information, the geography of the respective operations can also be obtained by first acquisition unit 207 Location information, photographic intelligence, such as: it is shown under a storage photo: on August 3rd, 2015,25 minutes at 4 points in afternoons, in Guangzhou.
First acquisition unit 207 obtain the temporal informations of the Emotion identification result respective operations, geographical location information or After photographic intelligence, allocation unit 208 distributes Emotion identification result, temporal information, geographical location information and photographic intelligence Corresponding storage label, the storage label can be mood label, such as " sad ", " happy ", be also possible to geographical labels, example Such as " Guangzhou ", " Shenzhen ", the label of other forms can also be.In some embodiments, allocation unit 208 can know mood Other result, temporal information and the first geographical location information distribute multiple and different storage labels, are convenient for multiple classification.
The distribution of Emotion identification result, temporal information and the first geographical location information is deposited accordingly in allocation unit 208 After storing up label, storage unit 209 carries out Emotion identification result, temporal information, the first geographical location information and storage label Associated storage establishes the personal mood warehouse of a user, and for recording and user is facilitated to look back, such as user can lead to It crosses storage label to be searched, by mood label " sad ", " happy " etc., oneself past specific heart of a certain kind can be looked back Situation state can look back oneself past in the animation in somewhere, pass through the time by geographical labels " Guangzhou ", " Shenzhen " etc. Label " 2015 ", " 2016 " etc. can look back oneself past in the animation of certain a period of time, believe in conjunction with the photo of storage Breath, by the previous animation of record, both pictures and texts are excellent is presented to user, carries out Integrated Selection in conjunction with a variety of labels, then can rise To the effect for carrying out accurate specific review, the life of user is recorded.In some embodiments, every one specific time, than A such as season or 1 year, as soon as leave specific place, according to the storage progress Emotion identification in mood warehouse As a result summary, for example, be made into a special issue, allow user facilitate review with share, or select season, best feelings in year Information (time, place, photo when most happy in such as year) when thread, guidance user remembers a good time.User is also The animation looked back to adjust oneself that can be done according to data statistics, such as user were had found by data statistics in the past A period of time in greatly be in positive emotional state, then explanation can continue to keep current life shape State.If user has found greatly to be at passive emotional state within past a period of time by data statistics In, then illustrate the place in need being adjusted in current animation.
In some embodiments, operation further includes sending distress signals to medical institutions;Execution module 20, further includes:
Second acquisition unit 210, for obtaining the current geographic position information of user;
Second searching unit 211, for searching the medical institutions in pre-determined distance according to current geographic position information;
Transmission unit 212, for sending distress signals to medical institutions.
Execution module 20 according to Emotion identification as a result, be in a state of emergency if getting user, after needing first aid to help, The current geographic position information that user is obtained by second acquisition unit 210, in subsequent transmission distress signals.Such as it is right The scan image for obtaining the acquisition in module 10 is analyzed, and in conjunction with Expression analysis software, is believed according to the eyeball in scan image Breath and facial information analyze the eyeball information characteristics obtained user at that time and facial information feature, in some embodiments, if When showing that user is in the emotional state of pain of highest level according to eyeball information characteristics and facial information signature analysis, such as Pupil variation abnormality degree reaches threshold value, and facial muscle degreeof tortuosity reaches threshold value, and corners of the mouth effluent has covering, then judges user It is in a state of emergency, it is possible in morbidity state or injured state etc..In some embodiments, the detection state of emergency Function can be closed or be opened by user's selection, and it is larger that the user for playing the role of morbidity medical history selects unlatching to play, and plays benefit Function is rescued, the user for medical history of not falling ill then can choose closing, respect fully individual subscriber wish.
After second acquisition unit 210 obtains the current geographic position information of user, the second searching unit 211, which is searched, to be used Medical institutions in pre-determined distance, so that user can be given treatment in time as early as possible.
After the second searching unit 211 finds the medical institutions near user, transmission unit 212 is to the medical institutions Ambulance call is dialed, is helped with allowing user to obtain first aid.In some embodiments, user can preset the therapeutic machine of oneself preference Structure, when detecting that user is in a state of emergency, after needing first aid to help, the just default therapeutic machine from trend of transmission unit 212 Structure dials ambulance call, succours user timely.In further embodiments, transmission unit 212 is to medical institutions While dialing ambulance call, distress signals can be also sent from the preset emergency contact of trend or dial rescuing telephone, to allow Relatives and friends learn the message that the user needs to be succoured.
The invention proposes a kind of application systems of Emotion identification, comprising: module 10 is obtained, for obtaining the current of user Emotion identification result;Execution module 20, for executing corresponding operation according to Emotion identification result, operation includes at least social point It enjoys and/or data statistics.By detecting and obtaining the emotional change of user, with it is automatic, quickly take different corresponding operatings, Wherein, mood is shared by Emotion identification result, can in time divided active mood/animation of user and friend It enjoys, plays the guiding role of active life, or comfort user timely and timely sharing, in addition, logical The mood of overwriting user also becomes more apparent upon oneself mood on ordinary days as a result, family can be used while sharing life with other people State carries out specific aim optimum management to the emotional state of oneself as a result, family can be used according to the record in a period of time, The memory time of user can be waken up, thus improves people's lives experience and quality of life, and simple and practical, it is convenient and efficient.
The invention also provides a kind of intelligent terminals, including processor and computer program, and computer program is by processor The step of application method of any one of above-mentioned Emotion identification is realized when execution, comprising: obtain the current emotional identification of user As a result;Corresponding operation is executed according to Emotion identification result, operation includes at least social sharing and/or data statistics.
Those of ordinary skill in the art will appreciate that realizing all or part of the process in above-described embodiment method, being can be with Relevant hardware is instructed to complete by computer program, the computer program can be stored in a non-volatile computer In read/write memory medium, the computer program is when being executed, it may include such as the process of the embodiment of above-mentioned each method.Wherein, Any reference used in provided herein and embodiment to memory, storage, database or other media, Including non-volatile and/or volatile memory.Nonvolatile memory may include read-only memory (ROM), programming ROM (PROM), electrically programmable ROM (EPROM), electrically erasable ROM (EEPROM) or flash memory.Volatile memory may include Random access memory (RAM) or external cache.By way of illustration and not limitation, RAM is available in many forms, Such as static state RAM (SRAM), dynamic ram (DRAM), synchronous dram (SDRAM), double speed are according to rate SDRAM (SSRSDRAM), enhancing Type SDRAM (ESDRAM), synchronization link (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic ram (DRDRAM) and memory bus dynamic ram (RDRAM) etc..
The above description is only a preferred embodiment of the present invention, is not intended to limit the scope of the invention, all utilizations Equivalent structure or equivalent flow shift made by description of the invention and accompanying drawing content is applied directly or indirectly in other correlations Technical field, be included within the scope of the present invention.

Claims (10)

1. a kind of application method of Emotion identification, which is characterized in that be applied in intelligent terminal, comprising:
Obtain the current emotional recognition result of user;
Corresponding operation is executed according to the Emotion identification result, the operation includes at least social sharing and/or data statistics.
2. the application method of Emotion identification according to claim 1, which is characterized in that described according to the Emotion identification As a result before the step of executing corresponding operation, further includes:
Judge whether the operation needs the user to confirm;
If desired, then it generates confirmation and reminds and be pushed to the user, the confirmation, which is reminded, executes mentioning for the operation for confirmation It wakes up;
After receiving the instruction that the user confirms the execution operation, into the step of executing the operation.
3. the application method of Emotion identification according to claim 2, which is characterized in that described to judge whether the operation needs The step of wanting the user to confirm, comprising:
Judge whether in automatic operation mode;
If being not at the automatic operation mode, determine that the operation needs user's confirmation.
4. the application method of Emotion identification according to claim 1, which is characterized in that the operation further includes playing sound It is happy;Described the step of corresponding operation is executed according to the Emotion identification result, comprising:
According to the Emotion identification result determination music categories to be played;
Search the specific music that the music categories are corresponded in the music record or favorites list of the user;
Play the specific music.
5. the application method of Emotion identification according to claim 4, which is characterized in that in the sound for searching the user After the step of corresponding to the specific music of the music categories in happy broadcasting record or favorites list, comprising:
If not finding the specific music, carries out intelligent recommendation and play the music of the corresponding music categories.
6. the application method of Emotion identification according to claim 1, which is characterized in that when the operation is shared for social activity, Described the step of corresponding operation is executed according to the Emotion identification result, comprising:
Corresponding default sharing template is determined according to the Emotion identification result;
Social sharing is carried out to the Emotion identification result according to the default sharing template.
7. the application method of Emotion identification according to claim 1, which is characterized in that when the operation is data statistics, Described the step of corresponding operation is executed according to the Emotion identification result, comprising:
The scene information when Emotion identification result generates is obtained, the scene information includes at least temporal information, geographical position One of confidence breath, photographic intelligence;
It is corresponding to the distribution of the Emotion identification result, the temporal information, the geographical location information and the photographic intelligence Storage label;
The Emotion identification result, the temporal information, the geographical location information, the photographic intelligence and the storage are marked Label are associated storage jointly.
8. the application method of Emotion identification according to claim 1, which is characterized in that the operation further includes to therapeutic machine Structure sends distress signals;Described the step of corresponding operation is executed according to the Emotion identification result, comprising:
Obtain the current geographical location information of the user;
According to presently described geographical location information, the medical institutions in pre-determined distance are searched;
Distress signals are sent to the medical institutions.
9. a kind of application system of Emotion identification characterized by comprising
Module is obtained, for obtaining the current emotional recognition result of user;
Execution module, for executing corresponding operation according to the Emotion identification result, the operation includes at least social share And/or data statistics.
10. a kind of intelligent terminal, including memory and processor, it is stored with computer program in the memory, feature exists In the step of processor realizes any one of claims 1 to 8 the method when executing the computer program.
CN201811574475.6A 2018-12-21 2018-12-21 Application method, system and the intelligent terminal of Emotion identification Pending CN109829082A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811574475.6A CN109829082A (en) 2018-12-21 2018-12-21 Application method, system and the intelligent terminal of Emotion identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811574475.6A CN109829082A (en) 2018-12-21 2018-12-21 Application method, system and the intelligent terminal of Emotion identification

Publications (1)

Publication Number Publication Date
CN109829082A true CN109829082A (en) 2019-05-31

Family

ID=66859922

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811574475.6A Pending CN109829082A (en) 2018-12-21 2018-12-21 Application method, system and the intelligent terminal of Emotion identification

Country Status (1)

Country Link
CN (1) CN109829082A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110853605A (en) * 2019-11-15 2020-02-28 中国传媒大学 Music generation method and device and electronic equipment
CN111383669A (en) * 2020-03-19 2020-07-07 杭州网易云音乐科技有限公司 Multimedia file uploading method, device, equipment and computer readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080190138A1 (en) * 2006-06-23 2008-08-14 Mindful Moods, Llp Calibratable Mood Patch-Apparatus and Method of Use
CN101836219A (en) * 2007-11-01 2010-09-15 索尼爱立信移动通讯有限公司 Generating music playlist based on facial expression
CN102467668A (en) * 2010-11-16 2012-05-23 鸿富锦精密工业(深圳)有限公司 Emotion detecting and soothing system and method
CN104523247A (en) * 2014-12-01 2015-04-22 成都智信优创科技有限公司 Wearable health medical treatment device
CN105719426A (en) * 2016-01-25 2016-06-29 广东小天才科技有限公司 Method and device for sorted calling for help
CN107320114A (en) * 2017-06-29 2017-11-07 京东方科技集团股份有限公司 Shooting processing method, system and its equipment detected based on brain wave

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080190138A1 (en) * 2006-06-23 2008-08-14 Mindful Moods, Llp Calibratable Mood Patch-Apparatus and Method of Use
CN101836219A (en) * 2007-11-01 2010-09-15 索尼爱立信移动通讯有限公司 Generating music playlist based on facial expression
CN102467668A (en) * 2010-11-16 2012-05-23 鸿富锦精密工业(深圳)有限公司 Emotion detecting and soothing system and method
CN104523247A (en) * 2014-12-01 2015-04-22 成都智信优创科技有限公司 Wearable health medical treatment device
CN105719426A (en) * 2016-01-25 2016-06-29 广东小天才科技有限公司 Method and device for sorted calling for help
CN107320114A (en) * 2017-06-29 2017-11-07 京东方科技集团股份有限公司 Shooting processing method, system and its equipment detected based on brain wave

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110853605A (en) * 2019-11-15 2020-02-28 中国传媒大学 Music generation method and device and electronic equipment
CN111383669A (en) * 2020-03-19 2020-07-07 杭州网易云音乐科技有限公司 Multimedia file uploading method, device, equipment and computer readable storage medium

Similar Documents

Publication Publication Date Title
US20210005224A1 (en) System and Method for Determining a State of a User
CN105009556B (en) Intention engine for the enhancing response in interactive remote communication
US9936127B2 (en) Systems and methods for providing attention directing functions in an image capturing device
US11806145B2 (en) Photographing processing method based on brain wave detection and wearable device
US20180101776A1 (en) Extracting An Emotional State From Device Data
US9235263B2 (en) Information processing device, determination method, and non-transitory computer readable storage medium
JP5120777B2 (en) Electronic data editing apparatus, electronic data editing method and program
CN103561652A (en) Method and system for assisting patients
US20100086204A1 (en) System and method for capturing an emotional characteristic of a user
US20110181684A1 (en) Method of remote video communication and system of synthesis analysis and protection of user video images
CN117633868A (en) Privacy protection method and device
CN106126017A (en) Intelligent identification Method, device and terminal unit
CN109416591A (en) Image data for enhanced user interaction
CN108989662A (en) A kind of method and terminal device of control shooting
CN110326300A (en) Information processing equipment, information processing method and program
JP2006165821A (en) Portable telephone
CN111797249A (en) Content pushing method, device and equipment
CN109191776A (en) A kind of calling method, device, terminal, wearable device and storage medium
CN107562952A (en) The method, apparatus and terminal that music matching plays
CN109829082A (en) Application method, system and the intelligent terminal of Emotion identification
CN111797304A (en) Content pushing method, device and equipment
KR102263154B1 (en) Smart mirror system and realization method for training facial sensibility expression
CN108335734A (en) Clinical image recording method, device and computer readable storage medium
CN110892411B (en) Detecting popular faces in real-time media
KR102423257B1 (en) Supporting system for service about processing record data of dementia patient

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190531

RJ01 Rejection of invention patent application after publication