CN110134316A - Model training method, Emotion identification method and relevant apparatus and equipment - Google Patents

Model training method, Emotion identification method and relevant apparatus and equipment Download PDF

Info

Publication number
CN110134316A
CN110134316A CN201910309245.5A CN201910309245A CN110134316A CN 110134316 A CN110134316 A CN 110134316A CN 201910309245 A CN201910309245 A CN 201910309245A CN 110134316 A CN110134316 A CN 110134316A
Authority
CN
China
Prior art keywords
touch control
user
control manner
emotional state
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910309245.5A
Other languages
Chinese (zh)
Other versions
CN110134316B (en
Inventor
李向东
田艳
王剑平
张艳存
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201910309245.5A priority Critical patent/CN110134316B/en
Publication of CN110134316A publication Critical patent/CN110134316A/en
Priority to PCT/CN2020/084216 priority patent/WO2020211701A1/en
Application granted granted Critical
Publication of CN110134316B publication Critical patent/CN110134316B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The embodiment of the present application discloses a kind of model training method and Emotion identification method, touch control manner and the corresponding emotional state of touch control manner when the application is based on user's operation control terminal equipment, disaggregated model is trained using machine learning techniques, obtains Emotion identification model;In turn, in practical applications, it can use the touch control manner when Emotion identification model manipulates the terminal device according to user, correspondingly determine the current emotional state of user.In this way, touch control manner and its corresponding emotional state when being manipulated some terminal device using user, are trained disaggregated model, the Emotion identification model suitable for the terminal device is obtained;Correspondingly, when which identifies the emotional state of user using the Emotion identification model, which can targetedly manipulate the touch control manner when terminal device according to user, accurately identify the emotional state of user.

Description

Model training method, Emotion identification method and relevant apparatus and equipment
Technical field
This application involves field of computer technology, and in particular to model training method, Emotion identification method and relevant apparatus And equipment.
Background technique
Nowadays, role is more and more in people's daily life for terminal device such as smart phone, tablet computer etc. Important, terminal device bring user experience becomes the key factor that user measures terminal device, how to mention for user For personalized service, improve user experience, it has also become the Research Emphasis of each terminal device manufacturer concern.
Currently, having been developed that some terminal devices can be from the angle of identification user emotion state, according to identifying User emotion state correspondingly provide personalized service for user;And reasonable personalized service can be provided for user, it is main To depend on the accuracy of user emotion state recognition.The current relatively conventional method based on human facial expression recognition mood, with The change of the factors such as the relative position between ambient light, user's face and terminal device, the accuracy of human facial expression recognition Also it can change, i.e., this method not can guarantee the facial expression for accurately identifying user, and then will lead to and known based on facial expression Not Chu emotional state be inaccurate.
In addition, there is also a kind of method for determining user emotion state based on physiology signal, the party in the prior art Method measures the physiology signal of user, such as heart rate, body temperature, blood pressure by additional measuring device, and then by these Physiology signal is analyzed and processed, and determines the emotional state of user;This method needs during realization by volume Outer external equipment, and the use of external equipment is more burdensome for a user, all in all, this method is not really achieved Improve the purpose of user experience.
Summary of the invention
The embodiment of the present application provides a kind of model training method, Emotion identification method and relevant apparatus and equipment, can The emotional state of user is accurately identified based on the Emotion identification model trained, consequently facilitating terminal device is according to being identified Emotional state, provide more reasonable personalized service for user.
In view of this, the application first aspect provides a kind of model training method, this method can be applied to terminal and set Standby and server obtains touch control manner when user's operation control terminal equipment, and mark the corresponding feelings of touch control manner in the method Not-ready status, using the touch control manner and the corresponding emotional state of touch control manner as training sample;In turn, using machine learning skill Art (machine learning technology, MLT), is trained preset disaggregated model using above-mentioned training sample, The Emotion identification model suitable for the terminal device is obtained, when which can manipulate the terminal device according to user Touch control manner, correspondingly determine the corresponding emotional state of the touch control manner.
Above-mentioned model training method can be directed to different terminal devices, be accordingly based on the user behaviour of the terminal device Touch control manner and the corresponding emotional state of touch control manner when controlling the terminal device, have trained needle using machine learning algorithm It is suitable for property the Emotion identification model of the terminal device;In this way, the feelings that application is obtained for its training on the terminal device Thread identification model, it is ensured that the Emotion identification model accurately can manipulate the terminal according to the user of the terminal device and set Touch control manner when standby, determines the emotional state of user.
In the first implementation of the embodiment of the present application first aspect, its corresponding feelings is determined for certain touch control manner When not-ready status, it can determine reference time section first according to the touch control manner corresponding triggered time, then, obtain the reference User's operation terminal device operation data content generated in time interval, as user's input terminal equipment word content, Voice content etc. by analyzing the operation data content obtained in reference time section, determines the operation data content in turn Corresponding emotional state, as the corresponding emotional state of the touch control manner.
In this way, the operation data content generated when using user's operation control terminal equipment, determines the corresponding mood of touch control manner State can guarantee that identified emotional state is rationally accurate, and then guarantee between identified touch control manner and emotional state Corresponding relationship it is rationally accurate.
In second of implementation of the embodiment of the present application first aspect, its corresponding feelings is determined for certain touch control manner When not-ready status, preset emotional state mapping table can be called, record has touch-control side in the emotional state mapping table Corresponding relationship between formula and emotional state;In turn, the corresponding feelings of touch control manner are searched in the emotional state mapping table Not-ready status.
Between the emotional state for having touch control manner and user of the related experiment for user's touch control terminal equipment when at present Corresponding relationship carried out research investigation, and some findings for being able to reflect this corresponding relationship are generated, according to these Finding correspondingly generates emotional state mapping table, and determines the corresponding mood of touch control manner based on the emotional state mapping table State can effectively guarantee that the emotional state determined for touch control manner is objective rationally.
It, can be first default when obtaining training sample in the third implementation of the embodiment of the present application first aspect Then the touch data that acquisition user's operation control terminal equipment generates in period does clustering processing to these touch datas and generates touching Data acquisition system is controlled, and determines touch control manner corresponding to touch data set, will include the most touch-control number of touch data then It is used as target touch data set according to set, using the corresponding touch control manner of target touch data set as target touch-control side Formula, and then the corresponding emotional state of target touch control manner is marked, target touch control manner and its corresponding emotional state are made For training sample.
Under normal conditions, user whithin a period of time can use a variety of different touch control manner operation control terminal equipment, and In this time, too big variation may not occur for the emotional state of user, for this reason, it may be necessary to by this kind of implementation from In a variety of touch control manners that family uses within this time, selects and be best able to the current emotional state touch control manner of characterization user I.e. above-mentioned target touch control manner, in turn, using the current emotional state of the target touch control manner and user as training sample, It so can effectively guarantee that the corresponding relationship between touch control manner and emotional state is accurately reasonable.
In the 4th kind of implementation of the embodiment of the present application first aspect, mentioned in the third above-mentioned implementation Touch data includes: screen capacitance variation data and coordinate value delta data.Since current most touch-screen equipments are used Touch screen be capacitance plate, therefore, can be with using screen capacitance variation data and coordinate value delta data as touch data Guarantee that method provided by the embodiments of the present application can be widely used in daily work life.
In the 5th kind of implementation of the embodiment of the present application first aspect, after training obtains Emotion identification model, also Touch control manner when user is subsequent to manipulate the terminal device can further be obtained and marks this excellent as optimization touch control manner Change the corresponding emotional state of touch control manner, using optimization touch control manner and its corresponding emotional state as optimization training sample; So as to later use, the optimization training sample optimizes training to Emotion identification model.
With the increase for using the time, the touch control manner that when user's touch control terminal equipment uses may also can change, The emotional state of user can be accurately identified according to the touch control manner of user in order to guarantee Emotion identification model always, in training Touch control manner and its corresponding mood after obtaining Emotion identification model, when can also constantly acquire user's operation control terminal equipment State, as optimization training sample, in turn, and when Emotion identification model can not accurately identify the emotional state of user, Ke Yili Training is advanced optimized to Emotion identification model with the optimization training sample, to guarantee that it has preferable model always Energy.
In the 6th kind of implementation of the embodiment of the present application first aspect, available user is directed to Emotion identification model Feedback information, and the feedback information characterization Emotion identification model performance be unsatisfactory for user demand when, using at the 5th kind The optimization training sample obtained in implementation optimizes training to Emotion identification model.
Due to the Emotion identification model in the application towards object be terminal device user, user's makes It can be used as with experience and measure mostly important one of the standard of the Emotion identification model performance, in the user feedback Emotion identification mould When the performance of type has been unable to satisfy self-demand, i.e., the emotional state for thinking that the Emotion identification model is identified in user is inadequate When accurate, i.e., instruction is optimized to the Emotion identification model using the optimization training sample obtained in the 5th kind of implementation Practice, so that its use demand for meeting user, improves the user experience.
In the 7th kind of implementation of the embodiment of the present application first aspect, terminal device can meet in charging shape The duration that state, remaining capacity are higher than default electricity and are in idle condition is more than any one of this three conditions of preset duration Or it is multinomial in the case where, using the optimization training sample obtained in above-mentioned 5th kind of implementation to Emotion identification model carry out Optimization training.
Due to usually requiring to expend big volume and electricity when optimizing trained to Emotion identification model, and may be to terminal The other function of equipment affects, and in order to guarantee not influence user's normal use terminal device, terminal device can be with When meeting any one of above-mentioned three conditions or multinomial condition, training is optimized to Emotion identification model, ensures user Usage experience.
The application second aspect provides a kind of Emotion identification method, and this method is usually applied to terminal device, in the party In method, terminal device obtains touch control manner when user manipulates itself, and being determined using the Emotion identification model of self-operating should The corresponding emotional state of touch control manner, the emotional state current as user, the Emotion identification model are using above-mentioned first party The model training method that face provides, obtains for the terminal device training.
The Emotion identification method utilize Emotion identification model, targetedly according to user's operation control terminal equipment when touch-control Mode determines the emotional state of user, can guarantee the accuracy of identified emotional state;Also, this method is determining user During emotional state, without any additional external equipment, the purpose for improving user experience is truly realized.
In the first implementation of the embodiment of the present application second aspect, terminal device can show desktop at its own In the case where interface, the current emotional state of the user identified according to Emotion identification model switches the display of desk interface Pattern.In this way, Show Styles of the terminal device by its desk interface of change, directly changes the visual experience of user, from view The emotional state for adjusting in perception or cooperating user is felt, to improve the usage experience of user.
In second of implementation of the embodiment of the present application second aspect, terminal device can open application at its own In the case where program, the current emotional state of the user identified according to Emotion identification model recommends phase by application program Hold inside the Pass, for example, recommending relevant musical content, video content, word content etc..In this way, leading in conjunction with the emotional state of user Crossing corresponding application program is that user recommends related content, is adjusted, is mentioned in real time to user emotion state from multiple angles High user experience.
The application third aspect provides a kind of model training apparatus, and described device includes:
Training sample obtains module and marks the touch-control side for obtaining touch control manner when user's operation control terminal equipment The corresponding emotional state of formula;By the touch control manner and the corresponding emotional state of the touch control manner, as training sample;
Model training module, for being trained to disaggregated model using the training sample using machine learning algorithm, Obtain Emotion identification model;Touch control manner when the Emotion identification model manipulates the terminal device with user is input, with The corresponding emotional state of the touch control manner is output.
In the first implementation of the embodiment of the present application third aspect, the training sample obtains module and specifically uses In:
According to the touch control manner corresponding triggered time, reference time section is determined;
Obtain the operation data content that terminal device described in user's operation generates in the reference time section;
The emotional state that user is determined according to the operation data content, as the corresponding mood shape of the touch control manner State.
In second of implementation of the embodiment of the present application third aspect, the training sample obtains module and specifically uses In:
Call preset emotional state mapping table;In the emotional state mapping table record have touch control manner with Corresponding relationship between emotional state;
The emotional state mapping table is searched, determines the corresponding emotional state of the touch control manner.
In the third implementation of the embodiment of the present application third aspect, the training sample obtains module and specifically uses In:
Within a preset period of time, acquisition user manipulates the touch data that the terminal device generates;
Clustering processing is done to the touch data and generates touch data set, determines the corresponding touching of the touch data set Prosecutor formula;
It will include the most touch data set of touch data as target touch data set, by the target touch-control number According to the corresponding touch control manner of set as target touch control manner;Mark the corresponding emotional state of the target touch control manner;
By the target touch control manner and the corresponding emotional state of the target touch control manner, as training sample.
In the 4th kind of implementation of the embodiment of the present application third aspect, the touch data includes: screen capacitance Delta data and coordinate value delta data.
In the 5th kind of implementation of the embodiment of the present application third aspect, described device further include:
Optimize training sample and obtain module, for obtaining touch control manner when user manipulates the terminal device, as excellent Change touch control manner;Mark the corresponding emotional state of the optimization touch control manner;The optimization touch control manner and the optimization are touched The corresponding emotional state of prosecutor formula, as optimization training sample;The optimization training sample is used for the Emotion identification model Optimize training.
In the 6th kind of implementation of the embodiment of the present application third aspect, described device further include:
Feedback information obtains module, the feedback information for being directed to the Emotion identification model for obtaining user;The feedback Whether the performance that information is used to characterize the Emotion identification model meets user demand;
First optimization training module, the performance for characterizing the Emotion identification model in the feedback information are unsatisfactory for using When the demand of family, training is optimized to the Emotion identification model using the optimization training sample.
In the 7th kind of implementation of the embodiment of the present application third aspect, described device further include:
Second optimization training module, is used for when the terminal device is in charged state, and/or, it is set in the terminal When standby remaining capacity is higher than default electricity, and/or, it is more than preset duration in the duration that the terminal device is in idle condition When, training is optimized to the Emotion identification model using the optimization training sample.
The application fourth aspect provides a kind of Emotion identification device, and described device includes:
Touch control manner obtains module, for obtaining touch control manner when user's operation control terminal equipment;
Emotional state identification module, for determining the corresponding emotional state of the touch control manner using Emotion identification model, The emotional state current as user;The Emotion identification model is that execute model training method described in first aspect trained It arrives.
In the first implementation of the embodiment of the present application fourth aspect, described device further include:
Show Styles switching module is used in the case where the terminal device shows desk interface, according to the user Current emotional state switches the Show Styles of desk interface.
In second of implementation of the embodiment of the present application fourth aspect, described device further include:
Commending contents module is used in the case where the terminal device opens application program, current according to the user Emotional state, pass through the application program recommend related content.
The 5th aspect of the application provides a kind of server, and the server includes processor and memory:
Said program code is transferred to the processor for storing program code by the memory;
The processor is used for the model training according to the above-mentioned first aspect of the instruction execution in said program code Method.
The 6th aspect of the application provides a kind of terminal device, and the terminal device includes processor and memory:
Said program code is transferred to the processor for storing program code by the memory;
The processor is used for the model training method according to the instruction execution first aspect in said program code, And/or execute Emotion identification method described in second aspect.
The 7th aspect of the application provides a kind of computer readable storage medium, including instruction, when it is transported on computers When row, so that computer executes model training method described in first aspect, and/or Emotion identification described in second aspect is executed Method.
Detailed description of the invention
Fig. 1 is the application scenarios schematic diagram of model training method provided by the embodiments of the present application and Emotion identification method;
Fig. 2 is a kind of flow diagram of model training method provided by the embodiments of the present application;
Fig. 3 is a kind of flow diagram of Emotion identification method provided by the embodiments of the present application;
Fig. 4 is a kind of structural schematic diagram of model training apparatus provided by the embodiments of the present application;
Fig. 5 is a kind of structural schematic diagram of Emotion identification device provided by the embodiments of the present application;
Fig. 6 is a kind of structural schematic diagram of server provided by the embodiments of the present application;
Fig. 7 is the structural schematic diagram of a kind of electronic equipment provided by the embodiments of the present application;
Fig. 8 is the software architecture diagram of a kind of electronic equipment provided by the embodiments of the present application.
Specific embodiment
In order to make those skilled in the art more fully understand application scheme, below in conjunction in the embodiment of the present application Attached drawing, the technical scheme in the embodiment of the application is clearly and completely described, it is clear that described embodiment is only this Apply for a part of the embodiment, instead of all the embodiments.Based on the embodiment in the application, those of ordinary skill in the art exist Every other embodiment obtained under the premise of creative work is not made, shall fall in the protection scope of this application.
The description and claims of this application and term " first ", " second ", " third ", " in above-mentioned attached drawing The (if present)s such as four " are to be used to distinguish similar objects, without being used to describe a particular order or precedence order.It should manage The data that solution uses in this way are interchangeable under appropriate circumstances, so as to embodiments herein described herein can in addition to Here the sequence other than those of diagram or description is implemented.In addition, term " includes " and " having " and their any deformation, Be intended to cover it is non-exclusive include, for example, containing the process, method of a series of steps or units, system, product or setting It is standby those of to be not necessarily limited to be clearly listed step or unit, but may include be not clearly listed or for these mistakes The intrinsic other step or units of journey, method, product or equipment.
It is user's bring user experience to further increase terminal device, provides more intimate, more individual character for user The service of change has some terminal device manufacturers from the angle of identification user emotion state at present, and exploitation terminal device is known The function of other user emotion state, the method for being applied to terminal device identification emotional state relatively conventional at present includes following three Kind:
Human facial expression recognition method, using on terminal device photographic device acquisition user facial expression, and then by pair The facial expression of user is analyzed and processed, and determines the emotional state of user;Since light is different under different scenes, and Relative position between user's face and terminal device is unstable, and this method not can guarantee accurately identifies use in all cases Family facial expression correspondingly in the lower situation of user's face Expression Recognition accuracy, not can guarantee identified user yet The accuracy of emotional state.
Phonetic recognition algorithm, using the voice content of terminal device acquisition user's input, by analyzing voice content Processing, determines the emotional state of user;This method needs user actively to input expression emotional state to terminal device voice Voice content, and then terminal device can correspondingly determine the emotional state of user;And in most cases, user can't Actively inform the emotional state of terminal device itself, it is seen then that the application value of this method in practical applications is lower.
Physiology signal method of identification, terminal device, which acquires Human Physiology by additional measuring device or sensor, to be believed Number, such as heart rate, body temperature, blood pressure, in turn, terminal device is analyzed and processed physiology signal collected, determines and uses The emotional state at family;This method is needed during realization by additional external equipment, and the use pair of external equipment It is typically more burdensome for user, i.e., it can be that user brings bad user experience from other side, not be really achieved Improve the purpose of user experience.
In order to enable terminal device accurately to identify the emotional state of user, guarantee to set from truly raising terminal Standby brought user experience, the application look for another way, and the screen accounting based on present terminal equipment is higher and higher, and user is frequent Ground interacts this phenomenon by touch screen (touch pad, TP) and terminal device, using user under identical emotional state The touch control manner of operation control terminal equipment have the characteristics that similar rule this, training one kind can be set based on user's operation control terminal Touch control manner when standby, determines the model of user's current emotional states, so that terminal device identifies user using the model Emotional state, provide reasonable personalized service for user.
Specifically, terminal device obtains user's operation control terminal and sets in model training method provided by the embodiments of the present application Touch control manner when standby, and mark the corresponding emotional state of touch control manner, using touch control manner and its corresponding emotional state as Training sample;And then machine learning algorithm is used, disaggregated model is trained to obtain Emotion identification using above-mentioned training sample Model.Correspondingly, in Emotion identification method provided by the embodiments of the present application, terminal device obtains user and manipulates the terminal device When touch control manner, and then it is acquired using determining by the obtained Emotion identification model of above-mentioned model training method training The corresponding emotional state of touch control manner, using the emotional state emotional state current as user.
It should be noted that model training method provided by the embodiments of the present application can be directed to different terminal devices, accordingly Touch control manner and the corresponding emotional state of touch control manner when ground manipulates the terminal device based on the user of the terminal device, The Emotion identification model for being targetedly suitable for the terminal device is trained using machine learning algorithm;In this way, being set in terminal Standby upper application trains obtained Emotion identification model for it, it is ensured that the Emotion identification model can be accurately according to the end The user of end equipment manipulates the touch control manner when terminal device, determines the emotional state of user.
Common several Emotion identification methods, method provided by the embodiments of the present application can utilize feelings compared to the prior art Thread identification model, targetedly according to user's operation control terminal equipment when touch control manner, determine the emotional state of user, guarantee The accuracy of identified emotional state;Also, method provided by the embodiments of the present application is in the process for determining user emotion state In, without any additional external equipment, truly realize the purpose for improving user experience.
It should be understood that model training method provided by the embodiments of the present application and Emotion identification method can be applied to be configured with The terminal device (being referred to as electronic equipment) and server of touch screen;Wherein, terminal device is specifically as follows intelligent hand Machine, tablet computer, computer, personal digital assistant (Personal Digital Assitant, PDA) etc.;Servicing implement body can Think application server, or Web server.
Technical solution provided by the embodiments of the present application in order to facilitate understanding is using terminal device as executing subject below Example, is introduced the application scenarios of model training method provided by the embodiments of the present application and Emotion identification method.
Referring to Fig. 1, Fig. 1 shows for the application scenarios of model training method provided by the embodiments of the present application and Emotion identification method It is intended to.As shown in Figure 1, including terminal device 101 in the application scenarios, the terminal device 101 is both for executing the application implementation The model training method training Emotion identification model that example provides, and for executing Emotion identification method provided by the embodiments of the present application The emotional state of user is identified.
In model training stage, terminal device 101 obtains touch control manner when user manipulates itself, and the touch control manner is specific It may include: clicking operation, the slide etc. of different dynamics and/or different frequency;The acquired touch control manner of label is corresponding Emotional state;In turn, using acquired touch control manner and its corresponding emotional state as training sample.Terminal device After 101 get training sample, using machine learning algorithm, using acquired training sample to preparatory in terminal device 101 The disaggregated model of building is trained, to obtain Emotion identification model.
In the model application stage, terminal device 101 executes Emotion identification method provided by the embodiments of the present application, using in mould The Emotion identification model that the training of type training stage obtains, identifies the emotional state of user;It is used specifically, terminal device 101 obtains Family manipulates touch control manner when itself, and the acquired corresponding emotional state of touch control manner is determined using Emotion identification model, makees For the current emotional state of user.
It should be noted that in model training stage, terminal device 101 is when manipulating itself based on itself user Touch control manner and its corresponding emotional state, the Emotion identification model that training obtains, the Emotion identification model are that specific aim is suitable For terminal device 101;Correspondingly, in the model application stage, terminal device 101 is obtained using in model training stage training Emotion identification model, touch control manner when manipulating itself according to user determines the current emotional state of user, can be effectively Guarantee the accuracy of determined emotional state.
It should be understood that above-mentioned application scenarios shown in FIG. 1 are only a kind of example, in practical applications, the embodiment of the present application is mentioned The model training method and Emotion identification method of confession can also be applied to other application scenarios, not mention herein to the embodiment of the present application The model training method of confession and the application scenarios of Emotion identification method are specifically limited.
Embodiment is first passed through below model training method provided by the present application is introduced.
Referring to fig. 2, Fig. 2 is a kind of flow diagram of model training method provided by the embodiments of the present application.Such as Fig. 2 institute Show, the model training method the following steps are included:
Step 201: obtaining touch control manner when user's operation control terminal equipment, mark the corresponding mood shape of the touch control manner State;By the touch control manner and the corresponding emotional state of the touch control manner, as training sample.
Terminal device obtains touch control manner when user's touch-control touch screen, the touch control manner when user manipulates itself It can be understood as touch control operation, touch control operation is specifically as follows user and is directed to the single step touch control operation that touch screen is initiated, such as different The slide etc. under clicking operation, different dynamics under dynamics, or user is directed to the continuous touch-control that touch screen is initiated Operation, such as adopting consecutive click chemical reaction operation, the continuously slipping operation of different frequency of different frequency, certainly, when user's touch-control touch screen Other used touch control operations also can be considered the touch control manner in the application, does not do have to the touch control manner in the application herein Body limits.
In turn, the acquired corresponding emotional state of touch control manner of terminal device label, the emotional state are user's hair Play the emotional state when touch control manner;Using the touch control manner and the corresponding emotional state of touch control manner as training sample.
It should be understood that in order to guarantee that the Emotion identification model obtained based on training sample training has preferable model performance, It usually requires to obtain a large amount of training sample;It certainly, can also be according to practical need in order to reduce the data processing amount of terminal device The quantity for reducing acquired training sample is sought, the quantity of acquired training sample is not specifically limited herein.
It should be noted that the touch data that touch control manner generates when usually requiring based on user's touch-control touch screen is come really It is fixed;For capacitance plate, touch data generally includes screen capacitance variation data and screen coordinate value delta data, In, screen capacitance variation data can characterize dynamics when user clicks or slide touch screen and user clicks or sliding Contact area when touch screen between touch screen, user clicks or the dynamics of sliding touch screen is bigger, the change of screen capacitance Change amplitude is bigger, user click or sliding touch screen when and touch screen contact area it is bigger, changed screen capacitance It is more;Screen coordinate value delta data is actually also to be determined according to screen capacitance variation data, screen coordinate value variation Glide direction and sliding when click location and user when data can characterize user's point touching screen slide touch screen away from From;In the touch screen of user's touch control terminal equipment, the bottom layer driving of terminal device can be by input (input) subsystem to end The processor of end equipment reports screen capacitance variation data and its corresponding position coordinates, by the position for recording consecutive variations Coordinate can determine glide direction and sliding distance.
It should be understood that user's touch-control touch screen will correspondingly generate other touch-control numbers for other kinds of touch screen According to for example, user's touch-control touch screens can correspondingly generate screen resistance change data and screen coordinate for touch screens It is worth delta data, these data can correspondingly reflect the current touch control manner of user, herein not to the concrete kind of touch data Type does any restriction.
Be specifically based on touch data determine touch control manner, building training sample when, terminal device can be in preset time period The touch data that interior acquisition user's operation control terminal equipment generates;Clustering processing is done to touch data collected and generates touch data Set, and determine the corresponding touch control manner of touch data set;It will include the most touch data set of touch data as mesh Touch data set is marked, using the corresponding touch control manner of target touch data set as target touch control manner, and then marking should The corresponding emotional state of target touch control manner;Finally, by target touch control manner and the corresponding emotional state of target touch control manner As training sample.
Specifically, within a preset period of time user would generally multiple operation control terminal equipment, correspondingly, terminal device can be adopted Collect multiple touch datas;Touch data cluster with similar features is got up, for example, can be by the similar screen of amplitude of variation Curtain capacitance variation data clusters are to together, by the similar screen capacitance variation data clusters of corresponding click location to one It rises, by the sliding trace characterized similar screen coordinate value delta data cluster to together, etc., thus obtains several touchings Control data acquisition system;In turn, according to the type of touch data in each touch-control set, correspondingly mark each touch-control set corresponding Touch control manner, for example, for being more than touch data set that the touch data of predetermined amplitude threshold value forms by amplitude of variation, it can To mark its corresponding touch control manner to click as severe, for being more than the touch data group of predeterminated frequency threshold value by change frequency At touch data set, can mark its corresponding touch control manner be frequently click, for being more than default by change frequency The touch data set of the screen coordinate value delta data composition of frequency threshold, it is frequent for can marking its corresponding touch control manner Sliding, etc..
In turn, determination includes that the touch data collection of most touch datas is combined into target touch data set, and corresponding Ground is using the corresponding touch control manner of target touch data set as target touch control manner;According to terminal device in preset time period The operation data content of the emotional state that can characterize user of acquisition, and/or, according to the touching recorded in emotional state mapping table Corresponding relationship between prosecutor formula and emotional state determines the corresponding emotional state of target touch control manner;Finally, by target touch-control Mode and its corresponding emotional state, as training sample.
It should be understood that can usually collect many target touchings through the above way during acquiring training sample Prosecutor formula and corresponding emotional state correspondingly in training Emotion identification model, can be based on target touch-control collected The classification of mode determines the touch control manner that Emotion identification model can identify, is based on the corresponding emotional state of each target touch control manner Determine the emotional state that Emotion identification model can determine that.
For the implementation method of the corresponding emotional state of label touch control manner, this application provides following two realization sides Method:
First method, terminal device determine reference time section according to the touch control manner corresponding triggered time;Obtaining should The operation data content that user's operation terminal device generates in reference time section;In turn, it is determined according to the operation data content The emotional state of user, as the corresponding emotional state of the touch control manner.
Specifically, terminal device can determine the touch control manner corresponding triggered time, by point centered on the triggered time, Reference time section is determined according to preset reference time siding-to-siding block length;In addition, terminal device can also be corresponding by touch control manner Triggered time as starting point or terminating point, determine reference time section according to preset reference time siding-to-siding block length, certainly, Terminal device can also use other modes, determine reference time section according to the touch control manner corresponding triggered time, herein not Any restriction is done to the mode for determining reference time section.
It should be understood that above-mentioned reference time siding-to-siding block length can be set according to actual needs, herein not to the reference time area Between length be specifically limited.
After determining reference time section, terminal device obtains user's operation control terminal equipment in the reference time section and produces Raw operation data content, the operation data content are the related data content that user manipulates terminal device generation, the operation Data content is specifically as follows the word content that user inputs the terminal device in reference time section, or Yong Hu The voice content of the terminal device is inputted in reference time section, can also be produced for user by the application program on terminal device Other raw operation data contents, do not do any restriction to the type of the operation data content herein.
After getting operation data content, terminal device can carry out at analysis accordingly by the operation data content Reason, determines the corresponding emotional state of operation data content;For example, when the word content that operation data content is user's input When, terminal device can be by carrying out the emotional state that semantic analysis determines it to the word content;When operation data content is When the voice content of user's input, terminal device can determine that its is corresponding by carrying out speech recognition analysis to the voice content Emotional state;When operation data content is the data content of other forms, terminal device can also correspondingly use its other party Formula determines its corresponding emotional state, does not also do any limit to the mode of the corresponding emotional state of determining operation data content herein It is fixed.Finally, using the corresponding emotional state of operation data content as the corresponding emotional state of touch control manner.
It should be understood that when determining target touch control manner by carrying out clustering processing to the touch data in preset time period, It can be directly using the preset time period as reference time section, and then according to user's operation control terminal equipment in the preset time period The operation data content of generation determines the corresponding emotional state of operation data content, as the corresponding mood of target touch control manner State.
It should be noted that terminal device needs to obtain the permissions of user before obtaining operation data content, only Have in the case where user allows terminal device to obtain operation data content, terminal device can just obtain user's operation control terminal equipment The operation data content of generation, and be that touch control manner marks corresponding emotional state based on acquired operation data content;And And terminal device is after getting operation data content, it is also necessary to the acquired operation data content of encryption storage, to ensure use The data-privacy safety at family.
Second method, terminal device call preset emotional state mapping table;The emotional state mapping table Middle record has the corresponding relationship between touch control manner and emotional state;In turn, touching is searched in the emotional state mapping table The corresponding emotional state of prosecutor formula.
Existing correlative study investigation discovery at present, the emotional state of touch control manner and user when user's touch control terminal equipment Between there are certain mapping relations, and generated some findings for being able to reflect this mapping relations, passed through It arranges these existing findings and correspondingly generates emotional state mapping table, remembered using the emotional state mapping table Record the corresponding relationship between various touch control manners and emotional state.
After touch control manner when getting user's operation control terminal equipment, the feelings that terminal device can call itself preset Not-ready status mapping table searches the acquired corresponding mood of touch control manner in turn in the emotional state mapping table State.
It should be understood that when determining target touch control manner by carrying out clustering processing to the touch data in preset time period, The corresponding emotional state of target touch control manner can be searched in the emotional state mapping table.
It should be noted that using above-mentioned first method, according in the operation data of user's operation terminal device generation Hold, after marking its corresponding emotional state for touch control manner, can also further utilize so determining touch control manner and feelings Corresponding relationship between not-ready status optimizes update processing to above-mentioned emotional state mapping table, constantly to enrich mood shape The mapping relations recorded in state mapping table.
It should be noted that in practical applications, can individually be marked using above-mentioned first method or second method Above-mentioned first method can also be combined label touch control manner with second method by the corresponding emotional state of touch control manner Corresponding emotional state, that is, can when the corresponding emotional state of touch control manner can not be accurately determined using first method, The corresponding emotional state of touch control manner is determined using second method, can also can not be accurately determined using second method When the corresponding emotional state of touch control manner, the corresponding emotional state of touch control manner is determined using first method, it can also basis The emotional state determined respectively using both methods determines the corresponding emotional state of touch control manner.
It should be understood that in practical applications, in addition to above two method can be used to mark emotional state for touch control manner Outside mode, other methods can also be selected to determine emotional state corresponding to touch control manner according to actual needs, it is not right herein The method of label emotional state does any restriction.
It should be noted that the emotional state often showed is substantially specific for the same user, Touch control manner used by touch control terminal equipment is also specific under specific emotional state;Training sample is acquired based on the above method This, it is ensured that most of the touch control manner for including in the collected training sample of institute for user through frequently with touch control manner, The corresponding emotional state of touch control manner is also most to belong to the emotional state that user often shows, correspondingly, it is ensured that be based on this The Emotion identification models that a little training samples training obtain, can more sensitively according to when user's touch control terminal equipment through frequently with touching Prosecutor formula determines the emotional state that user often shows.
Step 202: using machine learning algorithm, disaggregated model is trained using the training sample, obtains mood Identification model;Touch control manner when the Emotion identification model manipulates the terminal device with user is to input, with the touch-control side The corresponding emotional state of formula is output.
After getting the training sample for training Emotion identification model, terminal device can use machine learning algorithm, The disaggregated model being preset in terminal device is trained using acquired training sample, with the model to the disaggregated model Parameter is constantly optimized, after the disaggregated model meets training termination condition, according to the model structure of the disaggregated model and Model parameter generates Emotion identification model.
When specific training Emotion identification model, the touch control manner in training sample can be inputted classification mould by terminal device Type, the disaggregated model export the corresponding emotional state of the touch control manner, according to this by being analyzed and processed to the touch control manner The corresponding emotional state of touch control manner constructs loss function, in turn, root in the emotional state and training sample of disaggregated model output The model parameter in disaggregated model is adjusted according to the loss function, so that the optimization to disaggregated model is realized, when classification mould When type meets training termination condition, Emotion identification mould can be generated according to the model structure and model parameter of current class model Type.
When specifically judging whether disaggregated model meets trained termination condition, it can use test sample and the first model carried out Verifying, test sample is similar with training sample, and including touch control manner and the corresponding emotional state of touch control manner, this One model is to carry out the model that first round training optimization obtains to disaggregated model using multiple training samples;Specifically, terminal is set It is standby that touch control manner in test sample is inputted into first model, touch control manner is accordingly located in using first model Reason, obtains the corresponding emotional state of the touch control manner;In turn, according to the corresponding emotional state of touch control manner in test sample and this The emotional state of first model output calculates predictablity rate, when the predictablity rate is greater than preset threshold, i.e., it is believed that the The model performance of one model can satisfy demand, then can be raw according to the model parameter and model structure of first model At Emotion identification model.
It should be understood that above-mentioned preset threshold can be set according to the actual situation, the preset threshold is not done specifically herein It limits.
Moreover, it is judged that when whether disaggregated model meets trained termination condition, it can also be more according to being got through more trainings in rotation A model, it is determined whether continue to be trained disaggregated model, to obtain the optimal Emotion identification model of model performance.Specifically , it can use test sample and the multiple disaggregated models got through more trainings in rotation verified respectively, if judging through each training in rotation Gap between the predictablity rate of the model got is smaller, then it is assumed that the performance of disaggregated model without room for promotion, The highest disaggregated model of predictablity rate can be chosen, determines that mood is known according to the model parameter of the disaggregated model and model structure Other model;If having biggish gap between the predictablity rate of the disaggregated model got through each training in rotation, then it is assumed that the classification The performance of model there are also promoted space, can continue to be trained the disaggregated model, until obtain model performance it is most stable and Optimal Emotion identification model.
In addition, terminal device can also determine whether disaggregated model meets training and terminate item according to the feedback information of user Part.Specifically, terminal device can prompt user to carry out test use to the disaggregated model trained, and correspondingly feed back needle To the feedback information of the disaggregated model, if user characterizes the current performance of the disaggregated model for the feedback information of the disaggregated model It is still unable to satisfy user's current demand, then terminal device is needed using training sample, continues optimization instruction to the disaggregated model Practice;Conversely, if user for the feedback information of the disaggregated model to characterize the current performance of the disaggregated model preferable, substantially meet User's current demand, then terminal device can generate Emotion identification mould according to the model structure and model parameter of the disaggregated model Type.
It should be noted that the touch control manner of user's touch control terminal equipment is with the increase for using the time, it may occur that Change, therefore, after training obtains Emotion identification model, terminal device can also continue to optimization of collection training sample, and utilize Optimization training sample collected advanced optimizes training to Emotion identification model, to optimize the model of Emotion identification model Can, the emotional state of user can be more accurately determined according to the touch control manner of user.
Specifically, after obtaining Emotion identification model, when terminal device can continue to obtain user's operation control terminal equipment Touch control manner, as optimization touch control manner;And the corresponding emotional state of optimization touch control manner is marked, specifically mark mood The method of state may refer to the associated description in step 101, by optimization touch control manner and the corresponding feelings of optimization touch control manner As optimization training sample, which is used to make Emotion identification model optimization training not-ready status.
In one possible implementation, terminal device can initiate to know mood in response to the feedback information of user The optimization training of other model.That is, the available user of terminal device is directed to the feedback information of the Emotion identification model, the feedback letter Breath is for characterizing whether the performance of the Emotion identification model meets user demand;Emotion identification is characterized in acquired feedback information When the performance of model is unsatisfactory for user demand, training is optimized to the Emotion identification model using optimization training sample.
Specifically, terminal device, which can periodically initiate feedback information, obtains operation, for example, terminal device can periodically be shown Emotion identification model feedback acquisition of information interface, to obtain the feedback information that user is directed to Emotion identification model by the interface; Certainly, terminal device can also obtain feedback information by other means, not do herein to the acquisition modes of feedback information any It limits.
After terminal device gets feedback information, however, it is determined that it is discontented that feedback information characterizes the current performance of Emotion identification model The demand of sufficient user then correspondingly obtains optimization training sample, advanced optimizes training to the Emotion identification model;Conversely, It has been met the needs of users if it is determined that feedback information characterizes the current performance of Emotion identification model, then temporarily not to the Emotion identification Model advanced optimizes training.
In alternatively possible implementation, terminal device can directly when its own is in charged state, and/ Or, and its own remaining capacity be higher than default electricity when, and/or, be more than pre- in the duration that its own is in idle condition If when duration, optimizing training to Emotion identification model using optimization training sample.
It needs to expend the electricity of terminal device when optimizing trained to Emotion identification model, and optimizes trained process It may affect to the other function of terminal device, for example, influencing the operation speed of application program on terminal device Degree;In order to guarantee to optimize training in time to Emotion identification model in the case where not influencing user's using terminal equipment, Terminal device can be in charged state at itself, optimize training to the Emotion identification model using optimization training sample; Alternatively, terminal device can be when its remaining capacity be higher than default electricity, using optimization training sample to the Emotion identification model Optimize training;Alternatively, terminal device can utilize at which in the case that the duration of idle state is more than preset duration Optimization training sample optimizes training to Emotion identification model, and idle state herein specifically refers to user and sets without using terminal Terminal device state in which when standby;Again alternatively, terminal device can itself be in charged state, remaining capacity is higher than meeting When default electricity and idle state duration are more than any two condition in preset duration or three conditions, utilize and optimize training sample This optimizes training to Emotion identification model.
It should be understood that default electricity can be set according to actual needs, the numerical value of default electricity is not done specifically herein It limits;Preset duration can also be set according to actual needs, be not also specifically limited herein to the numerical value of preset duration.
It should be understood that in practical applications, optimizing training Emotion identification in addition to above two implementation can be used to determine Outside the opportunity of model, the opportunity for optimizing training Emotion identification model can also be determined according to other conditions, for example, training in optimization Sample can optimize training to Emotion identification model when reaching preset quantity, in another example, optimization cycle of training can be set, Training is optimized to Emotion identification model according to the optimization cycle of training, does not optimize training Emotion identification model to determining herein The mode on opportunity do any restriction.
Above-mentioned model training method provided by the embodiments of the present application can be directed to different terminal devices, be accordingly based on the end Touch control manner and the corresponding emotional state of touch control manner when the user of end equipment manipulates the terminal device, using engineering It practises algorithm and trains the Emotion identification model for being targetedly suitable for the terminal device;In this way, applying needle on the terminal device The Emotion identification model that it is trained, it is ensured that the Emotion identification model can accurately make according to the terminal device User manipulates the touch control manner when terminal device, determines the emotional state of user.
The model training method provided based on the above embodiment can train to obtain the mood for having preferable model performance Identification model is based on the Emotion identification model, and the application further provides a kind of Emotion identification method, so as to more clearly Solve above-mentioned Emotion identification model role in practical applications.Below by embodiment to Emotion identification provided by the present application Method does specific introduction.
Referring to Fig. 3, Fig. 3 is the flow diagram of Emotion identification method provided by the embodiments of the present application.As shown in figure 3, should Emotion identification method the following steps are included:
Step 301: obtaining touch control manner when user's operation control terminal equipment.
When user's operation control terminal equipment, terminal device can correspondingly obtain the touch control manner of user, which can also To be interpreted as touch control operation, touch control operation is specifically as follows user and is directed to the single step touch control operation that touch screen is initiated, such as different power The slide etc. under clicking operation, different dynamics under degree, or the continuous touch-control behaviour that user initiates for touch screen Make, such as adopting consecutive click chemical reaction operation, the continuously slipping operation of different frequency of different frequency, certainly, user's touch-control touch screen when institute Other touch control operations used also can be considered the touch control manner in the application, not do specifically to the touch control manner in the application herein It limits.
It should be noted that under normal conditions, above-mentioned touch control manner is true based on touch data acquired in terminal device Fixed, i.e., in user's operation control terminal equipment, terminal device can get the touch data of user's touch-control touch screen generation, into And touch control manner is determined based on acquired touch data.
For capacitance plate, touch data generally includes screen capacitance variation data and screen coordinate value variation number According to, wherein screen capacitance variation data can characterize user click or dynamics when sliding touch screen and user click or Contact area when sliding touch screen between touch screen;Screen coordinate value delta data is actually also according to screen capacitance What delta data determined, click location and user when screen coordinate value delta data can characterize user's point touching screen Glide direction and sliding distance when sliding touch screen.
Correspondingly, after terminal device gets screen capacitance variation data and screen coordinate value delta data, Ji Kegen The touch control manner of the current touch control terminal equipment of user is determined according to it;For example, according to the amplitude of variation of screen capacitance variation data, It can determine that the current touch control manner of user is that severe is clicked or slightly clicked, according to the variation of screen capacitance variation data Frequency can determine whether the current touch control manner of user is frequently to click, and is characterized according to screen coordinate value delta data Sliding trace can determine that the current touch control manner of user is a wide range of sliding or small range sliding, according to screen coordinate value The change frequency of delta data can determine whether the current touch control manner of user is frequently to slide.Certainly, terminal device may be used also Correspondingly to determine other touch control manners according to touch data, above-mentioned touch control manner is merely illustrative, herein not to touch control manner It is specifically limited.
It should be understood that user's touch-control touch screen will correspondingly generate other touch-control numbers for other kinds of touch screen According to for example, user's touch-control touch screens can correspondingly generate screen resistance change data and screen coordinate for touch screens It is worth delta data, may correspondingly determine that the current touch control manner of user according to these data, herein also not to touch data Concrete type does any restriction.
Step 302: the corresponding emotional state of the touch control manner is determined using Emotion identification model, current as user Emotional state;The Emotion identification model executes model training method training shown in Fig. 2 and obtains.
After terminal device gets touch control manner, acquired touch control manner is input to the mood run in terminal device Identification model is analyzed and processed acquired touch control manner using the Emotion identification model, and then exports the touch control manner Corresponding emotional state, the emotional state current as user.
It should be noted that above-mentioned Emotion identification model is the mould obtained through model training method shown in Fig. 2 training Type, the model in the training process, the touch data and the corresponding mood of touch data when the terminal device are manipulated based on user State, training obtain the Emotion identification model for being targetedly suitable for the terminal device, which can be accurate Touch control manner when ground is according to user's operation control terminal equipment, determines the emotional state of user.
It should be understood that the emotional state that Emotion identification model can identify, depending on being adopted when the training Emotion identification model Training sample;And touch control manner included in training sample is touch control manner when user manipulates the terminal device, instruction Practicing emotional state included in sample is emotional state when user uses the terminal device, i.e. the training sample is entirely base It is generated in the touch control manner of the user of the terminal device and its emotional state shown.Correspondingly, the training sample is utilized The obtained Emotion identification model of training, can accurately according to user's operation control terminal equipment when touch control manner, determine user Current emotional state, i.e., the Emotion identification model obtained using training sample training can be sensitively used according to the user Touch control manner identifies the emotional state corresponding to it.
After identifying the current emotional state of user using Emotion identification model, terminal device can be accordingly based upon being known Not Chu the current emotional state of user, further provide for personalized service for user, brought with improving terminal device as user User experience.
In one possible implementation, terminal device can be in the case where its own shows desk interface, switching The Show Styles of desk interface;For example, switching desk interface shows topics, shows wallpaper, display font etc..
For example, when the touch control manner that terminal device gets user is continually to slide touch screen, the touch control manner is defeated Enter Emotion identification model, Emotion identification model may determine the corresponding emotional state of the touch control manner for agitation;At this point, if terminal The interface that equipment is shown is desk interface, and the wallpaper of desktop then can correspondingly be switched to and more become clear, makes us by terminal device Pleasant picture, alternatively, terminal device can also be replaced and be showed topics and/or display font, to bring the sight of pleasure for user Sense experience.
Certainly, the emotional state that terminal device can also be current according to user, to Show Styles other in desk interface It is modified, any restriction is not done to the Show Styles that can be changed herein.
In alternatively possible implementation, terminal device can pass through in the case where itself opens application program The application program is that user recommends related content.
For example, it is assumed that the application program that terminal device is currently opened is music playing process, correspondingly, if Emotion identification mould Type according to the touch control manner of user, determine the current emotional state of user be it is low, then the music playing process can be user Recommend some cheerful and light-hearted music, to alleviate the current low mood of user;Or, it is assumed that the application journey that terminal device is currently opened Sequence is video reproduction program, correspondingly, if Emotion identification model determines the current mood shape of user according to the touch control manner of user State be it is sad, then the video reproduction program can recommend some videos made laughs for user, to adjust the current sad feelings of user Thread.Certainly, terminal device can also correspondingly be pushed away by other applications according to the current emotional state of user for user Related text content is recommended, for example, recommending related article, joke etc. for user.
It is not also corresponded to that the application program of related content can be recommended to do any restriction according to user emotion state herein It is specifically limited with the related content that program is recommended.
It should be understood that terminal device can also according to the actual situation, correspondingly other than above two possible implementation Other modes are taken, provide reasonable personalized service according to the current emotional state of user for it, for example, recommended user carries out Correlation can alleviate operation of mood etc., not be specifically limited herein to the personalized service that terminal device can be provided.
In Emotion identification method provided by the embodiments of the present application, terminal device utilizes the mood obtained for self training Identification model, touch control manner when manipulating itself according to user, determines the current emotional state of user.Compared to the prior art often Emotion identification method, this method can utilize Emotion identification model, targetedly according to user's operation control terminal equipment when Touch control manner determine the emotional state of user, guarantee determined by emotional state accuracy;Also, this method is determining use During the emotional state of family, without any additional external equipment, the purpose for improving user experience is truly realized.
For above-described model training method, present invention also provides corresponding model training apparatus, so that above-mentioned The application and realization of model training method in practice.
Referring to fig. 4, Fig. 4 is a kind of structural schematic diagram of model training apparatus provided by the embodiments of the present application;Such as Fig. 4 institute Show, which includes:
Training sample obtains module 401 and marks the touch-control for obtaining touch control manner when user's operation control terminal equipment The corresponding emotional state of mode;By the touch control manner and the corresponding emotional state of the touch control manner, as training sample;
Model training module 402 instructs disaggregated model using the training sample for using machine learning algorithm Practice, obtains Emotion identification model;Touch control manner when the Emotion identification model manipulates the terminal device with user is input, It is output with the corresponding emotional state of the touch control manner.
When specific implementation, training sample obtains the specific method that can be used for executing in step 201 of module 401, specifically asks With reference to the description in embodiment of the method shown in Fig. 2 to step 201 part;Model training module 402 specifically can be used for executing Method in step 202 specifically please refers to the description in embodiment of the method shown in Fig. 2 to step 202 part, no longer superfluous herein It states.
Optionally, the training sample obtains module 401 and is specifically used for:
According to the touch control manner corresponding triggered time, reference time section is determined;
Obtain the operation data content that terminal device described in user's operation generates in the reference time section;
The emotional state that user is determined according to the operation data content, as the corresponding mood shape of the touch control manner State.
When specific implementation, training sample obtains module 401 and can refer in embodiment shown in Fig. 2 about determining touch-control side The description of the related content of the corresponding emotional state of formula.
Optionally, the training sample obtains module 401 and is specifically used for:
Call preset emotional state mapping table;In the emotional state mapping table record have touch control manner with Corresponding relationship between emotional state;
The emotional state mapping table is searched, determines the corresponding emotional state of the touch control manner.
When specific implementation, training sample obtains module 401 and can refer in embodiment shown in Fig. 2 about determining touch-control side The description of the related content of the corresponding emotional state of formula.
Optionally, the training sample obtains module 401 and is specifically used for:
Within a preset period of time, acquisition user manipulates the touch data that the terminal device generates;
Clustering processing is done to the touch data and generates touch data set, determines the corresponding touching of the touch data set Prosecutor formula;
It will include the most touch data set of touch data as target touch data set, by the target touch-control number According to the corresponding touch control manner of set as target touch control manner;Mark the corresponding emotional state of the target touch control manner;
By the target touch control manner and the corresponding emotional state of the target touch control manner, as training sample.
When specific implementation, training sample obtains module 401 and can refer in embodiment shown in Fig. 2 about determining touch-control side The description of the related content of the corresponding emotional state of formula.
Optionally, the touch data includes: screen capacitance variation data and coordinate value delta data.
Optionally, described device further include:
Optimize training sample and obtain module, for obtaining touch control manner when user manipulates the terminal device, as excellent Change touch control manner;Mark the corresponding emotional state of the optimization touch control manner;The optimization touch control manner and the optimization are touched The corresponding emotional state of prosecutor formula, as optimization training sample;The optimization training sample is used for the Emotion identification model Optimize training.
When specific implementation, optimization training sample, which obtains module, can optimize with reference in embodiment shown in Fig. 2 about acquisition The description of the related content of training sample.
Optionally, described device further include:
Feedback information obtains module, the feedback information for being directed to the Emotion identification model for obtaining user;The feedback Whether the performance that information is used to characterize the Emotion identification model meets user demand;
First optimization training module, the performance for characterizing the Emotion identification model in the feedback information are unsatisfactory for using When the demand of family, training is optimized to the Emotion identification model using the optimization training sample.
When specific implementation, feedback information obtain module and first optimization training module can specifically refer to it is shown in Fig. 2 About the description for optimizing trained related content to Emotion identification model in embodiment.
Optionally, described device further include:
Second optimization training module, is used for when the terminal device is in charged state, and/or, it is set in the terminal When standby remaining capacity is higher than default electricity, and/or, it is more than preset duration in the duration that the terminal device is in idle condition When, training is optimized to the Emotion identification model using the optimization training sample.
When specific implementation, feedback information obtain module and first optimization training module can specifically refer to it is shown in Fig. 2 About the description for optimizing trained related content to Emotion identification model in embodiment.
Above-mentioned model training apparatus provided by the embodiments of the present application can be directed to different terminal devices, be accordingly based on the end Touch control manner and the corresponding emotional state of touch control manner when the user of end equipment manipulates the terminal device, using engineering It practises algorithm and trains the Emotion identification model for being targetedly suitable for the terminal device;In this way, applying needle on the terminal device The Emotion identification model that it is trained, it is ensured that the Emotion identification model can accurately make according to the terminal device User manipulates the touch control manner when terminal device, determines the emotional state of user.
For above-described Emotion identification method, present invention also provides corresponding Emotion identification devices, so that above-mentioned The application and realization of Emotion identification method in practice.
Referring to Fig. 5, Fig. 5 is a kind of structural schematic diagram of Emotion identification device provided by the embodiments of the present application;Such as Fig. 5 institute Show, which includes:
Touch control manner obtains module 501, for obtaining touch control manner when user's operation control terminal equipment;
Emotional state identification module 502, for determining the corresponding mood shape of the touch control manner using Emotion identification model State, the emotional state current as user;The Emotion identification model is to execute the training of model training method described in Fig. 2 to obtain 's.
When specific implementation, touch control manner obtains the specific method that can be used for executing in step 301 of module 501, specifically asks With reference to the description in embodiment of the method shown in Fig. 3 to step 301 part;Emotional state identification module 502 specifically can be used for The method in step 302 is executed, specifically please refers to the description in embodiment of the method shown in Fig. 3 to step 302 part, herein not It repeats again.
Optionally, described device further include:
Show Styles switching module is used in the case where the terminal device shows desk interface, according to the user Current emotional state switches the Show Styles of desk interface.
When specific implementation, Show Styles switching module can specifically be referred in embodiment shown in Fig. 3 about switching desktop The description of the related content of interface display pattern.
Optionally, described device further include:
Commending contents module is used in the case where the terminal device opens application program, current according to the user Emotional state, pass through the application program recommend related content.
When specific implementation, commending contents module specifically can with reference in embodiment shown in Fig. 3 about passing through application program Recommend the description of related content.
In Emotion identification device provided by the embodiments of the present application, terminal device utilizes the mood obtained for self training Identification model, touch control manner when manipulating itself according to user, determines the current emotional state of user.The device can utilize feelings Thread identification model, targetedly according to user's operation control terminal equipment when touch control manner determine the emotional state of user, guarantee The accuracy of identified emotional state;Also, the device is during determining user emotion state, without any additional External equipment truly realizes the purpose for improving user experience.
Present invention also provides a kind of servers for training pattern;It is that the embodiment of the present application provides referring to Fig. 6, Fig. 6 A kind of server architecture schematic diagram for training pattern, which can generate and compares because configuration or performance are different Big difference may include one or more central processing unit (central processing units, CPU) 622 (examples Such as, one or more processors) and memory 632, one or more storage application programs 642 or data 644 Storage medium 630 (such as one or more mass memory units).Wherein, memory 632 and storage medium 630 can be Of short duration storage or persistent storage.The program for being stored in storage medium 630 may include that one or more modules (do not mark by diagram Out), each module may include to the series of instructions operation in server.Further, central processing unit 622 can be set It is set to and is communicated with storage medium 630, the series of instructions operation in storage medium 630 is executed on server 600.
Server 600 can also include one or more power supplys 626, one or more wired or wireless networks Interface 650, one or more input/output interfaces 658, and/or, one or more operating systems 641, such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM etc..
The step as performed by server can be based on the server architecture shown in fig. 6 in above-described embodiment.
Wherein, CPU 622 is for executing following steps:
Touch control manner when user's operation control terminal equipment is obtained, the corresponding emotional state of the touch control manner is marked;By institute Touch control manner and the corresponding emotional state of the touch control manner are stated, as training sample;
Using machine learning algorithm, disaggregated model is trained using the training sample, obtains Emotion identification model; Touch control manner when the Emotion identification model manipulates the terminal device with user is to input, with the corresponding feelings of the touch control manner Not-ready status is output.
Optionally, the side of any specific implementation of model training method in the embodiment of the present application can also be performed in CPU622 Method step.
It should be noted that server needs are set with terminal when using server shown in fig. 6 training Emotion identification model It is standby to be communicated, to obtain training sample from terminal device, it should be appreciated that the training sample from different terminal devices should The mark of its corresponding terminal device is configured, correspondingly so that the CPU622 of server can use from same terminal device Training sample is suitable for the Emotion identification mould of the terminal device using model training method provided by the embodiments of the present application training Type.
The embodiment of the present application also provides another electronic equipment for training pattern and identification mood, (electronics is set Standby can be terminal device described above), for executing model training method provided by the embodiments of the present application, training is suitable for The Emotion identification model of itself;And/or Emotion identification method provided by the embodiments of the present application is executed, utilize trained mood Identification model manipulates the touch control manner of itself according to user, correspondingly identifies the current emotional state of user.
Fig. 7 shows the structural schematic diagram of above-mentioned electronic equipment 100.
Electronic equipment 100 may include processor 110, external memory interface 120, internal storage 121, general serial Bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio-frequency module 170, loudspeaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, key 190, motor 191, indicator 192, camera 193, display screen 194 and Subscriber Identity Module (subscriber identification module, SIM) card interface 195 Deng.Wherein sensor module 180 may include pressure sensor 180A, gyro sensor 180B, baroceptor 180C, magnetic Sensor 180D, acceleration transducer 180E, range sensor 180F, close to optical sensor 180G, fingerprint sensor 180H, temperature Spend sensor 180J, touch sensor 180K, ambient light sensor 180L, bone conduction sensor 180M etc..
It is understood that the structure of signal of the embodiment of the present invention does not constitute the specific restriction to electronic equipment 100.? In other embodiments of the application, electronic equipment 100 may include than illustrating more or fewer components, or the certain portions of combination Part perhaps splits certain components or different component layouts.The component of diagram can be with hardware, software or software and hardware Combination realize.
Processor 110 may include one or more processing units, such as: processor 110 may include application processor (application processor, AP), modem processor, graphics processor (graphics processing Unit, GPU), image-signal processor (image signal processor, ISP), controller, Video Codec, number Signal processor (digital signal processor, DSP), baseband processor and/or neural network processor (neural-network processing unit, NPU) etc..Wherein, different processing units can be independent device, It can integrate in one or more processors.
Controller can generate operating control signal according to instruction operation code and clock signal, complete instruction fetch and execution The control of instruction.
Memory can also be set in processor 110, for storing instruction and data.In some embodiments, processor Memory in 110 is cache memory.The memory can save the instruction that processor 110 is just used or is recycled Or data.If processor 110 needs to reuse the instruction or data, can be called directly from the memory.It avoids Repeated access, reduces the waiting time of processor 110, thus improves the efficiency of system.
In some embodiments, processor 110 may include one or more interfaces.Interface may include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit Sound, I2S) interface, pulse code modulation (pulse code modulation, PCM) interface, universal asynchronous receiving-transmitting transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), universal input export (general-purpose Input/output, GPIO) interface, Subscriber Identity Module (subscriber identity module, SIM) interface, and/or Universal serial bus (universal serial bus, USB) interface etc..
I2C interface is a kind of bi-directional synchronization universal serial bus, including serial data line (serial data line, SDA) He Yigen serial time clock line (derail clock line, SCL).In some embodiments, processor 110 may include Multiple groups I2C bus.Processor 110 can by different I2C bus interface distinguish coupled with touch sensors 180K, charger, Flash lamp, camera 193 etc..Such as: processor 110 can make processor by I2C interface coupled with touch sensors 180K 110 are communicated with touch sensor 180K by I2C bus interface, realize the touch function of electronic equipment 100.
I2S interface can be used for voice communication.In some embodiments, processor 110 may include multiple groups I2S bus. Processor 110 can be coupled by I2S bus with audio-frequency module 170, be realized logical between processor 110 and audio-frequency module 170 Letter.In some embodiments, audio-frequency module 170 can transmit audio signal to wireless communication module 160 by I2S interface, real The function of now being received calls by bluetooth headset.
Pcm interface can be used for voice communication, by analog signal sampling, quantization and coding.In some embodiments, sound Frequency module 170 can be coupled with wireless communication module 160 by pcm bus interface.In some embodiments, audio-frequency module 170 Audio signal can also be transmitted to wireless communication module 160 by pcm interface, realize the function to receive calls by bluetooth headset Energy.The I2S interface and the pcm interface may be used to voice communication.
UART interface is a kind of Universal Serial Bus, is used for asynchronous communication.The bus can be bidirectional communications bus. The data that it will be transmitted are converted between serial communication and parallel communications.In some embodiments, UART interface usually by with In connection processor 110 and wireless communication module 160.Such as: processor 110 passes through UART interface and wireless communication module 160 In bluetooth module communication, realize Bluetooth function.In some embodiments, audio-frequency module 170 can be by UART interface to nothing Line communication module 160 transmits audio signal, realizes the function that music is played by bluetooth headset.
MIPI interface can be used to connect the peripheral components such as processor 110 and display screen 194, camera 193.MIPI connects Mouth includes camera serial line interface (camera serial interface, CSI), display screen serial line interface (display Serial interface, DSI) etc..In some embodiments, processor 110 and camera 193 are communicated by CSI interface, real The shooting function of existing electronic equipment 100.Processor 110 and display screen 194 realize electronic equipment 100 by DSI interface communication Display function.
GPIO interface can pass through software configuration.GPIO interface can be configured as control signal, may be alternatively configured as counting It is believed that number.In some embodiments, GPIO interface can be used for connecting processor 110 and camera 193, display screen 194, wirelessly Communication module 160, audio-frequency module 170, sensor module 180 etc..GPIO interface can be additionally configured to I2C interface, and I2S connects Mouthful, UART interface, MIPI interface etc..
Usb 1 30 is the interface for meeting USB standard specification, specifically can be Mini USB interface, and Micro USB connects Mouthful, USB Type C interface etc..Usb 1 30 can be used for connecting charger for the charging of electronic equipment 100, can be used for Data are transmitted between electronic equipment 100 and peripheral equipment.It can be used for connection earphone, audio played by earphone.The interface It can be also used for connecting other electronic equipments, such as AR equipment etc..
It is understood that the interface connection relationship of each intermodule of signal of the embodiment of the present invention, only schematically illustrates, The structure qualification to electronic equipment 100 is not constituted.In other embodiments of the application, electronic equipment 100 can also be used The combination of different interface connection type or multiple interfaces connection type in above-described embodiment.
Charge management module 140 is used to receive charging input from charger.Wherein, charger can be wireless charger, It is also possible to wired charger.In the embodiment of some wired chargings, charge management module 140 can pass through usb 1 30 Receive the charging input of wired charger.In the embodiment of some wireless chargings, charge management module 140 can pass through electronics The Wireless charging coil of equipment 100 receives wireless charging input.While charge management module 140 is that battery 142 charges, may be used also To be power electronic equipment by power management module 141.
Power management module 141 is for connecting battery 142, charge management module 140 and processor 110.Power management mould Block 141 receives the input of battery 142 and/or charge management module 140, is processor 110, internal storage 121, display screen 194, the power supply such as camera 193 and wireless communication module 160.Power management module 141 can be also used for monitoring battery capacity, Circulating battery number, the parameters such as cell health state (electric leakage, impedance).In some other embodiment, power management module 141 Also it can be set in processor 110.In further embodiments, power management module 141 and charge management module 140 can also To be set in the same device.
The wireless communication function of electronic equipment 100 can pass through antenna 1, antenna 2, mobile communication module 150, wireless communication Module 160, modem processor and baseband processor etc. are realized.
Antenna 1 and antenna 2 electromagnetic wave signal for transmitting and receiving.Each antenna in electronic equipment 100 can be used for covering Cover single or multiple communication bands.Different antennas can also be multiplexed, to improve the utilization rate of antenna.Such as: it can be by antenna 1 It is multiplexed with the diversity antenna of WLAN.In other embodiments, antenna can be used in combination with tuning switch.
Mobile communication module 150, which can provide, applies wirelessly communicating on electronic equipment 100 including 2G/3G/4G/5G etc. Solution.Mobile communication module 150 may include at least one filter, switch, power amplifier, low-noise amplifier (low noise amplifier, LNA) etc..Mobile communication module 150 can receive electromagnetic wave by antenna 1, and to received electricity Magnetic wave is filtered, and the processing such as amplification is sent to modem processor and is demodulated.Mobile communication module 150 can also be right The modulated modulated signal amplification of demodulation processor, switchs to electromagenetic wave radiation through antenna 1 and goes out.In some embodiments, it moves At least partly functional module of dynamic communication module 150 can be arranged in processor 110.In some embodiments, mobile logical At least partly functional module of letter module 150 can be arranged in the same device at least partly module of processor 110.
Modem processor may include modulator and demodulator.Wherein, modulator is used for low frequency base to be sent Band signal is modulated into high frequency signal.Demodulator is used to received electromagnetic wave signal being demodulated into low frequency baseband signal.Then solution Adjust device that the low frequency baseband signal that demodulation obtains is sent to baseband processor.Low frequency baseband signal is through baseband processor Afterwards, it is delivered to application processor.Application processor is defeated by audio frequency apparatus (being not limited to loudspeaker 170A, receiver 170B etc.) Voice signal out, or image or video are shown by display screen 194.In some embodiments, modem processor can be Independent device.In further embodiments, modem processor can be independently of processor 110, with mobile communication module 150 or other function module be arranged in the same device.
It includes WLAN (wireless that wireless communication module 160, which can be provided and be applied on electronic equipment 100, Local area networks, WLAN) (such as Wireless Fidelity (wireless fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), Global Navigation Satellite System (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), the short distance wireless communication technology (near field communication, NFC) are red The solution of the wireless communications such as outer technology (infrared, IR).Wireless communication module 160 can be integrated into few communication One or more devices of processing module.Wireless communication module 160 receives electromagnetic wave via antenna 2, by electromagnetic wave signal frequency modulation And filtering processing, by treated, signal is sent to processor 110.Wireless communication module 160 can also connect from processor 110 Signal to be sent is received, frequency modulation is carried out to it, is amplified, is switched to electromagenetic wave radiation through antenna 2 and go out.
In some embodiments, the antenna 1 of electronic equipment 100 and mobile communication module 150 couple, antenna 2 and channel radio Believe that module 160 couples, allowing electronic equipment 100, technology is communicated with network and other equipment by wireless communication.It is described Wireless communication technique may include global system for mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), CDMA access (code Division multiple access, CDMA), wideband code division multiple access (wideband code division multiple Access, WCDMA), time division CDMA (time-division code division multiple access, TD- SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM and/or IR technology etc..Institute Stating GNSS may include GPS (global positioning system, GPS), global navigational satellite system It unites (global navigation satellite system, GLONASS), Beidou satellite navigation system (beidou Navigation satellite system, BDS), quasi- zenith satellite system (quasi-zenith satellite System, QZSS) and/or satellite-based augmentation system (satellite based augmentation systems, SBAS).
Electronic equipment 100 realizes display function by GPU, display screen 194 and application processor etc..GPU is at image The microprocessor of reason connects display screen 194 and application processor.GPU is calculated for executing mathematics and geometry, is used for figure wash with watercolours Dye.Processor 110 may include one or more GPU, execute program instructions to generate or change display information.
Display screen 194 is for showing image, video etc..Display screen 194 includes display panel.Display panel can use liquid Crystal display screen (liquid crystal display, LCD), Organic Light Emitting Diode (organic light-emitting Diode, OLED), active matrix organic light-emitting diode or active-matrix organic light emitting diode (active-matrix Organic light emitting diode's, AMOLED), Flexible light-emitting diodes (flex light-emitting Diode, FLED), Miniled, MicroLed, Micro-oLed, light emitting diode with quantum dots (quantum dot light Emitting diodes, QLED) etc..In some embodiments, electronic equipment 100 may include 1 or N number of display screen 194, N For the positive integer greater than 1.
Electronic equipment 100 can be by ISP, camera 193, Video Codec, GPU, display screen 194 and at It manages device etc. and realizes shooting function.
ISP is used to handle the data of the feedback of camera 193.For example, opening shutter when taking pictures, light is passed by camera lens It is delivered on camera photosensitive element, optical signal is converted to electric signal, and camera photosensitive element passes to the electric signal at ISP Reason, is converted into macroscopic image.ISP can also be to the noise of image, brightness, colour of skin progress algorithm optimization.ISP can be with Exposure to photographed scene, the parameter optimizations such as colour temperature.In some embodiments, ISP can be set in camera 193.
Camera 193 is for capturing still image or video.Object generates optical imagery by camera lens and projects photosensitive member Part.Photosensitive element can be charge-coupled device (charge coupled device, CCD) or complementary metal oxide is partly led Body (complementary metal-oxide-semiconductor, CMOS) phototransistor.Photosensitive element turns optical signal It changes electric signal into, electric signal is passed into ISP later and is converted into data image signal.Data image signal is output to DSP by ISP Working process.Data image signal is converted into the RGB of standard, the picture signal of the formats such as YUV by DSP.In some embodiments, Electronic equipment 100 may include 1 or N number of camera 193, and N is the positive integer greater than 1.
Digital signal processor, in addition to can handle data image signal, can also handle it for handling digital signal His digital signal.For example, digital signal processor is used to carry out Fu to frequency point energy when electronic equipment 100 is when frequency point selects In leaf transformation etc..
Video Codec is used for compression of digital video or decompression.Electronic equipment 100 can be supported one or more Video Codec.In this way, electronic equipment 100 can play or record the video of a variety of coded formats, and such as: dynamic image is special Family's group (moving picture experts group, MPEG) 1, MPEG2, mpeg 3, MPEG4 etc..
NPU is neural network (neural-network, NN) computation processor, by using for reference biological neural network structure, Such as transfer mode between human brain neuron is used for reference, it, can also continuous self study to input information fast processing.Pass through NPU The application such as intelligent cognition of electronic equipment 100 may be implemented, such as: image recognition, recognition of face, speech recognition, text understanding Deng.
External memory interface 120 can be used for connecting external memory card, such as Micro SD card, realize that extension electronics is set Standby 100 storage capacity.External memory card is communicated by external memory interface 120 with processor 110, realizes that data store function Energy.Such as by music, the files such as video are stored in external memory card.
Internal storage 121 can be used for storing computer executable program code, and the executable program code includes Instruction.Internal storage 121 may include storing program area and storage data area.Wherein, storing program area can store operation system It unites, application program (such as sound-playing function, image player function etc.) needed at least one function etc..It storage data area can The data (such as audio data, phone directory etc.) etc. created in storage 100 use process of electronic equipment.In addition, storage inside Device 121 may include high-speed random access memory, can also include nonvolatile memory, for example, at least a disk storage Device, flush memory device, generic flash memory (universal flash storage, UFS) etc..Processor 110 passes through operation It is stored in the instruction of internal storage 121, and/or is stored in the instruction for the memory being set in processor, electronics is executed and sets Standby 100 various function application and data processing.
Electronic equipment 100 can pass through audio-frequency module 170, loudspeaker 170A, receiver 170B, microphone 170C, earphone Interface 170D and application processor etc. realize audio-frequency function.Such as music, recording etc..
Audio-frequency module 170 is used to for digitized audio message to be converted into analog audio signal output, is also used for analogue audio frequency Input is converted to digital audio and video signals.Audio-frequency module 170 can be also used for audio-frequency signal coding and decoding.In some embodiments In, audio-frequency module 170 can be set in processor 110, or the partial function module of audio-frequency module 170 is set to processor In 110.
Loudspeaker 170A, also referred to as " loudspeaker ", for audio electrical signal to be converted to voice signal.Electronic equipment 100 can be with Music is listened to by loudspeaker 170A, or listens to hand-free call.
Receiver 170B, also referred to as " earpiece ", for audio electrical signal to be converted into voice signal.When electronic equipment 100 connects It answers a call or when voice messaging, it can be by the way that receiver 170B be answered voice close to human ear.
Microphone 170C, also referred to as " microphone ", " microphone ", for voice signal to be converted to electric signal.When making a phone call Or when sending voice messaging, voice signal can be input to microphone by mouth close to microphone 170C sounding by user 170C.At least one microphone 170C can be set in electronic equipment 100.In further embodiments, electronic equipment 100 can be set Two microphone 170C are set, in addition to collected sound signal, can also realize decrease of noise functions.In further embodiments, electronics is set Standby 100 can also be arranged three, four or more microphone 170C, realize that collected sound signal, noise reduction can also identify sound Directional recording function etc. is realized in source.
Earphone interface 170D is for connecting wired earphone.Earphone interface 170D can be usb 1 30, be also possible to Opening mobile electronic device platform (open mobile terminal platform, OMTP) standard interface of 3.5mm, the U.S. Cellular telecommunication industrial association (cellular telecommunications industry association of the USA, CTIA) standard interface.
Pressure signal can be converted into electric signal for experiencing pressure signal by pressure sensor 180A.In some implementations In example, pressure sensor 180A be can be set in display screen 194.There are many type of pressure sensor 180A, such as resistive pressure Sensor, inductance pressure transducer, capacitance pressure transducer, etc..Capacitance pressure transducer, can be including at least two Parallel-plate with conductive material.When effectively acting on pressure sensor 180A, the capacitor between electrode changes.Electronic equipment 100 determine the intensity of pressure according to the variation of capacitor.When there is touch operation to act on display screen 194, electronic equipment 100 is according to pressure Force snesor 180A detects the touch operation intensity.Electronic equipment 100 can also be believed according to the detection of pressure sensor 180A Number calculate touch position.In some embodiments, identical touch location, but the touch behaviour of different touch operation intensity are acted on Make, different operational orders can be corresponded to.Such as: when the touch operation for having touch operation intensity to be less than first pressure threshold value acts on When short message application icon, the instruction for checking short message is executed.When have touch operation intensity be greater than or equal to first pressure threshold When the touch operation of value acts on short message application icon, the instruction of newly-built short message is executed.
Gyro sensor 180B is determined for the athletic posture of electronic equipment 100.It in some embodiments, can be with Determine that electronic equipment 100 surrounds the angular speed of three axis (that is, x, y and z-axis) by gyro sensor 180B.Gyro sensors Device 180B can be used for shooting stabilization.Illustratively, when pressing shutter, gyro sensor 180B detection electronic equipment 100 is trembled Dynamic angle goes out the distance that lens module needs to compensate according to angle calculation, camera lens is allowed to offset electronic equipment by counter motion Stabilization is realized in 100 shake.Gyro sensor 180B can be also used for navigating, somatic sensation television game scene.
Baroceptor 180C is for measuring air pressure.In some embodiments, electronic equipment 100 passes through baroceptor The atmospheric pressure value that 180C is measured calculates height above sea level, auxiliary positioning and navigation.
Magnetic Sensor 180D includes Hall sensor.Electronic equipment 100 can use Magnetic Sensor 180D flip cover skin The folding of set.In some embodiments, when electronic equipment 100 is liding machine, electronic equipment 100 can be according to Magnetic Sensor The folding of 180D flip cover.And then according to the folding condition of the leather sheath detected or the folding condition of flip lid, setting flip lid is certainly The characteristics such as dynamic unlock.
Acceleration transducer 180E can detect the big of (the generally three axis) acceleration in all directions of electronic equipment 100 It is small.It can detect that size and the direction of gravity when electronic equipment 100 is static.It can be also used for identification electronic equipment posture, answer Switch for horizontal/vertical screen, the application such as pedometer.
Range sensor 180F, for measuring distance.Electronic equipment 100 can pass through infrared or laser distance measuring.? In some embodiments, photographed scene, electronic equipment 100 can use range sensor 180F ranging to realize rapid focus.
It may include such as light emitting diode (LED) and photodetector, such as photodiode close to optical sensor 180G. Light emitting diode can be infrared light-emitting diode.Electronic equipment 100 launches outward infrared light by light emitting diode.Electronics is set Standby 100 detect the infrared external reflection light from neighbouring object using photodiode.It, can be true when detecting sufficient reflected light Determining electronic equipment 100 nearby has object.When detecting insufficient reflected light, electronic equipment 100 can determine electronic equipment 100 do not have object nearby.Electronic equipment 100 can use to be pasted close to optical sensor 180G detection user's hand-hold electronic equipments 100 Nearly ear call, so that automatic distinguishing screen achievees the purpose that power saving.It can also be used for leather sheath mode, mouth close to optical sensor 180G Bag mode automatic unlocking and screen locking.
Ambient light sensor 180L is for perceiving environmental light brightness.Electronic equipment 100 can be according to the environment bright of perception Spend 194 brightness of automatic adjusument display screen.Automatic white balance adjustment when ambient light sensor 180L can also be used for taking pictures.Environment light Sensor 180L can also cooperate with close to optical sensor 180G, electronic equipment 100 be detected whether in pocket, with false-touch prevention.
Fingerprint sensor 180H is for acquiring fingerprint.The fingerprint characteristic that electronic equipment 100 can use acquisition realizes fingerprint Unlock accesses application lock, and fingerprint is taken pictures, fingerprint incoming call answering etc..
Temperature sensor 180J is for detecting temperature.In some embodiments, electronic equipment 100 utilizes temperature sensor The temperature of 180J detection, executes Temperature Treatment strategy.For example, when the temperature sensor 180J temperature reported is more than threshold value, electronics Equipment 100 executes the performance for reducing the processor being located near temperature sensor 180J, implements Thermal protection to reduce power consumption.? In other embodiments, when temperature is lower than another threshold value, electronic equipment 100 heats battery 142, leads to electricity to avoid low temperature The abnormal shutdown of sub- equipment 100.In some other embodiment, when temperature is lower than another threshold value, electronic equipment 100 is to battery 142 output voltage executes boosting, to avoid shutting down extremely caused by low temperature.
Touch sensor 180K, also referred to as " touch-control device ".Touch sensor 180K can be set in display screen 194, by touching It touches sensor 180K and display screen 194 forms touch screen, also referred to as " touch screen ".Touch sensor 180K acts on it for detecting On or near touch control operation.The touch control operation that touch sensor can will test passes to application processor, to determine touching Prosecutor formula.Visual output relevant to touch control operation can be provided by display screen 194.In further embodiments, it touches and passes Sensor 180K also can be set in the surface of electronic equipment 100, different from the location of display screen 194.
The available vibration signal of bone conduction sensor 180M.In some embodiments, bone conduction sensor 180M can be with Obtain the vibration signal of human body part vibration bone block.Bone conduction sensor 180M can also contact human pulse, receive blood pressure and jump Dynamic signal.In some embodiments, bone conduction sensor 180M also can be set in earphone, be combined into bone conduction earphone.Sound Frequency module 170 can parse voice based on the vibration signal for the part vibration bone block that the bone conduction sensor 180M is obtained Signal realizes phonetic function.The blood pressure jitter solution that application processor can be obtained based on the bone conduction sensor 180M Heart rate information is analysed, realizes heart rate detecting function.
Key 190 includes power button, volume key etc..Key 190 can be mechanical key.It is also possible to touch-key. Electronic equipment 100 can receive key-press input, generate key letter related with the user setting of electronic equipment 100 and function control Number input.
Motor 191 can produce vibration prompt.Motor 191 can be used for calling vibration prompt, can be used for touching vibration Dynamic feedback.For example, acting on the touch operation of different application (such as taking pictures, audio broadcasting etc.), different vibrations can be corresponded to Feedback effects.The touch operation of 194 different zones of display screen is acted on, motor 191 can also correspond to different vibrational feedback effects. Different application scenarios (such as: time alarm receives information, alarm clock, game etc.) different vibrational feedback effects can also be corresponded to Fruit.Touch vibrational feedback effect can also be supported customized.
Indicator 192 can be indicator light, can serve to indicate that charged state, electric quantity change can be used for instruction and disappear Breath, missed call, notice etc..
SIM card interface 195 is for connecting SIM card.SIM card can be by being inserted into SIM card interface 195, or from SIM card interface 195 extract, and realization is contacting and separating with electronic equipment 100.Electronic equipment 100 can support 1 or N number of SIM card interface, N For the positive integer greater than 1.SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card etc..The same SIM Card interface 195 can be inserted into multiple cards simultaneously.The type of multiple cards may be the same or different.SIM card interface 195 Different types of SIM card can also be compatible with.SIM card interface 195 can also be with compatible external storage card.Electronic equipment 100 passes through SIM Card and network interaction realize the functions such as call and data communication.In some embodiments, electronic equipment 100 uses eSIM, That is: embedded SIM card.ESIM card can cannot separate in electronic equipment 100 with electronic equipment 100.
The software systems of electronic equipment 100 can use layer architecture, event-driven framework, micronucleus framework, micro services frame Structure or cloud framework.The embodiment of the present invention by taking the android system of layer architecture as an example, exemplary illustration electronic equipment 100 it is soft Part structure.
Fig. 8 is the software architecture diagram of the electronic equipment 100 of the embodiment of the present invention.
Software is divided into several layers by layer architecture, and each layer has clearly role and the division of labor.Pass through between layers Software interface communication.In some embodiments, android system is divided into four layers, from top to bottom respectively application layer, answered With process block rack-layer, (Android runtime) and system library and inner nuclear layer when Android is run.
Application layer may include a series of application packages.
As shown in figure 8, application package may include camera, and picture library, calendar, call, map, navigation, WLAN, bluetooth, Music, video, the application programs such as short message.
Application framework layer provides Application Programming Interface (application for the application program of application layer Programming interface, API) and programming framework.Application framework layer includes some functions predetermined.
As shown in figure 8, application framework layer may include window manager, Content Provider, view system, phone pipe Manage device, resource manager, notification manager etc..
Window manager is for managing window writing routine.The available display screen size of window manager, judges whether there is shape State column, lock-screen, screen printing etc..
Content Provider is used to store and obtains data, and accesses these data by application program.The data It may include video, image, audio, the phone dialed and answered, browsing history and bookmark, telephone directory etc..
View system includes visible controls, such as the control of display text, shows the control etc. of picture.View system is available In building application program.What display interface can be made of one or more views.E.g., including the display of short massage notice icon Interface may include the view for showing text and the view for showing picture.
Telephone supervisor is for providing the communication function of electronic equipment 100.Such as talking state management (including connect, It hangs up).
Resource manager provides various resources, such as localized strings for application program, icon, picture, topology file, Video file etc..
Notification manager allows application program to show notification information in status bar, can be used for conveying and informs type Message, can be to disappear, without user's interaction automatically after short stay.For example notification manager be used to inform that downloading is completed, and disappear Breath prompting etc..Notification manager, which can also be, appears in the logical of system head status bar with chart or scroll bar textual form Know, for example, running background application program notice, can also be occur notice on the screen in the form of dialog box.Such as Text information is prompted in status bar, issues prompt tone, vibration of electronic equipment, indicator light flashing etc..
Android Runtime includes core library and virtual machine.Android runtime be responsible for Android system scheduling and Management.
Core library includes two parts: a part is the power function that java language needs to call, and another part is Android Core library.
Application layer and application framework layer operate in virtual machine.Virtual machine is by application layer and application program It is binary file that the java file of ccf layer, which executes,.Virtual machine is used to execute the management of Object Life Cycle, stack management, line Thread management, safety and the functions such as abnormal management and garbage reclamation.
System library may include multiple functional modules.Such as: surface manager (surface manager), media library (Media Libraries), three-dimensional graph process library (such as: OpenGL ES), 2D graphics engine (such as: SGL) etc..
Surface manager provides 2D and 3D figure layer for being managed to display subsystem for multiple application programs Fusion.
Media library supports a variety of common audios, video format playback and recording and static image file etc..Media library It can support a variety of audio/video coding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG etc..
Three-dimensional graph process library is for realizing 3-D graphic drawing, image rendering, synthesis and figure layer process etc..
2D graphics engine is the drawing engine that 2D draws.
Inner nuclear layer is the layer between hardware and software.Inner nuclear layer includes at least display driving, webcam driver, and audio is driven It is dynamic, sensor driving.
It takes pictures scene below with reference to capture, the workflow of 100 software of exemplary illustration electronic equipment and hardware.
When touch sensor 180K receives touch operation, corresponding hardware interrupts are sent to inner nuclear layer.Inner nuclear layer will touch It touches operation and is processed into original input event (including touch coordinate, the information such as timestamp of touch operation).Original input event quilt It is stored in inner nuclear layer.Application framework layer obtains original input event from inner nuclear layer, identifies control corresponding to the incoming event Part.It is to touch single-click operation with the touch operation, for control corresponding to the single-click operation is the control of camera applications icon, Camera applications call the interface of application framework layer, start camera applications, and then by calling inner nuclear layer to start webcam driver, lead to It crosses camera 193 and captures still image or video.
The embodiment of the present application also provides a kind of computer readable storage medium, for storing program code, the program code For executing any one embodiment and/or Emotion identification side in model training method described in foregoing individual embodiments Any one embodiment in method.
The embodiment of the present application also provides a kind of computer program product including instruction, when run on a computer, So that computer executes any one embodiment and/or mood in model training method described in foregoing individual embodiments Any one embodiment in recognition methods.
It is apparent to those skilled in the art that for convenience and simplicity of description, the system of foregoing description, The specific work process of device and unit, can refer to corresponding processes in the foregoing method embodiment, and details are not described herein.
In several embodiments provided herein, it should be understood that disclosed system, device and method can be with It realizes by another way.For example, the apparatus embodiments described above are merely exemplary, for example, the unit It divides, only a kind of logical function partition, there may be another division manner in actual implementation, such as multiple units or components It can be combined or can be integrated into another system, or some features can be ignored or not executed.Another point, it is shown or The mutual coupling, direct-coupling or communication connection discussed can be through some interfaces, the indirect coupling of device or unit It closes or communicates to connect, can be electrical property, mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme 's.
It, can also be in addition, each functional unit in each embodiment of the application can integrate in one processing unit It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list Member both can take the form of hardware realization, can also realize in the form of software functional units.
If the integrated unit is realized in the form of SFU software functional unit and sells or use as independent product When, it can store in a computer readable storage medium.Based on this understanding, the technical solution of the application is substantially The all or part of the part that contributes to existing technology or the technical solution can be in the form of software products in other words It embodies, which is stored in a storage medium, including some instructions are used so that a computer Equipment (can be personal computer, server or the network equipment etc.) executes the complete of each embodiment the method for the application Portion or part steps.And storage medium above-mentioned includes: USB flash disk, mobile hard disk, read-only memory (full name in English: Read-Only Memory, english abbreviation: ROM), random access memory (full name in English: Random Access Memory, english abbreviation: RAM), the various media that can store program code such as magnetic or disk.
The above, above embodiments are only to illustrate the technical solution of the application, rather than its limitations;Although referring to before Embodiment is stated the application is described in detail, those skilled in the art should understand that: it still can be to preceding Technical solution documented by each embodiment is stated to modify or equivalent replacement of some of the technical features;And these It modifies or replaces, the spirit and scope of each embodiment technical solution of the application that it does not separate the essence of the corresponding technical solution.

Claims (24)

1. a kind of model training method, which is characterized in that the described method includes:
Touch control manner when user's operation control terminal equipment is obtained, the corresponding emotional state of the touch control manner is marked;By the touching Prosecutor formula and the corresponding emotional state of the touch control manner, as training sample;
Using machine learning algorithm, disaggregated model is trained using the training sample, obtains Emotion identification model;It is described Touch control manner when Emotion identification model manipulates the terminal device with user is to input, with the corresponding mood shape of the touch control manner State is output.
2. the method according to claim 1, wherein the corresponding emotional state of the label touch control manner, Include:
According to the touch control manner corresponding triggered time, reference time section is determined;
Obtain the operation data content that terminal device described in user's operation generates in the reference time section;
The emotional state that user is determined according to the operation data content, as the corresponding emotional state of the touch control manner.
3. the method according to claim 1, wherein the corresponding emotional state of the label touch control manner, Include:
Call preset emotional state mapping table;Record has touch control manner and mood in the emotional state mapping table Corresponding relationship between state;
The emotional state mapping table is searched, determines the corresponding emotional state of the touch control manner.
4. method according to any one of claims 1 to 3, which is characterized in that when the acquisition user operation control terminal equipment Touch control manner, mark the corresponding emotional state of the touch control manner;The touch control manner and the touch control manner is corresponding Emotional state, as training sample, comprising:
Within a preset period of time, acquisition user manipulates the touch data that the terminal device generates;
Clustering processing is done to the touch data and generates touch data set, determines the corresponding touch-control side of the touch data set Formula;
It will include the most touch data set of touch data as target touch data set, by the target touch data collection Corresponding touch control manner is closed as target touch control manner;Mark the corresponding emotional state of the target touch control manner;
By the target touch control manner and the corresponding emotional state of the target touch control manner, as training sample.
5. according to the method described in claim 4, it is characterized in that, the touch data includes: screen capacitance variation data And coordinate value delta data.
6. the method according to claim 1, wherein obtain it is described obtain Emotion identification model after, it is described Method further include:
Touch control manner when user manipulates the terminal device is obtained, as optimization touch control manner;Mark the optimization touch-control side The corresponding emotional state of formula;By the optimization touch control manner and the corresponding emotional state of the optimization touch control manner, as optimization Training sample;The optimization training sample is for optimizing training to the Emotion identification model.
7. according to the method described in claim 6, it is characterized in that, the method also includes:
Obtain the feedback information that user is directed to the Emotion identification model;The feedback information is for characterizing the Emotion identification mould Whether the performance of type meets user demand;
When the performance that the feedback information characterizes the Emotion identification model is unsatisfactory for user demand, trained using the optimization Sample optimizes training to the Emotion identification model.
8. according to the method described in claim 6, it is characterized in that, the method also includes:
When the terminal device is in charged state, and/or, it is higher than default electricity in the remaining capacity of the terminal device When, and/or, when the duration that the terminal device is in idle condition is more than preset duration, utilize the optimization training sample Training is optimized to the Emotion identification model.
9. a kind of Emotion identification method, which is characterized in that the described method includes:
Obtain touch control manner when user's operation control terminal equipment;
The corresponding emotional state of the touch control manner, the emotional state current as user are determined using Emotion identification model;Institute Stating Emotion identification model is that perform claim requires 1 to 8 described in any item model training method training to obtain.
10. according to the method described in claim 9, it is characterized in that, the method also includes:
In the case where the terminal device shows desk interface, according to the current emotional state of the user, switch the table The Show Styles at face interface.
11. according to the method described in claim 9, it is characterized in that, the method also includes:
In the case where the terminal device opens application program, according to the current emotional state of the user, answered by described Recommend related content with program.
12. a kind of model training apparatus, which is characterized in that described device includes:
Training sample obtains module and marks the touch control manner pair for obtaining touch control manner when user's operation control terminal equipment The emotional state answered;By the touch control manner and the corresponding emotional state of the touch control manner, as training sample;
Model training module is trained disaggregated model using the training sample, obtains for using machine learning algorithm Emotion identification model;Touch control manner when the Emotion identification model manipulates the terminal device with user is to input, with the touching The corresponding emotional state of prosecutor formula is output.
13. device according to claim 12, which is characterized in that the training sample obtains module and is specifically used for:
According to the touch control manner corresponding triggered time, reference time section is determined;
Obtain the operation data content that terminal device described in user's operation generates in the reference time section;
The emotional state that user is determined according to the operation data content, as the corresponding emotional state of the touch control manner.
14. device according to claim 12, which is characterized in that the training sample obtains module and is specifically used for:
Call preset emotional state mapping table;Record has touch control manner and mood in the emotional state mapping table Corresponding relationship between state;
The emotional state mapping table is searched, determines the corresponding emotional state of the touch control manner.
15. 2 to 14 described in any item devices according to claim 1, which is characterized in that it is specific that the training sample obtains module For:
Within a preset period of time, acquisition user manipulates the touch data that the terminal device generates;
Clustering processing is done to the touch data and generates touch data set, determines the corresponding touch-control side of the touch data set Formula;
It will include the most touch data set of touch data as target touch data set, by the target touch data collection Corresponding touch control manner is closed as target touch control manner;Mark the corresponding emotional state of the target touch control manner;
By the target touch control manner and the corresponding emotional state of the target touch control manner, as training sample.
16. device according to claim 15, which is characterized in that the touch data includes: screen capacitance variation number According to and coordinate value delta data.
17. device according to claim 12, which is characterized in that described device further include:
Optimize training sample and obtain module, for obtaining touch control manner when user manipulates the terminal device, is touched as optimization Prosecutor formula;Mark the corresponding emotional state of the optimization touch control manner;By the optimization touch control manner and the optimization touch-control side The corresponding emotional state of formula, as optimization training sample;The optimization training sample is used to carry out the Emotion identification model Optimization training.
18. device according to claim 17, which is characterized in that described device further include:
Feedback information obtains module, the feedback information for being directed to the Emotion identification model for obtaining user;The feedback information Whether the performance for characterizing the Emotion identification model meets user demand;
First optimization training module, the performance for characterizing the Emotion identification model in the feedback information, which is unsatisfactory for user, to be needed When asking, training is optimized to the Emotion identification model using the optimization training sample.
19. device according to claim 17, which is characterized in that described device further include:
Second optimization training module, is used for when the terminal device is in charged state, and/or, in the terminal device When remaining capacity is higher than default electricity, and/or, when the duration that the terminal device is in idle condition is more than preset duration, Training is optimized to the Emotion identification model using the optimization training sample.
20. a kind of Emotion identification device, which is characterized in that described device includes:
Touch control manner obtains module, for obtaining touch control manner when user's operation control terminal equipment;
Emotional state identification module, for determining the corresponding emotional state of the touch control manner using Emotion identification model, as The current emotional state of user;The Emotion identification model is that perform claim requires 1 to 8 described in any item model training methods What training obtained.
21. device according to claim 20, which is characterized in that described device further include:
Show Styles switching module is used in the case where the terminal device shows desk interface, current according to the user Emotional state, switch the Show Styles of desk interface.
22. device according to claim 20, which is characterized in that described device further include:
Commending contents module is used in the case where the terminal device opens application program, according to the current feelings of the user Not-ready status recommends related content by the application program.
23. a kind of electronic equipment, which is characterized in that the terminal device includes processor and memory:
Said program code is transferred to the processor for storing program code by the memory;
The processor is used to be instructed according to the described in any item models of instruction execution claim 1 to 8 in said program code Practice method, and/or, perform claim requires 9 to 11 described in any item Emotion identification methods.
24. a kind of computer readable storage medium, including instruction, when run on a computer, so that computer executes such as The described in any item model training methods of claim 1 to 8, and/or, perform claim requires 9 to 11 described in any item moods Recognition methods.
CN201910309245.5A 2019-04-17 2019-04-17 Model training method, emotion recognition method, and related device and equipment Active CN110134316B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910309245.5A CN110134316B (en) 2019-04-17 2019-04-17 Model training method, emotion recognition method, and related device and equipment
PCT/CN2020/084216 WO2020211701A1 (en) 2019-04-17 2020-04-10 Model training method, emotion recognition method, related apparatus and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910309245.5A CN110134316B (en) 2019-04-17 2019-04-17 Model training method, emotion recognition method, and related device and equipment

Publications (2)

Publication Number Publication Date
CN110134316A true CN110134316A (en) 2019-08-16
CN110134316B CN110134316B (en) 2021-12-24

Family

ID=67570305

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910309245.5A Active CN110134316B (en) 2019-04-17 2019-04-17 Model training method, emotion recognition method, and related device and equipment

Country Status (2)

Country Link
CN (1) CN110134316B (en)
WO (1) WO2020211701A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111530081A (en) * 2020-04-17 2020-08-14 成都数字天空科技有限公司 Game level design method and device, storage medium and electronic equipment
CN111626191A (en) * 2020-05-26 2020-09-04 深圳地平线机器人科技有限公司 Model generation method and device, computer readable storage medium and electronic device
WO2020211701A1 (en) * 2019-04-17 2020-10-22 华为技术有限公司 Model training method, emotion recognition method, related apparatus and device
CN112906555A (en) * 2021-02-10 2021-06-04 华南师范大学 Artificial intelligence mental robot and method for recognizing expressions from person to person
WO2021139471A1 (en) * 2020-01-06 2021-07-15 华为技术有限公司 Health status test method and device, and computer storage medium
CN113656635A (en) * 2021-09-03 2021-11-16 咪咕音乐有限公司 Video color ring back tone synthesis method, device, equipment and computer readable storage medium
CN113791690A (en) * 2021-09-22 2021-12-14 入微智能科技(南京)有限公司 Man-machine interaction public equipment with real-time emotion recognition function
CN114223139A (en) * 2019-10-29 2022-03-22 深圳市欢太科技有限公司 Interface switching method and device, wearable electronic equipment and storage medium
CN115457645A (en) * 2022-11-11 2022-12-09 青岛网信信息科技有限公司 User emotion analysis method, medium and system based on interactive verification
CN115611393A (en) * 2022-11-07 2023-01-17 中节能晶和智慧城市科技(浙江)有限公司 Multi-end cooperative coagulant feeding method and system for multiple water plants
CN116662638A (en) * 2022-09-06 2023-08-29 荣耀终端有限公司 Data acquisition method and related device

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115657870A (en) * 2021-07-07 2023-01-31 荣耀终端有限公司 Method for adjusting sampling rate of touch screen and electronic equipment
CN113744738B (en) * 2021-09-10 2024-03-19 安徽淘云科技股份有限公司 Man-machine interaction method and related equipment thereof
CN114363049A (en) * 2021-12-30 2022-04-15 武汉杰创达科技有限公司 Internet of things equipment multi-ID identification method based on personalized interaction difference
CN115496113B (en) * 2022-11-17 2023-04-07 深圳市中大信通科技有限公司 Emotional behavior analysis method based on intelligent algorithm

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103926997A (en) * 2013-01-11 2014-07-16 北京三星通信技术研究有限公司 Method for determining emotional information based on user input and terminal
US20160027452A1 (en) * 2014-07-28 2016-01-28 Sone Computer Entertainment Inc. Emotional speech processing
CN105549885A (en) * 2015-12-10 2016-05-04 重庆邮电大学 Method and device for recognizing user emotion during screen sliding operation
CN106528538A (en) * 2016-12-07 2017-03-22 竹间智能科技(上海)有限公司 Method and device for intelligent emotion recognition
CN108227932A (en) * 2018-01-26 2018-06-29 上海智臻智能网络科技股份有限公司 Interaction is intended to determine method and device, computer equipment and storage medium
CN108334583A (en) * 2018-01-26 2018-07-27 上海智臻智能网络科技股份有限公司 Affective interaction method and device, computer readable storage medium, computer equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10884503B2 (en) * 2015-12-07 2021-01-05 Sri International VPA with integrated object recognition and facial expression recognition
CN106055236A (en) * 2016-05-30 2016-10-26 努比亚技术有限公司 Content pushing method and terminal
CN108073336A (en) * 2016-11-18 2018-05-25 香港中文大学 User emotion detecting system and method based on touch
CN107608956B (en) * 2017-09-05 2021-02-19 广东石油化工学院 Reader emotion distribution prediction algorithm based on CNN-GRNN
CN110134316B (en) * 2019-04-17 2021-12-24 华为技术有限公司 Model training method, emotion recognition method, and related device and equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103926997A (en) * 2013-01-11 2014-07-16 北京三星通信技术研究有限公司 Method for determining emotional information based on user input and terminal
US20160027452A1 (en) * 2014-07-28 2016-01-28 Sone Computer Entertainment Inc. Emotional speech processing
CN105549885A (en) * 2015-12-10 2016-05-04 重庆邮电大学 Method and device for recognizing user emotion during screen sliding operation
CN106528538A (en) * 2016-12-07 2017-03-22 竹间智能科技(上海)有限公司 Method and device for intelligent emotion recognition
CN108227932A (en) * 2018-01-26 2018-06-29 上海智臻智能网络科技股份有限公司 Interaction is intended to determine method and device, computer equipment and storage medium
CN108334583A (en) * 2018-01-26 2018-07-27 上海智臻智能网络科技股份有限公司 Affective interaction method and device, computer readable storage medium, computer equipment

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020211701A1 (en) * 2019-04-17 2020-10-22 华为技术有限公司 Model training method, emotion recognition method, related apparatus and device
CN114223139A (en) * 2019-10-29 2022-03-22 深圳市欢太科技有限公司 Interface switching method and device, wearable electronic equipment and storage medium
CN114223139B (en) * 2019-10-29 2023-11-24 深圳市欢太科技有限公司 Interface switching method and device, wearable electronic equipment and storage medium
WO2021139471A1 (en) * 2020-01-06 2021-07-15 华为技术有限公司 Health status test method and device, and computer storage medium
CN111530081A (en) * 2020-04-17 2020-08-14 成都数字天空科技有限公司 Game level design method and device, storage medium and electronic equipment
CN111626191A (en) * 2020-05-26 2020-09-04 深圳地平线机器人科技有限公司 Model generation method and device, computer readable storage medium and electronic device
CN112906555A (en) * 2021-02-10 2021-06-04 华南师范大学 Artificial intelligence mental robot and method for recognizing expressions from person to person
CN112906555B (en) * 2021-02-10 2022-08-05 华南师范大学 Artificial intelligence mental robot and method for recognizing expressions from person to person
CN113656635A (en) * 2021-09-03 2021-11-16 咪咕音乐有限公司 Video color ring back tone synthesis method, device, equipment and computer readable storage medium
CN113656635B (en) * 2021-09-03 2024-04-09 咪咕音乐有限公司 Video color ring synthesis method, device, equipment and computer readable storage medium
CN113791690A (en) * 2021-09-22 2021-12-14 入微智能科技(南京)有限公司 Man-machine interaction public equipment with real-time emotion recognition function
CN113791690B (en) * 2021-09-22 2024-03-29 入微智能科技(南京)有限公司 Man-machine interaction public equipment with real-time emotion recognition function
CN116662638A (en) * 2022-09-06 2023-08-29 荣耀终端有限公司 Data acquisition method and related device
CN116662638B (en) * 2022-09-06 2024-04-12 荣耀终端有限公司 Data acquisition method and related device
CN115611393A (en) * 2022-11-07 2023-01-17 中节能晶和智慧城市科技(浙江)有限公司 Multi-end cooperative coagulant feeding method and system for multiple water plants
CN115457645A (en) * 2022-11-11 2022-12-09 青岛网信信息科技有限公司 User emotion analysis method, medium and system based on interactive verification

Also Published As

Publication number Publication date
CN110134316B (en) 2021-12-24
WO2020211701A1 (en) 2020-10-22

Similar Documents

Publication Publication Date Title
CN110134316A (en) Model training method, Emotion identification method and relevant apparatus and equipment
KR102470275B1 (en) Voice control method and electronic device
CN109271081B (en) Roll the method and electronic equipment of screenshotss
EP3893129A1 (en) Recommendation method based on user exercise state, and electronic device
CN110910872B (en) Voice interaction method and device
CN110362373A (en) A kind of method and relevant device controlling screen wicket
CN109766043A (en) The operating method and electronic equipment of electronic equipment
CN110417991A (en) A kind of record screen method and electronic equipment
CN111061912A (en) Method for processing video file and electronic equipment
CN110045819A (en) A kind of gesture processing method and equipment
CN109890067A (en) Specific position in particular course is carried out to know method for distinguishing and electronic equipment
CN110506416A (en) A kind of method and terminal of terminal switching camera
CN110032307A (en) A kind of moving method and electronic equipment of application icon
CN110138959A (en) Show the method and electronic equipment of the prompt of human-computer interaction instruction
CN110531864A (en) A kind of gesture interaction method, device and terminal device
CN110489215A (en) The treating method and apparatus of scene is waited in a kind of application program
CN110244893A (en) A kind of operating method and electronic equipment of split screen display available
CN109976626A (en) A kind of switching method and electronic equipment of application icon
CN109920240A (en) A kind of method, device and equipment of infrared controller and infrared equipment Auto-matching
CN109274828A (en) A kind of method, control method and electronic equipment generating screenshot
WO2022068819A1 (en) Interface display method and related apparatus
WO2022095788A1 (en) Panning photography method for target user, electronic device, and storage medium
CN111742539B (en) Voice control command generation method and terminal
CN109634495A (en) Method of payment, device and user equipment
CN110059211A (en) Record the method and relevant apparatus of user feeling

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant