CN103926997A - Method for determining emotional information based on user input and terminal - Google Patents

Method for determining emotional information based on user input and terminal Download PDF

Info

Publication number
CN103926997A
CN103926997A CN201310011809.XA CN201310011809A CN103926997A CN 103926997 A CN103926997 A CN 103926997A CN 201310011809 A CN201310011809 A CN 201310011809A CN 103926997 A CN103926997 A CN 103926997A
Authority
CN
China
Prior art keywords
user
information
emotion
model
mood
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310011809.XA
Other languages
Chinese (zh)
Inventor
王炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Samsung Telecom R&D Center
Beijing Samsung Telecommunications Technology Research Co Ltd
Samsung Electronics Co Ltd
Original Assignee
Beijing Samsung Telecommunications Technology Research Co Ltd
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Samsung Telecommunications Technology Research Co Ltd, Samsung Electronics Co Ltd filed Critical Beijing Samsung Telecommunications Technology Research Co Ltd
Priority to CN201310011809.XA priority Critical patent/CN103926997A/en
Publication of CN103926997A publication Critical patent/CN103926997A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a method for determining emotional information based on user input. The method includes the steps: acquiring input parameter data of a user by a sensor positioned on the lower portion of a terminal panel; extracting features of the parameter data to determine corresponding feature information; selecting an emotional model matched with the feature information in a model base; determining the emotional information of the user according to an emotional type corresponding to the selected emotional model. The invention further discloses a terminal. The emotional information of the user can be provided by the method.

Description

Method and the terminal of emotional information determined in a kind of input based on user
Technical field
The application relates to terminal technology, and particularly method and the terminal of emotional information determined in a kind of input based on user.
Background technology
In the mobile Internet epoch, the interchange of information is used intelligent terminal in a large number, and the rhythm of life of every day is more and more faster, it is very large that user's mood also changes fluctuation, in the time producing bad mood, sometimes expect very much the comfort between friend relatives, in the time producing glad mood, also want to be shared with everybody.
Although along with the feature of the fast universal and real-time online of intelligent terminal, when chat or cooperative work, can transmit some information, but only limit to the simple present data such as transmission " writing " to the other side, can not transmit some current states of importer, realize the interchange of some personalizations.
Therefore need a kind of method can gather active user's mood, and this mood is delivered in other users' hand.Thereby share the mood of oneself, and the other user also can make a response according to the mood receiving in good time, for example send some signal that improves mood and contents, or infect the mood to the other side, thereby lifting mobile terminal is experienced as people's partner's perceptual user.
Summary of the invention
Method and terminal that the application provides a kind of input based on user to determine emotional information, can provide user's emotional information.
A method for emotional information is determined in input based on user, comprising:
Be positioned at the input parameter data of the inductor collection user under terminal panel, and described supplemental characteristic is carried out to feature extraction and determine corresponding characteristic information, in model bank, select the mood model of mating with described characteristic information, determine user's emotional information according to type of emotion corresponding to the mood model of selecting.
Preferably, described supplemental characteristic comprises input pressure, input position, input speed, input angle, input acceleration and/or input trajectory.
Preferably, the supplemental characteristic collecting is carried out to statistical study and feature extraction according to the time period.
Preferably, described characteristic information comprises: the position range that on average moves when user inputs, stop aerial time, average translational speed, mean pressure rank, the mean curvature parameter of input trajectory and/or the character content of input.
Preferably, the mode of setting up of the arbitrary mood model in described model bank is:
In advance by inductor gather many groups with the type of emotion of specifying the input parameter data while inputting, and the input parameter data of every group are carried out to feature extraction and determine corresponding characteristic information, utilize the characteristic information of all groups to add up mood model corresponding to type of emotion of determining described appointment;
And/or, self-defined each characteristic information corresponding to arbitrary mood model is set, and the span of each characteristic information;
And/or, self-defined input key word corresponding to arbitrary mood model that arrange.
Preferably, represent described emotional information by word and/or picture.
Preferably, type of emotion corresponding to mood model that described basis is selected determines that user's emotional information comprises:
Real-time emotional information using the type of emotion corresponding mood model of selecting as user;
And/or, by the type of emotion corresponding mood model of selecting and the corresponding corresponding preservation of temporal information; Determine the average emotional information in Preset Time according to the type of emotion of preserving in Preset Time and corresponding temporal information.
Preferably, the method further comprises:
Described user's emotional information is sent to the other side of accreditation by communication software; And/or,
Described user's emotional information is shown on the interface of the machine; And/or,
Upgrade the desktop picture chart of the machine according to described user's emotional information; And/or,
Upgrade the default picture state in communication software according to described user's emotional information; And/or,
Described user's emotional information is sent to the current games of carrying out; And/or,
Described user's emotional information is sent to other players in current ongoing game.
Preferably, in the time determining user's emotional information, further carry out according to the facial image of the user by camera collection.
A kind of terminal, comprising: inductor, model bank, Emotion identification unit and processing unit;
Described inductor, for gathering user's input parameter data;
Described Emotion identification unit, determines corresponding characteristic information for the supplemental characteristic of described inductor collection is carried out to feature extraction, and in described model bank, selects the mood model of mating with described characteristic information;
Described processing unit, type of emotion corresponding to mood model of selecting according to described mood matching unit determined user's emotional information;
Described model bank, for preserving mood model.
Preferably, described Emotion identification unit, is further used for described supplemental characteristic to carry out statistical study and feature extraction according to the time period.
Preferably, this terminal further comprises: mood model is set up unit;
Described inductor, is further used for gathering the many groups of input parameter data while inputting with the type of emotion of specifying; Described mood model is set up unit, for described inductor collection determined to corresponding characteristic information to specify every group of input parameter data of type of emotion while inputting to carry out feature extraction, utilize the characteristic information of all groups to add up mood model corresponding to type of emotion of determining described appointment, and be kept in described model bank;
And/or described mood model is set up unit, for each characteristic information corresponding to arbitrary mood model being set described model bank is self-defined, and the span of each characteristic information;
And/or described mood model is set up unit, at self-defined input key word corresponding to arbitrary mood model that arrange of described model bank.
Preferably, described processing unit, the real-time emotional information for type of emotion corresponding to mood model that described Emotion identification unit is selected as user;
And/or described processing unit, for the corresponding type of emotion and the corresponding corresponding preservation of temporal information of mood model that described Emotion identification unit is selected; Determine the average emotional information in Preset Time according to the type of emotion of preserving in Preset Time and corresponding temporal information.
Preferably, this terminal further comprises output unit;
Described output unit, for sending to described user's definite described processing unit emotional information the other side of accreditation by communication software;
And/or described output unit, for showing described user's definite described processing unit emotional information on the interface of the machine;
And/or described output unit, for upgrading the desktop picture chart of the machine according to the definite described user's of described processing unit emotional information;
And/or described output unit, for upgrading the default picture state of communication software according to the definite described user's of described processing unit emotional information;
And/or described output unit, for sending to the current games of carrying out by described user's definite described processing unit emotional information;
And/or described output unit, for sending to described user's definite described processing unit emotional information other players of current ongoing game.
Preferably, this terminal further comprises camera and face identification unit;
Described camera, the facial image while input for gathering user;
Described face identification unit, for identifying and analyze the facial image of described camera collection;
Described processing unit, is further used for determining according to the identification of described face identification unit and analysis result described user's emotional information.
In technique scheme, gather user's input parameter data by the inductor in terminal, and supplemental characteristic is carried out to feature extraction and determine corresponding characteristic information, from model bank, select again the mood model of mating with characteristic information, the last emotional information of determining user according to type of emotion corresponding to mood model of selecting.By above-mentioned processing, the supplemental characteristic such as pressure, position can utilize user to input time, Analysis deterrmination user's emotional information.
Brief description of the drawings
Fig. 1 is the basic procedure schematic diagram of determining emotional information method in the application;
Fig. 2 is the method flow schematic diagram of the application's example 1;
Fig. 3 is the basic structure schematic diagram of terminal in the application.
Embodiment
In order to make the application's object, technological means and advantage clearer, below in conjunction with accompanying drawing, the application is described in further details.
The application's basic thought is: utilize human body in writing process, can automaticly betray the feature of own current true mood, user emotion is determined in the action by monitor user ' in writing process.
Particularly, the application's method comprises: the input parameter data that are positioned at the inductor collection user under terminal panel, and supplemental characteristic is carried out to feature extraction and determine corresponding characteristic information, in model bank, select the mood model of mating with characteristic information, determine user's emotional information according to type of emotion corresponding to the mood model of selecting.
When user inputs, can input by hand-written or induction pen, induction pen also can be divided into electromagnetic pen or capacitive sensing pen etc.For different induction modes, corresponding inductor is also different.Wherein, the induction precision of electromagnetic pen will, higher than capacitive sensing pen, while therefore input by electromagnetic pen, can collect more information.Below by embodiment, the application's specific implementation is described in detail.Wherein, being input as example with electromagnetic pen describes.Fig. 1 is the method flow schematic diagram of determining emotional information in the application.As shown in Figure 1, the method comprises:
Step 101, is positioned at the input parameter data of the electromagnetic inductor collection user under terminal panel.
Here, use electromagnetic pen input in the present embodiment, therefore corresponding inductor is electromagnetic inductor.Wherein input parameter data refer to the action parameter in user's input process, can comprise input pressure, input position, input speed, input angle, input acceleration and/or input trajectory etc.Specifically can utilize various existing modes to carry out for the collection of these supplemental characteristics.
Particularly, in the time that user uses electromagnetic pen to carry out handwriting input, handwriting pad is current comparatively universal handwritten form input equipment, and it can reach the speed of 150 times/second to the sampling rate of nib.The state of starting writing and lift pen is generally decided by the pressure of the position of nib and tablet or pen, in the time that the distance of nib and tablet is less than the threshold value of setting, otherwise pen-down state will be triggered when when being greater than this threshold value, lifting pen state will be triggered.In the time of writing Chinese characters, the movement locus of nib on handwriting pad goes on record, and the movement locus of nib is generally described by its coordinate and pen state of starting to write/lift onboard, and the movement locus on handwriting pad is separated into stroke one by one nib to lift pen state.This step is mainly carried out the position of electromagnetic pen, angle, speed, the collection of the signals such as pressure.
Step 102, carries out feature extraction to the supplemental characteristic in step 101 and determines corresponding characteristic information.
The supplemental characteristic gathering in step 101 is conventionally many, so in the time that supplemental characteristic is carried out to feature extraction, can carry out statistical study and feature extraction according to the time period.For example, can carry out statistical study and feature extraction according to frame (timeslice).
Definite characteristic information can comprise: the position range on average moving when electromagnetic pen input, mean curvature parameter and/or the character content of input etc. that stops aerial time, average translational speed, mean pressure rank, input trajectory.
When feature extraction, can use coordinate to extract, curvature is extracted, VELOCITY EXTRACTION, and the methods such as pressure extraction extract the average mobile position range of pen, stop the aerial time, average mobile speed, average pressure rank, the features such as mean curvature parameter.
Step 103 is selected the mood model of mating with the determined characteristic information of step 102 in model bank.
Within input a period of time, the average mobile position range of pen, stop the aerial time, average mobile speed, the characteristic informations such as average pressure rank define different type of emotion, for example hover over while not writing on clipboard for pen, can be defined as thinking, and can be defined as hesitation when moving around or using erasing rubber to clear up back and forth, for effectively writing and can be defined as the firm mood of writing fast, can be defined as nervous write mode for writing of shake, can be by writing the variation of angle of stroke of time input, if major part is broken-line type, can be defined as conscientious calmness, if major part is circular arc, when pressure ratio is larger, can be defined as the mood of writing of angry work.The speed of writing by detection, the smooth speed of stroke.Can be defined as cheerful and light-hearted and sane mood.These different type of emotion represent by the mood model being kept in model bank.
In system model storehouse, preserve various mood model.These mood model can be by the mode that image data is trained is in advance set up, or, mood model also can be directly set.
Introduce several modeling pattern below:
1, utilize existing modeling pattern to carry out the foundation of mood model.For example, HMM (hidden Markov model) pattern-recognition.People's writing process can be regarded as a dual random process, action of writing itself is one and becomes sequence when observable, is according to the current needs of writing with the parameter of the action that the needs of emotion (unobservable state) sent at that time stream by brain.Utilize HMM reasonably to imitate this process, describe the signal of action of writing.
Particularly, while utilizing HMM to carry out setting up of mood model, for every kind of type of emotion is set up respectively a HMM model, a kind of type of emotion is described with a HMM, can be by applying a large amount of writing mood learning action model and trained.For example, for the arbitrary mood model that will set up (tranquil, glad, surprised, angry, sad, fear), a large number of users is sampled, carry out the input of electromagnetic pen to specify type of emotion, by inductor, the supplemental characteristic of a large number of users input is gathered, by point frame, parameter analysis and a characteristic parameter extraction, can obtain the characteristic parameter sequence X 1 of every frame, X2 ... Xt,, XT, wherein, Xt is illustrated in the value of the upper a certain characteristic information of time frame t, for example, the average mobile position range of pen, stops the aerial time.Average mobile speed, average pressure rank, mean curvature parameter etc.T is the time span of observed value, i.e. frame number).Utilize the characteristic information that the input of a large number of users is extracted to carry out statistical study, obtain specifying the span of individual features information under type of emotion, as HMM model.Wherein, a HMM model can be corresponding to the combination of a kind of characteristic information or various features information, and when corresponding to a kind of characteristic information, each user's characteristic parameter sequence is one-dimensional sequence; When HMM model is during corresponding to various features information, each user's characteristic parameter sequence is multidimensional sequence.Concrete every kind of mood model corresponding to which characteristic information can arrange as required.
By the method for HMM, formulate the parameter area value of various type of emotion.The final HMM model that can obtain sampling by a large number of users in advance, deposit in local data base, also can in application program, gather the supplemental characteristic of user behavior action or analyze after characteristic information upload to high in the clouds pattern base, can make like this sample of statistical sampling increase, realize coupling more accurately.
2, utilize self-defining tool software to carry out personalization definition to various type of emotion and characteristic of correspondence information.Type of emotion and the mapping data (being characteristic information and span thereof) corresponding to type of emotion of definition are deposited in local mode storehouse, type of emotion can just not be limited to above-mentioned basic status: tranquil, glad, surprised, indignation, sad, frightened, can be also the own emotional state defining such as " naughty ".Characteristic information is not also just limited to the parameter of input action, also can mate the words semanteme of input.
Complete by the way after modeling, every kind of mood model all comprises the span of one or more characteristic informations, a corresponding type of emotion.Mood model in the characteristic information of determining by the processing of step 101 and 102 and mood storehouse is compared, if the characteristic information of determining by the processing of step 101 and 102 meets the span of each characteristic information in certain mood model, judge and mate with this mood model, correspondingly can determine corresponding type of emotion.
For the action of failing to identify in pattern base or the low action of discrimination, can store in pattern base by the mode of artificial study, for it defines corresponding type of emotion.
Step 104, determines user's emotional information according to type of emotion corresponding to the mood model of selecting.
Emotional information can represent by word, picture or the combination of the two.
As previously mentioned, while carrying out feature extraction, can carry out according to the time period, the characteristic information drawing is also the characteristic information in the corresponding time period, in mood storehouse, select the type of emotion in namely corresponding time period of the corresponding type of emotion of mood model of coupling according to this characteristic information, for example carry out feature extraction taking frame as unit, the type of emotion so finally obtaining is exactly the type of emotion in respective frame.Obtain after type of emotion the directly emotional information within the corresponding time period using this type of emotion.Or, also can save as a record by corresponding with corresponding time period information type of emotion, then, determine average emotional information according to many records in Preset Time.
For example, system can be according to the monitoring to electromagnetic pen, the supplemental characteristic in storage a period of time in system, and with pattern base in comparing, output type of emotion, and current time of writing and type of emotion are stored as to a record; The like, can store the record in different time in a day, type of emotion when what upper layer application inquiry was nearest writes or average type of emotion in a day, then inquire the word of the expression mood that user or system pre-define and picture show or send to license to equation.
In addition, the facial image can also further input by camera collection user time by face recognition technology, further carries out according to the facial image gathering in the time determining user's emotional information.
So far, the basic skills flow process of the definite emotional information shown in Fig. 1 finishes.
Determining after corresponding emotional information, according to different application demands, the mood of determining can be shown or transmits.For example, user's emotional information can be sent to the other side of accreditation by communication software; And/or, user's emotional information is shown on the interface of the machine; And/or, according to the desktop picture chart of emotional information renewal the machine of user; And/or, upgrade the default picture state in communication software according to user's emotional information; And/or, user's emotional information is sent to the current games of carrying out; And/or, user's emotional information is sent to other players in current ongoing game.
Be given in the example of determining emotional information demonstration or transmission in different application below.
Example 1:
Flow process is as shown in Figure 2 by transmitting mood at the smart mobile phone that is equipped with electromagnetic pen.Utilize the screen of intelligent terminal as sensor, detect the information such as the position of electromagnetic pen in screen, angle, pressure, and be identified as mood and transmit.Its step is as follows:
Step 201: open chat, microblogging, the communications applications softwares such as note on outfit electromagnetic pen smart mobile phone;
Step 202: utilize electromagnetic pen input;
Step 203: the program of being responsible for electromagnetic pen input detects the information of the input of a period of time. collect the angle of input, speed, pressure.
Step 204: corresponding corresponding input pattern identification storehouse, inquires corresponding mood code, and return to communications applications software.
Step 205: again collect the angle of the input of a new round, speed, pressure, repeats above step, can be continuous provide or change input mood.Each input of taking turns detection time can be by User Defined.
Step 206: communications applications software sends to the other side's (for example chatting programme can adopt the mode of present to send to the other side, and note microblogging can directly add emotional state at information afterbody) above mood with application program.
Example 2:
Transmit mood by the smart mobile phone that is equipped with electromagnetic pen (or input equipment of other capacitive sensings), detect the position of electromagnetic pen in screen, speed, angle, the information such as pressure, and be identified as type of emotion.Can be in the process of input, eject remind frame or in status bar current oneself the mood of reminding user, make the mood of the adjustment oneself that user can be very fast.
Example 3:
Transmit mood by the smart mobile phone that is equipped with electromagnetic pen (or input equipment of other capacitive sensings), detect the position of electromagnetic pen in screen, speed, angle, the information such as pressure, are identified as type of emotion and are stored in user's intelligent terminal system, can calculate nearest (one day stage by the mode of statistics, week, the trend of the mood of the moon).Also can upgrade according to the mood of different time periods the state (head portrait of being in a bad mood such as band etc.) of the head portrait of own desktop picture icon (such as representing the desktop picture of mood) or the other side's social activity software.Take the scheme of different replies so that make the other side have a clear understanding of the mood of oneself.
Example 4:
Transmit mood by the smart mobile phone that is equipped with electromagnetic pen (or input equipment of other capacitive sensings), detect the position of electromagnetic pen in screen, speed, angle, the information such as pressure, and be identified as type of emotion.In the process of playing games, carry out alternately, mood being flowed to Games Software or the other side as a parameter with Games Software, make the game of Games Software image intensifying reality more.Increase the joyful degree of game.Or the mood of passing through to gather the each outpost of the tax office of user is such as information such as tensities, for the improvement of game is offered help.
Example 5:
Transmit mood by the smart mobile phone that is equipped with electromagnetic pen (or input equipment of other capacitive sensings), detect the position of electromagnetic pen in screen, speed, angle, pressure and other parameters feature, or be not limited to the characteristic parameter that electromagnetic pen obtains, can also be by information such as some key words after handwriting recognition, and the analysis of handwriting, or combine by camera and the recognition of face of intelligent terminal, can expand type and the degree of accuracy of identification emotional state.
The application also provides a kind of terminal, can be for implementing above-mentioned the application's method.Fig. 3 is the basic structure schematic diagram of this terminal.As shown in Figure 3, this terminal comprises: inductor, model bank, Emotion identification unit and processing unit.
Wherein, inductor, for gathering user's input parameter data.Emotion identification unit, determines corresponding characteristic information for the supplemental characteristic of inductor collection is carried out to feature extraction, and in model bank, selects the mood model of mating with characteristic information.Processing unit, type of emotion corresponding to mood model of selecting according to mood matching unit determined user's emotional information.Model bank, for preserving mood model.
Consider that image data amount is larger, for ease of analyzing, Emotion identification unit, can be further used for supplemental characteristic to carry out statistical study and feature extraction according to the time period.
Meanwhile, for realizing the foundation of mood model, the system side that can also be connected by network etc. in terminal or with terminal further comprises that mood model sets up unit.Specifically set up mode according to different mood model, mood model is set up the realization of unit also can be different.
Under a kind of mode, inductor can be further used for gathering the many groups of input parameter data while inputting with the type of emotion of specifying.Mood model is set up unit, for inductor collection determined to corresponding characteristic information to specify every group of input parameter data of type of emotion while inputting to carry out feature extraction, utilize the characteristic information of all groups to add up mood model corresponding to type of emotion of determining appointment, and be kept in model bank.
Under another kind of mode, mood model is set up unit, can each characteristic information corresponding to arbitrary mood model be set for self-defined in model bank, and the span of each characteristic information; Or, can also be used at self-defined input key word corresponding to arbitrary mood model that arrange of model bank.
Can set up mood model or adopt the combination of various ways to set up mood model by above-mentioned any mode.
With similar ground in said method, in the time determining emotional information, processing unit, type of emotion that can be corresponding for the mood model that Emotion identification unit is selected is as user's real-time emotional information; And/or, processing unit, for the corresponding type of emotion and the corresponding corresponding preservation of temporal information of mood model that Emotion identification unit is selected, then determine the average emotional information in Preset Time according to the type of emotion of preserving in Preset Time and corresponding temporal information.
For realizing the application of emotional information, this terminal can further include output unit.For adapting to different application demands, output unit, can be for sending to user's definite processing unit emotional information the other side of accreditation by communication software; And/or output unit, for showing described user's definite processing unit emotional information on the interface of the machine; And/or output unit, for upgrading the desktop picture chart of the machine according to the definite described user's of processing unit emotional information; And/or output unit, for upgrading the default picture state of communication software according to the definite described user's of processing unit emotional information; And/or output unit, for sending to the current games of carrying out by described user's definite processing unit emotional information; And/or described output unit, for sending to described user's definite described processing unit emotional information other players of current ongoing game.
Consider the mode of multiple multi-form information acquisition and definite emotional information, this terminal can further include camera and face identification unit.Wherein, camera, the facial image while input for gathering user.Face identification unit, for identifying and analyze the facial image of camera collection.Processing unit, is further used for determining according to the identification of face identification unit and analysis result user's emotional information.
From above-mentioned, what in the application, provide inputs based on user method and the terminal of determining emotional information, utilizes the feature of handwriting input, by software detection importer's input angle, and the signal identification importers' such as speed and pressure various moods.In chat, microblogging, the application such as note is upper, increases a kind of transmission of new quantity of information, and the other side can receive the other side's emotional state in receiving original information.Like this, can make mobile terminal, the equipment that PC etc. have been equipped with handwriting input can transmit importer's mood, shares the mood of oneself with the other side, makes user obtain better personal experience.
The foregoing is only preferred embodiment of the present invention, in order to limit the present invention, within the spirit and principles in the present invention not all, any amendment of making, be equal to replacement, improvement etc., within all should being included in the scope of protection of the invention.

Claims (15)

1. a method for emotional information is determined in the input based on user, it is characterized in that, comprising:
Be positioned at terminal induction device collection user's input parameter data, and described supplemental characteristic is carried out to feature extraction and determine corresponding characteristic information, in model bank, select the mood model of mating with described characteristic information, determine user's emotional information according to type of emotion corresponding to the mood model of selecting.
2. method according to claim 1, is characterized in that, described supplemental characteristic comprises input pressure, input position, input speed, input angle, input acceleration and/or input trajectory.
3. method according to claim 1, is characterized in that, the supplemental characteristic collecting is carried out to statistical study and feature extraction according to the time period.
4. method according to claim 1, it is characterized in that, described characteristic information comprises: the position range that on average moves when user inputs, stop aerial time, average translational speed, mean pressure rank, the mean curvature parameter of input trajectory and/or the character content of input.
5. method according to claim 1, is characterized in that, the mode of setting up of the arbitrary mood model in described model bank is:
In advance by inductor gather many groups with the type of emotion of specifying the input parameter data while inputting, and the input parameter data of every group are carried out to feature extraction and determine corresponding characteristic information, utilize the characteristic information of all groups to add up mood model corresponding to type of emotion of determining described appointment;
And/or, self-defined each characteristic information corresponding to arbitrary mood model is set, and the span of each characteristic information;
And/or, self-defined input key word corresponding to arbitrary mood model that arrange.
6. method according to claim 1, is characterized in that, represents described emotional information by word and/or picture.
7. method according to claim 1, is characterized in that, type of emotion corresponding to mood model that described basis is selected determines that user's emotional information comprises:
Real-time emotional information using the type of emotion corresponding mood model of selecting as user;
And/or, by the type of emotion corresponding mood model of selecting and the corresponding corresponding preservation of temporal information; Determine the average emotional information in Preset Time according to the type of emotion of preserving in Preset Time and corresponding temporal information.
8. method according to claim 1, is characterized in that, the method further comprises:
Described user's emotional information is sent to the other side of accreditation by communication software; And/or,
Described user's emotional information is shown on the interface of the machine; And/or,
Upgrade the desktop picture chart of the machine according to described user's emotional information; And/or,
Upgrade the default picture state in communication software according to described user's emotional information; And/or,
Described user's emotional information is sent to the current games of carrying out; And/or,
Described user's emotional information is sent to other players in current ongoing game.
9. method according to claim 1, is characterized in that, in the time determining user's emotional information, further carries out according to the facial image of the user by camera collection.
10. a terminal, is characterized in that, comprising: inductor, model bank, Emotion identification unit and processing unit;
Described inductor, for gathering user's input parameter data;
Described Emotion identification unit, determines corresponding characteristic information for the supplemental characteristic of described inductor collection is carried out to feature extraction, and in described model bank, selects the mood model of mating with described characteristic information;
Described processing unit, type of emotion corresponding to mood model of selecting according to described mood matching unit determined user's emotional information;
Described model bank, for preserving mood model.
11. terminals according to claim 10, is characterized in that, described Emotion identification unit is further used for described supplemental characteristic to carry out statistical study and feature extraction according to the time period.
12. terminals according to claim 10, is characterized in that, this terminal further comprises: mood model is set up unit;
Described inductor, is further used for gathering the many groups of input parameter data while inputting with the type of emotion of specifying; Described mood model is set up unit, for described inductor collection determined to corresponding characteristic information to specify every group of input parameter data of type of emotion while inputting to carry out feature extraction, utilize the characteristic information of all groups to add up mood model corresponding to type of emotion of determining described appointment, and be kept in described model bank;
And/or described mood model is set up unit, for each characteristic information corresponding to arbitrary mood model being set described model bank is self-defined, and the span of each characteristic information;
And/or described mood model is set up unit, at self-defined input key word corresponding to arbitrary mood model that arrange of described model bank.
13. terminals according to claim 10, is characterized in that, described processing unit, the real-time emotional information for type of emotion corresponding to mood model that described Emotion identification unit is selected as user;
And/or described processing unit, for the corresponding type of emotion and the corresponding corresponding preservation of temporal information of mood model that described Emotion identification unit is selected; Determine the average emotional information in Preset Time according to the type of emotion of preserving in Preset Time and corresponding temporal information.
14. terminals according to claim 10, is characterized in that, this terminal further comprises output unit;
Described output unit, for sending to described user's definite described processing unit emotional information the other side of accreditation by communication software;
And/or described output unit, for showing described user's definite described processing unit emotional information on the interface of the machine;
And/or described output unit, for upgrading the desktop picture chart of the machine according to the definite described user's of described processing unit emotional information;
And/or described output unit, for upgrading the default picture state of communication software according to the definite described user's of described processing unit emotional information;
And/or described output unit, for sending to the current games of carrying out by described user's definite described processing unit emotional information;
And/or described output unit, for sending to described user's definite described processing unit emotional information other players of current ongoing game.
15. terminals according to claim 10, is characterized in that, this terminal further comprises camera and face identification unit;
Described camera, the facial image while input for gathering user;
Described face identification unit, for identifying and analyze the facial image of described camera collection;
Described processing unit, is further used for determining according to the identification of described face identification unit and analysis result described user's emotional information.
CN201310011809.XA 2013-01-11 2013-01-11 Method for determining emotional information based on user input and terminal Pending CN103926997A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310011809.XA CN103926997A (en) 2013-01-11 2013-01-11 Method for determining emotional information based on user input and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310011809.XA CN103926997A (en) 2013-01-11 2013-01-11 Method for determining emotional information based on user input and terminal

Publications (1)

Publication Number Publication Date
CN103926997A true CN103926997A (en) 2014-07-16

Family

ID=51145252

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310011809.XA Pending CN103926997A (en) 2013-01-11 2013-01-11 Method for determining emotional information based on user input and terminal

Country Status (1)

Country Link
CN (1) CN103926997A (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104202718A (en) * 2014-08-05 2014-12-10 百度在线网络技术(北京)有限公司 Method and device for providing information for user
CN104618459A (en) * 2015-01-13 2015-05-13 北京中交兴路车联网科技有限公司 Method and system for automatically acquiring data model
CN104753766A (en) * 2015-03-02 2015-07-01 小米科技有限责任公司 Expression sending method and device
CN105549885A (en) * 2015-12-10 2016-05-04 重庆邮电大学 Method and device for recognizing user emotion during screen sliding operation
CN106383595A (en) * 2016-10-28 2017-02-08 维沃移动通信有限公司 Method for adjusting display interface of input method and mobile terminal
CN107292221A (en) * 2016-04-01 2017-10-24 北京搜狗科技发展有限公司 A kind of trajectory processing method and apparatus, a kind of device for trajectory processing
CN107736893A (en) * 2017-09-01 2018-02-27 合肥迅大信息技术有限公司 mental emotion monitoring system based on mobile device
CN108334522A (en) * 2017-01-20 2018-07-27 阿里巴巴集团控股有限公司 The method for determining customs's coding, and determine the method and system of type information
WO2018166241A1 (en) * 2017-03-17 2018-09-20 上海掌门科技有限公司 Method and device for generating presentation content
CN108665505A (en) * 2017-04-02 2018-10-16 田雪松 A kind of substrate and the data processing method based on substrate
CN109525725A (en) * 2018-11-21 2019-03-26 三星电子(中国)研发中心 A kind of information processing method and device based on emotional state
CN109513205A (en) * 2018-11-05 2019-03-26 努比亚技术有限公司 A kind of user emotion rendering method, terminal and readable storage medium storing program for executing
CN109643160A (en) * 2016-09-01 2019-04-16 株式会社和冠 Coordinate input processing device, emotion estimating device, emotion deduction system and emotion the presumption construction device of database
CN109683718A (en) * 2019-01-16 2019-04-26 深圳市中视典数字科技有限公司 A kind of interactive display unit and method
CN110069991A (en) * 2019-03-18 2019-07-30 深圳壹账通智能科技有限公司 Feedback information determines method, apparatus, electronic equipment and storage medium
CN110134316A (en) * 2019-04-17 2019-08-16 华为技术有限公司 Model training method, Emotion identification method and relevant apparatus and equipment
CN110755847A (en) * 2019-10-30 2020-02-07 腾讯科技(深圳)有限公司 Virtual operation object generation method and device, storage medium and electronic device
CN110895738A (en) * 2018-09-12 2020-03-20 丰田自动车株式会社 Driving evaluation device, driving evaluation system, driving evaluation method, and storage medium
US11380037B2 (en) 2019-10-30 2022-07-05 Tencent Technology (Shenzhen) Company Limited Method and apparatus for generating virtual operating object, storage medium, and electronic device
CN114816036A (en) * 2021-01-19 2022-07-29 北京搜狗科技发展有限公司 Emotion processing method, device and medium

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104202718A (en) * 2014-08-05 2014-12-10 百度在线网络技术(北京)有限公司 Method and device for providing information for user
CN104618459A (en) * 2015-01-13 2015-05-13 北京中交兴路车联网科技有限公司 Method and system for automatically acquiring data model
CN104753766A (en) * 2015-03-02 2015-07-01 小米科技有限责任公司 Expression sending method and device
CN105549885A (en) * 2015-12-10 2016-05-04 重庆邮电大学 Method and device for recognizing user emotion during screen sliding operation
CN107292221A (en) * 2016-04-01 2017-10-24 北京搜狗科技发展有限公司 A kind of trajectory processing method and apparatus, a kind of device for trajectory processing
CN109643160A (en) * 2016-09-01 2019-04-16 株式会社和冠 Coordinate input processing device, emotion estimating device, emotion deduction system and emotion the presumption construction device of database
US11625110B2 (en) 2016-09-01 2023-04-11 Wacom Co., Ltd. Coordinate input processing apparatus, emotion estimation apparatus, emotion estimation system, and building apparatus for building emotion estimation-oriented database
CN109643160B (en) * 2016-09-01 2023-03-17 株式会社和冠 Coordinate input processing device, emotion estimation device, system, and construction device
US11237647B2 (en) 2016-09-01 2022-02-01 Wacom Co., Ltd. Coordinate input processing apparatus, emotion estimation apparatus, emotion estimation system, and building apparatus for building emotion estimation-oriented database
EP3508948A4 (en) * 2016-09-01 2019-08-21 Wacom Co., Ltd. Coordinate input processing device, emotion estimation device, emotion estimation system, and device for constructing database for emotion estimation
CN106383595A (en) * 2016-10-28 2017-02-08 维沃移动通信有限公司 Method for adjusting display interface of input method and mobile terminal
CN108334522B (en) * 2017-01-20 2021-12-14 阿里巴巴集团控股有限公司 Method for determining customs code, and method and system for determining type information
CN108334522A (en) * 2017-01-20 2018-07-27 阿里巴巴集团控股有限公司 The method for determining customs's coding, and determine the method and system of type information
CN108628504A (en) * 2017-03-17 2018-10-09 上海掌门科技有限公司 A kind of method and apparatus generating displaying content
WO2018166241A1 (en) * 2017-03-17 2018-09-20 上海掌门科技有限公司 Method and device for generating presentation content
CN108665505A (en) * 2017-04-02 2018-10-16 田雪松 A kind of substrate and the data processing method based on substrate
CN107736893A (en) * 2017-09-01 2018-02-27 合肥迅大信息技术有限公司 mental emotion monitoring system based on mobile device
CN110895738A (en) * 2018-09-12 2020-03-20 丰田自动车株式会社 Driving evaluation device, driving evaluation system, driving evaluation method, and storage medium
CN109513205A (en) * 2018-11-05 2019-03-26 努比亚技术有限公司 A kind of user emotion rendering method, terminal and readable storage medium storing program for executing
CN109525725A (en) * 2018-11-21 2019-03-26 三星电子(中国)研发中心 A kind of information processing method and device based on emotional state
CN109683718A (en) * 2019-01-16 2019-04-26 深圳市中视典数字科技有限公司 A kind of interactive display unit and method
CN110069991A (en) * 2019-03-18 2019-07-30 深圳壹账通智能科技有限公司 Feedback information determines method, apparatus, electronic equipment and storage medium
CN110134316A (en) * 2019-04-17 2019-08-16 华为技术有限公司 Model training method, Emotion identification method and relevant apparatus and equipment
CN110134316B (en) * 2019-04-17 2021-12-24 华为技术有限公司 Model training method, emotion recognition method, and related device and equipment
CN110755847B (en) * 2019-10-30 2021-03-16 腾讯科技(深圳)有限公司 Virtual operation object generation method and device, storage medium and electronic device
US11380037B2 (en) 2019-10-30 2022-07-05 Tencent Technology (Shenzhen) Company Limited Method and apparatus for generating virtual operating object, storage medium, and electronic device
CN110755847A (en) * 2019-10-30 2020-02-07 腾讯科技(深圳)有限公司 Virtual operation object generation method and device, storage medium and electronic device
CN114816036A (en) * 2021-01-19 2022-07-29 北京搜狗科技发展有限公司 Emotion processing method, device and medium

Similar Documents

Publication Publication Date Title
CN103926997A (en) Method for determining emotional information based on user input and terminal
US11663784B2 (en) Content creation in augmented reality environment
CN102789313B (en) User interaction system and method
EP3000026B1 (en) Attributing user action based on biometric identity
CN103226388B (en) A kind of handwriting sckeme based on Kinect
CN106462598A (en) Information processing device, information processing method, and program
US20180253163A1 (en) Change of active user of a stylus pen with a multi-user interactive display
CN105960626A (en) Grip detection
CN104898825A (en) Electronic device and method for outputting feedback
CN110084056A (en) Privacy information is shown on personal device
AU2014318661A1 (en) Simultaneous hover and touch interface
CN104571482A (en) Digital device control method based on somatosensory recognition
CN102868830A (en) Switching control method and device of mobile terminal themes
CN112241715A (en) Model training method, expression recognition method, device, equipment and storage medium
CN105393200A (en) Interference data acquisition method and device
WO2015084686A1 (en) Crane gesture
CN105659202A (en) Detecting primary hover point for multi-hover point device
CN111798259A (en) Application recommendation method and device, storage medium and electronic equipment
CN103838444A (en) Input method and input equipment
CN107292221B (en) Track processing method and device and track processing device
CN106446912A (en) Media processing method and media processing device
CN103593052A (en) Gesture capture method based on Kinect and OpenNI
DE112018007850B4 (en) VOICE RECOGNITION SYSTEM AND OPERATING METHOD OF A VOICE RECOGNITION SYSTEM
CN107992193A (en) Gesture confirmation method, device and electronic equipment
CN108469912A (en) A kind of character input method and system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20140716

RJ01 Rejection of invention patent application after publication