CN107037890A - Processing method and processing device, computer equipment and the computer-readable recording medium of emoticon - Google Patents
Processing method and processing device, computer equipment and the computer-readable recording medium of emoticon Download PDFInfo
- Publication number
- CN107037890A CN107037890A CN201710292245.XA CN201710292245A CN107037890A CN 107037890 A CN107037890 A CN 107037890A CN 201710292245 A CN201710292245 A CN 201710292245A CN 107037890 A CN107037890 A CN 107037890A
- Authority
- CN
- China
- Prior art keywords
- expression
- emoticon
- target
- current
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
- G06F3/0237—Character input methods using prediction or retrieval techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
- G06F3/0236—Character input methods using selection techniques to select from displayed items
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/279—Recognition of textual entities
- G06F40/284—Lexical analysis, e.g. tokenisation or collocates
Abstract
The present invention provides a kind of processing method and processing device of emoticon, computer equipment and computer-readable recording medium.Its methods described includes:Receive the target emotion feature of the target emoticon of user's input;The target emoticon for identifying target emotion feature is generated using the expression expression object obtained in advance.Technical scheme, can be according to the demand of user, and generation meets the corresponding target emoticon of various target emotion features of the demand of user, so as to effectively enrich the expression way of emoticon.
Description
【Technical field】
The present invention relates to Computer Applied Technology field, more particularly to a kind of processing method and processing device of emoticon, meter
Calculate machine equipment and computer-readable recording medium.
【Background technology】
With the fast development of science and technology, the popularization that intelligent terminal is used, user can use intelligent terminal by forum, i.e.
When communication etc. various modes linked up with various friends with being exchanged.
In the prior art, user using intelligent terminal when being linked up, through the input that information is realized frequently with input method.It is existing
Input method in technology can not only realize the input of text information, while the input of various expressions is also provided, such as emoji, face
Word or dynamic emoticon etc..At present, it is more and more either in each World Jam or in the chat of instant messaging
User like link up when, use various emoticons.User in communication using emoticon by can not only be saved
The trouble of word input, information to be expressed is expressed while can also understand, furthermore it is also possible to adjust communication atmosphere, is improved
Communication efficiency.In the prior art many emoticons are provided with many input methods.In addition, user can also from network under
Many emoticons are carried, to use when needed.
From the foregoing, emoticon of the prior art be all pre-set in input method or network in can be advance
Set, all versions are all identical, and whichever user downloads, and emoticon when using all, causes emoticon
Expression way is excessively inflexible.
【The content of the invention】
The invention provides a kind of processing method and processing device of emoticon, computer equipment and computer-readable recording medium, for rich
The expression way of expressive symbol.
The present invention provides a kind of processing method of emoticon, and methods described includes:
Receive the target emotion feature of the target emoticon of user's input;
The target generated using the expression expression object obtained in advance for identifying the target emotion feature is expressed one's feelings
Symbol.
Still optionally further, in method as described above, generated using the expression expression object obtained in advance for identifying
The target emoticon of the target emotion feature, is specifically included:
Use the target emotion feature triggering expression expression object so that the expression expression object shows institute
State the corresponding target emoticon of target emotion feature;
The word of the mark target emotion feature is carried in the target emoticon.
Still optionally further, it is described to be generated with the expression obtained in advance expression object for marking in method as described above
Know before the target emoticon of the target emotion feature, methods described also includes:
Receive the selection request of the expression expression object of user's input;
The corresponding expression expression pair of selection request of the expression expression object is obtained from expression expression library of object
As and it is described expression expression object property parameters;
Expression expression object described in growth interface display in the expression expression object.
Still optionally further, in method as described above, the expression expression object is obtained from expression expression library of object
Selection request it is corresponding it is described expression expression object after, methods described also includes:
Receive the nursing instruction of user's input;
According to the growth condition fed in instruction and the property parameters, the transverse direction of the expression expression object is adjusted
Size and longitudinal size.
Still optionally further, in method as described above, the expression expression object is obtained from expression expression library of object
Selection request it is corresponding it is described expression expression object after, methods described also includes:
Receive the training directive of user's input;
According to the growth condition in the training directive and the property parameters, the transverse direction of the expression expression object is adjusted
Size.
Still optionally further, in method as described above, the expression expression object is obtained from expression expression library of object
Selection request it is corresponding it is described expression expression object after, methods described also includes:
The corresponding relation and current nursing shape between nursing state and emotional characteristics in the property parameters
Corresponding relation and current training shape between state, and/or physical training condition and emotional characteristics in the property parameters
State, obtains the current emotional characteristics of the expression expression object;
The current emotional characteristics of object is expressed according to the expression, adjustment is expressed described in object is the theme with the expression
The display interface of input method so that the emotional state that the display interface of the input method is presented is worked as with the expression expression object
The emotional state that preceding emotional state feature is identified is consistent.
Still optionally further, in method as described above, nursing state and emotional characteristics in the property parameters
Between corresponding relation and current nursing state, and/or physical training condition and emotional characteristics in the property parameters
Between corresponding relation and current physical training condition, it is described after obtaining the current emotional characteristics of the expression expression object
Method also includes:
The current emotional characteristics of object is expressed according to the expression, it is described to express object generation mark using the expression
The current emoticon of current emotional state feature;
Share the current emoticon.
Still optionally further, in method as described above, generated using the expression expression object obtained in advance for identifying
After the target emoticon of the target emotion feature, methods described also includes:
The target emoticon is sent to good friend.
The present invention provides a kind of processing unit of emoticon, and described device includes:
Receiving module, the target emotion feature of the target emoticon for receiving user's input;
Emoticon generation module, the expression expression object obtained in advance for utilizing is generated for identifying the target feelings
The target emoticon of thread feature.
Still optionally further, in device as described above, the emoticon generation module, specifically for:
Use the target emotion feature triggering expression expression object so that the expression expression object shows institute
State the corresponding target emoticon of target emotion feature;
The word of the mark target emotion feature is carried in the target emoticon.
Still optionally further, in device as described above, described device also includes acquisition module and display module;
The receiving module, is additionally operable to receive the selection request of the expression expression object of user's input;
Correspondence is asked in the acquisition module, the selection for obtaining the expression expression object from expression expression library of object
The expression expression object and the expression express object property parameters;
The display interface, for expression expression object described in the growth interface display in the expression expression object.
Still optionally further, in device as described above, described device also includes adjusting module;
The receiving module, is additionally operable to receive the nursing instruction of user's input;
The adjusting module, for according to the growth condition fed in instruction and the property parameters, adjustment to be described
The lateral dimension and longitudinal size of expression expression object.
Still optionally further, in device as described above, the receiving module is additionally operable to receive the instruction of user's input
Practice instruction;
The adjusting module, is additionally operable to the growth condition in the training directive and the property parameters, adjusts institute
State the lateral dimension of expression expression object.
Still optionally further, in device as described above, the acquisition module is additionally operable to according in the property parameters
Corresponding relation and current nursing state between nursing state and emotional characteristics, and/or according in the property parameters
Corresponding relation and current physical training condition between physical training condition and emotional characteristics, obtain the expression expression object current
Emotional characteristics;
The adjusting module, is additionally operable to express the current emotional characteristics of object according to the expression, adjusts with the expression
The display interface for the input method that expression object is the theme so that the emotional state that the display interface of the input method is presented
The emotional state that the emotional state feature current with the expression expression object is identified is consistent.
Still optionally further, in device as described above, described device also includes sharing module;
The emoticon generation module, is additionally operable to express the current emotional characteristics of object according to the expression, utilizes
The current emoticon of the expression expression object generation mark current emotional state feature;
The sharing module, for sharing the current emoticon.
Still optionally further, in device as described above, described device also includes:
Sending module, for sending the target emoticon to good friend.
The present invention also provides a kind of computer equipment, and the equipment includes:
One or more processors;
Memory, for storing one or more programs,
When one or more of programs are by one or more of computing devices so that one or more of processing
Device realizes the processing method of emoticon as described above.
The present invention also provides a kind of computer-readable medium, is stored thereon with computer program, the program is held by processor
The processing method of emoticon as described above is realized during row.
Processing method and processing device, computer equipment and the computer-readable recording medium of the emoticon of the present invention, it is defeated by receiving user
The target emotion feature of the target emoticon entered;Generated using the expression expression object obtained in advance for identifying target emotion
The target emoticon of feature.Technical scheme, can be according to the demand of user, and generation meets each of the demand of user
The corresponding target emoticon of target emotion feature is planted, so as to effectively enrich the expression way of emoticon.
【Brief description of the drawings】
Fig. 1 is the flow chart of the processing method embodiment of the emoticon of the present invention.
Fig. 2 is the structure chart of the processing unit embodiment one of the emoticon of the present invention.
Fig. 3 is the structure chart of the processing unit embodiment two of the emoticon of the present invention.
Fig. 4 is the structure chart of the computer equipment embodiment of the present invention.
A kind of exemplary plot for computer equipment that Fig. 5 provides for the present invention.
【Embodiment】
In order that the object, technical solutions and advantages of the present invention are clearer, below in conjunction with the accompanying drawings with specific embodiment pair
The present invention is described in detail.
Fig. 1 is the flow chart of the processing method embodiment of the emoticon of the present invention.As shown in figure 1, the table of the present embodiment
The processing method of feelings symbol, specifically may include steps of:
100th, the target emotion feature of the target emoticon of user's input is received;
The executive agent of the processing method of the emoticon of the present embodiment is the processing unit of emoticon, the emoticon
Processing unit can be arranged in input method, with coordinate input method realize user individual emoticon processing.This reality
In the processing method for the emoticon for applying example, the target emotion feature of the target emoticon of user's input is received, specifically can be with
For:Receive the target emotion feature of the target emoticon the need for user is inputted by manual interface module;Or receive logical
The target emotion feature of the target emoticon the need for phonetic entry is crossed, corresponding semanteme is then obtained by speech recognition again
Information, gets the target emotion feature of the target emoticon required for user.
For example, can be provided with emoticon generation button in input method, user can be ejected by clicking on the button
Emoticon generates interface, can alternatively, in interface show multiple emotional characteristicses for being available for user to select, and user can pass through
The human-machine interface module such as mouse and/or keyboard therefrom selects an emotional characteristics special as the target emotion of target emoticon
Levy.Or if user uses the mobile terminal of touch-screen, a mood directly can also therefrom be selected by touch-screen
Feature as target emoticon target emotion feature.Accordingly, the processing unit of emoticon can receive user
The target emotion feature of the target emoticon needed.Or if emoticon generation interface, which is not shown, is available for what user selected
Emotional characteristics, user can also directly input the target emotion feature of required target emoticon.The target of the present embodiment
Emotional characteristics, can be including mood is general, mood is pretty good, small secretly pleased, happy, very happy;And it is small it is sad, sad, hinder very much
The emotional characteristics of the various ranks of the heart, pain etc..Target emotion feature can also include work easily, work is good busy, work very
Tire out etc..The emotional characteristics of a variety of ranks of various ways can be specifically pre-set according to the actual requirements, and no longer one at one stroke herein
Example is repeated.
101st, the target emoticon for identifying target emotion feature is generated using the expression expression object obtained in advance.
The expression expression object of the present embodiment is the carrier for being used to identify various moods obtained in advance.Such as table
Feelings expression object can be that target emotion feature is expressed by the electronic pet in electronic pet, the present embodiment, so that
Obtain target emoticon of the electronic pet under target emotion feature.The expression expression object of the present embodiment can be used for root
A variety of emotional characteristicses, the target emoticon that expression user is intended by are identified according to the demand of user.Rather than existing
The emoticon downloaded in technology on network, is all set form, and the kitten that for example user downloads says the emoticon that thanks
Number, the kitten is also only capable of making the emoticon that thanks, without having the personalized emoticon that other users want,
Therefore the expression way of emoticon is excessively inflexible.
In the present embodiment, after the target emoticon for obtaining user's needs, user can be in social platform such as mhkc
Either forum shares the target emoticon or can send target emoticon to good friend in instant messaging.
The good friend of the present embodiment refers to good friend's account in social network-i i-platform, and inhuman.User obtains required
After target emoticon, when carrying out communication exchange with good friend, the aiming symbol can be used.For example, the table of the present embodiment
The usage scenario of the processing method of feelings symbol can be:User when good friend links up, when be not desired to typewriting, it is desirable to use oneself
Personalized expression expression object when going to represent some target emotion feature, pass through the table of the processing unit in emoticon
The emotional characteristics input window input target emotion feature at feelings expression interface, the processing unit of such emoticon gets user
Desired target emotion feature, and generate target emoticon for identifying the target emotion feature using expression expression object
Number, the target emoticon then can be sent to good friend, the expression expression that the target emoticon is obtained using user oneself
Object is generated, so as to meet the individual demand that user expresses expression object, and emoticon also enriches very much, together
When can also increase user use emoticon interest.
The processing method of the emoticon of the present embodiment, by the target emotion for receiving the target emoticon that user inputs
Feature;;The target emoticon for identifying target emotion feature is generated using the expression expression object obtained in advance.This implementation
The technical scheme of example, can be according to the demand of user, and generation meets the corresponding mesh of various target emotion features of the demand of user
Emoticon is marked, so as to effectively enrich the expression way of emoticon;User can also be met to emoticon simultaneously
Individual demand, while strengthen emoticon using interesting.
Still optionally further, on the basis of the technical scheme of above-mentioned embodiment illustrated in fig. 1, step 101 " using obtaining in advance
The expression expression object taken generates the target emoticon for identifying target emotion feature " before, following step can also be included
Suddenly:
(a1) the selection request of the expression expression object of user's input is received;
(a2) from expression expression library of object in obtain expression expression object selection request it is corresponding expression expression object with
And the property parameters of expression expression object;
(a3) in the growth interface display expression expression object of expression expression object.
It can include multiple expression expression objects in the expression expression library of object of the present embodiment, for example, can include user's sense
The various cartoon animals of interest such as kitten, doggie, baby penguins, piggy bunny etc. or can also either be liked for user
Various cartoon figures etc..That is, the expression expression object of the present embodiment can seem lived table for one
Feelings symbolic formulation body, to express the corresponding target emoticon of various target emotion features required for user.
In use, user can express the selection interface of object by expressing one's feelings, pass through human-machine interface module input expression table
Selection up to object is asked, such as can carry the title of expression expression object, such as doggie, kitten, rabbit in selection request
Son or baby penguins etc. title.If user is known a priori by the mark of each expression expression object in expression expression library of object, may be used also
Directly to carry the mark of expression expression object in selection request.Or the processing unit of emoticon directly can pass through one
The selection interface of individual expression expression object shows the symbol of each alternative expression expression object, is then therefrom selected by user
Select the symbol of an expression expression object.In a word, the processing unit of emoticon gets the choosing of the expression expression object of user
Select after request, the attribute ginseng of corresponding expression expression object and expression expression object is obtained from expression expression library of object
Number.It should be noted that if expression expression object is chicken, duckling or baby penguins etc, expression now expresses object
An egg just adopted is may be considered, then needs the time to pip, then slowly feeds and can just grow up.If expression expression
Object is the small pet of a bunny or other non-hatching classes, can express object for the expression of an immature size,
Need to grow up by nursing.Then the expression of the growth interface display initial acquisition of object can be expressed in the expression of input method
Object is expressed, to represent that user adopts success, is slowly fed again later, user just can use the expression to express object display and use
The various moods that family is intended by.That is, the technical scheme of the present embodiment, can increase an expression table in input method
Up to the growth interface of object, the moment shows the expression expression object that user adopts in the growth interface, it is possible to use target
Emotional characteristics triggers expression expression object, the expression is expressed object and target emotion feature is presented, now target emotion feature
Under expression expression object be corresponding target emoticon.
The property parameters of the expression expression object of the present embodiment include the growth condition of expression expression object, each nursing
The corresponding relation between corresponding relation, each physical training condition and emotional characteristics between state and emotional characteristics etc. the expression is expressed
The parameters of object.Such as growth condition can include accumulating online preset time period T, feed number of times (comprising the sum eaten
Drink) arrival X1 times is less than X2 time, and it is S1 and Z1 in horizontal and vertical growth size that expression, which expresses object,;If accumulate online
Between cycle T, feed number of times be more than X2 time, expression expression object it is constant in longitudinal size, laterally growth size be increase S2.If tired
Product line duration cycle T, frequency of training often increases Y1 times, and lateral dimension reduces S3.Expression expression pair can be prevented by training
As filling out, lateral dimension is reduced so that expression expression object more seems strong.Case above is only for example, in practical application
There may be other growth conditions, growth condition can all be the expression pair of expressing one's feelings for limiting with constantly feeding in a word
As dimensionally can constantly grow up, by constantly training, thus it is possible to vary the form of expression expression object is fat or thin.This
Sample, user can express object to generate various target emotions using oneself personalized expression fed.
For example, in symbolic animal of the birth year parameter, can also set and number of times is fed in preset time period T between X1 times to X2 times,
The emotional characteristics of expression expression object is most happy state, is less than X1 times when in preset time period T, feeding number of times, table of expressing one's feelings
Emotional characteristics up to object is small sad state;It is more than X2 times when in preset time period T, feeding number of times, expression expression object
Emotional characteristics be unhappy state;According to same principle, can be set in property parameters nursing state and emotional characteristics it
Between multigroup corresponding relation.In addition, the corresponding relation between physical training condition and emotional characteristics can also be set in property parameters,
No longer citing is repeated one by one for this.For example in preset time period T, if frequency of training is between Y1 times to Y2 times, expression expression
The emotional characteristics of object is full of energy state;In preset time period T, if frequency of training is less than Y1 times, expression expression
The emotional characteristics of object is flagging state;In preset time period T, if frequency of training is more than Y2 times, expression expression
The emotional characteristics of object is beaten state.According to same principle, physical training condition and feelings can be set in property parameters
Multigroup corresponding relation between thread feature, no longer citing is repeated one by one herein.In addition, in the property parameters of expression expression object
The expression way of a variety of emotional characteristicses can be carried, for example, mood is general, control expression expression object can make grim table
Feelings, do not grin, do not narrow eye etc.;It is very happy, expression expression object can be controlled to open one's mouth to laugh, eyes are in laugh at the state of narrowing, hand
Dance foot is stepped.That is, the dynamic of various emotional characteristics lower body each several parts can be defined in the symbolic animal of the birth year parameter of expression expression object
Make, with the action by controlling expression expression subject's body each several part, so that corresponding mood is presented in expression expression object
Feature.In practical application, the expression way of new emotional characteristics can also be increased in the property parameters of expression expression object, and
By training the expression way of the expression expression object representation new emotional characteristics, so that expression expression object can
The new emotional characteristics is expressed, the new corresponding emoticon of emotional characteristics is presented.
Alternatively, after the above-mentioned expression expression object adopted, step 101 " utilizes the expression obtained in advance to express
Object generates the target emoticon for identifying the target emotion feature " it is specifically as follows use target emotion feature
Triggering expression expression object so that expression expression object shows the corresponding target emoticon of target emotion feature.For example when
During using target emotion feature triggering expression expression object, the processing unit of emoticon can be according to the mesh of this in property parameters
The expression way of emotional characteristics is marked, control table please express object to express the mood of the target emotion feature, so as to obtain target
Emoticon.
Or further, using target emotion feature triggering expression expression object so that expression expression object is presented
Go out after the corresponding target emoticon of target emotion feature, mark target emotion can also be carried in target emoticon special
The word levied.The word of the target emotion feature, such as target feelings can be for example identified in the side ejection of the target emoticon
Thread is characterized as that mood is general, now represents boring, and mood is general, and it can be " boring now to identify the word of target emotion feature
~~", such as when target emotion is characterized as very happily, can now representing that mood is very exciting, very happily, now identifies mesh
Mark emotional characteristics word can for " good excitement, it is good happy, ha ha ha~~".
Further, (a2) " obtains expression expression object from expression expression library of object the step of above-described embodiment
After the property parameters of the corresponding expression expression object of selection request and expression expression object ", it can also comprise the following steps:
(b1) the nursing instruction of user's input is received;
(b2) according to feed instruction and property parameters in growth condition, adjustment expression expression object lateral dimension and
Longitudinal size.
The processing unit of the present embodiment emoticon can control the growth interface display of expression expression object to be available for expression
It is being eaten used in expression object growth and drinking, the food such as can include bread and milk.When user pass through it is man-machine
Interface module clicks on these foods, and the processing unit of emoticon receives the nursing instruction of user's input, at this moment not only can be with
Object is expressed by expression way during nursing food, the growth interface of control expression expression object according to the expression of this in property parameters
Show that the emoticon of food is eaten or drunk to expression expression object;Then can also be according in nursing instruction and property parameters
Growth condition, the lateral dimension and longitudinal size of adjustment expression expression object.For example fed all when restriction is each in growth condition
When the lateral dimension and longitudinal size of adjustment expression expression object, according to growth condition, table can be adjusted after each feed
Feelings express the lateral dimension and longitudinal size of object.If being limited in growth condition and predetermined number of times being fed in preset time period T
, can first basis when now receiving nursing instruction when can just adjust the lateral dimension and longitudinal size of expression expression object
The requirement for whether meeting and growing up this time is fed in growth condition detection, that is, adjusts the requirement of size, if satisfaction is current presets
Fed in period of time T and reach predetermined number of times, can now adjust the lateral dimension and longitudinal size of expression expression object.
Further, (a2) " obtains expression expression object from expression expression library of object the step of above-described embodiment
After the property parameters of the corresponding expression expression object of selection request and expression expression object ", it can also comprise the following steps:
(c1) training directive of user's input is received;
(c2) growth condition in training directive and property parameters, the lateral dimension of adjustment expression expression object.
The processing unit of the present embodiment emoticon can control the growth interface display of expression expression object to be available for expression
The various training programs of object training are expressed, such as skips rope, dance, running, when user by human-machine interface module clicks on this
A little training programs, the processing unit of emoticon receives the training directive of user's input, at this moment can not only be joined according to attribute
The expression of this in number expression object is trained to the expression way during project, the growth interface display of the control expression expression object table
Feelings expression object trains the emoticon during project;Then strip that can also be in training directive and property parameters
Part, the lateral dimension of adjustment expression expression object.Because training contributes to fat-reducing, so only adjusting expression expression object during training
Lateral dimension so that expression expression object it is thinner.For example when in growth condition limit every time training all adjustment expression expression pair
, can be after each training, according to growth condition, the lateral dimension of adjustment expression expression object during the lateral dimension of elephant.Such as
Training predetermined number of times in preset time period T is limited in fruit growth condition can just adjust the lateral dimension of expression expression object
When, when now receiving training directive, first it can detect whether this time training meets the requirement of growth condition according to growth condition,
The requirement of lateral dimension is adjusted, interior train reaches predetermined number of times if satisfactions be current preset time period T, now can be with
The lateral dimension of adjustment expression expression object.
Further, (a2) " obtains expression expression object from expression expression library of object the step of above-described embodiment
After the property parameters of the corresponding expression expression object of selection request and expression expression object ", it can also comprise the following steps:
(d1) corresponding relation and current nursing shape between the nursing state and emotional characteristics in property parameters
Corresponding relation and current physical training condition between state, and/or physical training condition and emotional characteristics in property parameters, are obtained
The emotional characteristics for taking expression expression object current;
(d2) according to the current emotional characteristics of expression expression object, adjust and the input method that object is the theme is expressed to express one's feelings
Display interface so that the emotional state that the display interface of input method the is presented emotional state feature current with expression expression object
The emotional state identified is consistent.
It can be defined in the property parameters of expression object in the present embodiment corresponding between nursing state and emotional characteristics
Corresponding relation between relation, and physical training condition and emotional characteristics.For example described in above-described embodiment, in preset time period T
Number of times is fed between X1 times to X2 times, the emotional characteristics of expression expression object is most happy state, as preset time period T
It is interior, feed number of times and be less than X1 times, the emotional characteristics of expression expression object is small sad state;When in preset time period T, feed
Number of times is more than X2 times, and the emotional characteristics of expression expression object is unhappy state, etc..So, according to the state of nursing and mood
Corresponding relation and current nursing state between feature, can get according to the corresponding emotional characteristics of nursing state.
Or as described in above-mentioned embodiment, can also be defined in preset time period T in the property parameters for expressing object,
If frequency of training is between Y1 times to Y2 times, the emotional characteristics of expression expression object is full of energy state;In preset time
In cycle T, if frequency of training is less than Y1 times, the emotional characteristics of expression expression object is flagging state;In preset time
In cycle T, if frequency of training is more than Y2 times, the emotional characteristics of expression expression object is beaten state, etc..So,
According to the corresponding relation between physical training condition and emotional characteristics and current physical training condition, it can get according to physical training condition
Corresponding emotional characteristics.
For example, after user adopts expression expression object, can be expressed in the display interface of input method with the expression
Object is the theme, so, and the processing unit of emoticon can be adjusted with table according to the current emotional characteristics of expression expression object
The display interface for the input method that feelings expression object is the theme so that emotional state and expression that the display interface of input method is presented
The emotional state that the current emotional state feature of expression object is identified is consistent.For example the table is shown in the display interface of input method
The pattern of feelings expression object is background, and the background of the theme can also carry some words, this can be identified by these words
The current emotional characteristics of expression expression object.Or can be with the different colours of the display interface of defining input method in property parameters
The different emotional states that the expression expresses object are identified, the current emotional characteristics of object can be now expressed according to expression, from
The corresponding color of the emotional characteristics is obtained in property parameters, the display for the input method being the theme with expression object of expressing one's feelings can be adjusted
The color at interface is the corresponding color of the emotional characteristics, so that the emotional state that the display interface of input method is presented
The emotional state that the emotional state feature current with expression expression object is identified is consistent.
In addition, can be between nursing state and emotional characteristics individually in property parameters in above-described embodiment it is corresponding
Relation and current nursing state, obtain the current emotional characteristics of expression expression object.Or individually according in property parameters
Physical training condition and emotional characteristics between corresponding relation and current physical training condition, obtain the current feelings of expression expression object
Thread feature.Or in the present embodiment, both can also be considered, according to the corresponding relation between the state of nursing and emotional characteristics
And current nursing state, the corresponding emotional characteristics of nursing state is obtained, further according between physical training condition and emotional characteristics
Corresponding relation and current physical training condition, obtain the corresponding emotional characteristics of physical training condition;Then nursing state correspondence is being combined
Emotional characteristics and the corresponding emotional characteristics of physical training condition, obtain preferable or poor in both or according to choosing one or the other of these two
Comprehensive emotional characteristics.
Still optionally further, (d1) " the nursing state and mood spy in property parameters the step of above-described embodiment
Corresponding relation and current nursing state between levying, and/or physical training condition in property parameters and emotional characteristics it
Between corresponding relation and current physical training condition, obtain the current emotional characteristics of expression expression object " after, can also include
Following steps:
(e1) the current emotional characteristics of object is expressed according to expression, the current feelings of expression expression object generation mark are utilized
The current emoticon of not-ready status feature;
(e2) current emoticon is shared.
It specifically may be referred in above-mentioned steps 102 " generate for identifying target using the expression expression object obtained in advance
In the generating mode of the target emoticon of emotional characteristics ", the present embodiment, the current mood of object can be expressed according to expression
Feature, using the current emoticon of the current emotional state feature of expression expression object generation mark, will not be repeated here.Most
The current emoticon of generation can be shared away in the plate such as personal space or personal homepage, so that good friend looks into afterwards
See.The current emoticon can certainly be sent to good friend.
The processing method of the emoticon of above-mentioned implementation, user can adjust expression table by way of feeding and training
Up to the size of object, meet user and want to generate the need of emoticon using arbitrary size or the expression expression object of shape
Ask, so as to effectively enrich the expression way of emoticon;Personalization of the user to emoticon can also be met simultaneously
Demand, while it is interesting to strengthen using for emoticon.
Fig. 2 is the structure chart of the processing unit embodiment one of the emoticon of the present invention.As shown in figure 4, the present embodiment
The processing unit of emoticon, can specifically include:Receiving module 10 and emoticon generation module 11.
Wherein receiving module 10 is used for the target emotion feature for receiving the target emoticon of user's input;
Emoticon generation module 11 is used to utilize the expression obtained in advance expression object to generate for identifying receiving module
The target emoticon of the 10 target emotion features received.
The processing unit of the emoticon of the present embodiment, the realization of the processing of emoticon is realized by using above-mentioned module
Principle and technique effect are identical with realizing for above-mentioned related method embodiment, and above-mentioned related method embodiment is may be referred in detail
Record, will not be repeated here.
Fig. 3 is the structure chart of the processing unit embodiment two of the emoticon of the present invention.As shown in figure 3, the present embodiment
The processing unit of emoticon, on the basis of the technical scheme of above-described embodiment, further more Delicatement introduces the present invention
Technical scheme.
In the processing unit of the emoticon of the present embodiment, emoticon generation module 11 is specifically for using target emotion
Feature triggering expression expression object so that the target emotion feature that expression expression object shows the reception of receiving module 10 is corresponding
Target emoticon;The word for the target emotion feature that mark receiving module 10 is received is carried in target emoticon.
Still optionally further, as shown in figure 3, in the processing unit of the emoticon of the present embodiment, in addition to acquisition module
12 and display module 13.
Receiving module 10 is additionally operable to receive the selection request of the expression expression object of user's input;
Acquisition module 12 is used for the choosing that the expression expression object that receiving module 10 is received is obtained from expression expression library of object
Select the property parameters for asking corresponding expression expression object and expression expression object;
Display interface 13 is used for the expression expression pair obtained in the growth interface display acquisition module 12 of expression expression object
As.
Still optionally further, as shown in figure 3, in the processing unit of the emoticon of the present embodiment, in addition to adjusting module
14。
Receiving module 10 is additionally operable to receive the nursing instruction of user's input;
Adjusting module 14 is used for the property parameters that the nursing instruction received according to receiving module 10 and acquisition module 12 are obtained
In growth condition, adjustment expression expression object lateral dimension and longitudinal size.
Still optionally further, in the processing unit of the emoticon of the present embodiment, receiving module 10 is additionally operable to receive user
The training directive of input;
Adjusting module 14 is additionally operable to the attribute ginseng that the training directive received according to receiving module 10 and acquisition module 12 are obtained
Growth condition in number, the lateral dimension of adjustment expression expression object.
Still optionally further, in the processing unit of the emoticon of the present embodiment, acquisition module 12 is additionally operable to according to attribute
Corresponding relation between nursing state and emotional characteristics and current nursing state in parameter, and/or according to property parameters
In physical training condition and emotional characteristics between corresponding relation and current physical training condition, obtain expression expression object current
Emotional characteristics;
Adjusting module 14 is additionally operable to the current emotional characteristics of expression expression object obtained according to acquisition module 12, adjustment with
The display interface for the input method that expression expression object is the theme so that emotional state and table that the display interface of input method is presented
The emotional state that the current emotional state feature of feelings expression object is identified is consistent.
Still optionally further, as shown in figure 3, in the processing unit of the emoticon of the present embodiment, in addition to sharing module
15。
Wherein emoticon generation module 11 is additionally operable to the current of the expression expression object obtained according to acquisition module 12
Emotional characteristics, utilizes the current emoticon of the current emotional state feature of expression expression object generation mark;
Sharing module 15 is used for the current emoticon for sharing the generation of emoticon generation module 11.
Still optionally further, as shown in figure 3, in the processing unit of the emoticon of the present embodiment, in addition to sending module
16 are used to send the target emoticon that emoticon generation module 11 is generated to good friend.
The processing unit of the emoticon of the present embodiment, the realization of the processing of emoticon is realized by using above-mentioned module
Principle and technique effect are identical with realizing for above-mentioned related method embodiment, and above-mentioned related method embodiment is may be referred in detail
Record, will not be repeated here.
Fig. 4 is the structure chart of the computer equipment embodiment of the present invention.As shown in figure 4, the computer equipment of the present embodiment,
Including:One or more processors 30, and memory 40, memory 40 are used to store one or more programs, work as memory
The one or more programs stored in 40 are performed by one or more processors 30 so that one or more processors 30 are realized such as
The processing method of the emoticon of above-described embodiment.In embodiment illustrated in fig. 4 exemplified by including multiple processors 30.
For example, a kind of exemplary plot for computer equipment that Fig. 5 provides for the present invention.Fig. 5 is shown suitable for being used for realizing this
The exemplary computer device 12a of invention embodiment block diagram.The computer equipment 12a that Fig. 5 is shown is only an example,
Any limitation should not be carried out to the function of the embodiment of the present invention and using range band.
As shown in figure 5, computer equipment 12a is showed in the form of universal computing device.Computer equipment 12a component can
To include but is not limited to:One or more processor 16a, system storage 28a, connection different system component (including system
Memory 28a and processor 16a) bus 18a.
Bus 18a represents the one or more in a few class bus structures, including memory bus or Memory Controller,
Peripheral bus, graphics acceleration port, processor or the local bus using any bus structures in a variety of bus structures.Lift
For example, these architectures include but is not limited to industry standard architecture (ISA) bus, MCA (MAC)
Bus, enhanced isa bus, VESA's (VESA) local bus and periphery component interconnection (PCI) bus.
Computer equipment 12a typically comprises various computing systems computer-readable recording medium.These media can be it is any can
The usable medium accessed by computer equipment 12a, including volatibility and non-volatile media, moveable and immovable Jie
Matter.
System storage 28a can include the computer system readable media of form of volatile memory, for example, deposit at random
Access to memory (RAM) 30a and/or cache memory 32a.Computer equipment 12a may further include it is other it is removable/
Immovable, volatile/non-volatile computer system storage medium.Only as an example, storage system 34a can be used for reading
Write immovable, non-volatile magnetic media (Fig. 5 is not shown, is commonly referred to as " hard disk drive ").Although not shown in Fig. 5,
It can provide for the disc driver to may move non-volatile magnetic disk (such as " floppy disk ") read-write, and to removable non-easy
The CD drive of the property lost CD (such as CD-ROM, DVD-ROM or other optical mediums) read-write.In these cases, each
Driver can be connected by one or more data media interfaces with bus 18a.System storage 28a can be included at least
One program product, the program product has one group of (for example, at least one) program module, and these program modules are configured to hold
The function of row above-mentioned each embodiments of Fig. 1-Fig. 3 of the invention.
Program with one group of (at least one) program module 42a/utility 40a, can be stored in such as system and deposit
In reservoir 28a, such program module 42a include --- but being not limited to --- operating system, one or more application program,
The reality of network environment is potentially included in each or certain combination in other program modules and routine data, these examples
It is existing.Program module 42a generally performs the function and/or method in above-mentioned each embodiments of Fig. 1-Fig. 3 described in the invention.
Computer equipment 12a can also be with one or more external equipment 14a (such as keyboard, sensing equipment, display
24a etc.) communication, the equipment communication interacted with computer equipment 12a can be also enabled a user to one or more, and/or
With any equipment (such as network interface card, tune for enabling computer equipment 12a to be communicated with one or more of the other computing device
Modulator-demodulator etc.) communication.This communication can be carried out by input/output (I/O) interface 22a.Also, computer equipment
12a can also by network adapter 20a and one or more network (such as LAN (LAN), wide area network (WAN) and/or
Public network, such as internet) communication.As illustrated, network adapter 20a by bus 18a and computer equipment 12a its
Its module communicates.It should be understood that although not shown in the drawings, can combine computer equipment 12a uses other hardware and/or software
Module, includes but is not limited to:Microcode, device driver, redundant processor, external disk drive array, RAID system, tape
Driver and data backup storage system etc..
Processor 16a is stored in program in system storage 28a by operation, thus perform various function application and
Data processing, for example, realize the processing method of the emoticon shown in above-described embodiment.
The present invention also provides a kind of computer-readable medium, is stored thereon with computer program, the program is held by processor
The processing method of the emoticon as shown in above-mentioned embodiment is realized during row.
The computer-readable medium of the present embodiment can be included in the system storage 28a in above-mentioned embodiment illustrated in fig. 5
RAM30a, and/or cache memory 32a, and/or storage system 34a.
With the development of science and technology, the route of transmission of computer program is no longer limited by tangible medium, can also be directly from net
Network is downloaded, or is obtained using other modes.Therefore, the computer-readable medium in the present embodiment can not only include tangible
Medium, can also include invisible medium.
The computer-readable medium of the present embodiment can use any combination of one or more computer-readable media.
Computer-readable medium can be computer-readable signal media or computer-readable recording medium.Computer-readable storage medium
Matter for example may be-but not limited to-system, device or the device of electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor, or
Combination more than person is any.The more specifically example (non exhaustive list) of computer-readable recording medium includes:With one
Or the electrical connections of multiple wires, portable computer diskette, hard disk, random access memory (RAM), read-only storage (ROM),
Erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disc read-only storage (CD-ROM), light
Memory device, magnetic memory device or above-mentioned any appropriate combination.In this document, computer-readable recording medium can
Be it is any include or storage program tangible medium, the program can be commanded execution system, device or device use or
Person is in connection.
Computer-readable signal media can be included in a base band or as the data-signal of carrier wave part propagation,
Wherein carry computer-readable program code.The data-signal of this propagation can take various forms, including --- but
It is not limited to --- electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be
Any computer-readable medium beyond computer-readable recording medium, the computer-readable medium can send, propagate or
Transmit for being used or program in connection by instruction execution system, device or device.
The program code included on computer-readable medium can be transmitted with any appropriate medium, including --- but do not limit
In --- wireless, electric wire, optical cable, RF etc., or above-mentioned any appropriate combination.
It can be write with one or more programming languages or its combination for performing the computer that the present invention is operated
Program code, described program design language includes object oriented program language-such as Java, Smalltalk, C++,
Also include conventional procedural programming language-such as " C " language or similar programming language.Program code can be with
Fully perform, partly perform on the user computer on the user computer, as independent software kit execution, a portion
Divide part execution or the execution completely on remote computer or server on the remote computer on the user computer.
Be related in the situation of remote computer, remote computer can be by the network of any kind --- including LAN (LAN) or
Wide area network (WAN)-be connected to subscriber computer, or, it may be connected to outer computer (is for example carried using Internet service
Come for business by Internet connection).
In several embodiments provided by the present invention, it should be understood that disclosed system, apparatus and method can be with
Realize by another way.For example, device embodiment described above is only schematical, for example, the unit
Divide, only a kind of division of logic function there can be other dividing mode when actually realizing.
The unit illustrated as separating component can be or may not be it is physically separate, it is aobvious as unit
The part shown can be or may not be physical location, you can with positioned at a place, or can also be distributed to multiple
On NE.Some or all of unit therein can be selected to realize the mesh of this embodiment scheme according to the actual needs
's.
In addition, each functional unit in each embodiment of the invention can be integrated in a processing unit, can also
That unit is individually physically present, can also two or more units it is integrated in a unit.Above-mentioned integrated list
Member can both be realized in the form of hardware, it would however also be possible to employ hardware adds the form of SFU software functional unit to realize.
The above-mentioned integrated unit realized in the form of SFU software functional unit, can be stored in an embodied on computer readable and deposit
In storage media.Above-mentioned SFU software functional unit is stored in a storage medium, including some instructions are to cause a computer
Equipment (can be personal computer, server, or network equipment etc.) or processor (processor) perform the present invention each
The part steps of embodiment methods described.And foregoing storage medium includes:USB flash disk, mobile hard disk, read-only storage (Read-
Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disc or CD etc. it is various
Can be with the medium of store program codes.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all essences in the present invention
God is with principle, and any modification, equivalent substitution and improvements done etc. should be included within the scope of protection of the invention.
Claims (18)
1. a kind of processing method of emoticon, it is characterised in that methods described includes:
Receive the target emotion feature of the target emoticon of user's input;
The target emoticon for identifying the target emotion feature is generated using the expression expression object obtained in advance.
2. according to the method described in claim 1, it is characterised in that generated using the expression expression object obtained in advance for marking
Know the target emoticon of the target emotion feature, specifically include:
Use the target emotion feature triggering expression expression object so that the expression expression object shows the mesh
Mark the corresponding target emoticon of emotional characteristics;
The word of the mark target emotion feature is carried in the target emoticon.
3. according to the method described in claim 1, it is characterised in that described generated with the expression obtained in advance expression object is used for
Before the target emoticon for identifying the target emotion feature, methods described also includes:
Receive the selection request of the expression expression object of user's input;
From expression expression library of object in obtain it is described expression expression object selection request it is corresponding it is described expression expression object with
And the property parameters of the expression expression object;
Expression expression object described in growth interface display in the expression expression object.
4. method according to claim 3, it is characterised in that the expression expression pair is obtained from expression expression library of object
After the corresponding expression expression object of selection request of elephant, methods described also includes:
Receive the nursing instruction of user's input;
According to the growth condition fed in instruction and the property parameters, the lateral dimension of the expression expression object is adjusted
And longitudinal size.
5. method according to claim 4, it is characterised in that the expression expression pair is obtained from expression expression library of object
After the corresponding expression expression object of selection request of elephant, methods described also includes:
Receive the training directive of user's input;
According to the growth condition in the training directive and the property parameters, the horizontal chi of the expression expression object is adjusted
It is very little.
6. method according to claim 5, it is characterised in that the expression expression pair is obtained from expression expression library of object
After the corresponding expression expression object of selection request of elephant, methods described also includes:
Corresponding relation between nursing state and emotional characteristics and current nursing state in the property parameters,
And/or corresponding relation and current physical training condition between the physical training condition and emotional characteristics in the property parameters,
Obtain the current emotional characteristics of the expression expression object;
The current emotional characteristics of object is expressed according to the expression, adjusts and the input that object is the theme is expressed with the expression
The display interface of method so that emotional state that the display interface of the input method is presented and the expression expression object are current
The emotional state that emotional state feature is identified is consistent.
7. method according to claim 6, it is characterised in that the nursing state and mood in the property parameters are special
Corresponding relation and current nursing state between levying, and/or the physical training condition and mood in the property parameters are special
After corresponding relation and current physical training condition between levying, the current emotional characteristics of the acquisition expression expression object, institute
Stating method also includes:
The current emotional characteristics of object is expressed according to the expression, it is described current to express object generation mark using the expression
Emotional state feature current emoticon;
Share the current emoticon.
8. according to any described methods of claim 1-7, it is characterised in that utilize the expression expression object generation obtained in advance
After the target emoticon for identifying the target emotion feature, methods described also includes:
The target emoticon is sent to good friend.
9. a kind of processing unit of emoticon, it is characterised in that described device includes:
Receiving module, the target emotion feature of the target emoticon for receiving user's input;
Emoticon generation module, the expression expression object obtained in advance for utilizing generates special for identifying the target emotion
The target emoticon levied.
10. device according to claim 9, it is characterised in that the emoticon generation module, specifically for:
Use the target emotion feature triggering expression expression object so that the expression expression object shows the mesh
Mark the corresponding target emoticon of emotional characteristics;
The word of the mark target emotion feature is carried in the target emoticon.
11. device according to claim 9, it is characterised in that described device also includes acquisition module and display module;
The receiving module, is additionally operable to receive the selection request of the expression expression object of user's input;
Corresponding institute is asked in the acquisition module, the selection for obtaining the expression expression object from expression expression library of object
State expression expression object and the expression expresses the property parameters of object;
The display interface, for expression expression object described in the growth interface display in the expression expression object.
12. device according to claim 11, it is characterised in that described device also includes adjusting module;
The receiving module, is additionally operable to receive the nursing instruction of user's input;
The adjusting module, for according to the growth condition fed in instruction and the property parameters, adjusting the expression
Express the lateral dimension and longitudinal size of object.
13. device according to claim 12, it is characterised in that:
The receiving module, is additionally operable to receive the training directive of user's input;
The adjusting module, is additionally operable to the growth condition in the training directive and the property parameters, adjusts the table
Feelings express the lateral dimension of object.
14. device according to claim 13, it is characterised in that:
The acquisition module, be additionally operable to the corresponding relation between nursing state and the emotional characteristics in the property parameters with
And the corresponding relation between current nursing state, and/or physical training condition and emotional characteristics in the property parameters with
And current physical training condition, obtain the current emotional characteristics of the expression expression object;
The adjusting module, is additionally operable to express the current emotional characteristics of object according to the expression, adjustment is expressed with the expression
The display interface for the input method that object is the theme so that emotional state and institute that the display interface of the input method is presented
State the emotional state that the current emotional state feature of expression expression object identified consistent.
15. device according to claim 14, it is characterised in that described device also includes sharing module;
The emoticon generation module, is additionally operable to express the current emotional characteristics of object according to the expression, using described
The current emoticon of the expression expression object generation mark current emotional state feature;
The sharing module, for sharing the current emoticon.
16. according to any described devices of claim 9-15, it is characterised in that described device also includes:
Sending module, for sending the target emoticon to good friend.
17. a kind of computer equipment, it is characterised in that the equipment includes:
One or more processors;
Memory, for storing one or more programs,
When one or more of programs are by one or more of computing devices so that one or more of processors are real
The existing method as described in any in claim 1-8.
18. a kind of computer-readable medium, is stored thereon with computer program, it is characterised in that the program is executed by processor
Methods of the Shi Shixian as described in any in claim 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710292245.XA CN107037890B (en) | 2017-04-28 | 2017-04-28 | Processing method and device of emoticons, computer equipment and readable medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710292245.XA CN107037890B (en) | 2017-04-28 | 2017-04-28 | Processing method and device of emoticons, computer equipment and readable medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107037890A true CN107037890A (en) | 2017-08-11 |
CN107037890B CN107037890B (en) | 2021-08-17 |
Family
ID=59536927
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710292245.XA Active CN107037890B (en) | 2017-04-28 | 2017-04-28 | Processing method and device of emoticons, computer equipment and readable medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107037890B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110176044A (en) * | 2018-06-08 | 2019-08-27 | 腾讯科技(深圳)有限公司 | Information processing method, device, storage medium and computer equipment |
CN110287895A (en) * | 2019-04-17 | 2019-09-27 | 北京阳光易德科技股份有限公司 | A method of emotional measurement is carried out based on convolutional neural networks |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110184727A1 (en) * | 2010-01-25 | 2011-07-28 | Connor Robert A | Prose style morphing |
CN106209587A (en) * | 2016-07-08 | 2016-12-07 | 中国银联股份有限公司 | For presenting equipment and the method for virtual expression in terminal in a personalized manner |
CN106530096A (en) * | 2016-10-08 | 2017-03-22 | 广州阿里巴巴文学信息技术有限公司 | Emotion icon processing method, device and electronic apparatus |
-
2017
- 2017-04-28 CN CN201710292245.XA patent/CN107037890B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110184727A1 (en) * | 2010-01-25 | 2011-07-28 | Connor Robert A | Prose style morphing |
CN106209587A (en) * | 2016-07-08 | 2016-12-07 | 中国银联股份有限公司 | For presenting equipment and the method for virtual expression in terminal in a personalized manner |
CN106530096A (en) * | 2016-10-08 | 2017-03-22 | 广州阿里巴巴文学信息技术有限公司 | Emotion icon processing method, device and electronic apparatus |
Non-Patent Citations (1)
Title |
---|
王岚: "智能主体情绪行为选择模型", 《兰州理工大学学报》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110176044A (en) * | 2018-06-08 | 2019-08-27 | 腾讯科技(深圳)有限公司 | Information processing method, device, storage medium and computer equipment |
CN110287895A (en) * | 2019-04-17 | 2019-09-27 | 北京阳光易德科技股份有限公司 | A method of emotional measurement is carried out based on convolutional neural networks |
CN110287895B (en) * | 2019-04-17 | 2021-08-06 | 北京阳光易德科技股份有限公司 | Method for measuring emotion based on convolutional neural network |
Also Published As
Publication number | Publication date |
---|---|
CN107037890B (en) | 2021-08-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10169897B1 (en) | Systems and methods for character composition | |
US11452941B2 (en) | Emoji-based communications derived from facial features during game play | |
CN105431813B (en) | It is acted based on biometric identity home subscriber | |
US10486312B2 (en) | Robot, robot control method, and robot system | |
CN109983430A (en) | Determination includes the graphic element in electronic communication | |
CN110286756A (en) | Method for processing video frequency, device, system, terminal device and storage medium | |
CN110109592A (en) | Head portrait creation and editor | |
CN110413841A (en) | Polymorphic exchange method, device, system, electronic equipment and storage medium | |
CN108780389A (en) | Image retrieval for computing device | |
CN107005550A (en) | Related communication model selection | |
CN109324688A (en) | Exchange method and system based on visual human's behavioral standard | |
KR20220039702A (en) | Multimodal model for dynamically responsive virtual characters | |
CN102479291A (en) | Methods and devices for generating and experiencing emotion description, and emotion interactive system | |
CN108876751A (en) | Image processing method, device, storage medium and terminal | |
US20170311861A1 (en) | Mood-conscious interaction device and method | |
CN110019918A (en) | Information displaying method, device, equipment and the storage medium of virtual pet | |
CN107037890A (en) | Processing method and processing device, computer equipment and the computer-readable recording medium of emoticon | |
CN109992187A (en) | A kind of control method, device, equipment and storage medium | |
CN109550256A (en) | Virtual role method of adjustment, device and storage medium | |
CN107219917A (en) | Emoticon generation method and device, computer equipment and computer-readable recording medium | |
CN110781421A (en) | Virtual resource display method and related device | |
CN110177041A (en) | The sending method and device of voice messaging, storage medium, electronic device | |
CN113050859B (en) | Driving method, device and equipment of interaction object and storage medium | |
CN108229640A (en) | The method, apparatus and robot of emotion expression service | |
Bhor et al. | Dynamic emotion recognition and emoji generation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |