CN104598127A - Method and device for inserting emoticon in dialogue interface - Google Patents

Method and device for inserting emoticon in dialogue interface Download PDF

Info

Publication number
CN104598127A
CN104598127A CN201410857573.6A CN201410857573A CN104598127A CN 104598127 A CN104598127 A CN 104598127A CN 201410857573 A CN201410857573 A CN 201410857573A CN 104598127 A CN104598127 A CN 104598127A
Authority
CN
China
Prior art keywords
expression
emoticon
user
current
facial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410857573.6A
Other languages
Chinese (zh)
Other versions
CN104598127B (en
Inventor
祝云龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201410857573.6A priority Critical patent/CN104598127B/en
Publication of CN104598127A publication Critical patent/CN104598127A/en
Application granted granted Critical
Publication of CN104598127B publication Critical patent/CN104598127B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition

Abstract

The invention relates to the field of intelligent terminals, in particular to a method and a device for inserting an emoticon in a dialogue interface. The method comprises the steps of starting a camera to capture a current user emoticon when an emoticon input request is obtained; identifying a current emoticon type which the current user emoticon is belonged to; determining and preferentially displaying a preset emoticon corresponding to the current emoticon type. According to the method and the device disclosed by the invention, automatic identification on the current user emoticon and preferential displaying the preset emoticon corresponding to the current emoticon can be realized, and the selection range of emoticons can be lessened, so that convenience is provided for a user to select the emoticon needing to be input; a selection procedure of inserting the emoticon during an information dialogue process can be simplified, better application experience is brought about for the user, and an intelligent terminal can be more fashionable and individual.

Description

A kind of method and device inserting expression in dialog interface
Technical field
The present invention relates to intelligent terminal field, particularly relate to a kind of method and the device that insert expression in dialog interface.
Background technology
Along with the development of mobile communication technology, people utilize terminal device by SMS (ShortMessage Service more and more, Short Message Service), E-Mail (Electronic Mail, Email) etc. mode carry out written communication, and insert some emoticons in the literature, often play the effect that an expression contained thousands and thousands of words.In exchanging frequently, user more and more likes transmitting mood with emoticon and expressing implication, and current terminal device can not judge that when calling emoticon user wants the emoticon inserted, user needs to be chosen, by complicated for this exchange way easily through carrying out finding the emoticon oneself could wanted in multiple expression storehouse by choice menus mode.
Selecting the method for emoticon to use emoticon too complicated to express oneself infelt user for liking in prior art, using very inconvenient also dumb, also can not embodying the personalization of the terminal device having this function.Existing emoticon, also abundant not in terms of content simultaneously.Emoticon itself is a bright spot of terminal device written communication, but the inconvenience on using can weaken this functional bands undoubtedly to the shake-up of consuming public.
Summary of the invention
The invention provides a kind of method and the device that insert expression in dialog interface, realize the expression default emoticon that also the current expression of preferential display is corresponding automatically identifying active user, reduce the range of choice of emoticon, be reduced at the selection flow process inserting emoticon in message session process.
First aspect, the invention provides a kind of method inserting expression in dialog interface, comprising:
When obtaining expression input request, starting camera and catching current user's facial expression image;
Identify the current expression classification belonging to described user's facial expression image;
Determine and the default emoticon that preferentially the described current expression classification of display is corresponding.
Second aspect, the invention provides a kind of device inserting expression in dialog interface, comprising:
Capture unit, for when obtaining expression input request, starting camera and catching current user's facial expression image;
Recognition unit, for identifying the current expression classification belonging to described user's facial expression image;
Determining unit, for determining and the default emoticon that preferentially the described current expression classification of display is corresponding.
The invention provides a kind of method and the device that insert expression in dialog interface, by when obtaining expression input request, starting camera and catching current user's facial expression image; And identify the current expression classification belonging to described user's facial expression image; Determine and the default emoticon that preferentially the described current expression classification of display is corresponding.The present invention realizes automatically identifying the expression of active user and the default emoticon of the current expression correspondence of preferential display, reduce the range of choice of emoticon, the emoticon facilitating user to select oneself will to input, be reduced at the selection flow process inserting emoticon in message session process, bring to user and better apply experience, make intelligent terminal have more fashion and personalization.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of the method for expression of inserting in dialog interface that first embodiment of the invention provides;
Fig. 2 is the process flow diagram of the method for expression of inserting in dialog interface that second embodiment of the invention provides;
Fig. 3 is the process flow diagram of the method for expression of inserting in dialog interface that third embodiment of the invention provides;
Fig. 4 is the process flow diagram of the method for expression of inserting in dialog interface that fourth embodiment of the invention provides;
Fig. 5 is the structural representation inserting the device of expression in dialog interface that fifth embodiment of the invention provides.
Embodiment
Below in conjunction with drawings and Examples, the present invention is described in further detail.Be understandable that, specific embodiment described herein is only for explaining the present invention, but not limitation of the invention.It also should be noted that, for convenience of description, illustrate only part related to the present invention in accompanying drawing but not full content.
First embodiment
Fig. 1 is the process flow diagram of the method for expression of inserting in dialog interface that first embodiment of the invention provides.With reference to shown in Fig. 1, the described method inserting expression in dialog interface, comprises the steps:
Step S110: when obtaining expression input request, starts camera and catches active user's facial expression image.
Utilize intelligent terminal to carry out in information session process, if user wants to input emoticon, only need click expression library button.When intelligent terminal detects that expression library button is selected, namely expression input request is detected, trigger camera (particularly front-facing camera) to start, catch the light of active user's face reflection, then convert the light signal of collection to electric signal by photo-sensitive cell (CCD or CMOS).Described electric signal, after amplifying process, is converted to the data image signal including user's face feature and user's face expression information by A/D modular converter.Described data image signal sends to the CPU of intelligent terminal for subsequent step by the A/D modular converter of intelligent terminal.
Step S120: identify the current expression classification belonging to described user's facial expression image.
Obtain the data image signal that described active user's facial expression image is corresponding, through signal transacting, geometrical normalization and gray scale normalization process are carried out to the signal of characterizing consumer facial characteristics in described data image signal, judge through facial feature extraction and feature again, determine the current expression type that described user's facial expression image is corresponding.
Step S130: determine and the default emoticon that preferentially the described current expression classification of display is corresponding.
Described default emoticon defaults in expression storehouse, expresses a series of identifiers of the mood of author with word and symbol composition expression or pattern.Make described default emoticon corresponding with the expression classification represented by it to the emoticon classification preset, if determine expression classification, just can determine and the default emoticon that preferentially display is corresponding with described expression classification.
The method inserting expression in dialog interface that the embodiment of the present invention provides, by when obtaining expression input request, starting camera and catching active user's facial expression image; And identify the current expression classification belonging to described user's facial expression image; The default emoticon corresponding with it is determined and the described default emoticon of preferential display according to current expression classification.The expression default emoticon that also the current expression of preferential display is corresponding automatically identifying active user is realized by said method, reduce the range of choice of emoticon, the emoticon facilitating user to select oneself will to input, be reduced at the selection flow process inserting emoticon in message session process, bring to user and better apply experience.
Second embodiment
Fig. 2 is the process flow diagram of the method for expression of inserting in dialog interface that second embodiment of the invention provides.The described method in dialog interface insertion expression is based on first embodiment of the invention, further, the current expression classification identified belonging to described user's facial expression image is specifically optimized for: the facial characteristics extracting active user's facial expression image that described camera catches, determines the current expression classification belonging to user's facial expression image by described facial characteristics.
With reference to shown in Fig. 2, the described method inserting expression in dialog interface, comprises the steps:
Step S210: obtain the data image signal that active user's facial expression image is corresponding, and the signal of characterizing consumer facial characteristics in described data image signal is processed.
Obtain the data image signal that active user's facial expression image is corresponding, and geometrical normalization process and gray scale normalization process are carried out to the signal of characterizing consumer facial characteristics in described data image signal.Described geometrical normalization process is according to human face characteristic point and geometric model determination rectangular characteristic region, then obtains by cutting out the image that size is 150 × 100, makes face in image transform to same position and onesize.Described gray scale normalization process refers to carries out the process such as illumination compensation to image, reduces the impact of illumination variation to a certain extent, contributes to improving discrimination.
Step S220: extract described facial characteristics and generate facial image, described facial image is divided into symmetrical two halves, select the half of face image that entropy is larger.
The signal extracted through the characterizing consumer facial characteristics of normalized generates facial image, described facial image is cut into symmetrical two half of face images.Due to direction and the brightness change of light, can make the information dropout of face mass part, there is shade in some places, and the change of at this moment illumination is larger on recognition result impact.But always have half face obviously high than second half brightness, this half of face shade is less, and the information of reservation is also more.Because face has relative symmetry, the information of half facial image is had to be redundancy, therefore, if only utilize the half of face that brightness is higher, preservation information is more to identify, just can eliminate the negative effect of shadow interference and more second half facial image of information dropout, thus improve discrimination.In addition, only carry out feature extraction with half of facial image, greatly can reduce operand.
The entropy computing formula of facial image is as follows:
H ( u ) = - Σ i = 0 255 p ( i ) lpg ( i )
Wherein, the entropy that H (u) is facial image, i is pixel value, and p (i) is the probability of occurrence of pixel value i in picture.
The entropy of half of facial image through left and right that above-mentioned formulae discovery obtains more respectively, selects half of facial image that entropy is larger as the object of feature extraction.
Step S230: the half of facial image larger to entropy carries out feature extraction to obtain Gabor characteristic.
Described Gabor characteristic be utilize Gabor function to target image (the half of facial image that entropy is larger) process obtain there is good time domain local characteristic and the facial characteristics value of multiresolution features and zoom capabilities.The half of facial image that entropy is larger obtains Gabor characteristic through Gabor transformation.
Step S240: classify to described Gabor characteristic, calls face database and carries out aspect ratio pair, determines the current expression type that described user's facial expression image is corresponding.
Design the cascade classifier based on Adaboost algorithm, classification ballot is carried out to the image in the face database of input, to judge classification of expressing one's feelings, obtains the current expression type that described user's facial expression image is corresponding.
By the method inserting expression in dialog interface that the present embodiment provides, by obtaining data image signal corresponding to active user's facial expression image, and the signal of characterizing consumer facial characteristics in described data image signal is processed; Extract described facial characteristics and generate facial image, described facial image is divided into symmetrical two halves, select the half of face image that entropy is larger; And the half of facial image larger to entropy carries out feature extraction to obtain Gabor characteristic; Described Gabor characteristic is classified, calls face database and carry out aspect ratio pair, determine the current expression type that described user's facial expression image is corresponding.The expression classification identified fast belonging to user's facial expression image is realized by said method.
3rd embodiment
Fig. 3 is the process flow diagram of the method for expression of inserting in dialog interface that third embodiment of the invention provides.The described method in dialog interface insertion expression is based on first embodiment of the invention, further, to determine described and default emoticon corresponding to the described current expression classification of preferential display is specifically optimized for: obtain the emoticon that described current expression classification is corresponding; When described emoticon is not positioned at the top of expression selection page, the position adjusting described emoticon makes it preferentially show.
Further, determining and before the default emoticon that preferentially the described current expression classification of display is corresponding, increasing following steps: the expression expressed according to emoticon, described default emoticon is classified.
With reference to shown in Fig. 3, the described method inserting expression in dialog interface, comprises the steps:
Step S310: the expression expressed according to emoticon, to described default emoticon classification.
Default emoticon in expression storehouse is divided into excitement, happiness, calmness, sadness, sobbing, indignation and multiple expression class such as surprised according to the expression difference that it is expressed.
Step S320: obtain the emoticon that described current expression classification is corresponding.
After determining the current expression classification belonging to user's facial expression image, inquiry expression storehouse, searches the expression class corresponding with current expression classification, obtains the emoticon in described expression class.
Step S330: judge whether described emoticon is the emoticon that expression selection page preferentially shows; If not, then step S340 is performed; If so, step S350 is performed.
Described expression selects page to be in message session interface, show the page of emoticon in expression storehouse, can according to the number arranging display condition and preset the emoticon that every page shows.Generally, the display condition of acquiescence specifies that the emoticon that frequency of utilization is high is presented at the homepage that expression selects page.
After obtaining emoticon corresponding to described current expression classification, inquire about the position of emoticon corresponding to current expression classification in expression selection page, judge whether that needing to upgrade expression selects page according to Query Result.
Step S340: the position adjusting described emoticon makes it preferentially show, and continue to perform step S350.
The emoticon that current expression classification is corresponding be not expression select page preferentially show emoticon time, judge that whether the previous emoticon of the emoticon that current expression classification is corresponding is emoticon corresponding to current expression classification, if not, switch, make the sequence of emoticon corresponding to current expression classification adjust forward a position, continue to compare till the emoticon that current expression classification is corresponding is the emoticon of the preferential display of expression selection page.If emoticon corresponding to current expression classification has multiple, then repeat this step and emoticon corresponding for all current expression classifications is adjusted to the emoticon that expression selects the preferential display of page.
Step S350: select page in dialog interface display expression.
The emoticon that current expression classification is corresponding be expression select page preferentially show emoticon time, dialog interface display expression select page; Otherwise, according to adjusting the position of emoticon corresponding to current expression classification described in step S340 and renewal is expressed one's feelings after selection page, at dialog interface display expression selection page.
The method inserting expression in dialog interface that the present embodiment provides, by described default emoticon classification, is obtaining and after determining the current expression classification belonging to user's facial expression image, is obtaining the emoticon that described current expression classification is corresponding; Judge whether described emoticon is the emoticon that expression selection page preferentially shows, and performs upgrade the operation that expression selects page according to judged result; And select page in dialog interface display expression.Realize emoticon corresponding for the current expression classification belonging to user's facial expression image being preferentially presented at expression by this method to select, in page, to facilitate user to select truly to express one's feelings with active user corresponding emoticon.
4th embodiment
Fig. 4 is the process flow diagram of the method for expression of inserting in dialog interface that fourth embodiment of the invention provides.The described method in dialog interface insertion expression, based on first embodiment of the invention, further, increases following steps: by described user's facial expression image stored in the expression storehouse belonging to default emoticon, and the described user's facial expression image of preferential display.
With reference to shown in Fig. 4, the described method inserting expression in dialog interface, comprises the steps:
Step S410: when obtaining expression input request, starts camera and catches current user's facial expression image.
When intelligent terminal detects expression input request, trigger camera (particularly front-facing camera) and start, convert the light signal of collection to electric signal by photo-sensitive cell (CCD or CMOS).Described electric signal, after amplifying process, is converted to the data image signal including user's face feature and user's face expression information by A/D modular converter.
Step S420: by described user's facial expression image stored in the expression storehouse belonging to default emoticon.
The data image signal including user's face feature and user's face expression information described in acquisition is stored in the expression storehouse belonging to default emoticon.
Step S430: judge whether described user's facial expression image is the emoticon that expression selection page preferentially shows; If not, then step S440 is performed; If so, step S450 is performed.
Judge whether described user's facial expression image is the emoticon that expression selects sequence first in page, determine that whether upgrading expression selects page according to judged result.
Step S440: the position adjusting described user's facial expression image is located at the first place that expression selects page.
To select sort in page the emoticon switch of first with expression, the position of described user's facial expression image is located at express one's feelings the first place of selection page.
Step S450: identify the current expression classification belonging to described user's facial expression image.
When described user's facial expression image is the emoticon of sequence first in expression selection page, continue to perform the current expression class operation identified belonging to described user's facial expression image, namely obtain data image signal corresponding to active user's facial expression image, and the signal of characterizing consumer facial characteristics in described data image signal is processed; Extract described facial characteristics and generate facial image, and the half of facial image larger to entropy carries out feature extraction to obtain Gabor characteristic; Described Gabor characteristic is classified, calls face database and carry out aspect ratio pair, determine the current expression type that described user's facial expression image is corresponding.
Step S460: obtain the emoticon that described current expression classification is corresponding.
After determining the current expression classification belonging to user's facial expression image, inquiry expression storehouse, determines the expression class belonging to the default emoticon corresponding with current expression classification, obtains the emoticon in described expression class.
Step S470: judge whether described emoticon is the emoticon that expression selection page preferentially shows; If not, then step S480 is performed; Otherwise, perform step S490.
After obtaining emoticon corresponding to described current expression classification, inquire about the position of emoticon corresponding to current expression classification in expression selection page, judge whether that needing to upgrade expression selects page according to Query Result.
Step S480: the position adjusting described emoticon makes it preferentially be shown in expression selection page, and priority is lower than user's facial expression image.
The priority of described emoticon is lower than described user's facial expression image.When described emoticon is not the emoticon of the preferential display of expression selection page, whether the previous emoticon first judging the emoticon that current expression classification is corresponding is user's facial expression image, if not, judge that whether the previous emoticon of the emoticon that described current expression classification is corresponding is emoticon corresponding to current expression classification again, if not, switch, makes the sequence of emoticon corresponding to current expression classification adjust forward a position.Continue to judge that whether the previous emoticon of the emoticon that current expression classification is corresponding is emoticon corresponding to current expression classification, if not, switch, continue according to the method described above to compare, till the previous emoticon of emoticon corresponding to current expression classification is user's facial expression image.
If emoticon corresponding to current expression classification has multiple, when the previous emoticon of emoticon corresponding to current expression classification is user's facial expression image, continue to judge whether emoticon corresponding to other current expression classifications is emoticons that expression selection page preferentially shows, whether namely repeat the previous emoticon judging the emoticon that current expression classification is corresponding is emoticon corresponding to current expression classification, if not, switch step, emoticon corresponding for all current expression classifications is adjusted to expression select the preferential display of page and priority lower than the emoticon of user's facial expression image.
Step S490: select page in dialog interface display expression.
Described emoticon be expression select page preferentially show emoticon time, dialog interface display expression select page; Otherwise, makes it preferentially be shown in expression according to the position adjusting described emoticon described in step S480 and select in page, and priority is lower than after user's facial expression image, select page in dialog interface display expression.
The method inserting expression in dialog interface that the present embodiment provides, increase described user's facial expression image stored in the expression storehouse belonging to default emoticon after catching current user's facial expression image at startup camera, and the step of the described user's facial expression image of preferential display, and determine that the priority of described user's facial expression image is higher than emoticon corresponding to the current expression classification belonging to user's facial expression image.Realize the current expression of fast shooting user by this method and be stored in the object in expression storehouse, having enriched the material stored in expression storehouse, being convenient to user and selecting current true expression or emoticon corresponding to user's facial expression image to insert dialog interface.
As a kind of of the present embodiment in the mode of dialog interface insertion expression be preferably:
When intelligent terminal detects expression input request, trigger camera (particularly front-facing camera) and start, convert the light signal of collection to electric signal by photo-sensitive cell (CCD or CMOS).Described electric signal, after amplifying process, is converted to the data image signal including user's face feature and user's face expression information by A/D modular converter.Include the data image signal of user's face feature and user's face expression information in described acquisition after, in message session application, directly insert the data image signal of the current user's expression of this sign and send to other participants of call.
This method is by when obtaining expression input request, start camera and catch current user's expression and other participants digital signal characterizing current user's expression being sent to call, realize quick obtaining and send the object that current user truly expresses one's feelings, adding the interest of message session.
5th embodiment
Fig. 5 is the structural representation inserting the device of expression in dialog interface that fifth embodiment of the invention provides.With reference to shown in Fig. 5, institute is shown in dialog interface and inserts the device of expressing one's feelings, and can comprise:
Capture unit 510, for when obtaining expression input request, starting camera and catching current user's facial expression image;
Recognition unit 530, for identifying the current expression classification belonging to described user's facial expression image;
Determining unit 550, for determining and the default emoticon that preferentially the described current expression classification of display is corresponding.
Further, described recognition unit 530 specifically may be used for: the facial characteristics extracting active user's facial expression image that described camera catches, and determines the current expression classification belonging to user's facial expression image by described facial characteristics.
Further, described determining unit 550 specifically may be used for: obtain the emoticon that described current expression classification is corresponding; When described emoticon is not positioned at the top of expression selection page, the position adjusting described emoticon makes it preferentially show.
Further, described device can also comprise: taxon 540, for determining and before the default emoticon that preferentially the described current expression classification of display is corresponding, according to the expression of emoticon expression, classifying to described default emoticon.
Further, described device can also comprise: stored in unit 520, for when obtaining expression input request, after startup camera catches current user's facial expression image, by described user's facial expression image stored in the expression storehouse belonging to default emoticon, and the described user's facial expression image of preferential display.
The above-mentioned device in dialog interface insertion expression can perform the method inserting expression in dialog interface that the embodiment of the present invention provides, and possesses the corresponding functional module of manner of execution and beneficial effect.
The invention described above embodiment sequence number, just to describing, does not represent the quality of embodiment.
Those of ordinary skill in the art should be understood that, above-mentioned of the present invention each module or each step can realize with general calculation element, they can concentrate on single calculation element, or be distributed on network that multiple calculation element forms, alternatively, they can realize with the executable program code of computer installation, thus they storages can be performed by calculation element in the storage device, or they are made into each integrated circuit modules respectively, or the multiple module in them or step are made into single integrated circuit module to realize.Like this, the present invention is not restricted to the combination of any specific hardware and software.
Each embodiment in this instructions all adopts the mode of going forward one by one to describe, and what each embodiment stressed is the difference with other embodiments, the same or analogous part between each embodiment mutually see.
The foregoing is only the preferred embodiments of the present invention, be not limited to the present invention, to those skilled in the art, the present invention can have various change and change.All do within spirit of the present invention and principle any amendment, equivalent replacement, improvement etc., all should be included within protection scope of the present invention.

Claims (10)

1. insert a method for expression in dialog interface, it is characterized in that, comprising:
When obtaining expression input request, starting camera and catching current user's facial expression image;
Identify the current expression classification belonging to described user's facial expression image;
Determine and the default emoticon that preferentially the described current expression classification of display is corresponding.
2. method according to claim 1, is characterized in that, the current expression classification belonging to described identification described user facial expression image, comprising:
Extract the facial characteristics of active user's facial expression image that described camera catches, determine the current expression classification belonging to user's facial expression image by described facial characteristics.
3. method according to claim 1, is characterized in that, describedly determines and default emoticon corresponding to the described current expression classification of preferential display, comprising:
Obtain the emoticon that described current expression classification is corresponding;
When described emoticon is not positioned at the top of expression selection page, the position adjusting described emoticon makes it preferentially show.
4. method according to claim 1, is characterized in that, describedly determines and before default emoticon corresponding to the described current expression classification of preferential display, also comprises:
According to the expression that emoticon is expressed, to described default emoticon classification.
5. method according to claim 1, is characterized in that, after startup camera catches current user's facial expression image, also comprises:
By described user's facial expression image stored in the expression storehouse belonging to default emoticon, and the described user's facial expression image of preferential display.
6. insert a device for expression in dialog interface, it is characterized in that, comprising:
Capture unit, for when obtaining expression input request, starting camera and catching current user's facial expression image;
Recognition unit, for identifying the current expression classification belonging to described user's facial expression image;
Determining unit, for determining and the default emoticon that preferentially the described current expression classification of display is corresponding.
7. device according to claim 6, is characterized in that, described recognition unit specifically for:
Extract the facial characteristics of active user's facial expression image that described camera catches, determine the current expression classification belonging to user's facial expression image by described facial characteristics.
8. device according to claim 6, is characterized in that, described determining unit specifically for:
Obtain the emoticon that described current expression classification is corresponding;
When described emoticon is not positioned at the top of expression selection page, the position adjusting described emoticon makes it preferentially show.
9. device according to claim 6, is characterized in that, also comprises:
Taxon, for determining and before the default emoticon that preferentially the described current expression classification of display is corresponding, according to the expression of emoticon expression, classifying to described default emoticon.
10. device according to claim 6, is characterized in that, also comprises:
Stored in unit, for when obtaining expression input request, after startup camera catches current user's facial expression image, by described user's facial expression image stored in the expression storehouse belonging to default emoticon, and the described user's facial expression image of preferential display.
CN201410857573.6A 2014-12-31 2014-12-31 A kind of method and device in dialog interface insertion expression Active CN104598127B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410857573.6A CN104598127B (en) 2014-12-31 2014-12-31 A kind of method and device in dialog interface insertion expression

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410857573.6A CN104598127B (en) 2014-12-31 2014-12-31 A kind of method and device in dialog interface insertion expression

Publications (2)

Publication Number Publication Date
CN104598127A true CN104598127A (en) 2015-05-06
CN104598127B CN104598127B (en) 2018-01-26

Family

ID=53123957

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410857573.6A Active CN104598127B (en) 2014-12-31 2014-12-31 A kind of method and device in dialog interface insertion expression

Country Status (1)

Country Link
CN (1) CN104598127B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105447164A (en) * 2015-12-02 2016-03-30 小天才科技有限公司 Method and apparatus for automatically pushing chat expressions
CN106339103A (en) * 2016-08-15 2017-01-18 珠海市魅族科技有限公司 Image checking method and device
CN106503744A (en) * 2016-10-26 2017-03-15 长沙军鸽软件有限公司 Input expression in chat process carries out the method and device of automatic error-correcting
CN107451560A (en) * 2017-07-31 2017-12-08 广东欧珀移动通信有限公司 User's expression recognition method, device and terminal
CN107634901A (en) * 2017-09-19 2018-01-26 广东小天才科技有限公司 Method for pushing, pusher and the terminal device of session expression
CN107784114A (en) * 2017-11-09 2018-03-09 广东欧珀移动通信有限公司 Recommendation method, apparatus, terminal and the storage medium of facial expression image
CN108038102A (en) * 2017-12-08 2018-05-15 北京小米移动软件有限公司 Recommendation method, apparatus, terminal and the storage medium of facial expression image
WO2020006863A1 (en) * 2018-07-06 2020-01-09 平安科技(深圳)有限公司 Automatic approval comment input method and apparatus, computer device, and storage medium
CN111369645A (en) * 2020-02-28 2020-07-03 北京百度网讯科技有限公司 Expression information display method, device, equipment and medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120030629A1 (en) * 2007-05-24 2012-02-02 Yahoo! Inc. Visual browsing system and method
CN103530313A (en) * 2013-07-08 2014-01-22 北京百纳威尔科技有限公司 Searching method and device of application information
CN104063683A (en) * 2014-06-06 2014-09-24 北京搜狗科技发展有限公司 Expression input method and device based on face identification

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120030629A1 (en) * 2007-05-24 2012-02-02 Yahoo! Inc. Visual browsing system and method
CN103530313A (en) * 2013-07-08 2014-01-22 北京百纳威尔科技有限公司 Searching method and device of application information
CN104063683A (en) * 2014-06-06 2014-09-24 北京搜狗科技发展有限公司 Expression input method and device based on face identification

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105447164A (en) * 2015-12-02 2016-03-30 小天才科技有限公司 Method and apparatus for automatically pushing chat expressions
CN106339103A (en) * 2016-08-15 2017-01-18 珠海市魅族科技有限公司 Image checking method and device
CN106503744A (en) * 2016-10-26 2017-03-15 长沙军鸽软件有限公司 Input expression in chat process carries out the method and device of automatic error-correcting
CN107451560A (en) * 2017-07-31 2017-12-08 广东欧珀移动通信有限公司 User's expression recognition method, device and terminal
CN107634901A (en) * 2017-09-19 2018-01-26 广东小天才科技有限公司 Method for pushing, pusher and the terminal device of session expression
CN107784114A (en) * 2017-11-09 2018-03-09 广东欧珀移动通信有限公司 Recommendation method, apparatus, terminal and the storage medium of facial expression image
CN108038102A (en) * 2017-12-08 2018-05-15 北京小米移动软件有限公司 Recommendation method, apparatus, terminal and the storage medium of facial expression image
WO2020006863A1 (en) * 2018-07-06 2020-01-09 平安科技(深圳)有限公司 Automatic approval comment input method and apparatus, computer device, and storage medium
CN111369645A (en) * 2020-02-28 2020-07-03 北京百度网讯科技有限公司 Expression information display method, device, equipment and medium
CN111369645B (en) * 2020-02-28 2023-12-05 北京百度网讯科技有限公司 Expression information display method, device, equipment and medium

Also Published As

Publication number Publication date
CN104598127B (en) 2018-01-26

Similar Documents

Publication Publication Date Title
CN104598127A (en) Method and device for inserting emoticon in dialogue interface
CN113473182B (en) Video generation method and device, computer equipment and storage medium
US20160283595A1 (en) Image directed search
CN104239535A (en) Method and system for matching pictures with characters, server and terminal
CN105630915A (en) Method and device for classifying and storing pictures in mobile terminals
CN110633669B (en) Mobile terminal face attribute identification method based on deep learning in home environment
US20140250457A1 (en) Video analysis system
KR20160074500A (en) Mobile video search
KR20120001285A (en) Method for searching product classification and providing shopping data based on object recognition, server and system thereof
CN104200249A (en) Automatic clothes matching method, device and system
CN111539960A (en) Image processing method and related device
CN110458796A (en) A kind of image labeling method, device and storage medium
US20160125472A1 (en) Gesture based advertisement profiles for users
CN109151318A (en) A kind of image processing method, device and computer storage medium
CN108228852A (en) The method, apparatus and computer readable storage medium of electron album cover generation
CN111158924A (en) Content sharing method and device, electronic equipment and readable storage medium
CN108596241B (en) Method for quickly classifying user genders based on multidimensional sensing data
US20210150243A1 (en) Efficient image sharing
CN103888423A (en) Information processing method and information processing device
CN112256890A (en) Information display method and device, electronic equipment and storage medium
US20170171462A1 (en) Image Collection Method, Information Push Method and Electronic Device, and Mobile Phone
CN111695008A (en) Message integration method and device
WO2022063189A1 (en) Salient element recognition method and apparatus
KR20150101846A (en) Image classification service system based on a sketch user equipment, service equipment, service method based on sketch and computer readable medium having computer program recorded therefor
CN112232890B (en) Data processing method, device, equipment and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Patentee after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Patentee before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

CP01 Change in the name or title of a patent holder