CN107122054A - A kind of detection method and device of face deflection angle and luffing angle - Google Patents
A kind of detection method and device of face deflection angle and luffing angle Download PDFInfo
- Publication number
- CN107122054A CN107122054A CN201710289717.6A CN201710289717A CN107122054A CN 107122054 A CN107122054 A CN 107122054A CN 201710289717 A CN201710289717 A CN 201710289717A CN 107122054 A CN107122054 A CN 107122054A
- Authority
- CN
- China
- Prior art keywords
- point
- specified point
- prenasale
- specified
- position coordinates
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention provides the detection method and device of a kind of face deflection angle and luffing angle, belong to field of computer technology.Methods described includes:The position coordinates of prenasale in pending facial image, the first specified point and the second specified point is obtained, wherein, first specified point and the relatively described prenasale of second specified point are symmetrical;Determine the prenasale to first specified point and the position coordinates of the intersection point of the line of second specified point;Ratio based on the distance between described first specified point to the intersection point with second specified point to the distance between the intersection point, determines the face deflection angle in the pending facial image.Using the present invention, the degree of accuracy for the face deflection angle determined can be improved.
Description
Technical field
The present invention relates to field of computer technology, the detection method of more particularly to a kind of face deflection angle and luffing angle
And device.
Background technology
With the development of computer technology and network technology, smart machine is progressed into the life of people, such as intelligence electricity
Depending on, intelligent refrigerator etc..During the use of smart machine, user is rotated after face, and smart machine can determine whether the deflection of face
Angle, and then corresponding function is performed based on deflection angle, when the angle that such as user turns left face is more than or equal to 15 degree, intelligence
Can TV switching channels.
In the prior art, determining the method for face deflection angle is:Facial image is determined first, is extracted from facial image
Prenasale that face includes, nose saddle point, the position coordinates for being then based on prenasale and nose saddle point determine face deflection angle, such as people
The position coordinates of prenasale is (x after face deflectionn, yn, zn), the position coordinates of nose saddle point be (xs, ys, zs), the angle of face deflection
Spend and be
Nose saddle point refers to that nose, to the minimum point of the matrix saddle paddy of place between the eyebrows, is normally between two, one in the prior art
As be that nose saddle point is found based on gray value, the minimum point of gray value between two is defined as nose saddle point, but be due to light, female
The reasons such as property cosmetic, people's wear a pair of spectacles, can make the gray value between two change, and then make the position of nose saddle point determined
Put that coordinate is inaccurate, the face deflection angle calculated can be caused inaccurate.
The content of the invention
In order to solve problem of the prior art, the embodiments of the invention provide a kind of face deflection angle and luffing angle
Detection method and device.The technical scheme is as follows:
First aspect includes there is provided a kind of detection method of face deflection angle, methods described:
The position coordinates of prenasale in pending facial image, the first specified point and the second specified point is obtained, wherein, institute
State the first specified point and the relatively described prenasale of second specified point is symmetrical;
Determine the prenasale to first specified point and the position coordinates of the intersection point of the line of second specified point;
Based on the distance between described first specified point to the intersection point with second specified point between the intersection point
Distance ratio, determine the face deflection angle in the pending facial image.
Optionally, first specified point and second specified point characterize left eye central point and right eye central point respectively;
Or first specified point and second specified point characterize left corners of the mouth point and right corners of the mouth point respectively.
So, the first specified point, the position coordinates of the second specified point that can make to determine are more accurate.
Second aspect includes there is provided a kind of detection method of face deflection angle, methods described:
Obtain prenasale, the first specified point, the second specified point, the 3rd specified point and the 4th finger in pending facial image
The position coordinates of fixed point, wherein, first specified point and the relatively described prenasale of second specified point are symmetrical, described
3rd specified point and the relatively described prenasale of the 4th specified point are symmetrical, and first specified point, described second are specified
Point is located at the both sides up and down of the prenasale, first specified point with the 3rd specified point, the 4th specified point respectively
It is located at the homonymy of the prenasale with the 3rd specified point;
Determine the prenasale to the position of first specified point and the first intersection point of the line of the 3rd specified point
Coordinate;
Determine the prenasale to the position of second specified point and the second intersection point of the line of the 4th specified point
Coordinate;
Based on the distance between described prenasale to first intersection point with the prenasale between second intersection point
Distance ratio, determine the face deflection angle in the pending facial image.
The third aspect includes there is provided a kind of detection method of face luffing angle, methods described:
Obtain prenasale, the first specified point, the second specified point, the 3rd specified point and the 4th finger in pending facial image
The position coordinates of fixed point, wherein, first specified point and the relatively described prenasale of second specified point are symmetrical, described
3rd specified point and the relatively described prenasale of the 4th specified point are symmetrical, and first specified point, described second are specified
Point is located at the both sides up and down of the prenasale, first specified point with the 3rd specified point, the 4th specified point respectively
It is located at the homonymy of the prenasale with the 3rd specified point;
Determine the prenasale to the position of first specified point and the first intersection point of the line of second specified point
Coordinate;
Determine the prenasale to the position of the 3rd specified point and the second intersection point of the line of the 4th specified point
Coordinate;
Based on the distance between described prenasale to first intersection point with the prenasale between second intersection point
Distance ratio, determine the face luffing angle in the pending facial image.
Optionally, first specified point and second specified point characterize left eye central point and right eye central point respectively,
3rd specified point and the 4th specified point characterize left corners of the mouth point and right corners of the mouth point respectively.
So, the first specified point, the position coordinates of the second specified point that can make to determine are more accurate.
Fourth aspect includes there is provided a kind of detection means of face deflection angle, described device:
First acquisition module, for obtaining the prenasale in pending facial image, the first specified point and the second specified point
Position coordinates, wherein, first specified point and the relatively described prenasale of second specified point are symmetrical;
First determining module, for determining the prenasale to first specified point and the line of second specified point
Intersection point position coordinates;
Second determining module, for being specified based on the distance between described first specified point to the intersection point with described second
Point determines the face deflection angle in the pending facial image to the ratio of the distance between the intersection point.
Optionally, first specified point and second specified point characterize left eye central point and right eye central point respectively;
Or first specified point and second specified point characterize left corners of the mouth point and right corners of the mouth point respectively.
5th aspect is there is provided a kind of detection means of face deflection angle, and described device includes:
Second acquisition module, for obtain the prenasale in pending facial image, the first specified point, the second specified point,
The position coordinates of 3rd specified point and the 4th specified point, wherein, first specified point and second specified point are relatively described
Prenasale is symmetrical, and the 3rd specified point and the relatively described prenasale of the 4th specified point are symmetrical, and described first
Specified point, second specified point are located above and below the prenasale respectively with the 3rd specified point, the 4th specified point
Both sides, first specified point and the 3rd specified point are located at the homonymy of the prenasale;
3rd determining module, for determining the prenasale to first specified point and the line of the 3rd specified point
The first intersection point position coordinates;
4th determining module, for determining the prenasale to second specified point and the line of the 4th specified point
The second intersection point position coordinates;
5th determining module, for being arrived based on the distance between described prenasale to first intersection point with the prenasale
The ratio of the distance between second intersection point, determines the face deflection angle in the pending facial image.
6th aspect is there is provided a kind of detection means of face luffing angle, and described device includes:
3rd acquisition module, for obtain the prenasale in pending facial image, the first specified point, the second specified point,
The position coordinates of 3rd specified point and the 4th specified point, wherein, first specified point and second specified point are relatively described
Prenasale is symmetrical, and the 3rd specified point and the relatively described prenasale of the 4th specified point are symmetrical, and described first
Specified point, second specified point are located above and below the prenasale respectively with the 3rd specified point, the 4th specified point
Both sides, first specified point and the 3rd specified point are located at the homonymy of the prenasale;
6th determining module, for determining the prenasale to first specified point and the line of second specified point
The first intersection point position coordinates;
7th determining module, for determining the prenasale to the 3rd specified point and the line of the 4th specified point
The second intersection point position coordinates;
8th determining module, for being arrived based on the distance between described prenasale to first intersection point with the prenasale
The ratio of the distance between second intersection point, determines the face luffing angle in the pending facial image.
Optionally, first specified point and second specified point characterize left eye central point and right eye central point respectively,
3rd specified point and the 4th specified point characterize left corners of the mouth point and right corners of the mouth point respectively.
The beneficial effect that technical scheme provided in an embodiment of the present invention is brought is:
Based on above-mentioned processing, terminal be according to the first specified point symmetrical with respect to prenasale on prenasale and face,
The position coordinates of second specified point, determines face deflection angle, or according to relative prenasale pair on prenasale and face
Claim the position coordinates of the first specified point, the second specified point, the 3rd specified point and the 4th specified point of distribution, determine face deflection angle
Degree, the position coordinates without determining nose saddle point, so as to improve the degree of accuracy for the face deflection angle determined.
Brief description of the drawings
Fig. 1 is a kind of schematic flow sheet of the detection method of face deflection angle provided in an embodiment of the present invention;
Fig. 2 (a) is a kind of face schematic diagram provided in an embodiment of the present invention;
Fig. 2 (b) is a kind of face schematic diagram provided in an embodiment of the present invention;
Fig. 3 is a kind of schematic flow sheet of the detection method of face deflection angle provided in an embodiment of the present invention;
Fig. 4 is a kind of face schematic diagram provided in an embodiment of the present invention;
Fig. 5 is a kind of schematic flow sheet of the detection method of face luffing angle provided in an embodiment of the present invention;
Fig. 6 is a kind of structural representation of the detection means of face deflection angle provided in an embodiment of the present invention;
Fig. 7 is a kind of structural representation of the detection means of face deflection angle provided in an embodiment of the present invention;
Fig. 8 is a kind of structural representation of the detection means of face luffing angle provided in an embodiment of the present invention;
Fig. 9 is a kind of structural representation of terminal provided in an embodiment of the present invention.
Embodiment
To make the object, technical solutions and advantages of the present invention clearer, below in conjunction with accompanying drawing to embodiment party of the present invention
Formula is described in further detail.
The embodiments of the invention provide a kind of detection method of face deflection angle, the executive agent of this method can be eventually
End.Wherein, terminal can be provided with processor, memory, receipts in mobile phone, tablet personal computer, intelligent television etc., the terminal
Device and image acquisition component etc. are sent out, processor can be used for the processing of the detection process of face deflection angle, and memory can be used
The data and the data of generation needed in the detection process of face deflection angle, transceiver can be used for receiving and send
Message etc., image acquisition component can be used for shooting facial image etc..The input and output such as screen are also provided with terminal to set
Standby, screen is displayed for facial image etc..In the present embodiment so that terminal is intelligent television as an example, retouching in detail for scheme is carried out
State, other situations are similar therewith, and the present embodiment is not repeated.
As shown in figure 1, the handling process of this method can include the steps:
Step 101, the position for obtaining the prenasale in pending facial image, the first specified point and the second specified point is sat
Mark.
In force, user is during using terminal, it is desirable to carry out control terminal using face deflection, and user can beat
The face deflection of terminal is opened, terminal can then open image acquisition component, shoot facial image and (be subsequently properly termed as pending
Facial image), so, terminal will get pending facial image.For example, user watches TV Festival using intelligent television
Mesh, it is desirable to switching channels, the face that can open intelligent television controls the function of channel switch, and intelligent television can open image and adopt
Collect part, shoot facial image.Terminal is got after pending facial image, and convolutional neural networks algorithm can be used to determine to treat
The position coordinates of prenasale, the first specified point and the second specified point that processing facial image includes, each position coordinates can be with
Represented using two-dimensional coordinate (x, y).For example, behind the position of determination prenasale, right angle can be set up using prenasale as the origin of coordinates
Coordinate system, the position coordinates of prenasale is (0,0).
It should be noted that being used herein as convolutional neural networks algorithm determines that prenasale, the first specified point and second are specified
The method of the position coordinates of point is identical with the method for prior art, furthermore it is also possible to determine pending people using other methods
The position coordinates of prenasale, the first specified point and the second specified point that face image includes, such as SVM (Support Vector
Machine, SVMs) algorithm etc., the embodiment of the present invention is not limited.
It should be noted that the first specified point and the second specified point are symmetrical with respect to prenasale any two on face
Point, i.e., in the front view of face, the distance of first specified point and the second specified point to prenasale is equal, and the first specified point
The both sides of prenasale are distributed in the second specified point, example, user can be demarcated in the front view of face this first specify
Point and position of second specified point on face.
It is preferred that, the first specified point and the second specified point characterize left eye central point and right eye central point respectively;Or first refer to
Fixed point and the second specified point characterize left corners of the mouth point and right corners of the mouth point respectively.Left eye central point and right eye central point with respect to prenasale,
The both sides of prenasale are symmetrically distributed in, and the feature of left eye central point and right eye central point is obvious, it is not necessary to user is in face
Demarcated in front view, terminal can identify the left eye central point and right eye central point of face according to convolutional neural networks.Similarly,
First specified point and the second specified point characterize left corners of the mouth point and right corners of the mouth point respectively, it is not required that user is in the front view of face
Demarcation, terminal can identify the left corners of the mouth point and right corners of the mouth point of face according to convolutional neural networks.
Furthermore, for a frame facial image, due to left eye central point and right eye central point and left corners of the mouth point and the right corners of the mouth
The feature of point is stable and prominent, regardless of whether there is the reasons such as light difference, women cosmetic, people's wear a pair of spectacles, terminal can lead to
Left eye central point and right eye central point and left corners of the mouth point and right corners of the mouth point that convolutional neural networks accurately identify face are crossed,
It is favorably improved left eye central point and right eye central point and left corners of the mouth point and right corners of the mouth point etc. based on face, the people detected
The accuracy of face deflection angle and luffing angle.
Step 102, determine prenasale to the first specified point and the position coordinates of the intersection point of the line of the second specified point.
In force, the position coordinates of prenasale is (x1, y1), the position coordinates of the first specified point is (x2, y2), second
The position coordinates of specified point is (x3, y3), the position coordinates of intersection point is (x4, y4), so, between the first specified point and intersection point
Distance is:The distance between second specified point and intersection point are:
The distance between prenasale and intersection point areThe distance between prenasale and first specified point areThe distance between prenasale and second specified point areSo, by
It is the intersection point on the line of the first specified point and the second specified point in intersection point, therefore has relational expression:
All it is to determine due to the position coordinates of prenasale, the first specified point, the second specified point, so by the relational expression, can obtain
To the position coordinates (x of intersection point4, y4)。
For example, as shown in Figure 2 a, the first specified point is that left eye central point, the second specified point are right eye central point, and intersection point is A
Point, as shown in Figure 2 b, the first specified point are that left corners of the mouth point, the second specified point are right corners of the mouth point, and intersection point is B points.
It should be noted that above-mentioned only give a kind of position coordinates (x of feasible determination intersection point4, y4) method, also
There are a variety of methods to determine the position coordinates (x of intersection point4, y4), the embodiment of the present invention is not limited.
Step 103, based on the distance between first specified point to intersection point and the second specified point to the distance between intersection point
Ratio, determines the face deflection angle in pending facial image.
In force, the position coordinates (x for the intersection point determined based on step 1024, y4) and the first specified point position sit
Mark, can calculate the first specified point the distance between to intersection pointIt is true based on step 102
Position coordinates (the x for the intersection point made4, y4) and the second specified point position coordinates, can calculate the second specified point to intersection point it
Between distance
Then the first specified point is calculated to the ratio of the distance between intersection point with the second specified point to the distance between intersection point
For
Use a1With b1Ratio, the method for determining face deflection angle can have following two:
Mode one, determines face deflection angle α=K1*A1, wherein,K1For the first deflection system prestored
Number.
Wherein, the first deflection factor can be preset by technical staff, and be stored into terminal, and such as 7.8.Determine first
The method of deflection factor can be:(such as turn or turn towards left avertence) face of deflection towards right avertence towards different directions using multiple faces
Image, using manual method, determines the face deflection angle of the facial image of every deflection, and determine the face figure of every deflection
AsThen the face deflection angle and corresponding A of the facial image of every deflection are calculated1Ratio, will calculate
Obtained multiple ratios are averaged, as the first deflection factor.The face deflection angle of the facial image of above-mentioned every deflection
Method can also be determined by professional equipment, do not limit herein.
In force, terminal determines a1With b1Ratio after, the first deflection factor K prestored can be obtained1, so
After calculate A1With K1Product, the product is defined as face deflection angle.Terminal is determined after face deflection angle, if the people
Face deflection angle is more than the references angle of default switching channels, and terminal can be with switching channels.
Mode two, terminal determine a1With b1Ratio after, a can be checked1With b1Ratio belonging to ratio range, so
Afterwards from the ratio range and the corresponding relation of deflection angle that prestore, determine that the corresponding deflection angle of the ratio range is behaved
Face deflection angle.Terminal is determined after face deflection angle, if the face deflection angle is more than the benchmark of default switching channels
Angle, terminal can be with switching channels.
Wherein, technical staff can prestore a1With b1Ratio range and deflection angle corresponding relation, determine this pair
The method that should be related to can be:(such as turn or turn towards left avertence) the face figure of deflection towards right avertence towards different directions using multiple faces
Picture, using manual method, determines the face deflection angle of the facial image of every deflection, and determine the facial image of every deflection
'sBy each a1With b1Ratio it is corresponding with deflection angle storage, so, there is substantial amounts of a1With b1Ratio and deflection
Angle is corresponding, can obtain a1With b1Ratio range and deflection angle corresponding relation.
Explanation is needed, based on a1With b1Ratio inverse, determine face deflection angle be also the present invention implement in one
Plant simple deformation.
In addition, based on a1With b1Ratio, can also determine face yawing moment, the first specified point be left eye central point,
Second specified point is right eye central point, can determine that face is in when looking squarely state first, a1With b1Ratio, using the value as
A reference value, if a1With b1Ratio be less than a reference value, then can determine that face is to deflect to the left, if a1With b1Ratio it is big
In a reference value, then it can determine that face is to deflect to the right, if a1With b1Ratio be equal to a reference value, then can determine that face is
It is non deflected.
In the embodiment of the present invention, terminal be according to prenasale and the first specified point, the position coordinates of the second specified point, it is determined that
Face deflection angle, the position coordinates without determining nose saddle point, so as to improve the standard for the face deflection angle determined
Exactness.
In another embodiment of the present invention, a kind of detection method of face deflection angle, the execution master of this method are also provided
Body can be terminal.
As shown in figure 3, the handling process of this method can include the steps:
Step 301, prenasale, the first specified point, the second specified point, the 3rd specified point in pending facial image are obtained
With the position coordinates of the 4th specified point.
Wherein, the first specified point and the second specified point are symmetrical with respect to prenasale, the 3rd specified point and the 4th specified point
Symmetrical with respect to prenasale, the first specified point, the second specified point and the 3rd specified point, the 4th specified point are located at prenasale respectively
Both sides up and down, the first specified point and the 3rd specified point are located at the homonymy of prenasale.
In force, user is during using terminal, it is desirable to carry out control terminal using face deflection, and user can beat
The face deflection of terminal is opened, terminal can then open image acquisition component, shoot facial image and (be subsequently properly termed as pending
Facial image), so, terminal will get pending facial image.For example, user watches TV Festival using intelligent television
Mesh, it is desirable to switching channels, the face that can open intelligent television controls the function of channel switch, and intelligent television can open image and adopt
Collect part, shoot facial image.Terminal is got after pending facial image, and convolutional neural networks algorithm can be used to determine to treat
The position of prenasale, the first specified point, the second specified point, the 3rd specified point and the 4th specified point that processing facial image includes
Coordinate, each position coordinates can use two-dimensional coordinate (x, y) to represent.
It should be noted that being used herein as convolutional neural networks algorithm determines that prenasale, the first specified point, second are specified
Point, the 3rd specified point, the method for the position coordinates of the 4th specified point are identical with the method for prior art, furthermore it is also possible to use
Other methods come determine prenasale, the first specified point, the second specified point, the 3rd specified point that pending facial image includes,
The position coordinates of 4th specified point, such as SVM algorithm, the embodiment of the present invention are not limited.
Optionally, as shown in figure 4, the first specified point and the second specified point characterize left eye central point and right eye center respectively
Point, the 3rd specified point and the 4th specified point characterize left corners of the mouth point and right corners of the mouth point respectively.
Step 302, determine prenasale to the first specified point and the position coordinates of the first intersection point of the line of the 3rd specified point.
Wherein, the first intersection point for by prenasale and perpendicular to the first specified point and the 3rd specified point line vertical line it is vertical
Foot.
In force, the position coordinates of prenasale is (x1, y1), the position coordinates of the first specified point is (x2, y2), the 3rd
The position coordinates of specified point is (x5, y5), the position coordinates of the first intersection point is (x4, y4), so, the first specified point and first hangs down
It is the distance between sufficient to be:The distance between 3rd specified point and first intersection point are:The distance between prenasale and first intersection point arePrenasale with
The distance between first specified point isThe distance between prenasale and the 3rd specified point areSo, because the first intersection point is by prenasale and is specified perpendicular to the first specified point and the 3rd
The intersection point of the vertical line of point line, therefore have relational expression:
All it is to determine due to the position coordinates of prenasale, the first specified point, the 3rd specified point, so by the relational expression, can obtain
To the position coordinates (x of the first intersection point4, y4)。
Step 303, determine prenasale to the second specified point and the position coordinates of the second intersection point of the line of the 4th specified point.
Wherein, the second intersection point for by prenasale and perpendicular to the second specified point and the 4th specified point line vertical line it is vertical
Foot.
In force, using the method for step 302, the position coordinates (x for obtaining the second intersection point can also be calculated6, y6).The
The position coordinates of two specified points is (x3, y3), the position coordinates of the 4th specified point is (x7, y7), because the second intersection point is to pass through nose
Cusp and perpendicular to the second specified point and the intersection point of the vertical line of the 4th specified point line, therefore have relational expression:
All it is to determine due to the position coordinates of prenasale, the second specified point, the 4th specified point, so by the relational expression, can obtain
To the position coordinates (x of the second intersection point6, y6)。
It should be noted that a kind of above-mentioned position coordinates for only giving feasible intersection point of determination first and the second intersection point
Method, also a variety of methods can determine the position coordinates of the first intersection point and the second intersection point, and the embodiment of the present invention is not limited.
Step 304, based on the distance between prenasale to the first intersection point and prenasale to the distance between the second intersection point
Ratio, determines the face deflection angle in pending facial image.
In force, the position coordinates (x for the first intersection point determined based on step 3024, y4) and prenasale position sit
Mark (x1, y1), prenasale can be calculated the distance between to the first intersection pointBased on step
Position coordinates (the x of 303 the second intersection points determined6, y6) and prenasale position coordinates (x1, y1), prenasale can be calculated
The distance between to the second intersection point
Then prenasale is calculated to the ratio of the distance between the first intersection point with prenasale to the distance between the second intersection point
For
Use a2With b2Ratio, the method for determining face deflection angle can have following two:
Mode one, determine face deflection angle α=K2*A2, wherein,K2For the second deflection system prestored
Number.
Wherein, the second deflection factor can be preset by technical staff, and be stored into terminal, and such as 10.1.Determine
The method of two deflection factors can be:(such as turn or turn towards left avertence) people of deflection towards right avertence towards different directions using multiple faces
Face image, using manual method, determines the face deflection angle of the facial image of every deflection, and determine the face of every deflection
ImageThen the face deflection angle and the A corresponding to oneself of the facial image of every deflection are calculated2Ratio,
Multiple ratios that calculating is obtained are averaged, as the second deflection factor.The face of the facial image of above-mentioned every deflection is inclined
The method of gyration can also be determined by professional equipment, not limited herein.
In force, terminal determines a2With b2Ratio after, the second deflection factor K prestored can be obtained2, so
After calculate A2With K2Product, the product is defined as to the face deflection angle of pending facial image.Terminal determines that face is deflected
After angle, if the face deflection angle is more than the references angle of default switching channels, terminal can be with switching channels.
Mode two, terminal determine a2With b2Ratio after, a can be checked3With b3Ratio belonging to ratio range, so
Afterwards from the ratio range and the corresponding relation of deflection angle that prestore, determine the corresponding deflection angle of the ratio range to treat
Handle the face deflection angle of facial image.Terminal is determined after face deflection angle, is preset if the face deflection angle is more than
Switching channels references angle, terminal can be with switching channels.
Wherein, technical staff can prestore a2With b2Ratio range and deflection angle corresponding relation, determine this pair
The method that should be related to can be:(such as turn or turn towards left avertence) the face figure of deflection towards right avertence towards different directions using multiple faces
Picture, using manual method, determines the face deflection angle of the facial image of every deflection, and determine the facial image of every deflection
'sBy each a2With b2Ratio it is corresponding with deflection angle storage, so, there is substantial amounts of a2With b2Ratio and deflection
Angle is corresponding, can obtain a2With b2Ratio range and deflection angle corresponding relation.
It should be noted that:Based on a2With b2Ratio inverse, determine face deflection angle be also the present invention implement in
A kind of simple deformation.
In addition, based on a2With b2Ratio, face yawing moment can also be determined, can determine that face is in first flat
During depending on state, a2With b2Ratio, using the value as a reference value, if a2With b2Ratio be less than a reference value, then can determine people
Face is to deflect to the left, if a2With b2Ratio be more than a reference value, then can determine that face is to deflect to the right, if a2With b2's
Ratio is equal to a reference value, then can determine that face is non deflected.
Optionally, following mode can also be used to determine the face deflection angle of pending facial image, first specifies
Point is left eye central point, and the second specified point is right eye central point, and the 3rd specified point is left corners of the mouth point, and the 4th specified point is the right corners of the mouth
Point.Corresponding processing can be as follows:
Based on the position coordinates of left eye central point, right eye central point and prenasale, left eye central point and the 3rd intersection point are determined
The distance between a1And the distance between right eye central point and the 3rd intersection point b1;Wherein, the 3rd intersection point be by prenasale and
Perpendicular to left eye central point and the intersection point of the vertical line of right eye central point line;And based on left corners of the mouth point, right corners of the mouth point and prenasale
Position coordinates, determine the distance between left corners of the mouth point and the 4th intersection point a3And between right corners of the mouth point and the 4th intersection point away from
From b3, wherein, the 4th intersection point is by prenasale and perpendicular to left corners of the mouth point and the intersection point of the vertical line of right corners of the mouth point line;It is determined that
Face deflection angle α=K3*A1+K4*A3, wherein,K3For the 3rd deflection factor prestored, K4For
The quadrupole deflector coefficient prestored.
Wherein, K3With K4It can be set and be stored into terminal by technical staff, determine K3With K4Method can be:
(such as turn or turn towards left avertence) facial image of deflection towards right avertence towards different directions using multiple faces, using manual method, really
The face deflection angle of the facial image of fixed every deflection, and determine the facial image of every deflection
Use α=K3*A1+K4*A3, calculate multigroup K3With K4, multiple K are then sought respectively3Average value, multiple K4Average value, obtain
Final K3With K4.The method of face deflection angle of the facial image of above-mentioned every deflection can also determine by professional equipment,
Do not limit herein.
In force, above-mentioned determination a1、b1、a3、b3Method above in detail narration, here is omitted.
Furthermore it is also possible to reference to the face deflection angle in above-mentioned two embodiment, determining pending facial image, first
Specified point is left eye central point, and the second specified point is right eye central point, and the 3rd specified point is left corners of the mouth point, and the 4th specified point is the right side
Corners of the mouth point, corresponding processing can be as follows:
Based on the position coordinates of left eye central point, right eye central point and prenasale, left eye central point and the 3rd intersection point are determined
The distance between a1And the distance between right eye central point and the 3rd intersection point b1;Wherein, the 3rd intersection point be by prenasale and
Perpendicular to left eye central point and the intersection point of the vertical line of right eye central point line;And based on left corners of the mouth point, right corners of the mouth point and prenasale
Position coordinates, determine the distance between left corners of the mouth point and the 4th intersection point a3And between right corners of the mouth point and the 4th intersection point away from
From b3, wherein, the 4th intersection point is by prenasale and perpendicular to the intersection point of left corners of the mouth point and the vertical line of right corners of the mouth point line, and base
In the position coordinates of left eye central point, left corners of the mouth point and prenasale, determine prenasale to left eye central point and left corners of the mouth central point
Line on the first intersection point apart from a2, and based on the position coordinates of right eye central point, right corners of the mouth point and prenasale, determine nose
Point on the line of right eye central point and right corners of the mouth central point the second intersection point apart from b2;Determine face deflection angle α=K5*A1+
K6*A2+K7*A3, wherein,K5For the 5th deflection factor prestored, K6To deposit in advance
6th deflection factor of storage, K7For the 7th deflection factor prestored.
Wherein, K5、K6With K7It can be set and be stored into terminal by technical staff, determine K5、K6With K7Method can
To be:(such as turn or turn towards left avertence) facial image of deflection towards right avertence towards different directions using multiple faces, use artificial side
Method, determines the face deflection angle of the facial image of every deflection, and determine the facial image of every deflectionUse α=K5*A1+K6*A2+K7*A3, calculate multigroup K5、K6With K7, then ask respectively many
Individual K5Average value, multiple K6Average value, multiple K7Average value, obtain final K5、K6With K7.The people of above-mentioned every deflection
The method of the face deflection angle of face image can also be determined by professional equipment, not limited herein.
In force, above-mentioned determination a1、b1、a2、b2、a3、b3Method above in detail narration, here is omitted.
Terminal is determined after face deflection angle, if the face deflection angle is more than the reference angle of default switching channels
Degree, terminal can be with switching channels.
It should be noted that determining that the method for face deflection angle is used not only for TV switching frequency in the embodiment of the present invention
Road, can be also used for control game operation etc., the embodiment of the present invention is not limited.
In the embodiment of the present invention, terminal is according to prenasale, the first specified point, the second specified point, the 3rd specified point and
The position coordinates of four specified points, determines face deflection angle, the position coordinates without determining nose saddle point, so as to improve
The degree of accuracy for the face deflection angle determined.
In another embodiment of the present invention, the method for additionally providing the face luffing angle for determining pending image, such as Fig. 5 institutes
Show, step process can be as follows:
Step 501, prenasale, the first specified point, the second specified point, the 3rd specified point in pending facial image are obtained
With the position coordinates of the 4th specified point.
The processing procedure of step 501 is identical with the processing procedure of step 301, and here is omitted.
Optionally, the first specified point and the second specified point characterize left eye central point and right eye central point respectively, and the 3rd specifies
Point and the 4th specified point characterize left corners of the mouth point and right corners of the mouth point respectively.
Step 502, determine prenasale to the first specified point and the position coordinates of the first intersection point of the line of the second specified point.
Wherein, the first intersection point for by prenasale and perpendicular to the first specified point and the second specified point line vertical line it is vertical
Foot.
In force, the position coordinates of prenasale is (x1, y1), the position coordinates of the first specified point is (x2, y2), second
The position coordinates of specified point is (x3, y3), the position coordinates of the first intersection point is (x4, y4), so, the first specified point and first hangs down
It is the distance between sufficient to be:The distance between second specified point and first intersection point are:The distance between prenasale and first intersection point arePrenasale with
The distance between first specified point isThe distance between prenasale and second specified point areSo, because the first intersection point is by prenasale and is specified perpendicular to the first specified point and second
The intersection point of the vertical line of point line, therefore have relational expression:
All it is to determine due to the position coordinates of prenasale, the first specified point, the second specified point, so by the relational expression, can obtain
To the position coordinates (x of the first intersection point4, y4)。
Step 503, determine prenasale to the 3rd specified point and the position coordinates of the second intersection point of the line of the 4th specified point.
Using the method with step 502, the position coordinates (x for obtaining the second intersection point can be calculated6, y6)。
Step 504, based on the distance between prenasale to the first intersection point and prenasale to the distance between the second intersection point
Ratio, determines the face luffing angle in pending facial image.
In force, the position coordinates (x for the first intersection point determined based on step 5024, y4) and prenasale position sit
Mark (x1, y1), prenasale can be calculated the distance between to the first intersection pointBased on step
Position coordinates (the x of 503 the second intersection points determined6, y6) and prenasale position coordinates (x1, y1), prenasale can be calculated
The distance between to the second intersection point
Then prenasale is calculated to the ratio of the distance between the first intersection point with prenasale to the distance between the second intersection point
For
Use a4With b4Ratio, determine that the method for face luffing angle can be as follows:
Terminal can obtain the pitching COEFFICIENT K prestored8, then by A4Product α=K with overlooking coefficient8*A4, it is determined that
For face luffing angle.So, the problem of can solving that in the prior art face luffing angle can not be determined.
In the embodiment of the present invention, terminal can also determine the pitch attitude of pending facial image, and pitch attitude includes facing upward
Depending on state, vertical view state, look squarely state.Terminal calculates a4With b4RatioAfterwards, bowing of prestoring can be obtained
A reference value is faced upward, if A4More than the pitching a reference value prestored, it is determined that the pitch attitude of face is vertical view state, if A4
Less than pitching a reference value, it is determined that the pitch attitude of face is looks up state, if A4Equal to pitching a reference value, it is determined that face
Pitch attitude to look squarely state.
So, terminal can both determine that user was in and look up state, and the angle looked up, it is also possible to determine
Go out user and be in vertical view state, and the angle overlooked.And handled accordingly based on the luffing angle determined.
In addition, above-mentioned pitching a reference value can before the use be determined by terminal, specific processing is:User is using eventually
During end, it is desirable to carry out control terminal using the pitching of face, user can open pitch control function, and terminal can then detect pitching
The unlatching of control function, then image acquisition component is opened in control, and sends default voice " positive please look at mobile phone straight ", figure
As acquisition component can be continuously shot multiple facial images, a in every facial image is then calculated4With b4, then calculate multiple
a4With b4Ratio A4, remove a maximum A4, and remove a minimum A4, by remaining A4Average, the average value then may be used
To be stored as pitching a reference value., so can be with for different users because everyone pitching a reference value is possible to different
The pitching a reference value of the user is calculated, can so make the pitch attitude of face determined more accurate.
It should be noted that:Based on a4With b4Ratio inverse, determine face luffing angle be also the present invention implement in
A kind of simple deformation.
It should also be noted that, just show nose on facial image in the accompanying drawing provided in the embodiment of the present invention
Point, the first specified point, the second specified point, the 3rd specified point and the 4th specified point.
In the embodiment of the present invention, terminal is according to prenasale, the first specified point, the second specified point, the 3rd specified point and
The position coordinates of four specified points, determines face luffing angle, and asking for face luffing angle can not be determined in the prior art by solving
Topic.
Based on identical technical concept, the embodiment of the present invention additionally provides a kind of device for determining face deflection angle, such as
Shown in Fig. 6, the device includes:
First acquisition module 610, is specified for obtaining the prenasale in pending facial image, the first specified point and second
The position coordinates of point, wherein, first specified point and the relatively described prenasale of second specified point are symmetrical;
First determining module 620, for determining the prenasale to first specified point and second specified point
The position coordinates of the intersection point of line;
Second determining module 630, for based on the distance between described first specified point to the intersection point and described second
Specified point determines the face deflection angle in the pending facial image to the ratio of the distance between the intersection point.
Optionally, first specified point and second specified point characterize left eye central point and right eye central point respectively;
Or first specified point and second specified point characterize left corners of the mouth point and right corners of the mouth point respectively.
In the embodiment of the present invention, terminal be according to prenasale and the first specified point, the position coordinates of the second specified point, it is determined that
Face deflection angle, the position coordinates without determining nose saddle point, so as to improve the standard for the face deflection angle determined
Exactness.
It should be noted that:The device for the detection face deflection angle that above-described embodiment is provided is in detection face deflection angle
When, only with the division progress of above-mentioned each functional module for example, in practical application, as needed can divide above-mentioned functions
With by different functional module completions, i.e., the internal structure of device is divided into different functional modules, to complete above description
All or part of function.In addition, the device for the detection face deflection angle that above-described embodiment is provided and detection face deflection
The embodiment of the method for angle belongs to same design, and it implements process and refers to embodiment of the method, repeats no more here.
Based on identical technical concept, the embodiment of the present invention additionally provides a kind of detection means of face deflection angle, such as
Shown in Fig. 7, the device includes:
Second acquisition module 710, for obtain the prenasale in pending facial image, the first specified point, second specify
The position coordinates of point, the 3rd specified point and the 4th specified point, wherein, first specified point and second specified point are with respect to institute
State prenasale symmetrical, the 3rd specified point and the relatively described prenasale of the 4th specified point are symmetrical, described
One specified point, second specified point are with the 3rd specified point, the 4th specified point respectively positioned at the upper of the prenasale
Lower both sides, first specified point and the 3rd specified point are located at the homonymy of the prenasale;
3rd determining module 720, for determining the prenasale to first specified point and the 3rd specified point
The position coordinates of first intersection point of line;
4th determining module 730, for determining the prenasale to second specified point and the 4th specified point
The position coordinates of second intersection point of line;
5th determining module 740, for based on the distance between described prenasale to first intersection point and the nose
Point determines the face deflection angle in the pending facial image to the ratio of the distance between second intersection point.
In the embodiment of the present invention, terminal is according to prenasale, the first specified point, the second specified point, the 3rd specified point and
The position coordinates of four specified points, determines face deflection angle, the position coordinates without determining nose saddle point, so as to improve
The degree of accuracy for the face deflection angle determined.
It should be noted that:The device for the detection face deflection angle that above-described embodiment is provided is in detection face deflection angle
When, only with the division progress of above-mentioned each functional module for example, in practical application, as needed can divide above-mentioned functions
With by different functional module completions, i.e., the internal structure of device is divided into different functional modules, to complete above description
All or part of function.In addition, the device for the detection face deflection angle that above-described embodiment is provided and detection face deflection
The embodiment of the method for angle belongs to same design, and it implements process and refers to embodiment of the method, repeats no more here.
Based on identical technical concept, the embodiment of the present invention additionally provides a kind of detection means of face luffing angle, such as
Shown in Fig. 8, the device includes:
3rd acquisition module 810, for obtain the prenasale in pending facial image, the first specified point, second specify
The position coordinates of point, the 3rd specified point and the 4th specified point, wherein, first specified point and second specified point are with respect to institute
State prenasale symmetrical, the 3rd specified point and the relatively described prenasale of the 4th specified point are symmetrical, described
One specified point, second specified point are with the 3rd specified point, the 4th specified point respectively positioned at the upper of the prenasale
Lower both sides, first specified point and the 3rd specified point are located at the homonymy of the prenasale;
6th determining module 820, for determining the prenasale to first specified point and second specified point
The position coordinates of first intersection point of line;
7th determining module 830, for determining the prenasale to the 3rd specified point and the 4th specified point
The position coordinates of second intersection point of line;
8th determining module 840, for based on the distance between described prenasale to first intersection point and the nose
Point determines the face luffing angle in the pending facial image to the ratio of the distance between second intersection point.
Optionally, first specified point and second specified point characterize left eye central point and right eye central point respectively,
3rd specified point and the 4th specified point characterize left corners of the mouth point and right corners of the mouth point respectively.
In the embodiment of the present invention, terminal is according to prenasale, the first specified point, the second specified point, the 3rd specified point and
The position coordinates of four specified points, determines face luffing angle, and asking for face luffing angle can not be determined in the prior art by solving
Topic.
It should be noted that:The device for the detection face luffing angle that above-described embodiment is provided is in detection face luffing angle
When, only with the division progress of above-mentioned each functional module for example, in practical application, as needed can divide above-mentioned functions
With by different functional module completions, i.e., the internal structure of device is divided into different functional modules, to complete above description
All or part of function.In addition, the device for the detection face luffing angle that above-described embodiment is provided and detection face pitching
The embodiment of the method for angle belongs to same design, and it implements process and refers to embodiment of the method, repeats no more here.
Fig. 9 is refer to, it illustrates the structural representation of the terminal involved by the embodiment of the present invention, the terminal can be used for
The detection method of face deflection angle provided in above-described embodiment is provided.Specifically:
Terminal 900 can include RF (Radio Frequency, radio frequency) circuit 110, include one or more meters
The memory 120 of calculation machine readable storage medium storing program for executing, input block 130, display unit 140, sensor 150, voicefrequency circuit 160,
WiFi (wireless fidelity, Wireless Fidelity) module 170, include one or the processing of more than one processing core
The part such as device 180 and power supply 190.It will be understood by those skilled in the art that the terminal structure shown in Fig. 9 is not constituted pair
The restriction of terminal, can include than illustrating more or less parts, either combine some parts or different part cloth
Put.Wherein:
RF circuits 110 can be used for receive and send messages or communication process in, the reception and transmission of signal, especially, by base station
After downlink information is received, transfer to one or more than one processor 180 is handled;In addition, being sent to up data are related to
Base station.Generally, RF circuits 110 include but is not limited to antenna, at least one amplifier, tuner, one or more oscillators, use
Family identity module (SIM) card, transceiver, coupler, LNA (Low Noise Amplifier, low-noise amplifier), duplex
Device etc..In addition, RF circuits 110 can also be communicated by radio communication with network and other equipment.The radio communication can make
With any communication standard or agreement, and including but not limited to GSM (Global System of Mobile communication, entirely
Ball mobile communcations system), GPRS (General Packet Radio Service, general packet radio service), CDMA (Code
Division Multiple Access, CDMA), WCDMA (Wideband Code Division Multiple
Access, WCDMA), LTE (Long Term Evolution, Long Term Evolution), Email, SMS (Short
Messaging Service, Short Message Service) etc..
Memory 120 can be used for storage software program and module, and processor 180 is stored in memory 120 by operation
Software program and module, so as to perform various function application and data processing.Memory 120 can mainly include storage journey
Sequence area and storage data field, wherein, the application program (ratio that storing program area can be needed for storage program area, at least one function
Such as sound-playing function, image player function) etc.;Storage data field can be stored uses created number according to terminal 900
According to (such as voice data, phone directory etc.) etc..In addition, memory 120 can include high-speed random access memory, it can also wrap
Include nonvolatile memory, for example, at least one disk memory, flush memory device or other volatile solid-state parts.
Correspondingly, memory 120 can also include Memory Controller, to provide processor 180 and input block 130 to memory
120 access.
Input block 130 can be used for the numeral or character information for receiving input, and generation to be set with user and function
The relevant keyboard of control, mouse, action bars, optics or the input of trace ball signal.Specifically, input block 130 may include to touch
Sensitive surfaces 131 and other input equipments 132.Touch sensitive surface 131, also referred to as touch display screen or Trackpad, collect and use
(such as user is using any suitable objects such as finger, stylus or annex in touch-sensitive table for touch operation of the family on or near it
Operation on face 131 or near touch sensitive surface 131), and corresponding attachment means are driven according to formula set in advance.It is optional
, touch sensitive surface 131 may include both touch detecting apparatus and touch controller.Wherein, touch detecting apparatus detection is used
The touch orientation at family, and the signal that touch operation is brought is detected, transmit a signal to touch controller;Touch controller is from touch
Touch information is received in detection means, and is converted into contact coordinate, then gives processor 180, and can reception processing device 180
The order sent simultaneously is performed.Furthermore, it is possible to using polytypes such as resistance-type, condenser type, infrared ray and surface acoustic waves
Realize touch sensitive surface 131.Except touch sensitive surface 131, input block 130 can also include other input equipments 132.Specifically,
Other input equipments 132 can include but is not limited to physical keyboard, function key (such as volume control button, switch key etc.),
One or more in trace ball, mouse, action bars etc..
Display unit 140 can be used for the information that is inputted by user of display or the information for being supplied to user and terminal 900
Various graphical user interface, these graphical user interface can be made up of figure, text, icon, video and its any combination.
Display unit 140 may include display panel 141, optionally, can use LCD (Liquid Crystal Display, liquid crystal
Show device), the form such as OLED (Organic Light-Emitting Diode, Organic Light Emitting Diode) configure display panel
141.Further, touch sensitive surface 131 can cover display panel 141, when touch sensitive surface 131 detects touching on or near it
Touch after operation, send processor 180 to determine the type of touch event, with type of the preprocessor 180 according to touch event
Corresponding visual output is provided on display panel 141.Although in fig .9, touch sensitive surface 131 and display panel 141 are conducts
Two independent parts are inputted and input function to realize, but in some embodiments it is possible to by touch sensitive surface 131 with showing
Panel 141 is integrated and realizes input and output function.
Terminal 900 may also include at least one sensor 150, such as optical sensor, motion sensor and other sensings
Device.Specifically, optical sensor may include ambient light sensor and proximity transducer, wherein, ambient light sensor can be according to environment
The light and shade of light adjusts the brightness of display panel 141, and proximity transducer can close display when terminal 900 is moved in one's ear
Panel 141 and/or backlight.As one kind of motion sensor, gravity accelerometer can detect in all directions (generally
Three axles) acceleration size, size and the direction of gravity are can detect that when static, available for identification mobile phone posture application (ratio
Such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, tap);Extremely
The other sensors such as the gyroscope, barometer, hygrometer, thermometer, the infrared ray sensor that can also configure in terminal 900, herein
Repeat no more.
Voicefrequency circuit 160, loudspeaker 161, microphone 162 can provide the COBBAIF between user and terminal 900.Audio
Electric signal after the voice data received conversion can be transferred to loudspeaker 161, sound is converted to by loudspeaker 161 by circuit 160
Sound signal output;On the other hand, the voice signal of collection is converted to electric signal by microphone 162, after voicefrequency circuit 160 is received
Voice data is converted to, then after voice data output processor 180 is handled, through RF circuits 110 to be sent to such as another end
End, or voice data is exported to memory 120 so as to further processing.Voicefrequency circuit 160 is also possible that earphone jack,
To provide the communication of peripheral hardware earphone and terminal 900.
WiFi belongs to short range wireless transmission technology, and terminal 900 can help user's transceiver electronicses by WiFi module 170
Mail, browse webpage and access streaming video etc., it has provided the user wireless broadband internet and accessed.Although Fig. 9 is shown
WiFi module 170, but it is understood that, it is simultaneously not belonging to must be configured into for terminal 900, can exist as needed completely
Do not change in the essential scope of invention and omit.
Processor 180 is the control centre of terminal 900, utilizes various interfaces and each portion of connection whole mobile phone
Point, by operation or perform and be stored in software program and/or module in memory 120, and call and be stored in memory 120
Interior data, perform the various functions and processing data of terminal 900, so as to carry out integral monitoring to mobile phone.Optionally, processor
180 may include one or more processing cores;It is preferred that, processor 180 can integrated application processor and modem processor,
Wherein, application processor mainly handles operating system, user interface and application program etc., and modem processor mainly handles nothing
Line communicates.It is understood that above-mentioned modem processor can not also be integrated into processor 180.
Terminal 900 also includes the power supply 190 (such as battery) powered to all parts, it is preferred that power supply can pass through electricity
Management system and processor 180 are logically contiguous, so as to realize management charging, electric discharge and power consumption by power-supply management system
The functions such as management.Power supply 190 can also include one or more direct current or AC power, recharging system, power supply event
The random component such as barrier detection circuit, power supply changeover device or inverter, power supply status indicator.
Although not shown, terminal 900 can also include camera, bluetooth module etc., will not be repeated here.Specifically in this reality
Apply in example, the display unit of terminal 900 is touch-screen display, and terminal 900 also includes memory, and one or one
More than program, one of them or more than one program storage in memory, and be configured to by one or one with
Upper processor states one by execution or more than one program deflects to perform the detection face described in each above-mentioned embodiment
Angle and luffing angle method.
One of ordinary skill in the art will appreciate that realizing that all or part of step of above-described embodiment can be by hardware
To complete, the hardware of correlation can also be instructed to complete by program, described program can be stored in a kind of computer-readable
In storage medium, storage medium mentioned above can be read-only storage, disk or CD etc..
The foregoing is only presently preferred embodiments of the present invention, be not intended to limit the invention, it is all the present invention spirit and
Within principle, any modification, equivalent substitution and improvements made etc. should be included in the scope of the protection.
Claims (10)
1. a kind of detection method of face deflection angle, it is characterised in that methods described includes:
The position coordinates of prenasale in pending facial image, the first specified point and the second specified point is obtained, wherein, described
One specified point and the relatively described prenasale of second specified point are symmetrical;
Determine the prenasale to first specified point and the position coordinates of the intersection point of the line of second specified point;
Based on the distance between described first specified point to the intersection point and second specified point between the intersection point away from
From ratio, determine the face deflection angle in the pending facial image.
2. according to the method described in claim 1, it is characterised in that first specified point and second specified point difference table
Levy left eye central point and right eye central point;Or first specified point and second specified point characterize left corners of the mouth point and the right side respectively
Corners of the mouth point.
3. a kind of detection method of face deflection angle, it is characterised in that methods described includes:
Obtain prenasale, the first specified point, the second specified point, the 3rd specified point and the 4th specified point in pending facial image
Position coordinates, wherein, first specified point and the relatively described prenasale of second specified point are symmetrical, the described 3rd
Specified point and the relatively described prenasale of the 4th specified point are symmetrical, first specified point, second specified point with
3rd specified point, the 4th specified point are located at the both sides up and down of the prenasale, first specified point and institute respectively
State the homonymy that the 3rd specified point is located at the prenasale;
Determine the prenasale to first specified point and the position coordinates of the first intersection point of the line of the 3rd specified point;
Determine the prenasale to second specified point and the position coordinates of the second intersection point of the line of the 4th specified point;
Based on the distance between described prenasale to first intersection point and the prenasale between second intersection point away from
From ratio, determine the face deflection angle in the pending facial image.
4. a kind of detection method of face luffing angle, it is characterised in that methods described includes:
Obtain prenasale, the first specified point, the second specified point, the 3rd specified point and the 4th specified point in pending facial image
Position coordinates, wherein, first specified point and the relatively described prenasale of second specified point are symmetrical, the described 3rd
Specified point and the relatively described prenasale of the 4th specified point are symmetrical, first specified point, second specified point with
3rd specified point, the 4th specified point are located at the both sides up and down of the prenasale, first specified point and institute respectively
State the homonymy that the 3rd specified point is located at the prenasale;
Determine the prenasale to first specified point and the position coordinates of the first intersection point of the line of second specified point;
Determine the prenasale to the 3rd specified point and the position coordinates of the second intersection point of the line of the 4th specified point;
Based on the distance between described prenasale to first intersection point and the prenasale between second intersection point away from
From ratio, determine the face luffing angle in the pending facial image.
5. the method according to claim 3 or 4, it is characterised in that first specified point and second specified point point
Not Biao Zheng left eye central point and right eye central point, the 3rd specified point and the 4th specified point characterize respectively left corners of the mouth point and
Right corners of the mouth point.
6. a kind of detection means of face deflection angle, it is characterised in that described device includes:
First acquisition module, for obtaining the prenasale in pending facial image, the first specified point and the position of the second specified point
Coordinate is put, wherein, first specified point and the relatively described prenasale of second specified point are symmetrical;
First determining module, for determining the prenasale hanging down to the line of first specified point and second specified point
The position coordinates of foot;
Second determining module, for being arrived based on the distance between described first specified point to the intersection point with second specified point
The ratio of the distance between the intersection point, determines the face deflection angle in the pending facial image.
7. device according to claim 6, it is characterised in that first specified point and second specified point difference table
Levy left eye central point and right eye central point;Or first specified point and second specified point characterize left corners of the mouth point and the right side respectively
Corners of the mouth point.
8. a kind of detection means of face deflection angle, it is characterised in that described device includes:
Second acquisition module, for obtaining the prenasale in pending facial image, the first specified point, the second specified point, the 3rd
The position coordinates of specified point and the 4th specified point, wherein, first specified point and the relatively described nose of second specified point
Point symmetry is distributed, and the 3rd specified point and the relatively described prenasale of the 4th specified point are symmetrical, and described first specifies
Point, second specified point are located at the both sides up and down of the prenasale with the 3rd specified point, the 4th specified point respectively,
First specified point and the 3rd specified point are located at the homonymy of the prenasale;
3rd determining module, for determining the prenasale to the of the line of first specified point and the 3rd specified point
The position coordinates of one intersection point;
4th determining module, for determining the prenasale to the of the line of second specified point and the 4th specified point
The position coordinates of two intersection points;
5th determining module, for based on the distance between described prenasale to first intersection point and the prenasale to described
The ratio of the distance between second intersection point, determines the face deflection angle in the pending facial image.
9. a kind of detection means of face luffing angle, it is characterised in that described device includes:
3rd acquisition module, for obtaining the prenasale in pending facial image, the first specified point, the second specified point, the 3rd
The position coordinates of specified point and the 4th specified point, wherein, first specified point and the relatively described nose of second specified point
Point symmetry is distributed, and the 3rd specified point and the relatively described prenasale of the 4th specified point are symmetrical, and described first specifies
Point, second specified point are located at the both sides up and down of the prenasale with the 3rd specified point, the 4th specified point respectively,
First specified point and the 3rd specified point are located at the homonymy of the prenasale;
6th determining module, for determining the prenasale to the of the line of first specified point and second specified point
The position coordinates of one intersection point;
7th determining module, for determining the prenasale to the of the line of the 3rd specified point and the 4th specified point
The position coordinates of two intersection points;
8th determining module, for based on the distance between described prenasale to first intersection point and the prenasale to described
The ratio of the distance between second intersection point, determines the face luffing angle in the pending facial image.
10. the device according to right wants 8 or 9, it is characterised in that first specified point and second specified point difference
Left eye central point and right eye central point are characterized, the 3rd specified point and the 4th specified point characterize left corners of the mouth point and the right side respectively
Corners of the mouth point.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710289717.6A CN107122054A (en) | 2017-04-27 | 2017-04-27 | A kind of detection method and device of face deflection angle and luffing angle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710289717.6A CN107122054A (en) | 2017-04-27 | 2017-04-27 | A kind of detection method and device of face deflection angle and luffing angle |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107122054A true CN107122054A (en) | 2017-09-01 |
Family
ID=59725461
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710289717.6A Pending CN107122054A (en) | 2017-04-27 | 2017-04-27 | A kind of detection method and device of face deflection angle and luffing angle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107122054A (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107944393A (en) * | 2017-11-27 | 2018-04-20 | 电子科技大学 | Face nose localization method |
CN108921148A (en) * | 2018-09-07 | 2018-11-30 | 北京相貌空间科技有限公司 | Determine the method and device of positive face tilt angle |
CN109086727A (en) * | 2018-08-10 | 2018-12-25 | 北京奇艺世纪科技有限公司 | A kind of method, apparatus and electronic equipment of the movement angle of determining human body head |
CN109345447A (en) * | 2018-09-20 | 2019-02-15 | 广州酷狗计算机科技有限公司 | The method and apparatus of face replacement processing |
CN109598196A (en) * | 2018-10-29 | 2019-04-09 | 华中科技大学 | A kind of multiform becomes the characteristic point positioning method of multi-pose Face sequence |
CN109753886A (en) * | 2018-12-17 | 2019-05-14 | 北京爱奇艺科技有限公司 | A kind of evaluation method of facial image, device and equipment |
CN110097021A (en) * | 2019-05-10 | 2019-08-06 | 电子科技大学 | Face pose estimation based on MTCNN |
CN110363052A (en) * | 2018-04-11 | 2019-10-22 | 杭州海康威视数字技术股份有限公司 | Determine the method, apparatus and computer equipment of the human face posture in image |
CN110390229A (en) * | 2018-04-20 | 2019-10-29 | 杭州海康威视数字技术股份有限公司 | A kind of face picture screening technique, device, electronic equipment and storage medium |
CN111700619A (en) * | 2020-05-28 | 2020-09-25 | 广西壮族自治区人民医院 | Neck rehabilitation auxiliary system and device thereof |
CN111914783A (en) * | 2020-08-10 | 2020-11-10 | 深圳市视美泰技术股份有限公司 | Method and device for determining human face deflection angle, computer equipment and medium |
CN112488016A (en) * | 2020-12-09 | 2021-03-12 | 重庆邮电大学 | Multi-angle face recognition method and application |
CN114162138A (en) * | 2020-09-10 | 2022-03-11 | 中国移动通信有限公司研究院 | Automatic driving mode switching method and device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101561710A (en) * | 2009-05-19 | 2009-10-21 | 重庆大学 | Man-machine interaction method based on estimation of human face posture |
CN103605965A (en) * | 2013-11-25 | 2014-02-26 | 苏州大学 | Multi-pose face recognition method and device |
CN104850825A (en) * | 2015-04-18 | 2015-08-19 | 中国计量学院 | Facial image face score calculating method based on convolutional neural network |
CN104951767A (en) * | 2015-06-23 | 2015-09-30 | 安阳师范学院 | Three-dimensional face recognition technology based on correlation degree |
-
2017
- 2017-04-27 CN CN201710289717.6A patent/CN107122054A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101561710A (en) * | 2009-05-19 | 2009-10-21 | 重庆大学 | Man-machine interaction method based on estimation of human face posture |
CN103605965A (en) * | 2013-11-25 | 2014-02-26 | 苏州大学 | Multi-pose face recognition method and device |
CN104850825A (en) * | 2015-04-18 | 2015-08-19 | 中国计量学院 | Facial image face score calculating method based on convolutional neural network |
CN104951767A (en) * | 2015-06-23 | 2015-09-30 | 安阳师范学院 | Three-dimensional face recognition technology based on correlation degree |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107944393B (en) * | 2017-11-27 | 2021-03-30 | 电子科技大学 | Human face nose tip positioning method |
CN107944393A (en) * | 2017-11-27 | 2018-04-20 | 电子科技大学 | Face nose localization method |
CN110363052A (en) * | 2018-04-11 | 2019-10-22 | 杭州海康威视数字技术股份有限公司 | Determine the method, apparatus and computer equipment of the human face posture in image |
CN110363052B (en) * | 2018-04-11 | 2022-05-20 | 杭州海康威视数字技术股份有限公司 | Method and device for determining human face pose in image and computer equipment |
CN110390229A (en) * | 2018-04-20 | 2019-10-29 | 杭州海康威视数字技术股份有限公司 | A kind of face picture screening technique, device, electronic equipment and storage medium |
CN109086727A (en) * | 2018-08-10 | 2018-12-25 | 北京奇艺世纪科技有限公司 | A kind of method, apparatus and electronic equipment of the movement angle of determining human body head |
CN108921148A (en) * | 2018-09-07 | 2018-11-30 | 北京相貌空间科技有限公司 | Determine the method and device of positive face tilt angle |
CN109345447A (en) * | 2018-09-20 | 2019-02-15 | 广州酷狗计算机科技有限公司 | The method and apparatus of face replacement processing |
CN109598196A (en) * | 2018-10-29 | 2019-04-09 | 华中科技大学 | A kind of multiform becomes the characteristic point positioning method of multi-pose Face sequence |
CN109753886A (en) * | 2018-12-17 | 2019-05-14 | 北京爱奇艺科技有限公司 | A kind of evaluation method of facial image, device and equipment |
CN109753886B (en) * | 2018-12-17 | 2024-03-08 | 北京爱奇艺科技有限公司 | Face image evaluation method, device and equipment |
CN110097021A (en) * | 2019-05-10 | 2019-08-06 | 电子科技大学 | Face pose estimation based on MTCNN |
CN110097021B (en) * | 2019-05-10 | 2022-09-06 | 电子科技大学 | MTCNN-based face pose estimation method |
CN111700619A (en) * | 2020-05-28 | 2020-09-25 | 广西壮族自治区人民医院 | Neck rehabilitation auxiliary system and device thereof |
CN111914783A (en) * | 2020-08-10 | 2020-11-10 | 深圳市视美泰技术股份有限公司 | Method and device for determining human face deflection angle, computer equipment and medium |
CN114162138A (en) * | 2020-09-10 | 2022-03-11 | 中国移动通信有限公司研究院 | Automatic driving mode switching method and device |
CN114162138B (en) * | 2020-09-10 | 2023-10-27 | 中国移动通信有限公司研究院 | Automatic driving mode switching method and device |
CN112488016A (en) * | 2020-12-09 | 2021-03-12 | 重庆邮电大学 | Multi-angle face recognition method and application |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107122054A (en) | A kind of detection method and device of face deflection angle and luffing angle | |
CN103455256B (en) | The method and terminal of Rotation screen display picture | |
CN109445736A (en) | A kind of multi-screen display method and mobile terminal | |
CN107580143B (en) | A kind of display methods and mobile terminal | |
CN106959761B (en) | A kind of terminal photographic method, device and terminal | |
CN106445339B (en) | A kind of method and apparatus that double screen terminal shows stereo-picture | |
CN105183296B (en) | interactive interface display method and device | |
CN103714161B (en) | The generation method of image thumbnails, device and terminal | |
CN107833178A (en) | A kind of image processing method, device and mobile terminal | |
CN105224556B (en) | Waterfall stream interface display methods and device | |
CN108491123A (en) | A kind of adjusting application program image target method and mobile terminal | |
CN107817939A (en) | A kind of image processing method and mobile terminal | |
CN106504303B (en) | A kind of method and apparatus playing frame animation | |
CN107506732A (en) | Method, equipment, mobile terminal and the computer-readable storage medium of textures | |
CN107396193B (en) | The method and apparatus of video playing | |
CN108415641A (en) | A kind of processing method and mobile terminal of icon | |
CN108668024A (en) | A kind of method of speech processing and terminal | |
CN109343788A (en) | A kind of method of controlling operation thereof and mobile terminal of mobile terminal | |
CN107864336A (en) | A kind of image processing method, mobile terminal | |
CN109739394A (en) | A kind of processing method of SAR value, mobile terminal | |
CN110191426A (en) | A kind of method and terminal of information sharing | |
CN110099434A (en) | A kind of power regulating method, terminal device and computer readable storage medium | |
CN109542572A (en) | A kind of interface display method and mobile terminal | |
CN109739300A (en) | A kind of method of controlling antenna and terminal | |
CN109727212A (en) | A kind of image processing method and mobile terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170901 |
|
RJ01 | Rejection of invention patent application after publication |