CN109344710B - Image feature point positioning method and device, storage medium and processor - Google Patents

Image feature point positioning method and device, storage medium and processor Download PDF

Info

Publication number
CN109344710B
CN109344710B CN201811002527.2A CN201811002527A CN109344710B CN 109344710 B CN109344710 B CN 109344710B CN 201811002527 A CN201811002527 A CN 201811002527A CN 109344710 B CN109344710 B CN 109344710B
Authority
CN
China
Prior art keywords
point
adsorption force
points
pixel
characteristic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811002527.2A
Other languages
Chinese (zh)
Other versions
CN109344710A (en
Inventor
罗子懿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neusoft Corp
Original Assignee
Neusoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neusoft Corp filed Critical Neusoft Corp
Priority to CN201811002527.2A priority Critical patent/CN109344710B/en
Publication of CN109344710A publication Critical patent/CN109344710A/en
Application granted granted Critical
Publication of CN109344710B publication Critical patent/CN109344710B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The application discloses a method and a device for positioning image feature points, a storage medium and a processor. The method and the device utilize the adsorption effect between the pixel points and the pixel points in the image to correct the position of the characteristic point with inaccurate current positioning. Firstly, determining all pixel points in the neighborhood of the feature point; then, acquiring the adsorption force of each pixel point in the neighborhood to the characteristic point; and then, the comprehensive adsorption force of all pixel points in the neighborhood on the characteristic points is obtained by utilizing the adsorption force of each pixel point on the characteristic points in the neighborhood. And finally, correcting the position of the characteristic point with inaccurate positioning according to the magnitude and the direction of the comprehensive adsorption force to obtain the corrected position of the characteristic point. The method and the device can correct the position of the characteristic point with inaccurate positioning, and compared with the prior art, the accuracy of positioning the characteristic point is obviously improved.

Description

Image feature point positioning method and device, storage medium and processor
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for locating image feature points, a storage medium, and a processor.
Background
In the field of image processing technology, a feature point refers to a point having a representative meaning in an image, such as an image corner point, an edge point, and the like. Feature points are also called key points, or interest points. The identification and the positioning of the feature points play an important role in the aspects of image identification, image matching, image deformation, 3D modeling and the like. Such as face recognition in image recognition. The precise positioning of the feature points is the key to the post-processing of image recognition. The more accurate the positioning of the feature points in the image, the higher the image recognition rate.
Although some image key point detection technologies exist at present, these methods can only locate the feature points to the vicinity of the true feature points, and the accuracy of location is low.
Disclosure of Invention
In order to solve the above technical problems in the prior art, the present application provides a method, an apparatus, a storage medium, and a processor for positioning image feature points, which can optimize the positioning of the image feature points, so that the positioning is more accurate.
The application provides the following technical scheme:
in a first aspect of the present application, a method for positioning an image feature point is provided, including:
determining all pixel points in the neighborhood of the characteristic point; each pixel point in the neighborhood has an adsorption effect on the characteristic point;
acquiring the adsorption force of each pixel point on the characteristic point;
acquiring the comprehensive adsorption force of all pixel points on the characteristic points by utilizing the adsorption force of each pixel point on the characteristic points;
and correcting the position of the characteristic point according to the magnitude and the direction of the comprehensive adsorption force to obtain the corrected position of the characteristic point.
As a possible implementation manner, the obtaining the adsorption force of each pixel point to the feature point specifically includes:
and acquiring the adsorption force of the pixel point on the characteristic point according to the gradient vector of the pixel point, the position of the pixel point and the position of the characteristic point.
As a possible implementation manner, the obtaining, according to the gradient vector of the pixel point, and the position of the pixel point and the position of the feature point, the adsorption force of the pixel point to the feature point specifically includes:
obtaining the distance between the pixel point and the characteristic point according to the position of the pixel point and the position of the characteristic point;
obtaining the included angle between the direction of the characteristic point pointing to the pixel point and the gradient direction of the pixel point;
and obtaining the adsorption force of the pixel point to the characteristic point according to the gradient vector of the pixel point, the distance between the pixel point and the characteristic point and the included angle.
As a possible implementation manner, after obtaining the adsorption force of each pixel point on the feature point, the method further includes:
normalizing the adsorption force of each pixel point on the characteristic point;
acquiring the comprehensive adsorption force of all pixel points on the feature points by using the adsorption force of each pixel point on the feature points, specifically:
and acquiring the comprehensive adsorption force of all the pixel points on the characteristic points by utilizing the adsorption force of each pixel point on the characteristic points after the characteristic points are normalized.
As a possible implementation manner, the comprehensive adsorption force of all the pixel points on the feature point is obtained by using the adsorption force of each pixel point on the feature point, and specifically obtained by the following formula:
Figure BDA0001783281660000021
wherein p isoRepresenting a characteristic point, piIs represented in the above poNeighborhood Ω ofoAny one of the pixel points in (a),
Figure BDA0001783281660000022
represents said piFor p isoAdsorption power of, omegaiTo the adsorption force
Figure BDA0001783281660000023
The weight of (2) is obtained by the following formula:
Figure BDA0001783281660000024
wherein,
Figure BDA0001783281660000031
represents said piAnd said poThe distance of (d); sigmadRepresenting the neighborhood ΩoAll pixel points in (1) and the poStandard deviation of the distance of (a).
As a possible implementation manner, after the correcting the position of the feature point and obtaining the corrected position of the feature point, the method further includes:
and changing the neighborhood of the characteristic point, and continuously correcting the position of the characteristic point aiming at the new neighborhood until the magnitude of the comprehensive adsorption force corresponding to the new neighborhood is lower than or equal to a preset threshold value.
In a second aspect of the present application, there is provided an apparatus for locating an image feature point, the apparatus including:
the determining unit is used for determining all pixel points in the neighborhood of the characteristic point; each pixel point in the neighborhood has an adsorption effect on the characteristic point;
the adsorption force acquisition unit is used for acquiring the adsorption force of each pixel point on the characteristic point;
the comprehensive adsorption force acquisition unit is used for acquiring the comprehensive adsorption force of all the pixel points on the characteristic points by utilizing the adsorption force of each pixel point on the characteristic points;
and the positioning unit is used for correcting the position of the characteristic point according to the magnitude and the direction of the comprehensive adsorption force to obtain the corrected position of the characteristic point.
As a possible implementation, the adsorption force acquisition unit includes a first acquisition subunit;
the first obtaining subunit is configured to obtain, according to the gradient vector of the pixel point, and the position of the pixel point and the position of the feature point, an adsorption force of the pixel point to the feature point.
As a possible implementation manner, the first obtaining subunit includes:
the distance obtaining subunit is configured to obtain a distance between the pixel point and the feature point according to the position of the pixel point and the position of the feature point;
the included angle acquisition subunit is used for acquiring an included angle between the direction of the characteristic point pointing to the pixel point and the gradient direction of the pixel point;
and the adsorption force obtaining subunit is used for obtaining the adsorption force of the pixel point on the feature point according to the gradient vector of the pixel point, the distance between the pixel point and the feature point and the included angle.
As a possible implementation manner, the apparatus further includes: a normalization unit;
the normalization unit is used for normalizing the adsorption force of each pixel point on the characteristic point;
the comprehensive adsorption force acquisition unit is specifically used for:
and acquiring the comprehensive adsorption force of all the pixel points on the characteristic points by utilizing the adsorption force of each pixel point on the characteristic points after the characteristic points are normalized.
As a possible implementation manner, the comprehensive adsorption force obtaining unit obtains the comprehensive adsorption force of all the pixel points to the feature points by using the following formula:
Figure BDA0001783281660000041
wherein p isoRepresenting a characteristic point, piIs represented in the above poNeighborhood Ω ofoAny one of the pixel points in (a),
Figure BDA0001783281660000042
represents said piFor p isoAdsorption power of, omegaiTo the adsorption force
Figure BDA0001783281660000043
The weight of (2) is obtained by the following formula:
Figure BDA0001783281660000044
wherein,
Figure BDA0001783281660000045
represents said piAnd said poThe distance of (d); sigmadRepresenting the neighborhood ΩoAll pixel points in (1) and the poStandard deviation of the distance of (a).
As a possible implementation manner, the apparatus further includes:
and the cyclic correction unit is used for changing the neighborhood of the characteristic point and continuously correcting the position of the characteristic point aiming at the new neighborhood until the magnitude of the comprehensive adsorption force corresponding to the new neighborhood is lower than or equal to a preset threshold value.
In a third aspect of the present application, a computer-readable storage medium is provided, on which a computer program is stored, which when executed by a processor, implements the method for locating image feature points as provided in the first aspect of the present application.
In a fourth aspect of the present application, a processor is provided, where the processor is configured to run a program, and the program is executed to perform the method for locating the image feature points according to the first aspect of the present application.
Compared with the prior art, the method has the advantages that:
in order to accurately determine the feature points of the image, the positioning method provided by the application corrects the positions of the feature points which are not accurately positioned currently by utilizing the adsorption effect between the pixel points and the pixel points in the image. Firstly, determining all pixel points in the neighborhood of the feature point; then, acquiring the adsorption force of each pixel point in the neighborhood to the characteristic point; and then, obtaining the comprehensive adsorption force of all the pixel points on the characteristic points by utilizing the adsorption force of each pixel point on the characteristic points in the neighborhood. And finally, correcting the position of the characteristic point with inaccurate positioning according to the magnitude and the direction of the comprehensive adsorption force to obtain the corrected position of the characteristic point.
The comprehensive adsorption force is a vector and comprises a magnitude and a direction, the magnitude represents the difference between the position of the current characteristic point and the accurate position, and the direction represents the relative direction between the accurate position of the characteristic point and the current position. The correction means that the feature point is moved by a distance corresponding to the size of the feature point based on the current position according to the direction of the comprehensive adsorption force, and the moved position is the corrected position of the feature point. Therefore, the method can correct the position of the characteristic point with inaccurate positioning, and compared with the prior art, the accuracy of positioning the characteristic point is obviously improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of a method for positioning image feature points according to an embodiment of the present disclosure;
fig. 2 is a distribution diagram of comprehensive adsorption force exerted on each feature point in the eye image in the neighborhood thereof according to the embodiment of the present application;
fig. 3 is a flowchart of another image feature point positioning method according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram illustrating an adsorption force of a pixel point in a feature point neighborhood to a feature point according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a positioning apparatus for image feature points according to an embodiment of the present disclosure;
fig. 6 is a hardware structure diagram of a positioning device for image feature points according to an embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Image recognition has a wide range of applications in people's lives. In image recognition applications, the feature points of an image play a key role. The accuracy of feature point positioning can affect the identifiability and the processing quality of the image, and the more accurate the feature point positioning is, the more ideal the image identification and processing effect is. For example, accurate positioning of the feature points is helpful for realizing more accurate and more natural image registration of the same object in the two pictures, so that image splicing and image synthesis effects are more ideal; and the positioning of the feature points in the image is helpful for clearly distinguishing different objects in the image; the method is beneficial to realizing more real effects of image enhancement, image deformation and three-dimensional modeling.
For example, the following is applied to an image "beauty" scene in image recognition with feature points in the image, so as to visually and intuitively understand the significance of accurately positioning the feature points.
An Application (APP) with a face "beauty" function is run in the mobile terminal, for example, to apply makeup to a portrait. The APP is used for beautifying the photos of the people locally stored in the mobile terminal, and the eyelash beautification is automatically added to the photos of the people. However, due to inaccurate identification of the feature points (pixel points of eyelids forming a person in the image), the positions of the eyelids identified by the APP and the positions of the eyelids to which the person needs to add the beauty eyelashes in the picture have a deviation, so that the roots of the beauty eyelashes automatically added by the APP for the picture cannot coincide with the positions of the eyelids of the person in the picture, and the image has a poor beauty effect. And if the feature point identification is accurate, the root of the cosmetic eyelash can be accurately added to the eyelid position of the person, so that the cosmetic effect is realized, and vice versa.
Aiming at the problem of inaccurate positioning of the characteristic points, the inventor finds that adsorption exists between pixel points and pixel points in the image through research. In the present application, the inventors will use the concept of "adsorption force" to describe the adsorption between dots in detail. As a vector, the adsorption force has both magnitude and direction. It should be noted that the adsorption force is different from the mechanical action between physical solid substances, and describes the relationship between data information contained in different pixel points in the same image. The adsorption force refers to the action relationship between a given point in an image and surrounding pixel points.
In the present application, the inventor provides a method, an apparatus, a storage medium, and a processor for positioning feature points of an image, which perform position correction on feature points that are currently inaccurately positioned by using an adsorption effect between pixel points and pixel points in the image. The following describes a method, an apparatus, a storage medium, and a processor for locating image feature points, respectively, with reference to the embodiments and the drawings.
First embodiment
Referring to fig. 1, the figure is a flowchart of a method for positioning image feature points according to an embodiment of the present application.
As shown in fig. 1, a method for positioning image feature points provided in the embodiment of the present application includes:
step 101: determining all pixel points in the neighborhood of the characteristic point; and each pixel point in the neighborhood has an adsorption effect on the characteristic point.
In this embodiment, an image in which the feature points are located by using the prior art is obtained in advance. Due to the limited positioning accuracy, the positions of the feature points positioned in the image are deviated from the accurate positions. This embodiment is based on the pre-acquired image, and the feature points therein are repositioned.
According to the research of the inventor, adsorption exists between pixel points in the image and pixel points. The present embodiment locates feature points based on the adsorption between pixel points and pixel points. The pixel points in the neighborhood of the feature point can be used as the pixel points which have important effect on the correction of the position of the feature point in the embodiment, and the pixel points outside the neighborhood of the feature point have no important effect on the correction of the position of the feature point.
It should be noted that the size range of the neighborhood may be set according to the requirement of the positioning accuracy and the positioning speed of the feature point, and the larger the range of the neighborhood is, the more the number of the pixel points in the neighborhood is, the higher the positioning accuracy of the feature point is. However, the number of pixel points is correspondingly large, so that the calculation amount required for positioning is large, and the calculation amount is large, so that the positioning speed is slow. Therefore, on the premise of meeting the requirement of positioning accuracy, the neighborhood which is as small as possible can be selected, and the positioning speed is improved. The embodiment is not particularly limited, and the selection can be performed according to actual application scenarios and requirements.
For example, to determine the bit feature point, the neighborhood may be set to a range of 9 × 9 pixel points centered on the feature point; if the requirement for positioning accuracy is higher, the set neighborhood range can be expanded, for example, the range is set to be 15 × 15 pixel points with the feature point as the center; if the positioning speed needs to be increased, the set neighborhood range, for example, the range of 7 × 7 pixels centered on the feature point, can be narrowed.
Step 102: and acquiring the adsorption force of each pixel point in the neighborhood to the characteristic point.
In the embodiment, the position of the feature point which is currently positioned inaccurately is corrected by utilizing the adsorption effect between the pixel point and the pixel point in the image. Each pixel point in the neighborhood of the feature point has an adsorption effect on the feature point, namely each pixel point has adsorption force on the feature point in the neighborhood. The adsorption force of the pixel point to the feature point is generally related to the gradient vector of the pixel point, and the larger the gradient vector is, the larger the adsorption force of the pixel point to the feature point is.
Step 103: and obtaining the comprehensive adsorption force of all pixel points to the characteristic points by utilizing the adsorption force of each pixel point to the characteristic points.
The comprehensive adsorption force represents the comprehensive adsorption effect of all pixel points on the characteristic point in the neighborhood of the characteristic point with inaccurate current positioning. The comprehensive adsorption force is obtained by using the adsorption force of each pixel point to the feature point obtained in step 102, and various implementation modes are provided.
As an example, weights may be configured for the adsorption forces corresponding to the pixel points with different distances according to the distance between each pixel point in the neighborhood and the feature point, and finally, the adsorption forces configured with the weights are accumulated to obtain the comprehensive adsorption force of all the pixel points on the feature point.
Step 104: and correcting the position of the characteristic point according to the magnitude and the direction of the comprehensive adsorption force to obtain the corrected position of the characteristic point.
The integrated adsorption force obtained in step 103 has magnitude and direction. The difference between the position of the current characteristic point and the accurate position is represented by the magnitude of the comprehensive adsorption force, and the larger the comprehensive adsorption force is, the larger the position change of the characteristic point before and after correction is. The direction of the comprehensive adsorption force represents the relative direction of the accurate position of the characteristic point and the current position, and the position of the characteristic point needs to be corrected to the direction of the comprehensive adsorption force.
This step can be understood in conjunction with fig. 2. Fig. 2 is a distribution diagram of the comprehensive adsorption force of each feature point in the eye image provided by the present embodiment in the neighborhood. The arrows distributed in fig. 2 indicate that the characteristic points where the arrows are located are subjected to the comprehensive adsorption force in the vicinity thereof.
The length of the arrow indicates the magnitude of the total adsorption force, and the direction of the arrow indicates the direction of the total adsorption force. For example, the longer the arrow is, the greater the total adsorption force applied to the feature point where the arrow is located, and correspondingly, the shorter the arrow is, the smaller the total adsorption force applied to the feature point where the arrow is located. The direction of the arrow in fig. 2 indicates that the current feature point position needs to be corrected in the direction of the arrow.
As can be seen from fig. 2, the length of the arrows distributed in the neighborhood of the edge points and the corner points in the image is significantly longer, i.e., the comprehensive adsorption force exerted on the edge points and the corner points is larger. The length of the arrow in the area between the eyebrow and the upper eyelid in fig. 2 is short, because the feature point neighborhood where the arrow is located has no edge point and corner point, and the adsorption of each pixel point in the neighborhood to the feature point is relatively balanced, so that the comprehensive adsorption force is not significant.
In addition, the length of the arrow at the center of the eyebrow and the center of the pupil in fig. 2 is relatively small, because although the feature point neighborhood where the arrow is located has the edge point and the angular point, the adsorption effect of each pixel point in the neighborhood to the feature point is relatively balanced, and the comprehensive adsorption force is not significant.
And finally, correcting the position of the characteristic point with inaccurate positioning according to the magnitude and the direction of the comprehensive adsorption force to obtain the corrected position of the characteristic point. As an example, the position of the feature point before correction is (x)0,y0) If the magnitude of the integrated adsorption force is D and the direction of the integrated adsorption force is positive along the x-axis, the corrected characteristic point is (x ', y'), where x ═ x0+D,y’=y0
It should be noted that the accurate positioning of the feature points can be finally realized by iterating the positioning method provided by this embodiment for multiple times. The iteration times are related to the size of the selected neighborhood range, the position of the feature point before correction, the distance of the accurate position of the feature point and other factors. For example, if the distance between the position before the feature point is corrected and the accurate position is small, it is likely that the accurate positioning of the feature point is completed by using the positioning method provided by the present embodiment with a small number of iterations.
The positioning method provided by the application utilizes the adsorption effect between the pixel points and the pixel points in the image to correct the position of the characteristic point with inaccurate current positioning. The comprehensive adsorption force is a vector and comprises a magnitude and a direction, the magnitude represents the difference between the position of the current characteristic point and the accurate position, and the direction represents the relative direction between the accurate position of the characteristic point and the current position. The correction means that the feature point is moved by a distance corresponding to the size of the feature point based on the current position according to the direction of the comprehensive adsorption force, and the moved position is the corrected position of the feature point. Therefore, the method can correct the position of the characteristic point with inaccurate positioning, and compared with the prior art, the accuracy of positioning the characteristic point is obviously improved.
Based on the image feature point positioning method provided by the foregoing embodiment, the present application further provides a specific implementation manner of obtaining the adsorption force of any pixel point in the feature point neighborhood to the feature point, and obtaining the comprehensive adsorption force of all pixel points in the neighborhood to the feature point, thereby providing another image feature point positioning method. A second embodiment of the method will be described with reference to the accompanying drawings.
Second embodiment
Referring to fig. 3, this figure is a flowchart of another method for locating image feature points according to the embodiment of the present application.
As shown in fig. 3, a method for positioning image feature points provided in the embodiment of the present application includes:
in this embodiment, step 301 is the same as step 101 in the first embodiment, and specific implementation of step 301 may refer to the description in the first embodiment, which is not repeated herein.
Step 302: and acquiring the adsorption force of the pixel points on the characteristic points according to the gradient vectors of the pixel points, the positions of the pixel points and the positions of the characteristic points.
The current characteristic point with inaccurate positioning, the adsorption force of any pixel point in the neighborhood to the characteristic point is related to the gradient vector of the pixel point and the positions of the two points.
The image is stored in the form of discrete digital signals, and if the image is regarded as a two-dimensional discrete function, the gradient vector of a pixel point in the image is actually the result of derivation of the two-dimensional discrete function. As a vector, the gradient vector has both a direction and a magnitude. The gradient direction of any pixel point in the neighborhood represents the adsorption direction of the data information (such as gray value) contained by the pixel point to the data information contained by the feature point, so that the adsorption force direction of the pixel point to the feature point is consistent with the gradient direction of the pixel point.
The calculation formulas for calculating the gradient vector of the pixel point include various formulas, and the gradient vector belongs to a relatively mature concept in the field of image processing, so that detailed description is not provided herein. The specific way of obtaining the gradient vector of the pixel point is not limited in the step.
As a specific implementation, step 302 may include the following steps.
Step 302 a: and obtaining the distance between the pixel point and the characteristic point according to the position of the pixel point and the position of the characteristic point.
As an example, in step 302a, the distance between two points can be obtained directly by using the euclidean distance formula. Of course, other calculation formulas can be used to obtain the distance between the pixel point and the feature point in the present step. In this embodiment, the specific calculation manner for obtaining the distance between the pixel point and the feature point is not limited.
Step 302 b: and obtaining an included angle between the direction of the characteristic point pointing to the pixel point and the gradient direction of the pixel point.
Step 302 c: and obtaining the adsorption force of the pixel points on the characteristic points according to the gradient vectors of the pixel points, the distance between the pixel points and the characteristic points and the included angle.
To facilitate the understanding of steps 302a to 302c, see FIG. 4 and equation (1). Fig. 4 is a schematic diagram of an adsorption force of a pixel point in a feature point neighborhood to a feature point provided in this embodiment of the present application.
In FIG. 4, poCharacteristic points representing current positioning inaccuracies, piIs shown in poAny one of the pixel points in the neighborhood,
Figure BDA0001783281660000101
represents piTo poAdsorption force of (g)piRepresents piA gradient vector of (a), theta is poPoint of direction piDirection of (a) and piIs included in the gradient direction.
Specifically, p can be calculated using the following formulaiTo poAdsorption force of
Figure BDA0001783281660000102
Figure BDA0001783281660000111
It can be seen from formula (1) that the magnitude of the adsorption force is directly proportional to the gradient vector of the pixel point, and the larger the gradient vector is, the larger the corresponding adsorption force is.
Step 303: and normalizing the adsorption force of each pixel point on the characteristic points.
In the image, there are some noise points more or less, and the existence of these noise points will affect the precise positioning of the feature points, and reduce the accuracy of the positioning. For example, the gradient vector of some noise points may be large, which may cause a large suction force, and if a feature point is moved along the suction force, an erroneous movement may be caused. Therefore, in order to suppress the influence of noise points on the feature point positioning in the image, the feature point positioning method provided by the application provides that the adsorption force of each pixel point in the neighborhood to the feature point is normalized. The adsorption force of the noise points in the neighborhood to the characteristic points is weakened through the normalized adsorption force, and the phenomenon that the characteristic points are positioned to generate deviation due to the fact that the adsorption force serving as the noise points is too large is avoided.
This example provides an embodiment of normalizing the adsorption force,and multiplying the adsorption force obtained in the step 302 by a normalization factor to obtain the normalized adsorption force. The adsorption force of each pixel point in the neighborhood of the characteristic point after the characteristic point is normalized
Figure BDA0001783281660000112
Can be expressed by equation (2).
Figure BDA0001783281660000113
In the formula (2), η is a normalization factor. The calculation formula of the normalization factor η is:
Figure BDA0001783281660000114
in the formula (3), max (| | g)piI) is expressed at the feature point poAnd (4) the maximum value of the gradient vector norm corresponding to all the pixel points in the neighborhood.
Step 304: and obtaining the comprehensive adsorption force of all pixel points on the characteristic points by utilizing the adsorption force of each pixel point on the characteristic points after the characteristic points are normalized.
In order to avoid the feature point from being trapped in local optimization, the adsorption force of the pixel points which are close to the feature point and the adsorption force of the pixel points which are far away from the feature point are respectively constrained. For example, on one hand, the too large influence of the adsorption force corresponding to the pixel points close to the characteristic points is avoided, namely the influence of neglecting the farther pixel points due to the fact that the pixel points are trapped in local optimization can be prevented; on the other hand, the too large influence of the adsorption force corresponding to the pixel points (such as abnormal pixel points like noise points) which are far away from the characteristic points and have large gradients is avoided. Therefore, a Gaussian-like model is introduced in the embodiment, and the Gaussian-like model is used for poDistributing weight values to the adsorption force of the pixel points in the neighborhood to the characteristic points so as to restrain each pixel point pair p in the neighborhoodoComprehensive adsorption of the spots.
The formula of the Gaussian-like model is specifically as follows:
Figure BDA0001783281660000121
in the formula (4), the first and second groups,
Figure BDA0001783281660000122
is a characteristic point poAnd its neighborhood ΩoAny one pixel point piDistance of (a)dRepresenting the neighborhood ΩoAll pixel points in (1) and (p)oStandard deviation of distance of (a), omegaiIs piFor the characteristic point poThe weight of the adsorption force of (1). From equation (4), the neighborhood ΩoInner, and feature point poThe smaller the distance of the pixel point is, the larger the corresponding adsorption force weight is.
After introducing a Gaussian-like model, the proposed calculation comprehensive adsorption force psioThe formula of (1) is:
Figure BDA0001783281660000123
in the above formula, the first and second carbon atoms are,
Figure BDA0001783281660000124
may be the characteristic point p obtained in step 303oNeighborhood ΩoIn, any pixel point piThe corresponding normalized adsorption force may be the characteristic point p obtained in step 302oNeighborhood ΩoIn, any pixel point piFor the characteristic point poThe adsorption force of (3).
Step 305: and correcting the position of the characteristic point according to the magnitude and the direction of the comprehensive adsorption force to obtain the corrected position of the characteristic point.
In this embodiment, step 305 is the same as step 104 in the first embodiment, and the specific implementation of step 305 may refer to the description in the first embodiment, which is not repeated herein.
The above is a method for positioning image feature points provided in the embodiments of the present application. In the method, after all pixel points in the neighborhood of the feature point are determined, the adsorption force of the pixel points on the feature point is obtained according to the gradient vector of the pixel points, the position of the pixel points and the position of the feature point; then, normalizing the adsorption force of each pixel point on the characteristic points, and acquiring the comprehensive adsorption force of all the pixel points on the characteristic points by utilizing the adsorption force of each pixel point on the characteristic points after the characteristic points are normalized; and finally, correcting the position of the characteristic point according to the magnitude and the direction of the comprehensive adsorption force to obtain the corrected position of the characteristic point.
In the method, in order to weaken the influence of noise points existing in the neighborhood of characteristic points with inaccurate positioning in the image on the positioning of the characteristic points, the adsorption force of each pixel point on the characteristic points in the neighborhood is normalized. In addition, when the comprehensive adsorption force of all pixel points to the feature points in the field is calculated, weight values are distributed to the adsorption force corresponding to each pixel point through a Gaussian-like model related to the distance between the pixel points and the feature points, so that the influence of the adsorption force corresponding to the pixel points with close distances on the pixel points can be avoided being overlarge, and meanwhile, the phenomenon that the noise points and other abnormal pixel points are large in gradient even if the distance is far can be avoided, and the accurate positioning of the feature points is interfered. Therefore, the accuracy of the method for positioning the characteristic points is further improved.
In addition, it should be noted that, in general, by using the image feature point positioning method provided by this embodiment, the correction of the feature point is not completed at one time, and is a loop iteration process. That is, the position of the feature point after the correction is closer to the accurate position of the feature point than the position before the correction. After step 305, the method for positioning image feature points further includes:
step 306: and changing the neighborhood of the characteristic point, and continuously correcting the position of the characteristic point aiming at the new neighborhood until the magnitude of the comprehensive adsorption force corresponding to the new neighborhood is lower than or equal to a preset threshold value.
The following describes a specific implementation process of step 306 by way of an example.
For example, there is an inaccurately positioned feature point a in the image, and the position of the feature point a in the image is (x1, y1), the comprehensive adsorption force of all pixel points in the neighborhood of the feature point a with respect to the feature point a is obtained by implementing steps 301 to 305, and the position of the feature point a is finally corrected according to the comprehensive adsorption force, so as to obtain the corrected position of the feature point a (x2, y 2). Then, the pixel point at the (x2, y2) position is used as a feature point, a new neighborhood corresponding to the feature point at the (x2, y2) position is determined, and the steps 301 to 304 are repeated to obtain the comprehensive adsorption force of all pixel points in the new neighborhood to the feature point at the (x2, y2) position.
Thereafter, the magnitude of the integrated adsorption force is compared with a preset threshold value. The preset threshold is a comprehensive adsorption force limit value set for improving the correction accuracy of the position of the feature point in the embodiment: when the magnitude of the comprehensive adsorption force is larger than a preset threshold value, the fact that the current position of the characteristic point has larger deviation with the accurate position is shown, the accuracy of the current position is not high enough, and the position of the characteristic point needs to be further corrected; when the magnitude of the comprehensive adsorption force is lower than or equal to the preset threshold value, the deviation between the current position and the accurate position of the characteristic point is very small, the accuracy of the current position is high enough, and the position of the characteristic point does not need to be further corrected.
That is, if the magnitude of the comprehensive adsorption force corresponding to the neighborhood of the pixel point at the (x2, y2) position is lower than or equal to the preset threshold, the (x2, y2) position can be used as the corrected feature point, and the accurate correction of the feature point position can be completed by one-time correction. And if the comprehensive adsorption force corresponding to the neighborhood of the pixel point at the (x2, y2) position is larger than the preset threshold, correcting the (x2, y2) position according to the comprehensive adsorption force to obtain the feature point position (x3, y3) after second correction, obtaining the comprehensive adsorption force of the neighborhood of the pixel point at the (x3, y3) position to the (x3, y3) position feature point after correction, and comparing the comprehensive adsorption force with the preset threshold until the comprehensive adsorption force is determined to be lower than or equal to the preset threshold, stopping correction of the feature point position. Therefore, the accurate correction of the positions of the characteristic points is completed by circularly correcting for multiple times.
In the moving process of the feature point, the neighborhood changes, and after the neighborhood changes, the pixel points in the neighborhood change, so that the comprehensive adsorption force of the pixel points on the feature points in the neighborhood to which the pixel points belong also changes. When the comprehensive adsorption force is smaller or 0, the position correction of the characteristic point is finished, namely the characteristic point is corrected in place. As an example, the preset threshold value may be set as needed, but of course, for high accuracy, it may be set to 0, that is, the correction of the position of the feature point is not stopped until the magnitude of the combined acting force is 0.
Based on the image feature point positioning method provided by the foregoing embodiment, the present application also provides an image feature point positioning device. A third embodiment of the apparatus will be described with reference to the accompanying drawings.
Third embodiment
Referring to fig. 5, the figure is a schematic structural diagram of a positioning device for image feature points according to an embodiment of the present application.
As shown in fig. 5, an apparatus for locating an image feature point provided in an embodiment of the present application includes: a determination unit 51, an adsorption force acquisition unit 52, a comprehensive adsorption force acquisition unit 53, and a positioning unit 54.
The determining unit 51 is configured to determine all pixel points in the neighborhood of the feature point; each pixel point in the neighborhood has an adsorption effect on the characteristic point;
an adsorption force obtaining unit 52, configured to obtain an adsorption force of each pixel point on the feature point;
a comprehensive adsorption force obtaining unit 53, configured to obtain a comprehensive adsorption force of all the pixel points on the feature point by using an adsorption force of each pixel point on the feature point;
and the positioning unit 54 is configured to correct the position of the feature point according to the magnitude and the direction of the comprehensive adsorption force, and obtain a corrected position of the feature point.
In the image feature point positioning device provided in the embodiment of the present application, the device obtains the comprehensive adsorption force of the adjacent pixel point to the feature point by using the adsorption force of the adjacent pixel point to the feature point, which is currently inaccurately positioned, and finally corrects the position of the feature point by using the comprehensive adsorption force. The comprehensive adsorption force is a vector and comprises a magnitude and a direction, the magnitude represents the difference between the position of the current characteristic point and the accurate position, and the direction represents the relative direction between the accurate position of the characteristic point and the current position. The correction means that the feature point is moved by a distance corresponding to the size of the feature point based on the current position according to the direction of the comprehensive adsorption force, and the moved position is the corrected position of the feature point. Therefore, the device can correct the position of the characteristic point with inaccurate positioning, and compared with the prior art, the accuracy of positioning the characteristic point is obviously improved.
In some possible implementations of the present application, the absorption force obtaining unit 52 may include: a first acquisition subunit;
the first obtaining subunit is configured to obtain, according to the gradient vector of the pixel point, and the position of the pixel point and the position of the feature point, an adsorption force of the pixel point to the feature point.
In some possible implementations of the present application, the first obtaining subunit includes: a distance acquisition subunit, an included angle acquisition subunit, and an adsorption force acquisition subunit.
The distance obtaining subunit is configured to obtain, according to the positions of the pixel points and the positions of the feature points, a distance between the pixel point and the feature point;
the included angle acquisition subunit is used for acquiring an included angle between the direction of the characteristic point pointing to the pixel point and the gradient direction of the pixel point;
and the adsorption force obtaining subunit is used for obtaining the adsorption force of the pixel point on the feature point according to the gradient vector of the pixel point, the distance and the included angle.
In some possible implementations of the present application, the device for positioning image feature points provided in this embodiment may further include: a normalization unit;
the normalization unit is used for normalizing the adsorption force of each pixel point on the characteristic point;
the comprehensive adsorption force acquisition unit 53 is specifically configured to:
and acquiring the comprehensive adsorption force of all the pixel points on the characteristic points by utilizing the adsorption force of each pixel point on the characteristic points after the characteristic points are normalized.
The comprehensive adsorption force obtaining unit 53 may specifically obtain the comprehensive adsorption force of all the pixel points on the feature points by using the following formula:
Figure BDA0001783281660000161
wherein p isoRepresenting a characteristic point, piIs represented in the above poNeighborhood Ω ofoAny one of the pixel points in (a),
Figure BDA0001783281660000162
represents said piFor p isoω i is the adsorption force
Figure BDA0001783281660000163
The weight of (2) is obtained by the following formula:
Figure BDA0001783281660000164
wherein,
Figure BDA0001783281660000165
represents said piAnd said poThe distance of (d); sigmadRepresenting the neighborhood ΩoAll pixel points in (1) and the poStandard deviation of the distance of (a).
In some possible implementations of the present application, the device for positioning image feature points provided in this embodiment may further include:
and the cyclic correction unit is used for changing the neighborhood of the characteristic point and continuously correcting the position of the characteristic point aiming at the new neighborhood until the magnitude of the comprehensive adsorption force corresponding to the new neighborhood is lower than or equal to a preset threshold value.
In this embodiment, the preset threshold is a comprehensive adsorption force limit value set for improving the correction accuracy of the position of the feature point in this embodiment: when the magnitude of the comprehensive adsorption force is larger than a preset threshold value, the fact that the current position of the characteristic point has larger deviation with the accurate position is shown, the accuracy of the current position is not high enough, and the position of the characteristic point needs to be further corrected; when the magnitude of the comprehensive adsorption force is lower than or equal to the preset threshold value, the deviation between the current position and the accurate position of the characteristic point is very small, the accuracy of the current position is high enough, and the position of the characteristic point does not need to be further corrected. The cyclic correction unit of the positioning device can continuously correct the characteristic points for multiple times, and finally, the accurate correction of the positions of the characteristic points is realized.
The embodiment of the present application further provides a storage medium, on which a program is stored, and when the program is executed by a processor, the program implements some or all of the steps in the positioning method of the image feature points protected in the first embodiment and the second embodiment of the present application. The storage medium may be a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and various media capable of storing program codes.
The embodiment of the present application provides a processor, where the processor is configured to execute a program, where when the program runs, part or all of the steps in the method for positioning image feature points protected in the first embodiment and the second embodiment are executed.
Based on the storage medium and the processor provided by the foregoing embodiments, the present application also provides a device for positioning image feature points.
Referring to fig. 6, this figure is a hardware structure diagram of the positioning device for image feature points provided in this embodiment.
As shown in fig. 6, the apparatus for locating image feature points includes: memory 601, processor 602, communication bus 603, and communication interface 604.
The memory 601 stores a program that can be executed on the processor, and when the program is executed, part or all of the steps of the image feature point positioning method provided in the first embodiment and the second embodiment of the present application are implemented. The memory 601 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
In the positioning apparatus, the processor 602 and the memory 601 transmit signaling, logic instructions, and the like through a communication bus. The pointing device is capable of communicative interaction with other devices via the communication interface 604.
By executing the method through the program, the position of the characteristic point with inaccurate positioning can be corrected, and compared with the prior art, the accuracy of positioning the characteristic point is obviously improved.
It should be understood that in the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" for describing an association relationship of associated objects, indicating that there may be three relationships, e.g., "a and/or B" may indicate: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of single item(s) or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
The foregoing is merely a preferred embodiment of the present application and is not intended to limit the present application in any way. Although the present application has been described with reference to the preferred embodiments, it is not intended to limit the present application. Those skilled in the art can now make numerous possible variations and modifications to the disclosed embodiments, or modify equivalent embodiments, using the methods and techniques disclosed above, without departing from the scope of the claimed embodiments. Therefore, any simple modification, equivalent change and modification made to the above embodiments according to the technical essence of the present application still fall within the protection scope of the technical solution of the present application without departing from the content of the technical solution of the present application.

Claims (9)

1. A method for locating image feature points, comprising:
determining all pixel points in the neighborhood of the characteristic point; each pixel point in the neighborhood has an adsorption effect on the characteristic point;
acquiring the adsorption force of each pixel point on the characteristic point;
acquiring the comprehensive adsorption force of all pixel points on the characteristic points by utilizing the adsorption force of each pixel point on the characteristic points;
correcting the position of the characteristic point according to the magnitude and the direction of the comprehensive adsorption force to obtain the corrected position of the characteristic point;
the comprehensive adsorption force of all the pixel points on the feature points is obtained by utilizing the adsorption force of each pixel point on the feature points, and is specifically obtained by the following formula:
Figure FDA0002692221530000011
wherein p isoRepresenting a characteristic point, piIs represented in the above poNeighborhood Ω ofoAny one of the pixel points in (a),
Figure FDA0002692221530000012
represents said piFor p isoAdsorption power of, omegaiTo the adsorption force
Figure FDA0002692221530000013
The weight of (2) is obtained by the following formula:
Figure FDA0002692221530000014
wherein,
Figure FDA0002692221530000015
represents said piAnd said poThe distance of (d); sigmadRepresenting the neighborhood ΩoAll pixel points in (1) and the poStandard deviation of the distance of (a).
2. The method according to claim 1, wherein the obtaining of the adsorption force of each pixel point to the feature point specifically includes:
and acquiring the adsorption force of the pixel point on the characteristic point according to the gradient vector of the pixel point, the position of the pixel point and the position of the characteristic point.
3. The method for locating an image feature point according to claim 2, wherein the obtaining of the adsorption force of the pixel point to the feature point according to the gradient vector of the pixel point, and the position of the pixel point and the position of the feature point specifically includes:
obtaining the distance between the pixel point and the characteristic point according to the position of the pixel point and the position of the characteristic point;
obtaining the included angle between the direction of the characteristic point pointing to the pixel point and the gradient direction of the pixel point;
and obtaining the adsorption force of the pixel point to the characteristic point according to the gradient vector of the pixel point, the distance between the pixel point and the characteristic point and the included angle.
4. The method for locating image feature points according to any one of claims 1 to 3, wherein after the obtaining of the adsorption force of each of the pixel points to the feature point, the method further comprises:
normalizing the adsorption force of each pixel point on the characteristic point;
acquiring the comprehensive adsorption force of all pixel points on the feature points by using the adsorption force of each pixel point on the feature points, specifically:
and acquiring the comprehensive adsorption force of all the pixel points on the characteristic points by utilizing the adsorption force of each pixel point on the characteristic points after the characteristic points are normalized.
5. The method according to any one of claims 1 to 3, wherein after the correcting the position of the feature point and obtaining the corrected position of the feature point, the method further comprises:
and changing the neighborhood of the characteristic point, and continuously correcting the position of the characteristic point aiming at the new neighborhood until the magnitude of the comprehensive adsorption force corresponding to the new neighborhood is lower than or equal to a preset threshold value.
6. An apparatus for locating an image feature point, comprising:
the determining unit is used for determining all pixel points in the neighborhood of the characteristic point; each pixel point in the neighborhood has an adsorption effect on the characteristic point;
the adsorption force acquisition unit is used for acquiring the adsorption force of each pixel point on the characteristic point;
the comprehensive adsorption force acquisition unit is used for acquiring the comprehensive adsorption force of all the pixel points on the characteristic points by utilizing the adsorption force of each pixel point on the characteristic points;
the positioning unit is used for correcting the position of the characteristic point according to the magnitude and the direction of the comprehensive adsorption force to obtain the corrected position of the characteristic point;
the comprehensive adsorption force obtaining unit obtains the comprehensive adsorption force of all the pixel points on the feature points through the following formula:
Figure FDA0002692221530000021
wherein p isoRepresenting a characteristic point, piIs represented in the above poNeighborhood Ω ofoAny one of the pixel points in (a),
Figure FDA0002692221530000031
represents said piFor p isoAdsorption power of, omegaiTo the adsorption force
Figure FDA0002692221530000032
The weight of (2) is obtained by the following formula:
Figure FDA0002692221530000033
wherein,
Figure FDA0002692221530000034
represents said piAnd said poThe distance of (d); sigmadRepresenting the neighborhood ΩoAll pixel points in (1) and the poStandard deviation of the distance of (a).
7. The apparatus according to claim 6, wherein the suction force acquisition unit includes a first acquisition subunit;
the first obtaining subunit is configured to obtain, according to the gradient vector of the pixel point, and the position of the pixel point and the position of the feature point, an adsorption force of the pixel point to the feature point.
8. A computer-readable storage medium, characterized in that the medium has stored thereon a computer program which, when being executed by a processor, carries out a method of localization of image feature points according to any one of claims 1 to 5.
9. A processor, characterized in that the processor is adapted to run a program, which when running executes a method of localization of image feature points according to any of claims 1-5.
CN201811002527.2A 2018-08-30 2018-08-30 Image feature point positioning method and device, storage medium and processor Active CN109344710B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811002527.2A CN109344710B (en) 2018-08-30 2018-08-30 Image feature point positioning method and device, storage medium and processor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811002527.2A CN109344710B (en) 2018-08-30 2018-08-30 Image feature point positioning method and device, storage medium and processor

Publications (2)

Publication Number Publication Date
CN109344710A CN109344710A (en) 2019-02-15
CN109344710B true CN109344710B (en) 2020-12-18

Family

ID=65296993

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811002527.2A Active CN109344710B (en) 2018-08-30 2018-08-30 Image feature point positioning method and device, storage medium and processor

Country Status (1)

Country Link
CN (1) CN109344710B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113077513B (en) * 2021-06-03 2021-10-29 深圳市优必选科技股份有限公司 Visual positioning method and device and computer equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9659234B1 (en) * 2015-12-01 2017-05-23 Intel Corporation Adaptive selection of scale invariant image feature keypoints
CN106339693A (en) * 2016-09-12 2017-01-18 华中科技大学 Positioning method of face characteristic point under natural condition
CN106897662B (en) * 2017-01-06 2020-03-10 北京交通大学 Method for positioning key feature points of human face based on multi-task learning
CN106991388B (en) * 2017-03-27 2020-04-21 中国科学院自动化研究所 Key point positioning method
CN107392215A (en) * 2017-08-02 2017-11-24 焦点科技股份有限公司 A kind of multigraph detection method based on SIFT algorithms
CN107742312A (en) * 2017-10-09 2018-02-27 沈阳东软医疗系统有限公司 A kind of method and apparatus that key point is positioned in medical image

Also Published As

Publication number Publication date
CN109344710A (en) 2019-02-15

Similar Documents

Publication Publication Date Title
CN110175558B (en) Face key point detection method and device, computing equipment and storage medium
US10062198B2 (en) Systems and methods for generating computer ready animation models of a human head from captured data images
US20160283780A1 (en) Positioning feature points of human face edge
US20200020173A1 (en) Methods and systems for constructing an animated 3d facial model from a 2d facial image
US9916495B2 (en) Face comparison device, method, and recording medium
CN113128449A (en) Neural network training method and device for face image processing, and face image processing method and device
WO2017186016A1 (en) Method and device for image warping processing and computer storage medium
US20230252664A1 (en) Image Registration Method and Apparatus, Electronic Apparatus, and Storage Medium
CN104715447A (en) Image synthesis method and device
CN108734078B (en) Image processing method, image processing apparatus, electronic device, storage medium, and program
US20230230305A1 (en) Online streamer avatar generation method and apparatus
CN110263745A (en) A kind of method and device of pupil of human positioning
CN114723888B (en) Three-dimensional hair model generation method, device, equipment, storage medium and product
WO2017070923A1 (en) Human face recognition method and apparatus
CN103745209A (en) Human face identification method and system
CN113689503A (en) Target object posture detection method, device, equipment and storage medium
CN108615256A (en) A kind of face three-dimensional rebuilding method and device
CN113421204A (en) Image processing method and device, electronic equipment and readable storage medium
CN112488067B (en) Face pose estimation method and device, electronic equipment and storage medium
CN104091148A (en) Facial feature point positioning method and device
CN113658324A (en) Image processing method and related equipment, migration network training method and related equipment
CN110084219A (en) Interface alternation method and device
CN109344710B (en) Image feature point positioning method and device, storage medium and processor
JP7143931B2 (en) Control method, learning device, identification device and program
CN109934058B (en) Face image processing method, face image processing device, electronic apparatus, storage medium, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant