CN110533760B - Ambient light information determination method, device, electronic equipment and storage medium - Google Patents

Ambient light information determination method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN110533760B
CN110533760B CN201910707522.8A CN201910707522A CN110533760B CN 110533760 B CN110533760 B CN 110533760B CN 201910707522 A CN201910707522 A CN 201910707522A CN 110533760 B CN110533760 B CN 110533760B
Authority
CN
China
Prior art keywords
color
sample
dimensional face
face image
ambient light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910707522.8A
Other languages
Chinese (zh)
Other versions
CN110533760A (en
Inventor
刘晓强
郑文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN201910707522.8A priority Critical patent/CN110533760B/en
Publication of CN110533760A publication Critical patent/CN110533760A/en
Application granted granted Critical
Publication of CN110533760B publication Critical patent/CN110533760B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/55Radiosity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Abstract

The disclosure relates to an ambient light information determination method, an apparatus, an electronic device and a storage medium, the method comprising: determining a plurality of sampling points in the three-dimensional face image; determining the direction information of the sampling point and the ratio of the color value of the actual color of the sampling point to the color value of the sample color of the sampling point; constructing a spherical harmonic illumination model equation according to the color value and the ratio of the ambient light in the three-dimensional face image along each direction component; and calculating the color value of the ambient light along each direction component according to an equation set formed by spherical harmonic illumination model equations corresponding to the sampling points. According to the embodiment of the disclosure, the color value of the obtained ambient light along each direction component can represent not only the component of each color in the ambient light, but also the color component of the ambient light along each direction, so that the color value of illumination and illumination direction information are included. According to the illumination information, the rendering effect of the three-dimensional face image can be improved, and accurate illumination information can be recovered.

Description

Ambient light information determination method, device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an ambient light information determining method, an ambient light information determining device, an electronic apparatus, and a storage medium.
Background
In the related art, after a three-dimensional face image is constructed according to a two-dimensional face image, the three-dimensional face image needs to be rendered, wherein the rendering needs to be performed based on ambient light information, so that the result of rendering the three-dimensional face image can accurately represent the brightness condition of light irradiated on a face.
However, the current method mainly determines the ambient light information based on the color of the face and the brightness of the pixels, which only can obtain the brightness information of the ambient light, but cannot obtain the information of the color, the irradiation direction and the like of the ambient light, so that the rendering effect on the three-dimensional face image is poor.
Disclosure of Invention
The present disclosure provides an ambient light information determination method, an ambient light information determination device, an electronic apparatus, and a storage medium to at least solve the problem that ambient light information determined in the related art does not contain information such as color, irradiation direction, and the like of ambient light. The technical scheme of the present disclosure is as follows:
according to a first aspect of embodiments of the present disclosure, there is provided an ambient light information determining method, including:
processing key points in the two-dimensional face image based on the three-dimensional deformation model to generate a three-dimensional face image;
determining a plurality of sampling points in the three-dimensional face image;
Determining direction information of the sampling point and a ratio of a color value of an actual color of the sampling point to a color value of a sample color of the sampling point, wherein the direction information comprises a plurality of direction components;
constructing a spherical harmonic illumination model equation according to the association relation between the color value of the ambient light in the three-dimensional face image along each direction component and the ratio;
and calculating the color value of the ambient light along each direction component according to an equation set formed by spherical harmonic illumination model equations corresponding to the sampling points.
Optionally, the determining the plurality of sampling points in the three-dimensional face image includes:
determining sample points positioned at preset positions in the three-dimensional face image;
and determining points outside a preset area from the sample points as the sampling points, wherein the preset area comprises:
nostril area, lateral face area, glasses area.
Optionally, the direction information of the sampling point includes a normal line of a tangential plane with the sampling point as a tangential point, where a curved surface formed by the three-dimensional face image is tangential to the tangential plane with the sampling point.
Optionally, the ratio is determined by:
Determining the position of the sampling point;
inquiring historical sample points positioned at the positions in a plurality of historical three-dimensional face images stored in a database in advance;
calculating the average value of the color values of a plurality of historical sample points as the color value of the sample color;
wherein the actual color comprises color values a of n channels i The sample color includes color values b of n channels i The ith channel of the actual color corresponds to the ith channel of the sample color with the same color, i is more than or equal to 1 and less than or equal to n;
calculating the ratio of the color value of the actual color to the color value of the sample color
Figure BDA0002152638110000021
Optionally, the normal includes an x-direction component, a y-direction component and a z-direction component, and constructing a spherical harmonic illumination model equation according to an association relationship between a color value of ambient light in the three-dimensional face image along each direction component and a color value of the sampling point includes:
constructing a spherical harmonic illumination model equation according to the association relation between the color value lighta of the ambient light in the three-dimensional face image along the x-direction component normal.x, the color value lightb along the y-direction component normal.y, the color value lightc along the z-direction component normal.z and the ratio I of the actual color of the sampling point to the sample color of the sampling point:
I=normal.x*lighta+normal.y*lightb+normal.z*lightc。
According to a second aspect of the embodiments of the present disclosure, there is provided an ambient light information determining apparatus, including:
the three-dimensional generation module is configured to execute processing on key points in the two-dimensional face image based on the three-dimensional deformation model so as to generate a three-dimensional face image;
a sampling point determination module configured to perform determining a plurality of sampling points in the three-dimensional face image;
a direction determination module configured to perform determining direction information of the sampling point, wherein the direction information includes a plurality of direction components;
a ratio determination module configured to perform determining a ratio of color values of an actual color of the sampling point to color values of a sample color of the sampling point;
an equation construction module configured to perform construction of a spherical harmonic illumination model equation from an association relationship between a color value of the ambient light in the three-dimensional face image along each of the direction components and the ratio;
and the color value calculation module is configured to execute an equation set formed according to spherical harmonic illumination model equations corresponding to the sampling points and calculate the color value of the ambient light along each direction component.
Optionally, the sampling point determining module includes:
A sample point determination sub-module configured to perform determining a sample point located at a preset position in the three-dimensional face image;
a sampling point determination submodule configured to perform determination of a point located outside a preset area among the sampling points as the sampling point, wherein the preset area includes:
nostril area, lateral face area, glasses area.
Optionally, the direction information of the sampling point includes a normal line of a tangential plane with the sampling point as a tangential point, where a curved surface formed by the three-dimensional face image is tangential to the tangential plane with the sampling point.
Optionally, the ratio determining module includes:
a position determination sub-module configured to perform determining a position of the sampling point;
a sample query sub-module configured to perform a query of a plurality of historical three-dimensional face images pre-stored in a database for historical sample points located at the location;
a color value calculation sub-module configured to perform calculation of a mean value of color values of a plurality of the history sample points as a color value of the sample color;
wherein the actual color comprises color values a of n channels i The sample color includes color values b of n channels i The ith channel of the actual color corresponds to the ith channel of the sample color with the same color, i is more than or equal to 1 and less than or equal to n;
a ratio calculation sub-module configured to perform a calculation of a ratio of the color value of the actual color to the color value of the sample color
Figure BDA0002152638110000031
Optionally, the normal includes an x-direction component, a y-direction component, and a z-direction component, and the equation construction module is configured to perform constructing a spherical harmonic illumination model equation according to an association of a color value lighta of ambient light in the three-dimensional face image along the x-direction component normal.x, a color value lightb along the y-direction component normal.y, a color value lightc along the z-direction component normal.z, and a ratio I of an actual color of the sampling point to a sample color of the sampling point:
I=normal.x*lighta+normal.y*lightb+normal.z*lightc。
according to a third aspect of embodiments of the present disclosure, there is provided an electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the ambient light information determination method as described in any of the embodiments above.
According to a fourth aspect of embodiments of the present disclosure, there is provided a storage medium, which when executed by a processor of an electronic device, enables the electronic device to perform the ambient light information determination method as described in any one of the embodiments above.
According to a fifth aspect of embodiments of the present disclosure, there is provided a computer program product configured to perform the ambient light information determination method of any one of the embodiments described above.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
according to the embodiment of the disclosure, a spherical harmonic illumination model equation can be constructed according to the association relation between the color value of the ambient light in the three-dimensional face image along each direction component and the ratio of the color value of the actual color to the color value of the sample color, then the color value of the ambient light along each direction component can be calculated for the equation set of a plurality of spherical harmonic illumination model equations constructed by a plurality of sampling points, and the color value of the ambient light along each direction component can be obtained according to the equation set of the spherical harmonic illumination model equation. According to the illumination information, the rendering effect of the three-dimensional face image can be improved, and accurate illumination information can be recovered.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure and do not constitute an undue limitation on the disclosure.
Fig. 1 is a schematic flow chart of an ambient light information determination method shown in accordance with an embodiment of the present disclosure.
Fig. 2 is a schematic diagram of one sample point shown in accordance with an embodiment of the present disclosure.
Fig. 3 is a schematic diagram of a rendering effect shown according to an embodiment of the present disclosure.
Fig. 4 is a schematic flow chart diagram illustrating another ambient light information determination method according to an embodiment of the present disclosure.
Fig. 5 is a schematic flow chart diagram illustrating one determination of a ratio according to an embodiment of the present disclosure.
Fig. 6 is a schematic flow chart diagram illustrating yet another ambient light information determination method according to an embodiment of the present disclosure.
Fig. 7 is a hardware configuration diagram of a device in which the ambient light information determining apparatus is located, according to an embodiment of the present disclosure.
Fig. 8 is a schematic block diagram of an ambient light information determining apparatus shown according to an embodiment of the present disclosure.
Fig. 9 is a schematic block diagram of a sample point determination module shown in accordance with an embodiment of the present disclosure.
FIG. 10 is a schematic block diagram of a ratio determination module shown in accordance with an embodiment of the disclosure.
Detailed Description
In order to enable those skilled in the art to better understand the technical solutions of the present disclosure, the technical solutions of the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the foregoing figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the disclosure described herein may be capable of operation in sequences other than those illustrated or described herein. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
Fig. 1 is a schematic flow chart of an ambient light information determination method shown in accordance with an embodiment of the present disclosure. The method for determining the ambient light information can be applied to electronic equipment capable of rendering three-dimensional face images based on the ambient light information, wherein the electronic equipment can be a mobile phone, a tablet computer, a wearable device and other terminals, and can also be a server.
As shown in fig. 1, the ambient light information determining method may include the steps of:
in step S1, processing key points in a two-dimensional face image based on a three-dimensional deformation model to generate a three-dimensional face image;
in one embodiment, a plurality of key points may be determined in a two-dimensional face image (the number and positions of the key points may be set as required), and then the plurality of key points are processed based on a three-dimensional deformation model (3 d morphable model, abbreviated as 3 dmm) to generate the three-dimensional face image. Specifically, a face in a two-dimensional face image can be represented by a set of parameterized bases (which can be understood as feature vectors), then key points in the two-dimensional face image are processed according to the bases to obtain key points in a three-dimensional face image, and the key points in the three-dimensional face image are determined because the three-dimensional face image can be represented based on the key points in the three-dimensional face image, so that the three-dimensional face image is determined.
In step S2, a plurality of sampling points are determined in the three-dimensional face image;
in one embodiment, for the three-dimensional face image, a point located at a preset position therein may be taken as a sample point, the position of each sample point on the face of the three-dimensional face image is preset, and an identification, such as a number, may be set for each sample point, respectively, so that the sample point of a specific position is determined according to the identification.
Further, for a sample point in the three-dimensional face image, a point outside a preset area can be determined as a sampling point in the sample point, wherein the preset area comprises at least one of the following: nostril area, lateral face area, glasses area.
For the preset areas such as the nostril area, the side face area and the glasses area, when the ambient light irradiates the preset areas, for example, the nostril area is shielded by a nose, the side face area is shielded by a cheek, and the glasses area is shielded by a glasses frame and a lens, the ambient light information in the preset areas is inconsistent with the actual ambient light information, for example, no matter what color the ambient light is, the nostril area is black because the nostril area is shielded by the nose, and the ambient light information in the nostril area is inconsistent with the actual ambient light information. Because not all people wear the glasses, whether the glasses exist in the three-dimensional face image can be firstly identified, and if the glasses exist, the area where the glasses exist is determined to be the glasses area.
In this embodiment, a sample point located outside the preset area may be selected as a sampling point, and since the sampling point is located outside the preset area, the sample point is generally not blocked, so that the ambient light information of the sampling point matches with the actual ambient light information, and therefore, the subsequent processing is performed according to the sampling point, which is favorable for ensuring that the ambient light information is accurately determined.
Fig. 2 is a schematic diagram of one sample point shown in accordance with an embodiment of the present disclosure.
When the component of the three-dimensional face image, in which the sampling point is perpendicular to the three-dimensional face image, is not considered, the positions of the sampling points in the three-dimensional face image may be the same as the positions of the sampling points in the two-dimensional face image corresponding to the three-dimensional face image, so as shown in fig. 2, with the two-dimensional face image example, 4 ten thousand sample points may be set, and 1200 points may be determined as sampling points at the sample points. The preset areas in fig. 2 include the side face area and the nostril area, i.e. the sampling points determined in the sample points, located outside the measuring area and the nostril area, e.g. mainly in the vicinity of the tip of the nose, the bridge of the nose, the chin, the person.
In step S3, determining direction information of the sampling point and a ratio of a color value of an actual color of the sampling point to a color value of a sample color of the sampling point, wherein the direction information includes a plurality of direction components;
in one embodiment, the direction information of the sampling point may be represented by a normal of the sampling point, which may be a three-dimensional vector.
The method for determining the normal line of the sampling point can be selected according to the need, for example, a triangle can be constructed by taking the sampling point as a vertex, a plane where the triangle is located and a curved surface formed by a three-dimensional face image (for example, the face surface in the three-dimensional face image) are tangent to the sampling point, and then the normal line of the plane where the triangle is located is calculated as the normal line of the sampling point; for example, two non-parallel vectors may be determined by using a sampling point as an intersection point of two vectors, a plane formed by the two vectors and a curved surface formed by the three-dimensional face image are tangent to the sampling point, and then a normal line of the plane formed by the two vectors is calculated as a normal line of the sampling point.
In one embodiment, the color value of the sample color of the sampling point may be obtained by querying a database, for example, a plurality of historical three-dimensional face images may be pre-stored in the database, where the skin colors of faces in the stored historical three-dimensional face images are the same or similar, for example, all yellow skin, all white skin, or all black skin, so that the accuracy of determining the sample color is favorably not affected by the skin color difference.
Each of the history three-dimensional face images may include a plurality of history sample points, the position of each of the history sample points on the face of the history three-dimensional face image being set in advance, and an identification, such as a number, may be set for each of the history sample points, respectively. Further, for the sampling points in the three-dimensional face image, the positions of the sampling points can be determined first, and then the identification corresponding to the positions, such as the numbers of the sampling points, can be determined.
Then, the historical sample points located at the positions are queried in a plurality of historical three-dimensional face images in the database, for example, the historical sample points with the same identification are queried, specifically, the historical sample points with the same number as the sampling points are queried, the queried historical sample points are the same in position on the face and the sampling points are the same in position on the face, then the historical sample points can be queried in the historical three-dimensional face images, further, the color values of the sample colors can be determined according to the color values of the historical sample points, for example, the color values of the historical sample points can be weighted and summed, or the average value can be obtained, and the obtained result is used as the color value of the sample color. And the color value of the actual color of the sampling point can be directly obtained from the three-dimensional face image.
The sample color and the actual color may comprise color values of multiple channels, for example, a color value of a red channel, a color value of a green channel, and a color value of a blue channel, in which case the color values of the sample color and the actual color are three-dimensional vectors containing three channel color values.
In step S4, a spherical harmonic illumination model equation is constructed according to the association relationship between the color value of the ambient light in the three-dimensional face image along each direction component and the ratio;
in one embodiment, for each point employed, the association of the color value of the sampling point with the color value of the ambient light in the three-dimensional face image along each of the directional components may be formulated by a spherical harmonic illumination model.
Specifically, the spherical harmonic illumination model formula may represent a relationship between a ratio of a color value of an actual color of the sampling point and a color value of a sample color of the sampling point, and a color value of a normal to the sampling point and ambient light along each of the direction components, specifically:
I=normal.x*lighta+normal.y*lightb+normal.z*lightc;
where I represents the ratio of the color value of the actual color of the sampling point to the color value of the sample color of the sampling point, and where both the sample color and the actual color are three-dimensional vectors containing three-way color values, the ratio of the color value of the actual color to the color value of the sample color is obtained by comparing the color value of the sample color with the color value of the actual color in the same dimension, e.g., the color value of the actual color is (R, G, B), the color value of the sample color is (R ', G', B '), where the first dimension is the red component, the second dimension is the green component, and the third dimension is the blue component, then the ratio of the color value of the actual color to the color value of the sample color, i.e., I is equal to (R/R', G/G ', B/B').
The normal of the sampling point is (normal.x, normal.y, normal.z), wherein normal.x is the component of the normal in the x direction in the three-dimensional face image, normal.y is the component of the normal in the y direction in the three-dimensional face image, normal.z is the component of the normal in the z direction in the three-dimensional face image, the x direction and the y direction are parallel to the plane of the three-dimensional face image, and the z direction is perpendicular to the plane of the three-dimensional face image.
Ambient light is denoted (lighta, lightb, lightc), where lighta and lightb and lightc are vectors of the same dimensions as the color values of the actual color and the color values of the sample color. For example, light is (Ra, ga, bb) representing a color value of ambient light along a direction component normal.x (i.e., x-direction), where Ra represents a red component of ambient light along the x-direction, ga represents a green component of ambient light along the x-direction, and Bb represents a blue component of ambient light along the x-direction; lightb is (Rb, gb, bb) representing color values of ambient light along a direction component normal.y (i.e., y-direction), where Rb represents a red component of ambient light along the y-direction, gb represents a green component of ambient light along the y-direction, and Bb represents a blue component of ambient light along the y-direction; lightc is (Rc, gc, bc) representing the color value of the ambient light along the direction component normal.z (i.e. z direction), where Rc represents the red component of the ambient light along the z direction, gc represents the green component of the ambient light along the z direction, and Bc represents the blue component of the ambient light along the z direction.
The color value of the ambient light along each direction component obtained according to the spherical harmonic illumination model formula can not only represent the component of each color in the ambient light, but also represent the color component of the ambient light in each direction, namely the color value and the direction information of illumination are contained.
In the related art, a spherical harmonic illumination model formula is used, and the color value of the actual color in the I is calculated mainly under the condition of knowing the ambient light information. In this embodiment, the color value of the ambient light along each of the direction components is determined according to the spherical harmonic illumination model formula, knowing the color value of the actual color.
In step S5, a color value of the ambient light along each direction component is calculated according to an equation set formed by spherical harmonic illumination model equations corresponding to the plurality of sampling points.
In one embodiment, since the equation can be constructed according to the above-mentioned spherical harmonic illumination model for each sampling point, a plurality of equations can be constructed for a plurality of sampling points, and then a plurality of equations can form an equation set, and by solving the equation set, optimal ambient light information (lighta, lightb, lightc) can be obtained, wherein the manner of solving the equation set includes, but is not limited to, singular value decomposition (Singular Value Decomposition, SVD for short).
According to the embodiment of the disclosure, a spherical harmonic illumination model equation can be constructed according to the association relation between the color value of the ambient light in the three-dimensional face image along each direction component and the ratio of the color value of the actual color to the color value of the sample color, then the color value of the ambient light along each direction component can be calculated for the equation set of a plurality of spherical harmonic illumination model equations constructed by a plurality of sampling points, and the color value of the ambient light along each direction component can be obtained according to the equation set of the spherical harmonic illumination model equation. According to the illumination information, the rendering effect of the three-dimensional face image can be improved, and accurate illumination information can be recovered.
Fig. 3 is a schematic diagram of a rendering effect shown according to an embodiment of the present disclosure.
As shown in fig. 3, in order to conveniently show the comparison effect, the two-dimensional face image and the three-dimensional face image are placed in one image for comparison, and in practical application, the two-dimensional face image and the three-dimensional face image may be placed in two images for display respectively.
In the related art, when a three-dimensional face image is rendered, since the three-dimensional face image is rendered according to the brightness of pixels in the two-dimensional face image, the sizes and shapes of five sense organs in the two-dimensional face image and the three-dimensional face image are not identical, for example, the nose in the three-dimensional face image in fig. 3 is smaller than the nose in the two-dimensional face image, and the eyes in the three-dimensional face image are larger than the eyes in the two-dimensional face image, which may cause a darker pixel in the two-dimensional face image and a brighter pixel in the three-dimensional face image should be irradiated with light.
For example, in fig. 3, the area a between the eyes on the right side of the face (the left eye of the person) and the eyebrows is dark in the two-dimensional face image, and if the three-dimensional face image constructed according to the prior art is rendered, the area is not illuminated, so that the brightness of all pixels of the area is low, and this rendering method does not accurately render the degree to which the light in the area a is blocked in the three-dimensional face image.
According to the embodiment of the disclosure, since the determined ambient light information includes the irradiation direction, when the three-dimensional face image is rendered, whether the ambient light is blocked or not can be determined according to the three-dimensional coordinates of each point in the three-dimensional face image, for example, as shown in fig. 3, since the nose in the three-dimensional face image is smaller than the nose in the two-dimensional face image, the blocking degree of the nose in the three-dimensional face image to the ambient light irradiated from the lower left side is lower than the blocking degree of the nose in the two-dimensional face image to the ambient light irradiated from the lower left side, so that a small amount of light irradiates the region a between the eyes and the eyebrows on the right side of the face in the three-dimensional face image, and the brightness of partial pixels in the region a is higher as a rendering result. Therefore, the environment light information determined according to the embodiment of the disclosure includes the color value and the irradiation direction information, and the rendering effect on the three-dimensional face image can be improved according to the environment light information.
Fig. 4 is a schematic flow chart diagram illustrating another ambient light information determination method according to an embodiment of the present disclosure. As shown in fig. 4, the determining a plurality of sampling points in the three-dimensional face image includes:
in step S11, determining a sample point located at a preset position in the three-dimensional face image;
in step S12, a point outside a preset area is determined as the sampling point from the sample points, where the preset area includes:
nostril area, lateral face area, glasses area.
In one embodiment, for the three-dimensional face image, a point located at a preset position therein may be taken as a sample point, the position of each sample point on the face of the three-dimensional face image is preset, and an identification, such as a number, may be set for each sample point, respectively, so that the sample point of a specific position is determined according to the identification.
Further, for a sample point in the three-dimensional face image, a point outside a preset area can be determined as a sampling point in the sample point, wherein the preset area comprises at least one of the following: nostril area, lateral face area, glasses area.
For the preset areas such as the nostril area, the side face area and the glasses area, when the ambient light irradiates the preset areas, for example, the nostril area is shielded by a nose, the side face area is shielded by a cheek, and the glasses area is shielded by a glasses frame and a lens, the ambient light information in the preset areas is inconsistent with the actual ambient light information, for example, no matter what color the ambient light is, the nostril area is black because the nostril area is shielded by the nose, and the ambient light information in the nostril area is inconsistent with the actual ambient light information. Because not all people wear the glasses, whether the glasses exist in the three-dimensional face image can be firstly identified, and if the glasses exist, the area where the glasses exist is determined to be the glasses area.
In this embodiment, a sample point located outside the preset area may be selected as a sampling point, and since the sampling point is located outside the preset area, the sample point is generally not blocked, so that the ambient light information of the sampling point matches with the actual ambient light information, and therefore, the subsequent processing is performed according to the sampling point, which is favorable for ensuring that the ambient light information is accurately determined.
In one embodiment, since more sampling points are needed to construct more equations, more calculation amounts are caused, the embodiment determines a smaller number of points in the sampling points as sampling points, for example, 1200 points can be determined in 4 ten thousand sampling points as sampling points, which is beneficial to reducing the calculation amounts, so that the method described in the embodiment of the disclosure is suitable for electronic devices with relatively weak skills, such as mobile phones, wearable devices and the like.
However, the number of sampling points is small, and the illumination condition of each position in the three-dimensional face image may not be accurately represented, so that the embodiment can determine the points outside the preset area as sampling points according to the density (for example, 100 points per square centimeter) lower than the preset density, so that the determined sampling points with a small number can be more uniformly distributed in the whole three-dimensional face image, the illumination condition of each position in the three-dimensional face image can be accurately represented relatively, and the method is favorable for accurately determining the ambient light information.
Optionally, the direction information of the sampling point includes a normal line of a tangential plane with the sampling point as a tangential point, where a curved surface formed by the three-dimensional face image is tangential to the tangential plane with the sampling point.
In one embodiment, the direction information of the sampling point may be represented by a normal of the sampling point, which may be a three-dimensional vector. For example, the normal of the sampling point is (normal.x, normal.y, normal.z), where normal.x is the component of the normal in the x-direction in the three-dimensional face image, normal.y is the component of the normal in the y-direction in the three-dimensional face image, normal.z is the component of the normal in the z-direction in the three-dimensional face image, the x-direction and the y-direction are parallel to the plane of the three-dimensional face image, and the z-direction is perpendicular to the plane of the three-dimensional face image.
The method for determining the normal line of the sampling point can be selected according to the need, for example, a triangle can be constructed by taking the sampling point as a vertex, a plane where the triangle is located and a curved surface formed by a three-dimensional face image (for example, the face surface in the three-dimensional face image) are tangent to the sampling point, and then the normal line of the plane where the triangle is located is calculated as the normal line of the sampling point; for example, two non-parallel vectors may be determined by using a sampling point as an intersection point of two vectors, a plane formed by the two vectors and a curved surface formed by the three-dimensional face image are tangent to the sampling point, and then a normal line of the plane formed by the two vectors is calculated as a normal line of the sampling point.
Fig. 5 is a schematic flow chart diagram illustrating one determination of a ratio according to an embodiment of the present disclosure. As shown in fig. 5, the ratio is determined by:
in step S1', determining the position of the sampling point;
in step S2', a history sample point positioned at the position is inquired in a plurality of history three-dimensional face images stored in advance in a database;
in step S3', calculating a mean value of the color values of the plurality of history sample points as a color value of the sample color;
wherein the actual color comprises color values a of n channels i The sample color includes color values b of n channels i The ith channel of the actual color corresponds to the ith channel of the sample color with the same color, i is more than or equal to 1 and less than or equal to n;
in step S4', the ratio of the color value of the actual color to the color value of the sample color is calculated
Figure BDA0002152638110000111
In one embodiment, the color value of the sample color of the sampling point may be obtained by querying a database, for example, a plurality of historical three-dimensional face images may be pre-stored in the database, where the skin colors of faces in the stored historical three-dimensional face images are the same or similar, for example, all yellow skin, all white skin, or all black skin, so that the accuracy of determining the sample color is favorably not affected by the skin color difference.
Each of the history three-dimensional face images may include a plurality of history sample points, the position of each of the history sample points on the face of the history three-dimensional face image being set in advance, and an identification, such as a number, may be set for each of the history sample points, respectively. Further, for the sampling points in the three-dimensional face image, the positions of the sampling points can be determined first, and then the identification corresponding to the positions, such as the numbers of the sampling points, can be determined.
Then, the historical sample points located at the positions are queried in a plurality of historical three-dimensional face images in the database, for example, the historical sample points with the same identification are queried, specifically, the historical sample points with the same number as the sampling points are queried, the queried historical sample points are the same in position on the face and the sampling points are the same in position on the face, then the historical sample points can be queried in the historical three-dimensional face images, further, the color values of the sample colors can be determined according to the color values of the historical sample points, for example, the color values of the historical sample points can be weighted and summed, or the average value can be obtained, and the obtained result is used as the color value of the sample color. And the color value of the actual color of the sampling point can be directly obtained from the three-dimensional face image.
The sample color and the actual color may comprise color values of multiple channels, for example, a color value of a red channel, a color value of a green channel, and a color value of a blue channel, in which case the color values of the sample color and the actual color are three-dimensional vectors containing three channel color values.
In case the sample color and the actual color are both three-dimensional vectors comprising three channel color values, the ratio of the color value of the actual color to the color value of the sample color is obtained by comparing the color value of the sample color with the color value of the actual color in the same dimension, e.g. the color value of the actual color is (R, G, B), the color value of the sample color is (R ', G', B '), wherein the first dimension is the red component, the second dimension is the green component, and the third dimension is the blue component, the ratio of the color value of the actual color to the color value of the sample color, i.e. the ratio I is equal to (R/R', G/G ', B/B').
And the sample color contains n-channel color values a i The sample color comprises color values b for n channels i And the ith channel of the actual color corresponds to the same color as the ith channel of the sample color, then the ratio of the color value of the actual color to the color value of the sample color
Figure BDA0002152638110000121
Fig. 6 is a schematic flow chart diagram illustrating yet another ambient light information determination method according to an embodiment of the present disclosure. As shown in fig. 6, the normal includes an x-direction component, a y-direction component and a z-direction component, and the constructing a spherical harmonic illumination model equation according to the association relationship between the color value of the ambient light in the three-dimensional face image along each direction component and the color value of the sampling point includes:
in step S41, a spherical harmonic illumination model equation is constructed according to the association relationship between the color value lighta of the ambient light along the x-direction component normal.x, the color value lightb along the y-direction component normal.y, the color value lightc along the z-direction component normal.z, and the ratio I of the actual color of the sampling point to the sample color of the sampling point in the three-dimensional face image:
I=normal.x*lighta+normal.y*lightb+normal.z*lightc。
in one embodiment, for each point employed, the association of the color value of the sampling point with the color value of the ambient light in the three-dimensional face image along each of the directional components may be formulated by a spherical harmonic illumination model.
Specifically, the spherical harmonic illumination model formula may represent a relationship between a ratio of a color value of an actual color of the sampling point and a color value of a sample color of the sampling point, and a color value of a normal to the sampling point and ambient light along each of the direction components, specifically:
I=normal.x*lighta+normal.y*lightb+normal.z*lightc;
Where I represents the ratio of the color value of the actual color of the sampling point to the color value of the sample color of the sampling point, and where both the sample color and the actual color are three-dimensional vectors containing three-way color values, the ratio of the color value of the actual color to the color value of the sample color is obtained by comparing the color value of the sample color with the color value of the actual color in the same dimension, e.g., the color value of the actual color is (R, G, B), the color value of the sample color is (R ', G', B '), where the first dimension is the red component, the second dimension is the green component, and the third dimension is the blue component, then the ratio of the color value of the actual color to the color value of the sample color, i.e., I is equal to (R/R', G/G ', B/B').
The normal of the sampling point is (normal.x, normal.y, normal.z), wherein normal.x is the component of the normal in the x direction in the three-dimensional face image, normal.y is the component of the normal in the y direction in the three-dimensional face image, normal.z is the component of the normal in the z direction in the three-dimensional face image, the x direction and the y direction are parallel to the plane of the three-dimensional face image, and the z direction is perpendicular to the plane of the three-dimensional face image.
Ambient light is denoted (lighta, lightb, lightc), where lighta and lightb and lightc are vectors of the same dimensions as the color values of the actual color and the color values of the sample color. For example, light is (Ra, ga, bb) representing a color value of ambient light along a direction component normal.x (i.e., x-direction), where Ra represents a red component of ambient light along the x-direction, ga represents a green component of ambient light along the x-direction, and Bb represents a blue component of ambient light along the x-direction; lightb is (Rb, gb, bb) representing color values of ambient light along a direction component normal.y (i.e., y-direction), where Rb represents a red component of ambient light along the y-direction, gb represents a green component of ambient light along the y-direction, and Bb represents a blue component of ambient light along the y-direction; lightc is (Rc, gc, bc) representing the color value of the ambient light along the direction component normal.z (i.e. z direction), where Rc represents the red component of the ambient light along the z direction, gc represents the green component of the ambient light along the z direction, and Bc represents the blue component of the ambient light along the z direction.
The color value of the ambient light along each direction component obtained according to the spherical harmonic illumination model formula not only can represent the component of each color in the ambient light, but also can represent the color component of the ambient light in each direction, namely the color value and the direction information of illumination are contained.
The embodiment of the ambient light information determining apparatus shown in the embodiment of the present disclosure may be applied to a terminal, a server, or other devices. The apparatus embodiments may be implemented by software, or may be implemented by hardware or a combination of hardware and software. Taking a software implementation as an example, the device in a logic sense is formed by reading corresponding computer program instructions in a nonvolatile memory into a memory by a processor of a device where the device is located for operation. In terms of hardware, as shown in fig. 7, a hardware structure diagram of a device where the ambient light information determining apparatus is located is shown in an embodiment of the present disclosure, where the device where the embodiment is located may generally include other hardware, such as a forwarding chip responsible for processing a packet, in addition to the processor, the network interface, the memory, and the nonvolatile memory shown in fig. 7; the device may also be a distributed device in terms of hardware architecture, possibly comprising a plurality of interface cards, for the extension of the message processing at the hardware level.
The present disclosure also proposes embodiments of an ambient light information determining apparatus, corresponding to the embodiments of the ambient light information determining method described above.
Fig. 8 is a schematic block diagram of an ambient light information determining apparatus shown according to an embodiment of the present disclosure. The method for determining the ambient light information can be applied to electronic equipment capable of rendering three-dimensional face images based on the ambient light information, wherein the electronic equipment can be a mobile phone, a tablet computer, a wearable device and other terminals, and can also be a server.
As shown in fig. 8, the ambient light information determining apparatus may include:
a three-dimensional generation module 1 configured to perform processing of key points in the two-dimensional face image based on the three-dimensional deformation model to generate a three-dimensional face image;
a sampling point determination module 2 configured to perform determination of a plurality of sampling points in the three-dimensional face image;
a direction determination module 3 configured to perform determination of direction information of the sampling points, wherein the direction information includes a plurality of direction components;
a ratio determining module 4 configured to perform determining a ratio of color values of an actual color of the sampling point to color values of a sample color of the sampling point;
An equation construction module 5 configured to perform construction of a spherical harmonic illumination model equation from an association relationship between a color value of the ambient light in the three-dimensional face image along each of the direction components and the ratio;
a color value calculation module 6 configured to perform an equation set constituted according to the spherical harmonic illumination model equations corresponding to the plurality of sampling points, and calculate a color value of the ambient light along each of the direction components.
Fig. 9 is a schematic block diagram of a sample point determination module shown in accordance with an embodiment of the present disclosure. As shown in fig. 9, the sampling point determining module 2 includes:
a sample point determination sub-module 21 configured to perform determination of a sample point located at a preset position in the three-dimensional face image;
a sampling point determination submodule 22 configured to perform determination of points, among the sample points, located outside a preset area as the sampling points, wherein the preset area includes:
nostril area, lateral face area, glasses area.
Optionally, the direction information of the sampling point includes a normal line of a tangential plane with the sampling point as a tangential point, where a curved surface formed by the three-dimensional face image is tangential to the tangential plane with the sampling point.
FIG. 10 is a schematic block diagram of a ratio determination module shown in accordance with an embodiment of the disclosure. As shown in fig. 10, the ratio determining module 4 includes:
a position determination sub-module 41 configured to perform a determination of the position of the sampling point;
a sample querying sub-module 42 configured to perform querying of a plurality of historical three-dimensional face images stored in advance in a database for historical sample points located at the locations;
a color value calculation sub-module 43 configured to perform calculation of a mean value of color values of a plurality of the history sample points as a color value of the sample color;
wherein the actual color comprises color values a of n channels i The sample color includes color values b of n channels i The ith channel of the actual color corresponds to the ith channel of the sample color with the same color, i is more than or equal to 1 and less than or equal to n;
a ratio calculation sub-module 44 configured to perform a calculation of a ratio of the color value of the actual color to the color value of the sample color
Figure BDA0002152638110000141
Optionally, the normal includes an x-direction component, a y-direction component, and a z-direction component, and the equation construction module is configured to perform constructing a spherical harmonic illumination model equation according to an association of a color value lighta of ambient light in the three-dimensional face image along the x-direction component normal.x, a color value lightb along the y-direction component normal.y, a color value lightc along the z-direction component normal.z, and a ratio I of an actual color of the sampling point to a sample color of the sampling point:
I=normal.x*lighta+normal.y*lightb+normal.z*lightc。
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
The embodiment of the disclosure also proposes an electronic device, including:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the ambient light information determination method according to any one of the embodiments described above.
Embodiments of the present disclosure also propose a storage medium, which when executed by a processor of an electronic device, enables the electronic device to perform the ambient light information determination method of any one of the above embodiments.
Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
Embodiments of the present disclosure also propose a computer program product configured to perform the ambient light information determination method of any one of the embodiments described above.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The foregoing has outlined the detailed description of the method and apparatus provided by the embodiments of the present disclosure, and the detailed description of the principles and embodiments of the present disclosure has been provided herein with the application of the specific examples, the above examples being provided only to facilitate the understanding of the method of the present disclosure and its core ideas; meanwhile, as one of ordinary skill in the art will have variations in the detailed description and the application scope in light of the ideas of the present disclosure, the present disclosure should not be construed as being limited to the above description.

Claims (8)

1. A method for determining ambient light information, comprising:
processing key points in the two-dimensional face image based on the three-dimensional deformation model to generate a three-dimensional face image;
determining a plurality of sampling points in the three-dimensional face image;
determining direction information of the sampling point and the ratio of the color value of the actual color of the sampling point to the color value of the sample color of the sampling point, wherein the direction information comprises a plurality of direction components, the direction information of the sampling point comprises a normal line of a tangent plane taking the sampling point as a tangent point, the normal line comprises an x-direction component, a y-direction component and a z-direction component, and a curved surface formed by the three-dimensional face image is tangent to the sampling point with the tangent plane;
Constructing a spherical harmonic illumination model equation according to the association relation between the color value of the ambient light in the three-dimensional face image along each direction component and the ratio, wherein the spherical harmonic illumination model equation comprises the following components:
constructing a spherical harmonic illumination model equation according to the association relation between the color value lighta of the ambient light in the three-dimensional face image along the x-direction component normal.x, the color value lightb along the y-direction component normal.y, the color value lightc along the z-direction component normal.z and the ratio I of the actual color of the sampling point to the sample color of the sampling point: i=normal.x+normal.y+normal.z+lighttc;
and calculating the color value of the ambient light along each direction component according to an equation set formed by spherical harmonic illumination model equations corresponding to the sampling points.
2. The method of claim 1, wherein determining a plurality of sampling points in the three-dimensional face image comprises:
determining sample points positioned at preset positions in the three-dimensional face image;
and determining points outside a preset area from the sample points as the sampling points, wherein the preset area comprises:
nostril area, lateral face area, glasses area.
3. The method of claim 1, wherein the ratio is determined by:
determining the position of the sampling point;
inquiring historical sample points positioned at the positions in a plurality of historical three-dimensional face images stored in a database in advance;
calculating the average value of the color values of a plurality of historical sample points as the color value of the sample color;
wherein the actual color comprises color values a of n channels i The sample color includes color values b of n channels i The ith channel of the actual color corresponds to the ith channel of the sample color with the same color, i is more than or equal to 1 and less than or equal to n;
calculating the ratio of the color value of the actual color to the color value of the sample color
Figure FDA0004129732810000011
4. An ambient light information determination apparatus, comprising:
the three-dimensional generation module is configured to execute processing on key points in the two-dimensional face image based on the three-dimensional deformation model so as to generate a three-dimensional face image;
a sampling point determination module configured to perform determining a plurality of sampling points in the three-dimensional face image;
a direction determining module configured to perform determining direction information of the sampling point, wherein the direction information includes a plurality of direction components, and the direction information of the sampling point includes a normal line of a tangential plane with the sampling point as a tangential point, the normal line includes an x-direction component, a y-direction component, and a z-direction component, and a curved surface formed by the three-dimensional face image is tangential to the tangential plane with respect to the sampling point;
A ratio determination module configured to perform determining a ratio of color values of an actual color of the sampling point to color values of a sample color of the sampling point;
an equation construction module configured to perform construction of a spherical harmonic illumination model equation from an association relationship between a color value of ambient light in the three-dimensional face image along each of the direction components and the ratio values,
the equation construction module is configured to perform a spherical harmonic illumination model equation according to an association relation of a color value lighta of ambient light in the three-dimensional face image along the x-direction component normal.x, a color value lightb along the y-direction component normal.y, a color value lightc along the z-direction component normal.z, and a ratio I of an actual color of the sampling point to a sample color of the sampling point: i=normal.x+normal.y+normal.z+lighttc;
and the color value calculation module is configured to execute an equation set formed according to spherical harmonic illumination model equations corresponding to the sampling points and calculate the color value of the ambient light along each direction component.
5. The apparatus of claim 4, wherein the sample point determination module comprises:
A sample point determination sub-module configured to perform determining a sample point located at a preset position in the three-dimensional face image;
a sampling point determination submodule configured to perform determination of a point located outside a preset area among the sampling points as the sampling point, wherein the preset area includes:
nostril area, lateral face area, glasses area.
6. The apparatus of claim 4, wherein the ratio determination module comprises:
a position determination sub-module configured to perform determining a position of the sampling point;
a sample query sub-module configured to perform a query of a plurality of historical three-dimensional face images pre-stored in a database for historical sample points located at the location;
a color value calculation sub-module configured to perform calculation of a mean value of color values of a plurality of the history sample points as a color value of the sample color;
wherein the actual color comprises color values a of n channels i The sample color includes color values b of n channels i The ith channel of the actual color corresponds to the ith channel of the sample color with the same color, i is more than or equal to 1 and less than or equal to n;
a ratio calculation sub-module configured to perform a calculation of a ratio of the color value of the actual color to the color value of the sample color
Figure FDA0004129732810000031
7. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the ambient light information determination method of any one of claims 1 to 3.
8. A storage medium, characterized in that instructions in the storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the ambient light information determination method of any one of claims 1 to 3.
CN201910707522.8A 2019-08-01 2019-08-01 Ambient light information determination method, device, electronic equipment and storage medium Active CN110533760B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910707522.8A CN110533760B (en) 2019-08-01 2019-08-01 Ambient light information determination method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910707522.8A CN110533760B (en) 2019-08-01 2019-08-01 Ambient light information determination method, device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110533760A CN110533760A (en) 2019-12-03
CN110533760B true CN110533760B (en) 2023-05-30

Family

ID=68661229

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910707522.8A Active CN110533760B (en) 2019-08-01 2019-08-01 Ambient light information determination method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110533760B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0341570A (en) * 1989-07-10 1991-02-22 Hitachi Ltd Color image processing method
CN102013005A (en) * 2009-09-07 2011-04-13 泉州市铁通电子设备有限公司 Local dynamic threshold color balance based detecting human face detection method with polarized colored light based on
CN108510583A (en) * 2018-04-03 2018-09-07 北京华捷艾米科技有限公司 The generation method of facial image and the generating means of facial image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0341570A (en) * 1989-07-10 1991-02-22 Hitachi Ltd Color image processing method
CN102013005A (en) * 2009-09-07 2011-04-13 泉州市铁通电子设备有限公司 Local dynamic threshold color balance based detecting human face detection method with polarized colored light based on
CN108510583A (en) * 2018-04-03 2018-09-07 北京华捷艾米科技有限公司 The generation method of facial image and the generating means of facial image

Also Published As

Publication number Publication date
CN110533760A (en) 2019-12-03

Similar Documents

Publication Publication Date Title
CN108197618B (en) Method and device for generating human face detection model
CN106331492A (en) Image processing method and terminal
CN109558864A (en) Face critical point detection method, apparatus and storage medium
CN111428581A (en) Face shielding detection method and system
CN106778453B (en) Method and device for detecting glasses wearing in face image
CN112419170A (en) Method for training occlusion detection model and method for beautifying face image
CN107203963A (en) A kind of image processing method and device, electronic equipment
CN107506738A (en) Feature extracting method, image-recognizing method, device and electronic equipment
CN108491823A (en) Method and apparatus for generating eye recognition model
CN112836625A (en) Face living body detection method and device and electronic equipment
CN109492540B (en) Face exchange method and device in image and electronic equipment
US20240127404A1 (en) Image content extraction method and apparatus, terminal, and storage medium
CN111784660B (en) Method and system for analyzing frontal face degree of face image
CN111274476B (en) House source matching method, device, equipment and storage medium based on face recognition
CN110533760B (en) Ambient light information determination method, device, electronic equipment and storage medium
CN110533777B (en) Three-dimensional face image correction method and device, electronic equipment and storage medium
CN110047126B (en) Method, apparatus, electronic device, and computer-readable storage medium for rendering image
CN115115552B (en) Image correction model training method, image correction device and computer equipment
CN115840550A (en) Angle-adaptive display screen display method, device and medium
CN113435358B (en) Sample generation method, device, equipment and program product for training model
CN112070662B (en) Evaluation method and device of face changing model, electronic equipment and storage medium
CN110969674B (en) Method and device for generating winding drawing, terminal equipment and readable storage medium
CN103971111B (en) Image processing method and terminal device
CN112069885A (en) Face attribute identification method and device and mobile terminal
CN113223103A (en) Method, device, electronic device and medium for generating sketch

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant