CN107563302B - Face restoration method and device for removing glasses - Google Patents

Face restoration method and device for removing glasses Download PDF

Info

Publication number
CN107563302B
CN107563302B CN201710676984.9A CN201710676984A CN107563302B CN 107563302 B CN107563302 B CN 107563302B CN 201710676984 A CN201710676984 A CN 201710676984A CN 107563302 B CN107563302 B CN 107563302B
Authority
CN
China
Prior art keywords
user
infrared
glasses
light source
structure light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710676984.9A
Other languages
Chinese (zh)
Other versions
CN107563302A (en
Inventor
张学勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201710676984.9A priority Critical patent/CN107563302B/en
Publication of CN107563302A publication Critical patent/CN107563302A/en
Application granted granted Critical
Publication of CN107563302B publication Critical patent/CN107563302B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a face restoration method and a face restoration device for removing glasses, wherein the method comprises the following steps: projecting an infrared structure light source to a user wearing the glasses, and shooting an infrared structure light image modulated after the infrared structure light source passes through the glasses; demodulating a phase corresponding to a deformed position pixel in the infrared structured light image, and generating depth-of-field information of the user eye according to the phase; and generating an eye contour of the user according to the depth of field information of the eyes of the user, and performing color processing on the eye contour according to the skin color information of the face of the user to generate a user image with glasses removed. Therefore, the face restoration method is enriched, and the accuracy and efficiency of face restoration are improved.

Description

Face restoration method and device for removing glasses
Technical Field
The invention relates to the technical field of information processing, in particular to a face restoration method and device for removing glasses.
Background
In daily life, some people's photos are often taken with glasses, such as some stars or strange beauty; or the user wearing the glasses with the photograph before viewing, etc. The user can only guess the pattern of the pair of glasses removed or recall the previous pattern.
In the related art, the glasses can be removed through the image processing software, but the face restoration efficiency of the method is not high and is not accurate enough.
Disclosure of Invention
The invention provides a face restoration method and device for removing glasses, and aims to solve the technical problems of low efficiency and low accuracy in face restoration by removing glasses in the prior art.
The embodiment of the invention provides a face restoration method for removing glasses, which comprises the following steps: projecting an infrared structure light source to a user wearing glasses, and shooting an infrared structure light image modulated after the infrared structure light source penetrates through the glasses; demodulating a phase corresponding to a deformed position pixel in the infrared structured light image, and generating depth-of-field information of the user eye according to the phase; generating an eye contour of the user according to the depth of field information of the eyes of the user, and performing color processing on the eye contour according to the skin color information of the face of the user to generate a user image with glasses removed.
Another embodiment of the present invention provides a face restoration device with glasses removed, including: the projection shooting module is used for projecting an infrared structure light source to a user wearing glasses and shooting an infrared structure light image modulated after the infrared structure light source penetrates through the glasses; the demodulation generation module is used for demodulating a phase corresponding to a deformed position pixel in the infrared structured light image and generating depth-of-field information of the user eye according to the phase; and the generating module is used for generating an eye contour of the user according to the depth of field information of the eyes of the user, carrying out color processing on the eye contour according to the skin color information of the face of the user, and generating a user image with glasses removed.
Another embodiment of the present invention provides a terminal, including a memory and a processor, where the memory stores computer-readable instructions, and the instructions, when executed by the processor, cause the processor to execute the method for restoring a human face with glasses removed according to the embodiment of the first aspect of the present invention.
A further embodiment of the present invention provides a non-transitory computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method for restoring a human face by removing glasses according to the embodiment of the first aspect of the present invention.
The technical scheme provided by the embodiment of the invention has the following beneficial effects:
the method comprises the steps of projecting an infrared structure light source to a user wearing glasses, shooting an infrared structure light image modulated after the infrared structure light source penetrates through the glasses, demodulating a phase corresponding to a deformed position pixel in the infrared structure light image, generating depth of field information of the eyes of the user according to the phase, generating an eye contour of the user according to the depth of field information of the eyes of the user, and performing color processing on the eye contour according to skin color information of the face of the user to generate the user image with the glasses removed. Therefore, the face restoration method is enriched, and the accuracy and efficiency of face restoration are improved.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a flow diagram of a method for restoring a human face with glasses removed according to one embodiment of the present invention;
FIG. 2(a) is a first view of a scene of structured light measurements according to one embodiment of the present invention;
FIG. 2(b) is a diagram of a second scenario of structured light measurements, in accordance with one embodiment of the present invention;
FIG. 2(c) is a schematic view of a scene three of structured light measurements according to one embodiment of the present invention;
FIG. 2(d) is a diagram of a fourth scenario of structured light measurements, in accordance with one embodiment of the present invention;
FIG. 2(e) is a fifth view of a scene for structured light measurement according to one embodiment of the present invention;
FIG. 3(a) is a schematic diagram of a partial diffractive structure of a collimating beam splitting element according to one embodiment of the present invention;
FIG. 3(b) is a schematic diagram of a partial diffractive structure of a collimating beam splitting element according to another embodiment of the present invention;
fig. 4 is a block diagram illustrating a structure of a face restoration apparatus with glasses removed according to an embodiment of the present invention;
fig. 5 is a block diagram illustrating a structure of a face restoration apparatus with glasses removed according to another embodiment of the present invention;
fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
The following describes a method and an apparatus for restoring a face with glasses removed according to an embodiment of the present invention with reference to the accompanying drawings.
Fig. 1 is a flowchart of a face restoration method for removing glasses according to an embodiment of the present invention.
As shown in fig. 1, the method for restoring a human face with glasses removed may include:
step 101, projecting an infrared structure light source to a user wearing glasses, and shooting an infrared structure light image modulated by the infrared structure light source after the infrared structure light source passes through the glasses.
And 102, demodulating a phase corresponding to the deformed position pixel in the infrared structured light image, and generating depth-of-field information of the user eye according to the phase.
It can be understood that when the glasses are removed for face restoration, the problems of low efficiency, insufficient accuracy and the like are caused by processing through the picture processing software.
In order to solve the technical problem, in the embodiment of the invention, through a structured light technology, an infrared structured light source can penetrate through glasses and directly hit on the eyes of a user, and when depth of field information of the eyes of the user is obtained, outline information of the eyes can be obtained, so that sunglasses are removed through 3D modeling information, color processing is carried out on an eye region according to surrounding skin color information, and a user photo with the glasses removed is generated, so that a face restoration mode is enriched, and the accuracy and the efficiency of face restoration are improved.
Specifically, the infrared structure light source can be projected to a user wearing glasses in the modes of an infrared emitter and the like, and it can be understood that the higher the power of the infrared emitter is, the stronger the ability of the infrared structure light source to penetrate through the glasses is, and the appropriate power can be selected to project according to the actual application needs.
As a possible implementation manner, in order to further improve the efficiency and accuracy of face reduction, the lens color of the glasses may be identified, and the projection intensity of the infrared structure light source may be adjusted according to the lens color. Therefore, the user requirements are further met, and the user experience is improved.
And shooting an infrared structure light image modulated after the infrared structure light source passes through the glasses while projecting.
Specifically, in practical application, an infrared structure light source can be projected to a user wearing glasses, and structural features of the structure light source are various, such as laser stripes, gray codes, sine stripes, or non-uniform speckles, so that the structure light source can acquire three-dimensional eye information of the user based on the contour and depth information of the eye, and compared with a two-dimensional face restoration mode processed only through image processing software, the accuracy is higher, and the accuracy of face restoration is convenient to guarantee.
In order to make it clear to those skilled in the art how to project the ir structured light source to the user wearing the glasses and capture the ir structured light image modulated by the ir structured light source after passing through the glasses, a widely-applied grating projection technology (fringe projection technology) is taken as an example to illustrate the specific principle, wherein the grating projection technology belongs to the broad-spectrum structured light.
When using surface structured light projection, as shown in fig. 2(a), a sinusoidal stripe is generated by computer programming, the sinusoidal stripe is projected to a measured object through a projection device, a CCD camera is used to photograph the bending degree of the stripe modulated by an object, the bending stripe is demodulated to obtain a phase, and the phase is converted to the height of the full field. Of course, the most important point is the calibration of the system, including the calibration of the system geometry and the calibration of the internal parameters of the CCD camera and the projection device, which otherwise may cause errors or error coupling. Since the system external parameters are not calibrated it is not possible to calculate the correct depth of field information from the phase.
Specifically, in the first step, a sinusoidal fringe pattern is programmed, because the phase is acquired subsequently by using a deformed fringe pattern, for example, by using a four-step phase shifting method, four fringes with a phase difference pi/2 are generated, and then the four fringes are projected onto the object to be measured (mask) in a time-sharing manner, and the pattern on the left side of fig. 2(b) is acquired, and the fringes on the reference plane shown on the right side of fig. 2(b) are acquired.
In a second step, phase recovery is performed, and the modulated phase is calculated from the four acquired modulated fringe patterns, where the resulting phase pattern is a truncated phase pattern, since the result of the four-step phase-shifting algorithm is calculated from the arctan function and is therefore limited to between [ -pi, pi ], i.e. it starts over again whenever its value exceeds this range. The phase principal value obtained is shown in fig. 2 (c).
In the second step, it is necessary to cancel the transition, i.e. restore the truncated phase to a continuous phase, as shown in fig. 2(d), with the modulated continuous phase on the left and the reference continuous phase on the right.
And thirdly, subtracting the modulated continuous phase from the reference continuous phase to obtain a phase difference, wherein the phase difference represents the depth of field information of the measured object relative to the reference surface, and substituting the phase difference into a phase-depth conversion formula (wherein corresponding parameters are calibrated) to obtain the three-dimensional model of the object to be measured as shown in fig. 2 (e).
It should be understood that, in practical applications, the structured light used in the embodiments of the present invention may be any pattern other than the grating, according to different application scenarios.
It should be emphasized that, as a possible implementation manner, the present invention projects speckle structured light to the face of a user, so that according to scattered spots in the speckle structured light set according to a preset algorithm, displacement (equivalent to modulation) generated after the speckle structured light is projected to the face of the user restores related depth-of-field information of the face of the user.
In this embodiment, a substantially flat diffraction element having a diffraction structure of relief with a specific phase distribution, a step relief structure having two or more concavities and convexities in cross section, or a step relief structure of a plurality of concavities and convexities may be used, the thickness of the substrate is approximately l micrometers, and the height of each step is not uniform, and is 0.7 micrometers to 0.9 micrometers. Fig. 3(a) is a partial diffraction structure of the collimating beam splitting element of this embodiment, and fig. 3(b) is a cross-sectional side view taken along section a-a, both in units of micrometers on the abscissa and on the ordinate.
Accordingly, since a general diffraction element diffracts a light beam to obtain a plurality of diffracted lights, there is a large difference in light intensity between the diffracted lights, and there is a large risk of injury to human eyes.
The collimation beam splitting component in this embodiment not only has the effect of carrying out the collimation to non-collimated light beam, still have the effect of beam split, non-collimated light through the speculum reflection goes out multi-beam collimated light beam toward different angles behind the collimation beam splitting component promptly, and the cross sectional area of the multi-beam collimated light beam of outgoing is approximate equal, energy flux is approximate equal, and then it is better to make the scattered point light that utilizes after this beam diffraction carry out image processing or projected effect, and simultaneously, laser emergent light disperses to each beam of light, the risk of injury people's eye has further been reduced, and owing to be speckle structured light, arrange even structured light for other, when reaching same collection effect, the electric energy that consumes is lower.
Based on the above description, in this embodiment, the infrared structured light source is projected to the user wearing the glasses, the infrared structured light image modulated after the infrared structured light source passes through the glasses is captured, the phase corresponding to the deformed position pixel in the infrared structured light image is demodulated, and the depth of field information of the user's eyes is generated according to the phase.
It should be emphasized that, specifically, based on the principle of structured light, under different acquisition conditions and environmental conditions, the structured light image processing results obtained by the structured light for the same object to be measured are different, for example, under a forward light environment with a distance of 2 meters between the structured light device and the pickup user a and under a backward light condition with a distance of 3 meters between the structured light device and the pickup user a.
Therefore, in order to reduce the difficulty of face restoration and improve the efficiency of face restoration for a user, the projection light source is generated to project to the user wearing glasses according to the structured light processing parameters when the registered verification feature information is processed by the structured light in advance, and an infrared structured light image of the projection light source modulated by the user wearing glasses is shot.
It should be noted that, according to different application scenarios, the implementation manners of projecting the infrared structured light source to the user wearing the glasses and shooting the infrared structured light image modulated by the infrared structured light source after passing through the glasses are different, which are exemplified as follows:
in a first example, an infrared structured light source is projected toward the face of a user wearing glasses and an infrared structured light image is captured of the infrared structured light source modulated by the face of the user after passing through the glasses.
Further, the phase corresponding to the deformed position pixel in the infrared structured light image is demodulated, and the depth of field information of the user's eyes is generated according to the phase.
In this example, the measured depth of field information of the user's eyes is different according to the characteristics of the user's eyes, and such difference in depth of field information can be reflected by the phase, for example, the depth of field information of the user's eyes is deeper as the phase distortion is larger as the five sense organs of the user's eyes are more stereoscopic, and thus the phase corresponding to the deformed position pixel in the structured light image is demodulated, and the depth of field information of the user's eyes is generated according to the phase. And generating a 3D model of the eyes of the user according to the depth information.
In the second example, a glasses wearing area of a user in a preview picture is identified, a projection angle of an infrared projector is adjusted according to the glasses wearing area, an infrared structure light source is projected to the glasses wearing area, and an infrared structure light image modulated by eyes of the user after the infrared structure light source penetrates through glasses is shot.
In the example, the infrared structure light source is projected only aiming at the glasses wearing area, so that the more accurate infrared structure light image based on the eye area can be obtained, the accuracy of face restoration is further improved, and the user requirements are met.
And 103, generating an eye contour of the user according to the depth of field information of the eyes of the user, performing color processing on the eye contour according to the skin color information of the face of the user, and generating a user image with glasses removed.
Specifically, when the depth information of the user's eyes is acquired, the contour information around the eyes can be acquired. Therefore, the glasses can be automatically removed to restore the human face. It can be understood that, after the glasses are removed, in order to further improve the authenticity, i.e., accuracy, of the face restoration, it is necessary to obtain skin color information of the skin around the eyes, perform color processing on the eye contour according to the skin color information, i.e., maintain the naturalness of the face restoration, and generate a user image with the glasses removed.
In summary, in the method for restoring a face of a removable glasses according to the embodiments of the present invention, an infrared structured light source is projected to a user wearing the glasses, an infrared structured light image modulated after the infrared structured light source passes through the glasses is captured, then a phase corresponding to a deformed position pixel in the infrared structured light image is demodulated, depth of field information of the user's eyes is generated according to the phase, an eye contour of the user is generated according to the depth of field information of the user's eyes, and a color process is performed on the eye contour according to skin color information of the user's face to generate a user image with the removable glasses. Therefore, the face restoration method is enriched, and the accuracy and efficiency of face restoration are improved.
In order to implement the above embodiment, the present invention further provides a human face restoring apparatus with glasses removed, fig. 4 is a block diagram of a structure of the human face restoring apparatus with glasses removed according to an embodiment of the present invention, and as shown in fig. 4, the apparatus includes a projection shooting module 100, a demodulation generating module 200, and a generating module 300.
The projection shooting module 100 is configured to project an infrared structured light source to a user wearing glasses, and shoot an infrared structured light image modulated by the infrared structured light source after passing through the glasses.
And the demodulation generation module 200 is configured to demodulate a phase corresponding to the deformed position pixel in the infrared structured light image, and generate depth-of-field information of the user's eye according to the phase.
The generating module 300 is configured to generate an eye contour of the user according to the depth-of-field information of the eyes of the user, perform color processing on the eye contour according to the skin color information of the face of the user, and generate a user image with glasses removed.
In an embodiment of the present invention, the projection shooting module 100 is specifically configured to: projecting an infrared structure light source to the face of a user wearing glasses, and shooting an infrared structure light image modulated by the infrared structure light source through the face of the user after the infrared structure light source penetrates through the glasses; the demodulation generating module 200 is specifically configured to: demodulating a phase corresponding to a deformed position pixel in the infrared structured light image, and generating depth-of-field information of the face of the user according to the phase; depth of field information of the eyes of the user is extracted from the depth of field information of the face of the user.
In an embodiment of the present invention, the projection shooting module 100 is further specifically configured to: identifying a glasses wearing area of a user in a preview picture; adjusting the projection angle of the infrared projector according to the glasses wearing area; and projecting the infrared structure light source to the glasses wearing area, and shooting an infrared structure light image modulated by the eyes of the user after the infrared structure light source penetrates through the glasses.
In one embodiment of the present invention, the structural features of the infrared structured light source include: laser stripes, gray codes, sinusoidal stripes, uniform speckles, or non-uniform speckles.
As a possible implementation manner, in order to further improve the efficiency and accuracy of face reduction, as shown in fig. 5, on the basis of fig. 4, the method further includes: an identification module 400 and an adjustment module 500.
The recognition module 400 is used to recognize the lens color of the glasses, and the adjustment module 500 is used to adjust the projection intensity of the infrared structure light source according to the lens color. Therefore, the user requirements are further met, and the user experience is improved.
It should be noted that the foregoing explanation of the method for restoring a human face with glasses removed is also applicable to the device for restoring a human face with glasses removed in the embodiment of the present invention, and details not disclosed in the embodiment of the present invention are not repeated herein.
In summary, in the face restoration device for removing glasses according to the embodiment of the present invention, an infrared structured light source is projected to a user wearing glasses, an infrared structured light image modulated after the infrared structured light source passes through the glasses is captured, then a phase corresponding to a deformed position pixel in the infrared structured light image is demodulated, depth of field information of the user's eyes is generated according to the phase, finally, an eye contour of the user is generated according to the depth of field information of the user's eyes, and a color process is performed on the eye contour according to skin color information of the user's face to generate a user image for removing glasses. Therefore, the face restoration method is enriched, and the accuracy and efficiency of face restoration are improved.
The embodiment of the invention also provides the terminal equipment. The above-mentioned terminal includes therein an image processing circuit, which may be implemented using hardware and/or software components, and may include various processing units defining an ISP (image signal processing) pipeline. Fig. 6 is a schematic structural diagram of an image processing circuit in the terminal according to an embodiment of the present invention. As shown in fig. 6, for ease of explanation, only aspects of the image processing techniques associated with embodiments of the present invention are shown.
As shown in fig. 6, the image processing circuit includes an imaging device 1010, an ISP processor 1030, and control logic 1040. The imaging device 1010 may include a camera with one or more lenses 1012, an image sensor 1014, and a structured light projector 1016. The structured light projector 1016 projects structured light onto an object to be measured. The structured light pattern may be a laser stripe, a gray code, a sinusoidal stripe, or a randomly arranged speckle pattern. The image sensor 1014 captures a structured light image projected onto the object to be measured, and transmits the structured light image to the ISP processor 1030, and the ISP processor 1030 demodulates the structured light image to acquire depth information of the object to be measured. At the same time, the image sensor 1014 can also capture color information of the object under test. Of course, the two image sensors 1014 may capture the structured light image and the color information of the object to be measured, respectively.
Taking speckle structured light as an example, the ISP processor 1030 demodulates the structured light image, specifically including acquiring a speckle image of the measured object from the structured light image, performing image data calculation on the speckle image of the measured object and the reference speckle image according to a predetermined algorithm, and acquiring a moving distance of each scattered spot of the speckle image on the measured object relative to a reference scattered spot in the reference speckle image. And (4) converting and calculating by using a trigonometry method to obtain the depth value of each scattered spot of the speckle image, and obtaining the depth information of the measured object according to the depth value.
Of course, the depth image information and the like may be acquired by a binocular vision method or a method based on the time difference of flight TOF, and the method is not limited thereto, as long as the depth information of the object to be measured can be acquired or obtained by calculation, and all methods fall within the scope of the present embodiment.
After ISP processor 1030 receives the color information of the object to be measured captured by image sensor 1014, image data corresponding to the color information of the object to be measured may be processed. ISP processor 1030 analyzes the image data to obtain image statistics that may be used to determine and/or image one or more control parameters of imaging device 1010. Image sensor 1014 may include a color filter array (e.g., a Bayer filter), and image sensor 1014 may acquire light intensity and wavelength information captured with each imaging pixel of image sensor 1014 and provide a set of raw image data that may be processed by ISP processor 1030.
ISP processor 1030 processes raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and ISP processor 1030 may perform one or more image processing operations on the raw image data, collecting image statistics about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
ISP processor 1030 may also receive pixel data from image memory 1020. The image memory 1020 may be a part of a memory device, a storage device, or a separate dedicated memory within an electronic device, and may include a DMA (direct memory access) feature.
Upon receiving the raw image data, ISP processor 1030 may perform one or more image processing operations.
After the ISP processor 1030 acquires the color information and the depth information of the object to be measured, the color information and the depth information can be fused to obtain a three-dimensional image. The feature of the corresponding object to be measured can be extracted by at least one of an appearance contour extraction method or a contour feature extraction method. For example, the features of the object to be measured are extracted by methods such as an active shape model method ASM, an active appearance model method AAM, a principal component analysis method PCA, and a discrete cosine transform method DCT, which are not limited herein. And then the characteristics of the measured object extracted from the depth information and the characteristics of the measured object extracted from the color information are subjected to registration and characteristic fusion processing. The fusion processing may be a process of directly combining the features extracted from the depth information and the color information, a process of combining the same features in different images after weight setting, or a process of generating a three-dimensional image based on the features after fusion in other fusion modes.
The image data for the three-dimensional image may be sent to an image memory 1020 for additional processing before being displayed. ISP processor 1030 receives processed data from image memory 1020 and performs image data processing on the processed data in the raw domain and in the RGB and YCbCr color spaces. Image data for a three-dimensional image may be output to a display 1060 for viewing by a user and/or further processing by a graphics engine or GPU (graphics processing unit). Further, the output of ISP processor 1030 may also be sent to image memory 1020 and display 1060 may read the image data from image memory 1020. In one embodiment, image memory 1020 may be configured to implement one or more frame buffers. In addition, the output of ISP processor 1030 may be transmitted to encoder/decoder 1050 for encoding/decoding image data. The encoded image data may be saved and decompressed before being displayed on the display 1060 device. The encoder/decoder 1050 may be implemented by a CPU or a GPU or a coprocessor.
The image statistics determined by ISP processor 1030 may be sent to control logic 1040 unit. Control logic 1040 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters of imaging device 1010 based on received image statistics.
The following steps are the steps of implementing the face adjustment method of the face 3D model by using the image processing technology in FIG. 6:
projecting an infrared structure light source to a user wearing glasses, and shooting an infrared structure light image modulated after the infrared structure light source penetrates through the glasses;
demodulating a phase corresponding to a deformed position pixel in the infrared structured light image, and generating depth-of-field information of the user eye according to the phase;
generating an eye contour of the user according to the depth of field information of the eyes of the user, and performing color processing on the eye contour according to the skin color information of the face of the user to generate a user image with glasses removed.
In order to implement the above embodiments, the present invention also proposes a non-transitory computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, is capable of implementing the method for restoring a human face by removing glasses according to the foregoing embodiments.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (10)

1. A face restoration method for removing glasses is characterized by comprising the following steps:
projecting an infrared structure light source to a user wearing glasses, and acquiring the lens color of the glasses, wherein the structure light parameter of the infrared structure light source is the same as the structure light processing parameter when the verification feature information is registered through structure light processing;
determining the projection intensity of the infrared structure light source according to the lens color;
shooting an infrared structure light image modulated after the infrared structure light source penetrates through the glasses;
demodulating a phase corresponding to a deformed position pixel in the infrared structured light image, and generating depth-of-field information of the user eye according to the phase;
generating an eye contour of the user according to the depth of field information of the eyes of the user, and performing color processing on the eye contour according to the skin color information of the face of the user to generate a user image with glasses removed.
2. The method of claim 1, wherein projecting the structured infrared light source to a user wearing the eyewear and capturing an image of the structured infrared light modulated by the structured infrared light source after transmission through the eyewear comprises:
projecting an infrared structure light source to the face of a user wearing glasses, and shooting an infrared structure light image modulated by the face of the user after the infrared structure light source penetrates through the glasses;
the demodulating a phase corresponding to a deformed position pixel in the infrared structured light image and generating depth-of-field information of the eyes of the user according to the phase comprises:
demodulating a phase corresponding to a deformed position pixel in the infrared structured light image, and generating depth-of-field information of the user face according to the phase;
and extracting the depth of field information of the eyes of the user from the depth of field information of the face of the user.
3. The method of claim 1, wherein projecting the structured infrared light source to a user wearing the eyewear and capturing an image of the structured infrared light modulated by the structured infrared light source after transmission through the eyewear comprises:
identifying a glasses wearing area of a user in a preview picture;
adjusting the projection angle of an infrared projector according to the glasses wearing area;
and projecting an infrared structure light source to the glasses wearing area, and shooting an infrared structure light image modulated by the user eyes after the infrared structure light source penetrates through the glasses.
4. The method of claim 1, further comprising:
identifying a lens color of the eyewear;
and adjusting the projection intensity of the infrared structure light source according to the color of the lens.
5. The method of claim 1, wherein the structural features of the infrared structured light source comprise:
laser stripes, gray codes, sinusoidal stripes, uniform speckles, or non-uniform speckles.
6. A face restoration device for removing eyeglasses, comprising:
the projection shooting module is used for projecting an infrared structure light source to a user wearing glasses and acquiring the color of a lens of the glasses, and the structure light parameters of the infrared structure light source are the same as the structure light processing parameters when the verification feature information registered through structure light processing;
the projection shooting module is further used for determining the projection intensity of the infrared structure light source according to the lens color and shooting an infrared structure light image modulated after the infrared structure light source penetrates through the glasses;
the demodulation generation module is used for demodulating a phase corresponding to a deformed position pixel in the infrared structured light image and generating depth-of-field information of the user eye according to the phase;
and the generating module is used for generating an eye contour of the user according to the depth of field information of the eyes of the user, carrying out color processing on the eye contour according to the skin color information of the face of the user, and generating a user image with glasses removed.
7. The apparatus of claim 6, wherein the projection capture module is specifically configured to:
projecting an infrared structure light source to the face of a user wearing glasses, and shooting an infrared structure light image modulated by the face of the user after the infrared structure light source penetrates through the glasses;
the demodulation generation module is specifically configured to:
demodulating a phase corresponding to a deformed position pixel in the infrared structured light image, and generating depth-of-field information of the user face according to the phase;
and extracting the depth of field information of the eyes of the user from the depth of field information of the face of the user.
8. The apparatus of claim 6, wherein the projection capture module is further specifically configured to:
identifying a glasses wearing area of a user in a preview picture;
adjusting the projection angle of an infrared projector according to the glasses wearing area;
and projecting an infrared structure light source to the glasses wearing area, and shooting an infrared structure light image modulated by the user eyes after the infrared structure light source penetrates through the glasses.
9. A terminal device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor, when executing the program, implements the method for restoring a human face by removing glasses according to any one of claims 1 to 5.
10. A non-transitory computer-readable storage medium having stored thereon a computer program, wherein the program, when executed by a processor, implements the method for restoring a human face by removing glasses according to any one of claims 1 to 5.
CN201710676984.9A 2017-08-09 2017-08-09 Face restoration method and device for removing glasses Active CN107563302B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710676984.9A CN107563302B (en) 2017-08-09 2017-08-09 Face restoration method and device for removing glasses

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710676984.9A CN107563302B (en) 2017-08-09 2017-08-09 Face restoration method and device for removing glasses

Publications (2)

Publication Number Publication Date
CN107563302A CN107563302A (en) 2018-01-09
CN107563302B true CN107563302B (en) 2020-10-02

Family

ID=60974392

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710676984.9A Active CN107563302B (en) 2017-08-09 2017-08-09 Face restoration method and device for removing glasses

Country Status (1)

Country Link
CN (1) CN107563302B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109445583A (en) * 2018-10-19 2019-03-08 Oppo广东移动通信有限公司 Page control method, device and mobile terminal
CN114827561B (en) * 2022-03-07 2023-03-28 成都极米科技股份有限公司 Projection control method, projection control device, computer equipment and computer-readable storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020579A (en) * 2011-09-22 2013-04-03 上海银晨智能识别科技有限公司 Face recognition method and system, and removing method and device for glasses frame in face image
CN106991377A (en) * 2017-03-09 2017-07-28 广东欧珀移动通信有限公司 With reference to the face identification method, face identification device and electronic installation of depth information

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106991379B (en) * 2017-03-09 2020-07-10 Oppo广东移动通信有限公司 Human skin recognition method and device combined with depth information and electronic device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020579A (en) * 2011-09-22 2013-04-03 上海银晨智能识别科技有限公司 Face recognition method and system, and removing method and device for glasses frame in face image
CN106991377A (en) * 2017-03-09 2017-07-28 广东欧珀移动通信有限公司 With reference to the face identification method, face identification device and electronic installation of depth information

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Research of spectacle frame measurement system based on structured light method;Dong Guan等;《PROCEEDINGS OF SPIE》;20061009;第1-8页 *

Also Published As

Publication number Publication date
CN107563302A (en) 2018-01-09

Similar Documents

Publication Publication Date Title
CN107368730B (en) Unlocking verification method and device
CN107480613B (en) Face recognition method and device, mobile terminal and computer readable storage medium
CN107025635B (en) Depth-of-field-based image saturation processing method and device and electronic device
CN107464280B (en) Matching method and device for user 3D modeling
CN107563304B (en) Terminal equipment unlocking method and device and terminal equipment
CN107734267B (en) Image processing method and device
CN104776815B (en) A kind of color three dimension contour outline measuring set and method based on Darman raster
CN107797664B (en) Content display method and device and electronic device
CN107564050B (en) Control method and device based on structured light and terminal equipment
CN107590828B (en) Blurring processing method and device for shot image
CN107517346B (en) Photographing method and device based on structured light and mobile device
US11138740B2 (en) Image processing methods, image processing apparatuses, and computer-readable storage medium
CN107483815B (en) Method and device for shooting moving object
CN107610171B (en) Image processing method and device
CN107734264B (en) Image processing method and device
CN108564540B (en) Image processing method and device for removing lens reflection in image and terminal equipment
CN107509043B (en) Image processing method, image processing apparatus, electronic apparatus, and computer-readable storage medium
CN107392874B (en) Beauty treatment method and device and mobile equipment
CN107451561A (en) Iris recognition light compensation method and device
CN107491744B (en) Human body identity recognition method and device, mobile terminal and storage medium
CN107610127B (en) Image processing method, image processing apparatus, electronic apparatus, and computer-readable storage medium
CN107454336B (en) Image processing method and apparatus, electronic apparatus, and computer-readable storage medium
CN107613239B (en) Video communication background display method and device
CN107592491B (en) Video communication background display method and device
CN107563302B (en) Face restoration method and device for removing glasses

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: OPPO Guangdong Mobile Communications Co., Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: Guangdong OPPO Mobile Communications Co., Ltd.

GR01 Patent grant
GR01 Patent grant