CN111723651B - Face recognition method, face recognition device and terminal equipment - Google Patents

Face recognition method, face recognition device and terminal equipment Download PDF

Info

Publication number
CN111723651B
CN111723651B CN202010387768.4A CN202010387768A CN111723651B CN 111723651 B CN111723651 B CN 111723651B CN 202010387768 A CN202010387768 A CN 202010387768A CN 111723651 B CN111723651 B CN 111723651B
Authority
CN
China
Prior art keywords
face
visible light
near infrared
image sample
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010387768.4A
Other languages
Chinese (zh)
Other versions
CN111723651A (en
Inventor
李治农
何柳青
林晓清
曹娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Entropy Technology Co Ltd
Original Assignee
Entropy Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Entropy Technology Co Ltd filed Critical Entropy Technology Co Ltd
Priority to CN202010387768.4A priority Critical patent/CN111723651B/en
Publication of CN111723651A publication Critical patent/CN111723651A/en
Application granted granted Critical
Publication of CN111723651B publication Critical patent/CN111723651B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/167Detection; Localisation; Normalisation using comparisons between temporally consecutive images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Collating Specific Patterns (AREA)
  • Studio Devices (AREA)

Abstract

The application is applicable to the technical field of image processing, and provides a face recognition method, a face recognition device and terminal equipment, wherein the method comprises the following steps: obtaining a visible light image and a near infrared image; extracting face features of the visible light image and the near infrared image respectively to obtain visible light face feature information and near infrared face feature information; performing face recognition based on the visible light face feature information, if a matching visible light face image sample exists, determining a registered user associated with the matching visible light face image sample as a face recognition result, otherwise, performing face recognition based on the near infrared face feature information; if the matching near-infrared face image sample exists, determining the registered user associated with the matching near-infrared face image sample as a face recognition result, and if not, outputting a reminding message. By the method, human face identification of various kinds of skin colors worldwide can be realized.

Description

Face recognition method, face recognition device and terminal equipment
Technical Field
The application belongs to the technical field of image processing, and particularly relates to a face recognition method, a face recognition device, terminal equipment and a computer readable storage medium.
Background
In the related technology, the visible light face recognition technology is high in usability and supports large-capacity face database recognition, but is easily influenced by illumination conditions, and the adaptability of the race is poor, for example, the recognition rate of yellow race individuals is high, but the recognition rates of white race individuals, brown race individuals and transition race individuals are lower, and black race individuals cannot be recognized basically. Therefore, how to overcome the influence of illumination variation and face complexion becomes a difficult problem.
Disclosure of Invention
In view of the above, the present application provides a face recognition method, a face recognition device, a terminal device, and a computer readable storage medium, which can realize face recognition of various kinds of people with various colors worldwide, and can meet the requirements of large-capacity database recognition.
In a first aspect, the present application provides a face recognition method, including:
obtaining a visible light image and a near infrared image, wherein the visible light image and the near infrared image are images obtained by shooting the same scene;
face feature extraction is respectively carried out on the visible light image and the near infrared image so as to obtain visible light face feature information of the visible light image and near infrared face feature information of the near infrared image;
Performing face recognition based on the visible light face characteristic information;
if the matching visible light face image sample exists, determining registered users associated with the matching visible light face image sample as a face recognition result, wherein the matching visible light face image sample is a visible light face image sample matched with the visible light face characteristic information in a preset face database, and each visible light face image sample in the face database is respectively associated with different registered users;
if the matching visible light face image sample does not exist, face recognition is carried out based on the near infrared face characteristic information;
if the matching near infrared face image sample exists, determining registered users associated with the matching near infrared face image sample as a face recognition result, wherein the matching near infrared face image sample is a near infrared face image sample matched with the near infrared face feature information in the face database, and each near infrared face image sample in the face database is respectively associated with different registered users;
and if the matching near infrared face image sample does not exist, outputting a reminding message to remind that the face recognition fails.
In a second aspect, the present application provides a face recognition apparatus, comprising:
an image acquisition unit, configured to acquire a visible light image and a near infrared image, where the visible light image and the near infrared image are images obtained by shooting a same scene;
the feature extraction unit is used for extracting face features of the visible light image and the near infrared image respectively to obtain visible light face feature information of the visible light image and near infrared face feature information of the near infrared image;
the visible light face recognition unit is used for recognizing the face based on the visible light face characteristic information;
a first judging unit, configured to determine, if there is a matching visible light face image sample, a registered user associated with the matching visible light face image sample as a result of face recognition, where the matching visible light face image sample is a visible light face image sample matched with the visible light face feature information in a preset face database, and each visible light face image sample in the face database is associated with a different registered user;
the second judging unit is used for carrying out face recognition based on the near infrared face characteristic information if the matching visible light face image sample does not exist;
A third judging unit, configured to determine, if there is a matching near-infrared face image sample, a registered user associated with the matching near-infrared face image sample as a face recognition result, where the matching near-infrared face image sample is a near-infrared face image sample in the face database that matches the near-infrared face feature information, and each near-infrared face image sample in the face database is associated with a different registered user;
and the fourth judging unit is used for outputting a reminding message to remind the face recognition failure if the matching near infrared face image sample does not exist.
In a third aspect, the present application provides a terminal device comprising a memory, a processor and a computer program stored in said memory and executable on said processor, said processor implementing the method as provided in the first aspect when executing said computer program.
In a fourth aspect, the present application provides a computer readable storage medium storing a computer program which, when executed by a processor, implements a method as provided in the first aspect.
In a fifth aspect, the present application provides a computer program product for causing a terminal device to carry out the method provided in the first aspect above, when the computer program product is run on the terminal device.
From the above, in the scheme of the application, a visible light image and a near infrared image are firstly obtained, and face feature extraction is respectively carried out on the visible light image and the near infrared image so as to obtain visible face feature information of the visible light image and near infrared face feature information of the near infrared image; and then carrying out face recognition based on the visible light face characteristic information, if a matched visible light face image sample exists, determining a registered user associated with the matched visible light face image sample as a face recognition result, if the matched visible light face image sample does not exist, carrying out face recognition based on the near infrared face characteristic information, if the matched near infrared face image sample exists, determining the registered user associated with the matched near infrared face image sample as a face recognition result, and if the matched near infrared face image sample does not exist, outputting a reminding message. The scheme of the application fuses, cross registers and identifies the near infrared face and the visible light face, and can realize face identification of various kinds of people with various colors worldwide.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a face recognition method according to an embodiment of the present application;
fig. 2 is a block diagram of a face recognition method provided by an embodiment of the present application;
fig. 3 is a schematic structural diagram of a face recognition device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in the present description and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Furthermore, the terms "first," "second," "third," and the like in the description of the present specification and in the appended claims, are used for distinguishing between descriptions and not necessarily for indicating or implying a relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
Fig. 1 shows a flowchart of a face recognition method according to an embodiment of the present application, which is described in detail below:
step 101, obtaining visible light images and near infrared images;
in the embodiment of the application, the visible light image is obtained by shooting by a visible light sensor, the near infrared image is obtained by shooting by a near infrared sensor, and the visible light image and the near infrared image are obtained by shooting the same scene by corresponding sensors. For example, the visible light sensor and the near infrared sensor may be installed at the same position, and lenses of the visible light sensor and the near infrared sensor are aligned in the same direction. When the user to be identified appears in the direction aligned with the lens, the visible light sensor and the near infrared sensor are triggered to shoot so as to obtain a visible light image and a near infrared image. Optionally, after the visible light image and the near infrared image are acquired, face detection may be performed on the visible light image and the near infrared image, respectively, so as to detect whether the visible light image and the near infrared image contain a face; if the face detection of the visible light image fails, re-acquiring the visible light image; and if the near infrared image face detection fails, re-acquiring the near infrared image.
Alternatively, in the process of acquiring the visible light image and the near infrared image in the above step 101, some preprocessing operations such as image quality detection processing, living body detection processing, near infrared anti-counterfeit processing may be performed on the visible light image and the near infrared image. The image quality detection may detect whether the quality of the visible light image and the near infrared image is qualified, the living body detection process may determine whether the face in the visible light image is a living body, effectively distinguish the real face from the photo, and the near infrared anti-counterfeiting process may determine whether the face in the near infrared image is a living body, referring to fig. 2, the specific flow of the step 101 may be as follows:
step S1, obtaining a visible light image, detecting the image quality of the visible light image, and jumping to step S3 if the quality detection is qualified; if the quality detection is not qualified, jumping to the step S2;
s2, judging whether the current comparison mode is a common comparison mode or a mixing ratio comparison mode, if the current comparison mode is the common comparison mode, outputting a reminding message to remind that the face recognition fails, and stopping the subsequent steps; if the current comparison mode is the mixing ratio comparison mode, jumping to the step S3; the comparison mode is preset by a user according to requirements, and can control the working mode of the face recognition method in the embodiment of the application;
Step S3, detecting whether a living body detection function based on the visible light image is opened, and if the living body detection function is opened, jumping to the step S4; if the living body detection function is not on, jumping to step S6;
step S4, performing image quality detection on the visible light image, and if the quality detection is qualified, jumping to the step S5; if the quality detection is not qualified, outputting a reminding message to remind that the face recognition fails;
step S5, performing living body detection based on the visible light image, and if the living body detection is successful, jumping to step S6; if the living body detection fails, outputting a reminding message to remind that the face recognition fails;
step S6, detecting whether the near infrared sensor is turned on, and if the near infrared sensor is turned on, jumping to step S7; if the near infrared sensor is not on, jumping to step S11;
step S7, detecting whether the anti-counterfeiting function based on the near infrared image is closed or not, and whether the current comparison mode is a common comparison mode or not, if the anti-counterfeiting function is closed and the current comparison mode is the common comparison mode, jumping to the step S11; if the anti-counterfeiting function is opened and/or the current comparison mode is not the common comparison mode, jumping to the step S8;
Step S8, acquiring a near infrared image, detecting whether a near infrared anti-counterfeiting function is opened, and if the near infrared anti-counterfeiting function is opened, jumping to step S9; if the near infrared anti-counterfeiting function is not opened, jumping to the step S11;
step S9, respectively carrying out image quality detection on the visible light image and the near infrared image, and if the quality detection of the visible light image and the near infrared image is qualified, jumping to the step S10; if the quality detection of the visible light image and/or the near infrared image is not qualified, outputting a reminding message to remind that the face recognition fails;
step S10, near infrared anti-counterfeiting detection is carried out based on the near infrared image, and if the near infrared anti-counterfeiting detection is successful, the step S11 is skipped; if the near infrared anti-counterfeiting detection fails, outputting a reminding message to remind that the face recognition fails;
step S11 is the following step 102.
Step 102, face feature extraction is performed on the visible light image and the near infrared image respectively to obtain visible light face feature information of the visible light image and near infrared face feature information of the near infrared image;
in the embodiment of the application, the face features of the user to be identified are extracted from the visible light image and the near infrared image respectively, so as to obtain the visible light face feature information of the visible light image and the near infrared face feature information of the near infrared image. The facial features include, but are not limited to, one or more of skin tone features, eye features, nose features, lip features, and facial shape features. The visible light face characteristic information is information for representing face characteristics of a user to be identified extracted from a visible light face image; the near-infrared face feature information is information for characterizing face features of a user to be identified extracted from a near-infrared face image. Alternatively, the face feature extraction process may be implemented by a trained neural network model. That is, the visible light image and the near infrared image are input to the neural network model, and face features are extracted from the visible light image and the near infrared image, respectively, by the neural network model.
Optionally, referring to fig. 2, after the step 102, the method further includes: judging whether the current comparison mode is a common comparison mode or a mixing ratio comparison mode; if the current comparison mode is a common comparison mode, judging whether the visible light face characteristic information is successfully extracted, and if the visible light face characteristic information is not extracted, outputting a reminding message to remind that the face recognition fails; if the visible light face feature information is successfully extracted, step 103 is executed, if a matching visible light face image sample exists, a registered user associated with the matching visible light face image sample is determined to be a face recognition result, if the matching visible light face image sample does not exist, recognition is failed, and a reminding message is output.
If the current comparison mode is a mixing ratio comparison mode, judging whether the visible light face feature information and the near infrared face feature information are successfully extracted, wherein the judging result has the following four conditions:
1. if the visible light face characteristic information is not extracted and the near infrared face characteristic information is not extracted, outputting a reminding message to remind that the face recognition fails.
2. If only the visible light face feature information is extracted, executing step 103; if the matching visible light face image sample exists, determining registered users associated with the matching visible light face image sample as a face recognition result; if the matching visible light face image sample does not exist, the recognition fails, and a reminding message is output.
3. If only near infrared face feature information is extracted, steps similar to steps C1, C2, C3 and C4 are executed: respectively comparing each near infrared face image sample in the face database with the near infrared image to obtain a third similarity of each near infrared face image sample relative to the near infrared image; determining the maximum value of the obtained third similarities as the maximum third similarity; comparing the maximum third similarity with a preset second near-infrared similarity threshold; if the maximum third similarity is greater than a second near-infrared similarity threshold, determining a registered user associated with a near-infrared face image sample corresponding to the maximum third similarity as a face recognition result; otherwise, the identification fails, and a reminding message is output.
4. If the visible light face feature information and the near infrared face feature information are simultaneously extracted, the following steps 103, 104, 105, 106 and 107 are performed.
Optionally, the step 102 specifically includes:
s111, performing image quality detection on the visible light image and performing image quality detection on the near infrared image;
S112, when the image quality of the visible light image passes detection, extracting facial features of the visible light image;
s113, when the image quality detection of the near infrared image passes, face feature extraction is performed on the near infrared image.
Wherein the image quality detection includes: quality judgment such as blurring, stretching, noise, brightness, exposure, angle, color, inversion, shielding, face size and the like. Respectively detecting the image quality of the visible light image and the near infrared image; when the image quality of the visible light image passes the detection, extracting the facial features of the visible light image; and when the image quality detection of the near infrared image passes, extracting face characteristics of the near infrared image. Optionally, when the image quality of the near infrared image passes, it is further determined whether the current comparison mode is a normal comparison mode, and if not, face feature extraction is performed on the near infrared image.
Step 103, carrying out face recognition based on the visible light face characteristic information;
in the embodiment of the application, the visible light face characteristic information characterizes the face characteristics of the user to be identified extracted from the visible light face image, and a sample matched with the visible light face image is searched in a preset face database according to the visible light face characteristic information. Specifically, a plurality of visible light face image samples are stored in a face database, each visible light face image is respectively associated with different registered users, and each visible light face image sample is a visible light image of a face part of the registered user associated with each visible light face image sample. The above-mentioned matching visible light human face image sample is one of the various visible light human face image samples matched with the visible light human face characteristic information.
Optionally, the step 103 specifically includes:
a1, based on the visible light face characteristic information, respectively comparing each visible light face image sample in the face database with the visible light image to obtain first similarity of each visible light face image sample relative to the visible light image;
a2, determining the maximum value in the obtained first similarity as the maximum first similarity;
a3, comparing the maximum first similarity with a preset first visible light similarity threshold;
and A4, if the maximum first similarity is larger than the first visible light similarity threshold, determining the visible light face image sample corresponding to the maximum first similarity as the matching visible light face image sample.
Specifically, the face database stores feature information of each visible face image sample. And then, calculating the similarity between the characteristic information of each visible light face image sample and the visible light face characteristic information of the visible light image, wherein the calculated similarity is recorded as the first similarity of each visible light face image sample relative to the visible light image. And selecting the maximum value from the obtained first similarity as the maximum first similarity. The application is provided with the first visible light similarity threshold in advance, if the maximum first similarity is larger than the first visible light similarity threshold, the visible light face image sample corresponding to the maximum first similarity is considered to be very similar to the visible light image, false recognition does not exist, and the visible light face image sample corresponding to the maximum first similarity can be directly determined to be matched with the visible light face image sample.
104, if a matching visible light face image sample exists, determining a registered user associated with the matching visible light face image sample as a face recognition result;
in the embodiment of the application, if the matching visible light face image sample exists in the face database, the user to be identified is one of registered users, namely, the user to be identified and the registered user associated with the matching visible light face image sample are the same person. At this time, the face recognition is successful, and the registered user associated with the matching visible face image sample is determined as the result of the face recognition.
Step 105, if the matching visible light face image sample does not exist, face recognition is performed based on the near infrared face feature information;
in the embodiment of the application, if no matching visible light face image sample exists in the face database, searching the matching near infrared face image sample in the face database according to the near infrared face characteristic information. Specifically, a plurality of near-infrared face image samples are stored in a face database, each near-infrared face image is respectively associated with different registered users, and each near-infrared face image sample is a near-infrared image of a face part of the registered user associated with each near-infrared face image sample. The near-infrared face image matching sample is one piece matched with the near-infrared face characteristic information in each near-infrared face image sample.
Optionally, the face recognition based on the near infrared face feature information specifically includes:
b1, screening a first set from the face database;
b2, determining one candidate visible light face image sample in the first set as a target visible light face image sample;
b3, calculating a second similarity between the near-infrared image and a target near-infrared face image sample based on the near-infrared face characteristic information;
b4, if the second similarity is larger than a preset first near-infrared similarity threshold, determining the target near-infrared face image sample as the matching near-infrared face image sample;
and B5, if the second similarity is smaller than or equal to the first near infrared similarity threshold, returning to the step of determining one candidate visible light face image sample in the first set as the target visible light face image sample and the subsequent steps.
Specifically, the face database includes a plurality of visible face image samples and a plurality of near infrared face image samples, where each visible face image sample corresponds to a near infrared face image sample, and near infrared face image samples corresponding to different visible face image samples are different. The visible light face image sample and the near infrared face image sample with the corresponding relation are face images of the same registered user. That is, in the face database, there is a set of images for each registered user, and each set of images includes a visible face image sample and a near infrared face image sample. And if the matching visible light face image samples do not exist, screening a first set from all the visible light face image samples in the face database, wherein the first set is composed of a plurality of candidate visible light face image samples. The first similarity corresponding to each candidate visible light face image sample is larger than a preset second visible light similarity threshold value. In the embodiment of the application, the second visible light similarity threshold is set smaller than the first visible light similarity threshold.
Further, a candidate visible light face image sample is selected from the first set to serve as a target visible light face image sample. And then, according to the corresponding relation between the visible light face image sample and the near infrared face image sample in the face database, acquiring a near infrared face image sample corresponding to the target visible light face image sample, and recording the near infrared face image sample as a target near infrared face image sample. Extracting face features of the target near-infrared face image sample to obtain feature information of the target near-infrared face image sample. And then, calculating the similarity between the characteristic information of the target near-infrared face image sample and the near-infrared face characteristic information of the near-infrared image, wherein the calculated similarity is recorded as a second similarity between the target near-infrared face image sample and the near-infrared image. Comparing the second similarity with a preset first near-infrared similarity threshold, and if the second similarity is larger than the first near-infrared similarity threshold, determining the target near-infrared face image sample as a matching near-infrared face image sample; if the second similarity is smaller than or equal to the first near infrared similarity threshold, returning to the step B2, selecting a candidate visible light face image sample in the first set again to serve as a target visible light face image sample, and executing the steps B3, B4 and B5 according to the new target visible light face image sample. It should be noted that the candidate visible light face image samples newly selected in the first set should not be selected. That is, each candidate visible light face image sample in the first set is determined no more than once as a target visible light face image sample. Finally, there are two possible outcomes after steps B2, B3, B4 and B5 are performed: firstly, determining a matching near infrared face image sample in a first set; secondly, after all candidate visible light face image samples in the first set are traversed, the matching near-infrared face image samples are still not determined, and at the moment, the matching near-infrared face image samples are considered to be absent.
Optionally, a candidate visible light face image sample may be sequentially selected from the first set according to the order of the first similarity corresponding to the candidate visible light face image sample from high to low, where the candidate visible light face image sample is used as the target visible light face image sample; alternatively, a candidate visible light face image sample may be sequentially selected from the first set in random order as the target visible light face image sample, where the selection order of the target visible light face image sample is not limited.
Optionally, the face recognition based on the near infrared face feature information may be further implemented in the following manner: according to the sequence from high to low of the first similarity corresponding to each visible light face image sample in the face database, acquiring the front N visible light face image samples, wherein N is a positive integer and is smaller than or equal to the number of the visible light face image samples in the face database; in the front N visible light face image samples, selecting one visible light face image sample at each time as a target visible light face image sample according to the sequence of the first similarity corresponding to the front N visible light face image samples from high to low; according to the corresponding relation between the visible light face image sample and the near infrared face image sample in the face database, acquiring a near infrared face image sample corresponding to the target visible light face image sample, and recording the near infrared face image sample as a target near infrared face image sample; based on the near-infrared face characteristic information, calculating the similarity between the near-infrared image and the target near-infrared face image sample, and recording the similarity as a second similarity; if the first similarity corresponding to the target visible light face image sample is larger than the second visible light similarity threshold and the second similarity is larger than the first near infrared similarity threshold, determining the target near infrared face image sample as a matching near infrared face image sample, and stopping selecting the next visible light face image sample as the target visible light face image sample; if the first similarity corresponding to the target visible light face image sample is not greater than the second visible light similarity threshold value and/or the second similarity is not greater than the first near infrared similarity threshold value, selecting the next visible light face image sample from the previous N visible light face image samples as the target visible light face image sample, and executing the same steps on the new target visible light face image sample; when the target visible light face image sample is the last visible light face image sample in the first N visible light face image samples, if the first similarity corresponding to the target visible light face image sample is not greater than the second visible light similarity threshold value and/or the second similarity is not greater than the first near infrared similarity threshold value, the matching near infrared face image sample is considered to be absent.
Optionally, after the step B1, the face recognition method further includes:
if the first set is an empty set, respectively comparing each near infrared face image sample in the face database with the near infrared image to obtain a third similarity of each near infrared face image sample relative to the near infrared image;
c2, determining the maximum value of the obtained third similarities as the maximum third similarity;
c3, comparing the maximum third similarity with a preset second near-infrared similarity threshold, wherein the second near-infrared similarity threshold is larger than the first near-infrared similarity threshold;
and C4, if the maximum third similarity is larger than the second near-infrared similarity threshold, determining the near-infrared face image sample corresponding to the maximum third similarity as a matching near-infrared face image sample.
Specifically, if the first set is an empty set, that is, no candidate visible light face image sample exists in the first set, face features of each near-infrared face image sample in the face database are extracted, and feature information of each near-infrared face image sample is obtained. And then, calculating the similarity between the characteristic information of each near-infrared face image sample and the visible face characteristic information of the near-infrared image, wherein the calculated similarity is recorded as the third similarity of each near-infrared face image sample relative to the near-infrared image. And selecting the maximum value from the obtained third similarity as the maximum third similarity. The application is provided with a second near-infrared similarity threshold smaller than the first near-infrared similarity threshold in advance, and if the maximum third similarity is larger than the second visible light similarity threshold, the near-infrared face image sample corresponding to the maximum third similarity is directly determined to be the matched near-infrared face image sample.
Step 106, if a matching near-infrared face image sample exists, determining a registered user associated with the matching near-infrared face image sample as a face recognition result;
in the embodiment of the application, if the matching near-infrared face image sample exists in the face database, the user to be identified is one of registered users, namely, the user to be identified and the registered user associated with the matching near-infrared face image sample are the same person. At this time, the face recognition is successful, and the registered user associated with the near infrared face image sample is determined as the face recognition result.
And 107, if the matching near infrared face image sample does not exist, outputting a reminding message to remind that the face recognition fails.
In the embodiment of the application, if the matching near infrared face image sample does not exist in the face database, a reminding message is output, and the reminding message is used for indicating the face recognition failure. Optionally, after the face recognition fails, the above step 101 may be re-performed.
Optionally, the step 101 specifically includes:
acquiring continuous M visible light images and continuous M near infrared images;
The M continuous visible light images are M visible light images obtained by continuous photographing with a visible light sensor, and similarly, the M continuous near-infrared images are M near-infrared images obtained by continuous photographing with a near-infrared sensor. Wherein M is an integer greater than 1, the ith visible light image and the ith near infrared image are obtained by shooting at the same moment and the same shooting angle by the corresponding sensor, and i is a positive integer not greater than M.
Correspondingly, the step 104 specifically includes:
if each visible light image in the M visible light images has a corresponding matching visible light face image sample and each matching visible light face image sample is the same, determining a registered user associated with the matching visible light face image sample as a face recognition result;
wherein the operations in steps 102, 103 described above are performed for each of the M visible light images. If each visible light image in the M visible light images has a corresponding matching visible light face image sample, and M matching visible light face image samples corresponding to the M visible light images are the same, determining registered users associated with the matching visible light face image samples as face recognition results.
Accordingly, the step 105 specifically includes:
if any one of the M visible light images does not have a corresponding matching visible light face image sample and/or has a different matching visible light face image sample, face recognition is performed based on the near infrared face feature information.
And if any one of the M visible light images does not have a corresponding matching visible light face image sample, and/or if at least one matching visible light face image sample is different from other visible light face image samples, performing face recognition based on near infrared face feature information corresponding to each near infrared image of the M near infrared images.
Optionally, the step 106 specifically includes:
if each near infrared image in the M near infrared images has a corresponding matching near infrared face image sample and each matching near infrared face image sample is the same, determining registered users associated with the matching near infrared face image sample as a face recognition result;
and if each near infrared image in the M near infrared images has a corresponding matching near infrared face image sample and M Zhang Pipei near infrared face image samples corresponding to the M near infrared images are the same, determining the registered user associated with the matching near infrared face image sample as a face recognition result.
Accordingly, the step 107 specifically includes:
if any near infrared image in the M near infrared images does not have a corresponding matching near infrared face image sample and/or has a different matching near infrared face image sample, outputting a reminding message to remind that face recognition fails;
if any near infrared image in the M near infrared images does not have a corresponding matching near infrared face image sample, and/or if at least one matching near infrared face image sample is different from other matching near infrared face image samples, outputting a reminding message, wherein the reminding message is used for indicating that the face recognition fails. Optionally, after the face recognition fails, the above step 101 may be re-performed.
In the embodiment of the application, M groups of images are acquired, and each group of images comprises a visible light image and a near infrared image which are shot at the same moment. And extracting face features of the visible light image and the near infrared image for each group of images respectively to obtain visible light face feature information and near infrared face feature information of each group of images. Searching a matching visible light face image sample in a face database according to the visible light face characteristic information of each group of images, and determining registered users associated with the matching visible light face image sample as face recognition results if M identical matching visible light face image samples are obtained; if M identical matching visible light face image samples are not obtained, searching matching near-infrared face image samples in a face database according to near-infrared face characteristic information of each group of images, and if M identical matching near-infrared face image samples are obtained, determining registered users associated with the matching near-infrared face image samples as face recognition results; and if M identical matching near infrared face image samples are not obtained, outputting a reminding message. The accuracy of face recognition can be improved through judging a plurality of groups of images.
From the above, the present application obtains the visible light image and the near infrared image; face feature extraction is respectively carried out on the visible light image and the near infrared image so as to obtain visible light face feature information and near infrared face feature information; performing face recognition based on the visible light face feature information, if a matching visible light face image sample exists, determining a registered user associated with the matching visible light face image sample as a face recognition result, otherwise, performing face recognition based on the near infrared face feature information; if the matching near-infrared face image sample exists, determining the registered user associated with the matching near-infrared face image sample as a face recognition result, and if not, outputting a reminding message. The scheme of the application fuses, cross registers and identifies the near infrared face and the visible light face, and can realize face identification of various kinds of people with various colors worldwide.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
Fig. 3 is a schematic structural diagram of a face recognition device according to an embodiment of the present application, and for convenience of explanation, only a portion related to the embodiment of the present application is shown.
The face recognition device 300 includes:
an image acquisition unit 301, configured to acquire a visible light image and a near infrared image, where the visible light image and the near infrared image are images obtained by capturing the same scene;
a feature extraction unit 302, configured to extract face features of the visible light image and the near infrared image, so as to obtain visible face feature information of the visible light image and near infrared face feature information of the near infrared image;
a visible light face recognition unit 303, configured to perform face recognition based on the above visible light face feature information;
a first judging unit 304, configured to determine, if there is a matching visible light face image sample, a registered user associated with the matching visible light face image sample as a result of face recognition, where the matching visible light face image sample is a visible light face image sample matched with the visible light face feature information in a preset face database, and each visible light face image sample in the face database is associated with a different registered user;
A second judging unit 305, configured to perform face recognition based on the near infrared face feature information if the matching visible face image sample does not exist;
a third judging unit 306, configured to determine, if there is a matching near-infrared face image sample, a registered user associated with the matching near-infrared face image sample as a face recognition result, where the matching near-infrared face image sample is a near-infrared face image sample in the face database that is matched with the near-infrared face feature information, and each near-infrared face image sample in the face database is associated with a different registered user;
and a fourth judging unit 307, configured to output a prompting message to prompt a face recognition failure if the matching near infrared face image sample does not exist.
Optionally, the aforementioned visible light face recognition unit 303 further includes:
the first comparison subunit is configured to compare each visible light face image sample in the face database with the visible light image based on the visible light face feature information, so as to obtain a first similarity of each visible light face image sample with respect to the visible light image;
A maximum first similarity determination subunit configured to determine a maximum value among the obtained respective first similarities as a maximum first similarity;
the first comparison subunit is used for comparing the maximum first similarity with a preset first visible light similarity threshold;
and the matching visible light sample determining subunit is configured to determine, as the matching visible light face image sample, a visible light face image sample corresponding to the maximum first similarity if the maximum first similarity is greater than the first visible light similarity threshold.
Optionally, each visible face image sample in the face database corresponds to a near infrared face image sample, and near infrared face image samples corresponding to different visible face image samples are different, where the visible face image sample and the near infrared face image sample with a corresponding relationship are face images of the same registered user, and the face recognition is performed based on the near infrared face feature information, and the second judging unit 305 further includes:
a first set screening subunit, configured to screen a first set from the face database, where the first set is formed by candidate visible face image samples, a first similarity corresponding to each candidate visible face image sample is greater than a preset second visible similarity threshold, and the second visible similarity threshold is smaller than the first visible similarity threshold;
A target visible light sample determining subunit, configured to determine one candidate visible light face image sample in the first set as a target visible light face image sample;
a second similarity calculating subunit, configured to calculate, based on the near-infrared face feature information, a second similarity between the near-infrared image and a target near-infrared face image sample, where the target near-infrared face image sample is a near-infrared face image sample corresponding to the target visible face image sample in the face database;
a first near-infrared matching sample determining subunit, configured to determine the target near-infrared face image sample as the matching near-infrared face image sample if the second similarity is greater than a preset first near-infrared similarity threshold;
and a skip sub-unit, configured to return to executing the step and subsequent steps of determining one candidate visible face image sample in the first set as a target visible face image sample if the second similarity is less than or equal to the first near-infrared similarity threshold, where the number of times that one candidate visible face image sample is determined as the target visible face image sample is not more than one.
Optionally, the second determining unit 305 further includes:
the empty set judging subunit is configured to, if the first set is an empty set, respectively compare each near-infrared face image sample in the face database with the near-infrared image to obtain a third similarity of each near-infrared face image sample with respect to the near-infrared image;
a maximum third similarity determination subunit configured to determine a maximum value among the obtained respective third similarities as a maximum third similarity;
a second comparing subunit, configured to compare the maximum third similarity with a preset second near-infrared similarity threshold, where the second near-infrared similarity threshold is greater than the first near-infrared similarity threshold;
and the second matching near-infrared sample determining subunit is configured to determine the near-infrared face image sample corresponding to the maximum third similarity as a matching near-infrared face image sample if the maximum third similarity is greater than the second near-infrared similarity threshold.
Optionally, the image acquisition unit 301 further includes:
the continuous image acquisition subunit is used for acquiring continuous M visible light images and continuous M near infrared images, wherein the ith visible light image and the ith near infrared image are images obtained by shooting the same scene, i is a positive integer not more than M, and M is an integer more than 1.
Optionally, the first determining unit 304 further includes:
and the continuous first judging subunit is used for determining registered users associated with the matched visible light face image samples as the face recognition result if each visible light image in the M visible light images has a corresponding matched visible light face image sample and each matched visible light face image sample is the same.
Optionally, the second determining unit 305 further includes:
and the continuous second judging subunit is used for carrying out face recognition based on the near infrared face characteristic information if any one of the M visible light images does not have a corresponding matching visible light face image sample and/or has different matching visible light face image samples.
Optionally, the third determining unit 306 further includes:
and the continuous third judging subunit is configured to determine a registered user associated with the matching near-infrared face image sample as a face recognition result if each near-infrared image in the M near-infrared images has a corresponding matching near-infrared face image sample and each matching near-infrared face image sample is the same.
Optionally, the fourth determining unit 306 further includes:
and the continuous fourth judging subunit is used for outputting a reminding message to remind the face recognition failure if any near infrared image in the M near infrared images does not have a corresponding matching near infrared face image sample and/or a different matching near infrared face image sample exists.
Optionally, the feature extraction unit 302 further includes:
a quality detection subunit, configured to perform image quality detection on the visible light image and perform image quality detection on the near infrared image;
the quality inspection passing subunit is used for extracting facial features of the visible light image when the image quality of the visible light image passes through detection;
and the quality inspection failing subunit is used for extracting face characteristics of the near infrared image when the image quality of the near infrared image passes through detection.
From the above, the present application obtains the visible light image and the near infrared image; face feature extraction is respectively carried out on the visible light image and the near infrared image so as to obtain visible light face feature information and near infrared face feature information; performing face recognition based on the visible light face feature information, if a matching visible light face image sample exists, determining a registered user associated with the matching visible light face image sample as a face recognition result, otherwise, performing face recognition based on the near infrared face feature information; if the matching near-infrared face image sample exists, determining the registered user associated with the matching near-infrared face image sample as a face recognition result, and if not, outputting a reminding message. The scheme of the application fuses, cross registers and identifies the near infrared face and the visible light face, and can realize face identification of various kinds of people with various colors worldwide.
Fig. 4 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 4, the terminal device 4 of this embodiment includes: at least one processor 40 (only one is shown in fig. 4), a memory 41 and a computer program 42 stored in said memory 41 and executable on said at least one processor 40, said processor 40 implementing the following steps when executing said computer program 42:
obtaining a visible light image and a near infrared image, wherein the visible light image and the near infrared image are images obtained by shooting the same scene;
face feature extraction is respectively carried out on the visible light image and the near infrared image so as to obtain visible light face feature information of the visible light image and near infrared face feature information of the near infrared image;
performing face recognition based on the visible light face characteristic information;
if the matching visible light face image sample exists, determining registered users associated with the matching visible light face image sample as a face recognition result, wherein the matching visible light face image sample is a visible light face image sample matched with the visible light face characteristic information in a preset face database, and each visible light face image sample in the face database is respectively associated with different registered users;
If the matching visible light face image sample does not exist, face recognition is carried out based on the near infrared face characteristic information;
if the matching near infrared face image sample exists, determining registered users associated with the matching near infrared face image sample as a face recognition result, wherein the matching near infrared face image sample is a near infrared face image sample matched with the near infrared face feature information in the face database, and each near infrared face image sample in the face database is respectively associated with different registered users;
and if the matching near infrared face image sample does not exist, outputting a reminding message to remind that the face recognition fails.
Assuming that the foregoing is a first possible embodiment, in a second possible embodiment provided by taking the first possible embodiment as a basis, the foregoing performing face recognition based on the foregoing visible light face feature information includes:
based on the visible light face characteristic information, respectively comparing each visible light face image sample in the face database with the visible light image to obtain first similarity of each visible light face image sample relative to the visible light image;
Determining the maximum value of the obtained first similarity as the maximum first similarity;
comparing the maximum first similarity with a preset first visible light similarity threshold;
and if the maximum first similarity is greater than the first visible light similarity threshold, determining the visible light face image sample corresponding to the maximum first similarity as the matching visible light face image sample.
In a third possible implementation manner provided by taking the second possible implementation manner as a basis, each visible light face image sample in the face database corresponds to a near infrared face image sample, and near infrared face image samples corresponding to different visible light face image samples are different, wherein the visible light face image sample and the near infrared face image sample with the correspondence relationship are face images of the same registered user, and the face recognition based on the near infrared face feature information includes:
screening a first set from the face database, wherein the first set is composed of candidate visible light face image samples, the first similarity corresponding to each candidate visible light face image sample is larger than a preset second visible light similarity threshold, and the second visible light similarity threshold is smaller than the first visible light similarity threshold;
Determining a candidate visible light face image sample in the first set as a target visible light face image sample;
calculating a second similarity between the near-infrared image and a target near-infrared face image sample based on the near-infrared face feature information, wherein the target near-infrared face image sample is a near-infrared face image sample corresponding to the target visible-light face image sample in the face database;
if the second similarity is greater than a preset first near-infrared similarity threshold, determining the target near-infrared face image sample as the matching near-infrared face image sample;
and if the second similarity is less than or equal to the first near infrared similarity threshold, returning to the step of determining one candidate visible light face image sample in the first set as the target visible light face image sample and the subsequent steps, wherein the number of times that one candidate visible light face image sample is determined as the target visible light face image sample is not more than one.
In a fourth possible implementation manner provided by the third possible implementation manner, after the screening the first set from the face database, the face recognition method further includes:
If the first set is an empty set, respectively comparing each near infrared face image sample in the face database with the near infrared image to obtain a third similarity of each near infrared face image sample relative to the near infrared image;
determining the maximum value of the obtained third similarities as the maximum third similarity;
comparing the maximum third similarity with a preset second near-infrared similarity threshold, wherein the second near-infrared similarity threshold is larger than the first near-infrared similarity threshold;
and if the maximum third similarity is greater than the second near-infrared similarity threshold, determining the near-infrared face image sample corresponding to the maximum third similarity as a matching near-infrared face image sample.
In a fifth possible embodiment provided by the first possible embodiment as a basis, the acquiring a visible light image and a near infrared image includes:
acquiring continuous M visible light images and continuous M near infrared images, wherein the ith visible light image and the ith near infrared image are images obtained by shooting the same scene, i is a positive integer not more than M, and M is an integer more than 1;
Correspondingly, if the matching visible light face image sample exists, determining the registered user associated with the matching visible light face image sample as a face recognition result, including:
if each visible light image in the M visible light images has a corresponding matching visible light face image sample and each matching visible light face image sample is the same, determining a registered user associated with the matching visible light face image sample as a face recognition result;
correspondingly, if the matching visible light face image sample does not exist, performing face recognition based on the near infrared face feature information, including:
if any one of the M visible light images does not have a corresponding matching visible light face image sample and/or has a different matching visible light face image sample, face recognition is performed based on the near infrared face feature information.
In a sixth possible implementation manner provided by the fifth possible implementation manner, if there is a matching near-infrared face image sample, determining a registered user associated with the matching near-infrared face image sample as a face recognition result includes:
If each near infrared image in the M near infrared images has a corresponding matching near infrared face image sample and each matching near infrared face image sample is the same, determining registered users associated with the matching near infrared face image sample as a face recognition result;
correspondingly, if the matching near infrared face image sample does not exist, outputting a reminding message, wherein the reminding message comprises the following steps:
if any near infrared image in the M near infrared images does not have a corresponding matching near infrared face image sample and/or has a different matching near infrared face image sample, outputting a reminding message to remind that face recognition fails.
In a seventh possible embodiment provided on the basis of the first possible embodiment, the second possible embodiment, the third possible embodiment, the fourth possible embodiment, the fifth possible embodiment, or the sixth possible embodiment, the extracting facial features from the visible light image and the near infrared image, respectively, includes:
Detecting the image quality of the visible light image and detecting the image quality of the near infrared image;
when the image quality of the visible light image passes the detection, extracting the face characteristics of the visible light image;
and when the image quality detection of the near infrared image passes, extracting face characteristics of the near infrared image.
The terminal device may include, but is not limited to, a processor 40, a memory 41. It will be appreciated by those skilled in the art that fig. 4 is merely an example of the terminal device 4 and is not meant to be limiting as to the terminal device 4, and may include more or fewer components than shown, or may combine certain components, or different components, such as may also include input-output devices, network access devices, etc.
The processor 40 may be a central processing unit (Central Processing Unit, CPU), the processor 40 may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may in some embodiments be an internal storage unit of the terminal device 4, such as a hard disk or a memory of the terminal device 4. The memory 41 may also be an external storage device of the terminal device 4 in other embodiments, for example, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device 4. Further, the memory 41 may include both the internal storage unit and the external storage device of the terminal device 4. The memory 41 is used for storing an operating system, an application program, a boot loader (BootLoader), data, other programs, and the like, such as program codes of the computer programs. The above-described memory 41 may also be used to temporarily store data that has been output or is to be output.
From the above, the present application obtains the visible light image and the near infrared image; face feature extraction is respectively carried out on the visible light image and the near infrared image so as to obtain visible light face feature information and near infrared face feature information; performing face recognition based on the visible light face feature information, if a matching visible light face image sample exists, determining a registered user associated with the matching visible light face image sample as a face recognition result, otherwise, performing face recognition based on the near infrared face feature information; if the matching near-infrared face image sample exists, determining the registered user associated with the matching near-infrared face image sample as a face recognition result, and if not, outputting a reminding message. The scheme of the application fuses, cross registers and identifies the near infrared face and the visible light face, and can realize face identification of various kinds of people with various colors worldwide.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
The embodiments of the present application also provide a computer readable storage medium storing a computer program which, when executed by a processor, implements steps for implementing the various method embodiments described above.
Embodiments of the present application provide a computer program product enabling a terminal device to carry out the steps of the method embodiments described above when the computer program product is run on the terminal device.
The integrated units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of each of the method embodiments described above. The computer program comprises computer program code, and the computer program code can be in a source code form, an object code form, an executable file or some intermediate form and the like. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a terminal device, a recording medium, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunication signal, and a software distribution medium. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other manners. For example, the apparatus/network device embodiments described above are merely illustrative, e.g., the division of modules or elements described above is merely a logical functional division, and there may be additional divisions in actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. A face recognition method, comprising:
obtaining a visible light image and a near infrared image, wherein the visible light image and the near infrared image are images obtained by shooting the same scene;
Preprocessing the visible light image and the near infrared image respectively;
respectively carrying out image quality detection on the visible light image and the near infrared image, and respectively extracting face characteristics of the visible light image and the near infrared image when the image quality detection passes so as to obtain visible light face characteristic information of the visible light image and near infrared face characteristic information of the near infrared image, wherein the image quality detection comprises blurring, stretching, noise, brightness, exposure degree, angle, color, inversion, shielding and face size and quality judgment;
judging a current comparison mode, and if the current comparison mode is a common comparison mode and the visible light face characteristic information of the visible light image is not successfully extracted, outputting a reminding message to remind that the visible light face recognition fails;
if the current comparison mode is a mixing ratio comparison mode, a living body detection function and a near infrared anti-counterfeiting function are both opened, and the image quality detection of the visible light image and the near infrared image is passed, respectively extracting visible light face characteristic information of the visible light image and near infrared face characteristic information of the near infrared image, and carrying out face recognition based on the visible light face characteristic information and the near infrared face characteristic information;
If the current comparison mode is a common comparison mode and the visible light face characteristic information of the visible light image is successfully extracted, or if the current comparison mode is a mixing ratio comparison mode and the visible light face characteristic information of the visible light image is successfully extracted, face recognition is carried out based on the visible light face characteristic information;
if a matching visible light face image sample exists, determining registered users associated with the matching visible light face image sample as a face recognition result, wherein the matching visible light face image sample is a visible light face image sample matched with the visible light face characteristic information in a preset face database, and each visible light face image sample in the face database is respectively associated with different registered users;
if the matching visible light face image sample does not exist, outputting a reminding message to remind that the visible light face recognition fails;
if the current comparison mode is a mixing ratio comparison mode and the near infrared face characteristic information of the near infrared image is successfully extracted, face recognition is carried out based on the near infrared face characteristic information; if a matching near infrared face image sample exists, determining registered users associated with the matching near infrared face image sample as a face recognition result, wherein the matching near infrared face image sample is a near infrared face image sample matched with the near infrared face feature information in the face database, and each near infrared face image sample in the face database is respectively associated with different registered users;
And if the matching near infrared face image sample does not exist, outputting a reminding message to remind that the face recognition fails.
2. The face recognition method according to claim 1, wherein the face recognition based on the visible light face feature information comprises:
based on the visible light face characteristic information, respectively comparing each visible light face image sample in the face database with the visible light image to obtain first similarity of each visible light face image sample relative to the visible light image;
determining the maximum value of the obtained first similarity as the maximum first similarity;
comparing the maximum first similarity with a preset first visible light similarity threshold;
and if the maximum first similarity is larger than the first visible light similarity threshold, determining the visible light face image sample corresponding to the maximum first similarity as the matching visible light face image sample.
3. The face recognition method according to claim 2, wherein each visible face image sample in the face database corresponds to a near infrared face image sample, and near infrared face image samples corresponding to different visible face image samples are different, wherein the visible face image sample and the near infrared face image sample having a correspondence are face images of the same registered user, and the face recognition based on the near infrared face feature information comprises:
Screening a first set from the face database, wherein the first set is composed of candidate visible light face image samples, the first similarity corresponding to each candidate visible light face image sample is larger than a preset second visible light similarity threshold, and the second visible light similarity threshold is smaller than the first visible light similarity threshold;
determining one candidate visible light face image sample in the first set as a target visible light face image sample;
calculating a second similarity between the near-infrared image and a target near-infrared face image sample based on the near-infrared face feature information, wherein the target near-infrared face image sample is a near-infrared face image sample corresponding to the target visible-light face image sample in the face database;
if the second similarity is larger than a preset first near-infrared similarity threshold, determining the target near-infrared face image sample as the matched near-infrared face image sample;
and if the second similarity is smaller than or equal to the first near infrared similarity threshold, returning to the step of determining one candidate visible light face image sample in the first set as the target visible light face image sample and the subsequent steps, wherein the number of times that one candidate visible light face image sample is determined as the target visible light face image sample is not more than one.
4. A face recognition method according to claim 3, wherein after said screening of the first set from the face database, the face recognition method further comprises:
if the first set is an empty set, respectively comparing each near infrared face image sample in the face database with the near infrared image to obtain a third similarity of each near infrared face image sample relative to the near infrared image;
determining the maximum value of the obtained third similarities as the maximum third similarity;
comparing the maximum third similarity with a preset second near-infrared similarity threshold, wherein the second near-infrared similarity threshold is larger than the first near-infrared similarity threshold;
and if the maximum third similarity is larger than the second near-infrared similarity threshold, determining the near-infrared face image sample corresponding to the maximum third similarity as a matching near-infrared face image sample.
5. The face recognition method according to claim 1, wherein the acquiring visible light images and near infrared images includes:
acquiring continuous M visible light images and continuous M near infrared images, wherein the ith visible light image and the ith near infrared image are images obtained by shooting the same scene, i is a positive integer not more than M, and M is an integer more than 1;
Correspondingly, if the matching visible light face image sample exists, determining the registered user associated with the matching visible light face image sample as a face recognition result, including:
if each visible light image in the M visible light images has a corresponding matching visible light face image sample and each matching visible light face image sample is the same, determining a registered user associated with the matching visible light face image sample as a face recognition result;
correspondingly, if the matching visible light face image sample does not exist, performing face recognition based on the near infrared face feature information, including:
and if any one of the M visible light images does not have a corresponding matching visible light face image sample and/or has a different matching visible light face image sample, performing face recognition based on the near infrared face feature information.
6. The method according to claim 5, wherein if there is a matching near infrared face image sample, determining a registered user associated with the matching near infrared face image sample as a result of face recognition, comprising:
If each near infrared image in the M near infrared images has a corresponding matching near infrared face image sample and each matching near infrared face image sample is the same, determining registered users associated with the matching near infrared face image sample as a face recognition result;
correspondingly, if the matching near infrared face image sample does not exist, outputting a reminding message, wherein the reminding message comprises the following steps:
if any near infrared image in the M near infrared images does not have a corresponding matching near infrared face image sample and/or has a different matching near infrared face image sample, outputting a reminding message to remind that face recognition fails.
7. The face recognition method according to any one of claims 1 to 6, wherein the face feature extraction of the visible light image and the near infrared image, respectively, includes:
performing image quality detection on the visible light image and performing image quality detection on the near infrared image;
when the image quality of the visible light image passes the detection, extracting the face characteristics of the visible light image;
and when the image quality detection of the near infrared image passes, extracting face characteristics of the near infrared image.
8. A face recognition device, comprising:
the image acquisition unit is used for acquiring a visible light image and a near infrared image, wherein the visible light image and the near infrared image are images obtained by shooting the same scene;
the feature extraction unit is used for extracting face features of the visible light image and the near infrared image respectively to obtain visible light face feature information of the visible light image and near infrared face feature information of the near infrared image;
the visible light face recognition unit is used for recognizing the face based on the visible light face characteristic information;
a first judging unit, configured to determine, if there is a matching visible light face image sample, a registered user associated with the matching visible light face image sample as a result of face recognition, where the matching visible light face image sample is a visible light face image sample matched with the visible light face feature information in a preset face database, and each visible light face image sample in the face database is associated with a different registered user respectively;
the second judging unit is used for carrying out face recognition based on the near infrared face characteristic information if the matching visible light face image sample does not exist;
A third judging unit, configured to determine, if there is a matching near-infrared face image sample, a registered user associated with the matching near-infrared face image sample as a face recognition result, where the matching near-infrared face image sample is a near-infrared face image sample in the face database that is matched with the near-infrared face feature information, and each near-infrared face image sample in the face database is associated with a different registered user respectively;
and the fourth judging unit is used for outputting a reminding message to remind the face recognition failure if the matched near infrared face image sample does not exist.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 7 when executing the computer program.
10. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the method according to any one of claims 1 to 7.
CN202010387768.4A 2020-05-09 2020-05-09 Face recognition method, face recognition device and terminal equipment Active CN111723651B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010387768.4A CN111723651B (en) 2020-05-09 2020-05-09 Face recognition method, face recognition device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010387768.4A CN111723651B (en) 2020-05-09 2020-05-09 Face recognition method, face recognition device and terminal equipment

Publications (2)

Publication Number Publication Date
CN111723651A CN111723651A (en) 2020-09-29
CN111723651B true CN111723651B (en) 2023-10-10

Family

ID=72564813

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010387768.4A Active CN111723651B (en) 2020-05-09 2020-05-09 Face recognition method, face recognition device and terminal equipment

Country Status (1)

Country Link
CN (1) CN111723651B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113516089B (en) * 2021-07-27 2024-04-12 中国平安人寿保险股份有限公司 Face image recognition method, device, equipment and readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013131407A1 (en) * 2012-03-08 2013-09-12 无锡中科奥森科技有限公司 Double verification face anti-counterfeiting method and device
CN108388878A (en) * 2018-03-15 2018-08-10 百度在线网络技术(北京)有限公司 The method and apparatus of face for identification
CN109427124A (en) * 2017-09-05 2019-03-05 北京中科奥森数据科技有限公司 A kind of binocular camera recognition of face access control equipment and its control method
CN109858371A (en) * 2018-12-29 2019-06-07 深圳云天励飞技术有限公司 The method and device of recognition of face
CN110072083A (en) * 2019-04-10 2019-07-30 深圳市万睿智能科技有限公司 A kind of control method and system of binocular camera Visual Speaker-phone
CN110532992A (en) * 2019-09-04 2019-12-03 深圳市捷顺科技实业股份有限公司 A kind of face identification method based on visible light and near-infrared

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013131407A1 (en) * 2012-03-08 2013-09-12 无锡中科奥森科技有限公司 Double verification face anti-counterfeiting method and device
CN109427124A (en) * 2017-09-05 2019-03-05 北京中科奥森数据科技有限公司 A kind of binocular camera recognition of face access control equipment and its control method
CN108388878A (en) * 2018-03-15 2018-08-10 百度在线网络技术(北京)有限公司 The method and apparatus of face for identification
CN109858371A (en) * 2018-12-29 2019-06-07 深圳云天励飞技术有限公司 The method and device of recognition of face
CN110072083A (en) * 2019-04-10 2019-07-30 深圳市万睿智能科技有限公司 A kind of control method and system of binocular camera Visual Speaker-phone
CN110532992A (en) * 2019-09-04 2019-12-03 深圳市捷顺科技实业股份有限公司 A kind of face identification method based on visible light and near-infrared

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Sellami, L等.Near-infrared facial recognition utilizing OpenCV software.40th Conference on Infrared Technology and Applications.2014,第2014卷全文. *
方圆圆.人脸识别与美颜算法实战 基于Python、机器学习与深度学习.机械工业出版社,2020,(第1版),113-115. *
朱道明.建筑安防技术.东华大学出版社,2013,163. *
李美丽.像素级图像融合算法与应用.西安电子科技大学出版社,2016,(第1版),59-60. *
栗科峰.人脸图像处理与识别技术.黄河水利出版社,2018,(第1版),全文. *
邓茜文 ; 冯子亮 ; 邱晨鹏 ; .基于近红外与可见光双目视觉的活体人脸检测方法.计算机应用.2015,30(第07期),全文. *

Also Published As

Publication number Publication date
CN111723651A (en) 2020-09-29

Similar Documents

Publication Publication Date Title
CN107423690B (en) Face recognition method and device
KR102466997B1 (en) Liveness test method and apparatus
CN110889312B (en) Living body detection method and apparatus, electronic device, computer-readable storage medium
CN110866466B (en) Face recognition method, device, storage medium and server
JP5106356B2 (en) Image monitoring device
US7643674B2 (en) Classification methods, classifier determination methods, classifiers, classifier determination devices, and articles of manufacture
CN107346419B (en) Iris recognition method, electronic device, and computer-readable storage medium
US9704024B2 (en) Object discriminating apparatus and method
CN111079816A (en) Image auditing method and device and server
CN104063709B (en) Sight line detector and method, image capture apparatus and its control method
CN108108711A (en) Face supervision method, electronic equipment and storage medium
CN111931548A (en) Face recognition system, method for establishing face recognition data and face recognition method
CN111027400A (en) Living body detection method and device
CN111723651B (en) Face recognition method, face recognition device and terminal equipment
JP2017058833A (en) Object identification device, object identification method, and program
CN113642639B (en) Living body detection method, living body detection device, living body detection equipment and storage medium
CN112183504B (en) Video registration method and device based on non-contact palm vein image
CN113837006A (en) Face recognition method and device, storage medium and electronic equipment
CN112183167A (en) Attendance checking method, authentication method, living body detection method, device and equipment
CN112699811A (en) Living body detection method, apparatus, device, storage medium, and program product
CN113255766B (en) Image classification method, device, equipment and storage medium
CN114596638A (en) Face living body detection method, device and storage medium
CN112200120B (en) Identity recognition method, living body recognition device and electronic equipment
CN113516089B (en) Face image recognition method, device, equipment and readable storage medium
US10726259B2 (en) Image processing method and system for iris recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant