CN113065507B - Method and device for realizing face authentication - Google Patents

Method and device for realizing face authentication Download PDF

Info

Publication number
CN113065507B
CN113065507B CN202110425395.XA CN202110425395A CN113065507B CN 113065507 B CN113065507 B CN 113065507B CN 202110425395 A CN202110425395 A CN 202110425395A CN 113065507 B CN113065507 B CN 113065507B
Authority
CN
China
Prior art keywords
face
visible light
data
human face
radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110425395.XA
Other languages
Chinese (zh)
Other versions
CN113065507A (en
Inventor
贺三元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alipay Hangzhou Information Technology Co Ltd
Original Assignee
Alipay Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alipay Hangzhou Information Technology Co Ltd filed Critical Alipay Hangzhou Information Technology Co Ltd
Priority to CN202110425395.XA priority Critical patent/CN113065507B/en
Publication of CN113065507A publication Critical patent/CN113065507A/en
Application granted granted Critical
Publication of CN113065507B publication Critical patent/CN113065507B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The specification provides a method for implementing face authentication, which comprises the following steps: acquiring face visible light data and face radar data of a user to be authenticated; the face visible light data and the face radar data are collected by the terminal at the same time; identifying a face visible light feature based on the face visible light data; identifying a face radar feature based on the face radar data; and when the human face visible light characteristics are matched with the human face radar characteristics, authenticating the user by adopting the human face visible light data and the verified human face information of the user.

Description

Method and device for realizing face authentication
Technical Field
The present disclosure relates to the field of network communications technologies, and in particular, to a method and apparatus for implementing face authentication.
Background
Face recognition is a biological recognition technology for carrying out identity recognition based on facial feature information of people. With the rapid development of computer vision technology, big data, machine learning and other technologies in recent years, the technology for performing user authentication through face recognition is increasingly widely applied to the fields of financial transactions, unmanned retail, security monitoring, public transportation and the like, and great convenience is brought to the work and life of people.
Meanwhile, the application of face authentication also causes new potential safety hazards. For example, an injection attack on a camera can disguise a previously captured face photo or face video as a live photo or live video captured from the camera, so that a device responsible for user authentication through face recognition may misuse that a user is performing conventional face authentication, and a wrong conclusion is drawn, resulting in loss of the user. Enhancing the security of face authentication has become a new technical challenge.
Disclosure of Invention
In view of this, the present disclosure provides a method for implementing face authentication, including:
acquiring face visible light data and face radar data of a user to be authenticated; the face visible light data and the face radar data are collected simultaneously;
identifying a face visible light feature based on the face visible light data; identifying a face radar feature based on the face radar data;
and when the human face visible light characteristics are matched with the human face radar characteristics, authenticating the user by adopting the human face visible light data and the verified human face information of the user.
The specification also provides a device for realizing face authentication, which comprises:
The data acquisition unit is used for acquiring face visible light data and face radar data of a user to be authenticated; the face visible light data and the face radar data are collected simultaneously;
the feature recognition unit is used for recognizing the visible light features of the human face based on the visible light data of the human face; identifying a face radar feature based on the face radar data;
and the user authentication unit is used for authenticating the user by adopting the face visible light data and the verified face information of the user when the face visible light characteristics are matched with the face radar characteristics.
A computer device provided in the present specification includes: a memory and a processor; the memory has stored thereon a computer program executable by the processor; and when the processor runs the computer program, executing the method for realizing the face authentication.
The present specification also provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the method described above for implementing the face authentication method.
According to the technical scheme, in the embodiment of the specification, the human face visible light data and the human face radar data of the user to be authenticated are collected at the same time, the human face visible light characteristics obtained by the human face visible light data are verified to be matched with the human face radar characteristics obtained by the human face radar data, and then the verified human face information of the user is utilized for authentication.
Drawings
Fig. 1 is a flowchart of a method for implementing face authentication in an embodiment of the present disclosure;
FIG. 2 is a flow chart of interaction between a terminal for face authentication and an authentication server in an application example of the present specification;
FIG. 3 is a hardware block diagram of an embodiment of the present specification;
fig. 4 is a logic structure diagram of a device for implementing face authentication in the embodiment of the present disclosure.
Detailed Description
The embodiment of the specification provides a novel human face authentication realization method, human face visible light data and human face radar data of a user are collected, human face visible light characteristics are obtained by two-dimensional human face visible light data, human face radar characteristics are obtained by three-dimensional human face radar data, after the human face visible light characteristics are authenticated to be matched with the human face radar characteristics, the user is authenticated based on verified human face information of the user, so that injection type attack can possibly pass authentication only by providing matched two-dimensional and three-dimensional data, the difficulty and time energy for forging the three-dimensional data are far higher than those for forging the two-dimensional data, the threshold and the consumed cost of the injection type attack are greatly improved, and the human face authentication has better safety.
In the embodiment of the present specification, the face authentication may be performed independently on the terminal, or may be performed cooperatively by the terminal and the authentication server. In addition, the terminal or authentication server may belong to different owners in different application scenarios. Several examples are given below:
for example, when the device is unlocked, the terminal may be the terminal of the user to be authenticated, and the face authentication is independently completed by the terminal of the user.
For example, in the login authentication, the terminal may be a terminal of the user to be authenticated, the face authentication may be independently completed by the terminal of the user and the result may be notified to the authentication server of the system to be logged in, or the face authentication may be completed by the terminal of the user together with the authentication server of the system to be logged in.
For example, in the face-brushing payment, the terminal may be a terminal of a payee receiving the user payment, and the face authentication is cooperatively completed by the terminal of the payee and an authentication server of the payment system.
In the embodiment of the specification, the terminal may be a mobile phone, a tablet computer, a PC (Personal Computer ), a notebook, a face-brushing payment machine, a face-brushing attendance machine, or the like; the authentication server may be one physical or logical server, or two or more physical or logical servers sharing different responsibilities may cooperate to implement various functions of the authentication server in the embodiments of the present specification. In a scenario in which a terminal performs face authentication in cooperation with an authentication server, the terminal and the authentication server are accessible to each other through a network. The embodiments of the present disclosure do not limit the types of the terminal and the authentication server, and the type, the protocol, etc. of the communication network between the terminal and the authentication server.
In the embodiment of the present disclosure, a flow of a method for implementing face authentication is shown in fig. 1.
Step 110, obtaining face visible light data and face radar data of a user to be authenticated.
In the embodiment of the specification, a visible light camera and a radar camera are mounted on the terminal. A visible light camera is a conventional camera used for taking pictures or recording videos, and is called a visible light camera in the embodiment of the present specification in order to distinguish between a radar camera and an infrared camera. The radar camera can be a millimeter wave radar camera or a laser radar camera, and any radar camera with the precision capable of meeting the requirements of application scenes can be adopted.
After the face authentication process is started, the terminal simultaneously uses the visible light camera and the radar camera to respectively acquire face visible light data and face radar data of a user to be authenticated. The visible light data of the face may be one face photo, two or more face photos (continuous shooting or non-continuous shooting), a video section containing the face, and the like, without limitation. The face radar data is generally three-dimensional data (such as point cloud data) acquired in a radar transmitting range while being photographed by a visible light camera, and a specific data format is not limited (such as data in a matrix format).
The face visible light data and the face radar data are acquired simultaneously, in other words, the face visible light data and the face radar data are acquired for the same face pose, angle and action. Thus, when the visible light camera and the radar camera collect face information of a person at the same time in a normal face authentication flow, the collected face visible light data and the face radar data reflect the same face information, and the face visible light data and the face radar data can be matched with each other.
Step 120, identifying visible light characteristics of the face based on the visible light data of the face; face radar features are identified based on the face radar data.
The face features include information of the face identifiable part itself, positional information of the identifiable part on the face, positional relationship information between two or more identifiable parts, and the like. For distinction, in the embodiment of the present specification, the face features identified from the face visible light data are referred to as face visible light features, and the face features identified from the face radar data are referred to as face radar features.
One or more of the above-mentioned information (information of the identifiable parts themselves, positional information of the identifiable parts on the face, positional relationship information between two or more identifiable parts, etc.) that can be identified from the face visible light data or the face radar data may be used as the face features employed in the embodiments of the present specification, without limitation. For example, the identifiable parts may be facial organs such as eyes, nose, mouth, eyebrows, etc., locatable facial position points in a person and at the eyebrows, facial boundaries (such as facial contours), etc.; the information of the identifiable portion itself may be the lateral and longitudinal width of the eye, the length and width of the nose, etc.; the position information of the identifiable part on the human face can be the horizontal distance from the center point of the left eye to the left face contour line, the vertical distance from the nose tip to the upper face contour line, the vertical distance from the eyebrow to the lower face contour line, and the like; the positional relationship information between the identifiable portions may be a distance between the center points of both eyes, a side length of each side of a triangle formed by the center points of both eyes and the center point of the mouth, or the like.
In an actual application scene, one or more face features to be used in the application scene can be selected as face visible light features or face radar features according to factors such as accuracy, recognition speed requirements and recognition accuracy requirements of the acquired data. The visible light features of the human face belong to plane features, for example, the visible light features can be one to more of the distance between any two facial organs and the distance between a certain facial organ and the facial boundary; the face radar feature may be a planar feature or a stereo feature, for example, may be one or more of a distance between any two face organs, a distance between a face organ and a face boundary, and a distance between a face organ and a camera.
In an embodiment of the present disclosure, the facial visible light features include one to more facial features, the facial radar features include one to more facial features, and at least one of the facial visible light features and the facial radar features is identical. For example, the vertical distance from the nose tip to the upper face contour line and the distance between the center points of the two eyes may be used as the face visible light feature, the distance between the center points of the two eyes may be used as the face radar feature, and the face visible light feature and the face radar feature each include the face feature that is the distance between the center points of the two eyes.
After the face visible light data and the face radar data are obtained, the face visible light data can be used for identifying the face visible light characteristics, and the face radar data can be used for identifying the face radar characteristics.
Two trained machine learning models (referred to as a visible light recognition model and a radar recognition model for distinction) are typically employed to recognize the facial visible light features and the facial radar features, respectively. In the embodiment of the present specification, the input of the visible light recognition model is the acquired face visible light data or one or more vectors generated by using the acquired face visible light data, and the output thereof includes the face visible light characteristics; the input of the radar recognition model is the acquired face radar data or one or more vectors generated using the acquired face radar data, the output of which includes the face radar features. The embodiments of the present disclosure are not limited to the algorithm, training method, etc. used for the visible light recognition model and the radar recognition model.
And 130, when the visible light characteristics of the human face are matched with the radar characteristics of the human face, authenticating the user by adopting the visible light data of the human face and the verified human face information of the user.
After the face visible light feature and the face radar feature are identified, comparing the same face features (namely the face features which belong to the face visible light feature and the face radar feature) in the face visible light feature and the face radar feature, and if the value of each of the same face features in the face visible light feature and the value of each of the same face features in the face radar feature are different from each other in a preset range (different face features can have different preset ranges), then the face visible light feature is considered to be matched with the face radar feature; if any one of the above-mentioned identical facial features differs from the value in the visible light feature by more than a predetermined range from the value in the facial radar feature, then the facial visible light feature is considered to be mismatched with the facial radar feature.
For example, assuming that both the face visible light feature and the face radar feature include a face feature that is a distance between center points of two eyes, when the distance between center points of two eyes in the face visible light feature differs from the distance between center points of two eyes in the face radar feature by more than 0.3 cm, the face visible light feature does not match the face radar feature. If the distance between the center points of the two eyes in the visible light feature of the human face and the distance between the center points of the two eyes in the radar feature of the human face are smaller than or equal to 0.3 cm, the visible light feature of the human face is matched with the radar feature of the human face on the human face feature; if the facial features are matched with other facial features which belong to the facial visible light features and the facial radar features, the facial visible light features are judged to be matched with the facial radar features.
If the visible light features of the face and the radar features of the face are not matched, authentication fails. And if the human face visible light characteristics are matched with the human face radar characteristics, authenticating the user by adopting human face visible light data and verified human face information of the user to be authenticated.
According to the specific implementation of the practical application scene, the verified face information of the user can be the face visible light image which belongs to the user, or can be verified face privacy data obtained by privacy processing of the verified face visible light image. The verified visible light image of the human face can be stored in the terminal of the user or can be obtained from a server of a national institution; the verified face privacy data may be stored on the terminal, on the authentication server, or in a network storage location accessible to the terminal or authentication server. The image data of the face is not obtained from the verified face privacy data.
The specific way of authenticating by using the face visible light data and the verified face information of the user is not limited, for example, when the verified face information is a verified face visible light image, the comparison between the face visible light feature and the face feature of the verified face information can be adopted, and the comparison between the photo or video data and the verified face information in the existing face authentication process can be referred to, which is not described again. For another example, when the verified face information is verified face privacy data, the same privacy processing can be performed on the face visible light data to obtain the face visible light privacy data, and then whether the verified face privacy data and the face visible light privacy data are the same or not is compared. When the visible light data of the human face is matched with the verified human face information of the user, the human face of the user passes authentication; when the face visible light data is not matched with the verified face information of the user, the face authentication fails.
In an application scene of carrying out face authentication by the cooperation of the terminal and the authentication server, the terminal can collect face visible light data and face radar data of a user to be authenticated at the same time, identify the face visible light characteristics based on the face visible light data and identify the face radar characteristics based on the face radar data; when the visible light characteristics of the face are matched with the radar characteristics of the face, the terminal performs privacy processing on the visible light data of the face to obtain visible light privacy data of the face, and the visible light privacy data of the face is uploaded to an authentication server; and the authentication server authenticates the user to be authenticated according to the verified face privacy data and the face visible light privacy data uploaded by the terminal.
The working principle of the radar is that the beam is transmitted to the target, and parameters such as distance, azimuth, altitude, speed, gesture and shape of the target can be obtained by comparing the time difference between the beam reflected from the target and the transmitted beam, so as to construct 3D (three-dimensional) environment perception data. The photograph or video taken by the visible light camera contains only 2D (two-dimensional) planar data. In the embodiment of the specification, the face features of the 3D stereoscopic face are compared with the face features of the 2D planar face, so that the injection attack on the camera needs to forge not only the video or the planar data of the photo, but also the 3D stereoscopic data matched with the planar data, which greatly improves the technical threshold of the forging, and the counterfeiter has to specially make 3D modeling for each photo or video to generate the corresponding stereoscopic data, so that a great amount of time and effort are required, and the probability of success of the injection attack is greatly reduced.
In some application scenarios, a face visible light contour can be identified based on face visible light data, a face radar contour is identified based on face radar data, then the face visible light contour is compared with the face radar contour, and when the face visible light feature is matched with the face radar feature, the face visible light contour is matched with the face radar contour, and the positions of the face visible light contour and the face radar contour are overlapped, the face visible light data and verified face information of a user are adopted to authenticate the user. If the human face visible light contour and the human face radar contour are not matched or the positions of the human face visible light contour and the human face radar contour are not coincident, authentication fails.
In the application scene, according to the specific implementation of the visible light recognition model and the radar recognition model, the visible light contours of the human face can be included in the output of the visible light recognition model, and other visible light contour models of the human face can be adopted to recognize the visible light contours of the human face from the visible light data of the human face; the face radar profile may be included in the output of the radar recognition model, or another face radar profile model may be employed to recognize the face radar profile from the face radar data; the human face visible light contour model and the human face radar contour model can be realized by adopting various suitable machine learning algorithms, and are trained by adopting a training mode which is suitable for the algorithms, and the human face visible light contour model and the human face radar contour model are not limited.
In addition, the matching of the visible light profile of the face with the radar profile of the face may be that the deviation of the visible light profile of the face with the radar profile of the face is within a first predetermined range, and the overlapping of the visible light profile of the face with the radar profile of the face may be that the deviation of the visible light profile of the face with the radar profile of the face is within a second predetermined range.
The visible light recognition model can generally judge whether a face exists or not according to the face visible light data, and include whether the face exists or not in the output of the model; the radar recognition model may generally determine whether a stereoscopic face exists based on the face radar data, and include the existence of the stereoscopic face in the output of the model. In this case, it is possible to determine that the face authentication fails when no face exists in the face visible light data or when no stereoscopic face exists in the face radar data. Whether a human face exists or not, and whether a three-dimensional human face exists or not can be determined according to the requirements of actual application scenes and the specific implementation of a visible light recognition model or a radar recognition model according to the judgment mode and the judgment standard, and the method is not limited. For example, one or more specific portions where the face contour cannot be extracted and the face cannot be recognized may be used as a condition for determining whether the face is absent or stereoscopic.
Some terminals are equipped with an infrared camera in addition to a visible light camera and a radar camera. On the terminals, the infrared camera can be used for collecting face visible light data and face radar data and simultaneously collecting face infrared data of a user to be authenticated. The face visible light data, the face radar data and the face infrared data which are acquired at the same time are acquired aiming at the same face posture, angle and action, and reflect the same face information. The face infrared data may be one face infrared picture, two or more face infrared pictures (continuous shooting or non-continuous shooting), a section of infrared video including a face, and the like, without limitation.
After the face infrared data is obtained, the face infrared outline is identified based on the face infrared data, the identified face infrared outline is compared with the face visible light outline and the face radar outline, and when the face visible light feature is matched with the face radar feature, the face visible light outline, the face radar outline and the face infrared outline are matched, and the positions of the face visible light outline, the face radar outline and the face infrared outline are coincident, the face visible light data and the verified face information of the user to be authenticated are adopted to authenticate the user.
A trained infrared contour model (a machine learning model) may be used to identify the infrared contours of the face. In embodiments of the present description, the input of the infrared contour model is face infrared data, or one or more vectors generated using face infrared data, the output of which includes the face infrared contour. The infrared contour model can be trained by adopting various machine learning algorithms and adopting a training mode suitable for the algorithm, and is not limited. In addition, the matching of the human face visible light profile, the human face radar profile and the human face infrared profile comprises that the deviation of the three profiles is in a third preset range; the overlapping of the positions of the visible light contour, the radar contour and the infrared contour of the human face includes that the position deviation of the three contours is in a fourth preset range.
The infrared contour model can generally determine from the face infrared data whether a face is present and include the presence of a face in the output of the model. The judging mode and the judging standard of whether the face exists in the face infrared data can be determined according to the needs of actual application scenes and the specific implementation of the infrared contour model, and the face infrared data is not limited. Before comparing the face outline with the face characteristics, whether the face exists in the face visible light data, the face radar data and the face infrared data can be judged. And when the human face does not exist in the human face visible light data, or the human face does not exist in the human face radar data, or the human face does not exist in the human face infrared data, judging that authentication fails.
The infrared camera images according to the infrared rays radiated or reflected by the object, can image in darkness, and the infrared face image is not affected by the change of the ambient illumination like a visible light face image, is not affected by the face makeup or camouflage, and has unique advantages in the face authentication. Meanwhile, face visible light data, face radar data and face infrared data are adopted to carry out face authentication, so that injection type attack on a camera is required to forge matched infrared data besides plane data of a forged video or a photo and 3D stereo data matched with the plane data, the difficulty of forging is further increased, and the safety of face authentication is improved.
The foregoing describes specific embodiments of the present disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
In one application example of the present specification, a visible light camera, an infrared camera, and a laser radar camera are mounted on a terminal. The terminal is stored with a trained visible light recognition model, a radar recognition model and an infrared contour model. The visible light recognition model is input into human face visible light data and output into whether a human face exists, a human face visible light contour and human face visible light characteristics; the input of the radar recognition model is face radar data, and the face radar data are output to judge whether a three-dimensional face exists, a face radar contour and a face radar feature; the input of the infrared contour model is the facial infrared data, and the output is whether the face and the facial infrared contour exist.
The flow of face authentication by the user is shown in fig. 2. When the face authentication is started, the terminal respectively collects face visible light data, face radar data and face infrared data through the visible light camera, the laser radar camera and the infrared camera.
The terminal inputs the visible light data of the face into a visible light recognition model, inputs the radar data of the face into a radar recognition model, and inputs the infrared data of the face into an infrared contour model to obtain the output of three models. And after judging that the face visible light data has a face, the face radar data has a three-dimensional face and the face infrared data has a face, the terminal compares the face visible light outline, the face radar outline and the face infrared outline. If no face exists in any of the face visible light data, the face radar data or the face infrared data, an authentication failure is returned to the user.
When the visible light profile of the face, the radar profile of the face and the infrared profile of the face are matched and the positions are identical, the terminal compares the visible light characteristic of the face with the radar characteristic of the face; otherwise, returning authentication failure to the user.
If the face visible light characteristics are matched with the face radar characteristics, the terminal performs preset privacy processing on the face visible light data to obtain the face visible light privacy data, and the face visible light privacy data is uploaded to the authentication server. If the visible light features of the face and the radar features of the face are not matched, the terminal returns authentication failure to the user.
The authentication server stores therein verified face privacy data of the user obtained from the national authority. The verified face privacy data are generated by the national authority according to the verified face images of the users after the same privacy processing, and are provided for the authentication server to use. After receiving the face visible light privacy data of the user to be authenticated uploaded by the terminal, the authentication server extracts the verified face privacy data of the same user, if the face privacy data is the same as the data uploaded by the terminal, a message passing authentication is returned to the terminal, otherwise, a message failing authentication is returned to the terminal, and the terminal informs that the user fails authentication.
Corresponding to the above flow implementation, the embodiment of the present specification further provides a device for implementing face authentication. The device can be realized by software, hardware or a combination of the hardware and the software. Taking software implementation as an example, the device in a logic sense is formed by reading corresponding computer program instructions into a memory through a CPU (Central Process Unit, central processing unit) of the device. In terms of hardware, in addition to the CPU, the memory, and the storage shown in fig. 3, the device where the device for implementing face authentication is located generally includes other hardware such as a chip for performing wireless signal transceiving, and/or other hardware such as a board card for implementing a network communication function.
Fig. 4 shows a device for implementing face authentication according to an embodiment of the present disclosure, including a data acquisition unit, a feature recognition unit, and a user authentication unit, where: the data acquisition unit is used for acquiring face visible light data and face radar data of a user to be authenticated; the face visible light data and the face radar data are collected simultaneously; the feature recognition unit is used for recognizing the visible light features of the human face based on the visible light data of the human face; identifying a face radar feature based on the face radar data; and the user authentication unit is used for authenticating the user by adopting the face visible light data and the verified face information of the user when the face visible light characteristics are matched with the face radar characteristics.
In one implementation, the apparatus further includes: the contour recognition unit is used for recognizing the visible light contour of the human face based on the visible light data of the human face; identifying a face radar profile based on the face radar data; the user authentication unit is specifically configured to: and when the human face visible light characteristics are matched with the human face radar characteristics, the human face visible light outline is matched with the human face radar outline, and the positions of the human face visible light outline and the human face radar outline are coincident, authenticating the user by adopting the human face visible light data and the verified human face information of the user.
In the above implementation manner, the data acquisition unit is specifically configured to: acquiring face visible light data, face radar data and face infrared data of a user to be authenticated; the face visible light data, the face radar data and the face infrared data are collected simultaneously; the device also comprises an infrared contour unit for identifying the infrared contour of the face based on the infrared data of the face; the user authentication unit is specifically configured to: and when the human face visible light characteristics are matched with the human face radar characteristics, the human face visible light outline, the human face radar outline and the human face infrared outline are matched, and the positions of the human face visible light outline, the human face radar outline and the human face infrared outline are overlapped, authenticating the user by adopting the human face visible light data and the verified human face information of the user.
In one example, the apparatus further comprises: the human face existence unit is used for judging whether a human face exists or not based on the human face visible light data and judging whether a three-dimensional human face exists or not based on the human face radar data; when the human face is not present in the human face visible light data or the human face is not present in the human face radar data, authentication fails.
In the above example, the data acquisition unit is specifically configured to: acquiring face visible light data, face radar data and face infrared data of a user to be authenticated; the face visible light data, the face radar data and the face infrared data are collected simultaneously; the face existence unit is specifically configured to: judging whether a human face exists based on the human face visible light data, judging whether a three-dimensional human face exists based on the human face radar data, and judging whether a human face exists based on the human face infrared data; when the human face visible light data does not exist a human face, the human face radar data does not exist a three-dimensional human face, or the human face infrared data does not exist a human face, authentication fails.
Optionally, the data acquisition unit is specifically configured to: the terminal collects face visible light data and face radar data of a user to be authenticated at the same time; the feature recognition unit is specifically configured to: identifying, by a terminal, a face visible light feature based on the face visible light data; identifying a face radar feature based on the face radar data; the verified face information includes: the authenticated face visible light image is subjected to privacy treatment to obtain verified face privacy data; the user authentication unit is specifically configured to: when the face visible light characteristics are matched with the face radar characteristics, the terminal performs privacy processing on the face visible light data to obtain face visible light privacy data, and the face visible light privacy data is uploaded to an authentication server; and the authentication server authenticates the user according to the verified face privacy data and the face visible light privacy data.
Optionally, the visible light feature of the face includes: one to more of a distance between any two facial organs, a distance between a certain facial organ and a facial boundary; the face radar feature includes: one or more of a distance between any two facial organs, a distance between a certain facial organ and a facial boundary, and a distance between a certain facial organ and a camera.
Embodiments of the present description provide a computer device that includes a memory and a processor. Wherein the memory has stored thereon a computer program executable by the processor; the processor executes the steps of the method for implementing face authentication in the embodiment of the present specification when running the stored computer program. For a detailed description of the steps of the implementation method of face authentication, please refer to the previous contents, and the description is not repeated.
Embodiments of the present specification provide a computer-readable storage medium having stored thereon computer programs which, when executed by a processor, perform the steps of the method for implementing face authentication in the embodiments of the present specification. For a detailed description of the steps of the implementation method of face authentication, please refer to the previous contents, and the description is not repeated.
The foregoing description of the preferred embodiment is merely illustrative of the presently preferred embodiment and is not intended to limit the claimed embodiments to the particular embodiments disclosed, but on the contrary, the intention is to cover all modifications, equivalents, alternatives, and alternatives falling within the spirit and principles of the invention.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, embodiments of the present specification may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Moreover, embodiments of the present description may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.

Claims (14)

1. A method for realizing face authentication comprises the following steps:
acquiring face visible light data and face radar data of a user to be authenticated; the face visible light data and the face radar data are collected simultaneously;
identifying a face visible light feature based on the face visible light data; identifying a face radar feature based on the face radar data; the human face visible light features include: one to more of a distance between any two facial organs, a distance between a certain facial organ and a facial boundary; the face radar feature includes: one to more of a distance between any two facial organs, a distance between a certain facial organ and a facial boundary;
identifying a face visible light profile based on the face visible light data; identifying a face radar profile based on the face radar data;
and when the human face visible light characteristics are matched with the human face radar characteristics, the human face visible light outline is matched with the human face radar outline, and the positions of the human face visible light outline and the human face radar outline are coincident, authenticating the user by adopting the human face visible light data and the verified human face information of the user.
2. The method of claim 1, the acquiring face visible light data and face radar data of the user to be authenticated, comprising: acquiring face visible light data, face radar data and face infrared data of a user to be authenticated; the face visible light data, the face radar data and the face infrared data are collected simultaneously;
the method further comprises the steps of: identifying a face infrared contour based on the face infrared data;
when the face visible light feature is matched with the face radar feature, the face visible light outline is matched with the face radar outline, and the positions of the face visible light outline and the face radar outline are coincident, authenticating the user by adopting the face visible light data and the verified face information of the user comprises the following steps: and when the human face visible light characteristics are matched with the human face radar characteristics, the human face visible light outline, the human face radar outline and the human face infrared outline are matched, and the positions of the human face visible light outline, the human face radar outline and the human face infrared outline are overlapped, authenticating the user by adopting the human face visible light data and the verified human face information of the user.
3. The method of claim 1, further comprising: judging whether a human face exists based on the human face visible light data, and judging whether a three-dimensional human face exists based on the human face radar data; when the human face is not present in the human face visible light data or the human face is not present in the human face radar data, authentication fails.
4. A method according to claim 3, the obtaining face visible light data and face radar data of the user to be authenticated comprising: acquiring face visible light data, face radar data and face infrared data of a user to be authenticated; the face visible light data, the face radar data and the face infrared data are collected simultaneously;
the judging whether a human face exists based on the human face visible light data, judging whether a three-dimensional human face exists based on the human face radar data, comprises the following steps: judging whether a human face exists based on the human face visible light data, judging whether a three-dimensional human face exists based on the human face radar data, and judging whether a human face exists based on the human face infrared data;
when the human face does not exist in the human face visible light data or the human face does not exist in the human face radar data, the authentication failure comprises the following steps: when the human face visible light data does not exist a human face, the human face radar data does not exist a three-dimensional human face, or the human face infrared data does not exist a human face, authentication fails.
5. The method of claim 1, the acquiring face visible light data and face radar data of the user to be authenticated, comprising: the terminal collects face visible light data and face radar data of a user to be authenticated at the same time;
the face visible light characteristics are identified based on the face visible light data; identifying facial radar features based on the facial radar data, comprising: identifying, by a terminal, a face visible light feature based on the face visible light data; identifying a face radar feature based on the face radar data;
the verified face information includes: the authenticated face visible light image is subjected to privacy treatment to obtain verified face privacy data;
when the face visible light feature is matched with the face radar feature, the face visible light outline is matched with the face radar outline, and the positions of the face visible light outline and the face radar outline are coincident, authenticating the user by adopting the face visible light data and the verified face information of the user comprises the following steps: when the human face visible light characteristics are matched with the human face radar characteristics, the human face visible light outline is matched with the human face radar outline, and the positions of the human face visible light outline and the human face radar outline are coincident, the terminal performs privacy processing on the human face visible light data to obtain human face visible light privacy data, and the human face visible light privacy data is uploaded to an authentication server; and the authentication server authenticates the user according to the verified face privacy data and the face visible light privacy data.
6. The method of claim 1, the face radar feature further comprising: the distance of a certain facial organ to the camera.
7. An implementation device for face authentication, comprising:
the data acquisition unit is used for acquiring face visible light data and face radar data of a user to be authenticated; the face visible light data and the face radar data are collected simultaneously;
the feature recognition unit is used for recognizing the visible light features of the human face based on the visible light data of the human face; identifying a face radar feature based on the face radar data; the human face visible light features include: one to more of a distance between any two facial organs, a distance between a certain facial organ and a facial boundary; the face radar feature includes: one to more of a distance between any two facial organs, a distance between a certain facial organ and a facial boundary;
the contour recognition unit is used for recognizing the visible light contour of the human face based on the visible light data of the human face; identifying a face radar profile based on the face radar data;
and the user authentication unit is used for authenticating the user by adopting the human face visible light data and the verified human face information of the user when the human face visible light characteristics are matched with the human face radar characteristics, the human face visible light outline is matched with the human face radar outline, and the positions of the human face visible light outline and the human face radar outline are coincident.
8. The apparatus of claim 7, the data acquisition unit is specifically configured to: acquiring face visible light data, face radar data and face infrared data of a user to be authenticated; the face visible light data, the face radar data and the face infrared data are collected simultaneously;
the apparatus further comprises: the infrared contour unit is used for identifying the infrared contour of the face based on the infrared data of the face;
the user authentication unit is specifically configured to: and when the human face visible light characteristics are matched with the human face radar characteristics, the human face visible light outline, the human face radar outline and the human face infrared outline are matched, and the positions of the human face visible light outline, the human face radar outline and the human face infrared outline are overlapped, authenticating the user by adopting the human face visible light data and the verified human face information of the user.
9. The apparatus of claim 7, further comprising: the human face existence unit is used for judging whether a human face exists or not based on the human face visible light data and judging whether a three-dimensional human face exists or not based on the human face radar data; when the human face is not present in the human face visible light data or the human face is not present in the human face radar data, authentication fails.
10. The apparatus of claim 9, the data acquisition unit is specifically configured to: acquiring face visible light data, face radar data and face infrared data of a user to be authenticated; the face visible light data, the face radar data and the face infrared data are collected simultaneously;
the face existence unit is specifically configured to: judging whether a human face exists based on the human face visible light data, judging whether a three-dimensional human face exists based on the human face radar data, and judging whether a human face exists based on the human face infrared data; when the human face visible light data does not exist a human face, the human face radar data does not exist a three-dimensional human face, or the human face infrared data does not exist a human face, authentication fails.
11. The apparatus of claim 7, the data acquisition unit is specifically configured to: the terminal collects face visible light data and face radar data of a user to be authenticated at the same time;
the feature recognition unit is specifically configured to: identifying, by a terminal, a face visible light feature based on the face visible light data; identifying a face radar feature based on the face radar data;
the verified face information includes: the authenticated face visible light image is subjected to privacy treatment to obtain verified face privacy data;
The user authentication unit is specifically configured to: when the human face visible light characteristics are matched with the human face radar characteristics, the human face visible light outline is matched with the human face radar outline, and the positions of the human face visible light outline and the human face radar outline are coincident, the terminal performs privacy processing on the human face visible light data to obtain human face visible light privacy data, and the human face visible light privacy data is uploaded to an authentication server; and the authentication server authenticates the user according to the verified face privacy data and the face visible light privacy data.
12. The apparatus of claim 7, the facial radar feature further comprising: one or more of the distances of a certain facial organ from the camera.
13. A computer device, comprising: a memory and a processor; the memory has stored thereon a computer program executable by the processor; the processor, when running the computer program, performs the method of any one of claims 1 to 6.
14. A computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the method of any of claims 1 to 6.
CN202110425395.XA 2021-04-20 2021-04-20 Method and device for realizing face authentication Active CN113065507B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110425395.XA CN113065507B (en) 2021-04-20 2021-04-20 Method and device for realizing face authentication

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110425395.XA CN113065507B (en) 2021-04-20 2021-04-20 Method and device for realizing face authentication

Publications (2)

Publication Number Publication Date
CN113065507A CN113065507A (en) 2021-07-02
CN113065507B true CN113065507B (en) 2023-06-02

Family

ID=76567324

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110425395.XA Active CN113065507B (en) 2021-04-20 2021-04-20 Method and device for realizing face authentication

Country Status (1)

Country Link
CN (1) CN113065507B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109117725A (en) * 2018-07-09 2019-01-01 深圳市科脉技术股份有限公司 Face identification method and device

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101339607B (en) * 2008-08-15 2012-08-01 北京中星微电子有限公司 Human face recognition method and system, human face recognition model training method and system
CN105513221B (en) * 2015-12-30 2018-08-14 四川川大智胜软件股份有限公司 A kind of ATM machine antifraud apparatus and system based on three-dimensional face identification
CN107292283A (en) * 2017-07-12 2017-10-24 深圳奥比中光科技有限公司 Mix face identification method
CN111126146B (en) * 2018-04-12 2024-03-05 Oppo广东移动通信有限公司 Image processing method, image processing device, computer readable storage medium and electronic apparatus
CN110443096A (en) * 2018-05-02 2019-11-12 上海聚虹光电科技有限公司 The application method of changeable filter camera for cloud platform identification
CN108492431B (en) * 2018-05-30 2024-03-29 中海云智慧(北京)物联网科技有限公司 Intelligent access control system
CN110059644A (en) * 2019-04-23 2019-07-26 杭州智趣智能信息技术有限公司 A kind of biopsy method based on facial image, system and associated component
CN112257641A (en) * 2020-10-30 2021-01-22 中电万维信息技术有限责任公司 Face recognition living body detection method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109117725A (en) * 2018-07-09 2019-01-01 深圳市科脉技术股份有限公司 Face identification method and device

Also Published As

Publication number Publication date
CN113065507A (en) 2021-07-02

Similar Documents

Publication Publication Date Title
US11288504B2 (en) Iris liveness detection for mobile devices
US20210064900A1 (en) Id verification with a mobile device
US10943095B2 (en) Methods and systems for matching extracted feature descriptors for enhanced face recognition
US9652663B2 (en) Using facial data for device authentication or subject identification
JP2022507315A (en) Identity verification methods and their equipment, computer programs and computer equipment
CN110008813B (en) Face recognition method and system based on living body detection technology
CN105868677B (en) Living body face detection method and device
Das et al. Recent advances in biometric technology for mobile devices
US10733279B2 (en) Multiple-tiered facial recognition
CN112825128A (en) Method and apparatus for liveness testing and/or biometric verification
US11961329B2 (en) Iris authentication device, iris authentication method and recording medium
KR102215535B1 (en) Partial face image based identity authentication method using neural network and system for the method
Galdi et al. PROTECT: Pervasive and useR fOcused biomeTrics bordEr projeCT–a case study
CN113065507B (en) Method and device for realizing face authentication
CN114581978A (en) Face recognition method and system
CN114743277A (en) Living body detection method, living body detection device, electronic apparatus, storage medium, and program product
CN109145772B (en) Data processing method and device, computer readable storage medium and electronic equipment
Lin et al. A novel framework for automatic 3D face recognition using quality assessment
CN110874876A (en) Unlocking method and device
CN112464741B (en) Face classification method, model training method, electronic device and storage medium
CN116451195A (en) Living body identification method and system
CN117133021A (en) Palm image recognition method, palm image recognition device, palm image recognition apparatus, palm image recognition device, palm image recognition program, and palm image recognition program
CN116188845A (en) Method and system for detecting attack resistance
CN116824314A (en) Information acquisition method and system
CN113487492A (en) Parallax value correction method, parallax value correction device, electronic apparatus, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant