CN113065507A - Method and device for realizing face authentication - Google Patents

Method and device for realizing face authentication Download PDF

Info

Publication number
CN113065507A
CN113065507A CN202110425395.XA CN202110425395A CN113065507A CN 113065507 A CN113065507 A CN 113065507A CN 202110425395 A CN202110425395 A CN 202110425395A CN 113065507 A CN113065507 A CN 113065507A
Authority
CN
China
Prior art keywords
face
human face
visible light
data
radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110425395.XA
Other languages
Chinese (zh)
Other versions
CN113065507B (en
Inventor
贺三元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alipay Hangzhou Information Technology Co Ltd
Original Assignee
Alipay Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alipay Hangzhou Information Technology Co Ltd filed Critical Alipay Hangzhou Information Technology Co Ltd
Priority to CN202110425395.XA priority Critical patent/CN113065507B/en
Publication of CN113065507A publication Critical patent/CN113065507A/en
Application granted granted Critical
Publication of CN113065507B publication Critical patent/CN113065507B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The present specification provides a method for implementing face authentication, including: acquiring visible light data of a face and radar data of the face of a user to be authenticated; the human face visible light data and the human face radar data are simultaneously collected by the terminal; identifying visible light characteristics of the face based on the visible light data of the face; identifying face radar features based on the face radar data; and when the human face visible light characteristics are matched with the human face radar characteristics, authenticating the user by adopting the human face visible light data and verified human face information of the user.

Description

Method and device for realizing face authentication
Technical Field
The present disclosure relates to the field of network communication technologies, and in particular, to a method and an apparatus for implementing face authentication.
Background
Face recognition is a biometric technology for identity recognition based on facial feature information of a person. With the rapid development of computer vision technology, big data, machine learning and other technologies in recent years, the technology for user authentication through face recognition is more and more widely applied in the fields of financial transactions, unmanned retail, security monitoring, public transportation and the like, and brings great convenience to the work and life of people.
Meanwhile, the application of face authentication also causes new potential safety hazards. For example, an injection attack on a camera can disguise a face photograph or a face video taken in advance as a live photograph or a live video taken from the camera, so that the device in charge of performing user authentication through face recognition mistakenly assumes that the user is performing conventional face authentication, and draws a wrong conclusion, resulting in loss of the user. Enhancing the security of face authentication has become a new technical challenge.
Disclosure of Invention
In view of this, the present specification provides a method for implementing face authentication, including:
acquiring visible light data of a face and radar data of the face of a user to be authenticated; the human face visible light data and the human face radar data are collected simultaneously;
identifying visible light characteristics of the face based on the visible light data of the face; identifying face radar features based on the face radar data;
and when the human face visible light characteristics are matched with the human face radar characteristics, authenticating the user by adopting the human face visible light data and verified human face information of the user.
This specification also provides an implementation apparatus for face authentication, including:
the data acquisition unit is used for acquiring the visible light data of the face of the user to be authenticated and the face radar data; the human face visible light data and the human face radar data are collected simultaneously;
the characteristic identification unit is used for identifying the visible light characteristics of the human face based on the visible light data of the human face; identifying face radar features based on the face radar data;
and the user authentication unit is used for authenticating the user by adopting the human face visible light data and the verified human face information of the user when the human face visible light characteristic is matched with the human face radar characteristic.
This specification provides a computer device comprising: a memory and a processor; the memory having stored thereon a computer program executable by the processor; when the processor runs the computer program, the method for realizing the human face authentication is executed.
The present specification also provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, performs the method of the implementation method of face authentication.
According to the technical scheme, in the embodiment of the specification, the face visible light data and the face radar data of the user to be authenticated are collected at the same time, the face visible light characteristics obtained from the face visible light data are verified to be matched with the face radar characteristics obtained from the face radar data, and then the verified face information of the user is used for authentication.
Drawings
Fig. 1 is a flowchart of a method for implementing face authentication in an embodiment of the present disclosure;
fig. 2 is a flowchart of interaction between a terminal for face authentication and an authentication server in an application example of the present specification;
FIG. 3 is a diagram of hardware architecture for implementing an embodiment of the present description;
fig. 4 is a logic structure diagram of an apparatus for implementing face authentication in an embodiment of the present specification.
Detailed Description
The embodiment of the specification provides a new face authentication implementation method, which includes the steps of collecting face visible light data and face radar data of a user at the same time, obtaining face visible light characteristics from two-dimensional face visible light data, obtaining face radar characteristics from three-dimensional face radar data, authenticating the user based on verified face information of the user after verifying that the face visible light characteristics are matched with the face radar characteristics, enabling injection type attack to be possible to authenticate by providing matched two-dimensional and three-dimensional data, and enabling the difficulty and time for counterfeiting the three-dimensional data to be far higher than that for counterfeiting the two-dimensional data, greatly improving the threshold and cost of the injection type attack, and enabling the face authentication to have better safety.
In the embodiment of the present specification, the face authentication may be performed independently on the terminal, or may be performed cooperatively by the terminal and the authentication server. In addition, the terminal or the authentication server may belong to different owners in different application scenarios. Several examples are given below:
for example, when the device is unlocked, the terminal may be a terminal of a user to be authenticated, and the face authentication is independently completed by the terminal of the user.
For example, in login authentication, the terminal may be a terminal of a user to be authenticated, face authentication may be independently performed by the terminal of the user and the result is notified to an authentication server of the system to be logged in, or face authentication may be performed by both the terminal of the user and the authentication server of the system to be logged in.
And in the face brushing payment, the terminal can be a payee terminal for receiving the user payment, and the face authentication is completed by the payee terminal and the authentication server of the payment system in a cooperation manner.
In the embodiments of the present description, the terminal may be a mobile phone, a tablet Computer, a PC (Personal Computer), a notebook, a face-brushing payment machine, a face-brushing attendance machine, or the like; the authentication server may be one physical or logical server, or two or more physical or logical servers sharing different responsibilities and cooperating with each other to implement the functions of the authentication server in the embodiments of the present specification. In a scene that the terminal and the authentication server cooperatively perform face authentication, the terminal and the authentication server can be mutually accessed through a network. The embodiments of the present specification do not limit the types of the terminal and the authentication server, and the type and protocol of the communication network between the terminal and the authentication server.
In the embodiment of the present specification, a flow of a method for implementing face authentication is shown in fig. 1.
And step 110, acquiring the visible light data of the face and the radar data of the face of the user to be authenticated.
In the embodiment of the present specification, a visible light camera and a radar camera are mounted on a terminal. The visible light camera is a conventional camera and is used for shooting pictures or recording videos, and in order to distinguish the radar camera from the infrared camera, the visible light camera is called as the visible light camera in the embodiment of the description. The radar camera can be a millimeter wave radar camera and also can be a laser radar camera, and any radar camera with the precision capable of meeting the requirements of application scenes can be adopted.
After the face authentication process is started, the terminal simultaneously uses the visible light camera and the radar camera to respectively acquire face visible light data and face radar data of the user to be authenticated. The visible light data of the human face may be one human face photo, two or more human face photos (continuous shooting or non-continuous shooting), a video including the human face, and the like, without limitation. The face radar data is generally three-dimensional data (such as point cloud data) which is shot by a visible light camera and collected within a radar emission range, and a specific data format is not limited (such as data in a matrix format).
It should be noted that the face visible light data and the face radar data are acquired simultaneously, in other words, the acquisition of the face visible light data and the face radar data is directed to the same face pose, angle and motion. Therefore, when the visible light camera and the radar camera simultaneously collect the face information of a person in a normal face authentication process, the collected face visible light data and the collected face radar data reflect the same face information, and the face visible light data and the face radar data can be matched with each other.
Step 120, identifying visible light characteristics of the human face based on the visible light data of the human face; and identifying the face radar characteristics based on the face radar data.
The face features include information of the recognizable parts of the face itself, position information of the recognizable parts in the face, and positional relationship information between two or more recognizable parts, and the like. For the sake of distinction, in the embodiments of the present specification, the face features recognized from the face visible light data are referred to as face visible light features, and the face features recognized from the face radar data are referred to as face radar features.
One or more of the above-mentioned information (information of the recognizable portion itself, position information of the recognizable portion on the face, position relationship information between two or more recognizable portions, and the like) that can be recognized from the face visible light data or the face radar data may be used as the face feature used in the embodiment of the present specification, and is not limited. For example, the recognizable part may be a face organ such as eyes, nose, mouth, eyebrows, etc., a locatable face position point in a person or in the center of the eyebrows, a face boundary (e.g., a face contour line), etc.; the information of the recognizable portion itself may be the lateral and longitudinal widths of the eyes, the length and width of the nose, etc.; the position information of the recognizable part on the face can be the horizontal distance from the center point of the left eye to the contour line of the left face, the vertical distance from the nose tip to the contour line of the upper face, the vertical distance from the eyebrow center to the contour line of the lower face and the like; the positional relationship information between the recognizable portions may be a distance between center points of both eyes, a side length of each side of a triangle formed by the center points of both eyes and a center point of a mouth, and the like.
In an actual application scene, one or more human face features to be used in the application scene can be selected as human face visible light features or human face radar features according to factors such as precision of acquired data, requirements on identification speed, requirements on identification accuracy and the like. The visible light feature of the human face belongs to a plane feature, and can be one or more of the distance between any two facial organs and the distance between a certain facial organ and a facial boundary, for example; the face radar feature may be a planar feature or a stereo feature, and may be, for example, one or more of a distance between any two face organs, a distance between a certain face organ and a face boundary, and a distance from a certain face organ to a camera.
In an embodiment of the present specification, the visible light face feature includes one or more face features, the radar face feature includes one or more face features, and at least one of the visible light face feature and the radar face feature is the same. For example, the vertical distance from the tip of the nose to the upper face contour line and the distance between the center points of the two eyes may be used as the visible light feature of the face, the distance between the center points of the two eyes may be used as the radar feature of the face, and both the visible light feature of the face and the radar feature of the face include the face feature of the distance between the center points of the two eyes.
After the face visible light data and the face radar data are obtained, the face visible light characteristics can be identified by adopting the face visible light data, and the face radar characteristics can be identified by adopting the face radar data.
Two trained machine learning models (referred to as visible light recognition model and radar recognition model for distinction) are typically used to recognize visible light features of a face and radar features of the face, respectively. In the embodiments of the present description, the input of the visible light recognition model is the obtained human face visible light data, or one or more vectors generated by using the obtained human face visible light data, and the output of the visible light recognition model includes human face visible light features; the input of the radar recognition model is the acquired face radar data or one to a plurality of vectors generated by using the acquired face radar data, and the output of the radar recognition model comprises the face radar characteristics. The embodiment of the present specification does not limit the algorithms, training modes, and the like used by the visible light recognition model and the radar recognition model.
And step 130, when the visible light characteristics of the face are matched with the radar characteristics of the face, authenticating the user by adopting the visible light data of the face and the verified face information of the user.
After the visible light characteristic of the face and the radar characteristic of the face are identified, the same face characteristic (namely the face characteristic belonging to the visible light characteristic of the face and the radar characteristic of the face) in the visible light characteristic of the face and the radar characteristic of the face is compared, if the value of each of the same face characteristic in the visible light characteristic of the face and the value difference in the radar characteristic of the face are in a preset range (different face characteristics can have different preset ranges), the visible light characteristic of the face is considered to be matched with the radar characteristic of the face; and if the value of any one of the same human face features in the visible light features is different from the value in the human face radar feature by more than a preset range, the human face visible light features are not matched with the human face radar features.
For example, assuming that the human face visible light feature and the human face radar feature both include a human face feature of a distance between center points of two eyes, when the difference between the distance between the center points of two eyes in the human face visible light feature and the distance between the center points of two eyes in the human face radar feature exceeds 0.3 cm, the human face visible light feature is not matched with the human face radar feature. If the distance between the center points of the two eyes in the human face visible light characteristic and the distance between the center points of the two eyes in the human face radar characteristic are less than or equal to 0.3 cm, the human face visible light characteristic is matched with the human face radar characteristic on the human face characteristic; and if the other human face characteristics which belong to the human face visible light characteristic and the human face radar characteristic are matched, judging that the human face visible light characteristic is matched with the human face radar characteristic.
And if the human face visible light characteristic and the human face radar characteristic are not matched, the authentication fails. And if the human face visible light characteristics are matched with the human face radar characteristics, the user is authenticated by adopting human face visible light data and verified human face information of the user to be authenticated.
According to the specific implementation of the practical application scenario, the verified face information of the user may be a visible light image of a face that belongs to the user himself or verified face privacy data obtained by performing privacy processing on the visible light image of the verified face. The verified face visible light image can be stored on the terminal of the user or acquired from a server of a state organ; the verified face privacy data can be stored on the terminal, the authentication server or a network storage location accessible by the terminal or the authentication server. Note that image data of a human face cannot be obtained from verified human face privacy data.
The specific manner of performing authentication by using the visible light data of the face and the verified face information of the user is not limited, for example, when the verified face information is a verified face visible light image, the visible light feature of the face may be used to compare with the face feature of the verified face information, and the comparison process between the picture or video data and the verified face information in the existing face authentication process may also be referred to, which is not described again. For another example, when the verified face information is verified face privacy data, the same privacy processing may be performed on the face visible light data to obtain face visible light privacy data, and then the verified face privacy data and the face visible light privacy data are compared to determine whether they are the same. When the visible light data of the human face is matched with the verified human face information of the user, the human face authentication of the user is passed; when the visible light data of the face does not match the verified face information of the user, the face authentication fails.
In an application scene of face authentication cooperatively performed by a terminal and an authentication server, the terminal can simultaneously acquire face visible light data and face radar data of a user to be authenticated, identify face visible light characteristics based on the face visible light data, and identify face radar characteristics based on the face radar data; when the human face visible light characteristics are matched with the human face radar characteristics, the terminal carries out privacy processing on the human face visible light data to obtain human face visible light privacy data, and uploads the human face visible light privacy data to the authentication server; and the authentication server authenticates the user to be authenticated according to the verified face privacy data and the face visible light privacy data uploaded by the terminal.
The radar has the working principle that the parameters of the distance, the direction, the height, the speed, the attitude, the shape and the like of a target can be obtained by transmitting beams to the target and comparing the time difference between the beams reflected from the target and the transmitted beams, so that 3D (three-dimensional) environment perception data is constructed. A photograph or video taken by a visible light camera contains only 2D (two-dimensional) plane data. In the embodiment of the description, the face features of the 3D stereoscopic face are compared with the face features of the 2D planar face, so that the injection attack on the camera needs to forge not only the planar data of the video or the photo, but also the 3D stereoscopic data matched with the planar data, which greatly improves the technical threshold of the forging, and a forger needs to specially perform 3D modeling for each photo or video to generate the corresponding stereoscopic data, so that a great deal of time and energy is spent, and the possibility of success of the injection attack is greatly reduced.
In some application scenarios, a human face visible light profile can be identified based on human face visible light data, a human face radar profile is identified based on human face radar data, then the human face visible light profile is compared with the human face radar profile, and when the human face visible light characteristic is matched with the human face radar characteristic, the human face visible light profile is matched with the human face radar profile, and the positions of the human face visible light profile and the human face radar profile are overlapped, the human face visible light data and verified human face information of a user are adopted to authenticate the user. And if the human face visible light outline is not matched with the human face radar outline or the positions of the human face visible light outline and the human face radar outline are not coincident, the authentication fails.
In the application scenario, according to the specific implementation of the visible light recognition model and the radar recognition model, the visible light profile of the face may be included in the output of the visible light recognition model, or another visible light profile model of the face may be used to recognize the visible light profile of the face from the visible light data of the face; the face radar profile can be included in the output of the radar recognition model, and another face radar profile model can be adopted to recognize the face radar profile from the face radar data; the human face visible light outline model and the human face radar outline model can be realized by adopting various suitable machine learning algorithms, and are trained by adopting a training mode adaptive to the algorithms, which is not limited.
In addition, the matching of the face visible light profile and the face radar profile can be that the deviation of the face visible light profile and the face radar profile is within a first preset range, and the coincidence of the positions of the face visible light profile and the face radar profile can be that the deviation of the positions of the face visible light profile and the face radar profile is within a second preset range.
The visible light recognition model can generally judge whether a human face exists or not from human face visible light data and include whether the human face exists or not in the output of the model; the radar recognition model may typically determine whether a stereo face is present based on the face radar data and include the presence of the stereo face in the output of the model. In this case, it may be determined that the face authentication fails when a face does not exist in the face visible light data or a stereoscopic face does not exist in the face radar data. The determination method and the determination standard for determining whether a face exists or not and whether a three-dimensional face exists or not may be determined according to the requirements of the actual application scene and the specific implementation of the visible light recognition model or the radar recognition model, and are not limited. For example, one or more specific portions of the face that cannot be extracted and identified may be used as the determination condition for the absence of the face or the absence of the stereoscopic face.
Some terminals are equipped with infrared cameras in addition to visible light cameras and radar cameras. On the terminals, the infrared camera can be used for collecting the face infrared data of the user to be authenticated while collecting the face visible light data and the face radar data. The human face visible light data, the human face radar data and the human face infrared data which are collected simultaneously are obtained aiming at the same human face posture, angle and action, and the same face information is reflected. The face infrared data may be one face infrared picture, two or more face infrared pictures (continuous shooting or non-continuous shooting), one section of infrared video including a face, and the like, without limitation.
After the face infrared data are obtained, the face infrared profile is identified based on the face infrared data, the identified face infrared profile is compared with the face visible light profile and the face radar profile, and when the face visible light characteristic is matched with the face radar characteristic, the face visible light profile is matched with the face radar profile, the face radar profile is matched with the face infrared profile and the positions of the face visible light profile, the face radar profile and the face infrared profile are coincident, the face visible light data and verified face information of a user to be authenticated are adopted to authenticate the user.
A trained infrared contour model (a machine learning model) can be typically employed to recognize the infrared contour of a human face. In the embodiment of the present specification, the input of the infrared contour model is face infrared data or one or more vectors generated by using the face infrared data, and the output of the infrared contour model comprises a face infrared contour. The infrared contour model can be trained by adopting various machine learning algorithms and a training mode suitable for the algorithms without limitation. In addition, the human face visible light profile, the human face radar profile and the human face infrared profile are matched, wherein the deviation of the three profiles is within a third preset range; the positions of the visible face outline, the radar face outline and the infrared face outline are coincident, and the position deviation of the three outlines is within a fourth preset range.
The infrared contour model can generally determine from the face infrared data whether a face is present and include the presence of the face in the output of the model. The determination mode and the determination standard for determining whether the human face exists in the human face infrared data can be determined according to the requirements of the actual application scene and the specific implementation of the infrared contour model, and are not limited. Before comparing the human face outline with the human face characteristics, whether human faces exist in the human face visible light data, the human face radar data and the human face infrared data or not can be judged. And when the visible light data of the face does not have the face, or the radar data of the face does not have a stereoscopic face, or the infrared data of the face does not have the face, judging that the authentication fails.
The infrared camera images according to infrared rays radiated or reflected by an object, can image in the dark, is not easily influenced by environmental illumination transformation like a visible light face image, is not influenced by face makeup or camouflage, and has unique advantages in face authentication. Meanwhile, the human face visible light data, the human face radar data and the human face infrared data are adopted to carry out human face authentication, so that the injection type attack on the camera is required to forge the matched infrared data besides the plane data for forging the video or the photo and the 3D stereo data matched with the plane data, the forging difficulty is further increased, and the safety of the human face authentication is improved.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
In one application example of the present specification, a visible light camera, an infrared camera, and a laser radar camera are mounted on a terminal. The trained visible light recognition model, radar recognition model and infrared profile model are stored on the terminal. The input of the visible light identification model is face visible light data, and the output is whether a face exists or not, a face visible light outline and face visible light characteristics exist; the input of the radar identification model is human face radar data, and the output is whether a three-dimensional human face, a human face radar outline and human face radar characteristics exist or not; the infrared contour model inputs the infrared data of the human face and outputs whether the human face and the infrared contour of the human face exist or not.
The flow of the face authentication performed by the user is shown in fig. 2. When the face authentication is started, the terminal simultaneously collects face visible light data, face radar data and face infrared data through the visible light camera, the laser radar camera and the infrared camera.
The terminal inputs the face visible light data into the visible light recognition model, inputs the face radar data into the radar recognition model, and inputs the face infrared data into the infrared profile model to obtain the outputs of the three models. And after the terminal judges that the human face visible light data has the human face, the human face radar data has the three-dimensional human face and the human face infrared data has the human face, comparing the human face visible light profile, the human face radar profile and the human face infrared profile. And if no human face exists in the human face visible light data, the human face radar data or the human face infrared data, returning authentication failure to the user.
When the human face visible light profile, the human face radar profile and the human face infrared profile are matched and the positions of the human face visible light profile, the human face radar profile and the human face infrared profile are coincident, the terminal compares the human face visible light characteristic with the human face radar characteristic; otherwise, authentication failure is returned to the user.
And if the human face visible light characteristics are matched with the human face radar characteristics, the terminal performs preset privacy processing on the human face visible light data to obtain human face visible light privacy data, and uploads the human face visible light privacy data to the authentication server. And if the human face visible light characteristic and the human face radar characteristic are not matched, the terminal returns authentication failure to the user.
The authentication server stores verified face privacy data of the user obtained from a national authority. The verified face privacy data are generated by the national authority according to the verified face image of the user after the same privacy processing and are provided for the authentication server to use. After receiving the face visible light privacy data of the user to be authenticated uploaded by the terminal, the authentication server extracts the verified face privacy data of the same user, if the verified face privacy data is the same as the data uploaded by the terminal, the authentication passing message is returned to the terminal, otherwise, the authentication failure message is returned to the terminal, and the terminal informs the user that the authentication fails.
Corresponding to the above flow implementation, the embodiment of the present specification further provides a face authentication implementation apparatus. The apparatus may be implemented by software, or by hardware, or by a combination of hardware and software. Taking a software implementation as an example, the logical device is formed by reading a corresponding computer program instruction into a memory for running through a Central Processing Unit (CPU) of the device. In terms of hardware, the device in which the face authentication implementation apparatus is located generally includes other hardware such as a chip for performing wireless signal transmission and reception and/or other hardware such as a board for implementing a network communication function, in addition to the CPU, the memory, and the storage shown in fig. 3.
Fig. 4 shows an implementation apparatus for face authentication provided in an embodiment of this specification, including a data acquisition unit, a feature recognition unit, and a user authentication unit, where: the data acquisition unit is used for acquiring the visible light data of the face and the radar data of the face of the user to be authenticated; the human face visible light data and the human face radar data are collected simultaneously; the characteristic identification unit is used for identifying the visible light characteristic of the human face based on the visible light data of the human face; identifying face radar features based on the face radar data; and the user authentication unit is used for authenticating the user by adopting the human face visible light data and the verified human face information of the user when the human face visible light characteristic is matched with the human face radar characteristic.
In one implementation, the apparatus further includes: the contour identification unit is used for identifying the visible light contour of the human face based on the visible light data of the human face; identifying a face radar profile based on the face radar data; the user authentication unit is specifically configured to: and when the human face visible light characteristic is matched with the human face radar characteristic, the human face visible light profile is matched with the human face radar profile, and the positions of the human face visible light profile and the human face radar profile are coincident, authenticating the user by adopting the human face visible light data and verified human face information of the user.
In the foregoing implementation manner, the data obtaining unit is specifically configured to: acquiring visible light data of a face, radar data of the face and infrared data of the face of a user to be authenticated; the human face visible light data, the human face radar data and the human face infrared data are collected simultaneously; the device also comprises an infrared outline unit which is used for identifying the infrared outline of the human face based on the infrared data of the human face; the user authentication unit is specifically configured to: and when the human face visible light characteristic is matched with the human face radar characteristic, the human face visible light profile, the human face radar profile and the human face infrared profile are matched, and the positions of the human face visible light profile, the human face radar profile and the human face infrared profile are coincident, authenticating the user by adopting the human face visible light data and verified human face information of the user.
In one example, the apparatus further comprises: the face existence unit is used for judging whether a face exists or not based on the face visible light data and judging whether a three-dimensional face exists or not based on the face radar data; and when the human face visible light data does not have a human face or the human face radar data does not have a three-dimensional human face, the authentication fails.
In the foregoing example, the data obtaining unit is specifically configured to: acquiring visible light data of a face, radar data of the face and infrared data of the face of a user to be authenticated; the human face visible light data, the human face radar data and the human face infrared data are collected simultaneously; the face presence unit is specifically configured to: judging whether a human face exists based on the human face visible light data, judging whether a three-dimensional human face exists based on the human face radar data, and judging whether the human face exists based on the human face infrared data; and when the human face visible light data does not have a human face, the human face radar data does not have a stereoscopic human face, or the human face infrared data does not have a human face, the authentication fails.
Optionally, the data obtaining unit is specifically configured to: simultaneously acquiring face visible light data and face radar data of a user to be authenticated by a terminal; the feature identification unit is specifically configured to: recognizing, by the terminal, a face visible light feature based on the face visible light data; identifying face radar features based on the face radar data; the verified face information includes: verified face privacy data is obtained after privacy processing is carried out on the authenticated face visible light image; the user authentication unit is specifically configured to: when the human face visible light characteristics are matched with the human face radar characteristics, the terminal carries out privacy processing on the human face visible light data to obtain human face visible light privacy data, and the human face visible light privacy data are uploaded to an authentication server; and the authentication server authenticates the user according to the verified face privacy data and the face visible light privacy data.
Optionally, the visible light feature of the human face includes: one to more of a distance between any two face parts, a distance between a certain face part and a face boundary; the face radar features include: the distance between any two face organs, the distance between a certain face organ and a face boundary, and the distance from a certain face organ to the camera.
Embodiments of the present description provide a computer device that includes a memory and a processor. Wherein the memory has stored thereon a computer program executable by the processor; the processor executes the steps of the implementation method of face authentication in the embodiments of the present specification when running the stored computer program. For a detailed description of each step of the implementation method of face authentication, please refer to the previous contents, which is not repeated.
Embodiments of the present description provide a computer-readable storage medium on which computer programs are stored, which, when executed by a processor, perform the steps of the implementation method of face authentication in the embodiments of the present description. For a detailed description of each step of the implementation method of face authentication, please refer to the previous contents, which is not repeated.
While the invention has been described in terms of what is presently considered to be the most practical and preferred embodiment, it is to be understood that the invention is not to be limited to the disclosed embodiment, but on the contrary, is intended to cover various modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the appended claims.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.

Claims (16)

1. A method for realizing face authentication comprises the following steps:
acquiring visible light data of a face and radar data of the face of a user to be authenticated; the human face visible light data and the human face radar data are collected simultaneously;
identifying visible light characteristics of the face based on the visible light data of the face; identifying face radar features based on the face radar data;
and when the human face visible light characteristics are matched with the human face radar characteristics, authenticating the user by adopting the human face visible light data and verified human face information of the user.
2. The method of claim 1, further comprising: recognizing a human face visible light outline based on the human face visible light data; identifying a face radar profile based on the face radar data;
when the human face visible light characteristic is matched with the human face radar characteristic, the human face visible light data and the verified human face information of the user are adopted to authenticate the user, and the authentication method comprises the following steps: and when the human face visible light characteristic is matched with the human face radar characteristic, the human face visible light profile is matched with the human face radar profile, and the positions of the human face visible light profile and the human face radar profile are coincident, authenticating the user by adopting the human face visible light data and verified human face information of the user.
3. The method of claim 2, wherein the obtaining of the visible light data of the face and the radar data of the face of the user to be authenticated comprises: acquiring visible light data of a face, radar data of the face and infrared data of the face of a user to be authenticated; the human face visible light data, the human face radar data and the human face infrared data are collected simultaneously;
the method further comprises the following steps: identifying a face infrared profile based on the face infrared data;
when the human face visible light characteristic is matched with the human face radar characteristic, the human face visible light data and the verified human face information of the user are adopted to authenticate the user, and the authentication method comprises the following steps: and when the human face visible light characteristic is matched with the human face radar characteristic, the human face visible light profile, the human face radar profile and the human face infrared profile are matched, and the positions of the human face visible light profile, the human face radar profile and the human face infrared profile are coincident, authenticating the user by adopting the human face visible light data and verified human face information of the user.
4. The method of claim 1, further comprising: judging whether a human face exists or not based on the human face visible light data, and judging whether a three-dimensional human face exists or not based on the human face radar data; and when the human face visible light data does not have a human face or the human face radar data does not have a three-dimensional human face, the authentication fails.
5. The method of claim 4, wherein the obtaining of the visible light data of the face and the radar data of the face of the user to be authenticated comprises: acquiring visible light data of a face, radar data of the face and infrared data of the face of a user to be authenticated; the human face visible light data, the human face radar data and the human face infrared data are collected simultaneously;
the judging whether a human face exists based on the human face visible light data and the judging whether a three-dimensional human face exists based on the human face radar data comprises the following steps: judging whether a human face exists based on the human face visible light data, judging whether a three-dimensional human face exists based on the human face radar data, and judging whether the human face exists based on the human face infrared data;
when the face visible light data does not have a face or the face radar data does not have a face, the authentication fails, and the authentication method comprises the following steps: and when the human face visible light data does not have a human face, the human face radar data does not have a stereoscopic human face, or the human face infrared data does not have a human face, the authentication fails.
6. The method of claim 1, wherein the obtaining of the visible light data of the face and the radar data of the face of the user to be authenticated comprises: simultaneously acquiring face visible light data and face radar data of a user to be authenticated by a terminal;
identifying human face visible light characteristics based on the human face visible light data; identifying face radar features based on the face radar data, comprising: recognizing, by the terminal, a face visible light feature based on the face visible light data; identifying face radar features based on the face radar data;
the verified face information includes: verified face privacy data is obtained after privacy processing is carried out on the authenticated face visible light image;
when the human face visible light characteristic is matched with the human face radar characteristic, the human face visible light data and the verified human face information of the user are adopted to authenticate the user, and the authentication method comprises the following steps: when the human face visible light characteristics are matched with the human face radar characteristics, the terminal carries out privacy processing on the human face visible light data to obtain human face visible light privacy data, and the human face visible light privacy data are uploaded to an authentication server; and the authentication server authenticates the user according to the verified face privacy data and the face visible light privacy data.
7. The method of claim 1, the visible light features of the human face comprising: one to more of a distance between any two face parts, a distance between a certain face part and a face boundary; the face radar features include: the distance between any two face organs, the distance between a certain face organ and a face boundary, and the distance from a certain face organ to the camera.
8. An implementation device for face authentication comprises:
the data acquisition unit is used for acquiring the visible light data of the face of the user to be authenticated and the face radar data; the human face visible light data and the human face radar data are collected simultaneously;
the characteristic identification unit is used for identifying the visible light characteristics of the human face based on the visible light data of the human face; identifying face radar features based on the face radar data;
and the user authentication unit is used for authenticating the user by adopting the human face visible light data and the verified human face information of the user when the human face visible light characteristic is matched with the human face radar characteristic.
9. The apparatus of claim 8, further comprising: the contour identification unit is used for identifying the visible light contour of the human face based on the visible light data of the human face; identifying a face radar profile based on the face radar data;
the user authentication unit is specifically configured to: and when the human face visible light characteristic is matched with the human face radar characteristic, the human face visible light profile is matched with the human face radar profile, and the positions of the human face visible light profile and the human face radar profile are coincident, authenticating the user by adopting the human face visible light data and verified human face information of the user.
10. The apparatus according to claim 9, wherein the data acquisition unit is specifically configured to: acquiring visible light data of a face, radar data of the face and infrared data of the face of a user to be authenticated; the human face visible light data, the human face radar data and the human face infrared data are collected simultaneously;
the device further comprises: the infrared outline unit is used for identifying a human face infrared outline based on the human face infrared data;
the user authentication unit is specifically configured to: and when the human face visible light characteristic is matched with the human face radar characteristic, the human face visible light profile, the human face radar profile and the human face infrared profile are matched, and the positions of the human face visible light profile, the human face radar profile and the human face infrared profile are coincident, authenticating the user by adopting the human face visible light data and verified human face information of the user.
11. The apparatus of claim 8, further comprising: the face existence unit is used for judging whether a face exists or not based on the face visible light data and judging whether a three-dimensional face exists or not based on the face radar data; and when the human face visible light data does not have a human face or the human face radar data does not have a three-dimensional human face, the authentication fails.
12. The apparatus according to claim 11, wherein the data acquisition unit is specifically configured to: acquiring visible light data of a face, radar data of the face and infrared data of the face of a user to be authenticated; the human face visible light data, the human face radar data and the human face infrared data are collected simultaneously;
the face presence unit is specifically configured to: judging whether a human face exists based on the human face visible light data, judging whether a three-dimensional human face exists based on the human face radar data, and judging whether the human face exists based on the human face infrared data; and when the human face visible light data does not have a human face, the human face radar data does not have a stereoscopic human face, or the human face infrared data does not have a human face, the authentication fails.
13. The apparatus according to claim 8, wherein the data acquisition unit is specifically configured to: simultaneously acquiring face visible light data and face radar data of a user to be authenticated by a terminal;
the feature identification unit is specifically configured to: recognizing, by the terminal, a face visible light feature based on the face visible light data; identifying face radar features based on the face radar data;
the verified face information includes: verified face privacy data is obtained after privacy processing is carried out on the authenticated face visible light image;
the user authentication unit is specifically configured to: when the human face visible light characteristics are matched with the human face radar characteristics, the terminal carries out privacy processing on the human face visible light data to obtain human face visible light privacy data, and the human face visible light privacy data are uploaded to an authentication server; and the authentication server authenticates the user according to the verified face privacy data and the face visible light privacy data.
14. The apparatus of claim 8, the human face visible light features comprising: one to more of a distance between any two face parts, a distance between a certain face part and a face boundary; the face radar features include: the distance between any two face organs, the distance between a certain face organ and a face boundary, and the distance from a certain face organ to the camera.
15. A computer device, comprising: a memory and a processor; the memory having stored thereon a computer program executable by the processor; the processor, when executing the computer program, performs the method of any of claims 1 to 7.
16. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN202110425395.XA 2021-04-20 2021-04-20 Method and device for realizing face authentication Active CN113065507B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110425395.XA CN113065507B (en) 2021-04-20 2021-04-20 Method and device for realizing face authentication

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110425395.XA CN113065507B (en) 2021-04-20 2021-04-20 Method and device for realizing face authentication

Publications (2)

Publication Number Publication Date
CN113065507A true CN113065507A (en) 2021-07-02
CN113065507B CN113065507B (en) 2023-06-02

Family

ID=76567324

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110425395.XA Active CN113065507B (en) 2021-04-20 2021-04-20 Method and device for realizing face authentication

Country Status (1)

Country Link
CN (1) CN113065507B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101339607A (en) * 2008-08-15 2009-01-07 北京中星微电子有限公司 Human face recognition method and system, human face recognition model training method and system
CN105513221A (en) * 2015-12-30 2016-04-20 四川川大智胜软件股份有限公司 ATM (Automatic Teller Machine) cheat-proof device and system based on three-dimensional human face identification
CN107292283A (en) * 2017-07-12 2017-10-24 深圳奥比中光科技有限公司 Mix face identification method
CN108492431A (en) * 2018-05-30 2018-09-04 中海云智慧(北京)物联网科技有限公司 A kind of intelligent access control system
CN108549867A (en) * 2018-04-12 2018-09-18 Oppo广东移动通信有限公司 Image processing method, device, computer readable storage medium and electronic equipment
CN109117725A (en) * 2018-07-09 2019-01-01 深圳市科脉技术股份有限公司 Face identification method and device
CN110059644A (en) * 2019-04-23 2019-07-26 杭州智趣智能信息技术有限公司 A kind of biopsy method based on facial image, system and associated component
CN110443096A (en) * 2018-05-02 2019-11-12 上海聚虹光电科技有限公司 The application method of changeable filter camera for cloud platform identification
CN112257641A (en) * 2020-10-30 2021-01-22 中电万维信息技术有限责任公司 Face recognition living body detection method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101339607A (en) * 2008-08-15 2009-01-07 北京中星微电子有限公司 Human face recognition method and system, human face recognition model training method and system
CN105513221A (en) * 2015-12-30 2016-04-20 四川川大智胜软件股份有限公司 ATM (Automatic Teller Machine) cheat-proof device and system based on three-dimensional human face identification
CN107292283A (en) * 2017-07-12 2017-10-24 深圳奥比中光科技有限公司 Mix face identification method
CN108549867A (en) * 2018-04-12 2018-09-18 Oppo广东移动通信有限公司 Image processing method, device, computer readable storage medium and electronic equipment
CN110443096A (en) * 2018-05-02 2019-11-12 上海聚虹光电科技有限公司 The application method of changeable filter camera for cloud platform identification
CN108492431A (en) * 2018-05-30 2018-09-04 中海云智慧(北京)物联网科技有限公司 A kind of intelligent access control system
CN109117725A (en) * 2018-07-09 2019-01-01 深圳市科脉技术股份有限公司 Face identification method and device
CN110059644A (en) * 2019-04-23 2019-07-26 杭州智趣智能信息技术有限公司 A kind of biopsy method based on facial image, system and associated component
CN112257641A (en) * 2020-10-30 2021-01-22 中电万维信息技术有限责任公司 Face recognition living body detection method

Also Published As

Publication number Publication date
CN113065507B (en) 2023-06-02

Similar Documents

Publication Publication Date Title
US11669607B2 (en) ID verification with a mobile device
US10943095B2 (en) Methods and systems for matching extracted feature descriptors for enhanced face recognition
US9652663B2 (en) Using facial data for device authentication or subject identification
CN105868677B (en) Living body face detection method and device
Das et al. Recent advances in biometric technology for mobile devices
Killioğlu et al. Anti-spoofing in face recognition with liveness detection using pupil tracking
KR101647803B1 (en) Face recognition method through 3-dimension face model projection and Face recognition system thereof
US10733279B2 (en) Multiple-tiered facial recognition
CN112825128A (en) Method and apparatus for liveness testing and/or biometric verification
US11961329B2 (en) Iris authentication device, iris authentication method and recording medium
WO2020079741A1 (en) Iris authentication device, iris authentication method, and recording medium
WO2020065954A1 (en) Authentication device, authentication method, and storage medium
KR102215535B1 (en) Partial face image based identity authentication method using neural network and system for the method
CN113065507B (en) Method and device for realizing face authentication
CN114581978A (en) Face recognition method and system
CN112990047B (en) Multi-pose face verification method combining face angle information
Lin et al. A novel framework for automatic 3D face recognition using quality assessment
KR101718244B1 (en) Apparatus and method of processing wide angle image for recognizing face
Beumier et al. Automatic face recognition
US20230342947A1 (en) Determination method, storage medium, and information processing apparatus
US20240071135A1 (en) Image processing device, image processing method, and program
Fagbolu et al. Secured banking operations with face-based automated teller machine
Drishty Review on gait recognition using artificial intelligence
Gossen Head pose normalization for recognition of human identities using color and depth data
CN113487492A (en) Parallax value correction method, parallax value correction device, electronic apparatus, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant