OA17098A - Authentication method, device and system based on biological characteristics. - Google Patents

Authentication method, device and system based on biological characteristics. Download PDF

Info

Publication number
OA17098A
OA17098A OA1201400421 OA17098A OA 17098 A OA17098 A OA 17098A OA 1201400421 OA1201400421 OA 1201400421 OA 17098 A OA17098 A OA 17098A
Authority
OA
OAPI
Prior art keywords
biométrie
client
template
authenticated
image
Prior art date
Application number
OA1201400421
Inventor
Pengfei XIONG
Bo Chen
Jie Hou
Hailong Liu
Original Assignee
Tencent Technology (Shenzhen) Company Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology (Shenzhen) Company Limited filed Critical Tencent Technology (Shenzhen) Company Limited
Publication of OA17098A publication Critical patent/OA17098A/en

Links

Abstract

A biometric-based authentication method, an apparatus, and a system are described. The method includes: receiving a biometric image to be authenticated sent from a client; performing feature extraction to the biometric image to be authenticated to obtain a biometric template to be authenticated; comparing the biometric template to be authenticated with a locally-stored biometric template; and returning an authentication result. In this case, the feature extraction process may be implemented at a cloud server side, as such, the complexity of the client may be reduced, the expandability of the client may be increased, a limitation that the biometric recognition may only be implemented on the client may be eliminated, and diversified utilization may be supported.

Description

AUTHENTICATION METHOD, DEVICE AND SYSTEM BASED ON BIOLOGICAL CHARACTERISTICS
Field of the Invention [0001] The présent disclosure relates to a computer field, and more particularly, to a biometric-based authentication method, an apparatus, and a System.
Background of the Invention [0002] Biométrie récognition technology may be defined to mean to io identify individual identity using human physiological features or behavioral features. In current information âge, how to accurately identify the identity of a person and protect information security has become a crucial social issue that must be addressed. Traditional identity authentication may easily be forged and lost, and thus be more and more difficult to meet social 15 requirements. Currentiy, the most safe and convenient solution may be the biométrie récognition technology, which may be simple and rapid. Further, identity authentication using the biométrie récognition technology may be very safe, reliable, and accurate.
[0003] Currentiy, the biométrie récognition technology may mainly 20 include human face récognition, fingerprint récognition, and iris récognition, etc. Taking the face récognition as an example, currentiy there are a variety of authentication services based on the face récognition. For example, an attendance checking service based on hardware like an attendance checking device, in which a human face may be collected and matched Iocally, and 25 functions of face attendance checking and access control may be achieved,
e.g., a face attendance checking device of some companies. Another example may be a login service based on a computer and a mobile terminal, such as face-verification boot of some notebooks, and face-verification unlock of some Smart phones.
Summary of the Invention [0004] Examples of the présent disdosure provide a biometric-based authentication method, an apparatus, and a System. The technical solution of the examples of the présent disdosure is described as follows.
[0005] A biometric-based authentication method inciudes: receiving, by a cloud server, a biométrie image to be authenticated sent from a client;
performing, by the cloud server, feature extraction to the biométrie image to be authenticated to obtain a biométrie template to be authenticated;
comparing, by the cloud server, the biométrie template to be authenticated with a biométrie template stored in the cloud server, 15 and retuming an authentication resuit to the client.
[0006] According to an example of the présent disdosure, the operation of performing the feature extraction to the biométrie image to be authenticated to obtain the biométrie template to be authenticated indudes:
performing illumination-normalization processing to the biométrie image to be authenticated;
performing the feature extraction to the biométrie image to be authenticated which is processed with the illumination normalization;
performing dimension-reducing calculation to an extracted feature; and linkîng results of the dimension-reducing calculation one by one to obtain the biométrie template to be authenticated.
[0007] According to an example of the présent disclosure, before the operation of receiving the biométrie image to be aulhenticated sent from the client, the method further includes:
receiving a biométrie image to be registered, a client identifier (ID), and a first user ID that are sent from the client;
performing the feature extraction to the biométrie image to be registered to obtain a biométrie template;
storing a relationship associated with the biométrie template, the client ID, and the first user ID to complété registration of a user; and retuming a registration resuit.
[0008] According to an example of the présent disclosure, the method further includes:
upon receiving the biométrie image to be aulhenticated sent from the client, receiving the client ID sent from the client;
wherein the operation of comparing the biométrie template to be authenticated with the biométrie template stored in the cloud server and retuming the authentication resuit comprises: searching out, according to the client ID, a biométrie template stored in the cloud server and associated with the client ID to obtain a collection of the biométrie template stored in the cloud server and associated with the client ID;
calculating a similarity between the biométrie template to be authenticated and each biométrie template in the collection;
when a similarity between the biométrie template to be authenticated and a biométrie template in the collection is greater than a predetermined récognition threshold, adding a user ID associated with the biométrie template in the collection to a récognition resuit collection;
sorting the user ID in the récognition resuit collection according to a descending order of the similarity; and retuming the récognition resuit collection to the client.
[0009] According to an example of the présent disclosure, the method 5 further includes:
upon receiving the biométrie image to be authenticated sent from the client, receiving the client ID and a second user ID sent from the client;
wherein the operation of comparing the biométrie template to be io authenticated with the biométrie template stored in the cloud server and retuming the authentication resuit comprises: searching out, according to the client ID and the second user ID, a biométrie template stored in the cloud server and associated with the client ID and the second user ID;
is calculating a similarity between the biométrie template to be authenticated and the biométrie template stored in the cloud server and associated with the client ID and the second user ID;
when the calculated similarity is greater than a predetermined 20 vérification threshold, determining that vérification of a user is passed, and retuming a vérification resuit to the client.
[0010] A biometric-based authentication method includes:
collecting, by a client, a user image;
performing, by the client, biometric-positioning processing to the user 25 image to obtain a biométrie image to be authenticated;
transmitting, by the client, the biométrie image to be authenticated to a cloud server, so that the cloud server performs feature extraction to the biométrie image to be authenticated to obtain a biométrie template to be authenticated, and compares the biométrie template to be authenticated with a biométrie template stored in the cloud server, and receiving, by the client, an authentication resuit retumed from the cloud 5 server.
[0011] According to an example of the présent disclosure, the operation of performing the biometric-positioning processing to the user image to obtain the biométrie image to be authenticated includes:
upon detecting that a predetermined biométrie is included in the user io image, determining and marking a position of the biométrie;
selecting a key-point position of the biométrie;
obtaining a coordinate of the key-point position; and performing, based on the obtained coordinate of the key-point position, posture correcting to the key-point position to obtain the biométrie 15 image to be authenticated.
[0012] A cloud server includes:
an access module, to receive a biométrie image to be authenticated sent from a client;
an authentication module, to perform feature extraction to the biométrie image to be authenticated to obtain a biométrie template to be authenticated, compare the biométrie template to be authenticated with a biométrie template stored in a data module, and retum an authentication resuit to the client; and the data module, to store the biométrie template.
[0013] According to an example of the présent disclosure, the authentication module includes:
an illumination processing unit, to perform illumination-normalization processing to the biométrie image to be authenticated;
a feature extracting unit, to perform the feature extraction to the biométrie image to be authenticated which is processed with the illumination normal ization, perform dimension-reducîng calculation to an extracted feature, and link results of the dimension-reducing calculation one by one to obtain the biométrie template to be authenticated; and an authenticating unit, to compare the biométrie template to be authenticated with the biométrie template stored in the data module, and retum the authentication resuit.
[0014] According to an example of the present disclosure, the access module is further to, before receiving the biométrie image to be authenticated sent from the client, receive a biométrie image to be regîstered, a client identifier (ID), and a first user ID that are sent from the client;
the cloud server further includes:
a session module, to perform the feature extraction to the biométrie image to be regîstered to obtain a biométrie template, and send a relationship associated with the biométrie template, the client ID, and the first user ID to the data module to complété registration of a user, and retum a registration resuit;
the data module is further to store the relationship associated with the biométrie template, the client ID, and the first user ID.
[0015] According to an example of the present disclosure, the access module îs further to, upon receiving the biométrie image to be authenticated sent from the client, receive the client ID sent from the client;
the authentication module includes:
a template obtaining unit, to perform the feature extraction to the biométrie image to be authenticated to obtain the biométrie template to be authenticated;
a collection obtaining unit, to search out, according to the client ID, a biométrie template stored in the data module and associated with the client ID to obtain a collection of the biométrie template stored in the data module and associated with the client ID;
a recognizing unit, to calculate a similarity between the biométrie template to be authenticated and each biométrie template in the collection; when a similarity between the biométrie io template to be authenticated and a biométrie template in the collection is greater than a predetermined récognition threshold, add a user ID associated with the biométrie template in the collection to a récognition resuit collection; and a récognition resuit transmitting unit, to sort the user ID in the récognition resuit collection according to a descending order of the similarity, and retum the récognition resuit collection to the client through the access module.
[0016] According to an example of the présent disclosure, the access
2o module is further to, upon receiving the biométrie image to be authenticated sent from the client, receive the client ID and a second user ID sent from the client;
the authentication module includes:
a template obtaining unit, to perform the feature extraction to the 23 biométrie image to be authenticated to obtain the biométrie template to be authenticated;
a searching unit, to search out, according to the client ID and the second user ID, a biométrie template stored in the data
module and associated with the client ID and the second user ID;
a verifying unit, to calculate a similarity between the biométrie template to be authenticated and the biométrie template 3 stored in the data module and associated with the client ID and the second user ID; when the calculated similarity is greater than a predetermined vérification threshold, détermine that vérification of a user is passed; and a vérification resuit transmitting unit, to retum a vérification resuit io to the client through the access module.
[0017] A client includes:
a collecting module, to collect a user image, and perform biometric-positioning processing to the user image to obtain a biométrie image to be authenticated;
a transmitting module, to transmit the biométrie image to be authenticated to a cloud server, so that the cloud server perforais feature extraction to the biométrie image to be authenticated to obtain a biométrie template to be authenticated, and compares the biométrie template to be authenticated with a biométrie template 20 stored in the cloud server, and a receiving module, to receive an authentication resuit retumed from the cloud server.
[0018] According to an example of the présent disclosure, the collecting module includes:
a collecting unit, to collect the user image;
a detecting unit, to détermine and mark, upon detecting that a predetermined biométrie is included in the user image, a position of the biométrie;
a key-point positioning unit, to select a key-point position of the biométrie, and obtain a coordinate of the key-point position; and a position-posture normalization unit, to perform, based on the obtained coordinate of the key-point position, posture correcting to the key-point position to obtain the biométrie image to be authentîcated.
[0019] A biometric-based authentication system includes:
a cloud server as described above and a client as described above.
[0020] Advantages of the technical solution provided by the examples of the présent disclosure may at least be illustrated as follow. A client may obtain a biométrie image and send the biométrie image to a cloud server; the cloud server may perform feature extraction to the biométrie image to obtain a biométrie template, and may perform biometric-based authentication to a user or the client. In this case, the feature extraction process may be implemented at the cloud server side, as such, the complexity of the client may be reduced, the expandability of the client may be increased, a limitation that the biométrie récognition may only be implemented on the client may be eliminated, and diversified utilization may be supported.
Brief Description of Drawings [0021] Hereinafter, accompanying drawings used to describe examples of the présent disclosure may be briefly introduced to make the technical solution of the présent disclosure clearer. Obviously, the drawings described below may be some example embodiments of the présent disclosure. According to these drawings, those skilled in the art may also obtain other drawings without making créative efforts.
[0022] FIG. 1 is a flowehart illustrating a biometric-based authentication method, according to an example of the présent disclosure.
[0023] FIG. 2 is a schematic diagram illustrating network architecture for implementing biometric-based registration and authentication service, according to an example of the présent disclosure.
[0024] FIG. 3 is a schematic diagram illustrating network architecture 5 for implementing biometric-based registration, according to an example of the présent disclosure.
[0025] FIG. 4 is a flowchart illustrating a biometric-based registration method, according to an example of the présent disclosure.
[0026] FIG. 5 is a flowchart illustrating a biometric-based vérification 10 method, according to an example of the présent disclosure.
[0027] FIG. 6 is a schematic diagram illustrating network architecture for implementing biometric-based vérification, according to an example of the présent disclosure.
[0028] FIG. 7 is a flowchart illustrating a biometric-based vérification 15 method, according to an example of the présent disclosure.
[0029] FIG. 8 is a flowchart illustrating a biometric-based récognition method, according to an example of the présent disclosure.
[0030] FIG. 9 is a schematic diagram illustrating network architecture for implementing biometric-based récognition, according to an example of 20 the présent disclosure.
[0031] FIG. 10 is a flowchart illustrating a biometric-based récognition method, according to an example of the présent disclosure.
[0032] FIG. 11 is a schematic diagram illustrating a structure of a cloud server, according to an example of the présent disclosure.
[0033] FIG. 12 is a schematic diagram illustrating a structure of an authentication module in a cloud server, according to an example of the présent disclosure.
[0034] FIG. 13 is a schematic diagram illustrating a second structure of a cloud server, according to an example of the présent disclosure.
[0035] FIG. 14 is a schematic diagram illustrating a second structure of an authentication module in a cloud server, according to an example of the 5 présent disclosure.
[0036] FIG. 15 is a schematic diagram illustrating a third structure of an authentication module in a cloud server, according to an example of the présent disclosure.
[0037] FIG. 16 is a schematic diagram illustrating a structure of a client, îo according to an example of the présent disclosure.
[0038] FIG. 17 is a schematic diagram illustrating a structure of a collecting module in a client, according to an example of the présent disclosure.
[0039] FIG. 18 is a schematic diagram illustrating a hardware structure 15 of a client, according to an example of the présent disclosure.
Detailed Description of the Invention [0040] Hereinafter, lhe présent disclosure will be described in further detail with référencé to the accompanying drawings and exemplary examples.
[0041] Conventionally, biométrie récognition techniques may ail be implemented on a client. The utilization of the conventional biométrie récognition techniques may be limited, may not support multi-client expansion, and lack diversified functions. Further, the authentication performed on the client may lead to reiatively complex authentication logic 25 of the client.
[0042] According to an example of the présent disclosure, a client may include but may not be limited to a cell phone, a tablet Personal computer
(PC), a laptop PC, a PC, a vehicle-mounted electronic System, a personal digital assistant (PDA), etc. The client may be any peripherals that can be connected to the internet, which may not be particularly limited in examples of the présent disclosure.
[0043] As shown in FIG 1, an example of the présent disclosure may provide a biometric-based authentication method, which may include the following operations.
[0044] In block 101, a client may collect a user image and perform biometric-positioning processing to the user image to obtain a biométrie l o i mage to be authenticated.
[0045] In block 102, the client may send the biométrie image to be authenticated to a cloud server.
[0046] In block 103, the cloud server may perform feature extraction to the biométrie image to be authenticated to obtain a biométrie template to be authenticated, and compare the biométrie template to be authenticated with a biométrie template pre-stored in the cloud server.
[0047] In block 104, the cloud server may retum an authentication resuit to the client.
[0048] Examples of the présent disclosure provide a biometric-based authentication method, in which a client may obtain a biométrie image and send the biométrie image to a cloud server, the cloud server may perform feature extraction to the biométrie image to obtain a biométrie template, and may perform biometric-based authentication to a user or the client. In this case, the feature extraction process may be implemented at the cloud server side, as such, the complexity of the client may be reduced, the expandability of the client may be increased, a limitation that the biométrie récognition may only be implemented on the client may be eliminated, and diversified utilization may be supported.
[0049] The method provided by the examples of the présent disclosure may implement a biometric-based registration and authentication service of the user. In this case, the authentication service may include a vérification service and a récognition service. Furthermore, an exampie of the présent disclosure may provide the architecture as shown in FIG 2 to achieve the above functions.
[0050] As shown in FIG 2, an access server, a session server, a vérification server, a récognition server, and a data server may form a cloud server. Among them, the access server may exchange data with the client, or with other servers included in the cloud server through an intemet-based protocol like a hypertext transfer protocol (HTTP) or a transmission control protocol (TCP);
the session server may implement a biometric-based registration service of the user;
the vérification server may implement a biometric-based vérification service of the user, the récognition server may implement a biometric-based récognition service of the user, and the data server may store a user identifier (user ID), a client ID, a legitimate biométrie template, and a relationship associated with the user ID, the client ID, and the legitimate biométrie template.
[0051] According to an example of the présent disclosure, a biometric-based registration method may be provided. It should be noted that before the user performs the biometric-based authentication through the client, the biometric-based registration may be performed, in which the user ID, the client ID, and the legitimate biométrie template may be associated with each other at the cloud server side.
[0052] Examples of the présent disclosure may be implemented, based on the architecture as shown in FIG 3, to implement the registration. FIG. 3 is a schematic diagram illustrating the network architecture for implementing the biometric-based registration, according to an example of the présent disclosure. As shown in FIG 3, the network architecture may include a client, an access server, a session server, and a data server.
[0053] It should be note that the example of the présent disclosure and examples described later may be illustrated taking the face récognition as a biométrie récognition technique. However, examples of the présent disclosure may not be limited to the face récognition. Other biometric-based techniques such as iris récognition and fmgerprint récognition may also be applicable to the exemples of the présent disclosure.
[0054] As shown in FIG 4, an example of the présent disclosure may provide a biometric-based registration method, which may include the following operations.
[0055] In block 201, a client may collect a user image.
[0056] In this case, the client may collect the user image. Specifically, the client may collect the user image from local pictures or videos, or may collect the user image through other collection devices, such as a caméra in a mobile phone.
[0057] In block 202, the client may detect a human face in the user image, and may détermine and mark a position of the face.
[0058] Specifically, when there is a human face in the user image, the position of the face may be determined and marked.
[0059] The operations in block 202 may be implemented by means of Haar features plus an adaboost face détection algorithm. The Haar features may be divided into three categories including an edge feature, a linear feature, as well as a center feature and a diagonal feature. The adaboost face détection algonthm may include Haar feature sélection and feature calculation. Among them, the feature calculation may be implemented by means of an intégral image method. According to the intégral image method, the three categories of the Haar features may be combined to form a feature template, in which there are a white rectangle and a black rectangle, and a feature value of the template may be obtained by subtracting a pixel sum of the black rectangle from a pixel sum of the white rectangle.
[0060] According to the main concept of the intégral image, a sum of pixels of each rectangle area in an image, which may be formed from a starting point to each point in the image, may be stored in a memory as an element of an array. When a pixel sum of an area is to be calculated, an element of the array may directly be used without re-calculating the pixel sum of the area, so that the calculation may be accelerated. When there are various sizes of a same area, the integra] image may calculate different features of the area using the same time, as such, the détection speed may be greatly improved.
[0061] The adaboost algorithm may be a conventional way of face détection, and may not be repeated herein.
[0062] It should be noted that more accurate positioning may be obtained under a small posture of the human face, in which a position of the human face is at a left-right inclination from -30 degrees to 30 degrees.
[0063] In block 203, the client may select a key-point position on the human face, and may obtain a coordinate of the key-point position.
[0064] According to an example of the présent disdosure, an eye and a mouth on the human face may be selected as the key-point position. As such, the operation of seiecting the key-point position on the human face and obtaining the coordinate of the key-point position may include:
determining and marking positions of the eye and the mouth on the obtained face area, obtaining a candidate eye area and a candidate mouth area through image projection, obtaining, on the candidate eye area, an accurate coordinate of a center of the eye using the Haar feature plus the adaboost algorithm, and obtaining, on the candidate mouth area, an accurate coordinate of a corner of the mouth using gabor feature plus the adaboost algorithm. [0065] In this case, the extraction of the gabor feature may be a conventional way of the face récognition, and may not be repeated herein. [0066] In block 204, the client may perform position-posture normalization processing to the key-point position to obtain a face image. [0067] In this case, the operation of performing the position-posture normalization processing to the key-point posilion to obtain the face image may include:
based on the obtained positions of the eye and the mouth, i.e., the accurate coordinate of the center of the eye and the accurate coordinate of the corner of the mouth, converting the original user image to a standard human face template through normalization operations, which may include clipping, zooming, posture correcting, etc., so as to cnsure that the eye and the mouth may be in a standard position on the standard face template, and thus the standard face image may be obtained.
[0068] In block 205, the client may compress the face image and send the compressed face image to a cloud server through a network.
[0069] In this case, when the compressed face image is sent to the cloud server, a user ID and a client ID may be sent to the cloud server, as well. [0070] In block 206, the cloud server may decompress the compressed face image.
[0071] In block 207, illumination-normalization processing may be performed to the decompressed face image.
[0072] In this case, the accuracy of the face récognition may be decreased due to different intensities and directions of lights acting on the 5 human face. As such, the obtained face image may be under a same illumination condition through the illumination-normalization processing, and therefore the accuracy of the face récognition may be improved.
[0073] In block 208, the cloud server may perform feature extraction to obtain a face-feature templale.
îo [0074] According to an example of the présent disclosure, the operation of performing, by the cloud server, the feature extraction to obtain the face-feature template may include:
performing, on the face image processed with the illumination normalization, global partitioning feature extraction, which may include 15 gabor local features, local binary patterns (LBP), and histograms of oriented gradients (HOG), performing dimension-reducing calculation to an extracted feature using a linear discriminant analysis (LDA) model, and linking results of the dimension-reducing calculation one by one to 20 obtain the face-feature template.
[0075] In this case, the LDA may be a collection-probability model, and may process a discrète data collection and reduce the dimension.
[0076] In block 209, the registration based on the human face feature may be implemented.
[0077] According to an example of the présent disclosure, the operations in block 209 may include:
the cloud server crcating and storing a relationship associated with the user ID, the client ID, and the legitimate face-feature template to complété the registration of the user. Among them, the relationship associated with the user ID, the client ID, and the legitimate face-feature template may be stored in a template database of the cloud server.
[0078] In block 210, a registration resuit may be retumed to the client.
[0079] Examples of the présent disdosure provide a biometric-based registration method, in which a user or a client may transmit a biométrie image to a cloud server for registering, and a relationship associated with a user ID, a client ID, and the biométrie image may be stored in the cloud server. As such, the biométrie authentication may be performed based on the to internet, so that the complexity of the client may be reduced, the expandability of the client may be increased, a limitation that the biométrie récognition may only be implemented on the client may be eliminated, and diversified utilization may be supported.
[0080] Examples of the présent disdosure provide a biometric-based 15 vérification method. FIG. 5 is a flowehart illustrating the biometric-based vérification method, according to an example of the présent disdosure. As shown in FIG 5, a face-feature template to be verified of a user may be obtained through feature extraction, a template corresponding to a user ID and a client ID may be selected from a template database in the cloud server 20 and may be compared with the face-feature template to be verified. As such, the biometric-based vérification of the user and the client may be implemented, and the permission of the user for using the client may be determined.
[0081] An example of the présent disdosure may provide the 25 architecture as shown in FIG 6 to implement the biometric-based vérification. FIG. 6 is a schematic diagram illustrating the network architecture for implementing the biometric-based vérification, according to an example of the présent disdosure. As shown in FIG 6, the network architecture may include a client, an access server, a vérification server, and a data server. Examples of the présent disclosure may be illustrated still taking the face récognition as the biométrie récognition technique.
[0082] As shown in FIG 7, an example of the présent disclosure may provide a biometric-based vérification method, which may include the following operations.
[0083] In block 301, a client may collect a user image.
[0084] In this case, the client may collect the user image. Specifically, the client may collect the user image from local pictures or videos, or may collect the user image through other collection devices, such as a caméra in a mobile phone.
[0085] In block 302, the client may detect a human face in the user image, and may détermine and mark a position of the face.
[0086] Specifically, when there is a human face in the user image, the position of the face may be determined and marked.
[0087] The operations in block 302 may be implemented by means of the Haar features plus the adaboost face détection algorithm. Implémentation of the operations in block 302 may be the same as that of the operations in block 202 in the aforementioned biometric-based registration method, which may not be repeated herein.
[0088] It should be noted that more accurate positioning may be obtained under a small posture of the human face, in which a position of the human face is at a left-righ t inclination from -30 degrees to 30 degrees.
[0089] In block 303, the client may select a key-point position on the human face, and may obtain a coordinate of the key-point position.
[0090] According to an example of the présent disclosure, an eye and a mouth on the human face may be selected as the key-point position. As such.
the operation of selecting the key-point position on the human face and obtaining the coordinate of the key-point position may include:
determining and marking positions of the eye and the mouth on the obtained face area, obtaining a candidate eye area and a candidate mouth area through image projection, obtaining, on the candidate eye area, an accurate coordinate of a center of the eye using the Haar feature plus the adaboost algorithm, and obtaining, on the candidate mouth area, an accurate coordinate of a corner of the mouth using the gabor feature plus the adaboost algorithm. [0091] In this case, the extraction of the gabor feature may be a conventional way of the face récognition, and may not be repeated herein. [0092] In block 304, the client may perform position-posture normalization processing to the key-point position to obtain a face image to be verified.
[0093] In this case, the operation of performing the position-posture normalization processing to the key-point position to obtain the face image to be verified may include:
based on the obtained positions of the eye and the mouth, i.e., the accurate coordinate of the center of the eye and the accurate coordinate of the corner of the mouth, converting the original user image to a standard face template through normalization operations, which may include clipping, zooming, posture correcting, etc., so as to ensure that the eye and the mouth may be in a standard position on the standard face template, and thus the standard face image to be verified may be obtained.
[0094] In block 305, the client may compress the face image to be verified and send the compressed face image to be verified to a cloud server through a network.
[0095] ln this case, when the compressed face image to be verifïed is sent to the cloud server, a user ID and a client ID may be sent to the cloud server, as well.
[0096] In block 306, the cloud server may decompress the compressed 5 face image to be verifïed.
[0097] In block 307, illumination-normalization processing may be performed to the decompressed face image to be verifïed.
[0098] In this case, the accuracy of the face récognition may be decreased due to different intensîties and directions of lights acting on the io human face. As such, the obtained face image may be under a same illumination condition through the illumination-normalization processing, and therefore the accuracy of the face récognition may be improved.
[0099] In block 308, the cloud server may perform the feature extraction to obtain a face-feature template to be verifïed.
[0100] According to an example of the présent disclosure, the operation of performing, by the cloud server, the feature extraction to obtain the face-feature template to be verifïed may include:
performing, on the face image to be verifïed processed with the illumination normalizatîon, global partitioning feature extraction, which may 20 include gabor local features, LBP, and HOG, performing dimension-reducing calculation to an extracted feature using a LD A model, and linking results of the dimension-reducing calculation one by one to obtain the face-feature template to be verifïed.
[0101] In this case, the LDA may be a collection-probability model, and may process a discrète data collection and reduce the dimension.
[0102] In block 309, the face-feature template to be verified may be compared with a locally-stored face-feature template, and a vérification resuit may be retumed.
[0103] In this case, the operation of comparing the face-feature template to be verified with the locally-stored face-feature template and retuming the vérification resuit may include the following processes.
[0104] In process 309-1, a face-feature template associated with the user ID and the client ID may be obtained from the template database.
[0105] In process 309-2, similarity between the face-feature template to 10 be verified and the face-feature template associated with the user ID and the client ID may be calculated.
[0106] According to an example of the présent disclosure, the calculation of the similarity may employ a cosine distance and k-nearest neighbor (KNN) algorithm, which may not be repeated herein.
[0107] In process 309-3, it may be determined whether the calculated similarity is greater than a predetermined vérification threshold. In response to a détermination that the calculated similarity is greater than the predetermined vérification threshold, the vérification may be passed. In response to a détermination that the calculated similarity is not greater than the predetermined vérification threshold, the vérification may not be passed.
[0108] In block 310, a vérification resuit may be retumed to the client.
[0109] According to an example of the présent disclosure, the biometric-based vérification method may be illustrated as follows. It may be assumed that a user A logs in an instant messaging (IM) application, such as 25 Tencent QQ, at a mobile phone A (i.e., a client), and a login password is a face of the user A. As such, a process of verifying the user A may be illustrated as follows. The user A may input, at the mobile phone A, a QQ number A (i.e., a user ID) that is registered by the user A. Meanwhile, the user A may collect the face of the user A using the mobile phone A and may send the QQ number A, the face of the user A, and an ID of the mobile phone A (i.e., a client ID) to acloud server for vérification. If the vérification at the cloud server is passed, the user A may successfully log in the QQ number A 5 at the mobile phone A.
[0110] Examples of the présent disclosure provide a biometric-based vérification method, in which a client may obtain a biométrie image and send the biométrie image to a cloud server; the cloud server may perform feature extraction to the biométrie image to obtain a biométrie template, and may 10 perform biometric-based vérification to a user or the client. In this case, the feature extraction process may be implemented at the cloud server side, so that the complexity of the client may be reduced, the expandability of the client may be increased, a limitation that the biométrie récognition may only be implemented on the client may be eliminated, and diversified utilization is may be supported.
[OUI] Examples of the présent disclosure provide a biometric-based récognition method. FIG. 8 is a flowchart illustrating the biometric-based récognition method, according to an example of the présent disclosure. As shown in FIG. 8, a face-feature template to be recognized of a user may be 20 obtained through feature extraction, and, a template associated with a client ID may be selected from a template database in a cloud server and may be compared with the face-feature template to be recognized. As such, the biometric-based récognition of the user and the client may be implemented, and a user ID corresponding to the face-feature template to be recognized 25 may be obtained.
[0112] An example of the présent disclosure may provide the architecture as shown in FIG 9 to implement the biometric-based récognition. FIG. 9 is a schematic diagram illustrating the network architecture for împlementing the biometric-based récognition, according to an example of the present disclosure. As shown in FIG 9, the network architecture may include a client, an access server, a récognition server, and a data server. Examples of the present disclosure may be illustrated still taking the face récognition as the biométrie récognition technique.
[0113] As shown in FIG 10, an example of the present disclosure may provide a biometric-based récognition method, which may include the following operations.
[0114] In block 401, a client may collect a user image.
[0115] In this case, the client may collect the user image. Specifically, the client may collect the user image from local pictures or videos, or may collect the user image through other collection devices, such as a caméra in a mobile phone.
[0116] In block 402, the client may detect a human face in the user image, and may détermine and mark a position of the face.
[0117] Specifically, when there is a human face in the user image, the position of the face may be determined and marked.
[0118] The operations in block 402 may be implemented by means of the Haar features plus the adaboost face détection algorithm. Implémentation of the operations in block 402 may be the same as that of the operations in block 202 in the aforementioned biometric-based registration method, which may not be repeated herein.
[0119] It should be noted that more accurate positioning may be obtained under a small posture of the human face, in which a position of the human face is at a left-right inclination from -30 degrees to 30 degrees.
[0120] In block 403, the client may select a key-point position on the face, and may obtain a coordinate of the key-point position.
[0121] According to an exampie of the présent disclosure, an eye and a mouth on the human face may be selected as the key-point position. As such, the operation of selecting the key-point position on the human face and obtaining the coordinate of the key-point position may include:
determining and marking positions of the eye and the mouth on the obtained face area, obtaining a candidate eye area and a candidate mouth area through image projection, obtaining, on the candidate eye area, an accurate coordinate of a center of the eye using the Haar feature plus the adaboost algorithm, and obtaining, on the candidate mouth area, an accurate coordinate of a corner of the mouth using gabor feature plus the adaboost algorithm.
[0122] In this case, the extraction of the gabor feature may be a conventional way of the face récognition, and may not be repeated herein. [0123] In block 404, the client may perform position-posture normalization processing to the key-point position to obtain a face image to be recognized.
[0124] In this case, the operation of performing the position-posture normalization processing to the key-point position to obtain the face image to be recognized may include:
based on the obtained positions of the eye and the mouth, i.e., the accurate coordinate of the center of the eye and the accurate coordinate of the corner of the mouth, converting the original user image to a standard face template through normalization operations, which may include dipping, zooming, posture correcting, etc., so as to ensure that the eye and the mouth may be in a standard position on the standard face template, and thus the standard face image to be recognized may be obtained.
[0125] In block 405, the client may compress the face image to be recognized and send the compressed face image to be recognized to a cloud server through a network.
[0126] In this case, when the compressed face image to be recognized is sent to the cloud server, a client ID may be sent to the cloud server, as well.
[0127] In block 406, the cloud server may decompress the compressed face image to be recognized.
[0128] In block 407, illumination-normalization processing may be performed to the decompressed face image to be recognized.
[0129] In this case, the accuracy of the face récognition may be decreased due to different intensifies and directions of Iights acting on the human face. As such, the obtained face image may be under a same illumination condition through the illumination-normalization processing, and therefore the accuracy of the face récognition may be improved.
[0130] In block 408, the cloud server may perform the feature extraction to obtain a face-feature template to be recognized.
[0131] According to an example of the présent disclosure, the operation of performing, by the cloud server, the feature extraction to obtain the face-feature template to be recognized may inciude:
performing, on the face image to be recognized processed with the illumination normalization, global partitioning feature extraction, which may inciude gabor local features, LBP, and HOG, performing dimension-reducing calculation to an extracted feature using a LDA model, and linking results of the dimension-reducing calculation one by one to obtain the face-feature template to be recognized.
[0132] In this case, the LDA may be a collection-probability model, and may process a discrète data collection and reduce the dimension.
[0133] In block 409, the face-feature template to be recognized may be compared with a locally-stored face-feature template, and a récognition resuit may be retumed.
[0134] In this case, the operation of comparing the face-feature template to be recognized with the locally-stored face-feature template and retuming the récognition resuit may include the following processes.
[0135] In process 409-1, a collection of a face-feature template associated with the client ID may be obtained from the template database.
[0136] In this case, the collection may include one or more than one face-feature templates associated with the client ID.
[0137] In process 409-2, a similarity between each face-feature template included in the collection and the face-feature template to be recognized may be calculated.
[0138] According to an example of the présent disclosure, the calculation of the similarity may employ the cosine distance and KNN algorithm, which may not be repeated herein.
[0139] In process 409-3, it may be determined whether each calculated similarity is greater than a predetermined récognition threshold. In response to a détermination that a calculated similarity is greater than the predetermined récognition threshold, process 409-4 may be performed. In response to a détermination that a calculated similarity is not greater than the predetermined récognition threshold, the récognition may not be passed.
[0140] ln process 409-4, a user ID associated with a face-feature template of which the similarity is greater than the predetermined récognition threshold may be added to a récognition resuit collection.
[0141] In process 409-5, the user ID in the récognition resuit collection may be sorted according to a descending order of the similarity.
[0142] In block 410, a récognition resuit may be retumed to the client.
[0143] According to an example of the présent disclosure, a récognition resuit may be defined to mean that the récognition is not passed, or may be a sorted récognition resuit collection.
[0144] According to an example of the présent disclosure, the biometric-based récognition method may be illustrated as follows. It may be assumed that a user A logs in an IM application, such as Tencent QQ, at a mobile phone A (i.e., a client), and a login password is a face of the user A. The user A has registered three QQ numbers at the mobile phone A, which may be referred to as QQ number A, QQ number B, and QQ number C, respectively. As such, a process of recognizing the user A may be illustrated as follows. When the user A wants to log in QQ at the mobile phone A, the user A may collect the face of the user A using the mobile phone A and may send the face of the user A and an ID of the mobile phone A (i.e., a client ID) to a cloud server for récognition. If the récognition at the cloud server is passed, the QQ numbers A, B, and C may be retumed to the mobile phone A. The user A may select any one of the QQ numbers A, B, and C to directly log in the QQ without performing the vérification process. In other words, when the récognition is passed, the corresponding vérification is passed, as well.
[0145] Examples of the présent disclosure provide a biometric-based récognition method, in which a client may obtain a biométrie image and send the biométrie image to a cloud server; the cloud server may perform feature extraction to the biométrie image to obtain a biométrie template, perform biometric-based récognition to a user or the client, and retum a corresponding user ID. In this case, the feature extraction process may be impiemented at the cloud server side, so that the complexity of the client may be reduced, the expandability of the client may be increased, a
limitation that the biométrie récognition may only be implemented on the client may be eliminated, and diversified utilization may be supported. [0146] As shown in FIG II, according to an example of the présent disclosure, a cloud server 5 may be provided. The cloud server 5 may 5 include:
an access module 51, to receive a biométrie image to be authenticated sent from a client;
an authentication module 52, to perform feature extraction to the biométrie image to be authenticated, which is received by the access module io 51, to obtain a biométrie template to be authenticated, compare the biométrie template to be authenticated with a biométrie template pre-stored in a data module 53, and retum an authentication resuit; and the data module 53, to store the biométrie template.
[0147] In the cloud server 5 as described above, the access module 51 15 may be implemented by the access server provided by the examples of the présent disclosure, the authentication module 52 may be implemented by the vérification server or the récognition server provided by the examples of the présent disclosure, and the data module 53 may be implemented by the data server provided by the examples of the présent disclosure.
[0148] As shown in FIG 12, according to an example of the présent disclosure, the authentication module 52 may include:
an illumination processing unit 521, to perform illumination-normalization processing to the biométrie image to be authenticated;
a feature extracting unit 522, to perform the feature extraction to the biométrie image to be authenticated which is processed with the illumination normalization, perform dimension-reducing calculation to an extracted feature, and link results of the dimension-reducing calculation one by one to obtain the biométrie template to be authenticated; and an authenticating unit 523, to compare the biométrie template to be authenticated that is obtained by the feature extracting unit 522 with the 5 biométrie template pre-stored in the data module 53, and retum the authentication resuit.
[0149] According to an example of the présent disclosure, before receiving the biométrie image to be authenticated sent from the client, the access module 51 may further receive a biométrie image, a client ID, and a io user ID that are sent from the client. Accordingly, as shown in FIG 13, the cloud server 5 may further include:
a session module 54, to perform the feature extraction to the biométrie image received by the access module 51 to obtain a biométrie template, and send a relationship associated with the biométrie template, the 15 client ID, and the user ID that are received by the access module 51 to the data module 53 to complété the registration of the user, and may retum a registration resuit.
[0150] Accordingly, the data module 53 may further store the relationship associated with the biométrie template, the client ID, and the 20 user ID, which is sent by the session module 54.
[0151] In this case, the session module 54 in the cloud server 5 as described above may be implemented by the session server provided by the examples of the présent disclosure.
[0152] According to an example of the présent disclosure, when 25 receiving the biométrie image to be authenticated sent from the client, the access module 51 may further receive the client ID sent from the client. Accordingly, as shown in HG 14, the authentication module 52 may further include:
a first template obtaining unit 524, to perforai the feature extraction to the biométrie image to be authenticated that is received by the access module 51 to obtain the biométrie template to be authenticated;
a collection obtaining unit 525, to search, according to the client ID received by the access module 51, from the biométrie template stored in the data module 53 to obtain a collection of a biométrie template associated with the client ID;
a recognizing unit 526, to calculate a similarity between the biométrie template to be authenticated that is obtained by the first template obtaining 10 unit 524 and each biométrie template încluded in the collection obtained by the collection obtaining unit 525; when a similarity between the biométrie template to be authenticated and a biométrie template induded in the collection is greater than a predetermined récognition threshold, add a user ID associated with the biométrie template induded in the collection to a 15 récognition resuit collection; otherwise, détermine that the récognition is not passed; and a récognition resuit transmitting unit 527, to sort the user ID in the récognition resuit collection according to a descending order of the similarity, and retum the récognition resuit collection to the client through the access 20 module 51.
[0153] According to an example of the présent disclosure, a récognition resuit may be defîned to mean that the récognition is not passed, or may be a sorted récognition resuit collection.
[0154] In this case, the authentication module 52 as shown in FIG 14 25 may be implemented by the récognition server provided by the examples of the présent disclosure.
[0155] According to an example of the présent disclosure, upon receiving the biométrie image to be authenticated sent from the client, the access module 51 may further receive the client ID and the user ID that are sent from the client. Accordingly, as shown in FIG 15, the authentication module 52 may further include:
a second template obtaining unit 528, to perform the feature extraction to the biométrie image to be authenticated that is received by the access module 51 to obtain the biométrie template to be authenticated;
a searching unit 529, to search out, according to the client ID and the user ID that are received by the access module 51, a biométrie template associated with the client ID and the user ID;
ίο a verifying unit 5210, to calculate a similarity between the biométrie template to be authenticated that îs obtained by the second template obtaining unit 528 and the biométrie template that is obtained by the searching unit 529 and is associated with the client ID and the user ID that are received by the access module 51; when the calculated similarity is is greater than a predetermined vérification threshold, détermine that the vérification of the user is passed; otherwise, détermine that the vérification of the user is not passed; and a vérification resuit transmitting unit 5211, to retum a vérification resuit to the client through the access module 51.
[0156] In this case, the authentication module 52 as shown in FIG 15 may be implemented by the vérification server provided by the examples of the présent disdosure.
[0157] The modules and/or units in the examples of the présent disdosure may be software (e.g., computer readable instructions stored in a computer readable medium and exécutable by a processor), hardware (e.g., the processor of an application spécifie integrated circuit (ASIC)), or a combination thereof. The modules and/or units in the examples of the présent disclosure may be deployed either in a centralized or a distributed configuration.
[0158] Examples of the présent disclosure provide a cloud server, which may obtain a biométrie image sent from a client, perform feature extraction 5 to the biométrie image to obtain a biométrie template, and may perform biometric-based authentication to a user or the client. In this case, the feature extraction process may be implemented at the cloud server side, so that the complexity of the client may be reduced, the expandability of the client may be increased, a limitation that the biométrie récognition may only be 10 implemented on the client may be eliminated, and diversified utilization may be supported.
[0159] As shown in FIG 16, according to an example of the présent disclosure, a client 6 may be provided. The client 6 may include:
a coilecting module 61, to collect a user image, and perform is biometric-positioning processing to the user image to obtain a biométrie image to be authenticated;
a transmitting module 62, to transmit the biométrie image to be authenticated that is obtained by the coilecting module 61 to a cloud server, so that the cloud server may perform feature extraction to the biométrie 20 image to be authenticated to obtain a biométrie template to be authenticated, and may compare the biométrie template to be authenticated with a biométrie template pre-stored in the cloud server, and a receiving module 63, to receive an authentication resuit retumed from the cloud server.
[0160] According to an example of the présent disclosure, as shown in
FIG 17, the coilecting module 61 as described above may include:
a coilecting unit 611, to collect the user image;
a detecting unit 612, to détermine and mark, upon detecting that a predetermined biométrie is included in the user image, a position of the biométrie;
a key-point positioning unit 613, to select a key-point position of the biométrie, and obtain a coordinate of the key-point position; and a position-posture normalization unit 614, to perform, based on the coordinate of the key-point position obtained by the key-point positioning unit 613, posture correcting to the key-point position to obtain the biométrie image to be authenticated.
to [0161] The modules and/or units in the examples of the présent disclosure may be software (e.g., computer readable instructions stored in a computer readable medium and exécutable by a processor), hardware (e.g., the processor of an application spécifie integrated circuit (ASIC)), or a combination thereof. The modules and/or units in the examples of the présent disclosure may be deployed either in a centralized or a distributed configuration.
[0162] FIG 18 is a schematic diagram illustrating a hardware structure of the client, according to an example of the présent disclosure. As shown in FIG. 18, the client may include a processor 701, a storage medium 702, and 20 I/O port 703, in which the storage medium 702 may store computer instructions, and the processor 701 may execute the computer instructions to perform operations including:
collecting a user image, and performing biometric-positioning processing to the user image to obtain a biométrie image to be authenticated; transmitting the obtained biométrie image to be authenticated to a cloud server, so that the cloud server may perform feature extraction to the biométrie image to be authenticated to obtain a biométrie template to be authenticated, and may compare the biométrie template to be authenticated with a biométrie template pre-stored in the cloud server, and receiving an authentication resuit retumed from the cloud server. [0163] The processor 701 may execute the computer instructions to further perform operations including:
determining and marking, upon detecting that a predetermined biométrie is inciuded in the user image, a position of the biométrie;
selecting a key-point position of the biométrie, and obtaining a coordinate of the key-point position; and performing, based on the obtained coordinate of the key-point position, posture correcting to the key-point position to obtain the biométrie image to be authenticated.
[0164] The storage medium 702 as described above may be a transitory storage medium (such as random access memory (RAM)), a non-transitory storage medium (such as read-only memory (ROM), or flash memory), or a combination thereof, which may not be limited herein.
[0165] As may be seen that when the computer instructions stored in the storage medium 702 are executed by the processor 701, fonctions of the aforementioned coliecting module 61, the transmitting module 62, and the receiving module 63 are achieved.
[0166] Examples of the présent disclosure provide a client, which may obtain a biométrie image of a user, and send the biométrie image to a cloud server, the cloud server may perform feature extraction to the biométrie image to obtain a biométrie template, and may perform biometric-based authentication to the user or the client. In this case, the feature extraction process may be implemented at the cloud server side, so that the complexity of the client may be reduced, the expandability of the client may be increased, a limitation that the biométrie récognition may only be implemented on the client may be eliminated, and diversified utilization may be supported.
[0167] According to an example of the présent disdosure, a biometric-based authentication system may further be provided. The 5 biometric-based authentication System may indude the cloud server 5 and the client 6 as described above.
[0168] Those skilled in the art may understand that ail or part of the procedures of the methods of the above examples may be implemented by hardware, or by hardware following machine readable instructions of a 10 computer program. The computer program may be stored in a computer readable storage medium. When running, the computer program may provide the procedures of the above method examples. The storage medium may be diskette, CD, ROM, or RAM, etc.
[0169] The above are several examples of the présent disdosure, and are 15 not used for limiting the protection scope of the présent disdosure. Any modifications, équivalents, improvements, etc., made under the principle of the présent disdosure should be included in the protection scope of the présent disdosure.

Claims (15)

  1. Claims
    1. A biometric-based authentication method, comprising:
    receiving, by a cloud server, a biométrie image to be authenticated sent from a client;
    5 performing, by the cloud server, feature extraction to the biométrie image to be authenticated to obtain a biométrie template to be authenticated;
    comparing, by the cloud server, the biométrie template to be authenticated with a biométrie template stored in the cloud server; io and retuming an authentication resuit to the client.
  2. 2. The method of claim 1, wherein the operation of performing the feature extraction to the biométrie image to be authenticated to obtain the
    15 biométrie template to be authenticated comprises:
    performing illumination-normalization processing to the biométrie image to be authenticated;
    performing the feature extraction to the biométrie image to be authenticated which is processed with the illumination 20 normalization;
    performing dimension-reducing calculation to an extracted feature; and linking results of the dimension-reducing calculation one by one to obtain the biométrie template to be authenticated.
    25
  3. 3. The method of claim 1, wherein before the operation of receiving the biométrie image to be authenticated sent from the client, the method further comprises:
    receiving a biométrie image to be registered, a client identifier (ID), and a first user ID that are sent from the client;
    performing the feature extraction to the biométrie image to be registered to obtain a biométrie template;
    5 storing a relationship associated with the biométrie template, the client
    ID, and the first user ID to complété registration of a user, and retuming a registration resuit.
  4. 4. The method of claim 3, further comprising:
    10 upon receiving the biométrie image to be authenticated sent from the client, receiving the client ID sent from the client;
    wherein the operation of comparing the biométrie template to be authenticated with the biométrie template stored in the cloud server and retuming the authentication resuit comprises:
    15 searching out, according to the client ID, a biométrie template stored in the cloud server and associated with the client ID to obtain a collection of the biométrie template stored in the cloud server and associated with the client ID;
    calculating a similarity between the biométrie template to be
    20 authenticated and each biométrie template in the collection;
    when a similarity between the biométrie template to be authenticated and a biométrie template in the collection is greater than a predetermined récognition threshold, adding a user ID associated with the biométrie template in the 25 collection to a récognition resuit collection;
    sorting the user ID in the récognition resuit collection according to a descending order of the similarity; and retuming the récognition resuit collection to the client.
  5. 5. The method of claim 3, further comprising:
    upon receiving the biométrie image to be authenticated sent from the client, receiving the client ID and a second user ID sent from the 5 client;
    wherein the operation of comparing the biométrie template to be authenticated with the biométrie template stored in the cloud server and retuming lhe authentication resuit comprises: searching out, according to the client ID and the second user ID, a 10 biométrie template stored in the cloud server and associated with the client ID and the second user ID;
    calculating a similarity between lhe biométrie template to be authenticated and the biométrie template stored in the cloud server and associated with the client ID and the second user 15 ID;
    when the calculated similarity is greater than a predetermined vérification threshold, determining that vérification of a user is passed, and retuming a vérification resuit to the client.
    20
  6. 6. A biometric-based authentication method, comprising:
    collecting, by a client, a user image;
    performing, by the client, biometric-positioning processing to the user image to obtain a biométrie image to be authenticated;
    transmitting, by the client, the biométrie image to be authenticated to a 25 cloud server, so that the cloud server performs feature extraction to the biométrie image to be authenticated to obtain a biométrie template to be authenticated, and compares the biométrie template to be authenticated with a biométrie template stored in the cloud server, and receiving, by the client, an authentication resuit retumed from the cloud server.
    5
  7. 7. The method of claim 6, wherein the operation of performing the biometric-positioning processing to the user image to obtain the biométrie image to be authenticated comprises:
    upon detecting that a predetermined biométrie is included in the user image, determining and marking a position of the biométrie;
    1 o selecting a key-point position of the biométrie;
    obtaining a coordinate of the key-point position; and performing, based on the obtained coordinate of the key-point position, posture correcting to the key-point position to obtain the biométrie image to be authenticated.
  8. 8. A cloud server, comprising:
    an access module, to receive a biométrie image to be authenticated sent from a client;
    an authentication module, to perform feature extraction to the biométrie 20 image to be authenticated to obtain a biométrie template to be authenticated, compare the biométrie template to be authenticated with a biométrie template stored in a data module, and retum an authentication resuit to the client; and the data module, to store the biométrie template.
  9. 9. The cloud server of claim 8, wherein the authentication module comprises:
    an illumination processing unit, to perform illumination-normalization processing to the biométrie image to be authenticated;
    a feature extracting unit, to perform the feature extraction to the biométrie image to be authenticated which is processed with the illumination normalizatïon, perform dimension-reducing calculation to an extracted feature, and Iink results of the dimension-reducing calculation one by one to obtain the biométrie template to be authenticated; and an authenticating unit, to compare the biométrie template to be authenticated with the biométrie template stored in the data module, and retum the authentication resuit.
  10. 10. The cloud server of claim 8, wherein the access module is further to, before receiving the biométrie image to be authenticated sent from the client, receive a biométrie image to be regîstered, a client identifier (ID), and a first user ID that are sent from the client;
    the cloud server further comprises:
    a session module, to perform the feature extraction to the biométrie image to be regîstered to obtain a biométrie template, and send a relationship associated with the biométrie template, the client ID, and the first user ID to the data module to complété registration of a user, and retum a registration resuit;
    the data module is further to store the relationship associated with the biométrie template, the client ID, and the first user ID.
  11. 11. The cloud server of claim 10, wherein the access module is further to, upon receiving lhe biométrie image to be authenticated sent from the client, receive the client ID sent from the client;
    the authentication module comprises:
    a template obtaining unit, to perform the feature extraction to the biométrie image to be authenticated to obtain the biométrie template to be authenticated;
    a collection obtaining unit, to search out, according to the client ID, a biométrie template stored in the data module and associated with the client ID to obtain a collection of the biométrie template stored in the data module and associated with the client ID;
    a recognizing unit, to calculate a similarity between the biométrie template to be authenticated and each biométrie template in the collection; when a similarity between the biométrie template to be authenticated and a biométrie template in the collection is greater than a predetermined récognition threshold, add a user ID associated with the biométrie template in the collection to a récognition resuit collection; and a récognition resuit transmitting unit, to sort the user ID in the récognition resuit collection according to a descending order of the similarity, and retum the récognition resuit collection to the client through the access module.
  12. 12. The cloud server of claim 10, wherein the access module is further to, upon receiving the biométrie image to be authenticated sent from the client, receive the client ID and a second user ID sent from the client;
    the authentication module comprises:
    a template obtaining unit, to perform the feature extraction to the biométrie image to be authenticated to obtain the biométrie template to be authenticated;
    a searching unit, to search out, according to lhe client ID and the second user ID, a biométrie template stored in the data module and associated with the client ID and the second user ID;
    a verifyîng unit, to calculate a similarity between the biométrie template to be authenticated and lhe biométrie template stored in the data module and associated with the client ID and the second user ID; when the calculated similarity is greater than a predetermined vérification threshold, détermine that vérification of a user is passed; and a vérification resuit transmitting unit, to retum a vérification resuit to the client through the access module.
  13. 13. A client, comprising:
    a collecting module, to collect a user image, and perform biometric-positioning processing to the user image to obtain a biométrie image to be authenticated;
    a transmitting module, to transmit the biométrie image to be authenticated to a cloud server, so that the cloud server performs feature extraction to the biométrie image to be authenticated to obtain a biométrie template to be authenticated, and compares the biométrie template to be authenticated with a biométrie template stored in the cloud server, and a receiving module, to receive an authentication resuit retumed from the server.
  14. 14. The client of claim 13, wherein the collecting module comprises: a collecting unit, to collect the user image;
    5 a de te clin g unit, to détermine and mark, upon detecting that a predetermined biométrie is included in the user image, a position of the biométrie;
    a key-point positioning unit, to select a key-point position of the biométrie, and obtain a coordinate of the key-point position; and
    10 a position-posture normalization unit, to perform, based on the obtained coordinate of the key-point position, posture correcting to the key-point position to obtain the biométrie image to be aulhenticated.
  15. 15. A biometric-based authentication system, comprising: a cloud server claimed in any of the preceding daims 8 to 12; and a client claimed in any of the preceding daims 13 to 14.
OA1201400421 2012-03-19 2013-03-13 Authentication method, device and system based on biological characteristics. OA17098A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210072147.2 2012-03-19

Publications (1)

Publication Number Publication Date
OA17098A true OA17098A (en) 2016-03-23

Family

ID=

Similar Documents

Publication Publication Date Title
US10664581B2 (en) Biometric-based authentication method, apparatus and system
JP6634127B2 (en) System and method for biometrics associated with a camera-equipped device
US9262614B2 (en) Image processing device, image processing method, and storage medium storing image processing program
US9122913B2 (en) Method for logging a user in to a mobile device
US20190362058A1 (en) Face unlocking method and device, electronic device, and computer storage medium
TWI727329B (en) Anti-spoofing system and method for providing selective access to resources based on a deep learning method
WO2019033525A1 (en) Au feature recognition method, device and storage medium
Das et al. Recent advances in biometric technology for mobile devices
CN109376604B (en) Age identification method and device based on human body posture
CN108108711B (en) Face control method, electronic device and storage medium
KR20230169104A (en) Personalized biometric anti-spoofing protection using machine learning and enrollment data
US20210034895A1 (en) Matcher based anti-spoof system
CN105407069B (en) Living body authentication method, apparatus, client device and server
Guo Impact on Biometric Identification Systems of COVID‐19
Messerschmidt et al. Biometric systems utilizing neural networks in the authentication for e-learning platforms
CN117115595A (en) Training method and device of attitude estimation model, electronic equipment and storage medium
US10311290B1 (en) System and method for generating a facial model
US10867022B2 (en) Method and apparatus for providing authentication using voice and facial data
CN108763893A (en) Read-write equipment, method based on recognition of face and electronic equipment
OA17098A (en) Authentication method, device and system based on biological characteristics.
Sruthi et al. A Fast and Accurate Face Recognition Security System
WO2023109551A1 (en) Living body detection method and apparatus, and computer device
CN116189315A (en) Living body detection method and system
Farooqui et al. Automatic Detection of Fake Profiles in Online Social Network Using Soft Computing
Nguyen AI Driven User Authentication