CN117235694A - Login system and method based on face recognition big data - Google Patents

Login system and method based on face recognition big data Download PDF

Info

Publication number
CN117235694A
CN117235694A CN202311181342.3A CN202311181342A CN117235694A CN 117235694 A CN117235694 A CN 117235694A CN 202311181342 A CN202311181342 A CN 202311181342A CN 117235694 A CN117235694 A CN 117235694A
Authority
CN
China
Prior art keywords
user
data
face
facial
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311181342.3A
Other languages
Chinese (zh)
Inventor
李淑红
刘飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Heilongjiang Duyue Technology Co ltd
Original Assignee
Heilongjiang Duyue Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Heilongjiang Duyue Technology Co ltd filed Critical Heilongjiang Duyue Technology Co ltd
Priority to CN202311181342.3A priority Critical patent/CN117235694A/en
Publication of CN117235694A publication Critical patent/CN117235694A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Collating Specific Patterns (AREA)

Abstract

The invention discloses a login system and a login method based on face recognition big data, and belongs to the technical field of face recognition big data. The login system based on the face recognition big data comprises a user registration module, a user login module, a face recognition module, a user management module and a security and privacy module. The user registration module is responsible for registering accounts of users; the user login module is responsible for matching the face data of a user with the features in the database, and the user obtains access rights after the matching is successful; the face recognition module receives face data provided by a user and recognizes the face data; the user management module is responsible for storing and managing face data and account information of a user, and an administrator can manage the face data and the account information; the security and privacy module is responsible for ensuring the privacy and data security of the user. The login method based on the face recognition big data is a mode of identity verification by using facial features of a user.

Description

Login system and method based on face recognition big data
Technical Field
The invention relates to the technical field of face recognition big data, in particular to a login system and a login method based on the face recognition big data.
Background
The face recognition big data technology is a large-scale data set processing and analyzing method based on face images and facial features, and aims to realize high-precision face recognition and biological feature recognition. The technology combines knowledge in the fields of artificial intelligence, computer vision, machine learning and the like, and has wide application fields including identity verification, security access control, monitoring systems, social media analysis and the like.
Conventional face recognition systems may be compromised by photo or video attacks that an attacker can use to fool the system with a user's photo or recorded video. Face recognition systems may be affected by factors such as light conditions, angles, expression changes, etc., resulting in misrecognition problems. Storing biometric data of a large number of users may raise privacy concerns and may lead to data leakage and abuse problems if such data is not properly managed and protected. The feature extraction and alignment process may require significant computational resources and time, especially when processing large-scale data, efficiency issues become a challenge. Furthermore, conventional authentication methods may require multiple steps, resulting in poor user experience.
Disclosure of Invention
The invention aims to provide a login system and a login method based on face recognition big data, so as to solve the problems in the background technology.
In order to solve the technical problems, the invention provides the following technical scheme:
the login system based on the face recognition big data comprises a user registration module, a user login module, a face recognition module, a user management module and a security and privacy module;
the user registration module comprises a user information collection unit, a face data collection unit and a data preprocessing unit, and is responsible for registering accounts of users;
the user login module comprises a user information collection unit, a face data collection unit, a data preprocessing unit, an authentication and access control unit and is responsible for matching face data of a login user with features in a database, and the user obtains access right after the face recognition module is successfully matched;
the face recognition module comprises a face data processing unit and a face recognition unit, is responsible for receiving feature vectors of face data provided by a user, compares the feature vectors with the face feature vectors stored in the database, and returns a comparison result to the user login module;
the user management module comprises a user information storage unit, a face data storage unit and a user account management unit, and is responsible for storing and managing face data and account information of a user, and an administrator uses the module to create, delete, disable or enable a user account;
the security and privacy module comprises a data encryption unit, an access control unit, a compliance checking unit, a log recording and monitoring unit and is responsible for ensuring the security of the system and the privacy of user data and ensuring that the user data is fully protected in the transmission and storage processes.
The user registration module and the user login module both comprise a user information collection unit, a face data collection unit and a data preprocessing unit; the user information collecting unit is responsible for collecting user names and passwords of users; the face data collection unit is in charge of receiving face video data uploaded by a user; the data preprocessing unit is responsible for preprocessing the video data, wherein the preprocessing operation comprises frame extraction, image clipping, denoising and alignment; extracting facial features of the user after the preprocessing operation is finished, wherein the facial features comprise eye positions and outlines, mouth and lip features, nose shapes, facial outlines, skin textures, facial expressions and muscle movements, and eyeball colors and textures; finally, the data preprocessing unit is responsible for converting facial features into feature vectors; the user login module also comprises an authentication and access control unit for deciding whether to allow the user to login.
The authentication of the user login module and the authentication of the access control unit are as follows:
s3-1, acquiring a user name of a user from a face data collection unit and acquiring a feature vector of facial features of the user from a data preprocessing unit;
s3-2, packing the user name and the feature vector of the facial feature of the user into user data;
s3-3, transmitting the user data to a face recognition module for recognition;
s3-4, acquiring a face comparison result transmitted by a face recognition module;
s3-5, if the face comparison is successful, granting the user access right; if the face alignment fails, the user will be denied access and need to attempt to re-login.
The face recognition module comprises the following steps of:
s4-1, receiving packed user data from a user login module, wherein the packed user data comprises a user name and a feature vector of facial features;
s4-2, inquiring a user database to obtain feature vectors of facial features of registered users;
s4-3, comparing the feature vector of the facial feature provided by the user with the feature vector of the registered user to generate a similarity score;
s4-4, setting a similarity threshold as N, and judging whether the two feature vectors are similar enough to be considered as the features of the same person;
s4-5, judging a comparison result according to the comparison similarity score and a set similarity threshold value; if the difference value between the similarity score and the threshold value N is within U, the face comparison is considered to be successful;
s4-6, returning the comparison result to an authentication and access control unit of the user login module; the comparison result comprises comparison success and comparison failure.
In step S4-3, the feature vectors are compared using euclidean distance, assuming that there are two feature vectors a and B, which are respectively expressed as: a= (a 1, a2, a3,., an), b= (B1, B2, B3,., bn), where n represents the dimension of the feature vector and ai and bi represent the values of the two vectors in the i-th dimension;
the calculation formula of the Euclidean distance is as follows:
the distance value of the Euclidean distance represents the difference degree between the two feature vectors, and the distance value is substituted into the step S4-5 for comparison; the calculation formula of the similarity score in step S4-5 is as follows:
V=exp(-k*d(A,B))
where V is the similarity score, k is a constant, and d (A, B) is the Euclidean distance.
The step of converting the facial features into feature vectors is as follows:
s6-1, detecting and positioning a human face in an input video by using a human face detection algorithm, and only extracting facial features of a user in the detection process;
s6-2, cutting out an image area containing the face according to the detected face position, reducing the calculated amount of feature extraction, and analyzing the face in a concentrated manner;
s6-3, preprocessing the cut facial image to reduce noise and improve the reliability of the features; the pretreatment step comprises the following steps: graying, histogram equalization and denoising;
s6-4, extracting the facial features from the preprocessed facial images by using a feature extraction algorithm;
s6-5, converting the extracted facial features into feature vectors, and obtaining feature vectors by arranging measurement results of each feature into one-dimensional vector (feature 1, feature 2, feature 3, feature n), wherein n represents that the user has n different facial features;
s6-6, normalizing the feature vector.
In S6-1, the face detection algorithm uses RPN, wherein the RPN is a deep learning network structure and is used for generating an anchor frame in an image, and the anchor frame is a boundary frame of a candidate target; generating a plurality of anchor frames of different sizes and aspect ratios at each location on the feature map using a sliding window, each anchor frame being defined by its central coordinates and a width-height; the RPN outputs coordinate information of each anchor frame, wherein the coordinate information comprises a central point coordinate and a width and height of each anchor frame, and the coordinates are used for adjusting the anchor frames;
the RPN also outputs a confidence score for each anchor box, the confidence score representing the probability that the anchor box contains a target; the score is used for anchor boxes, and boundary boxes with high confidence are reserved;
the RPN generates coordinates and confidence scores for the candidate bounding boxes using the following formula:
for each anchor frame, calculating the central point coordinates and the width and height adjustment values of the anchor frame, wherein the adjustment values comprise deltax, deltay, deltaw and deltah, and the new boundary frame coordinates are as follows: new center point coordinates x '=x+Δx, new center point coordinates y' =y+Δy, new width w '=w×exp (Δw), new height h' =h×exp (Δh);
for each anchor box, calculating a confidence score representing the probability that the anchor box contains the target; the score is output through the classification network, and the sigmoid activation function is used to map the output to between 0 and 1, representing the probability that the anchor box contains the target.
The sigmoid activation function is a function commonly used to map network outputs between 0 and 1 for binary classification problems. In calculating the confidence score, the original score is converted to a probability that the anchor box contains the target using a sigmoid activation function. The mathematics of the sigmoid activation function are expressed as follows:
where x is the original score or network output value and σ (x) is the mapped probability value.
In S6-4, the feature extraction algorithm is performed using PCA, which is a principal component analysis capable of reducing the data dimension and extracting the principal information in the data, and the calculation formula is as follows:
PCA calculates a covariance matrix of the data, the covariance matrix describing relationships between different features; for a data set with N features, the covariance matrix has a size of n×n, and the covariance matrix C is calculated using the following formula:
C=(1/N)*(X-μ)*(X-μ) T
wherein X is a data matrix, mu is the average value of the data, and T represents the transpose of the matrix;
then, carrying out eigenvalue decomposition on the covariance matrix to obtain eigenvalues and corresponding eigenvectors; the eigenvalues represent the importance of each eigenvector, which describes the dominant direction in the new eigenvector space; selecting the number of main components to be reserved according to the size of the characteristic value, and reserving the total variance of W%;
finally, the original data is projected into the new feature space using the selected principal components, the projection being accomplished by multiplying the data matrix by the principal component matrix, the projected data having dimensionality-reduced features for subsequent analysis and modeling tasks.
The user information storage unit of the user management module is responsible for storing basic information of users, and the information is stored in a database in a structured form so that the system can be searched and updated at any time. The face data storage unit is responsible for storing face feature data of a user. Such data is typically stored in a secure manner to prevent unauthorized access. The face feature data is stored in the form of feature vectors for comparison with the user. In the user account management unit, the administrator is allowed to perform the following operations:
a. the administrator may add a new user account, assign them a unique user ID, and record their basic information.
b. The administrator may delete user accounts that no longer need to access the system.
c. An administrator may disable or enable user accounts to control whether they have access to the system.
d. The administrator may perform resetting of the password or access rights if the user forgets the password or needs to reset the access rights.
The user management module also includes data backup and recovery functions to ensure that user information and face data can be recovered in the event of a system failure or data loss. To enhance security and compliance, the user management module may generate an audit log that records all the administrator's actions on the user's accounts and rights. These logs help track the activity of the system and detect potential problems.
The data encryption unit of the security and privacy module uses RSA encryption; in the initialization stage of the data encryption unit, a pair of RSA keys including a public key and a private key are generated; the public key is used for encrypting data, and the private key is used for decrypting data; when a user or a system needs to encrypt sensitive data, firstly, an RSA public key needs to be acquired, the data to be encrypted and the RSA public key are transmitted to a data encryption unit, the data encryption unit encrypts plaintext data by using the RSA public key, and the encryption operation steps are as follows:
s9-1, converting plaintext data into a digital form, and processing the data by using a PKCS#1 filling scheme to adapt to the requirements of an RSA encryption algorithm;
s9-2, encrypting the digital data by using the RSA public key;
s9-3, ciphertext data is generated;
the ciphertext data is transmitted through the communication channel, and at the receiving end, the ciphertext data is decrypted by using the RSA private key to restore the ciphertext data into original plaintext data, and the decryption operation is executed on the data encryption unit.
The access control unit of the security and privacy module is used for managing the access rights of users and administrators to the system. It defines who can access which parts of the system and in what way. By means of Access Control Lists (ACLs) or role-based access control (RBACs) etc. mechanisms it is ensured that only authorized users can perform certain operations. The compliance checking unit is responsible for ensuring that the system complies with applicable compliance and legal regulations. It may perform compliance reviews, monitors, reports, and automated compliance checks, which are important to systems that comply with GDPR, HIPAA, or other relevant regulations. The logging and monitoring unit logs system activity and monitors potential threats and abnormal behavior. It generates audit logs including user login, data access, system configuration changes, etc. These logs can be used for troubleshooting, security auditing, and monitoring the operating conditions of the system.
The login method based on the face recognition big data comprises the following steps:
s10-1, registering an account in a system by a user;
s10-2, preprocessing facial data uploaded by a user by the system, wherein the preprocessing comprises image cutting, denoising, alignment and standardization;
s10-3, extracting facial features from facial data uploaded by a user; features include eye position and contour, mouth and lip features, nose shape, facial contour, skin texture, facial expression and muscle movement, eye ball color and texture;
s10-4, converting the extracted facial features into feature vectors, and mapping the high-dimensional features into low-dimensional vectors by using PCA;
s10-5, the system associates the facial feature vector of the user with account information thereof and stores the facial feature vector in a database;
s10-6, a user needs to log in the system to access an account or perform a specific operation; when logging in, the user uploads the face data of the user;
s10-7, preprocessing the uploaded user face data;
s10-8, extracting facial features from facial data uploaded by a user;
s10-9, converting the extracted facial features into feature vectors;
s10-10, comparing the feature vectors of the registered users with the feature vectors of the logger, and calculating the similarity between the feature vectors by using Euclidean distance; setting a similarity threshold;
s10-11, if the face recognition result is higher than the similarity threshold, the user is granted access rights, and the system can be accessed or specific operation can be executed;
in the whole login process, the privacy and the data security of the user are ensured, and the security measures of data encryption, access control, log recording and monitoring are adopted to protect the system.
Compared with the prior art, the invention has the following beneficial effects:
the invention adopts complex face comparison algorithm based on face recognition big data, thereby improving the safety of the system. While conventional passwords and PIN codes are susceptible to risks of guessing and theft, the present invention uses biometric identification to reduce these risks.
Through the built-in security and privacy module, the invention improves the capability of resisting photo and video attacks, and ensures that only real users can obtain access rights.
The privacy and data security module of the present invention ensures that the biometric data of the user is adequately protected, including data encryption and compliance checking.
The face recognition technology based on the face recognition big data can provide high-precision face recognition, and meets the recognition requirements under different light, angle and expression conditions.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention. In the drawings:
FIG. 1 is a system structure diagram of a login system based on face recognition big data of the present invention;
fig. 2 is a schematic diagram of steps of a login method based on face recognition big data.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1-2, the present invention provides the following technical solutions:
according to one embodiment of the present invention, as shown in a system configuration diagram of a login system based on face recognition big data in fig. 1, the system includes a user registration module, a user login module, a face recognition module, a user management module, a security and privacy module;
the user registration module comprises a user information collection unit, a face data collection unit and a data preprocessing unit, and is responsible for registering accounts of users;
the user login module comprises a user information collection unit, a face data collection unit, a data preprocessing unit, an authentication and access control unit and is responsible for matching face data of a login user with features in a database, and the user obtains access right after the face recognition module is successfully matched;
the face recognition module comprises a face data processing unit and a face recognition unit, is responsible for receiving feature vectors of face data provided by a user, compares the feature vectors with the face feature vectors stored in the database, and returns a comparison result to the user login module;
the user management module comprises a user information storage unit, a face data storage unit and a user account management unit, and is responsible for storing and managing face data and account information of a user, and an administrator uses the module to create, delete, disable or enable a user account;
the security and privacy module comprises a data encryption unit, an access control unit, a compliance checking unit, a log recording and monitoring unit and is responsible for ensuring the security of the system and the privacy of user data and ensuring that the user data is fully protected in the transmission and storage processes.
The user registration module is responsible for collecting and registering information of new users, including face data. This information is stored in the user management module. The user login module is responsible for verifying the identity of a registered user, and the face recognition module is used for comparing face data provided by the user with feature vectors stored during registration. If authentication is successful, the user will gain access.
The main connection between the user login module and the face recognition module is authentication through face data. The user login module transmits the face data of the user to the face recognition module, which compares the data with the feature vectors in the database. The comparison will be used to determine if the user is granted access.
The user management module is a user information storage and management part of the core. It is associated with a user registration module for storing information and facial feature vectors of the new user. Meanwhile, the user login module is also associated with the user management module and is used for verifying the identity of the user. The administrator may create, delete, disable, or enable user accounts using the user management module.
The link between the user login module and the security and privacy module is to ensure the security of the system. The authentication and access control unit checks the identity of the user during the login process and ensures that only legitimate users can obtain access rights. Meanwhile, the data encryption unit ensures that the data of the user is fully protected in transmission and storage so as to ensure privacy and security.
According to another embodiment of the invention, the login method based on the face recognition big data comprises the following steps:
s10-1, registering an account in a system by a user;
s10-2, preprocessing facial data uploaded by a user by the system, wherein the preprocessing comprises image cutting, denoising, alignment and standardization;
s10-3, extracting facial features from facial data uploaded by a user; features include eye position and contour, mouth and lip features, nose shape, facial contour, skin texture, facial expression and muscle movement, eye ball color and texture;
s10-4, converting the extracted facial features into feature vectors, and mapping the high-dimensional features into low-dimensional vectors by using PCA;
s10-5, the system associates the facial feature vector of the user with account information thereof and stores the facial feature vector in a database;
s10-6, a user needs to log in the system to access an account or perform a specific operation; when logging in, the user uploads the face data of the user;
s10-7, preprocessing the uploaded user face data;
s10-8, extracting facial features from facial data uploaded by a user;
s10-9, converting the extracted facial features into feature vectors;
s10-10, comparing the feature vectors of the registered users with the feature vectors of the logger, and calculating the similarity between the feature vectors by using Euclidean distance; setting a similarity threshold;
s10-11, if the face recognition result is higher than the similarity threshold, the user is granted access rights, and the system can be accessed or specific operation can be executed;
in the whole login process, the privacy and the data security of the user are ensured, and the security measures of data encryption, access control, log recording and monitoring are adopted to protect the system.
The user registers the account, inputs the following user information:
user name: john doe, password: 123456 mailbox address: johndoe@example.com face recognition is performed using a camera.
In the face recognition process, an RPN face detection algorithm is used, and for each anchor frame, the central point coordinates and the width and height adjustment values of the anchor frame are calculated, wherein the adjustment values comprise Deltax, deltay, deltaw and Deltah, and the new boundary frame coordinates are as follows: new center point coordinates x '=x+Δx, new center point coordinates y' =y+Δy, new width w '=w×exp (Δw), new height h' =h×exp (Δh);
we have an initial anchor frame with its center coordinates (x, y) of (100 ), width (w) of 50 and height (h) of 60. The coordinates and the sizes of the anchor frame after adjustment are calculated as follows:
Δx=5,Δy=-3,Δw=0.1,Δh=0.2;
new center point coordinates x' =x+Δx=100+5=105;
new center point coordinates y' =y+Δy=100-3=97
New width w' =w×exp (Δw) =50×exp (0.1) ≡ 55.16)
New height h' =h×exp (Δh) =60×exp (0.2) ≡72.80
Thus, by applying these adjustment values, we adjust the initial anchor box to the new position (105,97) and size (55.16,72.80).
For each anchor box, calculating a confidence score representing the probability that the anchor box contains the target; the score is output through the classification network, and the sigmoid activation function is used to map the output to between 0 and 1, representing the probability that the anchor box contains the target.
The sigmoid activation function is a function commonly used to map network outputs between 0 and 1 for binary classification problems. In calculating the confidence score, the original score is converted to a probability that the anchor box contains the target using a sigmoid activation function. The mathematics of the sigmoid activation function are expressed as follows:
where x is the original score or network output value and σ (x) is the mapped probability value.
Calculated Δx=sigmoid (5) ≡ 0.9933071490757153; Δy=sigmoid (-3) ≡ 0.04742587317756678; Δw=sigmoid (0.1) ≡ 0.52497918747894; Δh=sigmoid (0.2) ≡ 0.5498340002496343.
These values are final adjustment values processed by a sigmoid activation function for adjusting the center coordinates and the size of the anchor frame to generate candidate regions for face detection.
The user name, password and mailbox address are successfully entered and submitted, the uploaded face photo is received by the system, and the system begins preprocessing the uploaded photo, including image cropping, denoising, alignment and standardization. Facial features extracted from the photograph include: eye position and contour, mouth and lip characteristics, nose shape, facial contour, skin texture, facial expression and muscle movement, eye ball color and texture.
When extracting facial features, using a PCA feature extraction algorithm, PCA calculates a covariance matrix of data, wherein the covariance matrix describes the relationship between different features; for a data set with N features, the covariance matrix has a size of n×n, and the covariance matrix C is calculated using the following formula:
C=(1/N)*(X-μ)*(X-μ) T
wherein X is a data matrix, mu is the average value of the data, and T represents the transpose of the matrix;
then, carrying out eigenvalue decomposition on the covariance matrix to obtain eigenvalues and corresponding eigenvectors; the eigenvalues represent the importance of each eigenvector, which describes the dominant direction in the new eigenvector space; selecting the number of main components to be reserved according to the magnitude of the characteristic value, and reserving 95% of total variance;
finally, the original data is projected into the new feature space using the selected principal components, the projection being accomplished by multiplying the data matrix by the principal component matrix, the projected data having dimensionality-reduced features for subsequent analysis and modeling tasks.
The extracted facial features are converted into feature vectors, the high-dimensional features are mapped to low-dimensional vectors using PCA, and the user's facial feature vectors are associated with their account information and stored in a database. The facial feature vector of the user is [0.1,0.2,0.3,..0.1 ] (128 floating point values).
After the user finishes registering, a login operation is performed. Inputting a user name: johnDoe, and open the camera to carry out face recognition operation. The user name "john doe" is successfully entered. Clicking the "scan face" button, the system starts the camera to scan the face. Scanned face data is received by the system.
The system begins preprocessing the scanned face data, including image cropping, denoising, alignment, and normalization. The facial features extracted from the scanned face data include: eye position and contour, mouth and lip characteristics, nose shape, facial contour, skin texture, facial expression and muscle movement, eye ball color and texture. The extracted facial features are converted into feature vectors, and the high-dimensional features are mapped to low-dimensional vectors using PCA. The system retrieves and compares the feature vector of the user 'JohnDoe' registered before and the feature vector of the registrant, and calculates the similarity between the feature vectors using euclidean distance.
Feature vector a= [0.1,0.2,0.3,..0.1 ] (128 floating point values),
feature vector b= [0.2,0.1,0.2, 0.3] (128 floating point values),
the feature vector A is a feature vector of a user 'JohnDoe' in a system database, the feature vector B is a facial feature vector of face recognition of the user during login, and the Euclidean distance between the two vectors is calculated:
the similarity score=exp (-k is euclidean distance), the similarity threshold is set to be 1, and whether the difference between the similarity score and the similarity threshold is within 0.5 is required to be judged; if the feature vectors are within 0.5, the feature vectors are considered to be similar, otherwise, the feature vectors are considered to be dissimilar; where k is a constant, set to-1.4912, for adjusting the shape of the similarity score. Similarity score = exp (-k 0.2) ≡ 0.8187, close to 1, thus indicating that the two eigenvectors are very similar, the system recognizes that they are from the same person, allowing the user to log in.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Finally, it should be noted that: the foregoing description is only a preferred embodiment of the present invention, and the present invention is not limited thereto, but it is to be understood that modifications and equivalents of some of the technical features described in the foregoing embodiments may be made by those skilled in the art, although the present invention has been described in detail with reference to the foregoing embodiments. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. Login system based on face identification big data, its characterized in that: the system comprises a user registration module, a user login module, a face recognition module, a user management module and a security and privacy module;
the user registration module comprises a user information collection unit, a face data collection unit and a data preprocessing unit, and is responsible for registering accounts of users;
the user login module comprises a user information collection unit, a face data collection unit, a data preprocessing unit, an authentication and access control unit and is responsible for matching face data of a login user with features in a database, and the user obtains access right after the face recognition module is successfully matched;
the face recognition module comprises a face data processing unit and a face recognition unit, is responsible for receiving feature vectors of face data provided by a user, compares the feature vectors with the face feature vectors stored in the database, and returns a comparison result to the user login module;
the user management module comprises a user information storage unit, a face data storage unit and a user account management unit, and is responsible for storing and managing face data and account information of a user, and an administrator uses the module to create, delete, disable or enable a user account;
the security and privacy module comprises a data encryption unit, an access control unit, a compliance checking unit, a log recording and monitoring unit and is responsible for ensuring the security of the system and the privacy of user data and ensuring that the user data is fully protected in the transmission and storage processes.
2. The face recognition big data based login system according to claim 1, wherein: the user information collecting unit is responsible for collecting user names and passwords of users; the face data collection unit is in charge of receiving face video data uploaded by a user; the data preprocessing unit is responsible for preprocessing the video data, wherein the preprocessing operation comprises frame extraction, image clipping, denoising and alignment; extracting facial features of the user after the preprocessing operation is finished, wherein the facial features comprise eye positions and outlines, mouth and lip features, nose shapes, facial outlines, skin textures, facial expressions and muscle movements, and eyeball colors and textures; finally, the data preprocessing unit is responsible for converting facial features into feature vectors; the authentication and access control unit included in the user login module decides whether to allow the user to login.
3. The face recognition big data based login system according to claim 2, wherein: the authentication of the user login module and the authentication of the access control unit are as follows:
s3-1, acquiring a user name of a user from a face data collection unit and acquiring a feature vector of facial features of the user from a data preprocessing unit;
s3-2, packing the user name and the feature vector of the facial feature of the user into user data;
s3-3, transmitting the user data to a face recognition module for recognition;
s3-4, acquiring a face comparison result transmitted by a face recognition module;
s3-5, if the face comparison is successful, granting the user access right; if the face alignment fails, the user will be denied access and need to attempt to re-login.
4. The face recognition big data based login system according to claim 1, wherein: the face recognition module comprises the following steps of:
s4-1, receiving packed user data from a user login module, wherein the packed user data comprises a user name and a feature vector of facial features;
s4-2, inquiring a user database to obtain feature vectors of facial features of registered users;
s4-3, comparing the feature vector of the facial feature provided by the user with the feature vector of the registered user to generate a similarity score;
s4-4, setting a similarity threshold as N, and judging whether the two feature vectors are similar enough to be considered as the features of the same person;
s4-5, judging a comparison result according to the comparison similarity score and a set similarity threshold value; if the difference value between the similarity score and the threshold value N is within U, the face comparison is considered to be successful;
s4-6, returning the comparison result to an authentication and access control unit of the user login module; the comparison result comprises comparison success and comparison failure.
5. The face recognition big data based login system according to claim 4, wherein: in step S4-3, the feature vectors are compared using euclidean distance, assuming that there are two feature vectors a and B, which are respectively expressed as: a= (a 1, a2, a3,., an), b= (B1, B2, B3,., bn), where n represents the dimension of the feature vector and ai and bi represent the values of the two vectors in the i-th dimension;
the calculation formula of the Euclidean distance is as follows:
the distance value of the Euclidean distance represents the difference degree between the two feature vectors, and the distance value is substituted into the step S4-5 for comparison; the calculation formula of the similarity score in step S4-5 is as follows:
V=exp(-k*d(A,B))
where V is the similarity score, k is a constant, and d (A, B) is the Euclidean distance.
6. The face recognition big data based login system according to any one of claims 1-5, wherein: the step of converting the facial features into feature vectors is as follows:
s6-1, detecting and positioning a human face in an input video by using a human face detection algorithm, and only extracting facial features of a user in the detection process;
s6-2, cutting out an image area containing the face according to the detected face position, reducing the calculated amount of feature extraction, and analyzing the face in a concentrated manner;
s6-3, preprocessing the cut facial image to reduce noise and improve the reliability of the features; the pretreatment step comprises the following steps: graying, histogram equalization and denoising;
s6-4, extracting the facial features from the preprocessed facial images by using a feature extraction algorithm;
s6-5, converting the extracted facial features into feature vectors, and obtaining feature vectors by arranging measurement results of each feature into one-dimensional vector (feature 1, feature 2, feature 3, feature n), wherein n represents that the user has n different facial features;
s6-6, normalizing the feature vector.
7. The face recognition big data based login system according to claim 6, wherein: in S6-1, the face detection algorithm uses RPN, wherein the RPN is a deep learning network structure and is used for generating an anchor frame in an image, and the anchor frame is a boundary frame of a candidate target; generating a plurality of anchor frames of different sizes and aspect ratios at each location on the feature map using a sliding window, each anchor frame being defined by its central coordinates and a width-height; the RPN outputs coordinate information of each anchor frame, wherein the coordinate information comprises a central point coordinate and a width and height of each anchor frame, and the coordinates are used for adjusting the anchor frames;
the RPN also outputs a confidence score for each anchor box, the confidence score representing the probability that the anchor box contains a target; the score is used for anchor boxes, and boundary boxes with high confidence are reserved;
the RPN generates coordinates and confidence scores for the candidate bounding boxes using the following formula:
for each anchor frame, calculating the central point coordinates and the width and height adjustment values of the anchor frame, wherein the adjustment values comprise deltax, deltay, deltaw and deltah, and the new boundary frame coordinates are as follows: new center point coordinates x '=x+Δx, new center point coordinates y' =y+Δy, new width w '=w×exp (Δw), new height h' =h×exp (Δh);
for each anchor box, calculating a confidence score representing the probability that the anchor box contains the target; the score is output through a classification network, a sigmoid activation function is used for mapping the output to between 0 and 1, and the probability that the anchor frame contains a target is represented;
the mathematical representation of the sigmoid activation function is as follows:
where x is the original score or network output value and σ (x) is the mapped probability value.
8. The face recognition big data based login system according to claim 6, wherein: in S6-4, the feature extraction algorithm is performed using PCA, which is a principal component analysis capable of reducing the data dimension and extracting the principal information in the data, and the calculation formula is as follows:
PCA calculates a covariance matrix of the data, the covariance matrix describing relationships between different features; for a data set with N features, the covariance matrix has a size of n×n, and the covariance matrix C is calculated using the following formula:
C=(1/N)*(X-μ)*(X-μ) T
wherein X is a data matrix, mu is the average value of the data, and T represents the transpose of the matrix;
then, carrying out eigenvalue decomposition on the covariance matrix to obtain eigenvalues and corresponding eigenvectors; the eigenvalues represent the importance of each eigenvector, which describes the dominant direction in the new eigenvector space; selecting the number of main components to be reserved according to the size of the characteristic value, and reserving the total variance of W%;
finally, the original data is projected into the new feature space using the selected principal components, the projection being accomplished by multiplying the data matrix by the principal component matrix, the projected data having dimensionality-reduced features for subsequent analysis and modeling tasks.
9. The face recognition big data based login system according to claim 1, wherein: the data encryption unit uses RSA encryption; in the initialization stage of the data encryption unit, a pair of RSA keys including a public key and a private key are generated; the public key is used for encrypting data, and the private key is used for decrypting data; when a user or a system needs to encrypt sensitive data, firstly, an RSA public key needs to be acquired, the data to be encrypted and the RSA public key are transmitted to a data encryption unit, the data encryption unit encrypts plaintext data by using the RSA public key, and the encryption operation steps are as follows:
s9-1, converting plaintext data into a digital form, and processing the data by using a PKCS#1 filling scheme to adapt to the requirements of an RSA encryption algorithm;
s9-2, encrypting the digital data by using the RSA public key;
s9-3, ciphertext data is generated;
the ciphertext data is transmitted through the communication channel, and at the receiving end, the ciphertext data is decrypted by using the RSA private key to restore the ciphertext data into original plaintext data, and the decryption operation is executed on the data encryption unit.
10. The login method based on the face recognition big data is characterized by comprising the following steps of: the method comprises the following steps:
s10-1, registering an account in a system by a user;
s10-2, preprocessing facial data uploaded by a user by the system, wherein the preprocessing comprises image cutting, denoising, alignment and standardization;
s10-3, extracting facial features from facial data uploaded by a user; features include eye position and contour, mouth and lip features, nose shape, facial contour, skin texture, facial expression and muscle movement, eye ball color and texture;
s10-4, converting the extracted facial features into feature vectors, and mapping the high-dimensional features into low-dimensional vectors by using PCA;
s10-5, the system associates the facial feature vector of the user with account information thereof and stores the facial feature vector in a database;
s10-6, a user needs to log in the system to access an account or perform a specific operation; when logging in, the user uploads the face data of the user;
s10-7, preprocessing the uploaded user face data;
s10-8, extracting facial features from facial data uploaded by a user;
s10-9, converting the extracted facial features into feature vectors;
s10-10, comparing the feature vectors of the registered users with the feature vectors of the logger, and calculating the similarity between the feature vectors by using Euclidean distance; setting a similarity threshold;
s10-11, if the face recognition result is higher than the similarity threshold, the user is granted access rights, and the system can be accessed or specific operation can be executed;
in the whole login process, the privacy and the data security of the user are ensured, and the security measures of data encryption, access control, log recording and monitoring are adopted to protect the system.
CN202311181342.3A 2023-09-14 2023-09-14 Login system and method based on face recognition big data Pending CN117235694A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311181342.3A CN117235694A (en) 2023-09-14 2023-09-14 Login system and method based on face recognition big data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311181342.3A CN117235694A (en) 2023-09-14 2023-09-14 Login system and method based on face recognition big data

Publications (1)

Publication Number Publication Date
CN117235694A true CN117235694A (en) 2023-12-15

Family

ID=89096037

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311181342.3A Pending CN117235694A (en) 2023-09-14 2023-09-14 Login system and method based on face recognition big data

Country Status (1)

Country Link
CN (1) CN117235694A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080042357A (en) * 2006-11-09 2008-05-15 현대자동차주식회사 Driver certification device and method that do fetters in face realization
CN103607387A (en) * 2013-11-14 2014-02-26 中国科学技术大学 A network login authentication cloud service system based on face identification and a method
CN104469767A (en) * 2014-10-28 2015-03-25 杭州电子科技大学 Implementation method for integrated security protection subsystem of mobile office system
CN109325472A (en) * 2018-11-01 2019-02-12 四川大学 A kind of human face in-vivo detection method based on depth information
CN109919056A (en) * 2019-02-26 2019-06-21 桂林理工大学 A kind of face identification method based on discriminate principal component analysis
CN110490044A (en) * 2019-06-14 2019-11-22 杭州海康威视数字技术股份有限公司 Face modelling apparatus and human face model building
CN111523405A (en) * 2020-04-08 2020-08-11 绍兴埃瓦科技有限公司 Face recognition method and system and electronic equipment
CN112733114A (en) * 2021-01-14 2021-04-30 天津大学 Privacy protection face recognition system and method for smart home
CN115761840A (en) * 2022-10-24 2023-03-07 林义江 Face recognition protection system based on big data platform

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080042357A (en) * 2006-11-09 2008-05-15 현대자동차주식회사 Driver certification device and method that do fetters in face realization
CN103607387A (en) * 2013-11-14 2014-02-26 中国科学技术大学 A network login authentication cloud service system based on face identification and a method
CN104469767A (en) * 2014-10-28 2015-03-25 杭州电子科技大学 Implementation method for integrated security protection subsystem of mobile office system
CN109325472A (en) * 2018-11-01 2019-02-12 四川大学 A kind of human face in-vivo detection method based on depth information
CN109919056A (en) * 2019-02-26 2019-06-21 桂林理工大学 A kind of face identification method based on discriminate principal component analysis
CN110490044A (en) * 2019-06-14 2019-11-22 杭州海康威视数字技术股份有限公司 Face modelling apparatus and human face model building
CN111523405A (en) * 2020-04-08 2020-08-11 绍兴埃瓦科技有限公司 Face recognition method and system and electronic equipment
CN112733114A (en) * 2021-01-14 2021-04-30 天津大学 Privacy protection face recognition system and method for smart home
CN115761840A (en) * 2022-10-24 2023-03-07 林义江 Face recognition protection system based on big data platform

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TWN29004: ""三维目标检测中的RPN网络总结"", pages 1 - 6, Retrieved from the Internet <URL:https://blog.csdn.net/qq_41621517/article/details/121879229?ops_request_misc=&request_id=&biz_id=102&utm_term=%E5%8C%BA%E5%9F%9F%E6%8F%90%E8%AE%AE%E7%BD%91%E7%BB%9C%EF%BC%88RPN%EF%BC%89%20%E7%BD%AE%E4%BF%A1%E5%BA%A6&utm_medium=distribute.pc_search_result.none-task-blog-2~all~sobaiduweb~default-4-121879229.142^v100^pc_search_result_base9&spm=1018.2226.3001.4187> *

Similar Documents

Publication Publication Date Title
US11288504B2 (en) Iris liveness detection for mobile devices
CN111133433B (en) Automatic authentication for access control using face recognition
EP1629415B1 (en) Face identification verification using frontal and side views
Dinca et al. The fall of one, the rise of many: a survey on multi-biometric fusion methods
Jain et al. Biometrics: a tool for information security
JP4097710B2 (en) Security apparatus and method
KR100927596B1 (en) Data protected pattern recognition method and apparatus
EP2688008B1 (en) Biological information acquisition device, biological information comparison device, and program
Fuzail et al. Face detection system for attendance of class’ students
Pinthong et al. Face recognition system for financial identity theft protection
CN117115881A (en) Face recognition system based on machine learning
Rana et al. Iris recognition system using PCA based on DWT
KR102308122B1 (en) Server And System for Face Recognition Using The Certification Result
Battaglia et al. A person authentication system based on RFID tags and a cascade of face recognition algorithms
CN112862491B (en) Face payment security method and platform based on security unit and trusted execution environment
CN111932755A (en) Personnel passage verification method and device, computer equipment and storage medium
Ashiba et al. Implementation face based cancelable multi-biometric system
CN117235694A (en) Login system and method based on face recognition big data
Tobji et al. A synthetic fusion rule based on flda and pca for iris recognition using 1d log-gabor filter
Dragerengen Access Control in Critical Infrastructure Control Rooms using Continuous Authentication and Face Recognition
CN116052313B (en) Intelligent secret cabinet control method, device, equipment and storage medium
Venkata Ramana et al. Hybrid Biometric Based Person Identification Using Machine Learning
Findling Pan shot face unlock: Towards unlocking personal mobile devices using stereo vision and biometric face information from multiple perspectives
Kamanga et al. Securing Iris Recognition System for Access Control Based on Image Processing
CN117809348A (en) Security face comparison search system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination