CN113626783A - Identity authentication method and device applied to mobile terminal - Google Patents
Identity authentication method and device applied to mobile terminal Download PDFInfo
- Publication number
- CN113626783A CN113626783A CN202110839741.9A CN202110839741A CN113626783A CN 113626783 A CN113626783 A CN 113626783A CN 202110839741 A CN202110839741 A CN 202110839741A CN 113626783 A CN113626783 A CN 113626783A
- Authority
- CN
- China
- Prior art keywords
- user
- data
- mobile terminal
- identity authentication
- prediction model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/44—Program or device authentication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Computer Hardware Design (AREA)
- General Health & Medical Sciences (AREA)
- Bioethics (AREA)
- Biophysics (AREA)
- Computing Systems (AREA)
- Molecular Biology (AREA)
- Evolutionary Computation (AREA)
- Mathematical Physics (AREA)
- Data Mining & Analysis (AREA)
- Computational Linguistics (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Collating Specific Patterns (AREA)
Abstract
The invention provides an identity authentication method and device applied to a mobile terminal, relating to the technical field of identity recognition, wherein the method comprises the following steps: collecting the typing behavior of a user by using a sensor of a mobile terminal to obtain data to be predicted; inputting the data to be predicted into a typing behavior prediction model to obtain an identity authentication result output by the typing behavior prediction model; the typing behavior prediction model is obtained by training based on the characteristics extracted from the template sample of the user and the template samples of other users, or is obtained by training based on the template sample of the user and the template samples of other users.
Description
Technical Field
The invention relates to the technical field of identity recognition, in particular to an identity authentication method and device applied to a mobile terminal.
Background
The biological characteristics of the human face, the fingerprint, the iris and the like are called as hard biological characteristics, at present, the supervision on the hard biological characteristics is increasingly strict, and once the hard biological characteristics are leaked, the influence on the user is extremely long-term or even lifelong. The authorization degree required by the hard biometric features is high, and basically, the user is required to cooperate to achieve the detection effect during detection, so that the user experience is poor.
How to protect the privacy of the user to the maximum extent in the identity authentication process, so that no long-term damage is caused in case of data leakage, and the required authority is not so high, so that the authorization process has universality, which needs to be considered in the future of identity authentication.
If the identity authentication two-classification model is established for each user, complexity is too high in practical application, maintenance of the two-classification model can only adopt an automatic mode, and recognition accuracy is influenced. Therefore, matching the historical data and the current data through a general model to reduce most of the resource consumption and complexity is an important issue to be solved in the industry.
Disclosure of Invention
In view of this, the present invention provides an identity authentication method and apparatus applied to a mobile terminal, so as to solve the defects in the prior art that the privacy protection of the user is insufficient and the authorization process is not universal during identity authentication, and achieve the requirements of universality and accuracy during identity authentication.
Based on the above purpose, the present invention provides an identity authentication method applied to a mobile terminal, which comprises the following steps:
collecting the typing behavior of a user by using a sensor of a mobile terminal to obtain data to be predicted;
inputting the data to be predicted into a typing behavior prediction model to obtain an identity authentication result output by the typing behavior prediction model; the typing behavior prediction model is trained based on the characteristics extracted from the template sample of the user and the template samples of other users, or is trained based on the template sample of the user and the template samples of other users.
Optionally, the collecting typing behavior data of the user by using the sensor of the mobile terminal specifically includes the following steps:
judging whether the user has an input behavior, and acquiring input behavior data of the user by using a sensor of a mobile terminal when the user has the input behavior; the sensor comprises a linear accelerometer and a gyroscope which are arranged in the mobile terminal;
and dividing the input behavior data into a plurality of groups of data according to preset dividing time to obtain the data to be predicted.
Optionally, the typing behavior prediction model is obtained by training through the following steps:
constructing a training sample pair based on the template sample of the user and the template samples of other users;
and training the training sample pair as input data for training in a deep learning mode to obtain the typing behavior prediction model for generating the identity recognition result of the data to be predicted.
Optionally, the typing behavior prediction model is obtained by training through the following steps:
constructing a training sample pair based on the template sample of the user and the template samples of other users;
extracting training features of the training sample pairs;
and training the training characteristic pair as input data for training in a deep learning mode to obtain the typing behavior prediction model for generating the identity recognition result of the data to be predicted.
Optionally, the training features include statistical features, local features, signal features, frequency domain features, and cross features.
Optionally, after the step of constructing the training sample pair based on the template sample of the user and the template samples of other users, the method further includes the following steps:
preprocessing the training sample pairs; the preprocessing mode comprises smoothing filtering, median filtering, average filtering and Kalman filtering.
Optionally, the inputting the data to be predicted into a typing behavior prediction model to obtain an identity authentication result output by the typing behavior prediction model specifically includes the following steps:
inputting the data to be predicted into a typing behavior prediction model to obtain an authentication distance output by the typing behavior prediction model;
judging the relation between the authentication distance and a preset distance threshold;
if the authentication distance is larger than the preset distance threshold, the identity authentication result is that the user is not the user and is other users;
and if the authentication distance is smaller than a preset distance threshold, the identity authentication result is the user.
The invention also provides an identity authentication method applied to the mobile terminal, which comprises the following steps:
the data acquisition module is used for acquiring the typing behavior of the user by utilizing a sensor of the mobile terminal to obtain data to be predicted;
the behavior prediction module is used for inputting the data to be predicted into a typing behavior prediction model to obtain an identity authentication result output by the typing behavior prediction model; the typing behavior prediction model is trained based on the characteristics extracted from the template sample of the user and the template samples of other users, or is trained based on the template sample of the user and the template samples of other users.
The invention also provides an electronic device, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the program to realize the steps of the identity authentication method applied to the mobile terminal.
The present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of any of the above-described identity authentication methods applied to a mobile terminal.
As can be seen from the above, the identity authentication method and apparatus applied to the mobile terminal provided by the present invention, when the user uses the mobile terminal, for example, uses an Application (App) installed on the mobile terminal, a series of verifications are completed in the background of the mobile terminal without sensing, and there is no need to interrupt App operation for biometric authentication as a human face or a fingerprint; the data acquired by the sensor is adopted for identity authentication, so that the privacy of a user is more friendly, the verification effect can be achieved, the harm to the user can be reduced to the greatest extent during leakage, meanwhile, the sensor data has timeliness, and after a period of time, the data can automatically lose efficacy under the influence of changed input habits or other objective factors, so that the accuracy of identity authentication is improved; the model constructed based on the data of the sensor has universality and meets the requirement of accuracy.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flow chart of an identity authentication method applied to a mobile terminal according to the present invention;
fig. 2 is a flowchart illustrating a specific process of step S100 in the identity authentication method applied to the mobile terminal according to the present invention;
FIG. 3 is a first schematic flow chart illustrating the construction of a typing behavior prediction model in the identity authentication method applied to a mobile terminal according to the present invention;
FIG. 4 is a schematic flow chart of the construction of a typing behavior prediction model in the identity authentication method applied to the mobile terminal according to the present invention;
FIG. 5 is a third schematic flow chart illustrating the construction of a typing behavior prediction model in the identity authentication method applied to a mobile terminal according to the present invention;
FIG. 6 is a fourth schematic flowchart of the typing behavior prediction model construction in the identity authentication method applied to the mobile terminal according to the present invention;
fig. 7 is a flowchart illustrating a specific process of step S200 in the identity authentication method applied to the mobile terminal according to the present invention;
fig. 8 is a schematic structural diagram of an identity authentication device applied to a mobile terminal according to the present invention;
fig. 9 is a schematic structural diagram of a data acquisition module in an identity authentication apparatus applied to a mobile terminal according to the present invention;
fig. 10 is a schematic structural diagram of a behavior prediction module in the identity authentication apparatus applied to the mobile terminal according to the present invention;
fig. 11 is a schematic structural diagram of an electronic device provided in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to specific embodiments and the accompanying drawings.
It is to be noted that technical terms or scientific terms used in the embodiments of the present invention should have the ordinary meanings as understood by those having ordinary skill in the art to which the present disclosure belongs, unless otherwise defined. The use of "first," "second," and similar terms in this disclosure is not intended to indicate any order, quantity, or importance, but rather is used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that the element or item listed before the word covers the element or item listed after the word and its equivalents, but does not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships may also be changed accordingly.
As a preferred embodiment of the present invention, the present invention provides an identity authentication method applied to a mobile terminal, including the steps of:
collecting the typing behavior of a user by using a sensor of a mobile terminal to obtain data to be predicted;
inputting the data to be predicted into a typing behavior prediction model to obtain an identity authentication result output by the typing behavior prediction model; the typing behavior prediction model is trained based on the characteristics extracted from the template sample of the user and the template samples of other users, or is trained based on the template sample of the user and the template samples of other users.
The invention also provides an identity authentication method applied to the mobile terminal, which comprises the following steps:
the data acquisition module is used for acquiring the typing behavior of the user by utilizing a sensor of the mobile terminal to obtain data to be predicted;
the behavior prediction module is used for inputting the data to be predicted into a typing behavior prediction model to obtain an identity authentication result output by the typing behavior prediction model; the typing behavior prediction model is trained based on the characteristics extracted from the template sample of the user and the template samples of other users, or is trained based on the template sample of the user and the template samples of other users.
By the identity authentication method and the identity authentication device applied to the mobile terminal, when a user uses the mobile terminal, such as an App installed on the mobile terminal, a series of verification is finished on a background of the mobile terminal without perception, and the App operation does not need to be interrupted like a human face and a fingerprint to specially carry out biological feature authentication; the data acquired by the sensor is adopted for identity authentication, so that the privacy of a user is more friendly, the verification effect can be achieved, the harm to the user can be reduced to the greatest extent during leakage, meanwhile, the sensor data has timeliness, and after a period of time, the data can automatically lose efficacy under the influence of changed input habits or other objective factors, so that the accuracy of identity authentication is improved; the model constructed based on the data of the sensor has universality and meets the requirement of accuracy.
The following describes preferred embodiments of the identity authentication method and apparatus applied to a mobile terminal according to the present invention with reference to the accompanying drawings.
Referring to fig. 1, the method includes the following steps:
and S100, collecting the typing behavior of the user by using a sensor of the mobile terminal to obtain data to be predicted. It is understood that mobile terminals include, but are not limited to, cell phones, notebooks, tablets, and wearable devices.
S200, inputting data to be predicted into a typing behavior prediction model to obtain an identity authentication result output by the typing behavior prediction model; the typing behavior prediction model is trained based on the template samples of the user and the characteristics extracted from the template samples of the other users, or is trained based on the template samples of the user and the template samples of the other users.
According to the method, when a user uses a mobile terminal, such as an App installed on the mobile terminal, a series of verification is finished on a background of the mobile terminal without perception through the step S100, and the App operation does not need to be interrupted specially for biometric authentication like a human face and a fingerprint; the data acquired by the sensor is adopted for identity authentication, so that the privacy of a user is more friendly, the verification effect can be achieved, the harm to the user can be reduced to the greatest extent during leakage, meanwhile, the sensor data has timeliness, and after a period of time, the data can automatically lose efficacy under the influence of changed input habits or other objective factors, so that the accuracy of identity authentication is improved; the model constructed by the step S200 based on the data of the sensor has versatility and meets the requirement of accuracy.
Referring to fig. 2, step S100 specifically includes the following steps:
s110, judging whether the user has an input behavior, and acquiring input behavior data of the user by using a sensor of the mobile terminal when the user has the input behavior; the sensor comprises a linear accelerometer and a gyroscope which are installed in the mobile terminal.
The method comprises the steps of acquiring the movement acceleration, the rotation angular velocity change and the like of a user using App of a mobile terminal by using a linear accelerometer and a gyroscope, and setting the sampling frequency of each sensor before acquisition, such as 50Hz and 100 Hz. When a user using the mobile terminal has an input behavior, the sensor is started to monitor, and sensor data during typing of the user is collected.
In the early stage, because the sensor data lacks the template data of the user, the sensor data which is collected to the user and passes face and fingerprint verification in the same session can be stored and warehoused as the sample template of the authenticated user. The sample template is collected for the subsequent judgment of unknown samples and is used as reference data for authentication comparison. Meanwhile, in the previous data collection stage, other biometric authentication modes such as fingerprints and human faces can be combined for marking the data of the user.
And S120, dividing the input behavior data into a plurality of groups of data according to preset dividing time to obtain data to be predicted. When the input behavior data of the user are accumulated for a certain time, the data are divided into small time windows, wherein the time windows can be selected from 0.2 second and 0.5 second, and the long-time data are divided into a plurality of small time window samples.
Referring to fig. 3, the typing behavior prediction model is trained by the following steps:
and A110, constructing a training sample pair based on the template sample of the user and the template samples of other users.
And A120, training the training sample pair as input data for training in a deep learning mode to obtain a typing behavior prediction model for generating an identity recognition result of the data to be predicted.
In this method, a template sample is required in the construction of a sample pair, and there are two main ways to select the template sample. The first method is a single sample method, in which a sample having user representativeness is called a template sample or a registration sample, and a common template sample obtaining method is that before the user performs identity authentication by typing for the first time, the user inputs a predetermined input content after passing through other identity verification means, and sensor data when inputting the content is used as the template sample of the user. It can be understood that this is one of the generation manners of the template samples, or the optimal template sample can be selected by calculation from the authenticated historical data, one of the methods is to perform time domain and frequency domain conversion on the small time window samples by using fast fourier transform, calculate the features on the frequency domain, traverse the existing samples, calculate the similarity of each sample with respect to other samples, and select the sample with the highest comprehensive similarity as the template sample. When the single-sample method is adopted, the template sample is not changed in a short period, and only one template sample is arranged in the short period, so the method is called as the single-sample method. The second method is a random sample method, which randomly extracts an authenticated historical sample as a template sample. After the template sample is constructed, constructing positive and negative samples which need to be trained and tested, wherein the construction mode is positive sampling and negative sampling. And combining the template sample with other samples to obtain a plurality of sample pairs, and further constructing a complete sample set. The other samples refer to the sample of the user or the samples of other users, if the sample of the user is the sample of the user, the sample is taken as a positive example sample, and the behavior patterns of the template sample and the other samples are consistent; if the sample of other users is considered as a negative sample, the behavior pattern between the template sample and other samples is not consistent. Thus, each sample pair is labeled with a corresponding matching label according to whether the sample pair is matched or not when the training sample pair is formed.
Referring to fig. 4, the typing behavior prediction model is further trained by the following steps:
and B110, constructing a training sample pair based on the template sample of the user and the template samples of other users. The process of step B110 to construct training sample pairs is identical to step a110 and will not be described in detail herein.
And B120, extracting training characteristics of the training sample pairs.
In the method, the training sample pair includes two schemes, the first scheme is a characteristic engineering scheme corresponding to the step B120, and the second scheme is an end-to-end scheme corresponding to the step A120. The feature engineering scheme is characterized in that features are manually extracted, then a model is constructed, and the extracted features comprise general statistical features, local features, signal features, frequency domain features, cross features and the like, such as minimum values, maximum values, mean values, variances, frequency domain features, spectral entropies, amplitudes, rolling features, zero penetration rates, peak numbers, change rates and the like. The second scheme is to directly input the original data into the deep learning model and directly extract the features by the deep learning model.
And B130, training the training characteristic pairs as input data for training in a deep learning mode to obtain a typing behavior prediction model for generating an identity recognition result of the data to be predicted.
In the method, the model construction mainly adopts a metric learning method, and one of the metric learning methods is to construct a twin network structure. The twin network has two inputs, where there are two feature extraction networks, the two sub-network weights are shared. Because two schemes are adopted in the process of processing the training sample pairs, the sub-network structures are different for different schemes. If the feature engineering corresponds to step B120, the sub-network structure is mainly composed of full connection layers. If an end-to-end scheme is adopted to correspond to step a120, that is, the input of the model is the original data of the template sample and other samples, the sub-network structure needs to adopt a structure with a time sequence model extraction capability, such as a recurrent neural network, a time sequence convolution structure, a time sequence transformer structure, a full connection structure, and the like, and the combination of different structures can have a feature extraction capability.
In this embodiment, the subnetwork structure in the first mode is stacked in multiple layers by using a one-dimensional convolution and a Wavenet network, where the one-dimensional convolution is combined with a bidirectional long-and-short-term memory network, and is preferably a one-dimensional convolution and Wavenet network structure.
The model between the one-dimensional convolution and the Wavenet network scheme is as follows:
a first layer: 16-kernel one-dimensional convolution is carried out, the kernel size is 1, the filling mode is same, and the activation function is hash;
a second layer: the core size of the 16-core Wavenet module is 3, and the stacking layer number in the Wavenet module is 16;
and a third layer: 32-kernel one-dimensional convolution is carried out, the kernel size is 1, the filling mode is same, and the activation function is hash;
a fourth layer: 32 cores of the Wavenet module, wherein the core size is 3, and the stacking layer number in the Wavenet module is 8;
and a fifth layer: performing 64-kernel one-dimensional convolution, wherein the kernel size is 1, the filling mode is same, and the activation function is hash;
a sixth layer: the core size of the Wavenet module is 3, and the number of stacking layers in the Wavenet module is 4;
a seventh layer: performing 128-kernel one-dimensional convolution, wherein the kernel size is 1, the filling mode is same, and the activation function is hash;
an eighth layer: the core size of the 128-core Wavenet module is 3, and the number of stacking layers in the Wavenet module is 1;
a ninth layer: one-dimensional global average pooling.
Therefore, there are two ways for model output, the first way is to splice, find the difference or otherwise extract the features of the template sample and other samples, then connect the full connection layer or other discriminant models, and finally output is a two-class output, i.e. if the template sample and other samples come from the same user, the output is matched, and if the template sample and other samples do not belong to the same user, the output is not matched. The second way is to optimize the network by using a pair-based loss function, such as triple loss, constrained loss, etc. Taking the triplet loss as an example, constructing a training set comprising template samples, positive samples and negative samples, extracting the sensor data depth features of the training set by adopting a twin network basic network sharing weight values, inputting the extracted depth embedding features into the triplet loss for error calculation after the extraction is successful, and then adopting an error back propagation method for training the twin network. In this embodiment, the loss function is preferably a triplet loss. Meanwhile, if triplet loss is selected, a main network needs to be defined in the training stage, a basic network (base network) needs to be defined in the main network, the basic network is a weight sharing model, a time sequence-based neural network architecture is mainly adopted for extracting characteristics of samples, and the application of the basic network is different according to different loss functions. And the main network needs to input the template sample, the positive sample and the negative sample at the same time, and uses the basic network sharing the weight to extract the features of the three samples respectively. And taking the extracted three types of sample characteristics as the output of the main network. The main network performs loss calculations on the three types of sample outputs using triplet loss. In the training process of the basic network and the main network, the related activation functions comprise relu, mish, sigmoid, tanh and the like, the optimizer adopts a Ranger optimizer, the optimization strategy adopts a BN optimization strategy, the L2 weight attenuation item regularization, the early stop method, the Dropout and the like are used, and Bayesian optimization is used for optimizing the parameters of the main network and the basic network.
Referring to fig. 5 and 6, after step a110 and step B110, the method further includes the following steps:
A111/B111, preprocessing a training sample pair; the noise reduction is performed by using some filtering and noise reduction techniques for filtering noise, for example, noise reduction may be performed on the training sample pair by using noise reduction methods in scenes such as smoothing filtering, median filtering, average filtering, kalman filtering, and the like.
Referring to fig. 7, step S200 specifically includes the following steps:
and S210, inputting the data to be predicted into the typing behavior prediction model to obtain the authentication distance output by the typing behavior prediction model.
And S220, judging the relation between the authentication distance and a preset distance threshold value.
And S230, if the authentication distance is greater than the preset distance threshold, judging that the user is not the identity authentication result, and judging that the user is the other user.
And S240, if the authentication distance is smaller than the preset distance threshold, the identity authentication result is the user.
The basic network outputs different types of samples to form the output of the main network, and a triplet loss is taken as an example for explanation, wherein the triplet loss is used for controlling the characteristics learned by the basic network, so that the characteristics of the positive sample are closer to the characteristics of the template sample, the characteristics of the negative sample are farther from the characteristics of the template sample, and the basic network learns the characteristic difference among different personnel sensor data. In the prediction stage, taking a login scene as an example, when a certain account is logged in, sensor data to be authenticated is generated, a template sample is extracted from authenticated data of a user of the account, the trained typing behavior prediction model is used for calculating the data to be predicted and the embedded feature expression of the typing behavior prediction model of the authenticated template sample respectively, the feature distance of the data to be predicted and the embedded feature expression of the typing behavior prediction model of the authenticated template sample are calculated, the authentication distance is compared with a preset distance threshold value, namely an authentication passing threshold value, set during verification of the typing behavior prediction model, if the authentication distance is greater than the preset distance threshold value, the authentication is not passed, the identity authentication result is that the user is not the user, the user is the other user, if the authentication distance is less than the preset distance threshold value, the authentication is passed, and the identity authentication result is that the user is the user.
The following describes the cross-domain behavior recognition apparatus provided by the present invention, and the cross-domain behavior recognition apparatus described below and the cross-domain behavior recognition method described above may be referred to correspondingly.
Referring to fig. 8, the apparatus specifically includes:
the data acquisition module 100 is configured to acquire a typing behavior of a user by using a sensor of the mobile terminal to obtain data to be predicted. It is understood that mobile terminals include, but are not limited to, cell phones, notebooks, tablets, and wearable devices.
The behavior prediction module 200 is configured to input data to be predicted into the typing behavior prediction model to obtain an identity authentication result output by the typing behavior prediction model; the typing behavior prediction model is trained based on the template samples of the user and the characteristics extracted from the template samples of the other users, or is trained based on the template samples of the user and the template samples of the other users.
According to the method, when a user uses a mobile terminal, such as an App installed on the mobile terminal, through the data acquisition module 100, a series of verification is completed on a background of the mobile terminal without perception, and the App operation does not need to be interrupted like a human face and a fingerprint to specially perform biological characteristic authentication; the data acquired by the sensor is adopted for identity authentication, so that the privacy of a user is more friendly, the verification effect can be achieved, the harm to the user can be reduced to the greatest extent during leakage, meanwhile, the sensor data has timeliness, and after a period of time, the data can automatically lose efficacy under the influence of changed input habits or other objective factors, so that the accuracy of identity authentication is improved; the model constructed by the behavior prediction module 200 based on the data of the sensor has universality and meets the requirement of accuracy.
Referring to fig. 9, the data acquisition module 100 specifically includes:
a behavior determining unit 110, configured to determine whether a user has an input behavior, and when the user has the input behavior, collect input behavior data of the user by using a sensor of the mobile terminal; the sensor comprises a linear accelerometer and a gyroscope which are installed in the mobile terminal.
The method comprises the steps of acquiring the movement acceleration, the rotation angular velocity change and the like of a user using App of a mobile terminal by using a linear accelerometer and a gyroscope, and setting the sampling frequency of each sensor before acquisition, such as 50Hz and 100 Hz. When a user using the mobile terminal has an input behavior, the sensor is started to monitor, and sensor data during typing of the user is collected.
In the early stage, because the sensor data lacks the template data of the user, the sensor data which is collected to the user and passes face and fingerprint verification in the same session can be stored and warehoused as the sample template of the authenticated user. The sample template is collected for the subsequent judgment of unknown samples and is used as reference data for authentication comparison. Meanwhile, in the previous data collection stage, other biometric authentication modes such as fingerprints and human faces can be combined for marking the data of the user.
The data acquisition unit 120 is configured to divide the input behavior data into a plurality of sets of data according to a preset division time, so as to obtain data to be predicted. When the input behavior data of the user are accumulated for a certain time, the data are divided into small time windows, wherein the time windows can be selected from 0.2 second and 0.5 second, and the long-time data are divided into a plurality of small time window samples.
Referring to fig. 10, the behavior prediction module 200 specifically includes:
the distance obtaining unit 210 is configured to input data to be predicted into the typing behavior prediction model, and obtain an authentication distance output by the typing behavior prediction model.
The relationship obtaining unit 220 is configured to determine a relationship between the authentication distance and a preset distance threshold.
A first result unit 230, configured to determine that the authentication result is the user if the authentication distance is greater than the preset distance threshold.
The second result unit 240 is configured to determine that the user is not the identity authentication result and that the user is another user if the authentication distance is smaller than the preset distance threshold.
The basic network outputs different types of samples to form the output of the main network, and a triplet loss is taken as an example for explanation, wherein the triplet loss is used for controlling the characteristics learned by the basic network, so that the characteristics of the positive sample are closer to the characteristics of the template sample, the characteristics of the negative sample are farther from the characteristics of the template sample, and the basic network learns the characteristic difference among different personnel sensor data. In the prediction stage, taking a login scene as an example, when a certain account is logged in, sensor data to be authenticated is generated, a template sample is extracted from authenticated data of a user of the account, the trained typing behavior prediction model is used for calculating the data to be predicted and the embedded feature expression of the typing behavior prediction model of the authenticated template sample respectively, the feature distance of the data to be predicted and the embedded feature expression of the typing behavior prediction model of the authenticated template sample are calculated, the authentication distance is compared with a preset distance threshold value, namely an authentication passing threshold value, set during verification of the typing behavior prediction model, if the authentication distance is greater than the preset distance threshold value, the authentication is not passed, the identity authentication result is not the user, the identity authentication result is the other user, if the authentication distance is smaller than the preset distance threshold value, the identity authentication is passed, and the identity authentication result is the user.
Fig. 11 illustrates a physical structure diagram of an electronic device, and as shown in fig. 11, the electronic device may include: a processor (processor)810, a communication Interface 820, a memory 830 and a communication bus 840, wherein the processor 810, the communication Interface 820 and the memory 830 communicate with each other via the communication bus 840. The processor 810 may invoke the logic instructions in the memory 830 to perform an identity authentication method applied to the mobile terminal, the method comprising the steps of:
s100, collecting typing behaviors of a user by using a sensor of a mobile terminal to obtain data to be predicted;
s200, inputting the data to be predicted into a typing behavior prediction model to obtain an identity authentication result output by the typing behavior prediction model; the typing behavior prediction model is trained based on the characteristics extracted from the template sample of the user and the template samples of other users, or is trained based on the template sample of the user and the template samples of other users.
In addition, the logic instructions in the memory 830 may be implemented in software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In another aspect, the present invention also provides a computer program product comprising a computer program stored on a non-transitory computer-readable storage medium, the computer program comprising program instructions, which when executed by a computer, enable the computer to execute the identity authentication method applied to a mobile terminal provided by the above methods, the method comprising the steps of:
s100, collecting typing behaviors of a user by using a sensor of a mobile terminal to obtain data to be predicted;
s200, inputting the data to be predicted into a typing behavior prediction model to obtain an identity authentication result output by the typing behavior prediction model; the typing behavior prediction model is obtained through training based on the template samples of the user and the characteristics extracted from the template samples of other users.
In still another aspect, the present invention also provides a non-transitory computer-readable storage medium having stored thereon a computer program, which when executed by a processor is implemented to perform the above-provided identity authentication method applied to a mobile terminal, the method including the steps of:
s100, collecting typing behaviors of a user by using a sensor of a mobile terminal to obtain data to be predicted;
s200, inputting the data to be predicted into a typing behavior prediction model to obtain an identity authentication result output by the typing behavior prediction model; the typing behavior prediction model is obtained through training based on the template samples of the user and the characteristics extracted from the template samples of other users.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Those of ordinary skill in the art will understand that: the discussion of any embodiment above is meant to be exemplary only, and is not intended to intimate that the scope of the disclosure, including the claims, is limited to these examples; within the idea of the invention, also features in the above embodiments or in different embodiments may be combined, steps may be implemented in any order, and there are many other variations of the different aspects of the invention as described above, which are not provided in detail for the sake of brevity.
The embodiments of the invention are intended to embrace all such alternatives, modifications and variances that fall within the broad scope of the appended claims. Therefore, any omissions, modifications, substitutions, improvements and the like that may be made without departing from the spirit and principles of the invention are intended to be included within the scope of the invention.
Omissions, modifications, equivalents, improvements, and the like are intended to be included within the scope of the present invention.
Claims (10)
1. An identity authentication method applied to a mobile terminal is characterized by comprising the following steps:
collecting the typing behavior of a user by using a sensor of a mobile terminal to obtain data to be predicted;
inputting the data to be predicted into a typing behavior prediction model to obtain an identity authentication result output by the typing behavior prediction model; the typing behavior prediction model is trained based on the characteristics extracted from the template sample of the user and the template samples of other users, or is trained based on the template sample of the user and the template samples of other users.
2. The identity authentication method applied to the mobile terminal according to claim 1, wherein the step of collecting typing behavior data of the user by using the sensor of the mobile terminal specifically comprises the following steps:
judging whether the user has an input behavior, and acquiring input behavior data of the user by using a sensor of a mobile terminal when the user has the input behavior; the sensor comprises a linear accelerometer and a gyroscope which are arranged in the mobile terminal;
and dividing the input behavior data into a plurality of groups of data according to preset dividing time to obtain the data to be predicted.
3. The identity authentication method applied to the mobile terminal according to claim 1, wherein the typing behavior prediction model is trained by the following steps:
constructing a training sample pair based on the template sample of the user and the template samples of other users;
and training the training sample pair as input data for training in a deep learning mode to obtain the typing behavior prediction model for generating the identity recognition result of the data to be predicted.
4. The identity authentication method applied to the mobile terminal according to claim 1, wherein the typing behavior prediction model is trained by the following steps:
constructing a training sample pair based on the template sample of the user and the template samples of other users;
extracting training features of the training sample pairs;
and training the training characteristic pair as input data for training in a deep learning mode to obtain the typing behavior prediction model for generating the identity recognition result of the data to be predicted.
5. The identity authentication method applied to the mobile terminal according to claim 4, wherein the training features comprise statistical features, local features, signal features, frequency domain features and cross features.
6. The identity authentication method applied to the mobile terminal according to any one of claims 3 to 5, wherein after the step of constructing the training sample pair based on the template sample of the user and the template samples of other users, the method further comprises the following steps:
preprocessing the training sample pairs; the preprocessing mode comprises smoothing filtering, median filtering, average filtering and Kalman filtering.
7. The identity authentication method applied to a mobile terminal according to claim 1, wherein the step of inputting the data to be predicted into a typing behavior prediction model to obtain an identity authentication result output by the typing behavior prediction model comprises the following steps:
inputting the data to be predicted into a typing behavior prediction model to obtain an authentication distance output by the typing behavior prediction model;
judging the relation between the authentication distance and a preset distance threshold;
if the authentication distance is larger than the preset distance threshold, the identity authentication result is that the user is not the user and is other users;
and if the authentication distance is smaller than a preset distance threshold, the identity authentication result is the user.
8. An identity authentication method applied to a mobile terminal is characterized by comprising the following steps:
the data acquisition module (100) is used for acquiring the typing behavior of the user by utilizing a sensor of the mobile terminal to obtain data to be predicted;
the behavior prediction module (200) is used for inputting the data to be predicted into a typing behavior prediction model to obtain an identity authentication result output by the typing behavior prediction model; the typing behavior prediction model is trained based on the characteristics extracted from the template sample of the user and the template samples of other users, or is trained based on the template sample of the user and the template samples of other users.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the identity authentication method as claimed in any one of claims 1 to 7 applied to a mobile terminal when executing the program.
10. A non-transitory computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor implements the steps of the identity authentication method applied to a mobile terminal according to any one of claims 1 to 7.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110839741.9A CN113626783A (en) | 2021-07-23 | 2021-07-23 | Identity authentication method and device applied to mobile terminal |
CN202210879197.5A CN115248910A (en) | 2021-07-23 | 2022-07-25 | Identity authentication method and device applied to mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110839741.9A CN113626783A (en) | 2021-07-23 | 2021-07-23 | Identity authentication method and device applied to mobile terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113626783A true CN113626783A (en) | 2021-11-09 |
Family
ID=78380850
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110839741.9A Pending CN113626783A (en) | 2021-07-23 | 2021-07-23 | Identity authentication method and device applied to mobile terminal |
CN202210879197.5A Pending CN115248910A (en) | 2021-07-23 | 2022-07-25 | Identity authentication method and device applied to mobile terminal |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210879197.5A Pending CN115248910A (en) | 2021-07-23 | 2022-07-25 | Identity authentication method and device applied to mobile terminal |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN113626783A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115103127A (en) * | 2022-08-22 | 2022-09-23 | 环球数科集团有限公司 | High-performance embedded intelligent camera design system and method |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109977639A (en) * | 2018-10-26 | 2019-07-05 | 招商银行股份有限公司 | Identity identifying method, device and computer readable storage medium |
CN110324350A (en) * | 2019-07-09 | 2019-10-11 | 中国工商银行股份有限公司 | Identity identifying method and server based on the non-sensitive sensing data in mobile terminal |
EP3699790A1 (en) * | 2019-02-19 | 2020-08-26 | Nxp B.V. | Method for enabling a biometric template |
-
2021
- 2021-07-23 CN CN202110839741.9A patent/CN113626783A/en active Pending
-
2022
- 2022-07-25 CN CN202210879197.5A patent/CN115248910A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109977639A (en) * | 2018-10-26 | 2019-07-05 | 招商银行股份有限公司 | Identity identifying method, device and computer readable storage medium |
EP3699790A1 (en) * | 2019-02-19 | 2020-08-26 | Nxp B.V. | Method for enabling a biometric template |
CN110324350A (en) * | 2019-07-09 | 2019-10-11 | 中国工商银行股份有限公司 | Identity identifying method and server based on the non-sensitive sensing data in mobile terminal |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115103127A (en) * | 2022-08-22 | 2022-09-23 | 环球数科集团有限公司 | High-performance embedded intelligent camera design system and method |
Also Published As
Publication number | Publication date |
---|---|
CN115248910A (en) | 2022-10-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103533546B (en) | Implicit user verification and privacy protection method based on multi-dimensional behavior characteristics | |
US20170227995A1 (en) | Method and system for implicit authentication | |
CN106068512B (en) | Method and apparatus for verifying user on the mobile device | |
CN102890776A (en) | Method for searching emoticons through facial expression | |
WO2019192253A1 (en) | Mobile device-based user identity authentication method and system | |
CN110782333B (en) | Equipment risk control method, device, equipment and medium | |
CN102890777B (en) | The computer system of recognizable facial expression | |
TWI679586B (en) | Handwriting data processing method and device | |
CN104143083A (en) | Face recognition system based on process management | |
CN111030992A (en) | Detection method, server and computer readable storage medium | |
CN111563746A (en) | Method, device, electronic equipment and medium for user identity authentication | |
CN106921500B (en) | Identity authentication method and device for mobile equipment | |
CN112492090A (en) | Continuous identity authentication method fusing sliding track and dynamic characteristics on smart phone | |
CN113626783A (en) | Identity authentication method and device applied to mobile terminal | |
CN118154194A (en) | Digital payment identity security verification method and system based on cloud platform | |
CN113742669A (en) | User authentication method based on twin network | |
CN116151965B (en) | Risk feature extraction method and device, electronic equipment and storage medium | |
Chaitanya et al. | Verification of pattern unlock and gait behavioural authentication through a machine learning approach | |
CN110311898B (en) | Man-in-the-middle attack detection method of networked numerical control system based on Gaussian radial basis function classifier | |
CN112272195B (en) | Dynamic detection authentication system and method thereof | |
CN101639876A (en) | Identity authentication method | |
CN114550224A (en) | Fingerprint image identification comparison method and device based on deep learning and electronic equipment | |
CN114021181A (en) | Mobile intelligent terminal privacy continuous protection system and method based on use habits | |
CN111353139A (en) | Continuous authentication method and device, electronic equipment and storage medium | |
CN113365273A (en) | Packet-level wireless equipment authentication method based on channel state information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20211109 |
|
WD01 | Invention patent application deemed withdrawn after publication |