CN110765939A - Identity recognition method and device, mobile terminal and storage medium - Google Patents

Identity recognition method and device, mobile terminal and storage medium Download PDF

Info

Publication number
CN110765939A
CN110765939A CN201911008522.5A CN201911008522A CN110765939A CN 110765939 A CN110765939 A CN 110765939A CN 201911008522 A CN201911008522 A CN 201911008522A CN 110765939 A CN110765939 A CN 110765939A
Authority
CN
China
Prior art keywords
behavior
identity recognition
category
data
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911008522.5A
Other languages
Chinese (zh)
Other versions
CN110765939B (en
Inventor
郭子亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911008522.5A priority Critical patent/CN110765939B/en
Publication of CN110765939A publication Critical patent/CN110765939A/en
Application granted granted Critical
Publication of CN110765939B publication Critical patent/CN110765939B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/02Preprocessing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Security & Cryptography (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Multimedia (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Psychiatry (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Telephone Function (AREA)

Abstract

The application discloses an identity recognition method, an identity recognition device, a mobile terminal and a storage medium, wherein the identity recognition method is applied to the mobile terminal, the mobile terminal comprises a sensor used for collecting behavior data, and the identity recognition method comprises the following steps: acquiring behavior data acquired by the sensor; performing feature extraction on the behavior data to obtain behavior features; inputting the behavior characteristics into a pre-trained behavior classification model to obtain a behavior class output by the behavior classification model; and inputting the behavior characteristics into an identity recognition model corresponding to the behavior category to obtain an identity recognition result output by the identity recognition model, wherein the identity recognition model is trained in advance to output the identity recognition result according to the input behavior characteristics. The method can realize non-inductive identity recognition and improve user experience.

Description

Identity recognition method and device, mobile terminal and storage medium
Technical Field
The present application relates to the field of mobile terminal technologies, and in particular, to an identity recognition method and apparatus, a mobile terminal, and a storage medium.
Background
Mobile terminals, such as smartwatches, mobile phones, etc., have become one of the most common consumer electronics products in people's daily life. With the development of technology, more and more functions can be realized by mobile terminals, and with the increase of functions that can be realized by mobile terminals, identity recognition is also gradually appearing in mobile terminals. In the identity recognition technology of the mobile terminal, the identity recognition is usually performed by using technologies such as password, face recognition and the like, and these identity recognition methods usually require active participation of a user, which may cause inconvenience in some special cases.
Disclosure of Invention
In view of the foregoing problems, the present application provides an identity recognition method, an identity recognition apparatus, a mobile terminal, and a storage medium to improve the foregoing problems.
In a first aspect, an embodiment of the present application provides an identity identification method, which is applied to a mobile terminal, where the mobile terminal includes a sensor for collecting behavior data, and the method includes: acquiring behavior data acquired by the sensor; performing feature extraction on the behavior data to obtain behavior features; inputting the behavior characteristics into a pre-trained behavior classification model to obtain a behavior class output by the behavior classification model; and inputting the behavior characteristics into an identity recognition model corresponding to the behavior category to obtain an identity recognition result output by the identity recognition model, wherein the identity recognition model is trained in advance to output the identity recognition result according to the input behavior characteristics.
In a second aspect, an embodiment of the present application provides an identity recognition apparatus, where the apparatus includes: applied to a mobile terminal comprising a sensor for collecting behavioural data, the apparatus comprising: the system comprises a data acquisition module, a feature extraction module, a behavior classification module and a feature identification module, wherein the data acquisition module is used for acquiring behavior data acquired by the sensor; the characteristic extraction module is used for extracting the characteristics of the behavior data to obtain behavior characteristics; the behavior classification module is used for inputting the behavior characteristics into a pre-trained behavior classification model to obtain behavior classes output by the behavior classification model; the characteristic recognition module is used for inputting the behavior characteristics into the identity recognition model corresponding to the behavior category to obtain an identity recognition result output by the identity recognition model, and the identity recognition model is trained in advance to output the identity recognition result according to the input behavior characteristics.
In a third aspect, an embodiment of the present application provides a mobile terminal, including: one or more processors; a memory; one or more application programs, wherein the one or more application programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the identification method provided in the first aspect above.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where a program code is stored in the computer-readable storage medium, and the program code may be called by a processor to execute the identity recognition method provided in the first aspect.
According to the scheme, behavior data are collected through a sensor of the mobile terminal, feature extraction is carried out on the behavior data, behavior features are obtained, the behavior features are input into a pre-trained behavior classification model, behavior categories output by the behavior classification model are obtained, then the behavior features are input into identity recognition models corresponding to the behavior categories, identity recognition results output by the identity recognition models are obtained, the identity recognition models are pre-trained, identity recognition results are output according to the input behavior features, active participation of users in an identity recognition process can be achieved, identity recognition can be achieved under the condition that the users do not sense, and user experience is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 shows a flow chart of an identification method according to an embodiment of the application.
Fig. 2 shows a graph of acceleration versus time provided by an embodiment of the present application.
Fig. 3 shows a graph of acceleration versus frequency provided by an embodiment of the present application.
Fig. 4 shows a flow chart of an identification method according to another embodiment of the present application.
Fig. 5 is a schematic diagram illustrating principle and principle of behavior classification in the identity recognition method according to the embodiment of the present application.
Fig. 6 shows a schematic diagram of the principle of the identity recognition method provided by the embodiment of the present application.
Fig. 7 shows a flow chart of an identification method according to yet another embodiment of the present application.
Fig. 8 shows a flow chart of an identification method according to yet another embodiment of the present application.
FIG. 9 shows a block diagram of an identification appliance according to one embodiment of the present application.
FIG. 10 illustrates a block diagram of a behavior classification module in an identification appliance in accordance with one embodiment of the present application.
Fig. 11 is a block diagram of a mobile terminal for executing an identity recognition method according to an embodiment of the present application.
Fig. 12 is a storage unit for storing or carrying program codes for implementing an identity recognition method according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
At present, human identification technology has been widely researched, because human identification plays an important role in human-computer interaction, and can support many emerging applications, such as smart home, augmented reality, medical care and the like. Moreover, with the development of mobile terminals, identity recognition is also widely applied to mobile terminals.
Among the traditional identification technologies, the most traditional identification technology based on password verification is poor in convenience, weak passwords can be easily broken, and reliability cannot be guaranteed. Compared with the password verification, the convenience of fingerprint identification, voiceprint identification, iris identification, face identification and other schemes is improved to a certain extent, but the schemes all require a user to actively complete certain actions, so that the user has obvious perception. When an emergency occurs and identity recognition is urgently needed, verification failure (fingerprint wet hand, face recognition failure and the like) may be encountered, so that identity recognition cannot be successfully performed and corresponding operation is performed.
In view of the above problems, the inventor provides an identity recognition method, an identity recognition device, a mobile terminal and a storage medium provided in the embodiments of the present application, and can recognize behavior features corresponding to behavior data through behavior data collected by a sensor of the mobile terminal and by using an identity recognition model of a corresponding behavior category, thereby obtaining an identity recognition result, without requiring a user to perform an active action, achieving identity recognition without perception, and improving user experience. The specific identification method is described in detail in the following embodiments.
Referring to fig. 1, fig. 1 is a schematic flow chart illustrating an identity recognition method according to an embodiment of the present application. The identity recognition method is used for recognizing the behavior characteristics corresponding to the behavior data through the behavior data collected by the sensor of the mobile terminal and by using the identity recognition model of the corresponding behavior category, so that an identity recognition result is obtained, a user does not need to actively act, the identity recognition without perception is realized, and the user experience is improved. In a specific embodiment, the identification method is applied to the identification apparatus 400 shown in fig. 9 and the mobile terminal 100 (fig. 11) configured with the identification apparatus 400. The mobile terminal may include a sensor for acquiring behavior data, and the sensor may include an acceleration sensor, a gyroscope sensor, a gravity sensor, a heart rate sensor, a brain wave sensor, a positioning sensor, an infrared sensor, and the like, which are not limited herein. The following will describe a specific flow of the embodiment by taking a mobile terminal as an example, and it is understood that the mobile terminal applied in the embodiment may be a smart watch, a mobile phone, and the like, which is not limited herein. As will be described in detail with respect to the flow shown in fig. 1, the identity recognition method may specifically include the following steps:
step S110: and acquiring the behavior data acquired by the sensor.
In the embodiment of the present application, a mobile terminal may be provided with various sensors for acquiring behavior data, such as an acceleration sensor, a gyroscope sensor, a gravity sensor, a heart rate sensor, a brain wave sensor, a positioning sensor, an infrared sensor, and the like. The behavior data may refer to data used for representing user behaviors, the user behaviors may include different types of user behaviors such as walking, standing, running, squatting, hand movement, head shaking and the like, and specific user behaviors may not be limited. The behavior data collected by these sensors may also vary from user behavior, for example, behavior data collected while the user is walking may differ from behavior data collected while the user is stationary. The behavior data and the characteristics of human behavior are different for different users, so that the behavior data collected by the sensor can be used for identification.
In some embodiments, the mobile terminal may acquire behavior data collected by a sensor for collecting behavior data when performing identity recognition. The behavior data acquired by the mobile terminal may include behavior data acquired by a plurality of sensors, for example, all behavior data acquired by sensors that can acquire the behavior data may be acquired, or some behavior data acquired by sensors may also be acquired, which is not limited herein. It can be understood that the more the types of the acquired behavior data (i.e. the more the types of the sensors for acquiring the behavior data), the more the dimensionality of the human behavior features for identity recognition is, the more the recognition rate and accuracy rate of the identity recognition are facilitated.
Step S120: and performing feature extraction on the behavior data to obtain behavior features.
In the embodiment of the application, after the mobile terminal acquires the behavior data acquired by the sensor for acquiring the behavior data, the mobile terminal may perform feature extraction on the behavior data to acquire the behavior feature, so as to perform the subsequent process of identity recognition according to the behavior feature.
In some embodiments, the feature extraction may be performed on the acquired behavior data, and may include a time sequence feature, a frequency domain feature, and a statistical feature, and the specific extracted feature may not be limited. The behavior data acquired by the sensor and acquired by the mobile terminal may be time series data (i.e. time domain data), for example, as shown in fig. 2, when the sensor is an acceleration sensor, fig. 2 shows a graph of a change of acceleration with time. The mobile terminal may extract statistical features, for example, obtain median, mean, maximum, minimum, peak, etc. from the behavior data detected by the sensor, so as to obtain statistical features in the time series data, for example, extract maximum and minimum, etc. from the time-varying acceleration curve shown in fig. 2; the mobile terminal can also obtain the values and the like before and after a certain point on the time axis and before and after a preset time interval according to the time sequence data, so as to obtain the time sequence characteristics; the mobile terminal may further perform fast fourier transform on the time series data to obtain frequency domain data, then separate the high and low frequency signals, calculate the overall capacity of the frequency domain signal and take at least a part of the coefficients as the frequency domain characteristics, for example, as shown in fig. 3, after performing fast fourier transform on the curve of the acceleration change with time shown in fig. 2, a change curve of the acceleration and the frequency may be obtained, and the frequency domain characteristics may be calculated according to the obtained change curve of the acceleration and the frequency curve. Of course, the specific manner of extracting the behavior feature corresponding to the behavior data may not be limited.
Step S130: and inputting the behavior characteristics into a pre-trained behavior classification model to obtain the behavior class output by the behavior classification model.
In the embodiment of the application, after the mobile terminal performs feature extraction on the behavior data and obtains the behavior features, the obtained behavior features may be input into a pre-trained behavior classification model to obtain a classification result output by a behavior classification region, and the classification result may include a behavior category corresponding to the behavior features.
In some embodiments, the behavior classification model may be obtained by training the initial model in advance according to a large number of training samples. The training samples may include input samples and output samples, the input samples may include behavior features corresponding to behavior feature data detected by various sensors, and the output samples may be behavior categories corresponding to the behavior features, so that the trained behavior classification model may be used to output the behavior categories according to the acquired behavior features. The output behavior category may include horizontal walking, vertical walking, horizontal running, vertical running, standing, still, etc., without limitation. The behavior classification model may be obtained by training a Support Vector Machine (SVM), a neural network, and the like, which is not limited herein.
Step S140: and inputting the behavior characteristics into an identity recognition model corresponding to the behavior category to obtain an identity recognition result output by the identity recognition model, wherein the identity recognition model is trained in advance to output the identity recognition result according to the input behavior characteristics.
In the embodiment of the present application, different behavior categories may correspond to different identity recognition models. When the identity recognition is performed according to the obtained behavior characteristics, the mobile terminal can input the behavior characteristics into the identity recognition model corresponding to the behavior category to obtain an identity recognition result.
In some embodiments, the identification model corresponding to each behavior category may also be trained from a large number of training samples. The training samples used for training the identity recognition model may also include input samples and output samples, the input samples may include behavior features corresponding to the same behavior category, and the output samples may be identity information of the user corresponding to the behavior features, so that the trained identity recognition model may be used to output an identity recognition result according to the input behavior features, and the identity recognition result may include identity information matching the input behavior features or identity information not matching the input behavior features, and the like, which is not limited herein. The identity recognition model may include a Support Vector Machine (SVM), a neural network, and the like, which is not limited herein. It can be understood that, because the behavior data of different behavior categories have large differences, the corresponding behavior feature differences are also large, and if a large number of behavior features corresponding to different behavior categories are trained, the finally obtained identity recognition model may not accurately distinguish the behavior features of different users, thereby affecting the accuracy of the identity recognition model. Therefore, different identity recognition models can be trained according to different behavior categories, and accuracy of identity recognition results output by the identity recognition models can be improved.
According to the identity recognition method provided by the embodiment of the application, the mobile terminal collects behavior data through the sensor, characteristic extraction is carried out on the behavior data to obtain behavior characteristics, the behavior characteristics are input into the pre-trained behavior classification model to obtain the behavior categories output by the behavior classification model, then the behavior characteristics are input into the identity recognition model corresponding to the behavior categories to obtain the identity recognition result output by the identity recognition model, the identity recognition model is pre-trained to output the identity recognition result according to the input behavior characteristics, active participation of a user in an identity recognition process can be avoided, the identity recognition can be achieved under the condition that the user does not sense, and user experience is improved. In addition, the behavior characteristics are recognized according to behavior recognition models of different behavior categories, so that the accuracy of identity recognition is improved, the user can recognize the identity through behavior actions of various behavior types, and the user experience is improved.
Referring to fig. 4, fig. 4 is a schematic flow chart illustrating an identity recognition method according to another embodiment of the present application. The method is applied to the mobile terminal, where the mobile terminal includes a sensor for acquiring behavior data, and as will be described in detail with reference to the flow shown in fig. 4, the identity recognition method may specifically include the following steps:
step S210: and acquiring the behavior data acquired by the sensor.
Step S220: and performing feature extraction on the behavior data to obtain behavior features.
Step S230: and performing characteristic processing on the behavior characteristics to obtain the processed behavior characteristics, wherein the characteristic processing at least comprises characteristic cleaning and characteristic mining.
In the embodiment of the application, after the mobile terminal performs feature extraction on the behavior data acquired by the sensor to obtain the behavior features, before the behavior classification model is used for identifying the behavior classification corresponding to the behavior features, the mobile terminal may further perform feature processing on the behavior features.
In some embodiments, the performing, by the mobile terminal, the feature processing on the obtained behavior feature may include: and carrying out feature cleaning and feature mining on the behavior features. The characteristic cleaning comprises the step of cleaning the content in the behavior characteristic according to a preset cleaning rule; feature mining includes mining behavioral features to form more dimensional features.
In some embodiments, the mobile terminal performing feature washing on the behavior feature may include: and removing missing values and abnormal values in the behavior characteristics, for example, removing incomplete data, data with wrong types and the like. As a specific implementation, the feature cleaning may be missing value processing, a dimension with a missing value smaller than a preset percentage may fit the missing value according to other values of the dimension, and if the number of missing values is greater than the preset percentage, it indicates that the feature is an invalid feature, and the dimension is removed. The preset percentage may be 35%, 40%, and the like, and the specific preset percentage may not be limited. As another specific implementation, the feature cleaning may be to clean up abnormal data, where normal data may satisfy the following rule: the method comprises the steps that the peak value of time sequence data acquired by an acceleration sensor at a single time falls within the range of plus or minus 2 times of standard deviation from the mean value, the valley value of time sequence data acquired by the acceleration sensor at a single time falls within the range of plus or minus 2 times of standard deviation from the mean value, the peak value of time sequence data acquired by a heart rate sensor at a single time falls within the range of plus or minus 1.3 times of standard deviation from the mean value, the valley value of time sequence data acquired by the heart rate sensor at a single time falls within the range of plus or minus 1.3 times of standard deviation, the change value of the rotation angle of equipment is not more than 140 degrees, and.
In some embodiments, feature mining the behavior feature may include: and excavating the behavior characteristics after the characteristics are cleaned by utilizing the lifting tree model. Before the behavior characteristics are input into the lifting tree model, the mobile terminal can also quantize the numerical characteristics in the behavior characteristics, and quantize the behavior characteristics into vectors after the behavior characteristics are quantized. And then inputting the quantized vector into a lifting tree model, and outputting a multi-dimensional characteristic vector by the lifting tree model according to the input vector so as to obtain the behavior characteristic after characteristic processing.
Step S240: and inputting the characteristic-processed behavior characteristics into a pre-trained behavior classification model to obtain the behavior category output by the behavior classification model.
In the embodiment of the present application, after performing the feature processing on the behavior feature, the behavior feature after the feature processing may be input into the behavior classification model to obtain the behavior class output by the behavior classification model. For example, as shown in FIG. 5, a vector x is input into a lifting tree model, feature-mined, and feature-mined to form a multi-dimensional vector W0、W1、W2、W3And W4And the data are input into a cascaded behavior classification model (which can be a trained linear classifier), and the behavior classification model outputs a result. It can be understood that after the behavior features are subjected to the feature cleaning and feature mining, the behavior features are input into the behavior classification model for behavior classification, so that the accuracy of the classification result can be improved.
In some embodiments, when the behavior classification model is trained, the feature processing processes of feature cleaning and feature mining may also be performed to train the behavior features of the behavior classification model, so that the trained behavior classification model has better classification capability, and the accuracy of the output behavior classification result is high.
Step S250: and judging whether the behavior category is a set category.
In the embodiment of the application, after obtaining the behavior class corresponding to the behavior feature, the mobile terminal may determine whether the behavior class is a set class, and if the behavior class is the set class, determine whether the current behavior feature may be used for identity recognition. It can be understood that if the behavior features corresponding to the behavior data of the user are some specific behavior categories, the identity cannot be identified by using the behavior features, for example, when the behavior category is a stationary category, behavior data collected by a sensor is the same when different users are stationary, and identity identification cannot be performed by using the behavior features of the behavior category because the behavior data are not used for distinguishing. Wherein setting the category may include: the specific setting categories are not limited, such as static, standing, sitting, and the like. It can be understood that if the behavior category is not the set category, it indicates that the behavior feature can be used for identity recognition; if the behavior type is a set type, the behavior characteristic is used for indicating that identity recognition cannot be carried out.
Step S260: and if the behavior category is not the set category, inputting the behavior characteristics into an identity recognition model corresponding to the behavior category to obtain an identity recognition result output by the identity recognition model, wherein the identity recognition model is trained in advance to output the identity recognition result according to the input behavior characteristics.
In the embodiment of the application, if it is determined that the behavior class of the acquired behavior feature is not the set class, the identity recognition may be performed by using the behavior feature, and in this case, an identity recognition model corresponding to the behavior class exists. Therefore, the behavior characteristics can be input into the identity recognition model corresponding to the behavior category, so that the identity recognition result output by the identity recognition model is obtained. For example, as shown in fig. 6, after the behavior features are input into the behavior classification model, the behavior features are input into the identity recognition model corresponding to the behavior classification according to the behavior classification, and finally, an identity recognition result is obtained.
Step S270: and if the behavior category is the set category, outputting first prompt content, wherein the first prompt content is used for prompting a user to perform human body behaviors which are not in the set category.
In the embodiment of the application, if the behavior type of the acquired behavior feature is determined to be the set type, the behavior feature cannot be utilized for identity recognition, and an identity recognition model corresponding to the behavior type does not exist under the condition, so that the identity recognition process fails, and invalid recognition operation is avoided. Therefore, if the behavior category is the set category, the first prompt content can be output to prompt the user to perform the human body behavior of the non-set category so as to perform identity recognition. The behavior of the first prompt content may be displayed in the form of sound, image, text, and the like, which is not limited herein.
In the identity recognition method provided by the embodiment of the application, the mobile terminal acquires behavior data through a sensor, performs feature extraction on the behavior data to obtain behavior features, performs feature cleaning and feature mining on the behavior features, inputs the behavior features after the feature processing into a pre-trained behavior classification model to obtain behavior categories output by the behavior classification model, judges whether the behavior categories are set categories, inputs the behavior features after the feature processing into an identity recognition model corresponding to the behavior categories if the behavior features are not the set categories, obtains an identity recognition result output by the identity recognition model, is pre-trained to output the identity recognition result according to the input behavior features, and therefore active participation of a user in an identity recognition process can be achieved, and identity recognition can be achieved without perception of the user, and the user experience is improved. In addition, the behavior characteristics are recognized according to behavior recognition models of different behavior categories, and the accuracy of identity recognition is improved.
Referring to fig. 7, fig. 7 is a schematic flow chart illustrating an identity recognition method according to another embodiment of the present application. The method is applied to the mobile terminal, where the mobile terminal includes a sensor for acquiring behavior data, and as will be described in detail with reference to the flow shown in fig. 7, the identity recognition method may specifically include the following steps:
step S310: acquiring a training data set, wherein the training data set comprises identity information of a set user and sample behavior characteristics corresponding to the identity information, and the sample behavior characteristics are behavior characteristics corresponding to the behavior categories.
In this embodiment of the present application, for the identity recognition model in the foregoing embodiment, the embodiment of the present application further includes a training method for the identity recognition model, and it is worth to be noted that training for the identity recognition model may be performed in advance according to the acquired training data set, and subsequently, each time identity recognition is performed, the identity recognition model may be used, and it is not necessary to train the model each time identity recognition is performed.
The following is a description of training an identity recognition model corresponding to an action category.
In some embodiments, the acquiring the training data set may include: acquiring behavior data of the set user, which is acquired by the sensor; performing feature extraction on the behavior data of the set user to obtain sample behavior features of the set user; inputting the sample behavior characteristics of the set user into the pre-trained behavior classification model, obtaining the pre-trained behavior classification model, and obtaining a behavior classification result output by the behavior classification model; and if the behavior classification result is the behavior category, establishing a corresponding relation between the sample behavior characteristics of the set user and the identity information of the set user, and acquiring the training data set.
It can be understood that when the training data set is obtained, a large amount of behavior data of the human body behaviors of the behavior category can be obtained for different set users. The setting user can be a user with known identity information. After the behavior data of the behavior category of the set user is acquired, feature extraction can be performed to obtain sample behavior features, and then the behavior classification model is used for classification, so that a behavior classification result output by the behavior classification model can be obtained, and whether the behavior classification result is the behavior category or not is determined. If the behavior classification result is the behavior class, the behavior feature of the sample is the behavior feature corresponding to the behavior class, and therefore, the behavior feature of the sample can be used for training the identity recognition model corresponding to the behavior class. And marking the sample behavior characteristics of the set user as the identity information of the set user, thereby establishing the corresponding relation between the sample behavior characteristics corresponding to the set user and the identity information of the set user. After the processing is carried out on a large amount of behavior data of the set user, a training data set can be obtained.
In some embodiments, before the acquiring the behavioral data of the set user collected by the sensor, the method further comprises: and outputting second prompt contents, wherein the second prompt contents are used for prompting the set user to perform the human body behaviors of the behavior categories. It can be understood that, when the training data set for training the behavior category is obtained, prompt content may be output to prompt the user to perform the human behavior of the behavior category, so as to facilitate obtaining the behavior data of the behavior category to further extract the sample behavior feature. In addition, although the user performs the human behavior of the behavior category, the action behavior of the user may be nonstandard, so that the corresponding relationship between the sample behavior characteristics and the identity information of the set user is established only when the sample behavior characteristics are subjected to behavior classification and the behavior classification result is determined to be the behavior category in the process of acquiring the training data set.
In the embodiment of the present application, in the training data set, the sample behavior feature is an input sample used for training, the labeled identity information of the setting user is an output sample used for training, and each set of training data may include one input sample and one output sample. The training data set may include input samples and output samples of a plurality of set users, so that the identity recognition model obtained by subsequent training may recognize identity information of a plurality of users.
Step S320: and inputting the training data set into a neural network, training the neural network, and obtaining the identity recognition model corresponding to the behavior category.
In the embodiment of the application, the training data set can be input to the neural network for training according to the training data set, so that the identity recognition model is obtained. The neural network may be a deep neural network, which is not limited herein.
The training of the initial model from the training data set is explained below.
The sample behavior characteristics in a group of data in the training data set are used as input samples of the neural network, and the identity information of the set user marked in the group of data can be used as output samples of the neural network. In the neural network, the neurons in the input layer are fully connected with the neurons in the hidden layer, and the neurons in the hidden layer are fully connected with the neurons in the output layer, so that potential features with different granularities can be effectively extracted. And the number of hidden layers can be multiple, so that the nonlinear relation can be better fitted, and the trained identity recognition model is more accurate. It is understood that the training process for the identification model may or may not be performed by the mobile terminal. When the training process is not completed by the mobile terminal, the mobile terminal can be used as a direct user or an indirect user, that is, the mobile terminal can send the acquired behavior characteristics to the server storing the identity recognition model, and obtain the identity recognition result from the server.
In some embodiments, the trained identity recognition model may be stored locally in the mobile terminal, and the trained identity recognition model may also be stored in a server in communication connection with the mobile terminal, so that the storage space occupied by the mobile terminal may be reduced, and the operating efficiency of the mobile terminal may be improved.
In some embodiments, the identity recognition model may periodically or aperiodically acquire new training data, train and update the identity recognition model.
In the embodiment of the present application, the manner of training the behavior classification model may also refer to the process of training the identity recognition model, the training data may be composed of the sample behavior characteristics and the labeled behavior categories, and the training process is not described herein again.
Step S330: and acquiring the behavior data acquired by the sensor.
Step S340: and performing feature extraction on the behavior data to obtain behavior features.
Step S350: and inputting the behavior characteristics into a pre-trained behavior classification model to obtain the behavior class output by the behavior classification model.
Step S360: and inputting the behavior characteristics into an identity recognition model corresponding to the behavior category to obtain an identity recognition result output by the identity recognition model, wherein the identity recognition model is trained in advance to output the identity recognition result according to the input behavior characteristics.
In the embodiment of the present application, steps S330 to S360 may refer to the contents of the foregoing embodiments, and are not described herein again.
The identity recognition method provided by the embodiment of the application provides a training method of an identity recognition model, and the initial model is trained through the sample behavior characteristics marked with the identity information of the set user, so that the identity recognition model is obtained, and the identity recognition model can be used for outputting an identity recognition result according to the behavior characteristics corresponding to the collected behavior data. The mobile terminal collects behavior data through the sensor, performs characteristic extraction on the behavior data to obtain behavior characteristics, inputs the behavior characteristics into a pre-trained behavior classification model to obtain behavior categories output by the behavior classification model, then inputs the behavior characteristics into an identity recognition model corresponding to the behavior categories to obtain identity recognition results output by the identity recognition model, and therefore active participation of a user is not needed in an identity recognition process, identity recognition can be achieved under the condition that the user does not sense, and user experience is improved. In addition, the behavior characteristics are recognized according to behavior recognition models of different behavior categories, and the accuracy of identity recognition is improved.
Referring to fig. 8, fig. 8 is a schematic flow chart illustrating an identity recognition method according to still another embodiment of the present application. The method is applied to the mobile terminal, where the mobile terminal includes a sensor for acquiring behavior data, and as will be described in detail with reference to the flow shown in fig. 8, the identity recognition method may specifically include the following steps:
step S410: and when the scene of identity recognition is triggered, acquiring the current identity recognition mode.
In this embodiment of the application, the mobile terminal may implement multiple identity recognition modes, where the multiple identity recognition modes may include a human behavior recognition mode, a human face recognition mode, an iris recognition mode, a fingerprint recognition mode, and the like, which are not limited herein.
The mobile terminal can monitor the scene of identity recognition, so that whether the recognition mode of human behavior is utilized for identity recognition or not is determined when the scene of identity recognition is triggered. The identification scenario may not be limited, and for example, the scenario may be a screen unlock scenario, a payment scenario, and the like. When monitoring that the scene of identity recognition is triggered, the mobile terminal can acquire the current identity recognition mode and determine whether the current identity recognition mode is a human behavior recognition mode or not so as to determine whether to perform identity recognition by using the identity recognition method provided by the embodiment of the application.
Step S420: and if the mode is the recognition mode of human behavior, acquiring the behavior data acquired by the sensor.
In the embodiment of the application, if the current identification mode for identifying the identity is the identification mode for the human behavior, the step of acquiring the behavior data acquired by the sensor can be executed, and the step of inputting the behavior characteristics into the identity identification model corresponding to the behavior category to acquire the identity identification result output by the identity identification model is executed, so that the identity identification is realized by utilizing the identification mode for identifying the human behavior.
In an application scenario, the mobile terminal can be an intelligent wearable device (such as an intelligent watch, an intelligent bracelet and the like), the above-mentioned identification scenario can be a payment scenario, and at this moment, if the selected identification manner is an identification manner of human behavior, the user only needs to simply do downlink behavior actions, so that the behavior data acquired by the sensor can identify the identity and complete payment, thereby realizing non-inductive payment.
Step S430: and performing feature extraction on the behavior data to obtain behavior features.
Step S440: and inputting the behavior characteristics into a pre-trained behavior classification model to obtain the behavior class output by the behavior classification model.
Step S450: and inputting the behavior characteristics into an identity recognition model corresponding to the behavior category to obtain an identity recognition result output by the identity recognition model, wherein the identity recognition model is trained in advance to output the identity recognition result according to the input behavior characteristics.
According to the identity recognition method provided by the embodiment of the application, when the scene of identity recognition is monitored, the identity recognition mode is determined, when the identity recognition mode is the recognition mode of human behaviors, behavior data collected by a sensor is used for carrying out feature extraction and behavior classification, behavior features are input into identity recognition models corresponding to behavior categories for recognition, identity recognition results corresponding to the identity recognition models are obtained, active participation of users in the identity recognition process is not needed, identity recognition can be achieved under the condition that the users do not sense, and user experience is improved. In addition, the behavior characteristics are recognized according to behavior recognition models of different behavior categories, and the accuracy of identity recognition is improved.
Referring to fig. 9, a block diagram of an identification apparatus 400 according to an embodiment of the present disclosure is shown. The identity recognition apparatus 400 is applied to the mobile terminal described above, which includes a sensor for collecting behavior data. The identification apparatus 400 includes: a data acquisition module 410, a feature extraction module 420, a behavior classification module 430, and a feature identification module 440. The data acquisition module 410 is configured to acquire behavior data acquired by the sensor; the feature extraction module 420 is configured to perform feature extraction on the behavior data to obtain behavior features; the behavior classification module 430 is configured to input the behavior features into a pre-trained behavior classification model, and obtain a behavior class output by the behavior classification model; the characteristic recognition module 440 is configured to input the behavior characteristics into an identity recognition model corresponding to the behavior category, and obtain an identity recognition result output by the identity recognition model, where the identity recognition model is pre-trained to output an identity recognition result according to the input behavior characteristics.
In some embodiments, referring to fig. 10, the behavior classification module 430 may include a feature processing unit 431 and a feature classification unit 432. The feature processing unit 431 is configured to perform feature processing on the behavior features to obtain processed behavior features, where the feature processing at least includes feature cleaning and feature mining; the feature classification unit 432 is configured to input the feature-processed behavior feature into a pre-trained behavior classification model, so as to obtain a behavior class output by the behavior classification model.
In this embodiment, the feature recognition module 440 may be specifically configured to: and inputting the behavior characteristics after the characteristic processing into the identity recognition model corresponding to the behavior category.
In some embodiments, the identification apparatus 400 may further include a category determination module. The category judgment module is used for judging whether the behavior category is a set category or not before the behavior characteristics are input into the identity recognition model corresponding to the behavior category and the identity recognition result output by the identity recognition model is obtained. If the behavior category is not the set category, the feature recognition module 440 inputs the behavior feature into the identity recognition model corresponding to the behavior category to obtain an identity recognition result output by the identity recognition model.
In this embodiment, the identification apparatus 400 may further include a first prompting module. The first prompting module is used for outputting first prompting content if the behavior category is the set category, and the first prompting content is used for prompting a user to perform human body behaviors which are not in the set category.
In the embodiment of the present application, the identity recognition apparatus 400 may further include a model training module. The model training module may be specifically configured to: acquiring a training data set, wherein the training data set comprises identity information of a set user and sample behavior characteristics corresponding to the identity information, and the sample behavior characteristics are behavior characteristics corresponding to the behavior categories; and inputting the training data set into a neural network, training the neural network, and obtaining the identity recognition model corresponding to the behavior category.
Further, the model training module obtains a training data set, which may include: acquiring behavior data of the set user, which is acquired by the sensor; performing feature extraction on the behavior data of the set user to obtain sample behavior features of the set user; inputting the sample behavior characteristics of the set user into the pre-trained behavior classification model, obtaining the pre-trained behavior classification model, and obtaining a behavior classification result output by the behavior classification model; and if the behavior classification result is the behavior category, establishing a corresponding relation between the sample behavior characteristics of the set user and the identity information of the set user, and acquiring the training data set.
In some embodiments, the identification apparatus 400 may further include a second prompting module. The second prompting module is used for outputting second prompting contents before the behavior data of the set user, which is acquired by the sensor, is acquired, wherein the second prompting contents are used for prompting the set user to perform the human body behaviors of the behavior categories.
In some embodiments, the data acquisition module 410 may be specifically configured to: when a scene of identity recognition is triggered, acquiring a current identity recognition mode; and if the mode is the recognition mode of human behavior, acquiring the behavior data acquired by the sensor.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, the coupling between the modules may be electrical, mechanical or other type of coupling.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
To sum up, according to the scheme provided by the application, behavior data are collected through a sensor of the mobile terminal, feature extraction is carried out on the behavior data, behavior features are obtained, the behavior features are input into a pre-trained behavior classification model, behavior categories output by the behavior classification model are obtained, then the behavior features are input into an identity recognition model corresponding to the behavior categories, an identity recognition result output by the identity recognition model is obtained, the identity recognition model is pre-trained, the identity recognition result is output according to the input behavior features, active participation of a user in an identity recognition process can be achieved, the identity recognition can be achieved under the condition that the user does not sense, and user experience is improved.
Referring to fig. 11, a block diagram of a mobile terminal according to an embodiment of the present application is shown. The mobile terminal 100 may be an electronic device capable of running an application, such as a smart watch, a smart phone, or the like. The mobile terminal 100 in the present application may include one or more of the following components: a processor 110, a memory 120, and one or more applications, wherein the one or more applications may be stored in the memory 120 and configured to be executed by the one or more processors 110, the one or more programs configured to perform a method as described in the aforementioned method embodiments.
Processor 110 may include one or more processing cores. The processor 110 interfaces with various components throughout the mobile terminal 100 using various interfaces and lines, and performs various functions of the mobile terminal 100 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 120 and invoking data stored in the memory 120. Alternatively, the processor 110 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 110 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 110, but may be implemented by a communication chip.
The Memory 120 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 120 may be used to store instructions, programs, code sets, or instruction sets. The memory 120 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like. The storage data area may also store data created by the terminal 100 in use, such as a phonebook, audio-video data, chat log data, and the like.
Referring to fig. 12, a block diagram of a computer-readable storage medium according to an embodiment of the present application is shown. The computer-readable medium 800 has stored therein a program code that can be called by a processor to execute the method described in the above-described method embodiments.
The computer-readable storage medium 800 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Alternatively, the computer-readable storage medium 800 includes a non-volatile computer-readable storage medium. The computer readable storage medium 800 has storage space for program code 810 to perform any of the method steps of the method described above. The program code can be read from or written to one or more computer program products. The program code 810 may be compressed, for example, in a suitable form.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (11)

1. An identity recognition method is applied to a mobile terminal, the mobile terminal comprises a sensor for collecting behavior data, and the method comprises the following steps:
acquiring behavior data acquired by the sensor;
performing feature extraction on the behavior data to obtain behavior features;
inputting the behavior characteristics into a pre-trained behavior classification model to obtain a behavior class output by the behavior classification model;
and inputting the behavior characteristics into an identity recognition model corresponding to the behavior category to obtain an identity recognition result output by the identity recognition model, wherein the identity recognition model is trained in advance to output the identity recognition result according to the input behavior characteristics.
2. The method of claim 1, wherein the inputting the behavior features into a pre-trained behavior classification model to obtain the behavior class output by the behavior classification model comprises:
performing feature processing on the behavior features to obtain processed behavior features, wherein the feature processing at least comprises feature cleaning and feature mining;
inputting the characteristic-processed behavior characteristics into a pre-trained behavior classification model to obtain behavior categories output by the behavior classification model;
the inputting the behavior characteristics into the identity recognition model corresponding to the behavior category includes:
and inputting the behavior characteristics after the characteristic processing into the identity recognition model corresponding to the behavior category.
3. The method according to claim 1, before the inputting the behavior feature into the identification model corresponding to the behavior category and obtaining the identification result output by the identification model, the method further comprises:
judging whether the behavior category is a set category;
and if the behavior category is not the set category, executing the identity recognition model corresponding to the behavior category and inputting the behavior characteristics to obtain the identity recognition result output by the identity recognition model.
4. The method of claim 3, further comprising:
and if the behavior category is the set category, outputting first prompt content, wherein the first prompt content is used for prompting a user to perform human body behaviors which are not in the set category.
5. The method of claim 1, wherein the identity recognition model is trained by:
acquiring a training data set, wherein the training data set comprises identity information of a set user and sample behavior characteristics corresponding to the identity information, and the sample behavior characteristics are behavior characteristics corresponding to the behavior categories;
and inputting the training data set into a neural network, training the neural network, and obtaining the identity recognition model corresponding to the behavior category.
6. The method of claim 5, wherein the obtaining a set of training data comprises:
acquiring behavior data of the set user, which is acquired by the sensor;
performing feature extraction on the behavior data of the set user to obtain sample behavior features of the set user;
inputting the sample behavior characteristics of the set user into the pre-trained behavior classification model, obtaining the pre-trained behavior classification model, and obtaining a behavior classification result output by the behavior classification model;
and if the behavior classification result is the behavior category, establishing a corresponding relation between the sample behavior characteristics of the set user and the identity information of the set user, and acquiring the training data set.
7. The method of claim 6, wherein prior to said obtaining behavioral data of the set user collected by the sensor, the method further comprises:
and outputting second prompt contents, wherein the second prompt contents are used for prompting the set user to perform the human body behaviors of the behavior categories.
8. The method of any one of claims 1-7, wherein the acquiring behavioral data collected by the sensor comprises:
when a scene of identity recognition is triggered, acquiring a current identity recognition mode;
and if the mode is the recognition mode of human behavior, acquiring the behavior data acquired by the sensor.
9. An identification device, applied to a mobile terminal including a sensor for collecting behavior data, the device comprising: a data acquisition module, a feature extraction module, a behavior classification module and a feature identification module, wherein,
the data acquisition module is used for acquiring behavior data acquired by the sensor;
the characteristic extraction module is used for extracting the characteristics of the behavior data to obtain behavior characteristics;
the behavior classification module is used for inputting the behavior characteristics into a pre-trained behavior classification model to obtain behavior classes output by the behavior classification model;
the characteristic recognition module is used for inputting the behavior characteristics into the identity recognition model corresponding to the behavior category to obtain an identity recognition result output by the identity recognition model, and the identity recognition model is trained in advance to output the identity recognition result according to the input behavior characteristics.
10. A mobile terminal, comprising:
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method of any of claims 1-8.
11. A computer-readable storage medium, having stored thereon program code that can be invoked by a processor to perform the method according to any one of claims 1 to 8.
CN201911008522.5A 2019-10-22 2019-10-22 Identity recognition method and device, mobile terminal and storage medium Active CN110765939B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911008522.5A CN110765939B (en) 2019-10-22 2019-10-22 Identity recognition method and device, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911008522.5A CN110765939B (en) 2019-10-22 2019-10-22 Identity recognition method and device, mobile terminal and storage medium

Publications (2)

Publication Number Publication Date
CN110765939A true CN110765939A (en) 2020-02-07
CN110765939B CN110765939B (en) 2023-03-28

Family

ID=69332909

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911008522.5A Active CN110765939B (en) 2019-10-22 2019-10-22 Identity recognition method and device, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN110765939B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021184468A1 (en) * 2020-03-18 2021-09-23 中国科学院深圳先进技术研究院 Action recognition method and apparatus, device, and medium
CN113742665A (en) * 2020-06-05 2021-12-03 国家计算机网络与信息安全管理中心 User identity identification model construction method, user identity authentication method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107016346A (en) * 2017-03-09 2017-08-04 中国科学院计算技术研究所 gait identification method and system
CN107506702A (en) * 2017-08-08 2017-12-22 江西高创保安服务技术有限公司 Human face recognition model training and test system and method based on multi-angle
CN108323201A (en) * 2016-11-16 2018-07-24 华为技术有限公司 A kind of identity authentication method and device
CN110163086A (en) * 2019-04-09 2019-08-23 缤刻普达(北京)科技有限责任公司 Body-building action identification method, device, equipment and medium neural network based

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108323201A (en) * 2016-11-16 2018-07-24 华为技术有限公司 A kind of identity authentication method and device
CN107016346A (en) * 2017-03-09 2017-08-04 中国科学院计算技术研究所 gait identification method and system
CN107506702A (en) * 2017-08-08 2017-12-22 江西高创保安服务技术有限公司 Human face recognition model training and test system and method based on multi-angle
CN110163086A (en) * 2019-04-09 2019-08-23 缤刻普达(北京)科技有限责任公司 Body-building action identification method, device, equipment and medium neural network based

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
郇战: "基于步态轨迹曲线特征的人体身份识别", 《郑州大学学报(理学版)》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021184468A1 (en) * 2020-03-18 2021-09-23 中国科学院深圳先进技术研究院 Action recognition method and apparatus, device, and medium
CN113742665A (en) * 2020-06-05 2021-12-03 国家计算机网络与信息安全管理中心 User identity identification model construction method, user identity authentication method and device
CN113742665B (en) * 2020-06-05 2024-03-26 国家计算机网络与信息安全管理中心 User identity recognition model construction and user identity verification methods and devices

Also Published As

Publication number Publication date
CN110765939B (en) 2023-03-28

Similar Documents

Publication Publication Date Title
CN107437074B (en) Identity authentication method and device
CN108920639B (en) Context obtaining method and device based on voice interaction
WO2019033525A1 (en) Au feature recognition method, device and storage medium
WO2017156965A1 (en) Method for fingerprint unlocking and terminal
CN110765939B (en) Identity recognition method and device, mobile terminal and storage medium
CN111260220B (en) Group control equipment identification method and device, electronic equipment and storage medium
CN112200796B (en) Image processing method, device and equipment based on privacy protection
CN110741387B (en) Face recognition method and device, storage medium and electronic equipment
CN110288085B (en) Data processing method, device and system and storage medium
CN105760851A (en) Fingerprint identification method and terminal
CN110766074B (en) Method and device for testing identification qualification of abnormal grains in biological identification method
CN112836661A (en) Face recognition method and device, electronic equipment and storage medium
CN112274909A (en) Application operation control method and device, electronic equipment and storage medium
CN112258238A (en) User life value cycle detection method and device and computer equipment
CN114272612A (en) Identity recognition method, identity recognition device, storage medium and terminal
CN114223139B (en) Interface switching method and device, wearable electronic equipment and storage medium
CN112818317A (en) Browsing process monitoring method, monitoring device and readable storage medium
CN116665278A (en) Micro-expression recognition method, micro-expression recognition device, computer equipment and storage medium
Monisha et al. A real-time embedded system for human action recognition using template matching
CN111931148A (en) Image processing method and device and electronic equipment
CN113095153A (en) Mobile terminal human situation recognition method based on depth residual error network
CN111797077A (en) Data cleaning method and device, storage medium and electronic equipment
CN117421199B (en) Behavior determination method and system
CN111369985A (en) Voice interaction method, device, equipment and medium
CN115294986B (en) Method for reducing false awakening of intelligent voice interaction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant