CN108108649B - Identity verification method and device - Google Patents

Identity verification method and device Download PDF

Info

Publication number
CN108108649B
CN108108649B CN201611049484.4A CN201611049484A CN108108649B CN 108108649 B CN108108649 B CN 108108649B CN 201611049484 A CN201611049484 A CN 201611049484A CN 108108649 B CN108108649 B CN 108108649B
Authority
CN
China
Prior art keywords
gesture
information
user
gesture information
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611049484.4A
Other languages
Chinese (zh)
Other versions
CN108108649A (en
Inventor
陈阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201611049484.4A priority Critical patent/CN108108649B/en
Publication of CN108108649A publication Critical patent/CN108108649A/en
Application granted granted Critical
Publication of CN108108649B publication Critical patent/CN108108649B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • G06Q20/4014Identity check for transactions
    • G06Q20/40145Biometric identity checks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • Computer Security & Cryptography (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Accounting & Taxation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Strategic Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • General Business, Economics & Management (AREA)
  • Finance (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an identity authentication method and device, and belongs to the technical field of information security. The method comprises the following steps: acquiring face information of a target user; detecting whether the face information of the target user is matched with the face information of the registered user; if the face information of the target user is matched with the face information of the registered user, acquiring gesture information of the target user; detecting whether the gesture information of the target user is matched with the gesture information of the registered user; and if the gesture information of the target user is matched with the gesture information of the registered user, determining that the target user passes the identity authentication. According to the invention, the gesture information is introduced and combined with the face information to carry out the identity authentication, even if the face information of the user is acquired by other people, as long as the gesture information set by the user is not leaked to other people, other people still cannot imitate that the user passes the identity authentication, and the security is higher.

Description

Identity verification method and device
Technical Field
The invention relates to the technical field of information security, in particular to an identity authentication method and device.
Background
The identity authentication technology based on face recognition is widely applied due to the advantages of convenience and high efficiency. For example, the technology is used in practical application scenarios such as login authentication, payment authentication and the like.
In the prior art, an identity authentication process based on face recognition is roughly as follows: the method comprises the steps that terminal equipment such as a mobile phone collects face information of a target user, detects whether the face information of the target user is matched with the face information of a registered user or not, and determines that the target user passes identity authentication if the face information of the target user is matched with the face information of the registered user. The face information is usually a face image acquired by a camera.
In the prior art, the identity of a user is verified only according to face information, but in an internet environment of self-photographing and sun-shining in circulation, face images (such as photos) of the user are easily acquired by others, and hackers can easily pass the identity verification through the face model by establishing a face model by using the face images published in social applications by the user.
Therefore, the security of the authentication process based on face recognition provided by the prior art is low.
Disclosure of Invention
In order to solve the problem that the security of an identity authentication process based on face recognition is low in the prior art, the embodiment of the invention provides an identity authentication method and device. The technical scheme is as follows:
in a first aspect, an identity verification method is provided, the method including:
acquiring face information of a target user;
detecting whether the face information of the target user is matched with the face information of the registered user;
if the face information of the target user is matched with the face information of the registered user, acquiring gesture information of the target user;
detecting whether the gesture information of the target user is matched with the gesture information of the registered user;
and if the gesture information of the target user is matched with the gesture information of the registered user, determining that the target user passes identity authentication.
In a second aspect, an identity authentication apparatus is provided, the apparatus comprising:
the first acquisition module is used for acquiring the face information of a target user;
the first detection module is used for detecting whether the face information of the target user is matched with the face information of the registered user;
the second acquisition module is used for acquiring the gesture information of the target user if the face information of the target user is matched with the face information of the registered user;
the second detection module is used for detecting whether the gesture information of the target user is matched with the gesture information of the registered user;
and the determining module is used for determining that the target user passes the identity authentication if the gesture information of the target user is matched with the gesture information of the registered user.
The technical scheme provided by the embodiment of the invention has the following beneficial effects:
the face information is relatively weak in confidentiality and can be regarded as 'plaintext', the gesture information is relatively strong in confidentiality and sufficiently rich in difference and can be regarded as 'ciphertext', and in the identity verification process based on the face recognition, the face information and the gesture information are matched at the same time to be regarded that the target user passes the identity verification, so that the safety of the identity verification process based on the face recognition is improved. Moreover, because the gesture information and the face information can be acquired through the camera, the convenience of information acquisition is not influenced, and the hardware cost of the equipment is not additionally increased.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of an authentication method according to an embodiment of the present invention;
FIG. 2 illustrates a schematic diagram of several different hand types;
FIG. 3 illustrates a schematic diagram of a combination of a single hand type and gesture trajectory;
FIG. 4 illustrates a schematic diagram of a combination of a hand type sequence and gesture trajectory;
FIG. 5 is a flow diagram of a registration process provided by one embodiment of the present invention;
FIG. 6 illustrates a schematic diagram of a registration process;
fig. 7 is a block diagram of an authentication apparatus provided by an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
The technical scheme provided by the embodiment of the invention is mainly used for carrying out identity authentication on the user. Authentication of a user is involved in many application scenarios, such as login authentication, payment authentication, check-in authentication, etc.
In the method provided by the embodiment of the invention, the execution main body of each step can be terminal equipment with a camera. For example, the terminal device may be an electronic device such as a mobile phone, a tablet computer, an electronic book reader, a multimedia player, a PC (personal computer), and the like. For convenience of description, in the following method embodiments, the main body of execution of each step is described as a terminal device.
Referring to fig. 1, a flowchart of an authentication method according to an embodiment of the present invention is shown. The method may include several steps as follows.
Step 101, obtaining face information of a target user.
The target user refers to the user who is currently requesting authentication. The face information of the target user refers to information including a face of the target user, for example, the face information of the target user may be an image including the face of the target user.
Optionally, the terminal device is configured with a camera, and the terminal device obtains the face information of the target user through the camera.
And 102, detecting whether the face information of the target user is matched with the face information of the registered user. If yes, go to step 103; if not, the following step 106 is performed.
The registered user refers to a user who sets face information and gesture information for authentication. The face information of the registered user refers to information including a face of the registered user, for example, the face information of the registered user may be an image including the face of the registered user.
Optionally, taking the face information as a face image as an example, the process of detecting whether the face information of the target user matches with the face information of the registered user may be as follows: extracting face features from a face image of a target user; comparing the face features of the target user with the face features of the registered users, and calculating the similarity between the face features; if the similarity is larger than a first preset threshold, determining that the face information of the target user is matched with the face information of the registered user; and if the similarity is smaller than a first preset threshold, determining that the face information of the target user is not matched with the face information of the registered user. The facial features of the registered user are extracted from the facial image of the registered user, and the facial features of the registered user can be extracted in advance and stored. For example, facial features of the registered user are extracted from facial images of the registered user and stored in the registration process. The facial features may include locations of points of facial features such as corners of the eyes, center of the eyes, tip of the nose, wings of the nose, corners of the mouth, corners of the eyebrows, and the like. The value range of the first preset threshold is greater than 0 and less than 1, for example, the first preset threshold is 0.9. In practical application, a first preset threshold value can be set according to the precision requirement of face matching, the higher the precision requirement is, the larger the first preset threshold value is, and the lower the precision requirement is, the smaller the first preset threshold value is.
And 103, acquiring gesture information of the target user.
The gesture information of the target user refers to information including a gesture performed by the target user, for example, the gesture information of the target user may be an image including the gesture performed by the target user. The terminal equipment can also acquire gesture information of the target user through the camera.
Optionally, the gesture information includes any one of the following types: a single hand type, a sequence of hand types, a gesture trajectory, a combination of a single hand type and a gesture trajectory, a combination of a sequence of hand types and a gesture trajectory.
Hand shape refers to the shape and state of the hand. Optionally, in the embodiment of the present invention, the hand refers to a part composed of a palm and fingers. Illustratively, as shown in FIG. 2, a schematic representation of several different hand types is shown, including, for example, fist making, five fingers opening, thumb setting, etc. Because the hand is very nimble changeable, in addition the hand can be towards different directions, consequently can make multiple different hand types. A hand shape may also be referred to as a two-dimensional hand shape or a static two-dimensional gesture.
A hand type sequence refers to a sequence comprising a plurality of hand types. For example, a hand type sequence includes 4 hand types of fist making, five fingers opening, index finger stretching, and thumb raising performed in sequence. In one example, the hand type sequence is obtained by: acquiring a hand shape of a target user through a camera; detecting to obtain a continuous acquisition instruction or a stop acquisition instruction; the continuous acquisition instruction is used for indicating continuous acquisition of the hand types, and the acquisition stopping instruction is used for indicating stopping acquisition of the hand types; if the acquisition continuing instruction is acquired, the step of acquiring one hand shape of the target user through the camera is started to be executed again; and if the acquisition stopping instruction is acquired, generating a hand type sequence of the target user according to the acquired hand types of the target user. Wherein the continuous acquisition instruction and the stop acquisition instruction can be triggered by corresponding selection controls. In the time dimension, a hand type sequence is formed by sequentially executing a plurality of hand types. A sequence of hand patterns is like a conventional code sequence consisting of characters, each hand pattern in the sequence of hand patterns corresponding to a character in the code sequence.
The gesture track refers to the overall motion track of the hand. And in the spatial dimension, a gesture track is formed by controlling the integral motion of the hand. For example, the gesture trajectory may be in the form of a circle, rectangle, hook, etc.
The combination of the single hand type and the gesture track may be that the single hand type is executed first and then the gesture track is executed, or the single hand type is maintained in the process of executing the gesture track. For example, as shown in fig. 3, where the single hand shape is a fist and the gesture trajectory is a circle, the target user may hold the hand in the fist hand shape and move the arm so that the hand draws a circle.
The combination of the hand type sequence and the gesture trajectory may be that the hand type sequence is executed first and then the gesture trajectory is executed, or that the hand type sequence is executed first and then the hand type sequence is executed, or that the hand type sequence is executed in the process of executing the gesture trajectory. For example, as shown in fig. 4, the hand shape sequence includes 4 hand shapes of a fist, a five-finger open, an index finger, and a thumb being raised, which are sequentially performed, and the gesture trajectory is a rectangle, the target user may perform the fist and keep the hand shape of the fist moving from a first vertex to a second vertex of the rectangle, the target user changes the hand shape from the fist to the five-finger open and keeps the hand shape of the five-finger open moving from the second vertex to a third vertex of the rectangle, the target user changes the hand shape from the five-finger open to the index finger and keeps the hand shape of the index finger moving from the third vertex to a fourth vertex of the rectangle, and the target user changes the hand shape from the extension to the thumb being raised and keeps the hand shape of the thumb being raised moving from the fourth vertex to the first vertex of the rectangle.
In this embodiment, only the example of obtaining the face information of the target user first and then obtaining the gesture information of the target user is taken as an example for description, in other possible embodiments, the gesture information of the target user may be obtained first and then the face information of the target user is obtained, or the two items of information may be obtained at the same time.
And 104, detecting whether the gesture information of the target user is matched with the gesture information of the registered user. If yes, go to step 105; if not, the following step 106 is performed.
The gesture information of the registered user refers to information including a gesture performed by the registered user, for example, the gesture information of the registered user may be an image including the gesture performed by the registered user.
When the type of the gesture information is a single hand type, a process of detecting whether the gesture information of the target user matches the gesture information of the registered user may be as follows: identifying the single hand type of the target user from the image containing the single hand type of the target user in a preset mode; if the single hand type of the target user is the same as the single hand type of the registered user, determining that the hand type information of the target user is matched with the hand type information of the registered user; and if the single hand type of the target user is different from the single hand type of the registered user, determining that the hand type information of the target user does not match with the hand type information of the registered user. The individual hand shape of the registered user is recognized from the image containing the individual hand shape of the registered user, and the individual hand shape of the registered user can be recognized and stored in advance. For example, during the registration process, the single hand shape of the registered user is identified from the image containing the single hand shape of the registered user in a preset manner and stored.
In one example, the preset manner may be to identify a single hand from the image by using a template matching manner. A hand-type template set is constructed in advance, the hand-type template set comprises a plurality of hand-type templates, and each hand-type template can be an image containing a single hand type. Taking the identification of the single hand form of the target user as an example, the similarity between the image containing the single hand form of the target user and each hand form template is calculated, and the hand form identification result is the single hand form in the hand form template with the highest similarity. The principle of the mode is simple, and the hand-shaped template is easy to add and improve. In another example, the preset manner may be to recognize a single hand from the image by using a hand recognition model. The hand type recognition Model may adopt a Hidden Markov Model (HMM) or a machine learning algorithm to construct a Neural Network (NN) Model, which is not limited in the embodiments of the present invention. The neural network model is used for hand type recognition, and the model is obtained by training a large number of training samples, so that the method has high fault tolerance.
When the type of the gesture information is a hand type sequence, a process of detecting whether the gesture information of the target user matches the gesture information of the registered user may be as follows: detecting whether all the hand types in the hand type sequence of the target user are matched with the hand types at the corresponding positions in the hand type sequence of the registered user; if all the gesture information of the target user is matched with the gesture information of the registered user, determining that the gesture information of the target user is matched with the gesture information of the registered user; if not all matches (i.e., there is at least one set of unmatched hand types), then it is determined that the gesture information of the target user does not match the gesture information of the registered user. A hand shape sequence corresponds to an image sequence comprising a plurality of images, each image in the image sequence comprising a hand shape. Taking the sequence of obtaining the hand shape of the target user as an example, the terminal device collects a plurality of images through the camera, and each image contains one hand shape executed by the target user. In addition, when comparing whether each group of hand types is matched, the method described above can be adopted, and the details are not repeated here.
When the type of the gesture information is a gesture track, a process of detecting whether the gesture information of the target user matches the gesture information of the registered user may be as follows: extracting gesture track characteristics from the gesture track of the target user; comparing the gesture track characteristics of the target user with the gesture track characteristics of the registered user, and calculating the similarity between the gesture track characteristics; if the similarity is larger than a second preset threshold, determining that the hand type information of the target user is matched with the hand type information of the registered user; and if the similarity is smaller than a second preset threshold, determining that the hand type information of the target user is not matched with the hand type information of the registered user. The gesture track features of the registered user are extracted from the gesture track of the registered user, and the gesture track features of the registered user can be extracted in advance and stored. For example, the gesture track characteristics of the registered user are extracted from the gesture track of the registered user in the registration process and stored. The gesture trajectory features may include the morphology of the gesture trajectory, the location of some turning points, and the like. The value range of the second preset threshold is greater than 0 and less than 1, for example, the second preset threshold is 0.9. In practical application, a second preset threshold value can be set according to the precision requirement of track matching, the higher the precision requirement is, the larger the second preset threshold value is, and the lower the precision requirement is, the smaller the second preset threshold value is. In addition, the gesture trajectory can be determined according to the position of the hand in a plurality of images which are continuously acquired. Taking the gesture track of the target user as an example, in the process of executing the gesture track by the target user, the terminal device continuously collects a plurality of images including the hand of the target user through the camera, and then obtains the gesture track of the target user according to the position of the hand of the target user in the plurality of images continuously collected.
When the type of the gesture information is a combination of a single hand type and a gesture trajectory, a process of detecting whether the gesture information of the target user matches the gesture information of the registered user may be as follows: detecting whether the single hand type of the target user is matched with the single hand type of the registered user or not and whether the gesture track of the target user is matched with the gesture track of the registered user or not; if the single hand type of the target user is matched with the single hand type of the registered user and the gesture track of the target user is matched with the gesture track of the registered user, determining that the gesture information of the target user is matched with the gesture information of the registered user; and if the single hand type of the target user is not matched with the single hand type of the registered user and/or the gesture track of the target user is not matched with the gesture track of the registered user, determining that the gesture information of the target user is not matched with the gesture information of the registered user. The matching detection process for a single hand shape and gesture trajectory can be referred to above, and is not described here.
When the type of the gesture information is a combination of a hand type sequence and a gesture trajectory, a process of detecting whether the gesture information of the target user matches the gesture information of the registered user may be as follows: detecting whether the hand type sequence of the target user is matched with the hand type sequence of the registered user or not and whether the gesture track of the target user is matched with the gesture track of the registered user or not; if the hand type sequence of the target user is matched with the hand type sequence of the registered user and the gesture track of the target user is matched with the gesture track of the registered user, determining that the gesture information of the target user is matched with the gesture information of the registered user; and if the hand type sequence of the target user is not matched with the hand type sequence of the registered user and/or the gesture track of the target user is not matched with the gesture track of the registered user, determining that the gesture information of the target user is not matched with the gesture information of the registered user. The matching detection process of the hand type sequence and the gesture track can be referred to above, and is not described here again.
In this embodiment, only whether the face information is matched is detected first and then whether the gesture information is matched is detected, but in other possible embodiments, whether the gesture information is matched is detected first and then whether the face information is matched is detected, or whether the two pieces of information are matched is detected at the same time.
And step 105, determining that the target user passes the identity authentication.
And when the two items of information, namely the face information and the gesture information, are correspondingly matched, determining that the target user passes the identity authentication.
And step 106, determining that the target user is not authenticated.
And when at least one item of information of the face information and the gesture information is not correspondingly matched, determining that the target user does not pass the identity authentication.
It should be noted that the number of registered users may be one or more. When the number of the registered users is multiple, the face information and the feature information of each registered user are correspondingly stored. When the face information of the target user is matched with the face information of the target registered user and the gesture information of the target user is matched with the gesture information of the target registered user, determining that the target user passes identity authentication; the target registered user refers to one of the registered users.
It should be further added that, when the number of times that the target user fails to pass the authentication is greater than the preset number of times, the terminal device may send an exception notification to the background server, and the background staff performs manual verification on the authentication process of the target user according to the exception notification, so as to find the reason that the target user repeatedly fails to pass the authentication and help the target user pass the authentication.
In summary, in the embodiment of the present invention, the face information is less confidential and can be regarded as "plaintext", the gesture information is more confidential and has sufficient differences and can be regarded as "ciphertext", and in the process of performing identity verification based on face recognition, the face information and the gesture information are simultaneously matched to consider that the target user passes the identity verification, so that the security of the identity verification process based on face recognition is improved. Moreover, because the gesture information and the face information can be acquired through the camera, the convenience of information acquisition is not influenced, and the hardware cost of the equipment is not additionally increased.
The identity authentication is carried out by introducing the gesture information and combining the face information, even if the face information of the user is acquired by other people, as long as the gesture information set by the user is not leaked to other people, other people still cannot counterfeit that the user passes the identity authentication, and the safety is higher. For example, common loss of a common mobile phone is common, self-shot photos of some users inevitably exist in a camera application of the mobile phone, and a hacker can easily use the self-shot photos to imitate that the users pass authentication by using an authentication process based on face recognition provided in the prior art. According to the scheme provided by the embodiment of the invention, a higher threshold is set for cracking due to one more gesture information, so that the safety of the verification process is improved.
In addition, in practical application, the gesture information can be selected into different implementation forms according to requirements on safety and convenience. The single hand type and the gesture track are simple in execution and suitable for application scenes with high requirements on convenience and verification efficiency; the hand type sequence, the combination of the single hand type and the gesture track and the combination of the hand type sequence and the gesture track are suitable for application scenes with high safety requirements.
In addition, in this embodiment, it is first detected whether the face information of the target user matches the face information of the registered user, and if the detection result of the face information is matching, it is continuously detected whether the gesture information of the target user matches the gesture information of the registered user, otherwise, if the detection result of the face information is not matching, it is not necessary to execute the matching detection procedure of the gesture information, which is helpful to save processing overhead.
In the related prior art, a living body detection mechanism is added on the basis of the existing process, when the face information of the target user is collected, the target user is required to perform specified actions such as blinking, nodding, opening the mouth and the like to detect whether the target user is a living body, and when the face information of the target user is detected to be matched with the face information of the registered user and the target user is the living body, the target user is determined to pass the identity authentication. However, since the number of actions that can be performed by the face is limited, a hacker can still set the above-mentioned specified actions on the face model in advance to easily pass live body detection. According to the technical scheme provided by the embodiment of the invention, as the hands are very flexible and changeable, and the actions which can be executed by the hands are more various, the gesture information is difficult to crack, and the safety is higher.
In one example, the terminal device displays a face acquisition interface, acquires face information input by a target user in the face acquisition interface, detects whether the face information of the target user is matched with face information of a registered user, displays a gesture acquisition interface if the face information of the target user is matched with the face information of the registered user, acquires gesture information input by the target user in the gesture acquisition interface, detects whether the gesture information of the target user is matched with the gesture information of the registered user, and determines that the target user passes identity authentication if the gesture information of the target user is matched with the gesture information of the registered user. The face acquisition interface can comprise a view frame of the camera and first prompt information, and the first prompt information is used for prompting a user to move a face to the view frame so as to acquire face information. The gesture collection interface comprises a view finder of the camera and second prompt information, and the second prompt information is used for prompting a user to execute a gesture in the view finder so as to collect gesture information.
In the embodiment shown in fig. 1 above, the verification process is described. Next, the registration process will be described.
In the registration process, the terminal device needs to acquire face information of the registered user and gesture information of the registered user. The registered user refers to a user who sets face information and gesture information for authentication. The terminal equipment can acquire the face information of the registered user and the gesture information of the registered user through the camera.
In this embodiment, the order of obtaining the face information of the registered user and the gesture information of the registered user is not limited, and the face information of the registered user may be obtained first and then the gesture information of the registered user, or the gesture information of the registered user may be obtained first and then the face information of the registered user, or both of the above two items of information may be obtained simultaneously.
In one example, as shown in fig. 5, the registration process includes the following steps:
step 501, obtaining face information of a registered user.
Step 502, after the face information of the registered user is acquired, a gesture setting interface is displayed.
The gesture setting interface comprises a plurality of selection items, and each selection item corresponds to one type of gesture information. Illustratively, 3 selection items are included in the gesture setting interface, namely a first selection item, a second selection item and a third selection item. The type of the gesture information used for triggering selection of the first selection item is a single hand type, the type of the gesture information used for triggering selection of the second selection item is a hand type sequence, and the type of the gesture information used for triggering selection of the third selection item is a combination of the single hand type and a gesture track. The registered user can select a proper type according to the requirements of the registered user on safety and convenience. For example, if the registered user has a high demand for convenience, the first selection item may be selected; if the registered user has a high requirement for security, the second option or the third option may be selected.
Step 503, after the selection instruction corresponding to the target selection item is obtained, obtaining gesture information of the registered user, where the gesture information of the registered user belongs to the type of gesture information corresponding to the target selection item.
Optionally, after the selection indication corresponding to the target selection item is acquired, displaying a gesture acquisition interface, and acquiring gesture information entered by the registered user in the gesture acquisition interface. As described above, the gesture capture interface includes the finder frame of the camera and the second prompt information, which is used to prompt the user to perform a gesture in the finder frame in order to capture gesture information.
And step 504, correspondingly storing the face information of the registered user and the gesture information of the registered user.
After acquiring the face information and the gesture information of the registered user, the terminal device correspondingly stores the two items of information so as to be used for identity authentication in the following.
Referring collectively to fig. 6, a diagram illustrating a registration flow is shown. After the registered user triggers the start of the registration process, the terminal equipment acquires the face information of the registered user. And if the face information is successfully acquired, the terminal equipment displays a gesture setting interface. The 3 selection items introduced above may be included in the gesture setting interface. And the registered user selects the type of the gesture information to be set in the gesture setting interface. And if the registered user selects a single hand type, the terminal equipment collects the single hand type of the registered user, and after the single hand type is collected successfully, the registration is finished. If the registered user selects the hand type sequence, the terminal device collects n hand types of the registered user to generate the hand type sequence, after the hand type sequence is successfully acquired, the registration is completed, and n is an integer greater than 1. If the registered user selects the combination of the single hand type and the gesture track, the terminal device firstly collects one hand type of the registered user, prompts the user to execute the gesture track after the hand type is successfully collected, the terminal device collects the gesture track of the registered user, and the registration is completed after the gesture track is successfully collected.
In addition, the gesture information is different from the face information, the registered user can reset the gesture information according to actual requirements, and the security can be further enhanced by periodically modifying the gesture information. As shown in fig. 5, the process of modifying the gesture information may include the following steps:
and 505, after the gesture resetting instruction is obtained, performing identity verification on the current user according to the face information of the current user and the gesture information of the current user.
The current user refers to a user currently requesting a reset of the gesture information. After the terminal device obtains the gesture resetting instruction, the identity verification process shown in fig. 1 is adopted to obtain the face information of the current user and the gesture information of the current user, and identity verification is performed on the current user according to the two items of information.
And under the condition that the current user passes the identity authentication, the current user is the registered user, and the current user is allowed to reset the gesture information of the registered user.
Step 506, acquiring gesture information input again by the current user;
in step 507, the gesture information input again is set as the gesture information of the registered user.
For example, the originally set gesture information of the registered user is a hand type sequence a, the reset gesture information is a hand type sequence B, the hand type sequence B is different from the hand type sequence a, and the terminal device replaces the hand type sequence B with the hand type sequence a.
Through the mode, the function of resetting the gesture information is provided, so that the user can modify the gesture information regularly, and the safety is further enhanced.
The following are embodiments of the apparatus of the present invention that may be used to perform embodiments of the method of the present invention. For details which are not disclosed in the embodiments of the apparatus of the present invention, reference is made to the embodiments of the method of the present invention.
Referring to fig. 7, a block diagram of an authentication apparatus according to an embodiment of the present invention is shown. The device has the functions of realizing the method examples, and the functions can be realized by hardware or by hardware executing corresponding software. The apparatus may include: a first acquisition module 710, a first detection module 720, a second acquisition module 730, a second detection module 740, and a determination module 750.
A first obtaining module 710, configured to perform step 101.
The first detecting module 720 is configured to perform the step 102.
A second obtaining module 730, configured to perform the step 103.
A second detecting module 740, configured to perform the step 104.
A determining module 750 for executing the above steps 105 and 106.
In an example, when the type of the gesture information is a hand type sequence, the second obtaining module 730 is specifically configured to: acquiring a hand shape of the target user through a camera; detecting to obtain a continuous acquisition instruction or a stop acquisition instruction; the continuous acquisition instruction is used for indicating continuous acquisition of the hand types, and the acquisition stopping instruction is used for indicating stopping acquisition of the hand types; if the acquisition continuing instruction is acquired, the step of acquiring one hand shape of the target user through the camera is started to be executed again; and if the acquisition stopping instruction is acquired, generating a hand type sequence of the target user according to the acquired hand types of the target user.
Optionally, the apparatus further comprises: the device comprises a third acquisition module and a storage module.
And the third acquisition module is used for acquiring the face information of the registered user and the gesture information of the registered user.
A storage module, configured to perform step 504.
In one example, a third obtaining module includes: the device comprises a face acquisition unit, an interface display unit and a gesture acquisition unit.
A face obtaining unit, configured to perform step 501.
An interface display unit, configured to perform step 502 described above.
A gesture obtaining unit, configured to perform step 503.
Optionally, the apparatus further comprises: the device comprises an identity verification module, a gesture acquisition module and a gesture resetting module.
And an identity authentication module, configured to perform step 505.
And a gesture obtaining module for executing the step 506.
And a gesture resetting module for executing the step 507.
Reference may be made to the above-described method embodiments for relevant details.
It should be noted that: in the above embodiment, when the device implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Referring to fig. 8, a schematic structural diagram of a terminal device according to an embodiment of the present invention is shown. The terminal device is configured to implement the identity authentication method provided in the above embodiment. Specifically, the method comprises the following steps:
terminal device 800 may include RF (Radio Frequency) circuitry 810, memory 820 including one or more computer-readable storage media, input unit 830, display unit 840, sensor 850, audio circuitry 860, WiFi (wireless fidelity) module 870, processor 880 including one or more processing cores, and power supply 890. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 8 does not constitute a limitation of the terminal device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the RF circuit 810 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, for receiving downlink information from a base station and then processing the received downlink information by the one or more processors 880; in addition, data relating to uplink is transmitted to the base station. In general, RF circuit 810 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, an LNA (Low Noise Amplifier), a duplexer, and the like. In addition, the RF circuit 810 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), e-mail, SMS (short messaging Service), etc.
The memory 820 may be used to store software programs and modules, and the processor 880 executes various functional applications and data processing by operating the software programs and modules stored in the memory 820. The memory 820 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the terminal apparatus 800, and the like. Further, the memory 820 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 820 may also include a memory controller to provide the processor 880 and the input unit 830 access to the memory 820.
The input unit 830 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. Specifically, the input unit 830 may include an image input device 831 and other input devices 832. The image input device 831 may be a camera or a photoelectric scanning device. The input unit 830 may include other input devices 832 in addition to the image input device 831. In particular, other input devices 832 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 840 may be used to display information input by or provided to a user and various graphical user interfaces of the terminal device 800, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 840 may include a Display panel 841, and the Display panel 841 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like, as an option.
The terminal device 800 may also include at least one sensor 850, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 841 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 841 and/or backlight when the terminal device 800 is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the terminal device 800, further description is omitted here.
Audio circuitry 860, speaker 861, microphone 862 may provide an audio interface between a user and terminal device 800. The audio circuit 860 can transmit the electrical signal converted from the received audio data to the speaker 861, and the electrical signal is converted into a sound signal by the speaker 861 and output; on the other hand, the microphone 862 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 860, and outputs the audio data to the processor 880 for processing, and then transmits the audio data to, for example, another terminal device via the RF circuit 810, or outputs the audio data to the memory 820 for further processing. The audio circuitry 860 may also include an earbud jack to provide communication of peripheral headphones with the terminal device 800.
WiFi belongs to short-distance wireless transmission technology, and the terminal device 800 can help the user send and receive e-mail, browse web pages, access streaming media, etc. through the WiFi module 870, and it provides the user with wireless broadband internet access. Although fig. 8 shows WiFi module 870, it is understood that it does not belong to the essential constitution of terminal apparatus 800, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 880 is a control center of the terminal device 800, connects various parts of the entire cellular phone using various interfaces and lines, and performs various functions of the terminal device 800 and processes data by operating or executing software programs and/or modules stored in the memory 820 and calling data stored in the memory 820, thereby performing overall monitoring of the cellular phone. Optionally, processor 880 may include one or more processing cores; preferably, the processor 880 may integrate an application processor, which mainly handles operating systems, user interfaces, applications, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 880.
Terminal device 800 further includes a power supply 890 (e.g., a battery) for powering the various components, which may be logically coupled to processor 880 via a power management system that may be used to manage charging, discharging, and power consumption. Power supply 890 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown, the terminal apparatus 800 may further include a bluetooth module or the like, which is not described in detail herein.
In particular, in this embodiment, the terminal device 800 further includes a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for performing the authentication method described above.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, e.g., a memory comprising instructions, executable by a processor of a terminal device to perform the steps in the above method embodiments is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (12)

1. An identity verification method, the method comprising:
acquiring face information of a target user;
detecting whether the face information of the target user is matched with the face information of the registered user;
if the face information of the target user is matched with the face information of the registered user, acquiring gesture information of the target user;
detecting whether the gesture information of the target user is matched with the gesture information of the registered user;
if the gesture information of the target user is matched with the gesture information of the registered user, determining that the target user passes identity authentication;
the type of the gesture information is a hand type sequence, and the hand type sequence refers to a sequence comprising a plurality of hand types; the detecting whether the gesture information of the target user matches the gesture information of the registered user comprises:
detecting whether all the hand types in the hand type sequence of the target user are matched with the hand types at the corresponding positions in the hand type sequence of the registered user;
if all the gesture information of the target user is matched with the gesture information of the registered user, determining that the gesture information of the target user is matched with the gesture information of the registered user;
if not, determining that the gesture information of the target user is not matched with the gesture information of the registered user; the hand shape sequence corresponds to an image sequence, the image sequence comprises a plurality of images, and each image in the image sequence comprises a hand shape;
and when each group of hand types are matched or not through comparison, identifying a single hand type from the image by adopting a hand type identification model, wherein the hand type identification model comprises a neural network model constructed by adopting a machine learning algorithm.
2. The method of claim 1, wherein the obtaining gesture information of the target user comprises:
acquiring a hand shape of the target user through a camera;
detecting to obtain a continuous acquisition instruction or a stop acquisition instruction; the continuous acquisition instruction is used for indicating continuous acquisition of the hand types, and the acquisition stopping instruction is used for indicating stopping acquisition of the hand types;
if the acquisition continuing instruction is acquired, starting to execute the step of acquiring one hand shape of the target user through the camera again;
and if the acquisition stopping instruction is acquired, generating a hand type sequence of the target user according to the acquired hand types of the target user.
3. The method of claim 1, wherein the gesture information further comprises any one of the following types:
a gesture trajectory;
a combination of a single hand shape and the gesture trajectory;
wherein the gesture track refers to the overall motion track of the hand.
4. The method of claim 3, wherein before obtaining the face information of the target user, further comprising:
acquiring the face information of the registered user and the gesture information of the registered user;
and correspondingly storing the face information of the registered user and the gesture information of the registered user.
5. The method of claim 4, wherein the obtaining of the face information of the registered user and the gesture information of the registered user comprises:
acquiring face information of the registered user;
after the face information of the registered user is acquired, displaying a gesture setting interface, wherein the gesture setting interface comprises a plurality of selection items, and each selection item corresponds to one type of gesture information;
after a selection instruction corresponding to a target selection item is acquired, acquiring gesture information of the registered user, wherein the gesture information of the registered user belongs to the type of the gesture information corresponding to the target selection item.
6. The method according to claim 4, further comprising, after storing the face information of the registered user and the gesture information of the registered user correspondingly:
after a gesture resetting instruction is acquired, performing identity verification on a current user according to face information of the current user and gesture information of the current user;
acquiring gesture information input again by the current user under the condition that the current user passes identity authentication;
and setting the gesture information input again as the gesture information of the registered user.
7. An authentication apparatus, the apparatus comprising:
the first acquisition module is used for acquiring the face information of a target user;
the first detection module is used for detecting whether the face information of the target user is matched with the face information of the registered user;
the second acquisition module is used for acquiring the gesture information of the target user if the face information of the target user is matched with the face information of the registered user;
the second detection module is used for detecting whether the gesture information of the target user is matched with the gesture information of the registered user;
the determining module is used for determining that the target user passes identity authentication if the gesture information of the target user is matched with the gesture information of the registered user;
the type of the gesture information is a hand type sequence, and the hand type sequence refers to a sequence comprising a plurality of hand types; the detecting whether the gesture information of the target user matches the gesture information of the registered user comprises:
detecting whether all the hand types in the hand type sequence of the target user are matched with the hand types at the corresponding positions in the hand type sequence of the registered user;
if all the gesture information of the target user is matched with the gesture information of the registered user, determining that the gesture information of the target user is matched with the gesture information of the registered user;
if not, determining that the gesture information of the target user is not matched with the gesture information of the registered user; the hand shape sequence corresponds to an image sequence, the image sequence comprises a plurality of images, and each image in the image sequence comprises a hand shape;
and when each group of hand types are matched or not through comparison, identifying a single hand type from the image by adopting a hand type identification model, wherein the hand type identification model comprises a neural network model constructed by adopting a machine learning algorithm.
8. The apparatus of claim 7, wherein the second obtaining module is specifically configured to:
acquiring a hand shape of the target user through a camera;
detecting to obtain a continuous acquisition instruction or a stop acquisition instruction; the continuous acquisition instruction is used for indicating continuous acquisition of the hand types, and the acquisition stopping instruction is used for indicating stopping acquisition of the hand types;
if the acquisition continuing instruction is acquired, starting to execute the step of acquiring one hand shape of the target user through the camera again;
and if the acquisition stopping instruction is acquired, generating a hand type sequence of the target user according to the acquired hand types of the target user.
9. The apparatus of claim 7, wherein the gesture information further comprises any one of the following types:
a gesture trajectory;
a combination of a single hand shape and the gesture trajectory;
wherein the gesture track refers to the overall motion track of the hand.
10. The apparatus of claim 9, further comprising:
the third acquisition module is used for acquiring the face information of the registered user and the gesture information of the registered user;
and the storage module is used for correspondingly storing the face information of the registered user and the gesture information of the registered user.
11. The apparatus of claim 10, wherein the third obtaining module comprises:
the face acquisition unit is used for acquiring face information of the registered user;
the interface display unit is used for displaying a gesture setting interface after the face information of the registered user is acquired, wherein the gesture setting interface comprises a plurality of selection items, and each selection item corresponds to one type of gesture information;
the gesture obtaining unit is used for obtaining gesture information of the registered user after obtaining a selection instruction corresponding to a target selection item, wherein the gesture information of the registered user belongs to the type of the gesture information corresponding to the target selection item.
12. The apparatus of claim 10, further comprising:
the identity authentication module is used for authenticating the identity of the current user according to the face information of the current user and the gesture information of the current user after the gesture resetting instruction is obtained;
the gesture obtaining module is used for obtaining gesture information input again by the current user under the condition that the current user passes identity authentication;
and the gesture resetting module is used for setting the gesture information input again as the gesture information of the registered user.
CN201611049484.4A 2016-11-24 2016-11-24 Identity verification method and device Active CN108108649B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611049484.4A CN108108649B (en) 2016-11-24 2016-11-24 Identity verification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611049484.4A CN108108649B (en) 2016-11-24 2016-11-24 Identity verification method and device

Publications (2)

Publication Number Publication Date
CN108108649A CN108108649A (en) 2018-06-01
CN108108649B true CN108108649B (en) 2020-04-07

Family

ID=62204829

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611049484.4A Active CN108108649B (en) 2016-11-24 2016-11-24 Identity verification method and device

Country Status (1)

Country Link
CN (1) CN108108649B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110610354A (en) * 2018-06-15 2019-12-24 腾讯科技(深圳)有限公司 Method and device for settlement of articles in unmanned store and storage medium
CN109284689A (en) * 2018-08-27 2019-01-29 苏州浪潮智能软件有限公司 A method of In vivo detection is carried out using gesture identification
CN109376513A (en) * 2018-08-29 2019-02-22 盐城线尚天使科技企业孵化器有限公司 Two-way verification method and system based on gesture control
CN109450867B (en) * 2018-10-22 2019-11-15 腾讯科技(深圳)有限公司 A kind of identity identifying method, device and storage medium
CN110677390B (en) * 2019-09-10 2023-03-24 中国平安财产保险股份有限公司 Abnormal account identification method and device, electronic equipment and storage medium
CN111046804A (en) * 2019-12-13 2020-04-21 北京旷视科技有限公司 Living body detection method, living body detection device, electronic equipment and readable storage medium
CN112417411A (en) * 2020-11-18 2021-02-26 深圳市优必选科技股份有限公司 Identity recognition method, device, electronic equipment and medium
DE112021002222T5 (en) * 2021-01-26 2023-02-23 Boe Technology Group Co., Ltd. Control method, electronic device and storage medium
CN113014980B (en) * 2021-02-23 2023-07-18 北京字跳网络技术有限公司 Remote control method and device and electronic equipment
JP2024519297A (en) * 2021-04-30 2024-05-10 ホアウェイ・テクノロジーズ・カンパニー・リミテッド Display adjustment method and device
CN116740768B (en) * 2023-08-11 2023-10-20 南京诺源医疗器械有限公司 Navigation visualization method, system, equipment and storage medium based on nasoscope

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1276572A (en) * 1999-06-08 2000-12-13 松下电器产业株式会社 Hand shape and gesture identifying device, identifying method and medium for recording program contg. said method
CN102968612A (en) * 2012-07-27 2013-03-13 中国工商银行股份有限公司 Bank identity identification method and system
CN104932797A (en) * 2014-03-17 2015-09-23 深圳富泰宏精密工业有限公司 Gesture unlocking method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9665725B2 (en) * 2015-02-06 2017-05-30 Microchip Technology Incorporated Gesture based access control method and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1276572A (en) * 1999-06-08 2000-12-13 松下电器产业株式会社 Hand shape and gesture identifying device, identifying method and medium for recording program contg. said method
CN102968612A (en) * 2012-07-27 2013-03-13 中国工商银行股份有限公司 Bank identity identification method and system
CN104932797A (en) * 2014-03-17 2015-09-23 深圳富泰宏精密工业有限公司 Gesture unlocking method and system

Also Published As

Publication number Publication date
CN108108649A (en) 2018-06-01

Similar Documents

Publication Publication Date Title
CN108108649B (en) Identity verification method and device
US10445482B2 (en) Identity authentication method, identity authentication device, and terminal
US10169639B2 (en) Method for fingerprint template update and terminal device
US9779527B2 (en) Method, terminal device and storage medium for processing image
RU2618932C2 (en) Method, installation and device of unblocking process for terminal
US11176235B2 (en) Permission control method and related product
CN107992728B (en) Face verification method and device
EP3035283A1 (en) Image processing method and apparatus, and terminal device
WO2014180095A1 (en) Systems and methods for real human face recognition
CN106203235B (en) Living body identification method and apparatus
CN108958606B (en) Split screen display method and device, storage medium and electronic equipment
CN108650408B (en) Screen unlocking method and mobile terminal
CN109145552A (en) Information ciphering method and terminal device
CN104573437B (en) Information authentication method, device and terminal
CN110555171A (en) Information processing method, device, storage medium and system
CN108989534A (en) Message prompt method, mobile terminal and computer readable storage medium
CN109951889A (en) A kind of Internet of Things matches network method and mobile terminal
CN109495638A (en) A kind of information display method and terminal
CN108287738A (en) A kind of application control method and device
CN109766050A (en) Fingerprint identification method, terminal and computer readable storage medium
CN106095289B (en) A kind of method and terminal of unlocked by fingerprint
CN108874281A (en) A kind of application program launching method and terminal device
CN110062412B (en) Wireless pairing method, system, storage medium and mobile terminal
CN110113486A (en) A kind of moving method and terminal of application icon
CN106845413B (en) Fingerprint identification method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant