WO2016034069A1 - 身份认证方法、装置、终端及服务器 - Google Patents
身份认证方法、装置、终端及服务器 Download PDFInfo
- Publication number
- WO2016034069A1 WO2016034069A1 PCT/CN2015/088215 CN2015088215W WO2016034069A1 WO 2016034069 A1 WO2016034069 A1 WO 2016034069A1 CN 2015088215 W CN2015088215 W CN 2015088215W WO 2016034069 A1 WO2016034069 A1 WO 2016034069A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- user
- face
- facial
- server
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 80
- 230000001815 facial effect Effects 0.000 claims abstract description 133
- 238000000605 extraction Methods 0.000 claims description 23
- 238000012795 verification Methods 0.000 claims description 7
- 108010001267 Protein Subunits Proteins 0.000 claims description 2
- 230000008921 facial expression Effects 0.000 claims 1
- 210000003128 head Anatomy 0.000 description 30
- 238000010586 diagram Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 7
- 239000000284 extract Substances 0.000 description 6
- 238000004891 communication Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
- H04L63/0861—Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/305—Authentication, i.e. establishing the identity or authorisation of security principals by remotely controlling device operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/34—User authentication involving the use of external additional devices, e.g. dongles or smart cards
- G06F21/35—User authentication involving the use of external additional devices, e.g. dongles or smart cards communicating wirelessly
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/45—Structures or tools for the administration of authentication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/164—Detection; Localisation; Normalisation using holistic features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/169—Holistic features and representations, i.e. based on the facial image taken as a whole
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
- G06V40/176—Dynamic expression
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L17/00—Speaker identification or verification techniques
- G10L17/22—Interactive procedures; Man-machine interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2101—Auditing as a secondary aspect
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2103—Challenge-response
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2115—Third party
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2117—User registration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/14—Image acquisition
- G06V30/142—Image acquisition using hand-held instruments; Constructional details of the instruments
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
Definitions
- the present application relates to the field of communications technologies, and in particular, to an identity authentication method, apparatus, terminal, and server.
- the server verifies that the input authentication password is consistent with the authentication password when the user is registered, and confirms that the user passes the identity authentication.
- authentication passwords are often simple combinations of numbers and letters that are easily stolen by malicious third parties. Therefore, the reliability of the existing identity authentication method is poor, and the user information is easily stolen, resulting in low security of the authentication.
- the present application provides an identity authentication method, device, terminal, and server to solve the problem that the identity authentication method in the prior art is less reliable and less secure.
- an identity authentication method is provided, where the method includes:
- an identity authentication method is provided, where the method includes:
- the gesture recognition information is the gesture recognition information obtained by the terminal by identifying a facial gesture presented by the user according to the facial dynamic authentication prompt information
- an identity authentication apparatus includes:
- a receiving unit configured to receive a face dynamic authentication prompt message sent by the server when the user performs identity authentication
- a recognition unit configured to obtain gesture recognition information of the face dynamic authentication prompt information by identifying a face gesture presented by the user
- a sending unit configured to send the gesture identification information to the server, so that the server determines that the user passes the identity authentication when verifying that the gesture recognition information is consistent with the face dynamic authentication prompt information.
- an identity authentication apparatus includes:
- a sending unit configured to send a face dynamic authentication prompt message to the terminal when the user performs identity authentication
- a receiving unit configured to receive the gesture identification information sent by the terminal, where the gesture recognition information is the gesture recognition information obtained by the terminal by identifying a facial gesture presented by the user according to the facial dynamic authentication prompt information;
- a determining unit configured to determine that the user passes the identity authentication when verifying that the gesture identification information is consistent with the face dynamic authentication prompt information.
- a terminal including:
- processor a memory for storing the processor executable instructions
- processor is configured to:
- a server including:
- processor a memory for storing the processor executable instructions
- processor is configured to:
- the gesture recognition information is the gesture recognition information obtained by the terminal by identifying a facial gesture presented by the user according to the facial dynamic authentication prompt information
- the server when the user is authenticated by the user, the server sends the dynamic authentication prompt information to the terminal, and the terminal obtains the gesture recognition information of the facial dynamic authentication prompt information by identifying the facial gesture presented by the user, and sends the gesture identification information to the server and the server.
- the verification gesture recognition information is consistent with the face dynamic authentication prompt information, it is determined that the user passes the identity authentication.
- the face dynamic authentication method can perform high security authentication on the user identity. Compared with the existing authentication password, the authentication information is not stolen by a malicious third party, which improves the reliability of the authentication. Sex, and the face dynamic authentication can identify the user as a live user, thereby further improving the accuracy of identity authentication and reducing the security risks in the authentication process.
- FIG. 1 is a schematic diagram of an identity authentication scenario according to an embodiment of the present application.
- 2A is a flowchart of an embodiment of an identity authentication method of the present application.
- 2B is a flowchart of another embodiment of the identity authentication method of the present application.
- 3A is a flowchart of another embodiment of an identity authentication method of the present application.
- FIG. 3B is a schematic diagram of a gesture of a human head in a face authentication process according to an embodiment of the present application.
- FIG. 4A is a flowchart of another embodiment of an identity authentication method according to the present application.
- FIG. 4B and FIG. 4C are schematic diagrams of key points of a face in an embodiment of the present application.
- FIG. 5 is a hardware structural diagram of a device where the identity authentication device of the present application is located;
- FIG. 6 is a block diagram of an embodiment of an identity authentication apparatus of the present application.
- FIG. 7 is a block diagram of another embodiment of the identity authentication apparatus of the present application.
- first, second, third, etc. may be used to describe various information in this application, such information should not be limited to these terms. These terms are only used to distinguish the same type of information from each other.
- first information may also be referred to as the second information without departing from the scope of the present application.
- second information may also be referred to as the first information.
- word "if” as used herein may be interpreted as "when” or “when” or “in response to a determination.”
- FIG. 1 it is a schematic diagram of an application scenario for implementing identity authentication according to an embodiment of the present application: a user completes identity authentication of a user by interacting with a terminal and a server, and communication between the terminal and the server may be completed based on a network.
- the network includes various wireless networks or wired networks, and the embodiments of the present application do not limit the embodiments.
- the terminal may be specifically a mobile phone, a tablet computer, a personal computer, or the like.
- two databases may be set on the server, which are a face feature information database and a face dynamic authentication prompt information database.
- the terminal may obtain the facial feature information of the registered user and send it to the server, and the server saves the facial feature information of the registered user to the face feature information database.
- face authentication may be performed first.
- the user sends the acquired facial feature information to the server, and the server verifies that the facial feature information matches the facial feature information of the user saved in the facial feature information data.
- the server can perform face dynamic authentication.
- the server can return the face dynamic authentication prompt information obtained from the face dynamic authentication prompt information database, and identify the user at the terminal.
- the presented face gesture thus obtains the gesture recognition information of the face dynamic authentication prompt information and sends the gesture identification information to the server.
- the server verifies that the gesture recognition information is consistent with the face dynamic authentication prompt information, it can be known that the current authenticated user is a living user, thereby finally determining The user is authenticated.
- the face feature information of the user acquired in the face registration phase may be referred to as the second face feature information in the embodiment of the present application, and the face feature information of the user acquired in the face authentication phase is referred to as the first person. Face feature information.
- the embodiments of the present application are described in detail below.
- FIG. 2A it is a flowchart of an embodiment of an identity authentication method according to the present application. The embodiment is described from a terminal side that implements identity authentication:
- Step 201 Receive a face dynamic authentication prompt message sent by the server when the user performs identity authentication.
- the server may randomly extract the face dynamic authentication prompt information from the face dynamic authentication prompt information data and return the information to the terminal, where the face dynamic authentication prompt information may include at least one of the following information: an expression action prompt information, For example, closing the eyes, opening the mouth, turning the head, etc.; and reading the prompt information by voice, for example, paying 20 yuan or the like.
- an expression action prompt information For example, closing the eyes, opening the mouth, turning the head, etc.
- reading the prompt information by voice for example, paying 20 yuan or the like.
- the terminal may first acquire the facial feature information of the user, and use the facial feature information acquired during the identity authentication as the first facial information of the user to the server.
- the server sends the facial dynamic authentication prompt information to the terminal when verifying that the first facial feature information matches the saved second facial feature information.
- the terminal may activate an integrated imaging device, such as a camera, to detect the face of the user, and when the face is detected, the face tracking of the user is performed on the face.
- an integrated imaging device such as a camera
- the terminal may activate an integrated imaging device, such as a camera, to detect the face of the user, and when the face is detected, the face tracking of the user is performed on the face.
- the server may search the facial feature information database according to the user name of the user, obtain the second facial feature information corresponding to the user name, and then adopt a preset comparison manner. Comparing the first face feature information and the second face feature information, if the feature comparison value is within a preset similarity range, determining that the first face feature information matches the second face feature information, determining the first After the face feature information is matched with the second face feature information, the user can be determined to pass the face authentication. At this time, the server sends the face dynamic authentication prompt information to the terminal.
- Step 202 Obtain gesture recognition information of the face dynamic authentication prompt information by recognizing the face gesture presented by the user.
- the terminal displays the face dynamic authentication prompt information on the identity authentication interface, and the user can present the corresponding face gesture according to the information, and the terminal recognizes the face gesture.
- the user may perform face tracking to obtain face tracking information, where the face tracking information may include at least one of facial key position information and human head posture information, and then the terminal obtains the user by analyzing the face tracking information. Gesture identification information.
- the position information of the key points of the face can be used to know whether the user prompts the eyes according to the expression action, whether to close the eyes, open the mouth, or the mouth shape of the user when reading the voice reading prompt information (the pronunciation of each word has a corresponding relationship with the mouth type, through the mouth type).
- the gesture recognition information of the user can be determined; the head posture information can be used to know whether the user turns his head, bows, and the like.
- Step 203 Send the gesture recognition information to the server, so that the server determines that the user passes the identity authentication when the verification gesture identification information is consistent with the facial dynamic authentication prompt information.
- the server may send the face dynamic authentication prompt information to the terminal and record Corresponding relationship between the user name of the user and the face dynamic authentication prompt information; in this step, after the terminal sends the gesture recognition information to the server, the server obtains the corresponding face dynamic authentication prompt information according to the user name of the user, and verifies the gesture.
- the identification information is consistent with the face dynamic authentication prompt information, the user is a living user, and the user is determined to pass the identity authentication.
- the terminal can obtain the user's audio information in addition to the user's mouth type, and obtain the voice information read by the user by voice recognition of the audio information. Therefore, the server determines whether the user is authenticated by identity when the voice information is consistent with the voice reading prompt information.
- FIG. 2B it is a flowchart of another embodiment of the identity authentication method of the present application, which is described from the server side that implements identity authentication:
- Step 211 When the user performs identity authentication, send the face dynamic authentication prompt information to the terminal.
- Step 212 Receive gesture identification information sent by the terminal, where the gesture recognition information is posture recognition information obtained by the terminal by recognizing a facial gesture presented by the user according to the facial dynamic authentication prompt information.
- Step 213 When the verification gesture identification information is consistent with the facial dynamic authentication prompt information, determine that the user passes the identity authentication.
- FIG. 2B is different from the identity authentication process shown in FIG. 2A only in the difference of the execution subject, that is, FIG. 2A is described from the terminal side, and FIG. 2B is described from the server side. Therefore, the related implementation process in the embodiment of FIG. 2B can be referred to the foregoing FIG. 2A. The description in the description will not be repeated here.
- the embodiment can perform high security authentication on the user identity by using the face dynamic authentication mode. Compared with the existing authentication mode, the authentication information is not stolen by a malicious third party, which improves. The reliability of the authentication, and the dynamic authentication of the face can identify the user as a living user, thereby further improving the accuracy of the identity authentication and reducing the security risks in the authentication process.
- FIG. 3A another embodiment of the identity authentication method of the present application, which illustrates the process of face registration in detail:
- Step 301 The user registers with the server through the terminal.
- Step 302 When the terminal detects the face of the user, the terminal performs face tracking on the user.
- the camera is integrated with a camera device, such as a camera.
- the user can automatically start the camera device to detect the face of the user when the user registers.
- the user can point the camera device to the front face of the camera.
- the terminal can track the face of the user through the face tracking algorithm. It should be noted that the existing face tracking algorithm can be used in the embodiment of the present application, and details are not described herein. .
- Step 303 The terminal acquires a face image according to a preset time interval in the face tracking process.
- the terminal acquires the face image according to the preset time interval by the camera device, and the time interval is set to avoid extracting substantially the same face image.
- the preset time interval may be 3 seconds.
- Step 304 Determine whether the definition of the face image meets the preset definition threshold. If yes, execute step 305; otherwise, end the current process.
- the sharpness may be first judged to exclude the face image with insufficient clarity.
- the terminal can retrieve a preset fuzzy judgment function to determine whether the sharpness of the face image satisfies the definition threshold.
- the fuzzy judgment function can adopt the fuzzy judgment function in the existing image recognition processing technology. The examples are not limited. For the face image satisfying the sharpness threshold, step 305 is performed, and the face image that does not satisfy the sharpness threshold is directly discarded, and then returns to step 303.
- Step 305 The terminal extracts the head posture information from the face image.
- the terminal After determining that the acquired face image is a clear face image in step 304, the terminal extracts the head pose information from the face image.
- the human head posture information in this embodiment may include at least one of the following angles: a low head angle, a side face angle, and a head angle.
- Step 306 The terminal determines whether each angle included in the gesture information of the human head is within a preset angle range. If yes, step 307 is performed; otherwise, the current flow ends.
- whether the face image is the front face image of the user is determined by the posture information of the head, and the terminal can determine whether each angle included in the posture information of the head is within a preset angle range, for example, The preset angle range is from 0 to 10 degrees.
- Step 307 is executed for the face image corresponding to the head posture information of the determination result, and the face image corresponding to the head posture information of the determination result is discarded, and then the process returns to step 303.
- Step 307 The terminal extracts facial feature information of the user from the face image.
- a LBP (Linear Back Projection) feature extraction algorithm may be used to extract a face feature vector value from a face image as a face feature information of the user.
- a face feature extraction algorithm used in any existing image processing technology can be applied to the embodiment of the present application, for example, a windowed Fourier transform gabor feature extraction algorithm. Wait.
- the face feature information of the user may be extracted from the plurality of face images, and the number of the plurality of face images may be preset. For example, five, correspondingly, according to the number of face images set, the foregoing steps 303 to 307 may be cyclically executed to acquire a face image that satisfies the preset number, and extract face feature information therefrom.
- Step 308 The terminal sends the face feature information to the server.
- Step 309 The server saves the correspondence between the user name of the registered user and the face feature, and ends the current process.
- the server after receiving the facial feature information sent by the terminal, the server may be in the face.
- the correspondence between the user name and the face feature of the registered user is saved in the information database.
- the corresponding relationship between the user name and the plurality of face feature information is saved.
- FIG. 4A another embodiment of the identity authentication method of the present application is based on the face registration process shown in FIG. 3, and the process of authenticating a user is described in detail:
- Step 401 Start identity authentication for the user.
- Step 402 The terminal acquires first facial feature information of the user.
- the manner in which the terminal obtains the facial feature information of the user is consistent with the manner in which the facial feature information is acquired in the face registration process shown in FIG. 3, which is specifically consistent with steps 302 to 307 shown in FIG. , will not repeat them here.
- the terminal may acquire at least one first facial feature information.
- Step 403 The terminal sends the first facial feature information of the user to the server.
- Step 404 The server verifies whether the first facial feature information matches the saved second facial feature information of the user. If yes, step 405 is performed; otherwise, the current process is ended.
- the server may search the facial feature information database according to the user name of the user, obtain the second facial feature information corresponding to the user name, and then adopt the pre-preparation.
- the comparing manner compares the first facial feature information and the second facial feature information. If the feature comparison value is within a preset similarity range, the first facial feature information and the second facial feature information may be determined to match.
- the first face feature information and the second face feature may be compared by using the Euclidean distance comparison method, and the square sum of the difference between the second face feature vector and the first face feature vector is calculated at this time, if If the sum of squares is less than a preset threshold, it may be determined that the identity authentication is performed by the user himself;
- the first face feature information and the second face feature may be compared by using a cosine distance comparison method. If the first face feature vector is V1 and the second face feature vector is V2, the following formula may be calculated. Value: V2*V1/(
- Step 405 The server sends the face dynamic authentication prompt information to the terminal.
- the server may randomly extract a face dynamic authentication prompt information from the face dynamic authentication prompt information database.
- the face dynamic authentication prompt information may include an expression action prompt information or a voice read prompt information.
- the prompting action is usually an action that the user can easily present through the facial gesture, for example, opening a mouth, closing the eyes, turning the head, etc.
- the voice reading prompt information the information is usually short, so that the user is Read at the time of authentication, and it is convenient for the terminal to recognize the facial gesture when the user reads.
- Step 406 The terminal obtains face tracking information by performing face tracking on the user.
- the terminal may output the face dynamic authentication prompt information on the authentication interface, and the user may present the corresponding face gesture according to the information.
- the terminal acquires the user through the face tracking algorithm.
- Face tracking information may include at least one of the following information: face key position information, head posture information.
- Step 407 The terminal analyzes the face tracking information to obtain the gesture recognition information of the user.
- FIG. 4B and FIG. 4C are schematic diagrams of position information of facial key points in the embodiment of the present application.
- FIG. 4B is a key position information of a user's mouth extracted in a normal state
- FIG. 4C is a “open mouth” posture of the user. After extracting the key position information of the user's mouth, by comparing the key position information extracted by FIG. 4B and FIG. 4C, that is, comparing the coordinate distances of the two key points above and below the mouth, the user's posture recognition information can be obtained as “ Open your mouth.”
- the terminal may obtain the head posture information by performing face tracking on the user, which may be specifically shown in FIG. 3B.
- the three corners are obtained, and if the angle values of the three corners satisfy the range of angle values defined by the "turning head", the posture recognition information of the user can be obtained as "turning head”.
- Step 408 The terminal sends the gesture recognition information to the server.
- Step 409 The server verifies that the gesture recognition information is consistent with the face dynamic authentication prompt information. If yes, step 410 is performed; otherwise, the current flow is ended.
- Step 410 The server determines that the user passes the identity authentication and ends the current process.
- the embodiment combines face authentication and dynamic authentication to perform high security authentication on the user identity, wherein the face authentication can initially verify whether the user is the user, and the authentication is performed compared to the existing authentication password.
- the face authentication can initially verify whether the user is the user, and the authentication is performed compared to the existing authentication password.
- the authentication information is not easily stolen by a malicious third party, and the reliability of the authentication is improved, and on the basis of being confirmed as the user himself, the face dynamic authentication can identify the user as a living user, thereby further improving the accuracy of the identity authentication. Reduce the security risks in the authentication process.
- the present application also provides an embodiment of an identity authentication device, a terminal, and a server.
- Embodiments of the identity authentication apparatus of the present application can be applied to terminals and servers, respectively.
- the device embodiment may be implemented by software, or may be implemented by hardware or a combination of hardware and software. Taking the software implementation as an example, as a logical means, the processor of the device in which it is located reads the corresponding computer program instructions in the non-volatile memory into the memory. From the hardware level, as shown in FIG. 5, a hardware structure diagram of the device where the identity authentication device is located, except for the processor, the memory, the network interface, and the non-volatile memory shown in FIG.
- the device in which the device is located in the embodiment may also include other hardware according to the actual function of the device.
- the terminal may include a camera, a touch screen, a communication component, etc.
- the server may include a packet responsible for processing the message. Forwarding chips and so on.
- the identity authentication apparatus may be applied to a terminal, and the apparatus includes: a receiving unit 610, an identifying unit 620, and a sending unit 630.
- the receiving unit 610 is configured to receive the facial dynamic authentication prompt information sent by the server when the user performs identity authentication.
- the identifying unit 620 is configured to obtain the gesture recognition information of the facial dynamic authentication prompt information by identifying a facial gesture presented by the user;
- the sending unit 630 is configured to send the gesture identification information to the server, so that the server determines that the user passes the identity authentication when verifying that the gesture identification information is consistent with the face dynamic authentication prompt information.
- the identification unit 620 can include (not shown in FIG. 6):
- a face information obtaining sub-unit configured to obtain face tracking information by performing face tracking on the user when the user presents a face gesture according to the face dynamic authentication prompt information
- the face information analysis subunit is configured to analyze the face tracking information to obtain the gesture identification information of the user.
- the face information analysis sub-unit may be specifically configured to: when the face tracking information is the face key position information, obtain the facial gesture recognition information of the user by analyzing the facial key position information, when When the face tracking information is the head posture information, the head rotation identification information of the user is obtained by analyzing the head posture information.
- the face dynamic authentication prompt information may include at least one of the following information: an expression action prompt information, and a voice read prompt information.
- the apparatus may also include (not shown in Figure 6):
- An acquiring unit configured to acquire facial feature information of the user, and use the facial feature information acquired during the identity authentication as the first facial feature information of the user;
- the sending unit 630 may be further configured to send the first facial feature information of the user to the server, so that the server is verifying the first facial feature information and the saved second person of the user
- the face dynamic authentication prompt information is sent when the face feature information is matched.
- the acquiring unit may be further configured to: when the user performs registration, acquire facial feature information of the user, and use the facial feature information acquired during the registration as a second person of the user.
- the sending unit 630 is further configured to send the second facial feature information to the server, so that the server saves the user name of the user and the second facial feature. Correspondence relationship.
- the acquiring unit may include: a face tracking subunit, configured to perform face tracking on the user when the face of the user is detected; and an image obtaining subunit, configured to be in the person Obtaining a face image according to a preset time interval in the face tracking process; the condition determining subunit is configured to determine whether the face image satisfies a preset feature extraction condition; and the feature extraction subunit is configured to satisfy the feature extraction condition And extracting facial feature information of the user from the face image.
- a face tracking subunit configured to perform face tracking on the user when the face of the user is detected
- an image obtaining subunit configured to be in the person Obtaining a face image according to a preset time interval in the face tracking process
- the condition determining subunit is configured to determine whether the face image satisfies a preset feature extraction condition
- the feature extraction subunit is configured to satisfy the feature extraction condition And extracting facial feature information of the user from the face image.
- condition determining subunit may further include:
- a clarity determining module configured to determine whether a resolution of the face image meets a preset definition threshold
- the attitude information extracting module is configured to extract, according to the clarity threshold, the head posture information from the face image, the head posture information including at least one of the following angles: a low head angle, a side face angle, and a partial Head angle
- An angle determining module configured to determine whether each angle included in the posture information of the human head is within a preset angle range
- the determination determining module is configured to determine that the face image satisfies the feature extraction condition if it is within a preset angle range.
- the feature extraction sub-unit may be specifically configured to use a preset feature extraction algorithm to extract a face feature vector value from the face image as the face feature information of the user; wherein the preset feature
- the extraction algorithm may include: a linear back projection LBP feature extraction algorithm, or a windowed Fourier transform gabor feature extraction algorithm, and the like.
- the identity authentication apparatus may be applied to a server, and the apparatus includes: a sending unit 710, a receiving unit 720, and a determining unit 730.
- the sending unit 710 is configured to send the face dynamic authentication prompt information to the terminal when the user performs identity authentication.
- the receiving unit 720 is configured to receive the gesture identification information that is sent by the terminal, where the gesture identification information is the gesture identification information obtained by the terminal by identifying a facial gesture presented by the user according to the facial dynamic authentication prompt information. ;
- the determining unit 730 is configured to determine that the user passes the identity authentication when verifying that the gesture identification information is consistent with the facial dynamic authentication prompt information.
- the receiving unit 720 is further configured to receive the first facial feature information of the user that is sent by the terminal;
- the device may further include: (not shown in FIG. 7): a verification unit, configured to verify whether the first facial feature information matches the saved second facial feature information of the user;
- the sending unit 710 may be specifically configured to send the face dynamic authentication prompt information to the terminal when the matching is performed.
- the receiving unit 720 is further configured to: when the user performs registration, receive the second facial feature information of the user that is sent by the terminal; the device may further include: The saving unit is configured to save a correspondence between the user name of the user and the second facial feature information.
- the verification unit may include: a feature search subunit, configured to search the correspondence according to the user name of the user, to obtain second face feature information corresponding to the user name; and feature comparison subunit And comparing the first facial feature information and the second facial feature information according to a preset comparison manner; the matching determining subunit is configured to determine, if the feature comparison value is within a preset similarity range, The first facial feature information matches the second facial feature information.
- the preset comparison manner that the feature comparison subunit can adopt includes: a European distance comparison method or a cosine distance comparison manner.
- the device embodiment since it basically corresponds to the method embodiment, reference may be made to the partial description of the method embodiment.
- the device embodiments described above are merely illustrative, wherein the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, ie may be located A place, or it can be distributed to multiple network units. You can choose some of them according to actual needs or All modules are used to achieve the objectives of the present application. Those of ordinary skill in the art can understand and implement without any creative effort.
- the face dynamic authentication method can perform high security authentication on the user identity, and the authentication information is not maliciously compared to the existing authentication mode.
- the three-party stealing improves the reliability of the authentication, and the face dynamic authentication can identify the user as a living user, thereby further improving the accuracy of the identity authentication and reducing the security risks in the authentication process.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Networks & Wireless Communication (AREA)
- Biomedical Technology (AREA)
- Computing Systems (AREA)
- Signal Processing (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Acoustics & Sound (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Collating Specific Patterns (AREA)
- Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims (25)
- 一种身份认证方法,其特征在于,所述方法包括:在用户进行身份认证时,接收服务器发送的人脸动态认证提示信息;通过识别所述用户呈现的人脸姿态,获得所述人脸动态认证提示信息的姿态识别信息;将所述姿态识别信息发送至所述服务器,以使所述服务器在验证所述姿态识别信息与所述人脸动态认证提示信息一致时,确定所述用户通过身份认证。
- 根据权利要求1所述的方法,其特征在于,所述通过识别所述用户呈现的人脸姿态,获得所述人脸动态认证提示信息的姿态识别信息,包括:在所述用户根据所述人脸动态认证提示信息呈现人脸姿态时,通过对所述用户进行人脸跟踪,获得人脸跟踪信息;分析所述人脸跟踪信息获得所述用户的姿态识别信息。
- 根据权利要求2所述的方法,其特征在于,所述分析所述人脸跟踪信息获得所述用户的姿态识别信息,包括:当所述人脸跟踪信息为面部关键点位置信息时,通过分析所述面部关键点位置信息获得所述用户的表情姿态识别信息;当所述人脸跟踪信息为人头姿态信息时,通过分析所述人头姿态信息获得所述用户的头部转动识别信息。
- 根据权利要求1至3任一所述的方法,其特征在于,所述人脸动态认证提示信息包括至少一种下述信息:表情动作提示信息、语音读取提示信息。
- 根据权利要求1所述的方法,其特征在于,所述接收所述服务器发送的人脸动态认证提示信息之前,所述方法还包括:获取所述用户的人脸特征信息,将所述身份认证时获取的人脸特征信息作为所述用户的第一人脸特征信息;向服务器发送所述用户的第一人脸特征信息,以使所述服务器在验证所述第一人脸特征信息与已保存的所述用户的第二人脸特征信息匹配时发送所 述人脸动态认证提示信息。
- 根据权利要求5所述的方法,其特征在于,所述方法还包括:在所述用户进行注册时,获取所述用户的人脸特征信息,将所述注册时获取的人脸特征信息作为所述用户的第二人脸特征信息;将所述第二人脸特征信息发送至所述服务器,以使所述服务器保存所述用户的用户名与所述第二人脸特征的对应关系。
- 根据权利要求5或6所述的方法,其特征在于,所述获取所述用户的人脸特征信息,包括:在检测到所述用户的人脸时,对所述用户进行人脸跟踪;在所述人脸跟踪过程中按照预设时间间隔获取人脸图像;判断所述人脸图像是否满足预设的特征提取条件;若满足所述特征提取条件,则从所述人脸图像中提取所述用户的人脸特征信息。
- 根据权利要求7所述的方法,其特征在于,所述判断所述人脸图像是否满足预设的特征提取条件,包括:判断所述人脸图像的清晰度是否满足预设的清晰度阈值;若满足所述清晰度阈值,则从所述人脸图像中提取人头姿态信息,所述人头姿态信息包括至少一个下述角度:低仰头角度、侧脸角度和偏头角度;判断所述人头姿态信息包含的每个角度是否在预设的角度范围内;若在预设的角度范围内,则确定所述人脸图像满足所述特征提取条件。
- 一种身份认证方法,其特征在于,所述方法包括:在用户进行身份认证时,向终端发送人脸动态认证提示信息;接收所述终端发送的姿态识别信息,所述姿态识别信息为所述终端通过识别所述用户根据所述人脸动态认证提示信息呈现的人脸姿态,获得的姿态识别信息;当验证所述姿态识别信息与所述人脸动态认证提示信息一致时,确定所述用户通过身份认证。
- 根据权利要求9所述的方法,其特征在于,向终端发送人脸动态认证提示信息之前,所述方法还包括:接收所述终端发送的所述用户的第一人脸特征信息;验证所述第一人脸特征信息与已保存的所述用户的第二人脸特征信息是否匹配;若匹配,则执行所述向终端发送人脸动态认证提示信息。
- 根据权利要求10所述的方法,其特征在于,所述方法还包括:在所述用户进行注册时,接收所述终端发送的所述用户的第二人脸特征信息;保存所述用户的用户名与所述第二人脸特征信息的对应关系。
- 根据权利要求11所述的方法,其特征在于,所述验证所述第一人脸特征信息与已保存的所述用户的第二人脸特征信息是否匹配,包括:根据所述用户的用户名查找所述对应关系,获得与所述用户名对应的第二人脸特征信息;采用预设的比较方式比较所述第一人脸特征信息和所述第二人脸特征信息;如果特征比较值在预设的相似度范围内,则确定所述第一人脸特征信息与所述第二人脸特征信息匹配。
- 一种身份认证装置,其特征在于,所述装置包括:接收单元,用于在用户进行身份认证时,接收服务器发送的人脸动态认证提示信息;识别单元,用于通过识别所述用户呈现的人脸姿态,获得所述人脸动态认证提示信息的姿态识别信息;发送单元,用于将所述姿态识别信息发送至所述服务器,以使所述服务器在验证所述姿态识别信息与所述人脸动态认证提示信息一致时,确定所述用户通过身份认证。
- 根据权利要求13所述的装置,其特征在于,所述识别单元包括:人脸信息获得子单元,用于在所述用户根据所述人脸动态认证提示信息呈现人脸姿态时,通过对所述用户进行人脸跟踪,获得人脸跟踪信息;人脸信息分析子单元,用于分析所述人脸跟踪信息获得所述用户的姿态识别信息。
- 根据权利要求14所述的装置,其特征在于,所述人脸信息分析子单元,具体用于当所述人脸跟踪信息为面部关键点位置信息时,通过分析所述面部关键点位置信息获得所述用户的表情姿态识别信息,当所述人脸跟踪信息为人头姿态信息时,通过分析所述人头姿态信息获得所述用户的头部转动识别信息。
- 根据权利要求13所述的装置,其特征在于,所述装置还包括:获取单元,用于获取所述用户的人脸特征信息,将所述身份认证时获取的人脸特征信息作为所述用户的第一人脸特征信息;所述发送单元,还用于向服务器发送所述用户的第一人脸特征信息,以使所述服务器在验证所述第一人脸特征信息与已保存的所述用户的第二人脸特征信息匹配时发送所述人脸动态认证提示信息。
- 根据权利要求16所述的装置,其特征在于,所述获取单元,还用于在所述用户进行注册时,获取所述用户的人脸特征信息,将所述注册时获取的人脸特征信息作为所述用户的第二人脸特征信息;所述发送单元,还用于将所述第二人脸特征信息发送至所述服务器,以使所述服务器保存所述用户的用户名与所述第二人脸特征的对应关系。
- 根据权利要求16或17所述的装置,其特征在于,所述获取单元包括:人脸跟踪子单元,用于在检测到所述用户的人脸时,对所述用户进行人脸跟踪;图像获取子单元,用于在所述人脸跟踪过程中按照预设时间间隔获取人脸图像;条件判断子单元,用于判断所述人脸图像是否满足预设的特征提取条件;特征提取子单元,用于若满足所述特征提取条件,则从所述人脸图像中提取所述用户的人脸特征信息。
- 根据权利要求18所述的装置,其特征在于,所述条件判断子单元包括:清晰度判断模块,用于判断所述人脸图像的清晰度是否满足预设的清晰度阈值;姿态信息提取模块,用于若满足所述清晰度阈值,则从所述人脸图像中提取人头姿态信息,所述人头姿态信息包括至少一个下述角度:低仰头角度、侧脸角度和偏头角度;角度判断模块,用于判断所述人头姿态信息包含的每个角度是否在预设的角度范围内;判断确定模块,用于若在预设的角度范围内,则确定所述人脸图像满足所述特征提取条件。
- 一种身份认证装置,其特征在于,所述装置包括:发送单元,用于在用户进行身份认证时,向终端发送人脸动态认证提示信息;接收单元,用于接收所述终端发送的姿态识别信息,所述姿态识别信息为所述终端通过识别所述用户根据所述人脸动态认证提示信息呈现的人脸姿态,获得的姿态识别信息;确定单元,用于当验证所述姿态识别信息与所述人脸动态认证提示信息一致时,确定所述用户通过身份认证。
- 根据权利要求20所述的装置,其特征在于,所述接收单元,还用于接收所述终端发送的所述用户的第一人脸特征信息;所述装置还包括:验证单元,用于验证所述第一人脸特征信息与已保存的所述用户的第二 人脸特征信息是否匹配;所述发送单元,具体用于在匹配时,向终端发送人脸动态认证提示信息。
- 根据权利要求21所述的装置,其特征在于,所述接收单元,还用于在所述用户进行注册时,接收所述终端发送的所述用户的第二人脸特征信息;所述装置还包括:保存单元,用于保存所述用户的用户名与所述第二人脸特征信息的对应关系。
- 根据权利要求22所述的装置,其特征在于,所述验证单元包括:特征查找子单元,用于根据所述用户的用户名查找所述对应关系,获得与所述用户名对应的第二人脸特征信息;特征比较子单元,用于按照预设的比较方式比较所述第一人脸特征信息和所述第二人脸特征信息;匹配确定子单元,用于如果特征比较值在预设的相似度范围内,则确定所述第一人脸特征信息与所述第二人脸特征信息匹配。
- 一种终端,其特征在于,包括:处理器;用于存储所述处理器可执行指令的存储器;其中,所述处理器被配置为:在用户进行身份认证时,接收服务器发送的人脸动态认证提示信息;通过识别所述用户呈现的人脸姿态,获得所述人脸动态认证提示信息的姿态识别信息;将所述姿态识别信息发送至所述服务器,以使所述服务器在验证所述姿态识别信息与所述人脸动态认证提示信息一致时,确定所述用户通过身份认证。
- 一种服务器,其特征在于,包括:处理器;用于存储所述处理器可执行指令的存储器;其中,所述处理器被配置为:在用户进行身份认证时,向终端发送人脸动态认证提示信息;接收所述终端发送的姿态识别信息,所述姿态识别信息为所述终端通过识别所述用户根据所述人脸动态认证提示信息呈现的人脸姿态,获得的姿态识别信息;当验证所述姿态识别信息与所述人脸动态认证提示信息一致时,确定所述用户通过身份认证。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017512314A JP6820062B2 (ja) | 2014-09-03 | 2015-08-27 | アイデンティティ認証方法ならびに装置、端末及びサーバ |
SG11201701497SA SG11201701497SA (en) | 2014-09-03 | 2015-08-27 | Identity authentication method and apparatus, terminal and server |
KR1020177005848A KR101997371B1 (ko) | 2014-09-03 | 2015-08-27 | 신원 인증 방법 및 장치, 단말기 및 서버 |
EP15838136.8A EP3190534B1 (en) | 2014-09-03 | 2015-08-27 | Identity authentication method and apparatus, terminal and server |
EP19172346.9A EP3540621B1 (en) | 2014-09-03 | 2015-08-27 | Identity authentication method and apparatus, terminal and server |
PL19172346T PL3540621T3 (pl) | 2014-09-03 | 2015-08-27 | Sposób oraz urządzenie do uwierzytelniania tożsamości, terminal i serwer |
US15/448,534 US10601821B2 (en) | 2014-09-03 | 2017-03-02 | Identity authentication method and apparatus, terminal and server |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410446657.0 | 2014-09-03 | ||
CN201410446657.0A CN105468950B (zh) | 2014-09-03 | 2014-09-03 | 身份认证方法、装置、终端及服务器 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/448,534 Continuation US10601821B2 (en) | 2014-09-03 | 2017-03-02 | Identity authentication method and apparatus, terminal and server |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016034069A1 true WO2016034069A1 (zh) | 2016-03-10 |
Family
ID=55439122
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2015/088215 WO2016034069A1 (zh) | 2014-09-03 | 2015-08-27 | 身份认证方法、装置、终端及服务器 |
Country Status (10)
Country | Link |
---|---|
US (1) | US10601821B2 (zh) |
EP (2) | EP3540621B1 (zh) |
JP (1) | JP6820062B2 (zh) |
KR (1) | KR101997371B1 (zh) |
CN (2) | CN105468950B (zh) |
ES (1) | ES2810012T3 (zh) |
HK (1) | HK1221795A1 (zh) |
PL (1) | PL3540621T3 (zh) |
SG (2) | SG10201901818UA (zh) |
WO (1) | WO2016034069A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110460993A (zh) * | 2019-08-21 | 2019-11-15 | 广州大学 | 一种基于手势验证的认证方法及系统 |
Families Citing this family (56)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10929550B2 (en) | 2015-04-30 | 2021-02-23 | Masaaki Tokuyama | Terminal device and computer program |
US10452823B2 (en) * | 2015-04-30 | 2019-10-22 | Masaaki Tokuyama | Terminal device and computer program |
CN106302330B (zh) * | 2015-05-21 | 2021-01-05 | 腾讯科技(深圳)有限公司 | 身份验证方法、装置和系统 |
CN105893828A (zh) * | 2016-05-05 | 2016-08-24 | 南京甄视智能科技有限公司 | 基于移动终端的人脸验证驾考系统与方法 |
CN107644190A (zh) * | 2016-07-20 | 2018-01-30 | 北京旷视科技有限公司 | 行人监控方法和装置 |
CN106228133B (zh) * | 2016-07-21 | 2020-04-10 | 北京旷视科技有限公司 | 用户验证方法及装置 |
CN107819807A (zh) * | 2016-09-14 | 2018-03-20 | 腾讯科技(深圳)有限公司 | 一种信息验证方法、装置和设备 |
CN108629260B (zh) * | 2017-03-17 | 2022-02-08 | 北京旷视科技有限公司 | 活体验证方法和装置及存储介质 |
CN106971163A (zh) * | 2017-03-28 | 2017-07-21 | 深圳市校联宝科技有限公司 | 一种接领人识别方法、装置和系统 |
CN109844747A (zh) * | 2017-04-01 | 2019-06-04 | 深圳市大疆创新科技有限公司 | 身份认证服务器、身份认证终端、身份认证系统及方法 |
CN107045744A (zh) * | 2017-04-14 | 2017-08-15 | 特斯联(北京)科技有限公司 | 一种智能别墅门禁认证方法及系统 |
CN107066983B (zh) * | 2017-04-20 | 2022-08-09 | 腾讯科技(上海)有限公司 | 一种身份验证方法及装置 |
US11080316B1 (en) * | 2017-05-26 | 2021-08-03 | Amazon Technologies, Inc. | Context-inclusive face clustering |
CN107707738A (zh) | 2017-09-07 | 2018-02-16 | 维沃移动通信有限公司 | 一种人脸识别方法及移动终端 |
CN108229120B (zh) * | 2017-09-07 | 2020-07-24 | 北京市商汤科技开发有限公司 | 人脸解锁及其信息注册方法和装置、设备、程序、介质 |
CN107562204B (zh) * | 2017-09-14 | 2021-10-01 | 深圳Tcl新技术有限公司 | 电视交互方法、电视及计算机可读存储介质 |
CN109670386A (zh) * | 2017-10-16 | 2019-04-23 | 深圳泰首智能技术有限公司 | 人脸识别方法及终端 |
CN107818253B (zh) * | 2017-10-18 | 2020-07-17 | Oppo广东移动通信有限公司 | 人脸模板数据录入控制方法及相关产品 |
CN107993174A (zh) * | 2017-11-06 | 2018-05-04 | 国政通科技股份有限公司 | 一种多功能便民服务系统 |
US10594690B2 (en) * | 2017-11-16 | 2020-03-17 | Bank Of America Corporation | Authenticating access to a computing resource using facial recognition based on involuntary facial movement |
CN107944380B (zh) * | 2017-11-20 | 2022-11-29 | 腾讯科技(深圳)有限公司 | 身份识别方法、装置及存储设备 |
CN107798548A (zh) * | 2017-11-27 | 2018-03-13 | 甘平安 | 一种购买方法及购买系统 |
US10924476B2 (en) * | 2017-11-29 | 2021-02-16 | Ncr Corporation | Security gesture authentication |
CN108229937A (zh) * | 2017-12-20 | 2018-06-29 | 阿里巴巴集团控股有限公司 | 基于增强现实的虚拟对象分配方法及装置 |
CN108171026A (zh) * | 2018-01-19 | 2018-06-15 | 百度在线网络技术(北京)有限公司 | 鉴权方法和装置 |
US11366884B2 (en) | 2018-02-14 | 2022-06-21 | American Express Travel Related Services Company, Inc. | Authentication challenges based on fraud initiation requests |
JP6859970B2 (ja) * | 2018-03-09 | 2021-04-14 | 京セラドキュメントソリューションズ株式会社 | ログイン支援システム |
CN108734084A (zh) * | 2018-03-21 | 2018-11-02 | 百度在线网络技术(北京)有限公司 | 人脸注册方法和装置 |
CN108446664A (zh) * | 2018-03-30 | 2018-08-24 | 广东华电网维信息科技有限公司 | 一种基于人脸识别的身份确认方法及装置 |
CN108712381A (zh) * | 2018-04-16 | 2018-10-26 | 出门问问信息科技有限公司 | 一种身份验证方法及装置 |
US10733676B2 (en) * | 2018-05-17 | 2020-08-04 | Coupa Software Incorporated | Automatic generation of expense data using facial recognition in digitally captured photographic images |
CN110555330A (zh) * | 2018-05-30 | 2019-12-10 | 百度在线网络技术(北京)有限公司 | 图像面签方法、装置、计算机设备及存储介质 |
CN109165614A (zh) * | 2018-08-31 | 2019-01-08 | 杭州行开科技有限公司 | 基于3d摄像头的人脸识别系统 |
CN109065058B (zh) * | 2018-09-30 | 2024-03-15 | 合肥鑫晟光电科技有限公司 | 语音通信方法、装置及系统 |
US10956548B2 (en) * | 2018-10-09 | 2021-03-23 | Lenovo (Singapore) Pte. Ltd. | User authentication via emotion detection |
CN111104658A (zh) * | 2018-10-25 | 2020-05-05 | 北京嘀嘀无限科技发展有限公司 | 注册方法及装置、认证方法及装置 |
CN109376684B (zh) | 2018-11-13 | 2021-04-06 | 广州市百果园信息技术有限公司 | 一种人脸关键点检测方法、装置、计算机设备和存储介质 |
JP7054847B2 (ja) * | 2019-03-04 | 2022-04-15 | パナソニックIpマネジメント株式会社 | 顔認証登録装置および顔認証登録方法 |
US10860705B1 (en) | 2019-05-16 | 2020-12-08 | Capital One Services, Llc | Augmented reality generated human challenge |
KR20210009596A (ko) * | 2019-07-17 | 2021-01-27 | 엘지전자 주식회사 | 지능적 음성 인식 방법, 음성 인식 장치 및 지능형 컴퓨팅 디바이스 |
CN110490106B (zh) * | 2019-08-06 | 2022-05-03 | 万翼科技有限公司 | 信息管理方法及相关设备 |
CN110611734A (zh) * | 2019-08-08 | 2019-12-24 | 深圳传音控股股份有限公司 | 交互方法及终端 |
CN110765436A (zh) * | 2019-10-26 | 2020-02-07 | 福建省伟志地理信息科学研究院 | 一种不动产信息分析管理系统和方法 |
CN111144896A (zh) * | 2019-12-16 | 2020-05-12 | 中国银行股份有限公司 | 一种身份验证方法及装置 |
US12033428B2 (en) | 2020-02-04 | 2024-07-09 | Grabtaxi Holdings Pte. Ltd. | Method, server and communication system of verifying user for transportation purposes |
KR102531579B1 (ko) * | 2020-11-18 | 2023-05-16 | 주식회사 코밴 | 이중 인증을 통한 간편 결제 방법 |
CN112487389A (zh) * | 2020-12-16 | 2021-03-12 | 熵基科技股份有限公司 | 一种身份认证方法、装置和设备 |
CN112383571B (zh) * | 2021-01-12 | 2021-06-04 | 浙江正元智慧科技股份有限公司 | 基于人脸识别大数据的登录管理系统 |
CN115131904A (zh) * | 2021-03-25 | 2022-09-30 | 中国移动通信集团安徽有限公司 | 一种门禁控制方法、装置、设备及计算机存储介质 |
CN113128969A (zh) * | 2021-04-26 | 2021-07-16 | 广州太玮生物科技有限公司 | 一种色谱柱日常使用管理系统 |
CN113255529A (zh) * | 2021-05-28 | 2021-08-13 | 支付宝(杭州)信息技术有限公司 | 一种生物特征的识别方法、装置及设备 |
CN113696851B (zh) * | 2021-08-27 | 2023-04-28 | 上海仙塔智能科技有限公司 | 基于车外手势的车辆控制方法、装置、设备以及介质 |
CN114268453B (zh) * | 2021-11-17 | 2024-07-12 | 中国南方电网有限责任公司 | 电力系统解锁方法、装置、计算机设备和存储介质 |
WO2023159462A1 (zh) * | 2022-02-25 | 2023-08-31 | 百果园技术(新加坡)有限公司 | 身份认证方法、装置、终端、存储介质及程序产品 |
CN114760074B (zh) * | 2022-06-13 | 2022-09-02 | 中广(绍兴上虞)有线信息网络有限公司 | 一种基于大数据安全的身份认证方法及系统 |
CN118196876B (zh) * | 2024-05-20 | 2024-08-16 | 东南大学 | 一种虚拟身份认证装置及其认证方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101075868A (zh) * | 2006-05-19 | 2007-11-21 | 华为技术有限公司 | 一种远程身份认证的系统、终端、服务器和方法 |
CN102385703A (zh) * | 2010-08-27 | 2012-03-21 | 北京中星微电子有限公司 | 一种基于人脸的身份认证方法及系统 |
CN103259796A (zh) * | 2013-05-15 | 2013-08-21 | 金硕澳门离岸商业服务有限公司 | 认证系统和方法 |
CN104298909A (zh) * | 2013-07-19 | 2015-01-21 | 富泰华工业(深圳)有限公司 | 电子装置、身份验证系统及方法 |
CN104518877A (zh) * | 2013-10-08 | 2015-04-15 | 鸿富锦精密工业(深圳)有限公司 | 身份认证系统及方法 |
Family Cites Families (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5572261A (en) * | 1995-06-07 | 1996-11-05 | Cooper; J. Carl | Automatic audio to video timing measurement device and method |
JP2000306090A (ja) * | 1999-04-20 | 2000-11-02 | Ntt Data Corp | 個人認証装置、方法及び記録媒体 |
SG91841A1 (en) * | 1999-11-03 | 2002-10-15 | Kent Ridge Digital Labs | Face direction estimation using a single gray-level image |
JP2003216955A (ja) * | 2002-01-23 | 2003-07-31 | Sharp Corp | ジェスチャ認識方法、ジェスチャ認識装置、対話装置及びジェスチャ認識プログラムを記録した記録媒体 |
US9412142B2 (en) * | 2002-08-23 | 2016-08-09 | Federal Law Enforcement Development Services, Inc. | Intelligent observation and identification database system |
JP2004110813A (ja) * | 2002-08-30 | 2004-04-08 | Victor Co Of Japan Ltd | 人物認証装置 |
WO2005093637A1 (de) * | 2004-03-29 | 2005-10-06 | Hoffmann Andre | Verfahren und system zur identifikation, verifikation, erkennung und wiedererkennung |
US7634106B2 (en) * | 2004-09-22 | 2009-12-15 | Fujifilm Corporation | Synthesized image generation method, synthesized image generation apparatus, and synthesized image generation program |
US8370639B2 (en) * | 2005-06-16 | 2013-02-05 | Sensible Vision, Inc. | System and method for providing secure access to an electronic device using continuous facial biometrics |
KR100725771B1 (ko) * | 2005-09-23 | 2007-06-08 | 삼성전자주식회사 | 휴대용 단말기용 얼굴 인식 및 인증 장치 및 방법 |
KR100777922B1 (ko) * | 2006-02-06 | 2007-11-21 | 에스케이 텔레콤주식회사 | 영상인식을 이용한 개인인증 및 전자서명 시스템 및 그방법 |
JP4367424B2 (ja) * | 2006-02-21 | 2009-11-18 | 沖電気工業株式会社 | 個人識別装置,個人識別方法 |
WO2007105193A1 (en) * | 2006-03-12 | 2007-09-20 | Nice Systems Ltd. | Apparatus and method for target oriented law enforcement interception and analysis |
JP5219184B2 (ja) * | 2007-04-24 | 2013-06-26 | 任天堂株式会社 | トレーニングプログラム、トレーニング装置、トレーニングシステムおよびトレーニング方法 |
JP4999570B2 (ja) * | 2007-06-18 | 2012-08-15 | キヤノン株式会社 | 表情認識装置及び方法、並びに撮像装置 |
KR20100062413A (ko) * | 2008-12-02 | 2010-06-10 | 한국전자통신연구원 | 텔레매틱스 장치를 위한 음성인식 장치 및 그 방법 |
JP2010231350A (ja) * | 2009-03-26 | 2010-10-14 | Toshiba Corp | 人物識別装置、そのプログラム、及び、その方法 |
KR101092820B1 (ko) * | 2009-09-22 | 2011-12-12 | 현대자동차주식회사 | 립리딩과 음성 인식 통합 멀티모달 인터페이스 시스템 |
WO2011065952A1 (en) * | 2009-11-30 | 2011-06-03 | Hewlett-Packard Development Company, L.P. | Face recognition apparatus and methods |
TWI411935B (zh) * | 2009-12-25 | 2013-10-11 | Primax Electronics Ltd | 利用影像擷取裝置辨識使用者姿勢以產生控制訊號之系統以及方法 |
US8818025B2 (en) * | 2010-08-23 | 2014-08-26 | Nokia Corporation | Method and apparatus for recognizing objects in media content |
US8897500B2 (en) * | 2011-05-05 | 2014-11-25 | At&T Intellectual Property I, L.P. | System and method for dynamic facial features for speaker recognition |
CN102298443B (zh) * | 2011-06-24 | 2013-09-25 | 华南理工大学 | 结合视频通道的智能家居语音控制系统及其控制方法 |
US9082235B2 (en) * | 2011-07-12 | 2015-07-14 | Microsoft Technology Licensing, Llc | Using facial data for device authentication or subject identification |
CN102324035A (zh) * | 2011-08-19 | 2012-01-18 | 广东好帮手电子科技股份有限公司 | 口型辅助语音识别术在车载导航中应用的方法及系统 |
KR20130022607A (ko) * | 2011-08-25 | 2013-03-07 | 삼성전자주식회사 | 입술 이미지를 이용한 음성 인식 장치 및 이의 음성 인식 방법 |
KR101242390B1 (ko) * | 2011-12-29 | 2013-03-12 | 인텔 코오퍼레이션 | 사용자를 인증하기 위한 방법, 장치, 및 컴퓨터 판독 가능한 기록 매체 |
US9083532B2 (en) * | 2012-03-06 | 2015-07-14 | Ebay Inc. | Physiological response PIN entry |
US8687880B2 (en) * | 2012-03-20 | 2014-04-01 | Microsoft Corporation | Real time head pose estimation |
EP2871640B1 (en) * | 2012-07-09 | 2021-01-06 | LG Electronics, Inc. | Speech recognition apparatus and method |
CN102932212A (zh) * | 2012-10-12 | 2013-02-13 | 华南理工大学 | 一种基于多通道交互方式的智能家居控制系统 |
US20140118257A1 (en) * | 2012-10-29 | 2014-05-01 | Amazon Technologies, Inc. | Gesture detection systems |
CN103036680A (zh) * | 2012-12-10 | 2013-04-10 | 中国科学院计算机网络信息中心 | 基于生物特征识别的域名认证系统及方法 |
US9310977B2 (en) * | 2012-12-14 | 2016-04-12 | Biscotti Inc. | Mobile presence detection |
JP6132232B2 (ja) * | 2013-02-01 | 2017-05-24 | パナソニックIpマネジメント株式会社 | メイクアップ支援装置、メイクアップ支援システム、およびメイクアップ支援方法 |
US20140341444A1 (en) * | 2013-05-14 | 2014-11-20 | Tencent Technology (Shenzhen) Company Limited | Systems and Methods for User Login |
JP6209067B2 (ja) * | 2013-11-21 | 2017-10-04 | 株式会社Nttドコモ | 画像認識装置、及び画像認識方法 |
CN103593598B (zh) * | 2013-11-25 | 2016-09-21 | 上海骏聿数码科技有限公司 | 基于活体检测和人脸识别的用户在线认证方法及系统 |
CN103634120A (zh) * | 2013-12-18 | 2014-03-12 | 上海市数字证书认证中心有限公司 | 基于人脸识别的实名认证方法及系统 |
US9866820B1 (en) * | 2014-07-01 | 2018-01-09 | Amazon Technologies, Inc. | Online calibration of cameras |
-
2014
- 2014-09-03 CN CN201410446657.0A patent/CN105468950B/zh active Active
- 2014-09-03 CN CN202010732960.2A patent/CN111898108B/zh active Active
-
2015
- 2015-08-27 PL PL19172346T patent/PL3540621T3/pl unknown
- 2015-08-27 SG SG10201901818UA patent/SG10201901818UA/en unknown
- 2015-08-27 EP EP19172346.9A patent/EP3540621B1/en active Active
- 2015-08-27 EP EP15838136.8A patent/EP3190534B1/en active Active
- 2015-08-27 SG SG11201701497SA patent/SG11201701497SA/en unknown
- 2015-08-27 JP JP2017512314A patent/JP6820062B2/ja active Active
- 2015-08-27 ES ES19172346T patent/ES2810012T3/es active Active
- 2015-08-27 WO PCT/CN2015/088215 patent/WO2016034069A1/zh active Application Filing
- 2015-08-27 KR KR1020177005848A patent/KR101997371B1/ko active IP Right Grant
-
2016
- 2016-08-18 HK HK16109898.6A patent/HK1221795A1/zh unknown
-
2017
- 2017-03-02 US US15/448,534 patent/US10601821B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101075868A (zh) * | 2006-05-19 | 2007-11-21 | 华为技术有限公司 | 一种远程身份认证的系统、终端、服务器和方法 |
CN102385703A (zh) * | 2010-08-27 | 2012-03-21 | 北京中星微电子有限公司 | 一种基于人脸的身份认证方法及系统 |
CN103259796A (zh) * | 2013-05-15 | 2013-08-21 | 金硕澳门离岸商业服务有限公司 | 认证系统和方法 |
CN104298909A (zh) * | 2013-07-19 | 2015-01-21 | 富泰华工业(深圳)有限公司 | 电子装置、身份验证系统及方法 |
CN104518877A (zh) * | 2013-10-08 | 2015-04-15 | 鸿富锦精密工业(深圳)有限公司 | 身份认证系统及方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3190534A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110460993A (zh) * | 2019-08-21 | 2019-11-15 | 广州大学 | 一种基于手势验证的认证方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
SG11201701497SA (en) | 2017-03-30 |
ES2810012T3 (es) | 2021-03-08 |
EP3190534B1 (en) | 2019-06-26 |
CN111898108A (zh) | 2020-11-06 |
EP3540621A1 (en) | 2019-09-18 |
JP6820062B2 (ja) | 2021-01-27 |
CN111898108B (zh) | 2024-06-04 |
EP3190534A4 (en) | 2018-03-21 |
KR20170047255A (ko) | 2017-05-04 |
US10601821B2 (en) | 2020-03-24 |
US20170180362A1 (en) | 2017-06-22 |
CN105468950B (zh) | 2020-06-30 |
HK1221795A1 (zh) | 2017-06-09 |
CN105468950A (zh) | 2016-04-06 |
JP2017530457A (ja) | 2017-10-12 |
SG10201901818UA (en) | 2019-03-28 |
PL3540621T3 (pl) | 2021-05-17 |
EP3190534A1 (en) | 2017-07-12 |
EP3540621B1 (en) | 2020-07-29 |
KR101997371B1 (ko) | 2019-07-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016034069A1 (zh) | 身份认证方法、装置、终端及服务器 | |
CN108804884B (zh) | 身份认证的方法、装置及计算机存储介质 | |
WO2017198014A1 (zh) | 一种身份认证方法和装置 | |
US20190012450A1 (en) | Biometric-based authentication method, apparatus and system | |
CN104298909B (zh) | 电子装置、身份验证系统及方法 | |
US9177131B2 (en) | User authentication method and apparatus based on audio and video data | |
US9122913B2 (en) | Method for logging a user in to a mobile device | |
US20170046508A1 (en) | Biometric authentication using gesture | |
US20160269411A1 (en) | System and Method for Anonymous Biometric Access Control | |
TWI727329B (zh) | 用於基於深度學習方法提供對資源之選擇性存取之防欺騙系統及方法 | |
WO2017193826A1 (zh) | 一种云桌面登陆验证方法、云桌面控制系统及客户端 | |
US20080013794A1 (en) | Feature Extraction Algorithm for Automatic Ear Recognition | |
US8983207B1 (en) | Mitigating replay attacks using multiple-image authentication | |
US9792421B1 (en) | Secure storage of fingerprint related elements | |
KR101724971B1 (ko) | 광각 카메라를 이용한 얼굴 인식 시스템 및 그를 이용한 얼굴 인식 방법 | |
CN105407069B (zh) | 活体认证方法、装置、客户端设备及服务器 | |
US10547610B1 (en) | Age adapted biometric authentication | |
WO2016062200A1 (zh) | 一种指纹认证的方法、装置及服务器 | |
WO2017041358A1 (zh) | 一种用户身份识别方法、装置和移动终端 | |
CN107370769B (zh) | 用户认证方法及系统 | |
WO2016058540A1 (zh) | 身份验证方法、装置和存储介质 | |
KR101718244B1 (ko) | 얼굴 인식을 위한 광각 영상 처리 장치 및 방법 | |
CN113344586B (zh) | 一种面向移动终端的人脸识别支付系统 | |
CN112149085A (zh) | 一种基于用户生物特征的游戏账户登录方法和装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15838136 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20177005848 Country of ref document: KR Kind code of ref document: A Ref document number: 2017512314 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2015838136 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015838136 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |