WO2016034069A1 - 身份认证方法、装置、终端及服务器 - Google Patents

身份认证方法、装置、终端及服务器 Download PDF

Info

Publication number
WO2016034069A1
WO2016034069A1 PCT/CN2015/088215 CN2015088215W WO2016034069A1 WO 2016034069 A1 WO2016034069 A1 WO 2016034069A1 CN 2015088215 W CN2015088215 W CN 2015088215W WO 2016034069 A1 WO2016034069 A1 WO 2016034069A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
user
face
facial
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2015/088215
Other languages
English (en)
French (fr)
Chinese (zh)
Inventor
杜志军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to KR1020177005848A priority Critical patent/KR101997371B1/ko
Priority to SG11201701497SA priority patent/SG11201701497SA/en
Priority to PL19172346T priority patent/PL3540621T3/pl
Priority to EP19172346.9A priority patent/EP3540621B1/en
Priority to EP15838136.8A priority patent/EP3190534B1/en
Priority to JP2017512314A priority patent/JP6820062B2/ja
Publication of WO2016034069A1 publication Critical patent/WO2016034069A1/zh
Priority to US15/448,534 priority patent/US10601821B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0861Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/305Authentication, i.e. establishing the identity or authorisation of security principals by remotely controlling device operation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/34User authentication involving the use of external additional devices, e.g. dongles or smart cards
    • G06F21/35User authentication involving the use of external additional devices, e.g. dongles or smart cards communicating wirelessly
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/45Structures or tools for the administration of authentication
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/164Detection; Localisation; Normalisation using holistic features
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/169Holistic features and representations, i.e. based on the facial image taken as a whole
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • G06V40/176Dynamic expression
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L17/00Speaker identification or verification techniques
    • G10L17/22Interactive procedures; Man-machine interfaces
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2101Auditing as a secondary aspect
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2103Challenge-response
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2115Third party
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2117User registration
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/142Image acquisition using hand-held instruments; Constructional details of the instruments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities

Definitions

  • the present application relates to the field of communications technologies, and in particular, to an identity authentication method, apparatus, terminal, and server.
  • the server verifies that the input authentication password is consistent with the authentication password when the user is registered, and confirms that the user passes the identity authentication.
  • authentication passwords are often simple combinations of numbers and letters that are easily stolen by malicious third parties. Therefore, the reliability of the existing identity authentication method is poor, and the user information is easily stolen, resulting in low security of the authentication.
  • the present application provides an identity authentication method, device, terminal, and server to solve the problem that the identity authentication method in the prior art is less reliable and less secure.
  • an identity authentication method is provided, where the method includes:
  • an identity authentication method is provided, where the method includes:
  • the gesture recognition information is the gesture recognition information obtained by the terminal by identifying a facial gesture presented by the user according to the facial dynamic authentication prompt information
  • an identity authentication apparatus includes:
  • a receiving unit configured to receive a face dynamic authentication prompt message sent by the server when the user performs identity authentication
  • a recognition unit configured to obtain gesture recognition information of the face dynamic authentication prompt information by identifying a face gesture presented by the user
  • a sending unit configured to send the gesture identification information to the server, so that the server determines that the user passes the identity authentication when verifying that the gesture recognition information is consistent with the face dynamic authentication prompt information.
  • an identity authentication apparatus includes:
  • a sending unit configured to send a face dynamic authentication prompt message to the terminal when the user performs identity authentication
  • a receiving unit configured to receive the gesture identification information sent by the terminal, where the gesture recognition information is the gesture recognition information obtained by the terminal by identifying a facial gesture presented by the user according to the facial dynamic authentication prompt information;
  • a determining unit configured to determine that the user passes the identity authentication when verifying that the gesture identification information is consistent with the face dynamic authentication prompt information.
  • a terminal including:
  • processor a memory for storing the processor executable instructions
  • processor is configured to:
  • a server including:
  • processor a memory for storing the processor executable instructions
  • processor is configured to:
  • the gesture recognition information is the gesture recognition information obtained by the terminal by identifying a facial gesture presented by the user according to the facial dynamic authentication prompt information
  • the server when the user is authenticated by the user, the server sends the dynamic authentication prompt information to the terminal, and the terminal obtains the gesture recognition information of the facial dynamic authentication prompt information by identifying the facial gesture presented by the user, and sends the gesture identification information to the server and the server.
  • the verification gesture recognition information is consistent with the face dynamic authentication prompt information, it is determined that the user passes the identity authentication.
  • the face dynamic authentication method can perform high security authentication on the user identity. Compared with the existing authentication password, the authentication information is not stolen by a malicious third party, which improves the reliability of the authentication. Sex, and the face dynamic authentication can identify the user as a live user, thereby further improving the accuracy of identity authentication and reducing the security risks in the authentication process.
  • FIG. 1 is a schematic diagram of an identity authentication scenario according to an embodiment of the present application.
  • 2A is a flowchart of an embodiment of an identity authentication method of the present application.
  • 2B is a flowchart of another embodiment of the identity authentication method of the present application.
  • 3A is a flowchart of another embodiment of an identity authentication method of the present application.
  • FIG. 3B is a schematic diagram of a gesture of a human head in a face authentication process according to an embodiment of the present application.
  • FIG. 4A is a flowchart of another embodiment of an identity authentication method according to the present application.
  • FIG. 4B and FIG. 4C are schematic diagrams of key points of a face in an embodiment of the present application.
  • FIG. 5 is a hardware structural diagram of a device where the identity authentication device of the present application is located;
  • FIG. 6 is a block diagram of an embodiment of an identity authentication apparatus of the present application.
  • FIG. 7 is a block diagram of another embodiment of the identity authentication apparatus of the present application.
  • first, second, third, etc. may be used to describe various information in this application, such information should not be limited to these terms. These terms are only used to distinguish the same type of information from each other.
  • first information may also be referred to as the second information without departing from the scope of the present application.
  • second information may also be referred to as the first information.
  • word "if” as used herein may be interpreted as "when” or “when” or “in response to a determination.”
  • FIG. 1 it is a schematic diagram of an application scenario for implementing identity authentication according to an embodiment of the present application: a user completes identity authentication of a user by interacting with a terminal and a server, and communication between the terminal and the server may be completed based on a network.
  • the network includes various wireless networks or wired networks, and the embodiments of the present application do not limit the embodiments.
  • the terminal may be specifically a mobile phone, a tablet computer, a personal computer, or the like.
  • two databases may be set on the server, which are a face feature information database and a face dynamic authentication prompt information database.
  • the terminal may obtain the facial feature information of the registered user and send it to the server, and the server saves the facial feature information of the registered user to the face feature information database.
  • face authentication may be performed first.
  • the user sends the acquired facial feature information to the server, and the server verifies that the facial feature information matches the facial feature information of the user saved in the facial feature information data.
  • the server can perform face dynamic authentication.
  • the server can return the face dynamic authentication prompt information obtained from the face dynamic authentication prompt information database, and identify the user at the terminal.
  • the presented face gesture thus obtains the gesture recognition information of the face dynamic authentication prompt information and sends the gesture identification information to the server.
  • the server verifies that the gesture recognition information is consistent with the face dynamic authentication prompt information, it can be known that the current authenticated user is a living user, thereby finally determining The user is authenticated.
  • the face feature information of the user acquired in the face registration phase may be referred to as the second face feature information in the embodiment of the present application, and the face feature information of the user acquired in the face authentication phase is referred to as the first person. Face feature information.
  • the embodiments of the present application are described in detail below.
  • FIG. 2A it is a flowchart of an embodiment of an identity authentication method according to the present application. The embodiment is described from a terminal side that implements identity authentication:
  • Step 201 Receive a face dynamic authentication prompt message sent by the server when the user performs identity authentication.
  • the server may randomly extract the face dynamic authentication prompt information from the face dynamic authentication prompt information data and return the information to the terminal, where the face dynamic authentication prompt information may include at least one of the following information: an expression action prompt information, For example, closing the eyes, opening the mouth, turning the head, etc.; and reading the prompt information by voice, for example, paying 20 yuan or the like.
  • an expression action prompt information For example, closing the eyes, opening the mouth, turning the head, etc.
  • reading the prompt information by voice for example, paying 20 yuan or the like.
  • the terminal may first acquire the facial feature information of the user, and use the facial feature information acquired during the identity authentication as the first facial information of the user to the server.
  • the server sends the facial dynamic authentication prompt information to the terminal when verifying that the first facial feature information matches the saved second facial feature information.
  • the terminal may activate an integrated imaging device, such as a camera, to detect the face of the user, and when the face is detected, the face tracking of the user is performed on the face.
  • an integrated imaging device such as a camera
  • the terminal may activate an integrated imaging device, such as a camera, to detect the face of the user, and when the face is detected, the face tracking of the user is performed on the face.
  • the server may search the facial feature information database according to the user name of the user, obtain the second facial feature information corresponding to the user name, and then adopt a preset comparison manner. Comparing the first face feature information and the second face feature information, if the feature comparison value is within a preset similarity range, determining that the first face feature information matches the second face feature information, determining the first After the face feature information is matched with the second face feature information, the user can be determined to pass the face authentication. At this time, the server sends the face dynamic authentication prompt information to the terminal.
  • Step 202 Obtain gesture recognition information of the face dynamic authentication prompt information by recognizing the face gesture presented by the user.
  • the terminal displays the face dynamic authentication prompt information on the identity authentication interface, and the user can present the corresponding face gesture according to the information, and the terminal recognizes the face gesture.
  • the user may perform face tracking to obtain face tracking information, where the face tracking information may include at least one of facial key position information and human head posture information, and then the terminal obtains the user by analyzing the face tracking information. Gesture identification information.
  • the position information of the key points of the face can be used to know whether the user prompts the eyes according to the expression action, whether to close the eyes, open the mouth, or the mouth shape of the user when reading the voice reading prompt information (the pronunciation of each word has a corresponding relationship with the mouth type, through the mouth type).
  • the gesture recognition information of the user can be determined; the head posture information can be used to know whether the user turns his head, bows, and the like.
  • Step 203 Send the gesture recognition information to the server, so that the server determines that the user passes the identity authentication when the verification gesture identification information is consistent with the facial dynamic authentication prompt information.
  • the server may send the face dynamic authentication prompt information to the terminal and record Corresponding relationship between the user name of the user and the face dynamic authentication prompt information; in this step, after the terminal sends the gesture recognition information to the server, the server obtains the corresponding face dynamic authentication prompt information according to the user name of the user, and verifies the gesture.
  • the identification information is consistent with the face dynamic authentication prompt information, the user is a living user, and the user is determined to pass the identity authentication.
  • the terminal can obtain the user's audio information in addition to the user's mouth type, and obtain the voice information read by the user by voice recognition of the audio information. Therefore, the server determines whether the user is authenticated by identity when the voice information is consistent with the voice reading prompt information.
  • FIG. 2B it is a flowchart of another embodiment of the identity authentication method of the present application, which is described from the server side that implements identity authentication:
  • Step 211 When the user performs identity authentication, send the face dynamic authentication prompt information to the terminal.
  • Step 212 Receive gesture identification information sent by the terminal, where the gesture recognition information is posture recognition information obtained by the terminal by recognizing a facial gesture presented by the user according to the facial dynamic authentication prompt information.
  • Step 213 When the verification gesture identification information is consistent with the facial dynamic authentication prompt information, determine that the user passes the identity authentication.
  • FIG. 2B is different from the identity authentication process shown in FIG. 2A only in the difference of the execution subject, that is, FIG. 2A is described from the terminal side, and FIG. 2B is described from the server side. Therefore, the related implementation process in the embodiment of FIG. 2B can be referred to the foregoing FIG. 2A. The description in the description will not be repeated here.
  • the embodiment can perform high security authentication on the user identity by using the face dynamic authentication mode. Compared with the existing authentication mode, the authentication information is not stolen by a malicious third party, which improves. The reliability of the authentication, and the dynamic authentication of the face can identify the user as a living user, thereby further improving the accuracy of the identity authentication and reducing the security risks in the authentication process.
  • FIG. 3A another embodiment of the identity authentication method of the present application, which illustrates the process of face registration in detail:
  • Step 301 The user registers with the server through the terminal.
  • Step 302 When the terminal detects the face of the user, the terminal performs face tracking on the user.
  • the camera is integrated with a camera device, such as a camera.
  • the user can automatically start the camera device to detect the face of the user when the user registers.
  • the user can point the camera device to the front face of the camera.
  • the terminal can track the face of the user through the face tracking algorithm. It should be noted that the existing face tracking algorithm can be used in the embodiment of the present application, and details are not described herein. .
  • Step 303 The terminal acquires a face image according to a preset time interval in the face tracking process.
  • the terminal acquires the face image according to the preset time interval by the camera device, and the time interval is set to avoid extracting substantially the same face image.
  • the preset time interval may be 3 seconds.
  • Step 304 Determine whether the definition of the face image meets the preset definition threshold. If yes, execute step 305; otherwise, end the current process.
  • the sharpness may be first judged to exclude the face image with insufficient clarity.
  • the terminal can retrieve a preset fuzzy judgment function to determine whether the sharpness of the face image satisfies the definition threshold.
  • the fuzzy judgment function can adopt the fuzzy judgment function in the existing image recognition processing technology. The examples are not limited. For the face image satisfying the sharpness threshold, step 305 is performed, and the face image that does not satisfy the sharpness threshold is directly discarded, and then returns to step 303.
  • Step 305 The terminal extracts the head posture information from the face image.
  • the terminal After determining that the acquired face image is a clear face image in step 304, the terminal extracts the head pose information from the face image.
  • the human head posture information in this embodiment may include at least one of the following angles: a low head angle, a side face angle, and a head angle.
  • Step 306 The terminal determines whether each angle included in the gesture information of the human head is within a preset angle range. If yes, step 307 is performed; otherwise, the current flow ends.
  • whether the face image is the front face image of the user is determined by the posture information of the head, and the terminal can determine whether each angle included in the posture information of the head is within a preset angle range, for example, The preset angle range is from 0 to 10 degrees.
  • Step 307 is executed for the face image corresponding to the head posture information of the determination result, and the face image corresponding to the head posture information of the determination result is discarded, and then the process returns to step 303.
  • Step 307 The terminal extracts facial feature information of the user from the face image.
  • a LBP (Linear Back Projection) feature extraction algorithm may be used to extract a face feature vector value from a face image as a face feature information of the user.
  • a face feature extraction algorithm used in any existing image processing technology can be applied to the embodiment of the present application, for example, a windowed Fourier transform gabor feature extraction algorithm. Wait.
  • the face feature information of the user may be extracted from the plurality of face images, and the number of the plurality of face images may be preset. For example, five, correspondingly, according to the number of face images set, the foregoing steps 303 to 307 may be cyclically executed to acquire a face image that satisfies the preset number, and extract face feature information therefrom.
  • Step 308 The terminal sends the face feature information to the server.
  • Step 309 The server saves the correspondence between the user name of the registered user and the face feature, and ends the current process.
  • the server after receiving the facial feature information sent by the terminal, the server may be in the face.
  • the correspondence between the user name and the face feature of the registered user is saved in the information database.
  • the corresponding relationship between the user name and the plurality of face feature information is saved.
  • FIG. 4A another embodiment of the identity authentication method of the present application is based on the face registration process shown in FIG. 3, and the process of authenticating a user is described in detail:
  • Step 401 Start identity authentication for the user.
  • Step 402 The terminal acquires first facial feature information of the user.
  • the manner in which the terminal obtains the facial feature information of the user is consistent with the manner in which the facial feature information is acquired in the face registration process shown in FIG. 3, which is specifically consistent with steps 302 to 307 shown in FIG. , will not repeat them here.
  • the terminal may acquire at least one first facial feature information.
  • Step 403 The terminal sends the first facial feature information of the user to the server.
  • Step 404 The server verifies whether the first facial feature information matches the saved second facial feature information of the user. If yes, step 405 is performed; otherwise, the current process is ended.
  • the server may search the facial feature information database according to the user name of the user, obtain the second facial feature information corresponding to the user name, and then adopt the pre-preparation.
  • the comparing manner compares the first facial feature information and the second facial feature information. If the feature comparison value is within a preset similarity range, the first facial feature information and the second facial feature information may be determined to match.
  • the first face feature information and the second face feature may be compared by using the Euclidean distance comparison method, and the square sum of the difference between the second face feature vector and the first face feature vector is calculated at this time, if If the sum of squares is less than a preset threshold, it may be determined that the identity authentication is performed by the user himself;
  • the first face feature information and the second face feature may be compared by using a cosine distance comparison method. If the first face feature vector is V1 and the second face feature vector is V2, the following formula may be calculated. Value: V2*V1/(
  • Step 405 The server sends the face dynamic authentication prompt information to the terminal.
  • the server may randomly extract a face dynamic authentication prompt information from the face dynamic authentication prompt information database.
  • the face dynamic authentication prompt information may include an expression action prompt information or a voice read prompt information.
  • the prompting action is usually an action that the user can easily present through the facial gesture, for example, opening a mouth, closing the eyes, turning the head, etc.
  • the voice reading prompt information the information is usually short, so that the user is Read at the time of authentication, and it is convenient for the terminal to recognize the facial gesture when the user reads.
  • Step 406 The terminal obtains face tracking information by performing face tracking on the user.
  • the terminal may output the face dynamic authentication prompt information on the authentication interface, and the user may present the corresponding face gesture according to the information.
  • the terminal acquires the user through the face tracking algorithm.
  • Face tracking information may include at least one of the following information: face key position information, head posture information.
  • Step 407 The terminal analyzes the face tracking information to obtain the gesture recognition information of the user.
  • FIG. 4B and FIG. 4C are schematic diagrams of position information of facial key points in the embodiment of the present application.
  • FIG. 4B is a key position information of a user's mouth extracted in a normal state
  • FIG. 4C is a “open mouth” posture of the user. After extracting the key position information of the user's mouth, by comparing the key position information extracted by FIG. 4B and FIG. 4C, that is, comparing the coordinate distances of the two key points above and below the mouth, the user's posture recognition information can be obtained as “ Open your mouth.”
  • the terminal may obtain the head posture information by performing face tracking on the user, which may be specifically shown in FIG. 3B.
  • the three corners are obtained, and if the angle values of the three corners satisfy the range of angle values defined by the "turning head", the posture recognition information of the user can be obtained as "turning head”.
  • Step 408 The terminal sends the gesture recognition information to the server.
  • Step 409 The server verifies that the gesture recognition information is consistent with the face dynamic authentication prompt information. If yes, step 410 is performed; otherwise, the current flow is ended.
  • Step 410 The server determines that the user passes the identity authentication and ends the current process.
  • the embodiment combines face authentication and dynamic authentication to perform high security authentication on the user identity, wherein the face authentication can initially verify whether the user is the user, and the authentication is performed compared to the existing authentication password.
  • the face authentication can initially verify whether the user is the user, and the authentication is performed compared to the existing authentication password.
  • the authentication information is not easily stolen by a malicious third party, and the reliability of the authentication is improved, and on the basis of being confirmed as the user himself, the face dynamic authentication can identify the user as a living user, thereby further improving the accuracy of the identity authentication. Reduce the security risks in the authentication process.
  • the present application also provides an embodiment of an identity authentication device, a terminal, and a server.
  • Embodiments of the identity authentication apparatus of the present application can be applied to terminals and servers, respectively.
  • the device embodiment may be implemented by software, or may be implemented by hardware or a combination of hardware and software. Taking the software implementation as an example, as a logical means, the processor of the device in which it is located reads the corresponding computer program instructions in the non-volatile memory into the memory. From the hardware level, as shown in FIG. 5, a hardware structure diagram of the device where the identity authentication device is located, except for the processor, the memory, the network interface, and the non-volatile memory shown in FIG.
  • the device in which the device is located in the embodiment may also include other hardware according to the actual function of the device.
  • the terminal may include a camera, a touch screen, a communication component, etc.
  • the server may include a packet responsible for processing the message. Forwarding chips and so on.
  • the identity authentication apparatus may be applied to a terminal, and the apparatus includes: a receiving unit 610, an identifying unit 620, and a sending unit 630.
  • the receiving unit 610 is configured to receive the facial dynamic authentication prompt information sent by the server when the user performs identity authentication.
  • the identifying unit 620 is configured to obtain the gesture recognition information of the facial dynamic authentication prompt information by identifying a facial gesture presented by the user;
  • the sending unit 630 is configured to send the gesture identification information to the server, so that the server determines that the user passes the identity authentication when verifying that the gesture identification information is consistent with the face dynamic authentication prompt information.
  • the identification unit 620 can include (not shown in FIG. 6):
  • a face information obtaining sub-unit configured to obtain face tracking information by performing face tracking on the user when the user presents a face gesture according to the face dynamic authentication prompt information
  • the face information analysis subunit is configured to analyze the face tracking information to obtain the gesture identification information of the user.
  • the face information analysis sub-unit may be specifically configured to: when the face tracking information is the face key position information, obtain the facial gesture recognition information of the user by analyzing the facial key position information, when When the face tracking information is the head posture information, the head rotation identification information of the user is obtained by analyzing the head posture information.
  • the face dynamic authentication prompt information may include at least one of the following information: an expression action prompt information, and a voice read prompt information.
  • the apparatus may also include (not shown in Figure 6):
  • An acquiring unit configured to acquire facial feature information of the user, and use the facial feature information acquired during the identity authentication as the first facial feature information of the user;
  • the sending unit 630 may be further configured to send the first facial feature information of the user to the server, so that the server is verifying the first facial feature information and the saved second person of the user
  • the face dynamic authentication prompt information is sent when the face feature information is matched.
  • the acquiring unit may be further configured to: when the user performs registration, acquire facial feature information of the user, and use the facial feature information acquired during the registration as a second person of the user.
  • the sending unit 630 is further configured to send the second facial feature information to the server, so that the server saves the user name of the user and the second facial feature. Correspondence relationship.
  • the acquiring unit may include: a face tracking subunit, configured to perform face tracking on the user when the face of the user is detected; and an image obtaining subunit, configured to be in the person Obtaining a face image according to a preset time interval in the face tracking process; the condition determining subunit is configured to determine whether the face image satisfies a preset feature extraction condition; and the feature extraction subunit is configured to satisfy the feature extraction condition And extracting facial feature information of the user from the face image.
  • a face tracking subunit configured to perform face tracking on the user when the face of the user is detected
  • an image obtaining subunit configured to be in the person Obtaining a face image according to a preset time interval in the face tracking process
  • the condition determining subunit is configured to determine whether the face image satisfies a preset feature extraction condition
  • the feature extraction subunit is configured to satisfy the feature extraction condition And extracting facial feature information of the user from the face image.
  • condition determining subunit may further include:
  • a clarity determining module configured to determine whether a resolution of the face image meets a preset definition threshold
  • the attitude information extracting module is configured to extract, according to the clarity threshold, the head posture information from the face image, the head posture information including at least one of the following angles: a low head angle, a side face angle, and a partial Head angle
  • An angle determining module configured to determine whether each angle included in the posture information of the human head is within a preset angle range
  • the determination determining module is configured to determine that the face image satisfies the feature extraction condition if it is within a preset angle range.
  • the feature extraction sub-unit may be specifically configured to use a preset feature extraction algorithm to extract a face feature vector value from the face image as the face feature information of the user; wherein the preset feature
  • the extraction algorithm may include: a linear back projection LBP feature extraction algorithm, or a windowed Fourier transform gabor feature extraction algorithm, and the like.
  • the identity authentication apparatus may be applied to a server, and the apparatus includes: a sending unit 710, a receiving unit 720, and a determining unit 730.
  • the sending unit 710 is configured to send the face dynamic authentication prompt information to the terminal when the user performs identity authentication.
  • the receiving unit 720 is configured to receive the gesture identification information that is sent by the terminal, where the gesture identification information is the gesture identification information obtained by the terminal by identifying a facial gesture presented by the user according to the facial dynamic authentication prompt information. ;
  • the determining unit 730 is configured to determine that the user passes the identity authentication when verifying that the gesture identification information is consistent with the facial dynamic authentication prompt information.
  • the receiving unit 720 is further configured to receive the first facial feature information of the user that is sent by the terminal;
  • the device may further include: (not shown in FIG. 7): a verification unit, configured to verify whether the first facial feature information matches the saved second facial feature information of the user;
  • the sending unit 710 may be specifically configured to send the face dynamic authentication prompt information to the terminal when the matching is performed.
  • the receiving unit 720 is further configured to: when the user performs registration, receive the second facial feature information of the user that is sent by the terminal; the device may further include: The saving unit is configured to save a correspondence between the user name of the user and the second facial feature information.
  • the verification unit may include: a feature search subunit, configured to search the correspondence according to the user name of the user, to obtain second face feature information corresponding to the user name; and feature comparison subunit And comparing the first facial feature information and the second facial feature information according to a preset comparison manner; the matching determining subunit is configured to determine, if the feature comparison value is within a preset similarity range, The first facial feature information matches the second facial feature information.
  • the preset comparison manner that the feature comparison subunit can adopt includes: a European distance comparison method or a cosine distance comparison manner.
  • the device embodiment since it basically corresponds to the method embodiment, reference may be made to the partial description of the method embodiment.
  • the device embodiments described above are merely illustrative, wherein the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, ie may be located A place, or it can be distributed to multiple network units. You can choose some of them according to actual needs or All modules are used to achieve the objectives of the present application. Those of ordinary skill in the art can understand and implement without any creative effort.
  • the face dynamic authentication method can perform high security authentication on the user identity, and the authentication information is not maliciously compared to the existing authentication mode.
  • the three-party stealing improves the reliability of the authentication, and the face dynamic authentication can identify the user as a living user, thereby further improving the accuracy of the identity authentication and reducing the security risks in the authentication process.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Acoustics & Sound (AREA)
  • Collating Specific Patterns (AREA)
  • Image Analysis (AREA)
  • Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)
PCT/CN2015/088215 2014-09-03 2015-08-27 身份认证方法、装置、终端及服务器 Ceased WO2016034069A1 (zh)

Priority Applications (7)

Application Number Priority Date Filing Date Title
KR1020177005848A KR101997371B1 (ko) 2014-09-03 2015-08-27 신원 인증 방법 및 장치, 단말기 및 서버
SG11201701497SA SG11201701497SA (en) 2014-09-03 2015-08-27 Identity authentication method and apparatus, terminal and server
PL19172346T PL3540621T3 (pl) 2014-09-03 2015-08-27 Sposób oraz urządzenie do uwierzytelniania tożsamości, terminal i serwer
EP19172346.9A EP3540621B1 (en) 2014-09-03 2015-08-27 Identity authentication method and apparatus, terminal and server
EP15838136.8A EP3190534B1 (en) 2014-09-03 2015-08-27 Identity authentication method and apparatus, terminal and server
JP2017512314A JP6820062B2 (ja) 2014-09-03 2015-08-27 アイデンティティ認証方法ならびに装置、端末及びサーバ
US15/448,534 US10601821B2 (en) 2014-09-03 2017-03-02 Identity authentication method and apparatus, terminal and server

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410446657.0 2014-09-03
CN201410446657.0A CN105468950B (zh) 2014-09-03 2014-09-03 身份认证方法、装置、终端及服务器

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/448,534 Continuation US10601821B2 (en) 2014-09-03 2017-03-02 Identity authentication method and apparatus, terminal and server

Publications (1)

Publication Number Publication Date
WO2016034069A1 true WO2016034069A1 (zh) 2016-03-10

Family

ID=55439122

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/088215 Ceased WO2016034069A1 (zh) 2014-09-03 2015-08-27 身份认证方法、装置、终端及服务器

Country Status (9)

Country Link
US (1) US10601821B2 (enExample)
EP (2) EP3190534B1 (enExample)
JP (1) JP6820062B2 (enExample)
KR (1) KR101997371B1 (enExample)
CN (2) CN111898108B (enExample)
ES (1) ES2810012T3 (enExample)
PL (1) PL3540621T3 (enExample)
SG (2) SG11201701497SA (enExample)
WO (1) WO2016034069A1 (enExample)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110460993A (zh) * 2019-08-21 2019-11-15 广州大学 一种基于手势验证的认证方法及系统

Families Citing this family (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10452823B2 (en) * 2015-04-30 2019-10-22 Masaaki Tokuyama Terminal device and computer program
US10929550B2 (en) 2015-04-30 2021-02-23 Masaaki Tokuyama Terminal device and computer program
CN106302330B (zh) * 2015-05-21 2021-01-05 腾讯科技(深圳)有限公司 身份验证方法、装置和系统
CN105893828A (zh) * 2016-05-05 2016-08-24 南京甄视智能科技有限公司 基于移动终端的人脸验证驾考系统与方法
CN107644190A (zh) * 2016-07-20 2018-01-30 北京旷视科技有限公司 行人监控方法和装置
CN106228133B (zh) * 2016-07-21 2020-04-10 北京旷视科技有限公司 用户验证方法及装置
CN107819807A (zh) * 2016-09-14 2018-03-20 腾讯科技(深圳)有限公司 一种信息验证方法、装置和设备
CN108629260B (zh) * 2017-03-17 2022-02-08 北京旷视科技有限公司 活体验证方法和装置及存储介质
CN106971163A (zh) * 2017-03-28 2017-07-21 深圳市校联宝科技有限公司 一种接领人识别方法、装置和系统
WO2018176485A1 (zh) * 2017-04-01 2018-10-04 深圳市大疆创新科技有限公司 身份认证服务器、身份认证终端、身份认证系统及方法
CN107045744A (zh) * 2017-04-14 2017-08-15 特斯联(北京)科技有限公司 一种智能别墅门禁认证方法及系统
CN107066983B (zh) * 2017-04-20 2022-08-09 腾讯科技(上海)有限公司 一种身份验证方法及装置
US11080316B1 (en) * 2017-05-26 2021-08-03 Amazon Technologies, Inc. Context-inclusive face clustering
CN108229120B (zh) * 2017-09-07 2020-07-24 北京市商汤科技开发有限公司 人脸解锁及其信息注册方法和装置、设备、程序、介质
CN107707738A (zh) * 2017-09-07 2018-02-16 维沃移动通信有限公司 一种人脸识别方法及移动终端
CN107562204B (zh) * 2017-09-14 2021-10-01 深圳Tcl新技术有限公司 电视交互方法、电视及计算机可读存储介质
CN109670386A (zh) * 2017-10-16 2019-04-23 深圳泰首智能技术有限公司 人脸识别方法及终端
CN107818253B (zh) * 2017-10-18 2020-07-17 Oppo广东移动通信有限公司 人脸模板数据录入控制方法及相关产品
CN107993174A (zh) * 2017-11-06 2018-05-04 国政通科技股份有限公司 一种多功能便民服务系统
US10594690B2 (en) * 2017-11-16 2020-03-17 Bank Of America Corporation Authenticating access to a computing resource using facial recognition based on involuntary facial movement
CN107944380B (zh) * 2017-11-20 2022-11-29 腾讯科技(深圳)有限公司 身份识别方法、装置及存储设备
CN107798548A (zh) * 2017-11-27 2018-03-13 甘平安 一种购买方法及购买系统
US10924476B2 (en) * 2017-11-29 2021-02-16 Ncr Corporation Security gesture authentication
CN108229937A (zh) * 2017-12-20 2018-06-29 阿里巴巴集团控股有限公司 基于增强现实的虚拟对象分配方法及装置
CN108171026A (zh) * 2018-01-19 2018-06-15 百度在线网络技术(北京)有限公司 鉴权方法和装置
US11366884B2 (en) 2018-02-14 2022-06-21 American Express Travel Related Services Company, Inc. Authentication challenges based on fraud initiation requests
JP6859970B2 (ja) * 2018-03-09 2021-04-14 京セラドキュメントソリューションズ株式会社 ログイン支援システム
CN108734084A (zh) * 2018-03-21 2018-11-02 百度在线网络技术(北京)有限公司 人脸注册方法和装置
CN108446664A (zh) * 2018-03-30 2018-08-24 广东华电网维信息科技有限公司 一种基于人脸识别的身份确认方法及装置
CN108712381A (zh) * 2018-04-16 2018-10-26 出门问问信息科技有限公司 一种身份验证方法及装置
US10733676B2 (en) * 2018-05-17 2020-08-04 Coupa Software Incorporated Automatic generation of expense data using facial recognition in digitally captured photographic images
CN110555330A (zh) * 2018-05-30 2019-12-10 百度在线网络技术(北京)有限公司 图像面签方法、装置、计算机设备及存储介质
CA3125586C (en) * 2018-06-11 2025-07-08 Laurencehamid ACTIVE STATE DETECTION
CN109165614A (zh) * 2018-08-31 2019-01-08 杭州行开科技有限公司 基于3d摄像头的人脸识别系统
CN109065058B (zh) * 2018-09-30 2024-03-15 合肥鑫晟光电科技有限公司 语音通信方法、装置及系统
US10956548B2 (en) * 2018-10-09 2021-03-23 Lenovo (Singapore) Pte. Ltd. User authentication via emotion detection
CN111104658A (zh) * 2018-10-25 2020-05-05 北京嘀嘀无限科技发展有限公司 注册方法及装置、认证方法及装置
CN109376684B (zh) * 2018-11-13 2021-04-06 广州市百果园信息技术有限公司 一种人脸关键点检测方法、装置、计算机设备和存储介质
JP7054847B2 (ja) * 2019-03-04 2022-04-15 パナソニックIpマネジメント株式会社 顔認証登録装置および顔認証登録方法
US10860705B1 (en) 2019-05-16 2020-12-08 Capital One Services, Llc Augmented reality generated human challenge
KR20210009596A (ko) * 2019-07-17 2021-01-27 엘지전자 주식회사 지능적 음성 인식 방법, 음성 인식 장치 및 지능형 컴퓨팅 디바이스
CN110472394A (zh) * 2019-07-24 2019-11-19 天脉聚源(杭州)传媒科技有限公司 一种预留信息处理方法、系统、装置和存储介质
CN110490106B (zh) * 2019-08-06 2022-05-03 万翼科技有限公司 信息管理方法及相关设备
CN110611734A (zh) * 2019-08-08 2019-12-24 深圳传音控股股份有限公司 交互方法及终端
CN110765436A (zh) * 2019-10-26 2020-02-07 福建省伟志地理信息科学研究院 一种不动产信息分析管理系统和方法
CN111144896A (zh) * 2019-12-16 2020-05-12 中国银行股份有限公司 一种身份验证方法及装置
WO2021158168A1 (en) * 2020-02-04 2021-08-12 Grabtaxi Holdings Pte. Ltd. Method, server and communication system of verifying user for transportation purposes
CN112422817B (zh) * 2020-10-28 2022-04-12 维沃移动通信有限公司 图像处理方法及装置
US20220158986A1 (en) 2020-11-17 2022-05-19 Titaniam, Inc. Non-stored multiple factor verification
KR102531579B1 (ko) * 2020-11-18 2023-05-16 주식회사 코밴 이중 인증을 통한 간편 결제 방법
CN112487389A (zh) * 2020-12-16 2021-03-12 熵基科技股份有限公司 一种身份认证方法、装置和设备
CN112383571B (zh) * 2021-01-12 2021-06-04 浙江正元智慧科技股份有限公司 基于人脸识别大数据的登录管理系统
CN115131904A (zh) * 2021-03-25 2022-09-30 中国移动通信集团安徽有限公司 一种门禁控制方法、装置、设备及计算机存储介质
CN113128969A (zh) * 2021-04-26 2021-07-16 广州太玮生物科技有限公司 一种色谱柱日常使用管理系统
CN113255529A (zh) * 2021-05-28 2021-08-13 支付宝(杭州)信息技术有限公司 一种生物特征的识别方法、装置及设备
CN113696851B (zh) * 2021-08-27 2023-04-28 上海仙塔智能科技有限公司 基于车外手势的车辆控制方法、装置、设备以及介质
US12199964B1 (en) * 2021-10-29 2025-01-14 United Services Automobile Association (Usaa) Tic detection-based video authentication method and system
CN114268453B (zh) * 2021-11-17 2024-07-12 中国南方电网有限责任公司 电力系统解锁方法、装置、计算机设备和存储介质
CN114283450A (zh) * 2021-12-23 2022-04-05 国网福建省电力有限公司信息通信分公司 一种变电站用作业人员身份识别方法及模块
JP7654313B1 (ja) * 2022-02-25 2025-04-01 ビゴ テクノロジー ピーティーイー. リミテッド 身分認証方法、装置、端末、記憶媒体及びプログラム製品
CN114760074B (zh) * 2022-06-13 2022-09-02 中广(绍兴上虞)有线信息网络有限公司 一种基于大数据安全的身份认证方法及系统
CN118196876B (zh) * 2024-05-20 2024-08-16 东南大学 一种虚拟身份认证装置及其认证方法
CN118821094A (zh) * 2024-07-19 2024-10-22 江苏芯灵智能科技有限公司 一种动态口令与多生物特征结合的身份认证方法
CN119046911B (zh) * 2024-10-30 2025-04-18 国网浙江省电力有限公司杭州供电公司 一种设备身份认证方法及系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101075868A (zh) * 2006-05-19 2007-11-21 华为技术有限公司 一种远程身份认证的系统、终端、服务器和方法
CN102385703A (zh) * 2010-08-27 2012-03-21 北京中星微电子有限公司 一种基于人脸的身份认证方法及系统
CN103259796A (zh) * 2013-05-15 2013-08-21 金硕澳门离岸商业服务有限公司 认证系统和方法
CN104298909A (zh) * 2013-07-19 2015-01-21 富泰华工业(深圳)有限公司 电子装置、身份验证系统及方法
CN104518877A (zh) * 2013-10-08 2015-04-15 鸿富锦精密工业(深圳)有限公司 身份认证系统及方法

Family Cites Families (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5572261A (en) * 1995-06-07 1996-11-05 Cooper; J. Carl Automatic audio to video timing measurement device and method
JP2000306090A (ja) * 1999-04-20 2000-11-02 Ntt Data Corp 個人認証装置、方法及び記録媒体
SG91841A1 (en) * 1999-11-03 2002-10-15 Kent Ridge Digital Labs Face direction estimation using a single gray-level image
JP2003216955A (ja) * 2002-01-23 2003-07-31 Sharp Corp ジェスチャ認識方法、ジェスチャ認識装置、対話装置及びジェスチャ認識プログラムを記録した記録媒体
US9412142B2 (en) * 2002-08-23 2016-08-09 Federal Law Enforcement Development Services, Inc. Intelligent observation and identification database system
JP2004110813A (ja) * 2002-08-30 2004-04-08 Victor Co Of Japan Ltd 人物認証装置
CA2600938A1 (en) * 2004-03-24 2005-10-06 Andre Hoffmann Identification, verification, and recognition method and system
US7634106B2 (en) * 2004-09-22 2009-12-15 Fujifilm Corporation Synthesized image generation method, synthesized image generation apparatus, and synthesized image generation program
US8370639B2 (en) * 2005-06-16 2013-02-05 Sensible Vision, Inc. System and method for providing secure access to an electronic device using continuous facial biometrics
KR100725771B1 (ko) * 2005-09-23 2007-06-08 삼성전자주식회사 휴대용 단말기용 얼굴 인식 및 인증 장치 및 방법
KR100777922B1 (ko) * 2006-02-06 2007-11-21 에스케이 텔레콤주식회사 영상인식을 이용한 개인인증 및 전자서명 시스템 및 그방법
JP4367424B2 (ja) * 2006-02-21 2009-11-18 沖電気工業株式会社 個人識別装置,個人識別方法
WO2007105193A1 (en) 2006-03-12 2007-09-20 Nice Systems Ltd. Apparatus and method for target oriented law enforcement interception and analysis
JP5219184B2 (ja) * 2007-04-24 2013-06-26 任天堂株式会社 トレーニングプログラム、トレーニング装置、トレーニングシステムおよびトレーニング方法
JP4999570B2 (ja) * 2007-06-18 2012-08-15 キヤノン株式会社 表情認識装置及び方法、並びに撮像装置
KR20100062413A (ko) * 2008-12-02 2010-06-10 한국전자통신연구원 텔레매틱스 장치를 위한 음성인식 장치 및 그 방법
JP2010231350A (ja) * 2009-03-26 2010-10-14 Toshiba Corp 人物識別装置、そのプログラム、及び、その方法
KR101092820B1 (ko) * 2009-09-22 2011-12-12 현대자동차주식회사 립리딩과 음성 인식 통합 멀티모달 인터페이스 시스템
WO2011065952A1 (en) * 2009-11-30 2011-06-03 Hewlett-Packard Development Company, L.P. Face recognition apparatus and methods
TWI411935B (zh) * 2009-12-25 2013-10-11 Primax Electronics Ltd 利用影像擷取裝置辨識使用者姿勢以產生控制訊號之系統以及方法
US8818025B2 (en) * 2010-08-23 2014-08-26 Nokia Corporation Method and apparatus for recognizing objects in media content
US8897500B2 (en) * 2011-05-05 2014-11-25 At&T Intellectual Property I, L.P. System and method for dynamic facial features for speaker recognition
CN102298443B (zh) * 2011-06-24 2013-09-25 华南理工大学 结合视频通道的智能家居语音控制系统及其控制方法
US9082235B2 (en) * 2011-07-12 2015-07-14 Microsoft Technology Licensing, Llc Using facial data for device authentication or subject identification
CN102324035A (zh) * 2011-08-19 2012-01-18 广东好帮手电子科技股份有限公司 口型辅助语音识别术在车载导航中应用的方法及系统
KR20130022607A (ko) * 2011-08-25 2013-03-07 삼성전자주식회사 입술 이미지를 이용한 음성 인식 장치 및 이의 음성 인식 방법
KR101242390B1 (ko) * 2011-12-29 2013-03-12 인텔 코오퍼레이션 사용자를 인증하기 위한 방법, 장치, 및 컴퓨터 판독 가능한 기록 매체
US9083532B2 (en) * 2012-03-06 2015-07-14 Ebay Inc. Physiological response PIN entry
US8687880B2 (en) * 2012-03-20 2014-04-01 Microsoft Corporation Real time head pose estimation
US9443510B2 (en) * 2012-07-09 2016-09-13 Lg Electronics Inc. Speech recognition apparatus and method
CN102932212A (zh) * 2012-10-12 2013-02-13 华南理工大学 一种基于多通道交互方式的智能家居控制系统
US20140118257A1 (en) * 2012-10-29 2014-05-01 Amazon Technologies, Inc. Gesture detection systems
CN103036680A (zh) * 2012-12-10 2013-04-10 中国科学院计算机网络信息中心 基于生物特征识别的域名认证系统及方法
US8914837B2 (en) * 2012-12-14 2014-12-16 Biscotti Inc. Distributed infrastructure
JP6132232B2 (ja) * 2013-02-01 2017-05-24 パナソニックIpマネジメント株式会社 メイクアップ支援装置、メイクアップ支援システム、およびメイクアップ支援方法
US20140341444A1 (en) * 2013-05-14 2014-11-20 Tencent Technology (Shenzhen) Company Limited Systems and Methods for User Login
JP6209067B2 (ja) * 2013-11-21 2017-10-04 株式会社Nttドコモ 画像認識装置、及び画像認識方法
CN103593598B (zh) * 2013-11-25 2016-09-21 上海骏聿数码科技有限公司 基于活体检测和人脸识别的用户在线认证方法及系统
CN103634120A (zh) * 2013-12-18 2014-03-12 上海市数字证书认证中心有限公司 基于人脸识别的实名认证方法及系统
US9866820B1 (en) * 2014-07-01 2018-01-09 Amazon Technologies, Inc. Online calibration of cameras

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101075868A (zh) * 2006-05-19 2007-11-21 华为技术有限公司 一种远程身份认证的系统、终端、服务器和方法
CN102385703A (zh) * 2010-08-27 2012-03-21 北京中星微电子有限公司 一种基于人脸的身份认证方法及系统
CN103259796A (zh) * 2013-05-15 2013-08-21 金硕澳门离岸商业服务有限公司 认证系统和方法
CN104298909A (zh) * 2013-07-19 2015-01-21 富泰华工业(深圳)有限公司 电子装置、身份验证系统及方法
CN104518877A (zh) * 2013-10-08 2015-04-15 鸿富锦精密工业(深圳)有限公司 身份认证系统及方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3190534A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110460993A (zh) * 2019-08-21 2019-11-15 广州大学 一种基于手势验证的认证方法及系统

Also Published As

Publication number Publication date
CN111898108A (zh) 2020-11-06
SG11201701497SA (en) 2017-03-30
EP3190534A1 (en) 2017-07-12
JP6820062B2 (ja) 2021-01-27
CN111898108B (zh) 2024-06-04
PL3540621T3 (pl) 2021-05-17
US10601821B2 (en) 2020-03-24
EP3540621A1 (en) 2019-09-18
KR20170047255A (ko) 2017-05-04
ES2810012T3 (es) 2021-03-08
US20170180362A1 (en) 2017-06-22
CN105468950A (zh) 2016-04-06
KR101997371B1 (ko) 2019-07-05
HK1221795A1 (zh) 2017-06-09
EP3190534A4 (en) 2018-03-21
JP2017530457A (ja) 2017-10-12
EP3540621B1 (en) 2020-07-29
SG10201901818UA (en) 2019-03-28
EP3190534B1 (en) 2019-06-26
CN105468950B (zh) 2020-06-30

Similar Documents

Publication Publication Date Title
CN105468950B (zh) 身份认证方法、装置、终端及服务器
RU2589344C2 (ru) Способ, устройство и система аутентификации на основе биологических характеристик
CN108804884B (zh) 身份认证的方法、装置及计算机存储介质
TWI578181B (zh) 電子裝置、身份驗證系統及方法
CN105654033B (zh) 人脸图像验证方法和装置
CN104008317B (zh) 认证设备和认证方法
US20170046508A1 (en) Biometric authentication using gesture
WO2017198014A1 (zh) 一种身份认证方法和装置
US8983207B1 (en) Mitigating replay attacks using multiple-image authentication
US9792421B1 (en) Secure storage of fingerprint related elements
KR101724971B1 (ko) 광각 카메라를 이용한 얼굴 인식 시스템 및 그를 이용한 얼굴 인식 방법
US10547610B1 (en) Age adapted biometric authentication
CN105407069B (zh) 活体认证方法、装置、客户端设备及服务器
WO2006027743A1 (en) Feature extraction algorithm for automatic ear recognition
KR102317598B1 (ko) 서버, 서버의 제어 방법 및 단말 장치
WO2016062200A1 (zh) 一种指纹认证的方法、装置及服务器
WO2017041358A1 (zh) 一种用户身份识别方法、装置和移动终端
CN110582771A (zh) 基于生物计量信息执行认证的方法和装置
WO2016058540A1 (zh) 身份验证方法、装置和存储介质
CN110519061A (zh) 一种基于生物特征的身份认证方法、设备及系统
HK1221795B (zh) 身份认证方法、装置、终端及服务器
KR101718244B1 (ko) 얼굴 인식을 위한 광각 영상 처리 장치 및 방법
HK1226206B (zh) 活体认证方法、装置、客户端设备及服务器
CN106549908A (zh) 使用者登录方法与应用此使用者登录方法的用户登录系统
HK1226206A (en) Method, apparatus, client device and sever for living body authentication

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15838136

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20177005848

Country of ref document: KR

Kind code of ref document: A

Ref document number: 2017512314

Country of ref document: JP

Kind code of ref document: A

REEP Request for entry into the european phase

Ref document number: 2015838136

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2015838136

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE